The TensorZero Gateway can be used with the TensorZero Python client, with OpenAI clients (e.g. Python/Node), or via its HTTP API in any programming language.
pip install tensorzero
.
await
in build_embedded
by setting async_setup=False
.This is useful for synchronous contexts like __init__
functions where await
cannot be used.
However, avoid using it in asynchronous contexts as it blocks the event loop.
For async contexts, use the default async_setup=True
with await.For example, it’s safe to use async_setup=False
when initializing a FastAPI server, but not while the server is actively handling requests.await
in build_http
by setting async_setup=False
.
See above for more details.await
in patch_openai_client
by setting async_setup=False
.
See above for more details.model
model
parameter should be one of the following:
tensorzero::function_name::<your_function_name>
For example, if you have a function namedgenerate_haiku
, you can usetensorzero::function_name::generate_haiku
.
tensorzero::model_name::<your_model_name>
For example, if you have a model namedmy_model
in the config file, you can usetensorzero::model_name::my_model
. Alternatively, you can use default models liketensorzero::model_name::openai::gpt-4o-mini
.
episode_id
and variant_name
) by prefixing them with tensorzero::
in the extra_body
field in OpenAI client requests.
model
parameter and other technical details.episode_id
and variant_name
) by prefixing them with tensorzero::
in the body in OpenAI client requests.
model
parameter and other technical details.episode_id
and variant_name
) by prefixing them with tensorzero::
in the body in OpenAI client requests.