Learn how to use TensorZero with Google AI Studio LLMs: open-source gateway, observability, optimization, evaluations, and experimentation.
google_ai_studio_gemini::model_name
to use a Google AI Studio (Gemini API) model with TensorZero, unless you need advanced features like fallbacks or custom credentials.
You can use Google AI Studio (Gemini API) models in your TensorZero variants by setting the model
field to google_ai_studio_gemini::model_name
.
For example:
model_name
in the inference request to use a specific Google AI Studio (Gemini API) model, without having to configure a function and variant in TensorZero.
GOOGLE_AI_STUDIO_API_KEY
environment variable before running the gateway.
You can customize the credential location by setting the api_key_location
to env::YOUR_ENVIRONMENT_VARIABLE
or dynamic::ARGUMENT_NAME
.
See the Credential Management guide and Configuration Reference for more information.
docker compose up
.