Fetch prompts
This guide shows you how to fetch the deployed version of your prompt in your code. You can do this using the Agenta SDK (Python).
Fetching prompts with the Agenta SDK
Step 1: Setup
Make sure to install the latest version of the agenta Python SDK (pip -U install agenta
).
- Set up environment variables:
AGENTA_API_KEY
for cloud users.AGENTA_HOST
set tohttp://localhost
if you are self-hosting.AGENTA_PROJECT_ID
set to the project ID.
Step 2: Fetch the prompt
from agenta import Agenta
agenta = Agenta()
config = agenta.get_config(base_id="xxxxx", environment="production", cache_timeout=200) # Fetches the configuration with caching
The response object is an instance of GetConfigResponse
from agenta.client.backend.types.get_config_response
. It contains the following attributes:
config_name: 'default'
current_version: 1
- `parameters: This dictionary contains the configuration of the application, for instance:
{'temperature': 1.0,
'model': 'gpt-3.5-turbo',
'max_tokens': -1,
'prompt_system': 'You are an expert in geography.',
'prompt_user': 'What is the capital of {country}?',
'top_p': 1.0,
'frequence_penalty': 0.0,
'presence_penalty': 0.0,
'force_json': 0}