- Set your Requesty API key
- Set your Requesty base URL
- Choose one of the 300+ supported models
- Access to all the best LLMs
- A single API key to access all the providers
- Very clear spending dashboards
- Telemetry and logging out of the box
Option no. 1 - Configure via environment variables
Set:- OPENAI_API_KEY=[Your Requesty API key]
- OPENAI_BASE_URL=“https://router.requesty.ai/v1”
model
parameter)
Option no. 2 - Configure the client
Load your Requesty API key any way you want. Pass theapi_key
, api_base_url
and set the model
parameter to any model, and you’re done!
(Yes, you can use xAI or any other model without changing anything but the model
parameter)