Do you use PydanticAI?

Integrating Requesty is a super simple 3 stage process:

  • Set your Requesty API key
  • Set your Requesty base URL
  • Choose one of the 300+ supported models

And get immediate value:

  • Access to all the best LLMs
  • A single API key to access all the providers
  • Very clear spending dashboards
  • Telemetry and logging out of the box

You can use the Requesty router to access any LLM, and get cost management, monitoring and fallbacks out-of-the-box.

Configure via environment variables

Set:

Change the model parameter to any model, and you’re done!

(Yes, you can use Anthropic or any other model without changing anything but the model parameter)

from pydantic_ai import Agent
from pydantic_ai.models.openai import OpenAIModel

model = OpenAIModel(
    "anthropic/claude-sonnet-4-20250514",
)

agent = Agent(model)

async def main():
    response = await agent.run("What should I build with Requesty router, now that I have access to 150+ LLMs?")
    print(response)

if __name__ == "__main__":
    import asyncio
    asyncio.run(main())