LlamaIndex Integration
We're excited to announce observability support for LlamaIndex applications.
If you're using LlamaIndex, you can now see detailed traces in Agenta to debug your application.
The integration is auto-instrumentation - just add one line of code and you'll start seeing all your LlamaIndex operations traced.
This helps when you need to understand what's happening inside your RAG pipeline, track performance bottlenecks, or debug issues in production.
Check out the tutorial and the Jupyter notebook for more details.