The best thing about OpenLLMetry is that it supports a wide range of LLMs and vector DBs out of the box. You just install the SDK and get metrics, traces and logs - without any extra work. Checkout the list of supported systems on Python and on Typescript. If your favorite vector DB or LLM is not supported by OpenLLMetry, you can still use OpenLLMetry to report the LLM and vector DB calls manually. Please open an issue for us as well so we can prioritize adding support for your favorite system. Here’s how you can do that manually in the meantime:Documentation Index
Fetch the complete documentation index at: https://enrolla-nk-hub-guardrails.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Reporting LLM calls
To track a call to an LLM, just wrap that call in your code with thewithLLMCall function in Typescript or track_llm_call in Python.
These functions passes a parameter you can use to report the request and response from this call.
Reporting Vector DB calls
To track a call to a vector DB, just wrap that call in your code with thewithVectorDBCall function.
This function passes a parameter you can use to report the query vector as well as the results from this call.

