IFTracer is a Python SDK developed by InsightFinder to facilitate monitoring and debugging of Large Language Model (LLM) executions. Built on top of OpenTelemetry, it provides non-intrusive tracing capabilities, allowing seamless integration with your observability stack. With the IFTracer-sdk and InsightFinder’s AI Observability you can monitor, analyze, and debug your LLM models with deep visibility into distributed system operations.
Features
- Distributed Tracing: Collect and correlate data across services.
- Custom Annotations: Add domain-specific insights to trace data.
- Real-Time Analytics: Powered by InsightFinder’s monitoring platform.
- New Tags: Additional data extracted from LLM models as tags like LLM model’s name, Vector Database’s embedding model name and the dataset retrieved by RAG to assist tracing.
- Customizable: More tags can be added if needed.
- Easy to Use: No need for third party services like self hosted OpenTelemetry. Only requirement is to have an InsightFinder Account.
Installation
Prerequisites
Ensure that Python is installed on your system. The SDK is compatible with Python versions 3.9 and above.
Installing via PyPI
To install the iftracer-sdk using pip and PyPI package, execute the following command:
Installing Directly from GitHub
To install iftracer-sdk directly from source code on Github, the following command can be executed:
Installing Using Local Source Code
The iftracer-sdk can also be installed using local source code. To install using local source code, the following command can be executed:
Setup
Importing the Package
Once the iftracer-sdk is installed, it can be imported into any python project. To import the sdk, add the following lines to your code:
from itracer.sdk import Iftracer
Initialization
To initialize the iftracer-sdk, the following values will be needed:
Configuration Setting | Env Variable | Description | Example |
api_endpoint | IFTRACER_BASE_URL | InsightFinder’s api endpoint for trace collector. | “https://otlp.insightfinder.com” |
iftracer_user | IFTRACER_USER | InsightFinder account user | “user” |
iftracer_license_key | IFTRACER_LICENSE_KEY | InsightFinder account’s license key | “jws786923h8690a716001133” |
iftracer_project | IFTRACER_PROJECT | The trace project in InsightFinder used for analysis | “trace-project-1” |
The iftracer-sdk can be initialized by adding the following code to your project’s init.py or the entrypoint of the project:
api_endpoint=”http://:”,
iftracer_user=”…”,
iftracer_license_key=”…”,
iftracer_project=”…”,
)
The iftracer-sdk needs to be initialized only once to set the environment variables.
Using Environment Variables
The iftracer-sdk can also be initialized using environment variables. To use environment variables, the bash export command can be used to configure the following environment variables:
export IFTRACER_BASE_URL=”http://:”,
export IFTRACER_USER=”xxx”,
export IFTRACER_LICENSE_KEY=”xxx”,
export IFTRACER_PROJECT=”xxx”,
The iftracer-sdk can now be initialized as:
The export command(s) will export the environment variables to the OS, and the iftracer-sdk can then be initialized without passing any values to the initialization call, using the variables configured using the export command(s).
IFTracer Decorators
The iftracer-sdk provides a set of decorators that can be put over methods to get tracing details. Different decorators can be used based on the use case:
api_endpoint=”http://18.212.200.99:4318″, # Contact our devops to get the unique url.
iftracer_user=”…”, # The value can be found on the first line of [User Account Information](https://app.insightfinder.com/account-info) page.iftracer_license_key=”…”, # The value can be found on the 5th line of [User Account Information](https://app.insightfinder.com/account-info) page.iftracer_project=”…” # Your project’s name. You can fill in any strings.
# )
- @workflow and @task can be used over synchronous functions. For asynchronous functions, @aworkflow and @atask should be used
- @workflow or @aworkflow should be used when:
- the function calls multiple tasks or workflows and combines their results
- the function is a high-level orchestration of a process
- you need to get more tags
- you intend to create a logical boundary for a workflow execution.
- For all other functionalities, @task or @atask should be used
Trace Prompt and Response Content
Sometimes, the tracer can’t catch the details of the LLM model. For example, responses from langchain ainvoke()/invoke() will normally require users to use an LLM callback_handler to show more LLM model details like prompt token. The iftracer-sdk package provides the function trace_model_response(<Response which you want to get details>) to catch the details and avoid the handler. The functionality can be implemented as:
def create_joke():
response = await joke_generator_langchain.ainvoke(
{“question”: prompt},
config=config,
)
trace_model_response(response) # optional. Required if must show LLM model details.
return response
Example/Quick Start
from iftracer.sdk import Iftracer
@workflow(name=”get_chat_completion_test”)
def get_gpt4o_mini_completion(messages, model=”gpt-4o-mini”, temperature=0.7):
“””
Note: Set your OpenAI API key before using this api
Function to get a response from the GPT-4o-mini model using the updated OpenAI API.
Args:
messages (list): List of messages (system, user, assistant).
model (str): The model to use, default is “gpt-4o-mini”.
temperature (float): Controls the randomness of the response.Returns:
str: The model-generated response.
“””
try:
# Make a request to OpenAI’s new Chat Completions API
response = openai.chat.completions.create(
model=model, messages=messages, temperature=temperature
)# Return the content of the first choice from the response
return response
except openai.OpenAIError as e:
# Catching OpenAI-specific errors
print(f”OpenAI API error: {e}”)
return None# Example Usage
if __name__ == “__main__”:
openai.api_key = “sk-proj-…” # Set your OpenAI API key here or export it as an environment variable.Iftracer.init(
api_endpoint=http://:,
ifuser=”…”,
iflicenseKey=”…”,
ifproject=”…”,
)
messages = [
{“role”: “system”, “content”: “You are a helpful assistant.”},
{
“role”: “user”,
“content”: “Can you write a haiku about recursion in programming?”,
},
]response = get_gpt4o_mini_completion(messages)if response:
print(“GPT-4o-mini Response:”, response)
FAQ
How to obtain the InsightFinder User account information and License Key?
To obtain the user information and license key to be used, click on the top-right profile icon, and select ‘User Settings’.
In the Account Profile Page, the ‘iftracer_user’ value should be the ‘User Name’, and the ‘iftracer_license_key’ value should be taken from the ‘License Key’.
Why can’t I find the new tags?
The iftracer-sdk won’t extract the metadata like prompt token from response if the chain (e.g.: joke_generator_langchain) contains RunnablePassthrough(). Try to remove RunnablePassthrough() from the chain and stringify the result later. If the issue still persists, please contact our support team.