-
Notifications
You must be signed in to change notification settings - Fork 73
Open
Description
I am testing the Langfuse integration for observability on a LangGraph. I have a runnable object after:
workflow.addNode(...)
workflow.addNode(...)
workflow.addEdge(...)
this.runnable = workflow.compile();
const callbackHandler = new CallbackHandler(
{
userId
})
const finalState = await this.runnable.invoke(initialState, { configurable: { llm }, callbacks: [callbackHandler] });
In the service, I have the following initialization code also:
.... //initialization of keys via config
const sdk = new NodeSDK({
spanProcessors: [new LangfuseSpanProcessor({
publicKey, secretKey, baseUrl, environment
})],
});
sdk.start();
The environment is set up because I can retrieve prompts, but after following the documentation, I'm still unable to see traces for the invocations of my graph.
Relevant packages & versions:
"@opentelemetry/sdk-node": "^0.207.0",
"@langfuse/client": "^4.3.0",
"@langfuse/core": "^4.3.0",
"@langfuse/langchain": "^4.3.0",
"@langfuse/otel": "^4.3.0",
"langchain": "^0.3.19",
"@langchain/community": "^0.3.35",
"@langchain/core": "^0.3.42",
"@langchain/google-genai": "^0.2.0",
"@langchain/langgraph": "^0.2.55",
"@langchain/openai": "^0.4.4",
Leo310
Metadata
Metadata
Assignees
Labels
No labels