Module: PlatformSdk::Observability::Langfuse::RubyLLMAdapter
- Defined in:
- lib/platform_sdk/observability/langfuse/ruby_llm_adapter.rb
Overview
Fires ‘llm_call.platform_sdk` ActiveSupport::Notifications events with a payload extracted from a RubyConversations / RubyLLM-shaped chat.
Apps that use the RubyLLM (or RubyConversations) gem call this from any LLM call site to get cost, token, model, input, and output captured in Langfuse without having to know the OTel attribute keys:
PlatformSdk::Observability::Langfuse::RubyLLMAdapter.fire(
conversation: conversation_manager,
context: 'chat_response'
)
The adapter is RubyLLM-aware but does not require the RubyLLM gem at load time — ‘RubyLLM::Content` references are guarded by `defined?`, so apps that don’t use RubyLLM can load the SDK without paying for this code path.
Dedup: nested or repeated calls that share the same ‘chat.messages.last` object are skipped, so a single LLM response can’t double-fire when multiple layers of error-handling each wrap the same yield. The dedup state is registered via ‘Langfuse.track_thread_local`, so `Traceable`-wrapped Sidekiq jobs get it cleared in their ensure block. Callers outside `Traceable` (Rails controllers, long-lived Puma threads) should invoke `Langfuse.clear_tracked_thread_locals!` at their request/operation boundary to avoid dedup state leaking between unrelated calls.
Constant Summary collapse
- MAX_PROMPT_MESSAGES =
30- THREAD_LOCAL_KEY =
:platform_sdk_langfuse_last_message_id
Class Method Summary collapse
-
.fire(context:, conversation: nil, chat: nil, error: nil) ⇒ Object
Fire an ‘llm_call.platform_sdk` notification for a single LLM call.
Class Method Details
.fire(context:, conversation: nil, chat: nil, error: nil) ⇒ Object
Fire an ‘llm_call.platform_sdk` notification for a single LLM call. Accepts either a RubyConversations conversation (preferred — the adapter reads `conversation.chat` and `conversation.model_identifier`) or a raw RubyLLM::Chat. Returns nil — never raises.
43 44 45 46 47 48 49 50 51 52 53 |
# File 'lib/platform_sdk/observability/langfuse/ruby_llm_adapter.rb', line 43 def fire(context:, conversation: nil, chat: nil, error: nil) payload = build_payload(context:, conversation:, chat:, error:) return unless payload ActiveSupport::Notifications.instrument(LLM_CALL_EVENT, payload) rescue StandardError => e OpenTelemetry.handle_error( message: "RubyLLMAdapter.fire failed: #{e.class}: #{e.[0, 200]}" ) nil end |