Module: Langfuse::PromptCacheEvents

Included in:
ApiClient
Defined in:
lib/langfuse/prompt_cache_events.rb

Overview

Prompt cache event emission for ApiClient.

Includers must expose:

  • ‘cache_backend_name` — used in #event_payload to tag the cache backend

  • ‘logger` — used to warn on observer/notifier failures

Constant Summary collapse

PROMPT_CACHE_NOTIFICATION =

ActiveSupport::Notifications event name used for prompt cache events.

"prompt_cache.langfuse"

Class Method Summary collapse

Instance Method Summary collapse

Class Method Details

.build_payload(key, cache_status:, source:, backend:, extra: {}) ⇒ Hash

Build a prompt cache event payload from a key, status, source, and backend. Shared by the ApiClient mixin and PromptCacheCoordinator so a payload-shape change can’t drift between the two emitters.

Parameters:

  • key (PromptCacheKey)

    Logical and storage cache key

  • cache_status (Symbol)

    Cache status

  • source (Symbol)

    Event source

  • backend (String)

    Backend identifier

  • extra (Hash) (defaults to: {})

    Additional payload fields

Returns:

  • (Hash)

    Event payload



23
24
25
26
27
28
29
30
31
32
33
34
# File 'lib/langfuse/prompt_cache_events.rb', line 23

def self.build_payload(key, cache_status:, source:, backend:, extra: {})
  {
    name: key.name,
    version: key.version,
    label: key.resolved_label,
    logical_key: key.logical_key,
    storage_key: key.storage_key,
    backend: backend,
    cache_status: cache_status,
    source: source
  }.merge(extra)
end

Instance Method Details

#emit_prompt_cache_event(event, payload = nil) ⇒ void

This method returns an undefined value.

Emit a prompt cache event to configured hooks. Accepts an eager payload hash or a block that builds one. The block is only evaluated when at least one listener is active, avoiding hash allocations on the hot path.

Parameters:

  • event (Symbol)

    Event name

  • payload (Hash, nil) (defaults to: nil)

    Event payload (omit when passing a block)

Yield Returns:

  • (Hash)

    Lazily constructed payload



54
55
56
57
58
59
60
61
62
63
# File 'lib/langfuse/prompt_cache_events.rb', line 54

def emit_prompt_cache_event(event, payload = nil)
  observer_callable = @cache_observer_callable
  as_listening = active_support_listening?
  return if observer_callable.nil? && !as_listening

  payload ||= block_given? ? yield : {}
  normalized_payload = payload.merge(event: event.to_sym)
  notify_cache_observer(normalized_payload) if observer_callable
  notify_active_support(normalized_payload) if as_listening
end

#emit_prompt_fallback_event(key, cache_status:, error:) ⇒ void

This method returns an undefined value.

Emit a fallback event for a prompt fetch that fell back to caller-provided content.

Parameters:

  • key (PromptCacheKey)

    Logical and storage cache key

  • cache_status (Symbol)

    Cache status to report

  • error (StandardError)

    The error that triggered the fallback



71
72
73
74
75
76
# File 'lib/langfuse/prompt_cache_events.rb', line 71

def emit_prompt_fallback_event(key, cache_status:, error:)
  emit_prompt_cache_event(:fallback) do
    event_payload(key, cache_status, CacheSource::FALLBACK,
                  error_class: error.class.name, error_message: error.message)
  end
end

#setup_prompt_cache_events(cache_observer:) ⇒ void

This method returns an undefined value.

Configure prompt cache event dispatch. Wraps the observer once into a 1-arg callable so the per-event hot path never re-checks arity.

Parameters:

  • cache_observer (#call, nil)

    Optional observer



41
42
43
44
# File 'lib/langfuse/prompt_cache_events.rb', line 41

def setup_prompt_cache_events(cache_observer:)
  @cache_observer_callable = wrap_cache_observer(cache_observer)
  @active_support_notifications = defined?(ActiveSupport::Notifications) ? ActiveSupport::Notifications : nil
end