Class: Openlayer::Integrations::GoogleConversationalSearchTracer

Inherits:
Object
  • Object
show all
Defined in:
lib/openlayer/integrations/google_conversational_search_tracer.rb

Overview

Tracer for Google Cloud DiscoveryEngine ConversationalSearchService

This class provides integration with Google’s ConversationalSearchService to automatically trace answer_query calls and send them to the Openlayer platform.

Examples:

Basic usage

require 'openlayer/integrations/google_conversational_search_tracer'
require 'google/cloud/discovery_engine/v1'

google_client = Google::Cloud::DiscoveryEngine::V1::ConversationalSearchService::Client.new
openlayer = Openlayer::Client.new(api_key: ENV['OPENLAYER_API_KEY'])

Openlayer::Integrations::GoogleConversationalSearchTracer.trace_client(
  google_client,
  openlayer_client: openlayer,
  inference_pipeline_id: 'your-pipeline-id'
)

# Now all answer_query calls are automatically traced
response = google_client.answer_query(
  serving_config: "projects/.../servingConfigs/default",
  query: { text: "What is the meaning of life?" }
)

Class Method Summary collapse

Class Method Details

.send_trace(args:, kwargs:, response:, start_time:, end_time:, openlayer_client:, inference_pipeline_id:, session_id: nil, user_id: nil) ⇒ void

This method returns an undefined value.

Send trace data to Openlayer platform

Parameters:

  • args (Array)

    Original method positional arguments

  • kwargs (Hash)

    Original method keyword arguments

  • response (Google::Cloud::DiscoveryEngine::V1::AnswerQueryResponse)

    The API response

  • start_time (Time)

    Request start time

  • end_time (Time)

    Request end time

  • openlayer_client (Openlayer::Client)

    Openlayer client instance

  • inference_pipeline_id (String)

    Pipeline ID

  • session_id (String, nil) (defaults to: nil)

    Optional session ID (takes precedence over auto-extracted)

  • user_id (String, nil) (defaults to: nil)

    Optional user ID



99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
# File 'lib/openlayer/integrations/google_conversational_search_tracer.rb', line 99

def self.send_trace(args:, kwargs:, response:, start_time:, end_time:, openlayer_client:, inference_pipeline_id:, session_id: nil, user_id: nil)
  # Calculate latency
  latency_ms = ((end_time - start_time) * 1000).round(2)

  # Extract query from request
  query_text = extract_query(args, kwargs)

  # Extract answer and metadata from response
  answer_data = extract_answer_data(response)

  # Extract additional metadata
   = (args, kwargs, response, latency_ms)

  # Rough estimate of prompt and completion tokens
  prompt_tokens = (query_text.length / 4.0).ceil
  completion_tokens = (answer_data[:answer_text].length / 4.0).ceil

  # Extract grounding information from metadata for step root level
  citations = .delete(:citations)
  references = .delete(:references)
  related_questions = .delete(:relatedQuestions)

  # Extract context from references (array of content strings)
  context = if references && references.is_a?(Array)
    references.map { |ref| ref[:content] }.compact
  else
    nil
  end

  # Extract nested steps (Google's execution steps)
  answer = response.respond_to?(:answer) ? response.answer : nil
  nested_steps = answer ? extract_steps(answer, query_text) : nil

  # Build step object
  step = {
    name: "Conversational Search answer_query",
    type: "chat_completion",
    provider: "Google",
    startTime: start_time.to_i,
    endTime: end_time.to_i,
    latency: latency_ms,
    metadata: ,
    inputs: {
      prompt: [
        {role: "user", content: query_text}
      ]
    },
    output: answer_data[:answer_text],
    promptTokens: prompt_tokens,
    completionTokens: completion_tokens,
    tokens: prompt_tokens + completion_tokens,
    model: "google-discovery-engine"
  }

  # Add grounding information at step root level
  step[:citations] = citations if citations
  step[:references] = references if references
  step[:relatedQuestions] = related_questions if related_questions

  # Add nested steps (Google's execution steps as RetrieverSteps)
  step[:steps] = nested_steps if nested_steps && !nested_steps.empty?

  # Build trace data in Openlayer format
  trace_data = {
    config: {
      inputVariableNames: ["query"],
      outputColumnName: "answer",
      latencyColumnName: "latency_ms",
      timestampColumnName: "timestamp"
    },
    rows: [
      {
        query: query_text,
        answer: answer_data[:answer_text],
        latency_ms: latency_ms,
        timestamp: start_time.to_i,
        metadata: ,
        steps: [step]
      }
    ]
  }

  # Add context column if available
  if context && !context.empty?
    trace_data[:rows][0][:context] = context
    trace_data[:config][:contextColumnName] = "context"
  end

  # Determine which session to use (kwarg takes precedence over auto-extracted)
  final_session = session_id || [:session]
  if final_session
    trace_data[:rows][0][:session_id] = final_session
    trace_data[:config][:sessionIdColumnName] = "session_id"
  end

  # Determine which user_id to use (kwarg takes precedence over auto-extracted)
  user_pseudo_id = extract_user_pseudo_id(response)
  final_user_id = user_id || user_pseudo_id
  if final_user_id
    trace_data[:rows][0][:user_id] = final_user_id
    trace_data[:config][:userIdColumnName] = "user_id"
  end

  # Send to Openlayer
  openlayer_client
    .inference_pipelines
    .data
    .stream(
      inference_pipeline_id,
      **trace_data
    )
end

.trace_client(client, openlayer_client:, inference_pipeline_id:, session_id: nil, user_id: nil) ⇒ void

This method returns an undefined value.

Enable tracing on a Google ConversationalSearchService client

Parameters:

  • client (Google::Cloud::DiscoveryEngine::V1::ConversationalSearchService::Client)

    The Google client instance to trace

  • openlayer_client (Openlayer::Client)

    The Openlayer client instance for sending traces

  • inference_pipeline_id (String)

    The Openlayer inference pipeline ID to send traces to

  • session_id (String, nil) (defaults to: nil)

    Optional session ID to use for all traces. Takes precedence over auto-extracted sessions.

  • user_id (String, nil) (defaults to: nil)

    Optional user ID to use for all traces.



46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
# File 'lib/openlayer/integrations/google_conversational_search_tracer.rb', line 46

def self.trace_client(client, openlayer_client:, inference_pipeline_id:, session_id: nil, user_id: nil)
  # Store original method reference
  original_answer_query = client.method(:answer_query)

  # Define traced wrapper method
  client.define_singleton_method(:answer_query) do |*args, **kwargs, &block|
    # Capture start time
    start_time = Time.now

    # Execute the original method
    response = original_answer_query.call(*args, **kwargs, &block)

    # Capture end time
    end_time = Time.now

    # Send trace to Openlayer (with error handling)
    begin
      GoogleConversationalSearchTracer.send_trace(
        args: args,
        kwargs: kwargs,
        response: response,
        start_time: start_time,
        end_time: end_time,
        openlayer_client: openlayer_client,
        inference_pipeline_id: inference_pipeline_id,
        session_id: session_id,
        user_id: user_id
      )
    rescue StandardError => e
      # Never break the user's application due to tracing errors
      GoogleConversationalSearchTracer.warn_if_debug("[Openlayer] Failed to send trace: #{e.message}")
      GoogleConversationalSearchTracer.warn_if_debug("[Openlayer] #{e.backtrace.first(3).join("\n")}") if e.backtrace
    end

    # Always return the original response
    response
  end

  nil
end

.warn_if_debug(message) ⇒ void

This method returns an undefined value.

Log warning message if debug mode is enabled

Parameters:

  • message (String)

    Warning message



642
643
644
# File 'lib/openlayer/integrations/google_conversational_search_tracer.rb', line 642

def self.warn_if_debug(message)
  warn(message) if ENV["OPENLAYER_DEBUG"]
end