Module: LLM::OpenAIMethods

Included in:
OpenAI, OpenWebUIMethods
Defined in:
lib/scout/llm/backends/openai.rb

Overview

OpenAI Chat Completions backend.

Implemented as a module exposing singleton methods (‘LLM::OpenAI.ask`, etc). We compose the backend by:

- prepending OpenAIMethods into the singleton class (overrides)
- including Backend::ClassMethods into the singleton class (shared logic)

Instance Method Summary collapse

Instance Method Details

#format_tool_call(message) ⇒ Object



44
45
46
47
48
49
50
51
52
53
# File 'lib/scout/llm/backends/openai.rb', line 44

def format_tool_call(message)
  tool_call = IndiferentHash.setup(JSON.parse(message[:content]))
  arguments = tool_call.delete('arguments') || {}
  name = tool_call[:name]
  tool_call['type'] = 'function'
  tool_call['function'] ||= {}
  tool_call['function']['name'] ||= name || 'function'
  tool_call['function']['arguments'] = arguments.to_json
  { role: 'assistant', tool_calls: [tool_call] }
end

#format_tool_definitions(tools) ⇒ Object



24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
# File 'lib/scout/llm/backends/openai.rb', line 24

def format_tool_definitions(tools)
  tools.values.collect do |obj, definition|
    definition = obj if Hash === obj
    definition

    definition = case definition[:function]
                 when Hash
                   definition
                 else
                   { type: :function, function: definition }
                 end

    definition = IndiferentHash.add_defaults definition, type: :function

    definition[:parameters].delete :defaults if definition[:parameters]

    definition
  end
end

#format_tool_output(message, last_id = nil) ⇒ Object



55
56
57
58
59
60
61
# File 'lib/scout/llm/backends/openai.rb', line 55

def format_tool_output(message, last_id = nil)
  info = JSON.parse(message[:content])
  id = info.delete('call_id') || info.dig('id') || last_id
  info['role'] = 'tool'
  info['tool_call_id'] = id
  info
end

#parse_tool_call(info) ⇒ Object

Tool-calls in the Chat Completions API are shaped like:

{"id":"call_...", "type":"function", "function": {"name":"...", "arguments":"{...}"}}


65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
# File 'lib/scout/llm/backends/openai.rb', line 65

def parse_tool_call(info)
  IndiferentHash.setup(info)

  function = info['function'] || info[:function] || {}
  IndiferentHash.setup(function)

  name = function[:name] || info[:name]
  id = info[:id] || info['id'] || info[:call_id] || info['call_id']

  arguments = function[:arguments] || info[:arguments] || info['arguments'] || '{}'
  arguments = begin
                JSON.parse(arguments)
              rescue
                arguments
              end if String === arguments

  { arguments: arguments, id: id, name: name }
end

#process_response(messages, response, tools, options, &block) ⇒ Object

Raises:

  • (Exception)


84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
# File 'lib/scout/llm/backends/openai.rb', line 84

def process_response(messages, response, tools, options, &block)
  raise Exception, response['error'] if response['error']

  message = response.dig('choices', 0, 'message')

  tool_calls = response.dig('choices', 0, 'tool_calls') ||
    response.dig('choices', 0, 'message', 'tool_calls')

  if tool_calls && tool_calls.any?
    tool_calls = tool_calls.collect { |tool_call| parse_tool_call(tool_call) }
    LLM.process_calls(tools, tool_calls, &block)
  else
    [IndiferentHash.setup(message)]
  end
end

#query(client, messages, tools = [], parameters = {}) ⇒ Object



12
13
14
15
16
17
18
19
20
21
22
# File 'lib/scout/llm/backends/openai.rb', line 12

def query(client, messages, tools = [], parameters = {})
  parameters[:messages] = messages
  parameters[:tools] = format_tool_definitions(tools) if tools && tools.any?

  begin
    client.chat(parameters: parameters)
  rescue
    Log.debug 'Input parameters: ' + "\n" + JSON.pretty_generate(parameters)
    raise $!
  end
end