Class: RubynCode::LLM::Adapters::OpenAI

Inherits:
Base
  • Object
show all
Includes:
JsonParsing, OpenAIMessageTranslator
Defined in:
lib/rubyn_code/llm/adapters/openai.rb

Direct Known Subclasses

OpenAICompatible

Constant Summary collapse

API_URL =
'https://api.openai.com/v1/chat/completions'
MAX_RETRIES =
3
RETRY_DELAYS =
[2, 5, 10].freeze
AVAILABLE_MODELS =
%w[gpt-4o gpt-4o-mini gpt-4.1 gpt-4.1-mini gpt-4.1-nano o3 o4-mini].freeze

Instance Method Summary collapse

Constructor Details

#initialize(api_key: nil, base_url: nil) ⇒ OpenAI

Returns a new instance of OpenAI.



20
21
22
23
24
# File 'lib/rubyn_code/llm/adapters/openai.rb', line 20

def initialize(api_key: nil, base_url: nil)
  super()
  @api_key = api_key
  @base_url = base_url
end

Instance Method Details

#chat(messages:, model:, max_tokens:, tools: nil, system: nil, on_text: nil, task_budget: nil) ⇒ Object

rubocop:disable Metrics/ParameterLists, Lint/UnusedMethodArgument – LLM adapter interface requires these params



34
35
36
37
38
39
40
41
42
43
# File 'lib/rubyn_code/llm/adapters/openai.rb', line 34

def chat(messages:, model:, max_tokens:, tools: nil, system: nil, on_text: nil, task_budget: nil) # rubocop:disable Metrics/ParameterLists, Lint/UnusedMethodArgument -- LLM adapter interface requires these params
  body = build_request_body(
    messages: messages, model: model, max_tokens: max_tokens,
    tools: tools, system: system
  )

  return stream_request(body, on_text) if on_text

  execute_with_retries(body, on_text)
end

#modelsObject



30
31
32
# File 'lib/rubyn_code/llm/adapters/openai.rb', line 30

def models
  AVAILABLE_MODELS
end

#provider_nameObject



26
27
28
# File 'lib/rubyn_code/llm/adapters/openai.rb', line 26

def provider_name
  'openai'
end