Class: RubynCode::LLM::Adapters::OpenAI
- Includes:
- JsonParsing, OpenAIMessageTranslator
- Defined in:
- lib/rubyn_code/llm/adapters/openai.rb
Direct Known Subclasses
Constant Summary collapse
- API_URL =
'https://api.openai.com/v1/chat/completions'- MAX_RETRIES =
3- RETRY_DELAYS =
[2, 5, 10].freeze
- AVAILABLE_MODELS =
%w[gpt-4o gpt-4o-mini gpt-4.1 gpt-4.1-mini gpt-4.1-nano o3 o4-mini].freeze
Instance Method Summary collapse
-
#chat(messages:, model:, max_tokens:, tools: nil, system: nil, on_text: nil, task_budget: nil) ⇒ Object
rubocop:disable Metrics/ParameterLists, Lint/UnusedMethodArgument – LLM adapter interface requires these params.
-
#initialize(api_key: nil, base_url: nil) ⇒ OpenAI
constructor
A new instance of OpenAI.
- #models ⇒ Object
- #provider_name ⇒ Object
Constructor Details
#initialize(api_key: nil, base_url: nil) ⇒ OpenAI
Returns a new instance of OpenAI.
20 21 22 23 24 |
# File 'lib/rubyn_code/llm/adapters/openai.rb', line 20 def initialize(api_key: nil, base_url: nil) super() @api_key = api_key @base_url = base_url end |
Instance Method Details
#chat(messages:, model:, max_tokens:, tools: nil, system: nil, on_text: nil, task_budget: nil) ⇒ Object
rubocop:disable Metrics/ParameterLists, Lint/UnusedMethodArgument – LLM adapter interface requires these params
34 35 36 37 38 39 40 41 42 43 |
# File 'lib/rubyn_code/llm/adapters/openai.rb', line 34 def chat(messages:, model:, max_tokens:, tools: nil, system: nil, on_text: nil, task_budget: nil) # rubocop:disable Metrics/ParameterLists, Lint/UnusedMethodArgument -- LLM adapter interface requires these params body = build_request_body( messages: , model: model, max_tokens: max_tokens, tools: tools, system: system ) return stream_request(body, on_text) if on_text execute_with_retries(body, on_text) end |
#models ⇒ Object
30 31 32 |
# File 'lib/rubyn_code/llm/adapters/openai.rb', line 30 def models AVAILABLE_MODELS end |
#provider_name ⇒ Object
26 27 28 |
# File 'lib/rubyn_code/llm/adapters/openai.rb', line 26 def provider_name 'openai' end |