Class: RubyPi::LLM::OpenAI

Inherits:
BaseProvider show all
Defined in:
lib/ruby_pi/llm/openai.rb

Overview

OpenAI provider implementation. Communicates with the OpenAI Chat Completions API to generate text completions, handle tool/function calls, and stream responses via Server-Sent Events.

Examples:

Basic usage

provider = RubyPi::LLM::OpenAI.new(
  model: "gpt-4o",
  api_key: ENV["OPENAI_API_KEY"]
)
response = provider.complete(messages: [{ role: "user", content: "Hello!" }])
puts response.content

Constant Summary collapse

BASE_URL =

Base URL for the OpenAI API.

"https://api.openai.com"

Instance Attribute Summary

Attributes inherited from BaseProvider

#max_retries, #retry_base_delay, #retry_max_delay

Instance Method Summary collapse

Methods inherited from BaseProvider

#complete

Constructor Details

#initialize(model: nil, api_key: nil, **options) ⇒ OpenAI

Creates a new OpenAI provider instance.

Parameters:

  • model (String) (defaults to: nil)

    the OpenAI model identifier (e.g., “gpt-4o”)

  • api_key (String, nil) (defaults to: nil)

    OpenAI API key (falls back to global config)

  • options (Hash)

    additional options passed to BaseProvider



31
32
33
34
35
36
# File 'lib/ruby_pi/llm/openai.rb', line 31

def initialize(model: nil, api_key: nil, **options)
  super(**options)
  config = RubyPi.configuration
  @model = model || config.default_openai_model
  @api_key = api_key || config.openai_api_key
end

Instance Method Details

#model_nameString

Returns the OpenAI model identifier.

Returns:

  • (String)


41
42
43
# File 'lib/ruby_pi/llm/openai.rb', line 41

def model_name
  @model
end

#provider_nameSymbol

Returns :openai as the provider identifier.

Returns:

  • (Symbol)


48
49
50
# File 'lib/ruby_pi/llm/openai.rb', line 48

def provider_name
  :openai
end