Class: Inquirex::LLM::OpenAIAdapter

Inherits:
Adapter
  • Object
show all
Defined in:
lib/inquirex/llm/openai_adapter.rb

Overview

OpenAI Chat Completions adapter for inquirex-llm.

Uses the Chat Completions API with response_format: { type: “json_object” } so the model is constrained to return a valid JSON object — more reliable than prompt-only “please return JSON” approaches for structured extraction.

Usage:

adapter = Inquirex::LLM::OpenAIAdapter.new(
  api_key: ENV["OPENAI_API_KEY"],
  model:   "gpt-4o-mini"
)
result = adapter.call(engine.current_step, engine.answers)

Constant Summary collapse

API_URL =
"https://api.openai.com/v1/chat/completions"
DEFAULT_MODEL =
"gpt-4o-mini"
DEFAULT_MAX_TOKENS =
2048
MODEL_MAP =

Maps Inquirex DSL model symbols to concrete OpenAI model ids. Accepts Claude symbols too — we substitute sensible OpenAI equivalents so flow definitions written against Anthropic still run against this adapter.

{
  gpt_4o:        "gpt-4o",
  gpt_4o_mini:   "gpt-4o-mini",
  gpt_4_1:       "gpt-4.1",
  gpt_4_1_mini:  "gpt-4.1-mini",
  claude_sonnet: "gpt-4o",
  claude_haiku:  "gpt-4o-mini",
  claude_opus:   "gpt-4o"
}.freeze

Instance Method Summary collapse

Methods inherited from Adapter

#source_answers, #validate_output!

Constructor Details

#initialize(api_key: nil, model: nil) ⇒ OpenAIAdapter

Returns a new instance of OpenAIAdapter.

Parameters:

  • api_key (String, nil) (defaults to: nil)

    defaults to ENV

  • model (String, nil) (defaults to: nil)

    default model id when a node does not specify one



42
43
44
45
46
47
48
# File 'lib/inquirex/llm/openai_adapter.rb', line 42

def initialize(api_key: nil, model: nil)
  super()
  @api_key = api_key || ENV.fetch("OPENAI_API_KEY") {
    raise ArgumentError, "OPENAI_API_KEY is required (pass api_key: or set the env var)"
  }
  @default_model = model || DEFAULT_MODEL
end

Instance Method Details

#call(node, answers = {}) ⇒ Hash

Returns structured data matching the node’s schema.

Parameters:

  • node (Inquirex::LLM::Node)

    the current LLM step

  • answers (Hash) (defaults to: {})

    all collected answers so far

Returns:

  • (Hash)

    structured data matching the node’s schema

Raises:



55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
# File 'lib/inquirex/llm/openai_adapter.rb', line 55

def call(node, answers = {})
  source      = source_answers(node, answers)
  model       = resolve_model(node)
  temperature = node.respond_to?(:temperature) ? (node.temperature || 0.2) : 0.2
  max_tokens  = node.respond_to?(:max_tokens)  ? (node.max_tokens  || DEFAULT_MAX_TOKENS) : DEFAULT_MAX_TOKENS

  response = call_api(
    model:       model,
    system:      build_system_prompt(node),
    user:        build_user_prompt(node, source, answers),
    temperature: temperature,
    max_tokens:  max_tokens
  )

  result = parse_response(response)
  validate_output!(node, result)
  result
end