Class: RubynCode::LLM::Adapters::Base

Inherits:
Object
  • Object
show all
Defined in:
lib/rubyn_code/llm/adapters/base.rb

Overview

Abstract base for all LLM provider adapters.

Every adapter must implement #chat, #provider_name, and #models. The Client facade delegates to whichever adapter is active.

Direct Known Subclasses

Anthropic, OpenAI

Instance Method Summary collapse

Instance Method Details

#chat(messages:, model:, max_tokens:, tools: nil, system: nil, on_text: nil, task_budget: nil) ⇒ LLM::Response

Parameters:

  • messages (Array<Hash>)

    Conversation messages

  • model (String)

    Model identifier

  • max_tokens (Integer)

    Max output tokens

  • tools (Array<Hash>, nil) (defaults to: nil)

    Tool schemas

  • system (String, nil) (defaults to: nil)

    System prompt text

  • on_text (Proc, nil) (defaults to: nil)

    Streaming text callback

  • task_budget (Hash, nil) (defaults to: nil)

    Optional task budget context

Returns:

Raises:

  • (NotImplementedError)


19
20
21
# File 'lib/rubyn_code/llm/adapters/base.rb', line 19

def chat(messages:, model:, max_tokens:, tools: nil, system: nil, on_text: nil, task_budget: nil) # rubocop:disable Metrics/ParameterLists -- LLM adapter interface requires these params
  raise NotImplementedError, "#{self.class}#chat must be implemented"
end

#modelsArray<String>

Returns Available model identifiers.

Returns:

  • (Array<String>)

    Available model identifiers

Raises:

  • (NotImplementedError)


29
30
31
# File 'lib/rubyn_code/llm/adapters/base.rb', line 29

def models
  raise NotImplementedError, "#{self.class}#models must be implemented"
end

#provider_nameString

Returns Provider identifier (e.g. ‘anthropic’, ‘openai’).

Returns:

  • (String)

    Provider identifier (e.g. ‘anthropic’, ‘openai’)

Raises:

  • (NotImplementedError)


24
25
26
# File 'lib/rubyn_code/llm/adapters/base.rb', line 24

def provider_name
  raise NotImplementedError, "#{self.class}#provider_name must be implemented"
end