Class: Strata::CLI::AI::Client

Inherits:
Object
  • Object
show all
Defined in:
lib/strata/cli/ai/client.rb

Overview

Client wraps ruby_llm gem for unified LLM access.

Instance Attribute Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(config = Configuration.new) ⇒ Client

Returns a new instance of Client.



16
17
18
19
# File 'lib/strata/cli/ai/client.rb', line 16

def initialize(config = Configuration.new)
  @config = config
  configure_ruby_llm if enabled?
end

Instance Attribute Details

#chatObject (readonly)

Returns the value of attribute chat.



14
15
16
# File 'lib/strata/cli/ai/client.rb', line 14

def chat
  @chat
end

#configObject (readonly)

Returns the value of attribute config.



14
15
16
# File 'lib/strata/cli/ai/client.rb', line 14

def config
  @config
end

Instance Method Details

#complete(prompt, system_prompt: nil) ⇒ String

Complete a prompt using the configured provider

Parameters:

  • prompt (String)

    The user prompt

  • system_prompt (String, nil) (defaults to: nil)

    Optional system context

Returns:

  • (String)

    The AI response



30
31
32
33
34
35
36
37
38
39
# File 'lib/strata/cli/ai/client.rb', line 30

def complete(prompt, system_prompt: nil)
  raise AIError, "AI is not enabled. Configure ai_api_key in .strata" unless enabled?

  chat = RubyLLM.chat(model: @config.model_or_default)
  chat.with_instructions(system_prompt) if system_prompt
  response = chat.ask(prompt)
  response.content
rescue RubyLLM::Error => e
  raise AIError, "LLM error: #{e.message}"
end

#enabled?Boolean

Check if AI is enabled and configured

Returns:

  • (Boolean)


22
23
24
# File 'lib/strata/cli/ai/client.rb', line 22

def enabled?
  @config.enabled?
end