Class: Strata::CLI::AI::Client
- Inherits:
-
Object
- Object
- Strata::CLI::AI::Client
- Defined in:
- lib/strata/cli/ai/client.rb
Overview
Client wraps ruby_llm gem for unified LLM access.
Instance Attribute Summary collapse
-
#chat ⇒ Object
readonly
Returns the value of attribute chat.
-
#config ⇒ Object
readonly
Returns the value of attribute config.
Instance Method Summary collapse
-
#complete(prompt, system_prompt: nil) ⇒ String
Complete a prompt using the configured provider.
-
#enabled? ⇒ Boolean
Check if AI is enabled and configured.
-
#initialize(config = Configuration.new) ⇒ Client
constructor
A new instance of Client.
Constructor Details
#initialize(config = Configuration.new) ⇒ Client
Returns a new instance of Client.
16 17 18 19 |
# File 'lib/strata/cli/ai/client.rb', line 16 def initialize(config = Configuration.new) @config = config configure_ruby_llm if enabled? end |
Instance Attribute Details
#chat ⇒ Object (readonly)
Returns the value of attribute chat.
14 15 16 |
# File 'lib/strata/cli/ai/client.rb', line 14 def chat @chat end |
#config ⇒ Object (readonly)
Returns the value of attribute config.
14 15 16 |
# File 'lib/strata/cli/ai/client.rb', line 14 def config @config end |
Instance Method Details
#complete(prompt, system_prompt: nil) ⇒ String
Complete a prompt using the configured provider
30 31 32 33 34 35 36 37 38 39 |
# File 'lib/strata/cli/ai/client.rb', line 30 def complete(prompt, system_prompt: nil) raise AIError, "AI is not enabled. Configure ai_api_key in .strata" unless enabled? chat = RubyLLM.chat(model: @config.model_or_default) chat.with_instructions(system_prompt) if system_prompt response = chat.ask(prompt) response.content rescue RubyLLM::Error => e raise AIError, "LLM error: #{e.}" end |
#enabled? ⇒ Boolean
Check if AI is enabled and configured
22 23 24 |
# File 'lib/strata/cli/ai/client.rb', line 22 def enabled? @config.enabled? end |