Class: Ace::LLM::Providers::CLI::ClaudeOaiClient
- Inherits:
-
Organisms::BaseClient
- Object
- Organisms::BaseClient
- Ace::LLM::Providers::CLI::ClaudeOaiClient
- Includes:
- CliArgsSupport
- Defined in:
- lib/ace/llm/providers/cli/claude_oai_client.rb
Overview
Client for Claude over Anthropic-compatible APIs (Z.ai, OpenRouter, etc.) Uses the claude CLI subprocess with backend-specific env vars to route requests through alternative Anthropic-compatible endpoints.
Constant Summary collapse
- API_BASE_URL =
"https://api.z.ai"- DEFAULT_GENERATION_CONFIG =
{}.freeze
- DEFAULT_MODEL =
"zai/glm-5"
Class Method Summary collapse
Instance Method Summary collapse
-
#generate(messages, **options) ⇒ Hash
Generate a response from the LLM.
-
#initialize(model: nil, **options) ⇒ ClaudeOaiClient
constructor
A new instance of ClaudeOaiClient.
-
#list_models ⇒ Object
List available models.
- #needs_credentials? ⇒ Boolean
-
#split_backend_model(model_string) ⇒ Array<String>
Split “backend/model” into [“backend”, “model”].
Constructor Details
#initialize(model: nil, **options) ⇒ ClaudeOaiClient
Returns a new instance of ClaudeOaiClient.
32 33 34 35 36 37 38 |
# File 'lib/ace/llm/providers/cli/claude_oai_client.rb', line 32 def initialize(model: nil, **) @model = model || DEFAULT_MODEL @options = @generation_config = [:generation_config] || {} @backends = [:backends] || {} @skill_name_reader = Molecules::SkillNameReader.new end |
Class Method Details
.provider_name ⇒ Object
26 27 28 |
# File 'lib/ace/llm/providers/cli/claude_oai_client.rb', line 26 def self.provider_name "claudeoai" end |
Instance Method Details
#generate(messages, **options) ⇒ Hash
Generate a response from the LLM
48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 |
# File 'lib/ace/llm/providers/cli/claude_oai_client.rb', line 48 def generate(, **) validate_claude_availability! prompt = () subprocess_env = .delete(:subprocess_env) working_dir = Atoms::ExecutionContext.resolve_working_dir( working_dir: [:working_dir], subprocess_env: subprocess_env ) prompt = rewrite_skill_commands(prompt, working_dir: working_dir) cmd = build_claude_command() stdout, stderr, status = execute_claude_command( cmd, prompt, subprocess_env: subprocess_env, working_dir: working_dir, subprocess_command_prefix: [:subprocess_command_prefix] ) parse_claude_response(stdout, stderr, status, prompt, ) rescue => e handle_claude_error(e) end |
#list_models ⇒ Object
List available models
74 75 76 77 78 79 80 |
# File 'lib/ace/llm/providers/cli/claude_oai_client.rb', line 74 def list_models [ {id: "zai/glm-5", name: "GLM-5", description: "Z.ai flagship model (Anthropic-compatible)", context_size: 128_000}, {id: "zai/glm-4.7", name: "GLM-4.7", description: "Z.ai balanced model (Anthropic-compatible)", context_size: 128_000}, {id: "zai/glm-4.6", name: "GLM-4.6", description: "Z.ai fast model (Anthropic-compatible)", context_size: 128_000} ] end |
#needs_credentials? ⇒ Boolean
40 41 42 |
# File 'lib/ace/llm/providers/cli/claude_oai_client.rb', line 40 def needs_credentials? false end |
#split_backend_model(model_string) ⇒ Array<String>
Split “backend/model” into [“backend”, “model”]
85 86 87 88 89 90 91 92 |
# File 'lib/ace/llm/providers/cli/claude_oai_client.rb', line 85 def split_backend_model(model_string) return [nil, nil] unless model_string parts = model_string.split("/", 2) return [nil, nil] unless parts.length == 2 parts end |