Class: Ace::LLM::Providers::CLI::ClaudeOaiClient

Inherits:
Organisms::BaseClient
  • Object
show all
Includes:
CliArgsSupport
Defined in:
lib/ace/llm/providers/cli/claude_oai_client.rb

Overview

Client for Claude over Anthropic-compatible APIs (Z.ai, OpenRouter, etc.) Uses the claude CLI subprocess with backend-specific env vars to route requests through alternative Anthropic-compatible endpoints.

Constant Summary collapse

API_BASE_URL =
"https://api.z.ai"
DEFAULT_GENERATION_CONFIG =
{}.freeze
DEFAULT_MODEL =
"zai/glm-5"

Class Method Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(model: nil, **options) ⇒ ClaudeOaiClient

Returns a new instance of ClaudeOaiClient.



32
33
34
35
36
37
38
# File 'lib/ace/llm/providers/cli/claude_oai_client.rb', line 32

def initialize(model: nil, **options)
  @model = model || DEFAULT_MODEL
  @options = options
  @generation_config = options[:generation_config] || {}
  @backends = options[:backends] || {}
  @skill_name_reader = Molecules::SkillNameReader.new
end

Class Method Details

.provider_nameObject



26
27
28
# File 'lib/ace/llm/providers/cli/claude_oai_client.rb', line 26

def self.provider_name
  "claudeoai"
end

Instance Method Details

#generate(messages, **options) ⇒ Hash

Generate a response from the LLM

Parameters:

  • messages (Array<Hash>)

    Conversation messages

  • options (Hash)

    Generation options

Returns:

  • (Hash)

    Response with text and metadata



48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
# File 'lib/ace/llm/providers/cli/claude_oai_client.rb', line 48

def generate(messages, **options)
  validate_claude_availability!

  prompt = format_messages_as_prompt(messages)
  subprocess_env = options.delete(:subprocess_env)
  working_dir = Atoms::ExecutionContext.resolve_working_dir(
    working_dir: options[:working_dir],
    subprocess_env: subprocess_env
  )
  prompt = rewrite_skill_commands(prompt, working_dir: working_dir)

  cmd = build_claude_command(options)
  stdout, stderr, status = execute_claude_command(
    cmd,
    prompt,
    subprocess_env: subprocess_env,
    working_dir: working_dir,
    subprocess_command_prefix: options[:subprocess_command_prefix]
  )

  parse_claude_response(stdout, stderr, status, prompt, options)
rescue => e
  handle_claude_error(e)
end

#list_modelsObject

List available models



74
75
76
77
78
79
80
# File 'lib/ace/llm/providers/cli/claude_oai_client.rb', line 74

def list_models
  [
    {id: "zai/glm-5", name: "GLM-5", description: "Z.ai flagship model (Anthropic-compatible)", context_size: 128_000},
    {id: "zai/glm-4.7", name: "GLM-4.7", description: "Z.ai balanced model (Anthropic-compatible)", context_size: 128_000},
    {id: "zai/glm-4.6", name: "GLM-4.6", description: "Z.ai fast model (Anthropic-compatible)", context_size: 128_000}
  ]
end

#needs_credentials?Boolean

Returns:

  • (Boolean)


40
41
42
# File 'lib/ace/llm/providers/cli/claude_oai_client.rb', line 40

def needs_credentials?
  false
end

#split_backend_model(model_string) ⇒ Array<String>

Split “backend/model” into [“backend”, “model”]

Parameters:

  • model_string (String)

    e.g. “zai/glm-5”

Returns:

  • (Array<String>)

    e.g. [“zai”, “glm-5”]



85
86
87
88
89
90
91
92
# File 'lib/ace/llm/providers/cli/claude_oai_client.rb', line 85

def split_backend_model(model_string)
  return [nil, nil] unless model_string

  parts = model_string.split("/", 2)
  return [nil, nil] unless parts.length == 2

  parts
end