Class: Ace::LLM::Providers::CLI::CodexOaiClient

Inherits:
Organisms::BaseClient
  • Object
show all
Includes:
CliArgsSupport
Defined in:
lib/ace/llm/providers/cli/codex_oai_client.rb

Overview

Client for interacting with Codex CLI targeting OpenAI-compatible providers Dynamically configures codex to use any backend (Z.ai, DeepSeek, etc.) via -c flag overrides for model_provider and model_providers config

Constant Summary collapse

API_BASE_URL =
"https://api.openai.com"
DEFAULT_GENERATION_CONFIG =
{}.freeze
DEFAULT_MODEL =
"zai/glm-5"

Class Method Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(model: nil, **options) ⇒ CodexOaiClient

Returns a new instance of CodexOaiClient.



33
34
35
36
37
38
39
# File 'lib/ace/llm/providers/cli/codex_oai_client.rb', line 33

def initialize(model: nil, **options)
  @model = model || DEFAULT_MODEL
  @options = options
  @generation_config = options[:generation_config] || {}
  @backends = options[:backends] || {}
  @skill_name_reader = Molecules::SkillNameReader.new
end

Class Method Details

.provider_nameObject



27
28
29
# File 'lib/ace/llm/providers/cli/codex_oai_client.rb', line 27

def self.provider_name
  "codexoai"
end

Instance Method Details

#generate(messages, **options) ⇒ Hash

Generate a response from the LLM

Parameters:

  • messages (Array<Hash>)

    Conversation messages

  • options (Hash)

    Generation options

Returns:

  • (Hash)

    Response with text and metadata



49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
# File 'lib/ace/llm/providers/cli/codex_oai_client.rb', line 49

def generate(messages, **options)
  validate_codex_availability!

  prompt = format_messages_as_prompt(messages)
  subprocess_env = options[:subprocess_env]
  working_dir = Atoms::ExecutionContext.resolve_working_dir(
    working_dir: options[:working_dir],
    subprocess_env: subprocess_env
  )
  prompt = rewrite_skill_commands(prompt, working_dir: working_dir)

  cmd = build_codex_oai_command(prompt, options, working_dir: working_dir)
  stdout, stderr, status = execute_codex_command(cmd, prompt, options)

  parse_codex_response(stdout, stderr, status, prompt, options)
rescue => e
  handle_codex_error(e)
end

#list_modelsObject

List available models



69
70
71
72
73
74
75
# File 'lib/ace/llm/providers/cli/codex_oai_client.rb', line 69

def list_models
  [
    {id: "zai/glm-5", name: "GLM-5", description: "Z.ai flagship model", context_size: 128_000},
    {id: "zai/glm-4.7", name: "GLM-4.7", description: "Z.ai balanced model", context_size: 128_000},
    {id: "zai/glm-4.6", name: "GLM-4.6", description: "Z.ai fast model", context_size: 128_000}
  ]
end

#needs_credentials?Boolean

Returns:

  • (Boolean)


41
42
43
# File 'lib/ace/llm/providers/cli/codex_oai_client.rb', line 41

def needs_credentials?
  false
end

#split_backend_model(model_string) ⇒ Array<String>

Split “backend/model” into [“backend”, “model”]

Parameters:

  • model_string (String)

    e.g. “zai/glm-5”

Returns:

  • (Array<String>)

    e.g. [“zai”, “glm-5”]



80
81
82
83
84
85
86
87
# File 'lib/ace/llm/providers/cli/codex_oai_client.rb', line 80

def split_backend_model(model_string)
  return [nil, nil] unless model_string

  parts = model_string.split("/", 2)
  return [nil, nil] unless parts.length == 2

  parts
end