Class: Ace::LLM::Providers::CLI::CodexOaiClient
- Inherits:
-
Organisms::BaseClient
- Object
- Organisms::BaseClient
- Ace::LLM::Providers::CLI::CodexOaiClient
- Includes:
- CliArgsSupport
- Defined in:
- lib/ace/llm/providers/cli/codex_oai_client.rb
Overview
Client for interacting with Codex CLI targeting OpenAI-compatible providers Dynamically configures codex to use any backend (Z.ai, DeepSeek, etc.) via -c flag overrides for model_provider and model_providers config
Constant Summary collapse
- API_BASE_URL =
"https://api.openai.com"- DEFAULT_GENERATION_CONFIG =
{}.freeze
- DEFAULT_MODEL =
"zai/glm-5"
Class Method Summary collapse
Instance Method Summary collapse
-
#generate(messages, **options) ⇒ Hash
Generate a response from the LLM.
-
#initialize(model: nil, **options) ⇒ CodexOaiClient
constructor
A new instance of CodexOaiClient.
-
#list_models ⇒ Object
List available models.
- #needs_credentials? ⇒ Boolean
-
#split_backend_model(model_string) ⇒ Array<String>
Split “backend/model” into [“backend”, “model”].
Constructor Details
#initialize(model: nil, **options) ⇒ CodexOaiClient
Returns a new instance of CodexOaiClient.
33 34 35 36 37 38 39 |
# File 'lib/ace/llm/providers/cli/codex_oai_client.rb', line 33 def initialize(model: nil, **) @model = model || DEFAULT_MODEL @options = @generation_config = [:generation_config] || {} @backends = [:backends] || {} @skill_name_reader = Molecules::SkillNameReader.new end |
Class Method Details
.provider_name ⇒ Object
27 28 29 |
# File 'lib/ace/llm/providers/cli/codex_oai_client.rb', line 27 def self.provider_name "codexoai" end |
Instance Method Details
#generate(messages, **options) ⇒ Hash
Generate a response from the LLM
49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 |
# File 'lib/ace/llm/providers/cli/codex_oai_client.rb', line 49 def generate(, **) validate_codex_availability! prompt = () subprocess_env = [:subprocess_env] working_dir = Atoms::ExecutionContext.resolve_working_dir( working_dir: [:working_dir], subprocess_env: subprocess_env ) prompt = rewrite_skill_commands(prompt, working_dir: working_dir) cmd = build_codex_oai_command(prompt, , working_dir: working_dir) stdout, stderr, status = execute_codex_command(cmd, prompt, ) parse_codex_response(stdout, stderr, status, prompt, ) rescue => e handle_codex_error(e) end |
#list_models ⇒ Object
List available models
69 70 71 72 73 74 75 |
# File 'lib/ace/llm/providers/cli/codex_oai_client.rb', line 69 def list_models [ {id: "zai/glm-5", name: "GLM-5", description: "Z.ai flagship model", context_size: 128_000}, {id: "zai/glm-4.7", name: "GLM-4.7", description: "Z.ai balanced model", context_size: 128_000}, {id: "zai/glm-4.6", name: "GLM-4.6", description: "Z.ai fast model", context_size: 128_000} ] end |
#needs_credentials? ⇒ Boolean
41 42 43 |
# File 'lib/ace/llm/providers/cli/codex_oai_client.rb', line 41 def needs_credentials? false end |
#split_backend_model(model_string) ⇒ Array<String>
Split “backend/model” into [“backend”, “model”]
80 81 82 83 84 85 86 87 |
# File 'lib/ace/llm/providers/cli/codex_oai_client.rb', line 80 def split_backend_model(model_string) return [nil, nil] unless model_string parts = model_string.split("/", 2) return [nil, nil] unless parts.length == 2 parts end |