Class: Telnyx::Resources::AI::OpenAI
- Inherits:
-
Object
- Object
- Telnyx::Resources::AI::OpenAI
- Defined in:
- lib/telnyx/resources/ai/openai.rb,
lib/telnyx/resources/ai/openai/chat.rb,
lib/telnyx/resources/ai/openai/embeddings.rb
Defined Under Namespace
Classes: Chat, Embeddings
Instance Attribute Summary collapse
- #chat ⇒ Telnyx::Resources::AI::OpenAI::Chat readonly
-
#embeddings ⇒ Telnyx::Resources::AI::OpenAI::Embeddings
readonly
OpenAI-compatible embeddings endpoints for generating vector representations of text.
Instance Method Summary collapse
-
#initialize(client:) ⇒ OpenAI
constructor
private
A new instance of OpenAI.
-
#list_models(request_options: {}) ⇒ Telnyx::Models::AI::OpenAIListModelsResponse
Lists every model currently available to your account on Telnyx Inference, including SOTA open-source LLMs hosted on Telnyx GPUs (for example ‘moonshotai/Kimi-K2.6`, `zai-org/GLM-5.1-FP8`, and `MiniMaxAI/MiniMax-M2.7`), embedding models, and any fine-tuned models you have created.
Constructor Details
#initialize(client:) ⇒ OpenAI
This method is part of a private API. You should avoid using this method if possible, as it may be removed or be changed in the future.
Returns a new instance of OpenAI.
50 51 52 53 54 |
# File 'lib/telnyx/resources/ai/openai.rb', line 50 def initialize(client:) @client = client @embeddings = Telnyx::Resources::AI::OpenAI::Embeddings.new(client: client) @chat = Telnyx::Resources::AI::OpenAI::Chat.new(client: client) end |
Instance Attribute Details
#chat ⇒ Telnyx::Resources::AI::OpenAI::Chat (readonly)
13 14 15 |
# File 'lib/telnyx/resources/ai/openai.rb', line 13 def chat @chat end |
#embeddings ⇒ Telnyx::Resources::AI::OpenAI::Embeddings (readonly)
OpenAI-compatible embeddings endpoints for generating vector representations of text
10 11 12 |
# File 'lib/telnyx/resources/ai/openai.rb', line 10 def @embeddings end |
Instance Method Details
#list_models(request_options: {}) ⇒ Telnyx::Models::AI::OpenAIListModelsResponse
Lists every model currently available to your account on Telnyx Inference, including SOTA open-source LLMs hosted on Telnyx GPUs (for example ‘moonshotai/Kimi-K2.6`, `zai-org/GLM-5.1-FP8`, and `MiniMaxAI/MiniMax-M2.7`), embedding models, and any fine-tuned models you have created.
Each entry is a ‘ModelMetadata` object describing the model id, owner, task, context length, supported languages, billing tier, pricing per 1M tokens, deployment regions, and whether the model supports vision or fine-tuning. Use this endpoint to discover model ids you can pass to `POST /v2/ai/openai/chat/completions`.
Model ids follow the ‘organization/model_name` convention from Hugging Face (for example `moonshotai/Kimi-K2.6`). This endpoint is OpenAI-compatible: clients pointed at `api.telnyx.com/v2/ai/openai` can call `client.models.list()` to retrieve the same payload.
38 39 40 41 42 43 44 45 |
# File 'lib/telnyx/resources/ai/openai.rb', line 38 def list_models(params = {}) @client.request( method: :get, path: "ai/openai/models", model: Telnyx::Models::AI::OpenAIListModelsResponse, options: params[:request_options] ) end |