Class: Telnyx::Resources::AI::OpenAI
- Inherits:
-
Object
- Object
- Telnyx::Resources::AI::OpenAI
- Defined in:
- lib/telnyx/resources/ai/openai.rb,
lib/telnyx/resources/ai/openai/chat.rb,
lib/telnyx/resources/ai/openai/embeddings.rb
Defined Under Namespace
Classes: Chat, Embeddings
Instance Attribute Summary collapse
- #chat ⇒ Telnyx::Resources::AI::OpenAI::Chat readonly
-
#embeddings ⇒ Telnyx::Resources::AI::OpenAI::Embeddings
readonly
OpenAI-compatible embeddings endpoints for generating vector representations of text.
Instance Method Summary collapse
-
#create_response(body:, request_options: {}) ⇒ Hash{Symbol=>Object}
Chat with a language model.
-
#initialize(client:) ⇒ OpenAI
constructor
private
A new instance of OpenAI.
-
#list_models(request_options: {}) ⇒ Telnyx::Models::AI::OpenAIListModelsResponse
Lists every model currently available to your account on Telnyx Inference, including SOTA open-source LLMs hosted on Telnyx GPUs (for example ‘moonshotai/Kimi-K2.6`, `zai-org/GLM-5.1-FP8`, and `MiniMaxAI/MiniMax-M2.7`), embedding models, and any fine-tuned models you have created.
Constructor Details
#initialize(client:) ⇒ OpenAI
This method is part of a private API. You should avoid using this method if possible, as it may be removed or be changed in the future.
Returns a new instance of OpenAI.
75 76 77 78 79 |
# File 'lib/telnyx/resources/ai/openai.rb', line 75 def initialize(client:) @client = client @embeddings = Telnyx::Resources::AI::OpenAI::Embeddings.new(client: client) @chat = Telnyx::Resources::AI::OpenAI::Chat.new(client: client) end |
Instance Attribute Details
#chat ⇒ Telnyx::Resources::AI::OpenAI::Chat (readonly)
13 14 15 |
# File 'lib/telnyx/resources/ai/openai.rb', line 13 def chat @chat end |
#embeddings ⇒ Telnyx::Resources::AI::OpenAI::Embeddings (readonly)
OpenAI-compatible embeddings endpoints for generating vector representations of text
10 11 12 |
# File 'lib/telnyx/resources/ai/openai.rb', line 10 def @embeddings end |
Instance Method Details
#create_response(body:, request_options: {}) ⇒ Hash{Symbol=>Object}
Chat with a language model. This endpoint is consistent with the [OpenAI Responses API](platform.openai.com/docs/api-reference/responses) and may be used with the OpenAI JS or Python SDK. Response id parameter is not supported at the moment. Use ‘conversation’ parameter to leverage persistent conversations feature.
29 30 31 32 33 34 35 36 37 38 |
# File 'lib/telnyx/resources/ai/openai.rb', line 29 def create_response(params) parsed, = Telnyx::AI::OpenAICreateResponseParams.dump_request(params) @client.request( method: :post, path: "ai/openai/responses", body: parsed[:body], model: Telnyx::Internal::Type::HashOf[Telnyx::Internal::Type::Unknown], options: ) end |
#list_models(request_options: {}) ⇒ Telnyx::Models::AI::OpenAIListModelsResponse
Lists every model currently available to your account on Telnyx Inference, including SOTA open-source LLMs hosted on Telnyx GPUs (for example ‘moonshotai/Kimi-K2.6`, `zai-org/GLM-5.1-FP8`, and `MiniMaxAI/MiniMax-M2.7`), embedding models, and any fine-tuned models you have created.
Each entry is a ‘ModelMetadata` object describing the model id, owner, task, context length, supported languages, billing tier, pricing per 1M tokens, deployment regions, and whether the model supports vision or fine-tuning. Use this endpoint to discover model ids you can pass to `POST /v2/ai/openai/chat/completions`.
Model ids follow the ‘organization/model_name` convention from Hugging Face (for example `moonshotai/Kimi-K2.6`). This endpoint is OpenAI-compatible: clients pointed at `api.telnyx.com/v2/ai/openai` can call `client.models.list()` to retrieve the same payload.
63 64 65 66 67 68 69 70 |
# File 'lib/telnyx/resources/ai/openai.rb', line 63 def list_models(params = {}) @client.request( method: :get, path: "ai/openai/models", model: Telnyx::Models::AI::OpenAIListModelsResponse, options: params[:request_options] ) end |