Class: Telnyx::Resources::AI::OpenAI

Inherits:
Object
  • Object
show all
Defined in:
lib/telnyx/resources/ai/openai.rb,
lib/telnyx/resources/ai/openai/chat.rb,
lib/telnyx/resources/ai/openai/embeddings.rb

Defined Under Namespace

Classes: Chat, Embeddings

Instance Attribute Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(client:) ⇒ OpenAI

This method is part of a private API. You should avoid using this method if possible, as it may be removed or be changed in the future.

Returns a new instance of OpenAI.

Parameters:



75
76
77
78
79
# File 'lib/telnyx/resources/ai/openai.rb', line 75

def initialize(client:)
  @client = client
  @embeddings = Telnyx::Resources::AI::OpenAI::Embeddings.new(client: client)
  @chat = Telnyx::Resources::AI::OpenAI::Chat.new(client: client)
end

Instance Attribute Details

#chatTelnyx::Resources::AI::OpenAI::Chat (readonly)



13
14
15
# File 'lib/telnyx/resources/ai/openai.rb', line 13

def chat
  @chat
end

#embeddingsTelnyx::Resources::AI::OpenAI::Embeddings (readonly)

OpenAI-compatible embeddings endpoints for generating vector representations of text



10
11
12
# File 'lib/telnyx/resources/ai/openai.rb', line 10

def embeddings
  @embeddings
end

Instance Method Details

#create_response(body:, request_options: {}) ⇒ Hash{Symbol=>Object}

Chat with a language model. This endpoint is consistent with the [OpenAI Responses API](platform.openai.com/docs/api-reference/responses) and may be used with the OpenAI JS or Python SDK. Response id parameter is not supported at the moment. Use ‘conversation’ parameter to leverage persistent conversations feature.

Parameters:

Returns:

  • (Hash{Symbol=>Object})

See Also:



29
30
31
32
33
34
35
36
37
38
# File 'lib/telnyx/resources/ai/openai.rb', line 29

def create_response(params)
  parsed, options = Telnyx::AI::OpenAICreateResponseParams.dump_request(params)
  @client.request(
    method: :post,
    path: "ai/openai/responses",
    body: parsed[:body],
    model: Telnyx::Internal::Type::HashOf[Telnyx::Internal::Type::Unknown],
    options: options
  )
end

#list_models(request_options: {}) ⇒ Telnyx::Models::AI::OpenAIListModelsResponse

Lists every model currently available to your account on Telnyx Inference, including SOTA open-source LLMs hosted on Telnyx GPUs (for example ‘moonshotai/Kimi-K2.6`, `zai-org/GLM-5.1-FP8`, and `MiniMaxAI/MiniMax-M2.7`), embedding models, and any fine-tuned models you have created.

Each entry is a ‘ModelMetadata` object describing the model id, owner, task, context length, supported languages, billing tier, pricing per 1M tokens, deployment regions, and whether the model supports vision or fine-tuning. Use this endpoint to discover model ids you can pass to `POST /v2/ai/openai/chat/completions`.

Model ids follow the ‘organization/model_name` convention from Hugging Face (for example `moonshotai/Kimi-K2.6`). This endpoint is OpenAI-compatible: clients pointed at `api.telnyx.com/v2/ai/openai` can call `client.models.list()` to retrieve the same payload.

Parameters:

Returns:

See Also:



63
64
65
66
67
68
69
70
# File 'lib/telnyx/resources/ai/openai.rb', line 63

def list_models(params = {})
  @client.request(
    method: :get,
    path: "ai/openai/models",
    model: Telnyx::Models::AI::OpenAIListModelsResponse,
    options: params[:request_options]
  )
end