Class: Ace::LLM::Organisms::TogetherAIClient

Inherits:
BaseClient
  • Object
show all
Defined in:
lib/ace/llm/organisms/togetherai_client.rb

Overview

TogetherAIClient handles interactions with Together AI’s API

Constant Summary collapse

API_BASE_URL =
"https://api.together.xyz"
DEFAULT_MODEL =
"meta-llama/Llama-3-70b-chat-hf"
DEFAULT_GENERATION_CONFIG =
{
  temperature: 0.7,
  max_tokens: nil,
  top_p: nil,
  top_k: nil,
  repetition_penalty: nil
}.freeze

Constants inherited from BaseClient

BaseClient::DEFAULT_SYSTEM_PROMPT_SEPARATOR

Instance Attribute Summary

Attributes inherited from BaseClient

#api_key, #base_url, #generation_config, #http_client, #model

Class Method Summary collapse

Instance Method Summary collapse

Methods inherited from BaseClient

#initialize, #needs_credentials?, #provider_name

Constructor Details

This class inherits a constructor from Ace::LLM::Organisms::BaseClient

Class Method Details

.provider_nameString

Get the provider name

Returns:

  • (String)

    Provider name



22
23
24
# File 'lib/ace/llm/organisms/togetherai_client.rb', line 22

def self.provider_name
  "togetherai"
end

Instance Method Details

#generate(messages, **options) ⇒ Hash

Generate a response from Together AI

Parameters:

  • messages (Array<Hash>, String)

    Messages or prompt

  • options (Hash)

    Generation options

Returns:

  • (Hash)

    Response with text and metadata



30
31
32
33
34
35
36
37
38
39
40
# File 'lib/ace/llm/organisms/togetherai_client.rb', line 30

def generate(messages, **options)
  messages_array = build_messages(messages)
  generation_params = extract_generation_options(options)

  request_body = build_request_body(messages_array, generation_params)
  response = make_api_request(request_body)

  parse_response(response)
rescue => e
  handle_api_error(e)
end