Class: RubynCode::LLM::Adapters::OpenAIStreaming
- Inherits:
-
Object
- Object
- RubynCode::LLM::Adapters::OpenAIStreaming
- Includes:
- JsonParsing
- Defined in:
- lib/rubyn_code/llm/adapters/openai_streaming.rb
Overview
SSE streaming parser for OpenAI Chat Completions API.
Parses ‘data: …` lines from the SSE stream, accumulates content deltas and tool_calls, and produces a normalized LLM::Response via #finalize.
Defined Under Namespace
Classes: Event
Constant Summary collapse
- STOP_REASON_MAP =
{ 'stop' => 'end_turn', 'tool_calls' => 'tool_use', 'length' => 'max_tokens', 'content_filter' => 'end_turn' }.freeze
Instance Method Summary collapse
- #feed(chunk) ⇒ Object
- #finalize ⇒ Object
-
#initialize(&block) ⇒ OpenAIStreaming
constructor
A new instance of OpenAIStreaming.
Constructor Details
#initialize(&block) ⇒ OpenAIStreaming
Returns a new instance of OpenAIStreaming.
24 25 26 27 28 29 30 31 32 33 |
# File 'lib/rubyn_code/llm/adapters/openai_streaming.rb', line 24 def initialize(&block) @callback = block @buffer = +'' @content_text = +'' @tool_calls = {} @response_id = nil @model = nil @finish_reason = nil @usage = nil end |
Instance Method Details
#feed(chunk) ⇒ Object
35 36 37 38 |
# File 'lib/rubyn_code/llm/adapters/openai_streaming.rb', line 35 def feed(chunk) @buffer << chunk consume_sse_events end |
#finalize ⇒ Object
40 41 42 43 44 45 46 47 48 49 50 |
# File 'lib/rubyn_code/llm/adapters/openai_streaming.rb', line 40 def finalize content = build_content_blocks stop = STOP_REASON_MAP[@finish_reason] || @finish_reason || 'end_turn' RubynCode::LLM::Response.new( id: @response_id, content: content, stop_reason: stop, usage: @usage || RubynCode::LLM::Usage.new(input_tokens: 0, output_tokens: 0) ) end |