Class: Brute::Middleware::LLMCall
- Inherits:
-
Object
- Object
- Brute::Middleware::LLMCall
- Defined in:
- lib/brute/middleware/llm_call.rb
Overview
The terminal “app” in the pipeline — performs the actual LLM call.
When streaming, on_content fires incrementally via AgentStream. When not streaming, fires on_content post-hoc with the full text.
Instance Method Summary collapse
Instance Method Details
#call(env) ⇒ Object
11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 |
# File 'lib/brute/middleware/llm_call.rb', line 11 def call(env) ctx = env[:context] response = ctx.talk(env[:input]) # Only fire on_content post-hoc when NOT streaming # (streaming delivers chunks incrementally via AgentStream) unless env[:streaming] if (cb = env.dig(:callbacks, :on_content)) && response text = safe_content(response) cb.call(text) if text end end response end |