Module: LLM::VLLMMethods
Overview
vLLM exposes an OpenAI-compatible Responses API.
We reuse the Responses backend behaviour and only tweak tool-call parsing.
Instance Method Summary collapse
Instance Method Details
#parse_tool_call(info) ⇒ Object
10 11 12 13 14 15 |
# File 'lib/scout/llm/backends/vllm.rb', line 10 def parse_tool_call(info) tool_call = super name = tool_call[:name].to_s name.sub!(/[^a-zA-Z_]+channel[^a-zA-Z_]+[a-zA-Z_]+/, '') tool_call.merge(name: name) end |