Module: Inquirex::LLM
- Defined in:
- lib/inquirex/llm.rb,
lib/inquirex/llm/node.rb,
lib/inquirex/llm/errors.rb,
lib/inquirex/llm/schema.rb,
lib/inquirex/llm/adapter.rb,
lib/inquirex/llm/version.rb,
lib/inquirex/llm/null_adapter.rb,
lib/inquirex/llm/openai_adapter.rb,
lib/inquirex/llm/dsl/flow_builder.rb,
lib/inquirex/llm/anthropic_adapter.rb,
lib/inquirex/llm/dsl/llm_step_builder.rb
Overview
LLM integration layer for Inquirex flows.
Extends the core DSL with four LLM-powered verbs that run server-side:
- clarify — extract structured data from free-text answers
- describe — generate natural-language text from structured data
- summarize — produce a summary of all or selected answers
- detour — dynamically generate follow-up questions
LLM calls never happen on the frontend. Steps are marked ‘requires_server: true` in the JSON wire format so the JS widget knows to round-trip to the server.
Usage:
require "inquirex"
require "inquirex-llm"
Inquirex.define id: "intake" do
start :description
ask(:description) { type :text; question "Describe your business."; transition to: :extracted }
clarify(:extracted) { from :description; prompt "Extract info."; schema name: :string; transition to: :done }
say(:done) { text "Done!" }
end
Defined Under Namespace
Modules: DSL, Errors Classes: Adapter, AnthropicAdapter, Node, NullAdapter, OpenAIAdapter, Schema
Constant Summary collapse
- VERSION =
"0.3.0"