Class: Inquirex::LLM::Node

Inherits:
Node
  • Object
show all
Defined in:
lib/inquirex/llm/node.rb

Overview

Enriched node for LLM-powered steps. Extends Inquirex::Node with attributes needed by the server-side LLM adapter: prompt template, output schema, source step references, and model configuration.

LLM verbs:

:clarify   — extract structured data from a free-text answer
:describe  — generate natural-language text from structured data
:summarize — produce a summary of all or selected answers
:detour    — dynamically generate follow-up questions based on an answer

All LLM nodes are collecting (they produce answers) and require server round-trips. The frontend shows a “thinking” state while the server processes.

Constant Summary collapse

LLM_VERBS =
%i[clarify describe summarize detour].freeze

Instance Attribute Summary collapse

Class Method Summary collapse

Instance Method Summary collapse

Constructor Details

#initialize(prompt:, schema: nil, from_steps: [], from_all: false, model: nil, temperature: nil, max_tokens: nil, fallback: nil) ⇒ Node

Returns a new instance of Node.



38
39
40
41
42
43
44
45
46
47
48
49
# File 'lib/inquirex/llm/node.rb', line 38

def initialize(prompt:, schema: nil, from_steps: [], from_all: false,
  model: nil, temperature: nil, max_tokens: nil, fallback: nil, **)
  @prompt = prompt
  @schema = schema
  @from_steps = Array(from_steps).map(&:to_sym).freeze
  @from_all = !!from_all
  @model = model&.to_sym
  @temperature = temperature&.to_f
  @max_tokens = max_tokens&.to_i
  @fallback = fallback
  super(**)
end

Instance Attribute Details

#fallbackProc? (readonly)

server-side fallback (stripped from JSON)

Returns:

  • (Proc, nil)

    the current value of fallback



26
27
28
# File 'lib/inquirex/llm/node.rb', line 26

def fallback
  @fallback
end

#from_allBoolean (readonly)

whether to pass all collected answers to the LLM

Returns:

  • (Boolean)

    the current value of from_all



26
27
28
# File 'lib/inquirex/llm/node.rb', line 26

def from_all
  @from_all
end

#from_stepsArray<Symbol> (readonly)

source step ids whose answers feed the LLM

Returns:

  • (Array<Symbol>)

    the current value of from_steps



26
27
28
# File 'lib/inquirex/llm/node.rb', line 26

def from_steps
  @from_steps
end

#max_tokensInteger? (readonly)

optional max output tokens

Returns:

  • (Integer, nil)

    the current value of max_tokens



26
27
28
# File 'lib/inquirex/llm/node.rb', line 26

def max_tokens
  @max_tokens
end

#modelSymbol? (readonly)

optional model hint (e.g. :claude_sonnet)

Returns:

  • (Symbol, nil)

    the current value of model



26
27
28
# File 'lib/inquirex/llm/node.rb', line 26

def model
  @model
end

#promptString (readonly)

LLM prompt template

Returns:

  • (String)

    the current value of prompt



26
27
28
# File 'lib/inquirex/llm/node.rb', line 26

def prompt
  @prompt
end

#schemaSchema? (readonly)

expected output structure (required for clarify/detour)

Returns:

  • (Schema, nil)

    the current value of schema



26
27
28
# File 'lib/inquirex/llm/node.rb', line 26

def schema
  @schema
end

#temperatureFloat? (readonly)

optional sampling temperature

Returns:

  • (Float, nil)

    the current value of temperature



26
27
28
# File 'lib/inquirex/llm/node.rb', line 26

def temperature
  @temperature
end

Class Method Details

.from_h(id, hash) ⇒ LLM::Node

Deserializes from a plain Hash (string or symbol keys).

Parameters:

  • id (Symbol, String)
  • hash (Hash)

Returns:



91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
# File 'lib/inquirex/llm/node.rb', line 91

def self.from_h(id, hash)
  verb             = hash["verb"]        || hash[:verb]
  question         = hash["question"]    || hash[:question]
  text             = hash["text"]        || hash[:text]
  transitions_data = hash["transitions"] || hash[:transitions] || []
  skip_if_data     = hash["skip_if"]     || hash[:skip_if]
  llm_data         = hash["llm"]         || hash[:llm] || {}

  transitions = transitions_data.map { |t| Inquirex::Transition.from_h(t) }
  skip_if = skip_if_data ? Inquirex::Rules::Base.from_h(skip_if_data) : nil

  prompt     = llm_data["prompt"]     || llm_data[:prompt]
  schema_raw = llm_data["schema"]     || llm_data[:schema]
  from_raw   = llm_data["from_steps"] || llm_data[:from_steps] || []
  from_all   = llm_data["from_all"]   || llm_data[:from_all] || false
  model      = llm_data["model"]      || llm_data[:model]
  temp       = llm_data["temperature"] || llm_data[:temperature]
  max_tok    = llm_data["max_tokens"]  || llm_data[:max_tokens]

  schema = schema_raw ? Schema.from_h(schema_raw) : nil
  from_steps = from_raw.map(&:to_sym)

  new(
    id:,
    verb:,
    question:,
    text:,
    transitions:,
    skip_if:,
    prompt:,
    schema:,
    from_steps:,
    from_all:,
    model:,
    temperature: temp,
    max_tokens:  max_tok
  )
end

.llm_verb?(verb) ⇒ Boolean

Whether this verb is a recognized LLM verb.

Returns:

  • (Boolean)


131
132
133
# File 'lib/inquirex/llm/node.rb', line 131

def self.llm_verb?(verb)
  LLM_VERBS.include?(verb.to_sym)
end

Instance Method Details

#collecting?Boolean

LLM verbs always collect output (the LLM provides the “answer”).

Returns:

  • (Boolean)


53
# File 'lib/inquirex/llm/node.rb', line 53

def collecting? = true

#display?Boolean

LLM verbs are never display-only.

Returns:

  • (Boolean)


56
# File 'lib/inquirex/llm/node.rb', line 56

def display? = false

#llm_verb?Boolean

Whether this is an LLM-powered step requiring server processing.

Returns:

  • (Boolean)


59
# File 'lib/inquirex/llm/node.rb', line 59

def llm_verb? = true

#to_hHash

Serializes to a plain Hash. LLM metadata is nested under “llm”. Fallback procs are stripped (server-side only). All transitions are marked requires_server: true.

Returns:

  • (Hash)


66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
# File 'lib/inquirex/llm/node.rb', line 66

def to_h
  hash = { "verb" => @verb.to_s }
  hash["question"] = @question if @question
  hash["text"] = @text if @text
  hash["transitions"] = @transitions.map(&:to_h) unless @transitions.empty?
  hash["skip_if"] = @skip_if.to_h if @skip_if
  hash["requires_server"] = true

  llm_hash = { "prompt" => @prompt }
  llm_hash["schema"] = @schema.to_h if @schema
  llm_hash["from_steps"] = @from_steps.map(&:to_s) unless @from_steps.empty?
  llm_hash["from_all"] = true if @from_all
  llm_hash["model"] = @model.to_s if @model
  llm_hash["temperature"] = @temperature if @temperature
  llm_hash["max_tokens"] = @max_tokens if @max_tokens
  hash["llm"] = llm_hash

  hash
end