Module: Legion::LLM::Inference::Conversation
- Extended by:
- Legion::Logging::Helper
- Defined in:
- lib/legion/llm/inference/conversation.rb
Constant Summary collapse
- MAX_CONVERSATIONS =
256- METADATA_ROLE =
:__metadata__- CURATED_ROLE =
:__curated__
Class Method Summary collapse
- .append(conversation_id, role:, content:, parent_id: nil, sidechain: false, message_group_id: nil, agent_id: nil, **metadata) ⇒ Object
-
.branch(conversation_id, from_message_id:) ⇒ Object
Create a new conversation branched from from_message_id.
-
.build_chain(conversation_id, include_sidechains: false) ⇒ Object
Build ordered chain from parent links.
-
.cancel_skill!(conversation_id) ⇒ Object
Reads current state, clears :skill_state, sets :skill_cancelled flag.
- .clear_cancel_flag(conversation_id) ⇒ Object
- .clear_skill_state(conversation_id) ⇒ Object
- .conversation_exists?(conversation_id) ⇒ Boolean
- .create_conversation(conversation_id, **metadata) ⇒ Object
- .in_memory?(conversation_id) ⇒ Boolean
-
.messages(conversation_id) ⇒ Object
Returns flat ordered message array — backward-compatible.
-
.migrate_parent_links!(conversation_id) ⇒ Object
Migrate existing sequential messages to use parent links.
-
.raw_messages(conversation_id) ⇒ Object
Returns ALL messages including internal-role entries (__metadata__, __curated__).
-
.read_metadata(conversation_id, tail_n: 20) ⇒ Object
Read metadata stored by store_metadata; scans tail of message list.
-
.read_sticky_state(conversation_id) ⇒ Object
Returns the sticky_state hash for this conversation, loading from persistent storage on cache miss when available.
- .replace(conversation_id, messages) ⇒ Object
- .reset! ⇒ Object
- .set_skill_state(conversation_id, skill_key:, resume_at:) ⇒ Object
-
.sidechain_messages(conversation_id, agent_id: nil) ⇒ Object
Return sidechain messages; optionally filter by agent_id.
-
.skill_cancelled?(conversation_id) ⇒ Boolean
:skill_cancelled is distinct from a nil :skill_state.
- .skill_state(conversation_id) ⇒ Object
-
.store_metadata(conversation_id, title: nil, tags: nil, model: nil) ⇒ Object
Store session metadata as a special entry (tail-window pattern).
-
.write_sticky_state(conversation_id, state) ⇒ Object
Writes sticky_state to an in-memory conversation and persists it when a DB backing store is available.
Class Method Details
.append(conversation_id, role:, content:, parent_id: nil, sidechain: false, message_group_id: nil, agent_id: nil, **metadata) ⇒ Object
17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 |
# File 'lib/legion/llm/inference/conversation.rb', line 17 def append(conversation_id, role:, content:, parent_id: nil, sidechain: false, message_group_id: nil, agent_id: nil, **) ensure_conversation(conversation_id) id = SecureRandom.uuid seq = next_seq(conversation_id) msg = { id: id, seq: seq, role: role, content: content, parent_id: parent_id, sidechain: sidechain, message_group_id: , agent_id: agent_id, created_at: Time.now, ** } conversations[conversation_id][:messages] << msg touch(conversation_id) (conversation_id, msg) msg end |
.branch(conversation_id, from_message_id:) ⇒ Object
Create a new conversation branched from from_message_id. Copies all messages up to and including that message into a new conversation.
83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 |
# File 'lib/legion/llm/inference/conversation.rb', line 83 def branch(conversation_id, from_message_id:) raw = (conversation_id) target = raw.find { |m| m[:id] == } raise ArgumentError, "Message #{} not found in #{conversation_id}" unless target chain = reconstruct_chain(raw) # Only keep messages up to (and including) the target message by seq cutoff_seq = target[:seq] prefix = chain.select { |m| m[:seq] <= cutoff_seq } new_id = SecureRandom.uuid create_conversation(new_id) prefix.each_with_index do |msg, i| new_msg = msg.merge(seq: i + 1, id: SecureRandom.uuid, parent_id: nil, created_at: Time.now) conversations[new_id][:messages] << new_msg (new_id, new_msg) end touch(new_id) new_id end |
.build_chain(conversation_id, include_sidechains: false) ⇒ Object
Build ordered chain from parent links. Excludes sidechain messages by default.
66 67 68 69 70 71 |
# File 'lib/legion/llm/inference/conversation.rb', line 66 def build_chain(conversation_id, include_sidechains: false) raw = (conversation_id) raw = raw.reject { |m| m[:sidechain] } unless include_sidechains raw = raw.reject { |m| internal_role?(m[:role]) } reconstruct_chain(raw) end |
.cancel_skill!(conversation_id) ⇒ Object
Reads current state, clears :skill_state, sets :skill_cancelled flag. Returns the previous state (for use in skill.cancelled payload), or nil if none.
215 216 217 218 219 220 221 222 223 |
# File 'lib/legion/llm/inference/conversation.rb', line 215 def cancel_skill!(conversation_id) ensure_conversation(conversation_id) state = conversations[conversation_id].delete(:skill_state) if state conversations[conversation_id][:skill_cancelled] = true touch(conversation_id) end state end |
.clear_cancel_flag(conversation_id) ⇒ Object
233 234 235 236 237 238 |
# File 'lib/legion/llm/inference/conversation.rb', line 233 def clear_cancel_flag(conversation_id) return unless in_memory?(conversation_id) conversations[conversation_id].delete(:skill_cancelled) touch(conversation_id) end |
.clear_skill_state(conversation_id) ⇒ Object
206 207 208 209 210 211 |
# File 'lib/legion/llm/inference/conversation.rb', line 206 def clear_skill_state(conversation_id) return unless in_memory?(conversation_id) conversations[conversation_id].delete(:skill_state) touch(conversation_id) end |
.conversation_exists?(conversation_id) ⇒ Boolean
180 181 182 |
# File 'lib/legion/llm/inference/conversation.rb', line 180 def conversation_exists?(conversation_id) in_memory?(conversation_id) || db_conversation_exists?(conversation_id) end |
.create_conversation(conversation_id, **metadata) ⇒ Object
166 167 168 169 170 |
# File 'lib/legion/llm/inference/conversation.rb', line 166 def create_conversation(conversation_id, **) conversations[conversation_id] = { messages: [], metadata: , lru_tick: next_tick } evict_if_needed persist_conversation(conversation_id, ) end |
.in_memory?(conversation_id) ⇒ Boolean
184 185 186 |
# File 'lib/legion/llm/inference/conversation.rb', line 184 def in_memory?(conversation_id) conversations.key?(conversation_id) end |
.messages(conversation_id) ⇒ Object
Returns flat ordered message array — backward-compatible. Uses chain reconstruction when parent links exist; falls back to seq order. Internal-only roles (__metadata__, __curated__) are filtered out.
43 44 45 46 47 48 49 50 51 |
# File 'lib/legion/llm/inference/conversation.rb', line 43 def (conversation_id) if in_memory?(conversation_id) touch(conversation_id) raw = conversations[conversation_id][:messages].reject { |m| internal_role?(m[:role]) } chain_or_seq(raw) else load_from_db(conversation_id) end end |
.migrate_parent_links!(conversation_id) ⇒ Object
Migrate existing sequential messages to use parent links. Safe to call on already-migrated data (no-op when parent links present).
242 243 244 245 246 247 248 249 250 251 252 253 254 255 |
# File 'lib/legion/llm/inference/conversation.rb', line 242 def migrate_parent_links!(conversation_id) ensure_conversation(conversation_id) msgs = conversations[conversation_id][:messages].sort_by { |m| m[:seq] } return if msgs.empty? return if msgs.any? { |m| m[:parent_id] } prev_id = nil msgs.each do |msg| msg[:parent_id] = prev_id prev_id = msg[:id] ||= SecureRandom.uuid end touch(conversation_id) end |
.raw_messages(conversation_id) ⇒ Object
Returns ALL messages including internal-role entries (__metadata__, __curated__). Use this when you need access to curation markers or metadata entries.
55 56 57 58 59 60 61 62 |
# File 'lib/legion/llm/inference/conversation.rb', line 55 def (conversation_id) if in_memory?(conversation_id) touch(conversation_id) conversations[conversation_id][:messages].dup else load_all_from_db(conversation_id) end end |
.read_metadata(conversation_id, tail_n: 20) ⇒ Object
Read metadata stored by store_metadata; scans tail of message list.
126 127 128 129 130 131 132 133 134 135 136 |
# File 'lib/legion/llm/inference/conversation.rb', line 126 def (conversation_id, tail_n: 20) raw = (conversation_id) tail = raw.last(tail_n).select { |m| m[:role] == METADATA_ROLE } return nil if tail.empty? entry = tail.last Legion::JSON.parse(entry[:content]) rescue Legion::JSON::ParseError => e handle_exception(e, level: :debug, handled: true, operation: 'llm.conversation.metadata_json_parse') nil end |
.read_sticky_state(conversation_id) ⇒ Object
Returns the sticky_state hash for this conversation, loading from persistent storage on cache miss when available. Does NOT call ensure_conversation, avoiding resurrection of unknown conversations.
141 142 143 144 145 146 147 148 149 150 151 152 153 154 |
# File 'lib/legion/llm/inference/conversation.rb', line 141 def read_sticky_state(conversation_id) unless in_memory?(conversation_id) persisted = nil persisted = db_load_sticky_state(conversation_id) if db_available? return persisted if persisted.is_a?(Hash) && persisted.any? return {}.freeze end conversations[conversation_id][:sticky_state] ||= begin persisted = db_load_sticky_state(conversation_id) if db_available? persisted.is_a?(Hash) ? persisted : {} end end |
.replace(conversation_id, messages) ⇒ Object
172 173 174 175 176 177 178 |
# File 'lib/legion/llm/inference/conversation.rb', line 172 def replace(conversation_id, ) ensure_conversation(conversation_id) conversations[conversation_id][:messages] = .each_with_index.map do |msg, i| msg.merge(seq: i + 1, created_at: msg[:created_at] || Time.now) end touch(conversation_id) end |
.reset! ⇒ Object
188 189 190 191 |
# File 'lib/legion/llm/inference/conversation.rb', line 188 def reset! @conversations = {} @lru_counter = 0 end |
.set_skill_state(conversation_id, skill_key:, resume_at:) ⇒ Object
193 194 195 196 197 |
# File 'lib/legion/llm/inference/conversation.rb', line 193 def set_skill_state(conversation_id, skill_key:, resume_at:) ensure_conversation(conversation_id) conversations[conversation_id][:skill_state] = { skill_key: skill_key, resume_at: resume_at } touch(conversation_id) end |
.sidechain_messages(conversation_id, agent_id: nil) ⇒ Object
Return sidechain messages; optionally filter by agent_id.
74 75 76 77 78 79 |
# File 'lib/legion/llm/inference/conversation.rb', line 74 def (conversation_id, agent_id: nil) raw = (conversation_id) result = raw.select { |m| m[:sidechain] && !internal_role?(m[:role]) } result = result.select { |m| m[:agent_id] == agent_id } unless agent_id.nil? result.sort_by { |m| m[:seq] } end |
.skill_cancelled?(conversation_id) ⇒ Boolean
:skill_cancelled is distinct from a nil :skill_state. nil skill_state also occurs after normal completion — use this flag to detect cancel.
227 228 229 230 231 |
# File 'lib/legion/llm/inference/conversation.rb', line 227 def skill_cancelled?(conversation_id) return false unless in_memory?(conversation_id) conversations[conversation_id][:skill_cancelled] == true end |
.skill_state(conversation_id) ⇒ Object
199 200 201 202 203 204 |
# File 'lib/legion/llm/inference/conversation.rb', line 199 def skill_state(conversation_id) return nil unless in_memory?(conversation_id) touch(conversation_id) conversations[conversation_id][:skill_state]&.dup end |
.store_metadata(conversation_id, title: nil, tags: nil, model: nil) ⇒ Object
Store session metadata as a special entry (tail-window pattern).
105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 |
# File 'lib/legion/llm/inference/conversation.rb', line 105 def (conversation_id, title: nil, tags: nil, model: nil) ensure_conversation(conversation_id) payload = { title: title, tags: , model: model }.compact msg = { id: SecureRandom.uuid, seq: next_seq(conversation_id), role: METADATA_ROLE, content: payload.to_json, parent_id: nil, sidechain: false, message_group_id: nil, agent_id: nil, created_at: Time.now } conversations[conversation_id][:messages] << msg touch(conversation_id) (conversation_id, msg) msg end |
.write_sticky_state(conversation_id, state) ⇒ Object
Writes sticky_state to an in-memory conversation and persists it when a DB backing store is available.
158 159 160 161 162 163 164 |
# File 'lib/legion/llm/inference/conversation.rb', line 158 def write_sticky_state(conversation_id, state) return unless in_memory?(conversation_id) conversations[conversation_id][:sticky_state] = state touch(conversation_id) db_persist_sticky_state(conversation_id, state) if db_available? end |