Class: ClaudeMemory::Store::SQLiteStore
- Inherits:
-
Object
- Object
- ClaudeMemory::Store::SQLiteStore
- Includes:
- RetryHandler, SchemaManager
- Defined in:
- lib/claude_memory/store/sqlite_store.rb
Overview
SQLite-backed fact store for ClaudeMemory. Manages all database tables (content_items, entities, facts, provenance, conflicts, fact_links, etc.) via Sequel with Extralite adapter. Includes RetryHandler for transient lock recovery and SchemaManager for automatic migrations on open.
Constant Summary
Constants included from SchemaManager
ClaudeMemory::Store::SchemaManager::SCHEMA_VERSION
Constants included from RetryHandler
RetryHandler::MAX_RETRIES, RetryHandler::RETRY_BASE_DELAY
Instance Attribute Summary collapse
-
#db ⇒ Sequel::Database
readonly
The underlying Sequel database connection.
Instance Method Summary collapse
-
#aggregate_ingestion_metrics ⇒ Hash?
Compute aggregate ingestion metrics across all distillation runs.
-
#backfill_distillation_metrics! ⇒ Integer
Mark all undistilled content items as distilled with zero token counts.
-
#checkpoint_wal ⇒ void
Checkpoint the WAL file to prevent unlimited growth.
-
#close ⇒ void
Disconnect from the database.
- #conflicts ⇒ Sequel::Dataset
-
#content_item_by_transcript_and_mtime(transcript_path, mtime_iso8601) ⇒ Hash?
Find a content item by transcript path and source modification time.
- #content_items ⇒ Sequel::Dataset
-
#count_undistilled(min_length: 200) ⇒ Integer
Count content items that have not yet been distilled.
- #delta_cursors ⇒ Sequel::Dataset
- #entities ⇒ Sequel::Dataset
- #entity_aliases ⇒ Sequel::Dataset
- #fact_links ⇒ Sequel::Dataset
- #facts ⇒ Sequel::Dataset
-
#facts_for_slot(subject_entity_id, predicate, status: "active") ⇒ Array<Hash>
Find all facts for a given subject + predicate combination (a “slot”).
-
#facts_with_embeddings(limit: 1000) ⇒ Array<Hash>
Retrieve active facts that have stored embeddings.
-
#find_fact_by_docid(docid) ⇒ Hash?
Look up a fact by its short document identifier.
-
#find_or_create_entity(type:, name:) ⇒ Integer
Find an entity by its slug or create a new one.
-
#get_content_item(id) ⇒ Hash?
Fetch a single content item by primary key.
-
#get_delta_cursor(session_id, transcript_path) ⇒ Integer?
Get the last-read byte offset for a session/transcript pair.
-
#get_meta(key) ⇒ String?
Retrieve a value from the meta table.
- #ingestion_metrics ⇒ Sequel::Dataset
-
#initialize(db_path) ⇒ SQLiteStore
constructor
Open (or create) a SQLite database and migrate to the current schema.
-
#insert_conflict(fact_a_id:, fact_b_id:, status: "open", notes: nil) ⇒ Integer
Record a conflict between two facts.
-
#insert_fact(subject_entity_id:, predicate:, object_entity_id: nil, object_literal: nil, datatype: nil, polarity: "positive", valid_from: nil, status: "active", confidence: 1.0, created_from: nil, scope: "project", project_path: nil) ⇒ Integer
Insert a new fact (subject-predicate-object triple) with an auto-generated docid.
-
#insert_fact_link(from_fact_id:, to_fact_id:, link_type:) ⇒ Integer
Create a directional link between two facts (e.g. supersession).
-
#insert_mcp_tool_call(tool_name:, duration_ms:, result_count: nil, scope: nil, error_class: nil, called_at: nil) ⇒ Integer
Record a single MCP tool invocation for telemetry.
-
#insert_provenance(fact_id:, content_item_id: nil, quote: nil, attribution_entity_id: nil, strength: "stated", line_start: nil, line_end: nil) ⇒ Integer
Record a provenance link between a fact and its source evidence.
-
#insert_tool_calls(content_item_id, tool_calls_data) ⇒ void
Bulk-insert tool call records for a content item.
- #llm_cache ⇒ Sequel::Dataset
-
#llm_cache_key(operation, model, input) ⇒ String
Compute the cache key for an LLM operation.
-
#llm_cache_lookup(cache_key) ⇒ Hash?
Look up a cached LLM result by its cache key.
-
#llm_cache_prune(max_age_seconds: 604_800) ⇒ Integer
Delete LLM cache entries older than the given age.
-
#llm_cache_store(operation:, model:, input_hash:, result_json:, input_tokens: nil, output_tokens: nil) ⇒ void
Store or update a cached LLM result.
- #mcp_tool_calls ⇒ Sequel::Dataset
-
#open_conflicts ⇒ Array<Hash>
Retrieve all unresolved conflicts.
- #operation_progress ⇒ Sequel::Dataset
- #provenance ⇒ Sequel::Dataset
-
#provenance_for_fact(fact_id) ⇒ Array<Hash>
Retrieve all provenance records for a given fact.
-
#record_ingestion_metrics(content_item_id:, input_tokens:, output_tokens:, facts_extracted:) ⇒ Integer
Record token usage and extraction counts for a distillation run.
-
#reject_fact(fact_id, reason: nil) ⇒ Hash?
Reject a fact as incorrect (e.g. a distiller hallucination).
- #schema_health ⇒ Sequel::Dataset
-
#schema_version ⇒ Integer?
Current schema version stored in the meta table.
-
#set_meta(key, value) ⇒ void
Set a key-value pair in the meta table (upsert).
- #tool_calls ⇒ Sequel::Dataset
-
#tool_calls_for_content_item(content_item_id) ⇒ Array<Hash>
Retrieve tool calls for a content item, ordered by timestamp.
-
#undistilled_content_items(limit: 3, min_length: 200) ⇒ Array<Hash>
Fetch content items that have not yet been distilled, ordered newest first.
-
#update_delta_cursor(session_id, transcript_path, offset) ⇒ void
Create or update the byte-offset cursor for a session/transcript pair.
-
#update_fact(fact_id, status: nil, valid_to: nil, scope: nil, project_path: nil, embedding: nil) ⇒ Boolean
Selectively update one or more fields on a fact.
-
#update_fact_embedding(fact_id, embedding_vector) ⇒ void
Overwrite the embedding vector for a fact.
-
#upsert_content_item(source:, text_hash:, byte_len:, session_id: nil, transcript_path: nil, project_path: nil, occurred_at: nil, raw_text: nil, metadata: nil, git_branch: nil, cwd: nil, claude_version: nil, thinking_level: nil, source_mtime: nil) ⇒ Integer
Insert a content item or return the existing id if a duplicate (same text_hash + session_id) already exists.
-
#vector_index ⇒ Index::VectorIndex
Lazily-initialized vector index for semantic search.
Methods included from RetryHandler
#transaction_with_retry, #with_retry
Constructor Details
#initialize(db_path) ⇒ SQLiteStore
Open (or create) a SQLite database and migrate to the current schema.
28 29 30 31 32 33 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 28 def initialize(db_path) @db_path = db_path @db = connect_database(db_path) ensure_schema! end |
Instance Attribute Details
#db ⇒ Sequel::Database (readonly)
Returns the underlying Sequel database connection.
24 25 26 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 24 def db @db end |
Instance Method Details
#aggregate_ingestion_metrics ⇒ Hash?
Compute aggregate ingestion metrics across all distillation runs.
529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 529 def aggregate_ingestion_metrics # standard:disable Performance/Detect (Sequel DSL requires .select{}.first) result = ingestion_metrics .select { [ sum(:input_tokens).as(:total_input), sum(:output_tokens).as(:total_output), sum(:facts_extracted).as(:total_facts), count(:id).as(:total_ops) ] } .first # standard:enable Performance/Detect return nil if result.nil? || result[:total_ops].to_i.zero? total_input = result[:total_input].to_i total_output = result[:total_output].to_i total_facts = result[:total_facts].to_i total_ops = result[:total_ops].to_i efficiency = total_input.zero? ? 0.0 : (total_facts.to_f / total_input * 1000).round(2) { total_input_tokens: total_input, total_output_tokens: total_output, total_facts_extracted: total_facts, total_operations: total_ops, avg_facts_per_1k_input_tokens: efficiency } end |
#backfill_distillation_metrics! ⇒ Integer
Mark all undistilled content items as distilled with zero token counts. Used for backfilling legacy content that predates the metrics table.
564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 564 def backfill_distillation_metrics! undistilled_ids = content_items .left_join(:ingestion_metrics, content_item_id: :id) .where(Sequel[:ingestion_metrics][:id] => nil) .select_map(Sequel[:content_items][:id]) return 0 if undistilled_ids.empty? now = Time.now.utc.iso8601 undistilled_ids.each do |cid| ingestion_metrics.insert( content_item_id: cid, input_tokens: 0, output_tokens: 0, facts_extracted: 0, created_at: now ) end undistilled_ids.size end |
#checkpoint_wal ⇒ void
This method returns an undefined value.
Checkpoint the WAL file to prevent unlimited growth.
49 50 51 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 49 def checkpoint_wal @db.run("PRAGMA wal_checkpoint(TRUNCATE)") end |
#close ⇒ void
This method returns an undefined value.
Disconnect from the database.
37 38 39 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 37 def close @db.disconnect end |
#conflicts ⇒ Sequel::Dataset
84 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 84 def conflicts = @db[:conflicts] |
#content_item_by_transcript_and_mtime(transcript_path, mtime_iso8601) ⇒ Hash?
Find a content item by transcript path and source modification time.
185 186 187 188 189 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 185 def content_item_by_transcript_and_mtime(transcript_path, mtime_iso8601) content_items .where(transcript_path: transcript_path, source_mtime: mtime_iso8601) .first end |
#content_items ⇒ Sequel::Dataset
63 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 63 def content_items = @db[:content_items] |
#count_undistilled(min_length: 200) ⇒ Integer
Count content items that have not yet been distilled.
503 504 505 506 507 508 509 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 503 def count_undistilled(min_length: 200) content_items .left_join(:ingestion_metrics, content_item_id: :id) .where(Sequel[:ingestion_metrics][:id] => nil) .where { byte_len >= min_length } .count end |
#delta_cursors ⇒ Sequel::Dataset
66 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 66 def delta_cursors = @db[:delta_cursors] |
#entities ⇒ Sequel::Dataset
69 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 69 def entities = @db[:entities] |
#entity_aliases ⇒ Sequel::Dataset
72 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 72 def entity_aliases = @db[:entity_aliases] |
#fact_links ⇒ Sequel::Dataset
81 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 81 def fact_links = @db[:fact_links] |
#facts ⇒ Sequel::Dataset
75 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 75 def facts = @db[:facts] |
#facts_for_slot(subject_entity_id, predicate, status: "active") ⇒ Array<Hash>
Find all facts for a given subject + predicate combination (a “slot”). Used by the resolver to detect supersession and conflicts.
408 409 410 411 412 413 414 415 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 408 def facts_for_slot(subject_entity_id, predicate, status: "active") facts .where(subject_entity_id: subject_entity_id, predicate: predicate, status: status) .select(:id, :subject_entity_id, :predicate, :object_entity_id, :object_literal, :datatype, :polarity, :valid_from, :valid_to, :status, :confidence, :created_from, :created_at) .all end |
#facts_with_embeddings(limit: 1000) ⇒ Array<Hash>
Retrieve active facts that have stored embeddings.
393 394 395 396 397 398 399 400 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 393 def (limit: 1000) facts .where(Sequel.~(embedding_json: nil)) .where(status: "active") .select(:id, :subject_entity_id, :predicate, :object_literal, :embedding_json, :scope) .limit(limit) .all end |
#find_fact_by_docid(docid) ⇒ Hash?
Look up a fact by its short document identifier.
311 312 313 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 311 def find_fact_by_docid(docid) facts.where(docid: docid).first end |
#find_or_create_entity(type:, name:) ⇒ Integer
Find an entity by its slug or create a new one.
259 260 261 262 263 264 265 266 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 259 def find_or_create_entity(type:, name:) slug = slugify(type, name) existing = entities.where(slug: slug).get(:id) return existing if existing now = Time.now.utc.iso8601 entities.insert(type: type, canonical_name: name, slug: slug, created_at: now) end |
#get_content_item(id) ⇒ Hash?
Fetch a single content item by primary key.
177 178 179 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 177 def get_content_item(id) content_items.where(id: id).first end |
#get_delta_cursor(session_id, transcript_path) ⇒ Integer?
Get the last-read byte offset for a session/transcript pair.
229 230 231 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 229 def get_delta_cursor(session_id, transcript_path) delta_cursors.where(session_id: session_id, transcript_path: transcript_path).get(:last_byte_offset) end |
#get_meta(key) ⇒ String?
Retrieve a value from the meta table.
656 657 658 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 656 def (key) @db[:meta].where(key: key).get(:value) end |
#ingestion_metrics ⇒ Sequel::Dataset
96 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 96 def ingestion_metrics = @db[:ingestion_metrics] |
#insert_conflict(fact_a_id:, fact_b_id:, status: "open", notes: nil) ⇒ Integer
Record a conflict between two facts.
457 458 459 460 461 462 463 464 465 466 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 457 def insert_conflict(fact_a_id:, fact_b_id:, status: "open", notes: nil) now = Time.now.utc.iso8601 conflicts.insert( fact_a_id: fact_a_id, fact_b_id: fact_b_id, status: status, detected_at: now, notes: notes ) end |
#insert_fact(subject_entity_id:, predicate:, object_entity_id: nil, object_literal: nil, datatype: nil, polarity: "positive", valid_from: nil, status: "active", confidence: 1.0, created_from: nil, scope: "project", project_path: nil) ⇒ Integer
Insert a new fact (subject-predicate-object triple) with an auto-generated docid.
285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 285 def insert_fact(subject_entity_id:, predicate:, object_entity_id: nil, object_literal: nil, datatype: nil, polarity: "positive", valid_from: nil, status: "active", confidence: 1.0, created_from: nil, scope: "project", project_path: nil) now = Time.now.utc.iso8601 docid = generate_docid(subject_entity_id, predicate, object_literal, now) facts.insert( subject_entity_id: subject_entity_id, predicate: predicate, object_entity_id: object_entity_id, object_literal: object_literal, datatype: datatype, polarity: polarity, valid_from: valid_from || now, status: status, confidence: confidence, created_from: created_from, created_at: now, scope: scope, project_path: project_path, docid: docid ) end |
#insert_fact_link(from_fact_id:, to_fact_id:, link_type:) ⇒ Integer
Create a directional link between two facts (e.g. supersession).
479 480 481 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 479 def insert_fact_link(from_fact_id:, to_fact_id:, link_type:) fact_links.insert(from_fact_id: from_fact_id, to_fact_id: to_fact_id, link_type: link_type) end |
#insert_mcp_tool_call(tool_name:, duration_ms:, result_count: nil, scope: nil, error_class: nil, called_at: nil) ⇒ Integer
Record a single MCP tool invocation for telemetry. Inserts synchronously; callers wrap in with_retry at the call site if needed.
115 116 117 118 119 120 121 122 123 124 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 115 def insert_mcp_tool_call(tool_name:, duration_ms:, result_count: nil, scope: nil, error_class: nil, called_at: nil) mcp_tool_calls.insert( tool_name: tool_name, called_at: called_at || Time.now.utc.iso8601, duration_ms: duration_ms, result_count: result_count, scope: scope, error_class: error_class ) end |
#insert_provenance(fact_id:, content_item_id: nil, quote: nil, attribution_entity_id: nil, strength: "stated", line_start: nil, line_end: nil) ⇒ Integer
Record a provenance link between a fact and its source evidence.
429 430 431 432 433 434 435 436 437 438 439 440 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 429 def insert_provenance(fact_id:, content_item_id: nil, quote: nil, attribution_entity_id: nil, strength: "stated", line_start: nil, line_end: nil) provenance.insert( fact_id: fact_id, content_item_id: content_item_id, quote: quote, attribution_entity_id: attribution_entity_id, strength: strength, line_start: line_start, line_end: line_end ) end |
#insert_tool_calls(content_item_id, tool_calls_data) ⇒ void
This method returns an undefined value.
Bulk-insert tool call records for a content item.
199 200 201 202 203 204 205 206 207 208 209 210 211 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 199 def insert_tool_calls(content_item_id, tool_calls_data) tool_calls_data.each do |tc| tool_calls.insert( content_item_id: content_item_id, tool_name: tc[:tool_name], tool_input: tc[:tool_input], tool_result: tc[:tool_result], compressed_summary: tc[:compressed_summary], is_error: tc[:is_error] || false, timestamp: tc[:timestamp] ) end end |
#llm_cache ⇒ Sequel::Dataset
99 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 99 def llm_cache = @db[:llm_cache] |
#llm_cache_key(operation, model, input) ⇒ String
Compute the cache key for an LLM operation.
630 631 632 633 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 630 def llm_cache_key(operation, model, input) input_hash = Digest::SHA256.hexdigest(input) Digest::SHA256.hexdigest("#{operation}:#{model}:#{input_hash}") end |
#llm_cache_lookup(cache_key) ⇒ Hash?
Look up a cached LLM result by its cache key.
591 592 593 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 591 def llm_cache_lookup(cache_key) llm_cache.where(cache_key: cache_key).first end |
#llm_cache_prune(max_age_seconds: 604_800) ⇒ Integer
Delete LLM cache entries older than the given age.
638 639 640 641 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 638 def llm_cache_prune(max_age_seconds: 604_800) cutoff = (Time.now - max_age_seconds).utc.iso8601 llm_cache.where { created_at < cutoff }.delete end |
#llm_cache_store(operation:, model:, input_hash:, result_json:, input_tokens: nil, output_tokens: nil) ⇒ void
This method returns an undefined value.
Store or update a cached LLM result. Uses upsert on the cache_key.
603 604 605 606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 603 def llm_cache_store(operation:, model:, input_hash:, result_json:, input_tokens: nil, output_tokens: nil) cache_key = Digest::SHA256.hexdigest("#{operation}:#{model}:#{input_hash}") llm_cache .insert_conflict(target: :cache_key, update: { result_json: result_json, input_tokens: input_tokens, output_tokens: output_tokens, created_at: Time.now.utc.iso8601 }) .insert( cache_key: cache_key, operation: operation, model: model, input_hash: input_hash, result_json: result_json, input_tokens: input_tokens, output_tokens: output_tokens, created_at: Time.now.utc.iso8601 ) end |
#mcp_tool_calls ⇒ Sequel::Dataset
102 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 102 def mcp_tool_calls = @db[:mcp_tool_calls] |
#open_conflicts ⇒ Array<Hash>
Retrieve all unresolved conflicts.
470 471 472 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 470 def open_conflicts conflicts.where(status: "open").all end |
#operation_progress ⇒ Sequel::Dataset
90 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 90 def operation_progress = @db[:operation_progress] |
#provenance ⇒ Sequel::Dataset
78 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 78 def provenance = @db[:provenance] |
#provenance_for_fact(fact_id) ⇒ Array<Hash>
Retrieve all provenance records for a given fact.
445 446 447 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 445 def provenance_for_fact(fact_id) provenance.where(fact_id: fact_id).all end |
#record_ingestion_metrics(content_item_id:, input_tokens:, output_tokens:, facts_extracted:) ⇒ Integer
Record token usage and extraction counts for a distillation run.
517 518 519 520 521 522 523 524 525 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 517 def record_ingestion_metrics(content_item_id:, input_tokens:, output_tokens:, facts_extracted:) ingestion_metrics.insert( content_item_id: content_item_id, input_tokens: input_tokens, output_tokens: output_tokens, facts_extracted: facts_extracted, created_at: Time.now.utc.iso8601 ) end |
#reject_fact(fact_id, reason: nil) ⇒ Hash?
Reject a fact as incorrect (e.g. a distiller hallucination). Sets status to “rejected”, closes any open conflicts involving the fact, and records the reason in conflict notes when provided. All updates run in a single transaction.
363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 363 def reject_fact(fact_id, reason: nil) row = facts.where(id: fact_id).first return nil unless row now = Time.now.utc.iso8601 resolved = 0 @db.transaction do facts.where(id: fact_id).update(status: "rejected", valid_to: now) open_conflict_rows = conflicts .where(status: "open") .where { (fact_a_id =~ fact_id) | (fact_b_id =~ fact_id) } .all open_conflict_rows.each do |conflict| suffix = reason ? " | resolved: rejected fact #{fact_id} (#{reason})" : " | resolved: rejected fact #{fact_id}" notes = "#{conflict[:notes]}#{suffix}" conflicts.where(id: conflict[:id]).update(status: "resolved", notes: notes) end resolved = open_conflict_rows.size end {rejected: true, conflicts_resolved: resolved} end |
#schema_health ⇒ Sequel::Dataset
93 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 93 def schema_health = @db[:schema_health] |
#schema_version ⇒ Integer?
Current schema version stored in the meta table.
55 56 57 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 55 def schema_version @db[:meta].where(key: "schema_version").get(:value)&.to_i end |
#set_meta(key, value) ⇒ void
This method returns an undefined value.
Set a key-value pair in the meta table (upsert).
649 650 651 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 649 def (key, value) @db[:meta].insert_conflict(target: :key, update: {value: value}).insert(key: key, value: value) end |
#tool_calls ⇒ Sequel::Dataset
87 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 87 def tool_calls = @db[:tool_calls] |
#tool_calls_for_content_item(content_item_id) ⇒ Array<Hash>
Retrieve tool calls for a content item, ordered by timestamp.
216 217 218 219 220 221 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 216 def tool_calls_for_content_item(content_item_id) tool_calls .where(content_item_id: content_item_id) .order(:timestamp) .all end |
#undistilled_content_items(limit: 3, min_length: 200) ⇒ Array<Hash>
Fetch content items that have not yet been distilled, ordered newest first.
489 490 491 492 493 494 495 496 497 498 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 489 def undistilled_content_items(limit: 3, min_length: 200) content_items .left_join(:ingestion_metrics, content_item_id: :id) .where(Sequel[:ingestion_metrics][:id] => nil) .where { byte_len >= min_length } .order(Sequel.desc(:occurred_at)) .limit(limit) .select_all(:content_items) .all end |
#update_delta_cursor(session_id, transcript_path, offset) ⇒ void
This method returns an undefined value.
Create or update the byte-offset cursor for a session/transcript pair.
238 239 240 241 242 243 244 245 246 247 248 249 250 251 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 238 def update_delta_cursor(session_id, transcript_path, offset) now = Time.now.utc.iso8601 delta_cursors .insert_conflict( target: [:session_id, :transcript_path], update: {last_byte_offset: offset, updated_at: now} ) .insert( session_id: session_id, transcript_path: transcript_path, last_byte_offset: offset, updated_at: now ) end |
#update_fact(fact_id, status: nil, valid_to: nil, scope: nil, project_path: nil, embedding: nil) ⇒ Boolean
Selectively update one or more fields on a fact. Only provided (non-nil) keyword arguments are written. Setting scope to “global” automatically clears project_path.
326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 326 def update_fact(fact_id, status: nil, valid_to: nil, scope: nil, project_path: nil, embedding: nil) updates = {} updates[:status] = status if status updates[:valid_to] = valid_to if valid_to if scope updates[:scope] = scope updates[:project_path] = (scope == "global") ? nil : project_path end if updates[:embedding_json] = .to_json end return false if updates.empty? facts.where(id: fact_id).update(updates) true end |
#update_fact_embedding(fact_id, embedding_vector) ⇒ void
This method returns an undefined value.
Overwrite the embedding vector for a fact.
350 351 352 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 350 def (fact_id, ) facts.where(id: fact_id).update(embedding_json: .to_json) end |
#upsert_content_item(source:, text_hash:, byte_len:, session_id: nil, transcript_path: nil, project_path: nil, occurred_at: nil, raw_text: nil, metadata: nil, git_branch: nil, cwd: nil, claude_version: nil, thinking_level: nil, source_mtime: nil) ⇒ Integer
Insert a content item or return the existing id if a duplicate (same text_hash + session_id) already exists. Wrapped in retry logic.
146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 146 def upsert_content_item(source:, text_hash:, byte_len:, session_id: nil, transcript_path: nil, project_path: nil, occurred_at: nil, raw_text: nil, metadata: nil, git_branch: nil, cwd: nil, claude_version: nil, thinking_level: nil, source_mtime: nil) with_retry("upsert_content_item") do existing = content_items.where(text_hash: text_hash, session_id: session_id).get(:id) return existing if existing now = Time.now.utc.iso8601 content_items.insert( source: source, session_id: session_id, transcript_path: transcript_path, project_path: project_path, occurred_at: occurred_at || now, ingested_at: now, text_hash: text_hash, byte_len: byte_len, raw_text: raw_text, metadata_json: &.to_json, git_branch: git_branch, cwd: cwd, claude_version: claude_version, thinking_level: thinking_level, source_mtime: source_mtime ) end end |
#vector_index ⇒ Index::VectorIndex
Lazily-initialized vector index for semantic search.
43 44 45 |
# File 'lib/claude_memory/store/sqlite_store.rb', line 43 def vector_index @vector_index ||= Index::VectorIndex.new(self) end |