Class: Factorix::Cache::S3
Overview
S3-based cache storage implementation.
Stores cache entries in AWS S3 with automatic prefix generation. TTL is managed via custom metadata on objects. Supports distributed locking using conditional PUT operations.
Constant Summary collapse
- DEFAULT_LOCK_TIMEOUT =
Default timeout for distributed lock acquisition in seconds.
30
Instance Attribute Summary
Attributes inherited from Base
Instance Method Summary collapse
-
#age(key) ⇒ Float?
Get the age of a cache entry in seconds.
-
#backend_info ⇒ Hash
Return backend-specific information.
-
#clear ⇒ void
Clear all cache entries in this prefix.
-
#delete(key) ⇒ Boolean
Delete a cache entry.
-
#each {|key, entry| ... } ⇒ Enumerator
Enumerate cache entries.
-
#exist?(key) ⇒ Boolean
Check if a cache entry exists and is not expired.
-
#expired?(key) ⇒ Boolean
Check if a cache entry has expired based on TTL.
-
#initialize(bucket:, cache_type:, region: nil, lock_timeout: DEFAULT_LOCK_TIMEOUT) ⇒ S3
constructor
Initialize a new S3 cache storage.
-
#read(key) ⇒ String?
Read a cached entry.
-
#size(key) ⇒ Integer?
Get the size of a cached entry in bytes.
-
#store(key, src) ⇒ Boolean
Store data in the cache.
-
#with_lock(key) { ... } ⇒ Object
Execute a block with a distributed lock.
-
#write_to(key, output) ⇒ Boolean
Write cached content to a file.
Constructor Details
#initialize(bucket:, cache_type:, region: nil, lock_timeout: DEFAULT_LOCK_TIMEOUT) ⇒ S3
Initialize a new S3 cache storage.
55 56 57 58 59 60 61 62 |
# File 'lib/factorix/cache/s3.rb', line 55 def initialize(bucket:, cache_type:, region: nil, lock_timeout: DEFAULT_LOCK_TIMEOUT, **) super(**) @client = Aws::S3::Client.new(**{region:}.compact) @bucket = bucket @prefix = "cache/#{cache_type}/" @lock_timeout = lock_timeout logger.info("Initializing S3 cache", bucket: @bucket, prefix: @prefix, ttl: @ttl, lock_timeout: @lock_timeout) end |
Instance Method Details
#age(key) ⇒ Float?
Get the age of a cache entry in seconds.
157 158 159 160 161 162 |
# File 'lib/factorix/cache/s3.rb', line 157 def age(key) resp = head_object(key) Time.now - resp.last_modified rescue Aws::S3::Errors::NotFound nil end |
#backend_info ⇒ Hash
Return backend-specific information.
248 249 250 251 252 253 254 255 |
# File 'lib/factorix/cache/s3.rb', line 248 def backend_info { type: "s3", bucket: @bucket, prefix: @prefix, lock_timeout: @lock_timeout } end |
#clear ⇒ void
This method returns an undefined value.
Clear all cache entries in this prefix.
138 139 140 141 142 143 144 145 146 147 148 149 150 151 |
# File 'lib/factorix/cache/s3.rb', line 138 def clear logger.info("Clearing S3 cache prefix", bucket: @bucket, prefix: @prefix) count = 0 list_all_objects do |objects| keys_to_delete = objects.filter_map {|obj| {key: obj.key} unless obj.key.end_with?(".lock") } next if keys_to_delete.empty? @client.delete_objects(bucket: @bucket, delete: {objects: keys_to_delete}) count += keys_to_delete.size end logger.info("Cache cleared", objects_removed: count) end |
#delete(key) ⇒ Boolean
Delete a cache entry.
127 128 129 130 131 132 133 |
# File 'lib/factorix/cache/s3.rb', line 127 def delete(key) return false unless exist_without_expiry_check?(key) @client.delete_object(bucket: @bucket, key: storage_key(key)) logger.debug("Deleted from cache", key:) true end |
#each {|key, entry| ... } ⇒ Enumerator
Enumerate cache entries.
230 231 232 233 234 235 236 237 238 239 240 241 242 243 |
# File 'lib/factorix/cache/s3.rb', line 230 def each return enum_for(__method__) unless block_given? list_all_objects do |objects| objects.each do |obj| next if obj.key.end_with?(".lock") logical_key, entry = (obj) next if logical_key.nil? # Skip entries without logical key metadata yield logical_key, entry end end end |
#exist?(key) ⇒ Boolean
Check if a cache entry exists and is not expired.
68 69 70 71 72 73 |
# File 'lib/factorix/cache/s3.rb', line 68 def exist?(key) head_object(key) !expired?(key) rescue Aws::S3::Errors::NotFound false end |
#expired?(key) ⇒ Boolean
Check if a cache entry has expired based on TTL.
168 169 170 171 172 173 174 175 176 177 178 |
# File 'lib/factorix/cache/s3.rb', line 168 def expired?(key) return false if @ttl.nil? resp = head_object(key) value = resp.[EXPIRES_AT_KEY] return false if value.nil? Time.now.to_i > Integer(value, 10) rescue Aws::S3::Errors::NotFound true end |
#read(key) ⇒ String?
Read a cached entry.
79 80 81 82 83 84 85 86 |
# File 'lib/factorix/cache/s3.rb', line 79 def read(key) return nil if expired?(key) resp = @client.get_object(bucket: @bucket, key: storage_key(key)) resp.body.read rescue Aws::S3::Errors::NoSuchKey nil end |
#size(key) ⇒ Integer?
Get the size of a cached entry in bytes.
184 185 186 187 188 189 190 191 |
# File 'lib/factorix/cache/s3.rb', line 184 def size(key) return nil if expired?(key) resp = head_object(key) resp.content_length rescue Aws::S3::Errors::NotFound nil end |
#store(key, src) ⇒ Boolean
Store data in the cache.
108 109 110 111 112 113 114 115 116 117 118 119 120 121 |
# File 'lib/factorix/cache/s3.rb', line 108 def store(key, src) = {LOGICAL_KEY_KEY => key} [EXPIRES_AT_KEY] = (Time.now.to_i + @ttl).to_s if @ttl @client.put_object( bucket: @bucket, key: storage_key(key), body: src.binread, metadata: ) logger.debug("Stored in cache", key:, size_bytes: src.size) true end |
#with_lock(key) { ... } ⇒ Object
Execute a block with a distributed lock. Uses conditional PUT for lock acquisition.
199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 |
# File 'lib/factorix/cache/s3.rb', line 199 def with_lock(key) lkey = lock_key(key) lock_value = SecureRandom.uuid deadline = Time.now + @lock_timeout loop do if try_acquire_lock(lkey, lock_value) logger.debug("Acquired lock", key:) break end cleanup_stale_lock(lkey) raise LockTimeoutError, "Failed to acquire lock for key: #{key}" if Time.now > deadline sleep 0.1 end begin yield ensure @client.delete_object(bucket: @bucket, key: lkey) logger.debug("Released lock", key:) end end |
#write_to(key, output) ⇒ Boolean
Write cached content to a file.
93 94 95 96 97 98 99 100 101 |
# File 'lib/factorix/cache/s3.rb', line 93 def write_to(key, output) return false if expired?(key) @client.get_object(bucket: @bucket, key: storage_key(key), response_target: output.to_s) logger.debug("Cache hit", key:) true rescue Aws::S3::Errors::NoSuchKey false end |