Class: SidekiqUniqueJobs::Digests
- Inherits:
-
Redis::SortedSet
- Object
- Redis::Entity
- Redis::SortedSet
- SidekiqUniqueJobs::Digests
- Defined in:
- lib/sidekiq_unique_jobs/digests.rb
Overview
Class Changelogs provides access to the changelog entries
Constant Summary collapse
- DEFAULT_COUNT =
Returns the number of matches to return by default.
1_000- SCAN_PATTERN =
Returns the default pattern to use for matching.
"*"
Instance Attribute Summary
Attributes inherited from Redis::Entity
Instance Method Summary collapse
-
#add(digest) ⇒ Object
Adds a digest.
-
#delete_by_digest(digest, **_opts) ⇒ Object
Delete a lock by its digest.
-
#delete_by_pattern(pattern, count: DEFAULT_COUNT) ⇒ Hash<String,Float>
Deletes unique digests by pattern.
-
#entries(pattern: SCAN_PATTERN, count: DEFAULT_COUNT) ⇒ Array<String>
The entries in this sorted set.
-
#initialize(digests_key = DIGESTS) ⇒ Digests
constructor
A new instance of Digests.
-
#page(cursor: 0, pattern: SCAN_PATTERN, page_size: 100) ⇒ Array<Integer, Integer, Array<Lock>>
Returns a paginated.
Methods inherited from Redis::SortedSet
#byscore, #clear, #count, #rank, #score
Methods inherited from Redis::Entity
#count, #exist?, #expires?, #pttl, #ttl
Methods included from Timing
clock_stamp, now_f, time_source, timed
Methods included from JSON
dump_json, load_json, safe_load_json
Methods included from Script::Caller
call_script, debug_lua, do_call, extract_args, max_history, normalize_argv, now_f, redis_version
Methods included from Logging
#build_message, included, #log_debug, #log_error, #log_fatal, #log_info, #log_warn, #logger, #logging_context, #with_configured_loggers_context, #with_logging_context
Constructor Details
Instance Method Details
#add(digest) ⇒ Object
Adds a digest
25 26 27 |
# File 'lib/sidekiq_unique_jobs/digests.rb', line 25 def add(digest) redis { |conn| conn.zadd(key, now_f, digest) } end |
#delete_by_digest(digest, **_opts) ⇒ Object
Delete a lock by its digest
49 50 51 52 53 54 55 56 57 58 59 60 61 62 |
# File 'lib/sidekiq_unique_jobs/digests.rb', line 49 def delete_by_digest(digest, **_opts) result, elapsed = timed do redis do |conn| conn.multi do |pipeline| pipeline.call("UNLINK", "#{digest}:LOCKED") pipeline.call("ZREM", key, digest) end end end log_info("#{__method__}(#{digest}) completed in #{elapsed}ms") result end |
#delete_by_pattern(pattern, count: DEFAULT_COUNT) ⇒ Hash<String,Float>
Deletes unique digests by pattern
35 36 37 38 39 40 41 42 43 44 |
# File 'lib/sidekiq_unique_jobs/digests.rb', line 35 def delete_by_pattern(pattern, count: DEFAULT_COUNT) result, elapsed = timed do digests = entries(pattern: pattern, count: count).keys redis { |conn| BatchDelete.call(digests, conn) } end log_info("#{__method__}(#{pattern}, count: #{count}) completed in #{elapsed}ms") result end |
#entries(pattern: SCAN_PATTERN, count: DEFAULT_COUNT) ⇒ Array<String>
The entries in this sorted set
72 73 74 |
# File 'lib/sidekiq_unique_jobs/digests.rb', line 72 def entries(pattern: SCAN_PATTERN, count: DEFAULT_COUNT) redis { |conn| conn.zscan(key, match: pattern, count: count).to_a }.to_h end |
#page(cursor: 0, pattern: SCAN_PATTERN, page_size: 100) ⇒ Array<Integer, Integer, Array<Lock>>
Returns a paginated
85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 |
# File 'lib/sidekiq_unique_jobs/digests.rb', line 85 def page(cursor: 0, pattern: SCAN_PATTERN, page_size: 100) redis do |conn| total_size, digests = conn.multi do |pipeline| pipeline.zcard(key) pipeline.zscan(key, cursor, match: pattern, count: page_size) end # NOTE: When debugging, check the last item in the returned array. [ total_size.to_i, digests[0].to_i, # next_cursor digests[1].each_slice(2).map { |digest, score| Lock.new(digest, time: score) }, # entries ] end end |