Class: Findbug::PerformanceEvent
- Inherits:
-
ActiveRecord::Base
- Object
- ActiveRecord::Base
- Findbug::PerformanceEvent
- Defined in:
- app/models/findbug/performance_event.rb
Overview
PerformanceEvent stores captured performance data in the database.
DATABASE SCHEMA
create_table :findbug_performance_events do |t|
t.string :transaction_name, null: false
t.string :transaction_type, default: 'request'
t.string :request_method
t.string :request_path
t.string :format
t.integer :status
t.float :duration_ms, null: false
t.float :db_time_ms, default: 0
t.float :view_time_ms, default: 0
t.integer :query_count, default: 0
t.column :slow_queries, <json_type>, default: []
t.column :n_plus_one_queries, <json_type>, default: []
t.boolean :has_n_plus_one, default: false
t.integer :view_count, default: 0
t.column :context, <json_type>, default: {}
t.string :environment
t.string :release_version
t.datetime :captured_at
t.timestamps
end
AGGREGATION STRATEGY
Unlike errors (which we group by fingerprint), we store every performance event individually. This allows:
-
Percentile calculations (p50, p95, p99)
-
Trend analysis over time
-
Individual slow request investigation
For dashboards, we aggregate on read using SQL GROUP BY.
Constant Summary collapse
- TYPE_REQUEST =
Transaction types
"request"- TYPE_CUSTOM =
"custom"- TYPE_JOB =
"job"
Class Method Summary collapse
-
.aggregate_for(transaction_name, since: 24.hours.ago) ⇒ Hash
Aggregate stats for a transaction.
-
.create_from_event(event_data) ⇒ PerformanceEvent
Create a performance event from Redis data.
-
.n_plus_one_hotspots(since: 24.hours.ago, limit: 10) ⇒ Array<Hash>
Get transactions with most N+1 issues.
- .parse_captured_at(value) ⇒ Object
- .percentile(sorted_array, percentile) ⇒ Object
-
.slowest_transactions(since: 24.hours.ago, limit: 10) ⇒ Array<Hash>
Get slowest transactions.
-
.throughput_over_time(since: 24.hours.ago, interval: "hour") ⇒ Array<Hash>
Get throughput over time (requests per minute).
Class Method Details
.aggregate_for(transaction_name, since: 24.hours.ago) ⇒ Hash
Aggregate stats for a transaction
129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 |
# File 'app/models/findbug/performance_event.rb', line 129 def self.aggregate_for(transaction_name, since: 24.hours.ago) events = where(transaction_name: transaction_name) .where("captured_at >= ?", since) return nil if events.empty? durations = events.pluck(:duration_ms).sort { transaction_name: transaction_name, count: events.count, avg_duration_ms: durations.sum / durations.size.to_f, min_duration_ms: durations.first, max_duration_ms: durations.last, p50_duration_ms: percentile(durations, 50), p95_duration_ms: percentile(durations, 95), p99_duration_ms: percentile(durations, 99), avg_query_count: events.average(:query_count).to_f.round(1), n_plus_one_count: events.where(has_n_plus_one: true).count } end |
.create_from_event(event_data) ⇒ PerformanceEvent
Create a performance event from Redis data
100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 |
# File 'app/models/findbug/performance_event.rb', line 100 def self.create_from_event(event_data) create!( transaction_name: event_data[:transaction_name], transaction_type: event_data[:transaction_type] || TYPE_REQUEST, request_method: event_data[:request_method], request_path: event_data[:request_path], format: event_data[:format], status: event_data[:status], duration_ms: event_data[:duration_ms], db_time_ms: event_data[:db_time_ms] || 0, view_time_ms: event_data[:view_time_ms] || 0, query_count: event_data[:query_count] || 0, slow_queries: event_data[:slow_queries] || [], n_plus_one_queries: event_data[:n_plus_one_queries] || [], has_n_plus_one: event_data[:has_n_plus_one] || false, view_count: event_data[:view_count] || 0, context: event_data[:context] || {}, environment: event_data[:environment], release_version: event_data[:release], captured_at: parse_captured_at(event_data[:captured_at]) ) end |
.n_plus_one_hotspots(since: 24.hours.ago, limit: 10) ⇒ Array<Hash>
Get transactions with most N+1 issues
184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 |
# File 'app/models/findbug/performance_event.rb', line 184 def self.n_plus_one_hotspots(since: 24.hours.ago, limit: 10) with_n_plus_one .where("captured_at >= ?", since) .group(:transaction_name) .select( "transaction_name", "COUNT(*) as occurrence_count", "AVG(query_count) as avg_queries" ) .order("occurrence_count DESC") .limit(limit) .map do |row| { transaction_name: row.transaction_name, n_plus_one_count: row.occurrence_count, avg_queries: row.avg_queries.round(1) } end end |
.parse_captured_at(value) ⇒ Object
248 249 250 251 252 253 254 255 256 257 |
# File 'app/models/findbug/performance_event.rb', line 248 def self.parse_captured_at(value) case value when Time, DateTime value when String Time.parse(value) else Time.current end end |
.percentile(sorted_array, percentile) ⇒ Object
234 235 236 237 238 239 240 241 242 243 244 245 246 |
# File 'app/models/findbug/performance_event.rb', line 234 def self.percentile(sorted_array, percentile) return 0 if sorted_array.empty? k = (percentile / 100.0) * (sorted_array.length - 1) f = k.floor c = k.ceil if f == c sorted_array[f] else sorted_array[f] + (k - f) * (sorted_array[c] - sorted_array[f]) end end |
.slowest_transactions(since: 24.hours.ago, limit: 10) ⇒ Array<Hash>
Get slowest transactions
157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 |
# File 'app/models/findbug/performance_event.rb', line 157 def self.slowest_transactions(since: 24.hours.ago, limit: 10) where("captured_at >= ?", since) .group(:transaction_name) .select( "transaction_name", "AVG(duration_ms) as avg_duration", "MAX(duration_ms) as max_duration", "COUNT(*) as request_count" ) .order("avg_duration DESC") .limit(limit) .map do |row| { transaction_name: row.transaction_name, avg_duration_ms: row.avg_duration.round(2), max_duration_ms: row.max_duration.round(2), count: row.request_count } end end |
.throughput_over_time(since: 24.hours.ago, interval: "hour") ⇒ Array<Hash>
Get throughput over time (requests per minute)
210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 |
# File 'app/models/findbug/performance_event.rb', line 210 def self.throughput_over_time(since: 24.hours.ago, interval: "hour") time_sql = Findbug::AdapterHelper.date_trunc_sql(interval, "captured_at") where("captured_at >= ?", since) .group(Arel.sql(time_sql)) .select( Arel.sql("#{time_sql} as time_bucket"), "COUNT(*) as request_count", "AVG(duration_ms) as avg_duration" ) .order(Arel.sql(time_sql)) .map do |row| time = row.time_bucket time = Time.parse(time.to_s) unless time.respond_to?(:strftime) { time: time, count: row.request_count, avg_duration_ms: row.avg_duration&.round(2) || 0 } end end |