Module: LlmGateway::Errors
- Defined in:
- lib/llm_gateway/errors.rb
Defined Under Namespace
Classes: APIConnectionError, APIStatusError, APITimeoutError, AuthenticationError, BadRequestError, BaseError, ClientError, ConflictError, HallucinationError, InternalServerError, InvalidResponseGrammar, MissingMapperForProvider, NotFoundError, OverloadError, PermissionDeniedError, PromptError, PromptTooLong, RateLimitError, UnknownError, UnknownModel, UnprocessableEntityError, UnsupportedModel, UnsupportedProvider
Constant Summary collapse
- OVERFLOW_PATTERNS =
[ /prompt is too long/i, # Anthropic /exceeds the context window/i, # OpenAI /reduce the length of the messages/i, # Groq /maximum context length is \d+ tokens/i, /context[_ ]length[_ ]exceeded/i, /too many tokens/i, /token limit exceeded/i, /request too large.*tokens per min/i, # OpenAI TPM wording /input tokens per minute/i, # Anthropic TPM wording /reduce the prompt length/i, /input or output tokens must be reduced/i ].freeze
Class Method Summary collapse
Class Method Details
.context_overflow_message?(message) ⇒ Boolean
48 49 50 51 52 53 |
# File 'lib/llm_gateway/errors.rb', line 48 def self.() text = .to_s return false if text.empty? OVERFLOW_PATTERNS.any? { |pattern| pattern.match?(text) } end |