Class: ActionAI::Agent
- Inherits:
-
AbstractController::Base
- Object
- AbstractController::Base
- ActionAI::Agent
- Includes:
- AbstractController::AssetPaths, AbstractController::Caching, AbstractController::Callbacks, AbstractController::Helpers, AbstractController::Logger, AbstractController::Rendering, AbstractController::Translation, Callbacks, Parameterized, Previews, QueuedExecution, Rescuable, ActionView::Layouts, Memery
- Defined in:
- lib/action_ai/agent.rb
Overview
Action AI Agent
Action AI allows you to send AI prompts from your application using an agent model and views.
Agent Models
To use Action AI, you need to create an agent model.
$ bin/rails generate ai:agent Generator
The generated model inherits from ApplicationAI which in turn inherits from ActionAI::Agent. An agent model defines methods used to generate an AI prompt. In these methods, you can set up variables to be used in the prompt views, options on the AI model used such as the :model ID, and attachments.
class ApplicationAI < ActionAI::Agent
default model: 'gpt-4o-mini',
provider: :openai
end
class Generator < ApplicationAI
default model: 'gpt-4o'
def code(task, language)
@task = task
@language = language
end
end
Within the agent method, you have access to the following methods:
-
attachments- Allows you to add attachments to your prompt in an intuitive manner;attachments << 'url_or_path/to/filename.png' -
ask- Allows you to specify a prompt to be sent. Likerenderin Action Controller, this is optional.
The ask method uses [RubyLLM::Chat#ask] under the hood and provides the same API, except that prompt is optional. If the prompt is not specified, it will be rendered from the view.
Prompt views
Like Action Controller, each agent class has a corresponding view directory in which each method of the class looks for a template with its name.
To define a template to be used with an agent, create an .erb file with the same name as the method in your agent model. For example, in the agent defined above, the template at app/ai/prompts/generator/code.erb would be used to generate the prompt.
Variables defined in the methods of your agent model are accessible as instance variables in their corresponding view.
Prompts by default are sent in plain text, so a sample view for our model example might look like this:
You are an expert <%= @language.to_s.camelize %> developer.
Write clean, well-commented code to accomplish the following:
<%= @task %>
You can even use Action View helpers in these views. For example:
You are an expert <%= t @language, scope: 'languages' %> developer.
Write clean, well-commented code to accomplish the following:
<%= @task %>
If you need to access the AI provider or model in the view, you can do that through chat object:
You are a <%= chat.model.name %> model, an expert <%= @language.to_s.camelize %> developer.
Write clean, well-commented code to accomplish the following:
<%= @task %>
Generating URLs
URLs can be generated in agent views using url_for or named routes. Unlike controllers from Action Pack, the agent instance doesn’t have any context about the incoming request, so you’ll need to provide all of the details needed to generate a URL.
When using url_for you’ll need to provide the :host, :controller, and :action:
<%= url_for(host: "example.com", controller: "projects", action: "show", id: @project.id) %>
When using named routes you only need to supply the :host:
<%= project_url(@project, host: "example.com") %>
You should use the named_route_url style (which generates absolute URLs) and avoid using the named_route_path style (which generates relative URLs), since a model reading the prompt will have no concept of a current URL from which to determine a relative path.
It is also possible to set a default host that will be used in all agents by setting the :host option as a configuration option in config/application.rb:
config.action_ai. = { host: "example.com" }
You can also define a default_url_options method on individual agents to override these default settings per-agent.
By default when config.force_ssl is true, URLs generated for hosts will use the HTTPS protocol.
Sending prompts
Once an agent action and template are defined, you can chat with an AI model using your prompt or defer its creation and all the interactions for later:
Generator.code("Parse CSV", :ruby).content # returns the generated code
prompt = Generator.code("Parse CSV", :ruby) # => an ActionAI::Interaction object
prompt.run # generates and executes the prompt now
The ActionAI::Interaction class is a wrapper around a delegate that will call your method to generate the prompt. If you want direct access to the delegator, or RubyLLM::Message, you can call the message method on the ActionAI::Interaction object.
Generator.code("Parse CSV", :ruby). # => a RubyLLM::Message object
Action AI is nicely integrated with Active Job so you can generate and send prompts in the background (example: outside of the request-response cycle, so the user doesn’t have to wait on it):
Generator.code("Parse CSV", :ruby).later # enqueue the AI processing to Active Job
Note that later will execute your method from the background job.
You never instantiate your agent class. Rather, you just call the method you defined on the class itself.
Attachments
Sending attachments with prompts is easy:
class Generator < ApplicationAI
def code(task, language, spec_file = nil)
@task = task
@language = language
<< spec_file if spec_file
end
end
If you need to send attachments with no prompt, you need to create an empty view for it, or pass an empty prompt explicitly:
class Generator < ApplicationAI
def code(spec_file)
<< spec_file
ask ""
end
end
Default Hash
Action AI provides some intelligent defaults for your AI interactions, these are usually specified in a default method inside the class definition:
class Generator < ApplicationAI
default model: 'gpt-4o'
end
You can pass in any config value that a RubyLLM::Chat accepts.
Finally, Action AI also supports passing Proc and Lambda objects into the default hash, so you can define methods that evaluate as the message is being generated:
class Generator < ApplicationAI
default model: -> { Current.user.preferred_model },
api_key: proc { Current.user.ai_api_key }
end
Note that the proc/lambda is evaluated right at the start of the prompt generation, so if you set something in the default hash using a proc, and then set the same thing inside of your agent method, it will get overwritten by the agent method.
It is also possible to set these default options that will be used in all agents through the default_options= configuration in config/application.rb:
config.action_ai. = { provider: :openai, model: "gpt-4o-mini" }
Callbacks
You can specify callbacks using before_action and after_action to manage your AI interactions, and using before_execution and after_execution for wrapping the prompt execution process. For example, when you want to add default attachments and log execution for all prompts executed by a certain agent class:
class Generator < ApplicationAI
before_action :add_shared_context!
after_execution :log_costs
def code(task, language)
@task = task
@language = language
end
private
def add_shared_context!
@context = Rails.root.join('ARCHITECTURE.md').read
end
def log_costs
Rails.logger.info "Generated code using #{.input_tokens} input and #{.output_tokens} output tokens."
end
end
Action callbacks in Action AI Agent are implemented using AbstractController::Callbacks, so you can define and configure callbacks in the same manner that you would use callbacks in classes that inherit from ActionController::Base.
Note that unless you have a specific reason to do so, you should prefer using before_action rather than after_action in your Action AI Agent classes for setup.
Rescuing Errors
rescue blocks inside of an agent method cannot rescue errors that occur outside of rendering – for example, record deserialization errors in a background job.
To rescue errors that occur during any part of the AI interaction process, use rescue_from:
class Generator < ApplicationAI
rescue_from RubyLLM::ApiQuotaExceededError do |error|
Rails.logger.warn "API quota exceeded: #{error.}"
end
def code(task, language)
@task = task
@language = language
end
end
Previewing prompts
You can preview your prompt templates visually by adding a prompt preview file to the ActionAI::Agent.preview_paths. Since prompts may do something interesting with database data, you may need to write some scenarios to load messages with fake data:
class GeneratorPreview < ActionAI::Preview
def code
Generator.code("Sort an array efficiently", :ruby)
end
end
Methods must return a RubyLLM::Message object which can be generated by calling the agent method without the additional content / later. The location of the agent preview directories can be configured using the preview_paths option which has a default of test/ai/agents/previews:
config.action_ai.preview_paths << "#{Rails.root}/lib/ai/previews"
An overview of all previews is accessible at http://localhost:3000/rails/ai/agents on a running development server instance.
Configuration options
These options are specified on the class level, like ActionAI::Agent.raise_execution_errors = true
-
default_options- You can pass this in at a class level as well as within the class itself as per the above section. -
logger- the logger is used for generating information on prompt execution if available. Can be set tonilfor no logging. Compatible with both Ruby’s ownLoggerand Log4r loggers. -
execution_job- The job class used withlater. Agents can set this to use a custom execution job. Defaults toActionAI::ExecutionJob. -
execute_later_queue_name- The queue name used bylaterwith the defaultexecution_job. Agents can set this to use a custom queue name.
Constant Summary collapse
- PROTECTED_IVARS =
AbstractController::Rendering::DEFAULT_PROTECTED_INSTANCE_VARIABLES + [:@_action_has_layout]
Class Attribute Summary collapse
-
.agent_name ⇒ Object
Returns the name of the current agent.
Class Method Summary collapse
-
.controller_path ⇒ Object
Returns the name of the current agent.
-
.default(value = nil) ⇒ Object
(also: default_options=)
Allows to set defaults through app configuration:.
-
.supports_path? ⇒ Boolean
Prompts do not support relative path links.
Instance Method Summary collapse
-
#agent_name ⇒ Object
Returns the name of the agent object.
-
#ask(prompt = self.prompt, with: use_attachments) ⇒ Object
The main method that sends the rendered prompt to the AI model.
-
#attachments ⇒ Object
Allows you to add attachments to a prompt, like so:.
-
#process(method_name, *args) ⇒ Object
:nodoc:.
- #prompt ⇒ Object
- #response_body ⇒ Object
Methods included from Rescuable
Class Attribute Details
.agent_name ⇒ Object
Returns the name of the current agent. This method is also being used as a path for a view lookup. If this is an anonymous agent, this method will return anonymous instead.
312 313 314 |
# File 'lib/action_ai/agent.rb', line 312 def agent_name @agent_name ||= anonymous? ? "anonymous" : name.underscore end |
Class Method Details
.controller_path ⇒ Object
Returns the name of the current agent. This method is also being used as a path for a view lookup. If this is an anonymous agent, this method will return anonymous instead.
317 318 319 |
# File 'lib/action_ai/agent.rb', line 317 def agent_name @agent_name ||= anonymous? ? "anonymous" : name.underscore end |
.default(value = nil) ⇒ Object Also known as: default_options=
Allows to set defaults through app configuration:
config.action_ai. = { provider: :openai }
322 323 324 325 |
# File 'lib/action_ai/agent.rb', line 322 def default(value = nil) self.default_params = default_params.merge(value).freeze if value default_params end |
.supports_path? ⇒ Boolean
Prompts do not support relative path links.
420 421 422 |
# File 'lib/action_ai/agent.rb', line 420 def self.supports_path? # :doc: false end |
Instance Method Details
#agent_name ⇒ Object
Returns the name of the agent object.
362 |
# File 'lib/action_ai/agent.rb', line 362 def agent_name = self.class.agent_name |
#ask(prompt = self.prompt, with: use_attachments) ⇒ Object
The main method that sends the rendered prompt to the AI model. There are two ways to call this method, with a block, or without a block. If prompt is omitted, it is rendered from the matching template.
It accepts an optional with: keyword for attachments:
-
:with- Array of file paths or URLs to attach to the prompt.
You can set default model options using the ::default class method:
class Generator < ActionAI::Agent
default model: 'gpt-4o', provider: :openai
end
It will find a template in the view paths using by default the agent name and the method name that it is being called from, it will then call RubyLLM::Chat#ask and return a resulting RubyLLM::Message.
For example:
class Generator < ActionAI::Agent
default model: 'gpt-4o'
def code(task, language)
@task = task
@language = language
ask # can be omitted, like +render+ in action controllers
end
end
Will look for all templates at “app/ai/prompts/generator” with name “code”. If no code template exists, it will raise an ActionView::MissingTemplate error.
However, those can be customized:
ask render(template: 'shared/prompt')
You can even render plain text directly without using a template:
ask("Write Ruby code for the following task: #{task}")
414 415 416 |
# File 'lib/action_ai/agent.rb', line 414 def ask(prompt = self.prompt, with: , &) @_message = chat.ask(prompt, with:, &) end |
#attachments ⇒ Object
Allows you to add attachments to a prompt, like so:
<< '/path/to/filename.jpg'
370 |
# File 'lib/action_ai/agent.rb', line 370 def = @attachments ||= [] |
#process(method_name, *args) ⇒ Object
:nodoc:
346 347 348 349 350 351 352 353 354 355 356 |
# File 'lib/action_ai/agent.rb', line 346 def process(method_name, *args) # :nodoc: payload = { agent: self.class.name, action: method_name, args: args } ActiveSupport::Notifications.instrument("process.action_ai", payload) do super end end |
#prompt ⇒ Object
364 |
# File 'lib/action_ai/agent.rb', line 364 def prompt = render_to_string |
#response_body ⇒ Object
359 |
# File 'lib/action_ai/agent.rb', line 359 def response_body = &.content |