Action AI – Easy AI prompt execution and testing

Action AI is a framework for designing AI interaction layers. These layers are used to consolidate prompt generation and execution in one place, instead of scattering provider calls across controllers, jobs, and models.

Action AI is in essence a wrapper around Action Controller and the RubyLLM gem. It provides a way to make AI prompts using templates in the same way that Action Controller renders views using templates.

The architecture is intentionally modeled after Action Mailer: class-level actions, view-backed templates, and lazy execution. Action AI rebuilds that shape for AI interactions rather than building an unrelated API from scratch.

Sending prompts

The framework works by initializing any instance variables you want to be available in the prompt template.

This can be as simple as:

class Generator < ApplicationAI
  default model: "gpt-4o"

  def code(task, language)
    @task     = task
    @language = language
    ask
  end
end

After the action method completes, the framework will automatically:

  1. Render the prompt from the corresponding template (e.g., app/ai/prompts/generator/code.erb)

  2. Send it to the configured AI model via ask

  3. Return an ActionAI::Interaction object

The prompt text is created by using an Action View template (regular ERB) that has the instance variables that are declared in the agent action.

So the corresponding template for the code method above could look like this:

You are an expert <%= @language.to_s.camelize %> developer.
Write clean, well-commented code for the following task:

<%= @task %>

If the task description was “Parse a CSV file and return unique values”, the rendered prompt would look like this:

You are an expert Ruby developer.
Write clean, well-commented code for the following task:

Parse a CSV file and return unique values

In order to execute prompts, you simply call the method and then call content to get the result or just run on the return value.

Calling the method returns a RubyLLM Message object:

prompt = Generator.code("Parse CSV and dedupe", :ruby) # => Returns a RubyLLM::Message object
prompt.run                                             # => executes the prompt

Or you can just chain the methods together like:

Generator.code("Parse CSV and dedupe", :ruby).content  # Returns AI's response for the prompt

Setting defaults

It is possible to set default values that will be used in every method in your Action AI Agent class. To implement this functionality, you just call the public class method default which you get for free from ActionAI::Agent. This method accepts a Hash as the parameter. You can use any options supported by RubyLLM::Chat, such as :provider and :model. Finally, it is also possible to pass in a Proc that will get evaluated when it is needed.

Note that every value you set with this method will get overwritten if you use the same key in your agent method.

Example:

class Generator < ApplicationAI
  default model: proc { Current.user.preferred_model }
end

Configuration

The Agent class has the full list of configuration options. Here’s an example:

ActionAI::Agent.default_options = {
  provider: :openai,
  model: "gpt-4o-mini"
}

Download and installation

The latest version of Action AI can be installed with RubyGems:

$ gem install action_ai

License

Action AI is released under the MIT license: