Skip to main content

AI Token Usage and API Keys

Justin Yan avatar
Written by Justin Yan
Updated today

With the inclusion of the AI Copilot, Frontly V2’s AI token usage is considerably different than in V1. Below is a breakdown of how AI tokens work in V2, how to manage usage, and how to integrate your own API keys for expanded access.

For V1 related documentation on AI token usage, click here.


Available LLMs

Frontly offers a selection of high-performance language models (LLMs) across both the AI Copilot and the AI Request action. These models are subject to change as newer, more capable options become available. If you're unsure which to use or what’s most current, feel free to reach out to our team.

AI Copilot LLMs:

  • openai/gpt-4.1

  • openai/gpt-4.1-mini

  • openai/o4-mini-high

  • openai/gpt-4.1-nano

  • anthropic/claude-3.7-sonnet

AI Request LLMs:

  • OpenAI GPT-4o Mini

  • OpenAI GPT-4o

  • OpenAI o3 Mini

  • Google Gemini Flash 1.5 8B

  • DeepSeek R1

  • Anthropic Claude 3.5 Sonnet


AI Token Usage

Each action you take with the AI Copilot or an AI Request will consume a number of tokens, based on how large your input and output data is, and which LLM you're using. More powerful models generally consume more tokens.

While we don’t yet have a published comparison of which model provides the best “bang for your buck,” we’re actively analyzing usage trends and plan to share clearer guidance in the near future.

To monitor usage:

  • Navigate to your Usage and Billing tab to see your monthly AI token balance.

  • Usage is organization-wide, meaning all AI actions across all apps count toward your limit.


Monthly Limits

Your monthly AI token limit depends on your plan, and is replenished on a rolling basis.

What does that mean?

If you use 3,000 tokens today at 2 p.m., those tokens will be replenished 30 days later at 2 p.m. — not at the start of the calendar month.

This gives you more flexibility to spread out usage over time.


API Keys

You can optionally bring your own API key to extend your token limits even further.

To connect:

  • Go to your Integrations tab

  • Add a valid OpenAI or OpenRouter API key

When you connect your own key:

  • Your monthly AI token balance in Frontly increases 10x

  • You can switch between your key and the built-in key between apps

  • You gain flexibility over pricing and model behavior

Why is there still a limit even with my own API key?

Even when using your own key, AI actions still pass through Frontly’s systems for request handling, validation, and formatting. Because of this, we still need to impose a high cap to ensure overall platform stability. These safeguards are in place to protect all users in case of unexpected spikes in usage.

Did this answer your question?