BoltAI Documentation
HomepagePricingChangelogDownload
  • Overview
  • Features
  • License
  • Getting Started
    • Setup
    • Your First Chat
    • AI Command
    • AI Inline
    • Use another AI Service
  • Chat UI
    • Overview
    • Basic Chat
    • Document Analysis
    • Advanced Voice Mode (beta)
    • Image Generation
    • Chat Configuration
    • AI Assistant
    • AI Plugins
    • App Appearance
    • Folder & Sidebar
    • Keyboard Shortcuts
    • Import from ChatGPT
    • Import from Claude.ai
    • Import / Export
    • Database Maintenance
    • Locations
    • Feature Flags
    • Community Icon
  • AI Command
    • Overview
    • Customize an AI Command
    • Alternative Profile
    • AI Command Behaviors
    • Instant Command
    • Bulk Editing Commands
    • FAQs
  • AI Inline
    • Overview
    • Inline Assistant
    • Inline Prompt
    • Inline Whisper
    • Advanced Configurations
  • Plugin
    • Overview
    • MCP Servers
    • Google Search
    • Web Browsing
    • Memory
    • Perplexity Search
    • Kagi Search
    • Brave Search
    • You.com Search
    • AppleScript
    • Shell Access
    • FFmpeg
    • DALL·E
    • Replicate
    • Whisper
    • Whisper (via Groq)
    • WolframAlpha
    • Gemini Code Execution
  • BoltAI on Setapp
    • Setapp Limitation
    • AI Inline on Setapp
    • Troubleshooting
  • BoltAI Mobile
    • Getting Started
    • MCP Servers (mobile)
  • Guides
    • How to create an OpenAI API Key
    • How to setup Web Search Plugin for BoltAI
    • How to set up BoltAI without an OpenAI API Key
    • How to generate Azure OpenAI API key
    • How to use Azure OpenAI API key in BoltAI
    • How to create an OpenRouter API key
    • How to set up a custom OpenAI-compatible Server in BoltAI
    • How to use Mistral AI on macOS with BoltAI
    • How to use Perplexity AI on mac with BoltAI
    • How to use Anthropic Claude on macOS with BoltAI
    • How to use Replicate AI on macOS with BoltAI
    • How to use Jina DeepSearch with BoltAI
    • How to migrate data to another Mac
    • How to back up your database
    • Cloud Sync Workaround
  • Troubleshooting
    • How to fix "This license key has reached the activation limit"
    • How to fix "You exceeded your current quota, please check your plan and billing details"
    • How to fix Accessbility permission
    • How to completely uninstall BoltAI
    • Can't select text in conversation prompt
    • API keys not persisted?
    • Download Previous Versions
  • Company
    • Run by a human
Powered by GitBook
On this page
  • Install Ollama
  • Set up Ollama in BoltAI
  • Use Ollama in Chat UI
  • Use Ollama with AI Command

Was this helpful?

  1. Getting Started

Use another AI Service

PreviousAI InlineNextOverview

Last updated 1 year ago

Was this helpful?

The beauty of BoltAI is that you can use the same features with different AI Services beyond OpenAI. BoltAI supports popular AI Services such as Anthropic, Google AI, Perplexity AI, Mistral AI or local LLMs via Ollama

Let's set up Ollama and use a local model with BoltAI.

Install Ollama

Ollama is a tool that helps us run large language models on our local machine and makes experimentation more accessible.

Installing Ollama is pretty straigh-forward.

  1. Go to and download the latest version

  2. Run Ollama. You should see the Ollama icon on the Mac Menubar

  3. Open Terminal, run ollama run mistral to run and chat with Mistral model

You can find more details in

Set up Ollama in BoltAI

  1. Open BoltAI app, go to Settings > Models. Click on the plus button (+) and choose Ollama

  2. Choose a default model. Optionally, you can click "Refresh" to fetch the list of available models

  3. Click "Save Changes"

Use Ollama in Chat UI

  1. Start a new chat, then switch to Ollama AI Service.

  2. (Optional) Choose a different model or a custom System Instruction

  3. You're ready to chat

Use Ollama with AI Command

Go to Settings > Commands, choose an AI Command, and set the AI Provider to Ollama. You can now use this AI Command 100% offline.

Demo:

Ollama Website
Ollama github repository
Set up Ollama