# Use another AI Service

The beauty of BoltAI is that you can use the same features with different AI Services beyond OpenAI. BoltAI supports popular AI Services such as Anthropic, Google AI, Perplexity AI, Mistral AI or local LLMs via Ollama

Let's set up Ollama and use a local model with BoltAI.

### Install Ollama <a href="#basic-usage" id="basic-usage"></a>

Ollama is a tool that helps us run large language models on our local machine and makes experimentation more accessible.

Installing Ollama is pretty straigh-forward.

1. Go to [Ollama Website](https://ollama.com) and download the latest version
2. Run Ollama. You should see the Ollama icon on the Mac Menubar
3. Open `Terminal`, run `ollama run mistral` to run and chat with Mistral model

You can find more details in [Ollama github repository](https://github.com/ollama/ollama/tree/main)

### Set up Ollama in BoltAI <a href="#basic-usage" id="basic-usage"></a>

1. Open BoltAI app, go to **Settings > Models**. Click on the plus button (+) and choose Ollama
2. Choose a default model. Optionally, you can click "Refresh" to fetch the list of available models
3. Click "Save Changes"

<figure><img src="https://3493584844-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FynYW2xZqA52spY7XgWis%2Fuploads%2Fag84mstsPcApMIzAfbSI%2FCleanShot%202024-03-17%20at%2017.44.12%402x.jpg?alt=media&#x26;token=64dc7c2f-f882-4607-a406-dbec6a35c7a3" alt=""><figcaption><p>Set up Ollama</p></figcaption></figure>

### Use Ollama in Chat UI <a href="#basic-usage" id="basic-usage"></a>

1. Start a new chat, then switch to Ollama AI Service.
2. (Optional) Choose a different model or a custom System Instruction
3. You're ready to chat

<figure><img src="https://3493584844-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FynYW2xZqA52spY7XgWis%2Fuploads%2FirlL2zjA4uBcLRDnTEMi%2FCleanShot%202024-03-17%20at%2017.46.59%402x.jpg?alt=media&#x26;token=e376e1a0-d24c-4e2d-8a9d-1a2ab590710e" alt=""><figcaption></figcaption></figure>

<figure><img src="https://3493584844-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FynYW2xZqA52spY7XgWis%2Fuploads%2FLTf72i8UWe608XZZP7LQ%2FCleanShot%202024-03-17%20at%2017.49.17%402x.jpg?alt=media&#x26;token=186a4f9d-836a-4a1e-b398-38403f04ccf8" alt=""><figcaption></figcaption></figure>

### Use Ollama with AI Command <a href="#basic-usage" id="basic-usage"></a>

Go to **Settings > Commands**, choose an AI Command, and set the AI Provider to Ollama. You can now use this AI Command 100% offline.

Demo:

{% embed url="<https://www.youtube.com/watch?feature=youtu.be&v=grLGY7orG2U>" %}


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.boltai.com/docs/start/use-another-ai-service.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
