How to self-host a proxy server for Anthropic Claude
BoltAI supports Anthropic Claude via a proxy server. You may wonder why a proxy server is needed? Why not sending requests to Anthropic API servers directly?
The answer is, at the moment, BoltAI only supports OpenAI-compatible API endpoints.
I understand this is not ideal and will fix this in the next version.
This proxy server converts the API of Anthropic's Claude model to the OpenAI Chat API format. So that it's compatible with BoltAI.
By default, you're using the proxy server provided by BoltAI (claude-proxy-us.boltai.com)
The source code of this proxy server is completely open. You can host yourself following the instruction from this repository:
https://github.com/longseespace/claude-to-chatgpt
❗ Notice: use the Docker option. Cloudflare worker deployment does not support streaming and has a poor performance.
If you are new here, BoltAI is a native macOS app that allows you to access ChatGPT inside any app. Download now.