IDE Integration
Connect your code editor to Teale for AI-powered autocomplete, chat, and code generation --- all running locally on your Mac.
Prerequisites
- Teale running with a model loaded (
teale up) - Your editor of choice installed
Continue.dev (VS Code / JetBrains)
Continue is an open-source AI code assistant that supports custom backends.
-
Install the Continue extension from the VS Code marketplace (or JetBrains plugin marketplace).
-
Open Continue's configuration file. In VS Code, run the command Continue: Open Config File or edit
~/.continue/config.jsondirectly. -
Add Teale as a model provider:
{"models": [{"title": "Teale Local","provider": "openai","model": "llama-3.1-8b-instruct-4bit","apiBase": "http://localhost:11435/v1","apiKey": "not-needed"}]} -
Save the file. Continue will reload automatically.
-
Open the Continue sidebar and select "Teale Local" from the model dropdown.
Tab autocomplete
To use Teale for tab completions, add a tabAutocompleteModel entry:
{
"tabAutocompleteModel": {
"title": "Teale Autocomplete",
"provider": "openai",
"model": "llama-3.1-8b-instruct-4bit",
"apiBase": "http://localhost:11435/v1",
"apiKey": "not-needed"
}
}
Cursor
Cursor supports custom OpenAI-compatible endpoints.
- Open Cursor Settings (Cmd+,).
- Navigate to Models > Add Model.
- Select OpenAI-compatible as the provider type.
- Set the base URL to
http://localhost:11435/v1. - Leave the API key field empty or enter any placeholder value.
- Select the model name from the dropdown (Cursor will query the
/v1/modelsendpoint). - Save and start using Teale in Cursor's chat and inline edit features.
Open WebUI
Open WebUI is a self-hosted chat interface that supports OpenAI-compatible backends.
- Open your Open WebUI instance in a browser.
- Go to Settings > Connections.
- Add a new OpenAI connection:
- URL:
http://localhost:11435 - API Key: leave empty or enter any value
- URL:
- Save. Available models will appear in the model selector.
Zed
Zed supports custom language model providers.
-
Open Zed settings (Cmd+,).
-
Add an OpenAI-compatible provider in your
settings.json:{"language_models": {"openai": {"api_url": "http://localhost:11435/v1","available_models": [{"name": "llama-3.1-8b-instruct-4bit","display_name": "Teale Local","max_tokens": 8192}]}}} -
Open the assistant panel and select "Teale Local" from the model picker.
Any OpenAI-compatible tool
Any application that lets you configure a custom OpenAI API endpoint works with Teale. The pattern is always the same:
- Set the base URL to
http://localhost:11435/v1. - Set the API key to any non-empty string (or leave blank if the tool allows it).
- Select the model name returned by the
/v1/modelsendpoint.
If the tool requires a specific model name, run teale models list to see what is currently loaded.
Next steps
- Use Teale with OpenAI SDK --- programmatic access from Python and Node.js
- Manage Models --- switch between models for different tasks
- API Key Management --- secure access when sharing over the network