Skip to main content๐ Prerequisites
- Windows, macOS, or Linux computer
- Sixth installed in VS Code
๐ Setup Steps
1. Install Ollama
- Visit ollama.com
- Download and install for your operating system
2. Choose and Download a Model
-
Open your Terminal and run the command:
โจ Your model is now ready to use within Sixth!
- Open VS Code
- Click Sixth settings icon
- Select โOllamaโ as API provider
- Enter configuration:
- Base URL:
http://localhost:11434/ (default value, can be left as is)
- Select the model from your available options
โ ๏ธ Important Notes
- Start Ollama before using with Sixth
- Keep Ollama running in background
- First model download may take several minutes
๐ง Troubleshooting
If Sixth canโt connect to Ollama:
- Verify Ollama is running
- Check base URL is correct
- Ensure model is downloaded
Need more info? Read the Ollama Docs.