Skip to main contentπ Prerequisites
- Windows, macOS, or Linux computer
- Sixth installed in VS Code
π Setup Steps
1. Install Ollama
- Visit ollama.com
- Download and install for your operating system
2. Choose and Download a Model
-
Open your Terminal and run the command:
β¨ Your model is now ready to use within Sixth!
- Open VS Code
- Click Sixth settings icon
- Select βOllamaβ as API provider
- Enter configuration:
- Base URL:
http://localhost:11434/
(default value, can be left as is)
- Select the model from your available options
β οΈ Important Notes
- Start Ollama before using with Sixth
- Keep Ollama running in background
- First model download may take several minutes
π§ Troubleshooting
If Sixth canβt connect to Ollama:
- Verify Ollama is running
- Check base URL is correct
- Ensure model is downloaded
Need more info? Read the Ollama Docs.