🤖 Setting Up LM Studio with Sixth

Run AI models locally using LM Studio with Sixth.

đź“‹ Prerequisites

  • Windows, macOS, or Linux computer with AVX2 support
  • Sixth installed in VS Code

🚀 Setup Steps

1. Install LM Studio

  • Visit lmstudio.ai
  • Download and install for your operating system
LM Studio download page

2. Launch LM Studio

  • Open the installed application
  • You’ll see four tabs on the left: Chat, Developer (where you will start the server), My Models (where your downloaded models are stored), Discover (add new models)
LM Studio interface overview

3. Download a Model

  • Browse the “Discover” page
  • Select and download your preferred model
  • Wait for download to complete
Downloading a model in LM Studio

4. Start the Server

  • Navigate to the “Developer” tab
  • Toggle the server switch to “Running”
  • Note: The server will run at http://localhost:1234
Starting the LM Studio server

5. Configure Sixth

  1. Open VS Code
  2. Click Sixth settings icon
  3. Select “LM Studio” as API provider
  4. Select your model from the available options
Configuring Sixth with LM Studio

⚠️ Important Notes

  • Start LM Studio before using with Sixth
  • Keep LM Studio running in background
  • First model download may take several minutes depending on size
  • Models are stored locally after download

đź”§ Troubleshooting

  1. If Sixth can’t connect to LM Studio:
  2. Verify LM Studio server is running (check Developer tab)
  3. Ensure a model is loaded
  4. Check your system meets hardware requirements