Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It looks like Copilot has direct support for Ollama if you're willing to set that up: https://docs.ollama.com/integrations/vscode

For LM Studio under server settings you can start a local server that has an OpenAI-compatible API. You'd need to point Copilot to that. I don't use Copilot so not sure of the exact steps there



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: