Using a local LLM as AI Code Assist
If you want to use AI for coding, there are plenty of tools to choose from. Popular options include GitHub Copilot, Claude Code, and Gemini. But what if you’d rather not send your code to external services? Now that computers are becoming more powerful, is it possible to run a large language model locally and use it as your own coding assistant?