
Introducing Continue: The Open Source Alternative to GitHub Copilot for Coding The video introduces 'Continue,' an open source alternative to GitHub Copilot, designed to enhance coding with AI assistance in VS Code and JetBrains. It offers the capability to integrate various AI models such as Claude-3, GPT-4, Groq, Ollama, and more. I walk through the installation process on VS Code, demonstrating how to access and select from free models, accept or reject code suggestions, add different models by inserting API keys, and edit code directly within the editor. The highlighted features include intuitive command shortcuts, the ability to ask descriptive questions about code, and support for numerous AI providers like OpenAI, Anthropic, and Ollama. 00:00 Introduction to Continue: The Open Source Alternative 00:18 Getting Started with Continue on VS Code 00:52 Exploring Continue's Features and Models 01:21 Setting Up and Using Different Models 02:13 Editing and Customizing with Continue 03:12 Conclusion and Encouragement to Support Continue
--- type: transcript date: 2024-05-05 youtube_id: qXNecVIxRi0 --- # Transcript: Continue: Incredible Open Source Github Copilot Alternative. Use Groq + Llama-3, Ollama and more this is absolutely amazing I'm going to be showing you continue which is essentially an open- Source alternative to GitHub co-pilot what it allows you to do is plug in different models such as Sonet GPT 3.5 gp4 you can plug in your Al llama models you can plug in grock and a ton of different other options it's free on vs code and Jet brains let me show you how to get started I'm going to be showing you in vs code but I'd imagine there's a similar setup for jet brains as well to get started with the extension just go ahead search for continue on the extension Marketplace in vs code if you're using vs code go ahead and install it and then the first thing that you're going to want to do as soon as you install it is go ahead and drag this over to the right hand side here if we just put that there now we have it on this panel and you can go command option L to toggle it off and on on Mac at least also going to be a similar command if you're using Windows you have it installed you can go ahead and select from the handful of free models that they have here so if you want to try say CLA 3 saw it I can just go ahead write me on hello world Express server and you can see it's really quick it's really nice it's intuitive you can go ahead and command shift enter to accept it or command shift backspace to reject it you can also accept it line by line if you'd like and then I can highlight pieces of code if I want I can go ahead command L puts it over here I can say what is this doing and there it's giving me a nice little description but the thing that I really like about this is say if you wanted to add different models there's a ton of different providers here so if I just open this up a little bit you can see there's opening eye there's an Tropic there's coher there's even things like AMA together Ai and then as my channel probably likely knows there's Gro to set up a new model you have a couple options you can go ahead put in your API key select the model here or alternatively you can always just open up the config.js here that's what I did for the grock model so now that it's in here I can go ahead I can go down here can select to run grock and then similar thing is if I just go ahead and I say command l once that I I have it all set up I can again try and do this again I can sa the Hello World server that takes in a message and a model name there you go it's like instant right so that's that grock feel within vs code easy right so right now you can go ahead get a v key from gr and then there you go you can just go ahead and accept it so command shift enter if I want to edit it I can highlight it I can go command I and I can write the edit and I could say change this to next then there you go you can go commit enter boom done really great if you want to use AMA you can do that as well you could go AMA run llama 3 and then once it's all up and running you can go ahead select your llama 3 model or whatever model you have running and then similar thing you can run that all for free locally on your machine if you want it so if I do the same thing hello world Express server there you go now it's streaming out from there's a ton of different features in here there's SL commands that you can use you can also import different files if you want so if you want to ask a question of a particular file you can go ahead head select the file that you want to go into and you can say what is within this file you can go ahead select it ask questions about it just like that there's a ton built into this I just wanted to do a really quick video highlighting it for you congrats to the team over at continue this is incredible work I'm really excited to see how this project evolves over time I'd encourage you go over to GitHub start a repo if you found this video useful please like comment share and subscribe otherwise until the next one
Weekly deep dives on AI agents, coding tools, and building with LLMs - delivered to your inbox.
Free forever. No spam.
Subscribe FreeNew tutorials, open-source projects, and deep dives on coding agents - delivered weekly.
Technical content at the intersection of AI and development. Building with AI agents, Claude Code, and modern dev tools - then showing you exactly how it works.