
In this video, I will demonstrate how to set up your own self-hosted version of Coolify on an AWS EC2 instance. Coolify is an open-source and self-hostable alternative to Heroku, Netlify, or Vercel, and it supports multiple programming languages and platforms. You'll learn how to configure your EC2 instance, set up SSL certificates, and deploy applications without vendor lock-in. Additionally, I'll walk you through deploying my LLM answer engine project using Coolify. This video is perfect for anyone looking to gain more control and flexibility over their application deployments. 00:00 Introduction to Coolify 00:57 Setting Up AWS EC2 Instance 01:53 Configuring EC2 Instance for Coolify 03:11 Installing Coolify on EC2 04:49 Accessing and Setting Up Coolify 05:54 Deploying a Project with Coolify 07:57 Exploring Coolify Features 08:59 Conclusion and Final Thoughts
--- type: transcript date: 2024-06-10 youtube_id: V-sjuTPj-3s --- # Transcript: How to Self Host Coolify on AWS EC2: Step-by-Step Guide in this video I'm going to show you how you can set up your own self-hosted version of kifi We're going to be setting it up on an ec2 instance on AWS before I get into that I just want to quickly show you kify and why it's both impressive and why you might find it interesting here it's described as an open source and self- hostable Heroku nullify or versel alternative and what it allows you to do is it's program language agnostic you can deploy this on anything from a Raspberry Pi to an ec2 instance to a digital ocean droplet allows you to deploy on a single server on multiple servers or even with Docker swarm with kubernetes support on the way it will set up your SSL certificates for you there's also push to deploy say if your application is set up to deploy whenever everything's merged to the main branch you can do that with this there's no vendor lockin you have complete control on all of your data there's automatic database backups which you can plug in and directly into S3 there's a ton of other nice features within here I'm going to be showing you how you can deploy the LL answer engine project that I have to AWS if you don't have an AWS account it's pretty easy to get set up you'll likely be able to get hundreds of dollars in credits to try out different features within AWS once you're signed up and signed in to AWS you'll have their console here and this is what it looks like so we're going to be going to ec2 and then once you're within the ec2 dashboard here what you can do is you can just click the button to launch instance once you're on the page we're just going to name this you can name it whatever want in this case I'm going to call it kify we're going to select that we're going to be using the latest version of Ubuntu and then we're going to scroll down here and then the requirements for kify we need two CPUs and 2 gigs of RAM and for the closest tier to that is the t2 medium here we have two CPUs as well as 4 gigs of memory a little bit more on the memory side now for the key pairs this is going to be what you use to SSH into your server if it's the first time that you've used AWS you can just go ahead and create a new key pair here and we'll just call this coolify keys and then we're going to be using a pem extension for this so you can create that it will download this file here and we'll just keep that in mind for the next step we're going to allow traffic from https as well as HTTP going to keep scrolling here now if we go back to the requirements here we need at least 30 gigs of storage so if we just bump this number up to 30 we can go ahead and launch our instance so now it might just take a moment for it to launch here you will see it in the pending State here just for a moment or two now if we scroll down to our security groups here and we click on launch wizard 1 we're just going to edit the inbound rules here and we're going to add one rule here we're going to go custom TCP and then we're going to expose Port 8000 on on anywhere ipv4 and then once you have that just scroll down to the bottom save the rules out and then we're going to hop back to our instances here so now that we see that it's running there's a couple different options here so you can click into the instance and you can see some of the details here like the IP and all of that what we're going to do here is we're going to connect to the instance here if you just click connect and you go over to SSH client we're going to copy this command here and then what you need to do is so this goes back to that pen key that we had just downloaded here I have that key within a folder here if I just LS this out we see that we have our coolify keys and now if I just copy this command here what you can do you can either copy that in directly depending on how the permissions are set up or you can also pseudo SSH you can put in your password and then we'll get a security prompt just like this the first time that we connect to it we'll just say yes we see that it's permanently added and now we see that we're within the server here if I just LS out here you can see that we're in that ubun 2 instance with the IP of our EC now the one thing on AWS if you just copied this command I'll show you what it does here that it says please run as root so if you see that don't be too concerned cuz with this curl command we're requesting to the coolify CDN to install it so again I'm just going to copy this and this time instead of just running bash what we're going to do here is we're going to pseudo bash so here we're just going to be piping that through to give those root permissions once you see it start to install here it has to install Docker it has to set up a handful of different commands that it has to get everything running it just takes a couple moments to set everything up here you'll see everything start to install download the necessary dependencies and all of that and then once that's done we can move on to the next step all right so now we see congratulations your coolify instance is ready to use please visit this link here to get started this ties back to the port that we had just exposed within our security group here so now if we go over to our browser and I paste in that IP with that Port if everything's working you'll get a screen like this now the one thing is if you don't get to a screen like this just go back if you skip the step on exposing the port 8000 here for the inbound rules just make sure that you Circle back and actually set that up because otherwise you won't be able to actually get to this page to set this up you can just plug in a name and an email here and then you can set up a password to log in to your instance here we'll click register we'll click save we're just going to skip the onboarding in this example but feel free to go through it if you'd like I'm going to disable this popup so this is what you see when you log in you have this nice dashboard we have projects we have servers we have sources I'm going to run you through a really quick example here if I just go over to my profile here and I just grab a repo that that I have public so I'm going to grab the llm answer engine project here and then what I'm going to do here is I'm going to click add a project here in this one I'm just going to call it answer we're going to click continue then from here we're going to click into the project environment we're going to add a resource so you can select from a ton of different resources here this is just to give you a little bit of an idea on the different things that you can plug in here there's a huge list of different services within here there's a number of different databases that you can set up you can also set it up with a Docker image in this example I'm going to be showing you how to set it up with a public repository but you can also set this up with a private repository as well if I just click public repository here and I click Local Host and I do Standalone Docker then you just have to put the link to the repository here you can click check repository and then here depending on your projects there's NYX packs there's static there's docker file or dock compose in this case I have a Docker file within my project so I'm going to go ahead and click that I'm going to be using Docker file for the build pack variable The Next Step there and then you can deploy this so before I deploy this I'm just going to go over to environment variables I'm going to click developer View and then I'm going to paste in all of my environment variables from myv here and then once that's plugged in you can just switch back to the normal view there and you'll see all of your different key ke is all plugged in now I'm going to click deploy here and then we have this log this deployment log like you'd see in some of these popular Services whether it's NFI or verel it will start to go through the different steps on reaching for the GitHub repository it will go through the steps of pulling everything down it will go through the build process in this case since it's using Docker it will spin up that Docker image and then it will subsequently grab all the resources and build it out from there it goes without saying that this is already pretty impressive right all right so now we see that the update was complete here now if we go back to the configuration of the project you'll see that you have these domains here once you have the link here you can make a new tab and then we can just paste it in just like that and then you'll see there we go now we have a self-hosted version of the answer engine we can just click through to it and we see that it's working just as you'd expect I wanted to give a huge shout out to the creator of kify it's incredibly impressive what it allows you to do now the other thing that I didn't really note is it doesn't just need to be for one project so if you have a ton of projects you can just throw them all in here you can provision your server on whatever you like this could be a raspberry pie an old laptop it could be an ec2 instance in this case it could be a digital ocean droplet whatever it might be it gives you a lot of autonomy over your project this is a really impressive option on if you do want to self-host your application that's it for this video if you found this video useful please like comment share and subscribe otherwise until the next one
Weekly deep dives on AI agents, coding tools, and building with LLMs - delivered to your inbox.
Free forever. No spam.
Subscribe FreeNew tutorials, open-source projects, and deep dives on coding agents - delivered weekly.
Technical content at the intersection of AI and development. Building with AI agents, Claude Code, and modern dev tools - then showing you exactly how it works.