
Check out CopilotKit on GitHub at https://go.copilotkit.ai/copilotkit to view the demo + more featured in this video. While you're there, star ⭐️ their repository and support open source. In this video, I demonstrate how to build an application that integrates UI components with an AI assistant using CopilotKit's new Crew AI integration. We start with a recipe creator example and adapt it for different use cases, exploring setup, data communication, and UI interaction. I guide you through pulling the repository, setting up environment variables, and ensuring seamless synchronization between the front-end and back-end. Along the way, I show how to customize the application for specific needs, such as creating a workout plan. Key features include predictive state updates and streaming responses from a large language model. 00:00 Introduction to Building AI-Integrated Applications 02:09 Getting Started with CopilotKit 03:21 Exploring the Demo Viewer 05:23 Customizing the Application 11:00 Integrating AI with Crew AI 15:29 Final Thoughts and Next Steps
--- type: transcript date: 2025-03-27 youtube_id: HYYiydjNh-g --- # Transcript: Building AI Powered Full-Stack Web Apps with CopilotKit + CrewAI in this video I'm going to be showing you how you can start building out an application where you can leverage your UI components in conjunction with your AI assistant to work together seamlessly and allows them to stay perfectly in sync I'm going to show you how to get started with co-pilot kit and specifically I'm going to show you their new crew aai integration what I'm going to be doing in this video is we're going to start with this recipe Creator and then ultimately we're going to adapt this for a different use case and I'm going to walk you through the setup process and as well as the different areas on where you can navigate and update to make this application your own so you'll have an understanding by the end of the video on how the data communicates between one another and ultimately all of the different component pieces on where we're reaching for information where we're ultimately sending information to a large language model and how everything stays up to date in sync within the UI there are a few different areas where a user can go and tweak things they can change things like the dropdowns the different checkboxes as well as the ingredients and instructions they can click this improve AI button alternatively they can directly type in a message within this right hand panel here and effectively how our application works is when a user sends in a request the nice thing with co-pilot kit is it allows us to have a shared State between what's happening on the back end of our application as well as what's within the front end how our application is broken out we're going to have our nextjs application layer and within this we're going to have a simple API wrote and that's going to connect to our python backend architecture this is going to be where we have a fast API endpoint and within this that's going to be how we communicate those messages back and forth from our crew AI agents as well as what we ultimately send back as a response to the front end in terms of the requests we're going to be sending in those details of the form as well as what we have in the natural language pain and similarly on the back end we're going to be streaming back the responses of the text from an llm if necessary alternatively we're going to be updating the information within the UI State without further Ado let's get started the first thing that we're going to do to get started is we're going to go ahead and pull down the repo once you've cloned the repository where we're going to be going within this is within the examples folder what I just showed you on the outset of the video is within the demo viewer here the thing that I love about this demo viewer is there are a ton of integrated different features of co-pilot kit all built within here one of my favorite features that they've included within this example is the predictive State updates this predictive State updates it's really neat because it's almost like something similar to cursor and what I can do within this is I can say something like I want a title for this blog post and what this will do is it will analyze what we have within our text panel here and it will give us suggestions within here as we can see we have that suggestion for a titled anthropic economic index understanding ai's impact on jobs and the economy and then you even have the ability to confirm confirm or reject for instance if I want to reject that title I can go ahead and click reject alternatively you can go and accept it and work through documents like that it's a really neat way on how you can interact with potentially a text document or if you want to build out something like the cursor for writing you can do all of that within this same demo and this same repo that I'm about to show you the first thing that I'm going to do is I'm going to open up two different terminals side by side on the left hand side here we're going to have our agent architecture if I just go within our demo viewer and go within our agent we're going to be setting up a quick environment variable and then a similar thing on here is we're just going to go within the examples folder and then in this case we're going to go to demo viewer and then within here this is going to be where we can go and install all of the dependencies from there I'm just going to pmpm install everything now within our agent folder if we go within demo we're going to see all of the backend agent architectures for all of the different examples that we saw within the UI there the shared State example that's going to be that recipe example that I'm going to show you how this works I'm going to be focused particularly on this example and what I'm going to show you is how you can take this example and tweak it and make it your own for whatever use case you might have in mind the one thing that we are going to be leveraging within this example is the openingi API and that's going to be how we communicate and interact with an AI model so just go ahead and get your API key from open AI we're going to create a EnV file and within here we're just going to add in our openai API key just like that and then you can save it out and then once you've saved out your EnV within the agent directory you just have to make sure that we're going to copy this same environment variable and then we're going to paste it within the demo viewer directory as well and then once we have that I can go and run poetry run server to start our agent architecture similar thing on the front end is I'll just start our nextjs application as well now we can open up our application and now we should have what we saw on the outset of the video we have all of those different examples and what I love about this example is there is basic basically all of these different really great use cases on how you can leverage copilot kit all within one repo so now if I just test this out I'm going to say I want to make a steak in 15 minutes with some healthy vegetables I'll send that in and just like we saw now we have this working here we see it set high protein we have all of the ingredients and then we also have the instructions there now that we have it set up I just want to go through how this works and how you can begin to make this your own first what I'm going to do is with in our demo directory is I'm going to go within the source directory and I'm going to go down to our configuration within our configuration this is going to be a list of all of those different capabilities that we have in the sidebar when it loads it's going to iterate through these and it's going to map all of these out what I want to do is I want to set the default state of our application to the shared State here I'm going to go within the page where we have all of these listed out within the selected demo ID is I'm just going to set the initial state of our application to be the shared state now if I go back to our application and if I refresh the page I should always have it on this shared State between agent and UI next what I'm going to do is I'm actually just going to get rid of our sidebar in its entirety in this example I'm just really going to focus on this core example since what I'm going to be building is going to be something related to exercise the first thing that I'm going to do is I'm actually just going to get rid of this style for the background image and what I'm going to do is I'm just going to say I want a background that is black purple and blue we'll go ahead and I'll save that out now we have our application we have the black purple and blue interface here now that we have this what I want to do is I want to go side by side and show you the UI layer as well as the agent architecture and how you can begin to make something like this your own what I'm going to do is I'm going to put our agent architecture which is the agentp on the left hand side here and then we're going to have the page if I open these up a side bys side we're going to see a lot of similar pieces that are set between the state of the front end and the back end of our application first one is going to be the skill level and instead of the recipe let's just call this the workout we can keep it beginner intermediate as well as advanced since they're still all applicable now in terms of these special preferences instead what I'm going to do within here is I'm going to say I want these to be different types of exercises now the one key piece with this is when you do make a change on the back end you just have to make sure that you go and reflect that change on the front end as well so just make sure that when you're updating your backend architecture that your following through with those changes to the front end and vice versa the reason I like to do these side by side is it makes it really easy to make sure that you do have continuity between your front and back end and there are some nice tricks if you're using something like wind surf or cursor what I can do within here is I can say update these and I can paste in what I had just had generated on the left hand side here I'm going to go ahead and submit that and there we go we have it updated on the front end as well now if I just pause there and update both of these and I show you the application so far now we have the special preferences here so we still have to update things like the cooking time but now instead of it being about recipes we now have something in the context of a workout we can just go through and update some of the labels here I can say workout type pretty much the last thing that we have to swap out here is going to be the ingredients you can also go and take this a step further and swap out things like the variable names and what have you within this as well now that we have the visual representation of what we want to update within our application from there we can go and swap out a handful of different things if you want to swap out variables you can go and highlight them and we can say something like workout time and swap out all of those variables next what we're going to go through is the cooking time now the one thing that you can do is if you command D within VSS code or cursor or wind serve what you'll be able to do is have multiple cursors where you can easily swap things out if I just say workout time I can save that out the next part of this is a core piece of this component here everything within here is going to be mapped to these different parameters that we have within this function call or tool call as it's Otherwise Known within here we have these various things like we have within the UI we have the skill level the special preferences the cooking time ingredients and instructions and within here you see that it maps to all of these various Fields here now just a quickly touch on what a function call is or a tool call what these are is effectively a way on how you can tell your large language model that you're using what your function does the parameters that it accepts the types of those different parameters as well as what the function and the types that function returns while it may sound like a lot they're relatively straightforward once you get the hang of it just to walk through this instead of having instructions geared towards a recipe generation we can have this instead geared towards a workout plan or whatever that interaction is this could be any different UI interaction that you want to have your co-pilot interact with it could be within a proprietary application that you have maybe you have a list and you want to with natural language have it select and email different individuals or whatever it might be you can do quite a bit with this type of mechanism within here we're going to have our recipe schema and this is going to be an object and we're going to have a handful of other properties within that object this is going to be the skill level of the recipe but in this case we can just swap it out to workout for special preferences again we can just swap out all of these instances basically of where it says recipe to workout now one thing I'm not going to demonstrate but what you could definitely do is you can swap out these different values where there is a reference for this key or variable throughout the application if you're trying to tidy this up for your particular use case just make sure you change out the variable names as well as those keys relating to in this case what might be something that's specific to what we mapping towards now once we have that the key portion of our application and where we actually integrate with the large language model is going to be within our crew AI flow if you haven't used them before it's a similar concept to the Lang graph architecture where you'll have all of these different interconnected nodes which is sometimes referred to as a cognitive architecture the way that you can think about flows is you're going to have an entry point for your flow on where that graph starts and then from where that starts there could be a series of different routes that the language model will take depending on what conditions are met or what particular task it might be for our flow it's going to be a relatively simple example within this we're going to be sending a request to a large language model and in this case what we're going to say is similar to the other examples you are a helpful assistant this is the current state of the workout plan you can just swap out all of those different values there within this instead of recipes similar to above we're going to going to say you are a helpful assistant for creating workouts this is the current state of the workout plan and you can modify the workout plan by calling the generate recipe if you have created or modified the workout plan just answer in one sentence about what you did even though they're coming back in a structure similar to that stream of text and tokens we get from something like chat GPT we can have that structure streaming to our application maap to the UI just to demonstrate that streaming effect if I say I want a high intensity strength training workout that last 30 minutes if you caught it there it happened very fast but what you'll see is it will go through the structure of this form and it will effectively map that schema of all of those different values in this case we had the skill level that was inferred we have the 30 minutes just like I had instructed we have the strength as well as the high intensity checked off and then we have the equipment the dumb balls yoga mat as well as the timer and then finally we have the instructions here finally we're just going to be running this model and then we're going to be setting the response to stream with co-pilot kit and that's the nice thing with co-pilot kit is they have these really helpful methods and functions as well as hooks on the front end you can stream in the different state and again there's going to be that continuity of what your agent is doing as well as your front end the great thing with what they're building is they're really trying to narrow the Gap in terms of the complexity that it takes to build out these full stack AI agentic applications because previously you might see a lot of examples that are maybe in the terminal or things that might have been a bit of a patchwork on the front end in terms of how they were implemented but with this there's a standardized format where if you wanted to swap between crew AI or Lang graph or what have you what you'll be able to do is to have those rails in between or can make that process easier finally within here we're going to be setting our model to opening eye gp40 now it is using light llm so if you do want to swap this out for another model you definitely can if you'd like just be mindful of swapping out and setting up all of the relevant API Keys finally within the model we're going to be sting in the system prompt like we had specified just above we're going to be sending in that whole state of all of those different messages and then finally we're going to have the tools that we had just defined in this case it's just that one tool that's acting as a structured schema for how our application will render those different pieces within the UI and then from there we're basically going to be appending the message that we get back from the model to the state array so we have the context of all of those previous messages as well then we just have how we're handling the tools tool calls do involve a little bit of parsing in terms of getting those results back from the llm because what we'll get back from the llm is a Json payload but once we have those tool call results as well as those parameters we have to go through and loop through the different parameters so depending on what your tool is doing as well as how many tools you have you just have to do a little bit of part singing organizing and sometimes in the case of if you're using something like an external provider actually invoking those methods to get some information that you might further pass within the context of your application or again within the large language model what we're doing once we get the tool call result is we're going to be using the results from that tool call to append to our messages and in this case it's going to be the recipe that was generated or again the workout that was generated otherwise that's pretty much it obviously you can build on top of this but this is just an introductory example on how you can get started with integrating the front end to the back end base of an agent architecture and like I mentioned there are a number of different examples within the demo repo if you have writing use cases or human in the loop use cases I really encourage you to try these out these are definitely a strong basis for building out these types of applications but otherwise that's pretty much it for this video if you found this video useful please comment share and subscribe otherwise until the next one
Weekly deep dives on AI agents, coding tools, and building with LLMs - delivered to your inbox.
Free forever. No spam.
Subscribe FreeNew tutorials, open-source projects, and deep dives on coding agents - delivered weekly.
Technical content at the intersection of AI and development. Building with AI agents, Claude Code, and modern dev tools - then showing you exactly how it works.