
Build a custom chatbot without the need to embed all the data + vector database. In this example I use the OpenAI function agent in language to show you how to can build out a chatbot that is based on your own custom functions + data sources. Links: https://github.com/vercel-labs/ai/tree/main/examples/next-langchain https://smith.langchain.com/projects https://platform.openai.com/account/api-keys Github Repo: https://github.com/developersdigest/Agent-Based-Chatbot-with-Langchain-and-Next.js/blob/main/route.ts
--- type: transcript date: 2023-08-07 youtube_id: qVM2BhAnKqo --- # Transcript: Building an Agent-Based Chatbot with Langchain and Next.js in 15 Minutes in this video I'm going to show you how you can build out your own custom chatbot in Lang chain in xjs so what I'm going to be showing you is a way that you can set up a chat bot without having to embed or create vectors of all the different information that you want to interact with what we're going to be using in this video is the openai function calling feature where you'll be able to Ping different apis that you want to access so if I just demonstrate how this can work is if I just say what is the price of Bitcoin now I use this because it's just a free API that anyone can use to get up and running it doesn't even need an API key to get started and you can see what it's doing under the hood is it's going out reaching out for that cryptocurrency price and returning me the value and the other thing that you can do is let's say what is the price of Bitcoin in Canadian dollars so it takes in two arguments one's required one's optional you so you can also pass in the currency type uh in this example but the main thing with this it's not really to build out and show you how to build out you know a chat bot where you're interacting with cryptocurrency prices and whatnot it's really to show you how to scaffold out how to reach out to apis that you have whether they're proprietary or an idea you have if you could ingest data from a different API or different data source so I'll show you all of that behind the hood we're going to be using a ton of different things within Lang chain but the code itself is going to be pretty small and reusable so if you want to be able to take this and Fork it and use it however you want feel free now the other thing I'm going to quickly incorporate in the project is langsmith now langsmith is something I encourage you to use in your Lang chain projects it gives you a high level overview of things like you know how many times that the chatbot has run uh or your application it doesn't necessarily just need to be a chatbot obviously how many tokens have been used it can show you all sorts of high level details so you can see what I just ran here I shouldn't even say high level just a whole lot of detail it shows you the metadata that's passed back and forth between the llm in and out it shows you the tools that are used shows you the response times of everything and it can even show you like the nuances of every token that is being used whether it's completion tokens or the prompt tokens themselves so super great I encourage you like I said all your link chain applications I'd encourage you to use this so the first thing that we're going to do we're going to dive into it we're going to open up our vs code now the first thing I'm going to have you do is head over to this link so this link is a AI repository from versall and it gives you the ability to use a number number of different Frameworks very very easily so you can see here if I just go to the examples so in this one I'm using the next Lang chain example here but say if you want to use next or another framework what I'm about to show you should work pretty seamlessly so I'm not going to actually touch the UI layer I'm just going to touch the back end so we're going to go in here now you can just run it through these steps here so MPX yarn pmpm whatever you'd like to do just go ahead copy that paste it in the terminal and then once you have that you should have a directory structure like this so once you have it just go ahead and npm install everything now the one thing I'd encourage you to do is just npm or pnpm whatever command you chose upgrade langchain at latest just so you have the latest version of Lang chain so I noticed within the repository Sometimes some of those versions fall out of date with Lang chain so once you have that set up you can just go over to the open AI website and get your API keys so super straightforward just make an account you'll probably be able to get some credits for free off the bat if you haven't used it yet you can go in copy your key and then once you have your key you can go within the DOT env.local.example and you can paste it in within the openai API key spot so next within langsmith so if you head over to langsmith here if you just go ahead and go through the flow make a new project click typescript you'll be able to get all the environment variables that you need and then once you have everything within your dot EnV so you can just paste in the four it gives you a little block you can copy you can go ahead and get your API key just from the bottom left so once you have that that's all you need to do to have langsmith running in the background and then you can sort of forget about it you don't need to declare it within the JavaScript or anything it's just there so once we have that the one place within our application that we're going to be working from here is within the route so I'm not going to touch the actual UI layer like I said but if you want to edit this because it's very simple just feel free once once the back end's all set up okay so the first thing that we're going to do is we're going to import some packages here now I'm going to run through these quickly but I'll touch on them a little bit more as we get into when we actually use them all so we're going to be using the dynamic tool and then the dynamic structure tool so one takes in a string the other takes in an object for an argument so one is going to be very simple in this example and then one is going to be a bit more involved so the dynamic structure tool that's what I used to run through that cryptocurrency example and then I'll show you a simpler example as well so we're going to be using the chat opening eye class this is going to be how we actually interact with the openai model then the agent that we're going to use is the initialize agent executor with options agent and then we're going to specify the open AI functions once we actually get to using that then we're going to be using the Wikipedia query run tool so I'm going to show you where and how you can use built-in tools if you don't want to use your own or if you just want to see what's also included in line chain then for the package that we pulled down part of the reason why I chose to use this as a starting off point is to be able to use that streaming back and forth and just have a very simple example you can build off of so that's what this repo does really well is that streaming back and forth between the different Frameworks that it has set up then we're going to be incorporating the Zod library for schema validation so I'll touch on this once we actually get here and it should make a little bit more sense if you haven't used Zod before then within the template when you pull it down so because it is built by versel you'll see that there is this export to run as Edge so once you're done and set up with this if you do want to deploy it it should be pretty straightforward to go ahead and Deploy on versaille okay so once you have this file so if you're following along you can go through and just remove everything within post and we'll just go through line by line everything that we need to set up in this example so the first thing we're going to do pretty straightforward is it's going to extract and destructure the incoming request from the front end next we're going to initialize the chat bot here you can specify the temperature which is streaming all sorts of the different open AI function options you can just specify within the object here now the I'm using the Wikipedia query tool just to demonstrate the different tools that you can reach for so I'm familiar with this one because this is one that I added in the node.js implementation of Lang chain so I'm most familiar with this one I'm not as familiar with the other ones just not having used them as much it's pretty simple it will give you the top results so you can specify say you want the top two results you can change that to two and then the length of the document that's returned now the one thing to note is I'm using a relatively short length because the thing you have to consider is if you just Jack this up to say 3000 you're going to be passing those tokens into the llm back and forth so just to be mindful of that I found for something like 300 it will give you a generally short response about something to be able to inform the context of the llm so an example of that is if you say what is Lang change so if you did that in chat gbt it might give you an answer like Lang chain is a blockchain project or something that's not true whereas this if you run this tool it will have that layer of sort of validation where it reaches out to the Lang chain Wikipedia page and just gives you that that first little blurb of text from the page okay so next this is going to be the simplest example so this is sort of the simplest Custom Tool of the function that is passed into the llm so if I just go back to our application here just briefly and I say what is Foo nope not food if I just go side by side so you see it triggered the foo function and Foo is a demo for YouTube and you see the value of food oops I even had a typo in here but it still it still caught it this is a demo for YouTube so very simple we just have the function name we have the description of what the function does now the reason why the description is important is that is how the llm is informed of what the function can do and it becomes even more important in The Next Step where if you have structured data and a structured tool where you want to accept an object of arguments and you need to specify those arguments and whatnot so there's a few other pieces outside of just writing the function that you need to do to have it interact with the llm and have it play nice and that is what this example I aim to sort of show you so this one I just called it fetch crypto price uh the description fetches the current price of a specified uh cryptocurrency and then there's the function itself so we're going to log out uh here's the options that it was passed and you can see in my examples there's the currency when I uh specified the Canadian currency and I didn't you saw I didn't put CAD right I just wrote Canadian and passed it in you can put in a different crypto name you could put in ethereum or whatever you see fit and then I use this just because it's a simple URL and no API key so in this step you don't need to reach out for an API key to use it you can just reach out and ping this so simple fetch request to the URL get the response and then I'm returning the crypto name and then the price itself so next we're just going to declare all the tools that we're going to be using in our example here so we're going to be using Wikipedia we're going to be using Foo and fetch crypto price then here is where we're going to actually establish which agent type that we're going to use so as I mentioned we're going to be using the openai function agent executor so it's going to handle all that responses in and out if there's multiple um functions that need to run so if I say what is the value of Foo and the value of Bitcoin in lira so you see in some examples if you want to have multiple functions invoked within one response you can do that so you see okay the value of Foo is this is a demo for YouTube and the value of Bitcoin and lira is that so just to show you how you can use this and it's not isolated to using one function at a time okay so next this is we're just going to extract the most recent message with what we're going to be passing in here so again you can sort of see it change this as you see fit I'm sort of using just a small example to try and use sort of as few tokens as possible just to be mindful of if you're running this and testing it out I just want to keep it as straightforward and to the point so then here this is going to be where we actually execute the agent so it's going to run in the background with the input from the user then this is going to be how we break up that response from the agent executor so the thing with link chain is streaming is something that is becoming more and more readily available now right now within the node.js implementation I didn't find a way to stream out the results from an agent that worked as I wanted so essentially all this is doing is it's creating a readable stream and it's creating that familiar output where it's not a consistent stream where it's just like you know like painting a line across the screen it will give you a bit of Randomness with those outputs so it sort of feels like it's streaming right out to the UI and then simply all that we have to do is just return that response within this streaming text response class that is built within the AI Library here and then that's pretty much it you're Off to the Races you can take this swap out the functions you can add multiple Dynamic tools or multiple tools within the implementation of Lang chain that they have now if you actually want to find the tools for langchain as an example you can just go over to the Lang chain slash tools and then if you just take a look here you can see a number of the tools that are available so a number of them do require API Keys a lot of them do have free tiers that you can access with an API key so I encourage you take a look at here at what's already pre-built maybe you don't need to build it yourself you could potentially just reach for one of these but yeah I just wanted to show an example of say if you did want to write your own functions and build your own custom agent I wanted to have a simple implementation and boilerplate so you could just come in here remove everything within the function swap out the name schema and just get off to the races with building out your own custom chatbot so if you're are curious on how to build out a chat bot where you do embed data or you have a lot of data and you want to do that similarity search between documents and whatnot I'll link some a couple of videos in the description and within the video here which you can click on but otherwise if you found this video useful please like comment share and subscribe and otherwise until the next one
Weekly deep dives on AI agents, coding tools, and building with LLMs - delivered to your inbox.
Free forever. No spam.
Subscribe FreeNew tutorials, open-source projects, and deep dives on coding agents - delivered weekly.
Technical content at the intersection of AI and development. Building with AI agents, Claude Code, and modern dev tools - then showing you exactly how it works.