
In this video I am taking a first look at the Vercel AI SDK. I will show you the Vercel AI Playground and show you how to get up and running with the Vercel AI SDK and how you get up and running with Hugging Face, Anthropic, Langchain or OpenAI in Next, Nuxt or Svelte in minutes. Links: https://sdk.vercel.ai/ https://sdk.vercel.ai/docs https://vercel.com/blog/introducing-the-vercel-ai-sdk npm install ai
--- type: transcript date: 2023-06-16 youtube_id: 7Cw1g5d9HD0 --- # Transcript: Getting Started with Vercel AI SDK & AI Playground heard in this video I wanted to show you the new versel AI SDK as well as their AI playground so right off the bat before getting into their SDK I do want to show their AI playground which I think is a phenomenal execution of a product so what this is here is it gives you the ability to compare the llms side by side so just to demonstrate this you can select a handful of these different models there's some that are behind the pro model there's some that you will need to log in for like these anthropic models but then you can add or remove models and then also sync the chat so if I just go ahead here and I say hello world send here you'll see that all of a sudden I get both of the responses from two separate llms within one screen right so you can imagine how this would be helpful with say you're developing an application or thinking of an application to develop and you want to try and figure out which llm might be best for the use case is this gives you a nice interface with quite a few llms already integrated so there's llms from hugging face to the anthropic llms to obviously open Ai and a handful of others so this gives you a really nice environment really just to play around with what type of responses you get or maybe if it's an open source Model A hugging face model where you you don't necessarily need something like chat you know or GPT 3.5 or gpd4 you can really see here which one accomplishes the task for you so if I just say demonstrate what you can do obviously you can get a lot more creative with your prompts here I encourage you to go to this SDK and check it out so if you just go to sdk.versell.ai you can see this this interfaced here now within here there's also a prompt so say if you're within the realm of prompt engineering or really trying to refine a good prompt so a similar experience here you can add or remove models and see side by side how those models handle uh the the prompts themselves so sort of similar just sort of a different look on the the interface here so the other thing to note with this is it's very new so you saw a little error come up there I was just on with uh Jared on Twitter here going back and forth on a versioning issue that I had I want to give huge a huge shout out to Jared and team they resolved the issue within minutes and I was able to get up and running to get this video going so the other thing to note with the versel AI SDK is there's a huge emphasis on streaming so streaming UI is what you can think of as the chat GPT like interface so as you're getting that response back it streams it to the screen for you so you know word by word or if it's code you know piece of Code by piece of code rather than waiting for that entire response to load and then show it here so you see sort of an example here of the issue that they're trying to solve So within the documentation here there is a bit of a technical explanation here if you're interested in what they're doing and sort of why they're doing it and I encourage you really just to check out their documentations as a whole they're great from what I've seen I haven't gone through everything quite yet this is really more of a first look at this as it did just come out yesterday so some of the Integrations within the aisdk if I just pull them up here and if I just search examples you can see within there I believe it's here this link So within here you can see some of the features that are built in so it's compatible with their Edge Network so it allows you to scale these and deploy these really quickly to you know a fast environment there's built-in adapters for hugging phase open AI anthropic and Lang chain which is great so the streaming first I think this is really huge you know this will prevent I think a lot of hiccups and just friction of developers trying to build their own solution for this having this within a package that you can just install and have streaming ready out of the gate I think is huge you know it saves a lot of effort and uh you know potential you know sort of technical complications with trying to roll this Solution by yourself so some of the templates and examples that are within the repo you have a next JS felt different Integrations so one with open AI one with hugging phase one with Lang chain and I did see somewhere that they are going to be supporting new Frameworks as they go with us so this is just the first roll out of this and I expect there to be a lot more features within this so some of the notable features as well are these hooks that they have built in use chat and use completion I will be pulling down in just a second there repo to show you how easy it is to get up and running with this so yeah I encourage you read through the blog post check out their documentation and if you're like me and you learn best by really just diving into the code head on over to their GitHub repository and pull this down so one thing to note with their SDK is they were able to secure the AI namespace if we call it that or whatever it is from npm where you they can or you can rather npm install Ai and that's their package so pretty great and I think they will get a lot of traction even just from that aspect alone I mean it goes without saying it's very helpful and useful but pretty neat just sort of an aside that they're they are able to secure this so once you're on their GitHub repo page you can either go to the examples themselves and look up the example that you're trying to start with so say you have a next application you can go through look at the readme and then you can execute this command within your terminal to get started or you could go back to the root of the repository pull this all down and you can just get clone from your vs code so that's what I'm going to be showing you here so just so it you know if you wanted to try out a handful of these examples so if we get clone pull down that repo you'll see we have this AI directory we can go into examples and then these are the ones that we have like we just saw so if we just see the AI examples and then let's just say next open AI okay so once we're with in this directory you can go ahead and install and the one thing that you will have to do is once you have this directory going here and just remove example from your environment variable and then you'll have to reach out to open AI to get that all right so once you have your API key go ahead and put it in your dot env.local just paste it in here after the equal sign and then go ahead and save and close that out so once you have that all set up you can just simply go npm run Dev and then you can open this up in your browser here so if we just click I'll close that one out and I'll just put this off to the side just to see if we have any errors I don't see any errors in our terminal and if I just go hello world you can see there's that streaming response so tell me a story that is relatively long so just to sort of demonstrate the streaming aspect of it so you can see this is a really quick way to get up and running with a chat bot so say if you want to have a chat bot or even just a chat interface of sorts this gives you a way that you can get up and running within next.js or the other examples that they have so this is an openai example but as you see here there is Lang chain hugging face there's a next implementation there's this felt kit there's a handful of things in here that you can play around with so really quick to get off and running so really phenomenal work by the team adversell in getting this up here for all of us to use so a couple other things just to note so within their AI SDK another night's feature is you can get code for the model and it does give you the ability to have a drop down here with a number of selections so if you want to use swell kit or just the node.js code you can go in here and select it here so when I was looking at this off the bat I did notice some variances for things just to look out for with you know some of the examples are a little bit different than one another I'm trying to think of one that may be able to demonstrate that but just to be mindful this just came out yesterday so any sort of issues or rough edges just let them know they're really great really responsive team and I have no doubt that any issues that come up as they get surfaced they'll get resolved pretty quickly so you can go in here copy the code and you'll sort of have examples for whatever model you want to interact with so say instead of the open AI example you want to play around with Claude if you're able to get an API key you can go over to Claude and then there's your example code for next here's the example code for node.js so you can really see how easy this is for a developer to come in here get sort of boilerplate and just get up and running you don't have to worry about how to handle the streaming of responses you can just handle sort of the business logic and all the other facets of your application now one thing that I haven't checked out yet but I am excited for is the Lang chain implementation because if you haven't used Lang chain yet I really encourage you to use that there is so much within the line chain ecosystem where you can develop upon these llm models and really build some cool things whether it's agents or things similar to Auto GPT of say if there's a particular task you want it to accomplish and you can run through this there's support now for the open AI functions which that's sort of a whole other level on how you can have sort of that computational execution to handle some of the drawbacks within an llm so for instance like if you want up-to-date information you can use your function calls for that or say if you want to do a particular calculations that the llm doesn't handle well you can use the function calling feature for that so a ton of stuff to play around with I think this solves a huge pain Point within the development of applications so I'm definitely going to be using this a whole lot more I'll be going into more examples using this in future videos so as always if you found this video useful please like comment share and subscribe and otherwise until the next one
Technical content at the intersection of AI and development. Building with AI agents, Claude Code, and modern dev tools - then showing you exactly how it works.
Weekly deep dives on AI agents, coding tools, and building with LLMs - delivered to your inbox.
Free forever. No spam.
Subscribe FreeNew tutorials, open-source projects, and deep dives on coding agents - delivered weekly.