
Dive into the world of cutting-edge Natural Language Processing (NLP) with HuggingFace! As an open-source library, HuggingFace provides a collection of transformer-based models tailored for NLP tasks. Whether you're a seasoned data scientist or just getting started, HuggingFace offers a powerful, user-friendly platform for your machine learning and AI projects. Its robust API, extensive pre-trained model selection, and active community support make it an invaluable tool for any NLP endeavour. Start your journey with HuggingFace today and transform the way you work with language data.
--- type: transcript date: 2023-05-15 youtube_id: z41vJlPMqnE --- # Transcript: Huggingface.js: Step-by-Step Guide to Getting Started All right. In this video, I'm going to be talking to you about hugging face.js and how you can get started and moving on these source models if you'd like to use them. So, in this video, I'm not going to show you how to like build a model or do anything extreme like that. I'm just going to show you how to leverage existing models that are on their platform. But before I get into that, I just want to touch on HuggingFace itself. So if you're not familiar with HuggingFace, so HuggingFace is a company uh that's well known for their work in artificial intelligence and it's sort of becoming almost like a GitHub of uh open source models and data sets and just a place where you can go ahead and both look at the models um and and interact with them and it's all done in a you know a nice interface and it's starting to uh develop an ecosystem all around it. So there's uh been a well-established Python ecosystem for quite some time. And what's exciting is now there's also a JavaScript uh library implementation where it makes it super simple to interact with their models. So without further ado, what we're going to do to get this up and running is go ahead and make an account on HuggingFace. You don't need a credit card or anything like that to get uh set up. Uh make an account. Once you've done that, uh head over to the top right hand corner. We're going to go to settings and then we're going to go ahead and make an API key. So once uh you're on the settings page, just go to access tokens and then you can make a uh a read token. You can make it will give you the decision between read and write. In this example, you'll just need to make a read token. And then once you have that, we're going to go over and put that into ourv file. So, I'll just open up uh VS Code here. Um, so go into uh an empty directory or or any directory really, but um go into an empty directory just to make it simple and go into your env file. If you haven't made one already, just go ahead and you know, you can touch uh make a variable name for your API key and then so variable name equal and then paste in the key there. So once you've done that, uh I'm just going to run through what we're going to be doing here. So we're going to import the required library. And in this case, we're going to be using the hugging face uh inference library. So if you go ahead and npmi uh hugging face/inference and then we're also going to be using unless you're putting your API key in line here. So uh we'll you can go ahead and install those. I already have them installed so I won't run that. And once we have that uh so we have that declared we're going to uh initialize and then this is where you'll access the variable that you put inv or you could paste it in line here if you didn't want to do all thev stuff. So from there, we're going to actually initialize the hugging face inference class. And this is going to be where we um it does a lot of the work. So in the first example I'm going to show you, I'm going to show you how to reference a specific model. And then in the following examples, I'll show you how you can uh go in and leverage their docs and just sort of quickly get going with what they have built in in their JavaScript wrapper here. So, if you want to use a uh custom model or uh a specific model on their hub, uh so the first thing you can do is you can go over and click the models here. And once you've done that, uh you'll have a whole list of all sorts of models. Now, uh even though there's a lot on the page here, the thing that I found useful is uh take a look at uh the tasks here. So think about the application of say you have an idea on how you want to leverage AI or if you just want to start to get ideas this is a great place to start. You can see you know you can get textto text generation or image to image or you know text to image. So in the first example I'm going to be using a library where it's image to text. So, the way you can find the model that I'm going to be showing you is you can go image to text and then you can sort by most downloaded here or however you want. You might have like a particular use case that you're looking for. You can always search as well. And here is the model that we're going to use. So, if you just go ahead and copy the model here, you can go ahead and paste it in right as you see here. So in this example, I'm going to be showing you this image here which I have in this tab and it's just simply a picture of a zebra. So it's going to take this image. We're first going to fetch it. I'll just get the code here. So first we're going to fetch it and then once we fetched it, we're going to convert that image into a blob. And then that blob, so that data representation of the image is what we're going to pass to the model. So you can sort of think of inference as that's what we're using to get the prediction. It's almost like a prediction API. So pick the model and then pass in an input, wait for a prediction. And the reason why I want to say prediction is when you get a result that you're not uh happy with or pleased with or or what have you, uh just know that this is this is just how it works. It's it's you know it's you know statistics and probability. It's not uh something that's uh going to be absolute where you pass in an image and it tells you exactly what it is. But what I found with this um model in particular uh it's their most downloaded model and it does work really well. Like I'd say the hit rate is what with what I've tried probably about 80 plus%. Um, now it can struggle with sometimes abstract images or something that might be uh very detailed. Um, you know, if you're expecting like a very like verbose explanation of like this like detailed scene might struggle with that, but if you handle or pass it, you know, just sort of typical images that you you might uh leverage uh it does a pretty good job. But instead of just talking through that, I'll just demonstrate it here. So we're going to pass the results here uh or yeah so pass the results from our image specify our model and we're just going to simply log the result. So if I just go node hugging face it's going to reach for that image get the blob and then once it has the blob it's going to go ahead and pass that through. So we see okay a zebra standing in a field of tall grass. So it did a great job in this example. So obviously play around with this. Uh in this example that I'm showing you here, the nice thing about it is it's set up in a way since you're fetching the image, converting it to a blob, you'll be able just to pass in whatever URL uh image um um that you'd like. Like you don't need to download them locally and and play around with that. You can just pass it in line in your code here. So um now to actually leverage the wrapper. So, if you want to start to use it in a way um that they've uh built this wrapper, let me just pull it up here. If we go back to the hugging facejs and we go to this inference page here. Now, this inference page is great because it shows you all of the built-in uh libraries and examples um that you you can pass in. So if it's a you know a summarization uh model that you want to use uh it gives you example of you know the model the input uh parameters. So this is a really great place to start. Um, if you're just sort of toying with an idea, I'd encourage you to look here. Um, maybe even before looking at the the custom models because this just has a lot of good information of both examples like working examples and the actual different types of models themselves. So, if you have a use case you're thinking of and you want to explore, take a look through this page. You know, it's a quick read. There's quite a few, but you, you know, you could read this in in a few minutes really. And the nice thing with it is once you have the library installed, I'll just go back and make a couple tweaks here. If I just log, remove the comments, you can start to see that you can really get going and leverage these in often cases incredibly powerful models with very few lines of code. So here I'm just going to get everything get rid of everything from these lines here and I'm going to change my variable name here to have it within the same convention that they're using in these docs. So once you have that set up, you can go ahead and just start reaching for examples here. So I'll just show you these. I'm not going to go through too many. So hopefully you you know you'll you'll be encouraged to you know go and explore and try these things by yourself. But so here we have the summarization. So we're using the Facebook bar large- CNN model. Now if we go ahead and run this again, see we just wait for oh instead of that we actually have to get our result here. So we'll run that again. We see okay. So there's the summarized version. Now, obviously, if this was a fair bit longer, you could it might be a better demonstration of it summarizing, but this is just an example, obviously. And then once you have that set up, you can really just come in here, find the the model that you want to use, um, you know, translation. Now, some of these models you might have to pass in uh other parameters like maybe specify the the language and and whatnot, but this is a really good place to get started. Um, so and the thing with this is if I just hop back to the models as well is you can let's just click models here. The the thing that's nice with hugging face is it's not just uh NLP models. So there's lots of different types of models. you know, there's there's uh models where it's like, you know, image to text like I just demonstrated, but there's there's lots of models that are coming out here. So, you can use it for voice recognition or all sorts of of different use cases. Um, you know, there might be tabular data that you want to work with, uh, even text to speech, which is is pretty interesting. So, say you want to have an application that will uh speak back to you. I haven't played around with this, but uh you know that could be an interesting use case. Say you want to build something like Siri or something. Um this is a great platform to start to flesh out and and tinker with uh whatever ideas you might have. So I'm not going to uh go into any exa any more examples here. Uh so you can see in the video, you know, just hop through the video again if you'd like. You can see uh where I found these things. But if you just simply Google hugging face.js and then uh go into the use the inference API, it will give you a host of different examples. And yeah, so I hope you found this useful. If you did, please like, comment, share, and subscribe. If you have any questions, as always, feel free to drop them into the comments below. And otherwise, until the next one.
Weekly deep dives on AI agents, coding tools, and building with LLMs - delivered to your inbox.
Free forever. No spam.
Subscribe FreeNew tutorials, open-source projects, and deep dives on coding agents - delivered weekly.
Technical content at the intersection of AI and development. Building with AI agents, Claude Code, and modern dev tools - then showing you exactly how it works.