
No-Code AI Automation with VectorShift: Integrations, Pipelines, and Chatbots In this video, I introduce VectorShift, a no-code AI automation platform that enables you to create AI solutions effortlessly. I demonstrate how to leverage various integrations such as Google Drive, OneDrive, Salesforce, HubSpot, and Notion to build AI applications. The video walks through creating a pipeline from scratch, setting up knowledge bases, configuring integrations, and using nodes for input and output. Additionally, I showcase examples of building chatbots, accessing AI models like GPT-4o, and integrating with platforms like Google Drive, Dropbox, Notion and Google Docs. Follow along as I detail how to retrieve information from documents, perform web searches, and utilize customization options to build powerful, production-ready AI applications. 00:00 Introduction to VectorShift 00:19 Getting Started with VectorShift 00:53 Creating a Pipeline from Scratch 01:12 Working with Nodes and Integrations 02:27 Building a Knowledge Base 04:59 Running Queries and Viewing Results 06:07 Advanced Integrations and Configurations 07:55 Leveraging Notion and Google Docs 10:56 Integrating Typeform and Final Thoughts 11:16 Conclusion and Thanks
--- type: transcript date: 2024-11-12 youtube_id: e1zHm5ydO4s --- # Transcript: Build No-Code AI Agents, Automations and Apps with VectorShift in this video I'm going to be showing you a vector shift which is a no code AI automation platform that allows you to easily create outof the boox AI Solutions now what I'm going to be showing and focused on in this video is how you can use different Integrations whether it's Google Drive one drive Salesforce HubSpot notion and more to get started you can make a free account on Vector shift and once you've logged in you'll see a page that looks just like this there's a ton of pre-built examples within here if we just take a look through here you can search through them whether you're trying to search a CSV whether you want to create an assistant chatbot productivity tool Finance strategy content creation and more now probably one of the easiest ways on how to create a chatbot or an AI automation that's based on files that you already have is through semantically searching something like Google Drive one drive notion or what have you what I wanted to show you is how you can create a pipeline from scratch on how you can build out an application that's going to automatic read your documents get the best results from whatever you're searching for as well as show you some of the other aspects of the platform and how you can integrate these pieces together that you can ultimately use for practical use cases once you're within the pipeline you have all of these different nodes that you can leverage to build out your workflow you have the different Integrations and you can just see across the board here there's an absolute ton of different Integrations I'm going to be showing you a couple different examples as well as how you can combine it with other aspects of the platform to create a comprehensive solution for whatever you might be doing whether it's an API you want to access information from YouTube they have xaa within here as well and there's basically just everything within here that you can build out a fully featured production ready application the first thing that we're going to do is we're going to start with an input node as well as an output node once you have that what you can do is you can drag the model that you want to use so say if you want to use something like gbt 40 you can just integrate it like this and if you want to access variables you can just put something like input and then you can connect connect the node of whatever you're integrating between the nodes just like that the first thing that I'm going to do is I'm just going to call this knowledge base and in this case let's just call it something like Google Drive now that we have that we can get the result from this node and then we can also pass in the query here the way that this is going to work is when your user sends in a message we're going to pass the message as context into the llm that we specified but we're also going to be passing it in to the knowledge base reader to create a new knowledge base is really straightforward so you can play around with the chunk size as well as the chunk overlap and what vectors are it's effectively the numerical representation of all of the different pieces of text within your data when a user sends in a query it's going to convert that query to the numerical representation or the vector of what the user is looking for and then it's going to do a comparison to what's within the knowledge base and that's going to be how it retrieves the top results or in other words what are the closest results to the query that the user has specified in this case I'm I'm going to specify the chunk size to be 8800 you can play around with this a bit if You' like and then for the embeddings model I'm going to change this out to be text embeddings 3 large once you're within here you have a few different options you can choose to upload files add Integrations scrape URLs or create a folder if I go to choose an integration you'll see all of the different connected apps and Integrations that you've already set up in my case I've already set up GitHub Dropbox Google Drive Google Docs Gmail as well as notion and if you wanted to add in a new one it's as easy as clicking Connect app you can scroll through the list here where there are a ton of different options and then this is going to be what it's going to use as a knowledge Source what you can do is once you have an integration you'll see the folder here so here is my Dropbox example but if I go within the folder you can see one of the files that I put in here and mind you I can put in a bunch of different files an important piece with this is you can determine the res scrape frequency so let's just say you have a folder on your computer and it's updating every hour you might be adding new documents and you want to keep the latest information that's within that folder to be used within your AI automation or chatbot or whatever you might be building you could specify this to be hourly daily or whatever it might be in this example I'm going to be showing you some SEC filings from Apple and I'm going to download a couple of the latest annual reports to give you an idea on what these documents look like they're very dense legal material right you can see a ton of different information within here once you have the documents and the folders selected as well as the area that you're targeting with in the integration you'll now have it automatically wired up and it's going to be res scraped at the frequency that you've specified now I'm just going to put it in a system message and I'm going to say answer questions based solely on the context of all provided sources do not infer information beyond what is explicitly stated in the sources maintain objectivity and avoid adding personal opinions or assumption ensure responses are concise and relevant to the user's question now if I were to just wire this up and go and click to run this pipeline if I look within one of the SEC filings here and let's say I ask a question of what the net income is for apple if I run the integration what is the net income of Apple in 2023 and 2024 I'll run that query and what you'll see is you'll see the runtime in terms of how long it took if it's using AI models you'll see how many AI credits it took then here we have our answer within our output we can compare this so 9699599902 5 and if we go back for 2024 the net income was 93.7 36 and if we go back to the SEC document we see 93.7 36 it's over 200 Pages worth of context that it's retrieving from and this can be considerably higher you could put in a ton of SEC filings you could potentially do this with all of the company's SEC filings and start to ask questions if you'd like and you can get creative with it now it doesn't necessarily need to be a file like this you can put in your own documents and really leverage it in different ways now let's begin to build on this a little bit so you can leverage some of the other Integrations within here let's say I want to use results from perplexity as well we can go and select perplexity and in this case let's just specify a variable we'll just say user message here within a variable we'll wire this up and I'll just say concisely answer the user's query in this case there are some different models that you can leverage from perplexity I'm just going to set this to the sonor large model in this case and then once we have that we can wire this up and pipe out the response to the open aai output I could say something like what is the latest news from Apple as well as what was the net income from 2023 here we can run that now the thing with this is it's going to perform the same query to both the knowledge based reader as well as perplexity you can be mindful of that you could play around with this and potentially put perplexity later within your application or have some condition logic as well here we see that Apple reported a new September quarter Revenue record of 94.9 billion marking a 6% increase what's cool with this is you can start to integrate some of the latest news and some of the web search capabilities within Vector shift as well as leveraging some of the knowledge bases that you might have with some of your proprietary data you have the ability to share this if you want to make this a chatbot you can create your chatbot and then from here you can configure your chatbot so you can edit basically basically every feature within here you can also embed this as one of those icons that you have within the corner of your screen once you're done within here you can deploy it and then you can export it so you can access it via the chat here here we see the chatbot you can also password protect it or alternatively you can add it within a script tag or within an iframe let's just build on this a little bit further let's just say instead of just an output I want to also write this to a Google doc what you can do in this case is you can configure it to a Google doc that you specify and in this case I have a Google doc called Vector shift demo so once you've selected your file you can just go ahead and save your configuration and then what you can do is you can also put this response to Output within a Google doc now if I ask what is Apple's latest news as well as their net income from 2023 and I run that query I see the output results here but I'll also see them within my Google doc this will just constantly append to it if I just say what was the major release today from Google Deep Mind as soon as I get that back I have that depended result there say if you have something with a notion you can write to that you can also read a notion page I have a notion page with some notes about how I develop YouTube videos for instance if I want to leverage that context I can just say notion context and you can be more specific with this another thing that some people do is they wrap the context Within These tags sometimes they might not know where the context starts and ends and XML can be an effective way on how you can map that if you'd like let's just say notion just to show you what this could potentially look like what you can do is if the context is relatively small you could connect it directly to an llm alternatively what you can do is let's say you have a large context you do have this knowledge based node called search and what this will allow you to do is you can create a temporary Vector database within your pipeline say if this page is really long and it's going to exceed the context of the llm call that I have within the application you might want to leverage something like this just to make sure that it will be compatible but in this case it's just a handful of Lines within the notion document will give us a little warning like we see here but in this case I know that it should be okay now I can try and leverage that notion context and I can say write a YouTube script on Alpha fold 3 we'll go ahead and run this and once we get that response we'll see it within the output and then as well it's going to be within that Google doc as well here's our output and it's relatively long so having it within a Google doc or a notion document or something like that it can be very helpful here we see the example of the script without me having to Fe the pieces of how I generally structure a YouTube video it's leveraging that context from notion and it has a general idea on the style of YouTube instructions and in this case within the document I basically said the YouTube channel is developers digest that focuses on AI and development you can see an example of a script that's geared towards the audience that I have this is just to give you an idea on one of the many things that you can do with Vector shift another thing that you can do with the Integrations is you can also integrate something like a type form say for instance I have a type form instead of an input and in this case I want to write a script about something I can effectively use this type form instead of an input and then have it route to all of the nodes that I set up that ultimately routes to writing to that Google doc for me that's it for this video I wanted to thank Vector ship for partnering on this video if you found this video useful please comment share and subscribe otherwise until the next one
Weekly deep dives on AI agents, coding tools, and building with LLMs - delivered to your inbox.
Free forever. No spam.
Subscribe FreeNew tutorials, open-source projects, and deep dives on coding agents - delivered weekly.
Technical content at the intersection of AI and development. Building with AI agents, Claude Code, and modern dev tools - then showing you exactly how it works.