Run AI Models on Your Laptop - No Coding Required!
Do you want to run open-source pre-trained models on your own computer?
This walkthrough is for you!
Ollama.ai, an open-source interface empowering users to install and run powerful models directly on their desktops. 🖥️
In this developer centric video you'll find a comparison between Ollama.ai, WebML, and the traditional REST API approach. We'll uncover the drawbacks of backend LLMs and how Ollama's client-side solution attempts to change that.
Join me as I demonstrate the performance of local models like Llama2, Mistral, Llava and more, showcasing the interaction through CLI and locally hosted REST API calls.
Don't miss out on the future of AI – watch now! 🔍🎬 #AILearning #OllamaAI #LanguageModels #mistral
Chapters:
0:00 Developer's Take
5:56 Getting Started
10:16 Installation
12:12 Using Llama2
14:00 Using Mistral (summarise a Webpage from URL)
15:25 Using Llava - Multimodal demo 1 - coffee shop image
17:20 Multimodal demo 2 - promotional image
17:45 Multimodal demo 3 - detecting a scene
18:43 Multimodal demo 4 - Economic Infographic
20:38 Arguments on Uncensored LLMs
22:22 Local WebAPI for LLM tasks
Check out my WebML project using Tensorflow.JS (mentioned in the introduction): https://youtu.be/NStucy_xte8?si=uotU5bKhr2hmmCdb