How to Use a Local LLM within Cursor (and save money on AI tokens!)

Опубликовано: 20 Ноябрь 2025
на канале: Coulter Peterson
13,816
202

Cursor is a fantastic AI-powered IDE, but what if you want to power it with your own locally-hosted LLM using your powerful gaming PC or Apple Silicon? In this video we cover how to do just that using LM Studio and an additional helpful tool.

▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬

📹 Related Videos: 📹

Vibe coding a Minecraft clone ►    • I built Minecraft in 300 seconds by Vibe C...  

▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬

🔔 Subscribe! 🔔

Subscribe ►    / @coulterpeterson  

The Channel ►    / @coulterpeterson  

▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬

🚀 About the Channel 🚀

I'm a nerd who codes and hacks around with old and new(ish) hardware.

▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬

🔖 Chapters 🔖

0:00 Intro and why you'd do this
0:44 Install and configure LM Studio
02:17 Install and configure Ngrok
03:00 Bring it all together and use in Cursor

⚠️ Disclaimer: This description may contain affiliate links, which help give a little to support the channel.

#cursor #coding #ai