Loading…


Friday May 16, 2025 TBA
Discover how to harness the power of local LLMs to create your own AI-powered mental model assistant—all while staying completely offline. This hands-on session demonstrates building a privacy-preserving AI system on your laptop using open-source tools like Ollama and Streamlit. Through live coding and practical examples, you'll see how to develop an AI thought partner that helps discover and refine mental models without relying on cloud services or external APIs.

Attendees will learn how to:
1. Set up and run powerful LLMs locally on their laptop for complete privacy and control
2. Build an interactive AI assistant using Ollama, Streamlit, and DeepEval that operates entirely offline
3. Create and implement custom knowledge bases to enhance the AI's capability for mental model discovery
4. Develop and test prompt engineering strategies for reliable, consistent results from local LLMs

This session is perfect for developers and tech leaders interested in exploring the practical possibilities of running AI locally while maintaining data privacy and reducing cloud dependencies.
Speakers
avatar for Jared Olhoft, MBA

Jared Olhoft, MBA

VP of Engineering, Collaboration.AI
VP of Engineering at Collaboration.AI, Jared Olhoft brings 20+ years of tech leadership across multiple industries and startups. The former Marine combines disciplined execution with innovation in practical AI solutions.
Friday May 16, 2025 TBA

Log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link