Building Private, Offline-Ready AI Experiences #88
Replies: 3 comments 2 replies
-
|
Excited to be here! Good luck to everyone participating |
Beta Was this translation helpful? Give feedback.
-
|
Hello, when an AI model is downloaded via Microsoft's Foundry CLI or through VS Code's AI Toolkit extension, is that model running exclusively on our computers offline? For example, I downloaded the phi-4-mini model and asked it about it being offline, it responded by saying that it processes its results in the cloud. Am I correct in assuming that this particular model is unaware about it being run locally/offline on my computer? |
Beta Was this translation helpful? Give feedback.
-
Foundry Local Livestream Recap & SummaryThanks to everyone who joined yesterday's insightful sessions on Foundry Local and local AI development!
Here's a summary of the key questions and topics discussed. Please continue the conversations below ... Foundry Local vs. Ollama
System Requirements & Platform Support
Azure Subscription & Offline Capabilities
Model availability & Integrations
Happy building! |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Welcome to the Local AI discussion hub! 🎉
This is a dedicated space for our community to collaborate, learn, and share everything related to building AI applications that run entirely on-device.
What is Foundry Local?
Foundry Local enables you to run AI models and integrate them into your applications directly on your device, with privacy by design, zero cloud dependency, and fast inference. Whether building for the JavaScript AI Buildathon or exploring local AI in general, this is your space.
Use This Discussion For:
Getting Started:
New to Foundry Local? Start here:
Community Guidelines:
Have a question, project or insight to share? Start a new reply or thread below!
Discussion host: @HamidOna
Beta Was this translation helpful? Give feedback.
All reactions