You can directly use the power of open-source AI models on your smartphone by using tools such as Ollama.
By executing these AI models locally, you may avoid depending on the cloud and gain advantages like faster inference and privacy.
How to Run AI Models Locally on Windows With Ollama
Without an internet connection, you can use local versions of AI models like ChatGPT on your computer. Numerous services assist you in this procedure. But we’ll use Ollama to show it.
Download and Install Ollama
- Click the “Download” button after visiting the Ollama website.
- Click Download after selecting your OS.
- Double-click the downloaded file, select Install, then adhere to the prompts for installation.
- After installation, a pop-up stating that Ollama is operating ought to appear.
- Open the terminal. Use Windows + R to open a command prompt, type cmd, and click Enter.
- Use the command below to download your first AI model. Make sure you swap out the model on Ollama with an actual model; examples of these include Llama 3, Phi 3, Mistral, Gemma, and so on. It can take some time, so be patient.