Member-only story

Running Ollama on Windows Made Easy

Ajay Kumar
3 min readApr 4, 2025

Table of Contents

What’s New in Ollama on Windows?

1. Native Windows Experience

2. OpenAI Compatibility

Step-by-Step Guide to Running Ollama on Windows

1. Get Started

2. Run Your First Model

3. Using the Ollama API

Features Overview

Hardware Acceleration

Extensive Model Library

Background API Service

Updating and Feedback

Conclusion

Ollama, the versatile platform for running large language models (LLMs) locally, is now available on Windows. This update empowers Windows users to pull, run, and create LLMs with a seamless native experience. Packed with features like GPU acceleration, access to an extensive model library, and OpenAI-compatible APIs, Ollama on Windows is designed to deliver a robust and efficient AI development environment.

What’s New in Ollama on Windows?

1. Native Windows Experience

The Windows preview brings Ollama’s capabilities to a new audience, offering:

GPU Acceleration: Built-in support for NVIDIA GPUs and modern CPU instruction sets like AVX and AVX2 ensures faster model performance. No configuration or virtualization is required!

Full Model Library Access: From language models like Llama 2 to vision models like LLaVA 1.6, the entire Ollama library is now accessible on Windows. Vision models even allow drag-and-drop image inputs in the terminal during runtime.

Always-On API: Ollama’s API runs automatically in the background on http://localhost:11434, allowing tools and applications to connect seamlessly.

2. OpenAI Compatibility

Ollama on Windows supports the same OpenAI-compatible API as its macOS counterpart. This means you can integrate Ollama with existing OpenAI-compatible tooling and workflows for local model execution.

Step-by-Step Guide to Running Ollama on Windows

1. Get Started

Download Ollama on Windows
Visit Ollama’s website and download the Windows preview installer.

Create an account to read the full story.

The author made this story available to Medium members only.
If you’re new to Medium, create a new account to read this story on us.

Or, continue in mobile web

Already have an account? Sign in

Ajay Kumar

Written by Ajay Kumar

I work as FullStack Software Engineer

No responses yet

Write a response