How to use NVIDIA GPUs on a Windows notebook with Linux
Daniel Nashed – 24 April 2025 22:42:35
The following is actually a run this at home instead of a don't try this at home.
I have been playing around with VMware workstation today. It turns out that VMware Workstation 17.6 Pro can provide 3D acceleration.
But it can't do a true vGPU passthru from what it looks like.
On the other side running LLMs on Linux on the other side is quite desirable.
Instead of running Linux on VMware, I makes a lot of sense to use WSL2 anyhow.
But also for GPUs WSL2 is the right choice as you can see below.
WSL + Ubuntu = NVIDIA GPU support
If you have the right drivers installed you can use the GPU on Windows (for example with Ollama) and also access it in WSL at the same time sharing the card.
Usually you want to run Ollama only once and share the loaded models via REST requests.
But it would be possible to run Ollama inside an Ubuntu WSL instance.
Not only that. With the right drivers you can also run Docker containers on WSL exposing the GPU.
WSL + Ubuntu + Docker = better NVIDIA GPU support
There is one special case where you even need a Docker container in this context.
The very useful nvtop tool (included in Ubuntu) does not run on WSL2.
The reason is that WSL does not expose the low level hardware which nvtop is relying on.
nvidia-smi (a tool shipped by NVIDIA to query GPU information) works on the WSL Ubuntu instance,
But nvtop only works inside a container because the NVIDIA drivers for the Docker server provide the full support.
You can run a wild match on the Windows host, the WSL Ubuntu and a Docker instance running Ubuntu inside WSL.
Usually you pick one way to run LLMs. But it is good to understand all different ways for your specific use case.
I am running all of those in parallel for testing on my lab notebook.
Now that Domino IQ shipped with external GPU support, those options might be come more interesting for you.
But you can also run Ollama without GPU support on modern CPUs which quite decent performance for a lab environment.
With a GPU Domino on Linux could run native inside the WSL Ubuntu instance or on Docker in side WSL.
Windows --> WSL2 Ubuntu 24.04 --> Docker 28.x --> Ubuntu 24.04 container --> Domino 14.5 EA3
I am not providing step by step instructions for installing drivers, because this might change over time and the versions change.
You can ask for example ChatGPT to provide detailed steps -- which are pretty good actually.
- Comments [0]