If you're looking at using Ollama on your PC to run local LLMs (Large Language Models), with Windows PCs at least, you have two options. The first is to just use the Windows app and run it natively.
Despite not being a developer myself, I can appreciate how Windows 11 really is a powerhouse for those so inclined. One of the big aspects to that is the Windows Subsystem for Linux (WSL), offering ...