Running local LLMs is all the rage these days in the self-hosting circles. And if you've been intrigued, or have dabbled in it, you'd have heard of Koboldcpp and LM Studio both. While I'd previously ...