Ollama slow inference. Native MLX Backend Ollama is now built directly on top of Apple’s open-so...