How are you hosting your LLM locally? I tried Ollama on an M4 Mac mini, even with a smaller LLM, the performance was very poor.
How are you hosting your LLM locally? I tried Ollama on an M4 Mac mini, even with a smaller LLM, the performance was very poor.