Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

I picked up the same model, including the shipping to Canada, it ended up costing a lot for what it is.

How are you hosting your LLM locally? I tried Ollama on an M4 Mac mini, even with a smaller LLM, the performance was very poor.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: