Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

I guess your response brings out the fact that it's not an either-or proposition. It's a sliding scale between dev resources and computer resources, based on the costs in your situation.


Yes, the exact tradeoff has to be calculated per instance, exactly how much programmer time would it take to reduce how much machine time.

However, getting X hours of average programmer time, vs X hours of average machine time...machines are cheap. You should default to optimizing programmer time, not machine time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: