Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

My sysadmin: "we won't switch to git because it can't handle binary files and our code base is too big"

Our whole codebase is 800MB.



I hope that was a conversation from 5 years ago.

Otherwise, I hope you replaced your sysadmin.


Our codebase (latest tree) is similar, but switching to git it's the total history size that is the problem. Our history is well over 25GB which git doesn't handle very gracefully.


History shouldn't be a problem, you can do a shallow checkout. But you will have to store the working tree at least on your workstation.

This solves the next scaling problem of avoiding managing the whole working tree. (without requiring narrow clones which have significant downsides)


Yeah, the working tree works well to have locally, and that's what's done with svn currently.

The problem is that I also want a fast log/blame for any file back to the beginning of time - but I'm ok with that requiring devs connecting to the server containing the history (as with svn).

I also haven't found a way to make git work smoothly in shallow mode as the default, e.g can I make checkout of a branch always remember it must be shallow? Can I make log use remote history when necessary etc? I don't want to fight the tool all the time because I'm using a nonstandard approach.


Isn't that the case that LFS solves? I've got 30ish gigs of binary blobs stored in my repos.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: