The US also goes on at-will employment, so regardless of how legally enfoceable his contract may be (in that they can't sue him for violating it), they can fire him for violating it.
I always lived in an at-will state, but I wouldn't ever go back there to work. I was under the impression that the states were either Union (California?) or "Right to work".
I work in the US and have never had a job where there wasn't some kind of signed contract. I'd feel that a company that didn't put its expectations of its employees in writing was somewhat shady.
I've worked in numerous jobs in the US and have never had to sign a contract. One tried to make me sign one of those 'anything you invent even on your own time is our property' documents and I told them no to which they said don't worry about it then.
The company I work for now was acquired awhile back and the software I work on is one of the main reasons for acquisition. I fully expected them to try and present some contracts, but they didn't. Most likely they correctly thought that I wouldn't sign it without some increase in pay and benefits, and they didn't want to give me any additional money.
EDIT: P.S. Even if it's a signed contract, that still doesn't mean it's enforcable.