Chicken and egg. Should they 1. crawl you first, or 2. send you traffic first? If 1, how long until they should either send traffic or stop, and if 2, how?
Why not adjust the crawl rate based on how often the site's content changes? If DuckDuckGoogle crawls me once an hour for a week, and notices that I only updated my content twice, why not scale the crawling back to a more reasonable rate, like once a day?
This seems logical, however what about the person who updates rarely (once or twice a month) but get a lot of notice when they do. Search engines wouldn't want to miss the opportunity to get that traffic.
I wonder if there is some sort of reverse process you could opt into - ie never crawl me until I ask to be crawled.
Disclaimer: I know little about this sort of thing.
You could probably do this by having a robots.txt file that blocks all crawling. When you want to be re-crawled, you can edit it to allow the relevant spiders to crawl you. I would imagine that spiders do not automatically re-check sites which disallow everything often, but Google (and probably others) let you manually submit URLs to be crawled.
Presumably you could set a minimum interval - an hour, or a day. This would make sure that big updates are noticed, without putting undue stress on the server.
Google is the only crawler I see using the protocol on my site. It does make Google updates occur within minutes, so that alone is reason to implement push.