Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

This is a joke: the robots.txt is a file on websites used to give instructions about their site to web crawlers.

Disallow: tells the robot that it should not visit those particular pages on a site. Or in this case, terminate those individuals.



Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: