Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

from the article:

>Such manipulation of photos is typically invisible to the human eye, the researchers found.

I think its more like attacking a machine learning model - the difference in the photos may not be apparent at all, so a human reviewer might not catch something.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: