They tried that. What came out was Perl. Lots and lots of layers of syntactic sugar layered on syntactic sugar layered on more sugar, so you now have lots of sweet ways to do the same thing. But between all the sugar it's becoming harder and harder to see the real substance which it was about. The computer doesn't have any problem crunching through the sugar (it doesn't have any teeth to worry about), but as a programmer it doesn't become easier to recognize the vegetables if your cauliflower is sometimes covered in marshmallow, other times drenched in syrup, and the third time someone made a half hearted attempt to caramelize it. (Of course nobody servers the cauliflower as just plain cauliflower anymore.) So yes, you cook for the computer first, innocent bystanders don't need to know what went in to your program.
In the real world, the innocent bystanders matter, code is written as much for other people to understand as it is for computers to execute. I think that instead of plastering over all your content with sweeties, languages should be designed in such a way that the sugar is not necessary by choosing the right fundamental concepts, so programmers can understand what's going on. Learn to cook with the right ingredients, and learn when to add spices. And when not to.
For another example of a syntactic sugar friendly design, have a look at C macros.
In the real world, the innocent bystanders matter, code is written as much for other people to understand as it is for computers to execute. I think that instead of plastering over all your content with sweeties, languages should be designed in such a way that the sugar is not necessary by choosing the right fundamental concepts, so programmers can understand what's going on. Learn to cook with the right ingredients, and learn when to add spices. And when not to.
For another example of a syntactic sugar friendly design, have a look at C macros.