I have just finished reading Outnumbered by David Sumpter. It is very readable, and says some interesting things about modern machine learning. Machine learning, in particular deep learning, is a hot topic at the moment, so I was curious to read about it, and about related stuff like how Facebook, Google, etc, use it.
Modern machine learning techniques can be very powerful, and can do many things as well or better than us. But one way to gauge their progress is to ask what are tasks they can beat us at, and what are the tasks where we still have the edge. In 2015, Google scientists studied how good modern learning algorithms are at learning how to play computer games. They found that the algorithms will beat you at Space Invaders, but that they were pretty hopeless at Pac-man.
The reason is interesting, and sheds light on the limitations of modern machine learning. I think the basic point is best illustrated by another task where machine learning is making rapid advances, but still cannot compete with us.
This is translation: Google translate will do a decent job of translating a single sentence, from one language to another, but it cannot do justice to a paragraph, let alone a book. Machine learning can deal with ten words, but hundreds of words with complex relationships in between sets of them, is too much.
Similarly, you can win at Space Invaders without much forward planning, but in Pac-man you need a strategy for eating the pills that allow you to turn the tables on the ghosts. We humans can see beyond the details of making the Pac-man turn left or right, or up or down, and realise that the power pills are crucial, that a strategy is needed for when best to use them. This is a bit like a scientist abstracting a physical law from a mass of messy data, or a detective linking two crimes. For this, our brains still rule, although who knows how long this rule will last.