Last month, a computer program apparently passed the
famous #TuringTest, convincingly presenting itself as a 13-year-old before a
panel of judges – at least for a third of them. There has been much hoopla
around this result – which should have been totally predictable. A few years
ago #NicholasCarr sounded the alarm (based on his disturbing self-observations
and some relevant research) that exposure to the incessant stream of
cacophonous information related through the internet was inducing in users a
kind of “artificial intelligence” – a mode of thinking marked by dampened
emotional responsiveness and mechanical analysis. If this, indeed, is the case,
then the thinking gap between human and machine is obviously shrunk, making it
so much easier for a mega-app to reach over even without credibly mimicking a
real human – and without Scarlett Johanson’s unmechanical, sexy voice.
The
report about the Turing-test milestone also reminded me of Average is Over, the latest book of economist #TylerCowen (who a
few years back contended that higher education worked mostly as a placebo). In
a nutshell, Cowen believes the kind of Knowledge Economy 2.0 we are entering
will reward disproportionately individuals who can team up productively with
computers. If this is the case, then Nicholas Carr should stop fretting about
the mind-numbing effects of IT and encourage every responsible parent and
school system to cultivate precisely the kind of “artificial intelligence” he
has bemoaned – and which is the only kind of intelligence that can make
human-machine collaboration seamless and natural. But even if Carr does not
repent, the legions of parents who stick smartphones and tablets in their
babies’ hands and schools which go mostly or all-digital are already arming the
upcoming generation with the proper skill-set for the 21st century.