Sunday, December 5, 2010

The crisis of values

Here is how Paul Krugman renders the last 20 years of Irish history in the NYT ("Eating the Irish"):

"The Irish story began with a genuine economic miracle. But eventually this gave way to a speculative frenzy driven by runaway banks and real estate developers, all in a cozy relationship with leading politicians. The frenzy was financed with huge borrowing on the part of Irish banks, largely from banks in other European nations.

"Then the bubble burst, and those banks faced huge losses. You might have expected those who lent money to the banks to share in the losses. After all, they were consenting adults, and if they failed to understand the risks they were taking that was nobody’s fault but their own. But, no, the Irish government stepped in to guarantee the banks’ debt, turning private losses into public obligations.

"Before the bank bust, Ireland had little public debt. But with taxpayers suddenly on the hook for gigantic bank losses, even as revenues plunged, the nation’s creditworthiness was put in doubt. So Ireland tried to reassure the markets with a harsh program of spending cuts.

"Step back for a minute and think about that. These debts were incurred, not to pay for public programs, but by private wheeler-dealers seeking nothing but their own profit. Yet ordinary Irish citizens are now bearing the burden of those debts.

"Or to be more accurate, they’re bearing a burden much larger than the debt — because those spending cuts have caused a severe recession so that in addition to taking on the banks’ debts, the Irish are suffering from plunging incomes and high unemployment."

So, private investors and bankers pocketed huge profits while the markets was on a roll (or the bubble was being inflated); but the Irish government promptly nationalized their potential losses when the chips came down. One might wonder what this scheme would do to the sense of fairness and just returns of the Irish - and others who have seen socially destructive economic practices lavishly rewarded by "the market" and subsequent losses shifted onto the gullible public. But never mind - we all know that the crisis of values in modern societies comes from post-modernist nihilism and the indoctrination of the young by a motley gang of feminists, gay rights activists, and unshaven academics in tweed jackets. Why, oh, why is it so difficult even for highly cultured and talented people like Theodore Dalrymple and Kay Hymowitz to connect the dots?

Testing the Zeitgeist

Writing for the “Language Log,” Mark Liberman deconstructs the recent NYT article on the potential perils of “Growing up Digital.” He apparently wants to send out a general warning against the recent explosion of alarmist pop neuroscience since his post is titled “Your Brain on …?” Liberman thinks one of the main studies quoted in the NYT piece has methodological flaws, therefore the article provides no sound proof regarding the effects of video games and thrilling video material on kids’ brains. He warns against alarmist stories with “a high ratio of stereotype and anecdote to fact,” as opposed to “serious large-scale studies of causes and effects.” Each impressionistic account should be seen for what it truly is – just another case of “ritual inter-generational hand-wringing.” Like, you know, Socrates’s worries about the negative effects of writing, concerns about the printing press, or the telephone (Liberman quotes a NYT article from 1924 describing the telephone as that “most persistent and … most penetrating” aspect of “the jagged city and its machines,” which “go by fits, forever speeding and slackening and speeding again, so that there is no certainty”; with the benefit of hindsight, we can now see how totally, utterly baseless all such alarmist premonitions have been). The comments below Lieberman’s post mostly support his blasé attitude. True, one reader (a self-described scientist who knows all too well “that anecdotal evidence doesn't mean squat in science”) does sound some concern. In his view, sometimes a phenomenon may become so “prevalent that you don't need science to tell you of its existence, instead perhaps only of its severity, and even then, sometimes it takes a while for scientists to come up with a good way to empirically quantify these things.” This might just be the case now, of we take his own observations seriously: “I'm young, and I have experienced for myself how constant exposure to the internet and games have severely harmed my own ability to concentrate and focus on tasks for long periods at a time (meaning, any longer than half an hour). But moreover, all my friends are having the same problem.” Another reader, though, immediately counters these worries. Citing his own superhuman powers of concentration at age 58, after decades of gaming, he concludes: “Your anecdotal narrative is no more proof of anything than mine.” A more sympathetic young reader admits: “I … use the internet many hours most days, and have serious trouble with concentration, procrastination, discipline at work, and so on, and yes, many of my friends have similar problems.” But does he see any causal link here? Not necessarily. In his universe, “without a control group of friends who don’t use the internet so much, I don’t see how we can fairly put the blame on it!” I have the following hypothesis regarding the total reliance of such superintelligent researchers (most are male, thus the ubiquitous “he” above; but women are not immune to the syndrome) on clear-cut experimental proof and their intense scorn for “anecdotal narratives” (why would some call it “evidence,” really?). If the left hemisphere of your brain is overdeveloped (a requirement for – and partly the effect of – a successful scientific career these days), it will inhibit the more inchoate impulses generated by the right hemisphere. As a result, you will tend to focus on observable causal relationships among isolated “variables”; and you won’t be able to step back and sense some overall patterns. Liberman and his fellow-travelers will, of course, dismiss such a glib explanation as a groundless overgeneralization by someone who should have never been granted a Ph.D. in a social discipline. They will continue to study language, of all things, applying the one and only scientific method that can produce true knowledge; and to throw out the CVs of job applicants who show the slightest diversion from the scientific canon. More ominously, others will continue to apply the same mindset and methodology to the study of society, politics, the economy – and even the human psyche. And their predictions will never be proven wrong, no matter how severe the next crisis they miss may turn out to be. As Iain McGilchrist notes in “The Master and His Emissary,” one of the benefits of having a hypertrophied left hemisphere is immunity from self-doubt.

Don’t daydream – ever!

A recent study has concluded that a “wandering mind may lead to unhappiness.” That is, if your mind strays too often from the task you need to perform, you are likelier to experience some depressive thoughts and feelings. Staying focused on that task, on the other hand, would make you happier. As we all know, it can even give you that elusive high referred to as “flow.” Since moments of mind-wandering tended to precede spurts of moodiness, the researchers concluded that the former were causing the latter, not the other way around. My money, though, would be on a different explanation. Could this be another spurious correlation, both variables being determined by a third, less obvious one? Maybe more impulsive (or compulsive) people would be more likely to succumb to uncontrollable ruminations. This, of course, is a classic recipe for depression. But the weaker self-control which produces impulsiveness is generally associated with negative emotionality (except for cases of hypomania, when individuals experience a chronic, invigorating high). So, people with robust self-control (those who would always wait for the second marshmallow) have no reason to fear they might experience a temporary mood disorder if they spend a bit longer ironing or self-grooming (the kind of tasks which seem to predispose us most to mind-wandering). Cutting down on such chores, though, might make everyone happier. I hope some clever research theme will think up a series of ingenious experiments to test this hypothesis.

Saturday, December 4, 2010

Just focus!

A couple of weeks ago it was revealed that typesetters had accidentally opened the wrong file for the British edition of John Franzen’s thick new novel, “Freedom.” As a result, 80,000 copies of the wrong draft were printed, and needed to be pulped. These may seem unrelated, but a recent study of data compiled in Oklahoma has revealed that surgical errors may be on the increase, despite detailed protocols aimed at avoiding them (and related malpractice lawsuits). In recent years, these have included things like operating on the wrong patient, organ, side of the brain, etc. – sometimes with lethal consequences. And the BBC web site still has a “Skillwise Factsheet” posted providing instructions on how to construct a good paragraph – and it contains the following gem: “What does the topic sentence do? It introduces the main idea of the sentence.” It has been maybe two years since I first saw it, and it is still there, unchanged. I would be curious to what extent this apparent difficulty to stay focused and pay attention to detail might make engineering errors (and even friendly fire accidents ) more common than they would normally be. But there must be a away to dispel the mental fog induced by an increasingly complex, fast-paced and technologically saturated social environment – yes, you guessed it, by applying even more innovative and immersive technologies.

Don't cut off your ear!

Jonah Lehrer describes in his blog (“Feeling Sad Makes Us More Creative?”) a recent study which seems to confirm “that people who are a little bit miserable” (like Van Gogh) are more creative (or innovative). He concludes that “the cliché might be true after all. Angst has creative perks.” I recall some time ago Lehrer already wrote about the upside of depression, but his focus is narrower now. When a researcher induced sad feelings and thoughts in experimental subjects, they produced collages which were judged a (statistically significant) tad more creative as compared to controls. Sadness also seemed to make subjects more attentive and detail-oriented, and to generally sharpen their “information processing strategies.” Apparently, such focus and diligence are quite helpful in performing various tasks – “writing a poem or solving a hard technical problem.” As further proof, Lehrer points to a survey which found that 80 per cent of writers who participated in one workshop “met the formal diagnostic criteria for some form of depression.” In my naïveté, I have always thought artistic creativity is a bit different from the sparks of innovation that have given us the atomic bomb and Facebook. While the latter could easily come to people who meet the diagnostic criteria for some part of the autistic spectrum, the former would seem to hinge on intense emotional attunement and expression. In that case, heightened sensitivity could produce both depressive slumps (or even madness) and creative surges. So having raw nerves would make it likelier that you 1) cut off your ear, and 2) achieve artistic recognition, and maybe even greatness. It seems like a classic case where one independent variable (emotional sensitivity/intensity) determines two dependent variables (depressive moods and creativity); therefore the correlation between those does not signify causation. On the basis of this theory, I do have the hunch that sacrificing any body part is highly unlikely to unleash the creative potential pent up in your skull. And this would apply to geeks, too – so maybe Lehrer is right and there is no meaningful difference between the two areas of creativity or innovation.

Facebook will save the world

I knew Facebook had already done a lot to upgrade the lives of millions, but apparently its most important contribution to humanity still lies ahead ("The Age of Possibility"). In the NYT, Roger Cohen describes a momentous global transformation which will shift the center of economic and political gravity in the world from the West to the rest. He is well aware that similar transitions in the past have involved enormous bloodshed and suffering. yet, he is "not too worried." What gives him confidence that this time things will be different? The first item on his list is "the web of social networks that now span the globe." The one example Cohen gives on this account reads: "Half a billion Facebook users constitute some sort of insurance against disaggregation." In his view, "being in touch in ways that dissolve national borders makes it more difficult to be in large-scale violent conflict across fault lines." I am thinking - who else wasn't "too worrried" and thought that this time things would be different? Oh, yes - the dotcom crowd 10 years ago, and the bankers (plus the millions of small "investors" in "home equity" who took their bait). But let's stay away from such far-fetched analogies. We all know that one day things will be really, truly different. If we could only believe strongly enough.