Sunday, December 5, 2010

The crisis of values

Here is how Paul Krugman renders the last 20 years of Irish history in the NYT ("Eating the Irish"):

"The Irish story began with a genuine economic miracle. But eventually this gave way to a speculative frenzy driven by runaway banks and real estate developers, all in a cozy relationship with leading politicians. The frenzy was financed with huge borrowing on the part of Irish banks, largely from banks in other European nations.

"Then the bubble burst, and those banks faced huge losses. You might have expected those who lent money to the banks to share in the losses. After all, they were consenting adults, and if they failed to understand the risks they were taking that was nobody’s fault but their own. But, no, the Irish government stepped in to guarantee the banks’ debt, turning private losses into public obligations.

"Before the bank bust, Ireland had little public debt. But with taxpayers suddenly on the hook for gigantic bank losses, even as revenues plunged, the nation’s creditworthiness was put in doubt. So Ireland tried to reassure the markets with a harsh program of spending cuts.

"Step back for a minute and think about that. These debts were incurred, not to pay for public programs, but by private wheeler-dealers seeking nothing but their own profit. Yet ordinary Irish citizens are now bearing the burden of those debts.

"Or to be more accurate, they’re bearing a burden much larger than the debt — because those spending cuts have caused a severe recession so that in addition to taking on the banks’ debts, the Irish are suffering from plunging incomes and high unemployment."

So, private investors and bankers pocketed huge profits while the markets was on a roll (or the bubble was being inflated); but the Irish government promptly nationalized their potential losses when the chips came down. One might wonder what this scheme would do to the sense of fairness and just returns of the Irish - and others who have seen socially destructive economic practices lavishly rewarded by "the market" and subsequent losses shifted onto the gullible public. But never mind - we all know that the crisis of values in modern societies comes from post-modernist nihilism and the indoctrination of the young by a motley gang of feminists, gay rights activists, and unshaven academics in tweed jackets. Why, oh, why is it so difficult even for highly cultured and talented people like Theodore Dalrymple and Kay Hymowitz to connect the dots?

Testing the Zeitgeist

Writing for the “Language Log,” Mark Liberman deconstructs the recent NYT article on the potential perils of “Growing up Digital.” He apparently wants to send out a general warning against the recent explosion of alarmist pop neuroscience since his post is titled “Your Brain on …?” Liberman thinks one of the main studies quoted in the NYT piece has methodological flaws, therefore the article provides no sound proof regarding the effects of video games and thrilling video material on kids’ brains. He warns against alarmist stories with “a high ratio of stereotype and anecdote to fact,” as opposed to “serious large-scale studies of causes and effects.” Each impressionistic account should be seen for what it truly is – just another case of “ritual inter-generational hand-wringing.” Like, you know, Socrates’s worries about the negative effects of writing, concerns about the printing press, or the telephone (Liberman quotes a NYT article from 1924 describing the telephone as that “most persistent and … most penetrating” aspect of “the jagged city and its machines,” which “go by fits, forever speeding and slackening and speeding again, so that there is no certainty”; with the benefit of hindsight, we can now see how totally, utterly baseless all such alarmist premonitions have been). The comments below Lieberman’s post mostly support his blasé attitude. True, one reader (a self-described scientist who knows all too well “that anecdotal evidence doesn't mean squat in science”) does sound some concern. In his view, sometimes a phenomenon may become so “prevalent that you don't need science to tell you of its existence, instead perhaps only of its severity, and even then, sometimes it takes a while for scientists to come up with a good way to empirically quantify these things.” This might just be the case now, of we take his own observations seriously: “I'm young, and I have experienced for myself how constant exposure to the internet and games have severely harmed my own ability to concentrate and focus on tasks for long periods at a time (meaning, any longer than half an hour). But moreover, all my friends are having the same problem.” Another reader, though, immediately counters these worries. Citing his own superhuman powers of concentration at age 58, after decades of gaming, he concludes: “Your anecdotal narrative is no more proof of anything than mine.” A more sympathetic young reader admits: “I … use the internet many hours most days, and have serious trouble with concentration, procrastination, discipline at work, and so on, and yes, many of my friends have similar problems.” But does he see any causal link here? Not necessarily. In his universe, “without a control group of friends who don’t use the internet so much, I don’t see how we can fairly put the blame on it!” I have the following hypothesis regarding the total reliance of such superintelligent researchers (most are male, thus the ubiquitous “he” above; but women are not immune to the syndrome) on clear-cut experimental proof and their intense scorn for “anecdotal narratives” (why would some call it “evidence,” really?). If the left hemisphere of your brain is overdeveloped (a requirement for – and partly the effect of – a successful scientific career these days), it will inhibit the more inchoate impulses generated by the right hemisphere. As a result, you will tend to focus on observable causal relationships among isolated “variables”; and you won’t be able to step back and sense some overall patterns. Liberman and his fellow-travelers will, of course, dismiss such a glib explanation as a groundless overgeneralization by someone who should have never been granted a Ph.D. in a social discipline. They will continue to study language, of all things, applying the one and only scientific method that can produce true knowledge; and to throw out the CVs of job applicants who show the slightest diversion from the scientific canon. More ominously, others will continue to apply the same mindset and methodology to the study of society, politics, the economy – and even the human psyche. And their predictions will never be proven wrong, no matter how severe the next crisis they miss may turn out to be. As Iain McGilchrist notes in “The Master and His Emissary,” one of the benefits of having a hypertrophied left hemisphere is immunity from self-doubt.

Don’t daydream – ever!

A recent study has concluded that a “wandering mind may lead to unhappiness.” That is, if your mind strays too often from the task you need to perform, you are likelier to experience some depressive thoughts and feelings. Staying focused on that task, on the other hand, would make you happier. As we all know, it can even give you that elusive high referred to as “flow.” Since moments of mind-wandering tended to precede spurts of moodiness, the researchers concluded that the former were causing the latter, not the other way around. My money, though, would be on a different explanation. Could this be another spurious correlation, both variables being determined by a third, less obvious one? Maybe more impulsive (or compulsive) people would be more likely to succumb to uncontrollable ruminations. This, of course, is a classic recipe for depression. But the weaker self-control which produces impulsiveness is generally associated with negative emotionality (except for cases of hypomania, when individuals experience a chronic, invigorating high). So, people with robust self-control (those who would always wait for the second marshmallow) have no reason to fear they might experience a temporary mood disorder if they spend a bit longer ironing or self-grooming (the kind of tasks which seem to predispose us most to mind-wandering). Cutting down on such chores, though, might make everyone happier. I hope some clever research theme will think up a series of ingenious experiments to test this hypothesis.

Saturday, December 4, 2010

Just focus!

A couple of weeks ago it was revealed that typesetters had accidentally opened the wrong file for the British edition of John Franzen’s thick new novel, “Freedom.” As a result, 80,000 copies of the wrong draft were printed, and needed to be pulped. These may seem unrelated, but a recent study of data compiled in Oklahoma has revealed that surgical errors may be on the increase, despite detailed protocols aimed at avoiding them (and related malpractice lawsuits). In recent years, these have included things like operating on the wrong patient, organ, side of the brain, etc. – sometimes with lethal consequences. And the BBC web site still has a “Skillwise Factsheet” posted providing instructions on how to construct a good paragraph – and it contains the following gem: “What does the topic sentence do? It introduces the main idea of the sentence.” It has been maybe two years since I first saw it, and it is still there, unchanged. I would be curious to what extent this apparent difficulty to stay focused and pay attention to detail might make engineering errors (and even friendly fire accidents ) more common than they would normally be. But there must be a away to dispel the mental fog induced by an increasingly complex, fast-paced and technologically saturated social environment – yes, you guessed it, by applying even more innovative and immersive technologies.

Don't cut off your ear!

Jonah Lehrer describes in his blog (“Feeling Sad Makes Us More Creative?”) a recent study which seems to confirm “that people who are a little bit miserable” (like Van Gogh) are more creative (or innovative). He concludes that “the cliché might be true after all. Angst has creative perks.” I recall some time ago Lehrer already wrote about the upside of depression, but his focus is narrower now. When a researcher induced sad feelings and thoughts in experimental subjects, they produced collages which were judged a (statistically significant) tad more creative as compared to controls. Sadness also seemed to make subjects more attentive and detail-oriented, and to generally sharpen their “information processing strategies.” Apparently, such focus and diligence are quite helpful in performing various tasks – “writing a poem or solving a hard technical problem.” As further proof, Lehrer points to a survey which found that 80 per cent of writers who participated in one workshop “met the formal diagnostic criteria for some form of depression.” In my naïveté, I have always thought artistic creativity is a bit different from the sparks of innovation that have given us the atomic bomb and Facebook. While the latter could easily come to people who meet the diagnostic criteria for some part of the autistic spectrum, the former would seem to hinge on intense emotional attunement and expression. In that case, heightened sensitivity could produce both depressive slumps (or even madness) and creative surges. So having raw nerves would make it likelier that you 1) cut off your ear, and 2) achieve artistic recognition, and maybe even greatness. It seems like a classic case where one independent variable (emotional sensitivity/intensity) determines two dependent variables (depressive moods and creativity); therefore the correlation between those does not signify causation. On the basis of this theory, I do have the hunch that sacrificing any body part is highly unlikely to unleash the creative potential pent up in your skull. And this would apply to geeks, too – so maybe Lehrer is right and there is no meaningful difference between the two areas of creativity or innovation.

Facebook will save the world

I knew Facebook had already done a lot to upgrade the lives of millions, but apparently its most important contribution to humanity still lies ahead ("The Age of Possibility"). In the NYT, Roger Cohen describes a momentous global transformation which will shift the center of economic and political gravity in the world from the West to the rest. He is well aware that similar transitions in the past have involved enormous bloodshed and suffering. yet, he is "not too worried." What gives him confidence that this time things will be different? The first item on his list is "the web of social networks that now span the globe." The one example Cohen gives on this account reads: "Half a billion Facebook users constitute some sort of insurance against disaggregation." In his view, "being in touch in ways that dissolve national borders makes it more difficult to be in large-scale violent conflict across fault lines." I am thinking - who else wasn't "too worrried" and thought that this time things would be different? Oh, yes - the dotcom crowd 10 years ago, and the bankers (plus the millions of small "investors" in "home equity" who took their bait). But let's stay away from such far-fetched analogies. We all know that one day things will be really, truly different. If we could only believe strongly enough.

Sunday, October 31, 2010

The Harvard culinary show

Harvard has started offering a basic science course where the lab work is cooking. It employs a couple of celebrity chefs and is immensely popular. The idea is to introduce some formulas and concepts from chemistry and physics to analyze different processes students observe in the course of food preparation. A couple of years ago I read somewhere that as high school students were increasingly uninterested in basic science, they were being offered new courses in applied disciplines - like forensic science. Let me think - if Harvard is now using cooking as a way to make chemistry and physics more relevant to its students, what should Indiana University in Gary do to achieve a similar effect? Teach massage?

Just like you and me

There is a new German movie out on the life of the young Goethe. A young actress interviewed on Deutsche Welle TV described how she had been thrilled to find out that Goethe had, in fact, loved, drunk, and joked – just like her and her friends. My first reaction was: “Yeah, right…” But, on second thought, why put anyone on a pedestal, really? And why assume that 200 years ago celebrities – of any age – were any different from what they are today?

The importance of being earnest

Michael Kimmelman describes in the NYT a visit to the House of Humor and Satire in Gabrovo – a humorless, once industrial city in Bulgaria (“Take My Bulgarian Joke Book. Please.”). Most of the piece is quite condescending toward the place and the tour guide/PR officer who welcomed the author. Until at the very end he recognizes in what he sees a kind of earnestness he and his kind seem to have lost. Granted, approaching the outside world with a sense of irony and detachment is a sign of unmistakable sophistication. But being unable to leave your irony behind must be an utmost curse. Lord Chesterton once wrote: “Oscar Wilde said that sunsets were not valued because we could not pay for sunsets. But Oscar Wilde was wrong; we can pay for sunsets. We can pay for them by not being Oscar Wilde.” By the way, the Don Quixote statue next to the Gabrovo satirical shrine is astounding – all made of iron scrap welded together to capture the true spirit of Cervantes’s otherworldly hero. Apparently, Mr. Kimmelman wasn’t sufficiently impressed to include it in the picture of the Gabrovo attraction accompanying his article.

Monday, October 25, 2010

The price of progress?

As the Chilean president was handing medals and flags to the 33 miners in his palace, he took time to hug each one of them. Emotions were clearly overflowing - no Pan Am smiles there. I was going to say - of course, this is the reason why even Chile, the Latin American tiger (if there is one), will never be Switzerland. But let's not stereotype.

Sunday, October 24, 2010

The continuing conquest of cool

A NYT article (“Looking to a Sneaker for a Band’s Big Break”) says lifestyle brands are fast becoming the new recording labels. Converse, for example, has set up a studio in which young musicians can make new recordings for free. Companies will sponsor different aspects of the music production, marketing, and distribution process, and sometimes acquire songs to give away at the own web sites. The overall strategy is for the youth-oriented brands to become patrons of hip music stars and thus acquire “coolness by association.” In the past, such “arrangements would have carried a stigma for the artists,” being seen as a sellout to the evil empire. Now they are embraced with casual enthusiasm. A hot female musician confidently proclaims: “Music is everywhere now, and if you have it tied to a brand, there’s nothing wrong with that.” The article mentions that the new largesse Converse and others have adopted is part of a more general strategy aiming “to infiltrate the lives of their customers on an ever deeper cultural level.” But never mind, we cannot really expect young musicians to connect those dots, can we? Even if some boast college degrees.

Planet of the apes

Writing in the NYT, renowned primatologist Frans de Waal explains the utter feasibility of “Morals without God.” He bases his conviction on a theory of “continuity between human and animal,” or a denial of “human exceptionalism.” From this point of view, there is no qualitative difference between the way the human brain churns out an ethical judgment, and how a chimp’s brain motivates some altruistic behaviors. Nay, there isn’t really a meaningful quantitative difference in the structure of the human and the ape brain – “even our vaunted pre-frontal cortex turns out to be of typical size: recent neuron-counting techniques classify the human brain as …” Really? Reading Dr. de Waal’s expose, I would suspect that his and my brain click in qualitatively different rhythms, to say nothing of the brains of Antoine de Saint-Exupéry and Cheeky Charley.

De Waal’s focuses on the altruistic tendencies we share with our primate cousins. A chimp, you see, will sometimes help – without the promise of any reward! – an arthritic elderly female climb on a tree to hang out with her kin; or will console a male who has lost a fight. Though de Waal says we should see the whole package of human motivations and behaviors as a product of evolution and a legacy we share with the animal kingdom, it’s clear where his heart lies. He wants to revive the old tale of the “noble savage” once popularized by Rousseau – the altruistic, compassionate side of our psychological makeup is inborn or natural (and thus shared with kindly behaving animals); and the nasty aspects come from the way our natural goodness has been twisted by “civilization.” Forget about those allegedly aggressive drives Freud fretted over, ready to break through the ”veneer” of civilized “propriety.” But why forget about them? In an older article, de Waal drew a contrast between chimps and their close relatives, the bonobos. The latter have apparently invented the ape version of la dolce vita: they engage in constant mutual grooming and casual sex, and spend most of their time in leisurely companionship and relaxation. Chimps, on the other hand, live in troops with rigid hierarchies where status is won and lost by a combination of fierce fighting and Machiavellianism. Submissive families occasionally stage coups against dominant ones, and chimp platoons sometimes even wage “wars” against other colonies. In general, the lives of young males (who, after puberty, need to win acceptance in a new troop) are often nasty, brutish, and short. Females fare a bit better, but most also need to show deference for the dominant female. They can also be savagely attacked by raiding males from other troops. So, should we attribute human bestiality, not just those spurts of altruism de Waal highlights, to the natural endowment we share with chimps?

As I was reading de Waal’s incisive analysis, I repeatedly cringed at all those evocations of obvious behavioral kinship between “us” and apes. Apparently, he did not cringe while typing out the whole piece. This could be a matter of personal idiosyncracies, including presence of lack of the “left-brained” sharpness needed for a high-flying scientific career. I am trying to banish from my mind another heretical thought, though. Could de Waal’s attitudes, which betray some very peculiar patterns of brain activation, be partly attributed to his own biocultural heritage? He is Dutch, and if you look at Dutch society, it seems pervaded – even against the backdrop of rising xenophobic fears – by a kind of bonobo-style, relaxed permissiveness combined with easy-going utilitarianism. Soft drugs, harder drugs administered to addicts, prostitutes posing in display windows, euthanasia, open-minded attitudes toward teenage sex, open-air urinals events attracting large numbers of beer-gulping young men – please, help yourself, you can have it all. Could such broad-mindedness partly translate into cheerful praise for the natural goodness we ostensibly share with those good-hearted, altruistic chimps? But probably we shouldn't stereotype - neither the Dutch, nor the chimps.

The end of disgust (among other things)?

An economist recently complained in the NYT about the rise of inequality in the United States. After the furor caused by the publication of The Spirit Level, it has apparently become acceptable even for practitioners of the “dismal science” to frown at extreme inequality. If not to condemn it on the basis of a frivolous “value judgment,” at least to point to its troublesome social consequences – including its negative externalities for even some of the top dogs. Call it the “negative utility” of wealth. The NYT contributor draws a stark contrast between two ages in American history: the 1950s, when decreased income and wealth inequalities (and a marginal federal tax rate of 91 per cent for incomes over $200,000 – or two million in today’s dollars) went hand in hand with rapid economic growth; and the period since the 1980s featuring rapid economic polarization , much slower growth rates, and a series of financial hiccups. Incidentally, there are other contrasts between the two periods. In the 1960s, the majority of American Caucasians professed to feel disgust if forced to drink from a water fountain after an African-American person. Three decades later such squeamishness at even imaginary contact with in individuals belonging to a different “race,” sexual orientation, or subculture had miraculously evaporated. Could this amazing march of tolerance have also come to include tolerance of gross inequality, as a result of weakened disgust at the obscene salaries and profits reaped by the best and the brightest in some sectors of the economy? And their casual flaunting of extreme opulence?

Tuesday, October 19, 2010

The experimenter’s dilemma

A lengthy feature in Prospect (“Matters of Life and Death”) describes the multiple experiments carried out by experimental philosophers seeking to understand the nature of ethical judgment. Those all turn around the famous trolley/footbridge dilemma: a runaway trolley is racing down a rail track and is going to kill five people. Would you pull a lever to divert it into a different track whether it will kill only one? And if you have the same situation, but you are standing on a footbridge above the track and the only way to avert the bloodbath is to push a “large” person standing next to you in front of the trolley. Would you do it? In the first case, most experimental subjects respond “yes,” on the basis of a simple utilitarian calculation: it is well worth saving five lives at the price of one. In the second case, most participants say they will not do it. And often cannot explain why. In my naïveté, I thought these thought experiments demonstrated that ethical judgments can be influenced by our instinctive emotional reactions. This is most likely to happen in situations which feel close-up-and-personal – like pushing someone to his death (I hope the gendered language would be acceptable here). When we operate a mechanical device (like those drones hunting down those Taliban militants in Pakistan?), it’s easier to keep our emotions at bay and rely on utilitarian calculations. A new crop of experimental philosophers, though, are unsatisfied with this interpretation. They want to know on the basis of what ethical doctrine exactly most people can decide to pull the lever, but would not push a warm, breathing human body in front of the racing trolley. So they design increasingly clever experiments to tease this out – in dozens of versions. What if the person on the second track had been tied down there by bullies? What if those bullies, unknowingly, had also put themselves in harm’s way by picnicking on the first track? What if the second track looped and joied the first track – in which case you would need to wish that the single person be killed in order to save the others? One scientist (“scholar” doesn’t seem the right word to describe this academic occupation) explains the goal off her experiments (for some reason, most of these practitioners are women): “Real-life cases have a lot of factors going on, and it’s hard to test whether it’s this factor that’s crucial or that factor. You have to artificially construct cases to focus on the factors that are important. It’s like the scientist in the lab who has to figure out whether, say, the dust particle makes a difference to friction, and tries to hold everything else constant.” You know, as they do it in real science. As I was reading, I was increasingly thinking: “What is wrong with these people? Why can’t they just accept the ‘fox doctrine’ (after the fox from the Little Prince) or the ‘Pascal doctrine,’ both stating that certain things can be understood only through the heart?” One possibility is that to the uniformly cheerful researchers conducting the experiments the different scenarios don’t feel all that different. So finding some coherent doctrine seems the only possible explanation for choosing one course of action over other similarly unpalatable options. On the face of it, this seems unlikely – the article says responses among experimental subjects are uninfluenced by social status or educational level. But the article also mentions a dispute during WW II between Winston Churchill and one of his cabinet ministers on whether to try to have more V1 cruise missiles rain over south London. The minister, policeman’s son, “perhaps felt more keenly … the risk that the people in the working-class areas of south London would be running.” So maybe there are some meaningful differences in how individuals think and feel about ethical dilemmas – and the philosophers conducting the experiments (like Churchill) are overly clever and upbeat outliers.

Monday, October 11, 2010

Can we handle the truth?

Yesterday on CNN, Fareed Zakaria hosted a panel which discussed the tea party movement in the United States. On it, two liberal historians faced off a journalist from Wall Street Journal. One of the historians kept repeating that the movement had been fanned by Fox News and funded by a few angry billionaires. It’s funny how easily any trend we dislike turns into a shallow conspiracy. Milosevic once alleged that much of Albanian nationalism in Kosovo. And liberal Western intellectuals thought the same of Milosevic’s brand of Serb nationalism. Meanwhile, a New York Times article (“Voter Disgust Isn’t Only About Issues”) says independent voters participating in focus groups indicated they saw specific political and economic problems as part of a broader social malaise: “the larger breakdown of civil society – the disappearance of common courtesy, the relentless stream of data from digital devices,* the proliferation of lawsuits and the insidious influence of media on their children.” The Wall Street Journal woman retorted that the tea party in fact had a sensible economic agenda of shrinking a bloated government machine. The other historian lamented that all those stirrings around tea party populism were leaving aside a fundamental issue – the problem of social justice. Historically, governments have been charged with restraining the “innumerable vultures” John Stewart Mill thought could be found in any society. The tea party rank and file, though, feel quite happy to side with the sharks against the only force which could potentially control their predatory greed and “perpetual and restless desire of power after power.” Apparently, no amount of liberal hand-wringing can help the “government-is-the-problem” crowd start connecting the dots. Meanwhile, in Belgrade protesters tried to attack a gay rights procession and injured over 80 riot policemen. That outburst must have been another political conspiracy. You know, in the sense of politics is about “who gets what, when, how.” Except, it’s unclear what the rioters could possibly hope to “get” in this case. Such violence may raise a troubling question: can “civilized” political institutions function in a society which includes a critical mass of uncivilized young males? I guess most political scientists would answer in the affirmative. Some would discount the significance of political culture, arguing that political actors respond rationally to the incentive structures they face. Others would argue that an appropriate set of attitudes can develop as a result of learning within a democratic institutional framework. I do hope one of these theories is right.

* Ooops – the Google founders will probably be ticked that many Americans don’t seem to take their corporate slogan seriously

Tuesday, October 5, 2010

Wittgenstein did have a wonderful life!!!

In his book, The Temperamental Thread, Harvard developmental psychologist Jerome Kagan summarizes his findings from decades of painstaking research and hundreds of clever experiments. He describes two basic inborn temperaments: low-reactive and high-reactive. While low-reactive individuals are relaxed, high-reactives are uptight perfectionists who are easily disturbed by sights, sounds, and even minor incidents. In a couple of places Dr. Kagan expresses sympathy for the high-reactives whose lives seem to be one unending torture. “I confess to some sadness,” he says, “when I reflect on the fact that some adults, because of the temperament they inherited, find it difficult to experience on most days the relaxed feeling of happiness that a majority in our society believe is life’s primary purpose.” And since high-reactives tend to be deeply introverted, Dr. Kagan expresses sorrow that they “miss the joys that come from meeting new people and visiting new places.” Well, they do “have the advantage of living a few years longer than extroverts.” But how can this compensate for all the cheerless suffering they are destined to endure? And what if some high-reactives find their own life satisfying at some deeper level? Dr. Kagan thinks they should know better. He gives the example of Ludwig Wittgesnstein who suffered many personal misfortunes, “never put roots down in any one place,” and “was profoundly depressed and anxious his entire life.” At one low point he even confessed “that he could not imagine a future with any joy or friendship.” Yet, on his deathbed he said to an attending relative: “Tell them I’ve had a wonderful life.” Dr. Kagan’s verdict? “This comment provides sufficient reason to question the meaning and accuracy of what people say about their moods and behaviors.” So, Wittgenstein wasn’t really in his right mind. I wish Dr. Kagan could fathom what it means to lead a truly intense life like Wittgenstein’s; to say nothing of the lives of all those poets, philosophers, mathematicians, etc. who have descended into madness, committed suicide, or narrowly escaped such a fate. A book with an evocative title, Living with Intensity, offers a good introduction to this tricky issue once addressed by Polish psychologist Kazimierz Dabrowski. But reading a book, or piles of psychological “findings” for that matter, won’t help you appreciate that extatic mode of “being-in-the-world” unless you can feel some of its emotional intensity in your own gut. Judging by the unfailingly reserved and even tone of Dr. Kagan’s writing, he has successfully avoided that developmental curse. He does recognize the usefulness of all those wretched high-reactives in his own work, though. He has regularly hired them as research assistants because they are oh so conscientious.

Friday, September 24, 2010

Long live transgression!

The Venice film festival was overshadowed by the controversy surrounding the awarding of the main prize to Sofia Copola – by a jury chaired by her former boyfriend Quentin Tarantino. What caught my attention, though, was another movie featured briefly in the “Cinema” segment on Euronews. I think the actors spoke German, though I am not quite sure. In one of the scenes, they showed a 20-something daughter and her 40-something dad sitting on a row of white chairs in some waiting room – a few feet apart in a spotlessly sterile environment, meant probably as a metaphor for their existential distance. Here is roughly the dialogue that ensued:
Daughter: “Have you imagined me naked?”
Dad: “No, I haven’t.”
Daughter: “Is it because it is some kind of taboo?”
Dad: “Yes. And such taboos exist for a reason among mammals – so that they can procreate.”
Daughter: “Well… I have imagined you naked.”
As that famous ad addressed to young women in the 1960s said, “You’ve come a long way baby.” Indeed. If anyone has doubts on that account, how about that German gunwoman who killed three people (including her five-year old sun and her former husband) before dying in a hale of police-fired bullets?

But the desire to transgress constraining prohibitions is neither gender, nor country bound. It has become truly all encompassing – a trend which comes through very clearly, for example, in a NYT article about a new strand of culinary experimentation (“Waiter, There’s Soup in My Bug”). It features a chef and artist who recently organized a feast with all kinds of insects and larvae – dead and alive – on the menu. He raises those in his own apartment in miniature houses designed by his girlfriend – also an artist. These are now on display in some gallery as a daring work of art. The event itself was billed as half meal, half performance art, with a modest 85-dollar price tag. A few of the guests could not overcome their narrow-minded prejudice or disgust and went home hungry. But most relished the treats they were served. And it wasn’t just the taste of it all – no, they were exhilarated that they had crossed such a difficult threshold. Now, some felt, anything was possible – nothing could hold them back in the pursuit of all kinds of life satisfaction. At a similar event some time ago, the intoxication produced by this act of culinary transgression apparently helped the participants overcome some unrelated inhibitions and they began hugging each other, a few even started groping and kissing in a corner. As the culinary artist says, once you see people eating insects as if it’s the most natural thing to do, “it turns your world upside down a little bit.”

I thought eating insects – in addition to inspiring that invigorating feeling of personal liberation – could solve some nutritional problems. Maybe it could unlock our access to a new locally grown, organic source of protein which is, after all, commonly consumed around the world. But an expert is quoted as saying the global population of edible insects is not that significant on a per capita basis, and people in places where malnutrition is a real problem already snack on all kinds of insects. So maybe we need to take a step further and consider some other organic substances which are currently off the menu – but could be nourishing and abundant if properly prepared and marketed. Even if some of these seem off limits now, maybe in 50 years no such silly squeamishness will stand in the way of technological, social, and moral progress. The kids in that famous psychological experiments who, at maybe four years of age, begin to wrinkle their noses in disgust at the sight of a giant cockroach floating in a glass of water? Maybe their grandchildren will just slurp it – or any other digestible item in its place – without the slightest twitch; and ask for more. A small step on the way to a more rational, or cost-benefit, analysis of what is now still a nutritional dilemma – which can help resolve humankind’s alleged Malthusian predicament once and forever. For now, though, let’s take things one transgression at a time…

Saturday, September 11, 2010

High on pixels?

Dr. Richard Friedman describes in the NYT (“Lasting Pleasures, Robbed by Drug Abuse”) how many of his patients who are addicted to drugs like cocaine seem to lose the ability to enjoy the small things in life. After a while, even the drug itself no longer gives them the same high. And these effects persist even many years after they have kicked the habit. Dr. Friedman explains that mind-altering drugs highjack the reward system of the human brain as they act much more powerfully on it than any natural stimuli. With time, neurons in the brain become less sensitive to all the dopamine being released under the influence of the drug, and pleasure fades away. Is this mechanism activated only by chemical substances, though? I am looking at a review of Fun, Inc. by Tom Chatfield. He thinks that video games are clearly the greatest invention in the history of humanity. Yet, even he recognizes that games are designed to tap into the same reward circuits that are activated by sugar, alcohol, and other drugs. Could they, then have a similar effect, desensitizing the brain to the smaller pleasures of life?

Karate wisdom

Our daughter took me to see Karate Kid the other day (a remake of the 1987 original). Some of the fighting sequences in the movie are a bit too graphic, but it has a great lesson at its heart: kung fu (which has now replaced karate) is not about beating up on the enemy; it’s about achieving internal balance and self-control. The Chinese bully knows all the moves, but cannot suppress his rage – so he must bow his head in defeat. Most of the critics reviewing the movie are completely missing this point, and make unfavorable comparison to the original movie which they probably saw in their own youth. This means most kids will probably miss the main point, too. I do hope, though, it will stick in the mind of Will Smith’s cute son who plays the leading role. He will need plenty of self-control as he is growing up in order to resist all the temptations and distractions bound to plague the life of a celebrity kid.

Wednesday, September 1, 2010

Liberation technology?

A recent Newsweek article has an ominous title: "The Creativity Crisis." It says creativity scores were rising among young Americans until the early 1990s. Then they started to slowly but steadily fall. Experts are scratching their heads, and most tend to blame the decline on educational reforms emphasizing standardized testing and rote memorization. Incidentally, the 1990s saw the spread of personal computers, video games, and internet use - all on top of hours of TV viewing. Could there be a link here? According to a recent NYT article, neuroscientists now believe the incessant use of electronic devices for instant communication, entertainment, and access to information (including hand-held computers masquerading as cell phones) may be depriving the brain of much needed downtime ("Digital Devices Deprive Brain of Needed Downtime"). This is an aspect of computer use left out of that famous Mac ad evoking Orwell.

"Does Your Language Shape How You Think?"

According to this fascinating NYT article, the short answer is "yes." The only problem with it is that the whole argument departs from a very disembodied understanding of thinking as a purely mental activity. So language "shapes" thought by inducing certain "habits" of thinking. It might be much more insightful to see the whole issue in a different light - to consider the way in which our native language must be influencing our brain wiring. From this point of view, it become clearer why being a native English speaker (a language which - quite unusually - does not assign gender to inanimate objects) may not exactly enhance your emotional connectedness to the larger social and natural world.

Saturday, August 21, 2010

Creative destruction

The Pavlovsk experimental station outside St. Petersburg holds seeds from millions of varieties of fruits and berries. Most of the seeds there, which come from many countries, are not kept any place else in the world. They need to be planted in order to be preserved as they would not survive freezing. The station was established by an eminent biologist who died in a Stalinist camp in 1943. Now a Russian state agency has decided the land it occupies is not being used profitably. So they want to hand it over to developers who plan to construct luxury apartments on it. A court will rule on that decision on Monday, but is unlikely to find any legal grounds to overturn it. During World War II, twelve Soviet scientists starved to death rather than eat from the seeds which they felt a duty to preserve for future generations. With the benefit of hindsight, they should have known better. Though Putin may still ride in on a white horse and somehow save the plants. Which would cement further his reputation as a protector of ordinary Russians (and some sort of common good) from all sorts of greedy vultures – the only one on offer for some time to come.

Tuesday, August 10, 2010

In memoriam

A journalist writing in a Bulgarian paper called Tony Judt "one of the greatest contemporary historians and analysts." Employing the same word to refer to Judt that is commonly used with reference to TV pundits - it's hard to think of a greater sacrilege. His last piece in the New York Review of Books ("Words") composed as he was losing control of his vocal chords (after many months on assisted breathing) is a true tribute to the endurance of the human spirit. May he rest in piece...

Sunday, August 8, 2010

The Iron Mask

In an old episode of Mad Men (from its first season) Don Draper meets his younger brother. He hasn’t seen him in maybe 15 years, and has meanwhile started a new life under an assumed identity. One brother comes across as sincere and emotional, the other emits zero emotion and speaks with utter indifference. Guess who is the successful marketing executive and who is the janitor who lives in a cheap rental room. That emotional suppression comes very handy in the incessant verbal ping-pong the ad men practice exchanging wisecracks on all sorts of issues. Repressed emotions also help them maintain a credible façade as they cheat on their frustrated wives. Oh, the price of civilization…

P.S. A few episodes later, I am left with one lingering sensation – the emotional distance between the characters is just staggering. One of the young writers complains that his newly acquired wife is just another stranger - as they all are to one another. To be unable to reach out emotionally to another human being – this must be a truly unusual and severe punishment. This emotional deficit creates ubiquitous and incessant jostling for power and status, and thus a very treacherous and hostile social (and office) milieu - assumed by game theory to be most natural and ubiquitous. Of course, Tocqueville foresaw it all when he wrote: “not only does democracy make every man forget his ancestors, but it hides his descendants and separates his contemporaries from him; it throws him back forever upon himself alone and threatens in the end to confine him entirely within the solitude of his own heart.” On the other hand, there seems to be a mismatch between the spirit of the costume drama and the Zeitgeist of the time as reflected in the style of the objects and ads from 1960. Who knows what those people truly thought and felt...

Tuesday, August 3, 2010

Dual use

A NYT piece says "a professor of engineering and neuroscience at Brown University is studying how human brain signals could combine with modern electronics to help paralyzed people gain greater control over their environments." Obviously, the same technology could be used to help perfectly healthy young guys fly fighter jets and helicopters, guide missiles to their targets, etc., all without a "joystick." The Pentagon, of course, is working on this. I am wondering if it is also part of the engineer/neuroscientist's business plan.

Saturday, July 24, 2010

My rest über alles

The latest data says one quarter of American couples now normally sleep in separate bedrooms. Apparently, this is a recent trend which is expected to grow steadily. A former museum director offered the following explanation to a NYT reporter: “Not that we don’t love each other, but at a certain point you just want your own room.” As a sleep specialist observed, "what happened in the last decade, is that people are suddenly making their own sleep a priority. If their rest is being impaired by their partner, the attitude now is that I don’t have to put up with this.” Why, indeed, put up with any personal inconvenience if you don't really have to?

Good guys (and gals) finish last

Two researchers on child development comment in the NYT on new law in Massachusetts requiring schools to institute anti-bullying programs, investigate complaints, report serious cases, etc. They are concerned the new legislation will encourage schools to make mostly superficial efforts which will not produce real results. The title of the article proclaims: "There’s Only One Way to Stop a Bully" - and this is "to teach children how to be good to one another" and to instill in them "a sense of responsibility for the well-being of others." Let's say schools and teachers decide to make an all-out, determined effort to "instill" these laudable values. On the other hand, children cannot remain blind to the fact that in the jostling for social status going on everywhere around them - from school cliques to boardrooms - it is often the nice guys and gals who finish last. And existing social mechanisms for the distribution of material and non-material rewards often favor, in the words of Paul Krugman, "bad actors." As Donald Trump likes to reminds his "apprentices," you do need to be tough, sometimes even mean, if you want to play with the big boys. Not that there is anything inherently wrong with this state of affairs - many will argue that this very harshness of the competition at all levels is at the heart of the dynamism which sets American capitalism apart from the more lethargic European versions. But it's quite obvious which lesson will leave a stronger mark on the minds of most impressionable kids and adolescents.

The revolution will be monetazied - well, maybe not this time

A web site called the Great American Apparel Diet invites visitors to commit not to buy any new clothes for a whole year. Since last September, 150 people have taken the pledge, though some have quit. Even the founder of the web site cheated twice, so going cold turkey on new apparel must be real stiff. More curiously, the pioneering dieter behind the website told a NYT reporter "she had thought about ways to make money off the diet." At the end, she decided to pass on the maintenance of the web site to future apparel dieters. One might sense a glaring contradiction between the spirit of the whole initiative and the urge to make money off its success. On the other hand, if we become overly sensitive about such cultural contradictions of capitalism, the whole economy could grind to a halt - like in North Korea.

Tuesday, July 20, 2010

Back to the future

I week ago, I came back frm Amsterdam. I arrived there on the eve of the Spain-Netherlands final game of the World Cup, and the whole city was overcrowded, euphoric, and draped in orange. The most striking sight I saw, however, were dozens of open-air portable urinals throughout the downtown area. Those were lined up not just on the edges of the Museum Square where the giant TV screens were set up, but also on other city squares, canal bridges, etc. - and were put to good use by mostly young men. I guess this is sound, pragmatic public policy, reflecting Holland’s famously utilitarian spirit – most of the male fans were carrying cases of Heineken, so they needed a convenient place to pass all that liquid out if they were not to pee on trees (which some did anyway). Such open-air urinals have also been deployed in Denmark, Britain, and a few other countries. In one Chinese city, the locals shunned the unfamiliar facilities, so municipal officials were instructed to use them in order to set an enlightened example. The whole issue, however, has another curious aspect. There are all these writings coming out about the erosion of the modern state and a return – politically – to the Middle Ages. I guess the return is also cultural. Several ages ago, people freely ate with greasy hands, belched, passed gas, urinated and did other things in public or semi-public settings. Then they gradually developed more “civilized” attitudes, at the core of which was an acute sense of embarrassment. That uncomfortable feeling drove such bodily functions underground, into an intensely private cocoon. Now all sorts of previously private activities are out in the open again, without any discernible sense of embarrassment or awkwardness. Any resistance or hesitancy, like in that distant Chinese city, will be whittled away. And most would see this as progress – a laudable pealing away of silly taboos and inhibitions.

Monday, June 28, 2010

A prophet in his own land?

The newly opened Dostoyevskaya station on the Moscow metro has caused some controversy. It is decorated with large black and gray images evoking disturbing scenes of murder and suicide from Dostoyevsky's novels. Critics are worried that it could become a "suicide Mecca," but the artist who did the murals is unrepentant. In his defense, he told journalists: "What did you want? Scenes of dancing? Dostoevsky does not have them." Indeed, it's all Dostoyevsky's fault - he should have written more cheerful stuff. On the other hand, 19th-century Russia knew neither Prosac nor positive psychology. Then, how can we blame any Russian writer for being excessively gloomy?

Wednesday, June 9, 2010

The failed poets society

It is sometimes said that people who could not make decent poets become critics instead. It seems brain science offers some corroboration to this old hunch. On his “Psychology Today” blog, Norman Holland recounts an improvised “experiment” he once witnessed. Someone brought together poets and critics to read poetry, and it turned out they had very different takes on it. While the poets paid attention to the sounding of words and to rhythm, “the critics concerned themselves with things like repetitions and contrasts of themes and meanings.” Now a neuroscientist has done a clever study of the “information-processing approach” of poets and critics. It turns out, exposure to poetry activates different neural networks in the brains of individuals belonging to the two groups. While poets tend to experience poetry more immediately (I would add, maybe more emotionally), critics apply a more top-down semantic-conceptual analysis. Holland makes it seem like this is a matter mostly of choosing the reading strategies appropriate for the occupations poets and critics have taken up. It’s probably more a matter of some individuals having the inclination (and underlying brain wiring) to approach the written word in one manner, and others – very differently.

Beyond stereotyping

Yesterday Madrid saw a mass demonstration by public sector workers. They were protesting government plans to cut their salaries by five per cent. Meanwhile, the Irish government has already cut the salaries of its public sector workers by 18 per cent. The measure, though unpopular, was met only by muted public grumbling. Now, someone could say: what can you expect? Spaniards, as quintessential southerners, are apparently less inclined to intellectualize away the challenges life (and class enemies) throw up at them. And Greeks seem to have an even harder time bracing up for the inevitable. But, of course, we have learned to look away from such tired cultural cliches.

Tuesday, June 8, 2010

Driven to distraction

Despite the alleged erosion of our brain powers Nicholas Carr laments, once in a while a longer, earth-shattering, potentially life-changing article does shoot to the top of the NYT’s “most emailed” list. The latest example is Matt Richtel’s “Hooked on Gadgets, and Paying a Mental Price.” It highlights the way electronic multiple devices can become addictive and undermine our ability to focus, filter out irrelevant stimuli, and stay “connected” in our first lives. It also offers a snapshot of the “work station” of a troubled (yet successful) IT entrepreneur – semi-surrounded by four computer screens, he looks like Tom Cruise in front of all those controls in “Top Gun.” As published on the web, though, the article has an ironic twist. It is interlaced with 19 blue-tinged hyperlinks enticing us to click away from it while reading. At school, this would be called “teaching by negative example.” Richtel cites a couple of neuroscientists who sound worried that this state of being incessantly hooked to IT devices must be rewiring our plastic brains in somewhat unhealthy ways. Yet, most of the concerns are still related to changes in our thinking and ability to empathize – as a result of the direct pressures and enticements we experience. It should by now be evident that any profound changes in the way we think or feel should be linked to changes in our brain wiring and activation, not only to cognitive or behavioral adaptations. But the old frame of reference lingers on.

The pursuit of scientific understanding

Lehrer’s faith that the scientific experiments he cites offer a valid explanation of the effects of electronic gadgets and web surfing on the brain calls to mind an older post by Norman Holland on his “Psychology Today” blog. So, I am quite relieved – apparently, my mind hasn’t yet gone completely the way of HAL’s. Here is what Holland says regarding the mountain of psychological experiments that have piled up over the decades: “Each experiment defines its independent and dependent variables in unique ways and adds a unique methodology. The result is a collection of discrete experimental results, each of which is thoroughly scientific, but that, as a whole, do not cumulate in the manner of a science. Rather the collection of experiments maintains a continuing conversation in the manner of the humanities.” I guess this wouldn’t resonate emotionally with someone like Lehrer, too – to say nothing of the more technically smart researchers who have not given up their research careers to write non-fiction.

Everything bad is really, truly good for you

In the NYT, Jonah Lehrer reviews (“Our Cluttered Minds”) Nocholas Carr’s book, “The Shallows: What the Internet IS Doing to Our Brains” (an elaboration of his much discussed Atlantic piece, “Is Google making Us Stupid?”). Carr essentially argues that the powerful distractions generated by the Internet are toasting our brains, thus undermining our ability to focus and “deep-read.” Lehrer objects that “the preponderance of scientific evidence suggests that the Internet and related technologies are actually good for the mind.” For example, one study found that “[video] gaming led to significant improvements in performance on various cognitive tasks, from visual perception to sustained attention.” And another “found that performing Google searches led to increased activity in the dorsolateral prefrontal cortex” – the brain area that “underlies the precise talents, like selective attention and deliberate analysis, that Carr says have vanished in the age of the Internet.” The bottom line for Lehrer is that “Google … isn’t making us stupid – it’s exercising the very mental muscles that make us smarter.” And, clearly, “the negative side effects of the Internet” Carr obsesses about do not “outweigh its efficiencies” – as Carr argues. Why do these two intelligent authors have such a fundamental disagreement? I have a hypothesis which I wish I could test. In what Lehrer identifies as a “melodramatic flourish,” Carr begins his book with a vignette from his earlier article. He reminds us of that memorable scene from “2001: A Space Odissey” when HAL, the spaceship “supercomputer” (having maybe less computational power than a sweatshop-made cell phone) pleads as he senses his electronic life seeping out of his silicone veins: “My mind is going. I can feel it. I can feel it.” Then Carr comments: “I can feel it too. Over the last few years, I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory.” Apparently, even a highly sophisticated and reflexive scientist-turned-science-writer like Lehrer can’t feel it. Here is the comment I posted on Lehrer’s blog under the entry pointing to his review of Carr’s book: “In ‘How We Decide’, Jonah Lehrer makes clear that emotional attunement to one's natural and social environment … is crucial to making judicious judgments and ad hoc decisions. I am just curious: are there any studies demonstrating that video games and web serving are beneficial for this aspect of our mental lives? Or are they improving mostly the nerdish cognitive skills boasted by the neuroscientists conducting all those clever experiments?”

Saturday, June 5, 2010

Save our mental environment!

A few years ago I maintained a simple web site grandly proclaiming that the erosion of our brain powers by all kinds of audiovisual pollution will be the "new frontier of environmental awareness." Adbusters now have a new issue of their magazine out under the heading "The Whole Brain Catalog" - with a nod to the Whole Earth Catalog which catalyzed the environmental movement in the 1960s. The promo they emailed warns that "the mental environment is now the terrain where our fate as humans will be decided." The lead article is a mental manifesto by Bill McKibben titled "The Environmental Movement of the Mind," and the other pieces look promising, too. Welcome to the new wave of mental environmentalism.

Thursday, June 3, 2010

Launching your kid in life

The NYT carries an article on "Teaching Work Values to Children of Wealth." Apparently, this is a new craze among the wealthy - as the crisis bites, and seemingly well educated college students or grads have increasing difficulty "getting into a purposeful path" in life. Of course, this new business opportunity is eagerly seized by all kinds of coaches and consultants - giving a much needed boost to the GDP. One expert offers the following sound advice to the anxious well-off: "Those families that treat their kids' launch like any other endeavor are having the most success." Indeed, what could be the difference between giving your offspring a good start in life and, say, launching a new celebrity fashion line or social website? All you need is cool business sense and/or professional judgment. And isn't this what helped the rich get rich in the first place (unless they inherited it all)?

Tuesday, June 1, 2010

Ironic democracy

The municipal elections in Reykjavik were won by an upstart party calling itself humbly the Best Party. Headed by Iceland's best-known comedian (who could now become mayor), the party ran a satirical campaign making extravagant promises - if elected, they would bring a polar bear to the city zoo, hand free towels at all swimming pools, construct a Dysneyesque theme part at the airport, etc. For decades, political scientists have tried to come up with qualifiers to describe obviously imperfect, yet seemingly democratic systems: "protodemocracy," "authoritarian democracy," "neopatrimonial democracy," etc. One study found 550 such examples of "democracy with adjectives" in the scholarly literature. Maybe it's time to add a new, less ominous entry to the list - "ironic democracy." This would be a system in which no one would be expected to take seriously elections and the post-election gimmicks of elected "politicians." Some would say the Reykjavik vote was an aberration, reflecting the rage of the electorate with the traditional parties that steered the country into its current financial meltdown. But let's not forget the trail-blazers who set the trend years ago - gifted self-styled entertainers (some detractors would call them clowns) like Silvio Berlusconi and Boris Johnson.

Wednesday, May 26, 2010

"Trade is to culture as sex is to biology."

This sparkling wit flows from the keyboard of Matt Ridley, a libertarian zoologist who once was on the staff of the Economist. He calls himself "the rational optimist" (the title of his new book), and offers a neat summary of his evolutionary thinking in the Wall Street Journal ("Humans: Why They Triumphed"). In his view, the way sex is essential to biological evolution, trade is at the heart of cultural and technological progress. It was the invention of trade that unleashed a process of increasing specialization and exchange of ideas. The Neanderthals, on the other hand, never adopted trading practices and ended up in the dustbin of history, despite their larger brains. From the very beginning of urban-based civilization, though, the spread of trade was marked by an enormous injustice: "Tax and even slavery began to rear their ugly heads. Thus was set the pattern that would endure for the next 6,000 years — merchants make wealth; chiefs nationalize it." This should probably apply to bankers, too - as Ridley was able to observe first-hand in his capacity as "non-executive chairman" of Northern Rock. He was pocketing GBP 300,000 a year there - until he was forced to resign when the bank cried uncle in September 2007. Luckily, there is light at the end of the tunnel: "Given that progress is inexorable, cumulative and collective if human beings exchange and specialize, then globalization and the Internet are bound to ensure furious economic progress in the coming century — despite the usual setbacks from recessions, wars, spendthrift governments and natural disasters."

Sunday, May 23, 2010


I am looking at the site of a Bulgaria cram school called Magnet ( One of the subjects they teach is Bulgarian language and literature. On its welcoming page, in Bulgarian, there are four brief paragraphs of text. In those four paragraphs, there are at least 10 commas that have been left out. There is only one word that can capture such a breath-taking degree of careless sloppiness, and it happens to be Russian. Or - the people who wrote and proofread Magnet's sales pitch are scrupulously meticulous, but this is the educational level they were able to reach after maybe a combined 138 years of education. Frankly, I am not sure which is worse.

The market knows best

William Keegan writing in the Guardian/Observer: "Although there is no getting away from the fact that Greece woefully mishandled its financial affairs, the financial markets seem to have put many eurozone countries in a classic Catch-22 position: they 'short' whole countries whose fiscal position is considered unsound. And then when the fiscal masochism they advocate is put into practice (or promised) they 'short' them again, because of the inevitable effect this will have on prospects for economic growth."

Saturday, May 22, 2010

Childhood dreams

Jordan Romero, a beaming American boy, holds the American flag in his stretched hands. The picture was taken at some peak in Indonesia, and now he is climbing Mount Everest accompanied by his father and three sherpas. At 13, he hopes to become the youngest person to climb the highest peak in the world. Someone could say: what better icon for the American dream of making it - with guts and determination - to the very top? Except that now kids from around the world are rushing to tackle superhuman challenges, with solo navigation around the world being the most common choice. The young climber, of course, blogs about his exilarating experiences. He recently wrote: "Every step I take is finally toward the biggest goal of my life, to stand on top of the world." You must admire this unchildish focus and dedication. And yet, what happens to you when you reach the highest goal of your whole life at 13? It must be all downhill from there. Peaking too early in life is not always the best strategy, but I do hope it will work for Jordan.

Friday, May 21, 2010

"Do Nice Gals Finish Last?"

This NYT blog post by economist Nancy Folbre says one of the reasons women get lower pay is that they are less pushy or Machiavellian when negotiating their salaries - and will often shrink from bringing the whole embarrassing issue up. The author asks: "Shouldn't we try to reward nice behavior?" This is the funniest question I have seen or heard in about 18 months. The first step Folbre suggests sound a bit more realistic: "We could start by making stronger efforts to penalize bullies and cheats." If only that "we" still existed...

Simply the pursuit of wealth

Harvard economic historian David S. Landes explains vividly ("The Enterprise of Nations," Wilson Quarterly) how in the past China clung stubbornly to silly traditions and missed out on 400 years of technological and social progress. Now the Chinese have learned their lesson, unlike some even more thick-headed holdouts. In other regions, all the recent success stories are countries which, like China, have jumped bravely into the global maelstrom of ideas, money, stuff, etc. In Latin America, for example, Brazil, Chile, and - er - Columbia have "done well" (curiously, no mention of Mexico). Now some naysayers are grumbling about the loss of jobs as companies move production to more efficient locations. Take the governor of Massachussetts who had the following comment on the decision of the new owners of Polaroid to shut down the company's main factory in his state: "I think it has been fairly sinister the company has been cut apart like a stolen car at a chop shop while the employees are left holding the spare tire." To Prof. Landes, such populist rhetoric is quite misplaced. As he patently explains, "the process of job transfer ('outsourcing') is a central aspect of contemporary entrepreneurship and globalization." Well, entrepreneurs do "prefer profits to sentiments," but how can you expect them not to? Landes could have quoted here Milton Friedman who once famously stated that "the social responsibility of business is to increase its profits." The Harvard professor, though, does make clear that, with the inevitable collateral damage wrought by the whirlwind of creative destrruction, outsourcing is a natural next step in a natural process which goes back thousands of years. This process is "simply the pursuit of wealth" - case rested. Of course, we don't even need to be reminded that any attempt to somehow constrain this process (maybe using as an excuse the excesses of unregulated entrepreneurship which allegedly created the current economic mess) leads inexorably down that slippery slope, onto the road to serfdom, directly into the gulag. As it happened in that former quasi-satellite of the Soviet Union, Finland.

Wednesday, May 19, 2010

"Exploring the Complexities of Nerdiness, for Laughs"

This is a NYT review of the sitcom Big Bang Theory which features, unbelievably, a couple of young physicists. The article offers the following quote by one of the actors: "A lot of people thought it would be a show that poked fun at smart people, but it has become a show that defends smart people much more often than that. These guys, as socially inept as they might be, are the type of people that are molding our future as a society." I knew there was something wrong with this sitcom. Indeed, the producers have picked the wrong genre. They should have concocted a dystopian miniseries about the coming merger of humans with machines. This hot topic was already discussed at the Global Catastrophic Risk Conference held at Oxford two years ago. It's now time for the science popularizers to step in and prepare us for the coming "singularity."

Saturday, May 15, 2010

Why worry now?

This is the headline of a post on one of the NYT multiple blogs: "Ash Falls on Reykjavik, and Icelanders Shrug." Apparently, they did the same when their banks and a few buccaneers embarked on that financial binge which sunk Iceland's economy. But maybe this is mere coincidence.

Death to Facebook?

"A few months back, four geeky college students, living on pizza in a computer lab downtown on Mercer Street, decided to build a social network that wouldn’t force people to surrender their privacy to a big business." As the NYT article makes clear, they have already raised on the internet more than twice the money they would need to design the open-source software intended to decentralize social networking. In their nerdy naïveté, they wanted to do this without promising big returns to venture capitalists. They should learn from the people who launched that web site last November allowing kids to make anonymously disparaging comments and ask embarrassing questions of each other. They knew their ECO 101 and had amassed 2.5 million dollars in venture capital before starting work on their similarly creative project.

Friday, May 14, 2010

Babies do need to grow up

Dr. Paul Bloom writes in the New York Times about his work aimed at pinning down "The Moral Life of Babies." It turns out, they are pretty uncooperative subjects - "because, even compared to rats or birds, they are behaviorally limited: they can’t run mazes or peck at levers." But he and his colleagues at Yale's Infant Cognition Center used some very clever experiments to elicit reliable data on the babies' ethical preferences. They found something quite disturbing - babies are, to put it mildly, intolerant. Gordon Brown might have even called them "bigoted." In other words, from a very early age, they exhibit clear-cut in-group preferences. How could I have been so naive? All that scholarly literature on nationalism I have devoured maintains that it is quite natural for millions of people of various colors, ethnicities, religions, etc. to live in perfect harmony in complex multicultural societies; and the bigotry of Serb nationalists or Hutu thugs is a cultural aberration which can be explained only by sophisticated conspiracy theories. It turns out this was all wrong: "in fact, our initial moral sense appears to be biased toward our own kind." Happily, this primitive morality can be transcended in modern societies with market economies, where rational individuals are able to negotiate the impartial morality needed to oil the wheels of commerce. According to Dr. Bloom, our morality must have such impartiality at its core in order to be truly mature. This reminds me of another NYT article, "When the Ties That Bind Unravel." It describes an increasingly common phenomenon in American society - "parents who have become estranged from their own children." At some points a son or a daughter decides that maintaining a strained relationship with his or her parents is too burdensome, and cuts them off. What could be more impartial than that, really - treating one's parents as chance acquaintances whom you can easily leave behind and move on with your life?

Thursday, May 13, 2010

The joy of motherhood

A 62-year old Bulgarian woman gave birth to twins. This, in itself, would not bee news - geriatric parenthood has long become a perfectly acceptable stab at happiness. The fact that she appeared before cameras made up as a clown cannot be held against her either - who can say what forms of self-expression are spooky these days? The twin girls were born from donated eggs and are severely underweight (one is a kilo, or a little over two pounds, the other half that weight), but this also turns out to be very much the norm in such cases. In fact, what seems most unusual about the elderly mother is her occupation. She is a certified psychiatrist. Now, this is precisely the kind of medical expert who can get you out of the doldrums if you have serious mental health issues.

Wednesday, May 12, 2010

The algorithm of creativity

This is from a NYT piece the booming scientific study of creativity: "'Creativity is a complex concept; it’s not a single thing,” [Dr. Kounious] said, adding that brain researchers needed to break it down into its component parts." He studies the neural basis of creativity, and defines it as "the ability to restructure one’s understanding of a situation in a nonobvious way." I wish the scientists doing such research could show some creativity in their own work - in addition their natural impulse to take things apart to see how they work.

Brave new Britain

Here is an excerpt from the Wikipedia entry on David Cameron: "Ex-Conservative MP Quentin Davies, who defected to Labour on 26 June 2007, branded him 'superficial, unreliable and [with] an apparent lack of any clear convictions' and stated that David Cameron had turned the Conservative Party's mission into a 'PR agenda'." These, of course, are gripes coming from an avowed personal enemy. And PR strategizing has already become a job requirement for political leaders anyway. Just think of the powerful Obama brand. One comment about Cameron's performance on the third prime-ministerial debate said that, as the BBC-hosted show progressed, "he began to look increasingly like a made up mannequin," but we shouldn't be so judgmental in our tastes. Another comment suggested that Cameron's thick hair in fact saved him from looking like a plastic mannequin.

Monday, May 3, 2010

Rule of law

There are reports that the parking lot adjacent to the Bulgarian parliament, used for the cars servicing the MPs, was set up illegally. It turns out the Sofia municipality never issued a permit for part of the public square to be fenced off.
Two boys, 13 and 14 years old, sneaked out of their homes to take a midnight walk in Sofia. Chased by a pack of stray dogs, they climbed on the side of an overpass. The younger boy slipped and fell on the boulevard below. A van immediately hit and killed him, and the driver sped away. This avoidable tragedy seems to capture in a teardrop the unrelenting unraveling of Bulgarian society.

Sunday, May 2, 2010

An examined life

In the NYT, Gary Wolf describes a new trend among computer geeks (“The Data-Driven Life”). Using various sensors and gadgets, they are now trying to collect as much data about their personal lives as possible. One has even compiled a searchable database of all ideas he has discussed with others or considered himself since the early 1980s. The idea behind this “self-tracking” is apparently not only to achieve greater efficiency in specific areas, but also to make sense of their lives – in the only way that can make sense to a geek. To some, it has obviously become a compulsion they see no reason to resist. In his sympathetic account, Wolf asks: “We use numbers when we want to tune up a car, analyze a chemical reaction, predict the outcome of an election. We use numbers to optimize an assembly line. Why not use numbers on ourselves?” Why not, really – it does sound quite logical. The early adopters are keenly aware that they are abnormal geeks, but are confident that what seems socially awkward now will soon be the new normal. That, I guess, is logical, too.

Thursday, April 29, 2010

The mystery of the universe

Stephen Wolfram argues on TED that "our universe is the product of a simple computational rule." But, of course - what could be more logical than this?

Wednesday, April 28, 2010

Who said political scientists can't be hip?

Following my previous comment, I took a solemn vow of blogosilence. I decided I would not write ANYTHING for at least two hours, and use that precious time to reevaluate my whole take on blogging. If my scattered thoughts appeal to the wrong target audience, would any neurotypicals take interest in them? This required some careful analysis. But then I got in my inbox a piece of news so exciting I couldn’t resist the itch. Can you believe it – the European Consortium for Political Research is now on Facebook and Twitter! Finally! What took them so long?

I, Robot

Last week, during pre-registration, a student came by to check my body temperature. I didn't turn him back since I am always thrilled to see students who are genuinely curious. His curiosity stemmed from the following comment he had read about me on “He is a robot! Very knowledgeable, indeed, but also routine-driven to the extreme... at the same time, he maintains a pretty interesting blog.” I am still wondering if I should feel hurt. Like his pals, the author is a übergamer who humbly styles himself as “Gatekeeper” (for the uninitiated, this would make him a Doom junkie). He apparently wants to keep the civic spirit AUBG so cherishes alive by dispensing empowering knowledge to the downtrodden and inexperienced (he has posted something like 8,000 comments already). Judging by his writing style, I probably gave him a C-, if I was in good mood. I should have known that no good deed goes unpunished. Anyway, it seems the opening qualification shouldn’t bother me much. Really, how could I hope to come alive for someone used to the electrifying excitement of immersive gaming? I would need to be a creature out of Avatar to achieve this. Plus, it’s quite obvious that Spaceship Earth will soon be taken over by cyborgs who hide their empty eyes behind dark glasses. If this is the case, becoming an early adopter of Al Gore’s* presentation style could give me the sharpest competitive edge. Ergo, I should be quite OK with "Gatekeeper" calling me a “robot.” His praise for my blog, though, nags me big time. His brain has surely been toasted from those 10,000 hours** of gaming he must have clocked in his second life (that order may have been reversed if he is still stuck in some dreary Bulgarian city, chained to a job incommensurate with his bloated self-importance). Hm, if such a persson can still find what I write “interesting,” there is perhaps something I am not doing right...

* Al Gore is a former US vice president and current environmental campaigner. He was once rumored to have a winding key sticking between his shoulder blades. That bulge is no longer visible under his suit, so “he” must have upgraded to a more powerful source of energy.

** How do I know it’s 10,000? Malcolm Gladwell (Outliers) has calculated that this is roughly the amount of practice it takes to achieve superb excellence in almost any area – even the silliest.

Sunday, April 25, 2010

Saved by the geeks

On, the creator of "Heroes," a cross-platform franchise which in the past would have been a mere TV series, explains the beauty of "audience sourcing." You see, there are all those millions of kids (some over 20, even 30) who, as a matter of principle, would never watch TV but would rather download - hm, illegally - their favorite shows. This is a fact of life, so this fickle audience must somehow be tapped. How do you do it most efficiently? Very simple - by creating a string of clever web sites through witch the digital fugitives will click compulsively (after they see, for example, the address for one of them flashed on the screen - all ingeniously embedded in the plot line). And bingo - you have delivered millions of surplus eyeballs to the advertisers chasing after this elusive demographic. You must really admire such inventive genius. At one point the creator makes a reference to a new survey which says US kids and teens now spend on average 7.5 hours (11.5 if you factor in multi-tasking) glued to screens of different sizes. What is his first thought? Wow, what an opportunity to reach into the brains of all those young addicts! And you can do it incessantly, practically all the time while they are awake - on behalf of advertisers and anyone willing to invest into that all-out effort. You must admire such clear focus and determination, too. But if you think the guy is merely bent on making a quick buck from his creative brilliance, you will be completely wrong. His "heroes" are all involved in frantic efforts to save the world - and his stated goal is to inspire all young minds watching them to follow in their footsteps. This, apparently, is the only feasible way to avoid all those environmental and other disasters which would otherwise ruin human civilization in the future. This reminds me of another talk on FORA - by a young neuroscientist who lists "six easy steps to avert the collapse of civilization." One of those steps is linked to the way the internet opens the gates of education. A "motivated teen" anywhere in the world, he says, can have access to the totality of human knowledge collected since the invention of writing. There is a slight problem here - all that compulsive clicking may not exactly contribute to the development of the focus and motivation essential to learning. But, of course, we all know that the Luddites' resistance to the onward march of progress was silly and futile, and we don't want to repeat their mistake all over again.

Your monetized second life

On there is a round table on some exciting efforts to bring "smarter TV" to the masses. One of the participants works for a company which offers its customers the opportunity to point a remote at, say, the watch worn by a celebrity actor and buy it on the spur of the moment. He explains that this is part of an overall project aimed at monetizing all sorts of emotional communities of people - fans of this and that. It's curious how many companies build their business plans on inducing customers to buy impulsively and incessantly, for themselves or nagged by their kids. Some sullen conservatives still think, though, that decorum and self-restraint in modern society are undermined primarily by a few feminists and other liberal intellectuals. Of course, they reject all conspiracy theories as hopelessly naïve.

We've come a long way, baby

"At 40, Earth Day Is Now Big Business" (NYT):

So strong was the antibusiness sentiment for the first Earth Day in 1970 that organizers took no money from corporations and held teach-ins “to challenge corporate and government leaders.”

Forty years later, the day has turned into a premier marketing platform for selling a variety of goods and services, like office products, Greek yogurt and eco-dentistry.

For this year’s celebration, Bahama Umbrella is advertising a specially designed umbrella, with a drain so that water “can be stored, reused and recycled.”

In part, said Robert Stone, a independent documentary filmmaker whose history of the American environmental movement is being broadcast on public television this week, the movement has been a victim of its own success in clearing up tangible problems with air and water. But that is just part of the problem, he noted.

“Every Earth Day is a reflection of where we are as a culture,” he said. “If it has become commoditized, about green consumerism instead of systemic change, then it is a reflection of our society.”

Taking the high road

Mary Billard, "A Yoga Manifesto" (NYT)

Yoga is definitely big business these days. A 2008 poll, commissioned by Yoga Journal, concluded that the number of people doing yoga had declined from 16.5 million in 2004 to 15.8 million almost four years later. But the poll also estimated that the actual spending on yoga classes and products had almost doubled in that same period, from $2.95 billion to $5.7 billion.

“The irony is that yoga, and spiritual ideals for which it stands, have become the ultimate commodity,” Mark Singleton, the author of “Yoga Body: The Origins of Modern Posture Practice,” wrote in an e-mail message this week. “Spirituality is a style, and the ‘rock star’ yoga teachers are the style gurus.”

Well, maybe it is the recession, but some yogis are now saying “Peace out” to all that.

Sunday, April 18, 2010

The freedom to smoke

The Bulgarian parliament has finally found an issue many MPs feel passionate about - the proposed lifting of the ban on smoking in public places before the latter even went into effect. The battle lines were drawn quite clearly between smokers and non-smokers - which must have been a coincidence since we know that MPs are solemnly committed to putting broader societal interests before their own narrowly conceived lifestyle preferences. One heavy smoker took a really principled stance: "As a free-thinking person I don't want to be told by law what to do and what not to do." If there is one sentence which brilliantly distills the whole dazzling complexity of modern Bulgarian political culture, this is it.

Friday, April 16, 2010


"Reaping Profit from Poland's Tragedy" (NYT):

"By 11:54 a.m. Saturday, less than three hours after President Lech Kaczynski’s plane had crashed in Russia, killing all 96 people on board, one opportunistic Pole had already manufactured 50 T-shirts emblazoned with the Polish flag and “RIP” and was peddling them on the Internet for $8.50 each, tax and delivery fees not included.

"Within hours of the crash, which also killed the president’s wife, Maria, the governor of Poland’s Central Bank and dozens of political and military leaders, sellers were hawking everything from commemorative plastic clocks adorned with images of the first couple smiling in front of a map of Poland to Internet domain names containing the late president’s name."

Here is the funnier part:

"Yet some Poles said the crass commercialism that also greeted the tragedy showed the extent to which Poland, 20 years after the revolution that overthrew Communism, had become a healthy capitalist economy, even as the free market was challenging the Roman Catholic Church as the new religion." No, I am not making this up. Maybe this kind of ruthlessly creative entrepreneurship can explain why Poland was the only EU economy which did not dip into recession last year.

Economists tried to predict the economic effect of the "commercial response to the crisis," but concluded it would be insignificant. But, who knows, a multitude of such entrepreneural ripples can perhaps merge into a tidal wave powerful enough to lift Poland's economy even further.

On the day of the accident, a woman selling Polish flags on the street in Warsaw said she had made a cool $700 in profit. The flags were $6.80 apiece, made in - where else? - China. So, a much needed boost to globalization, too. Every cloud must have a silver lining, indeed.

Wednesday, April 14, 2010

No regrets

A New York Times article (“Facing a Financial Pinch, and Moving In With Mom and Dad”) says last year in the US “37 percent of 18-to-29-year-olds were unemployed or no longer looking for work.” As a result, 10 per cent of those aged 18 to 34 moved back with their parents. A young scuba instructor interviewed for the article now lives in a rent-subsidized apartment with his 90-year old grandmother. He says: “Part of me stays because of the financial benefits — I could never find an apartment like this one for even double the current rent — and part of it is that, while this might sound pessimistic, the truth is my grandmother is not going to live forever so I want to spend as much time with her as possible so no regrets later on.” It’s really neat to have such a clear focus on your own emotional needs without being much distracted by those of others – particularly if they will soon be dead anyway. As Winifred Gallagher argues in Rapt, the skillful management of attention and the capacity to maintain unrelenting focus form comprise the protocol for the good life which philosophers have pursued for at least 26 centuries.

Saturday, April 10, 2010

Why hold anything back?

There is a piece in the New York Times about people who take pictures of everything – or almost everything they eat and then post the photos on the web (“First Camera, Then Fork”). For many, it has become an obsession/compulsion. And it’s now such a viral fad that N***n, O*****s, S**y, and F**i have all recently released camera models with designated “food” or “cuisine” modes (from now on, I have decided not to spell out any brand name for free). As I read this, I had a Eureka moment: if so many people are taking pictures of what comes in, why not photograph what comes out, too? This would be truly provocative – yet a logical next step toward the full digital self-baring so many seem to pursue. Then my enthusiasm cooled off as I thought some conceptual artist must already be doing this as part of some “project” with a cleverly mystifying name.

The true worth of things

Paul Krugman has a longer piece in the New York Times magazine on the challenges of “Building a Green Economy.” As a warm-up, he dispenses some Econ 101 axioms, like this one: “If there’s a single central insight in economics, it’s this: There are mutual gains from transactions between consenting adults. If the going price of widgets is $10 and I buy a widget, it must be because that widget is worth more than $10 to me. If you sell a widget at that price, it must be because it costs you less than $10 to make it. So buying and selling in the widget market works to the benefit of both buyers and sellers.” A few hours after I read this a saw a leatheresque jacket in a shop window worth around $700. Applying Krugman’s analysis, it’s easy to see why someone would sell the item at this price: it probably cost them $30 to – I was going to say “produce,” but we all know clothing companies don’t produce anything these days. The other half of Krugman’s observation made me scratch my head as I tried to imagine the kind of person to whom such garb would be “worth” the GDP per capita of Niger. My next thought was about a dose of cocaine – but let’s focus on the legal economy. I have only one explanation as to why Krugman can still use the term “efficiency” with – I assume – a straight face: he hasn’t watched Food, Inc. On second thought, putting hefty price tags on a few flashy items can probably help address one of the chief complaints John the Savage once made about “civilization”: “Nothing costs enough here.”

What is culture, really?

Four neuroanthropologists quote the following neat definition of culture: “Culture can be broadly defined as the repertoire of socially generated behaviors typical within a group of interrelated individuals.” Therefore, “behavior patterns exhibited by a range of social species are cultural – including location-behavior associations in fish, dialect variants of bird songs,,,, seed feeding and gathering procedures in rats,” etc. True, “culture is almost always exclusively discussed with reference to the one species that has made a virtue of cultural production, reproduction, and transformation, that is, the human species” – but the difference between the behavioural patterns of gupi fish and human beings appears to be one of degree. In my naïveté I thought “culture” was about weaving those webs of meaning older anthropologists claimed to study. But maybe “behaviour patterns” is all we are left with after all those centuries of disenchantment, desacralization, profanization, immanentization of existence – you name it. Or – this is all such contemporary “engineers of the human soul” can observe and study experimentally?

Monetazing the power of “people brands”

An upstart called OpenSky helps people who have built a loyal following in the blogosphere or on social web sites to recommend online purchases to their friends and fans – for a modest commission. I am looking forward to the day when it will be impossible to open one’s eyes without having some commercial message within your field of vision. Life will then become one blurred rolling temptation which will require an even more inhuman willpower to resist. That willpower, of course, will meanwhile be sapped as distractions become the water we – and particularly our kids – swim in. And then the only way to get rid of a temptation is to follow the Oscar Wilde Rule – just yield to it.

The elegant beauty of cost-benefit analysis

Writing on the Foreign Affairs web site (“Hardly Existential: Thinking Rationally about Terrorism”), an American political scientist and an Australian civil engineers offer the following “elemental observation”: “As a hazard to human life in the United States, or in virtually any country outside of a war zone, terrorism under present conditions presents a threat that is hardly existential. Applying widely accepted criteria established after much research by regulators and decision-makers, the risks from terrorism are low enough to be deemed acceptable. Overall, vastly more lives could have been saved if counterterrorism funds had instead been spent on combating hazards that present unacceptable risks.” I guess the geek squad working for Robert McNamara 53 years ago would have cheered such a clear-headed approach to existential threats. To be fair to the two respected scholars/experts, they do recognize that the psychological effect of terrorist attacks is slightly different from the anguish caused incidents like people drowning in their own bathtubs. Luckily, government experts in charge of putting some neat numbers on hazardous risks have already made some adjustments to reflect this awareness: “In order to deal with the emotional and political aspects of terrorism, a study recently conducted for the U.S. Department of Homeland Security suggested that lives lost to terrorism should be considered twice as valued as those lost to other hazards. That is, $1 billion spent on saving one hundred deaths from terrorism might be considered equivalent to $1 billion spent on saving two hundred deaths from other dangers.” To their credit, the authors of the Foreign Affairs piece do recognize that their “rational analysis” is unlikely to enlighten high-level deciders as Washington remains ridden with bureaucratic inertia and psychological rigidity: “The cumulative increased cost of counterterrorism for the United States alone since 9/11 -- the federal, state, local, and private expenditures as well as the opportunity costs (but not the expenditures on the wars in Iraq or Afghanistan) -- is approaching $1 trillion. However dubious and wasteful, this enterprise has been internalized, becoming, in Washington parlance, a ‘self-licking ice cream cone,’ and it will likely last as long as terrorism does.” This probably points to the somewhat limited utility of number crunching as a policy crutch on issues which even number crunchers can recognize as existential – at least potentially so.

Monday, April 5, 2010

The bearable lightness of living on impulse

Between 2001 and 2006, Tom Bissell wrote several books and over 50 magazine articles. Then he discovered the Grand Theft auto series and cocaine - a perfect match. He spent the next four years playing, occasionally squeezing in some sleep between marathon gaming sessions. He seems to have few regrets because video games apparently enriched his life giving him the most intense experiences (though he does acknowledge some collateral damage). I do hope this spin is part of the marketing strategy for Bissell's new book. The lead character in GTA IV is Niko Bellic, seemingly a former Serb paramilitary with some blood on his hands. But, hey, he has a friend called Hassan - which is a nice way for the makers of that allegedly hyperviolent video game to promote the cause of diversity and multiculturalism.

Thursday, April 1, 2010

Ironic faith

This comes from Riazat Butt's blog on the Guardian web site:
St Matthew's in Auckland describes itself as a "progressive Anglican church with a heart for the city and an eye to the world". That's an understatement.
Last Christmas it offended and intrigued in equal parts with a Saatchi-designed billboard that depicted a deflated Joseph in bed with a disappointed Mary and the caption "Poor Joseph. God was a hard act to follow". Its attempt to provoke was more successful than expected and the poster was promptly attacked with a knife.
For Easter, the most important festival in the Christian calendar, the people at St Matthew's have come up with another ruse to get people engaging with their faith. This billboard shows Jesus nailed to a crucifix, thinking to himself: "Well this sucks. I wonder if they'll remember anything I said". The vicar at St Matthew's, Glynn Cardy, says the poster is a reminder that "Easter is about more than a rugged cross, a supernatural miracle, or a chocolate bunny".

I guess this is a response to Christians who take Holy Scripture literally - to the point of trying to calculate the volume of the blood that will be spilled when X billion individuals are slaughtered at Armageddon. The ad agency which designed pro bono at least one of the billboards apparently thought they were performing a public service – or at least pretended to.

Wednesday, March 31, 2010

The end of decorum (among other things)

As reported in Der Spiegel, two German entrepreneurs recently received permission from the EU trademark authority to register the name "F***ing Hell" (spelled out in full)  to be used for the marketing of a new brand of beer. They explained in their application that in parts of Germany and Austria "hell" refers to a variety of light ale; and the word that goes with it is, in fact, the name of a small town in Austria. In the informed opinion of the EU trademark, the phrase was "an interjection used to express a deprecation, but it does not indicate against whom the deprecation is directed. Nor can it be considered as reprehensible to use existing place names in a targeted manner (as a reference to the place), merely because this may have an ambiguous meaning in other languages." In reality, the meaning of the name of that Austrian town is not overly ambiguous in English; and the two German entrepreneurs plan to use the innovative brand name to market clothing and many other items; and the Austrian town doesn't have a brewery; and its uptight citizens and mayor are not too excited about the attention their native town is attracting; and... But why should some petty objections be allowed to stand in the away of such creative, cheerfully subversive entrepreneurship? I can't wait to see the billboards. Oh, and Germany has a couple of other towns whose names have richly evocative meaning, like Kissing, Petting, and Pissing. There must be some products or services out there that can be joyfully branded with them, too  to help foster the self-expression values celebrated by successive waves of the World Values Survey.

Saturday, March 27, 2010

Epic boast

Joe Brewer, a game developer, argues on TED that only a dramatic increase in the total hours of global video gaming (to some mind-boggling yet precisely accurate number) can save the world. She flashes the picture of a freakishly exhilarated teenager which, she claims, rather than capturing a gaming high or even climax, shows some optimistic wrinkles on his face. Those come from his sensation that he is on the verge of what is known in the industry as an Epic Win. The picture ostensibly demonstrates that gamers, as “super-empowered hopeful individuals,” are in search of “epic meaning” as they exit into parallel online worlds. All that is needed is to harness that ocean of energy for the solution of pressing global problems, like the coming shortage of oil. Brewer claims her Institute of the Future is doing just that – by developing a few video games immersing players into, say, a future of gasoline scarcity. She and her fellow game engineers thus want not just to imagine but to make the future. I thought self-serving delusion had reached its peak somewhere between 1933 and 1953 (later in China), but the flight of geek imagination appears to have raised the bar in this area – as in many others.


The Bulgarian franchisee of Big Brother has blanketed Sofia with billboards enticing viewers to check out the new Family Big Brother they are putting out. The ads refer to one of the participants as a caring wife and mother of two children – and lover to three neighbors. I dearly wish I could time-manage and multitask as efficiently.

Status anxiety squared

In his new book, “The Genius in All of Us,” David Shenk argues that talent has been overrated. It’s self-discipline and motivation that count, and even those can be cultivated – so every child is a potential genius capable of breathtaking flights of creativity and imagination. He draws on recent findings in neurobiology to send a message similar to the point made by Malcolm Gladwell who gave as examples famous “outliers” to demonstrate that anyone who clocks in 10,000 hours of practice can achieve supreme excellence in almost any area. The egalitarian spirit infusing such upbeat assessment of the potential for creative genius in “all of us” is something to behold and admire. The implications of embracing this new outlook, though, may be ironic. In his book and documentary, “Status Anxiety,” Alain de Botton argues that a belief in social equality makes those who have failed to achieve the “American dream” miserable and resentful as they cannot blame their failure on anyone else or on larger social forces. If this is taken seriously, a belief in neural equality could take the rat race to a whole new level. Wouldn’t it be a nicer and kinder intellectual gesture to allow the majority of people to lead a dignified life devoid of much creative flair? What is the point of dangling before everyone the promise of universal “outlier” achievements? By implication, those who have not become creative celebrities (like Gladwell) will then be branded as failures because they have betrayed their ostensibly limitless potential. Come to think of it, Shenk’s and Gladwell’s invitation to everyone to follow in their own footsteps strikes me as a bit smug – to say nothing of socially irresponsible.