Search This Blog

Sunday 31 January 2016

Written your child’s personal statement yet? Get a move on...

It is now so difficult to isolate a child’s contribution, why not go straight to the source – start testing the parents?


 
Co-authorship seems to be the order of the day. Photograph: PhotoAlto / Alamy/Alamy


Catherine Bennett in The Guardian

For as long as I can remember, I have been fascinated by the importance attached to personal statements, written to a formula, and not by the candidates alone, as part of applications to British universities. The longer it persists, the more farcical, unfair – and excruciating for students – this requirement becomes.

Every year, experts on the process refine their advice and share disdainful lists of cliches that should never be used to start a personal statement, such as: “For as long as I can remember I have been fascinated” (used 196 times in 2013), or: “Nursing is a profession I have always looked upon with” (178 times), thereby adding to the self-consciousness of students required to appeal, in around 700 fresh and original – yet thoughtful and relevant – words, to admissions tutors, some of whom admit they never glance at these exercises in supplication.

And why, anyway, the horror of cliches? They are not, most of these 600,000 or so young applicants, applying for BAs in Being Martin Amis, in a world where originality of expression is the key to worldly success. You need only read one of George Osborne’s op-ed contributions, or cast your mind back to the Labour leadership hustings, to appreciate that 17-year-olds are being held to far higher standards than middle-aged politicians.

If Osborne can begin a piece: “Britain is firmly on the road to recovery, thanks to the hard graft of working people”, and Cameron blither: “I’m so passionate”, and Corbyn believe that “Jeff wants to know” constitutes a compelling barb, why shouldn’t an aspiring nurse begin a personal statement with: “Nursing is a profession I have always looked upon with...”? It can only mean, in the nurse’s case, that in the event of a tiebreak between two equally qualified candidates, a more dashingly phrased as opposed to unvarnished expression of intent would be taken to indicate superior potential. How much British politics has to learn.

Now, to add to the pressure on our future accountants and chemists to demonstrate tenacity without tedium, vivacity without froth, research suggests that much of the advice given by schools to young statement writers is wrong.

Arriving just after the final deadline for this year’s Ucas submissions, a report for the Sutton Trust, written by Dr Steven Jones of Manchester University, may well cause consternation among candidates who have just put, for example, a declaration of commitment, carefully stripped of cliches, before proof of intellectual curiosity.

It was not unexpected, perhaps, that support with personal statements, routine in independent schools, should help students from comprehensives make more successful applications to highly selective universities. But Jones also discovered that some existing guidance may be counterproductive. “Admissions tutors,” he writes, “tend to value focused and sustained analysis of a specific topic of interest or case study rather than broad statements about a subject or attempts to make the statement more ‘personal’.”

He shows how one candidate’s expression of enthusiasm (“I am particularly interested in Victorian literature”), and explanation – “the social constraints and etiquette of the time are vividly portrayed, yet the novels of this period remain timeless” – was considered “bland” by an admissions tutor. “There is a much stronger way to put this tension.”

No doubt. Then again, the hapless candidate, who has, we note, commendably avoided the now deprecated word “passion”, is only aiming to please. It could be that the author is capable of stronger phraseology, on the febrile, emotionally explosive content of Victorian literature, but reluctant to go over the top, since she is applying for five different courses, in places offering conflicting advice on personal statements. Perhaps this application’s authors have been studying the university website, which holds up as exemplary the hardly less tepid: “Computing is a thought-provoking subject, covering a range of disciplines and has permeated every aspect of modern life.”

The writers might be focusing, instead, on transferable and relevant skills: “Your child can also mention how their current qualifications have broadened their knowledge,” the University of Bath tells parents. Except that qualifications speak for themselves. So don’t bother. Don’t mention sport. Try to be interesting, instead. OK, do mention sport. It’s character-building. Ditto the Duke of Edinburgh award, which has no conceivable connection with your passion for, sorry, interest in, medieval history. “This is your opportunity to sell yourself,” is Ucas’s charming summary of this exercise, one which appears to have few, if any, parallels outside the UK. So grab attention. But don’t be glib. Or a prat. “We want you to be different, but not TOO different,” one tutor told Which?, a whimsical preoccupation, perhaps, when English teenagers are ranked the world’s worst for literacy and not much better at maths.

But what is uniform, in all this advice, is the assumption that the statement is co-authored. The mothers who ask you: “Have you done yours yet?”, who urge early submission, to beat the crowds, and who go on websites to share about “personal statement hell”, are doing no more than comply with Ucas expectations, as they follow through on years of help with homework, securing work experience, renting violins. “Do ask people that you trust, like your teacher/adviser or parent/carer to read through what you have written and give you feedback,” Ucas says, advice that is unlikely to help tutors with the true state of applicant literacy.

Having never, mercifully, been invited to collaborate on a personal statement, I can’t be sure how tempting it is to tinker, redraft, maybe reword the thing, in line with Ucas hints and its stern warnings on plagiarism. Very, probably, when your child is effectively competing with adults, including dedicated coaches in independent schools, brilliant stay-at-home mothers and others willing to pay £150 for a shop-bought item, from, say, Oxbridge Personal Statements. Liberal instructions to parents – “Your child’s personal statement needs to create a strong impression” – confirm that its editing is, indeed, your duty.

While coursework is being phased out in GCSEs and A-levels, for reasons having to do with the lamentable shortage, in many homes, of parents qualified to write 3,000 words on Macbeth, personal statements continue to make parental competence, finances and cultural capital a factor in applications to universities whose graduates will dominate the professions. So long as inequality of access remains a theme in higher education, the survival of the personal statement, in its cheat-friendly, whole-family format, must vitiate all corrective projects.

Whatever happens to the Sutton Trust’s recommendations, that personal statements be demystified, perhaps reformed, its findings on admissions tutor preferences guarantee one thing: formal analytical passages, modelled on Dr Jones’s examples, will be all the rage in next year’s statements.

In fact, since adults are likely to supervise this development, it would save precious student time, time perhaps better devoted to numeracy and literacy, if the universities invited parents to submit the resulting creations to Ucas under their own names, with the child merely confirming it is the adult’s own, unaided work.

Wednesday 27 January 2016

China accuses George Soros of 'declaring war' on yuan

Billionaire investor ‘trying to create panic for profit’, says scathing editorial, after he predicted the Chinese economy is headed for a hard landing


 
George Soros said China had left it to late to move from an export to a consumer-led economy. Photograph: Pascal Lauener/Reuters


Agence France-Presse in Beijing

Wednesday 27 January 2016 06.04 GMT

Chinese state media has stepped up a salvo of biting commentaries against George Soros and other currency traders as the yuan comes under pressure, with the billionaire investor accused of “declaring war” on the unit.

At the annual World Economic Forum in Davos last week, Soros told Bloomberg TV that the world’s second-largest economy – where growth has already slowed to a 25-year low according to official figures – was heading for more troubles.

“A hard landing is practically unavoidable,” he said.



Global markets turmoil echoes 2008 financial crisis, warns George Soros



Soros – whose enormous trades are still blamed in some countries for contributing to the Asian financial crisis of 1997 – pointed to deflation and excessive debt as reasons for China’s slowdown.

The normally stable yuan, whose value is closely controlled by Beijing, has come under pressure in recent weeks and months in overseas markets and from capital outflows. Authorities have spent hundreds of billions of dollars to defend it.

China’s official Xinhua news agency on Wednesday said that Soros had predicted economic troubles for China “several times in the past”.

“Either the short-sellers haven’t done their homework or … they are intentionally trying to create panic to snap profits,” it said.

An English-language op-ed in the nationalistic Global Times newspaper blamed “westerners” for not “accepting responsibility for the mess” in the world economy.

The comments came after the overseas edition of the People’s Daily, the official mouthpiece of the Communist party, published a front-page article Tuesday titled “Declaring war on China’s currency? Ha ha” that was widely shared on Chinese social media.

Soros “publicly ‘declared war’ on China”, the paper said, citing the 85-year-old as saying that he had taken positions against Asian currencies.

But some readers questioned whether the official rhetoric could fuel Chinese investors’ fears.

“They say a lot of loud slogans, but do official media even know that Chinese investors are in hell?” said one poster on social media network Weibo.

“I’m afraid that Chinese investors will die in a stampede before Soros even shows his hand.”

In the 1990s Soros led speculators in bets against the Bank of England, which unsuccessfully sought to defend the pound’s exchange rate peg.

“The Chinese left it too long” to change their growth model from dependence on exports to a consumer-led one, Soros said, even though Beijing had “greater latitude” than others to manage such a transition because of its currency reserves, which stand at over US$3tn.

Monday 25 January 2016

Back from the enemy country

Pervez Hoodbhoy in The Dawn

RARELY are Pakistanis allowed to cross their eastern border. We are told that’s so because on the other side is the enemy. Visa restrictions ensure that only the slightest trickle of people flows in either direction. Hence ordinary academics like me rarely get to interact with their Indian counterparts. But an invitation to speak at the Hyderabad Literary Festival, and the fortuitous grant of a four-city non-police reporting visa, led to my 11-day 12-lecture marathon at Indian universities, colleges, and various public places. This unusual situation leads me here to share sundry observations.
At first blush, it seemed I hadn’t travelled far at all. My first public colloquium was delivered in Urdu at the Maulana Azad National Urdu University (MANUU) in Hyderabad. With most females in burqa, and most young men bearing beards, MANUU is more conservative in appearance than any Urdu university (there are several) on the Pakistani side.
Established in 1998, it seeks to “promote and develop the Urdu language and to impart education and training in vocational and technical subjects”. Relative to its Pakistani counterparts, it is better endowed in terms of land, infrastructure and resources.
But there’s a still bigger difference: this university’s students are largely graduates of Indian madressahs while almost all university students in Pakistan come from secular schools. Thus, MANUU’s development of video “bridge courses” in Urdu must be considered as a significant effort to teach English and certain marketable skills to those with only religious training. I am not aware of any comparable programme in Pakistan. Shouldn’t we over here be asking how the surging output of Pakistani madressahs is to be handled? Why have we abandoned efforts to help those for whom secular schooling was never a choice? 
To my embarrassment, I was unable to fulfil my host’s request to recommend good introductory textbooks in Urdu from Pakistan. But how could I? Such books don’t exist and probably never will. Although I give science lectures as often in Urdu as English, the books I use are only in English. Somehow Pakistan never summoned the necessary vigour for transplanting modern ideas into Urdu. The impetus for this has been lost forever. Urdu, as the language of Islam in undivided India, once had enormous political significance. Education in Urdu was demanded by the Muslim League as a reason for wanting Pakistan!
A little down the road lies a different world. At the Indian Institute of Information Technology (IIIT) the best and brightest of India’s young, selected after cut-throat competition, are engaged in a furious race to the top. IIIT-H boasts that its fresh graduates have recently been snapped up with fantastic Rs1.5 crore (Indian) salaries by corporate entities such as Google and Facebook.
This face of modern India is equally visible at the various Indian Institutes of Technology (IIT), whose numbers have exploded from four to 18. They are the showpieces of Indian higher education. I spoke at three — Bombay, Gandhinagar, and Delhi — and was not disappointed. But some Indian academics feel otherwise.
Engineering education at the IITs, says Prof Raghubir Sahran of IIT-GN, has remained “mainly mimetic of foreign models (like MIT) and captive to the demands of the market and corporate agendas”. My physicist friend, Prof Deshdeep Sahdev, agrees. He left IIT-K to start his own company that now competes with Hewlett Packard in making tunnelling electron microscopes and says IIT students are strongly drill-oriented, not innovative.
Still, even if the IITs are not top class, they are certainly good. Why has Pakistan failed in making its own version of the IITs? One essential condition is openness to the world of ideas. This mandates the physical presence of foreign visitors.
Indeed, on Indian campuses one sees a large number of foreigners — American, European, Japanese, and Chinese.
They come for short visits as well as long stays, enriching universities and research centres.
Not so in Pakistan where foreigners are a rarity, to be regarded with suspicion. For example, at the National Centre for Physics, which is nominally a part of Quaid-i-Azam University but is actually ‘owned’ by the Strategic Plans Division (the custodian of Pakistan’s nuclear weapons), academic visitors are so tightly restricted that they seek to flee their jails soon after arrival.
Those who came from Canada, Turkey and Iran to a recent conference at the NCP protested in writing and privately told us that they would never want to come back.
Tensions between secular and religious forces appear high in Modi’s India. Although an outsider cannot accurately judge the extent, I saw sparks fly when Nayantara Sahgal, the celebrated novelist who was the first of 35 Indian intellectuals to hand back their government awards, shared the stage with the governor of Andhra Pradesh and Telangana. After she spoke on the threats to writers, the murder of three Indian rationalists, and the lynching of a Muslim man falsely accused of possessing beef, the enraged governor threw aside his prepared speech and excoriated her for siding with terrorists.
Hindutva ideology has put the ‘scientific temper’ of Nehruvian times under visible stress. My presentations on science and rationality sometimes resulted in a number of polite, but obviously unfriendly, comments from the audience.
Legitimate cultural pride over path-breaking achievements of ancient Hindu scholars is being seamlessly mixed with pseudoscience. Shockingly, an invited paper at the recent Indian Science Congress claimed that Lord Shiva was the world’s greatest environmentalist. Another delegate blew on a ‘conch’ shell for a full two minutes because it would exercise the rectal muscles of Congress delegates!
Pakistan and India may be moving along divergent paths of development but their commonalities are becoming more accentuated as well. Engaging with the other is vital — and certainly possible.
Although I sometimes took unpopular political positions at no point did I, as a Pakistani, experience hostility. The mature response of both governments to the Pathankot attack gives hope that Pakistan and India might yet learn to live with each other as normal neighbours. This in spite of the awful reality that terrorism is here to stay.

Sunday 24 January 2016

Want To Reduce Abortions? Don't Stigmatise Sex

 

The religious conservatives are responsible for high abortion rates; they are responsible for the injury and death of women.


Here is the fact that everyone debating abortion should know. There is no association between its legality and its incidence. In other words, banning abortion does not stop the practice; it merely makes it more dangerous.

The abortion debate is presented as a conflict between the rights of embryos and the rights of women. Enhance one, both sides sometimes appear to agree, and you suppress the other. But once you grasp the fact that legalising women's reproductive rights does not raise the incidence of induced abortions, only one issue remains to be debated. Should they be legal and safe or illegal and dangerous? Hmmm, tough question.

There might be no causal relationship between reproductive choice and the incidence of abortion, but there is a strong correlation: an inverse one. As the Lancet's most recent survey of global rates and trends notes, "The abortion rate was lower … where more women live under liberal abortion laws."

Why? Because laws restricting abortion tend to be most prevalent in places where contraception and comprehensive sex education are hard to obtain, and in which sex and childbirth outside marriage are anathematised. Young people have sex, whatever their elders say; they always have and always will. Those with the least information and the least access to birth control are the most likely to suffer unintended pregnancies. And what greater incentive could there be for terminating a pregnancy than a culture in which reproduction out of wedlock is a mortal sin?

How many more centuries of misery, mutilation and mortality are required before we understand that women — young or middle aged, within marriage or without — who do not want a child may go to almost any lengths to terminate an unwanted pregnancy? How much more evidence do we need that, in the absence of legal, safe procedures, such sophisticated surgical instruments as wire coathangers, knitting needles, bleach and turpentine will be deployed instead? How many more poisonings, punctured guts and burst wombs are required before we recognise that prohibition and moral suasion will not trounce women's need to own their lives?

The most recent meta-analysis of global trends, published in 2012, discovered that the abortion rate, after a sharp decline between 1995 and 2003, scarcely changed over the following five years. But the proportion that were unsafe (which, broadly speaking, means illegal), rose from 44% to 49%.

Most of this change was due to a sharp rise in unsafe abortions in West Asia (which includes the Middle East), where Islamic conservatism is resurgent. In the regions in which Christian doctrine exerts the strongest influence over legislation — west and middle Africa and central and south America — there was no rise. But that's only because the proportion of abortions that were illegal and unsafe already stood at 100%.

As for the overall induced abortion rate, the figures tell an interesting story. Western Europe has the world's lowest termination rate: 12 per year for every 1000 women of reproductive age. The more godly North America aborts 19 embryos for every 1000 women. In South America, where (when the figures were collected) the practice was banned everywhere, the rate was 32. In eastern Africa, where ferocious laws and powerful religious injunctions should — according to conservative theory — have stamped out the practice long ago, it was 38.

The weird outlier is eastern Europe, which has the world's highest abortion rate: 43 per 1000. Under communism, abortion was the only available form of medical birth control. The rate has fallen from 90 since 1995, as contraception has become easier to obtain, but there's still a long way to go.

Facts, who needs 'em? Across the red states of the US, legislators have been merrily passing laws that make abortion clinics impossible to run, while denying children effective sex education. In Texas, thanks to restrictive new statutes, over half the clinics have closed since 2013. But women are still obliged to visit three times before receiving treatment: in some cases this means travelling 1000 miles or more. Unsurprisingly, 7% of those seeking medical help have already attempted their own solutions.

The only reason why this has not caused an epidemic of abdominal trauma is the widespread availability, through unlicensed sales, of abortion drugs such as misoprostol and mifepristone. They're unsafe when used without professional advice, but not as unsafe as coathangers and household chemicals.

In June, the US Supreme Court will rule on the constitutionality of the latest Texan assault on legal terminations, the statute known as HB2. If the state of Texas wins, this means, in effect,the end of Roe v Wade, the decision that deemed abortion a fundamental right in the United States.

In Northern Ireland the new first minister, Arlene Foster, who took office on Monday, has vowed to ensure that the 1967 abortion act, which covers the rest of the United Kingdom, will not apply to her country. Women there will continue to buy pills (and run the risk of confiscation as the police rifle their post) or travel to England, at some expense and trauma. Never mind the finding of a High Court judge: "there is no evidence before this court that the law in Northern Ireland has resulted in any reduction in the number of abortions". It just warms the heart to see Protestant and Catholic fundamentalists setting aside their differences to ensure that women's bodies remain the property of the state.

Like them, I see human life as precious. Like them, I want to see a reduction in abortions. So I urge states to do the opposite of what they prescribe. If you want fewer induced abortions, you should support education that encourages children to talk about sex without embarrassment or secrecy; contraception that's freely available to everyone; an end to the stigma surrounding sex and birth before marriage.

The religious conservatives who oppose these measures have blood on their hands. They are responsible for high abortion rates; they are responsible for the injury and death of women. And they have the flaming cheek to talk about the sanctity of life.

Saturday 23 January 2016

Silence from big six energy firms is deafening

If this were a competitive market, our fuel bills would be £850 a year instead of £1,100

Patrick Collinson in The Guardian


 
UK consumers are not seeing their tariffs cut despite the fall in wholesale gas and oil prices. Photograph: Alamy


You cannot hope to bribe or twist the British journalist (goes the old quote from Humbert Wolfe) “But, seeing what the man will do unbribed, there’s no occasion to.” Much the same could be said about Britain’s energy companies. You cannot call them a cartel. But seeing what they do without actively colluding, there’s no occasion to.

Almost every day the price of oil and gas falls on global markets. But this has been met with deafening inactivity from the big six energy giants. Their standard tariffs remains stubbornly high, bar tiny cuts by British Gas last year and e.on, this week.

If this were a competitive market, which reflected the 45% fall in wholesale prices seen over the last two years, the average dual-fuel consumer in Britain would be paying £850 or so a year, rather than the £1,100 charged to most customers on standard tariffs.

But it is not a competitive market. The energy giants know that around 70% of customers rarely switch, so they can be very effectively milked through the pricey standard tariff, which is, itself, set at peculiarly similar levels across the big providers. The advent of paperless billing probably helps the companies, too, with busy householders failing to spot that they are paying way over the odds.

The gap between the standard tariffs and the low-cost tariffs is now astounding – £1,100 a year vs £775 a year. Yes, the 30% of households who regularly switch can, and do, benefit. But why must we have a business model where seven out of 10 customers lose out, while three out of 10 gain?

The vast majority would rather have an honest tariff deal where their energy company passes on reductions in wholesale prices without having to go through the rigmarole of switching.

Instead, we have a regulatory set-up which believes that the problem is that not enough of us switch. It thinks that it will be solved by getting that 30% figure up to 50% or more. Unfortunately, too, many regulators have a mindset that is almost ideologically attuned to a belief in the efficacy of markets, and the benefits of competition. If competition is not working, then they think the answer is simply more competition.

What would benefit consumers in these natural monopoly markets would be less competition and more regulation. We now have decades of evidence of how privatised former monopolies behave, and what it tells us is that they are there to benefit shareholders and bonus-seeking management, rather than customers.

In March we will hear from the Competition and Markets Authority about the results of its investigation into the energy market. Maybe it will conclude that privatisation and competition have failed, but my guess is that it won’t. The clue is in the name of the authority.

• A final word about home insurance. Last week I said every insurer is in on the game, happy to rip-off loyal customers, particularly older ones. I received a letter from a 90-year-old householder in Richmond Upon Thames, who, for 20 years has bought home and contents cover from the Ecclesiastical Insurance company.

After seeing my coverage, he nervously checked his premiums, as he had been letting them go through on direct debit for years without scrutiny.

To his delight, he discovered that Ecclesiastical had, unprompted, been cutting his insurance premiums.

One company, at least, doesn’t think it should skin an elderly customer just because it can probably get away with it. We should perhaps praise the lord there is an insurer out there with a conscience.

Is Ecclesiastical the only “ethical” insurer, or are there any others who are not “in on the game”, asks our reader from Richmond. Let me know!

Is mindfulness making us ill?

It’s the relaxation technique of choice, popular with employers and even the NHS. But some have found it can have unexpected effects

Dawn Foster in The Guardian


 
Illustration: Nick Lowndes for the Guardian

I am sitting in a circle in a grey, corporate room with 10 housing association employees – administrators, security guards, cleaners – eyes darting about nervously. We are asked to eat a sandwich in silence. To think about every taste and texture, every chewing motion and bite. Far from being relaxed, I feel excruciatingly uncomfortable and begin to wonder if my jaw is malfunctioning. I’m here to write about a new mindfulness initiative, and since I’ve never to my knowledge had any mental health issues and usually thrive under stress, I anticipate a straightforward, if awkward, experience.

Then comes the meditation. We’re told to close our eyes and think about our bodies in relation to the chair, the floor, the room: how each limb touches the arms, the back, the legs of the seat, while breathing slowly. But there’s one small catch: I can’t breathe. No matter how fast, slow, deep or shallow my breaths are, it feels as though my lungs are sealed. My instincts tell me to run, but I can’t move my arms or legs. I feel a rising panic and worry that I might pass out, my mind racing. Then we’re told to open our eyes and the feeling dissipates. I look around. No one else appears to have felt they were facing imminent death. What just happened?

For days afterwards, I feel on edge. I have a permanent tension headache and I jump at the slightest unexpected noise. The fact that something seemingly benign, positive and hugely popular had such a profound effect has taken me by surprise.

Mindfulness, the practice of sitting still and focusing on your breath and thoughts, has surged in popularity over the last few years, with a boom in apps, online courses, books and articles extolling its virtues. It can be done alone or with a guide (digital or human), and with so much hand-wringing about our frenetic, time-poor lifestyles and information overload, it seems to offer a wholesome solution: a quiet port in the storm and an opportunity for self-examination. The Headspace app, which offers 10-minute guided meditations on your smartphone, has more than three million users worldwide and is worth over £25m. Meanwhile, publishers have rushed to put out workbooks and guides to line the wellness shelves in bookshops. 

Large organisations such as Google, Apple, Sony, Ikea, the Department of Healthand Transport for London have adopted mindfulness or meditation as part of their employee packages, claiming it leads to a happier workforce, increased productivity and fewer sick days. But could such a one-size-fits-all solution backfire in unexpected ways?
Even a year later, recalling the sensations and feelings I experienced in that room summons a resurgent wave of panic and tightness in my chest. Out of curiosity, I try the Headspace app, but the breathing exercises leave me with pins and needles in my face and a burgeoning terror. “Let your thoughts move wherever they please,” the app urges. I just want it to stop. And, as I discovered, I’m not the only person who doesn’t find mindfulness comforting.

Claire, a 37-year-old in a highly competitive industry, was sent on a three-day mindfulness course with colleagues as part of a training programme. “Initially, I found it relaxing,” she says, “but then I found I felt completely zoned out while doing it. Within two or three hours of later sessions, I was starting to really, really panic.” The sessions resurfaced memories of her traumatic childhood, and she experienced a series of panic attacks. “Somehow, the course triggered things I had previously got over,” Claire says. “I had a breakdown and spent three months in a psychiatric unit. It was a depressive breakdown with psychotic elements related to the trauma, and several dissociative episodes.”

Four and a half years later, Claire is still working part-time and is in and out of hospital. She became addicted to alcohol, when previously she was driven and high-performing, and believes mindfulness was the catalyst for her breakdown.
Her doctors have advised her to avoid relaxation methods, and she spent months in one-to-one therapy. “Recovery involves being completely grounded,” she says, “so yoga is out.”

Research suggests her experience might not be unique. Internet forums abound with people seeking advice after experiencing panic attacks, hearing voices or finding that meditation has deepened their depression after some initial respite. In their recent book, The Buddha Pill, psychologists Miguel Farias and Catherine Wikholm voice concern about the lack of research into the adverse effects of meditation and the “dark side” of mindfulness. “Since the book’s been published, we’ve had a number of emails from people wanting to tell us about adverse effects they have experienced,” Wikholm says. “Often, people have thought they were alone with this, or they blamed themselves, thinking they somehow did it wrong, when actually it doesn’t seem it’s all that uncommon.”

One story in particular prompted Farias to look further into adverse effects. Louise, a woman in her 50s who had been practising yoga for 20 years, went away to a meditation retreat. While meditating, she felt dissociated from herself and became worried. Dismissing it as a routine side-effect of meditation, Louise continued with the exercises. The following day, after returning home, her body felt completely numb and she didn’t want to get out of bed. Her husband took her to the doctor, who referred her to a psychiatrist. For the next 15 years she was treated for psychotic depression.

Farias looked at the research into unexpected side-effects. A 1992 study by David Shapiro, a professor at the University of California, Irvine, found that 63% of the group studied, who had varying degrees of experience in meditation and had each tried mindfulness, had suffered at least one negative effect from meditation retreats, while 7% reported profoundly adverse effects including panic, depression, pain and anxiety. Shapiro’s study was small-scale; several research papers, including a 2011 study by Duke University in North Carolina, have raised concerns at the lack of quality research on the impact of mindfulness, specifically the lack of controlled studies.

Farias feels that media coverage inflates the moderate positive effects of mindfulness, and either doesn’t report or underplays the downsides. “Mindfulness can have negative effects for some people, even if you’re doing it for only 20 minutes a day,” Farias says. “It’s difficult to tell how common [negative] experiences are, because mindfulness researchers have failed to measure them, and may even have discouraged participants from reporting them by attributing the blame to them.”

Kate Williams, a PhD researcher in psychiatry at the University of Manchester and a mindfulness teacher, says negative experiences generally fall into one of two categories. The first is seen as a natural emotional reaction to self-exploration. “What we learn through meditation is to explore our experiences with an open and nonjudgmental attitude, whether the experience that arises is pleasant, unpleasant or neutral,” she says.

The second, Williams says, is more severe and disconcerting: “Experiences can be quite extreme, to the extent of inducing paranoia, delusions, confusion, mania or depression.” After years of training, research and practice, her own personal meditation has included some of these negative experiences. “Longer periods of meditation have at times led me to feel a loss of identity and left me feeling extremely vulnerable, almost like an open wound,” Williams says. As an experienced mindfulness teacher, however, she says she is able to deal with these negative experiences without lasting effect.

Rachel, a 34-year-old film-maker from London, experimented with mindfulness several years ago. An old school friend who had tried it attempted to warn her off. “He said, ‘It’s hardcore – you’ll go through things you don’t want to go through and it might not always be positive.’ I suppose sitting with yourself is hard, especially when you’re in a place where you don’t really like yourself. Meditationcan’t ‘fix’ anyone. That’s not what it’s for.”

After a few months of following guided meditations, and feeling increasingly anxious, Rachel had what she describes as a “meltdown” immediately after practising some of the techniques she’d learned; the relationship she was in broke down. “That’s the horrible hangover I have from this: instead of having a sense of calm, I overanalyse and scrutinise everything. Things would run round in my mind, and suddenly I’d be doing things that were totally out of character, acting very, very erratically. Having panic attacks that would restrict my breathing and, once, sent me into a blackout seizure on the studio floor that involved an ambulance trip to accident and emergency.” Rachel has recovered to some extent; she experiences similar feelings on a lower level even today, but has learned to recognise the symptoms and take steps to combat them.


  Illustration: Nick Lowndes for the Guardian

So are employers and experts right to extol the virtues of mindfulness? According to Will Davies, senior lecturer at Goldsmiths and author of The Happiness Industry, our mental health has become a money-making opportunity. “The measurement of our mental and emotional states at work is advancing rapidly at the moment,” he says, “and businesses are increasingly aware of the financial costs that stress, depression and anxiety saddle them with.”

Rather than removing the source of stress, whether that’s unfeasible workloads, poor management or low morale, some employers encourage their staff to meditate: a quick fix that’s much cheaper, at least in the short term. After all, it’s harder to complain that you’re under too much stress at work if your employer points out that they’ve offered you relaxation classes: the blame then falls on the individual. “Mindfulness has been grabbed in recent years as a way to help people cope with their own powerlessness in the workplace,” Davies says. “We’re now reaching the stage where mandatory meditation is being discussed as a route to heightened productivity, in tandem with various apps, wearable devices and forms of low-level employee surveillance.”

One former Labour backbencher, Chris Ruane, recently proposed meditation for civil servants, on the basis that it would cut Whitehall costs by lowering sick leave through stress, rather than making the workplace and jobs less stressful in the first place. “The whole agenda is so fraught with contradictions, between its economic goals and its supposedly spiritual methods,” Davies argues. “It’s a wonder anyone takes it seriously at all.”

Mindfulness has also been adopted by the NHS, with many primary care trusts offering and recommending the practice in lieu of cognitive behavioural therapy (CBT). “It fits nicely with the Nutribullet-chugging, clean-eating crowd, because it doesn’t involve any tablets,” says Bethan, a mental health nurse working in east London. “My main problem with it is that it’s just another word for awareness.”

Over the past few years, Bethan has noticed mindfulness mentioned or recommended increasingly at work, and says many colleagues have been offered sessions and training as part of their professional development. But the move towards mindfulness delivered through online or self-help programmes isn’t for everyone. “It’s fine, but realising you have depression isn’t the same as tackling it,” she says. “I don’t see it as any different from the five-a-day campaign: we know what we should be eating, but so many of us don’t do it. We know that isolating ourselves isn’t helpful when we feel blue, but we still do that.”

Part of the drive is simple cost-cutting. With NHS budgets squeezed, resource-intensive and diverse therapies that involve one-on-one consultations are far more expensive to dispense than online or group therapies such as mindfulness. A CBT course costs the NHS £950 per participant on average, while mindfulness-based cognitive therapy, because it’s delivered in a group, comes in at around £300 a person. “It’s cheap, and it does make people think twice about their choices, so in some respects it’s helpful,” Bethan says.

But in more serious cases, could it be doing more harm than good? Florian Ruths has researched this area for 10 years, as clinical lead for mindfulness-based therapy in the South London and Maudsley NHS foundation trust. He believes it is possible to teach yourself mindfulness through apps, books or online guides. “For most people, I think if you’re not suffering from any clinical issues, or illness, or from stress to a degree that you’re somewhat disabled, it’s fine,” he says. “We talk about illness as disability, and disability may arise through sadness, it may arise through emotional disturbance, like anxiety. Then, obviously, it becomes a different ballgame, and it would be good to have a guided practice to take you through it.” This runs counter to the drive towards online mindfulness apps, delivered without supervision, and with little to no adaptation to individual needs or problems.

But for Ruths, the benefits outweigh the risk of unusual effects. “If we exercise, we live longer, we’re slimmer, we’ve got less risk of dementia, we’re happier and less anxious,” he says. “People don’t talk about the fact that when you exercise, you are at a natural risk of injuring yourself. When people say in the new year, ‘I’m going to go to the gym’ – out of 100 people who do that, about 20 will injure themselves, because they haven’t been taught how to do it properly, or they’ve not listened to their bodies. So when you’re a responsible clinician or GP, you tell someone to get a good trainer.”

Certain mental health problems increase the risk of adverse effects from mindfulness. “If you have post-traumatic stress disorder, there is a certain chance that you may find meditation too difficult to do, as you may be re-experiencing traumatic memories,” Ruths says. “Once again, it’s about having experienced trainers to facilitate that. We’ve seen some evidence that people who’ve got bipolar vulnerability may struggle, but we need to keep in mind that it may be accidental, or it may be something we don’t know about yet.”

Of course, people may not know they have a bipolar vulnerability until they try mindfulness. Or they might have repressed the symptoms of post-traumatic stress disorder, only for these to emerge after trying the practice.

How can an individual gauge whether they’re likely to have negative side-effects? Both Farias and Ruths agree there isn’t a substantial body of evidence yet on how mindfulness works, or what causes negative reactions. One of the reasons is obvious: people who react badly tend to drop out of classes, or stop using the app or workbook; rather than make a fuss, they quietly walk away. Part of this is down to the current faddishness of mindfulness and the way it’s marketed: unlike prescribed psychotherapy or CBT, it’s viewed as an alternative lifestyle choice, rather than a powerful form of therapy.

Claire is clear about how she feels mindfulness should be discussed and delivered: “A lot of the people who are trained in mindfulness are not trained in the dangers as well as the potential benefits,” she says. “My experience of people who teach it is that they don’t know how to help people if it goes too far.”

There is currently no professionally accredited training for mindfulness teachers, and nothing to stop anyone calling themselves a mindfulness coach, though advocates are calling for that to change. Finding an experienced teacher who comes recommended, and not being afraid to discuss negative side-effects with your teacher or GP, means you’re far more likely to enjoy and benefit from the experience.

As both Claire and I have found, there are alternative relaxation methods that can keep you grounded: reading, carving out more time to spend with friends, and simply knowing when to take a break from the frenetic pace of life. Meanwhile, Claire’s experience has encouraged her to push for a better understanding of alternative therapies. “No one would suggest CBT was done by someone who wasn’t trained,” she says. “I’d like to see a wider discussion about what mindfulness is – and on what the side-effects can be.”

Friday 22 January 2016

Don’t blame China for these global economic jitters

In truth the west failed to learn from the 2008 crash. Any economic ‘recovery’ was built on asset bubbles

Ha Joon Chang in The Guardian


 
There has never been a real recovery in North America and western Europe since 2008.’ Photograph: Kai Pfaffenbach/Reuters


The US stock market has just had the worst start to a year in its history. At the same time, European and Japanese stock markets have lost around 10% and 15% of their values respectively; the Chinese stock market has resumed its headlong dash downward; and the oil price has fallen to the lowest level in 12 years, reflecting (and anticipating) worldwide economic slowdown.

According to the dominant economic narrative of recent times, 2016 was the year when the world economy would recover fully from the 2008 crash. The US would lead this recovery by generating growth and jobs via fiscal conservatism and pro-business policies. Reflecting the economy’s robust growth, the US stock market reached new heights in 2015, although disrupted by the mess in the Chinese stock market over the summer. By last October, US unemployment had fallen from the post-crisis peak of 10% to 5%, bringing it back close to the pre-crisis low. In a show of confidence, last month the US Federal Reserve finally raised its interest rate for the first time in nine years.


Not far behind the US, the story goes, have been Britain and Ireland. Hit harder than the US by the financial crisis, they have, however, recovered handsomely because they kept their nerve and stuck to the right, if unpopular, policies. Spending cuts, focused on wasteful welfare spending, accelerated job creation by making it more difficult for people to live off the taxpayer. They sensibly didn’t give in to the banker-bashers and chose not to over-regulate the financial sector.

Even the continental European economies have been finally picking up, it was said, having accepted the need for fiscal discipline, labour market reform and cutting business regulations. The world – at least the rich world – was finally set for a full recovery. So what has gone wrong?

Those who put forward the narrative are now trying to blame China in advance for the coming economic woes. George Osborne has been at the forefront, warning this month of a “dangerous cocktail of new threats” in which the devaluation of the Chinese currency and the fall in oil prices (both in large part due to China’s economic slowdown) figured most prominently. If our recovery was to be blown off course, he implied, it would be because China had mismanaged its economy.

China is, of course, an important factor in the global economy. Only 2.5% of the world economy in 1978, on the eve of its economic reform, it now accounts for around 13%. However, its importance should not be exaggerated. As of 2014, the US (22.5%) the eurozone (17%) and Japan (7%) together accounted for nearly half of the world economy. The rich world vastly overshadows China. Unless you are a developing economy whose export basket is mainly made up of primary commodities destined for China, you cannot blame your economic ills on its slowdown.

The truth is that there has never been a real recovery from the 2008 crisis in North America and western Europe. According to the IMF, at the end of 2015, inflation-adjusted income per head (in national currency) was lower than the pre-crisis peak in 11 out of 20 of those countries. In five (Austria, Iceland, Ireland, Switzerland and the UK), it was only just higher – by between 0.05% (Austria) and 0.3% (Ireland). Only in four countries – Germany, Canada, the US and Sweden – was per-capita income materially higher than the pre-crisis peak.

Even in Germany, the best performing of those four countries, per capita income growth rate was just 0.8% a year between its last peak (2008) and 2015. The US growth rate, at 0.4% per year, was half that. Compare that with the 1% annual growth rate that Japan notched up during its so-called “lost two decades” between 1990 and 2010.

To make things worse, much of the recovery has been driven by asset market bubbles, blown up by the injection of cash into the financial market through quantitative easing. These asset bubbles have been most dramatic in the US and UK. They were already at an unprecedented level in 2013 and 2014, but scaled new heights in 2015. The US stock market reached the highest ever level in May 2015 and, after the dip over the summer, more or less came back to that level in December. Having come down by nearly a quarter from its April 2015 peak, Britain’s stock market is currently not quite so inflated, but the UK has another bubble to reckon with, in the housing market, where prices are 7% higher than the pre-crisis peak of 2007.

Thus seen, the main causes of the current economic turmoil lie firmly in the rich nations – especially in the finance-driven US and UK. Having refused to fundamentally restructure their economies after 2008, the only way they could generate any sort of recovery was with another set of asset bubbles. Their governments and financial sectors talked up anaemic recovery as an impressive comeback, propagating the myth that huge bubbles are a measure of economic health.

Whether or not the recent market turmoil leads to a protracted slide or a violent crash, it is proof that we have wasted the past seven years propping up a bankrupt economic model. Before things get any worse, we need to replace it with one in which the financial sector is made less complex and more patient, investment in the real economy is encouraged by fiscal and technological incentives, and measures are brought in to reduce inequality so that demand can be maintained without creating more debts.

None of these will be easy to implement, but we know what the alternative is – a permanent state of low growth, instability, and depressed living standards for the vast majority.

Thursday 21 January 2016

Wait for a bus and then tell me the market knows best

Deregulation of buses in 1986 was a disaster for those unable to afford a car. Municipalisation is the key to a fairer system

Owen Jones in The Guardian


 

Liverpool’s Queen’s Square bus station. Since 1986, ‘bus trips in big cities outside London have collapsed from 2bn to 1bn a year’. Photograph: Christopher Thomond for the Guardian


Of the issues differentiating the metropolitan mindset in the capital from opinions voiced elsewhere, the starkest is probably transport. We hear much about the overcrowded rail network in London and the south east, where fares are among the most expensive in Europe. Of course we do: much of the national media works from London. And we have warm words for the buses in the capital, where since Ken Livingstone’s first mayoral administration, starting in 2000, the mayoralty and Transport for London have assumed regulatory powers, determining prices and frequency with dramatic success.

Travel outside London, however, and Britain’s deregulated bus system reveals itself as the source of widespread, justified disgruntlement – an overpriced, inefficient, poor-quality mess. According to a report to be published this week, since deregulation in 1986 – unleashed with the promise that “more people would travel” – bus trips in big cities outside London have collapsed from 2bn to 1bn a year. In London, on the other hand, where everything from how much we pay to which routes exist is decided by the mayor and Transport for London, bus use since the 1980s has gone in the opposite direction: from around 1bn to more than 2bn trips a year. Britain’s bus privatisation disaster is a story of profit before need, and a discomfiting tale for those who believe the private sector automatically trumps the public realm.

I asked Twitter for bad experiences of buses, and turned my feed into one long howl of anguish. Chronic delays, “virtually no evening travel”, old “clapped-out buses”, infrequency, poor punctuality, extortionate prices: these were common complaints. “No evening and barely a weekend bus service in Helmsley, North Yorks,” complained one. “Need a car to live here.” Another cry of frustration: “Costly, cash only, fragmented among several providers, no unified ticketing, virtually ends at 1800 hours, v[ery] poor on Sunday.”

It is the less well-off who suffer most. Fewer than one in three of the poorest tenth of the population own their own car. With our bus system in such a state, it is unsurprising that those with the least money are three times more likely to use a cab than the richest.


Our journey towards the great British bus disaster is uncovered in meticulous detail in the report, by Ian Taylor and Lynn Sloman. “Bus users in Britain inhabit different worlds,” they note, ranging from London’s centrally regulated bus system to rural dwellers who “live in a third world with a skeletal service or, in some places, no service at all”. Whatever Margaret Thatcher’s programme of deregulation in the 1980s offered, the opposite happened: fares rose, services worsened and bus use fell.

Private bus companies have one motive, after all: making profit, rather than catering to the needs of their users. Between 2003 and 2013, £2.8bn ended up as dividend payments in the bank balances of shareholders, rather than invested in improving bus services. About 40p in every pound of their total revenues comes directly from the taxpayer: yet another example of Britain’s publicly subsidised “free market” economy.

Bus services are a hopelessly confused patchwork of provision. Some tickets you can use with different providers, some you can’t. You may find enough buses at peak hours but struggle to catch one at other times. If you live in a rural area, you may struggle to find any buses to catch at all. Local authorities can pay for services deemed crucial for local communities, but they do so at great expense. Often the buses themselves are from another age: cold, noisy, rickety vehicles driven by woefully underpaid drivers. Those who can afford to insulate themselves by using their own vehicles can minimise the inconvenience, but that’s not an option for the less fortunate.

Our rail system, we debate. We know that a decisive majority of Britons want to see our railways publicly owned. But where is the debate about rescuing Britain’s bus services?

Bringing buses into accountable, municipal ownership would transform the system. Consider France, or Germany where apparently 88% of local public transport trips are on publicly owned trams, trains or buses, but also look closer to home. In 2013, after running a £8m surplus, municipally owned Lothian Buses in Edinburgh reported satisfaction rates of 96%, beating all competition in Britain. Nottingham City Transport – majority shareholder the city council – has satisfaction rates of 92% in “one of the least car-dependent cities” in Britain.


Here’s what municipalisation can achieve, say Taylor and Sloman. A “comprehensive network” for all communities, without allowing private companies to cherrypick the most profitable routes or obliging local authorities to intervene at “disproportionate cost”. Rather than having a bewildering array of fares, there could be “simple, area-wide fares” that – as in London – could be used across all transport services, from trains to buses, with daily costs capped. Instead of passengers being stuck for the best part of an hour at a freezing bus stop after changing buses, timetables could be coordinated, and public money be spent improving services, not frittered away as shareholder dividends.

We hear so little about buses, undoubtedly, because in London they are good enough for the middle classes to use, while outside the capital those with sharper elbows often avoid them. But here is surely an issue for Labour to champion. Those who suffer the most are often in areas where Labour could do well to try to win support, such as Cornwall, and in many Tory-dominated rural communities. Municipal ownership, with bus passengers encouraged to help run services, would improve the lives of many poorer citizens. There would also be fewer cars on the road. Whether or not Lady Lindsay of Dowhill ever really said “Anybody seen in a bus over the age of 30 has been a failure in life”, the stigma could be overcome for the public good.

Putting the great bus privatisation disaster on the agenda would be another reminder that the inherent superiority of the market is not a foregone conclusion.
This is a rich country. If we cannot even produce a decently functio
ning bus service for our citizens, we need to start asking ourselves some serious questions.

Arguing the toss

Nathan Leamon in Cricinfo


Will awarding the toss to the away team even up the playing field and deliver more away Test wins, or is this yet another case of received cricketing wisdom not stacking up with the facts?


You will rarely be criticised for choosing to bat. Batting is the default setting; bowling first is seen as the gamble © Getty Images



On the first morning of the first Test between Pakistan and England in Abu Dhabi, three events came to mind. One current, one recent, one infamous. The first was the conversation between Michael Atherton and both captains at the toss and the unanimity of all concerned. The second, the recent proposal from Ricky Ponting and Michael Holding amongst others, that the toss be done away with in Test cricket and the choice given instead to the away captain. The other was Brisbane 2002, and Nasser Hussain choosing to bowl first on a day almost as hot as the one in Abu Dhabi.

Let's start with the second. The suggestion of awarding the toss to the away captain was made by Ponting as a possible solution to the perceived problem of home teams tailoring wickets to suit their strengths. And the resulting domination of home teams. "It has never been harder to win away from home", we are told repeatedly.

Ironically, the decline of away wins is one of those facts that is assumed to be true without often, it would seem, being checked. In fact, it has never been easier to win on the road. More Tests are won by the away team now than at any time in recent history.


AWAY WINS IN TESTS

Decade     Win%
2010s        28.8
2000s        28.4
1990s        23.1
1980s        21.1
1970s        22.7
1960s        21.5


This is largely down to the decline in the draw. There have been more and more results in Tests and although the proportion of them that have gone the way of the visitors has shifted slightly in favour of the home team, this has resulted in a significant rise in away wins.

That said, there are other factors that suggest the balance of power is shifting slightly towards the home team. The gap between averages at home and averages away is growing, for example. So let's assume for now that the premise is true, and that home teams are increasingly dominant.

Holding and Ponting have suggested giving the toss to the visiting captain to prevent home teams stacking the conditions in their favour. I don't know whether this is a good idea or not. But there are three reasons that we should question whether it would achieve its aims.

Firstly, it assumes groundsmen can reliably bake certain characteristics into a pitch. In practice, pitch preparation seems to be an inexact science. I have stood before Test matches around the world and listened to groundsmen describe how the pitch is going to play, only to watch it do something completely different half an hour later.

It also presupposes that the interests of groundsman and home team are aligned, which is often not the case. In England for example, venues are heavily incentivised to maximise revenues from the Tests they host by ensuring five full days' play. So groundsmen, understandably, often pay less attention to the needs of the visiting circus than to the people who pay their salary for the other 51 weeks of the year.

Secondly, there is a law of unintended consequences in sporting rule changes that can often produce the opposite result to the one intended. If a home captain had control over the pitch, the framers of this law are assuming he would back away from tilting it in his favour. Is it not just as likely that he would go the other way and seek to produce a pitch so favourable that the toss was taken out of the equation? This after all is what MS Dhoni openly sought to do when England and Australia each last toured, produce pitches that turn big from ball one, and so take the toss out of the equation. Equally, you could imagine England or Australia producing genuine green-tops that would be as helpful to the quicks on day four as day one.

But lastly, and most importantly, it assumes that captains are able to use the toss to their advantage. This is not in any way proven. In fact the evidence suggests it just isn't the case.

At the time of writing, 1,048 Tests have been played since January 1990. During that period, the side that won the toss has lost slightly more (377) matches than it has won (374). Winning the toss in the modern era appears to give a side no advantage at all.

It wasn't always so. On uncovered pitches, batting first in almost all instances was a robustly successful strategy. If it rained during the match, the pitch would deteriorate, affecting the side batting second disproportionately. Until 1970, the side batting first in a Test won 36 per cent of matches, and lost 28 per cent.

But in the modern era, the advantage of winning the toss seems to have disappeared. This is, of course, stunningly counterintuitive.
Test cricket is an asymmetric game. One team bats first, then the other. And the two teams' chances of winning are not equal. The team batting first has different requirements for victory to the team batting second, and the pitch changes over the course of the match, affecting the balance of power between bat and ball. Therefore, we would assume, teams that win the toss can choose the best conditions and so gain an advantage. But they don't. How can that possibly be?

Dropped catches and a sickening injury to Simon Jones didn't help Nasser Hussain after he chose to bowl in Brisbane in 2002 © Getty Images





Sometimes, a perfectly reasonable response to current circumstances becomes a habit, then a tradition, then an article of faith that outlives the circumstances that created it. We rarely question what we know to be self-evidently true. And so the bias towards batting first seems to have outlived the circumstances that created it by several decades.

"If you win the toss, nine times out of ten you should bat. On the tenth occasion you should think about bowling and then bat."

That was a very successful strategy to adopt for the first century of Test cricket. And one that is still the default setting for most captains. In the 700 Tests played since January 2000, nearly twice as many captains have batted first than have chosen to bowl. Is it still successful?

In a word, no. In that period, the side batting first has won 36 per cent of those Tests, the side bowling first 39 per cent. The bat-first bias at the toss would seem to be neutral at best, and probably counter-productive.


It is still hard to believe that captains aren't able to use the toss to their advantage. There are venues where the evidence is stark. Some pitches clearly favour the side batting first, some the side batting second. In the 40 Tests played in Lahore, the team batting first has won just three. Adelaide by contrast is a classic bat-first venue. It starts as a batsman's paradise, but by the fifth day can be very tricky to bat on, with considerable turn for the spinners. In the 74 Tests played at the ground the side batting first have won 35, the side batting second 19. Since 1990 averages in the first innings are 44.6, in the second 38.9, the third 30.1 and the fourth 27.1 and, as you would expect, in that period, 25 out of 26 captains have chosen to bat first, gaining a considerable advantage in doing so.

These are not isolated cases. Many pitches have similarly skewed characteristics. Galle and Old Trafford for example, both have similar records to Adelaide. Karachi is as bowl-first friendly as Lahore.



****



Captains' behaviour at the toss seems to be yet another example of received cricketing wisdom not concurring with the evidence. Where what teams do doesn't seem to maximise their chances of winning. Why is this the case?

Well, part of the story involves how our brains handle information. There has been a great deal of research into memory and perception, and the results are both surprising and illuminating when it comes to our decision-making in sport. For a start, our memories don't work as you might expect. They are not akin to a videotape; we don't record a series of events and then play them back as and when they are needed.

The disturbing truth is that our unaided recall is not very good. The human brain encodes less than 10 per cent of what we experience, the rest it simply makes up. Our minds construct a narrative around the coded memories we do have that fills in the gaps with a plausible story. Faced with a huge number of random or near random events (a cricket match, for instance) our brains pattern-spot, even when there is no pattern. Our minds look for those events that they can form into a pattern or story, and that becomes the meaning or lesson that we take away from the match. Even if the vast number of events that occurred didn't fit the pattern, we disproportionately remember the ones that did.

At their best then, our memories seem to work along the lines of Albert Camus's description of fiction, they are the lie through which we tell the truth. What we remember didn't actually happen, what we remember is a story that our brains have fabricated, but one that we hope contains the essential truth of what happened in a way that we can understand and retain.

Our fallible memories are only part of the reason captains and coaches behave the way they do. There is another, far more powerful reason to make the choices they make and one which is harder to argue against. For this we need to go back to Brisbane in 2002, and Nasser Hussain choosing to bowl.


"The test of a first-rate intelligence is the ability to hold two opposed ideas in the mind at the same time, and still retain the ability to function."
- F. Scott Fitzgerald



It was the first Test of the Ashes, an Australian team were at the peak of their powers and playing at home in 'Fortress Brisbane', the hardest ground in the world to win at as an away team. No visiting team had won in the last 26 Tests played at the 'Gabbattoir'. Hussain won the toss and chose to bowl, Australia were 364-2 by the close of play and went on to win comfortably.

It is no use looking back with hindsight and using that to determine whether a decision was right or wrong. I am sure that if Nasser had known that choosing to bowl first would bring a host of dropped chances, the loss of a bowler to injury and Australia piling up the first-innings runs, he would have chosen to have a look behind door B and strapped his pads on.
But he didn't know, and in evaluating a past decision, we shouldn't know either. We need to remain behind the veil of ignorance, aware of all the potential paths the match could have taken, but ignorant of the one that it did.

One way we can do that is to simulate the match. There are various models that allow us to simulate matches given the playing strengths of the two sides and give probabilities for the outcome. When we do this for that Brisbane Test, we get the following probabilities for England:


Decision                  Win                  Draw                 Lose
Bat First                   4%                     3%                   93%
Bowl First                 4%                   10%                   86%



Every batsman in Australia's top seven for that match finished his career averaging over 45 (three averaged 50-plus), none of the English players did, only two averaged 40. England had a decent bowling attack. Australia had Warne, McGrath and Gillespie with 1,000 wickets between them already.

England were a pretty good side, they'd won four, lost two of their previous 10 matches. But they were hopelessly outgunned, and in alien conditions. Steve Waugh, the Australian captain, was also going to bowl if he had won the toss. If he had done then Australia would almost certainly have won the match as well. Australia were almost certainly going to win regardless of who did what at the toss.

But none of that made any difference. Hussain's decision to bowl first was castigated by the public and press of both countries. Wisden described it as "one of the costliest decisions in Test history". One senior journalist wrote that the decision should prompt the England captain "to summon his faithful hound, light a last cigarette and load a single bullet into the revolver".

For Nasser in Brisbane, read Ricky Ponting at Edgbaston in 2005, another decision to insert the opposition that has never been lived down. Yet, if either of them had batted first and lost, no one would ever remember their decision at the toss. You will rarely if ever be criticised for choosing to bat. Batting is the default setting; bowling first is seen as the gamble. And remember, the side that bats first loses significantly more than it wins.

Test cricket is one of the greatest contests in sport, a brilliant, multi-faceted contest for mind and body. But it is also a game of numbers. If you can tilt the numbers slightly in your favour, get them working for you, not against you, plot a slightly more efficient path to victory, then you are always working slightly downhill rather toiling against the slope.

As I write this, Pakistan are about to go out and bowl for the fourth consecutive day of England's first innings in Abu Dhabi on a pitch that you could land light aircraft on. They have home advantage, have made the orthodox decision, played well, and yet there is only one team that can win the match from here, and it isn't them. If this is what home advantage and winning the toss looks like then they are welcome to it.

It is all but certain that if they had ended up batting second they would now be in a considerably better position. Reverse the first innings as they have happened and Pakistan would now be batting past an exhausted England side and about to put them under the pump for a difficult last three sessions. And in the alternative scenarios where one side or the other got a first innings lead, as we have seen, those work disproportionately in favour of the side batting second.

But, we all do it. We look at a pristine wicket, flat, hard and true, and batting seems the only option. It is written into our cricketing DNA. The evidence may suggest there is a small marginal gain in bowling. But small margins be damned. If the marginal gain erodes your credibility and authority, then that is probably not an exchange you are willing to make. There are tides you can't swim against.

Which brings us back to Alastair Cook and Misbah-ul-Haq, standing in Abu Dhabi in the baking heat. Both are men of considerable character; brave, implacable and preternaturally determined to win. Each has withstood the slings and arrows of captaining their country through some fairly outrageous fortunes. Each is ready to bat first without a second thought. Because while they are certainly brave, they are not stupid. And you would have to be really stupid to make the right decision.
And there of course you have the central problem of much decision-making in cricket. This pitch is slightly different to all the other pitches that there have ever been. And you don't know for certain how it is going to play, or how that will influence the balance of power in the match. There are those who would argue that this is why stats are useless, or at best very limited.

I would agree entirely that stats are never sufficient to make a decision. There is nuance and subtlety to weigh; the brain and eye have access to information that the laptop doesn't. The feel and instincts of coaches and players, the hard-wired learning from decades in the game, contains incredibly valuable information and will always be the mainstay of decision-making that must be flexible and fluid through changing match situations. But if we are honest, we must also accept that the sheer weight and tonnage of what we don't know about how cricket works would sink a battleship. To use stats and nothing else to make decisions would be incredibly foolish, and as far as I am aware no one ever has. But equally, to insist on making decisions on incomplete information, without ever reviewing the effectiveness of those decisions would seem almost equally perverse.

I'm not saying that everyone was wrong in Abu Dhabi. I'm not saying that Misbah should have bowled. The weight of opprobrium heaped on him doesn't bear thinking about. It's the sort of decision that ends captaincies. No, Misbah had only one option and he took it. But maybe, just maybe, one day there will come a time when it isn't such an obvious choice.

Deny the British empire's crimes? No, we ignore them

New evidence of British colonial atrocities has not changed our national ability to disregard it.

George Monbiot in The Guardian


 
Members of the Devon Regiment round up local people in a search for Mau Mau fighters in Kenya in 1954. Photograph: Popperfoto/Popperfoto/Getty Images


There is one thing you can say for the Holocaust deniers: at least they know what they are denying. In order to sustain the lies they tell, they must engage in strenuous falsification. To dismiss Britain's colonial atrocities, no such effort is required. Most people appear to be unaware that anything needs to be denied.

The story of benign imperialism, whose overriding purpose was not to seize land, labour and commodities but to teach the natives English, table manners and double-entry book-keeping, is a myth that has been carefully propagated by the rightwing press. But it draws its power from a remarkable national ability to airbrush and disregard our past.

Last week's revelations, that the British government systematically destroyed the documents detailing mistreatment of its colonial subjects, and that the Foreign Office then lied about a secret cache of files containing lesser revelations, is by any standards a big story. But it was either ignored or consigned to a footnote by most of the British press. I was unable to find any mention of the secret archive on the Telegraph's website. The Mail's only coverage, as far as I can determine, was an opinion piece by a historian called Lawrence James, who used the occasion to insist that any deficiencies in the management of the colonies were the work of "a sprinkling of misfits, incompetents and bullies", while everyone else was "dedicated, loyal and disciplined".


----Also read

-----

The British government's suppression of evidence was scarcely necessary. Even when the documentation of great crimes is abundant, it is not denied but simply ignored. In an article for the Daily Mail in 2010, for example, the historian Dominic Sandbrook announced that "Britain's empire stands out as a beacon of tolerance, decency and the rule of law … Nor did Britain countenance anything like the dreadful tortures committed in French Algeria." Could he really have been unaware of the history he is disavowing?

Caroline Elkins, a professor at Harvard, spent nearly 10 years compiling the evidence contained in her book Britain's Gulag: the Brutal End of Empire in Kenya. She started her research with the belief that the British account of the suppression of the Kikuyu's Mau Mau revolt in the 1950s was largely accurate. Then she discovered that most of the documentation had been destroyed. She worked through the remaining archives, and conducted 600 hours of interviews with Kikuyu survivors – rebels and loyalists – and British guards, settlers and officials. Her book is fully and thoroughly documented. It won the Pulitzer prize. But as far as Sandbrook, James and other imperial apologists are concerned, it might as well never have been written.

Elkins reveals that the British detained not 80,000 Kikuyu, as the official histories maintain, but almost the entire population of one and a half million people, in camps and fortified villages. There, thousands were beaten to death or died from malnutrition, typhoid, tuberculosis and dysentery. In some camps almost all the children died.

The inmates were used as slave labour. Above the gates were edifying slogans, such as "Labour and freedom" and "He who helps himself will also be helped". Loudspeakers broadcast the national anthem and patriotic exhortations. People deemed to have disobeyed the rules were killed in front of the others. The survivors were forced to dig mass graves, which were quickly filled. Unless you have a strong stomach I advise you to skip the next paragraph.

Interrogation under torture was widespread. Many of the men were anally raped, using knives, broken bottles, rifle barrels, snakes and scorpions. A favourite technique was to hold a man upside down, his head in a bucket of water, while sand was rammed into his rectum with a stick. Women were gang-raped by the guards. People were mauled by dogs and electrocuted. The British devised a special tool which they used for first crushing and then ripping off testicles. They used pliers to mutilate women's breasts. They cut off inmates' ears and fingers and gouged out their eyes. They dragged people behind Land Rovers until their bodies disintegrated. Men were rolled up in barbed wire and kicked around the compound.

Elkins provides a wealth of evidence to show that the horrors of the camps were endorsed at the highest levels. The governor of Kenya, Sir Evelyn Baring, regularly intervened to prevent the perpetrators from being brought to justice. The colonial secretary, Alan Lennox-Boyd, repeatedly lied to the House of Commons. This is a vast, systematic crime for which there has been no reckoning.

No matter. Even those who acknowledge that something happened write as if Elkins and her work did not exist. In the Telegraph, Daniel Hannan maintains that just eleven people were beaten to death. Apart from that, "1,090 terrorists were hanged and as many as 71,000 detained without due process".

The British did not do body counts, and most victims were buried in unmarked graves. But it is clear that tens of thousands, possibly hundreds of thousands, of Kikuyu died in the camps and during the round-ups. Hannan's is one of the most blatant examples of revisionism I have ever encountered.

Without explaining what this means, Lawrence James concedes that "harsh measures" were sometimes used, but he maintains that "while the Mau Mau were terrorising the Kikuyu, veterinary surgeons in the Colonial Service were teaching tribesmen how to deal with cattle plagues." The theft of the Kikuyu's land and livestock, the starvation and killings, the widespread support among the Kikuyu for the Mau Mau's attempt to reclaim their land and freedom: all vanish into thin air. Both men maintain that the British government acted to stop any abuses as soon as they were revealed.

What I find remarkable is not that they write such things, but that these distortions go almost unchallenged. The myths of empire are so well-established that we appear to blot out countervailing stories even as they are told. As evidence from the manufactured Indian famines of the 1870s and from the treatment of other colonies accumulates, British imperialism emerges as no better and in some cases even worse than the imperialism practised by other nations. Yet the myth of the civilising mission remains untroubled by the evidence.