Shared posts

18 Feb 14:17

#6 Dinah the Aspie Dinosaur and the Doctor Appointment

by Dinah

Dinah the Aspie Dinosaur and the Doctor Appointment


18 Feb 14:16

#10 Dinah the Aspie Dinosaur and the Waiting Room (2)

by Dinah

Dinah the Aspie Dinosaur and the Waiting Room (2)


18 Feb 14:16

#12 Dinah the Aspie Dinosaur and the Library

by Dinah

Dinah the Aspie Dinosaur and the Library


03 Feb 12:29

Practically-A-Book Review: Dying To Be Free

by Scott Alexander

I am the last person with a right to complain about Internet articles being too long. But if I did have that right, I think I would exercise it on Dying To Be Free, the Huffington Post’s 20,000-word article on the current state of heroin addiction treatment. I feel like it could have been about a quarter the size without losing much.

It’s too bad that most people will probably shy away from reading it, because it gets a lot of stuff really right.

The article’s thesis is also its subtitle: “There’s a treatment for heroin addiction that actually works; why aren’t we using it?” To save you the obligatory introductory human interest story: that treatment is suboxone. Its active ingredient is the drug buprenorphine, which is kind of like a safer version of methadone. Suboxone is slow-acting, gentle, doesn’t really get people high, and is pretty safe as long as you don’t go mixing it with weird stuff. People on suboxone don’t experience opiate withdrawal and have greatly decreased cravings for heroin. I work at a hospital that’s an area leader in suboxone prescription, I’ve gotten to see it in action, and it’s literally a life-saver.

Conventional heroin treatment is abysmal. Rehab centers aren’t licensed or regulated and most have little interest in being evidence-based. Many are associated with churches or weird quasi-religious groups like Alcoholics Anonymous. They don’t necessarily have doctors or psychologists, and some actively mistrust them. All of this I knew. What I didn’t know until reading the article was that – well, it’s not just that some of them try to brainwash addicts. It’s more that some of them try to cargo cult brainwashing, do the sorts of things that sound like brainwashing to them, without really knowing how brainwashing works assuming it’s even a coherent goal to aspire to. Their concept of brainwashing is mostly just creating a really unpleasant environment, yelling at people a lot, enforcing intentionally over-strict rules, and in some cases even having struggle-session-type-things where everyone in the group sits in a circle, scream at the other patients, and tell them they’re terrible and disgusting. There’s a strong culture of accusing anyone who questions or balks at any of it of just being an addict, or “not really wanting to quit”.

I have no problem with “tough love” when it works, but in this case it doesn’t. Rehab programs make every effort to obfuscate their effectiveness statistics – I blogged about this before in Part II here – but the best guesses by outside observers is that for a lot of them about 80% to 90% of their graduates relapse within a couple of years. Even this paints too rosy a picture, because it excludes the people who gave up halfway through.

Suboxone treatment isn’t perfect, and relapse is still a big problem, but it’s a heck of a lot better than most rehabs. Suboxone gives people their dose of opiate and mostly removes the biological half of addiction. There’s still the psychological half of addiction – whatever it was that made people want to get high in the first place – but people have a much easier time dealing with that after the biological imperative to get a new dose is gone. Almost all clinical trials have found treatment with methadone or suboxone to be more effective than traditional rehab. Even Cochrane Review, which is notorious for never giving a straight answer to anything besides “more evidence is needed”, agrees that methadone and suboxone are effective treatments.

Some people stay on suboxone forever and do just fine – it has few side effects and doesn’t interfere with functioning. Other people stay on it until they reach a point in their lives when they feel ready to come off, then taper down slowly under medical supervision, often with good success. It’s a good medication, and the growing suspicion it might help treat depression is just icing on the cake.

There are two big roadblocks to wider use of suboxone, and both are enraging.

The first roadblock is the #@$%ing government. They are worried that suboxone, being an opiate, might be addictive, and so doctors might turn into drug pushers. So suboxone is possibly the most highly regulated drug in the United States. If I want to give out OxyContin like candy, I have no limits but the number of pages on my prescription pad. If I want to prescribe you Walter-White-level quantities of methamphetamine for weight loss, nothing is stopping me but common sense. But if I want to give even a single suboxone prescription to a single patient, I have to take a special course on suboxone prescribing, and even then I am limited to only being able to give it to thirty patients a year (eventually rising to one hundred patients when I get more experience with it). The (generally safe) treatment for addiction is more highly regulated than the (very dangerous) addictive drugs it is supposed to replace. Only 3% of doctors bother to jump through all the regulatory hoops, and their hundred-patient limits get saturated almost immediately. As per the laws of suppy and demand, this makes suboxone prescriptions very expensive, and guess what social class most heroin addicts come from? Also, heroin addicts often don’t have access to good transportation, which means that if the nearest suboxone provider is thirty miles from their house they’re out of luck. The List Of Reasons To End The Patient Limits On Buprenorphine expands upon and clarifies some of these points.

(in case you think maybe the government just honestly believes the drug is dangerous – nope. You’re allowed to prescribe without restriction for any reason except opiate addiction)

The second roadblock is the @#$%ing rehab industry. They hear that suboxone is an opiate, and their religious or quasi-religious fanaticism goes into high gear. “What these people need is Jesus and/or their Nondenominational Higher Power, not more drugs! You’re just pushing a new addiction on them! Once an addict, always an addict until they complete their spiritual struggle and come clean!” And so a lot of programs bar suboxone users from participating.

This doesn’t sound so bad given the quality of a lot of the programs. Problem is, a lot of these are closely integrated with the social services and legal system. So suppose somebody’s doing well on suboxone treatment, and gets in trouble for a drug offense. Could be that they relapsed on heroin one time, could be that they’re using something entirely different like cocaine. Judge says go to a treatment program or go to jail. Treatment program says they can’t use suboxone. So maybe they go in to deal with their cocaine problem, and by the time they come out they have a cocaine problem and a heroin problem.

And…okay, time for a personal story. One of my patients is a homeless man who used to have a heroin problem. He was put on suboxone and it went pretty well. He came back with an alcohol problem, and we wanted to deal with that and his homelessness at the same time. There are these organizations called three-quarters houses – think “halfway houses” after inflation – that take people with drug problems and give them an insurance-sponsored place to live. But the catch is you can’t be using drugs. And they consider suboxone to be a drug. So of about half a dozen three-quarters houses in the local area, none of them would accept this guy. I called up the one he wanted to go to, said that he really needed a place to stay, said that without this care he was in danger of relapsing into his alcoholism, begged them to accept. They said no drugs. I said I was a doctor, and he had my permission to be on suboxone. They said no drugs. I said that seriously, they were telling me that my DRUG ADDICTED patient who was ADDICTED TO DRUGS couldn’t go to their DRUG ADDICTION center because he was on a medication for treating DRUG ADDICTION? They said that was correct. I hung up in disgust.

So I agree with the pessimistic picture painted by the article. I think we’re ignoring our best treatment option for heroin addiction and I don’t see much sign that this is going to change in the future.

But the health care system not being very good at using medications effectively isn’t news. I also thought this article was interesting because it touches on some of the issues we discuss here a lot:

The value of ritual and community. A lot of the most intelligent conservatives I know base their conservativism on the idea that we can only get good outcomes in “tight communities” that are allowed to violate modern liberal social atomization to build stronger bonds. The Army, which essentially hazes people with boot camp, ritualizes every aspect of their life, then demands strict obedience and ideological conformity, is a good example. I do sometimes have a lot of respect for this position. But modern rehab programs seem like a really damning counterexample. If you read the article, you will see that this rehabs are trying their best to create a tightly-integrated religiously-inspired community of exactly that sort, and they have abilities to control their members and force their conformity – sometimes in ways that approach outright abuse – that most institutions can’t even dream of. But their effectiveness is abysmal. The entire thing is for nothing. I’m not sure whether this represents a basic failure in the idea of tight communities, or whether it just means that you can’t force them to exist ex nihilo over a couple of months. But I find it interesting.

My love-hate relationship with libertarianism. Also about the rehabs. They’re minimally regulated. There’s no credentialing process or anything. There are many different kinds, each privately led, and low entry costs to creating a new one. They can be very profitable – pretty much any rehab will cost thousands of dollars, and the big-name ones cost much more. This should be a perfect setup for a hundred different models blooming, experimenting, and then selecting for excellence as consumers drift towards the most effective centers. Instead, we get rampant abuse, charlatanry, and uselessness.

On the other hand, when the government rode in on a white horse to try to fix things, all they did was take the one effective treatment, regulate it practically out of existence, then ride right back out again. So I would be ashamed to be taking either the market’s or the state’s side here. At this point I think our best option is to ask the paraconsistent logic people to figure out something that’s neither government nor not-government, then put that in charge of everything.

Society is fixed, biology is mutable. People have tried everything to fix drug abuse. Being harsh and sending drug users to jail. Being nice and sending them to nice treatment centers that focus on rehabilitation. Old timey religion where fire-and-brimstone preachers talk about how Jesus wants them to stay off drugs. Flaky New Age religion where counselors tell you about how drug abuse is keeping you from your true self. Government programs. University programs. Private programs. Giving people money. Fining people money. Being unusually nice. Being unusually mean. More social support. Less social support. This school of therapy. That school of therapy. What works is just giving people a chemical to saturate the brain receptor directly. We know it works. The studies show it works. And we’re still collectively beating our heads against the wall of finding a social solution.

03 Feb 12:24

MANIC STREET PREACHERS – “The Masses Against The Classes”

by Tom

#845, 22nd January 2000

masses classes “All over the world I will back the masses against the classes” – William Ewart Gladstone.

Hello, it’s us again. Welcome to Popular. Welcome to 21st century pop music, now fifteen years old and dreadfully teenager-ish in its surly refusal to admit to any pigeonhole you might want to place it in. Putting the pop culture of this century’s first decade into a historical context is an unsatisfying job: it’s wriggly and shapeless. Some would gloomily have it that pop descended into an ahistorical inertia in the 00s, cycling through a tatty parade of old signifiers. Others would point to this tribe or that as keeping its vital spirit alive. From either perspective, trying to grab onto this century’s music through its number one records seems a strange proposition.

Maybe Gladstone can help. His famous placing of bets is no kind of socialist endorsement: he was appealing to his notion of a spirit in “the masses” that transcends factional (class) interest – the surges of support for a noble cause that led, in his eyes, to many of Victorian politics’ grand reforming moments, and overturned any partisan support of particular classes for the status quo. By focusing critically on only the best-selling record of any given moment, I’ve tried to place myself to pick up on as many of pop’s broad-based swells of sentiment as I can. There’s a nagging feeling that those kind of hits – the ones that stick around and define a summer, a winter, or a year – are more genuine and worthy of note than the mayfly one-week wonders that might surround them. But this is misguided. The pop charts have always also been about the classes – a mess of overlapping factions and specialisms that sometimes, somehow, get their message through. And the format of Popular also forces me to pay attention to this jabber of enthusiasms that a smoother history might overrule.

So number ones are a volatile balance of the masses and the classes, and that’s why I like to write about them. Still, though, 2000 is a shit of a year for doing it.

There are forty-two singles to cover, more than any year before or since. This berserk turnover is no accident: let’s remind ourselves of what getting a number one took at the turn of the 00s. In general, a hundred thousand sales would do the job. Pick the right week and you could hit the top on barely half that. Competition for number ones was planned to a degree, with release dates shifting back and forth to give bands with strong fanbases the best chance of a week’s glory. Those fanbases knew exactly what was coming, because singles were released to radio weeks in advance so they could build or mobilise an audience. On the relevant Monday, multiple formats in the main record shops helped fluff performance and ensure a high entry and peak. It was an unromantic business: marketers and fans united in what amounted to a business planning exercise, with all the thrill of a well-ordered Gant chart.

With hindsight, you notice two things. First, it’s astonishing the charts of the late 90s and early 00s are as representative as they are. There is a ridiculous number of number ones, but no more injustices than usual. Big records have always missed number one, whole styles have been neglected, but this period is no worse for it than any other. The masses remain in full voice.

The second curious thing is that this system, far from being sewn up, was ripe for gaming, vulnerable to the influence of faction. If you had a big enough fanbase, and picked the right week, you could get anything at all to number one. The charts have never been more open to the possibility of pop theatre than in the 00s: it might have been a golden age for wannabe Maclarens. But almost nobody took advantage of it. Of course, once bands built the kind of support to make trolling the charts a live possibility, most of them simply couldn’t be arsed any more.

It wasn’t just the innate conservatism of the act with an audience to please, though – the very idea of the charts as something that should be “subverted” seemed to belong to a prior age. One of the things that happened after punk was that the relationship of the underground and the mainstream changed. British psychedelic and prog bands didn’t shun the singles charts quite as much as lazy history might summarise, but there was hardly ever an ideological angle to their occasional visits. Punk, and the Pistols specifically, altered that. The near-miss or spiking of “God Save The Queen” became a feat to emulate, a crime to avenge. By 1980 the charts were highly winnable territory – former or tangential punkers like Adam Ant and Paul Weller going straight in at No.1, then pop itself restaffed by the eager and glorious theorists of the New Pop. A series of peaks – of fondly recalled victories – followed: Paul Morley and Trevor Horn’s tactical conquest with Frankie; 4AD getting M/A/R/R/S to the top; the situationist pop chaos of the KLF; the Battle of Britpop. And all these became rolled up into a general sense of an era when the charts “mattered”. But you got the feeling that to many, they mattered because of this possibility of minor, nose-tweaking shock – the classes winning out against the masses, if only for a week or two – and their day-to-day operation was a mere backdrop to that.

The Manic Street Preachers, on the threshold of the 21st century, are almost the last in this tradition. They had the opportunity – a band with enough fans to do something in that torpid millennial January. They had the motive – a band with a long-standing interest in quixotic pop gestures, and a fan’s love of theatrical subversion: they’d even called one of their videos Leaving The 20th Century, after a Situationist slogan. They also had the method – “Masses Against The Classes” was a limited single release, to be deleted after a week, guaranteeing it a compressed sales burst and a high debut placing.

But did they have the song? And who were they actually aiming it at?

It’s very doubtful that “The Masses Against The Classes” is meant as any kind of coherent statement, even more doubtful that you can parse it as one. The Occam’s Razor interpretation is that they wanted a Number One, saw a way of getting it, and slapped a Chomsky quote at the start as a bit of decorative brand-building and because it tickled Nicky Wire. If you take the perspective that a Cuban flag in HMV was an inherent pop good, and the Manics are fixing the charts to provide an alternative to complacency – then “Masses” works as an unfocused blast of wrath. It’s better – a lot better – than Westlife’s “Seasons In The Sun”. High praise, eh? But let’s offer the band the respect of at least trying to read too much into it all.

The quotes it’s topped and tailed by – Albert Camus bringing up the rear – sit in uneasy relationship. “This country was founded on the principle that the primary role of government is to protect property from the majority, and so it remains.”; “A slave begins by demanding justice, and ends by wanting to wear a crown”. Side by side, Camus’ fatalism makes this a glum pairing: the liberation of property (which a slave, by definition, is) inevitably ends in the re-establishment of government, and the cycle begins again. The title – Gladstone’s invocation of the historical spirit of the masses, magically separated from their economic interests – offers some kind of way out, however mystical. Break the cycle by backing the decent impulses of the masses against the classes. It’s not an analysis I would agree with, but that’s how the salad of sources works together, for me.

That’s the title and the quotes. The actual song, meanwhile, takes a rather different approach: it’s a haters-gonna-hate sneer. “Success is an ugly word / Especially in your tiny world”: A lot of people, it seems, didn’t like the furrowed-brow AOR direction the Manic Street Preachers had taken themselves in for This Is My Truth, Tell Me Yours, and the band, in the grand tradition of successful bands throughout history, interpreted boredom as envy. “The Masses” has them striking a defiant pose. Their grumpy old fans are the petty, factional classes, and their stadium-rockin’ newer ones are the noble masses.

It’s a nasty little record by this reading, but to really get how nasty it is, consider what it sounds like. This angry defense of a change of direction is packaged up in a song that’s a deliberate callback to their very first records. The early Manics single “Masses” reminds me of most is “You Love Us” – snotty, scrappy, and the kind of audience- and critic-baiting statement of belief that feels terribly 1992 but no less electric for that. “Masses” is determinedly uglier, though, janking and grinding along on its basic rock undercarriage like a car dragging a broken exhaust along a road. The early Manics never sounded quite this loud, either – boosted by compression steroids into a very deliberate kind of rawness, though compared to those early records what’s been gained in power has been lost in swagger. So what we’ve got is a song played in a manner designed to excited the band’s old fans, powered to number one at a time and using a gimmick that calls back to the Manics’ early theatrical streak, but which is actually a brutal dismissal as elitists of the very people most likely to get enthused by those things. Now there’s subversion. Oof.

“I guess at heart I remain some kind of a crinkly English situationist who wants to have his MTV and critique it too. I am reminded of the story of how high priest of situationism Guy Debord rushed over to London from Paris in the 1960s when he heard that a trained guerilla combat unit was ready for his inspection in Ladbroke Grove. He was directed to military headquarters on the All Saints Road where he discovered a young guy watching Match Of The Day on his sofa with a can of McEwan’s Special Export in his hand…. Debord, quite naturally, stalked off in a rage.”
– Steve Beard, Aftershocks: The End Of Style Culture

03 Feb 12:23

Life of Brian's Parody of the Sermon on the Mount in Jesus Films

by Mark Goodacre
The most iconic scene in Life of Brian is, of course, the opening post-credit scene in which Jesus is delivering the Sermon on the Mount to assembled thousands (In Judea. AD 33. Saturday afternoon. At around tea time).  Such is the success of Life of Brian that most of us are now more familiar with the parody than what is being parodied.  Everyone knows Life of Brian but relatively few are familiar with its source material.  As it happens, the scene is even funnier when viewed against the background of the Jesus films of the 1960s and 1970s.

Geoffrey Burgon's score is a superb pastiche of just what one hears at this point in both King of Kings and Greatest Story Ever Told. After shots of the crowds gathering, we have a clear shot of Jesus (Kenneth Colley) silhouetted against a blue sky.  As the camera pans back from Jesus, and as we get further and further away, we hear Jesus less and less distinctly until we arrive at Brian his mother Mandy. Mandy cannot hear a word and shouts, "Speak up!"  Iconic lines follow.  "Blessed are the cheesemakers!"  "Did you hear that? Blessed are the Greek!".

The point of the parody is the depiction of the Sermon on the Mount in the epic Jesus films like King of Kings.  Jesus is speaking to a cast of thousands and it is hardly surprising that people cannot hear:


But if you think it would take a lot of projection to speak to that crowd, compare Jesus (Jeffrey Hunter) about to give the Sermon on the Mount in King of Kings:


Jesus is so far in the distance in that shot that you can hardly see him.  Here is a little help:


It is not very different in the Greatest Story Ever Told in which Jesus (Max Von Sydow) gives his sermon to a group of disciples arranged around him in a circle, with a crowd listening at greater distance (imitating Matt. 5.1-2 and 7.28-29), and the vast landscape of Utah visible in the background:


It's not easy imagining being able to hear a word Jesus said from that kind of distance.  But take a look also at the way that Kenneth Colley is presented in close up against the blue sky in Life of Brian:


The similarity with the close-up of Jeffrey Hunter in King of Kings, who is similarly preaching against the backdrop of blue sky, is clear:


The colour in King of Kings is exquisite.  Jesus wears this unusual but rather striking red outer garment only during the Sermon on the Mount sequence in the film, and Ray makes sure to accentuate the contrast with the luscious, cloudless blue sky. 

But this draws to our attention the fact that while there are real similarities between the King of Kings sermon and the Life of Brian sermon that parodies it, there is one quite noticeable difference.  Kenneth Colley in Life of Brian looks nothing like Jeffrey Hunter or Max Von Sydow.  Why is that?  Colley in fact looks similar to Robert Powell in Jesus of Nazareth. And in 1979, when Life of Brian was released, Jesus of Nazareth was a very recent memory.  In fact, as Matt Page reminds us, Life of Brian even used some of the same sets that were used by Jesus of Nazareth out in Tunisia.  

Jesus of Nazareth does not feature a classic Sermon on the Mount scene, though it does repeatedly feature the teaching from the Sermon, and it has one scene in which Jesus gives both the beatitudes and the Lord's Prayer to a gathered multitude relatively late in the production:


The composition here, with its gorgeous oranges and browns, is quite unlike the Life of Brian and King of Kings sermons with their blue sky mountain shots, but Robert Powell's Jesus provides a close analogue to the few seconds we see of Kenneth Colley's Jesus.  Here is Colley:


His long dark hair and beard, and the arrangement of his garments is just like Powell's:


The sermon in Life of Brian thus parodies not only the scope and grandeur of King of Kings and Greatest Story, but also the very look of the Jesus most familiar to viewers in the late 1970s, Robert Powell in Jesus of Nazareth.

If you need a refresher, here is the scene from Life of Brian:



03 Feb 11:15

This Year's Infallible Super Bowl Prediction

by evanier

Right again!

03 Feb 11:15

This Year's Infallible Super Bowl Prediction

by evanier

I'm not watching.

I predict that every year and so far, I've never been wrong.

03 Feb 11:01

Natalie Bennett should beware the sharp elbows of Caroline Lucas

by Jonathan Calder
I have a soft spot for Natalie Bennett. She and I used both to be members of the team that took turns to host the Britblog Roundup.

But her performance on the Sunday Politics the other week has led some to wonder whether the Green Party would do better to field Caroline Lucas in the leaders' debates later this year.

And Natalie should be careful, because Caroline Lucas has sharp elbows and previous form.

Brighton Pavilion was the Greens' best prospect of a gain at the last general election largely because a local activist called Keith Taylor had built it up.

Then in 2007 Lucas descended on the town, professed her "admiration and respect" for Taylor and took the nomination from him.
Featured on Liberal Democrat Voice
If she expresses her admiration and express for the Green' current leader, Natalie really will be in trouble.
03 Feb 10:34

Outside the Government Final: The Five(ish) Doctors Reboot

by noreply@blogger.com (Philip Sandifer)
Given that The Day of the Doctor was massively successful and immediately catapulted to the top of the “best Doctor Who stories ever” list, it is perhaps no small thing that it was not actually the most beloved piece of the fiftieth anniversary. That honor, of course, goes to the spectacle of Steven Moffat visibly wishing he was sitting in the chair from Terror of the Autons during the botched satellite link-up with One Direction during the BBC Three “afterparty.” But second to the single most sublime moment of television that Steven Moffat has ever provided the world was Peter Davison’s The Five(ish) Doctors Reboot

It is worth stating, up front, that this is a genuinely sweet and delightful piece of television. It is routinely hilarious, from the brilliant use of the Shada clip to cover Tom Baker’s lack of involvement to the Peter Jackson/Ian McKellen cameo to Sylvester McCoy’s beautiful delivery of “I’d like to go home now” to Steven Moffat playing with action figures in his office to… I mean, this paragraph could go on for quite a bit, couldn’t it? This is a joyous thing chock full of charming moments. 

In some ways, actually, that’s the headline. Colin Baker has said in at least one interview that he genuinely was hurt not to be included in Day of the Doctor, especially given that Tom Baker was. This is an old wound, of course - the same one that flared twenty years earlier over the disparity of emphasis in The Dark Dimension. But it’s honestly difficult to imagine that Davison and McCoy didn’t have similar emotions. Of course they were all jealous of Baker and McGann for getting involvement. Wouldn’t anyone be? 

And yet this exists, full of warmth and humor and good feelings. On, to be clear, all sides. Moffat is there to send himself up as a disdainful and cantankerous man who has it in for any part of the series he didn’t create. Because, of course, he’d have loved to do The Thirteen Doctors as much as Colin Baker would have loved to be in it. Nobody didn’t want to see all of the classic series Doctors. And perhaps more to the point, in skewering himself so thoroughly, Moffat is tacitly making it clear that he understands Davison, Baker, and McCoy’s disappointments, and, more to the point, is sympathizing with them.

One might, of course, fairly ask why he didn’t just give them parts in the special then. But the reason for that is straightforward: the special still had to, at the end of the day, work as an episode of television and as a narrative. And the available options for trying to craft a narrative featuring all of the Doctors were all, in their own ways, unappealing. Seventy-five minutes is not nearly enough to handle eight leads with their own plot threads - an ensemble of that size needs far more room to breathe. And a chain of cameo appearances would quickly grow dull. It’s an absolute delight when Tom Baker shows up on screen. But if Tom Baker’s appearance were the first (or last) in a chain that went on through the list, it would swiftly become a tedious exercise in box ticking. The strictly episodic nature of that plot is, frankly, dull - just ask anyone who’s read The Eight Doctors. Or imagine the “Doctor’s reward” segment of The End of Time stretched out for an hour. Either will do.

On top of that, once you start going for completism, the necessary gaps become all the more visible. Suddenly you need to come up with a reason why you’re stopping at Tom Baker. Already The Day of the Doctor tangibly had Eccleston’s absence hanging over it, both in conspicuousness of John Hurt and in the obvious but totally ignored question of why the Moment apparently deemed the Doctor’s next incarnation inadequate for the task of preventing the Doctor from using it. Adding more Doctors just makes the ones who aren’t there more conspicuous, much like Tom Baker’s absence quietly dominates The Five Doctors. And, of course, there’s the passage of time - the fact that casting Davison, Baker, and McCoy as the Fifth through Seventh Doctors in 2013 would look like what it was: old men dressing like they’re thirty years younger than they are. Indeed, the fact that this is what happens if you put Davison, Baker, and McCoy in their Doctor costumes (or at least, in their coats thrown over the souvenir t-shirt versions of their costumes) is one of the central jokes of The Five(ish) Doctors Reboot.

Because, of course, the truth is that we didn’t all want the classic Doctors to be cast in the 50th anniversary special. What we wanted was for it to simultaneously be 1988, 1985, 1982, 1976, and for that matter 1973, 1968, and 1964, and to have the Doctors of our youth reborn on screen in all their glory. We wanted the past to still be accessible. We wanted a show about time travel to be capable of it. We wanted an anniversary special that was boundlessly long, and consisted not of cameos but of whole new seasons of all of our favorites. We wanted everything. And instead we live in a world with time, and history, and death. 

And the reality is that Peter Davison, Colin Baker, and Sylvester McCoy all know and understand that. They get that Moffat’s job is to create the best Doctor Who he possibly can in 2013, and that at the end of the day he has to go with his creative judgment because he’s the man at the keyboard. They’re all professionals and gentlemen. They understand how art is made, and that the moment you start trying to do something you don’t really believe in is the moment you’re lost. And that’s ultimately what The Five(ish) Doctors Reboot is: everyone involved getting together in a big group hug to show that they understand that this is how it works, and that they’re all still friends.

Which brings us to the most barbed gag in the entire thing - when McCoy stops and asks, why they’re doing this, noting, “I’ve travelled twelve thousand miles to get here, I’m in breach of contract, my film career’s in tatters, and for what,” prompting Baker to ask what the point of this mad folly is. At which point Davison, after a moment’s thought, says, “for the fans,” an answer everyone readily accepts, dismissing all doubt they had about the wisdom of this. 

It works on many levels. Part of the joke of the whole piece, after all, is the underlying egotism of their desire to be in the 50th. “For the fans,” then, is used, as it often is, to make a show of altrusim around an altogether more selfish act. This is just as much for the fans as convention appearances and Big Finish audios are, which is to say that the fans are a lovely supplement to a paycheck. Indeed, the fact that what is “for the fans” is more often for the fans’ disposable income goes a long way towards explaining the vast number of stunning misadventures undertaken in the fans’ names. Of course Colin Baker will break into the Roath Lock Studios for the fans. He already did Attack of the Cybermen for them, after all. 

But it’s also the case that “the fans” serve as the entire reason The Five(ish) Doctors Reboot exists, and not in an entirely pleasant way. While it is largely about everyone lampooning their own pretensions and ego, the occasion for this lampooning only exists because there are people who are absolutely ludicrously invested in their sense of what The Day of the Doctor should have been. Colin Baker may have admitted to hurt feelings in an interview, but portions of Doctor Who fandom were absolutely seething at the lack of classic Doctors, and were happy to accuse Moffat of egregious disrespect. Just as they were perfectly happy to get angry at him for failing to persuade Eccleston to come back, and, for that matter, perfectly happy to get angry at Eccleston for not understanding that playing the Doctor is actually a life sentence and not, as he might have imagined on the basis of his contract, a nine month gig. The number of genuinely ugly sentiments about this topic that “the fans” have expressed is vast, and while “not all fans” and even “not most fans” are absolutely accurate and solid defenses, the toxic dickheads have their effect.

And yet somehow the inherent ugliness that fuels this doesn’t detract from its fundamental charm. Yes, this exists because of a multi-layered and toxic culture of entitlement and arrogance that, ultimately, absolutely nobody involved with this is actually innocent of participation in, the audience included. And yet it’s fun. It’s a bunch of people from across fifty years of history throwing a party to celebrate both that history and the way in which that history has delivered the present day. And the sheer love of that history and of Doctor Who comes through in every scene and every time that people send up their own pretensions and rough edges. And that’s perhaps the thing that’s worth pointing out. Yes, everybody who does something like this “for the fans” is at the end of the day really doing it for themselves. But then, they’re the fans too, for better and for worse.


But in the case of The Five(ish) Doctors Reboot, pretty much entirely for better. 
03 Feb 09:04

You don’t need to vote for the Elite Consensus Party, because they’ll win anyway

by Nick

A couple of snippets from the Sundays to help you understand how British politics works today:

First, the Telegraph is very eager to tell us that Liz Kendall has suddenly emerged as the favourite for a Labour leadership race that may or may not be taking place at some unspecified point in the future. How has she achieved this impressive, yet somewhat nebulous, feat? Some mass mobilisation of Labour members? A series of impressive performances in the House of Commons? Perhaps she’s set out some important new ideas for the future of the Labour Party? Maybe it’s through a long period of helping and campaigning other MPs?

No, it’s because ‘she gave an interview to House magazine saying that for the NHS “What matters is what works”’. Yes, reusing some old bit of Blairite managerialist pabulum – what matters is ‘what works’, ignoring that ‘what works’ is determined by whoever decides what ‘working’ means – is enough to catapult you into the lead. It’s definitely not that she’s stressed the importance of private healthcare in the NHS while her old boss works for Alliance Boots at the same time its chairman is attacking Ed Miliband. What would the Telegraph have to gain by promoting someone who’d push Labour back towards the managerialist centre?

Meanwhile, over at the Independent, John Rentoul’s remarkable career as a political commentator who doesn’t understand the concept of ideology continues apace. Today, he’s setting out just why the country needs a Labour government, but headed by David Cameron. It’s the sort of centrist managerialist fantasy one would expect from an arch-Blairite, yet again stressing that what is important in a politician isn’t believing in something but being possessed of some nebulous form of ‘competence’. Like ‘what works’, ‘competence’ is purely in the eye of the beholder, and the beholder is normally a well-paid newspaper columnist or editor assessing who they’ve been told good things about.

(Incidentally, although I’ve said I don’t think a grand coalition is likely after the next election, the number of times I’ve seen people in the press suggesting it does make me wonder if the ground is being prepared for one, just in case it turns out to be necessary)

What both these articles represent, though, is the cosy consensuses that dominate British politics. Stick firmly within the lines of what’s acceptable within the elite consensus and you’ll be praised to the skies for your competence and put forward as possible leadership material. Question it, or stray outside the mainstream consensus for just a little bit and you’ll be a maverick on the fringes, and you definitely won’t get the media singing your praises. It’s a lot easier for everyone in the elite consensus if politics is just a matter of deciding between competing managerial visions without letting any of that horrible ideology getting there. It’s why the memory of Blairism lingers so much amongst the commentariat: things were easier then when all you had to worry about was ‘what works’ not what anyone might actually want.

03 Feb 09:03

Hope and the politics of the future

by Nick

This was the future, once.

This was the future, once.

Amidst all the coverage of the Greek election last week, one line from a report stood out for me:

The ruling New Democracy party is still wondering how its platform of Endless Suffering For Everyone was defeated by Syriza’s competing message of Maybe Not That.

Yes, it’s from the Mash, but as so often one line of satire gets closer to the truth than thousands of pieces of punditry. When traditional politics and traditional parties neglect to offer a positive vision of the future, there’s a natural appeal to anyone who can offer something that resembles a vision. Even if it’s just ‘Maybe Not That’, it’s much more appealing than offering people nothing more than the status quo, perhaps with slightly better gadgets.

This links to what I was saying yesterday – if all that mainstream politics can offer is a red, blue or yellow-tinged version of the elite consensus, and that consensus doesn’t offer a positive vision of the future, then why are we surprised that people are looking for alternatives?

What we don’t have, and what no one seems to be offering in the upcoming election, is a positive vision of the future. David Cameron has been touting the ultimate uninspiring managerialist vision of ‘Britain living within its means’, while Ed Miliband offers ‘maybe that, but not quite that’ and Nick Clegg promises ‘that or that, but perhaps slightly less of it’. Meanwhile, when confronted with any vision of the future, Nigel Farage runs screaming to the past and while the Greens at least acknowledge the future is likely to be radically different, their vision for it is short of hope.

We used to dream about the future. Yes, there were things in those dreams that were never going to happen like flying cars, jet packs and double-digit Jaws sequels, but there was hope in those visions. We had futures where the whole wealth of world knowledge was at our fingertips, instant worldwide communication was simple and the need to work was reduced or even eliminated by robotics and automation. We’ve got those, but the world we live in now resembles a cyberpunk dystopia rather than the utopian dreams of the future we had.

What we need is to reclaim and reinvent the idea that the future can be better and different than today. Our politics and culture aren’t offering that vision anymore, instead retreating into expecting tomorrow to look much like today and offering purely managerial solutions to try and keep things running much the way they have been. The problem, I think, is that even if people can’t articulate it, they know that vision doesn’t work. It might not be as obvious in Britain as it is in somewhere like Greece or Spain, but keeping things the same equates to keeping everyone at the same level of insecurity they’re already feeling. That’s not a future anyone would want to sign up for. Merely offering people endless workism from now until the end of time isn’t a vision, it’s a punishment for sins we never committed.

This is why I think ideas like basic income are important. As Paul Mason explains here, it’s too often seen through the prism of our current system, not as a transformative idea that would change the way our society works. It’s getting back to those old visions of the future, where technology has made fulfilling our basic needs – food, clothing, shelter, heat, light – such a simple task that they’re available to everyone without question or effort, just something you get by dint of being you. Instead, we often end up discussing it in terms of how we’d implement it through current systems as though they’re the only way of doing things.

The future doesn’t have to be about basic income, but I think there is a yearning out there for someone, be it politician or artist, who can provide a vision of something different and better, a future that we hope will come about, not one that we dread finally coming to pass. We have occasional moments when we recognise the importance of hope – even the audacity of it – but then we forget about it, or think that what people hope for is more management and more targets to regulate their lives. That might be some people’s vision of the future – a human finger, clicking off the items on an assessment checklist, forever – but surely we can come up with something better?

03 Feb 09:03

The Greens, Citizens Income and how journalists still don’t understand how political parties work

by Nick

After it flared up into media prominence over the last week, the Telegraph today eagerly covered the news that the Green Party won’t be including Citizens Income as a policy in their General Election manifesto.

However, there seems to be a problem with that news: it’s not true. Reading an account from a Green Party member, it seems that the party’s conference has insisted that the policy is included in the manifesto, and the Telegraph’s report is merely extrapolating wildly from some comments by Caroline Lucas. The member’s account suggests that she has opposed the inclusion of it in the manifesto, but even with that news, the Telegraph appears to be stretching her words. It reports that she said:

“The citizens’ income is not going to be in the 2015 general election manifesto as something to be introduced on May 8th. It is a longer term aspiration; we are still working on it,”

The key point they’re not factoring into their story is ‘as something to be introduced on May 8th’, instead focusing on the first part of the sentence. Let’s be honest, I don’t think even the most hardened support of a basic income scheme thinks it could be introduced quickly, and it helps to show the ignorance of reporters who believe that is the case.

However, I think this comes back to the point I made a couple of weeks ago about how journalists don’t understand how policy making within parties actually works. As someone with experience of seeing similar things in the Lib Dems, it’s almost pleasant to see another party being similarly misunderstood. Journalists like to believe that all political parties are run from the top down, not the bottom up, and of course ‘senior party figures’ are always happy to encourage this impression. So, when Caroline Lucas says something (and it’s misheard) it’s easy for them to leap to ‘the party has changed its policy!’ rather than ‘hmm, better check that for accuracy.’

It does make me think about the Iron Law of Oligarchy – the idea that all political organisations will progress from democracy to oligarchy over time – and whether the media have a role in encouraging and fostering that process. Could one even argue that social pressures and the expectation that an organisation will be run from the top are as much a pressure making it happen as the role of bureaucracy concentrating power in the organisation? Something else to add to the list of things I need to think about and write about some more…

02 Feb 14:20

The Parable Of The Talents

by Scott Alexander

[Content note: scrupulosity and self-esteem triggers, IQ, brief discussion of weight and dieting. Not good for growth mindset.]

I.

I sometimes blog about research into IQ and human intelligence. I think most readers of this blog already know IQ is 50% to 80% heritable, and that it’s so important for intellectual pursuits that eminent scientists in some fields have average IQs around 150 to 160. Since IQ this high only appears in 1/10,000 people or so, it beggars coincidence to believe this represents anything but a very strong filter for IQ (or something correlated with it) in reaching that level. If you saw a group of dozens of people who were 7’0 tall on average, you’d assume it was a basketball team or some other group selected for height, not a bunch of botanists who were all very tall by coincidence.

A lot of people find this pretty depressing. Some worry that taking it seriously might damage the “growth mindset” people need to fully actualize their potential. This is important and I want to discuss it eventually, but not now. What I want to discuss now is people who feel personally depressed. For example, a comment from last week:

I’m sorry to leave self a self absorbed comment, but reading this really upset me and I just need to get this off my chest…How is a person supposed to stay sane in a culture that prizes intelligence above everything else – especially if, as Scott suggests, Human Intelligence Really Is the Key to the Future – when they themselves are not particularly intelligent and, apparently, have no potential to ever become intelligent? Right now I basically feel like pond scum.

I hear these kinds of responses every so often, so I should probably learn to expect them. I never do. They seem to me precisely backwards. There’s a moral gulf here, and I want to throw stories and intuitions at it until enough of them pile up at the bottom to make a passable bridge. But first, a comparison:

Some people think body weight is biologically/genetically determined. Other people think it’s based purely on willpower – how strictly you diet, how much you can bring yourself to exercise. These people get into some pretty acrimonious debates.

Overweight people, and especially people who feel unfairly stigmatized for being overweight, tend to cluster on the biologically determined side. And although not all believers in complete voluntary control of weight are mean to fat people, the people who are mean to fat people pretty much all insist that weight is voluntary and easily changeable.

Although there’s a lot of debate over the science here, there seems to be broad agreement on both sides that the more compassionate, sympathetic, progressive position, the position promoted by the kind of people who are really worried about stigma and self-esteem, is that weight is biologically determined.

And the same is true of mental illness. Sometimes I see depressed patients whose families really don’t get it. They say “Sure, my daughter feels down, but she needs to realize that’s no excuse for shirking her responsibilities. She needs to just pick herself up and get on with her life.” On the other hand, most depressed people say that their depression is more fundamental than that, not a thing that can be overcome by willpower, certainly not a thing you can just ‘shake off’.

Once again, the compassionate/sympathetic/progressive side of the debate is that depression is something like biological, and cannot easily be overcome with willpower and hard work.

One more example of this pattern. There are frequent political debates in which conservatives (or straw conservatives) argue that financial success is the result of hard work, so poor people are just too lazy to get out of poverty. Then a liberal (or straw liberal) protests that hard work has nothing to do with it, success is determined by accidents of birth like who your parents are and what your skin color is et cetera, so the poor are blameless in their own predicament.

I’m oversimplifying things, but again the compassionate/sympathetic/progressive side of the debate – and the side endorsed by many of the poor themselves – is supposed to be that success is due to accidents of birth, and the less compassionate side is that success depends on hard work and perseverance and grit and willpower.

The obvious pattern is that attributing outcomes to things like genes, biology, and accidents of birth is kind and sympathetic. Attributing them to who works harder and who’s “really trying” can stigmatize people who end up with bad outcomes and is generally viewed as Not A Nice Thing To Do.

And the weird thing, the thing I’ve never understood, is that intellectual achievement is the one domain that breaks this pattern.

Here it’s would-be hard-headed conservatives arguing that intellectual greatness comes from genetics and the accidents of birth and demanding we “accept” this “unpleasant truth”.

And it’s would-be compassionate progressives who are insisting that no, it depends on who works harder, claiming anybody can be brilliant if they really try, warning us not to “stigmatize” the less intelligent as “genetically inferior”.

I can come up with a few explanations for the sudden switch, but none of them are very principled and none of them, to me, seem to break the fundamental symmetry of the situation. I choose to maintain consistency by preserving the belief that overweight people, depressed people, and poor people aren’t fully to blame for their situation – and neither are unintelligent people. It’s accidents of birth all the way down. Intelligence is mostly genetic and determined at birth – and we’ve already determined in every other sphere that “mostly genetic and determined at birth” means you don’t have to feel bad if you got the short end of the stick.

Consider for a moment Srinivasa Ramanujan, one of the greatest mathematicians of all time. He grew up in poverty in a one-room house in small-town India. He taught himself mathematics by borrowing books from local college students and working through the problems on his own until he reached the end of the solveable ones and had nowhere else to go but inventing ways to solve the unsolveable ones.

There are a lot of poor people in the United States today whose life circumstances prevented their parents from reading books to them as a child, prevented them from getting into the best schools, prevented them from attending college, et cetera. And pretty much all of those people still got more educational opportunities than Ramanujan did.

And from there we can go in one of two directions. First, we can say that a lot of intelligence is innate, that Ramanujan was a genius, and that we mortals cannot be expected to replicate his accomplishments.

Or second, we can say those poor people are just not trying hard enough.

Take “innate ability” out of the picture, and if you meet a poor person on the street begging for food, saying he never had a chance, your reply must be “Well, if you’d just borrowed a couple of math textbooks from the local library at age 12, you would have been a Fields Medalist by now. I hear that pays pretty well.”

The best reason not to say that is that we view Ramanujan as intellectually gifted. But the very phrase tells us where we should classify that belief. Ramanujan’s genius is a “gift” in much the same way your parents giving you a trust fund on your eighteenth birthday is a “gift”, and it should be weighted accordingly in the moral calculus.

II.

I shouldn’t pretend I’m worried about this for the sake of the poor. I’m worried for me.

My last IQ-ish test was my SATs in high school. I got a perfect score in Verbal, and a good-but-not-great score in Math.

And in high school English, I got A++s in all my classes, Principal’s Gold Medals, 100%s on tests, first prize in various state-wide essay contests, etc. In Math, I just barely by the skin of my teeth scraped together a pass in Calculus with a C-.

Every time I won some kind of prize in English my parents would praise me and say I was good and should feel good. My teachers would hold me up as an example and say other kids should try to be more like me. Meanwhile, when I would bring home a report card with a C- in math, my parents would have concerned faces and tell me they were disappointed and I wasn’t living up to my potential and I needed to work harder et cetera.

And I don’t know which part bothered me more.

Every time I was held up as an example in English class, I wanted to crawl under a rock and die. I didn’t do it! I didn’t study at all, half the time I did the homework in the car on the way to school, those essays for the statewide competition were thrown together on a lark without a trace of real effort. To praise me for any of it seemed and still seems utterly unjust.

On the other hand, to this day I believe I deserve a fricking statue for getting a C- in Calculus I. It should be in the center of the schoolyard, and have a plaque saying something like “Scott Alexander, who by making a herculean effort managed to pass Calculus I, even though they kept throwing random things after the little curly S sign and pretending it made sense.”

And without some notion of innate ability, I don’t know what to do with this experience. I don’t want to have to accept the blame for being a lazy person who just didn’t try hard enough in Math. But I really don’t want to have to accept the credit for being a virtuous and studious English student who worked harder than his peers. I know there were people who worked harder than I did in English, who poured their heart and soul into that course – and who still got Cs and Ds. To deny innate ability is to devalue their efforts and sacrifice, while simultaneously giving me credit I don’t deserve.

Meanwhile, there were some students who did better than I did in Math with seemingly zero effort. I didn’t begrudge those students. But if they’d started trying to say they had exactly the same level of innate ability as I did, and the only difference was they were trying while I was slacking off, then I sure as hell would have begrudged them. Especially if I knew they were lazing around on the beach while I was poring over a textbook.

I tend to think of social norms as contracts bargained between different groups. In the case of attitudes towards intelligence, those two groups are smart people and dumb people. Since I was both at once, I got to make the bargain with myself, which simplified the bargaining process immensely. The deal I came up with was that I wasn’t going to beat myself up over the areas I was bad at, but I also didn’t get to become too cocky about the areas I was good at. It was all genetic luck of the draw either way. In the meantime, I would try to press as hard as I could to exploit my strengths and cover up my deficiencies. So far I’ve found this to be a really healthy way of treating myself, and it’s the way I try to treat others as well.

III.

The theme continues to be “Scott Relives His Childhood Inadequacies”. So:

When I was 6 and my brother was 4, our mom decided that as an Overachieving Jewish Mother she was contractually obligated to make both of us learn to play piano. She enrolled me in a Yamaha introductory piano class, and my younger brother in a Yamaha ‘cute little kids bang on the keyboard’ class.

A little while later, I noticed that my brother was now with me in my Introductory Piano class.

A little while later, I noticed that my brother was now by far the best student in my Introductory Piano Class, even though he had just started and was two or three years younger than anyone else there.

A little while later, Yamaha USA flew him to Japan to show him off before the Yamaha corporate honchos there.

Well, one thing led to another, and right now if you Google my brother’s name you get a bunch of articles like this one:

The evidence that Jeremy [Alexander] is among the top jazz pianists of his generation is quickly becoming overwhelming: at age 26, Alexander is the winner of the Nottingham International Jazz Piano Competition, a second-place finisher in the Montreux Jazz Festival Solo Piano Competition, a two-time finalist for the American Pianist Association’s Cole Porter Fellowship, and a two-time second-place finisher at the Phillips Jazz Competition. Alexander, who was recently named a Professor of Piano at Western Michigan University’s School of Music, made a sold-out solo debut at Carnegie Hall in 2012, performing Debussy’s Etudes in the first half and jazz improvisations in the second half.

Meanwhile, I was always a mediocre student at Yamaha. When the time came to try an instrument in elementary school, I went with the violin to see if maybe I’d find it more to my tastes than the piano. I was quickly sorted into the remedial class because I couldn’t figure out how to make my instrument stop sounding like a wounded cat. After a year or so of this, I decided to switch to fulfilling my music requirement through a choir, and everyone who’d had to listen to me breathed a sigh of relief.

Every so often I wonder if somewhere deep inside me there is the potential to be “among the top musicians of my generation.” I try to recollect whether my brother practiced harder than I did. My memories are hazy, but I don’t think he practiced much harder until well after his career as a child prodigy had taken off. The cycle seemed to be that every time he practiced, things came fluidly to him and he would produce beautiful music and everyone would be amazed. And this must have felt great, and incentivized him to practice more, and that made him even better, so that the beautiful music came even more fluidly, and the praise became more effusive, until eventually he chose a full-time career in music and became amazing. Meanwhile, when I started practicing it always sounded like wounded cats, and I would get very cautious praise like “Good job, Scott, it sounded like that cat was hurt a little less badly than usual,” and it made me frustrated, and want to practice less, which made me even worse, until eventually I quit in disgust.

On the other hand, I know people who want to get good at writing, and make a mighty resolution to write two hundred words a day every day, and then after the first week they find it’s too annoying and give up. These people think I’m amazing, and why shouldn’t they? I’ve written a few hundred to a few thousand words pretty much every day for the past ten years.

But as I’ve said before, this has taken exactly zero willpower. It’s more that I can’t stop even if I want to. Part of that is probably that when I write, I feel really good about having expressed exactly what it was I meant to say. Lots of people read it, they comment, they praise me, I feel good, I’m encouraged to keep writing, and it’s exactly the same virtuous cycle as my brother got from his piano practice.

And so I think it would be too easy to say something like “There’s no innate component at all. Your brother practiced piano really hard but almost never writes. You write all the time, but wimped out of practicing piano. So what do you expect? You both got what you deserved.”

I tried to practice piano as hard as he did. I really tried. But every moment was a struggle. I could keep it up for a while, and then we’d go on vacation, and there’d be no piano easily available, and I would be breathing a sigh of relief at having a ready-made excuse, and he’d be heading off to look for a piano somewhere to practice on. Meanwhile, I am writing this post in short breaks between running around hospital corridors responding to psychiatric emergencies, and there’s probably someone very impressed with that, someone saying “But you had such a great excuse to get out of your writing practice!”

I dunno. But I don’t think of myself as working hard at any of the things I am good at, in the sense of “exerting vast willpower to force myself kicking and screaming to do them”. It’s possible I do work hard, and that an outside observer would accuse me of eliding how hard I work, but it’s not a conscious elision and I don’t feel that way from the inside.

Ramanujan worked very hard at math. But I don’t think he thought of it as work. He obtained a scholarship to the local college, but dropped out almost immediately because he couldn’t make himself study any subject other than math. Then he got accepted to another college, and dropped out again because they made him study non-mathematical subjects and he failed a physiology class. Then he nearly starved to death because he had no money and no scholarship. To me, this doesn’t sound like a person who just happens to be very hard-working; if he had the ability to study other subjects he would have, for no reason other than that it would have allowed him to stay in college so he could keep studying math. It seems to me that in some sense Ramanujan was incapable of putting hard work into non-math subjects.

I really wanted to learn math and failed, but I did graduate with honors from medical school. Ramanujan really wanted to learn physiology and failed, but he did become one of history’s great mathematicians. So which one of us was the hard worker?

People used to ask me for writing advice. And I, in all earnestness, would say “Just transcribe your thoughts onto paper exactly like they sound in your head.” It turns out that doesn’t work for other people. Maybe it doesn’t work for me either, and it just feels like it does.

But you know what? When asked about one of his discoveries, a method of simplifying a very difficult problem to a continued fraction, Ramanujan described his thought process as: “It is simple. The minute I heard the problem, I knew that the answer was a continued fraction. ‘Which continued fraction?’ I asked myself. Then the answer came to my mind”.

And again, maybe that’s just how it feels to him, and the real answer is “study math so hard that you flunk out of college twice, and eventually you develop so much intuition that you can solve problems without thinking about them.”

(or maybe the real answer is “have dreams where obscure Hindu gods appear to you as drops of blood and reveal mathematical formulae”. Ramanujan was weird).

But I still feel like there’s something going on here where the solution to me being bad at math and piano isn’t just “sweat blood and push through your brain’s aversion to these subjects until you make it stick”. When I read biographies of Ramanujan and other famous mathematicians, there’s no sense that they ever had to do that with math. When I talk to my brother, I never get a sense that he had to do that with piano. And if I am good enough at writing to qualify to have an opinion on being good at things, then I don’t feel like I ever went through that process myself.

So this too is part of my deal with myself. I’ll try to do my best at things, but if there’s something I really hate, something where I have to go uphill every step of the way, then it’s okay to admit mediocrity. I won’t beat myself up for not forcing myself kicking and screaming to practice piano. And in return I won’t become too cocky about practicing writing a lot. It’s probably some kind of luck of the draw either way.

IV.

I said before that this wasn’t just about poor people, it was about me being selfishly worried for my own sake. I think I might have given the mistaken impression that I merely need to justify to myself why I can’t get an A in math or play the piano. But it’s much worse than that.

The rationalist community tends to get a lot of high-scrupulosity people, people who tend to beat themselves up for not doing more than they are. It’s why I push giving 10% to charity, not as some kind of amazing stretch goal that we need to guilt people into doing, but as a crutch, a sort of “don’t worry, you’re still okay if you only give ten percent”. It’s why there’s so much emphasis on “heroic responsibility” and how you, yes you, have to solve all the world’s problems personally. It’s why I see red when anyone accuses us of entitlement, since it goes about as well as calling an anorexic person fat.

And we really aren’t doing ourselves any favors. For example, Nick Bostrom writes:

Searching for a cure for aging is not just a nice thing that we should perhaps one day get around to. It is an urgent, screaming moral imperative. The sooner we start a focused research program, the sooner we will get results. It matters if we get the cure in 25 years rather than in 24 years: a population greater than that of Canada would die as a result.

If that bothers you, you definitely shouldn’t read Astronomical Waste.

Yet here I am, not doing anti-aging research. Why not?

Because I tried doing biology research a few times and it was really hard and made me miserable. You know how in every science class, when the teacher says “Okay, pour the white chemical into the grey chemical, and notice how it turns green and begins to bubble,” there’s always one student who pours the white chemical into the grey chemical, and it just forms a greyish-white mixture and sits there? That was me. I hated it, I didn’t have the dexterity or the precision of mind to do it well, and when I finally finished my required experimental science classes I was happy never to think about it again. Even the abstract intellectual part of it – the one where you go through data about genes and ligands and receptors in supercentenarians and shake it until data comes out – requires exactly the kind of math skills that I don’t have.

Insofar as this is a matter of innate aptitude – some people are cut out for biology research and I’m not one of them – all is well, and my decision to get a job I’m good at instead is entirely justified.

But insofar as there’s no such thing as innate aptitude, just hard work and grit – then by not being gritty enough, I’m a monster who’s complicit in the death of a population greater than that of Canada.

Insofar as there’s no such thing as innate aptitude, I have no excuse for not being Aubrey de Grey. Or if Aubrey de Grey doesn’t impress you much, Norman Borlaug. Or if you don’t know who either of those two people are, Elon Musk.

I once heard a friend, upon his first use of modafinil, wonder aloud if the way they felt on that stimulant was the way Elon Musk felt all the time. That tied a lot of things together for me, gave me an intuitive understanding of what it might “feel like from the inside” to be Elon Musk. And it gave me a good tool to discuss biological variation with. Most of us agree that people on stimulants can perform in ways it’s difficult for people off stimulants to match. Most of us agree that there’s nothing magical about stimulants, just changes to the levels of dopamine, histamine, norepinephrine et cetera in the brain. And most of us agree there’s a lot of natural variation in these chemicals anyone. So “me on stimulants is that guy’s normal” seems like a good way of cutting through some of the philosophical difficulties around this issue.

…which is all kind of a big tangent. The point I want to make is that for me, what’s at stake in talking about natural variations in ability isn’t just whether I have to feel like a failure for not getting an A in high school calculus, or not being as good at music as my brother. It’s whether I’m a failure for not being Elon Musk. Specifically, it’s whether I can say “No, I’m really not cut out to be Elon Musk” and go do something else I’m better at without worrying that I’m killing everyone in Canada.

V.

The proverb says: “Everyone has somebody better off than they are and somebody worse off than they are, with two exceptions.” When we accept that we’re all in the “not Elon Musk” boat together (with one exception) a lot of the status games around innate ability start to seem less important.

Every so often an overly kind commenter here praises my intelligence and says they feel intellectually inadequate compared to me, that they wish they could be at my level. But at my level, I spend my time feeling intellectually inadequate compared to Scott Aaronson. Scott Aaronson describes feeling “in awe” of Terence Tao and frequently struggling to understand him. Terence Tao – well, I don’t know if he’s religious, but maybe he feels intellectually inadequate compared to God. And God feels intellectually inadequate compared to Johann von Neumann.

So there’s not much point in me feeling inadequate compared to my brother, because even if I was as good at music as my brother, I’d probably just feel inadequate for not being Mozart.

And asking “Well what if you just worked harder?” can elide small distinctions, but not bigger ones. If my only goal is short-term preservation of my self-esteem, I can imagine that if only things had gone a little differently I could have practiced more and ended up as talented as my brother. It’s a lot harder for me to imagine the course of events where I do something different and become Mozart. Only one in a billion people reach a Mozart level of achievement; why would it be me?

If I loved music for its own sake and wanted to be a talented musician so I could express the melodies dancing within my heart, then none of this matters. But insofar as I want to be good at music because I feel bad that other people are better than me at music, that’s a road without an end.

This is also how I feel of when some people on this blog complain they feel dumb for not being as smart as some of the other commenters on this blog.

I happen to have all of your IQ scores in a spreadsheet right here (remember that survey you took?). Not a single person is below the population average. The first percentile for IQ here – the one such that 1% of respondents are lower and 99% of respondents are higher – is – corresponds to the 85th percentile of the general population. So even if you’re in the first percentile here, you’re still pretty high up in the broader scheme of things.

At that point we’re back on the road without end. I am pretty sure we can raise your IQ as much as you want and you will still feel like pond scum. If we raise it twenty points, you’ll try reading Quantum Computing since Democritus and feel like pond scum. If we raise it forty, you’ll just go to Terence Tao’s blog and feel like pond scum there. Maybe if you were literally the highest-IQ person in the entire world you would feel good about yourself, but any system where only one person in the world is allowed to feel good about themselves at a time is a bad system.

People say we should stop talking about ability differences so that stupid people don’t feel bad. I say that there’s more than enough room for everybody to feel bad, smart and stupid alike, and not talking about it won’t help. What will help is fundamentally uncoupling perception of intelligence from perception of self-worth.

I work with psychiatric patients who tend to have cognitive difficulties. Starting out in the Detroit ghetto doesn’t do them any favors, and then they get conditions like bipolar disorder and schizophrenia that actively lower IQ for poorly understood neurological reasons.

The standard psychiatric evaluation includes an assessment of cognitive ability; the one I use is a quick test with three questions. The questions are – “What is 100 minus 7?”, “What do an apple and an orange have in common?”, and “Remember these three words for one minute, then repeat them back to me: house, blue, and tulip”.

There are a lot of people – and I don’t mean floridly psychotic people who don’t know their own name, I mean ordinary reasonable people just like you and me – who can’t answer these questions. And we know why they can’t answer these questions, and it is pretty darned biological.

And if our answer to “I feel dumb and worthless because my IQ isn’t high enough” is “don’t worry, you’re not worthless, I’m sure you can be a great scientist if you just try hard enough”, then we are implicitly throwing under the bus all of these people who are definitely not going to be great scientists no matter how hard they try. Talking about trying harder can obfuscate the little differences, but once we’re talking about the homeless schizophrenic guy from Detroit who can’t tell me 100 minus 7 to save his life, you can’t just magic the problem away with a wave of your hand and say “I’m sure he can be the next Ramanujan if he keeps a positive attitude!” You either need to condemn him as worthless or else stop fricking tying worth to innate intellectual ability.

This is getting pretty close to what I was talking about in my post on burdens. When I get a suicidal patient who thinks they’re a burden on society, it’s nice to be able to point out ten important things they’ve done for society recently and prove them wrong. But sometimes it’s not that easy, and the only thing you can say is “f#@k that s#!t”. Yes, society has organized itself in a way that excludes and impoverishes a bunch of people who could have been perfectly happy in the state of nature picking berries and hunting aurochs. It’s not your fault, and if they’re going to give you compensation you take it. And we had better make this perfectly clear now, so that when everything becomes automated and run by robots and we’re all behind the curve, everybody agrees that us continuing to exist is still okay.

Likewise with intellectual ability. When someone feels sad because they can’t be a great scientist, it is nice to be able to point out all of their intellectual strengths and tell them “Yes you can, if only you put your mind to it!” But this is often not true. At that point you have to say “f@#k it” and tell them to stop tying their self-worth to being a great scientist. And we had better establish that now, before transhumanists succeed in creating superintelligence and we all have to come to terms with our intellectual inferiority.

VI.

But I think the situation can also be somewhat rosier than that.

Ozy once told me that the law of comparative advantage was one of the most inspirational things they had ever read. This was sufficiently strange that I demanded an explanation.

Ozy said that it proves everyone can contribute. Even if you are worse than everyone else at everything, you can still participate in global trade and other people will pay you money. It may not be very much money, but it will be some, and it will be a measure of how your actions are making other people better off and they are grateful for your existence.

(in real life this doesn’t work for a couple of reasons, but who cares about real life when we have a theory?)

After some thought, I was also inspired by this.

I’m never going to be a great mathematician or Elon Musk. But if I pursue my comparative advantage, which right now is medicine, I can still make money. And if I feel like it, I can donate it to mathematics research. Or anti-aging research. Or the same people Elon Musk donates his money to. They will use it to hire smart people with important talents that I lack, and I will be at least partially responsible for those people’s successes.

If I had an IQ of 70, I think I would still want to pursue my comparative advantage – even if that was ditch-digging, or whatever, and donate that money to important causes. It might not be very much money, but it would be some.

Our modern word “talent” comes from the Greek word talenton, a certain amount of precious metal sometimes used as a denomination of money. The etymology passes through a parable of Jesus’. A master calls three servants to him and gives the first five talents, the second two talents, and the third one talent. The first two servants invest the money and double it. The third literally buries it in a hole. The master comes back later and praises the first two servants, but sends the third servant to Hell (metaphor? what metaphor?).

Various people have come up with various interpretations, but the most popular says that God gives all of us different amounts of resources, and He will judge us based on how well we use these resources rather than on how many He gave us. It would be stupid to give your first servant five loads of silver, then your second servant two loads of silver, then immediately start chewing out the second servant for having less silver than the first one. And if both servants invested their silver wisely, it would be silly to chew out the second one for ending up with less profit when he started with less seed capital. The moral seems to be that if you take what God gives you and use it wisely, you’re fine.

The modern word “talent” comes from this parable. It implies “a thing God has given you which you can invest and give back”.

So if I were a ditch-digger, I think I would dig ditches, donate a portion of the small amount I made, and trust that I had done what I could with the talents I was given.

VII.

The Jews also talk about how God judges you for your gifts. Rabbi Zusya once said that when he died, he wasn’t worried that God would ask him “Why weren’t you Moses?” or “Why weren’t you Solomon?” But he did worry that God might ask “Why weren’t you Rabbi Zusya?”

And this is part of why it’s important for me to believe in innate ability, and especially differences in innate ability. If everything comes down to hard work and positive attitude, then God has every right to ask me “Why weren’t you Srinivasa Ramanujan?” or “Why weren’t you Elon Musk?”

If everyone is legitimately a different person with a different brain and different talents and abilities, then all God gets to ask me is whether or not I was Scott Alexander.

This seems like a gratifyingly low bar.

[more to come on this subject later]

31 Jan 22:04

Anybody betting on two general elections this year should first read this analysis by Chris Huhne

by MikeSmithson

The political and legal environment make it very difficult

Nearly a year ago the former LD cabinet minister, Chris Huhne, wrote an excellent piece in the Guardian on how the Fixed Term Parliament Act would make it difficult for a second general election shortly after an indecisive outcome – as looks highly likely in May.

” The Fixed-term Parliaments Act means that the prime minister can no longer call an election at a time of his choosing. … Elections are held every five years, except when two thirds of the Commons votes for one, or a government loses a vote of confidence and there is no further successful vote within 14 days.

True, a minority-government prime minister could engineer the loss of a vote of no confidence, but they would then run the risk that the main opposition party would establish a new administration and delay the election. Since the prime minister would only attempt to force an election if he thought he would win, the opposition would have every incentive to avoid losing. So that stratagem looks flawed.

The fixed-term act introduces a further difficulty for minority governments, because the timing of an election would now be in the hands of the combined opposition majority. Any loss of a vote of confidence would trigger an election if the government could not scrabble together a majority. A minority government would constantly be at risk of an election being called at a time of the opposition’s choosing.

The opposition strategy would then be clear: let the government flounder. Deny or amend ministerial legislation. Maybe even deprive the government of money. None of this would cause it to fall, because the fixed-term act requires a specific vote of no confidence. When the administration was looking truly shambolic, you force and win a vote of no confidence, calling an election at the point of the governing party’s maximum disadvantage.

What if Ed Miliband and David Cameron begin to dislike the fixed term? What if they were jointly keen to re-establish the prime minister’s prerogative to call general elections? They could, of course, combine to do so. But why would the opposition to a minority government want to hand over control of the timing of the next general election to its principal opponent?

All of which tells me that minority governments will be less popular in future, and that coalitions are more likely to be the response to a hung parliament. And as for hung parliaments, we shall see. If Labour and the Tories are closely competitive, and if Scotland stays part of the union, it will be hard for winner to take all.”

I find it hard to argue with Huhne’s logic.

Mike Smithson

For 11 years viewing politics from OUTSIDE the Westminster bubble

Follow @MSmithsonPB

31 Jan 22:02

Not Watching This Weekend: Mr Benn

by Nick

Mr_Benn_Gladiator_book_coverThe Trailer: Voiceover man begins with ‘some heroes wear many costumes’. The whole trailer is shot through heavy filters, mostly dark and grey just to ensure everyone is clear that this is a Serious Film taking the source material Seriously. As it’s a trailer, we see all the best bits of the film mashed together through hyper-kinetic editing, complete with out of context quotes scattered over them.

We see Mr Benn (Benedict Cumberbatch) in a pinstripe suit and bowler hat, hear the Shopkeeper (Jim Broadbent) give a garbled explanation of how this is a role handed down from generation to generation to protect history and fantasy. There’d be flash cuts of fighting as a knight and as gladiator, doing complicated things as a spaceman and casting magic as a wizard, all shot in glorious Grimdark-Serious-O-Vision.

‘Protecting them from who?’ he asks, and the trailer shows the designated Bad Guy (Matt Smith), possibly interspersed with occasional shots of the Official Love Interest (Sienna Miller), cropping up in various times and places. Then the trailer slows to show us the Big Dramatic Scene.

Mr Benn, in a cowboy outfit celebrating something, when a bloodstained fez rolls across the screen and lands against his feet. He picks it up, looks out and sees the Bad Guy wearing a suit and bowler hat.

“You wore a costume and stepped into my world. Didn’t you realise that I could wear one and step into yours too?”

Another blizzard of disconnected images then the screen goes black. Voiceover Guy: ‘This summer, choose your outfit carefully.’ Graphics tell us MR BENN: THE MOVIE is Coming Soon.

Likelihood of director and writer claiming that this was always the intended vision for the character: High
Likelihood of anyone who’s seen the TV series keeping a straight face while watching it: Low
Likelihood of straight-to-streaming sequels with a tiny budget and none of the original cast: High

31 Jan 21:59

There are no beans in chili con carne.

There are no beans in chili con carne.
31 Jan 21:53

Slavery and the Creation of a Counterfeit ‘Biblical Civilization’ in America: 1619-1865

by Fred Clark

You probably weren’t able to get to Oxford University for the 2013 Astor Lecture, delivered by Notre Dame historian Mark Noll. I wasn’t either. But don’t worry — Deane Galbraith has us covered, providing an .mp3 of Noll’s lecture.

“Mark Noll identifies the year 1865 as the year in which the American Bible civilization cracked,” Galbraith writes, teasing Noll’s lecture, titled “Biblical Criticism and the Decline of America’s Biblical Civilisation, 1865-1918.″

Allow me to do a couple of imprudent and dangerous things here. First, I’m going to respond to Noll’s lecture before I’ve had a chance to listen to it. And, second, I’m going to disagree with Mark Noll about the history of Christianity in America. The latter is particularly reckless, since Noll is an incredibly smart and perceptive historian who seems to have read and digested everything about the history of American Christianity and surely knows more about that subject than I ever will.

But, still, this needs to be said: the title of Noll’s lecture is misleading and … well, it’s just wrong. It’s based on a false premise.

Any discussion of “the Decline of America’s Biblical Civilization, 1865-1918″ is doomed from the start because it assumes that America was some kind of “biblical civilization” in 1865. It wasn’t. “America’s Biblical Civilization” could not have cracked in 1865 because the possibility of such a thing ever existing had been negated back in 1619.

Perhaps America Christians in 1865 imagined they were living in a “Biblical Civiliation.” After all, the majority of white American Christians in the centuries leading up to 1865 regarded themselves as “biblical” people. They said as much quite a bit.

But if we’re going to understand America and American Christianity, we can’t just take their word for it. We have to evaluate what they meant by this claim, whether that meaning is meaningful, and whether it is in any sense accurate.

I think that claim was accurate, but I do not think it was meaningful. Because slavery.

The existence of slavery — the reliance upon slavery — renders the claim of “America’s Biblical Civilization: 1619-1865″ absurd and meaningless, as “any man whose judgment is not blinded by prejudice, or who is not at heart a slaveholder, shall not confess to be right and just.”

That last bit is from Frederick Douglass’ 1852 speech “What to the Slave Is the Fourth of July?” wherein Douglass contends directly with the claim that America was ever, in any meaningful sense, a “biblical civilization”:

The church of this country is not only indifferent to the wrongs of the slave, it actually takes sides with the oppressors. It has made itself the bulwark of American slavery, and the shield of American slave-hunters. Many of its most eloquent Divines. who stand as the very lights of the church, have shamelessly given the sanction of religion and the Bible to the whole slave system. They have taught that man may, properly, be a slave; that the relation of master and slave is ordained of God; that to send back an escaped bondman to his master is clearly the duty of all the followers of the Lord Jesus Christ; and this horrible blasphemy is palmed off upon the world for Christianity.

Yeah, that.

Those same “eloquent Divines” were not just the foremost proponents of American-style “biblical” Christianity, they were the inventors and creators of the thing. And the thing was invented and created and designed in order to give “the sanction of religion and the Bible to the whole slave system.”

Venus

Jonathan Edwards’ receipt for his purchase of “Negro Girle named Venus,” dated June 7, 1731.

Christendom never described itself as “biblical civilization” until the 17th century. For the previous 16 centuries of Christianity, the Bible did not play such a role in the way that Christians and “Christian civilization” identified and imagined itself. Such an idea just wasn’t available or possible before then. The transformation of Christendom from “Christian civilization” into “biblical civilization” was not a thing that could have happened until after the printing press and the widespread availability of non-Latin translations.

And as soon as such a thing became possible — as soon as the English-speaking colonists who would later become “Americans” first had the opportunity to redefine themselves and begin to identify as “biblical” Christians — it began to be shaped by the nearly concurrent rise of the institution of slavery.

The King James Version of the Bible was completed in 1611. The first African slaves were imported into Jamestown in 1619. “Biblical” Christianity and the idea of “biblical civilization” grew up alongside slavery. The latter shaped the former, and the two things have been inextricably intertwined ever since.

The invention of “biblical” Christianity and of the idea of “biblical civilization” was for the purpose of accommodating slavery. That may not have been its exclusive purpose, but it was an essential function of the thing. It was a concept shaped and designed and tailored so that it could and would defend and perpetuate slavery.

In broad terms, it was a mechanism to allow American Christians to avoid the inconveniently unambiguous implications of the question “Is slavery Christian?” Put that way, Christians can offer only one answer. But what if we change the question? What if we begin to ask, instead, “Is slavery biblical?” Ah, now we have room to work. Now we can introduce technicalities and proof-texts. Now we can shift the debate from the damning, insurmountable obstacles of the greatest commandment and the imperative to “do justice,” and we can begin, instead, debating arcane questions involving the exposition of disparate isolated texts.

It was a dodge. And that’s a feature, not a bug. This is what American-style “Bible Christianity” was designed and intended to do.

That’s why the perpetual antebellum “debate” among American Christians over whether or not slavery was “biblical” was rigged from the get-go. “Biblical” was an adjective primarily designed to describe a form of Christianity that had mutated to approve and defend the institution of slavery. “Biblical” may have also meant other things, but that was always a part of its meaning. That’s why those pre-1865 debates over whether or not “slavery is biblical” were ultimately just a form of disingenuous theater.

Of course slavery was “biblical” and of course opposition to slavery was “unbiblical.” That was what those words mean. That was the whole point of declaring that American Christians should think of themselves as a biblical civilization rather than a Christian one.

All of which is why the dominant narrative in historical and theological discussion of pre-1865 American Christian “debates” about slavery get the whole thing backwards and upside-down. I love Mark Noll’s The Civil War as Theological Crisis. You should read it. It’s a terrific, incisive, engaging book full of profound questions and insights. But it also gets the core of its argument backwards and upside-down.

The perverse part of that argument and that narrative is this: It asserts that pre-1865 “biblical” Christians approved of slavery because of the way they read their Bibles. That’s not true. That’s the opposite of what is true. Pre-1865 “biblical” Christians read their Bibles the way they did because they approved of slavery.

 

30 Jan 18:03

Individuals can be identified from 3 or 4 "anonymized" credit card purchase records.

Individuals can be identified from 3 or 4 "anonymized" credit card purchase records.
30 Jan 17:17

Rutland earthquake latest

by Jonathan Calder

Thanks to Tired Old Git on Twitter.
30 Jan 17:00

Sclerotica

by Tom

Before I start writing about Number Ones again, a quick bit of stattery around the current state of the charts. This is an extremely wonkish post, so reader beware.

The official Top 100 Streaming chart was launched 30 weeks ago. Let’s see what’s up with it.

Of the 100 songs in the streaming chart.

44 have been in it for 30 weeks (i.e. since the first ever chart)
38 have been in it between 10 and 29 weeks.
18 have been in it for fewer than 9 weeks.

Compare this to the official chart – which incorporates the streaming chart, of course.

33 in it for 30+ weeks
36 in it between 10 and 29 weeks.
31 in it for fewer than 9 weeks.

I should have done that as a graphic or something. Basically, streaming charts are slower. As streaming grows as a proportion of “sales”, it’s likely to slow the charts down further, as the die-off of hits gets longer.

How about just looking at the Top 40?

For the streaming chart using the same definitions as before, you get 15 out of 40 sluggards (vs 7 in the main chart), 10 out of 40 newbies (vs 14 in the main chart), and 15 in the middle (vs 19). Again, a big skew towards long-stayers in the streaming chart – maybe even bigger than looking at the full 100. And streaming is going to grow in importance.

None of this is reflected in the Number Ones, because record label marketing teams are usually very good at getting hits to a high initial placing, so there’s a high turnover of Number Ones that’s completely atypical of the sludgey inertia below.

There’s an argument that a slow chart reflects listener behaviour and taste better than a faster one. That’s true, but only up to a point, and I think the streaming chart goes well past that point. There is an average wear-out time for any given hit – the number of weeks people want to keep hearing it. There is also a distribution curve for each hit – not everyone is introduced to it at once, and new people keep on discovering it.

So imagine you have two people who both like Hozier’s “Take Me To Church”. One heard it in the first week of release. The other in the 14th. And let’s say that for a Hozier fan, 15 weeks is about the die-off time before even they are tired of hearing it. What it means is that the first Hozier fan is getting tired of “Take Me To Church” just at the time the second fan is getting enthused about it.

Record shops – and even online shops – have always favoured the first person, explicitly (in the days when rack space was at a premium) and implicitly (in the days when homepage real estate is). Even radio playlists – which the UK charts have never included – have some commitment to refreshing themselves. Streaming services do no such thing – the music is there, free, all the time, catering for fans all along the distribution curve. That sounds very fair. Except it means that everyone at the front of the distribution curve – who hear songs early – now have to wait for a song to work its way along the curve before it goes away. Eventually, enough of the people who will ever like Hozier will have heard Hozier and got tired of Hozier, and Hozier will drop out of the charts. But that might not be for a very long time.

Short version: streaming is a subtle flip in the way the charts work. Sales charts have always been geared towards reflecting people who pay attention to new music and come to a song quickly. Streaming charts are equally geared towards reflecting people who don’t pay attention to new music and come to a record late. There is nothing wrong with those people! But the result is a slower-moving chart and hits whose lifespan is a function of their slowest fans, with chart stays extended beyond most individual fans’ tolerance. The question of what a weekly chart is FOR – or at least, who its audience is – becomes more and more open.

(This may have a bizarre – and surely unintentional – outcome. The behaviour familiar to a generation of British pop fans – listening to the Top 40 to hear the latest music – is now actually penalised. If you listen every week to the charts, you will run through your tolerance for individual hits with the bulk of their lifespan still to run. A slow moving chart rewards occasional dips more than loyal following.)

30 Jan 13:38

The Moment Has Been Prepared For (The Day of the Doctor)

by noreply@blogger.com (Philip Sandifer)
Well, at least it's just the one who committed genocide and not the one in that
awful coat.
Cometh the hour, cometh the man. The hour, in this case, was actually about seventy-five minutes long, commencing at 7:50 PM on November 23rd, 2013. Martin Garrix was at number one with “Animals,” with Lily Allen, Lorde, One Direction, Lady Gaga, and Eminem also charting. In the six months since The Name of the Doctor had aired, Edward Snowden had created a major international stir when he leaked a significant trove of classified information about the extent of surveillance operations being routinely carried out by the US and UK governments, Mohamed Morsi was deposed as President of Egypt in a military coup, and Typhoon Haiyan struck the Philippines and Vietnam, killing more than six thousand people. The man, of course, was Steven Moffat. 

The Day of the Doctor did not quite win universal praise. It just won stellar ratings, an impressive 88% AI rating, the distinction of being only the second episode of Doctor Who ever to hit the number one slot in the weekly ratings, and Doctor Who Magazine’s 50th Anniversary poll for the greatest story of all time. Nothing is all things to everyone, but it is difficult to imagine something coming much closer than The Day of the Doctor did. And yet The Day of the Doctor arrived after an enormously troubled production season, and was hardly an uncomplicated production in its own right, with executive producer Caro Skinner quitting the series a few weeks before shooting began, on top of the entire mess of Eccleston initially expressing interest and subsequently declining to reprise the role. That it avoided being an outright disaster given these circumstances seems a lucky break. That it was an insta-classic seems a small miracle.

At the heart of its success is a script by Moffat that is unapologetically committed to the episode’s grandeur. The episode deploys big set piece after big set piece, rarely waiting long between them. The TARDIS helicopter lift starts at the two minute mark. At seven minutes, we jump into the Time War. Billie Piper shows up six minutes later, David Tennant six minutes after that, at around the twenty minute mark. At twenty-seven minutes, Smith and Tennant share the screen for the first time. The big Zygon awakening/invasion breaks out at thirty-five minutes. Six minutes later is the big “did you ever count the children” three-way confrontation among the Doctors, which, while lacking the immediate grandeur of some of the other instances, is nevertheless a huge moment. The biggest slow period of the episode is the subsequent ten minutes building to the Doctors blasting their way out of Gallifrey Falls No More, bringing us to the fifty-two minute mark. By fifty-eight minutes, all three Doctors are agreeing to commit double genocide. And seven minutes later the thirteen-Doctor montage has kicked off. And six minutes thereafter the Curator shows. 

But for all the accelerated pace involved in jumping from set piece to set piece, what’s also striking is the way in which the individual set pieces are generally given room to breathe. It’s not quite accurate to say that Moffat has slowed down the pace for this story, because there are moments where it absolutely screams through sequences, but there’s a sense of what scenes are going to need room to breathe that hasn’t entirely been on display in Moffat’s Doctor Who since The Impossible Astronaut/Day of the Moon. The result is an episode that feels, in many ways, like a linked sequence of mini-episodes. The Day of the Doctor watches very well as a single seventy-five minute bit of cinematic television, but it’s also an episode that divides very well into smaller segements. The counterpart to the huge chain of set pieces is that there’s a great place to pause and see how dinner’s coming every seven minutes or so. 

The other thing to point out, structurally, is that Moffat, in the most obvious move imaginable, tapped Nick Hurran to direct this. Hurran is typically adept, and even manages to make the 3-D effects work to his advantage at times - the handling of the dimensionally transcendental paintings is one of the few genuinely great shots in the history of the generally awful technology of stereoscopic film. Fast-paced scripts have always had a friend in Hurran, whose use of inserts and double images lets him communicate information with considerable efficiency, in a manner not unlike how Sherlock speeds things up with its superimposed text. He’s also incredibly deft at abandoning strict continuity editing, as in the Zygon breakout scene, which doesn’t parse as linear action at all, opting to very clearly communicate “oh no, Zygons everywhere, and now Osgood is cornered” instead of trying to actually show the entire process of Osgood running from the statue room to the elevator. 

That, at least, explains the structure. But The Day of the Doctor is far more than just that. It is a story that has to make a definitive statement on what Doctor Who is. And the way Moffat approaches that is revealing. It has been observed, not inaccurately, that The Day of the Doctor is largely about the new series. Yes, John Hurt is there to, in a real sense, allow the classic series to pass comment on the new one, but it’s worth noting that we pick up with the War Doctor, essentially, the day before Rose, a fact that’s heavily emphasized up front by Billie Piper’s intrusion from the Doctor’s immediate future. The major plot point, the Time War, is a new series invention. Yes, you’ve got the Zygons there as fanservice for David Tennant, and a smattering of classic series references and jokes, but this really is mostly about the recent past of the series. It’s much closer to being a new series version of The Three Doctors than it is to being The Thirteen Doctors

Some of this is simply a matter of practicality. Moffat surely rewatched The Three Doctors and The Five Doctors in planning this, and it would not have escaped his notice that The Five Doctors sagged badly under the weight of its cast size, and that was only really four Doctors. And down the road of trying to include all of the past Doctors lies a wealth of significant logistical challenges, to say the least. The only Doctors it would be straightforward to bring back were McGann, Eccleston, and Tennant, and one of them wasn’t interested. So it’s inevitable that this would be a new series-focused anniversary. 

But it’s also worth recalling what the Time War means in terms of the new series. By the end of the Davies era, the Time War had been built into, essentially, a metaphor for the cancellation - as the consequence of an actual narrative collapse. One of the things The Day of the Doctor is very much about, then, is suturing that wound. Between The Night of the Doctor and the War Doctor’s regeneration scene, Moffat tacitly removes the gap that had existed between McGann and Eccleston, symbolically restoring an unbroken narrative to Doctor Who so that it has something resembling an unbroken fifty year history.

There are, of course, lies in this. The Wilderness Years get a significant rewrite, in particular. Moffat remarked that he couldn’t really see McGann’s Doctor destroying Gallifrey, which is, to say the least, ironic given the Eighth Doctor Adventures. Yes, The Night of the Doctor goes out of its way to nod to Big Finish, but the McGann era is still a messy and hazily defined thing. All the same, it’s worth noting how much more destructive to the McGann era it would have been to give the War Doctor’s part to him. Deciding to have McGann’s Doctor only ever have flitted about the edges of the Time War at least leaves his era untouched, instead of declaring that the Doctor people enjoyed in Alien Bodies or The Chimes of Midnight, or even God help them, the TV Movie became, in the end, someone who committed double genocide. While the nature of what happened in the Wilderness Years remains muddy, whatever happened, it at least happened how fans remember it.

The bigger lie comes in the form of the War Doctor. Not, to be clear, because he’s a brazen and unapologetic retcon. Rather, it’s because he doesn’t actually fill the hole he’s meant to slot into. John Hurt is seventy-five, and The Day of the Doctor is overwhelmingly likely to be his only actual appearance in Doctor Who. His casting actually serves to render the Time War even less representable than it had been. Prior to The Day of the Doctor you could have done a Time War story provided Paul McGann was still alive. Now, however, the Time War is a truly lost era. (Yes, Engines of War exists. And someday the McGann/Eccleston book will too.) 

But in some ways this entire train of thought misses the point. One of the more on-point critiques that Lawrence Miles and Tat Wood make in the About Time series is the observation that the biggest problem with The Five Doctors is that it fails to use itself to kick off a new direction for the series. For all that it tries to culminate in an “well, isn’t this how it all started” moment, it’s not a story with consequences of any sort. This contrasts with The Three Doctors, which goes through a lengthy celebration of the past and then, more importantly, emphatically moves forward by restoring the Doctor’s ability to travel freely in time and space. Once again, it’s fairly obvious that Moffat looked at the past and thought about what worked and what didn’t, because The Day of the Doctor is firmly in the tradition of The Three Doctors. It’s over-hyping things to say that this is about setting up the next fifty years of Doctor Who (although it is worth noting that Moffat would ultimately make sure that the series began its fifty-first year with the Doctor at the beginning of a cycle of regenerations), but Moffat does use The Day of the Doctor to set up a new metaplot for the series. 

As with much of this story, there’s considerable subtlety to this. It is now inevitable that Gallifrey will return someday in Doctor Who. But it’s not inevitable along any particular timeframe, a point Moffat makes especially clear when he finishes his “of the Doctor” triptych by demonstrating that you can do Gallifrey stories other than “Gallifrey returns.” Instead The Day of the Doctor just marks a sort of narrative apex - the point where the course of things turns and we finally clearly start approaching Gallifrey’s return, which, let’s face it, some showrunner was always going to do. It’s not hard to imagine Moffat reaching the end of his time in charge of Doctor Who without ever bringing Gallifrey back. What we’re changing here really is the shape of Doctor Who’s metaplot.

But what’s more important, ultimately, is the reasoning behind that change. It’s not just that The Day of the Doctor reverses the outcome of the Time War, after all. It’s that it does so as part of an argument about the Doctor’s nature. This is, to a real extent, an outright moment of disagreement between Moffat and Russell T Davies. Moffat has said that he never really thought the Doctor would commit double genocide, and here he makes that argument explicit, having Clara frame her case for the Doctor not doing it in terms of what it means to be the Doctor, which in turn gets framed in terms of Terrance Dicks’s old “never cruel nor cowardly” line. The resolution of the story, in other words, is a statement of what Doctor Who is for, as a cultural object, which in turn justifies the existence of another fifty years of it.

(It’s also worth addressing the way in which Moffat handles the issue of the Doctor spending seven seasons thinking he’s committed a double genocide, namely by declaring that the Doctor doesn’t remember this adventure until it happens to Eleven. Moffat actually goes to considerable length throughout the story to make sure it fits meticulously with existing Doctor Who continuity, and so this is no surprise. But there’s also an emotional honesty to it that rarely gets remarked upon. It’s significant that it’s Matt Smith’s Doctor who gets to figure out how to save Gallifrey, and not Hurt or Tennant’s. It’s not until the Doctor accomplishes this - until he actually finds a better way - that he gets absolution. This is, in fact, entirely fitting. Eccleston, Tennant, and, until this story, Smith all thought they made the best choice available to them, and so lived with the consequences of that belief. It’s not that the Doctor was wrong about Gallifrey being destroyed in the Davies era - it’s that he hadn’t saved it yet.)

But what is this justification? Yes, he’s neither cruel nor cowardly, and he never gives up or gives in. Both lovely statements, but clearly not the whole of it. The Doctor, and Doctor Who itself, are more than just that. So what are they? Certainly many of the answers we’ve looked at throughout this project are not really present here. The relationship between eccentricity and the mainstream that Doctor Who has always mediated, the mercurial urge to tear down the world and always, endlessly change, these just aren’t the themes that are in play here, or, at least, they’re not at the forefront of the episode. 

No, instead we get material social progress. That, in the end, is the point of the Doctor. To find a better way. That’s why we need it, and, perhaps more to the point, why we always need more of it: because material social progress is always possible. Because there’s always more to do. Because making the world a better place is fundamentally, perpetually unfinished work. It is not, admittedly, the exact theme I would have preferred. But it’s a good theme, and a powerful one, and, perhaps most importantly, it’s a sensible choice for the fiftieth anniversary, because it’s the explanation for what Doctor Who is for that most obviously explains why it should keep going.

Which brings us, of course, to the cleverest thing that Moffat does in the course of The Day of the Doctor, which is, of course, the Curator. As we noted, despite creating a new Doctor to fill in the gap for the Time War, Moffat actually makes it even less possible to depict the Time War by having the War Doctor be played by a seventy-five year old actor with better things to do than pop back for another Doctor Who appearance. The same logic, of course, applies to the Curator. Tom Baker is eighty-one, and this almost certainly marks his final televised appearance in Doctor Who. But unlike the Time War, this does not create an unrepresentable space in the program’s past. Instead it creates one in the future. The Curator is a future era of Doctor Who that can never happen, but that is also now “canon,” as it were. The Doctor simultaneously will eventually regenerate into Tom Baker again and can never possibly regenerate into Tom Baker again. 


And this is, in the end, the real content and result of The Day of the Doctor. It doesn’t just heal the gap in the series’ past. It forever and permanently rejects the idea of Doctor Who being something with an ending. Sure, there may be more cancellations and Time Wars to come. But the story, like material social progress, will never actually be finished. Half a century down. Forever to go. Happy birthday, Doctor Who.
30 Jan 12:07

Taken for Granted

by LP

Goodness knows our society has long mistaken the exhortation to kill its idols for an invitation to shit on its most talented members, but sometimes it seems that comics fandom in particular is determined to isolate the best and brightest of their chosen medium’s creators and relentlessly heap bile on them until they go away, leaving the job of telling the stories they love so much to the most mediocre and predictable elements available.  Nowhere is this more apparent than in the case of Grant Morrison.

Alan Moore, at the very least, has the good grace to be an irascible old crank who hates comics, comic companies, comics fans, and pretty much everything else.  (And God — or giant snake-god — bless him for it.)  This makes it easy for fans to despise him and forget that he created at least a half-dozen of the finest works ever written for the superhero genre.  With Morrison, though, it can’t be attributed to crazy-old-manhood, or a case of being jilted, or personal or professional bitterness.  Although he’s possessed of plenty of the same quirks as Moore, he houses them in an urbane, sophisticated display that does nothing to piss off fans on a personal level; he’s still almost alarmingly enthusiastic about the superhero medium, and doesn’t bum out America’s man-children by pointing out that their primary hobby is kind of embarrassing; and far from having burned his bridges (or, more accurately, having had them burned), he’s worked his way to a level of almost unprecedented power and influence at venerable DC Comics.  And yet, while he’s still widely celebrated by many fans and critics, there remains a segment — vocal, hyperbolic, and not markedly small — who openly despise him and call him the worst thing to happen to the industry since Wertham.

Even his success is turned against him:  some of the same people who vilify Moore as a prickly, difficult snob who never learned to play ball claim that Morrison is a toady, a suck-up, a company man who betrays the legacy he loves just for the chance to play with them in his cosmos-sized sandbox.  Morrison’s work is too difficult, some of them say, with the charge of pretentiousness — that is, difficulty for difficulty’s sake — never too far away.  Morrison has no respect for the characters and wreaks arbitrary change on them for his own amusement, some of them say, with others going so far as to accuse him of not understanding the essence of these four-color icons in the first place.  Morrison is contemptuous of the fans, some of them say, and does not respond to their wishes.  Morrison leaves jobs undone, some of theme say, dangling threads of plot everywhere, failing to make connections, forgetting his place and leaving the universe over which he has been given charge a total mess.

I must confess to a near complete lack of appreciation for these charges; whether as a summary referendum on his character in general, or an analysis of specific cases, I find little merit in them.  Morrison’s work may be difficult to the kind of fan who despises ambiguity, fails to appreciate mystery, and resents being asked to make an effort, but it is only difficult by the still-paltry standards of superhero comics, and its best qualities — by turns elegant and simple — can be discerned by anyone who approaches his writing with a modicum of respect.  He may play ball with the powers that be to get himself into a position to effect change in the editorial mission of DC Comics, but it is a job that needs doing quite badly, and who else would do a better job of it?  Surely none of the cut-rate toadies, egotists, and hacks who make up the rest of Dan DiDio’s entourage.  The idea that Morrison doesn’t understand superhero comics is particularly bizarre, given that he literally wrote the book on the subject; it may not be the best book on the subject, but differences of interpretation aside, this is clearly not a man who doesn’t care about his medium.

The charge that he disrespects the characters bears a bit more examination.  The lack of vividness and uniqueness in some of his character work (though certainly not all) can probably be attributed that, especially of late, he has been called on to work with a cast of literally hundreds of individual characters; a few are sure to come off as a bit samey, but when the rubber meets the road, he can still hit you in the heart.  I think part of the confusion arises from the fact that one of Morrison’s great strengths — it can be witnessed anywhere, even in a brief and largely inconsequential panel on page 6 of this week’s Multiversity Handbook where he strips eternal Marvel Family nemesis Dr. Sivana to his purest visual essence by the multiple occurrence of two adjacent circles to symbolize his Coke-bottle glasses — is to immediately ascertain the most pure iconic nature of a character and use it as a thru-line in his storytelling.  This criticism comes up again and again, and to me, it seems to come from people who not only perceive this quality as a weakness rather than a strength, but also simply disagree with his assessment of what is essential about those characters.

Since I do my best to stay out of the rat-hole that is comics fandom, I cannot speak to whether or not the man holds his fans in contempt; he was certainly quite open and kind to me when I interviewed him a few years ago.  I don’t remember reading any instances where he was openly hostile or insulting to fans, but allowing for the fact that I might have just missed them, I have to consider that this hostility might run only one way, and that Morrison — a man who seems to me to come across as friendly and forthcoming — may be more hated than hateful.  The final charge, that he is a genius of coming up with ideas and a fool at executing them, is common enough, and there is some truth to it; the DC Universe is tangled with the plot and story threads he has left dangling all over the place.  I’m willing to let him off the hook for this, as I am for the fact that he can indeed be hit-and-miss as a writer, because his workload is phenomenally heavy; he’s producing an enormous amount of material for DC even aside from his editorial duties, and he’s bound to leave some stuff hanging from the rafters.  But when he is allowed, as he often isn’t, to complete a story, it’s more often than not airtight, with even sprawling and complex storylines (as in 7 Soldiers of Victory) wrapped up neatly in the end.  It should also be recognized that someone with an eye as far ahead as his may not be forgetful or careless so much as he is deliberately planting seeds for himself, or others, to harvest down the road.

Grant Morrison is a clever, intelligent, passionate, and prolific writer.  His work habits are by all accounts extraordinary; he has an encyclopedic knowledge of comics that he bring to bear on work that is creative rather than academic; and he has done the near-impossible by introducing metafictional techniques to comics without making them cheap and obvious, telling stories about stories in a way far less ham-fisted than the far more celebrated Neil Gaiman does.  He has managed to retain a subversive, progressive, and artistically refined outlook on his work while rising to a position of great authority in his chosen métier.  And on top of all that, he has produced an astonishing number of comics of great accomplishment and quality:  Doom Patrol, The Authority, The Invisibles, Batman, 7 Soldiers of Victory, Flex Mentallo, Seaguy, We3, and All-Star Superman, among others.  His failures have been few, and his greatest successes have been great indeed.  I don’t expect everyone to love him, but I don’t understand why so many people hate him; there are few people in comics with his combination of work ethic and talent, and we banish him at our peril.

28 Jan 22:42

My 16 Bitcoin predictions for 2015!

(Inspired by this euphoric fever dream of an infographic.)

  1. There will continue to be nothing that Bitcoin does better than existing systems, apart from money laundering and purchasing illicit goods.

  2. No, not remittances. The expensive bit of Western Union is in fact that last mile.

  3. It will remain difficult to turn your Bitcoins into conventional currency (which is the only reason there's such a spread between exchanges).

  4. It will get even harder to turn your conventional currency into Bitcoins, as any exchange not being run by blatant crooks puts you through the anti-money-laundering mill.

  5. The protocol problems will continue not to be fixed, unless most of the hashing power and Mircea "socks and cocks"* Popescu can be convinced to go along with the Bitcoin Foundation. No 20-meg blocks for you!

  6. 99% of current hashing power came online in 2014; this will be very price-sensitive, and much will go offline as the price drops, maybe coming back next hash adjustment.

  7. Miners will continue to sell their coins immediately to cover costs: we are circling equilibrium, where the cost of mining 1 BTC is about 1 BTC. The pool of money to pay for them comes from new Greater Fools.

  8. Transaction irreversibility will remain Bitcoin's sticking point, as speculators who are insufficiently computer-savvy keep getting burnt. "No chargebacks" will continue to repel customers and not attract businesses.

  9. More exchanges will get hacked and/or just take everyone's money. (So far just this year: BitStamp, 796, LocalBitcoins, EgoPay.)

  10. Everyone who bought in the last year and held is a bagholder. Their claims and speculation will get increasingly frenzied. Ask for numbers supporting all claims, particularly the ones in the above-linked infographic.

  11. The bagholders and gambling addicts will continue to be taken by obvious scams, e.g. the two Ponzi scheme sites in just the last month.

  12. Sidechains will continue to be vaporware and not in fact a thing that exists, let alone solves any problems. Bitcoiners will still talk about them as if they exist in the present, therefore you should ignore that altcoins are possible. Edit: OTOH, pettycoin might get finished this year.

  13. Altcoins will continue to be even scammier than the Bitcoin ecosphere, boggled as I was to realise this.

  14. The price is presently being held up by speculation and wishful thinking. No new reason will come along. The "fundamentals" are a castle in the air.

  15. Nobody actually wants smart contracts. They know that the plot of Dr. Strangelove is literally an unstoppable smart contract going wrong. Real customers want problems to be fixable when circumstances change, real companies want to retain the option of lawyering out of a stupid deal. The only people who would want smart contracts are businesses looking to screw over their customers even more than "mandatory arbitration" clauses do. This is about as appealing to customers as no chargebacks, for the same reason.

  16. Blockchains, even if by some remarkable wrinkle they turn out useful for something, will not lug Bitcoiners' 33 GB of SatoshiDice penny shavings with them. Bitcoiners will continue to bring up "blockchain technologies" as a reason to bother with Bitcoin regardless, because that's literally all they have.

    (Bitcoiners misunderstand that when a techie calls something "interesting" they don't necessarily mean "useful", "feasible" or "practical" — often they mean "what the hell even is that" or "I ain't even mad, that's amazing". The blockchain, particularly as implemented in Bitcoin, is very much the last.)


If you know nothing about Bitcoin and find the above largely confusing, here's the short FAQ and the RationalWiki article (which I started). I'm not such a fan.
Update 2016: Pettycoin still isn't finished. Otherwise, pretty good.

* Technically this is blatant ad hominem, but it's definitely a post so amazing it should be linked anywhere his name is mentioned, ever. Archive link.



comment count unavailable comments
28 Jan 22:08

An Ageing System

by Jack Graham
It's always puzzled me, this thing about people getting right-wing as they get older.  You'd think that the opposite would happen.

I mean, as you get older, you notice that the world keeps having the same problems, and that they tend to have the same underlying causes, and that nothing is ever done about them.  You notice more and more of the same kinds of scandals reoccurring over the years, time and again, and always based upon imbalances of power, and upon powerful people being unaccountable.  You're more likely to have a mortgage and debts, the older you are.  You're more likely to owe lots of money to banks and credit agencies, and to be crippled by these debts.  You're more likely to have health problems, and thus to need medical care, and thus to see that the Health Service is underfunded and overstretched.  You're more likely to realise that your investments and savings (if you have any) don't pay off in anything like the way you're told they will when you're younger.  You're more likely to worry about how you will look after yourself and your partner in old age.  You're more likely to notice your retirement receding into the distance.  You're more likely to find yourself paying through the nose for medications that go with age, medications your parents didn't have to pay through the nose for.  You're more likely to meet more and more people of different 'races' and nationalities, and thus to notice that they're not too different to you.  You're more likely to have been mistreated at work, or sacked, or been forced to find second jobs or third jobs.  You're more likely to have seen talentless people around you rising while you stay still, simply because they were born with advantages you never had.  You're more likely to have lived through several wars and recessions instead of just one or two.  You're more likely to have observed the way the world's weather systems have catastrophically changed even in the last few decades.  You're more likely to have seen friends' lives ruined by discrimination or depression or stress.  You're more likely to have seen your parents face an uncertain and neglected old age.  And you're more likely to have kids and grandkids, and to see all the challenges they face.  A restricted job market, more and more pressure on them to work harder to have a chance of being employed one day, the escalating cost of further and higher education, more debt earlier in life, less chance of being able to afford to buy a house or even move out of their parents' home, spiraling costs of living, less social safety nets, a squeezed education system.  And on and on it goes.

Decades of life means decades of observing the world getting worse, and the so-called solutions never working, and the so-called progressive parties always selling people out, and the persistence of poverty and corruption never being addressed, and inequality and injustice always being at the root of the problems.  And it means becoming more and more vulnerable, as an individual or as part of a family, to the insecurity and hardships capitalism causes and relies upon.

Basically, the saying "you get more right-wing as you get older" is something only applicable to people who are already economically and socially privileged.  Such people can afford to dabble with being left-wing when they're young, if they feel like it.  They can rely on rich parents to bankroll such flirtations with leftiness.  When they're young and cushioned, it costs them nothing.  Principles are always easier if they cost you nothing.  Then, as they get older, they take their place in the system of privilege that was always waiting to welcome them.  They see their savings and investments rising (because, generally, the more you have the more you make from it), their property becoming more and more valuable, their financial situation getting comfier and more secure every day.  At the very least, they see themselves comparatively insulated.  Their parents enjoy a luxurious old-age and then die, leaving them more property and investments and savings.  Their own children and grandchildren don't have to worry about the deficiencies of state education, or the deficiencies of the NHS, or the debts accrued during further and higher education, or the uncertainties of the job market.  Jobs wait for them.  Economic security is built into life for such people and their families.  Like all people with privilege, their primary focus is to hold onto it ferociously.  We all know that privilege causes people to feel aggrieved by the faintest suggestion of a challenge to their privilege.  This is at the root of much misogyny.  Patriarchy causes men think of women as appliances.  How angry would you be if your vacuum cleaner suddenly refused to work?  I dunno about you, but when things like that happen to me I swear at the errant machine and feel pretty damned aggrieved.  Men extend the same logic to women who don't want to be treated as equipment.  And Corey Robin, a historian of ideas, has done some very good work describing how modern conservatism is, at bottom, the ideological expression of the struggle to retain privilege.

This pattern is most pronounced in the very rich, of course.  But it used to hold pretty reliably (if scaled down) for the moderately rich, and down to the middle classes.  (It even used to hold for a certain extent to the working classes during the long post-war boom, which looks like a utopia of economic justice, job security and progressive welfarism compared to where we are now.)  However, like many old cultural certainties of capitalism, this pattern is breaking up - with increasing speed.  (Capitalism does this.)  The middle classes and petty bourgeoisie are more and more squeezed, more and more threatened by uncertainty.  The iron law of neoliberalism is the redistribution of wealth upwards - and this is really just a variation on what always happens in capitalism during its built-in periodic crises.  And, the bottom looking increasingly drained, neoliberalism - especially now in this era of crisis and austerity - is feeding on the middle.

Of course, the people in the middle can't be expected to draw left-wing conclusions from all this.  Such people generally drift to... da daaa!... the populist far-right (i.e. UKIP, BNP, the Tea Party, various European equivalents).  And, in the absence of organised organs of workers' struggle, the people at the bottom will often drift that way too.  Especially when you factor in the massive effort neoliberal capitalism puts into pushing and sustaining ideological disorientation.

Neoliberal capitalism itself is getting more ideologically right-wing as it gets older.  The only sane response is to go in the other direction.

But it's not easy.  Because, with that lifetime of confronting the horrors there also comes a lifetime of getting entangled in the very hard work and stress it takes to navigate them, and the fatalism of seeing nothing ever getting better, and the pessimism of constant defeats, and the confusion sewn by decades of 'There Is No Alternative'.  That's why the only thing that has a chance of fundamentally altering this downward spiral is a resurgence in working class organisation, in the teeth of the longstanding and hugely-successful neoliberal project of destroying the power of unions. 

That's why every success for the left, however compromised, however embedded in reformist politics and parliamentarianism, has to be seized upon. 

Because the dialectic of change has to start somewhere.
28 Jan 12:13

http://www.andrewrilstone.com/2015/01/special-offer-to-celebrate-publication.html

by Andrew Rilstone
SPECIAL OFFER
To celebrate the publication of One Hundred and Forty Characters in Search of an Argument, all my books are reduced by 30% on Lulu.  

PLUS Lulu is running a four-for-the-price-of-three offer {CODE TRGE15}


This means that you could buy

One Hundred and Forty Characters
The Viewer's Complete Tale
George and Joe and Jack and Bob
Do Balrogs Have Wings
Where Dawkins Went Wrong and
Who Sent The Sentinels 


You know it makes sense....


28 Jan 11:38

A one letter typo accidentally destroyed a 124-year-old engineering business.

A one letter typo accidentally destroyed a 124-year-old engineering business.
27 Jan 21:20

Americans Are Fleeing Religion and Republicans Are To Blame

by Lisa Wade, PhD

Over the past 40 years, Americans have become increasingly likely to deny an affiliation with a religion. The graph below shows that people with “no religious preference” rose from about 5% of the population in 1972 to about 20% today. Overall, however, Americans do not report a corresponding decline in the a belief in God, life after death, or other religious ideas. What’s going on?

2

Sociologists Michael Hout and Claude Fischer — the guys who made the graph above — argue that the retreat from religious affiliation is essentially, a retreat from the political right. Religion has become strongly associated with conservative politics, so left-leaning people are choosing, instead, to identify as “spiritual but not religious.”

Here is some of their evidence. The data below represents the likelihood of rejecting a religious affiliation according to one’s political views. The more politically liberal one is, the more likely they have come to reject religion.

3

Using fancy statistical analyses, they explain: “generational differences in belief add nothing to explaining the cohort differences in affiliation.” That is, people haven’t lost their faith, they just disagree with religious leaders and institutions.  Hout and Fischer conclude:

Once the American public began connecting organized religion to the conservative political agenda — a connection that Republican politicians, abortion activists, and religious leaders all encouraged — many political liberals and moderates who seldom or never attended services quit expressing a religious preference when survey interviewers asked about it.

Democrats have wondered how to break the association of the right with religion and claim a little bit of moral authority for themselves. It looks like they may not need to or, even, that having failed to do so has a surprise advantage.

Lisa Wade is a professor of sociology at Occidental College and the co-author of Gender: Ideas, Interactions, Institutions. You can follow her on Twitter and Facebook.

(View original at http://thesocietypages.org/socimages)

27 Jan 21:16

Releasing a tame aspie into the wild

by Amanda J Harrington



This information is designed to guide you but should not be taken as the only way to naturalise your tame aspie. Caution is advised in any situation where an aspie is released into the wild for the first time as their usual response to wide, open spaces full of nature is one of abject terror and incomprehension.

Your good intentions in rescuing your tame aspie have brought you to the goal you have planned for months: finally, it is time to release your aspie into the wild and hope they can fend for themselves.

To ensure this is a happy experience for you and as stress-free as possible for your aspie, please consider:

Training your aspie to feed itself

This can be a long process but is ultimately rewarding. Start small and guide your aspie towards bread and butter, then work up to jam. Buy cheap jam as you will lose a lot in misguided attempts to use a knife as a spoon.

Work up to cooking hot meals. These will mainly only be warm or incinerated for a while as your aspie is either too impatient to wait or forgets they are cooking.

Try not to rely on pizza as a foodstuff and introduce your aspie to using more than one pan/tray at a time.

Do not try leaving your aspie alone with a cookbook: contrary to what you might think, it will not encourage them to do the whole process from start to finish just because it is written down. More likely, the book will be ignored because the font is wrong and the pictures are badly contrasted and the instructions have no basis in fact.

Survival skills

Your aspie will be aware that the wild is a dangerous place. Try to persuade them not to base their instinctive reactions on Zombie-Feed 6 and Planet of Terrorrrrrr!

Equip them instead with a map of the relevant wild area, a quick link on their phone, a written note reminding them to use the map and the quick link and a gentle push to get them past the front gate.

Make sure they are aware of the usefulness of little old ladies in getting them where they need to go. Aspies are naturally suspicious of anyone else in the wild, but little old ladies are often the best first request for help as they appreciate being asked and love to give advice.

Do not let your aspie bring home any little old ladies they find in the wild. Life is complicated enough.

Getting around

Explain to your aspie about the useful but potentially terrifying power of public transport. Furnish them with another map, complete with highlighted areas and make sure they know which trains/buses not to get on.

When your aspie returns full of accusations against you because they have spent three hours on the wrong train/bus, try colour-coding your highlighted areas so they don't automatically choose the routes you told them to avoid.

When you have to collect them from Hambilam East at midnight, consider training them to use taxis.

Wild, romantic encounters

It's entirely possible that, once out of the confines of their tame environment, an aspie's attentions may turn to romance. This is only natural but does bring with it some significant challenges for the tame aspie.

As with Survival Skills, make sure your aspie does not base romantic encounters on those portrayed in their favourite video games. Instead, hand them a copy of Pride and Prejudice and hope for the best.

If your aspie tries out their own brand of romantic engagement, do be prepared to find yourself housing two tame aspies instead of one. Aspies have a tendency to run for home when going through new experiences and their new love will most likely follow. With any luck, you will now have someone to take care of you and your tame aspie for the next few years. If it doesn't work out this way, do be prepared to build on an extra room for all the gaming equipment.

Try and try again

Aspies going out into the wild are timid and fearful of danger. They need the security of home while knowing the wild world waits for them. Do be patient with the many returns your aspie will make as they learn to cope with new challenges. It is always better to welcome your aspie back then help them leave again than expect them to leave all at once and end up being eaten by wolves.

If all else fails...

If your aspie becomes stubborn and is determined to be tame forever, consider how important it is to live in the wild world. Has your aspie got it right? Should we live inside, where it is warm and safe and the internet is always on? Is this not better than the cold outdoors with intermittent wifi and internet cafes where people actually expect to socialise instead of just using the internet?

If your tame aspie has tried the wild and done their best, consider that they have become too accustomed to life where it is peaceful and safe. Leave your efforts on simmer for a while, then try again. It is always worth trying again because the next time your aspie might realise they do like the wild, it was just that you were doing it all wrong and now that they have a system, everything is fine!

And when they return again, with or without love, with or without little old ladies, with or without the shopping you have been waiting for since 11am, then open the door and smile as they clatter through. Sooner or later your tame aspie will go out into the wild again and this time they may remember the milk.

Put the kettle on in anticipation and enjoy your much-deserved relaxation. Right before your phone lights up with the Hambilam East area code.

Amanda




My books and writing blog, with free stuff.
Find me on Facebook.and Twitter!



27 Jan 14:26

1945 – How Churchill won the war, but lost the election

by noreply@blogger.com (Alun Wyburn-Powell)
It is the fiftieth anniversary of the death of Winston Churchill and nearly seventy years since the 1945 election. Although he is remembered as a highly-successful politician, Churchill in fact failed to win a seat in five of the 21 contests which he fought, and as party leader he never led his party to win the most votes in an election. Despite this, he served as prime minister of three very different governments.

The first was the successful wartime coalition from May 1940 to May 1945. The second was the now almost-forgotten caretaker government, which was in power from May to July 1945 after the other parties withdrew from the coalition in advance of the general election. For thirteen of its members, the 1945 Caretaker Government gave them their only ministerial appointment. They included Ronnie Tree, son of Arthur Tree and Ethel Field, who was appropriately enough appointed as a minister in the Department of Town and Country Planning. One day David Cameron may look back and think that a two-month single-party caretaker government, with a brief reward of office for some of his overlooked MPs, might have been a good idea.

When the votes of the 1945 election were counted in July, Churchill’s Conservative Party had gone down to a crushing defeat at the hands of Clement Attlee’s Labour Party. Should Churchill have been surprised by his defeat in 1945? Not really. Opinion polls were available and had consistently been showing a solid lead for the Labour Party. But how did Churchill manage to lose the 1945 election after leading the allies to victory in the war?

Among the excuses which the Conservatives offered was that the Army Bureau of Current Affairs had indoctrinated service personnel to vote Labour. This excuse was at least plausible, but also probably fairly flimsy. Parties tend to cling on to strange excuses after a poor result. After one by-election in the 1950s the Conservatives blamed the size of the constituency for their lacklustre performance, although presumably it was the same size for their opponents!

In 1945 the Conservatives lost the ‘ground war’. The party was in a weakened state on the ground with a depleted band of agents. The Conservatives, in contrast to the other parties, had stuck rigidly to the spirit and the letter of the wartime electoral truce. They had only held one party conference during the war and had put little effort into policy development and constituency organisation.

Public memory had a bearing on the outcome of the 1945 election. Lloyd George was still considered to be the man who won the First World War, but his record as prime minister after the war was dismal, with broken promises, unemployment, industrial unrest and threats to start another war. The popular conclusion was that good war leaders do not necessarily make good peacetime leaders. In 1945 the Conservatives were also still tarred with the taint of being the ‘Guilty Men’, so-called after a book which had appeared in 1940, blaming the party for the policy of appeasement which had failed to prevent the war.

British society had changed during the war and voters had become less class-bound. Evacuation of urban children to rural areas, service of all classes in the armed forces, and civilians sharing bomb shelters with strangers, had all led to a new degree of social mixing. After the First World War many people had wanted a return to life as it had been. After the second, most people wanted a complete break with the past. The forward-looking 1945 Labour pledge: ‘Let us face the future’ generated more enthusiasm that the Conservatives’ plea to let Churchill ‘finish the job’.

Churchill bore much personal responsibility for the failure of the Conservatives’ election campaign, including mis-handling a party election radio broadcast in which he claimed that the Labour Party would have to employ a form of ‘Gestapo’ to implement its policies. Labour leader, Clement Attlee, a moderate and unassuming man, had been responsible for much of Britain’s domestic policy during the war – exactly the area on which most people wanted the post-war government to concentrate. Labour ministers had proved themselves capable in key domestic roles. Although all the parties supported the proposals of the Beveridge Report, the Labour Party was more enthusiastic about its implementation than the Conservatives.

After his 1945 defeat, Churchill remained party leader and led the Conservatives into the following general election in February 1950. But he lost again. However, he was given one more opportunity and he did win the following election in 1951 – at least in terms of seats. The Conservatives won fewer votes but more seats than Labour, and went on to form a government over which Churchill presided for three and a half years until he retired at the age of 80. He survived another ten years and died on 24 January 1965.


A shorter version of my article above appeared on the Conversation: http://bit.ly/1BiUNmk