Shared posts

21 May 14:31

The M&Ms Store and the M&Mification of the World: Understanding the New Reification | Infinitely Full Of Hope

by andrewhickeywriter
03 Apr 20:32

ps please distribute this content to your professional audience of influencers across various verticals on linkedin

archive - contact - sexy exciting merchandise - cute - search - about
whoah, you can get to be or not to be in the latest humble bundle!

← previous March 31st, 2014 next

March 31st, 2014: ECCC was amazing and also the best. Thank you everyone who came out! It was... the most fun?? I will write more about it later when I am less exhausted!

EXCITING THINGS HAPPENING:

One year ago today: beach-based fun times, FINALLY

– Ryan

02 Apr 14:51

Conditioned eye contact.

by Neurodivergent K
Tw:  forced eye contact, restraint & other abusive 'treatment' descriptions

I can remember the last few times I made true eye contact, as clear as a snapshot.

-I was too distressed to make words happen and no one around knew sign. Eyes wildly snapped from person to person, looking for someone who could read my mind.

-I was in the ER getting stitches on my finger, which I had accidentally locked in a car door. I told them to not tell me when they were sticking me with the needles. They told me. Eyes flew to those of the person with me.

-One of those folks who thinks boundaries don't apply to him was getting too close and too cozy on the bus. Again, probing for someone who could recognize my distress.

-Someone was threatening violence in my direction at a thing I used to do, and I was meeting with one of the folks who had authority. His words on the experience are "please stop trying to set me on fire with your mind."

The elements of these scenarios are the same. Something is happening. I do not like it. I do not like it at all, and want it to stop immediately. Yelling and swearing has not worked. I can't hit it or kick it or pretend it doesn't exist. Those strategies have failed or have a 99.9999% chance of failing based on pattern data from years of experiences.

But eye contact makes them stop doing the unpleasant thing.

This doesn't make sense, does it? You've heard that eye contact is about sharing and social referencing and subtle messages and cues being sent among communicative partners. That's not what this is at all! This is the sledgehammer. This is the safeword, if you will, the "this stops now it has to it has to it has to make it stop nownownownownow no matter what".

Where did I get this idea? Therapy. That's where.

When I was a very small little child, the first thing they tried to get me to do was "look at me". Now, if I was a small child now they'd be still coercing looking at them. The new and improved way of forcing eye contact is to hold a desirable item between the adult's eyes and then give it to the small child when they look at it. This is still gross.

Back in my day, however, it was all out war. They would grab your face, they'd hold your hands down, they'd pretty much sit on you. It was a full out wrestling match until you submitted and looked them in the eye. Then, they immediately stopped. They immediately let go of your face or your hands or stopped sitting on your or stopped holding your shoulders so hard that the bones ground or what have you.

I was small. Hitting didn't work (I tried). Kicking was a no. Headbutting only worked once, biting was iffy. Covering my face got my hands dragged into my lap and held there. Dumping the chair and running was only a few seconds reprieve and led to the least comfortable hold ever. They had no compunctions about prying my eyes open when I squeezed them shut as tight as I could. No boundary violations were out of bounds. The only way to make the awful stop was to look in their eyes.

Reality land does not, in fact work that way. Eye contact is not the way to make things stop. People who know me understand that it means "something that is happening needs to not be happening right. now." Most people don't know that. People who only sort of know me can grasp that it's bad (see: "stop trying to set me on fire with your mind") but they don't know what it means. Strangers take eye contact to mean the opposite of what it does.

My brain knows that for most people a straight in the eyes stare is not the signal for "something needs to stop right. now." but it isn't that easy. One of the deepest conditioned things I have is "eye contact is giving in. If you do that, the bad will stop." This is irrational and untrue and the world doesn't work that way. It's deep, though, as the first and most consistent of the wrestling matches I had with adults as a small child.

This isn't what they thought they were teaching me. They claimed to be teaching me all sorts of things about eye contact. They didn't though. They wrestled me to the ground over and over to grind a lie into my head.
02 Apr 11:28

Night-blooming Flowers: Sudden skill acquisition and extreme context-dependence

by chavisory
There was a graphic that went completely viral in both the autism parent and theater communities of Facebook a while ago, which makes occasional reappearances.  It’s a performance report note from the stage manager of a popular children’s show.  In one matinee performance, there had been some chattering from the audience at one point.  Afterward, the stage manager had learned from one of the teachers what the source of the commotion had been:  an autistic little boy, who had, as far as anyone knew, never spoken before, had suddenly started talking to the teacher sitting next to him all about what he was seeing.  No one had even thought that he knew his teacher’s name.

A couple of parents who commented were skeptical of the account.  While I resented the incident being passed around as inspiration porn by many, it was entirely possible, I argued, that under new and unique circumstances, a kid had displayed an ability for language that he’d never demonstrated before.

I was indignantly lectured by one mother about how autism is a neurobiological condition that can’t be cured by the magic of theater.

But being able to do something suddenly for the first time, or under specific circumstances, or gain in speech abilities, isn’t recovery from autism; it’s a really common experience, if not usually that dramatic. Autistic people can possess extreme sensitivity to environmental detail, patterning, and circumstance, and it can affect our abilities in any given moment.  Hadn’t they ever had something similar happen?  Was this really not a thing that non-autistic people experienced?

They didn’t know what I was talking about.

*

We do gain skills without necessarily being drilled or pushed, just in ways that might not be apparent to observers.  Just because progress or learning aren’t being displayed to others does not mean they aren’t happening.  An ability probably wasn’t learned or gained at the moment it was first displayed, but has been “under construction,” internally, for a long time.  Things that are intuitive and easy to typical people can require long periods of interior fermentation and distillation in order to develop, and then for us to feel safe or comfortable enough to use them.

And then seemingly suddenly, when the time or circumstances are right, there they are, like a night-blooming flower.

That uncertainty about whether they will ever show themselves doesn’t make it a miracle when they do; it means it took the time or circumstances it took for that to happen.

I’ve also said before that I feel that very few of my autistic traits are, themselves, positive or negative.  They’re all double-edged swords.  Disabling or painful in one context, necessary or pleasurable in a different context. 

Amanda Baggs has characterized autism not as a specific set of permanent deficits, but as a particular way that the brain allocates cognitive resources

That an autistic person might suddenly display an ability they’d never outwardly demonstrated before, or be able to do something under extremely specific circumstances or specific kinds of stress that they can’t do under typical, everyday circumstances, is neither magic nor miraculous.  It’s a common aspect of being autistic, and it’s one of my favorite parts.  It’s part of how being autistic works that distinct skills can develop at atypical times, and seemingly in isolation from other aspects of development, or are usable in extremely specific contexts when they aren’t normally.
Part of presuming competence is maintaining awareness that just because a skill has never manifested itself yet, does not mean that it never will.  And that just because someone might, in fact, never gain some particular skill, does not mean that they won’t gain others.

*

Once, in the first summer I lived in my apartment, I planted some pots of morning glories out on my fire escape, where they could trail up the railings.  I saw their long, spiraling buds form, and anticipated their blooming.

Then something strange started happening.  I would get home from work in the afternoons, and the buds that had looked healthy and ready to burst that morning would be withered and dead looking.  I thought that somehow they were dying before they ever bloomed.  I couldn’t figure out what could be happening to them.

On a rare day off, I went to take a book and cup of coffee out to the fire escape to enjoy the mid-morning sun.

All the morning glories were in gorgeous, bright pink bloom.  I realized they must only open their buds at a certain angle or intensity of sunlight.  Once they did, the flowers only lasted for few hours before wilting.  Nothing was wrong with the plants.  That was just what they did.  They weren’t the same as marigolds or tulips.

-Moonflowers bloom only at night.
-Four o’clocks bloom only in shade or late afternoon; you can trick them with an umbrella.
-Nasturtiums flower only in arid conditions; they never will if they get too much water.
-Giant corpse blossoms can bloom only once every several years, or even several decades.  That doesn’t mean they don’t, just because someone could watch one for years and never see it happen.

-The seeds of some native prairie plants can germinate only after exposure to the intense heat of a wildfire.

As a kid, I didn’t pick up bike riding at the same age as everyone else I knew.  It just didn’t work for me the way it apparently did for everyone else, and I couldn’t stay upright without training wheels for much longer than was socially acceptable.  Discouraged and embarrassed, I gave up and threw my bike into a corner of the garage.  Months or years went by, and I didn’t look at it or touch it or attempt to practice.  I couldn’t ride a bike.  That was it.

Then one day when I was 8 or 9, I just felt like I probably could.  I dug my bike out of the garage and on the very first try, started riding in perfect circles around the driveway.

For a long time, I couldn’t summon the coordination necessary to blow my nose.  Then one day I just could.  I was 17.  (I was driving before that.  I could drive before I could blow my nose.)

One day I just understood how to make my bed.  I was 31.

Gain in abilities, even when sudden and seemingly inexplicable, is not recovery from autism; the fact that we learn very differently from other people is intrinsic to autism, and one of those differences seems to be, very commonly, that immense periods of internal processing, combined with specific circumstances, can be necessary before a skill can be externalized.


There’s nothing wrong with that, nor is there anything particularly miraculous about it.


31 Mar 11:06

Not To Rain On Anyone's Same-Sex Marriage Parade But...

by noreply@blogger.com (Jae Kay)
Way back in 2009 and 2010 when I was discussing marriage equality with some left-wingers (such as members of LGBT Labour), their reluctance to listen to any mention of marriage stemmed quite strongly from their belief that I was "undermining" civil partnerships. My attempts to point out civil partnerships were not perfect and did not grant legal equality (on issues I laid out here) were seen as an attack on their current, or future, relationship.

They'd taken civil partnerships to their heart and any criticism of the legal issues was seen as a criticism of their own personal choices. So, as I am about to embark on yet another adventure in being the only miserable one at the gay pride party, I want to make it clear from the start here: I do not begrudge anyone who is going to enter into a marriage under the newly introduced law. I plan to take advantage of the opportunity, partner willing of course, to enter a blessed state of matrimony (blessed by the arms of Thor in case you were wondering ;) ). I wish those getting married every happiness and success. Enjoy yourselves!

However... same-sex marriage is not what I was arguing for back in 2009. As I said here, what we are getting solves only a small percentage of the issues that created the need for something better than civil partnerships in the first place.

Our Government has done marriage equality on the cheap. Stonewall, after they got over the opposition of 10% of their members drowning out the other 90%, have been next to useless in doing anything useful other than cheering from the sidelines. I don't think even today they've realised how rubbish the legislation really is, and it'll probably be another 10 years before they even get around to declaring their opposition to any correction of the failings.

We must continue to fight for a correction to the errors made over the last couple of years and make the same-sex marriage act into something even better. I'm not sure I have the heart for that fight. But I'm principled enough to point out that we are not yet there...
31 Mar 11:02

Lorenzo Semple Jr., R.I.P.

by evanier

Writer Lorenzo Semple Jr. has died at the age of 91. A lot of folks reading this will recognize him as the main writer behind the Batman TV show of the sixties…and he did many other things, as this obit will tell you. I was not a fan of that show (or what it did to the public image of comic books) but Mr. Semple had a splendid body of work and he deserves to be celebrated for it…and for things other than Batman.

Anthony Tollin sent me a link to the obit. And Anthony Fiske sent me a question to ask why Semple is being billed as the "creator of Batman" in some reports. Well actually, I think they say he was the creator of the TV series, which is also not the proper phraseology. The Writers Guild has changed the rules from time to time but generally now, the person who writes the bible and pilot for a new series based on pre-existing material — which is what Semple did for Batman — gets the credit, "Developed for television by…" It was a little less formalized back when the show went on.

31 Mar 10:47

Wheat: Much More Than You Wanted To Know

by Scott Alexander

After hearing conflicting advice from diet books and the medical community, I decided to look into wheat.

There are two sets of arguments against including wheat in the diet. First, wheat is a carbohydrate, and some people support low carbohydrate diets. Second, something might be especially dangerous about wheat itself.

It was much easier to figure out the state of the evidence on low-carbohydrate diets. They seem to be at least as good and maybe a little better for weight loss than traditional diets, but this might just be because there are lots of carbohydrates that taste very good and when forced to avoid them, people eat less stuff. They may or may not positively affect metabolic parameters and quality of life. (1, 2, 3, 4). They don’t seem to cause either major health benefits or major health risks in the medium term, which is the longest term for which there is good data available – for example, they have no effect on cancer rates. Overall they seem solid but unspectacular. But there’s a long way between “low carbohydrate diet” and “stop eating wheat”.

So I was more interested in figuring out what was going on with wheat in particular.

Wheat contains chemicals [citation needed]. The ones that keep cropping up (no pun intended) in these kinds of discussions are phytates, lectins, gluten, gliadin, and agglutinin, the last three of which for your convenience have been given names that all sound alike.

Various claims have been made about these chemicals’ effects on health. These have some prima facie plausibility. Plants don’t want to be eaten [citation needed] and they sometimes fill their grains with toxins to discourage animals from eating them. Ricin, a lectin in the seeds of the castor oil plant so toxic it gets used in chemical warfare, is a pretty good example. Most toxins are less dramatic, and most animals have enzymes that break down the toxins in their preferred food sources effectively. But if humans are insufficiently good at this, maybe because they didn’t evolve to eat wheat, some of these chemicals could be toxic to humans.

On the other hand, this same argument covers every pretty much every grain and vegetable and a lot of legumes – pretty much every plant-based food source except edible fruits. So we need a lot more evidence to start worrying about wheat.

I found the following claims about negative effects of wheat:

1. Some people without celiac disease are nevertheless sensitive to gluten.
2. Wheat increases intestinal permeability, causing a leaky gut and autoimmune disease.
3. Digestion of wheat produces opiates, which get you addicted to wheat.
4. Wheat something something something autism and schizophrenia.
5. Wheat has been genetically modified recently in ways that make it much worse for you.
6. The lectins in wheat interfere with leptin receptors, making people leptin resistant and therefore obese.

I’ll try to look at each of those and then turn to the positive claims made about wheat to see if they’re strong enough to counteract them.

Some People Without Celiac Disease Are Sensitive To GlutenMostly true but of limited significance

Celiac disease is one source of concern. Everybody on all sides of the wheat debate agree about the basic facts of this condition, which affects a little less than 1% of the population. They have severe reactions to the gluten in wheat. Celiac disease is mostly marked by gastroentereological complaints – diarrhea, bloating, abdominal pain – but it is also associated with vitamin deficiencies, anaemia, skin reactions, infertility, and “malaise”. It can be pretty straightforwardly detected by blood tests and gut biopsies and is not subtle.

People start to disagree about the existence of “gluten sensitivity”, which if it existed would be a bad reaction to gluten even in people who don’t test positive for celiac disease. Many people believe they have gastrointestinal (or other) symptoms that go away when they eat gluten-free diets, but science can’t find anything wrong with their intestines that could be causing the problems.

A recent study somewhat vindicated these people. Biesiekierski 2011 describes a double-blind randomized controlled trial: people who said they had “gluten-sensitive” irritable bowel syndrome were put on otherwise gluten-free diets and then randomly given either gluten or a placebo. They found that the patients given gluten reported symptoms (mostly bowel-related and tiredness) much more than those given placebo (p = 0.0001) but did not demonstrate any of the chemical, immunological, or histological markers usually associated with celiac disease. A similar Italian study found the same thing, except that they did find a higher rate of anti-gluten antibodies in their patients. Another study found that non-celiacs with antibodies to gluten had higher rates of mortality. And another study did find a histological change in bowel barrier function on this group of patients with the introduction of gluten. And another study from the same group found that maybe FODMAPs, another component of wheat, are equally or more responsible.

The journal Gastroenterology, which you may not be surprised to learn is the leading journal in the field of gastroenterology, proclaims:

The current working definition of nonceliac gluten sensitivity (NCGS) is the occurrence of irritable bowel syndrome (IBS)-like symptoms after the ingestion of gluten and improvement after gluten withdrawal from the diet after exclusion of celiac disease based on negative celiac serologies and/or normal intestinal architecture and negative immunoglobulin (Ig)E-mediated allergy tests to wheat. Symptoms reported to be consistent with NCGS are both intestinal (diarrhea, abdominal discomfort or pain, bloating, and flatulence) and extra-intestinal (headache, lethargy, poor concentration, ataxia, or recurrent oral ulceration). These criteria strongly and conveniently suggest that NCGS is best understood as a subset of IBS or perhaps a closely related but distinct functional disorder. Although the existence of NCGS has been slowly gaining ground with physicians and scientists, NCGS has enjoyed rapid and widespread adoption by the general public.

But even this isn’t really that interesting. Maybe some people with irritable bowel syndrome or certain positive antibodies should try avoiding gluten to see if it helps their specific and very real symptoms. At most ten percent of people are positive antibody testing, and not all of those even have symptoms. That’s still a far cry from saying no one should eat wheat.

But the anti-wheat crowd says an alternative more sensitive antibody test could raise sensitivity as high as a third of the population. The test seems to have been developed by a well-respected and legitimate doctor, but it hasn’t as far as I can tell been submitted for peer review or been confirmed by any other source. Meh.

That’s boring anyway. The real excitement comes from sweeping declarations that the entire population is sensitive to wheat.

Wheat Increases Intestinal Permeability Causing A Leaky GutProbably true, of uncertain significance

There are gluten-induced mucosal changes in subjects without small bowel disease. And gliadin increases intestinal permeability in the test tube, which should be extremely concerning to any test tubes reading this.

But probably the bigger worry here are lectins, which include wheat germ agglutinin. WGA affects the intestinal permeability of rats, which should be extremely concerning to any rats reading this. The same substance has been found to produce pro-inflammatory cytokines and interfere with the growth of various organs including the gut.

So there’s pretty good evidence that chemicals in wheat can increase intestinal permeability. Who cares?

For years, “leaky gut syndrome” was an alternative medicine diagnosis that was soundly mocked by the mainstream medical establishment. Then the mainstream medical establishment confirmed it existed and did that thing where they totally excused their own mocking of it but were ABSOLUTELY OUTRAGED that the alternative medicine community might have in some cases been overenthusiastic about it.

Maybe I’m being too harsh. The alternative medicine community often does take “leaky gut syndrome” way too far.

On the other hand, it’s probably real and Nature Clinical Practice is now publishing papers saying it is “a key ingredient in the pathogenesis of autoimmune diseases” and “offers innovative, unexplored approaches for the treatment of these devastating diseases” and gut health has been deemed “a new objective in medicine”. Preliminary changes to intestinal permeability have been found in asthma, in diabetes, and even in depression.

But it’s not yet clear if this is cause and effect. Maybe the stress of having asthma increases intestinal permeability somehow. Or maybe high intestinal permeability causes asthma somehow. It sure seems like the latter might work – all sorts of weird antigens and stuff from food can make it into the bloodstream and alarm the immune system – but right now this is all speculative.

So what we have is some preliminary evidence that wheat increases intestinal permeability, and some preliminary evidence that increased intestinal permeability is bad for you in a variety of ways.

And I don’t doubt that those two facts are true, but my knowledge of this whole area is so weak that I wonder how much to worry.

What other foods increase intestinal permeability? Do they do it more or less than wheat? Has anyone been investigating this? Are there common things that affect intestinal permeability a thousand times more than wheat does, such that everything done by wheat is totally irrelevant in comparison?

Do people without autoimmune diseases suffer any danger from increased intestinal permeability? How much? Is it enough to offset the many known benefits of eating wheat (to be discussed later?) Fiber seems to decrease intestinal permeability and most people get their fiber from bread; would decreasing bread consumption make leaky gut even worse?

I find this topic really interesting, but in a “I hope they do more research” sort of way, not an “I shall never eat bread ever again” sort of way.

Digestion Of Wheat Produces Opiates, Which Get You Addicted To WheatProbably false, but just true enough to be weird

Dr. William Davis, a cardiologist, most famously makes this claim in his book Wheat Belly. He says that gliadin (a component of gluten) gets digested into opiates, chemicals similar to morphine and heroin with a variety of bioactive effects. This makes you addicted to food in general and wheat in particular, the same way you would get addicted to morphine or heroin. This is why people are getting fat nowadays – they’re eating not because they’re hungry, but because they’re addicted. He notes that drugs that block opiates make people want wheat less.

Does Wheat Make Us Fat And Sick, a review published in the Journal of Cereal Science (they have journals for everything nowadays) is a good rebuttal to some of Davis’ claims and a good pro-wheat resource in general.

They say that although gliadin does digest into opiates, those opiates are seven unit peptides and so too big to be absorbed from the gut to the bloodstream.

(note that having opiates in your gut isn’t a great idea either since there are lots of nerves there controlling digestion that can be affected by these drugs)

But I’m not sure this statement about absorption is even true. First, large proteins can sometimes make it into the gut. Second, if all that leaky gut syndrome stuff above is right, maybe the gut is unusually permeable after wheat consumption. Third, there have been sporadically reported cases of gliadin-derived opiates found in the urine, which implied they got absorbed somehow.

There’s a better counterargument on the blog The Curious Coconut. She notes that there’s no evidence these peptides can cross the blood-brain barrier, a precondition for having any psychological effects. And although the opiate-blocker naloxone does decrease appetite, this effect is not preferential for wheat, and probably more related to the fact that opiates are the way the brain reminds itself it’s enjoying itself (so that opiate-blocked people can’t enjoy eating as much).

And then there’s the usual absence of qualifiers. Lots of things are “chemically related” to other chemicals without having the same effect; are gliadin-derived opiates addictive? Are they produced in quantities high enough to be relevant in real life? Corn, spinach, and maybe meat can all get digested into opiates – is there any evidence wheat-derived opiates are worse? This is really sketchy.

The most convincing counterargument is that as far as anyone can tell, wheat makes people eat less, not more:

Prospective studies suggest that weight gain and increases in abdominal adiposity over time are lower in people who consume more whole grains. Analyses of the Physicians’ Health Study (27) and the Nurses’ Health Study (26) showed that those who consumed more whole grain foods consistently weighed less than those who consumed fewer whole grain foods at each follow-up period of the study. Koh-Banerjee et al. (27) estimated that for every 40-g increase in daily whole grain intake, the 8-y weight gain was lower by 1.1 kg.

I’ll discuss this in more detail later, but it does seem like a nail in the coffin for the “people eat too much because they’re addicted to wheat” theory.

Still, who would have thought that wheat being digested into opiates was even a little true?

Wheat Something Something Something Autism And SchizophreniaDefinitely weird

Since gluten-free diets get tried for everything, and everything gets tried for autism, it was overdetermined that people would try gluten-free diets for autism.

All three of the issues mentioned above – immune reactivity to gluten, leaky guts, and gliadin-derived opiates – have been suggested as mechanisms for why gluten free diets might be useful in autism.

Of studies that have investigated, a review found that seven reported positive results, four negative results, and two mixed results – but that all of the studies involved were terrible and the ones that were slightly less terrible seemed to be more negative. The authors described this as evidence against gluten-free diets for autism, although someone with the opposite bias could have equally well looked at the same review and described it as supportive.

However, a very large epidemiological study found (popular article, study abstract) that people with antibodies to gluten had three times the incidence of autism spectrum disease than people without, and that the antibodies preceded the development of the condition.

Also, those wheat-derived opioids from the last section – as well as milk-derived opioids called casomorphins – seem to be detected at much higher rates in autistic people.

Both of these factors may have less to do with wheat in particular and more to do with some general dysregulation of peptide metabolism in autism. If for some reason the gut kept throwing peptides into the body inappropriately, this would disrupt neurodevelopment, lead to more peptides in the urine, and give the immune system more chance to react to gluten.

The most important thing to remember here is that it would be really wrong to say wheat might be “the cause” of autism. Most likely people do not improve on gluten-free diets. While there’s room to argue that people might have picked up a small signal of them improving a little, the idea that this totally removes the condition is right out. If we were doing this same study with celiac disease, we wouldn’t be wasting our time with marginally significant results. Besides, we know autism is multifactorial, and we know it probably begins in utero.

Schizophrenia right now is in a similar place. Schizophrenics are five to seven times more likely to have anti-gliadin antibodies as the general population. We can come up with all sorts of weird confounders – maybe antipsychotic medications increase gut permeability? – but that’s a really strong result. And schizophrenics have frank celiac disease at five to ten times the rate of the general population. Furthermore, a certain subset of schizophrenics sees a dramatic reduction in symptoms when put on a strict gluten-free diet (this is psychiatrically useless, both because we don’t know which subset, and because given how much trouble we have getting schizophrenics to swallow one lousy pill every morning, the chance we can get them to stick to a gluten-free diet is basically nil). And like those with autism, schizophrenics show increased levels of weird peptides in their urine.

But a lot of patients with schizophrenia don’t have reactions to gluten, a lot don’t improve on a gluten free diet, and other studies question the research showing that any of them at all do.

The situation here looks a lot like autism – a complex multifactorial process that probably isn’t caused by gluten but where we see interesting things going on in the vague territory of gluten/celiac/immune response/gut permeability/peptides, with goodness only knows which ones come first and which are causal.

Wheat Has Been Genetically Modified Recently In Ways That Make It Much Worse For YouProbably true, especially if genetically modified means “not genetically modified” and “recently” means “nine thousand years ago”

If you want to blame the “obesity epidemic” or “autism epidemic” or any other epidemic on wheat, at some point you have to deal with people eating wheat for nine thousand years and not getting epidemics of these things. Dr. Davis and other wheat opponents have turned to claims that wheat has been “genetically modified” in ways that improve crop yield but also make it more dangerous. Is this true?

Wheat has not been genetically modified in the classic sense, the one where mad scientists with a god complex inject genes from jellyfish into wheat and all of a sudden your bread has tentacles and every time you try to eat it it stings you. But it has been modified in the same way as all of our livestock, crops, and domestic pets – by selective breeding. Modern agricultural wheat doesn’t look much like its ancient wild ancestors.

The Journal Of Cereal Science folk don’t seem to think this is terribly relevant. They say:

Gliadins are present in all wheat lines and in related wild species. In addition, seeds of certain ancient types of tetraploid wheat have even greater amounts of total gliadin than modern accessions…There is no evidence that selective breeding has resulted in detrimental effects on the nutritional properties or health benefits of the wheat grain, with the exception that the dilution of other components with starch occurs in modern high yielding lines (starch comprising about 80% of the grain dry weight). Selection for high protein content has been carried out for bread making, with modern bread making varieties generally containing about 1–2% more protein (on a grain dry weight basis) than varieties bred for livestock feed when grown under the same conditions. However, this genetically determined difference in protein content is less than can be achieved by application of nitrogen fertilizer. We consider that statements made in the book of Davis, as well as in related interviews, cannot be substantiated based on published scientific studies.

In support of this proposition, in the test tube ancient grains were just as bad for celiac patients’ immune systems as modern ones.

And yet in one double-blind randomized-controlled trial, people with irritable bowel syndrome felt better on a diet of ancient grains than modern ones (p lower inflammatory markers and generally better nutritional parameters than people on a modern grain one. Isn’t that interesting?

Even though it’s a little bit weird and I don’t think anyone understands the exact nutrients at work, sure, let’s give this one to the ancient grain people.

The Lectins In Wheat Interfere With Leptin Receptors, Making People Leptin Resistant And Therefore ObeseCurrently at “mere assertion” level until I hear some evidence

So here’s the argument. Your brain has receptors for the hormone leptin, which tells you when to stop eating. But “lectin” sounds a lot like “leptin”, and this confuses the receptors, so they give up and tell you to just eat as much as you want.

Okay, this probably isn’t the real argument. But even though a lot of wheat opponents cite the heck out of this theory, the only presentation of evidence I can find is Jonsson et al (2005), which points out that there are a lot of diseases of civilization, they seem to revolve around leptin, something common to civilization must be causing them, and maybe that thing could be lectin.

But civilization actually contains more things than a certain class of proteins found in grains! There’s poor evidence of lectin actually interfering with the leptin receptor in humans. The only piece of evidence they provide is a nonsignificant trend toward more cardiovascular disease in people who eat more whole grains in one study, and as we will see, that is wildly contradicted by all other studies.

This one does not impress me much.

Wheat Is Actually Super Good For You And You Should Have It All The TimeProbably more evidence than the other claims on this list

Before I mention any evidence, let me tell you what we’re going to find.

We’re going to find very, very many large studies finding conclusively that whole grains are great in a lot of different ways.

And we’re not going to know whether it’s at all applicable to the current question.

Pretty much all these studies show that people with some high level of “whole grain consumption” are much healthier than people with some lower level of same. That sounds impressive.

But what none of these studies are going to do a good job ruling out is that whole grain is just funging against refined grain which is even worse. Like maybe the people who report low whole grain consumption are eating lots of refined grain, and so more total grain, and the high-whole-grain-consumption people are actually eating less grain total.

They’re also not going to rule out the universal problem that if something is widely known to be healthy (like eating whole grains) then the same health-conscious people who exercise and eat lots of vegetables will start doing it, so when we find that the people doing it are healthier, for all we know it’s just that the people doing it are exercising and eating vegetables.

That having been said, eating lots of whole grain decreases BMI, metabolic risk factors, fasting insulin, and body weight (1, 2, 3, 4,5.)

The American Society For Nutrition Symposium says:

Several mechanisms have been suggested to explain why whole grain intake may play a role in body weight management. Fiber content of whole grain foods may influence food volume and energy density, gastric emptying, and glycemic response. Whole grains has also been proposed to play an important role in promoting satiety; individuals who eat more whole grain foods may eat less because they feel satisfied with less food. Some studies comparing feelings of fullness or actual food intake after ingestion of certain whole grains, such as barley, oats, buckwheat, or quinoa, compared with refined grain controls indicated a trend toward increased satiety with whole grains. These data are in accordance with analyses determining the satiety index of a large number of foods, which showed that the satiety index of traditional white bread was lower than that of whole grain breads. However, in general, these satiety studies have not observed a reduction in energy intake; hence, further research is needed to better understand the satiety effects of whole grains and their impact on weight management.

Whole grains, in some studies, have also been observed to lower the glycemic and insulin responses, affect hunger hormones, and reduce subsequent food intake in adults. Ingestion of specific whole grains has been shown to influence hormones that affect appetite and fullness, such as ghrelin, peptide YY, glucose-dependent insulinotropic polypeptide, glucagon-like peptide 1, and cholecystokinin. Whole grain foods with fiber, such as wheat bran or functional doses of high molecular weight β-glucans, compared with lower fiber or refined counterparts have been observed to alter gastric emptying rates. Although it is likely that whole grains and dietary fiber may have similar effects on satiety, fullness, and energy intake, further research is needed to elucidate how, and to what degree, short-term satiety influences body weight in all age groups.

Differences in particle size of whole grain foods may have an effect on satiety, glycemic response, and other metabolic and biochemical (leptin, insulin, etc.) responses. Additionally, whole grains have been suggested to have prebiotic effects. For example, the presence of oligosaccharides, RS, and other fermentable carbohydrates may increase the number of fecal bifidobacteria and lactobacilli (49), thus potentially increasing the SCFA production and thereby potentially altering the metabolic and physiological responses that affect body weight regulation.

In summary, the current evidence among a predominantly Caucasian population suggests that consuming 3 or more servings of whole grains per day is associated with lower BMI, lower abdominal adiposity, and trends toward lower weight gain over time. However, intervention studies have been inconsistent regarding weight loss

The studies that combined whole and refined grains are notably fewer. But Dietary Intake Of Whole And Refined Grain Breakfast Cereals And Weight Gain In Men finds that among 18,000 male doctors, those who ate breakfast cereal (regardless of whether it was whole and refined) were less likely to become overweight several years later than those who did not (p = 0.01). A book with many international studies report several that find a health benefit of whole grains, several that find a health benefit of all grains (Swedes who ate more grains had lower abdominal obesity; Greeks who ate a grain-rich diet were less likely to become obese; Koreans who ate a “Westernized” bread-and-dairy diet were less likely to have abdominal obesity) and no studies that showed any positive association between grains and obesity, whether whole or refined.

I cannot find good interventional trials on what happens when a population replaces non-grain with grain.

On the other hand, Dr. Davis and his book Wheat Belly claim:

Typically, people who say goodbye to wheat lose a pound a day for the first 10 days. Weight loss then slows to yield 25-30 pounds over the subsequent 3-6 months (differing depending on body size, quality of diet at the start, male vs. female, etc.)

Recall that people who are wheat-free consume, on average, 400 calories less per day and are not driven by the 90-120 minute cycle of hunger that is common to wheat. It means you eat when you are hungry and you eat less. It means a breakfast of 3 eggs with green peppers and sundried tomatoes, olive oil, and mozzarella cheese for breakfast at 7 am and you’re not hungry until 1 pm. That’s an entirely different experience than the shredded wheat cereal in skim milk at 7 am, hungry for a snack at 9 am, hungry again at 11 am, counting the minutes until lunch. Eat lunch at noon, sleepy by 2 pm, etc. All of this goes away by banning wheat from the diet, provided the lost calories are replaced with real healthy foods.”

Needless to say, he has no studies supporting this assertion. But the weird thing is, his message board is full of people who report having exactly this experience, my friends who have gone paleo have reported exactly this experience, and when I experimented with it, I had pretty much exactly this experience. Even the blogger from whom I took some of the strongest evidence criticizing Davis says she had exactly this experience.

The first and most likely explanation is that anecdotal evidence sucks and we should shut the hell up. Are there other, less satisfying explanations?

Maybe completely removing wheat from the diet has a nonlinear effect relative to cutting down on it? For example, in celiac disease there is no such thing as “partially gluten free” – if you have any gluten at all, your disease comes back in full force. This probably wouldn’t explain Dr. Davis’ observation – neither I nor my other wheatless-experimentation friends were as scrupulous as a celiac would have to be. But maybe there’s a nonlinear discrepancy between people who have 75% the wheat of a normal person and 10% the wheat of a normal person?

Maybe there’s an effect where people who like wheat but remove it from the diet are eating things they don’t like, and so eat less of them? But people who don’t like wheat like other stuff, and so eat lots of that?

Maybe wheat in those studies is totally 100% a confounder for whether people are generally healthy and follow their doctor’s advice, and the rest of the doctor’s advice is really good but the wheat itself is terrible?

Maybe cutting out wheat has really positive short-term effects, but neutral to negative long-term effects?

Maybe as usual in these sorts of situations, the simplest explanation is best.

Final Thoughts

Non-celiac gluten sensitivity is clearly a real thing. It seems to produce irritable bowel type symptoms. If you have irritable bowel type symptoms, it might be worth trying a gluten-free diet for a while. But the excellent evidence for its existence doesn’t seem to carry over to the normal population who don’t experience bowel symptoms.

What these people have are vague strands of evidence. Something seems to be going on with autism and schizophrenia – but most people don’t have autism or schizophrenia. The intestinal barrier seems to become more permeable with possible implications for autoimmune diseases – but most people don’t have autoimmune disease. Some bad things seem to happen in rats and test tubes – but most people aren’t rats or test tubes.

You’d have to want to take a position of maximum caution – wheat seems to do all these things, and even though none of them in particular obviously hurt me directly, all of them together make it look like the body just doesn’t do very well with this substance, and probably other ways the body doesn’t do very well with this substance will turn up, and some of them probably affect me.

There’s honor in a position of maximum caution, especially in a field as confusing as nutrition. It would not surprise me if the leaky gut connection turned into something very big that had general implications for, for example, mental health. And then people who ate grain might regret it.

But stack that up against the pro-wheat studies. None of them are great, but they mostly do something the anti-wheat studies don’t: show direct effect on things that are important to you. Most people don’t have autism or schizophrenia, but most people do have to worry about cardiovascular disease. We do have medium-term data that wheat doesn’t cause cancer, or increase obesity, or contribute to diabetes, or any of that stuff, and at this point solely based on the empirical data it seems much more likely to help with those things than hurt.

I hope the role of intestinal permeability in autoimmune disease gets the attention it deserves – and when it does, I might have to change my mind. I hope people stop being jerks about gluten sensitivity, admit it exists, and find better ways to deal with it. And if people find that eliminating bread from their diet makes them feel better or lose weight faster, cool.

But as far as I can tell the best evidence is on the pro-wheat side of things for most people at most times.

[EDIT: An especially good summary of the anti-wheat position is 6 Ways Wheat Can Destroy Your Health. An especially good pro-wheat summary is Does Wheat Make Us Fat And Sick?]

31 Mar 10:00

Labour party proposes to start charging separately for healthcare.

Labour party proposes to start charging separately for healthcare.
28 Mar 16:33

the shock of the library: oasis versus all of art and culture

by pˆnk s lord sükråt cunctør

george-harrison-wonderwall-music“Unless we can somehow recycle the concept of the great artist so that it supports Chuck Berry as well as it does Marcel Proust, we might as well trash it altogether” — Robert Christgau

“But rock criticism does something even more interesting, changing not just our idea of who gets to be an artist but of who gets to be a thinker. And not just who gets to be a thinker, but which part of our life gets to be considered ‘thought’. Say that – using rockers like Chuck and Elvis as intellectual models – young Christgau, Meltzer, Bangs, Marcus et al. grow up to understand that rock ’n’ roll isn’t just what you write about, it’s what you do. It’s your mode of thought. And if you do words on the page, then your behavior on the page doesn’t follow standard academic or journalistic practice, and is baffling for those who expect it to.” –Frank Kogan (responding to Xgau; update: see Frank’s comments below for link to complete piece, or go here)

“People who write and read and review books are fucking putting themselves a tiny little bit above the rest of us who fucking make records and write pathetic little songs for a living.”Noel Gallagher

Some time in the late 80s or early 90s, I heard a story that may not be true, or anyway may not be fair. I remember it because it’s quite funny, and seemed at the time telling (though I think not in the way I then felt it was). It was told of one-time Kirkaldy-born art-punker turned model turned TV presenter turned film-maker Richard Jobson, and it came to me via two writers very much self-taught as readers and intellectuals, both from more or less working-class backgrounds, both bookish even by my standards (and both unnamed, as I don’t entirely trust my memory for gossip over 25 years, and I don’t want to get them into bother). As Writer2 told it, Writer1 was interviewing Jobson in his home. Making some point about working class cultural heritage and self-education — and his own hungry autodidact fascination with, for example, the War Poets — Jobson had pointed to a large bookcase, full of books: Write1 asked how many he’d actually read. “Well,” said Jobson, “I’ve definitely made sure to read the first page of every one.”

I say 25 years — but I’m not quite sure when this interview took place: The Skids dissolved in 1983; Jobson’s second band, The Armoury Show, lasted until about 1988; not long after it folded, Jobson was riding high as a cultural gatekeeper on TV, a presenter and tastemaker. Certainly it was during this last phase that I *heard* the story — when I took it to be a tale against the kind of television Jobson was involved in; fast-food magazine shows in which everything (fiction, film, art, politics) was self-regardingly skimmed, and converted into this week’s glamourbot accessory. Nothing was ever mastered.

But there was always also a kind of carnival naughtiness about them. What had punk’s original promise been, after all? Let’s start over! Say that everyone’s equal and begin from there! Set all the cultural value-meters back to zero and re-run the entire race, starting now! Here was a return to its energy and levelling allure: here was all culture, and here we all were, barrelling through it all, to enthuse and explore and play and pretend; to learn as well as to gurn; to take what we found and use it as we pleased. And this pleasure of course included playing pranks on the swots of this world, and how they imagine they grasp the world better. There’s a punkily disarming cheek about Jobson’s admission — or boast, or performance, or whatever it is. And an urchin challenge too: I’m a Gemini, not George Steiner, I haven’t read all my books either, maybe more than one page each, but even if we ignore the ones I haven’t quite er begun yet, how much more than one is acceptable, when it isn’t all (and a stiff exam passed). On what page does the authentic cultural capital kick in? I very much trust my boredom.

This idea of the year-zero reset was as adolescent and as silly as it was intoxicating, of course: and as intoxicating as it was old-school. The 1913 Armory Show had been a reset: the New York exhibition that announcing the arrival of the new “modern” art in the USA, a flamboyantly new creative language and attitude. And here was a more or less hitless 80s new wave group — minor league by most accounting — snaffling this legendary name: at once a blatant appeal to high-art authority via fancy reference, and a insolent snook-cocking upturning of same: we name ourselves thus because we are EXACTLY AS IMPORTANT. And even as they rolled their eyes at such an absurd — not to say pretentious — spectacle, all kinds of commentators caught up in punk’s aftermath (“flamboyantly new creative language and attitude”) had placed themselves at the exact same oedipal fork: of course they too want to be urchins running through museums, but there’s also the urge to seek employment as enthusiastic ushers, showing one and all how exactly this (old-school) radical art ought to be understood and used.

And so there was always already a schoolyard-type squabble who gets to be a consider a “thinker”. Rock was always a dramatisation of growing up in public; less a refusal of the demands and changes and skills that school might produce than a theatre of the confused hope of an alternative: combination NO and YES. And we all have to start somewhere: and sometimes you need the goofy and even the deluded elements to help you open up cultural space for what maybe might matter (if only to you): the schoolyard play-acting and grandstanding.

And sometimes it’s a help to have missed the fights on the day, to grasp better where they’re really coming from. I’d left The Wire in early 1994, and really didn’t pay attention to pop for several years — Britpop’s distant alarums were very distant indeed for me; I didn’t engage; I also didn’t get bored; I was teaching myself ancient music history, pretty much. So there’s this cartoon of Be Here Now as mere lout-culture frenzy, driving down all possibility from wider pop of the art of the weird, the queer, the clever, the political, the experimental, the art-school (as Weej more or less puts it on a Popular thread): but coming back at it with a skewed ear, and probably more burnt out on the institutionalised complacency of experiment and avant-garde rebel pose, that’s not really what I hear.

These exhausting seven-minute drone epics all set about with clouds of FX, the the oddly delicate seagulls of reverse-tape guitar: someone should do a compare-and-contrast with Band of Susans! Is Liam’s singing really any less placemarker than Sonic Youth’s? What happens when you recognise the vocals as a basically instrumental elements? Pedal-point noise-roar against quiet integument-backdrops of musique concrete; the deliberately evasive Burroughsian cut-ups in the words: Oasis couldn’t pass an exam on the lineage of any of this — but the mark-grubbing parading of approved non-pop forebears are the problem they intuitively set themselves against. A Gallagher would never hint that he admired any school-approved non-pop mode of aesthetic sensibility — because in that moment he’d see himself become a kind of Manc Momus, the usher of culture stamping on a self-taught council-estate urchin face forever. But of course there are garden-variety avant-gardists also, of a lineage pre-redeemed by chart-topping sales — and plastered with Lennon’s (rather than Yoko’s) face. The clutching at the (dad)rock canon isn’t simply truculent and disastrous, it’s actively misleading. In clinch with the swotty twerp Albarn — well, let’s just say the clash often pushed both sides out towards the worst of themselves. The oddities at the edges of Oasis arrangements seem designed to be drowned out by the dominant lout-stomp; to be treated as ornament. You really have to listen against the main lairy bellow of the sound for the (very) masked musicianly detail. Even a decade later, with their 2008 return-and-farewell Dig Out Your Soul, when you could finally recognise a miniaturist’s mannerist sensibility, a still small connoisseur’s mutter, it’s still mostly framed in maximalist grind, Liam’s chant upfront. Who you never need to listen to twice, to catch anything you didn’t get the first time: a triumphant coarsened framing context from which the arty stuff peeks, in hopeful wait for the aesthetic respect it never quite gets…

Look back at the quote up top, Noel G’s pronouncements about books. It’s natural to see them as anti-book and anti-reading. But look past the anxiety and the aggression, and he isn’t even (quite) saying he’s anti-fiction; he’s saying he’s pro-music-making. To speak in very unGallagherly mode, he’s arguing that the dominant critical hierarchy exalts fiction-making (and those who enjoy the results) over music-making (and those who ditto ditto); but that he (relevantly a music-maker) does not. [Update: as Frank pointed out in comments, my reasoning seemed a bit opaque here -- I had in mind a section on the GQ interview I'd forgotten I hadn't quoted, Noel G contrasting his taste in books with his wife's: “I only read factual books. I can’t think of… I mean, novels are just a waste of f***ing time. I can’t suspend belief in reality… I just end up thinking, ‘This isn’t f***ing true.’ I like reading about things that have actually happened. I’m reading this book at the minute – The Kennedy Tapes. It’s all about the Cold War, the Cuban Missile Crisis – I can get into that.”]

Oasis came up entirely in post-punk time, in age and as industry backdrop: they never had to tussle with prog or whiteblues R&B, or even punk, really. But yes, they too confronted the oedipal fork: for all that it resolves itself in the Oasis-Blur wars — each combatant assigned a clear role — you look a bit harder and it’s right there in the interviews, that patented Gallagher performance, structured as much as anything round a self-loathing sibling rivalry: which brother is the least deluded; which best retains the pap-art [update: POP-art] faith…

A pitch to reassert music-making as a value is everywhere taken — masked as it is in a dogmatic rockcanon doggedness — as an assault on everything other kind of cultural value; the Gallaghers themselves make exactly this mistake. So the Gallaghers may loudly and proudly scorn actually existing higher art; and may insist that rock is properly anti-school, and that are they are properly and heroically anti-intellectual. But this is only true if the “mode of thought” of musicians is cast out from the ordinary understanding of intellect. Somehow, in the 70s, in the 80s, music-as-an-art-in-itself had became cut off from the ordinary trust the rest of the arts still somewhat received. I think it’s right to resist this, and — however inchoately — I think resentment at it is what drives Oasis. Their Oasis — their hubristic triumph, its as-swift collapse into a stupid shadow of itself — was a symptom of this curious situation.

mesconsumer There was never a golden age of countercultural comity of course — at most what there was, within rock, was a brief imagined utopia of progressive quilted fusion (discussed here): punk broke into useless pieces prog’s programme of radical inclusivity (all modes of sound-practice gathered into one stream) and furiously dispersed the fragments; post-punk was the various fragments unblurring and emerging seeminfgly whole in themselves, against a miasmatic background of ambition and doubt (explored somewhat cryptically here). It was at once almost hysterically out-facing (can we not attitudinally master and absorb anything now? from football to fashion to radical politics to avant-garde performance art?) and pervasively jittery with anxiety at the problem of cohesion. Before punk, a kind of anyone-can-do-it hard rock blues sound (guitar and vocal styling) had functioned as the thing “we all” agreed on — the notionally shared tongue, the element that made it all “rock” — into which could be poured anything from country to raga, the music-college swagger and self-belief of prog with the New Deal/Civil Rights/anti-war movement vagueness of hippie politics… After punk, a futile subcultural precession trooped the audition catwalk for the role of countercultural centre: two-tone, goth, rockabilly, afropop and electropop, besuited jazz, revenant psychedelia, post-no wave artnoise… And week after week in the NME, popstars of all kinds sketched out their own individual pan-cultural gallimaufry: Portrait of the Artist as a Consumer.

The post-punk break-up had seemed an amazing liberation — and it surely helped that so many from the unexpected sides of the tracks were emboldened to let themselves loose among all this high cultural stuff. But (as is often the way of amazing liberations, perhaps) it had quickly clotted into a landscape of jealously embattled fiefdoms and fandoms…

Plight (extrait) from Véronique Mouysset on Vimeo.

beuys-hareWhich isn’t the only way. Here’s another story, from 1986, about who gets to be a thinker. The BBC’s Arena was shooting a documentary about a Joseph Beuys installation at the Anthony D’Offay Gallery, Plight: in effect a cave fashioned of rolls of one of Beuys’ signature materials, felt, containing a grand piano, some sheet music with nothing written on it, and some thermometers. (You can see it here and in the top four images here).) It was only just months after the 1984-85 strike ended, and Beuys had expressed an interest in meeting Arthur Scargill, president of the National Union of Mineworkers. Unfortunately, Beuys died shortly after filming began, so the meeting never took place — but, their interest piqued, the documentary-makers decided to follow through (“We liked the idea of bringing together two charismatic figures from completely different worlds and had no idea what the outcome would be,” director Christopher Swayne told me), and invited Scargill to the gallery to comment on the work. He “was terrific,” Swayne continues: ” (…) more sensitive and suggestive than I would have imagined, and the seemingly incongruous conjunction between Bond Street art and a trade union leader gave us a rather surreal scene.” Like Yoko Ono, Beuys was a member of the fluxus group, part-prankster, part-primitivist — and of course (in the UK especially), trawling for responses from voices outside the usual relevant circle of art-savvy suspects can be a risky tactic. There’s suspicion, a stand-off of knowing or philistine scorn — when you’re not sure you know enough to comment, sometimes you lash out; and sometimes when what you know is merely dismissed, you lash back. Overcompensation on either side, and the conversation devolves into mutually spiteful misrepresentations and contempt (“jealously embattled fiefdoms”).

Beuys talks about the installation here, but famous as he was for explaining pictures to a dead hare, it’s not a very good discussion. This interviewer doesn’t get the right conversation from him; can’t seem to focus on the interesting Beuys does begin to say; perhaps fails even to spot them as such. I can’t find the Arena documentary on the internet, unfortunately: to get a sense of Scargill’s responses I’ve used these three pages, scanned from a book that discusses it (from Theatre and Everyday Life: An Ethics of Performance by Alan Read, pp172-3, publisher Routledge):

scargillbeuys1

scargillbeuys2

scargillbeuys3

Muffled a little by somewhat unfortunate writing, the author nevertheless gives a sense of what also excited the documentary-makers: the fact that this comes across as a cross-exploration of different realms of expertise, Scargill taking the project perfectly seriously, going out of his way to put engaged thought into his response, digging deep into his own adult experience as a miner. In conversation with the art if not the artist, Scargill brings his own knowledge and belief and passion to an an encounter of equals, and a blend of confidence and curiosity. There’s no sense that he feels he isn’t equipped or entitled to be talking about it. We’ve stepped fully out of urchin-time.

“I only listen to music derived or from the 60s. I’m not interested in jazz or hip-hop or whatever’s going round at the minute; indie shit. I don’t loathe it but I don’t listen to it. My education as a songwriter was from listening to the Kinks and the Who and the Beatles. I don’t listen to avant-garde landscapes and think, ‘I could do that.’ I’m not a fan of Brian Eno. It’s Ray Davies, John Lennon and Pete Townshend for me.Noel Gallagher

“Because our time tends to work with a kind of ideology, they call those things sculpture and paintings visual art. But I think that vision plays only one role and there are twelve other senses at least implied in looking at an artwork (…) If people are training and are really interested in art they could develop more senses. So this is now related to the senses of touching and surely also to seeing. This remains. I am not against vision because it’s one of the most important senses. You have a kind of acoustic effect, because everything is muffled down. Then there is the effect of warmness. As soon as there are more than twenty people in the room the temperature will rise immediately. Then there is the sound as an element muffling away the noise and the sound (…) So I try principally to do this thing further on over the threshold where modern art ends into an area where anthropological art has to start — in all fields of discussion, not only in the art world, be it in medicine, be it in miners’ problems, be it in the information of state and constitution, be it in the money system.”Joseph Beuys, 1985 (quote reordered to amplify a point)

It would take all the magic out of it to break down ‘I Am the Walrus’ to its basic components. I listen to it and go, ‘It’s fucking amazing; why is it amazing? I don’t know, it just is.’ That’s why I find journalists such joyless fucking idiots. They have to break music down and pull it apart until there’s nothing left, until they know it all; they analyse it down until it’s bland nonsense. They don’t listen to music like the rest of us.” Noel Gallagher

“I think it’s a bit unfair to erase the contribution of morons to British pop music – Sabbath, Troggs, Animals, Damned etc… Sometimes you need a Lemmy or Lonnie Donegan or John Fogerty figure to stop the intellectuals getting out of hand” — @dsquareddigest (Daniel Davies) on twitter

Step back again. What I’m trying to describe is a choppy and changeable sea of shifting layers, of the sense of authority within the assumed cultural hierarchy — complex in the sense that a lot of moving parts were moving a lot, in very different ways at very different levels, for quite a long time. And to push a little back against the melodramatic attempts of everyone involved (not least the Gallaghers) to simplify it.

A: Begin with Noel’s very hostile response to the way a writer might want to think about music — more as a spur for writing, for telling good (meaning sellable?) stories, than to reflect the needs of the musician. Noel can’t possibly mean that he never pulled apart his own songs, to get at what made them work — to see if they can’t be better. He might mean that such public analysis this is bad practice for a musician — who may well trust instinct and intuition, without ever having the words to explain what he hears or he wants to any non-musician outsider. Straight away it’s difficult to determine what’s punching up and what’s punching down. decision might be acutely hard to make. As much as anything, an official education is the elaboration of a shareable jargon of discussion and exploration between experts; someone talented but untrained may well be acutely aware that the extant jargons aren’t pinpointing or naming or describing precisely what they’re hearing and valuing. Hence: you clam up and trust only yourself and your pals.

B: The root of all “up” and “down”, punching-wise, is class, of course. And however much training and expertise can be a route upo and out, a smart working class kid will always be conscious of the middle class kids round them, with more time and space and money for extra tuition, with connected parents in the same field, with better means to game the system. And not unrelatedly, the line in the UK between the fine and the applied arts has always boiled with resentment (up) and snobbery (down). Routinely, historically, despite movements and revolts to counter it, in most eras it’s never quite enough to be an artisan, a professional craftsperson, a technician — the higher layers of creative respect will remain shut for you. (The genderline in pop is policed by exactly this prejudice.)

C: Age. Post-war, youth had seemed to have a much amplified power to reverse matters. Suddenly only the young had sex, grasped politics, understood the direction the future was arriving from… a position of seeming weakness had become one of saleable (exploitable) pseudo-strength. By the 90s, this had gone stringy and tangled. The potent founding tokens of youth culture were all old or dead; to insist on their continued value was to impose merely parental mores (or worse, the icons and ideations of bad cultural studies). Always there are be new names and pretty teenage faces, sure, but the counter to dadrock can quite soon just be a kind of dizzy, headachey flicker.

D: Back to Beuys for a moment: he’s saying that there are two further hierarchies. The first (which folds back into B and C) is a heirarchy of genre, which is why his work only reaches the top galleries, to make it as respectable art, once it’s been de-fanged and renamed: “sculpture”. The second is a hierarchy of the senses (saying there are 12 of these, rather than the canonic five). The categories of art history and critical theory tend to be highly conservative in effect; even theory that explores and celebrates the revolutionary moment is just stiff with cousins of the “original intent” movement in US constitutional theory. And this second harks back to A: the serried layers of sensual response, in which the visual has pre-eminence, with sound — and powers over it — always assigned a subaltern role.

(Because in the original bookshelf anecdote, right up top, what isn’t anywhere discussed is the large collection of records that was very surely standing quite near Jobson’s bookshelves. Because one of the things that successful musicians have likely done, at quite an early point in life, is to prioritise which archive exactly they’re going to devoting their time to, and learning from. We are all time-poor: this above all was something that punk dramatised — the leisure to mastery as a mark of privilege. But for a period — a period we may or may not yet have moved out of — the self-constructed private archive from out of a vast pool of LPs and 45s (CDs? mp3s?) was at once enormously more accessible and unregimented than gallery art or cinema or even book-learning. The freedom to do it all WRONG — which musical self-education via 45 and 33 very much allowed — was a central element in the sense of liberation that rock had once seemed to offer, and the curious topsy-turvy authority it at first gathered to itself.

E: and finally, very quickly, there’s all the ins and outs and shifting fashions and facts of what functions at any moment as the cultural “outside”. But this is rough notes towards a map and not at all the map itself — even a brief attempt to sketch how jazz or rock, born in black-created sound in a Jim Crow world, operated and evolved once borrowed or replicated elsewhere is going to end up a very long attempt. Because it’s quite complicated.

At any point, the verticality is unlikely to be clear or fixed; anything but. Move just a little and dimensions can reverse, distances can balloon or shrink, perspective can turn itself inside out. And let’s not beat about the bush: Oasis had no clear analytical grasp of all this or any of it, their responses very often purely reactive. It’s what we turn to art or music for — descriptions of ambivalence and complexity — but of course they only trusted themselves, and scorned the tools or even the desire to analyse same. And in this refusal, could never avail themselves of a refreshed context of self-awareness or experiment or outside input — as Lennon did with Yoko Ono.

Arthur Scargill’s argument about art — that there’s an artist in everyone, that something like the strike can bring a fruitful curiosity out in anyone — invokes the confidence that derives from a challenge faced and a fight sustained against odds: if you didn’t see yourself as an equal at some point, you’d always dodge away from fights. And sustained conversations are only possible between equals. The strike failed because the miners were defeated — and surely a loss of the relevant kind of confidence followed this, with all kinds of consequences in education; who it’s for and how. The social upheavals that allowed the rock generation to feel equals and more — even the late odd echoes that set the young siblings of punk against their older brothers and sisters — were not so much the kinds of upheaval that can be defeated or reversed. But of course they were vulnerable to time’s passing and the conflicted precession of generations. Even as late as the 80s, working class kids like Jobson (or less ambiguously, Mark E. Smith) could wage autodidact play-battle with echoes of the counterculture, and foray far beyond the the expectations of their own upbringing, on their own terms. But with the stutterstep of doubt introduced into musicianship as a shared value, and the surrender implied by the sourcing of authority extrinsically to music (in any written school of criticism or theory, for example), the options for similarly confident play couldn’t last beyond the early thrash of joyful contrarian noisemaking for Oasis. Semi-occluded semi-avant-garde musicianly grace notes hidden behind a defensively coarsened pretorian guard of bellow and bar-chord: this is — at worst and best — the hypocrisy-as-tribute that vice pays virtue.

28 Mar 16:25

ELTON JOHN – “Candle In The Wind ’97″ / “Something About The Way You Look Tonight”

by Tom

#774, 20th September 1997

citw Every Popular entry starts with the same question: why this record? This time it’s especially loud. “Candle In The Wind ‘97” is the highest-selling single of all time in the UK, almost 2 million clear of its nearest competitor. This is as big as pop gets. But “why?” might strike you as a silly question here, because its answer is so obvious: Diana, duh. So reframe it: why Diana?

The death of Princess Diana is recognisably a global news event, in the way we experience them now: the sudden in-rush of information into a new-made vacuum of speculation; the real-time grapple for meaning; and most of all the flood of public sentiment, deforming the story and becoming the story. It was also inescapable in a way nothing in my lifetime had been. But there are elements which feel very distant, and this single is one of them. It pushed the machineries of pop – literal ones, like CD presses and distribution fans, and metaphorical ones, like the charts – to their limits. HMV stores carried signs warning of a limit of 5 copies per person, and still sold out. There were reports of people buying 50 copies – for a shrine, perhaps, or just because CD singles had briefly become, like flowers and bears, part of a currency of devotion.

And still, because Diana so inconveniently died in the small hours of a Sunday, it felt to me like it arrived at No.1 late, a week after the funeral and two after the death. If its copies sold had been evenly distributed it could have managed months at No.1 – instead it racked up 5 weeks, fewer than Puffy. “Candle In The Wind ‘97” sets itself up to be a tribute that will last, but really it only made sense at the funeral, still in the heat of the story’s first phase: part of a fight about what Diana did or meant, and what her legacy might be.

Narratives overlapped, jostled for attention. Everyone had an agenda, everyone claimed her for it. Tony Blair, mesmerised by unifying figures and great causes, saw her as one – the “people’s princess”. TV news announcers, wrestling the story at its source, spat the word “paparazzi” with sudden, fearful distance. What they dreaded seemed to come true with Earl Spencer’s funeral speech, the ancien regime emerging to set the bloodline and duty of old England against its hateful, media-ridden, fallen reality. Murdoch’s Sun, meanwhile, had seen its opportunity. It raged at the family Diana had detested, damning their reticence. When others were a step behind, wringing their hands at the media for killing Diana, the Sun brazenly took that outrage and turned it into a lever to crack open the rest of Royal Family. The remainder of the Establishment retreated to their diaries, writing in despair of a Britain drowned in sentiment, left stained and sodden by this freak tide of petals, plushies and tears.

Legacy is part of what “Candle In The Wind” was always about – Bernie Taupin’s self-satisfied, sentimental recovery of the real girl beneath a superstar. “Candle In The Wind” is a song that’s angry about how men in Hollywood used and reshaped Norma Jean Baker, but then casually asserts the right of other men – Elton and Bernie – to revise the story and define an “authentic” version of the woman. Even the private life of Marilyn becomes a commodity, to be piously invoked by people who never met her. They all sexualised you, Nice Guy Bernie makes Elton simper – of course that’s not what I’m doing, way back in the obsessive dark of the cinema. Sometime in her teens, Diana Spencer sold her cassette of Goodbye Yellow Brick Road to her friend and flatmate, for 50 pence. She signed it before she handed it over.

A song about a dead woman whose place in our memory gets fought over by a vast establishment on one hand and people who never met her on the other: Taupin’s job here isn’t so much to bring the lyrics of “Candle In The Wind” up to date as to urgently make them less pointedly about Diana. The original “Candle” inevitably haunts this one – not just because it’s too resonant to be smothered, but because it makes it obvious how rushed, overdone, and fatuous the new version is. Forgivably so, perhaps. Elton didn’t know Marilyn but he did know Diana – he might have been at the funeral by right of friendship even if it wasn’t a gig. And compared to the knowing, late-night regrets and ruminations of the original, on “Candle In The Wind ‘97” he sings like he’s in a black suit and tie and nervously fingering the collar. (Flip to the ‘double A-side’ – yeah right – for a useful comparison: that’s what a relaxed Elton sounds like). He sings key words – “GROW in our hearts…the GRACE that PLACED yourself…” – with an unctuous precision. Peak smarm is hit on “now you belong to Heaven”, where Elton sounds like a Sunday School teacher explaining to a 5-year old where Bunny has gone.

For the biggest televised funeral of all time, though, some hyperbole is expected. Taupin certainly doesn’t risk caution – “from a country lost without a soul” sobs the lyric. Behind all this rending of garments, more intriguing touches lurk.

There’s the William Blake reference, for instance – “Your footsteps will always fall here, round England’s greenest hills”, an obvious nod to the verse which has ended up known as “Jerusalem”: “And did those feet in ancient time, walk upon England’s mountains green?”. Blake was referring to the legend that the young Jesus visited Britain, making the reference the closest “Candle In The Wind ‘97” comes to tying up all its vague messianic imagery into an implication that really would be startling. But there’s something more here. “Jerusalem” in its most famous sung arrangement also has currency as an alternative national anthem: it’s what England might have if we finally got rid of the Royal Family. Referencing it in a song for a woman who had stepped outside that family is a very interesting choice.

This reading of “Candle In The Wind ‘97” seems tenuous – but it’s backed up by the version of Diana the song chooses to emphasise. What we’re hearing about is Saint Diana, Our Lady Of The Landmines – placing herself in the grace where lives were torn apart. This was also the version of herself she most enjoyed. I don’t think she was cynical about her good works – while obviously living a life of astonishing privilege, she seems to have been a genuinely kind person, and on the right side of social history on some important issues – but she also knew the extent to which they threatened the monarchy.

One of the ways in which the monarchy managed to survive, retaining its power in an age where things might have gone badly for it, was turning Elizabeth II’s personal talent for rapid intimacy into a defining asset. The Queen, like Bill Clinton, has a famously good memory for faces, names, and small personal details – and this is turned by monarchists into an argument in favour of the whole institution. The Royals are valuable because they work so hard, and have such a bond with their subjects.

Since Divine Right won’t cut it, and the economic case is too grubby and unglamorous, this feels like the most solid defence of the Royals that monarchists have. But fixing a job description to monarchy is a secret attack on its legitimacy. If the job of monarchy simply amounts to empathising with people and remembering their names, then the monarch should be whoever does that job best. Diana’s challenge to the monarchy was that she took its nickname – The Firm – literally. She had been fired by the firm, and like a true entrepreneur she set up her own business as its competitor, disrupting it by doing exactly the same things – touring the world, visiting the poor or sick or industrious – with less protocol and more agility. The ultimate 80s icon was taking 80s politics to its unthinkable conclusion: privatise the monarchy. To do it, she used things the Royal Family could hardly touch – the media; youth; even pop.

This was why Diana’s modest assertion to Martin Bashir that perhaps she might be a princess “in people’s hearts” was such dynamite. What if, she was sweetly suggesting, simple popularity is a higher legitimacy than custom and tradition? This is a destabilising question. It’s the question implied by the NME when it modestly begins, in a paper full of critics, to list the records that sell the most every week. Which brings us back round to the original question: why is this record the biggest-selling single of all time?

Because they’re only based on sales, the British charts are a very crude cultural seismograph, able in their barefaced capitalist simplicity to pick up tremors other methods might smooth over. A colossal global news event should always show up on them, even overload them. But the unprecedented scale of this (really bad and hard to listen to) single’s success goes beyond that. Diana’s entire project – acting as a competitor to the Royal Family based on popularity and affection rather than iron tradition – means that a colossal show of genuine, bottom-up public mourning wasn’t just an inevitable reaction from her fans, it was the right one. And even if “Candle In The Wind ‘97” was a little late by our advanced standards, it was released in time to catch that wave.

Even so there’s a bigger question – why did this event manifest so strongly in pop, specifically? What sort of pop figure was Diana? It’s tempting and easy to look at her unearthly celebrity and simply pronounce her a pop star, but during her life that wasn’t how she figured into pop music. Instead, she was the archetypal fan. Though not for her taste, which was never going to wow critics – there’s a case for saying that “Something About The Way You Look Tonight” is the real tribute here, in that a bombastic bit of AOR with vaguely de Burghish themes is What She Would Have Wanted.

Diana’s most famous encounters with pop were fannish ones: being thrilled to meet Duran Duran backstage, dancing in front of a Kensington Palace mirror to “Girls On Film”. That second one, in particular, is iconic pop behaviour, but not star behaviour. It’s one of the classic images of fandom and the fan’s self-definition through pop: singing or dancing into the bedroom mirror. Diana’s performance of it in a palace calls back the original public idea of her, before the good deeds and the hugs. Diana was a symbol of pop – youth, energy, blah blah – at the heart of the establishment.

By 1997, that Diana was almost redundant. It still seemed like it might be important – and not just a trick of the demographic light – that the President played the sax and the Prime Minister had been some kind of rock band longhair. But the other possibility – that simply liking pop or rock music had no implications whatsoever – looked increasingly likely. A pop fan at the palace – or even thrown out of it – was no longer much of a story.

Dancing-in-the-mirror fandom has never been the only role for young women in mainstream pop, but it’s tended to serve as a default. It’s a coincidence that Diana dies just as the record industry starts to get its shit together about the Spice Girls and how to sell to their fans, but as we’ll see it makes “Candle In The Wind ‘97” a weirdly cathartic moment. The longer term trends in pop, as we close in on the modern day, are towards more women in the Top 10 and more solo artists, and mainstream pop in the last fifteen years is more defined than it ever was in Diana’s lifetime by the stories, presence, creativity and image of individual women.

The world these women are negotiating, resisting or conquering is similar to the one Diana faced. In adapting “Candle In The Wind” at her funeral, Elton John gave an account of Diana that stressed her enormous popularity and linked it with apparent sainthood. It was not the only version available – the intimate portrait by her brother, Earl Spencer, made the headlines by excoriating the press for hounding his sister. “Candle In The Wind ‘97” sounds awkward and overstated now, to say the least – Diana’s “legend” may not burn out any time soon, but it’s settled into a dull, emberish glow. But Spencer’s speech has fared worse: the era of the paparazzi, of the press hunting and being used by the famous, manifestly did not end.

Instead of the world rejecting the paparazzi, it caught up to them. Diana was, through circumstance, a post-privacy pioneer. The conditions of her adult life – forever observed, forever performing, always improvising to use that to her advantage – are replicated today not just for the stars who have to master those skills, but for all the billions living part of their life in public, . Her question, the pop question – who is popular, and what does that mean? – is the architecture of the social media world, its algorithmic cement. The iconography of Lady Di is frozen in the 1980s – scrape it away and you find an uncomfortably modern, uncomfortably pop figure.

28 Mar 09:52

i'm just kidding. facebook will store all the information you give it even when you use a fake name.

archive - contact - sexy exciting merchandise - cute - search - about
← previous March 26th, 2014 next

March 26th, 2014: EXCITING THINGS HAPPENING TODAY:

One year ago today: i am a sentient mouth on legs, and if that is chocolate you've got, know that i want to shove it in me

– Ryan

28 Mar 09:52

at time of writing, some of those poop names are still available. for some reason??

archive - contact - sexy exciting merchandise - cute - search - about
i am in Seattle at ECCC today and all weekend! come say hi!

← previous March 27th, 2014 next

March 27th, 2014: EXCITING THINGS HAPPENING TODAY AND ALSO THIS WEEKEND:

One year ago today: imagine...a DUDE

– Ryan

28 Mar 09:25

#566 Cash Flow

by noreply@blogger.com (treelobsters)
28 Mar 09:22

Tales From Costco #9

by evanier

talesfromcostco1.gif

This will probably amuse about six people but since I'm one of them and it's my blog, here it is. At first, it may not sound like a Tale From Costco but be patient. We'll get there.

We all have silly little words and phrases that inexplicably make us laugh. One of mine came from a comedy writer named Fred Rosenberg who, alas, is no longer with us. There are a number of Fred Rosenbergs alive and working in the entertainment industry but not the one who did the great impression of comedian Jack Carter.

It was a really dumb impression — that was kind of the point of it — but Fred did it every time I was around him and I always found it funny. He wouldn't replicate Carter's voice but he'd mimic the man's posture and frantic onstage manner. (For those of you who came in late to comedy: Jack Carter was a popular comic of the fifties and sixties, often seen on The Ed Sullivan Show and any other venue that booked stand-ups.)

Jack Carter

Jack Carter

So Fred would do this inane impression and it consisted of one line out of Carter's act. This is from around 1962…

These kids today with their dances! They do the Mashed Potato! They do the Baked Potato!

(And again, to translate for these kids today: The Mashed Potato was a dance somewhat like The Twist. There was probably a dance called the Baked Potato somewhere but Carter's joke was based on the assumption that there wasn't…and wasn't it absurd to think there might be? That's part of why Fred's impression was so funny to me, hauling out that silly line.)

Anyway, I laughed whenever Fred did the impression and to this day when I see mashed potatoes somewhere, I am often reminded of Fred's portrayal and I laugh. Yesterday afternoon, I was in Costco. I put a rotisserie chicken in my cart. I put a packaged Kosher-for-Passover turkey breast in my cart. I put four large bags of Arm & Hammer Baking Soda in my cart. I put a case of organic chicken broth in my cart. I put a case of canned cat food in my cart. I did not put either a large package of paper towels or toilet paper in my cart and I was wondering if they'd let me out of the building without one.

And then I spotted a big package of prepared mashed potatoes on display and it made me think of Fred Rosenberg's Jack Carter impression and I laughed. It's been more than twenty years since I last saw Fred but I heard his voice in my head and I laughed out loud. Other shoppers looked at me strangely but I didn't care.

One elderly gent who glared at me was driving around on one of those little scooters that Costco provides for shoppers who can't walk. He stared at me as if to say, "What's that idiot laughing about?" And I stared at him and thought, "Hey, I know who that is…" And I swear to God — it was Jack Carter.

28 Mar 09:19

Before the Internet

We watched DAYTIME TV. Do you realize how soul-crushing it was? I'd rather eat an iPad than go back to watching daytime TV.
28 Mar 09:19

How to Get to the Truth of the Matter

by Scott Meyer

You can still enter to win a signed copy of the new edition of Off to Be the Wizard (Magic 2.0). In addition to that book, the prize package will also include the Basic Instructions 2014 Box Calendar, and a signed copy of a book by a different author. This week, the bonus book is The Dreams of a Dying God (The Godlanders War, Book One) by Aaron Pogue.

You can enter below by friending Off to Be the Wizard on Facebook, or by following me on Twitter.

Sadly, the offer is only open to people in the United States. Shipping costs, what can I say? The winner will be selected on Sunday, March 30th (Not Saturday, as reported. Error on my part. Sorry.), then a new give away with a different bonus book will begin. Thanks, and good luck!

 

a Rafflecopter giveaway

As always, thanks for using my Amazon Affiliate links (USUKCanada).

28 Mar 00:00

Corporate Hobbies

by mike

The Supreme Court seems poised to radically revise American corporate law, by entertaining the idea that a corporation can have religious beliefs. (Theaporetic is not a corporate lawyer, so if any reader has more accurate knowledge he welcomes the correction to what follows).

Hobby Lobby, a privately held corporation, objects to the fact that Obamacare requires it to provide health insurance that includes contraception. Hobby Lobby’s CEO, David Green rejects contraception on religious grounds and resents having to pay for it.

But according US legal precedent, he is not paying for it, the corporation is. The corporation is a different person from David Green. But from the left and the right, the nature of corporate personhood appears to be changing. Corporations are being re-imagined as representations of the person who make them up.

If the reader will forgive a review, a corporation is an entity created by law which is, for most of the purposes of the law, a person. “Hobby Lobby” is a legal person–it can buy and sell land like a person; it can sue in court; it can sign contracts. It has many but not all of the rights that biological persons have–protection from illegal search and seizure, for example. It has a right to free speech. But it can’t vote.

One of the main advantages of a corporation is limited legal liability. If Hobby Lobby goes broke, its CEO, Green, is not liable for the debts Hobby Lobby owes. The fictitious person, “Hobby Lobby,” is liable. If Hobby Lobby sells toxic paint, and people who buy it die, Green is not liable, Hobby Lobby is liable. It’s pretty remarkable.

For this to work, a corporation has to be understood as separate from the people who make it up. That is, it has to be understood as a real person, not simply as a device for amplifying and projecting the desires of its shareholders. It may indeed do that, but if it is understood only as the collective representation of its shareholders, how can there be any claim to limited liability? For limited liability to make sense in law, the corporation has to have an existence separate from its shareholders. Legal scholars argued throughout the 19th century that the corporation was a real, artificial person, much in the same way that a man-made lake, an artificial lake,  is nevertheless a real lake, full of water and fish. The person created by law was a “real” person in law, not simply the voice of its members.

leviathan

So corporations, historically, can’t have a race or a gender. See for example the 1908 case of People’s Pleasure Park V. Rohleder, in which a group of African American formed a corporation to build an amusement park in a whites-only area of Richmond. The VA Supreme Court concluded that corporations are real legal persons, distinct from their members, and having no body, they cannot have a race. Corporations until recently have been seen as  distinct from the biological persons who make them up.

So if a corporation cannot have a race, it cannot have a gender, and it would seem it cannot have religious beliefs or the claim to a soul or redemption by Christ. It’s a person, but not that kind of person. David Green has religious beliefs, but David Green and “Hobby Lobby” are, in the history of American law, distinct and separate persons. Hobby Lobby cannot go to church or receive God’s grace; cannot be baptized or feel guilt or love or awe or mystery. Green would, in traditional corporate law, have no case.

In the now famous Citizens United decision, however, the Supreme Court recast corporations as instruments designed to amplify the voices of their members. A corporation was like a labor union or any other advocacy group: an entity through which people join to advance their interests. They have political beliefs and the corporation expresses them, so it’s unconstitutional to limit their political speech.

The Hobby Lobby case appears to build on this precedent. Hobby Lobby is held to have religious beliefs. These just so happen to be identical to those of David Green, and Hobby Lobby and David Green seem to have become interchangeable. So it seems to me (again not a lawyer) that Green’s lawyers are building on the revision of corporate law imagined in Citizens United.

Similarly, in 2013 the US fourth Circuit Court ruled, in Carnell Construction v. Danville Redevelopment that corporations–in this case a minority owned corporation–can have a race: “we hold that a corporation can acquire a racial identity and establish standing to seek a remedy for alleged race discrimination.” The corporation is not imagined as having been born with a race, but it can “acquire” a racial identity. So as in the case of David Green and Hobby Lobby, the corporation is merging with the people who make it up.

The Supreme Court tilts politically right; the author of the Carnell decision, though, was an Obama appointee. What this suggests to me is a major cultural shift, across the political spectrum, in thinking about the nature of the corporation. There are sorts of possibilities for legal challenge if you start to reconsider the corporation in this way.

If the corporation is merely a megaphone through which people speak, or if a corporation bears the race of its owners, then it’s not really distinct from them and so how can liability be limited? If it is the instrument of their will then surely they, and not the corporation, are liable. This is why Citizens United seems like a radical revision of corporate law.

Oddly, Citizens United is kind of a “socialist” decision, in that it imagines, as Mitt Romney put it, “that corporations are people,” they are a community, a collective, a social phenomenon, a coming together. This is entirely counter to the history of American corporate law since at least 1820, which has insisted that corporations are abstract, disembodied persons created out of market drives and needs, lacking sentimentality or a conscience, rational, distinct from the people who work for or direct the corporation’s movements.

Yesterday Justice Roberts suggested in questioning that corporations which are “closely held” (s-corporations) might embody their holders’ selves. In this he seems to be following the logic of the Carnell case as well as Citizens United. Roberts questions suggested that large corporations like Coca Cola are not “closely held” and so can’t be imagined to have a religion, though why this would be the case is less than clear.

What does a world full of male or female, baptist or presbyterian corporations look like? Corporations become representations, not simply of abstract market forces, but of ourselves, in the way that elected officials were once imagined as representatives of ourselves. Male corporations, female corporations, cranky old man corporations, corporations composed of small people or people who believe the earth is flat: it would continue the process whereby politics is corporatized.

 

 

 

 

 

 

 

 

26 Mar 09:18

Day 4833: In and Proud

by Millennium Dome
Wednesday:

Did you miss me? I've been EVER so busy (definitely NOT sulking. JUST because I'm a TEENAGER now!)

Tonight we'll be supporting a POSITIVE case for IN, cheering on Captain Clegg when he faces off against evil former-banker Mr Farrago, aka Dr Nigel No, Chief Kipper of the UKPNuts.

And HOORAY! for that! It's been AGES since I was so PROUD of the Liberal Democrats*. Just watch the video:



Doesn't that WARM your FLUFF!

It's certainly woken Daddy Alex up. He's written some NEW words to help the Captain out: Putting #WhyIAmIN Into What the Lib Dems Stand For!

And, like BAGPUSS, when Daddy wakes up, all his fluffy friends wake up! (Don't be RUDE! I mean ME!)

Like Daddy, I've had another look at my answer to his challenge last year and like Daddy I've added more guff included even more good Liberal values!

“The Liberal Democrats stand for freedom.

Freedom from poverty, ignorance and conformity.

Freedom for every individual, family, group, community, society or nation.

Freedom from inheriting the financial and environmental mistakes of earlier generations.

Freedom to live your life enjoying the rewards for your own endeavour, governed by your own choices – with equality before the law; without harming others.

To deliver these freedoms, for today and the future, needs both fairness and practicality; opportunity and compassion.

  • An economy that is stronger, and sustainable for the future, where everyone also pays their share.
  • A society which is fairer and that recognises that we work better together, together locally, together in the UK and together in Europe, to fight crime, ensure fair trade, tackle climate change and break down barriers to understanding.

The Liberal Democrats believe in a better future. That’s why Liberal Democrats are working to build a fairer, freer society and a stronger, greener economy, enabling every person to live the life they want.”

With all that help, I'm sure that Captain Clegg will do well.



*What has a Liberal Government ever done for us, eh?

Apart from being the only Western country with falling inequality, obviously, what have the Lib Dems in government actually done for us?

That is apart from falling inequality and getting through the worst recession for a hundred years with falling unemployment, falling inflation and a falling deficit, what have the Liberal Democrats achieved?

You know, set aside the falling inequality, strengthening economy, and raising the personal allowance so that millions of people on low and average earnings have had a tax cut (while taxing the rich more), can you think of anything that the Lib Dems have actually done?

Because and obviously we're not mentioning the falling inequality, strengthening economy, fairer taxes and sorting out the pension system, with a triple lock to maintain the value of current pensions, with better provision for everyone in future AND trusting people with their own money when they retire, can you think of any reason we should be proud of the Lib Dems in power?

I mean, not counting falling inequality, strengthening economy, fairer taxes, better pensions, and turning Labour student loans into what is effectively a graduate tax when graduates will pay less each month and that has seen more people from less well-off backgrounds than ever before going to University, what use have the Lib Dems been in the Coalition?

So if we ignore the falling inequality, strengthening economy, fairer taxes, better pensions, wider access to University, and of course free school meals… and ending child detention … and the world's first green bank… and apprenticeships… and no I.D cards… and the pupil premium… APART from all that…

What HAVE the Liberal Democrats done to make us feel proud?

Oh, and before we all have a Miranda moment, we got equal(er) marriage too.

26 Mar 08:17

E-Cig Study: Much Smoke, Little Light

by Scott Alexander

New study shows that e-cigarette users are no more likely to quit smoking tobacco after a year than non-e-cigarette users. In fact, the trend is in the opposite direction – e-cigarette users are less likely to give up their regular cigarettes. I’m skeptical. r/science is skeptical. The experts are skeptical. Even the authors of the study sound maybe a little skeptical.

The study surveyed tobacco smokers for various information including whether they smoked e-cigarettes in addition to their tobacco. A year later, they went back and surveyed everyone again and asked them if they were still smoking tobacco. And the people smoking the e-cigarettes were no more likely to have quit than the others.

Let’s transition from reality to Hypothetical World. In Hypothetical World, there are only two kinds of smokers, Short Smokers and Long Smokers. The moment someone smokes their first cigarette, God flips a coin and assigns them into one of the two groups based on the result. Short Smokers are predestined to smoke for exactly one year before quitting; Long Smokers are predestined to smoke for exactly fifty years before quitting.

A scientist in Hypothetical World wants to discovery what percent of first-time-smokers become Short Smokers versus Long Smokers (the real proportion is 50-50 since God’s coin is fair, but she doesn’t know that). So she uses the same methodology as this study. She hangs around a tobacco shop and accosts the first thousand people who come in to buy cigarettes, getting their names and phone numbers. Then a year and a day later, she calls them all up to ask if they are still smoking – since anyone who keeps smoking for a year and a day must be a Long Smoker.

So she finds something close to 2% of people are Short Smokers and a whopping 98% are Long Smokers. She incorrectly concludes that God is rolling a d100 and only assigning Short Smoker status to those who come up 99 or 00.

Don’t see why she would make this mistake? Consider a particular generation of Hypothetical people over their lifetimes. The Short Smokers will only smoke a single year out of their lifetime; the Long Smokers will smoke fifty years. When the scientist does her study in a randomly selected year, she only has a 1/average_lifespan chance of catching any given Short Smoker, but a 50/average_lifespan chance of catching a Long Smoker. So, her original sample will contain fifty times more Long Smokers than Short Smokers, and she will mistakenly conclude that their pattern is fifty times more common.

Now transition back to reality. Suppose there are two types of e-cigarette users – successful and unsuccessful. The successful e-cigarette users try e-cigarettes, immediately decide they are better than regular cigarettes, and switch to using e-cigarettes exclusively within one month. The unsuccessful e-cigarette users try e-cigarettes but just don’t get everything they love about tobacco from them. They sort of futz around with e-cigarettes and regular cigarettes and tell themselves that one of these days, they’re really going to stop the regular ones entirely and transition totally to e-cigarettes. These people continue futzing for let’s say ten years before they either finally quit tobacco, give up on e-cigarettes, or die.

In that case, any sample of tobacco smokers taken at a particular time will include a hundred twenty times as many unsuccessful e-cigarette users as successful ones. We expect unsuccessful e-cigarette users to continue their past pattern of futzing around, so it’s not surprising that this sort of sample finds most e-cigarette users not only can’t easily quit tobacco using e-cigarettes, but actually have a harder time quitting tobacco than normal smokers – they’ve already been preselected as The Group That Even E-Cigarettes Can’t Help; as The Group That Tried Something Billed As An Anti-Smoking Aid But Failed At It. It’s a pretty general rule of medicine that people who failed treatment once are more likely to fail treatment a second time.

This is a very speculative explanation and I haven’t heard anyone respectable at a major institution advance it yet, but it seems to me like the most likely reason for these findings. All I have to go for with the study right now is a preliminary “research letter”, but hopefully we’ll know more when the real thing comes out.

Lest this post be entirely pro-drug, here’s a clip of my addiction-medicine teacher and sometime-boss lecturing people about marijuana on Fox News last weekend. He is a great doctor and it’s neat to see him finally getting some of the celebrity he deserves. Even though his politics are terrible.

25 Mar 15:12

Putting #WhyIAmIN Into What the Lib Dems Stand For 2014.07 #LibDemValues

by Alex Wilcock

Tomorrow, Liberal Democrat Leader Nick Clegg puts the positive case for being in the European Union, in the first of two debates with UKIP Leader Nigel Farage putting the negative case for Little-Englanderism (the other two ‘Leaders’ can’t decide, so they’ve bottled it). That makes two headlines for What the Lib Dems Stand For: “Stronger Economy, Fairer Society” and “IN Europe, IN Work”. So how do those two fit together? Here’s my go at something slightly more than a slogan, but still punchy. If you like it, please borrow it to stick on a leaflet or add to a speech. If you don’t, please send me something better!

  • The Liberal Democrats stand for freedom for every individual – freedom from poverty, ignorance and conformity.
  • To make that freedom real needs both fairness and economic responsibility: an economy that works, that encourages enterprise, and where everyone pays their fair share.
  • So freedom from poverty requires responsible spending, not debt, built on fairer taxes where lower earners pay less tax and the wealthiest pay more; being in Europe to be in work, to fight crime and tackle climate change; and building green jobs for the future.
  • Freedom from ignorance needs better education and training, so people have the opportunity to realise their potential.
  • And freedom from conformity, supported by freedom from poverty and ignorance, means everyone should have the liberty to live their lives as they choose – without harming others; with equality before the law; with a better say, because no government always knows best.
  • These values are why Liberal Democrats are working for a greener, stronger economy in a fairer society, enabling every person to get on in life.

Last year I wrote a series of articles on What the Lib Dems Stand For, looking for something to fit in a box on a leaflet or in a minute’s speech – something to enthuse and inspire Lib Dem believers, something to attract and persuade potential supporters, something that’s more than a slogan or a soundbite but short enough to get in one go.

I opened it up into a meme, and many other Lib Dems took part as well – you can find the links to where I’ve published theirs, too, below. That’s still open: if you can do better, whether saying ‘Change this little bit because…’ or suggesting your own from scratch, I want to hear from you, too (my email link’s on the sidebar. I’d like to hear if you make use of mine, too).

Especially now we’re sharing power, it’s important to assert our values. I wanted to answer: What makes us different, and makes us stay? How is that reflected in our priorities in Coalition Government? How does that pick out the central message of the Preamble to our Constitution? How does that expand on the party’s ‘core message’ slogan of “Stronger Economy, Fairer Society”? And how can we best express it in language that feels natural to us and anyone listening to us? All the while making a positive case for us, not just ‘…But the others are worse’? I’ve tried to do all that together. Does it work for you?


Adding ‘IN’

Last year I tried to combine everything at once, with the added challenge to make it short and to make it make sense, rather than just being a storm of buzzword-salad. It works pretty well for me telling the story of how the three big freedoms of our Preamble fit together with fairness, but even then I knew it had to leave some things out, and the biggest one I couldn’t seem to fit in was internationalism. And what’s our big theme for this year’s European Elections? Who could have guessed?

My main change today is… Adding more words. Compared to last year’s, below, it’s up 16, taking it further from my target of 150 to fit in a medium-sized box on a leaflet or just one minute in a speech. What do you think of “being in Europe to be in work, to fight crime and tackle climate change”? Does it work? Does it fit? Is it too specific compared to the more values-based rest of the text? I based it on our three main campaigning messages about Europe, because I couldn’t find a way of getting the ‘drawbridge down’ sort of values behind them to work in just a few words. Can you?

And yes, now it makes the “Freedom from ignorance” bit look a bit short, but I’m not adding even more. Though if you don’t like semi-colons and want to break it up a bit more to make it easier to read, the pedant in me doesn’t like it, but you can use full stops instead to make the “Freedom from poverty” and “Freedom from conformity” bullets punchier.


The ‘What the Liberal Democrats Stand For Challenge’ So Far

This aims to be something short and simple that Lib Dems members can look at and think, ‘Yes, that’s some of why we bother’, and that other people can look at and think, ‘Oh, that’s what the Lib Dems are for, and I like it’. Feel free to borrow it for a box on your Focus leaflets, to be part of your speeches, for your members’ newsletters, your Pizza’N’Politics evenings – wherever it’ll do some good. And here’s what I’ve done with it so far, including many other Lib Dems’ own versions…

Happy 25th Birthday, Liberal Democrats – and What the Lib Dems Stand For 2013.1

Why we should sum up What the Lib Dems Stand For, and how it’s developed over the years.

What the Lib Dems Stand For 2013.2 – a Challenge and a Meme #LibDemValues

Setting out my ‘What the Lib Dems Stand For’ based on the Preamble, practice and core messaging, and challenging other Lib Dems to come up with their own.

The Liberal Democrat What Do We Stand For Challenge 2013.3 – Eight Answers (so far) #LibDemValues

After receiving the first set of responses, rounding up eight different Liberal Democrats’ versions of what we stand for – so far…

The Liberal Democrat What Do We Stand For Challenge 2013.4 – What It’s All About #LibDemValues

Inviting people to use my short declaration of ‘What the Lib Dems Stand For’ and explaining what each bit of it means.

What the Lib Dems Stand For 2013.5 – Why I Am A Liberal Democrat #LibDemValues

This one’s very different – longer and more personal: how did I get here? Why did I become a Lib Dem in the first place? And why do I stay?

The Liberal Democrat What Do We Stand For Challenge 2013.6 – Another Eight Answers #LibDemValues

Another eight different Liberal Democrats’ versions of what we stand for in the second set of responses people sent in.


Last year’s slightly shorter version:
The Liberal Democrats stand for freedom for every individual – freedom from poverty, ignorance and conformity.

To make that freedom real needs both fairness and economic responsibility: an economy that works, that encourages enterprise, and where everyone pays their fair share.

So freedom from poverty requires responsible spending, not debt, built on fairer taxes where lower earners pay less tax and the wealthiest pay more, and building green jobs for the future.

Freedom from ignorance needs better education and training, so people have the opportunity to realise their potential.

And freedom from conformity, supported by freedom from poverty and ignorance, means everyone should have the liberty to live their lives as they choose – without harming others; with equality before the law; with a better say, because no government always knows best.

That’s why Liberal Democrats are working for a stronger, greener economy in a fairer society, enabling every person to get on in life.

Once again, there will be more.


Featured on Liberal Democrat Voice
25 Mar 14:03

Should You Reverse Any Advice You Hear?

by Scott Alexander

Suppose a lot of that stuff about bravery debates is right.

That lots of the advice people give is useful for some people, but that the opposite advice is useful for other people.

For example, “You need to stop being so hard on yourself, remember you are your own worst critic” versus “Stop making excuses for yourself, you will never be able to change until you admit you’ve hit bottom.”

Or “You need to remember that the government can’t solve all problems and that some regulations are counterproductive” versus “You need to remember that the free market can’t solve all problems and that some regulations are necessary.”

Or “You need to pay more attention to your diet or you’ll end up very unhealthy” versus “You need to pay less attention to your weight or you’ll end up in a spiral of shame and self-loathing and at risk of eating disorders.”

Or “Follow your dreams, you don’t want to be working forever at a job you hate”, versus “Your dream of becoming a professional cosplayer may not be the best way to ensure a secure future for your family, go into petroleum engineering instead.”

Or “You need to be more conscious of how your actions in social situations can make other people uncomfortable and violate their boundaries” versus “You need to overcome your social phobia by realizing that most interactions go well and that probably talking to people won’t always make them hate you and cause you to be ostracized forever.”

Or “You need to be less selfish and more considerate of the needs of others” versus “You can’t live for others all the time, you need to remember you deserve to be happy as well.”

People often form groups based on pushing one or another side of these dichotomies. Most obviously, the Libertarian Party pushes one side of the political-regulation one and the Communist Party pushes the other. The fat acceptance movement pushes one side of the diet-health one and the American Heart Association pushes the other. Some religious groups and the effective altruism movement push one side of charity-selfishness, the Objectivists push the other.

Most of these groups have the stated purpose of moving society as a whole, but their primary short-term effect is to change the opinions of their members.

For example, maybe you join the Objectivist movement. You follow lots of Objectivist blogs that give you strong arguments for selfishness, hear lots of anecdotes of people being hurt by excessive altruism, and get exposed to any studies that seem to support the pro-selfishness point of view. You probably end up more selfish than you were before you joined the Objectivists.

Consider two possible interpretations of that result.

First, Objectivism might be a successful support group. People who aren’t selfish enough realize they need more selfishness in their lives, join the Objectivists, and support each other as they work to overcome their inbuilt disposition to ignore their own needs. Gradually they all become psychologically healthier people.

Or second, Objectivism might be a vicious cycle. The people who are already too selfish see an opportunity to be selfish with a halo. They join Objectivism, egg each other on, and become even more selfish still. Meanwhile, the people who could really have benefitted from Objectivism, the people who feel guilted into living for others all the time while ignoring their own needs, are off in some kind of effective charity group, egging each other on to be even more self-destructively altruistic.

The first dynamic definitely sometimes exists, and in fact I was cued in to this whole issue from a friend in the first situation who was genuinely helped by Objectivism.

But I think the second dynamic is usually more common. It’s much easier to join a group that celebrates your natural proclivities than one that demands you fight against them. Then you end up with dueling death spirals in which two separate communities become more and more certain of their own position.

I was talking to a friend on Facebook about marriage (I’ll default to anonymous; she can identify herself in comments if she wants). She was annoyed at a blog that criticized a supposed group of people who jumped into marriage unreflectively because they felt divorce was an easy and low cost escape if it didn’t go well. She thought this was a cheap shot, which of course it is.

But I pointed out that this criticism really wasn’t entirely off the mark, because these people exist and are in fact very common. The fallout from their recklessness sometimes requires a psychiatrist to help sort out, so I meet them and their children in treatment all the time.

And she said that sure, these people do exist, but there are a lot of people who are stuck in abusive relationships and already feel like divorce is too stigmatizing, and we shouldn’t be too quick to mock people who jump to divorce because that’s just going to make it harder for these people to get the divorces they really need.

And I said that okay, definitely those people existed as well, but it seemed kind of unfair for this demographic to hold hostage society’s ability to suggest people be more responsible with marriage and divorce, when there are so many people who would benefit from that advice.

And she said that yes, it would be nice to provide these irresponsible people the information that they need to think carefully before making major life choices, but that these probably weren’t the sort of people who read preachy conservative blogs about the virtues of the married life anyway.

And I didn’t have a good answer to that, because it was obviously true. The best I could do was point out that this would delegitimize pretty much all discourse. Every piece of social commentary is most likely to go to the people who need it least.

For just this reason I worry that everything I post on my blog is correct, but wrong relative to readers of my blog. For example, I post about how everyone needs to be much more mindful of the role biological factors play in human social systems. And I’m 99% sure that the average person is not sufficiently aware of or concerned about this.

But I’m much less certain that the average reader of my blog isn’t sufficiently aware or concerned about this. Maybe people who are really interested in biodeterminism search “biodeterminism” on Google, find my blog and several others, and end up way too biodeterminist. Maybe their time would be much better served reading some blog on how many things are due to fuzzy hard-to-measure social factors like who your third-grade teacher was.

And when a young person is looking for job advice, I worry that all the artsy creative people whose heads are already way too high in the skies will be reading books by artsy creative people who urge them to follow their dreams, and so be even less mindful of the importance of a secure future. And all the hard-headed down-to-earth people will naturally gravitate toward reading Have A Very Secure Future By Going Into Business by Warren Buffett, and maybe never get reminded of the importance of following dreams.

(This is even sadder when the groups aren’t equal in size, when society is much more in need of one side than the other, but that group is stuck in a tiny but super-intense inward-facing spiral. The Venn diagram of the people who most need to learn about LGBTQ rights compared against the people who most often hear about LGBTQ rights consists of two circles, one in Canada and the other in Peru. And so we end up with a big community of people who want trans people bullied out of society, plus a tiny community who spend a lot of time panicking that they might be unintentionally offensive by using the wrong form of “trans-” vs. “trans*”.)

I wonder whether everyone would be better off if they automatically reversed any tempting advice that they heard (except feedback directed at them personally). Whenever they read an inspirational figure saying “take more risks”, they interpret it as “I seem to be looking for advice telling me to take more risks; that fact itself means I am probably risk-seeking and need to be more careful”. Whenever they read someone telling them about the obesity crisis, they interpret it as “I seem to be in a very health-conscious community; maybe I should worry less about your weight all the time.”

Probably this wouldn’t literally work as written. Too much advice is applicable to everybody; the absence of advice to play more Russian roulette is directly linked to Russian roulette being a really bad idea for pretty much everyone.

And sometimes even the self-selected aren’t self-selected enough. Even though I’m terribly biodeterminist compared to the population at large, I feel like I keep finding the rabbit hole goes even deeper than I expected. And no matter how paranoid I am about the statistics asserted by a group whose identity I will not mention to avoid having to slog through the ensuing controversy, I always find I wasn’t quite paranoid enough – that the claims I was holding up as virtuous exceptions to the rule were also misleading or doctored.

But advice reversal might at least be worth considering. The checklist could be something like:

1. Are there plausibly near-equal groups of people who need this advice versus the opposite advice?

2. Have you self-selected into the group of people receiving this advice by, for example, being a fan of the blog / magazine / TV channel / political party / self-help-movement offering it?

3. Maybe the opposite advice, for you in particular, is at least as worthy of consideration.

24 Mar 14:22

An Open Letter to IBM's Open Letter

by schneier

Last week, IBM published an "open letter" about "government access to data," where it tried to assure its customers that it's not handing everything over to the NSA. Unfortunately, the letter (quoted in part below) leaves open more questions than it answers.

At the outset, we think it is important for IBM to clearly state some simple facts:
  • IBM has not provided client data to the National Security Agency (NSA) or any other government agency under the program known as PRISM.

  • IBM has not provided client data to the NSA or any other government agency under any surveillance program involving the bulk collection of content or metadata.

  • IBM has not provided client data stored outside the United States to the U.S. government under a national security order, such as a FISA order or a National Security Letter.

  • IBM does not put "backdoors" in its products for the NSA or any other government agency, nor does IBM provide software source code or encryption keys to the NSA or any other government agency for the purpose of accessing client data.

  • IBM has and will continue to comply with the local laws, including data privacy laws, in all countries in which it operates.

To which I ask:

  • We know you haven't provided data to the NSA under PRISM. It didn't use that name with you. Even the NSA General Counsel said: "PRISM was an internal government term that as the result of leaks became the public term." What program did you provide data to the NSA under?

  • It seems rather obvious that you haven't provided the NSA with any data under a bulk collection surveillance program. You're not Google; you don't have bulk data to that extent. So why the caveat? And again, under what program did you provide data to the NSA?

  • Okay, so you say that you haven't provided any data stored outside the US to the NSA under a national security order. Since those national security orders prohibit you from disclosing their existence, would you say anything different if you did receive them? And even if we believe this statement, it implies two questions. Why did you specifically not talk about data stored inside the US? And why did you specifically not talk about providing data under another sort of order?

  • Of course you don't provide your source code to the NSA for the purpose of accessing client data. The NSA isn't going to tell you that's why it wants your source code. So, for what purposes did you provide your source code to the government? To get a contract? For audit purposes? For what?

  • Yes, we know you need to comply with all local laws, including US laws. That's why we don't trust you -- the current secret interpretations of US law requires you to screw your customers. I'd really rather you simply said that, and worked to change those laws, than pretending that you can convince us otherwise.

EDITED TO ADD (3/25): One more thing. This article says that you are "spending more than a billion dollars to build data centers overseas to reassure foreign customers that their information is safe from prying eyes in the United States government." Do you not know that National Security Letters require you to turn over requested data, regardless of where in the world it is stored? Or do you just hope that your customers don't realize that?

24 Mar 12:02

This review of Max Tegmark’s book also occurs infinitely often in the decimal expansion of π

by Scott

Two months ago, commenter rrtucci asked me what I thought about Max Tegmark and his “Mathematical Universe Hypothesis”: the idea, which Tegmark defends in his recent book Our Mathematical Universe, that physical and mathematical existence are the same thing, and that what we call “the physical world” is simply one more mathematical structure, alongside the dodecahedron and so forth.  I replied as follows:

…I find Max a fascinating person, a wonderful conference organizer, someone who’s always been extremely nice to me personally, and an absolute master at finding common ground with his intellectual opponents—I’m trying to learn from him, and hope someday to become 10-122 as good.  I can also say that, like various other commentators (e.g., Peter Woit), I personally find the “Mathematical Universe Hypothesis” to be devoid of content.

After Peter Woit found that comment and highlighted it on his own blog, my comments section was graced by none other than Tegmark himself, who wrote:

Thanks Scott for your all to [sic] kind words!  I very much look forward to hearing what you think about what I actually say in the book once you’ve had a chance to read it!  I’m happy to give you a hardcopy (which can double as door-stop) – just let me know.

With this reply, Max illustrated perfectly why I’ve been trying to learn from him, and how far I fall short.  Where I would’ve said “yo dumbass, why don’t you read my book before spouting off?,” Tegmark gracefully, diplomatically shamed me into reading his book.

So, now that I’ve done so, what do I think?  Briefly, I think it’s a superb piece of popular science writing—stuffed to the gills with thought-provoking arguments, entertaining anecdotes, and fascinating facts.  I think everyone interested in math, science, or philosophy should buy the book and read it.  And I still think the MUH is basically devoid of content, as it stands.

Let me start with what makes the book so good.  First and foremost, the personal touch.  Tegmark deftly conveys the excitement of being involved in the analysis of the cosmic microwave background fluctuations—of actually getting detailed numerical data about the origin of the universe.  (The book came out just a few months before last week’s bombshell announcement of B-modes in the CMB data; presumably the next edition will have an update about that.)  And Tegmark doesn’t just give you arguments for the Many-Worlds Interpretation of quantum mechanics; he tells you how he came to believe it.  He writes of being a beginning PhD student at Berkeley, living at International House (and dating an Australian exchange student who he met his first day at IHouse), who became obsessed with solving the quantum measurement problem, and who therefore headed to the physics library, where he was awestruck by reading the original Many-Worlds articles of Hugh Everett and Bryce deWitt.  As it happens, every single part of the last sentence also describes me (!!!)—except that the Australian exchange student who I met my first day at IHouse lost interest in me when she decided that I was too nerdy.  And also, I eventually decided that the MWI left me pretty much as confused about the measurement problem as before, whereas Tegmark remains a wholehearted Many-Worlder.

The other thing I loved about Tegmark’s book was its almost comical concreteness.  He doesn’t just metaphorically write about “knobs” for adjusting the constants of physics: he shows you a picture of a box with the knobs on it.  He also shows a “letter” that lists not only his street address, zip code, town, state, and country, but also his planet, Hubble volume, post-inflationary bubble, quantum branch, and mathematical structure.  Probably my favorite figure was the one labeled “What Dark Matter Looks Like / What Dark Energy Looks Like,” which showed two blank boxes.

Sometimes Tegmark seems to subtly subvert the conventions of popular-science writing.  For example, in the first chapter, he includes a table that categorizes each of the book’s remaining chapters as “Mainstream,” “Controversial,” or “Extremely Controversial.”  And whenever you’re reading the text and cringing at a crucial factual point that was left out, chances are good you’ll find a footnote at the bottom of the page explaining that point.  I hope both of these conventions become de rigueur for all future pop-science books, but I’m not counting on it.

The book has what Tegmark himself describes as a “Dr. Jekyll / Mr. Hyde” structure, with the first (“Dr. Jekyll”) half of the book relaying more-or-less accepted discoveries in physics and cosmology, and the second (“Mr. Hyde”) half focusing on Tegmark’s own Mathematical Universe Hypothesis (MUH).  Let’s accept that both halves are enjoyable reads, and that the first half contains lots of wonderful science.  Is there anything worth saying about the truth or falsehood of the MUH?

In my view, the MUH gestures toward two points that are both correct and important—neither of them new, but both well worth repeating in a pop-science book.  The first is that the laws of physics aren’t “suggestions,” which the particles can obey when they feel like it but ignore when Uri Geller picks up a spoon.  In that respect, they’re completely unlike human laws, and the fact that we use the same word for both is unfortunate.  Nor are the laws merely observed correlations, as in “scientists find link between yogurt and weight loss.”  The links of fundamental physics are ironclad: the world “obeys” them in much the same sense that a computer obeys its code, or the positive integers obey the rules of arithmetic.  Of course we don’t yet know the complete program describing the state evolution of the universe, but everything learned since Galileo leads one to expect that such a program exists.  (According to quantum mechanics, the program describing our observed reality is a probabilistic one, but for me, that fact by itself does nothing to change its lawlike character.  After all, if you know the initial state, Hamiltonian, and measurement basis, then quantum mechanics gives you a perfect algorithm to calculate the probabilities.)

The second true and important nugget in the MUH is that the laws are “mathematical.”  By itself, I’d say that’s a vacuous statement, since anything that can be described at all can be described mathematically.  (As a degenerate case, a “mathematical description of reality” could simply be a gargantuan string of bits, listing everything that will ever happen at every point in spacetime.)  The nontrivial part is that, at least if we ignore boundary conditions and the details of our local environment (which maybe we shouldn’t!), the laws of nature are expressible as simple, elegant math—and moreover, the same structures (complex numbers, group representations, Riemannian manifolds…) that mathematicians find important for internal reasons, again and again turn out to play a crucial role in physics.  It didn’t have to be that way, but it is.

Putting the two points together, it seems fair to say that the physical world is “isomorphic to” a mathematical structure—and moreover, a structure whose time evolution obeys simple, elegant laws.   All of this I find unobjectionable: if you believe it, it doesn’t make you a Tegmarkian; it makes you ready for freshman science class.

But Tegmark goes further.  He doesn’t say that the universe is “isomorphic” to a mathematical structure; he says that it is that structure, that its physical and mathematical existence are the same thing.  Furthermore, he says that every mathematical structure “exists” in the same sense that “ours” does; we simply find ourselves in one of the structures capable of intelligent life (which shouldn’t surprise us).  Thus, for Tegmark, the answer to Stephen Hawking’s famous question—”What is it that breathes fire into the equations and gives them a universe to describe?”—is that every consistent set of equations has fire breathed into it.  Or rather, every mathematical structure of at most countable cardinality whose relations are definable by some computer program.  (Tegmark allows that structures that aren’t computably definable, like the set of real numbers, might not have fire breathed into them.)

Anyway, the ensemble of all (computable?) mathematical structures, constituting the totality of existence, is what Tegmark calls the “Level IV multiverse.”  In his nomenclature, our universe consists of anything from which we can receive signals; anything that exists but that we can’t receive signals from is part of a “multiverse” rather than our universe.  The “Level I multiverse” is just the entirety of our spacetime, including faraway regions from which we can never receive a signal due to the dark energy.  The Level II multiverse consists of the infinitely many other “bubbles” (i.e., “local Big Bangs”), with different values of the constants of physics, that would, in eternal inflation cosmologies, have generically formed out of the same inflating substance that gave rise to our Big Bang.  The Level III multiverse is Everett’s many worlds.  Thus, for Tegmark, the Level IV multiverse is a sort of natural culmination of earlier multiverse theorizing.  (Some people might call it a reductio ad absurdum, but Tegmark is nothing if not a bullet-swallower.)

Now, why should you believe in any of these multiverses?  Or better: what does it buy you to believe in them?

As Tegmark correctly points out, none of the multiverses are “theories,” but they might be implications of theories that we have other good reasons to accept.  In particular, it seems crazy to believe that the Big Bang created space only up to the furthest point from which light can reach the earth, and no further.  So, do you believe that space extends further than our cosmological horizon?  Then boom! you believe in the Level I multiverse, according to Tegmark’s definition of it.

Likewise, do you believe there was a period of inflation in the first ~10-32 seconds after the Big Bang?  Inflation has made several confirmed predictions (e.g., about the “fractal” nature of the CMB perturbations), and if last week’s announcement of B-modes in the CMB is independently verified, that will pretty much clinch the case for inflation.  But Alan Guth, Andrei Linde, and others have argued that, if you accept inflation, then it seems hard to prevent patches of the inflating substance from continuing to inflate forever, and thereby giving rise to infinitely many “other” Big Bangs.  Furthermore, if you accept string theory, then the six extra dimensions should generically curl up differently in each of those Big Bangs, giving rise to different apparent values of the constants of physics.  So then boom! with those assumptions, you’re sold on the Level II multiverse as well.  Finally, of course, there are people (like David Deutsch, Eliezer Yudkowsky, and Tegmark himself) who think that quantum mechanics forces you to accept the Level III multiverse of Everett.  Better yet, Tegmark claims that these multiverses are “falsifiable.”  For example, if inflation turns out to be wrong, then the Level II multiverse is dead, while if quantum mechanics is wrong, then the Level III one is dead.

Admittedly, the Level IV multiverse is a tougher sell, even by the standards of the last two paragraphs.  If you believe physical existence to be the same thing as mathematical existence, what puzzles does that help to explain?  What novel predictions does it make?  Forging fearlessly ahead, Tegmark argues that the MUH helps to “explain” why our universe has so many mathematical regularities in the first place.  And it “predicts” that more mathematical regularities will be discovered, and that everything discovered by science will be mathematically describable.  But what about the existence of other mathematical universes?  If, Tegmark says (on page 354), our qualitative laws of physics turn out to allow a narrow range of numerical constants that permit life, whereas other possible qualitative laws have no range of numerical constants that permit life, then that would be evidence for the existence of a mathematical multiverse.  For if our qualitative laws were the only ones into which fire had been breathed, then why would they just so happen to have a narrow but nonempty range of life-permitting constants?

I suppose I’m not alone in finding this totally unpersuasive.  When most scientists say they want “predictions,” they have in mind something meatier than “predict the universe will continue to be describable by mathematics.”  (How would we know if we found something that wasn’t mathematically describable?  Could we even describe such a thing with English words, in order to write papers about it?)  They also have in mind something meatier than “predict that the laws of physics will be compatible with the existence of intelligent observers, but if you changed them a little, then they’d stop being compatible.”  (The first part of that prediction is solid enough, but the second part might depend entirely on what we mean by a “little change” or even an “intelligent observer.”)

What’s worse is that Tegmark’s rules appear to let him have it both ways.  To whatever extent the laws of physics turn out to be “as simple and elegant as anyone could hope for,” Tegmark can say: “you see?  that’s evidence for the mathematical character of our universe, and hence for the MUH!”  But to whatever extent the laws turn out not to be so elegant, to be weird or arbitrary, he can say: “see?  that’s evidence that our laws were selected more-or-less randomly among all possible laws compatible with the existence of intelligent life—just as the MUH predicted!”

Still, maybe the MUH could be sharpened to the point where it did make definite predictions?  As Tegmark acknowledges, the central difficulty with doing so is that no one has any idea what measure to use over the space of mathematical objects (or even computably-describable objects).  This becomes clear if we ask a simple question like: what fraction of the mathematical multiverse consists of worlds that contain nothing but a single three-dimensional cube?

We could try to answer such a question using the universal prior: that is, we could make a list of all self-delimiting computer programs, then count the total weight of programs that generate a single cube and then halt, where each n-bit program gets assigned 1/2n weight.  Sure, the resulting fraction would be uncomputable, but at least we’d have defined it.  Except wait … which programming language should we use?  (The constant factors could actually matter here!)  Worse yet, what exactly counts as a “cube”?  Does it have to have faces, or are vertices and edges enough?  How should we interpret the string of 1′s and 0′s output by the program, in order to know whether it describes a cube or not?  (Also, how do we decide whether two programs describe the “same” cube?  And if they do, does that mean they’re describing the same universe, or two different universes that happen to be identical?)

These problems are simply more-dramatic versions of the “standard” measure problem in inflationary cosmology, which asks how to make statistical predictions in a multiverse where everything that can happen will happen, and will happen an infinite number of times.  The measure problem is sometimes discussed as if it were a technical issue: something to acknowledge but then set to the side, in the hope that someone will eventually come along with some clever counting rule that solves it.  To my mind, however, the problem goes deeper: it’s a sign that, although we might have started out in physics, we’ve now stumbled into metaphysics.

Some cosmologists would strongly protest that view.  Most of them would agree with me that Tegmark’s Level IV multiverse is metaphysics, but they’d insist that the Level I, Level II, and perhaps Level III multiverses were perfectly within the scope of scientific inquiry: they either exist or don’t exist, and the fact that we get confused about the measure problem is our issue, not nature’s.

My response can be summed up in a question: why not ride this slippery slope all the way to the bottom?  Thinkers like Nick Bostrom and Robin Hanson have pointed out that, in the far future, we might expect that computer-simulated worlds (as in The Matrix) will vastly outnumber the “real” world.  So then, why shouldn’t we predict that we’re much more likely to live in a computer simulation than we are in one of the “original” worlds doing the simulating?  And as a logical next step, why shouldn’t we do physics by trying to calculate a probability measure over different kinds of simulated worlds: for example, those run by benevolent simulators versus evil ones?  (For our world, my own money’s on “evil.”)

But why stop there?  As Tegmark points out, what does it matter if a computer simulation is actually run or not?  Indeed, why shouldn’t you say something like the following: assuming that π is a normal number, your entire life history must be encoded infinitely many times in π’s decimal expansion.  Therefore, you’re infinitely more likely to be one of your infinitely many doppelgängers “living in the digits of π” than you are to be the “real” you, of whom there’s only one!  (Of course, you might also be living in the digits of e or √2, possibilities that also merit reflection.)

At this point, of course, you’re all the way at the bottom of the slope, in Mathematical Universe Land, where Tegmark is eagerly waiting for you.  But you still have no idea how to calculate a measure over mathematical objects: for example, how to say whether you’re more likely to be living in the first 1010^120 digits of π, or the first 1010^120 digits of e.  And as a consequence, you still don’t know how to use the MUH to constrain your expectations for what you’re going to see next.

Now, notice that these different ways down the slippery slope all have a common structure:

  1. We borrow an idea from science that’s real and important and profound: for example, the possible infinite size and duration of our universe, or inflationary cosmology, or the linearity of quantum mechanics, or the likelihood of π being a normal number, or the possibility of computer-simulated universes.
  2. We then run with that idea until we smack right into a measure problem, and lose the ability to make useful predictions.

Many people want to frame the multiverse debates as “science versus pseudoscience,” or “science versus science fiction,” or (as I did before) “physics versus metaphysics.”  But actually, I don’t think any of those dichotomies get to the nub of the matter.  All of the multiverses I’ve mentioned—certainly the inflationary and Everett multiverses, but even the computer-simuverse and the π-verse—have their origins in legitimate scientific questions and in genuinely-great achievements of science.  However, they then extrapolate those achievements in a direction that hasn’t yet led to anything impressive.  Or at least, not to anything that we couldn’t have gotten without the ontological commitments that led to the multiverse and its measure problem.

What is it, in general, that makes a scientific theory impressive?  I’d say that the answer is simple: connecting elegant math to actual facts of experience.

When Einstein said, the perihelion of Mercury precesses at 43 seconds of arc per century because gravity is the curvature of spacetime—that was impressive.

When Dirac said, you should see a positron because this equation in quantum field theory is a quadratic with both positive and negative solutions (and then the positron was found)—that was impressive.

When Darwin said, there must be equal numbers of males and females in all these different animal species because any other ratio would fail to be an equilibrium—that was impressive.

When people say that multiverse theorizing “isn’t science,” I think what they mean is that it’s failed, so far, to be impressive science in the above sense.  It hasn’t yet produced any satisfying clicks of understanding, much less dramatically-confirmed predictions.  Yes, Steven Weinberg kind-of, sort-of used “multiverse” reasoning to predict—correctly—that the cosmological constant should be nonzero.  But as far as I can tell, he could just as well have dispensed with the “multiverse” part, and said: “I see no physical reason why the cosmological constant should be zero, rather than having some small nonzero value still consistent with the formation of stars and galaxies.”

At this, many multiverse proponents would protest: “look, Einstein, Dirac, and Darwin is setting a pretty high bar!  Those guys were smart but also lucky, and it’s unrealistic to expect that scientists will always be so lucky.  For many aspects of the world, there might not be an elegant theoretical explanation—or any explanation at all better than, ‘well, if it were much different, then we probably wouldn’t be here talking about it.’  So, are you saying we should ignore where the evidence leads us, just because of some a-priori prejudice in favor of mathematical elegance?”

In a sense, yes, I am saying that.  Here’s an analogy: suppose an aspiring filmmaker said, “I want my films to capture the reality of human experience, not some Hollywood myth.  So, in most of my movies nothing much will happen at all.  If something does happen—say, a major character dies—it won’t be after some interesting, character-forming struggle, but meaninglessly, in a way totally unrelated to the rest of the film.  Like maybe they get hit by a bus.  Then some other random stuff will happen, and then the movie will end.”

Such a filmmaker, I’d say, would have a perfect plan for creating boring, arthouse movies that nobody wants to watch.  Dramatic, character-forming struggles against the odds might not be the norm of human experience, but they are the central ingredient of entertaining cinema—so if you want to create an entertaining movie, then you have to postselect on those parts of human experience that do involve dramatic struggles.  In the same way, I claim that elegant mathematical explanations for observed facts are the central ingredient of great science.  Not everything in the universe might have such an explanation, but if one wants to create great science, one has to postselect on the things that do.

(Note that there’s an irony here: the same unsatisfyingness, the same lack of explanatory oomph, that make something a “lousy movie” to those with a scientific mindset, can easily make it a great movie to those without such a mindset.  The hunger for nontrivial mathematical explanations is a hunger one has to acquire!)

Some readers might argue: “but weren’t quantum mechanics, chaos theory, and Gödel’s theorem scientifically important precisely because they said that certain phenomena—the exact timing of a radioactive decay, next month’s weather, the bits of Chaitin’s Ω—were unpredictable and unexplainable in fundamental ways?”  To me, these are the exceptions that prove the rule.  Quantum mechanics, chaos, and Gödel’s theorem were great science not because they declared certain facts unexplainable, but because they explained why those facts (and not other facts) had no explanations of certain kinds.  Even more to the point, they gave definite rules to help figure out what would and wouldn’t be explainable in their respective domains: is this state an eigenstate of the operator you’re measuring?  is the Lyapunov exponent positive?  is there a proof of independence from PA or ZFC?

So, what would be the analogue of the above for the multiverse?  Is there any Level II or IV multiverse hypothesis that says: sure, the mass of electron might be a cosmic accident, with at best an anthropic explanation, but the mass of the Higgs boson is almost certainly not such an accident?  Or that the sum or difference of the two masses is not an accident?  (And no, it doesn’t count to affirm as “non-accidental” things that we already have non-anthropic explanations for.)  If such a hypothesis exists, tell me in the comments!  As far as I know, all Level II and IV multiverse hypotheses are still at the stage where basically anything that isn’t already explained might vary across universes and be anthropically selected.  And that, to my mind, makes them very different in character from quantum mechanics, chaos, or Gödel’s theorem.

In summary, here’s what I feel is a reasonable position to take right now, regarding all four of Tegmark’s multiverse levels (not to mention the computer-simuverse, which I humbly propose as Level 3.5):

Yes, these multiverses are a perfectly fine thing to speculate about: sure they’re unobservable, but so are plenty of other entities that science has forced us to accept.  There are even natural reasons, within physics and cosmology, that could lead a person to speculate about each of these multiverse levels.  So if you want to speculate, knock yourself out!  If, however, you want me to accept the results as more than speculation—if you want me to put them on the bookshelf next to Darwin and Einstein—then you’ll need to do more than argue that other stuff I already believe logically entails a multiverse (which I’ve never been sure about), or point to facts that are currently unexplained as evidence that we need a multiverse to explain their unexplainability, or claim as triumphs for your hypothesis things that don’t really need the hypothesis at all, or describe implausible hypothetical scenarios that could confirm or falsify the hypothesis.  Rather, you’ll need to use your multiverse hypothesis—and your proposed solution to the resulting measure problem—to do something new that impresses me.

24 Mar 11:45

The "Lump of Criminality" fallacy

by Mark Thompson
There is a widely understood and recognised fallacy within economics known as "Lump of Labour".

The rebuttal to it essentially says that the amount of work available within an economy is not fixed (as the fallacy would have us believe) but rather changes as the economy grows and changes. This can be as a result of organic growth or can also be as a result of other factors such as legislative and technological changes and things like immigration etc.

Image courtesy of David Castillo Dominici
This fallacy is highly reminiscent of an argument I have heard put forward to justify the retention of the current laws on drugs. It goes along the lines of: "Do you really think that if we legalised drugs, the drug dealers and barons would suddenly become nice people and stop being law breakers? Of course they wouldn't, they'd simply find other nefarious activities to replace the lost drugs revenues.".

This argument seems plausible on a prima-facie basis. If someone has got themselves involved in drug dealing they are likely to be something of a wrong-un and it does not take much of a leap of imagination to conclude they might get up to other dodgy behaviours were the drugs option taken away from them.

But this is very similar to "Lump of Labour". It's assuming that there is a fixed amount of criminality in the economy and that nothing we do can change that. Is it presupposing that all of those people involved in the black market drugs world would have progressed to crime anyway and they just happened to choose drugs, as if a criminal lifestyle is somehow predetermined outside of any cultural, legal and economic factors.

We might want to call this the "Lump of Criminality" fallacy. It is surely pretty self-evidently not true. Drugs at the moment are a highly lucrative activity. There are various estimates but one recent study puts the drug trade in the UK as valued at £7 billion per year. That is a huge amount of money currently available to the black market and even a tiny slice of those sort of revenues will make getting involved in criminality a more attractive option for some people. If you bring currently illegal drugs entirely within the legal economy those monies disappear. Yes I am sure some of those dodgy characters will find other illegal ways of funding their lifestyles but it is also likely that a fair number of them will move into the legal economy.

It is also worth bearing in mind that some of those who will decide to remain in the criminal world will be doing so because it's probably all they have ever known. They will have found themselves drug dealing to e.g. fund their own drug habit and now all of their friends and their entire lifestyle is on the wrong side of the tracks. But if drugs had been legal from the get-go they would have been much less likely to be in a situation like this. Hence as time goes on the chances of people in marginal situations turning to crime will be reduced too.

I'm not claiming legalising drugs will be a panacea. Of course it won't be. Drugs can be dangerous and even under a legalised framework some addicts will commit crimes. But the idea that changing the drugs laws will not do anything about the level of criminality in our society is a knee-jerk reactionary fallacy that does not really bear more than a few minutes scrutiny.

24 Mar 08:14

Almost everything you've been told about unhealthy foods is wrong.

Almost everything you've been told about unhealthy foods is wrong.
24 Mar 08:11

Two places with similar urban/college populations: the one with more immigrants produces more notable people.

Two places with similar urban/college populations: the one with more immigrants produces more notable people.
24 Mar 08:07

THE VERVE – “The Drugs Don’t Work”

by Tom

#773, 13th September 1997

verve ddw “Whenever we played that live there would be rows of grown men crying. It was almost like these guys couldn’t cry when they needed to cry, but that song operated like a pressure valve for them and it was okay for them to cry at a big rock concert.” – Richard Ashcroft on “The Drugs Don’t Work”

The list of number ones is not a complete history of anything except itself: it’s an iceberg party, a throng of bobbing and jostling tips – rock, hip-hop, reggae, indie, cinema, politics, comedy, charity, marketing and more, each one an incomplete and distorted story. But sometimes – when a berg seems over-familiar – the tiny and partial story told by the tip can put a new spin on it.

So the rock and indie number ones of 1996-1997 have seemed to me to tell a story about anxiety, a crisis of legitimacy for rock music. “Setting Sun” brutally demonstrated that it was impossible simply to pick up where the 60s innovations had left off. “Discotheque” suggested that other musics could no longer be easily absorbed into the working practises of a rock band. And Oasis were a walking declaration that a traditional band line-up should be the centre of pop, simply by right and by confidence – and it had worked, until Be Here Now showed the limits of this fiat rock.

But there are other things rock can do beyond innovation and simple hugeness. “The Drugs Don’t Work” leads us to one of them: rock could get emotional. Specifically, rock could thrive as a venue for great big male emotions, a conduit by which confused 21st century guy feels could be expressed and released at stadium scale and numbing pace. Creation’s Alan McGee, a partisan of more swaggering styles, coined an ugly, dismissive term for it: ‘bedwetter music’. He was talking about Coldplay, but he could have been talking about Keane, Athlete, Snow Patrol – bands who, like them or not, were Britain’s main solution to the “what is rock for?” riddle.

Tying this to The Verve might seem wrong. The Verve were part of Oasis’ moment, not Coldplay’s – their previous album, A Northern Soul, used Oasis’ producer Owen Morris, and as Matt DC pointed out to me, the reason Urban Hymns ended up outselling Be Here Now was because it offered a similarly mammoth, but apparently more consistent and thoughtful, alternative for disappointed buyers.

That side of the band always fought against a still earlier incarnation. At the heart of the group was an instinct to meander. Early singles – like 1992’s “Gravity Grave” – cast Richard Ashcroft as a psychedelic pilgrim, cloudwalking wide-eyed through his band’s blown-out songs. It was an outrageously corny take on psychedelia, all the more so for its fixed-stare sincerity. At the time I thought myself far too hip for it, but secretly enjoyed it anyway.

But once they started writing more structured songs, their best tracks were usually the ones where Ashcroft tapped into this questing side. “History” wraps itself up in William Blake references and comes on like a Northern Jim Morrison, and again uses aggressive sincerity as a get-out-of-jail card to cover the track’s wayward structure: when you mean it this much, who cares that the song just fizzles at the end? “Bitter Sweet Symphony” – the breakthrough – does the same thing with a fantastic stolen hook, and a groove and theme which means the endless voyaging and the lack of resolution become the point rather than something Ashcroft is trying to front his way through. But the famous video sums up the underlying game very well – Ashcroft crashing into passers-by who get in the way of his vision quest. It’s the Gallagher attitude applied to philosophy: weaponised introspection.

That’s the link between The Verve and the Coldplay era – that sense that the singer’s giant sensitive feelings are the most important thing in the world, and that as such they deserve only the broadest, slowest, most self-serious music as accompaniment. As you’ve probably realised, I don’t like this music very much – not that this makes me a critical maverick – and “The Drugs Don’t Work” both succeeds and fails by pointing towards this glum, widescreen version of rock.

“The Drugs Don’t Work” is a small, bleak song nestled inside a larger, lazier one, and the small song takes The Verve out of their psychedelic comfort zone and back down to grey, inescapable, Earth: it’s Richard Ashcroft writing about his dying Dad. Of course, the song works if it’s just about comedowns, or a chemically-defined relationship, but this is one time when learning the song’s authorised subject improves it. It’s already got the cat in the bag metaphor – an ear-seizing image, one of the year’s most arresting lyrics – but “If you want a show / Just let me know / And I’ll sing in your ear again” becomes a devastating line when you set it in the hopeless quiet of a hospital ward. Ashcroft tones down his rock prophet style to sound confused and exhausted, and Nick McCabe drops in the occasional lonesome whale cry.

It’s sombre, effective, it’s what (I guess) the record is best remembered for, and it’s only about half the song. All the “ooo sha la la” parts, all the “whoa Lords”, and especially Ashcroft’s vamping at the end are big rock boilerplate, and for me they blow the effect. A song which works because it’s grounded in a relatable experience turns into another trot through the rock frontman playbook. The ‘Mad Richard’ urban shaman shows up again at the most inappropriate time.

The most obvious effect was just to make “The Drugs Don’t Work” longer. We are in an era of Number One bloat, where bolting on an extra minute comes as standard, and I think it particularly hurts this record. Perhaps I’d feel different front-and-centre at a Verve gig, but for me that whole string-driven coda doesn’t feel redemptive or healing, it’s just a reminder that – as with “History” – Ashcroft is awful at sticking the landings of songs and prefers to bluff his way out of them. The bluff plainly worked, but the ideas and the emotional weight of “The Drugs Don’t Work” ease up well before halfway. In the wider story of British rock, it’s a transitional Number One, a song whose hurt and confusion are sabotaged by its worn-out nods to rock enormity. The next generation of massive UK bands would smooth out these conflicting impulses, and find ways of doing emotion at arena scale. Most of their hits are as dreary and draining as “The Drugs Don’t Work” winds up being, but few are as frustrating as this song, because few of them have its kernel of quality in the first place.

22 Mar 17:00

So are there any good historical movies?

by mike

Mr smarty pants critic, theaporetic, thinks Twelve Years a Slave is a bad movie, and thinks he proved why, without having actually seen the movie, quite an absurd position indeed. Criticism is easy–are there any good historical movies?

Theaporetic, having a nine year old child, does not go to the movies much because babysitting. But thinking back, what historical movies are good? Off the top of my head, I really liked Pirates of the Caribbean, the first one. The sequels were bloated and tiresome. I liked O Brother Where Art Thou a lot, and still do. I think the key for me is the degree to which the film maker is willing to abandon realism.

Historical movies and academic histories both tell stories about the present, using the past as material. As mentioned, director Steve McQueen likes to call Twelve Years a Slave “a story of today.” Similarly historians always write about questions that animate the present, so a modern book about Pirates will consider the way gender roles played out and pay a lot of attention to Anne Bonney and Mary Read, and a book about banking written in the last twenty years will be about regulation. So both historical movies and academic histories are always really about the  present moment.

I generally think that should be made really obvious, not hidden. As historians, we should make it clear that our work is engaged with the present and asking questions useful to the present, not pretend that we are writing from some position outside space and time where the real truth is revealed.

eswannPirates of the Caribbean is a ridiculous movie, a modern fantasy of piracy. Like recent academic histories of piracy, it stresses multiculturalism and democratic anti-authoritarianism. But it takes them to absurd lengths. I’ll focus on one aspect, the role of women in the movie. Yes, there are two known cases of women as pirates in the caribbean. And probably communities that tolerated/encouraged privacy didn’t care too much about respectable gender propriety. Upper class women abandoning a life of respectable ease to be pirates? Umm, no.

But that’s ok, because the movie is always sharing its jokiness with its audience. It’s loud and extreme. The histories of piracy that discuss women as pirates are of necessity quieter and much more modest, because the examples of women as pirates are quieter and more modest. But in both cases what we are considering is “what can women be and do” by imagining what they might have been and might have done in the past.

obrotherO Brother Where Art Thou is a broad farce with great music. Like Pirates it’s historically “real looking” but at the same time obviously fantasy. It enacts the famous observation that although American formal political culture has usually been viciously racist, American popular culture has always wanted to mix and “miscegenate.” The climax, where a race-baiting politician fails to convince the audience to reject a racially mixed musical group, reverses what actually happened in the 1930s, which is that race-baiting pretty much always worked. The film is a fantasy of the power of art in history. And it’s ok, because just as there were some female pirates, there were some cases where art trumped the politics of segregation, and art does sometimes trump politics.

You’d be foolish to take O Brother or Pirates for accurate accounts of life in those times, but I don’t think the films encourage their audiences to do that. They encourage the audience to think of the past as a field of difference, where alternative identities might be possible. This appeals to me for the same reason history appeals to me–it’s a record of alterity, of radical difference

This seems to me to be the best use of history in movies, to dramatize alterity. When movies present themselves as realistic or accurate, then the problems begin. I wrote before about the absurdity of considering Saving Private Ryan “realistic.” It may or may not be a good movie, but from the opening sequence, “realism” is out the window. You cannot see the scene from 12 different perspectives, including both the perspective of the nazis and the American soldiers and call it realistic. You can call it “vivid.” But there’s a narrative structure to movies which undermines the claim to realism.

You could argue the same thing about historical narrative, and I’d agree: history often claims a degree of “realism” that it can’t ever deliver. Good histories don’t do this–they make the contingency and “constructedness” of their claims really apparent. As with the films mentioned above.

 

cleopatra

 

As I think more about it, probably the first two godfather movies are good historical movies. They aren’t good accounts of organized crime–they are probably ridiculous on that score. But they dramatize the problem of immigration and assimilation. I gave Spielberg’s Lincoln an honest B an Amistad an honest B+, probably because they tried hard and did some research. But I couldn’t get with Saving Private Ryan, the Spielbergian sentimentality and phony realism was too much. I kind of like The Searchers because it’s overheated. I thought There Will be Blood was ridiculous because incoherent. Gangs of New York terrible every way: morally incoherent, historically inaccurate, bloated and self important, dumb. Admittedly, this is not a very coherent or deeply thought list.

21 Mar 11:36

Typical Mind and Disbelief In Straight People

by Scott Alexander

Lots of no doubt very wise people avoid comment threads here, but the comments on Universal Human Experiences are really worth it. The post was about how people mistake their own unique and individual ways of experiencing the world for human universals and assume anyone who says otherwise is speaking metaphorically. And boy did people have a lot of good examples of this.

I want to turn lots of them into extended discussions eventually, but for now I’ll settle for the one that reminded me how I still fail at Principle of Charity.

Principle of Charity, remember, says you should always assume your ideological opponents’ beliefs must make sense from their perspective. If you can’t even conceive of a position you oppose being tempting to someone, you don’t understand it and are probably missing something. You might be missing a strong argument that the position is correct. Or you might just be missing something totally out of left field.

People always accuse me of going too far with this, and every time I’m tempted to believe them I always get reminded I don’t go nearly far enough.

I’m pretty good at feeling tempted by most positions I oppose, which satisfies me that I understand them enought to feel justified in continuing to reject them. The biggest exception is that opposition to homosexuality has never made sense to me. I can sort of understand where it fits into a natural law theology, but a lot of anti-gay activists are, no offense, not exactly Thomas Aquinas.

Chris Hallquist commented:

I remember hearing once on Dan Savage’s podcast that he gets letters from gay men who grew up in very conservative parts of the country, who didn’t know that being straight was a thing. They assumed all men were attracted to men, but just hid it.

Martin responded:

Dr. Paul Cameron, founder of the anti-gay Family Research Institute, is quoted as saying: “If all you want is the most satisfying orgasm you can get – and that is what homosexuality seems to be – then homosexuality seems too powerful to resist… It’s pure sexuality. It’s almost like pure heroin. It’s such a rush. They are committed in almost a religious way. And they’ll take enormous risks, do anything.” He says that for married men and women, gay sex would be irresistible. “Martial sex tends toward the boring end,” he points out. “Generally, it doesn’t deliver the kind of sheer sexual pleasure that homosexual sex does” So, Cameron believes, within a few generations homosexuality would be come the dominant form of sexual behavior. Apparently, some people build their entire lives around not knowing that being straight is a thing.

So imagine that you’re one of those people Dan Savage was talking about – a closeted gay guy who doesn’t realize he’s a closeted gay guy. He just thinks – reasonably, given his own experience! -that the natural state of the human male is to be attracted to other men, but that men grudgingly have sex with and marry women anyway because society tells them they have to.

(I don’t know if this generalizes to women)

In that case, exactly the anti-gay position conservatives push makes perfect sense for exactly the reasons they say it makes sense.

Allowing gay marriage would destroy straight marriage? Yes! If everyone’s secretly gay, then as soon as gay marriage is allowed, they will breath a sigh of relief and stop marrying opposite-sex partners whom they were never very attracted to anyway.

Gay people are depraved and licentious? Yes! Everyone else is virtuously resisting all of these unbearable homosexual impulses, and gay people are the ones who give in, who can’t resist grabbing the marshmallow as soon as it is presented to them.

(I’m referring to this experiment, not some sort of creepy sexual euphemism. Get your heads out of the gutter.)

Teaching children about homosexuality will turn them gay? Yes! The only thing preventing them all from being gay already is the social stigma against it. Teaching them in school that homosexuality is okay and shouldn’t be stigmatized cuts the last thin thread connecting them to straightness.

This can’t be a universal explanation for anti-gay attitudes. Something like half the US population is against gay marriage (previously much more) and probably five percent or less is gay. Closeted gay people don’t explain more than a small fraction of the anti-gay movement.

But it’s probably bigger than the fraction who read Thomas Aquinas. Maybe all these idiosyncratic arguments that only a few people can really appreciate turn into soundbites and justifications that get used by other people who feel vague discomfort but don’t have a good grounding for why. That means they’d have an impact larger than the size of the groups that produce them.

21 Mar 10:38

Guest Post: Steven Moffat - A Case For The Prosecution

by noreply@blogger.com (Philip Sandifer)
Personally, as I've said, I quite like the Moffat era. I dare say I love it. But I recognize that this is a controversial opinion, and it seems fair, before we go in, to give the opposition a chance to state the case against. So here's Jack Graham doing what he does best: being loud, angry, and leftist about Doctor Who. We'll be back on Monday with a Pop Between Realities to kick off the Moffat era, and then The Eleventh Hour on Wednesday. Also, to clarify, the illustration on this one was my pick, meant as a cheekily representative example of anti-Moffat criticism, and not Jack's at all.

I’m here to up the anti, so to speak.  As such, I’m a sacrificial lamb.  Firstly, I’m going to annoy you all by delaying the arrival of the proper posts about the Moffat/Smith era.  And Phil specifically asked me for a polemic, so what follows will be an angry screed.  There really isn’t space here to do more than gesticulate irritably at the issues (since I’ve tried to have some mercy on you regarding word count).  I’ve had to leave out lots of points, cut loads of stuff about the wider political context (I promise you - it was all in the first draft, and it was brilliant) and, of necessity, simplify.  So this post is going to sit here seething, vulnerable and alone, a hostage to fortune, as Phil winds up his phenomenal Eruditorum project with a series of (doubtless) nuanced and subtle redemptive readings of the era I’m about to intemperately trash.  At least nobody can say I balked at making myself a sitting target.  

(I must acknowledge Phil’s own ‘Definitive Moffat and Feminism Post’, which I used as a springboard, and the generous assistance of Richard Pilbeam and Jonathan Barlow, from whom I have cribbed shamelessly.  All errors are, of course, mine alone.)


Phil’s recent, devastating piece on ‘The End of Time - Part 2’ describes what I find most worrying about the late RTD era.  But for Phil, ‘The Eleventh Hour’ was probably like the cavalry turning up.  For me, ‘The Eleventh Hour’ was like the cavalry turning up and immediately opening fire on me.  

It’s interesting to note that whereas RTD usually drafted in celebrities – Chris Morris style - to satirise themselves without knowing it (which they always fell for), Moffat drafts Patrick Moore in to be flattered.  RTD got Sharon Osbourne and Anne Widdecombe to endorse Harold Saxon (i.e. the Anti-Christ).  Moffat gets Sir Patrick to play himself as a charming, twinkly-eyed expert.  Aside from lauding a well-known reactionary and sexist, Moore’s guest appearance is emblematic of a wider complacency.  Amy’s village is a retreat from anything like the Powell Estate.  This is maundering ‘long shadows on village greens’ territory. The episode celebrates cutesy, ostensibly lovable, eccentric Britishness.  

There’s a near-immediate attempt to problematize such things in ‘The Beast Below’… but the actual result is instructive.  Starship UK is all of Britain, divided into counties that are now housed within tower blocks.  There seem to be no social divisions preventing some residents from fitting easily into a tower block with others.  Starship UK is a great communal space in which everyone fits happily into the same shopping mall.  This isn’t a problem.  Indeed, it is specifically portrayed as wonderful until the Doctor deduces that Starship UK is also police state… not that there’s much evidence of this, or of any need for it, beyond the arbitrary decision on the part of the authorities to feed children to a monster that refuses to eat them.  There are problems, of course, but social or economic hierarchy isn’t one of them because it doesn’t exist.  There are no poor or homeless people.  There’s a Queen, but she seems to be the only protestor in existence (usurping an entire subculture of activists and transferring their bravery and commitment to the monarchy).  There are sinister government types, but they regularly admit their secrets openly to the entire population… just like our own elites don’t.  The exploitation of the Space Whale becomes the collective fault of the entire population, all kept fully informed by their extraordinarily forthcoming government.  The people all choose – from a position of total freedom – to voluntarily embrace hypocrisy and complacency.  Moffat is groping towards a critique of complicity.  And there’s a germ of a point here because it’s true, to an extent, that many of us know that our society is monstrously unfair yet often choose to do nothing about it.  But Moffat has the government openly admitting its crimes (which misrepresents the reality of government secrecy and media propaganda) to a population who are all equally free and empowered (which misrepresents the reality of people struggling under huge disadvantages, preoccupations and economic blackmail), and who all equally choose to co-operate in exploitation (which… you get the gist).  The exploitation is carried out by the population as a whole.  But who are they exploiting?   The Whale isn’t akin to the working class because they’re not there anymore (in accordance with bourgeois wisdom).  Is the Whale meant to represent the victims of economic or military imperialism?  Precious little sign of it.  This attempt at a spiky, satirical episode resolves into a standard, victim-blaming, liberal whinge about how ‘we’re all middle class now, we get the government we deserve, we’re all complicit, even the activists and protestors’.  This kind of ‘critique’ leads nowhere except to more of the ‘apathy’ that it supposedly attacks.  Economic inequality, economic coercion, layers of social management, the suffocating obfuscation of the media, the manufacture of consent, even capitalism itself… none of these things are anywhere to be seen.  The dark secrets come from nowhere, except maybe the dark side of mankind or something.  And, in the end, the oppressed creature is held to a higher standard of morality than the oppressors, as is usual in these bourgeois morality plays.

Even Moffat’s best attempt at an angrily political story, ‘The Bells of Saint John’, is compromised by the kind of bourgeois reductionism that writes off the London riots - an explosion of rage at austerity and police bullying - as a glitch in the software.  His underlying metaphysics support the baddies’ claims about how the human brain works, and thus cuts agency away from anyone who might try to protest.  The idea of social rebellion is incomprehensible in this neoliberal vista, except as a moment when the top-down control briefly malfunctions.  Even if it turns nasty, the liberal capitalist millennium is escape-proof.  

Instead, you have to fit into it.  Madame Vastra is a Silurian who, thanks to her own comfortable position in bourgeois society, has made her peace with the world.  The Silurians were always the Palestinians of the Who universe.  Displaced, kept down, promised recompense by the Doctor – the great, well-meaning liberal compromiser - and then betrayed so that the status quo can be reset.  In Moffat’s version, one of our heroes is a Silurian who has been separated from her defeated people, bought off and reconciled to the conquerors.  (But then, on Moffat’s watch, all nuance was dropped from the Silurians, with their return story featuring a metaphor about how sometimes good people just have to torture terrorists to protect the innocent.)  Meanwhile, Strax is Vastra’s comedy sidekick.  The Sontarans no longer have anything to say about militarism.  This dimension cannot be explored in Moffat’s version of the show.  It wouldn’t be recognisable to modern TV as a palatable part of a profitable franchise.  The only thing you can do with a metaphor about militarism is laugh at it.  

Jenny, Vastra and Strax are a perfect illustration of how Steven Moffat waters down any of the satirical or polemical acid in the show’s signifiers, of how he neutralises and sanitises the show’s inbuilt tendency to engage in (admittedly imperfect) political critique.  He excises anything potentially worrying to the mainstream.  He is semiotic paint stripper.  He makes Doctor Who safe for neoliberalism.

Related to this are the repeated and jarring ethical failures.  There’s the Doctor becoming a neoconservative of other people’s souls, rearranging their innards until they become more to his liking.  In ‘A Christmas Carol’, the Doctor rewrites a man’s life – while he protests.  I realise Kazran is a horrible man, and full marks to Moffat for relating his callousness to his commodification of people as collateral.  That has some potential bite.  But Moffat doesn’t see Kazran as representing a system as opposed to individual villainy (even RTD’s capitalists are more systemic than Kazran).  He doesn’t adapt the parts of Dickens’ fable where Scrooge is persuaded through arguments about social justice.  He leaves out the levelling parties from Scrooge’s youth and replaces them with celeb bashes where Kazran and the Doctor exploit women.  Abigail is a literal ‘woman in a refrigerator’… except that she makes the hero and the villain like each other!  Abigail is literally put in and out of the freezer according to the whims of the boys.  This is only one signal of a wider malaise, as we get to the end of the episode without the Doctor even asking about any of the other frozen people.  We may assume that the Doctor’s newly created version of Kazran will free them (unfair economic systems happen because one or two guys are old meanies, natch) but it isn’t like the Doctor to not even check.  He says he’s “never met an unimportant person” but his actions say otherwise.  How Abigail feels about the fact that she’s been a slave for years, and is now about to die, fails to be a blip on anyone’s radar.  If the Doctor’s going to alter the past, why permit the system that puts Abigail in the fridge to start with?  Why is Kazran the priority here?  Because, once again, the system is inescapable.  Moffat and the Doctor have made peace with it.

The most egregious failure of moral priorities – again linked to a distinctly neoliberal kind of political context-removal – is in ‘The Day of the Moon’, in which the Doctor defeats the Silence by brainwashing the entire human race to become mindless, genocidal killers.  Moffat thinks everybody on Earth has seen the Moon landing on TV, apparently discounting the humanity of anyone too poor or non-Western to not get televised updates on American triumphalism.  It is never explained how or why the Silence are any worse than Nixon, with whom the Doctor gets comfortably pally, given that he dropped more tonnage of bombs than were dropped during WWII on small peasant countries in South East Asia.  Indeed, the Doctor seems to think that Nixon is the leader of a slave rebellion.  In effect, Steven Moffat has compared Richard Nixon to Spartacus or Toussaint L’Ouverture.  But then, the Doctor would be a hypocrite if he waxed superior to Nixon in this story, given his own – apparently heroic, amusing and sexy - engineering of ethnic cleansing.  He’s hypocritical enough for waxing superior to the Silence, who are never actually shown doing anything much wrong (aside from one murder that seems to have been put in solely to justify calling them the baddies).  Remember what I was saying last time about villains sometimes having an objectively superior moral position?

Moffat’s at it again with the snuggling up to powerful leaders in ‘The Day of the Doctor’.  Moffat has the Tenth Doctor conducting a romance with Elizabeth I, during which she jokes about how routine it is for her to have people killed - and he doesn’t bat an eyelid.  How this makes her any better than a Zygon is never explored.  I’ve read irritated posts on tumblr about how Moffat reduces Elizabeth I to a simpering, infantile, love-struck girlfriend.  And yeah, I can see the sexism there… but I’m equally worried by the idea of the Doctor being friendly with her at all, given that she was a ruthless, blood-splattered autocrat.  Just like all feudal monarchs  And like Richard Nixon.  And like several other people the Doctor has gotten matey with during the Moffat era.  Moffat’s repeated tendency to have him cosy up to rulers, presidents, kings and queens, bosses, presidents, etc, is quite revolting.  Okay, the classic Doctors used to occasionally talk about being pally with Napoleon or Mao or Nelson or Marie Antoinette… but you could usually rely on them to distrust or dislike the ruling classes they actually met.  RTD’s depiction of Queen Victoria was far too sympathetic (in reality she was a horrible person who lived in obscene luxury at the apex of a brutal empire) but she didn’t exactly end up as the Doctor’s best friend.  

On the subject of sexism…  I have a quite simplistic view of this.  I think the reason that lots of people think Steven Moffat’s version of Doctor Who is sexist is because it repeatedly acts and sounds sexist.  It may be that Moffat consciously tries to craft his Who as feminist or pro-feminist.  If so, I don’t think there’s any better illustration of the crucial point that, in a sexist society, however much of an ‘ally’ you may be, if you’re a man then you still enjoy male privilege, and probably don’t realise it half the time.

The Doctor describes Clara as “a mystery wrapped in an enigma squeezed into a skirt that’s just a little bit too tight”.  The Doctor describes Marilyn Monroe as though she really was nothing more than the stereotypical ‘man crazy’ ditz she played in some of her movies.  Rory likens being married to Amy to being trapped inside a giant robot duplicate of her.  We get dialogue like “Why did she try to kill you and then want to marry you?”  “Because she’s a woman”.  Osgood, a scientist, is shown to be secretly obsessed with jealousy towards her prettier sister.  A Dalek develops a female alter-ego, and she spends her time cooking.

Moffat’s show is crammed with tropetastic Manic Pixie Dreamgirls who tease and tantalise our hero.  Like all MPDs, they each play the role of muse to our broken hero.  The jokey put-downs are part and parcel of the MPD by the way – she always admonishes the hero out of his slough.  The put-downs are not a sign of independence or a genuinely critical attitude; they’re part of the MPD’s job.  The hero is slumped in melancholy - really, was there ever a Doctor as petulantly self-pitying as the Eleventh?  Ten isn’t even in the running – and along comes the MPD, all quippy and perky and happy and impossible – to heal, inspire and reawaken him.  Even as the MPDs have such an excess of personality, every last one of them revolves eternally around the hero.  The Doctor is, of course, the lead character… but he doesn’t need to be the flame around which all these fluttering butterfly girls dance.  

(This, by the way, is a manifestation of Moffat’s unhealthy and point-missing insistence on the Doctor being what the show is about, rather than a way of unifying the polemics and allegories and metaphors and satires and pastiches inside one meta-text.  That isn’t me calling for the Doctor to be characterless.  Indeed, the trouble now is that he’s more characterless than ever before.  The more Moffat concentrates on him, the more characterless he becomes, because he becomes more and more a narcissistic navel-gazer, rather than an actor in social events outside himself.  Once again, the true neoliberal attitude: the atomised individual is what matters, not the social act.)

In Moffat’s show, women are overwhelmingly defined by their traditional gender roles or bodily functions.  It doesn’t matter that their excellence in these gender roles is praised by show and lead character.  It doesn’t matter that we’re supposed to be impressed by the virtuosity with which River tricks people using her feminine wiles.  It doesn’t change anything that the Doctor goes into rhapsodies about the wonders of motherhood.  That isn’t liberating; it’s still the mapping of male, patriarchal conceptions of female value onto female characters.

River exists entirely because of the Doctor.  Who the hell is River?  She is an assemblage of gender essentialist tropes and wisecracks.  When does she ever – beyond, arguably, her first appearance – behave like an academic or a scientist?  When does she ever display anything resembling erudition or intellectual curiosity?  When does she ever do or say anything to show or engender love?  Admittedly, the Doctor seems to be sexually aroused by the way she shoots people… which is just charming.  In ‘Let’s Kill Hitler’, she is incarnated as Mels, a character we’ve never seen or heard of before, and plonked unceremoniously into the story out of sheer, brazen convenience.  She stalks Amy and Rory (her unwitting mother and father) for years, pretending to be their friend, all because of her pre-programmed monomaniacal desire to get to the Doctor.  She regenerates while “concentrating on a dress size”.  She spends the rest of the episode obsessing over her hair, clothes, shoes and weight.  River’s instability is finally conquered by the love of a good man.  This seems intensely hostile and patronising.  If that isn’t what was aimed at, then somebody is a very bad shot.  

It doesn’t matter that River is ‘powerful’.  Fetishizing ‘power’ in women characters – having them kicking ass and always being ready with a putdown - isn’t the same as writing them as human beings.  Moffat’s Who may not be quite as blatant an iteration of this misprision as, say, Zack Snyder’s Sucker Punch, but, strangely, it might even be worse precisely because it seems superficially better.  Snyder’s film might, on a first viewing, be mistaken for something which knowingly panders to adolescent male fantasies only to ruthlessly pull the rug out from under the leering male viewer… Snyder does, after all, pepper his sexploitation sequences with grim representations of real sexual exploitation.  Moffat doesn’t, for instance, put River into suspenders and pigtails (though he gets close to this sort of thing a few times with Amy) but then neither does he bother to really address the consequences of gendered violations like forced pregnancy.  Moffat can get away with as much, if not more, than Snyder (or, for that matter, Uncle Tewwance) precisely because he seems to be ‘on side’.  There’s neoliberal feminism for you in a nutshell: the absorption and assimilation of protest by the dominant hegemonic culture.

The show repeatedly reduces Amy to the roles by which patriarchy constructs femininity: girlfriend, fiancée, wife, mother, ex…  There’s even a ‘comedy’ episode in which Amy is said to have used her sexuality to pass her driving test (tsch, these women drivers!), is split into two people and literally fancies herself, thus providing lesbian fantasy fodder for the men around her.  Remember that episode of Red Dwarf where Rimmer and Lister meet their female counterparts in a parallel universe?  At one point, Rimmer is aghast because his female opposite has, in an effort to pull him, “gone to get some sexy videos” because “she seems to imagine that seeing two men together will turn me on”.  Moffat may be writing Doctor Who-as-sitcom, but a line in a real SF sitcom skewers the absurdity of the male idea that all women secretly want to engage in lesbian sex for male titillation.  The rest of the ostensible laughs come from Rory staring up Amy’s skirt… because it’s hilarious to violate her privacy without consent!  But, of course, Rory and Any are married by this point, so that effectively makes Amy into Rory’s property (remember how, once they’re married, the Doctor has to ask Rory’s permission to hug her?).  Rory’s half-hearted acceptance of this new arrangement is all part of the joke of Rory.  He is, of course, a slave to Amy.  He can’t stop himself staring up her skirt because she’s just too pretty (putting it all on her) and has to accept that he’s now ‘Mr Pond’… but its all part of the trade-off that the Nice Guy makes.  

Rory is an emblem of Nice Guy Syndrome.  He’s the bloke who thinks he’s entitled to the girl because he helped her move house that time.  In Rory’s case, he was her playmate during all the childhood years when she wanted to play Raggedy Man (again, the Doctor-fixation).  He hung around throughout school and college, waiting for his time to come.  As with so much of Moffat’s work, this is a recycling of stuff that was big in the 90s.  Rory is a reiteration of the Nice Guys Who Waited in 90s sitcoms – Niles in Frasier, Ross in Friends.  Like them, he’s a self-pitying, yet idealised, nerd Mary Sue (contrary to sexist myth, Mary Sues are not just the province of female fan-fic writers).  He hangs around pining, being loyal (for thousands of years in Rory’s case), and thus earns the girl.  He accrues his entitlement to her via years in the ‘friend zone’.  Initially, she barely notices his maleness because it is obscured by his niceness (girls like jerks) and his allegedly less-than-stunning looks (girls like hunks, the selfish…) while he, of course, desires her for… well, without any self-awareness, he desires her for her stunning looks.  “You’re so beautiful,” moans Rory on one of the occasions when he dies.  It is only later, after she has been educated in her role as wife and mother, that he gives any indication of liking anything about her personality.  Rory is an illustration of all this, written as lovable, swathed in the alibi of irony.  Later Rory gets his forceful and decisive moments.  But even here, it’s hard to not see these as a reassertion of proper male authority.  In the meantime, he gets repeatedly slaughtered in order to hammer home the point that Amy is, for all her supposed independence, slavishly dependent upon him.  They may not fancy the nice guys, but they need them!

There’s a feminist reading that claims Moffat’s female characters as role models.  Post-Demon’s Run Amy, for instance, can be seen as a rape survivor who refuses to allow the experience to define the rest of her life.  However, the trouble here – aside from the trotting out, yet again, of the Mystical Pregnancy trope by a male writer who feels entitled to reduce his female characters to uncanny uteruses - is the sheer blithe glibness of the representation in question.  Yes, it might be an admirable thing to show a woman who, having been violated with an unwanted pregnancy and birth, only to have her baby stolen from her, were shown as living past such trauma and refusing to allow it to define her… if we were ever given any real sense that the experience had been traumatic for her.  It might be objected that this complaint amounts to asking for more concentration on the rape and the trauma.  But I didn’t want to see SF rape on Doctor Who again at all!  Even so, given that Steven Moffat made the unforced artistic/business choice to put SF rape in there, I’d have much preferred to see some indication that the victim found it more than slightly and briefly unpleasant.  After all, violence against women – sexual and otherwise – is currently at epidemic levels globally, and getting worse (one of the social by-products of neoliberal crisis and austerity).

We’re all supposed to think modern Who is so much better than old Who because it’s ‘emotional’… well, you can’t just turn that off when it suits you and not expect anyone to cry shenanigans.  And I can’t help noticing that the moment when the supposed emotional maturity of modern Who fails most catastrophically is also the moment when it forces itself into the corner of dealing with the sexual violation of a young woman.  

Tell you what: why don’t we get some women writers back onto the show, and they can decide if they want to write about rape or pregnancy in SF terms, and, if so, how they think it should be done.   It’s a thought.  

Relatedly, there’s the issue of heteronormativity.  This is the first era to have a married couple aboard the TARDIS as regulars, the first era to have the Doctor married off, or even to have him straightforwardly and explicitly straight (unless you want to argue about Susan).  This is the first era that repeatedly focuses on heterosexual couples, the first era that repeatedly has alien menaces defeated by the aggressive assertion of heteronormative gender relations.  It happens again and again, be it through the declaration of heterosexual love (‘The Lodger’), the power of heterosexual fatherhood (‘Black Spot’, ‘Night Terrors’), the emotions of the heterosexual nuclear family (‘The Doctor, the Widow and the Wardrobe’, ‘The Snowmen’), the restatement of heterosexual parenthood as an implicit good (‘Closing Time’), etc.

Yes, there are positively drawn gay characters here and there - Canton in ‘Day of the Moon’, for instance.  But anyone who thinks that a relationship between a humanoid reptile and a subservient housemaid/ninja (in which they solve crimes in Victorian London) qualifies as normative….  In the midst of a sea of heteronormativity, it’s hardly a workable defence to point to one homosexual relationship which is sketched in the most outlandishly Fantasy terms, and features no characters of any consistency.  Jenny, for instance, becomes a ninja – out of the blue - whenever required.  It is a mark of how little genuine respect is shown these characters that her unveiling as a ninja in ‘The Crimson Horror’ is just that: an unveiling, with the camera panning up her legs, clad in tight leather for the benefit of the male gaze.  Oh look, we’re back at the sexism.

‘The Crimson Horror’ is Gatiss’ best script yet for the TV series, featuring lots of quite well-drawn female characters… so it’s tragic, and illustrative of the lack of care which undermines this era, that the episode also features two lapses into breathtaking male privilege: the Doctor’s offhand description of Tegan as “a gobby Australian” and the truly jaw-dropping moment when the Doctor forcibly snogs – i.e. sexually harasses – a young woman whom he knows to be in a committed same-sex relationship.  This moment is, needless to say, brushed off immediately and treated as another bit of fun.  

We get something similar (though arguably not quite as bad) in ‘Time of the Doctor’ when the Doctor slaps Clara’s backside to prove that she’s his property… I mean, girlfriend.  I realise, by the way, that this is supposed to be a sign of the Doctor’s unworldly, immature, crass misunderstanding of human relationships.  But that’s an unconvincing alibi.  Firstly, ‘it’s just a joke’ has long been discredited as an excuse for sexism… or should’ve been.  Secondly, this is the same character who is supposed to be emotionally mature enough to sustain a romantic relationship with River which we are meant to find noble and moving.  You can’t just move the character’s maturity dials up and down from episode to episode depending on… oh, hang about.

On the subject of Clara… she’s really the ultimate example of the Moffatian variant of the MPD: the woman-as-puzzle.  She also emblematic of the Moffatian habit of simply altering the female character to fit each episode as it comes along.  With Clara, he literally makes her a succession of different people.

There is something about the spurious way that all the multiple Claras are declared - by fiat - to be part of one whole, that strangely mirrors Moffat’s obsession with forcing all the previous Doctors into line with his.  Here Moffat shows the a truly fannish impulse towards syncresis.  He crams all the old Doctors into his ‘of the Doctor’ tetralogy, just as he’s been crowbarring clips of them in at every opportunity since ‘The Eleventh Hour’.  The aim is to bring the past into his orbit, the better to overwrite it.  The truth is, Steven Moffat has become the Great Intelligence.  The threat posed by the Intelligence in ‘Name of the Doctor’ is that he/it will take over the Doctor’s entire life and rewrite it to suit himself.  This in the same episode in which Moffat literally inserts his own character, Clara, into every moment of the Doctor’s life, having her meet every single one of his incarnations, putting her at the very moment when the Doctor first leaves Gallifrey, telling him which TARDIS to steal.  This in the same episode in which he introduces an entirely new, never-before-seen incarnation of the Doctor.  Whatever else you can say about him, Moffat isn’t a writer who allows himself to be troubled by an excess of self-awareness.  Of course, all Who is a palimpsest.  But Steven Moffat seems to be the only writer to work on the show who is absolutely determined to overwrite his own personality on top of the whole damned thing.  

Part and parcel of this seems to be his fixation upon the Doctor himself.  Admittedly, this is a recurring problem of new-Who going back almost to the start in 2005.  But again, Moffat seems to have taken it to a new level.  And, y’know… this show isn’t fundamentally about the Doctor.  He’s a narrative device for moving us from satire to polemic to allegory to metaphor to pastiche to whatever.  That doesn’t, of course, mean that he has to be written as a cipher without interiority.  On the contrary, I love it when he is written as having interiority.  But the thing is… well, there are several things.  Firstly, as mentioned, the kind of interiority he has now is of the self-involved, atomised individual of neoliberalism, not the social actor he should be.  Secondly, there’s no necessary connection between giving the Doctor interiority and making the show about him.  Thirdly, even if we accept that the show must be intensely emotional and focused upon the feelings of the lead character, wouldn’t it be better if it actually… umm… did that?  

The reason I feel ill when the Doctor snogs River’s ghost at the end of ‘Name of the Doctor’ is not that I hate emotion in Who, or that I want – because I’m a sexually and emotionally repressed nerd or something – Doctor Who to be emotionless.  Rather, the opposite of this is the truth.  The reason I feel ill at moments like that is rather that I hate fake emotion, cheap emotion, unearned emotion.  Commodified emotion.  Packaged, marketed, profitable, sugary, junk emotion.  Sentimentality, in other words.  

Sentimentality is disgusting because it’s not fundamentally about other people, or relationships.  It’s about oneself.  It’s self-regarding, self-comforting, self-pleasing.  It isn’t social.  It’s narcissistic.  This is precisely what is so horribly wrong with all those Moffatian emotional tornadoes.  How can they be touching when the characters and relationships are so shallow?  When we’re watching narcissists adoring their own reflections in their partner’s eyes?

This post has gone on too long, but there’s so much still to say about these last few years.  There’s the hubris of having the Doctor frighten away enemies by touting his reputation.  There’s the moralistic preaching and speechifying.  There’s the sheer boredom of the story arc mysteries, impossible to care about because they’re always waved away with some bit of nonsense made-up-on-the-fly.  There’s the constant undermining and reversal of death.  There’s the banalization of the Time War into a Lucasfilm space battle.  There’s Matt Smith (luckily, my politics means I’m used to being in a tiny minority).  There’s ‘Victory of the Daleks’.  There’s the relentless middle class-ness of almost everyone and everything.  There’s the way the Doctor’s behaviour never changes, no matter how many times he learns his lesson.  There’s “the tears of a whole family on Christmas Eve…”  There’s the inconsistency of claiming that the Doctor’s moral status has beend challenged by comparing him to a warrior when the show is chock-full of unambiguously noble warriors.  There’s the stigmatising of loners.  There’s the use of slightly surreal environments which then get fully explained, just in case anything off-the-wall makes the viewers uncomfortable.  There’s the reliance on tropes that were big in the 90s… I’ve mentioned Nice Guys from US sitcoms, but how about Greys and Area 51?  There’s the stalking-as-romance trope.  There’s the sheer privilege-blindness involved in making the first all black guest cast in Doctor Who play a bunch of fools who need to be captured and threatened into moral behaviour by the Doctor, or in giving a greedy trader the name ‘Solomon’.  There’s the Doctor relating to his vehicle by fancying it.  There’s the way ‘Night Terrors’ lectures the working class on how to be better parents.  There’s the way every resolution seems to involve solving the monsters to death (usually with love), thus defeating the rump gothic with the power of comforting banalities.  There’s James Corden.  There’s… oh, that’ll do.  I’ve had enough.  I don’t like having to hate this show.  I want to love it.  


I’ll say one more thing.  It may be true that previous production teams have, at one time or another, been guilty of things similar to everything I’ve just been talking about.  But I think that’s a red herring.  What does it matter?  What does it change?  Besides, even if it’s true that Moffat gets a lot of extra stick because expectations have been raised… well, that isn’t unfair.  Even if he’s the one who raised them (which I don’t buy), it still isn’t unfair.  It’s part of how things get better.  The people in power, the privileged, deliver something, and instead of saying “thanks boss”, you say “not enough – do better.”  Moffat has a harder time pleasing everybody because more people are politicised and vocal about stuff like sexism.  The neoliberal feminism of a privileged ‘ally’ isn’t good enough for them.  And that’s as it should be.  Be reasonable, I say.  Demand the impossible.