Shared posts

13 Mar 19:27

traces / strangers

there is an experience that soft chambers wonders whether you have ever had

it is not an experience that can be had today

this is a historical feeling – it was created by specific circumstances and is unlikely to exist again in the same way

it is the experience of renting a videogame for a period of time and discovering the traces left by a stranger, the evidence of another person’s encounter with the world

dropping into the half-completed quests, the half-built villages, we felt as if we could nearly touch these strangers, talk to them

coming across worlds that had been completed, rescued, saved, we marvelled at the skill and power of these others whom we would never meet

we wondered, standing before the cave housing the greatest monster in the world, the beast that would be god, how many people’s invisible efforts had gone into reaching this point

how many hands did a single save file pass between from the time a game was stocked on the shelf of a small rental store to the time that store sold their old cartridges and replaced them with compact discs?

soft chambers wonders: is this simple nostalgia? or was there something ineffable and important shared through that experience?

25 Jan 21:57

The Paradox of the Muddy Children

by Greg Ross

Three children return home after playing outside, and their father tells them that at least one of them has a muddy face. He repeats the phrase “Step forward if you have a muddy face” until all and only the children with muddy faces have stepped forward.

If there’s only one child with a muddy face, then she’ll step forward immediately — she can see that no other children have muddy faces, so her father must be talking about her. Each of the other children will see her muddy face and stand fast, since they have no way of knowing whether their own faces are muddy.

If there are two children with muddy faces, then no one will step forward after the first request, since each might think the father is addressing the other one. But when no one steps forward after the first request, each will realize that there must be two children with muddy faces, and that she herself must be one of them. So both will step forward after the second request, and the rest will stand fast.

A pattern emerges: If there are n children with muddy faces, then n will step forward after the nth request.

But now imagine a scenario in which more than one of the children has a muddy face, but the father does not tell them that at least one of them has a muddy face. Now no one steps forward after the first request, for the same reason as before. But no one steps forward at the second request either, because the fact that no one stepped forward after the first request no longer means that there is more than one child with a muddy face.

This is perplexing. In the second scenario all the children can see that at least one of them has a muddy face, so it seems needless for the father to tell them so. But without his statement the argument never gets going; despite his repeated requests, no child will ever step forward. What’s missing?

(From Michael Clark, Paradoxes From A to Z, 2007.)

25 Jan 21:53

January 25, 2015


The secret kangaroo penguin club meeting went really well. I couldn't believe all the celebrities who showed up!
25 Jan 05:49

Sundance Review: The Exquisite Holy Terror Of ‘The Witch’ Will Chill Your Bones & Haunt Your Soul

by Rodrigo Perez
In cinema, witches are traditionally depicted as campy, melodramatic cartoons. They sport wicked noses, grotesque make-up and shrill cackling voices. But if one were to subtract all elements of broad humor and clichéd corniness and distill the frightening concept of witchcraft and evil to their base essentials, a chilling picture begins to emerge. Such is Parts & Labor’s “The Witch” a spellbinding, absolutely nightmarish picture that will genuinely make your blood run cold. Set in 1630 New England, a devout Christian family is about to be excommunicated from their village thanks to the prideful and arrogant father. The family then makes their own pilgrimage to a nearby...
25 Jan 05:49

Angry Atheist Rant

by ozymandias
Zephyr Dear

I care a lot about whether it's good. =/

I am not saying that religious people are universally stupid. I’m saying that religion is factually incorrect.

I am not saying that atheists are universally smart. I’m saying that religion is factually incorrect.

I am not saying that religion doesn’t provide comfort to and enrich the lives of billions of people. I’m saying that religion is factually incorrect.

I am not saying that everything in the modern scientific consensus is true. I’m saying that religion is factually incorrect.

I am not saying that everything that some asshole passes off as science is true. I’m saying that religion is factually incorrect.

I am not saying that there are no people who attempt to reconcile science and religion. I’m saying that religion is factually incorrect.

I am not saying that nothing good has been done in the name of religion. I’m saying that religion is factually incorrect.

I am not saying that nothing evil has been done in the name of science. I’m saying that religion is factually incorrect.

I am not saying that the tradition and community people get from their religions isn’t valuable. I’m saying that religion is factually incorrect.

I am not saying that religion is inherently tyrannical and oppressive. I’m saying that religion is factually incorrect.

There are atheists that say some of those things that I’m not saying. Which is bad, because all of those things I’m not saying are douchebaggy and factually incorrect and also completely unnecessary.

My radical position is that you should not believe things that are factually incorrect. Even if believing them makes you feel nice. Even if it makes you a better person. Even if it connects you to your community and your ancestors. (You can go to the rituals without believing in God, anyway.) Even if some other people over there believe factually incorrect things sometimes too.

I feel like having to justify why believing things that aren’t true is bad is like having to justify why hitting people who don’t want to get hit is bad. It… is? Duh? But the number of otherwise intelligent people who say “they’re nice and they aren’t hurting anyone, what do you care?” suggests otherwise.

The doublethink necessary to believe things that aren’t true hurts your ability to figure out what’s true and what’s not. Not necessarily; humans are very good at compartmentalizing and often, say, take “I feel it in my heart” as adequate on matters of theology but not on matters of medicine. But in aggregate turning off your critical thinking and rationality sometimes hurts your ability to be rational.

If you believe things that aren’t true, you’re going to make decisions based on the false things you believe, and decisions made based on inaccurate information are usually bad decisions. If you believe apricot pits cure cancer, you might skip chemo. If you believe there aren’t any cars on the street when there are, you might cross and get hit by an SUV. If you believe all-nighters improve your grades, you might flunk an exam. If your map says that Disney World is in Michigan, you will never get to meet the Mouse. For that matter, look at basically any atrocity in history, nearly all of which were caused by people believing untrue things (usually “this group of people is inherently evil,” “God said so,” or “our ludicrous political system totally works”).

As a practical matter, there are a lot of false beliefs in the world, and most forms of theism I encounter in day-to-day life are not actively hurting people and, in fact, might be making people’s lives better. So I’d prefer to try to get people to not believe in untrue things that are also hurting people first.

This is really just a plea for atheists and theists to stop arguing about religion is good. Who cares? The question is whether it’s true.


25 Jan 05:46

cute-bird-dad: when will game of thrones and the walking dead go away. when will this wave of...

cute-bird-dad:

when will game of thrones and the walking dead go away. when will this wave of cynical grimdark “only the strong survive” fiction finally end. why is star trek about explosions now. please give me the power to destroy this disgusting redpill bro nerd culture that is wallowing in its cheap easy discomfort and gratuitous shock value and refusal to move forward with society

24 Jan 20:44

Somebody Get This Guy a Budget

by Peter Watts

We open on two civilians waiting to board a train. To their left stands a SWAT cop in riot gear; to their right, a battered drone right out of Blade Runner hovers menacingly at heart level. Glances are exchanged, though no words are spoken: If you have nothing to hide, you have nothing to fear. Slowly— making no sudden moves— the civilians climb aboard.

Welcome to the Special Economic Zone. You’ll like the shops, if you like to sweat.

*

That is not a Donnie Darko hoodie.  Then again, it could be a Donnie Darko tribute.

That is not a Donnie Darko hoodie. Then again, it could be a Donnie Darko shout-out.

When Jim Munroe was first starting out, he broke up with Rupert Murdoch. He turned his back on conventional publishing after a successful debut with Harper-Collins, choosing instead to go indie. Of course, they didn’t call it “going indie” back in the final year of the twentieth century. They barely even called it “self-publishing”. What they really called it was career suicide.

But damned if he didn’t make a go of it, years before Howey and Weir and all those other late-comers jumped on the bandwagon. Novels were only the start; Munroe branched out into games, graphic novels, lo-fi movies. His first web serial, Infest Wisely, had a budget of about twenty bucks and looked it. Its most expensive prop was a mock-up of an ATM (or possibly, an actual ATM boosted from the local 7-11 à la first-season Breaking Bad. Jim’s resourceful that way.)

postopianHis second outing was a mockumentary purporting to be a Chinese human-interest piece set in 2045, tut-tutting about the poor white underclass that had emerged following America’s bankruptcy (and subsequent repossession of the Cloud) way back in 2016. Android babies and human spam figured on-stage; the giant mutant spiders, in deference to budget reality, stayed out of frame. Ghosts With Shit Jobs weighed in at a still-meager budget of $4K and 7,000 hours of volunteer effort— constraints which didn’t stop it from showing at festivals from London to Beijing, and two dozen cities in between. It got a shitload of rave reviews and took home the Best Feature award at Sci-Fi-London in 2012. Ghosts looked significantly better than Infest— largely because Toronto’s Dundas Square makes a pretty shiny SFnal backdrop for free— but the shoestring, while thicker, remained.

Third time out, though, Jim graduated from shoestrings to bootstraps. He got 150K from the Independent Production Fund, so he could pay his crew. He raised another 25 grand on Kickstarter to cover post-production. The result is an eight-episode web-series (or, if you prefer, a 70-minute movie) called Haphead (here’s the trailer, here’s the fb page)— still made for an infinitesimal fraction of your typical movie, but still forty times richer than last time.

It shows. It premiered in Toronto just last Thursday and it’s already picked up its first nomination (Best Score, out at Vancouver Webfest). I rather expect more will be coming.

A Clockwork Orange glower for the 21rst Century.  Look into her eyes: that's the Aster*sk corporate logo. It washes out the inside of your head every time you boot up.

A Clockwork Orange glower for the 21rst Century. Look into her eyes: that’s the Aster*sk corporate logo. It scrubs out the inside of your head every time you boot up.

Haphead is a story of the near future, set in Hamilton’s “Special Economic Zone”— basically an industrial ghetto, liberated from such anticompetitive woes as the minimum wage, or safety standards. Our guide is an assembly-line grunt named Maxine, who makes a marginal living slapping together brain-game interfaces for overseas markets. The device clings to the base of your skull like a leech, bypasses your sense organs in favor of writing input directly onto the sensory cortex (I’m guessing some kind of TMS or targeted ultrasound, although we’re never told). Maxine isn’t allowed to use them herself; the tech hasn’t been approved for domestic distribution. But she steals one anyway, uses it as a passport into a virtual game world that— well, imagine Skyrim inhabited entirely by bipedal sapient kick-boxing ninja bunnies.

No, really.

Max spends a lot of time there.

The thing is, this interface is immersive. It doesn’t just fill the senses, it works out the body. Spend enough time fighting killer rabbits in fantasy-land and you develop moves— not to mention improved stamina and muscle tone — back here in meatspace. Which comes in handy when someone close to Maxine dies suddenly under mysterious circumstances…

Haphead is way better than it has any right to be. Little gems of technosocial extrapolation glitter throughout Munroe’s screenplay: upscale malls with perky automated security systems, apologetically refusing entry to consumers with “mixed-income backgrounds”; insurance companies with their own paramilitary SWAT teams to go after false claimants. The plot itself— at first glance a straightforward lefty bit of capitalist-bashing— takes turns you might not expect. People are not always who they seem to be; the victims aren’t always who you might think (or if they are, they might be a bit less deserving of sympathy than they first seem). Star Elysia White is a real find; whether Max is mourning or raging, pondering some mystery or cracking wise, her performance is spot-on throughout.

You're in good hands with Allstate.

You’re in good hands with Allstate.

It’s not a perfect film. The extrapolation’s a bit sloppy in places, the narrative occasionally inconsistent. I love, for example, the face-recognition specs that flash your net financial worth to any mall cop who crosses your path— but I’m skeptical that a social infrastructure with that level of casual surveillance would also let you extract two million dollars from a corner ATM by dragging an unconscious account-holder up to the keyboard and smushing their fingerprint onto the ID pad. And after seeing a security drone break up an after-hours Fight Club on the factory floor mere moments after it starts, I gotta wonder why none of those bots show up when Max gets into an extended knock-down-drag-out with an actual supervisor in the same building.

It’s important to note, though, that when I find something wanting in this series— something that can’t be obviously forgiven as a budgetary artefact— the fault I’m finding is that Haphead occasionally descends to the quality of Hollywood productions with Hollywood budgets. If the dialog is clunky now and then— as when a couple of generic bad guys loom and spout threatening clichés in the first few minutes, before fading away to make room for the main story— it’s still no worse than much of the dialog I’ve endured while catching up on “Person of Interest”. If some of the secondary characters don’t always hit their marks acting-wise, they still look pretty good next to some of the performances in a cult favorite like, say, Babylon 5. The only reason I grumble about them at all is because I hold Postopia Productions to a higher standard, teensy budget notwithstanding.

Personally I would've preferred Catworld. but to each his own.

Personally I would’ve preferred Cat World, but to each his own.

And in fact, most of the nits I’d pick pretty much do come down to budget. It’s impressive enough that Munroe’s modest funds were sufficient to render a high-altitude zoom from jet stream to worktable in a single continuous shot, or anchor a futuristic CGI shopping mall onto a real-world industrial park; I shouldn’t complain just because there weren’t enough customers in the parking lot. Ninja Bunny World looks curiously retro for a 2025 game environment, and a plot-critical piece of biotech seems a bit too advanced for the world we see on screen— but this is the price you pay for rendering macro concepts with modest funding. For me, the choice between rich-but-dumb vs. smart-but-poor is a no-brainer (granted, the existence of Michael Bay makes it pretty obvious that mine is a minority view). Still. Wouldn’t it be great to have that third option? Wouldn’t it be awesome to live in a world where smart-and-rich came along more often than a solar eclipse?

Wouldn’t it be great if this guy got some serious money behind him?

From troposphere to Table in one smooth shot. I would've built an animated gif showing the sequence, but I couldn't be bothered.

From troposphere to tabletop in one smooth shot. I would’ve built an animated gif showing the sequence, but I couldn’t be bothered.

 

Munroe continues to do wonders with the means at his disposal. Every time he’s up to bat, he hits the ball further. I don’t know what his next project is going to be (beyond, hopefully, another season of Haphead). I don’t know what kind of budget he’ll manage to put together. But the trend is unambiguous. One of these days— sooner rather than later, I’m thinking— Jim Munroe is going to hit it right out of the goddamn park.

Just like that battered menacing drone at the train station, I’m going to be watching him every step of the way.

 

24 Jan 20:01

we saw blade runner final cut on our date tonight there was one scene i found compelling if you...

Zephyr Dear

yesss watch more movies and have interesting thoughts on them that roughly mirror my own (not yet fully-formed) take yessssss do that

we saw blade runner final cut on our date tonight

there was one scene i found compelling if you read deckard as realizing he is a replicant, which is the one where he arguably assaults rachel

to my reading, she has just realized she is not “human” (so what should she feel? can she trust her feelings? etc) and he is probably realizing the same of himself, and his pushing her to sleep with him happens in this very telling way where he has to talk her through how a woman should react to a man, essentially

at best, rachel is reluctant to participate in deckard’s fantasy — which makes sense, the real tragedy here is that she isn’t given a chance to reclaim a subjectivity for herself, she is just embedded into deckard’s. anyway, in the particular scene,

it is as if they are realizing their performances (their humanness, but also their gender) are just that, performances

the film would’ve made a lot more sense of deckard had been fully recognized by the narrative as a total asshole. in other words if the film had been willing to fully criticize men and masculinity.

the foiling of deckard vs batty could then follow a better form: deckard should be unmasked and crushed by the narrative, whereas batty should achieve recognition for his humanity. the film almost goes there, kinda giving batty that chance, but it is too invested in harrison ford so it can’t seal the deal

and it also misses a ton of chances to show how humans commodify each others performances (priss and zora are both sex workers, for instance, but that is not even really fleshed out).

the visuals are beautiful (we got to see it on 35mm which was nostalgic for me, though the print wasn’t the best) but the story just doesn’t quite track

24 Jan 19:58

cityofdreamingspanners: ughzuko: Adam Hills rant about Rupert...





















cityofdreamingspanners:

ughzuko:

Adam Hills rant about Rupert Murdochs Charlie Hebdo tweets

I will get behind almost anything that concludes with describing the oldest, whitest, richest scumbag on earth a “Massive fuck knuckle”.

24 Jan 02:20

Improve Education By Teaching Less

by ozymandias

(Okay, everyone, we know that Ozy Is Not An Education Reform Specialist Or A Teacher And Has Actually Not Even Taken Sociology of Education Yet Despite It Being Offered Every Year, right? Please take this as more “food for thought” than “serious policy suggestion.”)

Right now, schools try to cram in a lot of stuff. Anyone remember the American history classes that never got past the Civil War? Biology classes that never got around to covering vertebrates? World History classes that have a week for Greece and Rome or a day for Vietnam? Math classes where half the textbook could have been blank white pages, because you’re certainly not addressing any of it?

Not just that, but a lot of the stuff schools teach is totally useless. I spent three years in middle school learning how to write five-paragraph essays and four years in high school learning how to write literary criticism. I find it amazing that I spent seven years of my life learning how to write the only two genres that absolutely no one in the world wants to read. I mean, Jesus, I could have spent seven years working on my sonnet skills. At least that would get me laid.

Let me be clear: I’m not blaming teachers for this shit. Most teachers are good people doing the best they can in an absolutely crappy incentive system.

“Our students should learn about Important Subject X!” is popular and “maybe we should take some of this out of the textbook?” leads to cries of dumbing down education. So textbooks have more information than you could possibly teach to a bunch of bored freshmen no matter how good a teacher you are. Someone decided that reading and writing ought to be taught in the same class and “students write about what they’re reading” is a natural way to synthesize that. Someone else decided that a single essay in a prescribed and absurdly artificial format* should be the sole way of assessing how well students are doing at writing, and you cannot blame people for responding rationally to incentives.

But the problem remains: students are being taught things they don’t need to know, and not being taught things they need to know, and this is a terrible way to run an educational system.

I suggest triage. Create a bare minimum list of things that people absolutely need to know– things that are highly effective in making people happier and better citizens, that either you or people around you will seriously regret your not knowing– and concentrate on teaching that. My preliminary list:

  • How to read.
  • How to write a clear sentence and paragraph.
  • Some foreign language fluently. Probably more than one.
  • Basic arithmetic.
  • Statistics.
  • How to assess information for quality (statistics is related to this).
  • The scientific method.
  • Basic science: how evolution works, what the atom theory is, etc.
  • Civics.
  • Basic psychology.
  • Et cetera, I highly doubt this list is complete.

“But Ozy?” I hear you say. “What about the love of learning, knowledge for its own sake? Don’t you value that?” Of course I do. I read textbooks for fun. But the love of learning cannot be coerced. You can’t make someone be passionate about learning world history because you passed a law that says everyone in tenth grade has to learn about world history. And even if that magically worked, they would probably be endlessly frustrated that you only spent a week on Greece and Rome.

Once you teach people the absolute basics, they can go where their passions take them: solving math problems, doing experiments, going to Shakespeare performances, writing poems, playing drums, programming, learning everything there is to know about the Abbasids. If people do things they care about, they are more likely to actually remember them a decade later; furthermore, it teaches important skills like How To Find Things Out that are way more important than a half-remembered quadratic equation.

Some people think people wouldn’t learn unless you coerced them. I highly doubt this. Humans’ comparative advantage is intelligence; we evolved to be thinking animals. It is really a sign of success at… something… that schools have managed to convince so many people that learning is boring and sucks. Besides, I highly doubt Hypothetical Would Rather Sit On Her Ass Than Learn To Play Drums Or Something Lady would be much good at learning things in a regular school system either. 

So basically I propose modified unschooling! With a caveat that some things are important enough that everyone has to learn them even if they don’t want to! Okay.

*People who didn’t spend three years learning how to write a five-paragraph essay, you cannot imagine how terrible they are. Imagine the guidelines an Overly Literal Genie who’s read too much Strunk and White would give for writing an essay. “There must be an introduction, a conclusion, and three paragraphs of evidence. The introduction contains two sentences of hook, two sentences of transition, and a thesis statement which says exactly what the next three paragraphs are going to say…”


24 Jan 02:11

Why I’m A Moral Nihilist

by ozymandias

lease note that I am strictly talking about moral nihilism here, which I’m going to refer to as “nihilism” for the rest of the post because I like saving on the typing. There are lots of other kinds of nihilism, and most of them make my head hurt (I am down with the existential nihilism though). I am not the person to ask about mereological nihilism.

When I say I’m a nihilist, I mean that I believe that there is no such thing as objective morality. All morality is just a kind of preference– when you say “X is morally right” you mean “I would prefer to live in a world where there was X.” It’s erroneous to think that there is an objective system of morals Out There, the same way that it is erroneous to think that, just because I prefer lima beans to microwave popcorn, there is an objective scale of tastiness with lima beans at the top and microwave popcorn at the bottom.

When I say I’m a utilitarian, I mean “I really like happiness, and I really don’t like pain. In fact, these feelings are so strong I want the most possible happiness and the least possible pain!” But I believe that there’s no way I can convince you to like happiness unless you already like happiness. If you believe that gaining honor through war is the highest goal of human life, or (like many medieval Christians) that suffering is good because it leads to the purification of the soul, the closest thing I can come to an argument is “look at all this unnecessary pain you’re causing! You monster!” Which is not really a good argument, because if they primarily cared about minimizing human pain they’d be utilitarians and we wouldn’t be having this argument.

When I argue for assisted suicide after counselling, I say “assisted suicide lets people die if the rest of their life would contain too much pain, which is good because pain makes people unhappy, which is bad because happiness is good, which is good because… I don’t know, it just is.” Similarly, someone else might argue “if people commit assisted suicide they will die, which is bad because they could have lived longer, which is bad because life is good, which is good because… I don’t know, it just is.” And a third person might argue “assisted suicide should be available upon demand, because that way people can die when they want to, which is good because people can control their own bodies, which is good because freedom is good, which is good because… I don’t know, it just is.” I see no reason to prioritize my “just is” over other people’s.

(Of course, you can value freedom or life as a means to happiness. But that’s not what I’m talking about. Many people do, in fact, view both of the above as ends in themselves.)

It’s true that human morality tends to share certain traits cross-culturally, which at first blush is evidence for some kind of objective moral system. But– well, first, I’m disinclined to accept that argument because I’d suddenly have to start believing that obedience to authority and maintaining purity are morally good instead of two of the largest sources of evil. And second, all of the moral beliefs found cross-culturally are things that have obvious evolutionary advantages for a social species. Is it more likely that there are objective morals that we have developed a “moral sense” for, or that humans who were loyal to their friends tended to survive better than humans that weren’t? And if the former, do vampire bats have a moral sense too?

Whenever I say I’m a nihilist, someone immediately concludes that I’m contradicting myself because I have a moral system and do things like “only buy ethically made clothes” and “eat veganish” and “give a tenth of my income to charity” and “blog about social justice.” This makes no sense to me. It’s like saying “saying you like lima beans is a fact about your brain, not a fact about lima beans. Therefore you should like lima beans and microwave popcorn equally!” My morality is an arbitrary preference I have, I arbitrarily happen to prefer happiness to unhappiness, and I act to increase the amount of happiness in the world. Where’s the contradiction?

There are also people who believe that, since I don’t believe morals exist except as human preferences, I shouldn’t judge other people’s morals. This makes no sense to me either. There is only so much happiness I can cause! If I want to maximize human happiness, I need to get other people on board with Happiness and the Maximizing Thereof. Social shaming is an excellent method of convincing people of things, particularly things that are fundamentally arational. (To put it another way: my arbitrary moral preference does not include an arbitrary moral preference for not shaming other people’s arbitrary moral preferences. Nyeh.)

(Arbitrary moral preference has stopped looking like words.)


23 Jan 18:54

It’s A Speaker Dock & Sunglasses. Deal With It. $18

Zephyr Dear

I <3 meh

  • Lousy speaker dock + cheap sunglasses + Internet = hilarity
  • Lenses are polarized, like our community's response to this joke bundle
  • Puts the "deal" in "deal with it"
  • And then takes it right back out again

"Deal sites" must die.

Whaaa? When we only sell one item per day, why would we waste an entire day on a pointless offer like this? A crappy speaker dock and a pair of sunglasses, bundled together to make a played-out meme reference: is that any way to run a deal site?

No, because "deal sites" are bullshit.

Without a community around them, we mean. When they cultivate a wacky just-us-deal-nerds image, but insist on a one-way relationship with their customers just as alienating as Big Retail: "shut up while we push product at you." Meh is a community that happens to have a deal attached. Without the community, we'd basically be Groupon with vastly better copywriting.

No doubt, the deals pay the bills. We love it when people buy stuff from us. We love putting a kick-ass deal together. But just look at the real estate we give to this story, and that video over there, and the forum links, compared to the holy product photo and title and buy button. That's no accident. When we were concocting Meh's DNA, we very intentionally made "Internet" the dominant gene and "store" the recessive one. Content and community come first.

And to prove it, here's a lame deal that won't make us any money. The speaker dock is the same piece of garbage that inspired our original profane rant about stupid sucky speaker docks. The sunglasses aren't anything special, either. But put them together and it's funny! Kind of! If you look at it from the right angle! And possibly after ingesting the right substances!

Consider today a day off from shopping. Check out the video. Answer the poll. Most of all, get into the community and shoot the shit. Tell us how idiotic and insane we are for shooting ourselves in the foot with bad ideas like today's bundle. "Deal with it" by making your feelings known and finding common cause with your fellow disappointed shoppers. That's what you can do in a community. Anything else is just a deal site.

22 Jan 01:03

The Mariner's Rule

by John Michael Greer
One of the things my readers ask me most often, in response to this blog’s exploration of the ongoing decline and impending fall of modern industrial civilization, is what I suggest people ought to do about it all. It’s a valid question, and it deserves a serious answer.

Now of course not everyone who asks the question is interested in the answers I have to offer. A great many people, for example, are only interested in answers that will allow them to keep on enjoying the absurd extravagance that passed, not too long ago, for an ordinary lifestyle among the industrial world’s privileged classes, and is becoming just a little bit less ordinary with every year that slips by.  To such people I have nothing to say. Those lifestyles were only possible because the world’s industrial nations burnt through half a billion years of stored sunlight in a few short centuries, and gave most of the benefits of that orgy of consumption to a relatively small fraction of their population; now that easily accessible reserves of fossil fuels are running short, the party’s over. 

Yes, I’m quite aware that that’s a controversial statement. I field heated denunciations on a regular basis insisting that it just ain’t so, that solar energy or fission or perpetual motion or something will allow the industrial world’s privileged classes to have their planet and eat it too. Printer’s ink being unfashionable these days, a great many electrons have been inconvenienced on the internet to proclaim that this or that technology must surely allow the comfortable to remain comfortable, no matter what the laws of physics, geology, or economics have to say.  Now of course the only alternative energy sources that have been able to stay in business even in a time of sky-high oil prices are those that can count on gargantuan government subsidies to pay their operating expenses; equally, the alternatives receive an even more gigantic “energy subsidy” from fossil fuels, which make them look much more economical than they otherwise would.  Such reflections carry no weight with those whose sense of entitlement makes living with less unthinkable.

I’m glad to say that there are  fair number of people who’ve gotten past that unproductive attitude, who have grasped the severity of the crisis of our time and are ready to accept unwelcome change in order to secure a livable future for our descendants. They want to know how we can pull modern civilization out of its current power dive and perpetuate it into the centuries ahead. I have no answers for them, either, because that’s not an option at this stage of the game; we’re long past the point at which decline and fall can be avoided, or even ameliorated on any large scale.

A decade ago, a team headed by Robert Hirsch and funded by the Department of Energy released a study outlining what would have to be done in order to transition away from fossil fuels before they transitioned away from us. What they found, to sketch out too briefly the findings of a long and carefully worded study, is that in order to avoid massive disruption, the transition would have to begin twenty years before conventional petroleum production reached its peak and began to decline. There’s a certain irony in the fact that 2005, the year this study was published, was also the year when conventional petroleum production peaked; the transition would thus have had to begin in 1985—right about the time, that is, that the Reagan administration in the US and its clones overseas were scrapping the promising steps toward just such a transition.

A transition that got under way in 2005, in other words, would have been too late, and given the political climate, it probably would have been too little as well. Even so, it would have been a much better outcome than the one we got, in which most of us have spent the last ten years insisting that we don’t have to worry about depleting oilfields because fracking was going to save us all. At this point, thirty years after the point at which we would have had to get started, it’s all very well to talk about some sort of grand transition to sustainability, but the time when such a thing would have been possible came and went decades ago. We could have chosen that path, but we didn’t, and insisting thirty years after the fact that we’ve changed our minds and want a different future than the one we chose isn’t likely to make any kind of difference that matters.

So what options does that leave? In the minds of a great many people, at least in the United States, the choice that apparently comes first to mind involves buying farmland in some isolated rural area and setting up a homestead in the traditional style. Many of the people who talk enthusiastically about this option, to be sure, have never grown anything more demanding than a potted petunia, know nothing about the complex and demanding arts of farming and livestock raising, and aren’t in anything like the sort of robust physical condition needed to handle the unremitting hard work of raising food without benefit of fossil fuels; thus it’s a safe guess that in most of these cases, heading out to the country is simply a comforting daydream that serves to distract attention from the increasingly bleak prospects so many people are facing in the age of unraveling upon us.

There’s a long history behind such daydreams. Since colonial times, the lure of the frontier has played a huge role in the American imagination, providing any number of colorful inkblots onto which fantasies of a better life could be projected. Those of my readers who are old enough to remember the aftermath of the Sixties counterculture, when a great many young people followed that dream to an assortment of hastily created rural communes, will also recall the head-on collision between middle-class fantasies of entitlement and the hard realities of rural subsistence farming that generally resulted. Some of the communes survived, though many more did not; that I know of, none of the surviving ones made it without a long and difficult period of readjustment in which romantic notions of easy living in the lap of nature got chucked in favor of a more realistic awareness of just how little in the way of goods and services a bunch of untrained ex-suburbanites can actually produce by their own labor.

In theory, that process of reassessment is still open. In practice, just at the moment, I’m far from sure it’s an option for anyone who’s not already traveled far along that road. The decline and fall of modern industrial civilization, it bears repeating, is not poised somewhere off in the indefinite future, waiting patiently for us to get ready for it before it puts in an appearance; it’s already happening at the usual pace, and the points I’ve raised in posts here over the last few weeks suggest that the downward slope is probably going to get a lot steeper in the near future. As the collapse of the fracking bubble ripples out through the financial sphere, most of us are going to be scrambling to adapt, and the chances of getting everything lined up in time to move to rural property, get the necessary equipment and supplies to start farming, and get past the worst of the learning curve before crunch time arrives are not good.

If you’re already on a rural farm, in other words, by all means pursue the strategy that put you there. If your plans to get the necessary property, equipment, and skills are well advanced at this point, you may still be able to make it, but you’d probably better get a move on. On the other hand, dear reader, if your rural retreat is still off there in the realm of daydreams and good intentions, it’s almost certainly too late to do much about it, and where you are right now is probably where you’ll be when the onrushing waves of crisis come surging up and break over your head.

That being the case, are there any options left other than hiding under the bed and hoping that the end will be relatively painless? As it happens, there are.

The point that has to be understood to make sense of those options is that in the real world, as distinct from Hollywood-style disaster fantasies, the end of a civilization follows the famous rule attributed to William Gibson: “The future is already here, it’s just not evenly distributed yet.”  Put another way, the impacts of decline and fall aren’t uniform; they vary in intensity over space and time, and they impact particular systems of a falling civilization at different times and in different ways.  If you’re in the wrong place at the wrong time, and depend on the wrong systems to support you, your chances aren’t good, but the places, times, and systems that take the brunt of the collapse aren’t random. To some extent, those can be anticipated, and some of them can also be avoided.

Here’s an obvious example. Right now, if your livelihood depends on the fracking industry, the tar sands industry, or any of the subsidiary industries that feed into those, your chances of getting through 2015 with your income intact are pretty minimal.  People in those industries who got to witness earlier booms and busts know this, and a good many of them are paying off their debts, settling any unfinished business they might have, and making sure they can cover a tank of gas or a plane ticket to get back home when the bottom falls out. People in those industries who don’t have that experience to guide them, and are convinced that nothing bad can actually happen to them, are not doing these things, and are likely to end up in a world of hurt when their turn comes.

They’re not the only ones who would benefit right now from taking such steps. A very large part of the US banking and finance industry has been flying high on bloated profits from an assortment of fracking-related scams, ranging from junk bonds through derivatives to exotic financial fauna such as volumetric production payments. Now that the goose that laid the golden eggs is bobbing feet upwards in a pond of used fracking fluid, the good times are coming to a sudden stop, and that means sharply reduced income for those junior bankers, brokers, and salespeople who can keep their jobs, and even more sharply reduced prospects for those who don’t.

They’ve got plenty of company on the chopping block.  The entire retail sector in the US is already in trouble, with big-box stores struggling for survival and shopping malls being abandoned, and the sharp economic downturn we can expect as the fracking bust unfolds will likely turn that decline into freefall, varying in intensity by region and a galaxy of other factors. Those who brace themselves for a hard landing now are a good deal more likely to make it than those who don’t, and those who have the chance to jump to something more stable now would be well advised to make the leap.

That’s one example; here’s another. I’ve written here in some detail about how anthropogenic climate change will wallop North America in the centuries ahead of us. One thing that’s been learned from the last few years of climate vagaries is that North America, at least, is shifting in exactly the way paleoclimatic data would suggest—more or less the same way it did during warm periods over the last ten or twenty million years. The short form is that the Southwest and mountain West are getting baked to a crackly crunch under savage droughts; the eastern Great Plains, Midwest, and most of the South are being hit by a wildly unstable climate, with bone-dry dry years alternating with exceptionally soggy wet ones; while the Appalachians and points eastward have been getting unsteady temperatures but reliable rainfall. Line up your choice of subsistence strategies next to those climate shifts, and if you still have the time and resources to relocate, you have some idea where to go.

All this presumes, of course, that what we’re facing has much more in common with the crises faced by other civilizations on their way to history’s compost heap than it does with the apocalyptic fantasies so often retailed these days as visions of the immediate future. I expect to field a flurry of claims that it just ain’t so, that everything I’ve just said is wasted breath because some vast and terrible whatsit will shortly descend on the whole world and squash us like bugs. I can utter that prediction with perfect confidence, because I’ve been fielding such claims over and over again since long before this blog got started. All the dates by which the world was surely going to end have rolled past without incident, and the inevitable cataclysms have pulled one no-show after another, but the shrill insistence that something of the sort really will happen this time around has shown no sign of letting up. Nor will it, since the unacceptable alternative consists of taking responsibility for doing something about the future.

Now of course I’ve already pointed out that there’s not much that can be done about the future on the largest scale. As the fracking bubble implodes, the global economy shudders, the climate destabilizes, and a dozen other measures of imminent crisis head toward the red zone on the gauge, it’s far too late in the day for much more than crisis management on a local and individual level. Even so, crisis management is a considerably more useful response than sitting on the sofa daydreaming about the grandiose project that’s certain to save us or the grandiose cataclysm that’s certain to annihilate us—though these latter options are admittedly much more comfortable in the short term.

What’s more, there’s no shortage of examples in relatively recent history to guide the sort of crisis management I have in mind. The tsunami of discontinuities that’s rolling toward us out of the deep waters of the future may be larger than the waves that hit the Western world with the coming of the First World War in 1914, the Great Depression in 1929, or the Second World War in 1939, but from the perspective of the individual, the difference isn’t as vast as it might seem. In fact, I’d encourage my readers to visit their local public libraries and pick up books about the lived experience of those earlier traumas. I’d also encourage those with elderly relatives who still remember the Second World War to sit down with them over a couple of cups of whatever beverage seems appropriate, and ask about what it was like on a day-by-day basis to watch their ordinary peacetime world unravel into chaos.

I’ve had the advantage of taking part in such conversations, and I’ve also done a great deal of reading about historical crises that have passed below the horizon of living memory. There are plenty of lessons to be gained from such sources, and one of the most important also used to be standard aboard sailing ships in the days before steam power. Sailors in those days had to go scrambling up the rigging at all hours and in all weathers to set, reef, or furl sails; it was not an easy job—imagine yourself up in the rigging of a tall ship in the middle of a howling storm at night, clinging to tarred ropes and slick wood and trying to get a mass of wet, heavy, wind-whipped canvas to behave, while below you the ship rolls from side to side and swings you out over a raging ocean and back again. If you slip and you’re lucky, you land on deck with a pretty good chance of breaking bones or worse; if you slip and you’re not lucky, you plunge straight down into churning black water and are never seen again.

The rule that sailors learned and followed in those days was simple: “One hand for yourself, one hand for the ship.” Every chore that had to be done up there in the rigging could be done by a gang of sailors who each lent one hand to the effort, so the other could cling for dear life to the nearest rope or ratline. Those tasks that couldn’t be done that way, such as hauling on ropes, took place down on the deck—the rigging was designed with that in mind. There were emergencies where that rule didn’t apply, and even with the rule in place there were sailors who fell from the rigging to their deaths, but as a general principle it worked tolerably well.

I’d like to propose that the same rule might be worth pursuing in the crisis of our age. In the years to come, a great many of us will face the same kind of scramble for survival that so many others faced in the catastrophes of the early 20th century. Some of us won’t make it, and some will have to face the ghastly choice between sheer survival and everything else they value in life. Not everyone, though, will land in one or the other of those categories, and many those who manage to stay out of them will have the chance to direct time and energy toward the broader picture.

Exactly what projects might fall into that latter category will differ from one person to another, for reasons that are irreducibly personal. I’m sure there are plenty of things that would motivate you to action in desperate times, dear reader, that would leave me cold, and of course the reverse is also true—and in times of crisis, of the kind we’re discussing, it’s personal factors of that sort that make the difference, not abstract considerations of the sort we might debate here. I’ll be discussing a few of the options in upcoming posts, but I’d also encourage readers of this blog to reflect on the question themselves: in the wreck of industrial civilization, what are you willing to make an effort to accomplish, to defend, or to preserve?

In thinking about that, I’d encourage my readers to consider the traumatic years of the early 20th century as a model for what’s approaching us. Those who were alive when the first great wave of dissolution hit in 1914 weren’t facing forty years of continuous cataclysm; as noted here repeatedly, collapse is a fractal process, and unfolds in real time as a sequence of crises of various kinds separated by intervals of relative calm in which some level of recovery is possible. It’s pretty clear that the first round of trouble here in the United States, at least, will be a major economic crisis; at some point not too far down the road, the yawning gap between our senile political class and the impoverished and disaffected masses promises the collapse of politics as usual and a descent into domestic insurgency or one of the other standard patterns by which former democracies destroy themselves; as already noted, there are plenty of other things bearing down on us—but after an interval, things will stabilize again.

Then it’ll be time to sort through the wreckage, see what’s been saved and what can be recovered, and go on from there. First, though, we have a troubled time to get through.
21 Jan 23:31

Photo



21 Jan 18:11

Freedom, control and good ideas

by Seth Godin

Where should great programmers choose to work?

[I say 'choose' because anyone who has worked with programmers understands that the great ones are worth far more than the average ones. Sometimes 50 times as much. That's because great programmers are able to architect systems that are effective, that scale, and that do things that other programmers can't imagine until after they're done.]

While this is a post about people who work to become great programmers, I think it applies to most fields, including sales and design.

Many programmers are drawn to famous, hip, growing tech companies. There are literally tens of thousands of programmers working at Apple, Google and Facebook, and each company receives more than a thousand resumes a day.

It might not be a great choice, though. Not for someone willing to exchange the feeling of security for the chance to matter.

The first challenge is freedom: Not just the freedom to plan your day and your projects, but the freedom to try new things, to go out all the way out to the edge, to launch things that might not work.

A key element of freedom is control. Controlling what you work on and how you do it. If you are part of a team of a hundred people working on an existing piece of software, you will certainly learn a lot. But the areas you have control over, responsibility for, the ability to change—are small indeed.

The team that built the Mac (arguably one of the most important software teams in history) was exactly the right size for each member to have freedom and control while also shipping important work. 

Alas, when an organization gets bigger, the first technical choice they make is to build systems based on programming jobs that don't need brilliant engineers. The most reliable way to build a scalable, predictable industrial organization is to create jobs that can be done by easily found (and replaced) workers. Which means less freedom and less control for the people who do the work, and more freedom and more control for the organization.

When faced with the loss of freedom and control, many talented people demand an increase in security and upside. That's one big reason (irony alert) that fast-growing companies go public—so they will have the options currency to pay their team handsomely, which puts the future of the company in the hands of Wall Street, which will happily exchange stock price growth for the banality of predictable. This, of course, leads to programmers losing even more freedom and even more control.

It's entirely possible that an industrialized organization is going to change the world, but they're going to do it with you or without you.

The alternative, as talented outliers like Marco Arment have shown us, is to take a good idea (like Tumblr or Overcast) and make it into something great.

The challenges here are that finding a great idea is a lot of work (and a distinct skill) and making it into a company that succeeds is a lot of work as well. Programmers who do both those jobs are often left fighting for the time to do the programming they actually love to do. (Mark Zuckerberg decided to give up serious programming at Facebook, Dave Filo chose not to at Yahoo).

The alternative? Be as active in finding the right place to work as great founders are in finding you. The goal might not be to find a famous company or even a lucrative gig. Instead, you can better reach your potential by finding the small shop, the nascent organization, the powerful agent of change that puts you on the spot on a regular basis. 

This is a lot of work. Not only do you need to do your job every day, and not only do you need to continually hone your skills and get ever better at your work, but now you're expected to spend the time and energy to find clients/bosses/a team where you are respected and challenged and given the freedom and control to do even better work.

If I were a great programmer, I'd be spending the time to figure out what I'd want my day to look like, then going to events, startup weekends, VC firms and other places where good idea people are found. The best jobs might be the most difficult to find.

Bernie Taupin needed Elton John as much as John needed Taupin.

You can't get away with this strategy of self-selection if you're simply a good programmer. It won't work if you don't have a point of view about your craft and if you need management supervision in order to ship great code. You need to build a trail that proves you're as good as you assert you are. But those are all skills, skills worth acquiring in an age when they are worth more than ever before. 

Once you have those chops, though, the onus is on you to choose not to be a cog in a well-oiled machine that will rob you of freedom and control, not to mention the personal development and joy that come with a job where you matter.

To be really clear, it's entirely possible to be a great programmer doing important work at a big company. But those companies must work overtime to create an environment where systems-creep doesn't stifle the desire and talent of the best people on the team.

The naive person wonders, "how come so many great architects build iconic buildings early in their career?" In fact, the truth is:

doing the work that earns a commision for an iconic building makes you into a great architect.

Michael Graves and Zaha Hadid didn't wait for someone to offer them a great project. They went and got it.

[If this resonates with you, I might have precisely the right gig for the right programmer. You can read the details here. If you know someone, please share.]

       
21 Jan 18:05

Why Chaucer Now? #MLA15

by Jeffrey Cohen
My daughter made me this for MLA
by J J Cohen

I had the honor of participating in a lively roundtable at the most recent MLA on "Why Chaucer Now?" Emma Lipton put the roundtable together and moderated the session so well that the conversation was prolonged and lively. Audience participation and MLA seldom go hand in hand. Inevitably papers go far too long (senior scholars seem especially bad at timing themselves; at least I'd like to think it's a timing issue rather than presumption). Chairs often exhibit a "What can you do?" attitude -- forgetting that some people have traveled very long distances for the conference, and would rather have engagement with the speakers than the live delivery of a performance that could have more comfortably been viewed via video at home.  Admittedly, we also lost two of our Chaucer speakers to the hazards of travel -- but I liked that instead of filling in the opened space with longer presentations the audience was invited to prod and provoke.

Patricia Ingham started the session with an enormously helpful historicization of the question that had brought us together as a panel: should the MLA combine divisions like Old English, Chaucer, and Middle English Exclusive of Chaucer? She ruminated over a long archive of the MLA pondering such bureaucratic and institutional questions embedded within historical ones, and emphasized the public pedagogical outreach that the organization used to do (eg, being involved with the teaching of Chaucer in the high school curriculum). Nicole Nolan Sidhu spoke of the flourishing of supposedly surpassed racism and misogyny within internet porn (so easily accessible, yet "private" and thereby not amenable to intervention or challenge) and used Chaucer to drawn an alternative history of obscenity as a public and reconfigurable discourse.

I'd decided that my own contribution would be about the teaching of Chaucer, and that I'd involve my students in its composition. I'm happy I did. One more quick note and then I'll post it. Emma asked me before I presented how important Chaucer is to my work, and not that long ago I would have said not all that much. I realize that isn't true though. Yes, I've written a few essays, contributed to handbooks, have a Chaucer chapter in one of my books, and so on. But doing the index for Stone and seeing the sheer number of references to Chaucer permeating a book that is in no way about him made me realize that even when I don't think I'm engaging his work, I often quietly am.

Why Chaucer?


            To the question “Why Chaucer?” my response not long ago would have been historicist, emphasizing inertia and disciplinary conservatism. Because we’ve been teaching him in classrooms for so long, one of the few single author courses still on the books at many institutions. Because his London dialect became ascendant, and does not require the depth of special training needed to read other medieval works like the poems of the Gawain poet or anything in Old English. Because the academy is conservative. Because we are inheritors of the pronouncement that Chaucer is the father of English poetry, and even though we know we don’t need a primal father we continue to canonize him through our specialty societies, our publications, our MLA divisions. Because most of us teach in English rather than Literature Departments (and some strong challenges to Chaucer’s supremacy have come from scholars attempting to restore to the British archipelago its roiled diversity of cultures and tongues). Feminism and postcolonial studies can buttress Chaucer’s position at the generative center of medieval English literature, but they have also made us see that Marie de France was just as sophisticated a poet (too bad she wrote in French), and that trilingual John Gower conveys the polyglot truth of late 14th century literature better. We teach and study Chaucer because that’s the field we have been trained to teach and study. Chaucer will vanish as medievalist jobs in English Departments do (followed most likely by English Departments themselves: it's interesting to think who will be coming to these MLA meetings a hundred years from now).

            Why Chaucer? My answer is more complicated now than it would have been last summer. This semester, my twentieth at George Washington University, saw me teaching the Canterbury Tales to undergraduates yet again. Three things occurred that changed how I teach and think about Chaucer’s works. First, the Cengage-owned Riverside Chaucer is now so expensive even in paperback that no professor can reasonably ask students to purchase the volume. For the first time I ordered Jill Mann’s edition of the Canterbury Tales(Penguin Classics, $14). That change in text meant I taught the class from a fresh, clean book lacking the decades of marginalia that have accrued on my Riverside (an amply glossed hardcover held together by duck tape). It was liberating to let those textual predeterminations go, an inscribed history of my normative training as a medievalist. Second, during the fall semester of 2014, it was impossible to close the door of any classroom and expect to bar the entry of the aftermath of the racism and violence in Ferguson, or the growing awareness of the depth and persistence of rape culture on college campuses. Theformer president of GW made some victim blaming remarks about drinking andsexual assault just before I taught the Wife of Bath’s Tale, with its casual narration of a knight who rapes a maiden “maugree hir hed.” I teach that text on day three of the course, so our concerns were clear from the start. Third, I was fortunate enough to have 24 extraordinary students in the class, and they were diverse. Two of them changed the class profoundly, at least from my point of view: K., a student who declared on the first day that they would like to be referred to by third person plural and thereby gender indeterminate pronouns (I took it as a good sign that all the students in the room simply nodded to the request, registering no surprise; and that they were later happy to find in the Pardoner and John/Eleanor Rykener medieval genderqueers); and C., a student who had taken a previous class with me, who emailed me after reading the Wife of Bath’s tale to ask if sexual assault would be a frequent topic in the class because she would likely not be able to sit through discussions. I arranged it so that C. knew in advance what classes would probably touch upon rape – and with Chaucer, that is quite a few. She sat near the door, and if she felt the need, she left. I assured her that was no problem. We had three GW studentscommit suicide in the spring and one more publicly attempt it in the fall semester: those deaths took a toll on my students, and made me realize we need to be more vigilant to signs of distress. I told C. that if leaving class is something she needs to do as an act of self-preservation, then I applaud her for speaking to me and choosing not to endure an unbearable topic. I never understood the recent furor over trigger warnings. They don’t limit discussion, they simply respect the fact that not every one of our students has had the life we wish them to have had.

            Keeping C. in mind, I dreaded discussion of the Reeve’s Tale, with its vengeance rapes. I always teach it with the Cook’s Tale, so we started there, and using the work of Paul Strohm and David Wallace to talk about London, community making, exclusion, and violence directed at those identified as foreign bodies (the Cook’s Flemish proverb as invitation to speak of the murder of the Flemings in 1381 in Chaucer’s boyhood neighborhood). We circled around the edges of the Reeve’s Tale, speaking of dialect and humor, the animal noises and animal desires, the glimpses of rural life, the fabliau structure. But we did not speak directly of the sexual violence at the center of the tale, the events that within my glossed Riverside are given a long contextual history as pranks and topoi, with the aubade supposed to be especially funny. Instead I told my students when we reached the bedroom scene that they know what comes next, that rape is not entertainment, that I did not want to participate in a long history of making light of sexual assault. I asked them to leave thinking about the source of humor in the tale and the essential conversation about rape culture that college campuses are having now. They departed in an uncharacteristically somber mood.


            And that’s all too the good. As I brought the class to a close in December, I asked my students the “Why Chaucer?” question that we are pondering together today. Their answers surprised me. Because he is someone to disagree with as well as be inspired by. Because he is difficult, complicated, artful. Because of the problems he conveys. Because his idea of fellowship as an unlikely gathering of the diverse gets enacted when we read his works together. Because having to learn Middle English together reduces every student in the class to the same starting point, to the same position of shared vulnerability. Because the pilgrims never exactly arrive in Canterbury, and the conversation they start is worth carrying on.

21 Jan 17:16

kawaiibooties: things you should know: peoples sexualities can change so can their gender and their...

kawaiibooties:

things you should know:

  • peoples sexualities can change
  • so can their gender
  • and their romantic attraction
  • people are not limited to being one sexuality/gender just because at one point they identified as it
21 Jan 02:07

NRA: Antichrist & Sons

by Fred Clark

Nicolae: The Rise of Antichrist; pp. 292-294

Here we arrive at one of the oddest variations in Tim LaHaye’s “Bible prophecy” prophesying. He strays from the main End Times script here, introducing a new element to the story that you won’t find in the footnotes to your Scofield Reference Bible.

The main outline of this “Bible prophecy” story was firmly established more than a century ago by Darby and Scofield. You got your Rapture, then your Antichrist and Tribulation with your seals and trumpets and vials of destruction. Then you got your Armageddon and your Second (Third?) Coming, your Great White Throne sorting and The End. So far in the Left Behind series, Tim LaHaye has offered only minor variations in that main story, mostly only tweaking arcane details — when do the Two Witnesses begin preaching? — that will be of interest only to those well-versed in the minutiae of End Times fandom.

But here, halfway through the third book in his series, LaHaye introduces something quite original.

Jerry Jenkins sets the scene for us, and since no one is on the telephone or in an airport, that must mean …

Despite his years of flying, Rayford had never found a cure for jet lag, especially going east to west. His body told him it was the middle of the evening, and after a day of flying, he was ready for bed. But as the DC-10 taxied toward the gate in Milwaukee, it was noon Central Standard Time. Across the aisle from him, the beautiful and stylish Hattie Durham slept. Her long blonde hair was in a bun, and she had made a mess of her mascara trying to wipe away her tears.

That short paragraph displays several of Jenkins’ signature touches in these books: an obsession with travel logistics, clichéd sexist stereotypes, and a casual disregard for massive suffering and death.

Hattie hasn’t been crying because flying into Milwaukee instead of the recently destroyed O’Hare forces her to think of the nuclear destruction of Chicago a few days ago. She isn’t crying over the many friends, colleagues, memories and beloved places she lost in that very recent senseless horror. No one in these books will ever cry about that, or even give it much of a second thought.

No, Hattie has been crying because her relationship with Nicolae Carpathia has been going badly. Sure, he may be People magazine’s “Sexiest Man Alive” two years in a row, beloved by nearly everyone on the planet, but he’s also apparently a sub-optimal boyfriend.

She had wept off and on almost the entire flight. Through two meals, a movie, and a snack, she had unburdened herself to Rayford. She had lost her love for the man. She didn’t understand him. While she wasn’t ready to say he was the Antichrist, she certainly was not as impressed with him behind closed doors as most of his public was with him.

It’s the little things, you know? He’s inattentive. He can be moody, distracted. Oh, and some days he just goes around nuking cities and killing tens of millions of people for no apparent reason.

Most troubling to Rayford was Hattie’s turmoil over her pregnancy. He wished she would refer to what she was carrying as a child. But it was a pregnancy to her, an unwanted pregnancy.

Do you sense that Rayford and the authors are warming up for a big sermon on abortion? Because, yes, Rayford and the authors are warming up here for a big sermon on abortion.

Do you expect that such a thing may be horrible? Right again. It’s horrible.

I’m not sure if it’s worse for readers who are opposed to abortion or for readers who support abortion rights. In the latter case, you’re certainly not going to want to read a clumsy, straw-man anti-abortion rant delivered by Rayford Steele to his former pseudo-mistress. But in the former case, I would imagine that Rayford is probably just about the last person you’d want to see made the official spokesman and embodiment of your perspective.

We’ll contend with the particular horrors of that in our next installment. Here I just want to step back and focus on the context that prompts that discussion, which is this: The Antichrist is having a baby!

DaimonThis is not part of the usual End Times check list or an idea even considered in most “Bible prophecy” conferences. It’s just not part of the Rapture/Tribulation/Armageddon story as told and re-told over the years. Tim LaHaye is introducing a new element here — something that never would have occurred to Scofield or Ironsides or Hal Lindsey or John Walvoord.

None of those other “Bible prophecy scholars” have included any discussion of the Antichrist’s baby in their versions of this scheme.

That’s partly because it’s not a “Bible prophecy” that’s prophesied anywhere in the Bible. That’s true of most of the items on the usual End Times check list, of course (which is why, after all this time, I still use scare quotes for that phrase “Bible prophecy”). But there’s still a logic of sorts that’s usually invoked to justify most of the things “Bible prophecy” types say the Bible predicts about the future. Some of these predictions are based on specific texts — the massive earthquake described in Revelation 6, for example. Other predictions are justified indirectly — through the application of the “prophetic” decoder formulas, or through numerology, or even raw assertion based on the flimsiest of pretexts (“Magog” = Moscow!).

But nowhere in any of that has anyone previously located a verse, or a hidden coded message, or a symbological divination, that suggested that the Antichrist is going to have a baby.

Remember where this character — “the Antichrist” — comes from. It’s a composite gleaned from tyrants and villains scattered throughout the Bible. You start with the “Beast” and the “conqueror” in John’s Apocalypse, and then you add a bunch of details from every other biblical story that involves any wicked ruler — Pharaoh, Ahab, Nebuchadnezzar, Herod, etc. A major theme in most of those stories is that wicked rulers will not ultimately prevail — they will be cast down, overthrown, and cut off. Remember what happened to Pharaoh’s first-born? That’s the general pattern. And since “the Antichrist” was assembled as the culmination of this pattern, the character has never been presented as having an heir to the throne.

The idea also just doesn’t fit with the rest of the scheme. It doesn’t work with the End Times timetable: Rapture, Antichrist, Tribulation, Armageddon, The End. That all takes place in seven short years. So even if the Antichrist gets busy gettin’ busy immediately after the Rapture, his oldest children would only be 6 years old when Killer Jesus came back to wipe out their daddy and all his minions.

So what’s the point? Why introduce the idea of the Antichrist’s baby into the established story? Why would the Antichrist bother with this, and why would the authors bother with having him do so? And what does it mean for the rest of the story to change this part of it in this way?

One thing it might suggest is that Nicolae Carpathia doesn’t know that he’s the Antichrist. He doesn’t seem to realize that history is in its final throes, with the last curtain closing in less than six years from this point in the story.

Unfortunately, that doesn’t seem to work with the rest of what we’ve been told or shown about Nicolae. I’ve wished it were true, because I think his character — and the entire story — would be a lot more interesting if he had no idea that he was the Antichrist, and no idea of all the “Bible prophecies” and End Times check list events he was required/predetermined to fulfill. But the authors have never given us any other motive or explanation for Nicolae’s behavior. His agenda — from the construction of New Babylon to the rebuilding of the Temple in Jerusalem — is just too weird and arbitrary to be explained by anything other than his working from the same prophecy check list that’s pinned up on the wall of Bruce Barnes’ study.

So he’s gotta know he’s the Antichrist, and he surely knows the final countdown of the seven-year Tribulation is under way.

But maybe he really thinks he can win. Maybe he imagines that he’ll be able to change the predetermined outcome at Armageddon, preventing the End of the World and thereby continuing his reign as global dictator for the rest of his natural life. In that case, it might make sense for him to want children — to raise an heir who could inherit his throne. Maybe Nicolae really thinks he has a chance of keeping history and the world going longer than just the next five and a half years, long enough to see his child graduate from school, maybe get married some day and make him not just a father but a grandfather. …

But that scenario – the Antichrist-who-thinks-he-can-win-Armageddon possibility — raises another huge problem. It turns the whole story upside-down. It gives us a story in which powerful forces are at work trying to bring about the end of the world, while one man rises up to oppose them — fighting to save the world and to prevent the slaughter of billions of people. If that’s the story, then there’s no way that guy isn’t the hero of the story — even if that guy is also the Antichrist.

Fighting to save the world makes you the hero. That’s how that always works.

So what’s left? What else could possibly explain why LaHaye and Jenkins felt the need to change the existing folklore and storyline of the End Times time line? What made them decide to have the Antichrist have a baby? There’s nothing in their “Bible prophecies” to require such a change, and it doesn’t seem to fit in with the rest of the story they’re trying to tell, so why add this?

I suspect it has to do with that horrible sermon Rayford is about to give Hattie on the evils of abortion.

Hal Lindsey wrote a series of blockbuster best-selling “Bible prophecy” books back in the 1970s. White American evangelical Christians loved The Late Great Planet Earth and There’s a New World Coming, embracing those books as a reflection of their beliefs and values and concerns. But neither of those books had anything to say about abortion.

By 1995, it was no longer possible to write for that white evangelical audience without mentioning abortion. Abortion politics had become a central part — perhaps the central part — of white evangelicalism, and for LaHaye’s reboot of Lindsey’s earlier formula to work in the 1990s, it was going to have to include plenty of anti-abortion material, even if that meant shoe-horning it into the story through an awkward innovation that doesn’t otherwise fit and serves no other function.

20 Jan 19:07

3 ways to monkey-patch without making a mess

Monkey Patching. When you first try Ruby, it’s amazing. You can add methods right to core classes! You don’t have to call Time.now.advance(days: -1), you can write 1.day.ago! It makes Ruby a joy to read and write. Until…

You hit weird bugs because a patch changed Hash.

You get confused about which code actually ran, so you can’t debug it when it breaks.

And you finally figure out that all your problems were caused six months ago, when you monkey-patched Enumerable to make one line of code five characters shorter.

But what’s the alternative? No convenience at all? Code that looks like GroupableArray.new([1, 2, 3, 4]).in_groups_of(2)? Waiting for blank? to make it into core Ruby before you’re allowed to have nice user input handling?

You don’t want to give up on monkey patches entirely. But how can you write monkey patches that won’t make you want to fire yourself for incompetence the next time you see them?

Put them in a module

When you monkey patch a class, don’t just reopen the class and shove your patch into it:

1
2
3
4
5
class DateTime
  def weekday?
    !sunday? && !saturday?
  end
end

Why not?

  • If two libraries monkey-patch the same method, you won’t be able to tell.

    The first monkey-patch will get overwritten and disappear forever.

  • If there’s an error, it’ll look like the error happened inside DateTime.

    While technically true, it’s not that helpful.

  • It’s harder to turn off your monkey patches.

    You have to either comment out your entire patch, or skip requiring your monkey patch file if you want to run code without it.

  • If you, say, forgot to require 'date' before running this monkey patch, you’ll accidentally redefine DateTime instead of patching it.

Instead, put your monkey patches in a module:

1
2
3
4
5
6
7
8
9
module CoreExtensions
  module DateTime
    module BusinessDays
      def weekday?
        !sunday? && !saturday?
      end
    end
  end
end

This way, you can organize related monkey patches together. When there’s an error, it’s clear exactly where the problem code came from. And you can include them one group at a time:

1
2
# Actually monkey-patch DateTime
DateTime.include CoreExtensions::DateTime::BusinessDays

If you don’t want the patch anymore, just comment out that line.

Keep them together

When you monkey patch core classes, you add to the core Ruby APIs. Every app with core patches feels a little bit different. So you have to have a way to quickly learn those changes when you jump into a new codebase. You have to know where your monkey patches live.

I mostly follow Rails’ monkey-patching convention. Patches go into lib/core_extensions/class_name/group.rb. So this patch:

1
2
3
4
5
6
7
8
9
module CoreExtensions
  module DateTime
    module BusinessDays
      def weekday?
        !sunday? && !saturday?
      end
    end
  end
end

would go into lib/core_extensions/date_time/business_days.rb.

Any new developer could browse through the Ruby files in lib/core_extensions and learn what you added to Ruby. And they’ll actually use those convenient new methods you wrote, instead of those methods just getting in the way.

Think through the edge cases

I don’t know why Enumerable doesn’t have a sum method. So often, I wish I could write [1, 2, 3].sum, or ["a", "b", "c"].sum, or [Article.new, Article.new, Article.new].sum… Oh.

When you monkey patch a class, you’re usually thinking about one thing you want to make easier. You want to calculate a sum of numbers, but forget that Arrays can hold other things.

Right now, it makes sense. You’d never try to calculate the average of a bunch of hashes. But when you attach methods to an object that just plain fail when you call them sometimes, you’ll confuse yourself later on.

You can deal with this in a few ways. From best to worst:

  • Handle unexpected input reasonably.

    This works well if your patch works with strings. You can get something reasonable from almost anything if you call to_s on it first. And Confident Ruby will teach you a ton of ways to deal with different kinds of input.

  • Handle the error in a clearer way.

    This could be as easy as throwing an ArgumentError with a good message when you see input you’re not expecting. Don’t depend on someone else understanding random NoMethodErrors.

  • Document the kind of input you expect in a comment.

    The other two options are better, if you can use them. But if you can’t check for edge cases inside your patch, at least document them. That way, once your caller figures out it’s your patch that’s causing their problem, they’ll have an idea of what you were trying to do.

My all-time favorite monkey patch

Finally, I want to leave you with my all-time favorite monkey patch: Hash#string_merge:

lib/core_extensions/hash/merging.rb
1
2
3
4
5
6
7
8
9
10
11
12
13
14
module CoreExtensions
  module Hash
    module Merging
      def string_merge(other_hash, separator = " ")
        merge(other_hash) {|key, old, new| old.to_s + separator + new.to_s}
      end
    end
  end
end

{}.string_merge({:class => "btn"}) # => {:class=>"btn"}

h = {:class => "btn"} # => {:class=>"btn"}
h.string_merge({:class => "btn-primary"}) # => {:class=>"btn btn-primary"}

It makes attaching CSS classes to HTML elements so much nicer.

Sensible monkey patching

Monkey-patching core classes isn’t all bad. When you do it well, it makes your code feel more like Ruby. But just like any of Ruby’s sharp edges, you have to take extra care when you use them.

If you keep your patches together, group them into modules, and handle the unexpected, your monkey patches will be as safe as they can be.

What’s the best monkey patch you’ve ever written (or seen)? Leave a comment and tell me about it!

20 Jan 19:04

Five Reasons Why Gendered Products are a Problem

by Lisa Wade, PhD

2 (1)Our Pointlessly Gendered Products Pinterest board is funny, no doubt. When people make male and female versions of things like eggs, dog shampoo, and pickles, you can’t help but laugh. But, of course, not it’s not just funny. Here are five reasons why.

1. Pointlessly gendered products affirm the gender binary.

Generally speaking, men and women today live extraordinarily similar lives. We grow up together, go to the same schools, and have the same jobs. Outside of dating — for some of us — and making babies, gender really isn’t that important in our real, actual, daily lives.

These products are a backlash against this idea, reminding us constantly that gender is important, that it really, really matters if you’re male and female when, in fact, that’s rarely the case.

2

But if there were no gender difference, there couldn’t be gender inequality; one group can’t be widely believed to be superior to the other unless there’s an Other. Hence, #1 is important for #3.

Affirming the gender binary also makes everyone who doesn’t fit into it invisible or problematic. This is, essentially, all of us. Obviously it’s a big problem for people who don’t identify as male or female or for those whose bodies don’t conform to their identity, but it’s a problem for the rest of us, too. Almost every single one of us takes significant steps every day to try to fit into this binary: what we eat, whether and how we exercise, what we wear, what we put on our faces, how we move and talk. All these things are gendered and when we do them in gendered ways we are forcing ourselves to conform to the binary.

2. Pointlessly gendered products reinforce stereotypes.

Pointlessly gendering products isn’t just about splitting us into two groups, it’s also about telling us what it means to be in one of those boxes. Each of these products is an opportunity to remind us.

3

3. Pointlessly gendered products tell us explicitly that women should be subordinate to or dependent on men.

All too often, gender stereotypes are not just about difference, they’re about inequality. The products below don’t just affirm a gender binary and fill it with nonsense, they tell us in no uncertain terms that women and men are expected to play unequal roles in our society.

Girls are nurses, men are doctors:

4

Girls are princesses, men are kings:

12

4. Pointlessly gendered products cost women money.

Sometimes the masculine and feminine version of a product are not priced the same. When that happens, the one for women is usually the more expensive one. If women aren’t paying attention — or if it matters to them to have the “right” product — they end up shelling out more money.  Studies by the state of California, the University of Central Florida, and Consumer Reports all find that women pay more. In California, women spent the equivalent of $2,044 more a year (the study was done in 1996, so I used an inflation calculator).

This isn’t just something to get mad about. This is real money. It’s feeding your kids, tuition at a community college, or a really nice vacation. When women are charged more it harms our ability to support ourselves or lowers our quality of life.

5. Pointlessly gendered products are stupid. There are better ways to deliver what people really need.

One of the most common excuses for such products is that men and women are different, but most of the time they’re using gender as a measure of some other variable. In practice, it would be smarter and more efficient to just use the variable itself.

For example, many pointlessly gendered products advertise that the one for women is smaller and, thus, a better fit for women. The packaging on these ear buds, sent in by LaRonda M., makes this argument.

2

Maybe some women would appreciate smaller earbuds, but it would still be much more straightforward to make ear buds in different sizes and let the user decide which one they wanted to use.

Products like these make smaller men and larger women invisible. They also potentially make them feel bad or constrain their choices. When the imperative for women is to be small and dainty, how do women who don’t use smaller earbuds feel?  Or, maybe the small guy who wants to learn how to play guitar never will because men’s guitars don’t fit him and he won’t be caught dead playing this:

1b

14

In sum, pointlessly gendered products aren’t just a gag. They’re a ubiquitous and aggressive ideological force, shaping how we think, what we do, and how much money we have. Let’s keep laughing, but let’s not forget that it’s serious business, too.

Lisa Wade is a professor at Occidental College and the co-author of Gender: Ideas, Interactions, Institutions. Find her on TwitterFacebook, and Instagram.

(View original at https://thesocietypages.org/socimages)

20 Jan 18:48

American Sniper, and the Murderers Hall of Infamy

by Arthur Silber
Just ten days ago, our wondrously life-affirming Western, and especially American, culture offered the spectacle of talk radio hosts giggling as they tried to determine the best way to celebrate the murders of those accused of the Charlie Hebdo killings.

But our glorious culture is capable of achievements greater by far than a few hosts laughing about the deaths of less than a handful of people. When it comes to nauseating spectacles that celebrate violence and bloody death, we are always outdoing ourselves. Perhaps "celebrate" isn't quite the right word in this context. "Consecrate" captures the dynamic more accurately; truly, violence. especially gratuitous violence, and bloody death, the bloodier the better, constitute our civic religion these days.

So this week began with the inspiring news that huge audiences flocked to see the new film, American Sniper. The enormous success of the film, which according to most reports took all the Hollywood-watchers and predicters by surprise (important reminder: so-called "experts" in any and every field -- foreign policy, economics, even Hollywood -- are the last people whose judgment you should trust, save for exceptions so rare they fail to constitute a serious challenge to the rule's application) is described as a "juggernaut," with the film expected to gross more than $105 million for the four-day holiday weekend: "The film is still setting mega-records including the largest January-February opening ever, the largest MLK four-day haul and an uber-career high for [Clint] Eastwood [the director[."

American Sniper is the story of Chris Kyle, a Navy SEAL who served four tours of duty during the war in Iraq. He is considered to be the most lethal sniper in American history, with 160 confirmed kills, out of 255 probable kills. I have not seen the film yet (and have no plans to do so in the near future), but I'm reading Kyle's book. A number of reviews of the movie confirm that Kyle's own view of what he did, as stated in his book, is faithfully rendered in the film.

Kyle's view of his actions is very straightforward and uncomplicated. He considered all those he killed to be "savages" who represented "despicable evil." Kyle's total of 160 confirmed kills may represent a record, but Kyle declares: "I only wish I had killed more."

Kyle also says this:
I loved what I did. I still do. If circumstances were different -- if my family didn't need me -- I'd be back in a heartbeat. I'm not lying or exaggerating to say it was fun. I had the time of my life as a SEAL.
After reading many reviews and comments about the film, I am confident in stating that the consensus view of Kyle and his "achievement" is that he was a hero. The film, and most members of the audience, are profoundly sympathetic to Kyle. They are keenly aware of the enormous pain Kyle suffered -- all of which had to do with fellow Americans who were killed, especially those Americans whose lives he thought he might have saved. Neither Kyle, nor his book, nor, it appears, the film expends even a moment's energy or thought for the suffering of the Iraqis. (There are brief mentions in his book of concern for those Iraqis who were "loyal to the new government," but it is hardly a subject of great moment to him. And those are the only Iraqis who merit a glimmer of compassion. All the rest of the Iraqis embody "despicable evil.")

Kyle's attitude toward the murders he committed -- that "it was fun," and that he "had the time of [his] life as a SEAL," immediately put me in mind of Matthew Hoh. You may recall that five years ago Hoh was much praised and lauded by the usual suspects among "dissident" writers because he resigned from the military "in protest about the Afghan war." Hoh did not deserve such praise: his only objection to the Afghan war was that it was "ineffective" and "counterproductive," not that it was a war of aggression, or that he objected to the U.S. government's foreign policy of ceaseless bombings, invasions, covert operations, and so on, all for the purpose of American global hegemony. Hoh had no objection at all to any of that. He supported that policy.

I explained why I assessed Hoh's actions in a radically different way from those commentators who praised him in two articles: "The Denial Continues, and the Horror Remains Unrecognized," and "Desperately Seeking Peacenik, Pot-Smoking Hippies." In the first piece, I set forth what I regarded as the worst of Hoh's own comments about his military experience:
"I'm not some peacenik, pot-smoking hippie who wants everyone to be in love," Hoh said. Although he said his time in Zabul was the "second-best job I've ever had," his dominant experience is from the Marines, where many of his closest friends still serve.

"There are plenty of dudes who need to be killed," he said of al-Qaeda and the Taliban. "I was never more happy than when our Iraq team whacked a bunch of guys."
Chris Kyle: "It was fun. ... I had the time of my life as a SEAL." Matthew Hoh: "I was never more happy than when our Iraq team whacked a bunch of guys."

Bloodthirsty killers, brothers-at-arms, soulmates. If you wish to nominate Kyle and Hoh to the Murderers Hall of Infamy, I will offer no objection.

Following Hoh's comments, I offered a concise explanation of why Hoh's view is unforgivably wrong. I repeat it here, for this passage can be applied with full force to Kyle's evaluation of his experience. (Kyle provides a "justification" of his kills which is unsurprisingly identical to Hoh's: "Everyone I shot in Iraq was trying to harm Americans or Iraqis loyal to the new government.") I wrote:
The critical facts are few in number, and remarkably easy to understand: Iraq never threatened the U.S. in any serious manner. Our leaders knew Iraq did not threaten us. Despite what should have been the only fact that mattered, the U.S. invaded and occupied, and still occupies, a nation that never threatened us and had never attacked us. Under the applicable principles of international law and the Nuremberg Principles, the U.S. thus committed a monstrous, unforgivable series of war crimes. Those who support and continue the occupation of Iraq are war criminals -- not because I say so, but because the same principles that the U.S. applies to every other nation, but never to the U.S. itself, necessitate that judgment and no other.

While it may be true that some "dudes" threatened Hoh's life and the lives of those with whom he served, Hoh could never have been threatened in that manner but for the fact that he was in Iraq as part of a criminal war of aggression. In other words, he had no right to be in Iraq in the first place. And if he had not been, he would never have been in a position to "whack[] a bunch of guys."

Hoh joined the U.S. military voluntarily. He was obliged to understand this.
Kyle and Hoh could have acted differently. In the earlier article, I discussed the notable, genuinely inspiring example of Ehren Watada, who refused to serve: "My participation would make me party to war crimes." Watada deserves great praise and admiration; Kyle and Hoh absolutely do not.

In the last few days, I've heard and read many awful and frequently idiotic remarks about the great "success" of American Sniper. I've heard how many audiences apparently cheer wildly at the conclusion of the film, in approval of this portrait of a great American hero. A number of commentators insist that Americans are "starved" for this version of "unapologetic patriotism."

In the midst of this blood-drenched celebration of unnecessary, avoidable murder, I heard one especially stupid comment. A local Los Angeles radio host lamented that, during his time in office, Mr. Obama has never uttered Chris Kyle's name. Obama has failed to grant the recognition due this great hero. When I heard that, I had an odd, funny thought. Of course Obama isn't going to mention Kyle, I thought. Kyle is his competition.

Obama is the Murderer-in-Chief. He devoted years and enormous energy to becoming the Murderer-in-Chief. And you expect him to share this great achievement with some two-bit sniper? 160 confirmed kills? That's a morning's work for Obama. Surely we recall that Obama devotedly continues -- and expands -- the infernal work of American Empire in Afghanistan, Pakistan, Libya, Syria, Somalia, and in countless countries around the globe. Surely we recognize that Obama considers the lives of tens of thousands of people, most of them entirely innocent by any standard, as completely expendable in the pursuit of American global hegemony.

On top of this, we surely recall that Obama has a Kill List, and that his Murder Program has been widely publicized in the nation's leading newspapers. The U.S. government has been at great pains to make sure that we all know about the Kill List and the Murder Program in excruciating detail -- and that we know that Obama himself is critical in directing all these operations. The U.S. government, led by Obama, claims that it may kill anyone it chooses, anywhere in the world, for any reason it offers, or for no reason at all. Why would Obama even notice a pipsqueak like Kyle?

Yet the truth is that the overwhelming majority of Americans recognize and remember none of this. Although the Kill List was much written about for a brief period, it has joined the long list of horrors in the cesspool of Americans' amnesia. It's ancient history; who cares about it any longer? Almost ho one. The truth is far worse than that: as I have noted, even during and immediately after the extensive coverage of the Kill List and the Murder Program, as far as most Americans were concerned, all the stories and discussion "caused almost no reaction at all ... It was as if nothing of any significance had been said."

And so we have huge numbers of Americans eager to see this celebration of murder in a criminal war of aggression, and a film which offers an undiluted version of Kyle's view of the Iraqis he killed as embodying "despicable evil." While I've seen a few reports indicating that the film may offer a somewhat more complex perspective, and might even cause a viewer to wonder if the war was "worth it" -- but solely because of the great suffering endured by Kyle, not by the Iraqis -- this obviously is not the primary reason for the film's success. Even the reviews that claim this greater "complexity" for the film stress that the film is enormously sympathetic to Kyle. It is certain that the film does not even begin to approach the idea that Kyle was a serial murderer, who killed people when he had no right to do so -- and when he had no right even to be in their country.

In short: Kyle committed a series of unforgivable crimes. What he did was unforgivably wrong. and unforgivably evil.

That view does not translate into boffo box office, not in these United States of America.

What, then, do I consider the real explanation for the film's notable success? One of the essays linked above provided that explanation, in two brief opening paragraphs. From "To Honor the Value of a Single Life: The First Murder":
We live in a culture drenched with cruelty, violence and blood. From our earliest days as children, we are taught to hate those who are not like us. We learn that compassion and empathy are signs of weakness, and failings to be viewed with contempt. By the time we are adults, most people have internalized these lessons completely. They refuse even to question them. They will despise you, or simply ignore you, if you dare to challenge these beliefs.

We are also taught that the fundamental virtue is obedience to authority. Whatever else we may question -- and, in truth, there is no longer much at all that may be questioned -- the inherent goodness of the primary authority figures we are taught to revere is an absolute that we must accept. The authority figures we are told we must obey, if, that is, we wish to be civilized and decent, are our parents in the first instance; as we grow older, and when the roots of obedience are left to grow and strengthen, as they are in almost every case, the same mechanism encompasses additional authority figures: political leaders, and the military and police, are among the prime examples.
American Sniper mixes these elements together in a lethal combination. Americans' comfort with extreme cruelty and violence, and their unquestioning acceptance of the necessity of obedience to authority (Kyle repeatedly stresses that he was "simply" doing "his duty," but any questions as to why he chose this duty are ignored entirely), are offered to audiences as a version of themselves they view with great favor. Indeed, they revel in it.

The great success of American Sniper immediately follows the latest exercise in the ongoing demonization of Islam and Muslims. Americans' penchant for violence and unending aggression requires the existence of targets who "deserve" whatever they get, even and often especially when what they get is brutality, torture and murder. Empire is greatly skilled and inventive at feeding the appetites of this ravenous monster. Given recent developments, the horrors will not be ending anytime soon. It is more likely that the pressure grows for new explosions of these hatreds. The dedication to violence demands an outlet. Tell many Americans that their hatred and their desire to wreak vengeance are "justified," and they will love you for it.

At this point, it doesn't appear that most Americans can even imagine a profoundly different way of living, let alone begin to make it real. The deadly disease that consumes America can be described in many ways -- but, at least for me, "living" isn't one of them.
20 Jan 18:28

Ignoring when real life gives you a better ending: THE IMITATION GAME

by The Bitter Script Reader
I saw THE IMITATION GAME and AMERICAN SNIPER last week. As the two of them have had a lot of Oscar buzz surrounding them I went into both films with high hopes. As it turned out, the Top 20 films of 2014 posts that I wrote a couple weeks ago are in no need of being revised.

Both films are "based on a true story," and what was interesting to me was that both films shared a common infuriating flaw: the protagonist's death was relegated to a footnote. My jaw dropped when THE IMITATION GAME tossed off Alan Turing's suicide as a text caption over a final celebratory moment. Later, I was flat out infuriated at the way AMERICAN SNIPER threw in a description of Chris Kyle's murder with all the grace of a handwritten note explaining "Note: Poochie died on his way back to his home planet."

I am not one of those people who thinks a "based on a true story" film needs to be a total biopic of its subject from birth to death. Hell, I prefer it when it's not. When I was a reader, I saw so many bad biopic scripts that were just trying to cram in EVERYTHING about their subject. You can usually get a far more effective movie if you hone in on one particular aspect of a character's life and explore that. A good example of this done right? SELMA.

The issue with the two movies I'm discussing today is that their subject's deaths inform so much about their lives and are directly relevant to the stories that are told in the narrative. Alan Turing built a computer that broke Nazi codes in World War II and in doing so, likely shortened the war by two years and saved tens of thousands of lives. This man was a hero as surely as anyone who fought in World War II.

And what happened to him later in life? Well, his work was classified, so there was no public recognition of what he did. Even worse than the lack of glory was the fact that he was later prosecuted for being a homosexual, because in 1952, it was a crime to be gay in the United Kingdom. Offered a choice of going to jail or chemical castration, he took the castration.  (Presumably the irony of punishing homosexual acts by sending one to jail would not set in until years later when the HBO series OZ reached the UK's shores.)

In the film, we're not shown a trial. There's no big dramatic moment when we see the state pronounce sentence on a man who saved their asses in World War II. We're given a framing story that would have set all this up, but we learn about the conviction and the castration in one scene between Turing and his ex-beard. It dulls the impact of the injustice somewhat and then even more offensively, his suicide gets NO on-screen depiction.

Yeah, he kills himself - almost certainly because of that pain of what the state put him through and somehow the movie glides right past that. I'm not just angry as someone fascinated by Turing, I'm angry as a writer. How the hell do you leave that dramatic moment on the table?

Despite that, THE IMITATION GAME is a serviceable movie. It's well-acted and competently directed. It's not standout and is the kind of film that in a few years you might strain to remember, but it does more right than it does wrong.

AMERICAN SNIPER on the other hand.....

You know what? Let's deal with that tomorrow.
20 Jan 02:21

What Makes Groups Smart? Women, According to MIT Scientists

What Makes Groups Smart? Women, According to MIT Scientists:

Imo this dude is naive as fuck but interesting study, especially if you’re willing to see manhood as a constructed problem.

19 Jan 02:11

in nge, the big mechas are actually just containment suits on gigantic monsters, as if you’d...

in nge, the big mechas are actually just containment suits on gigantic monsters, as if you’d put an aot titan inside a gundamn mecha

my interpretation being that the artifice of human-ness is this prosthetic thing on top of being an animal without language,

and the connection between the mecha and the mass of flesh is the pilot (the ego i guess) injected in via the plug

I FIND THAT REALLY COMPELLING

18 Jan 18:20

Je suis Bat Boy

by Fred Clark

I started at the newspaper in October 2001. Here’s your desk, here’s your phone, over there’s where your mailbox will be eventually when our mailboxes are returned.

Mail delivery to the newsroom had been suspended. All of the paper’s incoming mail was being handled and inspected elsewhere because of the anthrax attacks.

BatBoyThose attacks, weirdly forgotten just a few years later, lasted for several weeks during the immediate aftermath of the 9/11 attacks. The main targets were newspapers and TV news departments, but they also included the offices of two Democratic U.S. senators, the headquarters of several supermarket tabloids, and a few other random citizens whose infections and deaths have never been explained.

Ultimately, the anthrax letters sent through the mail would wind up infecting 22 people with the disease, killing five of them. And then, after a few weeks, the letters just stopped showing up and the crisis faded away.

Back in October 2001, though, nobody knew how many of the letters had been sent, who was sending them, or why. No one knew where those letters might arrive next.

The first victim of the attacks, Robert Stevens, died on Oct. 5. Stevens was a photo editor for American Media, Inc., the Boca Raton, Florida-based publisher of several supermarket tabloids, including the National Enquirer, the Sun (where Stevens worked), and the Sun’s main rival for paranormal hilarity, the Weekly World News.

The Sun and the Weekly World News were simultaneously tabloids and parodies of tabloids. They were satire. Sometimes that satire was pretty brilliant. Other times it was hilariously over-the-top. Often it walked a troubling line by seeming to reinforce the very things it was ridiculing (the “Ed Anger” columns seem to have been read un-ironically by many people who shared the column’s apparent racist, sexist, nativist, homophobic views). And more often the satire of these tabloids adopted a sneering tone toward all the marks and rubes who weren’t in on the joke — who read its accounts of alien abductions, “Bible prophecy,” cryptozoology, ghosts, and the like without fully realizing it was all a gag.

But my point here is not to critique or to analyze the quality or meaning of those now-defunct satiric tabloids. I just wanted to remind us all that just a little more than a decade ago, a terror attack here in the United States targeted and killed a visual artist who worked for a satiric tabloid.

The anthrax attacks targeted journalists and satirists alike, and prompted a great deal of chest-thumping, defiant affirmations of the freedom of speech and freedom of the press.

And then, three weeks after the death of Robert Stevens, the Patriot Act passed the U.S. Senate on a vote of 98-to-1 and was signed into law.

Plus ça change. …

18 Jan 08:16

Sunny day + fun waves = big grin on my face :D



Sunny day + fun waves = big grin on my face :D

17 Jan 20:17

allthingslinguistic: socimages: Map of the week: 57% of...



allthingslinguistic:

socimages:

Map of the week: 57% of languages do not have gendered pronouns.

By Lisa Wade, PhD

The map above is an interactive available at the World Atlas of Language Structures.  It represents an extensive, but not quite comprehensive collection of world languages. Each dot represents one. White dots are languages that do not include gendered pronouns. No “he” or “she.” Just a gender neutral word that means person.

The colored dots refer to languages with gendered pronouns, but there are more than one kind, as indicated by the Values key. The number on the right, further, indicates how many languages fit into each group. Notice that the majority of languages represented here (57%) DO NOT have gendered pronouns.

3The map at the site is interactive. Go there to click on those dots and explore.

Lisa Wade is a professor of sociology at Occidental College and the co-author of Gender: Ideas, Interactions, Institutions. You can follow her on Twitter and Facebook.

Note that while WALS has around 300 languages, and there are actually around 6000 languages in the world, this map almost certainly overstates the proportion of languages that have a masculine/feminine distinction in pronouns, for two reasons. 

1) In linguistic terms, a gender distinction is a distinction between various noun classes, and feminine/masculine is only one way to split up noun classes. Other common ways are animate/inanimate (for example Algonquian languages, aka many of the pink dots in North America) or having a much larger group of noun classes that may include human/nonhuman but not feminine/masculine (for example Bantu languages, aka many of the red dots in Africa). 

2) The languages that are represented on this map include disproportionate numbers of widely-studied languages, because the creators are working from published grammars. The most studied language family is Indo-European, in which most languages do have a masculine/feminine gender distinction in pronouns, but many less-studied language families do not. 

But I do agree with the overall point: while having separate words for “she” and “he” seems natural to English speakers, a large proportion of languages get along just fine without them. (And singular they has been around since Chaucer, anyway.)

17 Jan 18:16

newyorker: An illustration by Tom Gauld.



newyorker:

An illustration by Tom Gauld.

15 Jan 18:58

Photo



14 Jan 20:17

This is really interesting! So, now that all the primaries are out, why would the different houses gravitate towards joining the Death Eaters? Like, why do the Death Eaters pull more Slytherins than Hufflepuffs?

Well our Primaries don’t always match up with what people tend to be sorted in-canon, and when they do it’s not because we intended them to! So to say that Slytherin Primaries would gravitate more than the other houses toward being Death Eaters, I would actually say is probably not the case.

You’re right in thinking that Hufflepuffs have a lot of potential to become Death Eaters. We sort Bellatrix Lestrange as a Hufflepuff Primary. If they can be convinced that muggles and muggle borns are inhuman, or at least less human than wizards and witches, then the actions of the Death Eaters probably wouldn’t carry the same level of horror for them. Ravenclaws would join the Death Eaters if they accepted the system that Voldemort proposed, and Gryffindors could be swayed by twisted appeals to their morals. Get their logic or their gut to follow the teachings of Voldemort, and you’ve got yourself some new Death Eaters. I really don’t think that in the face of something as plainly evil as Voldemort, you’re going to have any particular house favoring the cause. 

A big part of the why then, in the case of the books, is that…JKR really doesn’t like Slytherin. Her Slytherins are all evil. The most redeemable Slytherin we probably have is Slughorn (Slytherpuff, we think), who’s still unsavory. And as far as I can tell, her justification for that is that power corrupts. And arguably Narcissa when she lies to Voldemort to save Draco, but…she’s still married to and supportive of a Death Eater. She doesn’t get to e considered a good person. According to how JKR writes them, all Slytherins get caught up in their ambitions and their schemes and they forget the things that really matter— bravery, kindness, friendship. Things that she doesn’t particularly show Slytherins being capable of. 

And if you take people like that and put them in a war-torn world where one of the ways out, one of the ways to protect your family, is to join with Voldemort? I can understand the train of logic that would get you there. And it’s more of a stretch, but I can understand the thematic logic, at least, that would get you to the final battle and not have a single Slytherin student wanting to fight.

There are some other interesting theories, like the importance Slytherin puts on blood purity encouraging and propagating and being at the least sympathetic to Voldemort’s violence, and like the social importance put on your Hogwarts house and the pressures that would go along with that, but none of those would explain why Slytherins are unanimously awful. The story of how ambition and selfishness and family loyalty can be positive things just wasn’t the story that JKR was interesting in telling. 

Which like, yeah okay, they’re her books and she can do what she wants with them. 

But as far as your question? I don’t think Slytherins are more likely than the other houses to turn Death Eater. I could come up with some answer about protecting your family by siding with whoever the biggest bad in the room is, but I don’t actually think that’s a strong enough motivation for most people in that situation. You’re going to end up with Slytherins who flee and Slytherins who stay quiet and who join the Ministry and try to help from political angles, and you’re going to end up with Slytherins who roar right alongside those Gryffindors and join the fight with all their gusto. 

-Kat