Shared posts

03 May 14:53

She wanted to vaccinate their kids against COVID. He didn't. A judge had to decide

by Nina Feldman | WHYY
Heather wanted to have her two children vaccinated against COVID-19, while her ex-husband did not. In Pennsylvania, decisions about children

A divorced Pennsylvania couple could not agree on whether to vaccinate their children — and ended up in court. Since the vaccine was approved for kids, cases like these have skyrocketed in the state.

(Image credit: Emma Lee/WHYY)

15 Jul 15:08

Rick Bayless Leaves Randolph Street Restaurants Leña Brava and Cruz Blanca

by Ashok Selvam
The exterior of a restaurant with a sidewalk sign. Rick Bayless is walking away from Cruz Blanca and Leña Brava. | Barry Brecheisen/Eater Chicago

The West Loop taco bar and restaurant will continue with a new chef

Rick Bayless has left his West Loop restaurants, depleting Randolph Street of one of its biggest guns. Bayless, the Oklahoma-born chef that’s made a fortune cooking and selling Mexican food, has exited Leña Brava and Cruz Blanca, the neighboring restaurants that opened in 2016. Bayless has left the restaurants to his long-time business partner, Manny Valdes. The two founded Frontera Foods, the Mexican food company known for chips, sauces, and more. They sold Frontera in 2016 to ConAgra Foods.

Leña Brava centers around cooking food over a wood fire. A restaurant aimed to showcase Bayless’s talent along Randolph Restaurant Row, where pricey mezcal sits on shelves behind the bar. Bayless and Monteverde chef Sarah Gruenenberg famously danced behind the bar celebrating their wins at the James Beard Awards gala after party.

“For me, [leaving is] super painful, super sad,” says Bayless. “I’ve always dreamed of being able to have restaurant with all wood-fired kitchen. I grew up in barbecue with all that wonderful wood flavor and everything. Every restaurant I’ve opened had a wood-fired element — to be able to do a whole wood fired kitchen was such a dream for me.”

Cruz Blanca plays the casual little sibling as a Oaxcan taqueria and brewery. While draft beer has plumetted during the pandemic, retail beer sales have helped weather the storm, says Valdes. While Valdes says “we’ve had to throw away a lot of beer,” sales at stores like Mariano’s and Whole Foods are up 50 to 60 percent. It’s not enough to offset COVID-19 related losses, but it’s slowed the bleeding.

Food Network & Cooking Channel New York City Wine & Food Festival presented by Capital One - Grand Tasting presented by ShopRite featuring Culinary Demonstrations at The IKEA Kitchen presented by Capital One Photo by John Lamparski/Getty Images for NYCWFF
Rick Bayless on stage in 2019 in New York.

Both Bayless and Valdes are frank about the fact that their partnership has deteriorated. Valdes wouldn’t go into specifics of how the relationship had frayed, but says they’ve been considering the change since March. The pandemic has been grueling, and for success it requires an owner-operator on premises, he says. Valdes naturally oversaw the West Loop restaurants, while Bayless focuses on River North’s Frontera Grill, Topolobampo, and Bar Sotano.

Bayless says he and Valdes were already moving in different directions before the pandemic hit. Then the industry was upended almost overnight, and the fissures between the two became untenable. “It [wasn’t] working for either one of us, to constantly be at loggerheads,” Bayless says. “He wanted to make decisions I wasn’t comfortable with, so we decided it’s time now... in better times maybe we would have been able to work things out, but not now — it’s just too much.”

The economics of a pandemic restaurant industry have Valdes hoping to make the Randolph Street feel more like an affordable destination for locals. Where a customer can easily come in for a glass of wine and some wood-fired oysters without a reservation. Cruz Blanco stayed open for carryout and delivery during the pandemic and Valdes says the support from West Loop neighbors has astonished him. Leña Brava’s menu will slowly add more small plates: “We really want to become a neighborhood restaurant,” Valdes says.

Bayless is not nearly so optimistic about the state of the industry in River North. Historically reliant in large part upon hotel guests in the Loop and downtown workers, his restaurants are now fighting for the same small group of diners as every other establishment in the area. Even if he could operate at full capacity indoors, the demand just isn’t there, he says. “We started off in River North 33 years ago when it was a horrible neighborhood, and we’ve helped build it into the good neighborhood that it is now,” he says. “We’re seeing that overnight it can go from one of the best neighborhoods [for restaurants] to one of the worst.”

He’s primarily concerned with his clutch of Clark Street establishments, but his other ventures have suffered too — in January, he opened Tortazo, a partnership with Filipino fast-food giant Jollibee, inside the Willis Tower. At the time, 16,000 people worked there, and it was the best address in the city for a fast-casual restaurant, according to Bayless. Six weeks later, restaurant was shut down because of the pandemic.

A corner restaurant. Barry Brecheisen/Eater Chicago
Ownership wants to make Leña Brava more of a casual restaurant.

During the shutdown and early stages of reopening, Bayless has turned his attention to activism, in part through the Independent Restaurant Coalition. He’s among those advocating for a bipartisan bill — the RESTAURANTS Act of 2020 (Real Economic Support that Acknowledges Unique Restaurant Assistance Needed to Survive) — that seeks economic relief for operators sinking further into the red every day. “Every restaurateur we talk to is barely holding on,” Bayless says. “We need an influx of capital right now so we can stop losing money and start moving toward growing our business back to where we can break even.”

Valdes is of Cuban descent. Bayless’s story as becoming a cultural ambassador for Mexican food has been widely told, whether on PBS or elsewhere. The Bayless name has drawn even more attention to Randolph Street, a place where Robert DeNiro just last week opened a Chicago location of Nobu Hotel. But a big-time chef’s name can be intimidating to some customers who come in with preconceptions of what to expect, Valdes says. They may feel the restaurant is reserved only for special occasions.

Bayless and Valdes continue to work together. Valdes is a minority partner in Tortas Frontera, the fast-casual restaurants with locations inside O’Hare International Airport. In January, the concept was more or less rebranded as Tortazo. When the restaurant first opened, Bayless and company hoped to open others across the country. Valdes also is an investor in Bayless Disney World restaurant in Orlando, Florida.

15 Feb 14:49

Heaven’s Vault preview: an ouroboros of wordery

by Alice Bell
Crowtrobot

80 Days is one of my top adventure games of all time. Really looking forward to this.

I love words, me. They’re so expressive and weird and can do so much. I think of people who are very good at word-balling as like conductors mixed with magicians. They marshall all these weird squiggles into a specific order that make other people spontaneously feel real feelings, in whatever order they want. Madness! And the origins and cross-pollinations of language, especially one as terrible as English (consider the words ‘content’ and ‘content’ for example) that you can study words specifically, which is called etymology.

For example, did you know that avocados and orchids are both basically big swinging balls? Avocado is a corruption of the Aztec word for testicle, and orchid comes from the Ancient Greek for same, orchis*. This once again goes to show that humans have evolved very little. But we have at least evolved to the point that we can make video games with an etymological flair. That gold standard segue leads me to Heaven’s Vault.

(more…)

15 Feb 14:45

Have You Played… Need for Speed: Hot Pursuit?

by Brendan Caldwell
Crowtrobot

they're really on my late 90s PC obsession kick.

This was such a fun game.

Have You Played? is an endless stream of game retrospectives. One a day, every day, perhaps for all time.

No, not that one. This one. Confusingly known in North America as Need For Speed III: Hot Pursuit, this 1998 racer was a steel-blue feast of dropdown menus infested with cars. Click on one of these three-dimensional motorbeasts to hear the silky voice of every US infomercial telling you about horsepower and gear boxes and how the Alfa Romeo Scighera “embraces dynamic new design influences.” Oh yes, that’s the good stuff. That’s some good cars right there.

(more…)

14 Feb 18:00

Have you played… Might & Magic VI: The Mandate of Heaven?

by Ollie Toms
Crowtrobot

Hell yeah I have. We're super due for a game made in this style. This was such a great dungeon crawler.

Have You Played? is an endless stream of game retrospectives. One a day, every day, perhaps for all time.

It’s very difficult to pick a favourite moment out of the unsightly behemoth that was Might & Magic VI: The Mandate of Heaven. It may have been when I went on a murder-hobo rampage through the starting town of New Sorpigal, killing every peasant, every guard, every trader; and then heading to the temple and donating gold until my reputation had improved from “Notorious” back to “Respectable”.

Or it may have been that fateful moment where, due to a simple misclick, I accidentally mixed one potion with another in my inventory and caused a chemical explosion that instantly killed my entire party.

(more…)

14 Feb 15:22

The New Yorker Running for President of Nigeria

by Pawel Slabiak
Crowtrobot

weird that a magazine is running for president, but i'm not familiar with nigeria's constitution.

One evening last summer, Chike Ukaegbu, a 35-year-old New York tech entrepreneur, called his uncle, Augustine Akalonu: “Are you sitting down?” Ukaegbu asked. Dr. Akalonu was sitting down; he was driving home from his pediatrics practice in the Bronx. Ukaegbu had just been named among the global 100 most influential people of African descent under forty. He had been speaking at conferences around the country and running a lauded tech accelerator in New York. But he wasn’t calling about any of that.  

“I’m going to run for president,” Ukaegbu said, a characteristic note of mischief in his voice.

Dr. Akalonu, a jovial man in his mid-sixties with an easy, unhurried manner said, “Great. President of what?”

“Of Nigeria,” replied Ukaegbu.

“I nearly had a car accident,” Dr. Akalonu told me a few months later in November, at a fundraiser he hosted in Nyack, New York. Ukaegbu had become the youngest presidential candidate in Nigerian history. 


“Gerontocracy.” That’s the word critics use to describe Nigerian politics. The country has a population of almost 200 million—the largest on the continent, and the youngest: Sixty percent of Nigerians are under age 30, only six percent are older than 60. But government leadership positions are overwhelmingly filled by the aged. This disparity illustrates a broader African trend. In 2017 researchers estimated the median age of the continent’s population was 20, while the average age of a head of state was 62. In May 2018, however, a month after angering Nigerian youth with disparaging comments at an event in London, President Muhammadu Buhari, 75, signed a law lowering the minimum age for presidential candidates from 40 to 35.

Ukaegbu came to the U.S. in 2002 at the age of 19 to study biomedical engineering at the City College of New York. That’s where I first encountered him, although we were not friends and moved in different circles. “I don’t remember all students who participated in our program, but I do remember him,” Nora Heaphy, then a program director for the college’s Colin Powell Fellows in Leadership and Public Service, told me. “He was extremely charismatic, very open to learning, very engaging.” While many fellows went to entry-level jobs at D.C. think tanks upon graduation, she said, Ukaegbu stayed in New York. In 2010, he and a friend he had met at a church concert, Kevaughn Isaacs, launched a non-profit called Re:LIFE, training disconnected youth in Harlem and Washington Heights to develop business plans and helping them complete their education. Ukaegbu himself enrolled in distance learning courses in business management and venture capitalism at Cornell, the University of Pennsylvania, and Stanford, as his plans expanded.

In 2013, Re:LIFE ran its first startup class requiring each trainee to launch a business by the program’s end, and Ukaegbu began noticing what he saw as bias. “I saw several brilliant founders who were not getting funded,” he told me. “And I heard several bullshit stories from investors explaining why they’re not funding these people.” In 2015, he founded Startup52, a Manhattan-based “start-up accelerator” aiming to offer young entrepreneurs from underrepresented demographics intensive coaching in polishing their business plans and investor pitches. Two years later, the accolades for Re:LIFE and Startup52 helped him receive a green card through the United States’ National Interest Waiver category.

“This is the new Nigerian dream,” Damilare Ogunleye, Ukaegbu’s 33-year-old Lagos-based campaign director, told me when we met in Lagos in January. “To leave, embed in the system in the U.S., U.K., or wherever, and skip all the problems of Nigeria.”

Chike Ukaegbu and his brother Chibueze Ukaegbu Jr. (left), in the campaign office in Abuja in January. Photographs by Pawel Slabiak

Nigerian politics, with its volatile mix of money, violence, and humor, is often surreal. Out of more than 70 candidates who initially declared intent to stand in the presidential election on February 16, 2019, only two have emerged as frontrunners. One is current president Buhari, who due to months’-long medical leaves in the U.K. has had to publicly deny rumors that he has died and been replaced by his double, a hypothetical Sudanese named Jubril. “It’s the real me,” Buhari assured an audience of Nigerian expatriates at a U.N. climate summit in Poland in December. Despite Transparency International placing Nigeria 148th out of 180 countries on a corruption index last year, with scores virtually unchanged during a four-year presidency premised on fighting corruption, Buhari has titled his reelection manifesto Next Level, unironically vowing even greater progress.

Buhari’s chief rival is Atiku Abubakar, a 72-year-old businessman who has been unable to travel to the U.S. since 2005, when an American investigation implicated him in a transatlantic bribery case involving an FBI sting, a telecommunications deal, and $90,000 stashed in a Louisiana congressman’s freezer. Abubakar finally managed a visit to Washington, D.C. in January, reportedly helped by Brian Ballard, a lobbyist with extensive links to the Trump administration. In Nigeria, Abubakar has faced numerous corruption allegations.

“The surest way to riches and power,” former U.S. ambassador Joseph Campbell and Chatham House fellow Matthew Page wrote of Nigeria in 2018, “is through elected office and the opportunities for kleptocratic state capture that it offers.” Successful candidates today are often “godfathered” by prior office holders who have amassed great wealth from public funds, bankrolling party and campaign costs in exchange for a continued cut of national treasure once the sponsored candidate assumes office. Political parties resemble businesses, low on ideology or policy objectives, funded top-down by the godfather, the top office holder, or, during campaigns, by the candidate for highest office. 


Ukaegbu’s eccentric campaign began with a chance encounter at a Johns Hopkins School of Advanced International Studies event last April. Ogunleye, who worked on political campaigns while at a Nigerian communications firm from 2012 to 2015, was visiting with a delegation of Lagosian entrepreneurs—he currently runs a T-shirt and mug customization website called Suvenia.com. The two men bonded over a passion for tech innovation: Ukaegbu had been reaching out to Nigerian officials and one presidential candidate with a technology blueprint for the country, a platform politicians could adopt and promote, with no response. “You should run,” Ogunleye told him.

In June, Ukaegbu began holding daily videoconferences from New York with a core team: his older brother Chibueze, who runs a coding academy and software firm in Aba; Ogunleye in Lagos; and a banker in Abuja. The four began planning how to introduce Ukaegbu to voters, raise funds, and obtain a political party nomination, as required by law. They found inspiration, Ogunleye told me, in an Old Testament anecdote in which four lepers, rather than starve in a city besieged by the Syrian army, set out for the Syrian camp, hoping for nourishment or swift death. When they arrive, they find it full of provisions but abandoned: The Syrians heard the thunder of charging chariots and fled in haste. “How do you with four lepers create the sound of chariots?” asked Ogunleye, the sleeves of his suit pinned with card cufflinks featuring the ace of diamonds and ace of spades. “Lead with the media.”

Nigerian news media didn’t bite, but through a contact in D.C. Ukaegbu landed interviews on CNN, BBC, and Al Jazeera. “Everyone got interested,” said Ogunleye. “Who is this guy who is on all these global platforms? What kind of backing does he have?”

In August, after 16 years in New York, Ukaegbu moved back to Nigeria to launch his campaign. Other parties showed up to offer him office nominations and cabinet positions with them—a standard tactic to neutralize political opponents. During one meeting Ogunleye found particularly gratifying, a consultant from Atiku Abubakar’s People’s Democratic Party, eager to demonstrate that the unknown Ukaegbu stood no chance, tapped a bystander. “Do you know this guy?” she asked. The man squinted. “Don’t force yourself,” said the consultant. But the man kept looking at Ukeagbu. “Hey, aren’t you the guy on CNN?” he said. “You’re good.”

At this point, the scrappy #Chike4Nigeria team included Re:LIFE cofounder Isaacs,  now a New York high-school teacher of graphic design and photography, producing all campaign graphics during early morning hours and school recess. Ukaegbu’s older brother, Chibueze Ukaegbu Jr., CEO of a coding academy and software firm named LearnFactory Nigeria, contributed personal funds to the campaign’s shoestring budget and directed efforts of his eight full-time and four part-time staffers to building websites, apps, and WhatsApp broadcasts. In New York, Ezinne Kwubiri, head of inclusion and diversity at H&M, worked her contacts to generate media coverage and organize fundraisers, in addition to helping Ukaegbu manage his increasingly chaotic calendar.

By fall, which is when I came across the campaign on social media, Ukaegbu had become the nominee of the new Advanced Allied Party (AAP). AAP chose Safiya Ibrahim Ogoh, a woman from the country’s predominantly Muslim north, as his running mate—balancing out Ukaegbu’s roots in the mostly Christian south. 

Election posters cover walls and fences in Lagos, Nigeria’s largest city. Tens of thousands of candidates are running for office across all government levels in 2019—more candidates than ever before, according to Nigeria’s Independent National Electoral Commission.

Ukeagbu’s campaign message is that Nigeria stands at a pivotal point in its trajectory—young, rich in resources, and full of ingenuity, but also undereducated, plagued by conflict and corruption, and underdeveloped. Without leaders committed to aggressive investment in education and technology, he and Ogunleye believe, Nigerians will be crushed by the impending fourth industrial revolution—a revolution Ogunleye perceived signs of in China while on an Alibaba fellowship in 2017. “You think it’s bad now?” Ukaegbu asked a crowd in Philadelphia in November. “Just wait for artificial intelligence,” he said. “That will be real hunger games.”

While not all experts may share Ukaegbu’s and Ogunleye’s tech focus, many would agree that Nigeria is in trouble. In a 2019 analysis, Eurasia Group ranks the Nigerian presidential race among this year’s top ten geopolitical risks, thanks to three probable electoral outcomes: Buhari wins (“an elderly, infirm leader who lacks the energy, creativity, or political savvy to move the needle on Nigeria’s most intractable problems”), Abubakar wins (“another gerontocrat who would focus on enriching himself and his cronies”) or no clear winner emerges (“a dangerous wildcard”). Meanwhile, relentless demographic growth continues. In a 2018 report on “Severely Off-Track Countries,” Brookings projects that by 2030 the number of extremely poor in the nation may reach 160 million. 

“Wake up!” says Ukaegbu in one of his campaign videos. “We’re in a fight for our lives, and we don’t even know it.” 


One searingly hot day in January, I watched Ukaegbu deliver this message to a group of kids and teachers at a private school in Abuja, in the shade of a mango tree. Ukaegbu introduced himself to each person and remembered their names at question time. Despite his dire warnings, he radiated enthusiasm for what he called “amazing Nigeria.” Ben Dike, a teacher at the school who later invited Ukaegbu to teach a fifth-grade math lesson, asked, “How do you intend to compete with these gladiators?”

Realistically, Ukaegbu doesn’t stand a chance in the upcoming election. (“Who’s that?” asked my driver in Lagos when I told him whom I’d come to cover—a dampening response I had heard more than once.) It doesn’t seem to bother him. Ukaegbu “doesn’t see ceilings,” his cousin Ezemdi Akalonu, a junior at Brown University, told me. His family knows him as a 17-year-old who ran away from home to attend Lagos University instead of the local college prescribed by his parents—enraging his mother, a school principal, and his father, a civil servant and entrepreneur—before looking further afield, to the United States. Now, he’s the local boy made good: His high school in Aba invites him back to speak whenever he visits.

Chike Ukaegbu visits a private school in Abuja on January 18th, 2019.

Following the speech at the Abuja school, Ukaegbu and his brother Chibueze decamped to a spartan mezzanine room in a mostly vacant office building. An enormous desk in the otherwise empty, low-ceilinged space, and a bizarrely miniature bathroom gave the room an Alice-in-Wonderland vibe. Chibueze peered into his laptop, working on a website that would allow voters to find young candidates by district. Chike monitored some of the two dozen WhatsApp channels that he said bombard him with “five thousand messages a day.” The lights went off, as frequently happens, until a generator kicked in, and a fan resumed noisily circulating tropical air.

“We’re on the list,” Ukaegbu said, scanning a copy of the roster of presidential candidates, just released by Nigeria’s Independent National Electoral Commission. It was a relief—backroom intrigue and string-pulling, whether from within his own party or elsewhere, could no longer prevent his name from appearing on the ballot. 

Despite the early buzz, Ukaegbu’s run has struggled to attract donors. The party, which expected Ukaegbu to bring in the money, is not happy about it. The #Chike4Nigeria team predicted enthusiastic donations from the highly educated diaspora, which remitted home $25 billion in 2018, World Bank data show. According to numbers published by the Center for Social Justice, one tenth of one percent of that sum would match the total for media expenditures made in the 2015 campaign by PDP, the biggest spender and, as the then-governing party with a sitting president, an entity with unrivaled, direct, and unscrutinized access to the country’s oil revenue. But the money hasn’t materialized.

The team has also encountered skepticism from the very Nigerian demographic it hoped would welcome Ukaegbu’s bid: the 18-to-35-year-old voters who make up more than half the electorate. Instead, the most excited supporters have often been retirees, people in their sixties and older.

It’s hard to know how much of that is about the Ukaegbu campaign, and how much of it is about the realities of Nigerian politics. When I asked Matthew Page whether he could imagine an outsider, even a well-resourced one, mounting a successful presidential bid without established patronage-client relationships, he seemed unconvinced. “I think we’re a few decades away from that,” he said.

As I was boarding a flight back to New York in late January, Ukaegbu called me. He sounded tired. For days he had been trying to connect me with his vice-presidential pick, but she kept demurring. Two nights before, at a tense party meeting in a Stygian room in Abuja, I had watched him square off against members who wanted to declare support for Buhari and catch the last trickle of cash from the dwindling stream of the ruling party’s electoral funds. I asked whether he felt discouraged. He said he did not. “It’s certainly been an education,” he told me on the call. “It’s been like a PhD in Nigerian politics.”

14 Feb 14:14

Blackface Didn't End In The 1980s

by Hafsa Quraishi
Crowtrobot

anyone who had a diet of 00's video sketches certainly saw a lot of edgy "ironic" blackface

A model is seen wearing a black Gucci turtleneck with red lips. After the company was widely criticized for what many saw as resembling blackface, Gucci withdrew the sweaters from its shelves.

Two top Virginia lawmakers have admitted to wearing blackface in the 1980s. But blackface didn't stop then.

(Image credit: Andrea Ronchini/Getty Images)

12 Feb 18:18

French-Style Brown Butter New Potatoes

by Sasha Marx
Crowtrobot

Sounds good.

French-Style Brown Butter New PotatoesGet Recipe!
12 Feb 14:21

Off the Map

by Patrick Iber
Crowtrobot

Sounds interesting.

In February 2003, Al Jazeera broadcast an interview with Donald Rumsfeld, then George W. Bush’s secretary of defense. “Would it worry you,” the interviewer asked him, “if you go by force into Iraq that this might create the impression that the United States is becoming an imperial, colonial power?” Rumsfeld swatted away the question. “I’m sure that some people would say that,” he responded, “but it can’t be true because we’re not a colonial power. We’ve never been a colonial power.… That’s just not what the United States does. We never have and we never will.” In March, the United States invaded Iraq. By April, it was operating a government of occupation. By May, it had effectively placed a proconsul in charge of the country.

In the early years of the Iraq war, the idea of the United States as an imperial power was, for a moment, a subject of serious debate. Longstanding left-wing critics of empire like Noam Chomsky were now joined by conservative hawks such as Niall Ferguson in agreeing that the United States was an empire, though they differed deeply on whether this was a good thing. But both Rumsfeld and the journalist questioning him exhibited a kind of historical amnesia. Rumsfeld denies the possibility that the United States could ever be an empire; the journalist asks if it is in the process of becoming one. But what if it had been all along?

HOW TO HIDE AN EMPIRE: A HISTORY OF THE GREATER UNITED STATES by Daniel Immerwahr Farrar, Straus and Giroux, 528 pp., $30.00

That is the question Daniel Immerwahr pursues in How to Hide an Empire: A History of the Greater United States. “One of the truly distinctive features of the United States’ empire,” he observes, “is how persistently ignored it has been.” In order to address this historical amnesia, we must, he argues, consider the United States not as it is typically represented on the map—as the mainland United States with corners in Washington state, Maine, Florida, and Southern California—but as a collection of all of the territories in which the United States has exercised sovereignty. This “Greater United States” includes not only Puerto Rico, whose colonial status is at least widely recognized if not deeply considered, but also other territories ranging from rocks covered in bird excrement to the approximately 800 military bases that the United States still operates around the world. (Britain and France have 13 bases combined; Russia has nine.)

The book is written in 22 brisk chapters, full of lively characters, dollops of humor, and surprising facts. (Did you know that the U.S. greenback, for example, is modeled on the Philippine colonial currency, and not the other way around?) It entertains and means to do so. But its purpose is quite serious: to shift the way that people think about American history. Americans tend to see their country as a nation-state, not as an imperial power. As such, its global reach and influence are often invisible to its own citizens. So are the complex reactions that its actions around the world produce. Without an understanding of empire, Americans may see events elsewhere—a caravan of Central American migrants heading through Mexico, let’s say—as foreign threats, rather than as a consequence of the global distribution of power and violence, as something shaped by the history and politics of the United States.


Histories of U.S. empire often start in 1898 with the Spanish-American War, out of which the United States took control of Puerto Rico and the Philippines. But nearly since its founding, the United States had been in the process of expanding and debating how it would expand from 13 Eastern states across the continent. Early national elites were divided on how to apportion land for Native Americans and maintain peace between them and European settlers. Daniel Boone, for years taught to schoolchildren as a “pioneer,” was in his own time a criminal, who, by taking white settlers westward into Native American lands, risked involving the U.S. government in conflict. Thomas Jefferson’s original idea for the Louisiana Purchase was that it would primarily provide access to southern ports, and that much of the remaining land would be kept for Native Americans, along with free black people, Catholics, and others he judged unfit for citizenship.

But over the course of the nineteenth century, opportunities for profit pushed individuals and governments to seize more and more land. The white settlers’ beliefs about racial difference served to both justify this expansion and to limit it. New states applied for admission to the union: some successfully, others—like Lincoln, West Dakota, Deseret, Cimarron, Montezuma, and the majority-Indian Sequoyah—were rejected. After the U.S.-Mexico War of 1846 to 1848, the United States acquired not the populous central and southern zones but only the relatively sparsely populated northern territories of Mexico, partly because of racism. Representatives who opposed expansion for moral reasons were joined by those who opposed trying to incorporate large numbers of people they considered racially unfit for democratic government. Those new territories would become states decades later only when white rule could be assured.

An unexploded shell from the USS Charleston in the streets of Caloocan in the Philippines in 1898. Douglas White/Navel History and Heritage Command

But while racist thought shaped the country’s enlargement, commerce was never far from the discussion either. One particularly strong chapter in How to Hide an Empire deals with the “guano islands” scattered throughout the Pacific and the Caribbean. By the 1850s, guano had become a highly prized commodity. Intensive, industrialized agriculture required supplies of nitrogen fertilizer, which could be made from bird excrement. To this end, an 1856 law allowed any American citizen to “take peaceable possession” of any previously unclaimed island where they discovered guano deposits. In that simple manner, the territory would come to belong to the United States.

While territories that bordered the nation could eventually expect full statehood, the guano islands resembled traditional overseas colonies. Because the islands were unpopulated—apart from the birds—the workers who would toil among mountains of guano often had to be tricked and coerced into going there. In some places, business owners employed Native Hawaiians; in others they exploited African Americans, who labored in conditions resembling convict camps. An 1889 uprising by workers on Navassa Island, off the coast of Haiti, led to the deaths of five whites; President Benjamin Harrison commuted the death sentences of its leaders after an investigation revealed that the workers had no recourse to any government official. This position nevertheless reinforced the notion that the islands were part of the United States.


The years after 1898 usually are seen as a rupture in the history of U.S. empire, as the United States acquired more substantial overseas colonies. But it wasn’t the fact of expansion that was new—that had been ongoing throughout the nineteenth century. Nor was it the conflict with foreign nations and the conquest of territory—Native American communities were foreign nations and had suffered violence and displacement as a result. Nor, for that matter, was it the acquisition of territory outside of the mainland—as the guano islands show. What was different was that the United States had acquired overseas territories with substantial native populations understood as nonwhite.

At first, the United States carried out its conquests of Puerto Rico, Cuba, and the Philippines in the name of liberation from Spanish tyranny. But hopes for genuine self-rule among the people in those former Spanish colonies soon met the reality of U.S. occupation. Cuba gained technical independence but with compromised sovereignty, as the United States insisted on the right to intervene in its affairs and frequently did so. In the Philippines, pro-independence forces led by Emilio Aguinaldo suffered a brutal counterinsurgent campaign, which cast doubt on the efficacy, not to mention the morality, of U.S. policy. Mark Twain, who became the most prominent anti-imperialist in the mainland United States, imagined colonial subjects thinking: “There must be two Americas: one that sets the captive free, and one that takes a once-captive’s new freedom away from him, and picks a quarrel with him with nothing to found it on; then kills him to get his land.”

As Immerwahr puts it, the United States faced a trilemma. It could have at most two of the following three things: republicanism, white supremacy, or overseas expansion. It was republicanism that lost out. In a 1901 ruling, the SupremeCourt established that the Constitution did not fully apply to overseas territories. Their residents did not have the same rights as mainland Americans. Immerwahr details the many consequences of making these legal gray zones: counterinsurgency in the Philippines followed by colonial government; medical experimentation and sterilizations in Puerto Rico, and an economy arranged around exploiting its status as a tax haven; and Hawaii turned over to the military after Pearl Harbor. By the end of World War II, Immerwahr reckons, there were more people living in colonies and under U.S. occupation—including in Japan and the U.S. zones of Germany—than in the mainland.

In an era of airplanes and wireless communication, territory was not required for dominance; indeed, it could be a source of friction.

This was a temporary condition. The third era in the history of U.S. empire, from the postwar period to the present, saw a retreat from formal occupation. The Philippines obtained independence in 1946; Alaska and Hawaii got full incorporation; Puerto Rico, Guam, and Samoa got civilian rule. But the United States was not necessarily acting out of altruism. The struggles of colonized people for independence, anti-Filipino sentiment in California, and America’s desire to claim the mantle of freedom in the Cold War conflict with the Soviet Union were all part of the picture. The United States was also strategically shedding the undesirable parts of its power: the obligation to maintain order, to support colonial administrations, and, from the racist point of view of many white Americans, the colonial ties that facilitated migration to the United States.

Instead, it now kept an empire of bases: signing a 99-year lease for land in the Philippines, keeping control of Vieques in Puerto Rico, Guantánamo in Cuba, Okinawa in Japan, and many hundreds of other locations. In an era of airplanes and wireless communication, territory was not required for dominance; indeed, it could be a source of friction. Immerwahr, borrowing a term from historian Bill Rankin, calls this new geography of power the “pointillist empire,” and others have called it an “archipelago.”


Immerwahr’s book will arrive as perhaps the most hotly anticipated release in American diplomatic history in some time, in part because of an unusual professional dispute: He debuted the arguments of this book in early 2016 in the Society for Historians of American Foreign Relations’ prestigious Bernath lecture; the text of his speech was published, as is customary, in the field’s leading journal, Diplomatic History. So far, so normal. What set tongues wagging was Diplomatic History ’s decision to publish, on the eve of the appearance of Immerwahr’s book, an article-length rebuttal to the Bernath lecture, written by Paul Kramer, a historian of the Philippines, and entitled “How Not to Write the History of Empire.” Since its appearance in September last year, the Immerwahr-Kramer affair has been one of the first subjects of conversation among historians in this field.

For an article in an academic publication, Kramer’s piece made an astonishingly personal attack. It accuses Immerwahr’s essay of “reflecting deep historical currents of nationalist arrogance and short-sightedness.” Kramer says that in Immerwahr’s account, the people of the colonies rarely appear and only count when they are part of empire; that the colonies matter to him only for the way they affect the history of the United States. Kramer also objects to Immerwahr’s claim that mainstream American history has not accounted for the imperial role of the United States. Who, Kramer asks, counts as mainstream? His article includes an eight-page bibliographic appendix, listing books and dissertations on the subject of American colonialism.

If How to Hide an Empire resembled the work described by Kramer’s critique, it would indeed be a problem. Yet as How to Hide an Empire makes clear, Immerwahr’s argument is mostly intended for a mass audience, who won’t likely be familiar with the large body of work on this subject. Its goal is to convince a reading public of the centrality of empire to U.S. history. To do this, it does anchor parts of the argument with familiar figures; it is written from the mainland out, rather than the colonies in. Making the opposite choice would also yield an interesting and valuable book, but it would be a different book. “The history of the United States is the history of empire” is the line that closes How to Hide an Empire, and that is the point. On this, Kramer and Immerwahr would surely agree, in spite of the sparks.

Kramer’s strongest argument is that writing a history of U.S. empire as a history of territory leaves out a great deal, since “most expressions of American global power in the twentieth century” do not involve the conquest of new lands. Generations of American political leaders and government officials have sought and successfully developed more informal mechanisms of control, from economic pressure to CIA intervention. Parts of How to Hide an Empire bear out this critique. Writing of the 1960s, by which time Alaska and Hawaii had become states and the only colonies that remained were the Virgin Islands, American Samoa, Guam and other parts of Micronesia, Immerwahr asks, “Where had [America’s] imperialist spirit gone?” But in a decade that saw major escalation of the war in Vietnam, the U.S. invasion of the Dominican Republic, and America’s encouragement of political violence in Brazil and Indonesia, it’s not hard to find the “imperialist spirit” at work.

Left to right: housing for American troops in Okinawa in 1957, and aircraft maintenance at a base in U.S.-occupied Okinawa, Japan, in 1951.John Dominis/The Life Picture Collection/Getty; Werner Bischof/Magnum Photos

Immerwahr’s response is that thinking about territory, and not just informal mechanisms of control, can lead to important insights. (This approach also has the potential to reach people who are not convinced that indirect control or influence is worth describing as “empire.”) Immerwahr explains, for example, that the United States was willing to retreat from formal colonial possession when it could substitute the gains of empire at home, whether through natural resources or new technology. Indeed, one reason why the United States did not acquire major overseas colonies in tropical zones, while other European powers did, is that the country is already so large that it passes through multiple climactic zones, could produce products like sugar domestically, and possessed abundant re-sources of strategic products such as oil. As the twentieth century moved onward, synthetic chemicals further reduced the need for access to tropical markets. World War II saw acute shortages of rubber, for example, which the United States had largely obtained from European colonies, now seized by Japan. FDR imposed a national speed limit of 35 miles per hour to save tires; but by the end of the war, synthetic rubber had solved the problem.

The size of the U.S. economy in the postwar era—it accounted for half of global economic output—allowed it to spread international standards and practices (the octagonal red stop sign is nearly universal), spread the use of English, and shape the global economy, all but guaranteeing the United States access to the resources it needed. The possibilities of chemical substitution had dramatically lessened fears that the country would run out of resources: Fibers could be replaced by nylon or polyester, and plastic proliferated throughout postwar consumer society. The major exception was oil, as U.S. demand began to exceed supply. It is a telling exception, for it is precisely for access to the lifeblood of the global economy that the United States has repeatedly been willing to transgress international law. Facing the oil crises of the 1970s, Henry Kissinger mused that the United States “may have to take some oil fields.” “I’m not saying we have to take over Saudi Arabia,” he remarked at a National Security Council meeting in January 1975. “How about Abu Dhabi, or Libya?”

Immerwahr convincingly argues that the United States looks less like an empire than its European counterparts did not because U.S. policy maintains any inherent commitment to anti-imperialism, but because its empire is disguised first as continuous territory and later by the development of substitutes for formal territorial control. The United States “replaced colonies with chemistry,” and partially “substituted technology for territory.” It is a powerful and illuminating economic argument. To this, I think it must be added that non-territorial mechanisms of control, from CIA interventions to the policies of the International Monetary Fund, were essential to building and maintaining U.S. power in the twentieth century. That power can’t be understood without the Marshall Plan, or America’s support for the removal of Prime Minister Mohammed Mossadegh in Iran in 1953 and of President Salvador Allende in Chile in 1973. These get little attention in a book with a focus on formal territory. The pointillist strategy is part of hiding an empire, but it is not the whole story.


Nevertheless, the book succeeds in its core goal: to recast American history as a history of the “Greater United States.” Immerwahr’s final chapter, for example, centers on the bin Laden family. Yemen-born Mohamed bin Laden became the Saudi government’s preferred builder in the 1950s and worked on many classified projects for the U.S. military before his death in a plane crash in 1967. One of his 54 children, Osama, embraced a radically anti-Western interpretation of Islam. His primary grievance, among a litany of complaints about Western culture and behavior, was the presence of U.S. troops on bases in Saudi Arabia, reopened after the first Gulf War. From Afghanistan, he planned the attacks of September 11, 2001, which killed thousands and baited the Bush administration into a conceptually endless War on Terror.

Two of the things that have characterized the War on Terror have been the use of torture, carried out in the legal gray zones of Guantánamo Bay and military bases throughout the world, and the use of drones launched from a similar list of locations. It has been a war both provoked by America’s empire of bases, and fought from them. It is typical of U.S. imperial history that someone like Donald Rumsfeld could deny the very existence of empire while authorizing brutality that depends on it. Immerwahr’s book deserves a wide audience, and it should find one. In making the contours of past power more visible, How to Hide an Empire may help make it possible to imagine future alternatives.

*This article has been updated to correct the date of the Navassa uprising.

12 Feb 14:16

Mario Kart 8 Deluxe World Record Foiled By Last Second Blue Shell

by Heather Alexandra
Crowtrobot

oh man

A Mario Kart 8 Deluxe speedrun stalled out right at the finish line, seconds away from a new world record. It’s an ending that reminds us all that blue shells are absolute bullshit.

Read more...

06 Feb 18:07

Japan's Blueberry Cheese Burger Looks Freaky But Delicious 

by Brian Ashcraft
Crowtrobot

I disagree that it looks delicious.

That hamburger looks really gray and weird.

And just texturally I'm concerned.

Japan’s Freshness Burger has upped its bonkers burger game with its latest release: the Blueberry Cheese Burger.

Read more...

06 Feb 15:20

Breath Of The Wild Now Runs Beautifully In 4K (On The PC)

by Luke Plunkett
Crowtrobot

as seen here: in an animated GIF

While the latest Zelda game has been available in 4K for a while now on the PC via emulation, its performance wasn’t always the best. It is now.

Read more...

06 Feb 14:41

Sunless Skies’ wonderful-horrible Officers make a murderous void feel like home

by Alec Meer
Crowtrobot

I don't know why I want to buy this considering I thought Sunless Seas was merely "ok"

The word ‘lonely’ comes up often when discussing Sunless Skies, which seems like an odd thing to say about a game in which you haul yourself around the stars in the company of up to two-dozen crew members. But that’s the tone of Failbetter’s twisted sci-fi Victoriana roguelike: feeling desperately alone and vulnerable, in a desperately large and lethal place. Those crew? They’re all nameless, faceless, hired only to die on your dime. In Sunless Skies’ merciless vacuum, care is a luxury you cannot afford.

(more…)

30 Jan 17:40

Why Ted Cruz Is Happy Hiding in Donald Trump’s Shadow

by Steven Cohen

On Monday, a few hours before Donald Trump called for “a total and complete shutdown of Muslims entering the United States,” Ted Cruz was asked whether he expects Trump to come after him, now that one leading poll has the Texas senator ahead in the coveted early voting state of Iowa. “Listen, I like and respect Donald Trump,” said Cruz. “I continue to like and respect Donald Trump. While other candidates in this race have gone out of their way to throw rocks at him, to insult him, I have consistently declined to do so, and I have no intention of changing that now.”

True to his word, Cruz refused to join the pack of Republican hopefuls who piled onto the front-runner’s latest obscenity. At a press conference the following morning to announce a Senate bill barring the resettlement of Syrian refugees, Cruz appeared alongside Texas Governor Greg Abbott and continued to dance around the question of Trump’s naked racism, at one point commending the Donald for “focusing the American people’s attention” on the urgency of fending off foreign invaders. Pressed for a direct response to Trump’s ban on Muslims, Cruz finally conceded, “I do not agree with [Trump’s] proposal. I do not think it is the right solution.”

The right solution, you may be surprised to learn, is Cruz’s solution, which he just happened to introduce in the Senate the morning after Trump belched out his own. The modestly titled “Terrorist Refugee Infiltration Prevention Act” would substitute Trump’s blanket, possibly unconstitutional ban with a more targeted—and, in certain senses, crueler—three-year moratorium on the resettlement of refugees from Syria, Iraq, Libya, Somalia, Yemen, and any other country determined to contain “terrorist-controlled territory.” Where Trump’s answer is typically lacking in nuance, Cruz’s bill is designed to “focus very directly on the threat.” He’s casting it as the principled, measured alternative to a vaguely defined problem that both candidates insist exists.

“This is not about the Islamic faith,” Cruz explained to NPR’s Steve Inskeep on Wednesday. “It is about Islamism, which is a very different thing.” The conservatives Cruz is courting don’t appear to recognize the distinction, and it would be naive to think that Cruz isn’t perfectly aware of that. According to a new Bloomberg poll, two-thirds of likely Republican voters support Trump’s indiscriminate prohibition; one-third say it makes them more inclined to vote for him. 

If Cruz truly wanted to set his intentions apart from Trump’s, he could start by refuting the white-supremacist propaganda Trump has pointed to as evidence that “Muslim” is indeed synonymous with “terrorist sympathizer.” But Cruz, the champion debater and seasoned appellate attorney, is careful to present his disagreement with Trump as rooted in policy, not premise. “That is not my view of how we should approach it,” Cruz told NPR. He’s happy to let voters decide what the “it” is.  

Trump’s precipitous descent into outright fascism is widely considered to be a problem for the GOP—and in some ways it is. But for Cruz, never a party loyalist to begin with, it’s also created a unique opportunity to channel the energies of racial anxiety into a comparatively palatable, mainstream campaign for the presidency. A number of commentators have noted that Cruz is positioning himself to consolidate Trump’s support in the eventual event of his collapse—which, we keep being told, will be arriving any day now. 

But the net, and more dangerous, effect of Cruz’s strategy is to legitimize the racism that informs Trump’s. Two weeks ago, Cruz was on the extreme end of a national debate over admitting people fleeing the ravages of countries the United States has made war on. By allowing Trump to “effectively outbid” him in the wake of the San Bernadino massacre, as NPR’s Inskeep put it, Cruz has come out looking relatively moderate and responsible in an entirely new discussion about whether the basis of U.S. policy should be overt xenophobia or implied xenophobia.

Each of the remaining Republican contenders is cognizant of the need to create rhetorical distance from Trump without disavowing the sentiments he’s churned up from below. Carly Fiorina called closing the borders to Muslims an “overreaction”—a euphemism that became a false equivalence when she compared it to President Obama’s “dangerous” underreaction to the supposed threat. Marco Rubio criticized the form of Trump’s comments but not their substance, saying only that Trump’s “habit of making offensive and outlandish statements will not bring Americans together.” Jeb Bush, who supports imposing a religious test on the admission of refugees, called Trump “unhinged.” Ben Carson, who disagrees with Trump’s proposed ban because he does “not advocate being selective on one’s religion,” has previously stated that a Muslim shouldn’t be allowed to be president.

The other candidates may recognize the dilemma posed by the stubborn popularity of Trump’s ravings, but no one has been as deliberate, or effective, in incorporating the strains of white nationalism into their own overarching strategy as Cruz has. He’s hewed closely—but, critically, not too closely—to Trump’s noxious line on immigration and refugees, which Cruz frequently ties together with warnings of an impending invasion from the south. “Border security is national security,” he said in a statement on Sunday prior to President Obama’s address about terrorism and the San Bernadino shootings. “I will shut down the broken immigration system that is letting jihadists into our country,” he reiterated later. 

So far, Trump’s flamboyant nativism has drawn all the scrutiny, leaving Cruz to concentrate on raising money and building out his ground game. He knows better than to openly embrace the most jarring of Trump’s flourishes, but he won’t attack them, either—and when others do, Cruz is right there holding the flank. President Obama sounds like a “condescending school marm lecturing the American people against Islamophobia,” Cruz told NPR’s Inskeep. At the last Republican debate, he invoked his Cuban-American heritage as a cover for the field’s more general shift in the direction of mass deportation and wall-building: “For those of us who believe people ought to come to this country legally, and we should enforce the law, we’re tired of being told it’s anti-immigrant. It’s offensive.” Two weeks later, campaigning on the road in Iowa alongside Representative Steve “Cantaloupe Calves” King of Iowa, perhaps the most aggressively ignorant anti-immigration crusader in Congress, Cruz assured reporters that “tone matters” when it comes to these issues.

In an effort to explain his latest step down the road to the internment camp, some have speculated that Trump is attempting to fend off Cruz’s surging poll numbers. If so, he misunderstands the nature of Cruz’s maneuvering, as well as the depth of Cruz’s patience. With each reflexive lurch toward a darker, more explicitly ugly politics, Trump draws more attention to himself but also clears more ideological space for Cruz. Lindsey Graham, who’s polling somewhere ahead of Louis Farrakhan in the race for the Republican nomination, told the Guardian, “It’s time for Ted Cruz to quit hiding in the weeds and speak out against Donald Trump’s xenophobia and racial bigotry.” 

But Ted Cruz likes it in the weeds just fine. He’s made it this far trudging through the muck, and there’s no reason for him to change course anytime soon.

19 Aug 17:11

Police are operating with total impunity in Ferguson

by Matthew Yglesias

Above you'll see a picture of Scott Olson, the Getty photographer who's brought us many of the most striking images of protests and police crackdown that followed the shooting of Michael Brown.

The other two men in the photograph, despite presumably being police officers, are not identifiable at this time. Unlike normal police officers, they are not wearing name tags or badges with visible numbers on them. When police arrested the Washington Post's Wesley Lowery and the Huffington Post's Ryan Reilly, they weren't wearing badges or nametags either. Reasonable people can disagree about when, exactly, it's appropriate for cops to fire tear gas into crowds. But there's really no room for disagreement about when it's reasonable for officers of the law to take off their badges and start policing anonymously.

many cops operating in Ferguson are betting on impunity, and it seems to be a winning bet

There's only one reason to do this: to evade accountability for your actions.

Olson was released shortly after his arrest, as were Reilly and Lowery before him. Ryan Devereaux from The Intercept and Lukas Hermsmeier from the German tabloid Bild were likewise arrested last night and released without charges after an overnight stay in jail. In other words, they never should have been arrested in the first place. But nothing's being done to punish the mystery officers who did the arresting.

And what's particularly shocking about this form of evasion is how shallow it is. I can't identify the officers in that photograph. But the faces are clearly visible. The brass at the Ferguson Police Department, Saint Louis County Police Department, and Missouri Highway Patrol should be able to easily identify the two officers who are out improperly arrested photographers. By the same token, video taken at the Lowery and Reilly arrests should allow for the same to be done in that case.

Policing without a nametag can help you avoid accountability from the press or from citizens, but it can't possibly help you avoid accountability from the bosses.

on another level, it would almost be nicer to hear that nobody in charge thinks there's been any misconduct

For that you have to count on an atmosphere of utter impunity. It's a bet many cops operating in Ferguson are making, and it seems to be a winning bet.

In his statement today, President Obama observed that "there's no excuse for excessive force by police or any action that denies people the right to protest peacefully," seeking to tap into the widespread view that some instances of excessive force and denial of first amendment rights have taken place. But Obama did not even vaguely hint that any officer of the law would or should face even the slightest sanction for this inexcusable behavior.

Statements from Governor Jay Nixon and Highway Patrol Captain Ron Johnson have suffered from the same problem. It is nice, of course, to hear that one's concerns are in some sense shared by the people in power.

But on another level, it would almost be nicer to hear that nobody in charge thinks there's been any misconduct. After all, a lack of police misconduct would be an excellent reason for a lack of any disciplinary action. What we have is something much scarier. Impunity. The sense that misconduct will occur and even be acknowledged without punishment. Of course there are some limits to impunity. Shoot an unarmed teenager in broad daylight in front of witnesses, and there'll be an investigation. But rough up a reporter in a McDonalds for no reason? Tear-gas an 8 year-old? Parade in front of the cameras with no badges on? No problem.

(Pew Center)

According to a Pew poll released earlier today, most white people have a good amount of confidence in the investigation into Michael Brown's death. They have the good sense, however, to at least admit to some misgivings about the handling of the protests.

What they ought to see is that the two are hardly so separable. The protests would not be handled so poorly if the officers doing the handling felt that they were accountable for their actions. And a policing culture that doesn't believe cops should be accountable for their actions is not a culture that lends itself to a credible investigation.

04 Aug 15:39

Revenge of the conservative nerds

by Ezra Klein
Crowtrobot

"Theirs is the nerd-dom of Star Wars, not Star Trek; of Mario Kart and not World of Warcraft; of the latest X-Men movie rather than the comics themselves." -- these guys know that star trek is/was written by a bunch of soggy lefties right?

The National Review recently published an odd, but interesting, essay by Charles W. Cooke about "America's nerd problem."

The article begins by accusing a number of writers, broadcasters, politicians and scientists (including myself, Matt Yglesias and Dylan Matthews; as well as Neil deGrasse Tyson and Al Gore) of being faux-nerds: "Theirs is the nerd-dom of Star Wars, not Star Trek; of Mario Kart and not World of Warcraft; of the latest X-Men movie rather than the comics themselves."

Yglesias defends himself against the scurrilous accusation that he's not a a true trekkie here. And anyone who doubts Matthews' nerd credentials has never met him. Still, I'll cop to some of the charges: I prefer Mario Kart to World of Warcraft and have little patience for either Star Trek or Star Wars. My knowledge of old X-Men comics, however, is embarrassingly complete (and let's not get started on X-Force).

2356258-dangerous

It’s like a magazine, but it’s for nerds.

Nerd-offs aside, Cooke's essay, though putatively about progressivism, is an interesting window into the state of contemporary conservatism. The old conservative critique of nerds — or, to be more precise about it, technocrats and intellectuals — was that their approach to knowledge was fundamentally flawed.

"I would rather be governed by the first 2,000 people in the Boston telephone directory than by the 2,000 people on the faculty of Harvard University," William F. Buckley, the founder of The National Review, famously said. (I admit the data here is poor, but I would guess that the Boston telephone directory tilts towards Star Wars while the Harvard faculty favors Star Trek.)

Yuval Levin, one of conservatism's foremost thinkers, has argued that America's two political traditions are rooted in this debate.

He's written a book about the arguments between Edmund Burke and Thomas Paine, which he frames as a disagreement about what we can know in society, with Burke prizing "social knowledge" and Paine prizing "technical knowledge." Conservatism, he's argued, emerges from the Burkean tradition: it is skeptical of what the nerds can know and reverent of what the common man has learned — that's where you get Buckley's quote, for instance. Liberalism, he says, is just the reverse.

Cooke's essay is convoluted and expresses, at times, both pro- and anti-nerd sentiments. But the framing is clear, and it reflects an emergent trend in conservatism. Its argument isn't the classically conservative argument that the left is full of nerds and their ambitious, arrogant designs should be mistrusted; it's that the left is full of faux-nerds who lack scientific training but nevertheless wear glasses — and their ambitious, arrogant designs should be mistrusted. Or, to put it more simply, the problem isn't nerds so much as poseurs.

"Sorry, America," he concludes. "Science is important. But these are not the nerds you're looking for."

A version of this transition can be seen in the Republican Party's lurch from George W. Bush to Paul Ryan. The left's knock on Bush (and, before him, Reagan) was that he was dumb and inarticulate; the right's riposte was that the left prized the wrong kind of knowledge, and that Republicans were smart enough to know that ordinary Americans were a helluva lot wiser than Ivy League elitists. And the right was comfortable in that response: it was an argument they kept winning at the polls.


But after Bush's disastrous presidency and Obama's political successes, today's Republicans don't want to towel snap the Democratic Party's nerds. They want to out-nerd them. The party's standard-bearer, insofar as there is one, is Paul Ryan. He became the GOP's champion after video of him blasting President Obama with charts and graphs during the Blair House debate over health-care reform went viral. The National Review called it "a devastating critique" that proved "that the Democrats just don't have an answer to Ryan's arguments." (The answer to Ryan's arguments was that they were mostly wrong.) A few weeks later Real Clear Politics wrote that you could easily imagine Ryan at a Star Trek convention.

Ryan then became the Republican Party's vice presidential nominee on the strength of his unusually detailed budgets, which, again, were contrasted with Obama's faux-wonkery: "In an era that seemingly rewards shallow oratorical excellence over substance (see Obama, Barack Hussein), his political brilliance is the capacity to educate on a vision, run on a record of accomplishment, and — yes — stand on his feet and talk persuasively about both," enthused conservative economist Doug Holtz-Eakin at the Daily Caller.

Even the case for Ted Cruz gets made in terms of a nerd-off with Obama.

"Cruz went to Princeton University, where he was a national champion debater, and got a law degree from Harvard. Cruz's legal career was objectively more impressive than Obama's," wrote Jonah Goldberg at the National Review. "He clerked on the appellate court and for Chief Justice William Rehnquist on the Supreme Court. He held numerous prestigious jobs in and out of government. Like Obama, he taught law, but Cruz was also the solicitor general of Texas and argued before the Supreme Court nine times." His World of Warcraft guild could probably crush Obama's.

01 Aug 20:02

The Times of Israel published an article on "when genocide is permissible." Here's their funder

by Matthew Yglesias

A writer named Yochanan Gordan briefly had an article up on the website of The Times of Israel calling for genocide of the Palestinian population of the Gaza Strip.

Screen_shot_2014-08-01_at_12.05.09_pm The site's editors seem to have come around to the view that this kind of content is wildly inappropriate and took it down.

The incident naturally raises curiosity about the nature of the Times, which turns out to be a fairly small audience English-language digital-only project. What's interesting, however, is that though it's formally organized as a for-profit, it's pretty clearly a money-losing endeavor undertaken by somewhat idiosyncratic Boston-area hedge funder and philanthropist Seth Klarman. Klarman is mostly a Republican, but he backs marriage equality for gay and lesbian couples. He's also one of the major sources of funding for the Ending Spending Action Fund.

More relevantly, according to Forward he's very active in a number of US-based pro-Israel groups of various stripes:

But the foundation also has a focus on Israel advocacy. Klarman has been a board member of, and a major donor to, The Israel Project, a fast-growing pro-Israel advocacy group that seeks to provide information useful to working journalists. He gave the group nearly $4 million between 2008 and 2010.

The foundation has also given smaller amounts to the Middle East Media Research Group, an anti-Islamist research group whose board members include Elliott Abrams, a senior aide in several Republican administrations, and Steve Emerson, a researcher devoted to exposing ties, as he perceives them, between American Muslims and extremist Muslim movements. Klarman has also contributed to the Committee for Accuracy in Middle East Reporting, a group devoted to combating what it sees as anti-Israel bias in the media.

Klarman has also been the longtime chairman of The David Project, a Boston-based group mostly concerned with pro-Israel advocacy on campus. The group is also known for its long-running, and ultimately failed, effort to oppose the construction of a Boston mosque. Klarman said in an interview with the Forward that his interest in The David Project was in its campus work. The group has recently adopted a more moderate approach to campus activism.

Obviously the folks running the show at the Times of Israel have the good sense to recognize after the fact that open calls for genocide are not in keeping with an institutional mission to try to make Israel look good in the world.

31 Jul 17:12

America's marijuana policy isn't funny. It's racist.

by German Lopez
Crowtrobot

"Hahahahaha! We're completely insulated from the consequences of the war on drugs."

When NBC's Meet the Press over the weekend held a roundtable about the New York Times Editorial Board's decision to endorse marijuana legalization, participants seemed to take the issue very lightly — regularly making jokes between a few serious policy points.

It was obvious where the conversation would go from the start, when host David Gregory mentioned marijuana and giggles went around the table. From that point, the jokes flowed. "I don't know what they've been smoking up there," said columnist David Brooks about the New York Times Editorial Board. Judy Woodruff of PBS Newshour said, "When I think of grass, I think of something to walk on. When I think of pot, I think of something to put a plant in."

there's a very serious disconnect about what marijuana legalization would mean for America

The chuckles are typical in conversations about marijuana policy. At one of his first town halls, President Barack Obama joked, "There was one question that was voted on that ranked fairly high: that was whether legalizing marijuana would improve the economy and job creation. And I don't know what this says about the online audience." In June, former President Bill Clinton asked, "Rocky Mountain high?" to chuckles before going into an answer that seemed to support state-based reform. Hillary Clinton got in some jokes about marijuana before answering a similar question at a CNN-hosted town hall in the spring.

In some cases, these quips help lighten a conversation about drugs that many Americans, especially parents, are simply uncomfortable with. But the jokes also reflect a problem in discussions about US drug policy: there's a very serious disconnect about what marijuana legalization means for Americans.

It's easy to joke about marijuana policy when the idea of legalization feels more like a new freedom, which might be the case for whiter and wealthier populations. As someone from a privileged background who socializes with people from similarly privileged backgrounds, my social circle's conversations about pot legalization largely revolve around how cool and liberating it might be to buy pot legally. What we rarely mention in these conversations: race, the criminal justice system, and the fear of getting arrested if someone were to buy pot illegally.

black people are 3.7 times more likely to get arrested for pot possession

But for minority, poorer populations, marijuana policy is much closer to a civil rights issue. Marijuana isn't just a drug that they would like to be able to use and carry out in the open. Marijuana criminalization has historically been used to harass and arrest people in minority and poor communities at hugely disproportionate rates.

A 2013 report from the American Civil Liberties Union found that, even though white and black people use pot at similar rates, black people are 3.7 times more likely to get arrested for marijuana possession. In the District of Columbia, black people were 8.3 times more likely to get arrested for the drug.

Black_v_white_marijuana_possession_arrest

Black and white people use pot at similar rates, but black people are much more likely to be arrested for it. (ACLU)

These racially disproportionate marijuana-related arrest rates remained in some states even after decriminalization, when criminal penalties are removed but the drug remains technically illegal.

New York, for instance, decriminalized marijuana in 1977, but as of 2012 had one of the highest arrest rates for pot possession. The problem: New York law allows arrests for marijuana that's within public view. Police officers in New York City regularly used this exception to arrest people, particularly minorities, by getting them to empty their pockets during stop-and-frisk searches and expose marijuana that would otherwise have remained hidden. (According to a report from the New York City Public Advocate's office, the vast majority of stop-and-frisk searches in 2012 — roughly 84 percent — involved black or Hispanic people.)

There is a legitimate debate to be had about whether these arrests focus on drug traffickers instead of users, highlight broader problems in the war on drugs and the criminal justice system, or signify higher crime rates among minority communities.

marijuana policy is simply no joking matter

But Meet the Press didn't even give that debate a chance. The roundtable instead focused on the health effects and whether legalization increases pot use. These are very important issues that need to be discussed, but they're also the kinds of issues more privileged Americans can focus on because they just don't see the skewed effects of criminalization in their everyday lives.

Just imagine, for example, if the couple of minutes the roundtable spent on jokes were instead spent discussing racially uneven drug policy enforcement. As Ryan Cooper of the Week points out, this would be much more valuable to Meet the Press' audience. Because for a large chunk of the US population, marijuana policy is simply no joking matter.

To learn more about marijuana legalization, read our full explainer and watch the video below:


29 Jul 16:32

Habanero Carrot Cake From 'Sweet and Vicious'

by Emma Kobolakis
Crowtrobot

oooooh

Habanero Carrot Cake From 'Sweet and Vicious'
If the fire-breathing dragon wasn't hint enough, one bite will prove this cake is packing heat. In her recently released cookbook, Sweet and Vicious: Baking with Attitude, Libbie Summers stirs hot pepper extract into a lightly spicy batter, and spikes the cream cheese frosting with spiced pecans. The fruitiness of the pepper works well with the carrot-heavy batter, further enhanced by traditional cinnamon, nutmeg, and ground cloves. Get Recipe!
25 Jul 18:54

Corporations used to pay almost one-third of federal taxes. Now it's one-tenth.

by Danielle Kurtzleben

Everyone agrees the corporate income tax is broken, but meaningful changes to it never seem to happen. And despite a flurry of recent attention, action on inversions — reincorporating a business in a foreign country in order to take advantage of lower rates — seems to keep being punted into the future.

Obama railed against inversions this week, calling for "economic patriotism." But Senate Democrats say they don't think Congress can address the issue before August recess, as Bloomberg reported this week. And even then, agreement looks tough: Republicans want broader tax-code reform, and not all Democrats are behind Obama.

Corporations are shouldering far less of a tax burden than they used to. Corporate tax revenues have declined as a share of GDP over the years, but individual tax revenues have held steady, according to a 2013 GAO report.

Screen_shot_2014-07-25_at_11.57.01_am

Corporations account for a much smaller share of the tax revenue pie than they used to. In 1952, corporations accounted for 32.1 percent of federal revenue. As of 2013, it was less than 10 percent.

It's understandable why US corporations seek out inversions — the US has the highest nominal corporate tax rate among developed countries, with a 35 percent top federal rate and a 39.1 percent average combined rate. While other countries' rates have fallen, the U.S.'s has stayed high.

But corporations are also very good at finding ways to pay less. The GAO found that all corporations who filed M-3s (a tax form for large and international corporations) paid an effective tax rate of 22.7 percent in 2013. Among profitable companies only, the rate was even lower, at 17 percent.

So even while the US corporate tax rate is high, its corporate tax revenue collections are low. That's one reason why, as of 2011, the US was on the low end of corporate tax revenue among OECD nations.

Screen_shot_2014-07-25_at_12.22.42_pm

21 Jul 15:14

Cajun Pork and Beef Pie with Savory Cream Cheese Topping

by Jennifer Olvera
Crowtrobot

this sounds delightful and also that it will kill me

Cajun Pork and Beef Pie with Savory Cream Cheese Topping
This Paul Prudhomme-inspired pie is essentially a sweet pastry crust filled with a savory mixture of Cajun-spiced ground pork and beef. It's topped with rich seasoned cream cheese, which turns bubbly and browned in the oven—in short, it's bliss on a plate. Get Recipe!
18 Jul 01:59

Chocolate Mousse and Marshmallow Icing S'mores Cake

by Ideas in Food
Chocolate Mousse and Marshmallow Icing S'mores Cake
Borrowing all the classic flavors of a campfire s'more, the Ideas in Food team creates a graham cracker cake that's flavored with browned butter, layered with a dulce de leche-spiked chocolate mousse, and topped with a toasted bourbon-marshmallow icing. Get Recipe!
17 Jul 15:14

Tipping perpetuates racism, classism, and poverty — let's get rid of it!

by Brandon Ambrosino

"Welcome to my restaurant; now please pay my employees."

That's tipping in a nutshell, according to Mark Ventura, a former waiter and an economics major at Miami University. Ventura was quoted last week in an article profiling the restaurant Packhouse Meats, which opened in January in Newport, KY. The restaurant has a no-tipping policy. Signs proudly announcing the embargo are on full display in the restaurant, and the credit card slip only has a place for your signature — no extra line for gratuity.

Plenty of people have written about the indignities of the American tipping system. English author Lynne Truss once compared visiting New York to visiting the Third World: "In this great financial capital ... tips are not niceties: give a 'thank you' that isn't green and foldable and you are actively starving someone's children." The Village Voice's Foster Kamer called tipping "an assault on fairness" for everyone involved in the transaction: "It reinforces an economically and socially dangerous status quo, while buttressing a functional aristocracy," he wrote in "The Death of Tipping". Meanwhile Michael Lewis, in one of the most well-known essays on the subject, argued against it from the consumer's perspective, comparing obligatory tipping — and what sort of tipping isn't in some sense obligatory? — to a government tax: "I feel we are creeping slowly toward a kind of baksheesh economy in which everyone expects to be showered with coins simply for doing what they've already been paid to do."

And yet for some reason, the customary practice of tipping endures, and all of us who read these essays and hope they catch on continue to actively participate in the system we seem to so publicly hate. As William Scott pointed out almost a century ago in The Itching Palm, one of the first published anti-tipping screeds, "There are abundant indications of a widespread distaste for the custom but the sentiment is unorganized and inarticulate."

Here, then, is the complete case against tipping.

1) Tipping lets employers off the hook

The first and most compelling rebuttal to any case against tipping is always BUT THAT'S HOW SERVERS MAKE MOST OF THEIR INCOME.

Yes, that's right — and that's the problem. Restaurant servers' hourly wages are ridiculously low — $2.13 an hour, in fact, in most states — and they do depend on tips to account for the bulk of their income. Taking away a server's tips would put her in a bad place financially —  unless her employer ups her hourly wage. As it now stands, the tipping model lets business owners make more money at the expense of their employees' hard work. But rather than let their employees grovel for tips, restaurateurs ought to be required to pay their employees a living wage.

Consumers should not be responsible for paying the incomes of a restaurant owner's employees. For one thing, it isn't fair to the consumers. But more troublingly, it isn't fair to the employees: a server's ability to pay his bills shouldn't be subject to the weather, the frequency with which he touches his guests, or the noise level of the restaurant, all of which are factors that contribute to the tip amount left by a consumer.

As the Economy Policy Institute (EPI) notes,

Tipped workers — whose wages typically fall in the bottom quartile of all U.S. wage earners, even after accounting for tips — are a growing portion of the U.S. workforce. Employment in the full-service restaurant industry has grown over 85 percent since 1990, while overall private-sector employment grew by only 24 percent. In fact, today more than one in 10 U.S. workers is employed in the leisure and hospitality sector, making labor policies for these industries all the more central to defining typical American work life.

EPI also cites research that the poverty rate of tipped workers is nearly double that of other workers (as the chart below indicates), and that tipped employees are 3 times more likely to be on food stamps.

Poverty_national

EPI also argues it is false to suggest that "these workers' tips provide adequate levels of income and reasonable economic security," as 2014 reports from the White House and the Congressional Budget Office argued. Further, they say, research clearly shows that poverty rates are reduced in those states where the minimum wage rate for tipped workers has been raised.

2) Tipping is undemocratic

"The itching palm is a moral disease," wrote Scott in his 1916. To him, tipping was a threat to the founding principle of democracy: that all men are created equal. Allowing an American citizen (i.e. the person being tipped) to adopt the posture of a sycophant is deeply undemocratic, argued Scott, because it limits self-respect to the "governing classes" (i.e. the tippers).

According to Michael Lynn, a professor of consumer behavior at the Cornell University School of Hotel Administration, the practice of tipping originated in Europe and only later migrated to America just after the Civil War. (As for why the practice started in Europe in the first place, Kamer discusses different theories.) Wealthy Americans returning home from European vacations wanted to show off what they'd learned abroad, and so they started tipping their service workers.

Tipping, in other words, is rooted in an aristocratic tradition. It should come as no surprise that tipping took off in Europe, a continent that promoted a clear distinction between the servant class and higher forms of society. But as Scott notes, America prides itself on not distinguishing social groups bases solely on their financial means. In fact, he notes, "Tipping, and the aristocratic idea it exemplifies, is what we left Europe to escape."

Scott isn't the only one with this view. According to Yoram Margalioth of Tel Aviv University Law School, tipping in America was at first "met with fierce opposition as fostering a master-servant relationship [was] ill suited to a nation whose people were meant to be social equals." The Anti-Tipping Society was founded in 1904 in Georgia, and convinced its 100,000 members to foreswear tipping for an entire year. Labor unions, too, came out against tipping, as did the president of the American Federation of Labor, Samuel Gompers. Opposition to tipping finally got codified into law, when Washington State passed a no-tipping law in 1909. Five other states followed suit, though, according to Wachter, none of the laws were enforced, and as a result, all of them were repealed by 1926.

Today, tipping continues to be de rigueur in America, while, ironically, the European custom has been replaced in its home country by a service charge.

Scott_3

3) Tipping doesn't do what it's supposed to do

As Margialoth notes, many people view tipping "as an informal service contract between the customer and the waiter, acting as a consumer-monitoring mechanism." This informal contract reinforces the belief that customers are able to monitor the service they receive and reward it accordingly. In other words, the argument goes, tipping motivates the server to do her best work. This makes some sense at least in theory, but in reality, it's really, really wrong.

After a qualitative study of more than 2,600 dining parties at 21 different restaurants, Lynn concluded that "tips are only weakly related to service." As Margialoth notes, the most important factor to patrons deciding upon tip amounts is the amount of the check, not the efficiency, or inefficiency, of the server; the quantity of the food they order, not the quality with which it's served to them. This finding, Lynn argues, "raises serious questions about the use of tips as a measure of server performance or customer satisfaction as well as the use of tips as incentives to deliver good service." It also emphasizes the fact that tipping is really, painfully unfair: how in the world is bringing a customer a $1,000 bottle of wine any more work than bringing her a $60 bottle? If Lynn is right, and customers generally tip on amount alone, the difference between the hypothetical 20 percent gratuities would be $188 — a $200 tip versus a $12 tip.

Steve Dublanica, author of two books on the service industry, said that any server would agree with Lynn's findings:

If you've waited tables, you know this is true. I learned this on the job years ago. You can give people amazing service and they'll stiff you. You can give them horrible service, and they can give you a great tip. There's no rhyme or reason to it. If only 2 percent of the tip is based on the service, what are the other 98 percent doing? If they're not tipping on service, they're tipping on psychological processes that are happening.

Jay Porter, owner of the Linkery restaurant in San Diego, said it's "silly" to think that servers are motivated merely by prospective tips. "Servers are motivated to do a good job in the same ways that everyone else is," he wrote in Slate, noting that they're motivated by wanting to keep their jobs and earn raises, and because they take pride in their work. He added: "In any workplace, everyone is required to perform well, and tips have nothing to do with it."

Not that tipping isn't a powerful motivator. It is — just not for the employee. The thought of being able to hire labor at around two bucks an hour is probably great news to employers looking to turn profits. Again, that's problematic. (See #1.)

4) Tipping is discriminatory … and it might be illegal

The way we tip reflects our prejudices, argues Freakonomics' Stephen Dubner. Here's what he told Brian Lehrer: "The data show very clearly that African Americans receive less in tips than whites, and so there is a legal argument to be made that as a protected class, African American servers are getting less for doing the same work. And therefore, the institution of tipping is inherently unfair."

But not only are black servers making less money than white servers — black diners are perceived to be leaving less money than white diners. Data collected in 2009 from over 1,000 servers all across the US "found that over sixty-five percent [of servers] rated African Americans as below average tippers." As a result, restaurant workers of all colors dislike waiting on black customers, studies found. The economy of tipping is so racially charged that both servers and diners are affected by prejudice.

Racism isn't the only kind of discrimination baked into the American tipping system. Female servers, too, face routine discrimination. As Lynn told Dubner: blonde, slender, larger-breasted women in their 30s earn some of the highest tips. Granted, the decision of how large a tip to leave is up to the subjective whims of the tipper, and different people have their own aesthetic preferences. But when a server's main source of income is her tips, and if those tips are regulated by the prejudices of the tippers, then a case could potentially be made that certain wage practices of restaurants are discriminatory.

This is the very case Kamer made (emphasis mine): "In 1971's Griggs v. Duke Power, the Civil Rights Act of 1964 was ruled to prohibit businesses with discriminatory practices against those protected under it, even if that effect is unintended. Tipping, which has been proven to be discriminatory, could be downright unconstitutional."

Right_tipping_meme

5) Tipping might be psychologically harmful

In response to the question, "Do you feel pressured to tip at a restaurant even if you feel you received bad service?" 70 percent of those polled answered "yes." Margalioth wrote, "This seems to prove the social norm of tipping is so strong that many people feel extorted to tip."

But why do we feel such an intense pressure to tip? According to Lynn, we tip in order to prevent feeling guilty or ashamed for violating the social norm of tipping: "Perhaps [the tipper] dislikes having someone disapprove of her," he says. Or maybe she's "internalized some standard of fairness that leads her to feel guilty if she does not reward the server for his efforts." Ofer H. Azar, economist and professor at Ben-Gurion University of the Negev, agrees with Lynn: "people tip because this is the social norm and, when they disobey the norm, they suffer a psychological disutility because of social disapproval, embarrassment, and feeling guilty and unfair."

There's another way tipping could take a toll on our psyches. Margalioth argues that tipping is a form of "negative externality imposed by wealthy people on the rest of society." According to Margalioth's theory, when top earners spend more money, those who earn less feel pressured to keep up, as research has shown. In other words, she suggests, middle-class and poor Americans feel like they have to be as "visibly impressive" as wealthier Americans. This pressure might be a motivating factor in tipping, she says.

The upshot of this research is summed up by Lynn: "I think it's quite possible that tipping norms undermine overall satisfaction or happiness."

6) Tipping is not really charitable

Arguing that we do away with tipping seems like a mean thing to do: the world needs more charity, thank you, so you should keep tipping your server. But the problem with this argument is that leaving a gratuity is not actually charitable.

The word "gratuity" comes from a word meaning "gift." But that word doesn't really make sense in the context of tipping, which is, of course, a quid pro quo arrangement. You don't gift the waiter money, you release funds to him that he, by virtue of simply being your server, has earned. He is rightfully entitled to that money, and you are ethically obligated to give him by social norms that seem to be as binding as any government law.

Scott sees tipping as "misguided generosity." While we are right to feel gratitude for those serving us, he argues we go awry when we feel obligated to express our "appreciation in terms of money." After all, notes Scott, "Self respect is satisfied with verbal appreciation."

Of course, verbal appreciation won't pay the bills of tipped workers, almost 13 percent of whom live in poverty. But rather than satisfy our consciences with trivial thoughts about how tips are really charitable, we should start holding restaurant owners accountable for their employees' wages. If they argue that servers actually like the tipping system because they come out on top, we should ask these owners to put their money where their mouths are and cut their own pay down to two bucks an hour.

Plus tips.

16 Jul 17:48

Mildly Sweet Sweet-Potato Biscuits

by Marissa Sertich Velie
Mildly Sweet Sweet-Potato Biscuits Sweet potatoes started out as a way of stretching expensive refined flour in biscuit doughs for those who couldn't afford otherwise, but they're not just an economical step: They create moist, flavorful biscuits that are even more likely to be tender, because some of that sweet potato replaces what would otherwise be wheat gluten. Here are the steps to make them. Get Recipe!
16 Jul 16:10

Is organic food any healthier? Most scientists are still skeptical.

by Brad Plumer

You've no doubt noticed that organic foods are a fair bit more expensive at the grocery store. An organic head of lettuce can cost twice as much as a regular one. But is it any healthier for you?

In recent years, most scientists have answered this question with a flat "no." There simply doesn't seem to be much evidence that organic foods are more nutritious than conventional foods.

In 2009, the United Kingdom's Food Standards Agency reviewed 67 studies on this topic and couldn't find much difference in nutrient quality between the two food types. In 2012, a larger review of 237 studies published in the Annals of Internal Medicine also found that organic foods didn't appear to be any healthier or safer to eat than their conventionally grown counterparts.

But there have long been dissenters who argue that there must be some health benefits to organic. And a July 2014 study in the British Journal of Nutrition, led by Carlo Leifert of Newcastle University, reopened this debate by adding a small twist. The researchers' reviewed 347 previous studies and found that certain organic fruits and vegetables had higher levels of antioxidants than conventionally grown crops.

Unfortunately, this doesn't prove very much by itself. No one knows if those moderately higher levels of antioxidants actually boost your health. For that to happen, they'd have to be absorbed into your bloodstream and distributed to the right organs — and there just hasn't been much good research showing that. For now, there's little evidence to suggest concrete health benefits from eating organic.

In the meantime, some commentators have suggested that this endless health debate has become a distraction. Marion Nestle of New York University argues that the best reasons to buy organic produce involve environmental impacts and production values. Any nutritional benefit is a "bonus," if there even is one.

Other experts point out that most Americans don't eat enough fruits or vegetables of any type — a far more pressing health concern than whatever minor differences may exist between organic and conventional food. "What's missing in this debate is the important fact that the best thing consumers can do is to eat lots of fruits and vegetables, period, regardless of whether they are produced organically or conventionally," says Carl Winter of the University of California Davis. (He's also skeptical, by the way, that organic food is any healthier for you.)

Here's an overview of this often-contentious topic:

It's not easy to compare organic and conventional foods

Organic farm in Yakima, Washington. (sagebrush photography/Flickr)

One major hurdle for anyone trying to compare "conventional" and "organic" foods is that these are incredibly broad terms.

In the United States, there's technically a dividing line between the two: farms certified as "organic" by the USDA are prohibited from using synthetic pesticides, petroleum-based fertilizers, or sewage sludge. Organic animals can't be fed antibiotics or growth hormones.

But that still leaves a lot of room for variation. Some conventional farms go heavy on the synthetic fertilizer and pesticides. But others spray more selectively or use alternative pest management techniques.

Likewise, some organic farms use natural pesticides that are considered "organic" but can nonetheless be quite toxic. And some organic farms use compost that can contain more contaminants like lead or cadmium than conventional fertilizer. It all depends on the situation — there's no single "conventional" farming system or single "organic" system.

What's more, there are endless variables that can affect the nutritional value of crops, from soil type to climate conditions to the crop cultivars being planted. Controlling for all these factors to make a grand statement about "organic" versus "conventional" farming is incredibly difficult.

And, not surprisingly, scientists have struggled to find clear nutritional differences so far. One 2013 study found that organic tomatoes have more vitamin C, but they're also smaller than conventional tomatoes, so the differences are fairly minimal. Another 2013 study found that organic milk in the US contained more omega-3s — though this may be more about the types of feed used than a unique property of "organic" farming.

More recently, researchers have been conducting large meta-analyses — studies of studies — to try to pinpoint some big-picture lessons here. To date, those reviews have usually found little nutritional difference between organic and conventional produce (see here and here). But now comes a new study with a slight dissent.

A big 2014 study claimed possible health benefits for organic produce…

John Williamson holds a handful of flax seed December 13, 2012, on his 200-acre organic farm in North Bennington, Vermont. (Robert Nickelsberg/Getty Images)

In a big meta-analysis done in 2014, Leifert and his colleagues reviewed 347 studies comparing organic and conventional produce around the world. They concluded that organic fruits and vegetables had, on average, higher levels of antioxidants and lower levels of synthetic pesticide residue.

What's not clear, however, is whether these differences have any actual health impact on human beings. And there were a few sharp criticisms of the study. Let's take a closer look at the paper's findings:

1) Organic produce, on average, had higher levels of antioxidants. This is the part of the study that got the most attention from the media. On average, the authors found, organic produce had higher levels of flavonoids, phenolic acids, anthocyanins, and carotenoids — in some cases, 20 to 40 percent higher.

These compounds — referred to as "antioxidants" — are essentially plant defenses, produced when the plants are stressed by their environment. So one possibility is that organic crops create more of them since they're not protected by chemical pesticides and have to deal with more pests.

But there's a catch: we don't really know whether these compounds improve people's health. We don't know how many of these extra antioxidants are actually absorbed by humans. We don't know what the optimal level of antioxidant intake actually is. It's true that some studies have shown that a diet rich in fruits and vegetables can protect against disease. But those studies looked at people mostly eating conventionally grown vegetables — and the precise role of antioxidants is still being debated.

The University of Washington's Charles Benbrook, a co-author of that 2014 pro-organic study, noted this point in a blog post: "Our team, and indeed all four reviews, acknowledges that many questions remain about the bioavailability of plant-based antioxidants, how necessary they are at different life stages, and how inadequate intakes shift the burden of disease." He added that there were some reasons to think those antioxidants are beneficial, but it's hard to say for sure.

2) Organic grains, on average, had 48 percent lower cadmium levels. Cadmium is a heavy metal that is taken up by plants in the soil and is harmful to humans in very high doses. So at first glance this seems like a point in favor of organic.

Yet it's hard to see why organic farming per se would lead to lower cadmium levels — this may just reflect differences in various soils. (Crops from some organic farms can be quite heavy in cadmium.)

It's also not clear this is a pressing health concern. The EPA says the average American gets 0.0004 micrograms of cadmium per kilogram of body weight per day from food — 10 times lower than levels that would cause kidney damage. If you want to reduce your cadmium intake, focus first on quitting smoking and eating less shellfish. Those are bigger sources.

3) Organic fruits and vegetables had less synthetic pesticide residue. This shouldn't be too surprising — synthetic pesticides aren't used on organic farms, and the study didn't test for organic pesticides (which can themselves be quite toxic). Still, some experts are unconvinced that pesticide residue is a big problem either way.

"From my 27 years of research on pesticides and food safety, I remain skeptical that the extremely low levels of pesticide residues we encounter from foods have any impact on public health, and slightly lowering such levels even more would not have any additional impact," Carl Winter, a pesticide and risk assessment specialist at the University of California Davis, wrote to me in an email. "Our typical exposure to pesticide residues is at levels 10,000 to 10,000,000 times lower than doses that cause no observable effect in laboratory animals that are fed pesticides daily throughout their entire lifetimes." (Here's some of his research on that.)

That said, anyone who's worried should wash their fruits and vegetables in tap water to significantly reduce pesticide residue.

4) Organic produce had lower levels of protein, fiber, and nitrates. This was another finding that didn't get as much attention and might actually be a point in favor of conventional produce, as Tom Sanders, a nutritional scientist at King's College London, points out. Note, however, that there's still some debate over whether higher or lower levels of nitrates in vegetables are preferable.

Yet the pro-organic study also attracted some criticism

As with all big studies on a contentious topic, Leifert's pro-organic study also received a fair bit of criticism — you can see a roundup here. A few points made:

1) The health benefits of those antioxidants are still uncertain. "There is no evidence provided that the relatively modest differences in the levels of some of these compounds would have any consequences (good or bad) on public health," said Richard Mithen of the Institute for Food Research. He added this twist: "The additional cost of organic vegetables to the consumer and the likely reduced consumption would easily offset any marginal increase in nutritional properties, even if they did occur, which I doubt."

2) The analysis may have included too many low-quality studies. Alan Dangour — the scientist who led the 2009 review that found no significant differences between conventional and organic food — argued that Leifert likely included too many low-quality studies in his review. Leifert shot back that Dangour's own study excluded too many studies. This is often a point of contention when dealing with meta-analyses.

3) Comparisons between "organic" and "conventional" may be inherently flawed. And still other experts reiterated the point made above that it's difficult to compare "organic" to "conventional" farming because practices vary so widely. For instance: on average, cadmium levels may be lower in organically grown cereal crops. But some organic farms use compost that's extremely high in cadmium.

To make this even trickier, the Leifert review surveyed studies across the entire world — 70 percent of the studies were in Europe, with the rest in the United States, Canada, Brazil, and Japan. There may well be regional variations within that average.

Is it time to put this debate aside?

The Greenfresh market, located in Renton Washington. (I-5 Design and Manufacture/Flickr)

Some commentators have suggested that the debate over whether organic or conventional food is healthier is becoming increasingly useless.

Back in 2009, food writer James McWilliams pointed out that only about 2.5 percent of food eaten in the United States is organic — and the typical consumers tend to be college-educated and fairly well-off. That means we're quibbling over marginal nutritional differences (if any) for a population that's already fairly healthy.

By contrast, about 73 percent of the US population doesn't eat the recommended five or more servings of fruits and vegetables each day. For many nutritionists, that's a much more pressing concern — fixing that shortfall would swamp any health benefits organic food might have.

Indeed, a few experts wonder if the endless debate over organic versus conventional might even be counterproductive: "I worry that some consumers might actually reduce their consumption of fruits and vegetables because of pesticide residue concerns," notes Winter, "which would do them more harm than good."

Even some proponents of organic food have suggested that the nutrition question is a bit of a sideshow. On her blog, Marion Nestle argues that the case for buying organic produce hinges more on how our food is produced and concern for the environment: "As I said, if they are more nutritious, it's a bonus, but there are plenty of other good reasons to prefer them."

Nestle doesn't list those reasons, but proponents often cite things like less fertilizer runoff and pollution or fewer antibiotics being used in farms or less pesticide exposure for farmworkers.

Still, the Guardian recently cited one survey suggesting that at least 55 percent of organic buyers list "healthy eating" as a reason for purchasing. So it's unlikely this debate will go away anytime soon — and it'll remain of keen interest to a lot of people.

Further reading:

  • Here's the July 2014 study that kicked off this debate. Here's co-author Charles Benbrook's blog post explaining it. And here is a list of responses from the Science Media Centre, some positive, most negative.
  • Here's an in-depth review of the British Journal of Nutrition study by Cornell doctoral candidate Kevin Klatt.
  • For those who want to pick over different studies with a fine-tooth comb, Tamar Haspel had an excellent piece looking at the evidence on a variety of fruits, vegetables, milk, and meat.
  • Melinda Wenner Moyer recently wrote a long reported essay for Slate on why she felt fine feeding her kids conventional fruits and vegetables.

WATCH: What's wrong with food in America

16 Jul 14:52

In mayoral fund-raising, Emanuel sets the rules and brings in the money

by Mick Dumke
Crowtrobot

rahm... a hypocrite?????? what no

Wealthy donors with business ties to the city keep giving to the mayor. by Mick Dumke and Ben Joravsky July 10 was another productive day for Mayor Rahm Emanuel's fund-raising machine. Chicago Forward, the political action committee put together by some of the mayor's friends and run by his former aides, reported collecting $325,000 in contributions that day from just six people.…

[ Read more ]

[ Subscribe to the comments on this story ]

16 Jul 14:30

Former Dictator Manuel Noriega Is Suing Over Call of Duty

by Brian Ashcraft

Former Dictator Manuel Noriega Is Suing Over Call of Duty

Former Panamanian dictator Manuel Noriega is taking Activision to court over his depiction in Call of Duty: Black Ops II.

Read more...








15 Jul 15:36

Why the Myers-Briggs test is totally meaningless

by Joseph Stromberg
Crowtrobot

a lot of people put this in their dating profiles and it was always baffling to me.

Like this video? Subscribe to Vox on YouTube.

The Myers-Briggs Type Indicator is probably the most widely used personality test in the world.

About 2 million people take it annually, at the behest of corporate HR departments, colleges, and even government agencies. The company that produces and markets the test makes around $20 million off it each year.

The only problem? The test is completely meaningless.

"There's just no evidence behind it," says Adam Grant, an organizational psychologist at the University of Pennsylvania who's written about the shortcomings of the Myers-Briggs previously. "The characteristics measured by the test have almost no predictive power on how happy you'll be in a situation, how you'll perform at your job, or how happy you'll be in your marriage."

The test claims that based on 93 questions, it can group all the people of the world into 16 different discrete "types" — and in doing so, serve as "a powerful framework for building better relationships, driving positive change, harnessing innovation, and achieving excellence." Most of the faithful think of it primarily as a tool for telling you your proper career choice.

But the test was developed in the 1940s based on the totally untested theories of Carl Jung and is now thoroughly disregarded by the psychology community. Even Jung warned that his personality "types" were just rough tendencies he'd observed, rather than strict classifications. Several analyses have shown the test is totally ineffective at predicting people's success in various jobs, and that about half of the people who take it twice get different results each time.

Yet you've probably heard people telling you that they're an ENFJ (extroverted intuitive feeling judging), an INTP (introverted intuitive thinking perceiving), or another one of the 16 types drawn from Jung's work, and you may have even been given this test in a professional setting. Here's an explanation of why these labels are so meaningless — and why no organization in the 21st century should rely on the test for anything.

The Myers-Briggs rests on wholly unproven theories

Carl Jung in 1960. (Douglas Glass/Paul Popper/Popperfoto/Getty Images)

In 1921, Jung published the book Psychological Types. In it, he put forth a few different interesting, unsupported theories on how the human brain operates.

Among other things, he explained that humans roughly fall into two main types: perceivers and judgers. The former group could be further split into people who prefer sensing and others who prefer intuiting, while the latter could be split into thinkers and feelers, for a total of four types of people. All four types, additionally, could be divided based on attitudes into introverts and extroverts. These categories, though, were approximate: "Every individual is an exception to the rule," Jung wrote.

Even these rough categories, though, didn't come out of controlled experiments or data. "This was before psychology was an empirical science," says Grant, the Penn psychologist. "Jung literally made these up based on his own experiences." But Jung's influence on the early field was enormous, and this idea of "types" in particular caught on.

Jung's principles were later adapted into a test by Katherine Briggs and her daughter Isabel Briggs Myers, a pair of Americans who had no formal training in psychology. To learn the techniques of test-making and statistical analysis, Briggs worked with Edward Hay, an HR manager for a Philadelphia bank.

They began testing their "Type Indicator" in 1942. It copied Jung's types but slightly altered the terminology, and modified it so that people were assigned one possibility or the other in all four categories, based on their answers to a series of two-choice questions.

Raise two (the number of possibilities in each category) to the fourth power (the number of categories) and you get 16: the different types of people there apparently are in the world. Myers and Briggs gave titles to each of these types, like the Executive, the Caregiver, the Scientist, and the Idealist.

The test has grown enormously in popularity over the years — especially since it was taken over by the company CPP in 1975 — but has changed little. It still assigns you a four-letter type to represent which result you got in each of the four categories:

(Wikimedia Commons/Jake Beech)

The Myers-Briggs uses false, limited binaries

With most traits, humans fall on different points along a spectrum. If you ask people whether they prefer to think or feel, or whether they prefer to judge or perceive, the majority will tell you a little of both. Jung himself admitted as much, noting that the binaries were useful ways of thinking about people, but writing that "there is no such thing as a pure extravert or a pure introvert. Such a man would be in the lunatic asylum."

But the test is built entirely around the basis that people are all one or the other. It arrives at the conclusion by giving people questions such as "You tend to sympathize with other people" and offering them only two blunt answers: "yes" or "no."

It'd be one thing if there were good empirical reasons for these strange binary choices that don't seem to describe the reality we know. But they come from the disregarded theories of an early-20th-century thinker who believed in things like ESP and the collective unconscious.

Actual data tells psychologists that these traits do not have a bimodal distribution. Tracking a group of people's interactions with others, for instance, shows that as Jung noted, there aren't really pure extroverts and introverts, but mostly people who fall somewhere in between.

All four of the categories in the Myers-Briggs suffer from these kinds of problems, and psychologists say they aren't an effective way of distinguishing between different personality types. "Contemporary social scientists are rarely studying things like whether you make decisions based on feelings or rational calculus — because all of us use both of these," Grant says. "These categories all create dichotomies, but the characteristics on either end are either independent from each other, or sometimes even go hand in hand." Even data from the Myers-Briggs test itself shows that most people are somewhere in the middle for any one category, and just end up being pigeonholed into one or the other.

This is why some psychologists have shifted from talking about personality traits to personality states — and why it's extremely hard to find a real psychologist anywhere who uses the Myers-Briggs with patients.

There's also another related problem with these limited choices: look at the chart above, and you'll notice that words like "selfish," "lazy," or "mean" don't appear anywhere. No matter what type you're assigned, you get a flattering description of yourself as a "thinker," "performer," or "nurturer."

This isn't a test designed to accurately categorize people, but rather a test designed to make them feel happy after taking it. This is one of the reasons it's persisted for so many years in the corporate world after being disregarded by psychologists.

The Myers-Briggs provides inconsistent, inaccurate results

(Frederick Florin/AFP/Getty Images)

Theoretically, people might still get value out of the Myers-Briggs if it accurately indicated which end of a spectrum they were closest to for any given category.

But the problem with that idea is the fact that the test is notoriously inconsistent. Research has found that as many as 50 percent of people arrive at a different result the second time they take a test, even if it's just five weeks later.

That's because the traits it aims to measure aren't the ones that are consistently different among people. Most of us vary in these traits over time — depending on our mood when we take the test, for instance, we may or may not think that we sympathize with people. But the test simply tells us whether we're "thinking" or "feeling" based on how we answered a handful of binary questions, with no room in between.

Another indicator that the Myers-Briggs is inaccurate is that several different analyses have shown it's not particularly effective at predicting people's success at different jobs.

If the test gives people such inaccurate results, why do so many still put stock in it? One reason is that the flattering, vague descriptions for many of the types have huge amounts of overlap — so many people could fit into several of them.

This is called the Forer effect, and is a technique long used by purveyors of astrology, fortune telling, and other sorts of pseudoscience to persuade people they have accurate information about them.

The Myers-Briggs is largely disregarded by psychologists

All this is why psychologists — the people who focus on understanding and analyzing human behavior — almost completely disregard the Myers-Briggs in contemporary research.

Search for any prominent psychology journal for analysis of personality tests, and you'll find mentions of several different systems that have been developed in the decades since the test was introduced, but not the Myers-Briggs itself. Apart from a few analyses finding it to be flawed, virtually no major psychology journals have published research on the test — almost all of it comes in dubious outlets like The Journal of Psychological Type, which were specifically created for this type of research.

CPP, the company that publishes the test, has three leading psychologists on their board, but none of them have used it whatsoever in their research. "It would be questioned by my academic colleagues," Carl Thoresen, a Stanford psychologist and CPP board member, admitted to the Washington Post in 2012.

Apart from the introversion/extroversion aspect of the Myers-Briggs, the newer, empirically driven tests focus on entirely different categories. The five-factor model measures people's openness, conscientiousness, extroversion, agreeableness, and neuroticism — factors that do differ widely among people, according to actual data collected. And there's some evidence that this scheme may have some predictive power in determining people's ability to be successful at various jobs and in other situations.

One thing it doesn't have: the marketing machine that surrounds the Myers-Briggs.

So what is the Myers-Briggs useful for?

(The Shifted Librarian)

The Myers-Briggs is useful for one thing: entertainment. There's absolutely nothing wrong with taking the test as a fun, interesting activity, like a BuzzFeed quiz.

But there is something wrong with CPP peddling the test as "reliable and valid, backed by ongoing global research and development investment." The company makes an estimated $20 million annually, with the Myers-Briggs as its flagship product. Among other things, it charges between $15 and $40 to each person who wants to take the test, and $1,700 to each person who wants to become a certified test administrator.

Why would someone pay this much to administer a flawed test? Because once you have that title, you can sell your services as a career coach to both people looking for work and the thousands of major companies — such as McKinsey & Co., General Motors, and a reported 89 of the Fortune 100 — that use the test to separate employees and potential hires into "types" and assign them appropriate training programs and responsibilities. Once certified, test administrators become cheerleaders of the Myers-Briggs, ensuring that use of the outdated instrument is continued.

If private companies want to throw their money away on the Myers-Briggs, that's their prerogative. But about 200 federal agencies reportedly waste money on the test too, including the State Department and the CIA. The military in particular relies heavily on the Myers-Briggs, and the EPA has given it to about a quarter of its 17,000 employees.

It's 2015. Thousands of professional psychologists have evaluated the century-old Myers-Briggs, found it to be inaccurate and arbitrary, and devised better systems for evaluating personality. Let's stop using this outdated test — which has about as much scientific validity as your astrological sign — and move on to something else.


Correction: This piece previously stated that the military uses the Myers-Briggs for promotions in particular, rather than using it as a general tool.

14 Jul 20:32

Pan-Roasted Salmon With Arugula and Avocado Salad

by Yasmin Fahr
Pan-Roasted Salmon With Arugula and Avocado Salad Creamy, buttery avocado, nutty shavings of Parmesan, and a bright, tangy dressing set the stage for a delicious piece of salmon. Did we mention it only takes 10 minutes to cook? Get Recipe!
14 Jul 15:50

Takeout-Style Kung Pao Chicken (Diced Chicken With Peppers and Peanuts)

by J. Kenji López-Alt
Takeout-Style Kung Pao Chicken (Diced Chicken With Peppers and Peanuts) As much as I now love real-deal Sichuan kung-pao chicken, my absolute favorite Chinese dish as a kid was this mildly spiced Americanized version—and to be honest, I still love it today. Just because it's a Chinese-American standard, complete with slightly-gloppy-sauce and mild heat doesn't make diced chicken with peppers and peanuts any less delicious. Here's how to make it at home. Get Recipe!