Shared posts

19 Jun 00:49

Scott Coker replaces Bjorn Rebney as head of Bellator MMA

by Luke Thomas

Former Strikeforce CEO Scott Coker has replaced now former Bellator MMA CEO Bjorn Rebney, and the promotion also plans to move away from its longstanding tournament format.

The Bjorn Rebney era in Bellator might be soon coming to an end.

According to sources with knowledge of the situation, Viacom executives, the majority owner of Bellator MMA, informed Rebney, the founder and CEO of the promotion, earlier this week in New York that they are working towards replacing him with former Strikeforce CEO and founder Scott Coker. Josh Gross first reported the news early Wednesday morning.

The early plan was to announce the move later this week, perhaps as early as Wednesday, however, according to one source who spoke on the condition of anonymity, "the fight is on." In other words, Rebney, who founded Bellator in 2007, isn't ceding control easily.

Coker was not available to comment on the news.

It had long been rumored that Viacom executives and Rebney did not see eye-to-eye on the future of Bellator and were looking to making a change. Once Coker's contract and non-compete clause with Zuffa, who purchased Strikeforce in 2011, expired in March, the wheels were in motion to bring him on board.

Viacom purchased a majority stake of Bellator in late 2011 after the UFC signed a new seven-year broadcast deal with the FOX Broadcasting Company. Bellator debuted on Spike TV, which is owned by Viacom, in early Jan. 2013.

According to a Bellator press kit, Rebney, a former boxing promoter, "spent all his savings and put a second, then a third mortgage on his house" to launch the fight promotion seven years ago, however, it was only when Connecticut-based hedge fund Plainfield Asset Management decided to invest in "the mid-tens of millions" that the promotion was officially born in 2008. The organization's first event was held on April 3, 2009.

With Rebney leading the way as the face of the promotion, Bellator's first season aired on ESPN Deportes. The promotion later moved to FOX Sports Net, MTV2 and finally Spike TV. When the media giant Viacom purchased a majority stake of the company in Oct. 2011, Rebney was part of the deal. Tim Danaher, the former VP of Plainfield Asset Management, who spearheaded the initial move to invest in the company six years ago, left to become Bellator president in 2010. It is unclear at this time whether he would remain on board or not and whether Coker would bring in his own staff to run the promotion behind the scenes. Bellator's next event -- its 122nd -- is scheduled for July 25.

Coker launched Strikeforce in 1985 as a kickboxing promotion. Once California legalized the sport in 2005, Coker began promoting MMA in 2006, with the first event taking place on March 10, 2006 in front of 18,625 fans at the HP Pavilion in San Jose, Calif. The San Jose-based promotion signed with Showtime in Feb. 2009 and started to develop some momentum as the legitimate No. 2 MMA promotion behind the UFC. However, in a shocking turn of events, Zuffa purchased the promotion from its parent company, Silicon Valley Sports & Entertainment, in March 2011 before closing its doors in Jan. 2013. Once that happened, Coker was hardly seen or heard from in MMA, until his contract with the UFC expired three months ago.

It remains to be seen whether Rebney will be successful in fighting off the move, but if Viacom has its way, Coker will be introduced as the new face of Bellator sooner rather than later.

21 Jun 21:53

Me on the Lou Dobbs Show

by schneier

I was on the Lou Dobbs Show earlier this week.

18 Jun 12:02

Leave No Veterans Behind Wins First Bonus Challenge

by craigconnects

Hey, I want to congratulate Leave No Veteran Behind and their team for all their hard work, and for winning the first Bonus Challenge in the CrowdRise Veterans Charity Challenge. They've really got their boots on the ground helping out Vets and their families.

The Challenge was: "Raise the Most between May 28th and June 3 at 12:00pm EST and you'll win $5000 for your charity."

leave no vet behind

Leave No Veteran Behind  (LNVB) has accomplished a lot in the four years they've been around. Here's a brief summary of what they've done:

  • Paid off the student loans of 7 Veterans totaling just under $100k
  • Directly employed over 150 Vets in transitional work with Chicago Public Schools
  • Paid out over $1.2 million dollars in transitional wages to Vets and their families
  • Trained over 250 Veterans in employment skills to include:
    • security
    • private investigation
    • bookkeeping
    • administrative support
    • information technology
    • executive management skills
  • Decreased youth violence on the Southside of the City of Chicago, according to public school behavior records, police department crime rates, and qualitative data aggregated by the University of Chicago's Consortium on School Research
  • Currently, LNVB serves over 7,000 Chicago school children on a daily basis with safety and mentorship services
  • LNVB places Veterans in long-term employment scenarios
  • LNVB has partnered with West Point Military Academy to deliver STEM training to over 100 youth in Chicago
  • LNVB has advocated for the State of Illinois' Veteran Earned Income Tax Credit to help decrease statewide Veteran unemployment rates
  • 82% of every dollar goes to direct program support

I'm honored to be able to give money to this nonprofit. I just figure that if someone's willing to risk their life for me, I should give back, and this is the least that I can do. A nerd's gotta do what a nerd's gotta do.

Make sure to check out the Veterans Charity Challenge to find out which Bonus Challenges are coming up next, and if you're on Twitter, you can follow the conversation using #VetsChallenge.




18 Jun 12:02

America's 50 Worst Charities

by craigconnects

Hey, this is a really big deal. How often do you give to charities when you believe in their cause? You may think that the org is doing really good work, but I am actively involved with the Center for Investigative Reporting (CIR), and unfortunately that's not what's always happening.

Hundreds of charities claim to help those in need. But of the millions of dollars raised each year, how much goes to cancer patients, disabled vets, and dying children? For some charities, almost nothing goes toward direct aid. CIR and the Tampa Bay Times worked to create a list of the 50 worst charities in America.

top 10 worst charities

It's important to be aware that you're not giving your hard earned money to bad actors.

The nation’s 50 worst charities have paid their solicitors nearly $1 billion over the past 10 years that could have gone to charitable works.


Some of the findings that CIR released include that:

  • The 50 worst charities in America devote less than 4% of donations raised to direct cash aid.
  • Some charities give even less than 4% to direct cash aid.
  • Over a decade, one diabetes charity raised nearly $14 million and gave about $10,000 to patients.
  • Six orgs spent nothing at all on direct cash aid.

There are a lot of charities out there who really have their boots on the ground doing good work and spending their donations wisely, but you need to make sure to do your research before you give. Bad actors are out there to take your money, and the cause you were donating to may never see the actual money.

The Center for Investigative Reporting shares a CNN report that will air tomorrow, 6/13, at 8pm EDT. They'll show you what happens when a reporter tries to confront the executives of some of the charities on the list above. And you'll find out how one charity network spent nearly 70% of the millions they raised on fundraisers.

This is a really big deal, and very important. Don't get fooled by bad charities. Here are some tips that CIR gives for making sure that your money is actually going to a worthy cause: There are many resources to make sure that you choose a good charity to give to, orgs who are the real deal.

Please help me spread the word by sharing this across your social platforms, and help to stop these bad actors.



11 Jun 14:07

A Woman’s Tattoo Doubles The Chance of a Man Approaching

by Jeremy Dean

Post image for A Woman’s Tattoo Doubles The Chance of a Man Approaching

The incredible dating power of a woman's tattoo.

Not long ago I reported on a study which found that guitar cases have considerable power over women when they are asked on a date.

The French psychologist who conducted that study, Nicolas Gueguen, has been up to his old tricks again on the Atlantic Coast of France.

In a new experiment, reported in the journal Archives of Sexual Behavior, he had some women lying on a beach, face-down, reading a book (Gueguen, 2013). Sometimes they displayed a 10cm x 5cm temporary tattoo of a butterfly on their lower backs and sometimes not.

Then another research assistant counted how many times a man came up to them and tried to start a conversation.

Without a tattoo they were approached 10% of the time, but with the tattoo this shot up to 24%. Not only that but the tattoo increased the speed with which men approached from 35 minutes up to 24 minutes.

I will pass no judgement on the rights or wrongs of tattoos, the types of men that might have been approaching or anything else, but simply leave this for you to interpret as you will.

Oh wait; one final fact does need mentioning. When men were asked to evaluate the women with or without the tattoo, they judged that the women with tattoos were more likely to say yes to their advances and were probably more promiscuous.

Whether or not either of these assumptions is actually true is a totally different matter. It may well be that men misinterpret tattoos and/or that women don't realise how men perceive them.

OK, now discuss.

Image credit: Stephanie Wallace

If you value PsyBlog, please support it by spreading it to others through email, social networks or even old-fashioned talking! Thank you!

Making Habits, Breaking Habits

In his new book, Jeremy Dean--psychologist and author of PsyBlog--looks at how habits work, why they are so hard to change, and how to break bad old cycles and develop new healthy, creative, happy habits.

"Making Habits, Breaking Habits", is available now on Amazon.

09 Jun 16:06

LaMP Teaches You a Foreign Language via Movie and YouTube Subtitles

by Melanie Pinola

LaMP Teaches You a Foreign Language via Movie and YouTube Subtitles

Windows/Web: LaMP, or Lingual Media Player, is a neat program for learning a new language through watching videos—one of the best ways to become more efficient in a foreign language. The program displays foreign language subtitles for any film you load into the media player or videos you select from YouTube.

To use LaMP, click on the Video File folder to load a video, then add the subtitle file(s) you want displayed (found in the video folder's languages subfolders). You can display two subtitles at once: For example, the language you're learning and your native language. LaMP supports a great many languages, from Arabic to Vietnamese.

As you watch the video, you'll also see in addition to the subtitles, a transcript of all the subtitles with their time marks, and you can set the player to pause after each subtitle to help you soak it in. Additionally, you can use Bing or Google Translate to instantly confirm a subtitle's translation.

If you don't have an foreign language videos to use, LaMP can search YouTube or connect to a compiled list of YouTube videos with closed captions. The online version of LaMP has fewer features than the Windows version (e.g., no manual syncing of subtitles and sound), but is still a convenient way to watch and learn a language using YouTube videos.

It would be nice if there was some marker for each word as it's being said, but overall LaMP is a great utility for the language learner.

LaMP Lingual Media Player | via AddictiveTips

09 Jun 16:06


Good thing we're too smart to spend all day being uselessly frustrated with ourselves. I mean, that'd be a hell of a waste, right?
05 Jun 05:09

Craving Coffee Might be Genetic

by 23andMe

By Amick B.


There are many studies that look at the health impacts of drinking coffee, but you’d be forgiven if you weren’t clear about whether it’s good for you.

Research has found consuming coffee could potentially prevent heart disease, type-2 diabetes, Parkinson’s disease and liver cancer, but the results are often conflicting. Beyond that, other variables can potentially negate any benefits from drinking a cup of Joe.  Whether you like your coffee black or with a bit of cream and sugar, for example, makes a difference for your health. How much you drink and even the kinds of filters you use to make coffee could also have an impact.

All the conflicting information is enough to keep non-coffee drinkers from lining up at Starbucks, while those who indulge are left scratching their heads about whether they need to cut back or keep on imbibing.

But there’s one area where the science seems certain:  Some of us are just more genetically wired to like coffee. A number of studies have identified certain genetic variants associated with caffeine consumption and metabolism, which helps us know why some of us choose coffee, and others don’t.

Recently, a new study found some additional genetic variants associated with a tendency toward java enjoyment, including one near NRCAM, a gene implicated in vulnerability to addiction.  The study looked at nearly 8,000 Caucasian coffee drinkers and found significant evidence of gene association at the single nucleotide polymorphism (SNP) rs382140 near NRCAM.  Using coffee consumption as a model, this new research may help scientists better understand genetic problems with addiction and how to treat them.

The recent study also looked at two SNPs that lie in the region between CYP1A1 and CYP1A2 genes – CYP1A1 is known to metabolize polycyclic aromatic hydrocarbons, which are important constituents of coffee, whereas caffeine is primarily metabolized by CYP1A2. The study confirmed previous studies and added a new SNP, rs2470893, to the list of those associated with consumption of coffee and metabolism of caffeine.

23andMe looks at a variant that affects how quickly a person metabolizes caffeine based on preliminary research of the SNP rs762551. People with rs762551 and the slower version of the CYP1A2 enzyme who also drank at least two to three cups of coffee per day had a significantly increased risk of a non-fatal heart attack, according to the research, while, according to the study, fast metabolizers could actually reduce their heart attack risk by drinking coffee.

Additionally, in a 2011 blog post, 23andMe discussed two studies that found an association between coffee and caffeine consumption and SNP variants near the CYP1A1 and AHR genes, helping us know which genes influence the amount of coffee people consume.  One study found the SNP rs2472297 near CYP1A1 had a fairly consistent effect — people with the T version were prone to drink about a quarter cup more of coffee a day – and the other showed that people with two copies of a T at rs4410790 near the AHR gene drank about a third of a cup of coffee more each day than those without any copies.

To find out whether genetic variants influence their caffeine consumption, 23andMe customers can look up their data for rs2472297 and rs4410790 using the Browse Raw Data feature.

Amick Boone is a freelance writer who’s written about health and life sciences for nearly 10 years.  When she’s not at her computer, she’s usually in motion – doing yoga, riding her bike or traveling.

05 Jun 05:09

Self service requires information, which requires design

by Seth Godin

Consider travel as an example:

If you've arranged the flights on the monitor in order of flight time, not destination, requiring me to stop and take out my ticket, you have failed.

If you've hidden the room numbers (or given them fancy names) so that only an employee can find the right spot, you've failed as well.

The label on prescription drugs, the instructions post-doctor visit, the manual for using software or putting together furniture--if we're getting rid of service and turning it into self-service, we owe it to our newly deputized employees (our customers) to give them the tools they need to not need us.

Sure, you need someone in charge of customer service. But you also need someone in charge of service design. Someone responsible for fixing what's broken, not merely apologizing for it again and again.

It's not cheap, but it's way cheaper than answering the phone or annoying the people who pay our bills.

05 Jun 05:06

The Problems with Managing Privacy by Asking and Giving Consent

by schneier

New paper from the Harvard Law Review by Daniel Solove: "Privacy Self-Management and the Consent Dilemma":

Privacy self-management takes refuge in consent. It attempts to be neutral about substance -- whether certain forms of collecting, using, or disclosing personal data are good or bad -- and instead focuses on whether people consent to various privacy practices. Consent legitimizes nearly any form of collection, use, or disclosure of personal data. Although privacy self-management is certainly a laudable and necessary component of any regulatory regime, I contend that it is being tasked with doing work beyond its capabilities. Privacy self-management does not provide people with meaningful control over their data. First, empirical and social science research demonstrates that there are severe cognitive problems that undermine privacy self-management. These cognitive problems impair individuals' ability to make informed, rational choices about the costs and benefits of consenting to the collection, use, and disclosure of their personal data.

Second, and more troubling, even well-informed and rational individuals cannot appropriately self-manage their privacy due to several structural problems. There are too many entities collecting and using personal data to make it feasible for people to manage their privacy separately with each entity. Moreover, many privacy harms are the result of an aggregation of pieces of data over a period of time by different entities. It is virtually impossible for people to weigh the costs and benefits of revealing information or permitting its use or transfer without an understanding of the potential downstream uses, further limiting the effectiveness of the privacy self-management framework.

05 Jun 05:05

The Security Risks of Unregulated Google Search

by schneier

Someday I need to write an essay on the security risks of secret algorithms that become part of our infrastructure. This paper gives one example of that. Could Google tip an election by manipulating what comes up from search results on the candidates?

The study’s participants, selected to resemble the US voting population, viewed the results for two candidates on a mock search engine called Kadoodle. By front-loading Kadoodle’s results with articles favoring one of the candidates, Epstein shifted enough of his participants' voter preferences toward the favored candidate to simulate the swing of a close election. But here’s the kicker: in one round of the study, Epstein configured Kadoodle so that it hid the manipulation from 100 percent of the participants.

Turns out that it could. And, it wouldn't even be illegal for Google to do it.

The author thinks that government regulation is the only reasonable solution.

Epstein believes that the mere existence of the power to fix election outcomes, wielded or not, is a threat to democracy, and he asserts that search engines should be regulated accordingly. But regulatory analogies for a many-armed, ever-shifting company like Google are tough to pin down. For those who see search results as a mere passive relaying of information, like a library index or a phone book, there is precedent for regulation. In the past, phone books -- with a monopoly on the flow of certain information to the public -- were prevented from not listing businesses even when paid to do so. In the 1990s, similar reasoning led to the "must carry" rule, which required cable companies to carry certain channels to communities where they were the only providers of those channels.

As I said, I need to write an essay on the broader issue.

05 Jun 05:05

The Problems with CALEA-II

by schneier

The FBI wants a new law that will make it easier to wiretap the Internet. Although its claim is that the new law will only maintain the status quo, it's really much worse than that. This law will result in less-secure Internet products and create a foreign industry in more-secure alternatives. It will impose costly burdens on affected companies. It will assist totalitarian governments in spying on their own citizens. And it won't do much to hinder actual criminals and terrorists.

As the FBI sees it, the problem is that people are moving away from traditional communication systems like telephones onto computer systems like Skype. Eavesdropping on telephones used to be easy. The FBI would call the phone company, which would bring agents into a switching room and allow them to literally tap the wires with a pair of alligator clips and a tape recorder. In the 1990s, the government forced phone companies to provide an analogous capability on digital switches; but today, more and more communications happens over the Internet.

What the FBI wants is the ability to eavesdrop on everything. Depending on the system, this ranges from easy to impossible. E-mail systems like Gmail are easy. The mail resides in Google's servers, and the company has an office full of people who respond to requests for lawful access to individual accounts from governments all over the world. Encrypted voice systems like Silent Circle are impossible to eavesdrop on—the calls are encrypted from one computer to the other, and there's no central node to eavesdrop from. In those cases, the only way to make the system eavesdroppable is to add a backdoor to the user software. This is precisely the FBI's proposal. Companies that refuse to comply would be fined $25,000 a day.

The FBI believes it can have it both ways: that it can open systems to its eavesdropping, but keep them secure from anyone else's eavesdropping. That's just not possible. It's impossible to build a communications system that allows the FBI surreptitious access but doesn't allow similar access by others. When it comes to security, we have two options: We can build our systems to be as secure as possible from eavesdropping, or we can deliberately weaken their security. We have to choose one or the other.

This is an old debate, and one we've been through many times. The NSA even has a name for it: the equities issue. In the 1980s, the equities debate was about export control of cryptography. The government deliberately weakened U.S. cryptography products because it didn't want foreign groups to have access to secure systems. Two things resulted: fewer Internet products with cryptography, to the insecurity of everybody, and a vibrant foreign security industry based on the unofficial slogan "Don't buy the U.S. stuff -- it's lousy."

In 1993, the debate was about the Clipper Chip. This was another deliberately weakened security product, an encrypted telephone. The FBI convinced AT&T to add a backdoor that allowed for surreptitious wiretapping. The product was a complete failure. Again, why would anyone buy a deliberately weakened security system?

In 1994, the Communications Assistance for Law Enforcement Act mandated that U.S. companies build eavesdropping capabilities into phone switches. These were sold internationally; some countries liked having the ability to spy on their citizens. Of course, so did criminals, and there were public scandals in Greece (2005) and Italy (2006) as a result.

In 2012, we learned that every phone switch sold to the Department of Defense had security vulnerabilities in its surveillance system. And just this May, we learned that Chinese hackers breached Google's system for providing surveillance data for the FBI.

The new FBI proposal will fail in all these ways and more. The bad guys will be able to get around the eavesdropping capability, either by building their own security systems -- not very difficult -- or buying the more-secure foreign products that will inevitably be made available. Most of the good guys, who don't understand the risks or the technology, will not know enough to bother and will be less secure. The eavesdropping functions will 1) result in more obscure -- and less secure -- product designs, and 2) be vulnerable to exploitation by criminals, spies, and everyone else. U.S. companies will be forced to compete at a disadvantage; smart customers won't buy the substandard stuff when there are more-secure foreign alternatives. Even worse, there are lots of foreign governments who want to use these sorts of systems to spy on their own citizens. Do we really want to be exporting surveillance technology to the likes of China, Syria, and Saudi Arabia?

The FBI's short-sighted agenda also works against the parts of the government that are still working to secure the Internet for everyone. Initiatives within the NSA, the DOD, and DHS to do everything from securing computer operating systems to enabling anonymous web browsing will all be harmed by this.

What to do, then? The FBI claims that the Internet is "going dark," and that it's simply trying to maintain the status quo of being able to eavesdrop. This characterization is disingenuous at best. We are entering a golden age of surveillance; there's more electronic communications available for eavesdropping than ever before, including whole new classes of information: location tracking, financial tracking, and vast databases of historical communications such as e-mails and text messages. The FBI's surveillance department has it better than ever. With regard to voice communications, yes, software phone calls will be harder to eavesdrop upon. (Although there are questions about Skype's security.) That's just part of the evolution of technology, and one that on balance is a positive thing.

Think of it this way: We don't hand the government copies of our house keys and safe combinations. If agents want access, they get a warrant and then pick the locks or bust open the doors, just as a criminal would do. A similar system would work on computers. The FBI, with its increasingly non-transparent procedures and systems, has failed to make the case that this isn't good enough.

Finally there's a general principle at work that's worth explicitly stating. All tools can be used by the good guys and the bad guys. Cars have enormous societal value, even though bank robbers can use them as getaway cars. Cash is no different. Both good guys and bad guys send e-mails, use Skype, and eat at all-night restaurants. But because society consists overwhelmingly of good guys, the good uses of these dual-use technologies greatly outweigh the bad uses. Strong Internet security makes us all safer, even though it helps the bad guys as well. And it makes no sense to harm all of us in an attempt to harm a small subset of us.

This essay originally appeared in Foreign Policy.

05 Jun 05:02

Movie Studio Set To Comb Desert Landfill For Notoriously Terrible ‘E.T.’ Atari Game

by Mary Beth Quirk

Is he in a sandy grave, waiting to return to the surface?

Is he in a sandy grave, waiting to return to the surface?

There is a legend, a legend of a magical place filled with millions of copies of the notoriously terrible 1982 E.T. title, among other failed Atari games. According to gamer lore, after the title flopped, millions of Atari cartridges were buried somewhere in the desert of New Mexico, perhaps in an attempt to forever bury the shame of the game’s extreme terribleness. Now one film company has been granted the rights to search a landfill in a quest to see if the legend is real.

As legend and Wikipedia have it, the major failure that was E.T. could’ve contributed to Atari’s downfall and subsequent sale to another company in 1983. The Atari Graveyard has, until now, been mostly considered an urban myth, a caveat of a tale whispered by the glow of computer screens by game developers wary of making the same mistakes.

Apparently, E.T. keeps falling into a hole, causing gamers to constantly be shunted back to the beginning of the game, ad nauseam, ad infinitum. Thus, its disposal in a dry, desert grave.

The Alamogordo Daily News dips into the local lore, as the city commission has granted a Canadian film studio the permission to search a landfill near the city to hunt for the buried games, which could include other titles as well as the unlucky E.T.

A District Commissioner involved in the plan reportedly admitted to having played the game himself and affirmed that yes, it was horrible. The city has given Fuel Industries the go-ahead to dig up the old landfill where the games are said to be buried, and it’s got six months to do it.

As for what will happen if and when the games are uncovered, that plan currently resides in the minds of studio executives but will likely include some kind of documentary about their undertaking. Godspeed.

A film company poised to search for ‘worst video game ever’ in Alamogordo [Alamogordo Daily News]

05 Jun 04:51

Join us for a presentation about "Hybrid Semantics in Media and Scientific Publishing" at Semantic Publishing Meet-up, June 4th, San Francisco

by reneta

Ontotext will share success stories of mass media and scientific publishers taking the leap from pre-FTS-indexing times to context-supported authoring, data-rich editorial process and dynamic semantic publishing. It will show how hybrid semantics is done and what it takes.

Over the years, a repetitive solution pattern for semantic publishing got mastered and applied over and over again. Ontotext will present how it works and how to benefit from it – from the domain aware schema modeling, through sophisticated text analytics, to readers’ profiles aware publishing.

To join register here.

05 Jun 04:50

Interrogating p-values

by Andrew


This article is a discussion of a paper by Greg Francis for a special issue, edited by E. J. Wagenmakers, of the Journal of Mathematical Psychology. Here’s what I wrote:

Much of statistical practice is an effort to reduce or deny variation and uncertainty. The reduction is done through standardization, replication, and other practices of experimental design, with the idea being to isolate and stabilize the quantity being estimated and then average over many cases. Even so, however, uncertainty persists, and statistical hypothesis testing is in many ways an endeavor to deny this, by reporting binary accept/reject decisions.

Classical statistical methods produce binary statements, but there is no reason to assume that the world works that way. Expressions such as Type 1 error, Type 2 error, false positive, and so on, are based on a model in which the world is divided into real and non-real effects. To put it another way, I understand the general scientific distinction of real vs. non-real effects but I do not think this maps well into the mathematical distinction of θ=0 vs. θ≠0. Yes, there are some unambiguously true effects and some that are arguably zero, but I would guess that the challenge in most current research in psychology is not that effects are zero but that they vary from person to person and in different contexts.

But if we do not want to characterize science as the search for true positives, how should we statistically model the process of scientific publication and discovery? An empirical approach is to identify scientific truth with replicability; hence, the goal of an experimental or observational scientist is to discover effects that replicate in future studies.

The replicability standard seems to be reasonable. Unfortunately, as Francis (in press) and Simmons, Nelson, and Simonsohn (2011) have pointed out, researchers in psychology (and, presumably, in other fields as well) seem to have no problem replicating and getting statistical significance, over and over again, even in the absence of any real effects of the size claimed by the researchers.

. . .

As a student many years ago, I heard about opportunistic stopping rules, the file drawer problem, and other reasons why nominal p-values do not actually represent the true probability that observed data are more extreme than what would be expected by chance. My impression was that these problems represented a minor adjustment and not a major reappraisal of the scientific process. After all, given what we know about scientists’ desire to communicate their efforts, it was hard to imagine that there were file drawers bulging with unpublished results.

More recently, though, there has been a growing sense that psychology, biomedicine, and other fields are being overwhelmed with errors (consider, for example, the generally positive reaction to the paper of Ioannidis, 2005). In two recent series of papers, Gregory Francis and Uri Simonsohn and collaborators have demonstrated too-good-to-be-true patterns of p-values in published papers, indicating that these results should not be taken at face value.

. . .

Although I do not know how useful Francis’s particular method is, overall I am supportive of his work as it draws attention to a serious problem in published research.

Finally, this is not the main point of the present discussion but I think that my anti-hypothesis-testing stance is stronger than that of Francis (in press). I disagree with the following statement from that article:

For both confirmatory and exploratory research, a hypothesis test is appropriate if the outcome drives a specific course of action. Hypothesis tests provide a way to make a decision based on data, and such decisions are useful for choosing an action. If a doctor has to determine whether to treat a patient with drugs or surgery, a hypothesis test might provide useful information to guide the action. Likewise, if an interface designer has to decide whether to replace a blue notification light with a green notification light in a cockpit, a hypothesis test can provide guidance on whether an observed difference in reaction time is different from chance and thereby influence the designer’s choice.

I have no expertise on drugs, surgery, or human factors design and so cannot address these particular examples—but, speaking in general terms, I think Francis is getting things backward here. When making a decision, I think it is necessary to consider effect sizes (not merely the possible existence of a nonzero effect) as well as costs. Here I speak not of the cost of hypothetical false positives or false negatives but of the direct costs and benefits of the decision. An observed difference can be relevant to a decision whether or not that difference is statistically significant.

The post Interrogating p-values appeared first on Statistical Modeling, Causal Inference, and Social Science.

04 Jun 13:51

girlgoesgrrr:   TODAY IN TURKEY National Protest: Istanbul:...




National Protest: Istanbul: 01-02JUNE2013

ACAB Worldwide


27 May 00:43

Why Math Can't Bail Us Out

by Emanuel Derman
I don’t think mathematical models can do much to get us out of the mess that we’re in.  There’s a limit.  The mess that we’re in currently in the United States and maybe in the world in general, I don’t think it’s really a mess brought about by mathematics or curable by mathematics.   You ...

Read More
26 May 06:08

Have the climate sceptics really won?


Despite recent fears of sceptics winning public debates, they are not all powerful, but have cast a spell upon their opponents. Earlier this week, Martin Wolf of the Financial Times announced that the "climate sceptics have won". His comments echo those of former Nasa scientist James Hansen who told an audience in Edinburgh last year that the sceptics "have been winning the public debate with the help of tremendous resources." The action needed in response to this situation was spelt out by Lord Stern – the eponymous author of the well-known 2007 report on the economics of climate change – who once called sceptics "forces of darkness" who had to be "driven back."

Such comments reflect a conventional wisdom in the climate debate. Climate sceptics, or deniers as they are often called, are presented as all-powerful forces bankrolled by rich corporations who have wielded their awesome power to block efforts to deal with the threat of human caused climate change. How do we know that climate sceptics have such power? As Martin Wolf explains, it is the "world's inaction" on climate policy which reveals their power.

From this perspective then, a key challenge of securing action on climate change is to defeat the sceptics – to drive back the forces of darkness so that the forces of good might prevail. Victory will be achieved by winning the battle for public opinion on the state of climate science.

However, a closer look at the logic underlying such arguments reveals a chain of causality which scholars of the public understanding of science have long critiqued as the ineffectual "deficit model" of science. Even more troubling, there is reason to believe that the focus of attention by climate campaigners on sceptics actually works against effective action.

The so-called "deficit model" suggests that the public lacks certain knowledge that if it were known properly (so closing the deficit) would lead them to favor certain policy actions. In other words, if only you understood the "facts" as I understand them, then you would come to share my policy preferences.

The deficit model helps to explain why people argue so passionately about "facts" in public debates over policies with scientific components. If you believe that acceptance of certain scientific views is a precondition for, or a causal factor in determining what policy views people hold, then arguments over facts serve as political debate by proxy.

Dan Kahan, professor of psychology at Yale Law School, has conducted several studies of public views on climate change and finds that the causal mechanisms of the "deficit model" actually work in reverse: people typically "form risk perceptions that are congenial to their values." Our political views shape how we interpret facts. On an issue as complex as climate, there are enough data and interpretations to offer support to almost any political agenda. Thus we have arguments over the degree or lack of consensus among scientists, and see efforts to delegitimise outlier positions in order to assert one true and proper interpretation. Added to the mix is the temptation to push "facts" beyond what science can support, which offers each side the opportunity for legitimate critique of the excesses of their opponents. These dynamics can (and do) go on forever.

In the first half of the 20th century, the American political commentator Walter Lippmann recognized that uniformity of perspective was not necessary for action to take place in democracies. He explained that the goal of politics is not to make everyone think alike, but to help people who think differently to act alike. A vast body of scholarship supports the limitations of the deficit model, yet it remains a defining feature of debates over climate policy today.

Read the entire article

NEWS SOURCE: the guardian



09 May 23:24

The 2013 Colorado Legislative Session just ended. What happened and where do we go from here?

by steve

Almost one year ago, Great Education Colorado and a couple dozen organizations set a goal for the General Assembly: make 2013 the Year of the Student.

Today, the Year of the Student coaltion includes 170 organizations and 9,500+ individuals and the 2013 legislative session has drawn to a close.

So what happened?

Because of the Year of the Student coalition’s efforts, students will be much better off next year than they were this year, but there’s still a lot of work to do.

This year, legislators were deluged like never before with constituent contacts about education funding.  And, for the first time in years, education funding was a focal point of legislative debate and action.  And the outcomes?

  • $30 million increase for higher ed
  • 15% increase in Special Education
  • Estimated $200 million additional deposit in the State Education Fund
  • 3,200 new preschool slots
  • Opportunity for a statewide ballot initiative to increase P-12 funding.

Still on the extensive to do list:

  • K-12 funding is still $1 billion below inflation and enrollment growth since 2009 and even farther behind what’s necessary to ensure all students can meet standards and succeed.
  • Thousands of eligible children still have no access to preschool.
  • Only one state provides less support for higher ed per capita than Colorado.

Of course, 2013 isn’t over and there are still opportunities for progress.  It’s likely that voters will have the chance in November to make a substantial down-payment on the funding needed to provide all students with the educational opportunities they need to succeed.

And Great Ed will continue to build on this year’s success and support legislators in their long-term efforts to do right by students.

We’ve come so far this year; but we can’t let up in the long term fight to create a education funding system that enables all students to succeed.   So stay tuned.

Thanks for all you do to make progress possible and sustainable.


02 May 18:29

stop cispa

by ballgame

UPDATE below (2013/4/25).

They’re trying again, hoping we’re asleep.

TechDirt’s take on where things stand.

Latest news from Electronic Frontier Foundation.

UPDATE: Some additional links worth reading:

For readers in the UK, privacy groups allege that “[c]ommunications providers are conspiring with the government to keep quiet over the effects of the Communications Data Bill, known to critics as Snooper’s Charter.” AFAICT, it appears that privacy rights in Britain are decaying, but not as rapidly as they are in the U.S. (h/t Slashdot.)

RT has two recent posts about the issue. In one, the Senate does not appear to be in a hurry to pass the bill, while in the other, Obama is apparently already deploying CISPA-like programs to monitor the Internet. (h/t Reddit)

02 May 16:28

Can open peer review work? Biology Direct suggests it can…

by ciaranoneill

Peer review is a process integral to the scientific research cycle, and, for the majority of biology journals, one that takes place behind closed doors. In conventional blind peer review, reviewers remain anonymous to the authors, and their comments are not made available once a paper is published.

The closed nature of this review can have its problems: high-profile commentaries on lengthy, iterative review calling for an end to ‘the tyranny of reviewer experiments’ and ‘painful publishing’ have generated much discussion on the issue in recent years.

But can one of the touted solutions to the problem, namely opening up the peer review process for all to see, work in practice?

Biology Direct launched 7 years ago as a community experiment in open peer review, whereby both the names of the reviewers, and their comments, are included in the final publication. As Editors-in-Chief Eugene Koonin, Laura Landweber and David Lipman put it in their launch Editorial, their aims were ‘unapologetically ambitious’:

“to establish a new, perhaps, better system of peer review and, in the process, bolster productive scientific debate, and provide scientists with useful guides to the literature”

Their logic was as follows – scientific debate at conferences is priceless in providing fresh perspectives and directions, and while discussion can be heated, academics are required to stand by the comments they make. Without the veil of anonymity, responsibility is increased – why not apply the same rubric to the peer-review process?

Other progressive policies of the journal include inviting the authors to select reviewers from the Editorial board, and allowing the author, rather than an Editor, to decide whether to publish a paper in light of the reviewer comments.

Now in its 8th year, the journal is reorganising and expanding its scope, and in an Editorial to announce the changes, the Editors-in-Chief take a look a back at the successes, and challenges, that this open peer-review scheme has met with.

Anecdotally, their discussions with scientists support the impression that Biology Direct has become a well-known forum for constructive, open discussion of new results and ideas, with readers often turning first to the exchange between authors and reviewers.

The quality of research published in the journal is also clear, Biology Direct has been recognized twice in the BioMed Central Annual Research Awards, most recently for Geoffrey Diemer and Kenneth Stedman’s discovery of a novel virus genome from an extreme environment which received the overall award. The journal is also well accessed and cited – suggesting that concerns that this model would compromise the quality of research published are not substantiated.

The successes have led to the expansion and reorganisation of the journal into the following sections:

-          Bioinformatics
-          Evolutionary Biology
-          Genomics
-          Mathematical Biology
-          Non-coding DNA and RNA
-          Structural and Molecular Biology
-          Systems Biology



Full details of the newly appointed Section Editors, and Editorial Board members are available here.

While the journal is expanding at this point an air of pragmatism remains – one particularly interesting insight from the Editors is that the journal has been more successful in generating discussion in some areas of the scope than others (namely evolutionary genomics, systems biology and bioinformatics).

Perhaps this suggests that currently open review is more attractive to authors of non-experimental research, and it will be interesting to see if this is reflected in the proliferation of journals with similar open review policies, such as F1000 Research.

What is clear is that the scientific community acknowledges the need for alternative approaches which attempt to address perceived inefficiencies with the traditional system, (the subject of a recent panel discussion organised by BioMed Central) and whilst a  ‘one-size-fits-all’ approach may not be the solution, Biology Direct has forged a unique niche for open peer review and scientific debate.