1. Abundant evidence → Clever synthesis → Informed crime-prevention decisions The What Works Crime Toolkit beautifully synthesizes – on a single screen – the evidence on crime-prevention techniques. This project by the UK's @CollegeofPolice provides quick answers to what works (the car breathalyzer) and what doesn't (the infamous "Scared Straight" programs). Includes easy-to-use filters for evidence quality and type of crime. Just outstanding.
2. Insights → Strategic reuse → Data-driven decision making Tom Davenport explains why simply generating a bunch of insights is insufficient: "Perhaps the overarching challenge is that very few organizations think about insights as a process; they have been idiosyncratic and personal." A truly insight-driven organization must carefully frame, create, market, consume, and store insights for reuse. Via @DeloitteBA.
3. Sloppy science → Weak replication → Psychology myths Of 100 studies published in top-ranking journals in 2008, 75% of social psychology experiments and half of cognitive studies failed the replication test. @iansample delivers grim news in The Guardian: The psych research/publication process is seriously flawed. Thanks to @Rob_Briner.
4. Flawed policy → Ozone overreach → Burden on business Tony Cox writes in the Wall Street Journal that the U.S. EPA lacks causal evidence to support restrictions on ground-level ozone. The agency is connecting this pollutant to higher incidence of asthma, but Cox says new rules won't improve health outcomes, and will create substantial economic burden on business.
5. Opaque process → Peer-review fraud → Bad evidence More grim news for science publishing. Springer has retracted 64 papers from 10 journals after discovering the peer reviews were linked to fake email addresses. The Washington Post story explains that only nine months ago, BioMed Central – a Springer imprint – retracted 43 studies. @RetractionWatch says this wasn't even a thing before 2012.