Understand evidence

1. Know someone who effectively promotes evidence?
Nominations are open for the 2016 John Maddox Prize for Standing up for Science, recognizing an individual who promotes sound science and evidence on a matter of public interest, facing difficulty or hostility in doing so.

Researchers in any area of science or engineering, or those who work to address misleading information and bring evidence to the public, are eligible. Sense About Science (@senseaboutsci) explains that the winner will be someone who effectively promotes evidence despite challenge, difficulty, or adversity, and who takes responsibility for public discussion beyond what would be expected of someone in their position. Nominations are welcome until August 1.

2. Evidence to improve surgical outcomes.
Based on Oxford, UK, the IDEAL Collaboration is an initiative to improve the quality of research in surgery, radiotherapy, physiotherapy, and other complex interventions. The IDEAL model (@IDEALCollab) describes the stages of innovation in surgery: Idea, Development, Exploration, Assessment, and Long-Term Study. Besides its annual conference, the collaborative also proposes and advocates for assessment frameworks, such as the recent IDEAL-D for assessing medical device safety and efficacy.

3. Could artificial intelligence replace executives?
In the MIT Sloan Management Review, Sam Ransbotham asks Can Artificial Intelligence Replace Executive Decision Making? ***insert joke here*** Most problems faced by executives are unique, not well-documented, and lack structured data, so they're not available to train an artificial intelligence system. What would be more useful would be analogies and examples of similar decisions – not a search for concrete patterns. AI needs repetition, and most executive decisions don't lend themselves to A/B testing or other research methods. However, some routine/small issues could eventually be handled by cognitive computing.

4. Can data be labeled for quality?
Jim Harris (@ocdqblog) describes must-haves for data quality. His SAS blog post compares consuming data without knowing its quality to purchasing unlabeled food. Possible solution: A data-quality 'label' could be implemented as a series of yes/no or pass/fail flags appended to all data structures. These could indicate whether all critical fields were completed, and whether specific fields were populated with a valid format and value.

Evidence & Insights Calendar:

August 24-25; San Francisco. Sports Analytics Innovation Summit.

September 13-14; Palo Alto, California. Nonprofit Management Institute: The Power of Network Leadership to Drive Social Change, hosted by Stanford Social Innovation Review.

September 19-23; Melbourne, Australia. International School on Research Impact Assessment. Founded by the Agency of Health Quality and Assessment, RAND Europe, and Alberta Innovates.

Leave a comment

Scroll Up