1. Management research: Alchemy → Chemistry?
McKinsey’s Michael Birshan and Thomas Meakin set out to “take a data-driven look” at the strategic moves of newly appointed CEOs, and how those moves influenced company returns. The accompanying podcast (with transcript), CEO transitions: The science of success, says “A lot of the existing literature is quite qualitative, anecdotal, and we’ve been able to build a database of 599 CEO transitions and add a bunch of other sources to it and really try and mine that database hard for what we hope are new insights. We are really trying to move the conversation from alchemy to chemistry, if you like.”
The research was first reported in How new CEOs can boost their odds of success. McKinsey’s evidence says new CEOs make similar moves, with similar frequency, whether they’re taking over a struggling company or a profitable one (see chart). For companies not performing well, Birshan says the data support his advice to be bold, and make multiple moves at once. Depending how you slice the numbers, both external and internal hires fared well in the CEO role (8).
Is this science? CEO performance was associated with the metric excess total returns to shareholders, “which is the performance of one company over or beneath the average performance of its industry peers over the same time period”. Bottom line, can you attribute CEO activities directly to excess TRS? Organizational redesign was correlated with significant excess TRS (+1.9 percent) for well-performing companies. The authors say “We recognize that excess TRS CAGR does not prove a causal link; too many other variables, some beyond a CEO’s control, have an influence. But we do find the differences that emerged quite plausible.” Hmm, correlation still does not equal causation.
Examine the evidence. The report’s end notes answer some key questions: Can you observe or measure whether a CEO inspires the top team? Probably not (1). Where do you draw the line between a total re-org and a management change? They define ‘management reshuffle’ as 50+% turnover in first two years (5). But we have other questions: How were these data collected and analyzed? Some form of content analysis would likely be required to assign values to variables. How were the 599 CEOs chosen as the sample? Selection bias is a concern. Were some items self-reported? Interviews or survey results? Were findings validated by assigning a second team to check for internal reliability? External reliability?
2. ICER + pharma → Fingerpointing.
There’s a kerfuffle between pharma companies and the nonprofit ICER (@ICER_review). The Institute for Clinical and Economic Review publishes reports on drug valuation, and studies comparative efficacy. Biopharma Dive explains that “Drugmakers have argued ICER’s reviews are driven by the interests of insurers, and fail to take the patient perspective into account.” The National Pharmaceutical Council (@npcnow) takes issue with how ICER characterizes its funding sources.
ICER has been doing some damage control, responding to a list of ‘myths’ about its purpose and methods. Its rebuttal, Addressing the Myths About ICER and Value Assessment, examines criticisms such as “ICER only cares about the short-term cost to insurers, and uses an arbitrary budget cap to suggest low-ball prices.” Also, ICER’s economic models “use the Quality-Adjusted Life Year (QALY) which discriminates against those with serious conditions and the disabled, ‘devaluing’ their lives in a way that diminishes the importance of treatments to help them.”
3. Immortal time bias → Overstated findings.
You can’t get a heart transplant after you’re dead. The must-read Hilda Bastian writes on Statistically Funny about immortal time bias, a/k/a event-free time or competing risk bias. This happens when an analysis mishandles events whose occurrence precludes the outcome of interest – such as heart transplant outcomes. Numerous published studies, particularly those including Kaplan-Meier analyses, may suffer from this bias.
4. Climate change → Weird weather?
This week the US is battling huge fires and disastrous floods: Climate change, right? Maybe. There’s now a thing called event attribution science, where people apply probabilistic methods in an effort to determine whether an extreme weather resulted from climate change. The idea is to establish/predict adverse impacts.
Evidence & Insights Calendar:
September 20-22; Newark, New Jersey. Advanced Pharma Analytics. How to harness real-world evidence to optimize decision-making and improve patient-centric strategies.
September 13-14; Palo Alto, California. Nonprofit Management Institute: The Power of Network Leadership to Drive Social Change, hosted by Stanford Social Innovation Review.
Photo credit: Fingerpointing by Tom Hilton.