If you have been too busy running gels…
Replication: The U.S. Government Accountability Office (GAO) published a report on Federal Actions Needed to Promote Stronger Research Practices [link]. The recommendations will affect the NIH, NSF, and NASA. The recommendations require the agencies to collect information on research quality and increase transparency of research results. See more below.
Nudges: A large meta-analysis, originally published in January, is making the rounds again. The analysis showed that research on “choice architecture interventions” (bah!) suffers from publication bias and unexplained heterogeneity. See below.
Webb: After reading these news, you may attempt to relax by looking at the amazing picture of the Cartwheel Galaxy, only 500 million light-years away.
Its appearance, much like that of the wheel of a wagon, is the result of an intense event – a high-speed collision between a large spiral galaxy and a smaller galaxy not visible in this image. … The Cartwheel Galaxy sports two rings — a bright inner ring and a surrounding, colorful ring. These two rings expand outwards from the center of the collision, like ripples in a pond after a stone is tossed into it.
Here’s to hoping…
According to the GAO report (see above) — “Officials from NIH, NSF, and NASA emphasized that it would be time consuming and resource intensive to
attempt to replicate or reproduce the work underlying the thousands of
published manuscripts each year. In addition, officials from these
agencies maintained to us that the grant application review process and
the pre-publication peer review process are adequate to ensure
appropriate rigor.” This attitude is disappointing.
However, the report itself includes various interesting suggestions for improvement that are not part of the official recommendations.
… NIH made changes in 2015 to require that—beginning in
January 2016—grant applicants provide an explicit discussion and
evaluation of the rigor of the relevant prior research, how they intend to
address any weaknesses therein, and how their proposed experimental
design and methods will achieve robust and unbiased results
Read the report [pdf].
Not nudges again?!
So at least the reliability problems in social psychology/behavioral economics that drew attention to the reliability of research were solved, right? Nah.
Read the original meta-analysis on nudges from January.
And then read the discussion that it sparked. We suspect that Maier et al.’s conclusion comes close to the truth of the matter: they find no evidence for nudging after adjusting for publication bias.
[Cass] Sunstein continues to write as if nudges and cost-benefit analysis—the whole technocratic shebang—are cheeky new ideas worth giving a shot, and not the codified bedrock of an approach to government with a real and unflattering historical track record. Experience has not shaken the Sunstein worldview. It remains as smug as ever.
Savor this nasty 2019 review of Cass Sunstein by Aaron Timms.
Odds and Ends
It’s the fifty year anniversary of the revelations about the Tuskegee Syphilis Study.
Major publication demonstates using stem cells to create synthetic mouse embryos. Read on the commercial and ethical implications of this work in MIT Technology Review.