This originated as a post to a mailing list on the subject of blockchains and how they might help the cause of open science. The quote below is the claim I was directly responding to.
Shauna: “but the scientific community is arguably the most effective trust-based system in human history” – according to this view we wouldn’t need version control systems or preregistration either. I couldn’t disagree more; trust has no place in science. To me, that’s one of the major things with open scientific practices: removing the trust from science and instead practice transparency.
Trust is a fundamental issue in all human relationships and communities. Every single interaction we have with other human beings involves some level of trust. Just today I had to trust the people who harvested and packaged my food not to have accidentally or maliciously poisoned me, the drivers on the street to obey traffic conventions, and my house mate to have locked the house and not invited anyone dangerous inside — and those are only the most obvious examples. I could come up with dozens more.
Different situations require different levels of trust. If a situation requires more trust than you currently have, you can try to increase trust in a number of ways. You can build new technologies, but you can also strengthen relationships, create neutral institutions, or add legal or regulatory force to your agreements. None of these work perfectly, and often you’re best off pursuing a combination of them. In all cases, though, you will have to trust someone at some point – it’s just a matter of deciding which system will allow you to trust in a way that’s acceptable to you.
The scientific community has trust issues, yes, like every other human community. But its trust issues are of a specific type. When you read a scientific paper, what makes you doubt the findings? Personally, I’m not worried that the authors have faked the data, or that the publisher has changed the content of the paper without anybody knowing, or that the paper is stolen or plagiarized. I know that the scientific community has very strong norms against these types of violations, and so they’re relatively rare. Broadly speaking, I trust the scientific community to minimize these problems. There’s not a lot of communities I would trust like that, which is why I claimed that science is special in this way.
The trust issues that the scientific community currently has are largely based around mis-aligned incentives. I trust most scientists not to engage in outright fraud but I don’t trust them not to make choices in their research practices that may hurt their careers. They know how the funding, publication, and tenure systems work, and they know that replications, preregistration, and following strict practices to minimize false positives will hurt their careers. Simply put: most scientists don’t trust that taking actions to make science better will be rewarded rather than punished. In a world of decreasing funding and a decaying social safety net, is anyone surprised that people do what’s best for themselves within the existing norms of the community?