Really, really good essay. The levels of tacit knowledge embedded is often hard to extract, and since part of the question is how much of what seems routine creates variations in outcome, an answer here too is more replication, purely to capture the stochasticity in outputs as we do more.
It's also a pity that there is no more efficient way to do things than for researchers to observe in real life for a period of weeks. This might be more in the domain of sociology than it is about meta-research through. IE - Perhaps we are limited by our sociological behavior, so that we simply must resort to these basic observational practices, no matter how great tech gets. That is, until a technology is so advanced it can implicitly account for all variables and inform us what they are. But the hotdog story illustrates that even in exceedingly "simple" cases, it's nigh impossible to pick up on all these variables.
I wonder if this is an actual application for the internet of things; it should be possible to design lab equipment that allows you to export experimental parameters to an LMS in a verifiable manner.
I enjoyed your article, and the hot dog story is a great one! I appreciate how clearly you unpack the LLM issue in terms of reading and understanding scientific literature as well.
Since starting in my current research integrity role I've been on a bit of a quest to improve our scientific method writing. It's not the be-all-and-end-all of the problem, but I do think it's something that can be significantly improved.
One thing I'm trying is to encourage researchers to add more detail to their methods. I wrote a bit about this here - https://news.cancerresearchuk.org/2022/12/07/research-with-integrity-the-madness-of-short-methods/ - where I blame page limits in printed journals for setting a precedent for methods to be very short. People need to be shown more examples of good and detailed methods and protocols (and I like your point about video methods, too) so that we can be clearer about what was done and how it was done.
Another puzzle piece I'm trying to fit is to bring in electronic lab notebooks. Many biology labs in the UK still use physical lab books, and while these have many advantages, they are quite laborious to write in and don't provide structures . Moving to an ELN I hope to offer ways of recording experiments more easily, and using templates, forms and other structures to help capture experiments in more detail. There are technical and cultural challenges to overcome in terms of bringing in a new way of working, and I'd interested in your thoughts if you have considered ELNs or similar as beneficial or not in for science?
Again, not the entire solution, but hopefully some positive steps forward.
Good thoughts -- lab notebooks and protocols (e.g., Protocols.io) might help a lot here, but even so, what about the things no one even thinks to write in a protocol because it is such an invisible feature of the environment that it isn't even noticed?
Agree this is a big issue - how do we know the things we don't know we don't know! (I play this clip of Donald Rumsfeld during our Institute induction sessions - https://youtu.be/REWeBzGuzCc?si=VOgbhASjTDif-52B - to try to make this point!)
I do like the idea of filming experiments, and I'm trying a bit of that with researchers, but it's quite a lot of work to do it, and you still have to make sure you capture everything! Plus, you end up figuring out whether to put in the best example of how to do it, the 'kind of in the middle' example, or to include some of the ones that don't quite go right as well!
Both of these stories speak to the confounding nature of (what we might think are) mundane details. Explicit knowledge can be studied, tacit knowledge must be demonstrated — and this sort is... not even technically *knowledge*? They're like incidental byproduct effects baked into a process chain that even the tacit knowledge-having experts aren't conscious of.
Really, really good essay. The levels of tacit knowledge embedded is often hard to extract, and since part of the question is how much of what seems routine creates variations in outcome, an answer here too is more replication, purely to capture the stochasticity in outputs as we do more.
Also, on a similar note: https://www.strangeloopcanon.com/p/two-stories-about-tacit-knowledge
This hotdog story is pure gold!
It's also a pity that there is no more efficient way to do things than for researchers to observe in real life for a period of weeks. This might be more in the domain of sociology than it is about meta-research through. IE - Perhaps we are limited by our sociological behavior, so that we simply must resort to these basic observational practices, no matter how great tech gets. That is, until a technology is so advanced it can implicitly account for all variables and inform us what they are. But the hotdog story illustrates that even in exceedingly "simple" cases, it's nigh impossible to pick up on all these variables.
I wonder if this is an actual application for the internet of things; it should be possible to design lab equipment that allows you to export experimental parameters to an LMS in a verifiable manner.
I enjoyed your article, and the hot dog story is a great one! I appreciate how clearly you unpack the LLM issue in terms of reading and understanding scientific literature as well.
Since starting in my current research integrity role I've been on a bit of a quest to improve our scientific method writing. It's not the be-all-and-end-all of the problem, but I do think it's something that can be significantly improved.
One thing I'm trying is to encourage researchers to add more detail to their methods. I wrote a bit about this here - https://news.cancerresearchuk.org/2022/12/07/research-with-integrity-the-madness-of-short-methods/ - where I blame page limits in printed journals for setting a precedent for methods to be very short. People need to be shown more examples of good and detailed methods and protocols (and I like your point about video methods, too) so that we can be clearer about what was done and how it was done.
Another puzzle piece I'm trying to fit is to bring in electronic lab notebooks. Many biology labs in the UK still use physical lab books, and while these have many advantages, they are quite laborious to write in and don't provide structures . Moving to an ELN I hope to offer ways of recording experiments more easily, and using templates, forms and other structures to help capture experiments in more detail. There are technical and cultural challenges to overcome in terms of bringing in a new way of working, and I'd interested in your thoughts if you have considered ELNs or similar as beneficial or not in for science?
Again, not the entire solution, but hopefully some positive steps forward.
Good thoughts -- lab notebooks and protocols (e.g., Protocols.io) might help a lot here, but even so, what about the things no one even thinks to write in a protocol because it is such an invisible feature of the environment that it isn't even noticed?
Agree this is a big issue - how do we know the things we don't know we don't know! (I play this clip of Donald Rumsfeld during our Institute induction sessions - https://youtu.be/REWeBzGuzCc?si=VOgbhASjTDif-52B - to try to make this point!)
I do like the idea of filming experiments, and I'm trying a bit of that with researchers, but it's quite a lot of work to do it, and you still have to make sure you capture everything! Plus, you end up figuring out whether to put in the best example of how to do it, the 'kind of in the middle' example, or to include some of the ones that don't quite go right as well!
Both of these stories speak to the confounding nature of (what we might think are) mundane details. Explicit knowledge can be studied, tacit knowledge must be demonstrated — and this sort is... not even technically *knowledge*? They're like incidental byproduct effects baked into a process chain that even the tacit knowledge-having experts aren't conscious of.
Love this essay!
I think this counts as what James Scott called "metis".