Sports Science Monthly – January 2021
Every month we take a deep dive into the latest research in sports science. To kick of 2021 we’ve put together one of our biggest editions yet, reviewing 12 new articles on a range of topics from a critique of data-driven coaching, repeatability of training improvements, caffeine periodization, crowd wisdom, and what coaches can learn from hunter gatherers.
As always, the full Sports Science Monthly is available exclusively to HMMR Plus Members. You can browse the past topics on our archive page. The first topic below is free to everyone, but sign up now to read about all the latest research. To get an idea of what Sports Science Monthly is all about, the April 2016 edition is available in its entirety for free.

This Month’s Topics
- The illogic of being data driven
- Caffeine periodization
- Managing jet lag and travel fatigue
- Repeatability of training improvements
- Just how much wisdom is there in a crowd?
- Lessons from hunter-gathers to guide sports innovation
- Quick-fire round
The illogic of being data driven
Quick Summary – The blind use of data within sport can lead to poor decision making and performance outcomes. Instead, we should include coach-led intuitive “feel” as an additional data point when making decisions, and ensure the data we do collect is valid and reliable.
The HMMR Media site theme recently was sporting technology and data, which has been a hot topic within sports science over the last couple years. In December, I was fortunate to be a (online) delegate at the Australian Institute of Sport Sports Technology and Applied Research Symposium (STARS), where many presentations focused on current and future uses of data and technology within sport. However, at the same time, there was a general awareness that our answers can only be as good as the data we collect, and that, sometimes, we run the risk of over-reliance. Legendary Australian hockey coach Ric Charlesworth was one such speaker, with his session titled “The numbers matter—but which ones?” Charlesworth’s point was that data has huge potential, but we must recognize its limitations and avoid becoming over-reliant on it.
Along a similar theme is a recent commentary piece published in Science and Medicine in Football. Titled “The illogic of being data-driven: reasserting control and restoring balance in our relationship with data and technology in football”, the authors take a look at some issues with our frequent—and often blind—use of data in football, which I think has wider applications to the whole of sport. Their main point is that, in today’s climate of wearable sensors and data analytics, we are becoming increasingly data-driven in our decisions, meaning that, in line with Charlesworth’s comments, decisions are made “by the numbers.”
In football, this can be increasingly problematic due to its highly complex nature; unlike throwing the discus, for example, which is a somewhat closed skill, football (and, by extension, other team sports) is affected by a large number of factors, with the twenty-two players on the pitch able to interact in a myriad of ways which drive different outcomes. This complexity is a problem when it comes to analysis; we likely can’t capture or quantify all the influencing factors, and so any data analysis is incomplete.
We’ve become incorrectly data-driven and reliant, according to the authors, due to the sports science arms race; as technology develops, staff and clubs want to gain an edge on their competitors, and are happy to spend money to do so. Stories of successful data analysis to deliver a winning edge, such as Michael Lewis’ Moneyball, only add fuel to the fire. Furthermore, it can be hard to fully evaluate the various data collection and analysis technologies, as many utilize a “black box” method, whereby the analysis methods and algorithms are proprietary, and, as a result, unknown to practitioners using them. This can be especially problematic when the black box corrects data inputs, which is an issue found to affect the calculation of the acute:chronic workload ratio, the source of many recent controversies.
Becoming data-driven can also affect decision making and incentives. Goodhart’s Law stats that, when a measure becomes a target, it ceases to become a good measure. In sport, whenever we use training data as a proxy for performance, we run the risk of getting into trouble, especially if the particular metric of interest doesn’t transfer to performance as well as we hope it does.
This isn’t to say that we shouldn’t utilise data; it can be hugely useful, and intricate statistical models can often process more data than we, as humans, can. Instead, write the authors, we need to be more judicious in how we use data, making decisions that take into account both hard data and more subjective, qualitative assessments—often termed the coach’s eye. If we view technology and analytics as tools that can support us—instead of drive us—in making a decision, we can get both a deeper and more nuanced perspective. In turn, we become data-informed, as opposed to data-driven.
To become data informed, we need to be clear on the questions we want to answer; defining these then determines what data is needed to provide the answer, which in turn drives the technology and techniques we will use. We should ensure our data collection technologies are valid—they collect what they purport to collect—and reliable, whereby they produce the same outputs over time, as opposed to having a large amount of error. The authors also believe we need to be increasingly selective around data sharing; just because we’ve collected a large amount of data doesn’t mean it all should be blindly shared with coaches and athletes. Instead, we should share only the data that is required to assist in decision making, to avoid overburdening and confusing all involved. As a result, being data informed means that we use data to support decision making, as opposed to blindly following it as a means of making automatic decisions. We can also use data to review decisions we’ve made, increasing our confidence in the various methods we use as time goes on—and making us aware of any errors that might be creeping in.
In short, we must be better at using data to support our own decisions, as opposed to relying on it to “automate” our decision-making processes. In doing so, we can make better decisions and, as a result, hopefully drive performance onto a higher level. In closing, the authors bring out a Hans Rosling quote: the world cannot be understood without numbers, and it cannot be understood with numbers alone.
Join Now to Keep ReadingThis is just the beginning. To continue reading this article you must be a HMMR Plus member. Join now to get access to this and more content. Learn more and sign up here. If you are already a member, log in here. |