In this post we discuss why getting fitness and rehab advice from ChatGPT can lead to problems and why human judgement is essential.

Have you ever read a newspaper article on a topic you know well and been struck by how many inaccuracies it contains?
I’m sure this doesn’t just happen in my field either, but most likely across the board.
Unless the journalist is a specialist in a particular subject, it’s almost inevitable.
I frequently have the same experience using ChatGPT.
Whilst you can marvel at its ability to answer a broad array of questions, dig a little deeper and it’s easy to find the flaws.
It’s instructive that the people who’ve worked on developing these models tell their friends and their children not to use them. That tells you everything you need to know.
Ultimately, these models don’t actually know anything. They work by predicting what text is likely to come next based on patterns in the data they were trained on.
Depending on the quality of that data, they are giving you a middle-of-the-road answer at best.
Whilst this might suffice for general queries, it won’t provide enough depth and nuance to arrive at valuable solutions.
Certainly not in my field and probably not in yours either.
Get your fitness and rehab advice from humans for the time being.