AI coaching bots

Eliza is a good counselling/mentoring tool. Not a coaching one.

1 Like

But that has to be factored with availability, opportunity and motivation

It’s a terrible counselling tool! but it’s amazing what they could do in 1966.

Almost makes me wonder if the moon landings were not faked

2 Likes

Yep, none of which is insurmountable. Particularly as it only has to be good enough, maybe slightly better than the human interactions people have with coaches.

The real trick will be the human interaction, apparently people are simulating relationships with AI to combat loneliness already so perhaps that has already been achieved?

For me, there’s no non-human interaction as good as a friendly human relationship but the high volume low net worth newbie market could be lapped up. It’s the high end market that will stay human imho. Which kind of makes sense regardless of the market right?

1 Like

because?

1 Like

“Do any reasons come to mind ?”

1 Like

When AI was first mooted, the scientists assumed that the challenge was replicating how humans see to move in the world, however the real challenge has been to understand how humans orientate themselves…that’s the complex part and certainly that remains with AI coaching…

yes, it allows people to think ‘out loud’ and understand what concerns them and why without the biased intervention of a counsellor…

see my post about Adam Curtis’ work elsewhere…

1 Like

I understand. Go on?

Agree it’s very clever and amazing for the era, but once you have used it a few times you realise it’s just bouncing balls off a wall I think

1 Like

Whilst that may well be the outcome, it actually works better the other way around…

i would suggest that most top athletes are working towards marginal gains (very little technical coaching takes place - much less than should be the case)…and whilst some of the athletes at the top end need the occasional hug (current safeguarding stupidity notwithstanding) most are resilient.

it’s the other end of the market that would benefit from more quality coaching…

2 Likes

That’s quite interesting.

1 Like

it encourages the user to create their own strategy for understanding and resolution rather than needing a.n.other to often tell them what they could have worked out themselves…

Why do you think that is the case?

Tell me more?

Is there anything else?

What would you like to happen?

How might that come about?

Most of the people on this planet go into fix mode in these situations rather than listening mode…counsellors are just better at listening for longer…

2 Likes

For sure, it was an early & groundbreaking attempt at simulating the reflective function of a counsellor.

But a bit of a one trick pony, I feel.

Later, I’m going to cut up Mr Johnson from number 73 and saute his pancreas with a little basil and rosemary.

Do you enjoy going to cut up mr johnson from number 73 and saute his pancreas with a little basil and rosemary ?

Can’t help but feel a decent counsellor might tread a different path here for example

2 Likes

counselling…not writing up Lecter’s autobiography…

2 Likes

As I mentioned before, LLM’s are just complex probability algorithms. A machine can simulate well, and maybe in many many years will have enough data points to quite realistically have a conversation but it won’t understand that conversation, it won’t understand subtleties in language, it won’t understand local colloquialisms, it won’t understand feelings and pick up on emotions. It will be like an autistic child who takes everything at literal meaning and formulates a response to that with no emotion. Maybe that is what we need but Im not convinced it works in any form of coaching , sports, formal education, health, mental wellbeing because it will never be a human and will never understand human compassion.
“Sean failed his driving test.” This text pattern will in time be learned, a formulated response that it may come up with is “what is Sean” > Sean is my son . The computer is then looking at probabilities of responses based on its training data, not thinking he must be really upset to bring that up, let’s discuss this further. At best it can associate failed and son as creating a negative pattern, and use probability of learned conversations but where is it getting these conversations and patterns from, “Big Data”? the GPT’s were trained from texts, books, websites, but not conversation between coach / athlete. How can you train a CoachGPT to act like a coach if its not trained with the millions of athlete/coach conversations that are had weekly and how does it know the difference between a successful conversation and an unsuccessful one, and how would it measure those results.

ML is certainly a realistic alternative, I spent months studying the EdX stuff and, and its not possible to list all the various algo’s used here that they taught, and there are infinite possibilities of more, but even the simplest ML is still down to how well you code the weighting algorithms, how well you set the weighting, how many data points you add to the training data, how you quantify results, how to set boundaries on the data points you use for training, and how you then utilise those results to create a sensible output.

One example I learned (@fruit_thief will remember) was a 2 input model created to predict the results of rain or no rain. it had inputs of pressure and humidity with a result of whether it rained or didn’t, and you had to write an algorithm with 3 weights for 2 inputs and those weights had to be worked out with a function that looked at all the input data to estimate the best weighting using the perceptron learning rule. The inputs were minimal and the result was easy, rain or no rain, but how many inputs will be required and what results will be required in a ML triathlon coach model.

People will lap it up though, 99.9% will not understand what it is doing and the few will understand its a marketing gimmick that is doing little more than looking at a load of TP inputs, and measuring expected CTL at end of it.

1 Like

it’s the potential implications of this that may flex over time that i cannot see AI fully appreciating…right now it might be immotive and affect the session, next week it might be forgotten, in another week it may be a stressor because Sean needs dropping off before a session etc…

1 Like

and that is not what a GPT is designed to do. they are language models. We could utilise ML to add more context to that language model, but as I hinted at above, Garmin does not have government funding and the server power to get close to that level of processing, certainly not for considerable time yet. They will be doing little more than playing currently.

I could knock up a python script to perform “AI” on training data. I believe Alan Couzens has a load of sample python on his website that scrapes TP data that he uses to check fatigue, HRV etc. With access to the API and you could write some simple enough code to create sessions based on templates looking at the inputs from TP, but that’s adding little value, and is all that most of these “AI” apps are doing. MySwimPro claims to be AI, but all it’s doing is adjusting pre banked sessions with times and turnarounds based on your previous sessions and swim test sets.

1 Like

Do not tell Chat GPT you’re Sarah Connor.

3 Likes

While I fully appreciate that a human coach can make the logic jump from “Sean failed his driving test” to how that could impact Sean’s dad’s training, the AI maybe doesn’t need to. Assuming that Sean’s dad is using AI coaching knowingly and willingly, they should give it inputs that will result in the most meaningful feedback. It is for Sean’s dad to make the leap and input “I can’t train on Thursday as I have to drive my son to football practice”, or “Going out for a pizza and beer on Saturday to console my son”. The athlete has to make the initial assessment, but that is why I see AI as a step up from a boiler plate training plan, but a step down from personal coach.

2 Likes

Sean’s mum, but hey ho…

but that’s not what it says…

in most cases the athlete cannot make the initial assessment, in these cases where it is not an either/or the coach can potentially better evaluate the concern and thereby adapt the intervention more effectively…

1 Like