
Your smartwatch buzzes at 2pm: “You seem stressed based on your heart rate. Try this 3 minute breathing exercise?” Meanwhile, your phone suggests a personalised workout based on last night’s sleep quality. Sound familiar?
We’re living in an age where artificial intelligence isn’t just transforming our phones and cars, it’s becoming our AI health companion. But as these AI tools become more sophisticated and widespread, we need to ask some hard questions. Are they actually helping us, or are we creating new problems we haven’t thought about yet?
What Exactly Are AI Health Coaches?
Think of AI health coaches as that friend who’s always tracking their steps, but infinitely more patient and available 24/7. These systems analyse everything from your fitness tracker data to what you log in apps. They then use machine learning to give you personalised advice on diet, exercise, sleep, and lifestyle habits.
Take Numan, a UK based digital health platform that launched an AI assistant in early 2025. Users can have actual conversations about managing weight and chronic conditions. It’s like texting a health professional who never gets tired of your questions.
The appeal is obvious: instant access without the £90-180 per session that traditional therapy costs in the UK, or the months long wait for a GP appointment that many of us know all too well.
📌 Related Reading:
Advice on Better Dieting – Why Most Diets Fail and What Actually Worked – Everyday Mastery
AI Therapists: A Double-Edged Sword
Mental health support through AI is perhaps the most hotly debated . Apps like Wysa (which the NHS has trialled for patients on waiting lists) offer cognitive behavioural therapy techniques, mood tracking, and crisis support. While traditional therapy can cost hundreds per session, AI support is often under £20 per month or even free.
Stanford Health’s 2025 research found these tools can really help with anxiety and depression symptoms, especially for early interventions. But here’s where it gets Tricky.
Gut Health & Depression: The Dangerous Truth You Need to Know
The “Creepy Factor” Is Real
Let’s be honest, many people find AI health advice unsettling. There’s something almost weird about pouring your heart out to a bot, even a clever one. If you’re dealing with depression, anxiety or trauma, would you trust an algorithm to read your feelings?
Some users say they feel judged or misunderstood by AI therapists, especially when replies seem scripted or miss the emotional cues a human would notice. One person said it was “like talking to a smart but emotionally tone-deaf friend.”
When AI Health Gets It Wrong: The Stakes Are High
Here’s a scenario that keeps regulators awake at night: What happens when an AI therapist misses warning signs of serious mental health crisis? Or gives advice that doesn’t fit someone’s background, or could even be unsafe for their health?
Recent lawsuits, including a 2025 case where parents alleged ChatGPT contributed to their teen’s suicide, highlight these very real risks.
Traditional therapy in the UK costs £40-100 per session, but human therapists carry professional insurance and must follow strict rules. When an AI system makes a mistake, who’s liable? The app company? The user? It’s murky legal territory that’s still being figured out.
The Integration Challenge: Does AI Play Well with the NHS?
The NHS is slowly adopting AI, but linking it into current systems isn’t easy. Many roll outs have been delayed by 4–10 months, and by June 2025 only 43 of 66 trusts were using AI tools in practice
The reality is that many GP practices are still using outdated computer systems that struggle to connect with modern AI health apps. So while your AI coach might suggest you discuss your sleep patterns with your doctor, there’s no smooth way for that information to reach your medical records.
Some successes are emerging though. AI powered scheduling has increased NHS theatre efficiency from 73% to 86%, cutting waiting lists by 28%. But these are behind the scene improvements, not the kind of personal health coaching we’re discussing.
The Cost Reality Check of AI Health
Let’s talk money. Popular AI therapy apps like Wysa cost around £75 annually or £150 for lifetime access, with free versions offering basic chatbot support. Compare this to private therapy at £40-180 per session, and the money saving appeal is clear.
But here’s what the marketing often skips: many people use AI tools alongside human care, not instead of it. That means you could end up paying for both, your AI coach subscription and later a professional consultation when things get more complex.
In the UK, personal trainers usually charge £30–60 per session. An AI fitness app might cost £10–20 a month, but it can’t check your form, adjust equipment, or give you the motivation that comes from real human support. You could also try free AI tools like ChatGPT or Claude, though they still have limits.
Age and Cultural Divides
Reactions to AI health tools differ a lot by age. Younger people (18–35) often welcome AI coaches for fitness and mental health support. But older adults, who may need more health help, often find these tools confusing or too impersonal
Culture matters too. AI systems trained mostly on Western data might give advice that doesn’t fit people from other backgrounds. For example, diet tips that ignore religious food rules, or exercise plans that overlook traditions around gender and activity.
The Loneliness Paradox
Here’s perhaps the most troubling question: At a time when loneliness is considered a public health crisis, are AI health tools helping or making things worse?
Human connection has clear healing benefits. When we swap that for AI chats, we may fix short term issues but create bigger ones later. Yes, AI therapy is easy to access and low cost, but it can also feel lonely. It suggests that human care and understanding are extras, not essential parts of recovery
Some users report becoming overly dependent on AI validation, checking in multiple times daily for reassurance that a human friend or therapist might provide more meaningfully once a week. I have friends that use AI Chat as a friend to just Hang With.
Unanswered Questions About Tech Exposure
Another issue that’s rarely discussed is the impact of constant Bluetooth and Wi-Fi exposure from wearables and smart devices. These signals are considered safe by regulators, but the truth is we don’t yet have long-term data on what decades of daily use might mean for the body. While the risks may be small, it’s an area where science is still catching up, and many people prefer to take a cautious approach.
What’s Next For AI Health?
The UK government’s 10-year health plan sees AI as central to improving the NHS. We’ll likely see smarter AI health tools, better links to medical records, and clearer rules on safety and responsibility
But the biggest progress may come from mixed approaches. AI handling daily checks and simple advice, while human professionals focus on harder cases and emotional support.
👉 Recent studies show that bringing AI into healthcare has been harder than leaders expected. This means we need realistic expectations about what these tools can and can’t do.
The Bottom Line
AI health coaches, therapists, and personal trainers offer genuine benefits: accessibility, affordability, and 24/7 availability. They’re particularly valuable for people who might otherwise have no support at all.
But they’re not magic solutions. They come with privacy concerns, ways people could misuse them, and the risk of replacing human connection when we need it most. The key is being realistic about their limitations while making the most of their strengths.
The real future is likely a partnership: AI for daily tracking and reminders, humans for deeper care and understanding. The challenge will be making sure these tools improve care without replacing what makes healthcare human.
Would you trust an AI with your health journey? Maybe the better question is: How can we use these tools wisely while preserving what makes healthcare fundamentally human? How much do you use AI for your health at the moment?
- Generative AI–Enabled Therapy Support Tool for Improved Treatment Adherence and Outcomes
- Summary: This observational study involved 244 patients in NHS group-based CBT programs. The group using the AI support tool had higher attendance, fewer dropouts, and better improvement & recovery rates than the control group using standard CBT materials. PubMed+1
Systematic Review of Human, AI, and Hybrid Coaching / Therapy Models
- Title: Systematic Review Exploring Human, AI, and Hybrid Health Interventions
- Summary: This review found that AI coaching and hybrid models generally report positive improvements in psychological wellbeing, including reductions in depressive symptoms. PubMed