Apple has unveiled an advanced artificial intelligence model that may change how we view health tracking through wearables. In collaboration with the University of Southern California, researchers at Apple have developed a new AI system called the Wearable Behaviour Model, or WBM, which focuses not on the raw sensor data commonly used in fitness devices, but rather on processed behavioural signals. These include step count, sleep duration, REM cycles, gait, and overall weekly activity patterns. The idea is to understand how people behave over time, rather than just measuring what their body is doing at a given moment.
The research is part of the broader Apple Heart and Movement Study and challenges the industry’s current reliance on heart rate, blood oxygen, and other single moment metrics as the best indicators of health. The team behind the study believes that although raw sensor data is useful, it often lacks the necessary context to accurately reflect real world health conditions. It can also be inconsistent due to technical glitches or user specific variations. By contrast, behavioural data paints a fuller picture of an individual's lifestyle and health trajectory, offering more meaningful insight when processed correctly.
One major reason this kind of data has been avoided so far is its overwhelming size and noise. With millions of users generating gigabytes of activity data every day, most systems struggle to make sense of it. Apple tackled this by using structured and refined data sourced from over 162,000 Apple Watch users, resulting in a massive 2.5 billion hours of wearable activity information. This clean, high volume data became the training foundation for the new AI model.
Once trained, the WBM model used 27 types of behavioural indicators sorted into categories such as mobility, cardiovascular fitness, sleep, and general activity. The model was tested across 57 different health tasks ranging from identifying chronic conditions like diabetes to tracking the recovery from injuries or infections. In 39 of those tasks, the behavioural model outperformed baseline methods, proving its reliability in evaluating long term health outcomes.
To test the robustness of the model, the researchers compared it to a second model that only used heart rate data from sensors. While there was no clear winner between the two on their own, the real breakthrough came when both models were combined. This fusion of behavioural and traditional sensor data significantly improved the AI’s ability to predict and assess health conditions. It showed that combining different forms of information could lead to a much more accurate and practical healthcare tool.
One of the biggest advantages of the behavioural model is its interpretability. Because it relies on data that people intuitively understand like how much they walk or how well they sleep, its insights can be more relatable and actionable for both users and healthcare providers. It also makes the model less sensitive to technical faults or sensor malfunctions.
However, the researchers noted limitations in their study. The dataset only represents Apple Watch users in the United States, which limits the diversity of the findings. Also, since accurate behavioural tracking still requires relatively expensive devices, this technology may not be accessible to everyone, which could impact the reach of preventive healthcare globally.
Despite these limitations, the new model marks a significant shift in how wearable devices might be used in the future. With AI capable of understanding patterns in how we live rather than just what our sensors pick up, the future of health tech could be both more insightful and more human.
For more updates on AI in wearables and the future of health tracking, follow Tech Moves on Instagram and Facebook.