Lessons from Owlet and Rhapsody on turning continuous data into real-time clinical impact
For a parent, the difference between reassurance and escalation can come down to a single moment: an alert, a change in pulse rate or oxygen, or a signal that baby needs them. Together, Owlet and Rhapsody bring this at home monitored information directly to the clinician, inside of their electronic health record.
At this year’s ViVE event in Los Angeles, one theme came through clearly: healthcare doesn’t have an AI problem, it has a data problem. In a conference session featuring Owlet and Rhapsody, the conversation wasn’t about model accuracy or the latest advances in generative AI. It focused on something far more fundamental: what it actually takes to turn continuous health data into meaningful clinical action.
That distinction matters, because as AI becomes more embedded in care delivery, the real challenge is no longer generating insight; it’s operationalizing it.
The Shift from Monitoring to Intervention
Owlet’s work in infant monitoring offers a clear example of where the industry is heading. Using continuous, real-time data from connected devices, the goal is not simply to track vital signs, but to potentially enable earlier, more proactive intervention. AI predictive models are increasingly being used to identify patterns that may indicate risk before a clinical event occurs.
However, detection alone is not the endpoint, rather it’s the starting point.
This reflects a broader shift in healthcare: from episodic care to continuous monitoring, and from reactive treatment to more predictive intervention. Yet getting there requires more than better models. It requires the ability to move real-time data both reliably and securely across devices, applications, and systems.
Why Insight Alone Isn’t Enough
One of the most important takeaways from the session is that insight, on its own, has limited value. A real-time and integrated model can help connect the dots. But if that information doesn’t reach the right clinician, in the right system, at the right time, and in a format that can be acted on, it may not change outcomes.
This is where many AI initiatives fall short.
Healthcare data remains fragmented across devices, EHRs, and digital health applications. Even when organizations generate high-quality data, they often lack the infrastructure to integrate it into clinical workflows.
The result is a familiar pattern: promising signals that never translate into action.
“Healthcare doesn’t struggle to generate data, it struggles to use it,” said Michelle Blackmer, CMO of Rhapsody. “Bridging that gap is what determines whether AI actually works.”
What Owlet and Rhapsody are doing together offers a clear example of how that gap can be closed. Through their partnership, continuous pulse rate and oxygen level readings from Owlet’s FDA-cleared BabySat ® is securely integrated into clinical workflows via the EHR. Rather than existing in a separate application, infant monitoring data becomes part of the patient record, available to clinicians in the systems they already use.
This creates a standards-based, real-time data flow from home to hospital, reducing the need for custom integrations while ensuring data is accurate, timely, and actionable. Clinicians can access continuous insights alongside other clinical data, enabling earlier intervention without adding workflow burden.
In practice, this is what it means to operationalize AI: not just generating signals,but embedding them into the environments where care decisions are made.
Interoperability as the Enabler of AI
The Owlet + Rhapsody session reinforced a critical point: interoperability is no longer a background concern. It is central to whether AI works at all. AI systems depend on access to complete, contextual data.
That means connecting device-generated data with clinical and operational systems and doing so in a way that supports real-time decision-making. Without that foundation, AI remains siloed, confined to individual applications rather than embedded across workflows.
AI can generate insight anywhere, but it can only drive impact when it’s connected to the systems where care actually happens. That means connecting device-generated data, clinical records and operational systems. And doing so in a way that supports real-time decision-making.
Without that foundation, AI remains siloed, confined to individual applications rather than embedded across workflows.
Trust, Context, and Clinical Relevance
There’s another layer to this challenge: trust.
In a clinical environment, data isn’t useful unless it is trusted. That requires high data quality, clear provenance, and alignment with clinical context.
For connected device data in particular, this becomes even more important. Continuous monitoring generates large volumes of information, but not all of it is actionable. The role of AI, and the systems supporting it, is to filter, contextualize, and deliver the right signal at the right moment.
That’s a technical challenge, but it’s also a design one.
What This Means for the Future of Digital Health
The conversation at ViVE made one thing clear: the future of AI in healthcare will not be defined by models alone. It will be defined by whether organizations can connect data across systems, embed intelligence into workflows, and enable action, not just insight.
The shift from data to detection is already underway. The question now is whether healthcare systems are prepared to support it.