Algorithmic suspicion — when sleep data makes us doubt the people we love
In a world where sleep tech turns our most intimate moments into data-driven detective stories, one partner’s midnight heart rate spike becomes the fuel for suspicion, indicating that when love and loyalty meet algorithms, privacy takes a backseat to paranoia.
While sleep tech was initially designed to optimise rest and recovery, it has deviated into the domain of domestic espionage. (Photo: iStock)
It starts quietly. An eerie glow in the dark. One half of a couple, sleepless in bed, scrolling with bated fingers through the other’s sleep report from a few hours earlier. There it is: a charted spike in heart rate starting at 2:47am.
The sleepless partner’s imagination activates: “Why’s she up this late? She’s supposed to be on a business trip in another city. Surely she’d be exhausted. Then why’s her heart rate so high? Is she cheating? Maybe I’m overthinking this. But why would the heart rate keep racing? She’s travelling with that colleague I don’t like…”
Visceral dread wrenches the sleepless partner’s gut – all enabled by algorithms built to do exactly that: find patterns, fill in blanks and tell stories the heart is too afraid to ask.
Algorithmic suspicion
Welcome to the age of algorithmic suspicion, where intimacy is mediated by sleep-technology devices and betrayal is inferred from biometrics – the body’s signatures: heart rate, fingerprints, facial patterns – our biology translated into data. Algorithmic suspicion is what happens when intimacy borrows the logic of surveillance; when data becomes evidence and affection becomes inquiry.
For the uninitiated, “sleep tech” refers to the growing class of devices that monitor our sleep cycles with clinical precision – or so the marketing insists. Think smart watches, rings, headbands and even mattresses that track heart rate, body temperature, movement and breathing; all to rate how “efficiently” we rest. Behaviour-informed marketers promise better sleep (and thus better health) through data, turning rest into a performance metric and the bedroom into a quantified arena.
Who wouldn’t want better sleep, given our unyieldingly demanding lives?
But, while sleep tech, to be fair, was initially designed to optimise rest and recovery, it has deviated into the domain of domestic espionage. In most instances, spousal surveillance appears to be an unintentional feature of the devices; however, in others, it’s by design.
For instance, in 2016, a Spanish company unveiled the
a mattress embedded with motion sensors to detect “suspicious” activity while its owner was away. More recently, the proposed
Ring claimed to analyse skin temperature, heart rhythm and voice tension to alert one’s partner of potential infidelity events. Nothing says romance like a push notification of your partner’s purported impropriety.
The delusion of data-driven certainty
The trouble isn’t that sleep-tech devices collect data, but that we find it almost irresistible to treat that data as gospel.
In our search for certainty, we read these charts as a kind of scripture – their lines interpreted with the fervour of faith, their errors forgiven as mystery. But, in reality, sleep-tech accuracy remains highly variable as most devices offer only estimates, not diagnostics, and their reliability depends as much on calibration and consistent use as on the sophistication of the sensor itself. In truth, sleep data reveals as much about the user as it does the device.
Sleep tech sells certainty, but what it measures best is our appetite for reassurance. A partner’s racing pulse could signal passion, panic, caffeine, or even a dream. Why then does sleep data lead us to assume the worst? Because the human mind detests uncertainty.
A spike in heart rate at 2:47 am isn’t evidence; it’s an empty space the imagination rushes to fill. Faced with ambiguity, anxiety demands closure.
Evolution has taught us to err on the side of danger rather than denial. The device’s authority gives that fear credibility – numbers feel incorruptible, so any irregularity must mean something. We measure, we compare, we suspect. So, when emotion supplies the story, data becomes the proof.
It’s not that the evidence points to infidelity; it’s that uncertainty feels unbearable. The mind prefers the cruel certainty of betrayal to the quiet ache of not knowing.
When privacy meets the pillow
But what of privacy? It’s undisputed that sleep data – often biometric – counts as personal information. And laws like the Protection of Personal Information Act (Popia) treat biometric data as sacred, classified as “special personal information”.
However, Popia was written to restrain corporations and the government, not couples. Perhaps that was understandable at the time; few could have imagined that love itself would become a site of data collection. Accordingly, the Act excludes processing done for “purely personal or household activity”.
The South African Law Reform Commission, in formulating Popia, probably never envisaged a domestic surveillance state powered by private insecurity.
Scholars refer to this oversight as the “intimate threat”: a paradigm in which privacy violations stem not from hackers or corporations, but from partners who already hold the passwords, devices and emotional leverage necessary for access. These are not breaches of system security, but of trust. The law, preoccupied with external attackers, has yet to account for the domestic realm as a site of digital harm.
As a result, when partners weaponise shared data or sleep-tracking information against one another, victims often find themselves outside the reach of data protection authorities, reliant instead on fragmented common law remedies or criminal statutes never designed for the subtleties of love, consent and surveillance.
It follows that we are in new legal and moral territory, raising questions such as:
When the violator and the violated share a bed, what is privacy?
When consent is given out of fear, is it consent at all?
And when one partner says, “If you’ve got nothing to hide, sync the app”, what kind of love demands surveillance?
The new frontier
In exploring the use of sleep tech, we often assume that knowing a partner’s metrics is inevitable. It’s not. There is a quiet case for not knowing – or not being entitled to know – especially amid the power asymmetries of modern love. Perhaps we need a right to privacy that survives inside love, and, paradoxically, becomes a measure of it.
To protect intimacy, we may first have to defend its silences.
This may be privacy’s new frontier: biometric recognition in private – the right to rest unmeasured, to remain uncharted in a world determined to chart everything.
In the faint blue light of the quantified age, that might be the last kind of darkness we’re allowed. DM
While I initially acquired my Whoop body tracker to verify the healthy state of my sleep (which it did), I also found it useful to highlight hidden everyday stressors and to confirm that red wine spoiled my sleep.
I eventually stopped using it because it was becoming almost an obsession and a distraction from more important priorities in my life. I rather trust in my own senses to inform me of what is happening with my body and to guide my behaviour.