Smart Devices in Surveillance Activities
How consumer-grade wearables are being quietly transformed into tools for psychological insight and control
In the past decade, the explosion of smart devices like watches, chest straps, neurotrackers, and wearables of every kind, has reshaped how we think about health. They promised us empowerment: better sleep, more focus, personalized fitness, data-driven mindfulness. But as with all technology, the same tools that give can also take. What happens when the quantified self becomes a monitored self? When wellness data becomes evidence? When the devices meant to help us understand ourselves are turned toward understanding us, without consent?
This is not science fiction. It’s already happening.
From Biofeedback to Behavioral Surveillance
Most consumer wearables today offer some form of biofeedback: heart rate, galvanic skin response, temperature, EEG activity, and more. These signals, when interpreted with the right tools, can offer insights into not just health but mood, stress, cognitive load, and even intent.
Researchers have known this for decades. In clinical settings, devices like the Polar H10 chest strap or the Muse 2 EEG headset are used to monitor autonomic and neural responses. But what’s new is how easily these devices can be adapted for entirely different contexts, such as training simulations, behavioral analysis, and yes, interrogation support.
Consider the Human Performance Monitor (HPM) project, which I was involved in. It is a situational awareness system that aggregates real-time physiological data from consumer devices like the eSense GSR sensor, Polar H10 heart monitor, and Muse 2 EEG headband. Originally designed to provide feedback on mental and emotional state during high-pressure tasks, it quickly proved its potential in broader, more sensitive domains.
Biointelligence: The New OSINT?
Open-source intelligence (OSINT) has long relied on digital breadcrumbs: social media activity, location metadata, image analysis. But biosignals add a new layer: biological open-source intelligence, or BioOSINT (hey, I'm just making up this term I swear). Imagine not just knowing where a person was, or what they said, but how they felt while saying it. Their cognitive strain. Their stress spike. Their momentary hesitation.
It’s not hard to connect the dots. A heart rate spike during questioning. A sudden GSR fluctuation when shown a photo. A drop in alpha waves when discussing a timeline. With proper calibration, biosignals become subtle tells in a behavioral poker game.
This doesn’t mean the data is perfect, or even self-explanatory. Interpretation remains tricky, context-dependent, and riddled with false positives. But the value isn’t in absolutes, it’s in probabilities. In edge cases. In supplemental signals that paint a fuller picture.
A Shift in Intent
It’s crucial to understand that the shift isn’t just technological. It’s ideological. When developers begin prototyping systems that visualize stress, cognition, and mental readiness in real time, they step into domains once reserved for neuroscience labs and intelligence agencies.
You don’t need military-grade EEG rigs anymore. You need a laptop, a few BLE sensors, and an open-source stack.
For example, even in its early version, the HPM system already integrates:
- GSR (Galvanic Skin Response): A measure of skin conductivity correlated with sympathetic nervous system arousal (e.g., stress).
- HR/RR (Heart Rate and RR Intervals): Indicators of cardiovascular stress and autonomic regulation.
- EEG (Brainwaves across five bands): Raw and processed brain data indicating levels of attention, relaxation, and neural synchronization.
These aren’t just physiological measurements. They’re psychophysiological markers, gateways into the mind-body interface.
And here’s the rub: They’re collected with the user’s implicit consent, under the banner of productivity or wellness. But what if that consent is blurred, coerced, or simply misunderstood?
From Interrogation to Optimization
Let’s imagine a few settings:
- Corporate Interviews: Candidates hooked up to wearables during interviews, under the pretense of performance enhancement. In truth, their stress and honesty are being profiled.
- Remote Training Simulations: Soldiers, athletes, or emergency responders being monitored for stress resilience, attention decay, or emotional dysregulation.
- Intelligence Debriefings: Operatives questioned post-mission, with their EEG and GSR data silently recorded to detect omissions or inconsistencies.
These scenarios are not hypothetical. In fact, as someone who has built software systems that ingest, process, and visualize this type of real-time biosignal data, I can say with confidence: the infrastructure is trivial to assemble. The bottleneck is no longer the tech. It’s the ethics.
The Illusion of Consent
Every user agrees to terms when they pair a smart device. But no terms of service can prepare them for the data being redirected into inference engines. No checkbox anticipates emotional cartography. No privacy policy covers latent biometric profiling.
When you wear a GSR sensor in a meditation app, you think you’re optimizing yourself. But the raw data could just as easily be used to measure persuasion resistance. Or spot micro-reactions to subliminal stimuli.
The same goes for EEG. A Muse 2 user might believe they’re training focus, when in fact they’re providing real-time markers of their cognitive threshold, fatigue curve, or suggestibility.
The Power of Visualization
It’s not just about data collection. The real transformation comes when that data is made legible, when it’s visualized. A heartbeat is just a beat, until it’s charted over time, colored by state, sonified into alerts, and mapped to known patterns.
In the HPM prototype, physiological signals are not just plotted, they’re interpreted. Heart rate abnormalities trigger tones. RR intervals show coherence or fragmentation. EEG charts reveal dominant frequency bands. GSR lines climb during narrative tension.
This isn’t just signal monitoring. It’s emotional forensics.
Smart Devices as Tactical Tools
Here’s the deeper truth: once we can measure these signals, we can modulate them. Biofeedback loops allow systems not just to observe but to nudge.
- A subtle change in music when stress rises.
- A shift in lighting when attention wanes.
- A prompt when alpha waves dip.
This is where performance monitoring crosses into performance shaping. It starts as training. It ends in influence.
The Ethical Stack
Every layer of this system, the sensors, the streamers, the processors, the dashboard, is ethically neutral until it isn’t. The stack is agnostic. But the intention behind it changes everything.
You can use a Muse 2 to help someone manage ADHD. You can use it to assess political loyalty.
You can use Polar H10 to monitor PTSD. You can use it to detect lies.
You can use GSR to train emotional regulation. You can use it to measure interrogation compliance.
The tools are the same. Only the story changes.
A Call for Bio-Situational Awareness
As we move into a world where smart devices mediate not just our health but our truthfulness, we must cultivate a new kind of literacy: a bio-situational awareness.
It means understanding:
- What your devices are really capturing.
- How that data can be interpreted.
- How visualization transforms signals into narratives.
- How narratives shape perception.
This is no longer about privacy. It’s about presence. It’s about how our biology becomes a surface of interaction: visible, traceable, and exploitable.
Toward an Informed Future
I don’t share this as a warning. I share it as someone who has built and shipped these systems. As someone who’s passionate about cognitive tech. As someone who sees the potential in this frontier, and the risk.
Used well, biointelligence can improve mental health, optimize performance, and build empathy.
Used poorly, it becomes the biometric panopticon.
We need technologists, designers, policymakers, and citizens to co-create the rules of engagement. To build ethical defaults. To design with care. To imagine not just what’s possible, but what’s responsible.
The Human Performance Monitor was born as a curiosity project. A mashup of open data, wearable sensors, and real-time charts. But it has revealed something deeper: that surveillance is no longer out there. It’s in here, in our heartbeat, in our skin, in our thoughts.
And we carry it willingly.
Oh, we haven't even talked about AI integration here.
Until we ask better questions 👋