Smartphones are equipped with many different sensors that allow us to interact with our real-world surroundings. However, our growing dependence on smartphones is generating considerable new threats, owing to the army of sensors that are increasingly vulnerable to attack. Ever wondered what black magic powers do smartphone apps such as Sky Maps have, for them to know the precise orientation of your phone while you gaze at the heavens above? What about ride-hail apps such as Uber that can generate such stunningly granular reporting on the drivers’ braking and acceleration patterns, or apps like Metal Detector that help you hunt ghosts?
In the next few minutes, you will be ushered into the world of the humble trio of inertial measurement unit (IMU) sensors, or motion sensors, that is, accelerometers, gyroscopes and magnetometers. These unsung little workhorses lurk unsuspectingly on our phones’ motherboards, furiously transducing and spitting data emanating from every sinew of motion and vibration around us, while quietly fueling an emergent wave of artificial intelligence (AI) that will influence and shape our lives deeply in the coming decade.
Now, the question is: Does your smartphone have these as well? In case you own a reasonably new smartphone, like me, chances are that dialing a USSD secret code will give you backdoor access to a screen that looks like the one shown in Fig. 2.
Woah! You did not realise there was an entire menagerie of sensors prowling in your phone, did you? You are certainly not alone! According to a recent survey, a staggering 61 per cent of smartphone users had never even heard of accelerometers and an even higher ratio (73 per cent) had not heard of gyroscopes.
Let me give you an idea as to how incredibly sensitive and fast these sensors are. A typical IMU sensor has a linear acceleration sensitivity as low as 0.00061 g/LSB (gravitational acceleration per least signal bit) and an angular rate sensitivity of 0.004375 degrees per sec/LSB. This can be sampled at the rate of 6664 samples per second. However, in order to save battery, the mobile operating system limits the sampling rate to 416 samples per second, which is still impressively high. These sensors are so incredibly sensitive that these can be used as a reliable alternative to an electromyogram (EMG) for tremor frequency assessment when it comes to the care of patients with a diagnosis of Parkinson’s disease, Essential tremor, Holmes’ tremor and even Orthostatic tremor.
These examples are part of this very exciting revolution happening in the personal healthcare space, where the emergence of the motion sensors-laden smartphone as the centrepiece of democratisation of healthcare-hardware has yielded incredible results in domains such as gait- and posture-related disorders treatment, geriatric care and treatment of neurodegenerative disorders (Fig. 3).
Besides healthcare, these motion sensors are driving incredible advances in other areas such as imaging, virtual reality (VR), augmented reality (AR), indoor positioning and password-free personal authentication.
However, this very sensitivity and power of these open the door for some heinous attacks to be carried out with much ease that were hitherto not achievable.
In the rest of this article, I survey the landscape of such attacks, categorise these and highlight the flagship examples in each category. The goal of this venture is to not just highlight these threats to the security community at large, but to also pique the interest of sensor designers and manufacturers who will play a crucial role in overcoming these threats.
Most security attacks on sensors broadly fall into two categories: spoofing attacks and attacks with intent of malicious use. Spoof attacks actually target modification of sensor data being used by a legitimate application, either by cunningly injecting fake or synthetic data somewhere in the pipeline, or introducing adversarial vibrations around the phone (by playing sounds of certain frequencies, for example). This dirties up the very data that the sensor is picking up.
Attacks with malicious intent involve nefariously eavesdropping on sensor data and inferring things that the user might be typing or talking without having to access the traditional well-guarded sources of such information, like the microphone.
Now, let’s dive deeper.
Spoofing attacks in the context of smartphone sensors are carried out in two ways: spoofing by injection and spoofing by transduction.
Spoofing by injection entails an attacker injecting spurious sensor data (synthetically crafted or otherwise) into a system in a way that is indistinguishable with regards to real data, thereby, poisoning the rest of the sensor data ingestion pipeline. A classic example of this would be a GPS spoof attack where a software-defined radio device is engineered to synthesise fake GPS signals that are ingested by the targeted device as if these were the GPS signals beamed from satellites.
Recently, a US$ 1 million grant was awarded to Clemson University researchers to counter GPS spoofing attacks on network time protocol (NTP). Rooted or jail-broken devices are also targeted to carry out spoof injection attacks where synthetic gyroscopic, accelerometeric and magnetometric (compass) data is injected as if these emanated from the physical sensors (think cheating in fitness apps or mobile games).
Spoofing by transduction attacks involve targeting the very physical environment being sensed by the sensor with a malicious intent. The two motion sensors—accelerometer and gyroscope—can both be targeted acoustically by strategically playing certain sound frequencies that induce maximum sensing response (resonance) from the respective MEMS chips. These sound signals can either emanate from a pre-calibrated sonic gun, a nearby laptop playing YouTube videos or a silent partner-in-crime that resides on the device, the vibrator motor.
With regards to the vibrator motor specifically, one very interesting development that will pave the way for exciting enhancements in the richness of haptic interactions, which might also contribute to increased sophistication of security threats, is the latest Android API (level 26) that allows for triggering of the vibrator motor via much complex waveforms with precise control over duration, waveform shape as well as duty cycle.
Malicious use of sensor data
This category of attacks entails the attacker munging sensor data to infer aspects of the person’s or even the device’s identity with a malicious intent.
As was revealed in a recent study, a significant number of smartphone users had never even heard of the accelerometer MEMS chip present on their device, and were relatively much less concerned about the malicious use of these sensors in comparison with overtly active biometric sensors such as fingerprint or camera and microphone, for example. This threat perception (or lack of thereof) fits squarely into the colloquial notion of what eavesdropping actually entails. It opens the doors for a plethora of rather sinister attacks that can be carried out with the user actually allowing these by granting sensor data access permissions.
Let us survey some of the examples that are well known in the academic community.
Device finger-printing. Most commercial smartphone sensor chips provide measurements that slightly deviate from the ground truth values in a way that is rather specific to that particular device. This is due to the idiosyncratic birth defects or hardware imperfections etched into the silicon. These device-specific defects manifest as offsets and scaling factors in some measurements, or in terms of the measurements that these yield in response to some pre-authored triggers controlled by the attacker.
By using the right algorithm, the attacker can very accurately finger-print the device, allowing it to be tracked even when the user has taken standard precautions such as using incognito modes in the browser or deleting suspicious cookies. The sinister aspect of this attack is that, it empowers the attacker to keep consolidating the data collected from the victim and track her or him over disparate spatial locations and time. The attack becomes ineffective if and only if the device is changed or the sensor ages or wears out dramatically to the degree that its fingerprint becomes obsolete. But studies have shown that the fingerprint lasted over nine months of experimentation with more than 100 accelerometers.
Acoustic (voice) data collection without using the microphone
Two factors facilitate this effect. The first factor is that both the accelerometer and gyroscope are extremely sensitive to any mechanical vibration including human speech. Second, both iOS and Android allow high enough sampling rates of data from these sensors that a non-trivial part of the human speech spectrum is covered.
Projects such as Gyrophone and Accelword demonstrate how gyroscopic and accelerometric vibration data can be used to achieve efficient speaker information retrieval or hot-word detection. What makes it truly worse is that sensor data can be extracted via browsers as well, albeit at a lower sampling rate.
On the concluding note, I would like to emphasise that this article is not meant to introduce fearmongering in the minds of smartphone users with regards to IMU motion sensors, but to merely raise awareness about the sinister threats posed by the misuse of these sensors. These sensors will be at the heart of some exciting advancements spanning across domains such as healthcare, authentication, entertainment and videography.
Vinay Uday Prabhu is principal machine learning scientist at UnifyID Inc., a security startup based in San Francisco, California. His current research lies at the intersection of deep learning, security and smartphone sensors. Prabhu holds a PhD in electrical and computer engineering from Carnegie Mellon University.