Galaxy S10 has an ultrasonic fingerprint scanner. Here’s why you should care

The Galaxy S10’s in-screen fingerprint scanner may look just like the one on the OnePlus 6T, but don’t be fooled. Samsung’s flagship Galaxy S10 and S10 Plus are the first phones to use Qualcomm’s ultrasonic in-screen fingerprint technology, which uses sound waves to read your print. 

Related to ultrasound in a doctor’s office, this “3D Sonic Sensor” technology works by bouncing sound waves off your skin. It’ll capture your details through water, lotion and grease, at night or in bright daylight. Qualcomm also claims it’s faster and much more secure than the optical fingerprint sensor you’ve seen in other phones before this. That’s because the ultrasonic reader takes a 3D capture of all the ridges and valleys that make up your skin, compared to a 2D image — basically a photo — that an optical reader captures using light, not sound waves. 

The debate between ultrasonic and optical fingerprint scanners comes at a time when biometric security is on the ascent. In-screen fingerprint readers are a hot trend in phone design because they don’t take up any room on the phone face, and require less groping around than a sensor embedded on the phone’s power button or back casing. That design dovetails nicely with the move toward an all-screen face with barely visible bezels. 


Now playing:
Watch this:

Why the Galaxy S10’s ultrasonic fingerprint reader matters

2:48

“Security and biometrics have been integrating into mobile platforms at a rapid pace,” Alex Katouzian, Qualcomm senior vice president of mobile technology, said in December at Qulalcomm’s yearly tech summit in Hawaii. “This is the future of fingerprint technology.”

The ultrasonic fingerprint sensor built into the screen layers replaces iris scanning as the biometric sensor of choice on the Galaxy S10 and S10 Plus in particular. (The Galaxy S10E has a traditional capacitive fingerprint reader on the power button.) Iris scanning has existed since the Galaxy S7, so Samsung’s move away from it is a surprise about-face. Rumor has it that Google might fold 3D face scanning into the next version of Android, referred to as Android Q

For an animated explanation at how ultrasonic fingerprint scanners work, be sure to also check out the video in this story. Meanwhile, here’s what you need to know about the ultrasonic in-screen fingerprint reader on the Galaxy S10 and S10 Plus.

What’s an ‘ultrasonic’ fingerprint sensor?

The 3D Sonic Sensor fingerprint sensor developed by Qualcomm uses sound waves (this is the “sonic” part) to “read” your fingerprint when you unlock your phone. The trend these days is to embed this fingerprint sensor underneath the display so that you unlock the phone by putting your finger or thumb in the center of the screen — it’s also called an in-screen fingerprint reader — but this type of sensor could also exist on a device’s home button. CNET saw a prototype of this ultrasonic sensor in 2015.

In this case, the ultrasonic sensor is integrated into one of the several layers that make up your phone’s display. When you put your finger on the target area, you’re touching the glass that tops your phone, not the sensor. But your skin does send out a tiny electrical pulse that activates the sensor and gets it to do its thing. It’s been snoozing but by touching the phone, you’re giving it a wake-up nudge.

oneplus-6t-3435oneplus-6t-3435

The OnePlus 6T was the first phone with a US carrier to have an (optical) in-screen fingerprint reader.


James Martin/CNET

How exactly does the ultrasonic sensor work?

When your electrical signal hits the sensor, it emits sound waves that bounce back up to your skin. Your skin’s surface isn’t flat — your fingertip is a unique pattern of of ridges and valleys, which is what makes fingerprinting a useful form of identification. These ultrasonic waves bounce back to the processor, which maps your fingerprint based on the pressure reading of the sound wave bouncing off your skin. Specifically, it calculates different levels of voltage based on your ridges and valleys.

In a simplified example, say that your ridges are a 1 and your valley is a 0. The ultrasonic sensor module can map out that data to form a detail-rich 3D image of your fingerprint. The soundwaves can also detect your blood flow, and would reject a print from a severed finger. It can’t be fooled by fake fingers or synthetic skin.

Ultrasonic versus optical fingerprint reader: How is it different?

An optical fingerprint reader — like the one powered by component-maker Synaptics that we first saw in Vivo phones — bounces light up to your finger and back down to the sensor, which interprets the reading as a 2D image. It essentially takes a photo of your finger to determine the pattern of ridges and valleys. But experts say this approach is easier to fool with a photo, a fingerprint transfer (for example, if someone lifted your print) or a prosthetic fingerprint.

Qualcomm claims that its ultrasonic fingerprint sensor is powerful enough to get a reading 4 millimeters deep when it scans your print. That’s pore level. We have’t tried the Galaxy S10’s ultrasonic fingerprint reader yet, so we can’t say for sure if that’s right. But it’s a sure bet everyone in the security community will test it to the limit.

samsung-galaxy-note-9-1296samsung-galaxy-note-9-1296

Goodbye, capacitive fingerprint reader. Hello, ultrasonic.


Angela Lang/CNET

How is ultrasonic different from a physical fingerprint reader?

When you put your finger on a smooth reader on a phone’s back, side or home button, your fingerprint’s captured by a capacitive fingerprint sensor. Yes, “capacitive” is the same term that applies to your phone’s touchscreen, where your finger lets off trace amounts of electric charge for the display to sense where exactly you’re touching.

It’s the same case here. Capacitors throughout the scanner detect the placement of your finger ridges by measuring electrical charge and then matching the pattern of those charges to your registered fingers.

qualcomm-5g-prototype-phonequalcomm-5g-prototype-phone

Jessica Dolcourt/CNET

What’s the advantage of using this over Face ID and other face unlocking features?

Apple’s Face ID uses a depth map made of 30,000 infrared dots to map your facial contours. Samsung used to use competing technology to scan your irises, but has removed that for the new phones. The Galaxy S10 phones use a third method that’s baked in Google Android, which essentially takes a photo of your face. Face ID and iris scanning are considered secure enough to use for mobile payments, but the facial-recognition option In Android Pie won’t support mobile payments. It’s there for convenience, but won’t give you strong protection. 

It’s possible that Samsung ran into difficulties getting a front-facing 3D face scanner to work in time for its Galaxy S10 launch. It’s also possible that Samsung’s tying its fate to whatever Google supports, in order to save money on an investment that might be just a few months away. It wouldn’t be completely out of bounds to think that Samsung wanted to save a Face ID-style option for an August release of the Galaxy Note 10.

Whatever the plan, whatever the motivation, it’s likely that the ultrasonic in-screen fingerprint reader will stick around for several generations — especially if it proves to be as fast, convenient and secure as Qualcomm says.

Published: Dec. 4, 2018.Update: Feb. 21, 2019 at 10:01am PT.

More from Samsung

Check Also

8 New Google Products We Expect to See This Year

Google’s device line could end up having a particularly important moment in 2023. The company usually announces new Pixel products throughout the year. Google is expected to release its first foldable phone this year, however, which would directly compete with Samsung’s proven line of Galaxy Z Fold devices. Google also introduced its own ChatGPT rival, …

Leave a Reply