Apple started in 2017 facial biometrics securely with hardware dedicated to that function. By then Android had already built a software solution, extremely insecure and that could be circumvented by print or digital photographs.
Google was inspired by this idea to launch Pixel 4 this year, which features dedicated hardware to unlock the screen through the user’s face. Thanks to this, as well as the iPhone 11 , there is no digital biometrics.
But what are the differences between the mechanisms of each? A YouTuber has done an extensive video about it, which can be seen below:
Registering a face
A veteran in this field, Apple can do a much less bureaucratic process for registering a face. The user opens the menu and then fits his face into a wide area of view, twisting his neck once so that the sensor captures an image with deep mathematical data to create a 3D model stored in the device. The procedure is performed a second time under warranty.
In Pixel 4, the menu is very similar to Apple’s solution, except that the viewing area is smaller and the Soli sensor is not yet as polished as Apple’s sensors. So the user can take, perhaps a minute, in this one-step configuration, which indicates small blocks in which the sensor has not yet captured enough data.
Unlocking the screen
On iPhone 11 and others with Face ID, the system is always ready to recognize a face as soon as a touch on the screen, even if it is erased, is identified, or the grip movement is performed. The screen is soon unlocked for the user, but if the face is not recognized, the lock screen appears and notifications can be viewed if previously configured.
The Face ID, however, has a curious limitation: On Apple cell phones, it only works with the device in portrait mode, with a maximum range of 45 degrees. In last year’s iPad Pro there is no such limitation.
Already in Pixel 4, everything begins when the user’s hand approaches the device or even his face. The sensor can pick up the vibrations generated by movements in its vicinity and then “warns” the unlocking sensor to go into operation. Thus, as soon as the screen is turned on, the user is already thrown to the desktop
Pixels 4 have no angle limitation for unlocking to work.
The eye problem
As we have already reported here on AllCellular, Pixel 4 has come with a dangerous feature enabled by default that cannot be configured otherwise for now: unlocking happens even when the user’s eyes are closed. Thus, a dead man’s handset could be unlocked, or someone could gain access to the device while the user sleeps by simply pointing the phone at the owner’s face.
Google will release a corrective update that will allow open eyes to be required.
Apple, with a long history of investing more in security before putting resources on the market, does not suffer from this problem: by default, the iPhone 11 requires user attention, ie eyes on the screen. But this option can be disabled, making unlocking less secure.
The work in the background
Google very proudly detailed how your face unlocks were designed to be safe. As soon as the Pixel 4 sensors come into action, an image is logged and sent over a unique bridge to Titan, the chip responsible for storing the user’s facial biometrics data. If they match, Titan frees up the use of the device.
On the iPhone 11, Face ID handles more complex work: as soon as a face approaches, the sensor captures several 2D images of it alongside deep mathematical data. After that, Apple’s neural network locally converts the images with the additional recorded information and decides if that result is compatible not only with the previously entered data but with the first image taken by the sensor at the time of unlocking.
In the end, the two companies offer a very complex solution to protect user privacy. But we want to know: would you trust one more than the other? If so, why? Tell us in the comments!