Technology

How does the Pixel series of Google smartphones accurately detect the movement of the finger touching the display?


software

How does the Pixel series of Google smartphones accurately detect the movement of the finger touching the display?

Most smartphones use a touch display, which allows intuitive operation using your finger. However, there are various types of operations with fingers, such as taps, double taps, long taps, pinch ins, pinch outs, drags, and flicks, and it is possible to correctly determine what kind of operation was performed on the smartphone side. is needed. Google smartphonePixelWith the update made in March 2020, the accuracy of touch operation discrimination was improved, but a researcher from Google's Android UX team explains what kind of system was added.

Google AI Blog: Sensing Force-Based Gestures on the Pixel 4
https://ai.googleblog.com/2020/06/sensing-force-based-gestures-on-pixel-4.html

Mostly used for touch displays on smartphonesCapacitance methodThe touch sensor is a non-conductive material such as a drive electrode, a detection electrode, and glass sandwiched between them.DielectricIt consists of: Each of the drive and detection electrodes is extremely small, and when combined, it holds a small charge.CapacitorForming cells. When a conductive finger approaches it, part of the electric charge is released, and the capacitance slightly decreases, so the principle of the capacitance method is to be able to detect the location touched by the finger.

The cells of a capacitive touch sensor are tightly lined up on the display, but they are still much coarser than the pixels of the display. For example, the Pixel 4 has a display resolution of 2280 pixels by 1080 pixels, but a touch sensor has 32 cells by 15 cells.

Click the image below to see a GIF animation that visualizes the readings from the touch sensor cell when tapped from the left, long-pressed, and flicked. Although there are differences in fine movements, the resolution of the touch sensor sensitivity is still rough and it is sometimes difficult to correctly identify various gestures.

"Pressure at the time of touch" is attracting attention as a solution to the problem of "how to determine the correct gesture?". However, the hardware sensor required to sense the touch pressure is very expensive to design and install, and it is very difficult for humans to precisely control the touch pressure. Introduction to smartphones has been postponed.

Capacitive touch sensors do not respond to changes in pressure itself, but are tuned to be extremely sensitive to changes in the distance of your finger within a few millimeters of the display. That is, when you touch the display with your finger, the sensor near the center of the touched area saturates, but the area around the touched area retains a high dynamic range.

When you apply force to your finger and push the display firmly, the flesh of your finger will deform and spread. The nature of this deformation depends on the size and shape of the user's finger, the angle with respect to the screen, and also dynamically due to finger movement, axis, and pressure. In other words, by reading and analyzing the movement of the flesh of the finger with a capacitive touch sensor, the pressure and movement of the user's finger can be indirectly sensed.

So Google designed a machine learning algorithm that analyzes and classifies dynamic changes. The architecture outline of the gesture classification model designed by Google is as follows, focusing on the spatial features of the signals observed by the sensorConvolutional neural networkFocus on (CNN) and temporal characteristicsRegression neural networkIt is a model that combines (RNN).

This gesture classification model was trained on input datasets such as long press, long press with little force, tap, scroll, and drag. This model is from March 2020Pixel feature drops 2ndIt was introduced in the Pixel series, and it is now possible to accurately identify gestures even with the capacitance method.

Google's Android UX team says "machine learning algorithms and carefulInteraction designWith the integration of, we were able to provide a more expressive touch experience for Pixel users. We will continue to research and develop these features to refine the Pixel touch experience and explore new touch interactions."

Copy the title and URL of this article

Source link

Do you like this article??

Show More

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button