Tap, Tap – Android 11/iOS 14 Back Tap Gestures for any Android phone!
Following the release of the first Android 11 Developer Preview back in February, we learned that Google was working on a new set of gestures code-named “Columbus.” This feature lets you double tap on the back of your Pixel phone to perform actions like launching the Google Assistant, launching the Google Camera app, controlling media playback, and more. In Android 11 Developer Preview 2, Google continued work on these gestures with new actions for taking a screenshot and opening the recent apps overview. However, these gestures were still hidden away from Pixel users, and in subsequent Android 11 Beta releases, were removed entirely. Thankfully, developer Kieron Quinn, also known as Quinny899 on our forums, managed to port this feature so it’ll work on basically any Android device.
His new app, called Tap, Tap, brings the double back tap gesture to any ARMv8 device running Android 7.0 Nougat and higher. In the demo video that I embedded above, I double tapped the back of my Pixel 4 to launch the camera app. In this video, developer Kieron Quinn launches the OnePlus Camera app by double tapping the back of his OnePlus 7T Pro. Launching the camera app is not all that Tap, Tap can do, though. Using an Accessibility Service, Tap, Tap can recognize when you tap the back of your Android phone and then perform certain actions, such as simulating a home, recent apps, or back button press.
Tap, Tap uses the same machine learning models that Google trained to recognize double taps on the back of the Pixel 3 XL, Pixel 4, and Pixel 4 XL. That means it’ll work best when using either one of these three phones or a device with similar dimensions and build to one of these three. Thus, your mileage may vary in how well Tap, Tap recognizes double taps (especially when you have a thick case on), but I’ve managed to get this work on the ASUS ROG Phone 3 and Huawei P40 Pro. No special hardware or software version is needed for this app to work since all the app is doing is reading changes in the device’s accelerometer and gyroscope sensors. The machine learning models were trained by Google to recognize accelerometer and gyroscope sensor readings that happen when you tap the rear of the device, while high-pass and low-pass filters are used to further refine the sensitivity. Theoretically, then, this feature, or one just like it, will work well on any device that has an ML model trained for it, which is likely how it works on Apple devices running iOS 14 and how it’ll work when Xiaomi rolls it out for some devices in MIUI 12.