Apple is researching how AirPods could use sensors like the ones used in Face ID to read the user’s lips and process what the user wants, even if there isn’t a spoken request. The “Wearable skin vibration or silent skin gesture detector” patent proposes using what it calls self-mixing interferometry to recognize more nuanced gestures. Beyond full head movements like a nod or shake, much smaller ones such as a smile, or a whispered command, could be detected. Deformations in the skin, or skin and muscle vibrations, could be spotted and interpreted by the interferometry sensor. The idea is that as a user speaks, the movement of the jaw and cheeks is detectable through the use of a Vertical Cavity Surface Emitting Laser (VCSEL) in the frame of the device. The idea is that the VCSEL emitter and sensor, similar to the combo used in Face ID, could be in the frame of the device. And, users could select how the AirPods react to different skin and lip movements picked up by that combination of emitter and sensor. In the case of AirPods that go inside the ear, instead of solely over the ear AirPods Max, Apple also says that “the self-mixing interferometry sensor may direct the beam of light toward a location in an ear of the user.” When that light and its reflection back to the sensor alters, it will be because of head or skin and muscle movement. The patent is then specifically about methods by which such movement could be detected, but beyond the specifics, there are two clear benefits. One is that movement detection allows for what Apple calls silent commands. Currently AirPods Pro support a silence nod or shake of the head to accept or reject phone calls, but they could be set to interpret a mouth movement as meaning “skip track.”