A new feature being tested in iOS 7 lets you control the iPhone interface with head movements
The feature, reported by 9to5 Mac, is an accessibility option that can help the disabled
There is no guarantee the option will be in the final version of iOS 7 when it comes out in the fall
Apple appears to be testing out motion control features that will let people control their iPhones by moving their heads.
The feature, first reported by 9to5 Mac, was discovered in the developer version of Apple’s latest mobile operating system, iOS 7, and is not yet publicly available. It is an option in the Accessibility menu that overrides the usual touchscreen controls.
It works by using the front-facing iPhone or iPad camera to detect head movements, according to the report. When on, the system automatically runs through every option on the device screen until the user selects the one he or she wants by turning his or her head to one side.
The left or right head movements can be assigned to different tasks, including returning to the main home screen, displaying notifications or activating voice assistant Siri. (You can watch a video demonstration here.)
The approach appears time-consuming, but being able to navigate the phone interface this way with head movements could potentially be useful for people with limited mobility or other disabilities that make touch or voice interfaces difficult to use.
The new version of the iOS mobile operating system is still in beta testing, when its features are used by developers creating new apps while Apple fixes bugs and polishes the interface. Regular iPhone users will be able to upgrade to iOS 7 in the fall.
The head-movement feature is meant as an accessibility option, but it’s easy to imagine similar motion control features becoming more commonplace on devices in the future.
Motion detection technology that lets you control a computer interface has taken off in the past few years. The Microsoft Kinect and a soon-to-be-released device from Leap Motion use powerful depth sensor cameras. The Kinect can detect a person’s skeleton, face and even expressions. The Leap detects gestures such as finger movements.
In a pinch, regular smartphone cameras can perform some minor motion detection features, but they are still in the early stages and need work before they are good enough to be practical tools.
Motion control options are already popping up in smartphones. The Samsung Galaxy S4 supports eye scrolling, moving a page when the user’s eyes look up or down, and detecting when motion has stopped.
If you look away from a video while it is playing, it can automatically pause. It also supports gestures, such as picking up calls when you wave a hand over the device.