What is Apple’s door detection feature, how does it work and how to use it – Technology News, Firstpost



Apple has unveiled a range of new software features in tandem with the hardware capabilities on some of its high-end devices to help users with certain physical disabilities. These features include Door Detection on the iPhone and iPad; live captions on iPhone, iPad and Mac, and more. These features, Apple said, will be available through software updates on Apple devices later this year.

The iPhone has some of the best feature sets tech companies can come up with, for specially skilled people. Year after year, the tech giant has unveiled new software features that, combined with the hardware capabilities on some of its high-end devices, help users with certain physical disabilities.

People with low or no visibility, in particular, are at the center of these accessibility developments. One such recent development is Door Detection, a new feature that informs blind or partially sighted people about the characteristics of a door and how to operate it.

What is door detection?
One of the biggest challenges faced by people with low vision or low vision in a new environment is negotiating doors. This feature can help users who are blind or visually impaired locate a door when arriving at a new destination, understand how far away they are, and describe door characteristics, including whether it is open or closed and when it is closed, or it can be opened by pushing, turning a knob or pulling a handle. It can even read out signs and symbols on the door. All this makes exploring unfamiliar territory much easier for a visibly challenged person.

How does door detection work?
Apple’s door detection works using a range of cameras and sensors that are present in the latest generation of the more expensive models of the iPhone. Specifically, it uses the LiDAR or light detection and range sensor to measure how far an object, in this case a door, is from the user. It also uses the cameras, in conjunction with the LiDAR sensor, and the on-board machine learning of the phone to read and reinterpret a live scene.

How do I use the Door Detection feature?
While the door detection feature will be available at a later date after a major update is released, the idea is for a visibly challenged person to pull out their LiDAR-compatible iPhone and scan the area in their immediate vicinity with an app, or the camera itself. The device then reads a scene, analyzes the different elements in the scene and calculates where and how far they are from the user, then outputs audio signals to the user and directs him to the door. If scanned correctly, it can also tell users how to open the door, push or pull, and many other attributes that help negotiate the door much easier. Please note that for this to work, users will need to enable accessibility mode on their respective iPhones.

Apple will also release a number of other features intended to improve accessibility features. For example, it will add a wealth of new features to the Apple Watch that will help users with disabilities better control their Apple Watch with their iPhones, and vice versa. It will also add live captioning to its accessibility features, allowing people with hearing disabilities to follow audio-based content, such as phone calls or FaceTime meetings, using real-time captioning.

All of these feature sets are currently being tested by Apple and will be available to general users after a major upcoming update.

Leave a Reply

Your email address will not be published.