Apple previewed its software features for cognitive, vision, hearing, and mobility accessibility, along with tools for individuals with speaking difficulties, before the 12th Global Accessibility Awareness Day, which falls on the third Thursday of May, Trend reports citing Xinhua.
Apple has been working in collaboration with community groups representing users with disabilities to develop accessibility features, and the updates draw on advances in hardware and software, including on-device machine learning to ensure user privacy, the company said on Tuesday.
Coming later this year, users with cognitive disabilities can use iPhone and iPad with greater ease and independence with Assistive Access. Nonspeaking individuals can type to speak during calls and conversations with Live Speech, people can use Personal Voice to create a synthesized voice that sounds like them for connecting with family and friends. For users who are blind or have low vision, Detection Mode in Magnifier offers Point and Speak, which identifies text users point toward and reads it out loud, according to Apple.
"We're excited to share incredible new features that build on our long history of making technology accessible so that everyone has the opportunity to create, communicate, and do what they love," said Tim Cook, Apple's CEO.
"The intellectual and developmental disability community is bursting with creativity, but technology often poses physical, visual, or knowledge barriers for these individuals," said Katy Schmid, senior director of National Program Initiatives at The Arc of the United States.
The cognitively accessible experience "means more open doors to education, employment, safety, and autonomy. It means broadening worlds and expanding potential," she added.