Skip to content

Commit

Permalink
Add overview
Browse files Browse the repository at this point in the history
  • Loading branch information
akaDuality committed Dec 10, 2023
1 parent ef9d76e commit c16ea13
Showing 1 changed file with 3 additions and 3 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -5,19 +5,19 @@ Accessibility part of UIKit and SwiftUI frameworks helps developers to represent
## Overview


- **VoiceOver** helps blind or low-visioned persons to use a phone by listening audio description of UI and command by different swipes and non-direct touches.
- **VoiceOver** helps blind or low-visioned persons to use a phone by listening audio description of UI and command by different swipes and non-direct touches. Developer prepare text description of the element, iPhone will generate voice description from text.

[Video how to navigate by VoiceOver](https://www.youtube.com/watch?v=qDm7GiKra28)

![VoiceOver gestures](VoiceOverGestures)

- **Voice Control** adds additional commands over graphical UI to control a phone by voice commands. A user of VoiceControl can see, but can't touch their phone.
- **Voice Control** adds additional commands over graphical UI to control a phone by voice commands. A user of VoiceControl can see, but can't touch their phone, as a result he can pronounce commands lite "select Pepperoni", "tap purchase" or "close screen". iPhone recognizes speach, convert it to text and links command to elements' description.

[Video how to use Voice Control](https://www.youtube.com/watch?v=eg22JaZWAgs)

![Voice C~ontrol modes: with labels, enumerated elements or grid](VoiceControlOverview)

- **Switch Control** allows to connect external devices and link them to any command. As a result paralyzed people can control a phone by simple signals: finger movement, muscle stretches, etc. Also, a phone's camera can recognize facial expression or any sound like a command.
- **Switch Control** allows to connect external devices and link them to any command. As a result paralyzed people can control a phone by simple signals: finger movement, muscle stretches, etc. Also, a iPhone's camera can recognize facial expression or any sound like a command. In the end user moves focus on screen and pass command to focused element by submenu that is presented after selection.

> Note: Watch [Apple's playlist about Accessibility ](https://www.youtube.com/playlist?list=PLIl2EzNYri0cLtSlZowttih25VnSvWITu) for inspiration
Expand Down

0 comments on commit c16ea13

Please sign in to comment.