diff --git a/Sources/AccessibilityDocumentation/Documentation.docc/Articles/AssistiveTechnologies/SwitchControl.md b/Sources/AccessibilityDocumentation/Documentation.docc/Articles/AssistiveTechnologies/SwitchControl.md index a57b7cb..48b0070 100644 --- a/Sources/AccessibilityDocumentation/Documentation.docc/Articles/AssistiveTechnologies/SwitchControl.md +++ b/Sources/AccessibilityDocumentation/Documentation.docc/Articles/AssistiveTechnologies/SwitchControl.md @@ -1,4 +1,4 @@ -# SwitchControl +# Switch Control Allows to connect external devices and link them to any command. As a result paralyzed people can control a phone by simple signals: finger movement, muscle stretches, etc. Also, a iPhone's camera can recognize facial expression or any sound like a command. In the end user moves focus on screen and pass command to focused element by submenu that is presented after selection. diff --git a/Sources/AccessibilityDocumentation/Documentation.docc/Articles/AssistiveTechnologies/VoiceControl.md b/Sources/AccessibilityDocumentation/Documentation.docc/Articles/AssistiveTechnologies/VoiceControl.md index 4ef778e..d652ce5 100644 --- a/Sources/AccessibilityDocumentation/Documentation.docc/Articles/AssistiveTechnologies/VoiceControl.md +++ b/Sources/AccessibilityDocumentation/Documentation.docc/Articles/AssistiveTechnologies/VoiceControl.md @@ -1,4 +1,4 @@ -# VoiceControl +# Voice Control Adds additional commands over graphical UI to control a phone by voice commands. A user of VoiceControl can see, but can't touch their phone, as a result he can pronounce commands lite "select Pepperoni", "tap purchase" or "close screen". iPhone recognizes speach, convert it to text and links command to elements' description.