vwlsmssng 4 points 4y ago
Don't worry, I've cut & pasted and tweeked it a little for visual readers
# Customization
- Sudle haptic feedback for most sound queues - navigating and tapping items, turning the rotor, various errors
- Ability to turn on and off every sound and haptic feedback effect individually, in addition to just having a master switches for sounds and haptics
- Ability to create custom punctuation pronunciation levels in addition to the usual none, some and all. These allow you to change whether a character is just past to the speech synthesizer, or spoken in a different way. This can be used for example to change the default pronunciation of the # character from "number" to "hashtag" or even shorter just "hash". These punctuation schemes are synced over iCloud to other iOS devices and macs and can be exported out into a file that can be shared with other people or backed up for importing later.
- The ability to make custom activities - a set of settings that can be quickly switched to manually or be applied when you enter a specific app or enter a specific context like working in a word processor. Currently as of Dev Beta 1 the settings you can apply include the voice, speech rate and volume as well as the punctuation level.
- You can reassign existing gestures/bluetooth keyboard commands and add new ones to perform different actions. These include basic things like navigating to different kinds of elements, adjusting speech settings, quickly going to the home screen, app switcher and the notification and control centers, to really advanced and powerful ones like running any shortcut. Most gestures can be changed, with the exception of the 1-finger swipes that move through items and the double-tap which performs a tap on whatever is focused.
- You can now completely turn off reading of Emoji. If you interact with people that like to spam them without introducing much benefit or include them in their usernames on social media, you can now kill them on an OS level.
- You can now customize how VoiceOver handles image descriptions. Ever since iOS 11 Apple has been using the new machine learning features to guess objects and text on pictures and having VoiceOver read them out if you performed a 3-finger tap to get additional information. Now, you can have VoiceOver read them automatically if they're available or have it play a sound to let you know about them.
#Braille
- VoiceOver now includes the open-source Liblouis braille translator to provide braille translation for braille displays. This has basically become an industry standard and is also used by Microsoft and Google and offers a large number of languages. However if you prefer the old braille tables they are still available as well.
- There is now a separate rotor for changing the braille language table. Previously, this was tied to the speech language rotor.
- Typing on a braille display has been sped up greatly, which should be particularly noticeable when using contracted braille.
- VoiceOver now displays position information inside a list in braille. So for example, if you focus the airplane mode switch in the settings app in addition to the switch itself being indicated in braille you'll also see a message like "1/50", indicating that this is the first item out of 50 of the list of settings. This doesn't appear to be indicated with Speech for the moment, at least in the context of lists.
#Misc
- Performance in general has been improved rather noticeably, particularly when quickly dragging your finger through a lot of items or when switching through screens in an app.
- The camera app provides additional guidance while taking a picture. IN addition to telling you when 1 or more faces are in frame and where they are, you are now told if you are tilting your device and get additional audio, haptic and spoken feedback when you hold the phone level.
- If you make a screen recording, VoiceOver speech is now included in the recorded audio. Previously, it wasn't. This makes the feature extremely helpful if you want to report accessibility issues to an app developer, because you can just make a recording and demonstrate exactly where and how things aren't reading how they should be.