Skip to content
QualityLogic logo, Click to navigate to Home

Accessibility Industry Update: June Edition

Home » Blogs/Events » Accessibility Industry Update: June Edition

Welcome to QualityLogic’s monthly accessibility industry update, reviewing key events from May of 2024. In this edition we’ll be covering GAAD (global accessibility awareness day), our takeaways, and the most important discussion points from the annual event, held on May 16. This one was packed with some truly spectacular talks, networking opportunities, a little competition, and of course troves of important information. Also, in a first, Apple and Google both unveiled many new accessibility features coming to their software later this year. This one is going to be a bit longer than usual, but we promise to keep it interesting! 

Global Accessibility Awareness Day (GAAD) #13

The purpose of GAAD is to get everyone talking, thinking, and learning about digital access/inclusion and people with different disabilities. (The GAAD Foundation) 

Held on Thursday, May 16, this year’s GAAD featured 208 events (that we know about at least) sponsored by nearly 100 organizations and interest groups championing digital accessibility across nearly every sector. 

Among the presentations were 55 published in-person events, 93 virtual events, 52 published private/internal events, and 8 activities. Many of the virtual ones are now available to watch in leisure. 

Events We Recommend

With over 200 different presentations in one day (many of which lasted at least thirty minutes), it can be especially difficult to sift through them all. These are the ones we recommend: 

[On-Demand Webinar] Prepare Your Website for New Accessibility Rules (lumar.io): Shameless plug! We (QL) partnered with our friends at Lumar to go over recent and upcoming accessibility regulations, as well as basic recommendations to get started. 

As always, you can view the list on the GAAD events page

W3C Announces 174 New Outcomes for Web Accessibility

GAAD got quite the abrupt start when the World Wide Web Consortium (W3C) published its latest WCAG 3.0 working draft. Though incomplete and under active discussion, this new version contains a total of 174 new outcomes. For the unaware, the term outcomes replaces “success criteria” in 3.0 and beyond. For a comparison, WCAG 2.2 has 87 success criteria. A couple key points: 

  • It is likely that a good number of the proposed outcomes will be removed with feedback, as there is still a lot of research that needs to be done. 
  • Conformance levels have not yet been defined, so the number of outcomes that organizations will need to think about are realistically much less. 

The changes under discussion that we find most interesting are: 

  • No memorization (EXPLORATORY): Tasks can be completed without memorizing and recalling information from previous stages of the process. 
  • AI editable (EXPLORATORY): Auto generated text descriptions are editable by content creator. 
  • Use of spatial audio (EXPLORATORY): Information is not conveyed with spatial audio alone. 
  • Comparable keyboard effort (EXPLORATORY): The number of input commands required to complete a task using the keyboard is similar to the number of input commands when using other input modalities. 
  • Specific pressure (EXPLORATORY): Click activation using a pointer device does not require applying a specific pressure. 
  • Non-verbal cues (EXPLORATORY): Media alternatives explain nonverbal cues, such as tone of voice, facial expressions, body gestures, or music with emotional meaning. 
  • Single idea (EXPLORATORY): Each segment of text [such as sentence, paragraph, bullet] presents one concept. 
  • Sentence voice (EXPLORATORY): The voice used is easiest to understand in context. 
  • Disability information privacy (EXPLORATORY): Disability information is not disclosed to or used by third parties and algorithms (including AI). 
  • Algorithm bias (EXPLORATORY): Algorithms (including AI) used are not biased against people with disabilities. 

Apple Announces New Accessibility Features, Including Eye Tracking

Apple was kind enough to give us a sneak peak of editions to iOS coming later this year as part of iOS 18. Accessibility options include eye tracking, music haptics, and vocal shortcuts. 

Eye Tracking will let users navigate and use the iPhone and iPad with just their eyes, including performing most gestures. The feature is powered by artificial intelligence and will use the front-facing camera to set-up a profile based on the configuration of the user’s eyes. This is one to stay on top of, especially for those of us who routinely do mobile accessibility testing. 

Music Haptics is a revolutionary concept that aims to help users who are deaf or hard of hearing enjoy and experience music. When the accessibility setting is turned on, the tactic engine plays taps, textures, and refined vibrations that correspond with music audio. Music Haptics already works with millions of songs from the Apple Music catalog, and an API will be released for developers to bring this to their applications. 

Vocal Shortcuts will allow users to train Siri to respond to custom utterances. Shortcuts can do anything from launching apps, to obtaining information and performing more complex tasks throughout the operating system. 

Listen for Atypical Speech is a new option that will use on-device machine learning algorithms to try to understand unconventional speech patterns, such as those that are often the result of conditions like cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke. 

Vehicle Motion Cues will help to reduce motion sickness by adding animated dots to the corners of the screen that correspond to vehicle motion, but that do not inhibit a user’s ability to see or interact with displayed content. 

CarPlay is getting voice control (so users can control CarPlay supported applications), sound recognition (so those who are hard of hearing can recognize crucial sounds on the road like sirens), and color filters (simplifying the interface with options like bold text for color blind users). 

They also mentioned that updates will be coming to existing and widely used features, like: 

  • VoiceOver will be getting new voices, a flexible Voice Rotor, custom volume control, and the ability to customize VoiceOver keyboard shortcuts on Mac. 
  • Magnifier will offer a new Reader Mode and the option to easily launch Detection Mode with the Action button. 
  • Improvements to Braille Screen Input (notably support for multi-line braille, and a way to start BSI more rapidly). 
  • Switch Control will include the option to use the cameras in iPhone and iPad to recognize finger-tap gestures as switches. 
  • Voice Control now works with custom vocabularies and complex words. 
  • Hover Typing shows larger text when typing in a text field, and in a user’s preferred font and color. 
  • Personal Voice will make it easier for users to make custom voices. Cutting down on the time and sentences needed to do so. 

New AI Accessibility Updates Across Google Products

Like Apple, Google also took advantage of GAAD to talk about the new features related to accessibility coming to their products. 

  • As announced in I/O 2024, Gemini is now being integrated into TalkBack. There is another article going over this in greater detail here. The short is that Gemini Nano (a local on-device model) will be integrated to provide image descriptions in real time. 
  • Lookout, which lets blind and visually impaired users get live descriptions of their surroundings using the camera, will now let users find specific types of objects. Users will select from a list of seven categories like seating and tables or bathrooms, and upon moving their camera around the room, TalkBack will announce the direction and distance to that item. 
  • With the app Look to Speak, users who have difficulty speaking can now look at phrases to have them spoken aloud. Now it is also possible to choose custom emojis, symbols, and photos. 
  • In Google Maps, screen reader users will get more information about their surroundings and walking directions in detailed voice guidance (like letting you know when you’re headed in the right direction or being re-routed). Google Lens will also announce the name, category, and distance away from businesses as the camera is held and rotated around the user. 
  • Maps now has accessibility information for more than 50 million businesses like wheelchair friendly entrances, accessible restrooms, parking, and seating options. There is also an option to filter reviews for “accessibility”. 
  • Business owners can now add the Auracast attribute to their business profile in maps so that users of Bluetooth hearing aids know where they can cast sound to them. 

Intriguing Reads and Information

Interested in More Information?