Accessibility in Flutter on the Web
How Flutter aims to make canvas-rendered apps accessible to users of assistive technologies
8 min read
·
Apr 16, 2024
One of the target platforms the Flutter framework supports is the web. Flutter applications guarantee pixel perfection and platform consistency through rendering all UI onto a canvas element. However, by default canvas elements are not accessible. This case study explains how accessibility support works for such canvas-rendered Flutter apps.
Flutter has a large number of default widgets that
generate an accessibility tree
automatically. An accessibility tree is a tree of accessibility objects that assistive technology can query for attributes and properties and perform actions on. For custom widgets, Flutter’s
Semantics
class lets developers describe the meaning of their widgets, helping assistive technologies make sense of the widget content.
For performance reasons, at the time of this writing, Flutter’s accessibility is opt-in by default. The Flutter team would like to eventually turn the semantics on by default in Flutter Web. However, at the moment, this would lead to noticeable performance costs in a significant number of cases, and requires some optimization before the default can be changed. Developers who want to always turn on Flutter’s accessibility mode can do so with the following code snippet.
import 'package:flutter/semantics.dart';
void main() {
runApp(const MyApp());
if (kIsWeb) {
SemanticsBinding.instance.ensureSemantics();
}
}
Note: If your app absolutely requires to know if a user is using accessibility devices like screen readers, allow users to opt-in.
Once you’ve opted in to Flutter’s accessibility support, the HTML changes automatically, as shown in the rest of this page.
Note:
Screen readers are only one example of assistive technology that profits from the described approach. For improved legibility, screen readers are used as a proxy for this and other assistive technologies in general.
Flutter’s accessibility opt-in
Flutter’s opt-in mechanism is a hidden button. It places a button, exactly speaking, an
<flt-semantics-placeholder>
element with
role="button"
? which is invisible and unreachable to sighted users ? in its HTML. It’s a custom element with styling applied so it doesn’t show and isn’t selectable unless you use a screen reader.
<flt-semantics-placeholder
role="button"
aria-live="polite"
aria-label="Enable accessibility"
tabindex="0"
style="
position: absolute;
left: -1px;
top: -1px;
width: 1px;
height: 1px;"
></flt-semantics-placeholder>
/* `<flt-semantics-placeholder>` inherits from `<flutter-view>`. */
flutter-view {
user-select: none;
}
Changes after the opt-in
What happens when a screen reader user clicks this button? Consider a not too complex example like the
card
from the Flutter Gallery as displayed in the following screenshot.
To better understand what’s changing when a user clicks the button, compare the before and after screenshots of Chrome DevTools when you
inspect the accessibility tree
. The second screenshot exposes a lot more semantic information than the first.
Before opt-in:
After opt-in:
Details of the implementation
The core idea in Flutter is to create an accessible DOM structure that reflects what’s right now displayed on the canvas. This consists of an
<flt-semantics-host>
parent custom element that has
<flt-semantics>
and
<flt-semantics-container>
child elements that in turn can be nested. Consider a button widget, such as
TextButton
. This widget is represented by an
<flt-semantics>
element in the DOM. The ARIA annotations (e.g.,
role
or
aria-label
) and other DOM properties (
tabindex
, event listeners) on the
<flt-semantics>
element allows the screen reader to announce the element as a button to the user, and support clicking and tapping on it, even though it’s not a literal
<button>
element. In the following screenshot the
Share
button is one example of such a button.