The accessibility service is a feature of the Android framework designed to provide user interface enhancements to assist users with disabilities, or those who may temporarily be unable to fully interact with a device. In these cases, people might need additional or alternative feedback such as text-to-speech or haptic feedback. The accessibility services run in the background and receive callbacks from the system when accessibility events fired. Such events denote some state transition in the user interface, for example, the focus has changed, a button has been clicked, etc. Such the service can optionally request the capability for querying the content of the active window.
The accessibility services receive events from the system when UI of the active application is changed or user input appeared. Also they may retrieve content of the active window and perform actions on UI components.
What can we do by accessibility services?
Let’s check what the accessibility service is able to do with user interface.
The service may react to a bunch of events. The AccessibilityEvent
class represents a system event and contains contextual information about what happened in the user interface, including event type, source of event, event time, etc. For example, there are several very useful event types such as TYPE_VIEW_CLICKED
, TYPE_WINDOW_STATE_CHANGED, TYPE_VIEW_TEXT_SELECTION_CHANGED
and others.
The accessibility service configuration allows us to filter events by type and by application package. In other words, it is possible to receive all accessibility events from all applications or some specific events from specific applications.
As far as interacting with the UI is concerned, you can interact with user interface in different ways. The accessibility service doesn’t use View
class directly, but AccessibilityNodeInfo
instead. This class allows you to click views (using ACTION_CLICK
or ACTION_LONG_CLICK
actions), scroll views (using ACTION_SCROLL_FORWARD
and ACTION_SCROLL_BACKWARD
actions) or make text selection operations (using actions like ACTION_SET_SELECTION
, ACTION_COPY
, ACTION_CUT
, etc.). The whole list of actions is available in javadoc of the class.
Also it is possible to get view-related information from AccessibilityNodeInfo
such as isScrollable
, isClickable
, etc. Considering ability to retrieve then window content and performing gestures, the accessibility services seem to be quite a powerful tool.
What can we use them for?
As I mentioned, the accessibility service is designed to assist users with disabilities by providing additional feedback. But possible usage is not limited by these user interface enhancements. The accessibility service may be developed as part of a bigger application and used to communicate with other components like services or activities. Hence, we can use them for our own needs. Let me take a couple of examples.
One of my previous projects was quite specific €” an android application was part of big testing infrastructure developed for testing bluetooth connectivity of embedded software. The aim of the application was emulating user actions such as accept bluetooth pairing request, reject bluetooth pairing request, allow remote bluetooth device to access phonebook/messages, and so on. For these purposes, I used the accessibility service to perform actions with system dialogs.
The second example of using accessibility services (you can find several suggestions at Stackoverflow) is obtaining response of USSD code by retrieving content of system dialog. These codes, also known as “quick codes”, are usually used for communicating with GSM operator. Often they look like *100#
, *123#
, etc. Here is a demo application I wrote for this post. Below I will describe development details.
I created a list of possible usage scenarios:
1. Automating/emulating user actions (may be used for testing purposes)
– Accepting/rejection runtime permissions
– Accepting/rejecting bluetooth pairing request
– Answering/rejecting calls
– Using as alternative to uiautomator/espresso (for system apps)
2. Reacting to system events if there is no broadcast event for them
– Receiving USSD code result
– Appearance of applications dialog
3. Retrieving information from UI elements of other application (including systems)
– GoogleNow-on-tap like functionality (analysing textviews content of any app and making a google search)
– Spy-ish software (looking for keywords in textviews)
4. Dispatching gestures
– Controlling other application by performing gestures (kind of user action emulation)
5. Soft keyboard state listening and controlling
– Gathering key pressing statistics
As you can see, there are several ways of unusual usage of accessibility services. I wrote ‘unusual’ because all these scenarios are different from the original purpose of accessibility services. They are a bit tricky, but I believe we can use widely our own services as part of user interaction testing (together with espresso and uiautomator).
Any restrictions?
The accessibility service seems to be kind of security issue. But they are not. Firstly, because user must manually activate an accessibility service in the corresponding settings screen. Secondly, the services run in user space, not system space and not middleware. So, simple adb shell pm dump com.yourapp.package
command will show all information about accessibility service, even classname. Moreover, to get content of some textview you should know its package, id and other information (the same as using uiautomator).
Show me an example
I won’t go into details of how to develop and configure accessibility services in general. You can find a good guide at android development documentation website. Instead, I will focus on approaches from the list above.
The demo application allows user to add USSD code to the list, run it by clicking on list item and reach USSD code response, dismiss dialog and show response content within application. The application may be downloaded from play store and source code is available at github.
This video shows how the application works:
The accessibility service must be able to do the following:
- Receive accessibility event when the dialog with USSD code response appears;
- Retrieve content of the dialog (response of USSD code);
- Dismiss dialog by clicking the OK button;
So, firstly, we have to configure the accessibility service to receive events from the application which performs USSD codes (basically this is the application we use to make calls). Also we need to receive only events that represent the opening of dialogs. In order of do that we have to know package of phone application and corresponding accessibility event type. Second, to reach USSD response text we need to know id of dialog textview where response is shown. And finally we need to know id of OK button to emulate click to dismiss the dialog.
There is uiautomatorviewerpackage
of application and id of textview and button. Here is a good tutorial on how to use it.
There are a couple of important lines in this configuration:
- The
android:packageNames="com.android.phone"
line tells the system that we are interested only in events from application with such package. On my Nexus 6 device this is Phone application made by Google. Other manufacturers (like Samsung, LG, etc) may change package of phone application. - The
android:accessibilityEventTypes="typeWindowStateChanged"
line adds filtering of events by type. This event type represents changing state of the window, which means opening of dialogs, popups and so on. - The
android:canRetrieveWindowContent="true"
line enables accessibility service to go through layout tree and get view-related information.
Now, let’s implement the accessibility service itself. Below is a code snippet which contains overridden onAccessibilityEvent
method of AccessibilityService
class. It shows how to get text from textview and how to perform actions on button.
Please, let me know if you use accessibility services in your apps and for what purpose. Also, add comments if you find another unusual scenarios of accessibility service usage.