Hey, Alexa: Put Inclusion on the Agenda

From labor disputes cases to labor and employment publications, for your research, you’ll find solutions on Bloomberg Law®. Protect your clients by developing strategies based on Litigation...

By Porter Wells

Voice-activated virtual assistants can set alarms, check the weather, start conference calls, or read that email from your boss aloud. This technology is sprinting into offices and employers must ensure their workers aren’t left at the starting line.

More than a quarter (29 percent) of companies in North America and Europe are already using an artificial intelligence assistant or chatbot in the workplace or plan to start using one in the next 12 months, an April 2018 study found. The virtual assistant industry market as a whole could pass $16.1 billion by 2023, a separate report projected.

Technological advances can increase productivity. But employers need to be aware that some employees may have difficulty accessing or interacting with new voice-activated technology. Two particular groups of workers should be considered: those with disabilities and those who speak with accents.

VAT in the workplace is “a very big issue for major companies,” said Leslie Wilson, vice president of Disability:IN’s Inclusion Works, a network of disability rights advocates and businesses including Fortune 500 companies like Amazon, Dell, 3M, Bank of America, and Comcast NBCUniversal .

Accessibility Is Key

When it comes to disabled workers, VAT presents a mixed bag.

On the one hand, “it’s a boon for workers with manual dexterity limitations,” Simon Dermer said. That is to say, voice commands at the office could significantly help workers who struggle with fine motor skills or have limited physical mobility. Dermer is the co-founder and managing director of eSSENTIAL Accessibility, a company that works with businesses to ensure people with disabilities can access their web services.

But deaf workers and those with speech-related disabilities require additional consideration, Dermer said. Wilson agreed, noting that often, companies purchase off-the-shelf products for workplace use that haven’t been robustly tested for their usability for such workers.

Tech companies seem to have realized this, as virtual assistants are now incorporating visual cues, such as colored lights and display screens, Dermer told Bloomberg Law.

VAT can perform functions that help disabled workers beyond simply acknowledging and executing commands. During a forward-looking technology presentation at Microsoft ‘s Build 2018 conference, a deaf engineer said she needs a sign language interpreter in meetings. However, because Microsoft’s virtual assistant, Cortana, can transcribe speech to text, she’s able to give meeting conversations her full attention without the need to take notes.

Jennifer Betts, a partner with labor employment law firm Ogletree Deakins, encourages companies to do their homework before rolling out VAT for their workers’ use.

“Who is the developer? What was the tech’s data development process and how much diverse data was utilized? Has the tech been developed and validated to ensure that it avoids these kinds of failures? Who in the industry is using it? If it hasn’t been widely adopted, why not?” Betts told Bloomberg Law in an email.

And because employers are required under the Americans with Disabilities Act to make reasonable accommodations for disabled workers, Betts strongly advises that companies confer with legal counsel before rolling out VAT to their workforce.

Thus far, litigation alleging workplace discrimination involving voice-activated technology hasn’t been widely pursued, with no such claims filed in federal courts over the last year, according to Bloomberg Law’s Litigation Intelligence Center.

When it comes to the general introduction of virtual assistants to the workplace, “I don’t see any downsides,” Wilson told Bloomberg Law. “I’m not being Pollyanna. All I see is optimism. The thing about technology is that when it works, it doesn’t leave anyone behind,” she said.

Learning Curve Patience

Google Home and Amazon’s Alexa still have statistically observable difficulties understanding commands given in certain accents, according to data compiled and analyzed by Globalme, a language technologies company headquartered in Vancouver. The assistants struggle not just with commands given by nonnative English speakers, but also by Americans who speak with regional accents.

Globalme co-founder Emre Akkas said that result was expected, given the data on which those Silicon Valley products were built. “The data is biased towards the voices of male, native English speakers without regional accents because, well, that’s the demographic that has provided the most data to these devices so far,” Akkas said.

There’s no law like the ADA that protects workers who speak with an accent, Betts told Bloomberg Law. Still, it’s generally unlawful to discriminate against someone because of their national origin—an issue that could crop up in VAT. “Depending on the circumstances of your tech, there could be a lurking claim of disparate impact"—that is, a disproportionately negative impact on a protected group of people—"that employers should be cautious about,” Betts said.

But there’s an upside here that is unique to machine learning: Your AI assistant of choice doesn’t need to wait to download a system upgrade from the engineers at headquarters to refine its productivity. Each unique module learns on its own, constantly taking in and processing data from the voices speaking to it and running programs connected to its software. Still, some employees may have to repeat themselves several times in front of co-workers or troubleshoot a virtual assistant performing the wrong function.

Tech companies are working on the data bias, Akkas told Bloomberg Law. They’re continuing to improve VAT’s understanding not only of accented English, but of other languages as well. Alexa speaks three languages: English, German, and Japanese. Google Home understands and speaks those three languages, plus French, Italian, and Spanish. Microsoft’s Cortana is the only one of the three that speaks Mandarin Chinese.

But the tech won’t stop there. Back-and-forth dialogue with virtual assistants is still a challenge, said Joshua Montgomery, cofounder of Mycroft.ai, an open-source virtual assistant company that crowdsources VAT coding and software. What excites Montgomery about the future is getting to the day when you can have an actual discussion with an AI and clarify through a natural conversation what exactly it is you want the tech to do.

Request Labor & Employment on Bloomberg Law