Getty Images
Amazon unexpectedly renews employee monitoring debate
Amazon is reportedly considering an employee monitoring tool that would block certain words, such as 'robots.' It may never happen, the company said, but it has spurred debate.
Some employee monitoring technologies give employers the capability to flag inappropriate language on apps and identify harassment. If that was all Amazon was doing, its consideration of a tool to block speech on an internal messaging app might not have raised eyebrows.
But based on leaked documents, Amazon is reportedly considering blocking words and phrases such as injustice, ethics, living wage, this is dumb, robots, restrooms and others that might signal employee discontent over working conditions.
Details of the Amazon documents, which the company isn't denying, were recently published in a story by The Intercept. Amazon said the program to censor messages hasn't been launched, and if it does launch, the only words "that may be screened are ones that are offensive or harassing, which is intended to protect our team," said Barbara Agrait, an Amazon spokesperson, in an email to SearchHRSoftware.
Still, the leaked documents raise broad questions about employee monitoring -- whether organizations should deploy the tools and, if they do, how they should be used. HR technology is available to analyze employee sentiment in communications and flag signals of, for instance, potential employee burnout or to identify someone at risk of quitting. Other employee monitoring tools can help assess employee productivity, such as checking calendars and emails to glean an employee's efforts.
Employees have no legal right to privacy as long as employers notify employees of their policies, said Paul Starkman, a labor and employment attorney for Clark Hill PLC in Chicago.
But "from a publicity and brand and reputation standpoint," Starkman said, Amazon's proposal is "dangerous." The company "should expect to get some backlash in the court of public opinion about it."
Paul StarkmanLabor and employment attorney, Clark Hill PLC
The risk of employee monitoring
Starkman said most organizations don't monitor employee communities, and if they do, it's typically to check for profanity.
"Most companies don't have the resources or the time or inclination to monitor employees," said Starkman, who noted that there is a risk of employee discontent and higher turnover that comes with using these tools.
Plus, there are other approaches for finding out what employee concerns are without monitoring their communications in the background.
Remesh Inc. makes an HR tool that uses a messaging app to gather and rank employee opinions. Managers, for instance, can solicit feedback from employees at group meetings and in real time. The tool summarizes relevant employee feedback using machine learning and other algorithms. The AI discovers and prioritizes the most important responses from the entire span of opinions and those that are the most representative, said Andrew Konya, co-founder and CEO of Remesh.
Konya sees two significant risks with Amazon's approach: missing the truth and attrition.
"By putting in systems that suppress or steer conversations, you might steer away from important truths that you need to know," he said. "The second risk is that by creating a situation where employees feel muted, that their voice feels suppressed, you will lead them to quit."
Konya said employers should seek "to create situations where there is two-way consent to the mutual exchange of truth and honesty."
Patrick Thibodeau covers HCM and ERP technologies for TechTarget. He's worked for more than two decades as an enterprise IT reporter.