denisismagilov - stock.adobe.com

Lawmakers take aim at employee monitoring software

AI-enabled employee monitoring software's role in measuring performance is raising concerns that it might help automate hiring, firing and promotion decisions.


Listen to this article. This audio was generated by AI.

As AI-enabled employee monitoring software tools expand, so does the legislative pushback. Some lawmakers are worried about the implications of monitoring everything employees do and the possibility that employers could rely on algorithms in hiring, firing and promotion decisions.

Today, few U.S. laws govern how AI is used to monitor employees in the workplace, but lawmakers are trying to change that. One effort is from Massachusetts State Rep. Dylan Fernandes, a Democrat, who has been trying to win approval of his provocatively named bill, An Act Preventing a Dystopian Work Environment.

The "gaps in employee privacy protections are becoming increasingly concerning," Fernandes said in an email to TechTarget Editorial about his proposed legislation. He said his bill confronts "this creeping surveillance by requiring transparency and empowering employees with the right to know what data is collected and how it's used."

At 39 pages, the bill governs employee data collection and use, and requires notification of a company's specific form of employee monitoring; it also regulates the use of automated decision-making in employment, in part by requiring an Algorithmic Impact Assessment before using a system. The Canadian government uses the AIA methodology, which asks a series of questions related to automated decision-making.

The Massachusetts bill is just one of many emerging in states such as New York, New Jersey and California, as well as in Congress, that would set transparency requirements for employee monitoring software and AI use, said Joseph O'Keefe, an employment law partner at Faegre Drinker in New York City.

Employee monitoring software outpacing laws

AI is evolving so fast that it will be some time before vendors, their customers and governments understand the legal complexities of these systems, according to O'Keefe. But unless employees are aware of how algorithms are used and how employees are measured at work, it will create problems, he said.

The legislature didn't adopt the Massachusetts bill this session, which ended in August, but it could be reintroduced.

"The best way to preserve trust," O'Keefe said, "is to make sure that the employees understand how their performance is being monitored."

For these systems to work, to collect the data, they have to be monitoring in one way or another.
Hatim RahmanAssistant professor of management and organizations, Northwestern University

How AI systems gather employee monitoring information is opaque, said Hatim Rahman, an assistant professor of management and organizations at the Kellogg School of Management at Northwestern University, who believes legislation similar to the proposed Massachusetts bill will be needed.

Performance measuring systems can't be separated from employee monitoring, according to Rahman, whose book, Inside the Invisible Cage: How Algorithms Control Workers, was published last month by University of California Press.

"For these systems to work, to collect the data, they have to be monitoring in one way or another," he said.

For software engineers, employee monitoring software could measure how many lines of code they produce or how quickly an employee responds to a message, Rahman said.

Worker experiences aren't easily measured

However, Rahman sees this as a form of data cherry-picking with a limited view of employee performance. He said employees might not have a clear path for recourse or accountability in how they are measured.

"Even the most sophisticated algorithms have difficulty capturing the complexity of workers' experiences," Rahman said.

While employers sort out these issues, vendors are making tools to help them better understand their workforce.

Visier, a people analytics software maker, said this month that it has "enhanced" its corporate mission to help companies deal with productivity growth, among other issues. Its AI tools aim to help employers understand "Which employees are most and least productive and why?" and answer questions such as "How should we pay people to improve fairness, equity, as well as business impact?"

In an email response to questions from TechTarget Editorial, Paul Rubenstein, Visier's chief people officer, said he sees AI as capable of producing a fairer measure of employee performance.

"When the CHRO is asked, 'Who are the top performers?' they have to rely on a list of performance ratings -- a measure highly influenced by personal perspective and human bias," he said.

But that's different from how employees think about their performance, Rubenstein argued. At the end of the day, an employee counts up Jira tickets closed, sales made in Salesforce, calls taken and tracked in ServiceNow, and meals served and tracked in the point-of-sale system, he said.

Well before AI, Rubenstein said, employees have long known how their supervisors appraised them.

"The AI tools being added to support these employees aren't creating new metrics -- they empower employees to achieve those metric targets more easily," Rubenstein said.

Patrick Thibodeau is an editor at large for TechTarget Editorial who covers HCM and ERP technologies. He's worked for more than two decades as an enterprise IT reporter.

Dig Deeper on Talent management