Future Tech

Microsoft Dynamics 365 called out for worker surveillance

Tan KW
Publish date: Wed, 31 Jul 2024, 04:49 PM
Tan KW
0 460,935
Future Tech

Microsoft Dynamics 365 provides "field service management" that allows customers to monitor mobile service workers through smartphone apps - allegedly to the detriment of their autonomy and dignity.

According to a case study from Cracked Labs - an Austrian nonprofit research group - the software is part of a broader set of applications that disempowers workers through algorithmic management.

The case study [PDF] summarizes how employers in Europe actually use software and smartphone apps to oversee field technicians, home workers, and cleaning staff.

It's part of a larger ongoing project helmed by the group called “Surveillance and Digital Control at Work," which includes contributions from AlgorithmWatch, Jeremias Adams-Prassl, professor of law at the University of Oxford, and trade unions UNI Europa and GPA.

Mobile maintenance workers used to have a substantial amount of autonomy when they used basic mobile phones, the study notes, but smartphones have allowed employers to track what mobile workers do, when they do it, where they are, and gather many other data points.

The effect of this monitoring, the report argues, means diminished worker discretion, autonomy, and sense of purpose due to task-based micromanagement. The shift has also accelerated and intensified work stress, with little respect to workers' capabilities, differences in lifestyle, and job practices.

It's not just Microsoft

The case study looks specifically at Microsoft Dynamics 365, but observes that there are many field service management applications - from vendors like Oracle, SAP, Salesforce, IFS (Sweden), Nomadia (France), OverIT (Italy), Praxedo (France), ServiceMax (US), and ServiceNow (US) - which are very similar.

Field service management is one part of Dynamics 365, a cloud-based enterprise system for customer relationship management and enterprise resource planning, the report explains.

The software can be used for managing various types of work and provides tools for communication, time and location tracking, billing, and management of information about customers, equipment, and materials. It allows employers to manage and automate the scheduling and dispatch of workers, work orders, and service tasks.

Wolfie Christl, a public interest researcher with Cracked Labs and author of the report, told The Register that one of the issues with the sort of algorithmic management enabled by Dynamics 365 is that it enables intense performance monitoring.

"This can clearly be used to pressure workers to accelerate work and so on," Christl explained, noting that there are different approaches to workplace monitoring in different regions. "For example, in Austria and Germany, you need an agreement with the employer. The employer needs an agreement with workers and with the work council to implement this kind of individual-level performance rating."

In the US, he noted, the situation is different because there's less consensus that this sort of monitoring should not be allowed without discussion.

Even more problematic, Christl argued, is the way such systems are being used to micromanage work - telling workers where they should go, which client they should visit, when they should arrive, the tasks they should perform, and the target time for each task.

"This clearly, from my perspective, affects autonomy," declared Christl. "It affects work discretion. And there can be many things that can go wrong because, if the system does not really accurately digitally represent all the steps and the tasks that should be done, then workers have the problem how they handle it with the system."

According to Christl, Dynamics 365 lets employers predict how long it takes to complete specific work based on a worker's past activities and feedback from AI models.

"The analysis may, for example, suggest that a particular client, region, weekday, task, or worker will likely increase the time required to carry out the work," he suggested. "The example report shows how the system accuses a particular worker named 'Bob Kozak' of being slower than expected."

The report notes that Microsoft advises customers against using this data for personnel actions.

"Microsoft emphasizes in the documentation that its 'predictive work duration' system is 'not intended for use in making, and should not be used to make, decisions that affect the employment of an employee or group of employees, including compensation, rewards, seniority, or other rights or entitlements.'"

It also observes that Microsoft has integrated its generative AI technology, known as Copilot, into its field management system - for doing things like summarizing information related to a particular work order.

"Dispatchers are told to 'review' the summary before 'using' it in order to 'ensure AI-generated content is accurate, complete, and appropriate,'" the report explains. "Copilot also offers to automatically create draft work orders based on the contents of emails with customer requests."

Show a little respect

Saiph Savage, assistant computer science professor at Northeastern University and director of the Northeastern Civic AI research lab, told The Register that she sees a problem with the way these sorts of tools enforce specific work patterns without respect to worker abilities and cultural norms.

"They talk a lot about how the employer can see how much time workers are taking for specific tasks, how much time they're expected to take. And if you think about it, all of that is assuming that people have specific types of work patterns," she explained.

"It's not considering, for example, that certain cultures simply have different types of time management. For example, some cultures are more monochronic, where they're doing one task at a time. Others are polychronic, where they're doing multiple tasks at a time.

"I think the problem with this is that it assumes that everyone has to behave similarly," Savage opined. "I'm also worried about people with disabilities or older adults who are completing the work. It might simply take them slightly longer to complete the task. And it still might be okay."

Savage also sees other problems with employee monitoring. "It also limits workers' creativity," she added. "It makes workers feel that they are not trusted, and it can create an us versus them dynamic."

Microsoft disputes that its software uses AI to make performance-based recommendations, contends that its software helps field service personnel do their jobs more effectively, and notes that it's up to employers to follow applicable laws related to monitoring and privacy.

"Field service workers travel to multiple locations servicing different products every day," a Microsoft spokesperson told The Register. "Dynamics 365 Field Service and its Copilot capabilities are designed to help field service workers schedule, plan and provide onsite maintenance and repairs in the right location, on time with the right information and workplace guides on their device to complete their jobs.

"Dynamics 365 Field Service does not use AI to recommend individual workers for specific jobs based on previous performance. Dynamics 365 Field Service was developed in accordance with our Responsible AI principles and data privacy statement. Customers are solely responsible for using Dynamics 365 Field Service in compliance with all applicable laws, including laws relating to accessing individual employee analytics and monitoring."

Asked about the report's claims that automated management tools restrict worker autonomy and personal dignity, Lili Cheng, corporate VP of business applications and platforms at Microsoft, told The Register: "That's one of our key goals, to uphold the dignity and experience of workers."

Cheng challenged the report's contention that its software supports performance-based job recommendations.

"There were certain things that we don't actually support at all in the product," she argued. "So we don't do things like recommend individual workers for specific jobs based on their performance. We're not ranking people. And that's really important to us just for responsible AI and those kinds of policies."

Cheng said field service work is often complicated and both employers and workers appreciate having the kind of information available through Dynamics 365 - particularly younger workers who have grown up using smartphones.

"So one of the things that we want to make sure is that we have, you know, good tools for the workforce so that they can do their jobs and that companies can actually retain and make that job experience better for workers," she explained.

Asked about concerns that algorithmic work management systems potentially discriminate against older workers and fail to account for different work patterns, Cheng replied, "Obviously, we look at all the responsible AI principles and we follow that with pretty much all the workplace tools we have." ®

 

https://www.theregister.com//2024/07/31/microsoft_dynamics_365_surveillance/

Discussions
Be the first to like this. Showing 0 of 0 comments

Post a Comment