Microsoft is in the spotlight once again, not for its games and devices but for how it handles loss in the age of AI. After the layoffs rolled out of trauma that impacted some of Xbox’s studios like Arkane Austin and Tango Gameworks, Matt Turnbull, Executive Producer at Xbox Game Studios Publishing, shared a LinkedIn post in the wake of the layoffs, making the “well-meaning suggestion” that the people impacted would reach out to AI.
He really seemed to believe this was being helpful. The post was removed shortly thereafter. Intentions aside, the message, though meant to support, also raised ire for its tone-deafness. The backlash revived larger conversations about how tech companies navigate real losses and emotions with technologies that include not just AI but also strategic choices.
Layoffs Across Xbox Studios
Microsoft’s gaming division has recently gone through a significant reorganization, resulting in many studio closures and massive layoffs. This was part of Microsoft’s larger goal of streamlining its business after it acquired Activision Blizzard for $69 billion.
The studios affected include ZeniMax, Arkane Austin, and Tango Gameworks. Employees from both creative and technical teams were impacted, breaking up pipelines for developing games, as well as deeply hurting employee morale.
Using Copilot and ChatGPT as Coping Mechanisms
As reported by the BBC, Xbox executive Matt Turnbull suggested laid-off staff use AI tools like Copilot and ChatGPT for emotional support, sparking public backlash. He encouraged the affected employees to leverage the AI tools, not only for emotional support but also for career-related tasks (resume building, job searches, networking, etc.)
“These are challenging times, and if you’re navigating a layoff or even quietly preparing for one, you’re not alone and you don’t have to go it alone,” Turnbull said. “No AI tool is a replacement for your voice or your lived experience. But at a time when mental energy is scarce, these tools can help get you unstuck faster, calmer, and with more clarity,” as reported by The Verge.
Cold Comfort or Useful Innovation?
The reaction was mixed. Some saw value in an idea that was practical and modern. Others considered it insensitive and tone-deaf, particularly for a company that had orchestrated layoffs.
Some of the criticism stemmed from the idea of processing emotion while using AI, arguing that a human connection and comfort level are necessary in times of crisis. Some talked about the irony of a company providing software instead of support at a time when employees are mourning the loss of their teams and the loss of their careers.
This tension between automation and emotional experience is playing out across industries. In sports, China’s use of AI referees has raised similar debates about whether technology enhances fairness or erodes human connection.
Microsoft’s Broader Context: Aftermath of Activision
Microsoft closed studios as part of its restructuring process after acquiring Activision as a way to clean up its operations. The company informed employees that it needed to “free up money” for its long-term vision.
Microsoft also told staff that the changes represented a greater emphasis on efficiency and its future identity. The emotional reactions of the displaced teams imply that the human element of change was not their first or primary consideration. As reported in ScreenHub, analysts indicated that the Xbox layoffs showed greater symptoms of problems embedded into Microsoft’s monetization strategy and its investment strategies, with the associated financial implications of their aggressive studio acquisitions.
MnS’s restructuring is in line with a broader process across the corporate entity, as IndiaTimes reported that Microsoft company-wide intends to lay off up to 9,000 more employees while pursuing artificial intelligence investments, underscoring the tension between pursuing innovation and making impactful changes to the workforce.
Can AI Understand Grief?
According to Gadgets360, Arkane Studios founder Raphaël Colantonio posted on X:
“Why is no one talking about the elephant in the room? Cough cough (Gamepass),” Colantonio said on X (formerly Twitter) Saturday.
Similar concerns are growing in healthcare, where AI diagnostic tools are prompting questions about accuracy, empathy, and medical trust.
The Moral Dimensions of Empathy in an AI World
This event exposes a broader tension at work: how companies negotiate technological implementations with authentic human empathy. As AI and similar tools integrate into work, we must assess their impact on emotional and ethical decision-making processes.
While Copilot and ChatGPT offer temporary benefits, they cannot substitute for genuine connections during grief and mental distress.
I believe that technology should support humans and not replace them. AI assists in organizing thoughts and writing messages, yet it risks replacing empathy, leading to a loss of human connection. There are moments in a crisis when people need to feel validation and support from real humans. AI for emotional support seems efficient, yet it lacks the genuine care and respect essential for meaningful moments.
