Microsoft has introduced a new internal feature that allows employees to confidentially report potential misuse of the company’s technology. The move comes amid its growing criticism over its alleged involvement in military surveillance operations.
The tool, called ‘Trusted Technology Review,’ has been added to Microsoft’s existing Integrity Portal. It enables employees to raise ethical or policy concerns related to the development or deployment of Microsoft’s products, functioning similarly to systems used for reporting workplace misconduct or security issues.
In a memo sent to staff on 5 November, Microsoft President Brad Smith said the new mechanism will ensure employees can report concerns anonymously and without fear of retaliation. The initiative reflects the company’s effort to enhance oversight and reinforce accountability in light of increasing ethical challenges in the technology sector.
The move follows months of public and internal pressure after reports linked Microsoft’s cloud and AI technologies to Israel’s Unit 8200 — a military intelligence division accused of surveillance operations during the Gaza conflict. Earlier this year, Microsoft confirmed elements of The Guardian’s investigation, saying it had found supporting evidence that some of its Azure services were used by Israel’s Ministry of Defense. The company has since disabled certain subscriptions tied to those activities.
Smith said the new reporting system is part of a broader push to strengthen Microsoft’s governance and human rights due diligence processes before entering contracts involving sensitive technologies. The company is now refining its pre-contract reviews to ensure that future partnerships undergo deeper ethical evaluation.
The update comes amid a wave of employee activism, including campaigns such as No Azure for Apartheid, which have demanded greater transparency around Microsoft’s defense and surveillance contracts. By formalising a reporting channel for ethical concerns, Microsoft aims to create an internal safeguard that identifies potential risks before they escalate.
While it remains to be seen whether the move will satisfy critics, the initiative marks a notable step toward embedding ethical accountability into Microsoft’s culture — signalling a shift from reactive damage control to proactive governance.

