According to a Guardian report, Israel is availing the services of Microsoft’s cloud in its conflict with Palestine…


Israel’s war with Palestine has stretched beyond physical borders. According to verified reports by The Guardian, Israel has been using sophisticated digital systems supported by Microsoft to enable surveillance, data processing and targeting operations.

Microsoft’s Azure along with Israeli AI tools like Lavender are being used in warfare and this has raised several ethical and humanitarian questions.

Image Courtesy: Wikimedia Commons

Mass Surveillance via Microsoft Azure

According to the investigation report, an Israeli military intelligence unit called Unit 8200 has been using a dedicated, segregated section within Microsoft’s Azure cloud to store enormous quantities of Palestinian phone calls since 2022.

It all began with a 2021 meeting between Microsoft CEO Satya Nadella and Unit 8200’s commander Yossi Sariel. The system in place allows the collection and storage of “millions of mobile phone calls made each day by Palestinians in Gaza and the West Bank.”

The cloud infrastructure provides “near-limitless storage” and findings from the report show that the archive spans approximately 11,500 terabytes, which is equal to a whopping 200 million hours of phone calls.

This surveillance data has directly helped in the preparation and planning of airstrikes in both Gaza and the West Bank.

Microsoft CEO Satya Nadella has denied any knowledge of what Israel’s military has stored in its cloud (Image Courtesy: Wikimedia Commons)

Microsoft’s Public Denial

The Silicon Valley-based company has explicitly denied any wrongdoings. It has claimed it has no knowledge of the nature of data Israeli officials intended to store in its cloud.

After pressure due to reports from The Guardian and other sources in January this year, the tech giant launched a formal investigation into the matter. Microsoft claimed that the review had “found no evidence to date” that Azure or its AI products were “used to target or harm people” in the territory.

A Microsoft spokesperson claimed the company’s “engagement with Unit 8200 has been based on strengthening cybersecurity and protecting Israel from nation state and terrorist cyber-attacks”. “At no time during this engagement has Microsoft been aware of the surveillance of civilians or collection of their cellphone conversations using Microsoft’s services, including through the external review it commissioned.”

Image Courtesy: Wikimedia Commons

AI-Driven Targeting with Lavender Israel’s use of AI is not limited to data storage. An earlier The Guardian report from 2024 had mentioned an AI-powered database called Lavender which was being used in the war in Gaza to identify up to37,000 potential targets based on suspected links to Hamas.

This coupled with intelligence drawn from the phone call repositories held in Azure are being used to research and identify bombing targets in Gaza. Talking to The Guardian, an 8200 source claimed that when planning an airstrike on an individual located in a densely populated areas where high numbers of civilians are present, the cloud-based system would be used to examine calls made by people in the immediate vicinity.

The sources also claimed that the use of the system had increased during the campaign in Gaza, which has killed more than 60,000 people, the majority of whom are civilians, including over 18,000 children.

Lavender is believed to have played a crucial role in the initial months of the conflict, by producing a list of presumed militants and low-ranking operatives. After some internal refinement, the system has “a 90% accuracy rate when it comes to verified target recommendations.

Image Courtesy: Wikimedia Commons

Ethical Questions & Corporate Accountability

The latest reports have raised several ethical questions. Microsoft’s supposed ignorance and investigation shows how corporates might struggle or choose to turn a blind eye when it comes to global rights.

The implications are severe. AI and cloud are streamlining war but at what cost? The mass surveillance enabled by Azure, combined with AI-led targeting, is a direct attack on human rights. Critics have long called for tech companies to uphold their moral responsibilities when it comes to conflict involvement.

The Last Word

As Microsoft continues to look into its involvement in the conflict in Gaza and artificial intelligence continues to support and shape military outcomes in the region, the bigger question is about ethical boundaries that tech companies need to draw.

The intersection of data, tech and war needs greater scrutiny. And accountability.

In case you missed:

Adarsh hates personal bios, Chelsea football club and Oxford commas. When he's not writing, he's busy playing FIFA on his PlayStation.

Leave A Reply

Share.
© Copyright Sify Technologies Ltd, 1998-2022. All rights reserved