Home

Donate

The Weaponization of Things: Israel’s Techno-Violence, A Litmus Test for Technologists

Afsaneh Rigot / Nov 4, 2024

From Israel’s use of AI and facial recognition in Gaza to the remote detonation of pagers and walkie-talkies in Lebanon, this moment should be a massive wake-up call for anyone doing tech and society work: nothing will remain neutral, and the development of new technologies cannot continue as normal, writes Afsaneh Rigot.

Palestinians charge mobile phones and batteries using portable charging stations on a street in western Khan Younis, Gaza, on Tuesday, Oct. 31, 2023. Photographer: Ahmad Salem/Bloomberg via Getty Images

Israel’s weaponization of technology in its war on Gaza — and now in Lebanon and other parts of the region — has reached unprecedented levels. Both low-tech everyday technologies and high-tech emerging ones are being turned into weapons deployed with impunity.

What is happening in Palestine and Lebanon is consequential to everyone. If we are fearful of the potential harms of technology and emerging tech – or the “existential threats” they could pose – the only way to stop a fatalistic future is to recognize this most lethal weaponization of tech and push firmly back at any normalization. These are not sci-fi scenarios but very real violence that is happening now. In other words, the only way to challenge hypothetical threats is to understand the current violence – and invest in stopping it.

In the past year alone, we have seen the most dystopian uses of AI and facial recognition systems for automated killing and deadly drone attacks, vast disinformation campaigns on social media, and the remote detonation of low-tech pager devices. These are new thresholds for violence through technology.

In April, +972 Magazine reported that Israel uses an AI system called “Lavender” to automate military targeting. It assigns a social scoring system to rate Palestinians in Gaza on the basis of their suspected affiliation with Palestinian armed groups – often arbitrarily – in order to categorize them as potential targets. Everyday activities such as call records, location data, social network connections, and biometric data are included and when a target meets the threshold, they are placed on the kill list, called “Where’s Daddy?” The name itself is disquieting. This tech is designed to track the target–often to home, where they may be with family–before striking.

The list of oppressive tech is lengthy, historic, and harrowing. Israel has long used varying levels of technology for the systematic enforcement of its apartheid regime and occupation. The Palestine Laboratory, a 2023 book by investigative journalist Anthony Loewenstein, chronicles how these systems have been tested on the Palestinian people for seven decades. However, the violence of these methods is never confined to Palestinians – they are then exported around the world.

What we are seeing today is not only how advanced systems are created and tested in a tech-facilitated genocide but also new methods in the weaponization of everyday things. In September, Israel’s pager and walkie-talkie attacks in Lebanon came amidst the use of more advanced tech on Palestinians. This was the use of low-tech, cheap, everyday technologies as weapons to kill and maim. At least 32 people, including at least four children and several hospital workers, were killed in the attacks, and more than 3,300 others were injured. In the aftermath, parents were afraid to turn on their baby monitors. People unplugged their televisions. Grown men wondered if the next time they went to light a cigarette, the lighter would explode.

“People think they’re going to lose something by talking about Palestine. But they should think about what they lose by not talking about Palestine.” — Ta-Nehisi Coates, author and journalist

Exploding pagers and the “Weaponization of Things”

For years, Hezbollah, whose members were the purported targets of the most recent pager attacks, reverted to low-tech alternatives in an attempt to avoid smartphones. They believed they were vulnerable to surveillance and other forms of weaponization by Israel, which is known for the mass production and exporting of spyware technology.

Israel and its supporters framed it as a legal “precision attack,” justified because Hezbollah is designated as a terror group and regional foe. But the United Nations High Commissioner for Human Rights and legal experts determined it was not precise but was, in fact, a war crime perpetrated with the intent to cause psychological fear and terror among civilians. This is techno-terror. Importantly, considerations of terrorism often ignore the possibility of state terrorism. But this attack is arguably an example of it. UN experts called it a “terrifying” violation of international law. Numerous International and regional NGOs have called for prompt and independent investigations and accountability.

This form of attack has now been tested on the Lebanese people, and some have even framed it as a “masterstroke,” greenlighting the concept that if a people or group are defined as terrorists, then anything goes – including damage inflicted on innocent bystanders. The fact that the assault was done in a “devastating and extravagantly public fashion” suggests the dawn of a new era for the use of these methods. It ushers in an age of techno-warfare on the masses through the weaponization of everyday technologies and the exploitation of vulnerabilities in international supply chains for violence and surveillance.

The psychological impact of using cheaper everyday technologies as weapons — the Orwellian feeling of control that the most mundane object can explode at any time — further alienates marginalized communities around the world from the tech they might need to survive.

Boomerangs and a reckoning for tech and society

“Believe us when we say we are a surveillance testing lab in every sense of the word.”Marwa Fatafta, Palestinian policy analyst and digital rights expert

If Israel can inflict terror on Lebanese and Palestinian civilians at this scale — and have it applauded — why would it not do so again elsewhere? Even further, why wouldn’t other groups and states use weaponized everyday tech against their opponents, or even their own citizens, for control? If emerging technologies can be used for genocide and as human grading systems set by machines that kill, why would this automation of oppression not be used by other repressive states? Once something is proven technically possible, it will likely be used again.

If these attacks are normalized without proper, meaningful, independent investigations and demands for accountability, the precedent is sinister. As Daniel Levy, president of the US/Middle East Project and a former Israeli negotiator, recently warned: “The way this war is being conducted should terrify everyone in terms of what the future — which is here today for Palestinians [and Lebanese] — looks like.” The “Palestine Laboratory” has spawned a spyware economy and been used as political currency by Israel. Palestinians and global experts have been ringing alarm bells about the global risks of this new tech warfare and say it should be understood in line with a long history of invasive human rights abuse tactics. Tools and systems that started as military weapons, anti-terror tools and strategies, and “national security” systems are commercialized and used against general populations.

We can understand how to work and build against techno-terror by first understanding how the goalposts of what is deemed acceptable have been moved historically. Once the road is paved to use these tactics en masse, it becomes harder to challenge. History teaches us that. Some of the worst systems of oppression, surveillance, and abuse used on people daily come from sinister beginnings: colonial tactics, warfare, anti-terror “gray areas,” and white supremacy, to name a few. They often start as a violent system, tool, or tactic used against those deemed as threats to hegemonies; usually the most marginalized or those dehumanized by power-holders of the time. Once “tested” on these groups, they are later used on the masses.

A version of this is seen in the “colonial, terror or imperial boomerang effect.” The term was originally coined to show that empires use their colonies as laboratories for methods of counter-insurgency, social control and repression” — methods later used against their citizens. We are now experiencing versions of this boomerang from the expanded powers and rights abuses seen in “counter-terrorism” campaigns. For decades, the violation of international laws, human and civil rights has been permissible through anti-terror laws and prevention tactics. The remnants of military, invasive, and violent tactics used in “anti-terror” law or military campaigns are weaved into everyday state controls and abuses of power – think the impact of changes post-9/11.

Through my work, I have seen this play out on groups regularly dehumanized in the mainstream, like queer people, prisoners, migrants and refugees, Black people, and Palestinians. They are often used as the initial “petri dish” of these violent systems. The systems are eventually exported as weapons or technology to use against everyday people for means of control in the “new normal” — from policing and immigration controls to mass surveillance systems and the use of tear gas and administrative detentions. Let’s not forget police who have roots in US crimes of slavery and their own methods in the imperial boomerang. For years, the same US police forces' brutality and racial profiling have “swapped” tactics and training with Israeli forces in charge of enforcing apartheid systems.

Tech building cannot continue as normal

In our organization’s work at The De|Center, we investigate how technology evolves in various global contexts, support our communities, and push for tech (especially everyday communication technologies) that are built while centering the most marginalized and criminalized people (the decentered). I have been working in this field for a decade now. There is no way to improve the design of violent tech, such as those used militarily or for mass policing and surveillance, for the margins. Such tech can only be “bettered” by abolition. However, it is hard to think about how we can push for better technologies and digital security for communities when we are witnessing the heaviest, most sinister forms of communication tech weaponization.

The Washington Post reported that the devices used in Lebanon had a “two-step de-encryption procedure,” or a two-factor authentication feature, to ensure that the pagers were in the hands of the user when it detonated – with the aim to maim hands and eyes. This feature is one that people working in tech, society, and privacy often recommend to our communities and activists as safety measures. But it was the weapon this time. How do we continue to recommend this, knowing what our communities have seen?

The pager attacks pose larger questions for us than (albeit important) supply-chain vulnerability threats: the weaponization of physical communication tools redefines the relationship between tools/products and a user. All of our tech involves physical elements, and if devices can be manipulated into explosives with impunity, very little around us remains safe – regardless of whether the vulnerability is introduced through compromised supply chains or its initial design. It is that impunity and normalization that create a bleak future.

Palestinians in Gaza have seen technology created for civilian purposes — such as Google Photos or cloud computing systems — used as real weapons of mass destruction. There have been accounts of people in Gaza who choose not to use WhatsApp for fear of being tracked by machines that might make them a target for Lavender — where “every phone call they make, every friendship they build, and every shop they visit can push their score up or down.” People cannot trust tools when they see them used to incite genocide and violence captured in images seen all around the world.

The chilling effect of these acts of terror is vast. At a time when people are at war and under bombardment, when they need their everyday tech to communicate with family and loved ones or to gather news for their safety, these escalations are a laceration of trust in both the digital and physical things they need the most.

This fear can lead people to use more unsafe options for their communications or to avoid digital communications altogether. In my own work alone, I have heard people from Europe to Africa, Latin America to the Middle East say they are worried about what tech to use and who to trust.

Ending the cycle of tech-destruction

While Israel hailed its walkie-talkie attacks as a success in its “precision” targeting of Hezbollah, the victims told a different story. The deadly war rages on in Lebanon, and there is the specter of a wider war with Iran.

We are witnessing what may come to be known as the first techno-genocide. Since October 2023, the war on Gaza has created an “apocalyptic” “hell on Earth,” with an estimated 42,000 killed, 52% of those identified being women and children. In February 2024, Francesca Albanese, Special Rapporteur on the Palestinian territories, stated that “the threshold indicating the commission of the crime of genocide [in Gaza]…has been met.” The number of those directly targeted by killer tech is unknown and will likely remain unknown for years to come.

The weaponization of communication technologies like pagers as bombs or social media as lethal grading systems raises vital questions about the design and use of technology, particularly regarding its potential for dual-use. Developers and tech companies have an obligation to design and build in ways that prevent weaponization. First, every person or institution that works on “tech and society” must push for meaningful accountability and investigations into these sinister techno-terrors. The technologists of conscience who continue to advance such work with these harms in mind must be supported and followed, even if immediate repercussions exist. Finally, we must call for the abolition of the use of consumer technologies for warfare and an end to the genocide, occupation, and a system of apartheid. Otherwise, any serious investigations into the societal harms of technologies and their impacts cannot continue.

Pushing back on the normalization of techno-terror is important for the lives of millions of people in the Middle East, but also for everyone on the globe. If we are to contend with any work related to “tech and society,” the boomerang must be grabbed and broken.

The author wishes to acknowledge the De|Center team for their contributions to the review of this post, including Roja Heydarpour. Thanks are also due to Marwa Fatafta, Dia Kayyali, Mahsa Alimardani, Reem Almasri, and Nina Nilofa.

Related Reading

Authors

Afsaneh Rigot
Afsaneh Rigot is the founder and Principal Researcher of the De|Center, advisor to the Cyberlaw Clinic at Harvard Law School and ARTICLE 19, as well as an affiliate at the Berkman Klein Center at Harvard University. As well as general privacy and security topics, Rigot's focus is law, technology, hu...

Topics