Israel Acknowledges Use of Computer Algorithm For Palestinian Arrests
Mass arrests of over 400 Palestinians were the result of a computer algorithm based on their social media usage, explains TRNN’s Shir Hever
SHARMINI PERIES: It’s The Real News Network. I’m Sharmini Peries, coming to you from Baltimore.
Israeli authorities have recently acknowledged that, in the past few weeks, they have arrested 400 Palestinians who committed no crime, but were identified by a computer program as, “likely to commit a crime in the future”.
It is also the second week of a hunger strike of Palestinian political prisoners in Israeli jails that are protesting prison conditions and overcrowding. According to the Palestinian prisoners’ rights organization, Addameer, there are currently 6,300 Palestinian political prisoners, which include 500 administrative detainees who are held without charges, 61 female prisoners, and 300 child prisoners. These numbers do not include the 400 I mentioned above.
Last week, we talked with Sahar Francis, the director of Addameer, recently in an interview with The Real News, and she spoke not only about the hunger strike, but also about the mass arrests. Here is what she had to say.
SAHAR FRANCIS: Actually, these mass arrest campaigns were announced since almost more than two years under the… the argument that all these people are involved in incitement. This has become the trend in the last couple of years, arresting people for their social media activities, whether publishing things on Facebook, or Twitter, or in other social media. And they are arresting lots of activists from the ground, peaceful activists that they are joining the demonstrations against settlements, and the wall, and so on.
SHARMINI PERIES: According to the Israeli police, and the Israeli Security Agency, ISA, they are already arresting Palestinians if they suspect that they might commit a crime, although they have not yet done so. Over 400 Palestinians were arrested by using a computer algorithm to analyze their behavior patterns.
To further explain what is going on here, I’m being joined by Shir Hever. Shir is a Real News correspondent in Heidelberg, Germany. Shir had been living in Israel, where he worked with the Alternative Information Center.
Thanks for joining us today, Shir.
SHIR HEVER: Thanks for having me, Sharmini.
SHARMINI PERIES: Shir, what is this algorithm? And how do the Israeli police and the secret police know when someone is about to commit a crime?
SHIR HEVER: Of course, we don’t know exactly what is the program and how it works. But this is a series of Israeli companies that developed this kind of algorithm. And the idea is that they track social media, but not only social media, but also information about the individual, in order to try to predict their behavior.
And the point is that if somebody has lost a relative, for example, to an attack by the Israeli military or by colonists, or if somebody has just lost their job, they’re considered now to be a security risk. Because maybe they will be frustrated and maybe they will seek revenge.
So, this creates an algorithm where people are double punished. For example, you’ve lost a sibling to Israeli aggression, then suddenly you are arrested because you might consider committing a crime. And, therefore, people are punished twice.
SHARMINI PERIES: Right. And, Shir, why would the Israeli police, and secret police, even admit to arresting innocent people?
SHIR HEVER: This is a major part of this story. The story was actually uncovered by John Brown, who is an Israel journalist who is covering human rights violations by the Israeli authorities. But this is not his real name. It’s a pseudonym. He prefers to remain anonymous. His sources are actually Israeli police, and secret police operatives, who are willing to admit that they are arresting people based on their prediction of their future behavior because this creates a business opportunity for these companies. The companies who are providing these algorithms are not just selling them to the Israeli police, but they also try to export them to other governments.
So, the point is, if they can show that they are successfully preventing terrorism by arresting these people, then they will have better sales for their program.
SHARMINI PERIES: Shir, in the age of a lot of terrorism, and hyper-national security concerns, isn’t actually a good thing to be able to detect that there is a crime being planned, or a crime underway? And use the technology you have in order to pre-emptively stop the process by arresting the people involved? Is that what’s going on?
SHIR HEVER: Well, I don’t think it’s a good thing. I think it’s a basic violation of the right of human beings to be considered innocent until proven guilty. And these programs, they don’t focus on the situations that create violence. They focus on the individuals. So, rather than saying maybe 50 years of military occupation is causing people to rise up in anger and to fight against occupation, and that causes violence, instead the program focuses on individuals and says, oh, this person who just lost their job, who just lost a family member to Israeli aggression, that person might pose a security risk. And they will be punished without having actually committed a crime.
In the long term, this creates a lot more resistance, a lot more violence, and prevents any chance of reconciliation.
SHARMINI PERIES: And, Shir, are the streets of Israel safer as a result of the usage of this kind of technology?
SHIR HEVER: Absolutely not. What the Israeli officials, and the Israeli journalists who are reporting on this new algorithm, they’re trying to promote these companies. But they also have to admit that the attacks continue. Palestinians are not willing to sit tight and remain under occupation. Because the level of frustration is very high – this is now the 50th year of military occupation in Palestine, then a lot of Palestinians do resort to violence. And there have been a series of attacks. Also, following this massive wave of arrests that we’re now talking about, which proves that when somebody just decides to act in violence, that these algorithms have no success in predicting that.
SHARMINI PERIES: And at this moment, we mentioned earlier that there is a hunger strike of Palestinian prisoners that is getting worldwide attention. Then why are the Israeli authorities stuffing their prisons with more prisoners? Aren’t they making the hunger strike even stronger, and drawing more attention, international attention, to what’s going on there?
SHIR HEVER: Well, absolutely. The Israeli authorities are not acting in a rational manner. As a colonial power, they are trapped in a sort of power struggle with the Palestinians. They have to keep showing and proving to the Israeli public, and to the Palestinians, and to themselves, that they are able to always act any way they want. Even if it doesn’t make sense or even if it’s not the best course of action.
That means that if the Palestinians choose to protest the occupation this year, the 50th year of Israeli occupation, is a very important milestone and a lot of Palestinians are indeed protesting, then the Israeli authorities have to prove that they can crush these protests.
And, actually, they are doing that in a very arbitrary, very heavy-handed manner. And by increasing the population of the prisoners just as this major hunger strike enters now its second week, this is something that is certainly making the Israeli authorities look bad. But they have no choice. They are trapped in this course of action.
Any kind of concession, any kind of attempt to defuse the situation would be seen as weakness by the Israeli public.
SHARMINI PERIES: Right. It is not unusual for Israeli authorities to arrest Palestinians and hold them for extended periods of time without a trial? It’s going on now. Does it matter if it is based on an algorithm of this sort, if they’re doing it anyway?
SHIR HEVER: It matters because the very pretence that there is rule of law has been dropped. Because rather than saying that these people are guilty of incitement, which was the argument so far, now they’re not even bothering to blame them of incitement. They’re just arresting them for future crime. So, a line has been crossed.
But, I think more importantly than that, is also the issue of how it affects people outside of Israel and Palestine. Because these companies, one of the companies that is developing these kinds of algorithms called Faception. They developed technology that is somehow supposed to identify terrorists by their behavior. Faception claimed that terrorism runs in people’s DNA. And by the shape of their face, they can tell in advance if somebody is going to be a terrorist or not.
These kinds of companies are hawking their technology, and they get the Israeli authorities to use that technology in order to arrest Palestinians. Right now, they have no success, because this technology is not successful in actually preventing resistance by Palestinians, or violent acts by individual Palestinians.
But they only need it to work once. Because if, at some point, they’re going to arrest a lot of Palestinians and then violence will go down, then they would be able to claim this is thanks to our technology, and they’ll be able to sell that technology to customers in Europe, in America, in other countries around the world. And that’s exactly the risk, because in the end, it’s people all around the world who would be subject to that kind of algorithm that might get them arrested without them having done anything.
SHARMINI PERIES: Shir Hever, I thank you so much for joining us today.
SHIR HEVER: Thank you, Sharmini.
SHARMINI PERIES: And thank you for joining us here on The Real News Network.