Monday, December 23, 2024
Monday, December 23, 2024

Terrorism-filled tech legal liabilities in pending Supreme Court battle

Must read

John Furner
John Furnerhttps://dailyobserver.uk
Experienced multimedia journalist with a background in investigative reporting. Expert in interviewing, reporting, fact-checking, and working on a deadline. Excel at cinematic storytelling and sourcing images, sound bites, and video for multimedia publication. Work well with photographers and videographers when not shooting his own stories, and love to collaborate on large, in-depth features.

The Daily Observer London Desk: Reporter- John Furner

For Twitter, one man’s terrorist is another man’s freedom fighter.

And then there’s Donald Trump.

As the Supreme Court ponders the liability of Big Tech companies and their treatment of terrorists who try to gain use of their platforms, the justices were reminded of the words of one Twitter official in 2014 in describing their reticence to policing their users.

“One man’s terrorist is another man’s freedom fighter,” the Twitter employee, who remained anonymous, told Mother Jones then, in reference to a passive approach to the company’s struggle — and avoidance — of defining “terrorism.”

Iran’s Ayatollah Ali Khamenei, who has tweeted that Israel is “a malignant cancerous tumor” that must be “removed and eradicated” remains on the platform. So does Afghanistan’s Taliban.

But Mr. Trump was ousted in the wake of the events of Jan. 6, 2021, for what Twitter called “incitement to violence.”

“It does seem odd they would feel justified in removing Donald Trump but not the leaders of groups that have been on the State Department system as foreign terrorist organizations,” said Max Abrams, an expert in national security and terrorism at Northeastern University.

The freedom fighter quote came up during oral arguments at the Supreme Court on Feb. 22 in a case testing whether Twitter can be sued by family of victims of terrorism for failing to sift out terrorist content.

It was coupled with another similar case, with Google as the main platform, that delved into whether platforms are liable for the algorithms they use to promote content.

The tech companies insist they do try to cull terrorist content, though they admit they can’t be perfect.

Eric Schnapper, the lawyer who represented the families in the Twitter case, said it sometimes didn’t seem like the companies really were trying.

That’s when he cited the Twitter official’s comment, given in the 2014 interview, when asked why Twitter wasn’t taking down Islamic State content three months after ISIS killers executed two Americans.

The quote dates back at least to the 1970s and in the decades since has become a trite absolution of responsibility for trying to sort right and wrong.

Mario Diaz, a lawyer with Concerned Women for America, introduced the quote as part of a “friend of the court” brief filed in the tech cases. He said he was trying to puncture the tech companies’ contention that they are honest arbiters.

“They actively chose to ban the president of the United States in the aftermath of January 6 but refused to suspend known international terrorist accounts, even after being explicitly alerted to them,” Mr. Diaz said. “They are, therefore, knowingly, willingly and actively aiding and abetting these organizations in conducting terrorist attacks, as the families are alleging in this case.”

The justices didn’t stop long to ponder the quote.

Immediately after Mr. Schnapper raised it, Justice Ketanji Brown Jackson moved the lawyer on to talk about the intricacies of blame and how much assistance an enterprise had to give to a terrorist operation before becoming liable.

The tech companies say they aren’t liable for what users post on their platforms under Section 230 of the Communications Decency Act. Their opponents say that 1990s-era provision has been stretched beyond its breaking point, particularly when tech companies use algorithms to promote their content.

The case involving Twitter centers on liability under the Anti-Terrorism Act.

Twitter, in response to an email inquiry for this story, replied with a “poop” emoji. That’s the company’s standard reply to press inquiries under Elon Musk since he took over the platform in October.

Inquiries to Google and Facebook went unanswered.

The companies each scolded Mr. Trump in January 2021.

Twitter, for example, ousted the then-president for two posts on Jan. 8, one of which said his followers “will not be disrespected” and another that announced he wouldn’t attend the inauguration. Perhaps innocuous on their own, Twitter said it read them “in the context of broader events in the country and the ways in which the president’s statements can be mobilized by different audiences, including to incite violence, as well as in the context of the pattern of behavior from this account in recent weeks.”

The company banned him for glorification of violence.

He has since been re-platformed after Mr. Musk’s takeover.

Figuring out who stays and who goes has always been a bit of an art.

In the heady days of social media early in the previous decade, the companies often took a hands-off approach — as the Twitter official’s quote to Mother Jones suggests.

As evidence mounted that ISIS was sustaining itself through online recruiting, though, the platforms began to take a more active role in cleansing themselves.

“Facebook, Twitter, Google and the others already have “a national security or terrorism unit” doing a lot of work in these areas, and they also rely on lots of algorithms to help patrol and eliminate terrorist content,” said James Forest, a professor at UMass Lowell.

Twitter, for example, announced it suspended nearly a million accounts linked to terrorist content.

In 2022, Twitter found itself under pressure to cancel accounts linked to Iran’s Islamic Revolutionary Guard Corps. At first it resisted, telling the Counter Extremism Project that @igrciran hadn’t violated its policies, even though it had tweeted a threat to assassinate President Trump.

Eventually, the account was axed.

Yet several top Taliban officials remain active, sharing the oppressive regime’s doings in Afghanistan.

A spokesperson from the Counter Extremism Project told The Media that terrorist content has decreased on social media platforms since 2014, but there’s still much work to do.

In a report issued in January, the group found 813 links to extremist content in 2022.

“The companies did not suddenly become concerned that their platforms were misused by terrorists, only that it could cost them financially and legally,” the spokesperson noted. “Any progress was the result of public and advertiser pressure and the prospect of regulation from authorities.”

Jason Blazakis, a professor at the Middlebury Institute of International Studies at Monterey in California, said Congress should amend the laws to make clear where liability would fall for tech companies since the definition of terrorism varies from company to company and even between government departments.

“This inevitability results in subjective determinations,” he said. “This is clearly a problem, yet the problem is not one that social media companies can fix.”

John Furner
John Furnerhttps://dailyobserver.uk
Experienced multimedia journalist with a background in investigative reporting. Expert in interviewing, reporting, fact-checking, and working on a deadline. Excel at cinematic storytelling and sourcing images, sound bites, and video for multimedia publication. Work well with photographers and videographers when not shooting his own stories, and love to collaborate on large, in-depth features.

PLACE YOUR AD HERE

- Advertisement -spot_img

More articles

PLACE YOUR AD HERE

- Advertisement -spot_img

Latest article

John Furner
John Furnerhttps://dailyobserver.uk
Experienced multimedia journalist with a background in investigative reporting. Expert in interviewing, reporting, fact-checking, and working on a deadline. Excel at cinematic storytelling and sourcing images, sound bites, and video for multimedia publication. Work well with photographers and videographers when not shooting his own stories, and love to collaborate on large, in-depth features.