Connect with us

Hi, what are you looking for?

FutureMoneyFate.comFutureMoneyFate.com

World News

‘Jawboning’ politicians who push to delete social media posts draw scrutiny

In the weeks after the Jan. 6 attack on the U.S. Capitol, activists gathered outside the Houston home of Sen. Ted Cruz to protest his refusal to certify President Biden’s election victory — as well as his decision to jet off to Cancún on vacation during a deadly local storm.

As images of the protesters rippled across Twitter (now X), Cruz’s team called and texted people in Twitter’s D.C. office, insisting that some of their posts violated his safety and demanding that they be removed, according to people familiar with the matter who spoke on the condition of anonymity to discuss private deliberations.

“We would hear very frequently from [Cruz’s] office,” said one former employee. “They were one of the more frequent congressional offices to complain.”

For years, politicians like Cruz (R-Tex.) have tapped private contacts at social media firms to influence a range of decisions, from deleting a specific post to changing policies around hate speech, voter suppression and public health misinformation, according to more than a dozen people familiar with the tech companies’ operations, many of whom spoke on the condition of anonymity to discuss internal matters.

The practice has become so routine it even has a nickname — “jawboning” — and tech companies have responded by establishing internal systems to ensure that influential users receive prompt responses, the people said. The complex rules also help guard against such requests having undue influence, the people said.

Now, the Supreme Court is set to decide whether politicians’ attempts to influence the tech giants violate the First Amendment, defining for the first time the constitutional bounds of the practice. On Monday, Supreme Court justices are scheduled to hear oral arguments in Murthy v. Missouri, a landmark case that could define the future of free speech on social media.

The case was initiated by Republican attorneys general in Louisiana and Missouri, who sued the Biden administration, alleging its communications with platforms urging the removal of posts containing misinformation about the pandemic and elections amounted to illegal censorship. The Justice Department is defending the Biden administration, arguing that the Constitution permits the use of the bully pulpit to protect the public.

“This case has potential to really reshape the rules of the road here,” said Daphne Keller, who directs the program on platform regulation at Stanford’s Cyber Policy Center and is a former associate general counsel for Google. “It’s the fundamental question of how we govern what speech is and isn’t allowed on platforms and what information they’re allowed to use.”

While the case focuses on the Biden administration, politicians from both parties frequently leverage relationships to try to remove unfavorable posts, the people said. In one instance, the office of former House speaker John A. Boehner, a Republican from Ohio, asked Twitter to remove a post circulating his wife’s phone number. Twitter ultimately declined after staffers reviewed the tweets and found that Debbie Boehner, a real estate agent, advertised the number prominently on her own website, one of the people said. Neither Boehner nor Cruz responded to requests for comment.

Still, a legal movement has arisen to challenge what many conservatives allege is a vast liberal censorship regime. House Republicans led by Rep. Jim Jordan (Ohio) are investigating how tech companies handle requests from Biden administration officials, demanding thousands of documents from internet platforms. Conservatives activists also have filed lawsuits and records requests for private correspondence between tech companies and academic researchers studying election- and health-related conspiracies.

“We have uncovered substantial evidence that the Biden administration directed and coerced Big Tech companies to censor Americans’ free speech,” Jordan spokeswoman Nadgey Louis-Charles said in a statement.

The legal campaign has blunted coordination as the 2024 election looms. Federal agencies have stopped sharing information with some social networks about foreign disinformation campaigns, shutting down a line of communications opened after revelations of Russian interference in the 2016 elections.

Tech industry executives and civil society groups say the case now before the Supreme Court requires a nuanced review, especially as the evolution of artificial intelligence presents new disinformation risks in a critical election year. Alex Abdo, litigation director of the Knight First Amendment Institute at Columbia University, which filed a brief in support of neither party, urged the court to clarify the constitutional line between coercion and persuasion.

“The government has no authority to threaten platforms into censoring protected speech, but it must have the ability to participate in public discourse so that it can effectively govern and inform the public of its view,” he said.

During the Obama administration, Facebook, Google and other tech juggernauts were the darlings of Washington. Silicon Valley employees would often weave in and out of Capitol Hill offices showing congressional staffers how to use their platforms. But in August 2014, a video of journalist James Foley being executed by ISIS circulated on YouTube, Twitter and other services — and the relationship grew complicated.

As ISIS increasingly used the tech platforms to recruit new members, Lisa Monaco, now deputy attorney general, and other Obama aides pushed companies to combat terrorist content. The companies complied, breaking with prior practices. After months of internal deliberation, Twitter announced a plan to fight violent extremism, removing accounts suspected to have ISIS ties. YouTube also invested in detecting and taking down terrorist videos.

Tech companies deepened their relationships with government and law enforcement following revelations of Russian interference, sharing findings on how foreign operatives, terrorists and extremists were using the internet to mislead people. When the pandemic hit and social media became a hot spot for conspiracies, public health officials kept social media companies updated on the latest developments.

As Washington policymakers increasingly scrutinized social media, they more frequently sought to influence the companies’ decisions.

“Both parties do it,” said Nu Wexler, a former congressional aide who also worked at Google, Meta and Twitter. “A lot of them are at war with political opponents on social media. They think their access to social media companies will help them get their opponents suspended.”

In response, tech companies developed systems to handle the deluge of requests. Meta lobbyists and staffers sent complaints about social media posts from politicians and other high-profile figures to an email alias for an expedited review. Meta declined to comment.

Before Elon Musk’s takeover, Twitter largely prohibited lobbyists or advertising reps — who might have connections to politicians — from deciding whether a tweet should be removed or left up. Instead, those employees would send those requests to the trust and safety team responsible for content moderation, the people said.

“I never felt pressured by the FBI or the White House because I wasn’t … dealing with them,” said Anika Collier Navaroli, a senior fellow at the Tow Center for Digital Journalism at Columbia University and a former senior Twitter policy official.

In 2021, as the Biden administration urged Americans to get the coronavirus vaccine, the White House and federal public health officials bickered with tech companies about how their actions might impact the push, according to documents publicly released through the Murthy v. Missouri case, House Republicans’ probe and X owner Elon Musk’s Twitter Files. The White House referred The Post to the Justice Department’s brief.

Soon after Inauguration Day in January 2021, then White House staffer Clarke Humphrey pressed Twitter to remove a tweet by anti-vaccine activist Robert F. Kennedy Jr. linking baseball player Hank Aaron’s death to coronavirus vaccines. The tweet remains up.

Former White House staffer Rob Flaherty questioned why Meta was hosting a video of conservative talk show host Tucker Carlson voicing skepticism about the vaccine. A Meta employee, whose name is redacted in court documents, responded that the post didn’t violate company rules and that the company had limited its spread. After the employee didn’t respond to a slew of follow-ups for two days, Flaherty shot back: “These questions weren’t rhetorical.”

These tense conversations appeared to have an impact on some company policies. In an email exchange, Meta global affairs president Nick Clegg questioned why Meta was removing claims that the coronavirus was “man made.”

“Because we were under pressure from the administration and others to do more,” a Meta employee responds, in the July 2021 exchange.

That same month, the White House said it was reviewing policies to hold tech companies responsible for misinformation, including amending tech companies’ prized legal shield, Section 230 — an idea Biden had floated as early as 2020. Humphrey and Flaherty did not respond to requests for comment.

These emails — along with thousands of messages between Biden administration officials and social media companies — are included in the record as part of the Supreme Court case, which argues that the White House, FBI, Centers for Disease Control and Prevention and other federal offices coerced social media companies into taking down users’ posts.

The state attorneys general argue these sometimes contentious conversations show federal officials violated the First Amendment, which prohibits the government from infringing on private speech or punishing people for expressing different views.

Justice Department lawyers say the exchanges show the federal government educating the tech platforms about posts they thought were causing “preventable deaths,” arguing that the companies were free to make their own decisions. They say the state attorneys general failed to show the government tied regulatory threats to specific content moderation decisions.

“There’s a very clear line between education and coercion. I think the question is where exactly do courts draw that line?” said Matt Perault, director of the Center on Technology Policy at the University of North Carolina at Chapel Hill and a former Meta policy official.

In July, a federal judge in Louisiana sided with the state attorneys general, issuing a sweeping injunction that limited how thousands of employees in a wide range of government departments and agencies can communicate with the tech companies. In September, the U.S. Court of Appeals for the 5th Circuit narrowed that order to the White House, the surgeon general’s office, the center and the FBI.

As the case heads to the Supreme Court, there are early indications of how some justices view these issues. In October, the three most conservative justices dissented when the majority temporarily allowed the Biden administration to resume communications with social media companies while the litigation continued.

Justice Samuel A. Alito, Jr., joined by Justices Clarence Thomas and Neil M. Gorsuch, called the majority’s decision to block a lower-court ruling against the Biden administration “highly disturbing,” saying that “government censorship of private speech is antithetical to our democratic form of government.”

Perault and other experts said the Murthy v. Missouri case has convinced many in the tech industry of the need to establish clearer rules around government actors. One idea that’s gained traction is registering complaints from officials and politicians publicly.

Such openness might have been revelatory back in September 2019. That’s when the Trump White House asked Twitter to remove a tweet by celebrity Chrissy Teigen calling former president Donald Trump “a p—- a — b—-.” The company declined, said Navaroli.

“I think that there are genuine conversations that should be had about the role of the American First Amendment,” Navaroli said in an interview. But there is this “theory out there that social media companies were being coerced into taking down content. It’s literally just not been proved and the information that we have that’s out there has said the exact opposite.”

This post appeared first on The Washington Post

You May Also Like

Stock News

In this episode of StockCharts TV‘s The Final Bar, Tony Dwyer of Canaccord Genuity talks Fed policy, corporate bond spreads, and why the level of interest...

Stock News

SPX Monitoring Purposes: Long SPX 8/9/23 at 4467.71. Long SPX on 2/6/23 at 4110.98: Sold 6/16/23 at 4409.59 = gain of 7.26%. Gain since...

World News

The vote in Ohio this week significantly set back Republican efforts to restrict abortion rights, no question. Voters strongly rejected Issue 1, which would...

World News

“Tuberville lives in Auburn, Alabama, with his wife Suzanne.” — website of Sen. Tommy Tuberville (R-Ala.) In late June, Tommy Tuberville traveled to the...