Removals in recent years by Twitter and Facebook of more than 150 personalities and fake media sites created in the United States It was revealed last month By Internet researchers Graphika and the Internet Observatory at Stanford. While the researchers did not attribute the false accounts to the US military, two officials familiar with the matter said that US Central Command Among those whose activities are subject to scrutiny. Like others interviewed for this report, they spoke on the condition of anonymity to discuss sensitive military operations.
The researchers did not specify when the removals occurred, but those familiar with the matter said they were in the past Two or three years. Some, they said, were recent and involve leaflets from the summer providing anti-Russian narratives citing the Kremlin’s “imperialist” war in Ukraine and warning of the conflict’s direct impact on Central Asian countries. More importantly, they found that fake personals – using tactics used by countries like Russia and China – did not gain much traction, and that public accounts actually attracted more followers.
Centcom, headquartered in Tampa, has military operations powers in 21 countries in the Middle East, North Africa, and Central and South Asia. A spokesman declined to comment.
Brigadier General of the Air Force. General Patrick Ryder, the Pentagon’s press secretary, said in a statement that the Army’s information operations “support our national security priorities” and should be conducted in accordance with relevant laws and policies. “We are committed to implementing those guarantees,” he said.
Spokesmen for Facebook and Twitter declined to comment.
According to the researchers’ report, the deleted accounts included a fake Persian-language media site that shared content republished from the US-funded Voice of America Persian and Radio Free Europe. She said another was linked to a Twitter handle that in the past claimed to work on behalf of Centcom.
A fake account posted an inflammatory tweet alleging that relatives of deceased Afghan refugees had reported the repatriation of bodies from Iran with missing organs, according to the report. The tweet linked to the video was part of an article posted on a US Army website.
Centcom has not commented on whether these accounts were created by its employees or contractors. One defense official said that if the organ harvesting tweet turned out to be from US Central Command, it would “certainly be a violation of doctrine and training practices.”
Independent of the report, The Washington Post learned that in 2020, Facebook disabled fake characters created by Centcom to counter disinformation spread by China suggesting that the coronavirus responsible for covid-19 was created in a US Army laboratory in Fort Detrick, Md. Officials are familiar with the matter. Officials said the fake profiles — active in Facebook groups that spoke in Arabic, Farsi and Urdu — were used to amplify real information from the US Centers for Disease Control and Prevention about the virus’ origins in China.
US government Use of artificial social media The accounts, although permitted by law and policy, sparked controversy within the Biden administration, as the White House pressured the Pentagon to explain and justify its policies. Several US officials said the White House, agencies like the State Department and even some Defense Department officials were concerned that the policies were too broad, allowing respite for tactics that, even if used to disseminate real information, risked undermining US credibility. .
“Our adversaries are absolutely in the information realm,” said a second senior defense official. “There are some who think we shouldn’t do anything covertly in that space. Giving up an entire domain to an opponent would be unwise. But we need a stronger protective fence.”
A spokeswoman for the National Security Council, part of the White House, declined to comment.
Cal revealed his review at a virtual meeting of the National Security Council on Tuesday, saying he wanted to know what types of operations were carried out, who they were targeting, what tools were being used, and why military leaders chose those tactics. , and how effective it is, many officials said.
The message was basically, “You have to justify to me why you’re doing these kinds of things,” the first defense official said.
Pentagon policy and doctrine discourage the military from promoting falsehoods, but there are no specific rules mandating the use of truthful information for psychological operations. For example, the military sometimes uses fiction and satire for persuasion purposes, but in general the messages are supposed to stick to the facts, officials said.
In 2020, officers at Facebook and Twitter contacted the Pentagon to raise concerns about fake accounts they had to remove, suspected of being linked to the military. That summer, David Agranovich, director of Facebook’s global threat countering division, spoke to Christopher C. Miller, the assistant director of special operations/low intensity conflict, who oversees policy for impact operations, warning him that if Facebook could identify it, states could United to get to know them. Litigants, several people familiar with the conversation said.
“His point was ‘Guys, you’ve been caught,'” one person said. it’s a problem.’ “
Before Miller could take action, he was chosen to head a different agency – the National Counterterrorism Center. Then the November elections happened and the Trump administration had time to address the matter, even though Miller spent the last few weeks of Donald Trump’s presidency as acting Secretary of Defense.
With the rise of Russia and China as strategic competitors, military leaders wanted to respond, including online. Congress supported it. Frustrated by perceived legal obstacles to the Department of Defense’s ability to conduct covert activities in cyberspace, Congress in late 2019 passed a law asserting that the military can conduct operations in an “information environment” to defend the United States and respond to foreign disinformation aimed at undermining its interests. The measure, known as Section 1631, allows the military to carry out covert psychological operations without bypassing what the CIA has claimed to be its covert authority, alleviating some of the frictions that have previously impeded such operations.
“The combat leaders were really excited,” the chief defense official recalls. They were very excited to take advantage of these new powers. Defense contractors were also eager to secure lucrative covert contracts to enable covert influence operations.”
At the same time, the official said, military commanders were not trained to oversee “technically complex operations by contractors” or to coordinate such activities with other stakeholders elsewhere in the US government.
Last year, with a new administration in place, Agranovich tried Facebook again. This time he raised his complaint to President Biden’s deputy national security adviser for electronic affairs, Anne Neuberger. Agranovich, who served on the National Security Council under the Trump administration, told Neuberger that Facebook was pulling fake accounts because they violated the company’s terms of service, according to people familiar with the exchange.
The accounts were easily discovered by Facebook, which since Russia’s campaign to interfere in the 2016 presidential election has boosted its ability to identify fake personalities and websites. A person familiar with the matter said that in some cases the company removed personal files, which appeared to be linked to the military, that promoted information that fact-checkers deemed false.
Agranovich too Talk to officials at the Pentagon. his message Kan: “We know what the Department of Defense is doing. It violates our policies. We will enforce our policies” and so “the Department of Defense should stop it,” said A. An American official looks into the matter.
In response to White House concerns, Cale ordered a review of Military Information Support Operations, or MISO, the Pentagon’s nickname for psychological operations. A draft concluded that policies, training and oversight all needed to be tightened, and that coordination with other agencies, such as the State Department and the CIA, needed strengthening, according to officials.
The review also found that although there have been cases where bogus information has been pushed by the military, it was the result of insufficient oversight of contractors and personnel training — not systemic problems, officials said.
Two officials said the Pentagon leadership did little to the review, before Grafia and Stanford published their report on August 24, prompting a flurry of news coverage and questions for the military.
The State Department and the CIA were alarmed by the military’s use of covert tactics. The first defense official said the State Department officers blamed the Department of Defense, “Don’t inflate our policies with fake characters, because we don’t want to be seen as creating a phony grassroots effort.”
One diplomat described it this way: “In general, we shouldn’t use the same kind of tactics that our opponents use because the bottom line is we have a high moral ground. We are a society built on a certain set of values. We promote these values around the world and when We use tactics like that, it undermines our argument for who we are.”
Psychological operations to promote American novels abroad are not new in the military, but they are the popularity of Western social media around the world It led to the expansion of tactics, including the use of characters and synthetic images – Sometimes called “deep fake”. The reasoning is that opinions expressed by what appears to be, for example, an Afghan woman or an Iranian student, may be more persuasive than if they were publicly pushed by the US government.
Officials said the majority of the military’s influence operations are public, promoting US policies in the Middle East, Asia and elsewhere in its own name. They said there are good reasons to use covert tactics, such as trying to infiltrate a closed terrorist chat group.
A key issue for senior policymakers now is whether the military’s implementation of covert influence operations is producing results. “Is the juice worth the pressure? Does our approach really have the potential to deliver the return on investment we were hoping for or does it cause more challenges?” said one of the people familiar with the discussion.
The Graphica and Stanford report notes that the covert activity has not had a significant impact. He noted that the “vast majority of blogs and tweets” reviewed received “no more than a few likes or retweets,” and that only 19 percent of the fabricated accounts had more than 1,000 followers. “Frankly,” the report stated, “the most followed assets in the data provided by Twitter were public accounts that publicly declared a connection to the US military.”
Michael Lumpkin, a former senior Pentagon official who handles information operations policy and former head of state, told the Global Engagement Center for Management. “Otherwise we risk making more enemies than friends.”
Alice Critts contributed to this report.