At a glance.
- Deepfakes, detection and data poisoning.
- Openness in Rossiya.
- Russian information operations related to the invasion of Ukraine.
Deepfakes, detection and data poisoning.
Deepfakes have typically been viewed as a deception technique: they fake a plausible image, video, or audio and persuade people to believe a lie. This is undoubtedly one of the risks and is commonly addressed through fact checking, rumor checking, and other forms of debunking. There are also technical tools available or under development that can help detect deepfakes. Researchers at the University of Tokyo this week announced work on a technique they call “self-mixed images,” which they say offers a superior method for training algorithms, better than recognized deepfakes. At the moment, the technology works better with images than with videos. “Of course we want to improve this idea. It currently works best with still images, but videos can exhibit temporal artifacts that we can’t see yet. In addition, deepfakes are usually only partially synthesized. We could also explore ways to detect fully synthetic images,” Engineering & Technology quotes lead researcher Toshihiko Yamasaki as explaining.
There are other considerations for detecting and debunking deepfakes. Carey O’Connor Kolaja, CEO of AU10TIX, recently discussed deepfakes with CyberWire and described ways organizations could work together to combat this threat. Some of their suggestions will look familiar to you, like the best practices advocated in other, more general cybersecurity contexts. “To combat disinformation and misrepresentation, we must unite and form consortia to safely and legally exchange signals within ecosystems,” she said. “Technology allows us to do this with zero trust architectures and cryptographic signatures, but commercially we need to find a way forward. When CNN misrepresents something, it’s bad for all media outlets. When GoFundMe makes false statements, it’s bad for all crowdfunding platforms.”
Cross-referencing data from different sources may clarify some claims, especially as each institution strengthens its own security posture. “We should also bring together unexpected data signals to determine what is authentic and what is not,” said Kolaja, “[c]Combining social signals with local signals, for example with deepfake technology. Organizations should also try to adopt and at the same time accept new policies, standards and interoperability [Coalition for Content Provenance and Authenticity (C2PA)] Tools and new approaches to maintain trust in our digital/physical world.”
She pointed to other areas where deepfakes are a problem. “Deepfakes are becoming more common and will penetrate unexpected areas,” she said. “The expected uses will continue to be media in all forms, including political campaigns, social campaigns and even commercial campaigns. Other forms of impersonation that may prove even more problematic can range from a fraudulent person taking exams to fake doctors offering fraudulent online services.”
“In the identity space,” she continued, “crooks will try to fool authentication systems with synthetic images [or] Videos of someone other than themselves. Scammers could also create videos of family members to try to collect ransom.”
The countermeasures point to another problem with deepfakes: the prospect of data poisoning, the injection of false data into databases where it may go undetected for long periods of time.
Openness in Rossiya.
The fact that Russia’s war against Ukraine has so far not been entirely successful was noted unusually openly on Russian television. “The situation for us is definitely going to get worse,” the New York Times quotes Mikhail M. Khodaryonok from Rossiya’s widely acclaimed news talk show 60 Minutes. Khodaryonok is a retired colonel and “a conservative columnist on military affairs”. He continued, “We are in total geopolitical isolation and the whole world is against us, even though we don’t want to admit it.” The Times and other media wear one Link to the program. English subtitles have been added and are worth watching (and listening to) in full.
Russian information operations related to the invasion of Ukraine.
Mandiant this morning released an overview of Russian intelligence operations it is following in the run-up to Russia’s war against Ukraine on the actual invasion to date. Senior analyst Alden Wahlstrom, one of the lead authors of that report, said the research aimed to show “how well-known actors and campaigns can be leveraged or otherwise refocused to support emerging security interests, including large-scale conflicts. For years, analysts have documented that Ukraine, a key strategic interest of Russia, is a testing ground for Russian cyber threat activities that they could subsequently deploy elsewhere. Now we see how pro-Russian actors have used (in whole or in part) the means and campaign infrastructure developed over time to target Ukraine.”
The operations exhibit a mix of disinformation and disruptive attacks (primarily ransomware, wiper malware disguised as ransomware, and distributed harassment-level denial-of-service attacks). As early as January 14 of this year, the defacement of Ukrainian government websites with alleged theft and subsequent data deletion began. “The defacements likely coincided with the January deployment of the destructive tools PAYWIPE, an MBR wiper disguised as ransomware, and the file corrupter SHADYLOOK against the Ukrainian government and other targets.” On February 23, the eve of the actual invasion, it was repeated this attack style. In this case, the defacements “coincided with destructive attacks on Ukrainian government targets using the ransomware-disguised NAARMISS Master Boot Record (MBR) wiper and the PARTYTICKET wiper.” And during the war itself, on March 16, a deepfake video of Ukrainian President Zelenskyy, who appeared to announce surrender to Russia, was circulated via compromised Ukrainian news sites. This incident coincided with another wiper attack: “On the same day, Mandiant identified the JUNKMAIL wiper targeting a Ukrainian organization. The malware was configured via a scheduled task to run approximately three hours before Zelenskyy’s scheduled speech before the US Congress. “
Some well-known threat actors had emerged. APT28 (Fancy Bear, GRU) was behind much of the Russian activity and the allied ghostwriters operators of the Belarusian satellite intelligence and security services were also active in the Russian interest. The Internet Research Agency known as Trollfarm, which meddles in elections, also seems to have resurfaced as “Kiber”. [that is, Cyber] Force Z” and resumed influence and reinforcement operations. And there was the usual covert media working under inauthentic roles. Kiber Force Z’s style is as familiar as it is tasteless, with a Russian-uniformed Pepe the Frog (an orthodox cross, blasphemous). around the neck, a “Z” badge in place of honor on the left shoulder) calling for an air raid on Azovstal manned by three soldiers of the pig-faced Azov battalion (The Azov soldiers look better uniformed and better equipped than the Russian comrade Soldier Pepe, who seems a little careless and unconcerned in his turnout. Perhaps Kiber Force Z realized that President Zelensky’s easygoing self-portrayal played better than President Putin’s expensive clothes, long tables, and Ruritanian guards.)
There has also been some nominally hacktivist activity in support of Russia. “Established hacktivist figures JokerDNR and Beregini have actively continued their attacks on Ukraine leading up to and since the Russian invasion, including by releasing allegedly leaked documents containing possible personally identifiable information (PII) of Ukrainian military personnel,” Mandiant notes. and cautiously continues: “In addition, newly formed ‘hacktivist’ groups whose affiliation with the Russian state is still unknown, such as Killnet, Xaknet and RahDit, have engaged in hacktivist threat activities in support of Russia, including distributed denial of service (DDoS ) attacks, hack-and-leak operations, and defacements.” We believe it is very likely that these hacktivist figures are operating under the control, or at least direction, of Moscow’s intelligence agencies.
Russian disinformation has two sides. One, for foreign consumption, was in the familiar tabloidesque entropic style aimed more at obscuring advice than persuading, which has been a staple of Russian election meddling for the past decade. This line includes claims such as the discovery of US biowar labs in Ukraine, Poland’s systematic harvesting of organs from Ukrainian refugees for sale on the black market for transplants, etc. The other is aimed primarily at domestic audiences and emphasizes the foreign threat to Russia , Ukrainian atrocities against ethnic Russian enclaves and, most importantly, the alleged Nazi cabal that must rule Kyiv. These lines of disinformation are meant to be persuasive. They were also closely followed with minor deviations, making Colonel Khodaryonok’s remarks on Rossiya’s 60 minutes all the more remarkable.
The report concludes with an assessment of the prospects for influence campaigns aimed at Russian targets. It can be expected that Russian operators will continue to spread disinformation, likely with the support of their satellite services in Belarus. China and Iran serve as allies of expediency, selling Russian issues when it serves those regimes’ longstanding anti-Western strategic goals:
“Intelligence operations observed in the context of Russia’s invasion of Ukraine have both tactical objectives, responding to or attempting to shape events on the ground, and strategic objectives, attempting to influence the changing geopolitical landscape, shown. While these operations posed an outsized threat to Ukraine, they have also threatened the US and other Western countries. As a result, we expect such operations, including those involving cyber threat activities and potentially other destructive and destructive attacks, to continue , while the conflict progresses.
“A notable feature of the operations attributed to hitherto known actors is their apparent consistency with the established motives of the respective campaign Tactics, Techniques and Procedures (TTPs) in support of tactical and strategic objectives directly related to the conflict itself. This is special beneficial when facts on the ground shape Russia’s need to influence events in Ukraine, organize Russian support domestically, and manage global perceptions of Russia’s actions. Meanwhile, pro-PRC and pro-Iran campaigns have opportunistically used the Russian invasion to advance long-held strategic goals. We also expect this momentum to continue and are actively monitoring to see if their scope of information operations expands on activities surrounding the conflict.”