Video script & research: Alya Shandra, design: Ganna Naronina, narration: Lianna Makuch.
How can you brainwash people into believing what you want? Russia has it down to a science.
In Soviet times, journalism and philology students studied spets, or combat propaganda – the art of sowing discord in the enemy’s ranks by means of disinformation and manipulation. Known as psychological warfare in the West, spetspropaganda was an essential part of the 53 wars of Russia and the USSR.
Today, the Kremlin is using it against Russia’s own population. Tactics of spetspropaganda are easily identified in the broadcasts of Russia’s state TV which have managed to keep Vladimir Putin in power for 20 years – and God knows for how many more. Moreover, elements of psychological warfare are obvious in Russia’s foreign influence operations.
Spetspropaganda came to wide attention after a viral article by Russian journalist Vladimir Yakovlev, the owner of the Kommersant publishing house said to be the founder of Russia’s format of post-Soviet journalism. He recalled some widespread techniques of spetspropaganda which he was forced to study in an atmosphere of top secrecy at Moscow State University. Writer Zarina Zabrisky, who translated the post to English, wrote about her forced studying of Spetspropaganda as a philology student at the St. Petersburg State University, and has been avidly raising awareness of the technique being applied in Russian media manipulations – see this and this article.
According to Ms.Zabrisky, the classes were conducted in an atmosphere of total secrecy, and students were not allowed to take their notes with them. The teacher counted the manually numbered pages in the notebooks after each lesson to make sure no written information escapes.
Despite these secretive measures, it is possible to make out the full theoretical framework of Russian psychological warfare. In 2017, the Moscow Times brought to light an allegedly unauthorized copy of a GRU handbook which, according to former students, was the basis for the courses on spetspropaganda for reserve officer training cadets at Moscow State University’s (MGU) journalism faculty. The Moscow Times is most likely talking about the book “Secrets of psychological war Goals, tasks, methods, forms, experience” by Vladimir Krysko, which was published in Minsk in 1999: the citations match and so does the publishing date.
But it appears that the teaching of spetspropaganda is not confined to MGU journalist students.
According to communications and hybrid war specialist Heorhiy Pocheptsov, psychological warfare is also taught at the MGU’s global policy faculty, at the Russian State Humanitarian University, and Moscow Institute of International Relations. As well, techniques of spetspropaganda are studied in such Russian institutions as the Military Institute of Information and Foreign Languages, which boasts a faculty of foreign military information offering two-month spetspropaganda courses for officers and trains military journalists.
So. swarms of Moscow university students graduate each year with knowledge of spetspropaganda. Where do they all go?
Psychological warfare for war and for peace
“Psychological warfare has been conducted from the earliest ages, but nowhere and never has it been advertised,” starts the textbook.
Despite the essence of psychological warfare being understood only by specialists, it is pretty close to the regular kinetic one we all know, Krysko says – it achieves the same thing as the war with guns and tanks, but without firing a shot.
“One of the main goals of any ‘ordinary’ war is precisely to change the psychology of the enemy. What does it mean to ‘force him to surrender to the grace of the winner,’ or ‘accept the proposed settlement plan’ of some kind of problem? This implies, among other things, forming a belief in him that the further resistance is pointless, depriving him of faith in his success, i.e. changing the psychology.”
But spetspropaganda is not only for war, the textbook stresses. Spetspropaganda has a range of tasks for the population and army of the enemy countries for times of peace, which is called the “threatening period,” all directed towards either preventing or preparing a conflict.
At the strategic level, these include predictable “white PR” tasks such as promoting views in the interests of your own country on the international stage, cultivating trust towards its army, and propagandizing its economical, military, and political superiority – all commonplace strategic communications goals. More revealing are the insidious tasks of peacetime spetspropaganda:
- Form a negative attitude towards a possible war among the population and servicemen of the target country;
- Breaking apart coalitions of hostile countries;
- Intimidate potential enemies: demonstrate military power, political pressure, economic blockade etc;
- Criticize the spiritual and military ideals, discredit the doctrines, military theories, views of the leadership of the enemy country;
- Incite political, national, and religious enmity between various groups of the population and armed forces, and thus
- Achieve a weakening of the moral-political potential of the population and military of a potential enemy.
The short-term tasks provide more detail:
- Conduct specific events to discredit the political-military leadership of the enemy country in front of its citizens and the world community;
- Achieve moral approval of your actions and real help for your country and the upcoming military actions from allied countries;
- Mobilize your citizens to unconditionally support your upcoming military actions, neutralize the pacifist attitudes;
- Conduct maskirovka (camouflage) of your genuine measures to counteract the ideas of your enemy;
- Incite the population and servicemen of the enemy to anti-social actions which destabilize normal life;
- Activate existing religious and national prejudices, incite contradictions between specific social and national groups of the population of the enemy country;
- Incite the anti-government forces in the enemy camp to active measures (prepare the “fifth column”);
- Organize effective counteraction to the psychological operations of the enemy.
For the Russia watcher, it will be fairly obvious that the abovementioned lists of tasks constitute a theoretical framework of Russia’s hybrid war. Inciting enmity, engineering conflicts, and otherwise demoralizing the population and leadership of western countries has become the trademark of Russian political warfare and is described extensively by analysts drawing conclusions from astute empirical observations.
On the other side of the coin are psychological warfare operations for war. These include measures that are similar but harsher and are relatively well-known as PSYOPs. Strategically, these include:
- Getting neutral countries on your side, discredit your enemies on the international stage;
- Neutralize enemy propaganda directed at your population and military;
- Create panic and mass psychosis, overcome the enemy’s will to fight, make him believe he is weaker than he is, make him oppose the policy of war;
- Incite enmity and conflicts between various groups of the enemy;
- Disinform the population and military about the real situation at the front;
- Undermine morale and Incite the military of the enemy to desert the arena of battle.
The technology of spetspropaganda
Krysko’s book pays homage to the widespread concept of white, gray, and black propaganda:
- the white is open and cites official sources such as state information;
- black hides its real sources of information and has lies as its goal;
- gray propaganda is in between.
The goal of a psychological war is not to use exclusively black propaganda, but to be able to productively combine all three, Krysko states.
Also important is studying the target population. Nationalities vary by their psychology; taking these differences into account improves the effect of propaganda by one-third, Krysko claims.
It is essential to study the objects of spetspropaganda. Apart from a thorough assessment of the state of the military, the socio-political sphere of the country is studied to influence the civilian population, which, according to Krysko, is much more easily influenced than the military. Thus, the propagandist must know what the different population groups think about the internal and external policies of their state and its neighbors, how they view the possibility of a war, how its media work, whether there are pacifist demonstrations, etc.
The two main methods of psychological war are:
- Persuasion – to influence the critical thinking of the objects of propaganda.
- Suggestion – to influence the subconscious of those who don’t think critically.
Persuasion
Persuasion works by first getting the person to internally agree with certain conclusions and then form and consolidate new attitudes corresponding to the set goal. Persuasion is considered to be successful when the victim is deeply confident in the truth of the newly assimilated ideas, which allows adopting unambiguous solutions. Thanks to this confidence, attitudes are formed which influence the behavior of people in everyday life.
How does one achieve persuasion? Here are some ways.
1. Repeat ad nauseam. Invented by Nazi propagandist-in-chief Dr. Goebbels.
“However, the repeating should not be stereotypical, the propaganda thesis should be supported by different arguments,” Krysko writes, bringing an example of how the Red Army propaganda used repetition during the 1939 invasion of Poland.
11 recurrent topics were used. 5 concerned the fleeing of the Polish government to Romania (a lie, since it was actually on Polish territory near the Romanian border), 2 topics about Polish government stealing the country’s gold reserves (also a lie), and 1 about the luxurious lifestyle of the members of the government.
2. Repeating works best when with a Big Lie. “The bigger the lie, the more likely people will believe it” is a quote attributed to Goebbels but actively used by Russia today. A classic example of a Big Lie is “Operation Infektion” – the USSR’s Cold War-era claim that the US invented AIDS which is believed to this day.
Today, Russia spreads the Big Lie that US-backed Ukrainian Nazis want to commit ethnic genocide against Russians in eastern Ukraine. Russian central TV airs staged interviews and fakes to “illustrate” the Big Lie and smear the Ukrainian army. One staged interview broadcast on Rossiya 1 became emblematic of Russian propaganda about Ukraine: a boy supposedly “crucified” by the Ukrainian army in Sloviansk, despite not a single eyewitness being able to corroborate the story existing.
Another example of staged footage illustrating Russia’s Big Lie about Ukraine is a video where figures that are claimed to be Ukrainian volunteer battalions appear to hang a pro-Russian militant and his pregnant wife. This video, too, was staged – the unnatural movements of the “hung” bodies make it likely that mountain climbing equipment was used to carry out this stunt.
Huge amounts of people are buying the Big Lie; in fact, it is mobilizing militants from Russian-speaking regions to flock to Donbas, like Manas, who сame from Kyrgyzstan to fight Ukrainian “Nazis” but left without finding any.
3. The primacy effect, also by Goebbels, who said “The communication must reach the audience ahead of competing propaganda” – because we are predisposed to prioritize the first version we hear.
Russia used this effect in the first weeks of the occupation of Crimea. Prior to the military operation, Russia went into overdrive portraying the Euromaidan revolution as a neo-Nazi coup – and a Russian takeover of the peninsula as a way of escaping it.
This version was given broad international coverage after Russian President Vladimir Putin’s press conference about Euromaidan on 4 March 2014, where he claimed there was a threat of “semi-fascist” elements taking over Ukraine.
Not only did many Crimeans believe this lie, Western media did too. The myth of a Neo-Nazi coup in Ukraine persists to this day, and it has become fashionable to imply that Ukraine somehow has a Neo-Nazi or far-right problem larger than elsewhere. The accusation has managed to stick with Ukraine, being constantly brought up in discussions about the country.
4. The 60-40 principle. To cultivate trust in a source, make 60% of its coverage objective and have 40% of manipulation, says Yakovlev. The Spetspropaganda manual offers more details, providing four main principles of making the audience trust you:
- If there are no serious reasons to hide facts or show them only under a certain angle, tell your audience adequately about them;
- Apart from considerations of military secrecy, only the assumption that the audience will not believe the facts can be a serious reason for concealing or distorting them;
- Every time the audience believes that the propagandist is lying, omits or adds serious details, the effect of propaganda on the audience is seriously weakened;
- Because of this, propaganda should never use falsified facts which may be exposed by the audience.
Additional persuasion methods include ensuring the credibility of the propaganda source by making it look like it has “special knowledge” which is not covered by official sources, creating the image of being “objective, independent, and alternative,” urging to adopt a “wider perspective.”
Arguments used in persuasion are separated into “true facts,” arguments appealing to positive expectations (such as “if you surrender, you will have good food rations as a POW”) and negative expectations (such as “if you do not surrender, you will freeze to death on the battlefield”).
Other effects of persuasion used by “psychological war specialists” when addressing a mass audience include:
- Strong arguments against the point of view (POV) of the “object” will be effective when he is distracted by something (illustrations in a leaflet, music and noise in a radio program, a video row in a TV show);
- The object will assess arguments which appear to be diametrically opposing to his POV as totally unacceptable;
- If the persuasive message seems to only slightly differ from the object’s POV, the object will often identify his own view with the persuasive message.
Suggestion
Suggestion, on the other hand, aims to influence the subconscious. In the process of suggestion, intellectual activity is either missing or weakened, and information, attitudes, feelings, and stereotypes of behavior are perceived based on mechanisms of contagion and imitation.
Suggestion usually happens between the lines. The restlessness, fear, or apathy one might feel after watching RT is the result of suggestion, the goal being to “achieve a weakening of the moral-political potential of the population and military of a potential enemy.”
It is crucial that the indoctrinated persons are able to perceive the words of another person as instructions to action, that they are receptive to being psychologically influenced. Indoctrination requires neither a system of logical arguments nor active thinking.
Although every person is susceptible to suggestion, women are more susceptible than men, children – more than adults, the manual notes. Being more emotional, having weak logical thinking, religiousness, anxiety, shyness, a tendency to imitate, sensitivity, a sense of hopelessness, low willpower, depression, and fear are some of the states that increase one’s receptivity to suggestion.
It works best when the figure exerting the suggestive influence is authoritative, meaning she can be trusted, is confident, competent, acts benevolently towards the propaganda victim, and demonstrates that she believes the information being shared. Most likely, it is for this reason that RT employs fringe figures acting as experts: they are shown as figures who can be trusted, and thus the process of suggestion becomes more effective.
Methods of suggestion, according to the manual, include the seven “tricks of the trade” of a successful propagandist, as identified by the Institute of Propaganda Analysis.
- Name-calling – to discredit and dehumanize;
- Glittering generalities – vague, sweeping statements using language associated with values and beliefs deeply held by the audience without providing supporting information or reason;
- Transfers – carrying over the authority and approval of something we respect and revere to something the propagandist would have us accept;
- Testimonials – quotations of figures which the victims of propaganda either love or hate to make them love or hate the idea);
- Plain Folks – convincing the audience that the spokesperson is of humble origin, someone they can trust and who has their interests at heart;
- Bandwagon – persuading the audience to follow the crowd by creating the impression of widespread support;
- Card stacking – by selectively choosing facts (either positive or negative) to make the best case possible for his side and the worst for the opposing viewpoint.
Another common tactic of suggestion is the rotten herring. Yakovlev describes it thus:
“A false accusation is found. It is important that it is as dirty and scandalous as possible — petty theft, say, child molestation, or murder, desirably motivated by greed. These work well.
Proving the charge is not the purpose of rotten herring. The purpose is to invite a broad, public discussion of its …UNfairness and injustice.
The nature of the human psyche is such that as soon as the charge becomes a subject of public discussion, ‘supporters’ and ‘opponents,’ ‘specialists’ and ‘experts,’ rabid ‘prosecutors’ and ardent ‘defenders’ of the accused inevitably enter the scene.
But regardless of the views of the discussion participants use the name of the accused in conjunction with dirty and scandalous accusations, over and over, thus rubbing more ‘rotten herring’ in his ‘clothes,’ until the ‘smell’ follows him everywhere. And the question whether he ‘killed, stole, seduced or did not’ becomes associated with his name.”
Zarina Zabrisky finds uses of the “rotten herring” in the election campaign of Hillary Clinton, who was smeared with a presumable connection to ISIS based on an email hack, and Emmanuel Macron, to whom Russian media applied many labels, from being a US agent to being gay to his campaign being funded by Saudi Arabia.
In the case of Ukraine, Russia’s accusations of Nazism in post-Euromaidan Ukraine have worked particularly well. Russian trolls do not miss a chance to belt out “Nazi junta” when talking about Ukraine, and foreign journalists still dedicate significant column space to search for the phantom of “overlooked” neo-Nazism in Ukraine.
However, the Rotten Herring can be applied to anybody the Kremlin wants to smear, not necessarily politicians. Such was the case with historian Yuriy Dmitriev, a man who was unearthing Russia’s darkest Gulag-day secrets, who was detained on fabricated accusations of child pornography. This accusation of pedophilia, a heinous crime causing most people to run away from the accused just in case, thinned the rows of Dmitriev’s supporters early on, notes writer Boris Shenderovich (although later, most figured out the historian was innocent).
Other methods include rumors, which are essential to obscuring the truth and enforcing stereotypes among the target groups. Krysko classifies rumors into wishful rumors, frightening rumors, and aggressive divisive rumors. Rumors can be absolutely false; unreliable with elements of credibility; plausible and authentic rumors with elements of improbability.
While word-of-mouth rumors were widespread in the wars of the past, in today’s social media age, conspiracy theories play the same function as rumors.
- Part 1: Propaganda prepares Russia for war
- Part 2: Whataboutism
- Part 3: Rapid fire conspiracy theories
- Part 4: Russian propaganda operates by law of war
- Part 5: Reflexive Control