As of February 24, 2022, Russian President Vladimir Putin declared the beginning of what he dubbed the “special military operation” and the Russian Armed Forces invaded Ukrainian territory. What the Russian authorities assumed would be a swift operation soon became a drawn-out, full-fledged war. Many events occurred over the course of the first year of war, keeping Russians in suspense, forcing them to detach themselves from the situation, giving them hope, and then driving them to despair. When we conducted our first interviews in spring 2022, many thought the war would not last long.
Since then, it has become clear that the war will be with us for a while. The daily life of Russian citizens has been invaded time and again by dramatic events. The Russian retreat from the occupied territories, the annexation of new regions, the bombing of Kiev, the first Crimean Bridge explosion, and the “partial mobilization”— to name just a few. Have these events changed the average Russian’s view of the war, and if so, how? How did residents of the Russian Federation perceive the “special military operation” more than half a year later? These questions are the focus of the report you see before you.
There are several research teams monitoring changes in Russian perceptions of the war through opinion polls (for example,
Russian field and
Chronicles). The work they are doing is very important. However, like any research method, surveys have their drawbacks—there are some things they simply will not show. For example, surveys do not always allow us to understand a respondent’s attitude towards sensitive or hot-button topics, as sometimes people have a tendency to
hide their true views. But more importantly, for Russians largely removed from the political process, perceptions of such politically-charged issues as the “special military operation,” war, and military conflict do not fit neatly into the standardized set of coherent positions that a survey is capable of capturing. These perceptions may be complex and contradictory, and in this case, in-depth interviews and long conversations with people allow us to better understand the idiosyncrasies of each viewpoint. To our knowledge, we are the only team that systematically monitors Russian perceptions of the war using qualitative (interview) rather than quantitative (survey) methods.
We released our first analytical report in September 2022. You can read it
here (in Russian) and
here (in English). In it, we presented the results of our qualitative study through interviews conducted over several months after the start of the war, in March, April, and May 2022. Our interviewees held a variety of opinions on the military conflict—there were those who supported the hostilities in one way or another (war supporters), those who condemned military aggression (war opposers), and those who tried to avoid giving any explicit assessment of the situation (undecided). We compared these three groups of respondents with each other: how they perceive the armed conflict, what emotions they associate with it, and how they consume information, assess the victims of the conflict, discuss the situation with loved ones, reflect on the consequences of the war, and so on. We have also published the results of this research in analytical media outlets, a few examples of which can be found
here,
here, and
here, as well as in scientific journals, such as those found
here (in Russian) and
here.
The paper you are currently reading is the second analytical report we have published and a continuation of this research. It is based on qualitative sociological interviews with Russian citizens conducted in fall 2022, from 7 to 9 months after the outbreak of the war. We wanted to determine how Russian perceptions of the war had changed during this period. This time, we excluded subjects who consistently opposed the war from the sample and decided to focus our study on the specifics of perceptions held by Russian citizens who did not have an unambiguous anti-war stance.
We conducted 88 interviews, each of which lasted from 30 minutes to two hours (the majority of these interviews were around an hour long). Among them are some repeated interviews—this means we spoke again with some participants from our first wave of interviews, who were classified in spring 2022 as supporters or undecided. When we analyzed the repeated interviews, we also referred back to the subject’s first interview conducted in the spring in order to see how the same subject’s opinion had changed after over half a year of war. We also conducted new interviews with subjects we had not spoken to before. As in the first wave of research, during our fall session, we found our subjects through advertisements on social media and through snowball sampling. You can read more about our research and data collection methods in the next section.
This time, we decided not to categorize our subjects as “supporters” or “undecided.” The fact is, the lines dividing these two camps are becoming increasingly blurred. Of course, there are still staunch supporters of the war who could hardly be called “undecided.” But among our subjects are many who support certain aspects of the “special operation,” are uncertain about the necessity of others, and flat out disagree with other aspects. Therefore, we group all our subjects together, but at the same time attempt to show the differences between different types of perception of the war.
This report comprises two parts, aside from the introduction, research methods, and conclusion. The first investigates how Russian perceptions of the war had changed more than half a year after it began. In this section, we often compare the first and second interviews taken with a subject in order to show how their views had changed (if at all). We also address interviews with new subjects and illustrate new trends in perceptions of the war that were practically nonexistent in spring 2022. The second part deals with perceptions of the war in fall 2022. In addition to the overall perceptions of non-opponents, we also analyze their reaction to the announcement of “partial mobilization,” their emotions surrounding the protraction of the war, how they consume information about the conflict and how they see it ending. In the conclusion, we present the main findings of the study. Some preliminary results of this second phase of the study have already been published in the media, for example
here,
here and
here.
This study was organized by the Public Sociology Laboratory (PS Lab). The PS Lab is an informal research collective that studies politics and society in Russia and the post-Soviet space from a comparative perspective. The Lab has studied the 2011–2013 “Fair Elections Movement,” and then the following post-protest local activism in Russia, the Euromaidan and Anti-Maidan Movements in Ukraine, the 2014 Donbas War, and now, Russian perception of the military conflict between Russia and Ukraine. You can learn more about the PS Lab’s work
here.
At the same time, the team conducting this research extends beyond the walls of the PS Lab. The interviews used in this report were conducted by (in alphabetical order) Violetta Alexandrova, Serafima Butakova, Kira Evseenko, Svetlana Erpyleva, Oleg Zhuravlev, Sasha Kappinen, Irina Kozlova, Nadezhda Korytnikova, Alexander Makarov, Vadim Maleiko, Natalia Savelyeva, Vladislav Siiutkin, Yulia Strizhenova, and Igor Chernivsky. Analysis of the data and drafting of the report was performed by Maxim Alyukov, Violetta Alexandrova, Serafima Butakova, Alya Denisenko, Svetlana Erpyleva, Oleg Zhuravlev, Sasha Kappinen, Irina Kozlova, Nadezhda Korytnikova, Anatoly Kropivnitskyi, Darya Lupenko, Natalia Savalyeva, Vladislav Siiutkin, Yulia Strizhenova, and Igor Chernivsky. The report was edited by Svetlana Erpyleva and Sasha Kappinen.
This research project has no clients or outside funding. Such a huge undertaking was made possible thanks to the fact that there are so many of us and each understands the importance of determining how war affects our society. We are all employed in different places and have other projects we worked on in tandem with this one. We are all (with the exception of one researcher in Ukraine) Russian citizens with a variety of political views, but all of us oppose the war. We are aware that our view of the situation does not align with the views of many of our interview subjects. We treat their opinions with respect, withholding judgement, and simply describing and analyzing them from an investigative, sociological point of view. This is why there are so many quotes throughout the text: they are direct quotes from our respondents.
We are grateful to Mikhail Oleinikov for formatting the report. We would also like to thank our friends and colleagues who aided us in our search for subjects. And we would like to thank the interview subjects who agreed to speak with us despite the complexity of the topic and the military censorship throughout the country.
Below, we discuss what data we collected, how we collected and analyzed it, and the limitations of our collection and analysis methods.
Data CollectionWe felt the acute need for a second stage of research immediately after the “partial mobilization” was declared in Russia. At that moment, the distant war once again (or for the first time) hit home for millions of Russians, invading their everyday lives. Friends, acquaintances, and even strangers on the street began talking—sometimes whispering—about the war again. We all felt like things were changing. Nevertheless, we knew that even the strongest emotions sometimes fade quickly and may not leave significant marks on people’s attitudes. Therefore, we decided to wait a few weeks before starting the second phase of our study.
The majority of the interviews used in this study were conducted between October 11 and December 29, 2022. Two subjects were interviewed later, in late January and early February 2023. Most of the interviews were collected in the second half of October and November 2022.
We used several methods to find our respondents.
Firstly, we approached the supporters and undecided respondents whom we spoke with in spring 2022 with a request for a second interview. Around half of them agreed to speak with us again.
Secondly, we put out an advertisement calling for interview respondents on social media—both on the Lab’s page and the researchers’ personal profiles. In these announcements, we indicated that we are looking for people who did not identify as staunch opponents of the war, as well as those who thought their views on the war had changed a lot over the previous six months (in any direction).
Thirdly, we relied on the snowball sampling method. We asked all our respondents to recommend a few people whom we could contact and also used our own social media pages to search for potential respondents. We reached out to distant relatives, former classmates, childhood friends, and even former respondents from prior research projects. Since there were more than 20 people on our team (researchers and interviewers), we were able to contact a wide variety of unrelated interview subjects.
This time, with rare exceptions, we conducted the interviews over the internet. The first stage of research revealed that the majority of respondents preferred to meet online, since this way it is easier to find time for the conversation and they feel more secure, more anonymous. We asked our respondents to choose which platform was most comfortable for them. Most of the interviews were conducted via Telegram, some through Zoom, and the rest through WhatsApp. The interviews lasted from 30 minutes to two hours, but most conversations lasted between an hour and an hour and a half. With respondent permission, conversations were recorded on a voice recorder and then transcribed by a professional transcriber (always the same person).
All interviews were taken anonymously. In some cases, we essentially did not know the real names of our respondents—we only had their contacts and nicknames or online handles. In cases when the names were known to us, we asked the respondents not to say them during the recorded conversation. After the interviews were transcribed, all personally-identifiable information was removed from the transcripts. Recordings and transcripts are stored on a password-protected cloud service that only researchers have access to. When we quote extracts from interviews in the report (and in other publications), we include them in such a way that it is impossible to identify the respondents—we indicate only gender, age, profession, and the date of the interview.
The interviews were conducted either by professional researchers or by volunteers with (or receiving) an education in the social sciences. Most of the volunteer interviewers had prior experience with us—they had helped us collect interviews in the first phase of our research. All volunteers received detailed interview instructions from the project coordinator.
At this stage, we used two different sets of guidelines (lists of main themes/questions) for the interviews—one for repeat interviews with spring respondents, and the other for interviews with new respondents. These guidelines differed only slightly from each another. Both sets of guidelines included questions about changes in perceptions of the war from the moment it started to the moment the interview was conducted; the mobilization; respondents’ methods of consuming and sources for information about the war; their conversations with loved ones about the “special operation”; their attitude towards the protests, both against the war and against mobilization; and about their view of how the Russo-Ukrainian War might play out. The guidelines for interviews with new respondents also included a block of questions about the respondent’s political views in general, and in particular, concerning Russian and Ukrainian relations before February 24th, 2022. (We did not ask these questions in the repeated interviews because we had already discussed this topic with respondents in the spring.)
Half of the supporters and undecided respondents with whom we spoke in the spring of 2022 responded to our request to talk to us again. Some we simply could not reach because we no longer had their contact information (we purposefully neglected to save it in order to protect the anonymity of our respondents). Others did not respond to our messages. Some said no—partly because they found our first analytical report “biased.” However, many pro-war respondents had a positive opinion about our first report and gladly agreed to speak with us again. For example, one of the respondents called our first report “a breath of fresh air between the propaganda of one side and the propaganda of the other side” (m., age 25).
Finding new interview subjects was not easy. We encountered the same problems in fall 2022 as we had in spring of the same year—many did not want to discuss the war with researchers for various reasons. One person expressed very concrete fears for his own safety, and not even because he did not trust the researchers: “I have nothing to say. No thanks. I still have family in Russia. I have no doubt about the honesty and anonymity of your colleagues, but I also have no doubt about the quality of the secret police’s work” (m., age unknown). An acquaintance of one researcher initially agreed to an anonymous interview, then wrote back a week later to cancel. She explained that her husband was adamantly opposed to any interview by phone (or online messenger)—it was not safe. In her words, after having a few phone conversations about certain topics, she had started to hear a noise in the background of all her calls and that “everything was telling her this was bad news” (f., age 33). An acquaintance of a different researcher, on the contrary, called her back in response to a message to explain why she was not ready to talk on the record. According to her, everything leaves a digital trace and it is impossible to discuss such topics now. Her contract with the university had just been renewed, and she was afraid that they might find out about her giving the interview and fire her (f., age 65).
Many believed that such conversations could be potentially dangerous, but they could not (or did not want to) articulate what exactly this danger was. As an acquaintance of one researcher said in her response to our request for an interview: “I would like to help you, but we live in a certain society, one where you don’t want to be participating in social surveys. That's all I can say” (f., age 33). Many people simply didn’t want to talk about such a complex, incomprehensible, contradictory topic: “The special operation is a difficult topic. A lot of people I know are for it, and just as many I know are against it. So I’m afraid I’m not willing to speak on the topic” (m., age 40). “I don’t actually want to talk about that topic. I don’t have any answers” (m., age 35). Of course, we also encountered entirely unexpected reactions. For example, a repost of our public call for interview subjects received the following comment: “Honestly, no offense, but offers like this sound like they’re recruiting for the very same sleeper agents. Where the interview turns into something else” (m., aged approx. 20).
Once again, we want to express our gratitude to all those who agreed to talk with us in these difficult conditions. We promise to preserve your anonymity.
Description of DataAs part of the second phase of our research (in the fall), we conducted 88 interviews with non-opponents of the war; 40 of these were repeat interviews with respondents we had already spoken to in the spring. We were able to speak to 18 out of 29 of the “undecided” subjects interviewed in the spring and 22 out of 40 “supporters” interviewed in the same period. We also conducted 48 new interviews.
Our sample is not representative of the Russian population on the whole—as in the spring, people with a higher degree of education were overrepresented, as were residents of Moscow and Saint Petersburg. This means that it is impossible to say how the “majority” and “minority” of Russians feel about the war based on our data, but it is possible to describe existing attitudes towards the war in society. This description of various attitudes and rationales behind the Russian populace’s view of the war does not require hundreds of interviews. Sociologists working with qualitative methods are often guided by the principle of “saturation” when collecting data: as soon as the arguments/reasoning about a phenomenon start to be reiterated within the study and new interviews cease to bring new information, data collection ends. In our data, we see a number of recurring types of reasoning and explanations of the events at hand. These frequently-repeated thought patterns and ways of assessing the war and the course of hostilities allow us to hypothesize about patterns in how different groups of people conceptualize war—mechanisms that lead to one or another type of perception of warfare, casualties, information—despite the disproportionate nature of the sample.
Below, we describe our subject pool according to 5 characteristics: gender, age, education, income, and geographic location. The ratios shown in the graphs below in no way indicate that this is how the gender, age, education, and income of non-war opponents are distributed throughout Russia. On the contrary, it shows the specific makeup of our study sample and the ways in which it is skewed.