
Author: Yana Tsymbalenko,
Chair of the Supervisory Board of the National Association of Lobbyists of Ukraine (NALU),
Authorized Officer for Corruption Prevention
In contemporary geopolitical conflicts, information has long ceased to be neutral. Manipulative technologies are actively employed as tools to influence public opinion and shape social behavior. Through news outlets, social media, messaging platforms, and pseudo-expert commentary, targeted pressure is exerted on collective consciousness—capable of altering perceptions of events, triggering fear, aggression, or apathy, and steering society toward predetermined decisions and actions.
Manipulation operates not only at the level of rational thinking but also through emotions, subconscious patterns, and ingrained stereotypes. This is why disinformation often disguises itself as an “alternative viewpoint,” “personal experience,” or “independent analysis,” creating the illusion of choice while, in reality, imposing a pre-designed narrative.
The information environment of democratic societies is particularly vulnerable, as freedom of speech and the right to self-expression are among their core values. In conditions of information overload—when audiences are exposed to vast volumes of content on a daily basis—false information easily blends with credible sources. This dynamic is further reinforced by the logic of the “post-truth” era, in which emotions and personal beliefs frequently outweigh verified facts. As a result, disinformation ceases to be merely a latent threat and begins to exert tangible influence on social processes, public trust in institutions, and the overall stability of democracy.
A Practical Approach to Analyzing Disinformation
To understand how disinformation penetrates the information space and how it can be effectively countered, it is essential to combine several applied approaches.
An analytical approach makes it possible to identify common manipulative techniques, such as emotionally charged headlines, facts taken out of context, and appeals to fear or the construction of an “enemy” image. This enables audiences and professionals alike to recognize disinformation at the very stage of content consumption.
Comparative analysis of international and national experiences demonstrates that different countries rely on a diverse set of countermeasures—from legal regulations and platform moderation policies to media literacy and public education programs. Such comparison helps determine which practices prove most effective under specific political, social, and cultural conditions.
Viewing the information environment as a system allows for a clearer understanding of the roles played by key actors, including traditional media, social networks, state institutions, civil society organizations, and users themselves. Disinformation does not exist in isolation; it spreads through the interaction of these actors and is often amplified by the algorithms of digital platforms.
The use of a forward-looking, predictive approach makes it possible not only to identify existing threats but also to anticipate future scenarios of information attacks. This, in turn, provides a basis for developing practical recommendations aimed at strengthening societal information resilience and improving mechanisms for protection against disinformation.
How Information Shapes Perception and Behavior
In today’s information environment, public communication has long ceased to be a simple exchange of messages. Any form of information—from news reports to social media posts—directly influences how people perceive reality, as well as their emotions and behavioral responses. With the advancement of digital technologies and virtual spaces, this influence increasingly operates unconsciously, at the level of subconscious attitudes and psychological triggers.
In democratic societies undergoing digital transformation, public policy effectively becomes a form of continuous communication. It functions not only through official statements or formal decisions, but also through subtle interactions between individuals’ conscious actions and their unconscious reactions. In this sense, informational influence extends both to individual citizens and to society as a whole.
Public policy is inherently linked to people’s needs and interests, as well as to the responsibility of state institutions and civil society. Communication grounded in democratic principles serves as a crucial tool for building trust, strengthening social institutions, and rethinking democracy in the digital age. At the same time, this very space gives rise to new vulnerabilities associated with information manipulation.
Information plays a central role in shaping knowledge, forecasting social trends, and structuring interactions between individuals and institutions. Systematized data on citizens’ needs, attitudes, emotions, motivations, and behaviors can serve as a powerful instrument for protecting democratic values—or, conversely, as a means of manipulation. Likewise, information related to traditions, symbols, and cultural codes shapes national identity and core societal values, making it particularly sensitive to external influence.
How Disinformation Affects Society and Democratic Processes
An analysis of the modern information environment shows that communication is increasingly being used as a tool of governance. The media sphere plays a significant role in this, as it can not only inform but also deliberately shape desired narratives. The consequence of such influence is a communicative crisis — a decline in trust in traditional media and official sources of information. In this situation, an individual simultaneously acts as both an active participant in the information space and its object.
Digital tools for disseminating information are now accessible to almost everyone. This creates the illusion of complete freedom of self-expression, but at the same time complicates personal identification in cyberspace. The information environment becomes a complex social system where freedom of speech can be used as a cover for spreading disinformation — especially in democratic contexts.
Based on the nature of user-generated content, the internet can be roughly divided into several segments: knowledge platforms, commercial services, social media, and social networks. Social media and social networks, in particular, have become the main environments for spreading disinformation. They are convenient for communication but are also open to the mass infiltration of manipulative, aggressive, or morally harmful content.
In practical terms, disinformation is not only outright falsehoods. It can appear as distortions of facts, manipulation of context, emotional pressure, or the imposition of oversimplified and dangerous explanations of complex processes. Such influence gradually erodes trust between people and undermines the foundations of democratic communication, often with delayed but long-lasting effects.
Within the framework of information confrontations, “soft” and “hard” forms of information warfare can be distinguished. The first operates through half-truths, emphasis, and emotional interpretations, without resorting to direct lies. The second openly substitutes reality with fabricated narratives and actively promotes them. Ukraine is currently facing these hard forms of information influence.
Content of an anti-constitutional nature is particularly dangerous. This includes:
- Materials that directly or indirectly threaten national interests, territorial integrity, and the democratic system.
- Deliberate distortion of real socio-political processes and historical events.
- Substitution of concepts, creation of artificial myths, and promotion of hostile narratives and values.
- Populist manipulations and the use of political clichés to discredit state institutions and officials.
The systematic and deliberate dissemination of manipulative and false content can have serious social consequences — ranging from increased anxiety and suicidal tendencies to social destabilization and the justification of violence.
Conclusion
Disinformation, integrated into everyday information space, has become an effective tool for influencing mass perception and citizen behavior. Its spread undermines trust, weakens democratic institutions, and creates long-term risks for the stability of democratic systems on a global scale.
Effective counteraction to these threats requires a systemic approach combining legal, organizational, informational, and technological measures. Strengthening cybersecurity, fostering responsible communication, and enhancing societal resilience to manipulative influence are key conditions for preserving democracy in the digital age.