Artificial intelligence, digital social networks, and climate emotions

Main
The potential of AI technologies to augment climate change monitoring and decision-making, and help reduce the carbon and environmental footprint of human activities, has gained considerable attention in the last years. A much less explored aspect in current conversations about the climate dimensions of AI relates to the fact that AI technologies can also affect behavior by influencing human emotions (e.g., refs. 1,2).
The efficacy of such emotion technologies has been questioned on various grounds3,4,5. Uses of AI to try to measure and influence human emotions continues to grow despite this well-founded criticism. Since emotions are a fundamental part of human behavior, and relate to climate decision-making and collective action in several ways6,7,8,9, the interplay between AI, emotions, and climate action thus demands continued attention and scrutiny.
This Perspective synthesizes research and ideas related to the ways that AI could affect climate action by influencing human emotions at scale. Our work complements previous research (e.g., refs. 10,11,12,13), and asks: (a) How are emotions, climate action, and AI related? (b) Which are the possible effects of increased applications of AI on individual and collective behavior mediated through emotions? (c) Which are the most important research challenges in this area?
We offer a conceptual overview of the topic and therefore do not conduct a systematic literature search strategy. Instead, we present key insights from relevant literature and selected examples to showcase the complex relationship between emotions, AI, and climate action. We also suggest new avenues of research, as well as a number of methodological and research ethical challenges. Box 1 summarizes key terms used in this Perspective.
Emotions and climate action
Emotions are an inherent part of decision-making and collective action, which in turn have implications for climate action6,7,8,9,14. Several aspects of human behavior are affected by emotions, including the enforcement of norms15, climate risk perceptions16, public policy support for climate policies17, perceptions of desirable climate futures18, and the translation of climate knowledge into climate action19.
Given these influences of emotions on climate action, it is critical to understand the underlying psychological mechanisms that link climate emotions with action, as well as their connection to expanding digital social networks, and advances in AI influencing (directly and indirectly) human emotions.
Emotions that relate to climate change and its impacts on people, nature, and future generations—here simply denoted as “climate emotions”—are to some extent not different from emotions (like fear, anger, sadness, and despair) induced by crises like pandemics, war, terror attacks, or personal tragedies. Some of the mechanisms that link emotions to action, however, make them particularly interesting to elaborate in the context of climate action, and in relation to expanding social digital networks20.
These mechanisms include, for example, affective forecasting—whereby people act as a response to anticipated future emotional outcomes of climate change or mitigation behaviors21; value congruence—where emotions align with personal values and identities, thus strengthening commitment to action8; and social norm activation—whereby emotions are amplified or regulated by social dynamics and groups behavior8. In addition, temporal immediacy means that emotions are more strongly triggered by vivid, near-term threats than abstract, long-term risks22. This last mechanism means that the long-term aspects of climate change are different from other risks people are exposed to, and thus may not trigger the same emotional reaction. This may lead to an underestimation of climate risks and to delayed climate action22.
Emotions and behaviors related to climate change (and other crises perceived as existential threats) also contain a cross-generational and cross-species dimension that makes them different from other domains. In other words, people not only sense worry or empathy for themselves and their closest family, but also for future generations23 and non-human species24.
A new socio-technological context for climate emotions and action
The psychological processes that link emotions to action do not unfold in a vacuum, but are deeply social phenomena8,25. Humans are influenced by the specific environments in which they make decisions, the social groups with which they identify, and the broader contexts in which their lives unfold7,25,26,27. Being part of different social and cultural groups enables or constrains certain behaviors, through habits, styles, or norms28,29, and shape our cognitive processes30,31, and what people value and feel 31,32. These social interactions all take place in a changing socio-technological context33. Below, we elaborate how this socio-technological context has changed in the last decades due to expanding digital social networks and advances in AI.
These changes include (a) the way massive digital social networks help diffuse and amplify collective climate emotions; (b) the influence of AI-enabled systems on affective information online; and (c) the emerging capabilities of Generative AI (GenAI) to enable large-scale and targeted production of affective content. While this new context is not unique to climate emotions, we highlight their potential (and until now ignored) influence on climate action at scale. We explore each of these points below.
-
a.
Expanded digital social networks and collective emotions
Social media have fundamentally expanded the geographical scale and changed the properties of human social networks, making communication increasingly rapid, virtual and borderless33,34, influencing news consumption35, and shaping individual attitudes, feelings and behaviors36,37. Information about climate change and its impacts like wildfires, floods, and droughts are a fundamental part of an increasingly automated news ecosystem38,39.
The expansion of such digital social networks and the diffusion of climate events information, allow for extensive emotion propagation, and for collective emotions on climate issues13. Emotion propagation describes the spreading of emotions and related behaviors in large social groups. Propagation evolves as a result of interacting psychological, social network, and algorithmic factors40,41. The resulting collective emotions can be more intense, and evolve over time differently compared to emotions that are experienced by individuals or in small groups42,43. The 2004 floods in Valencia (Spain) offer an illustration of this phenomenon (Box 2).
Social feedback dynamics also change in digital social networks since these networks allow for qualitatively different forms of social interaction compared to “offline” social networks. This includes the anonymized nature of communication, and the absence of face-to-face interactions44,45, and non-verbal emotional links through eye contact46,47. These have been proposed to amplify competition between groups48. Emotionally charged online news and social media content about climate change is abundant49, and tends to become amplified, politicized, personalized, and polarized (e.g., refs. 50,51). Studies also suggest that moral emotional language, anger, and outrage about political out-groups (e.g., climate denialists or climate activists) strongly amplify sharing, and induce emotional responses52,53. All these phenomena are, as we discuss below, deeply infused by various uses of AI.
-
b.
Recommender systems, emotion data, and social bots
The structure of digital social networks and the associated flows of information within them is not random, but fundamentally mediated by AI in various ways. Recommender systems—which often rely on deep learning-based AI54—influence both information diffusion and the structure of digital social networks33,39. People-recommender systems (like “People You May Know” on Facebook or “Who to Follow” on X/Twitter) affect how digital social networks evolve over time, and as a result, affect the type of information users encounter online40. Content-recommender systems are known to play an important role in amplifying emotionally charged content online20,42,55.
Social bots, as another example, seem to further amplify emotive climate content online. Automated communication through social bots has become an integral part of today’s social media ecosystem56, and has become increasingly technologically sophisticated over time57,58, at times specifically designed to target human emotions59. Research indicates that social bots are active in debates about climate change60, and may contribute to polarization by promoting both positions in online engagements at the same time39,61,62. Their capabilities may, as we discuss next, be improved considerably due to advances in GenAI.
Communication in these digital social networks takes place in an increasingly “emotionalized” digital media environment that includes the AI-supported collection and analysis on a diversity of “emotion data” about users with the intention to keep users engaged for as possible by nurturing emotional engagements63,64. Such data are collected by digital media outlets and social media platforms by design (e.g., “like,” “love,” and “sad” reaction buttons), and can also be extracted as “surplus” from online behavior as users are profiled based on, for example, their browsing and engagement behavior online65.
-
c.
The rise of Generative AI
GenAI with its multimodal abilities to analyze and generate various forms of content is an additional form of AI that connects to human emotions at scale. GenAI applications like ChatGPT (by OpenAI), Llama (by Meta), Gemini (by Google), Midjourney, Qwen (Alibaba), and DeepSeek-R1 (by DeepSeek) are examples of AI models with the capabilities to automate the production of emotionally charged synthetic content at scale66, and may result in AI-amplified biased emotional judgments67.
Such synthetic content is already visible in public conversations about climate change and the need for climate action. The wide diffusion of false information and AI-generated content—often displaying strongly emotionally charged images and text—after the hurricanes Milton and Helene in the United States in 2024, and the Los Angeles fires in 2025, are clear examples of this phenomenon, even though their longer-term impacts on public opinion are uncertain. (See The Guardian (2024): “Russia shares AI images of Hurricane Milton as disinformation abounds in USA” https://www.theguardian.com/us-news/2024/oct/10/russia-ai-hurricane-milton-disinformation NPR (2024): “AI-generated images have become a new form of propaganda this election season” https://www.npr.org/2024/10/18/nx-s1-5153741/ai-images-hurricanes-disasters-propaganda CBS (2024): “Bogus videos from the hurricanes are going viral. Here’s how to spot old and fabricated footage online” https://www.cbsnews.com/news/hurricane-viral-video-how-to-spot-old-fabricated-ai-footage/ NPR (2025): “LA’s wildfires prompted a rash of fake images. Here’s why” https://www.npr.org/2025/01/16/nx-s1-5259629/la-wildfires-fake-images).
The potential influence on emotions of uses of GenAI can unfold through personalized synthetic media content, and/or through conversational persuasion. In the former case, studies suggest that GenAI can be used to not only produce highly realistic synthetic content (i.e., text, images, videos), but also prompt such production to specifically target the emotional state, age, and/or ideological predisposition of a target audience68,69,70.
Chatbots with infused GenAI capabilities also allow for human–machine engagements through lengthy conversations71,72, potentially with influence on human emotions at scale due to these technologies accessibility and low costs to upscale. While their use on emotions is experimental at the moment59, their inherent and growing capabilities, including abilities to combine various forms of synthetic content, can be used to engage emotionally with users on climate change issues (e.g., refs. 72,73).
These technological advances in combination with GenAI’s abilities to integrate and process a large combination of data—including individual data as part of its targeting efforts—is often mentioned as these technologies’ ability for “hypersuation”74, thus further contributing to a changing technological context for climate emotions and action.
A framework to explore AI, climate emotions, and climate action
This changing socio-technological context may relate to psychological mechanisms in various ways. For example, exposure to images, videos and sound climate-related catastrophes and their impacts on people and communities amplified through AI-powered digital networks could influence affective forecasting as people try to anticipate what such events could entail for them and their families. Seeing online users with similar values respond to such events could induce further individual actions through value congruence and social norm activation. Engaging with synthetic material or other forms of GenAI-empowered technologies (say, chatbots) could allow people to see the temporal immediacy of growing carbon emissions. Table 1 summarizes how some of these connections could unfold in more detail.
Possible impacts of AI-mediated emotions for climate action
The impacts of emotions on climate action are to a large extent collective, and extend over long time periods13. We suggest that such longer-term impacts mediated by AI can play out in at least two ways: first, through the connection between emotions, collective mobilization, and identity; and second through new forms of empathy.
Emotions, mobilization, and identity
Participating in collective actions, such as climate rallies, can create “upward spirals” of emotions and actions that may reinforce each other over time75,76. Emotional engagements can foster group formation and mobilize online supporters77 which, in some cases, has proven to lead to tangible impacts on climate policies and emission reductions78. However, such collective emotions not only support coordinated collective climate action, but can also contribute to the formation of enduring identities as people identify with a group united by a common goal as a “by-product of the collective emotion” (ref. 43, p. 156).
Young climate activists in “Fridays for Future” offer an example of how emotional messaging through digital social networks can result in political mobilization79,80, and contribute to the formation of political attitudes81,82. The online presence and messaging of Greta Thunberg is characterized not only by its emphasis on “listening to the science,” but also by its strong emotional framing83. Olesen83 speaks of a new generation of technology savvy and digitally amplified “political icons”—i.e., individuals “with high public visibility who ha[ve] come to embody a political cause or movement for a large group of people, and who command […] significant emotional and moral attachment from their audiences” (p. 3).
Needless to say, affective online communication can also have negative effects on climate action. Toxic language, hate, and insults from opposing movements like climate denialists and proponents from the extreme right are a common feature in online conversations about climate change84. New forms of “carnivore diet,” pro-meat and anti-sustainability online communities such as “#yes2meat”61, and anti-wind power movements85 are additional examples of emotionally driven online mobilization against climate action. AI-mediated and emotive digital mobilization can also amplify emotions such as climate anxiety, fear, and despair which may reduce collective climate action. The feeling of being “disaffected, unengaged, skeptical or disillusioned,” for example, decreases the willingness to engage in collective climate action86,87. Such negative emotions can be exacerbated by media reporting and through algorithmic amplification88,89.
Expanded digital social networks, growing uses and analysis of emotion-related data, and the accelerated use of GenAI to produce various forms of climate communication that influence human emotions, thus allow for the mobilization of different social movements. Reinforcing “upward spirals” of emotions affected by collective action and the formation of collective identities can result both in movements that contribute and undermine climate action.
Digital empathy
Empathy—that is, the ability to understand and care for the experiences of another person, animal, or elements of the natural world—allows people to feel strong bonds to nature, and thus for garnering support for climate action90,91. People who feel strong emotional connections to others, for example, tend to care more about climate change, its impacts on society, and also engage more often with climate issues92. The collective emotional connection of rural communities to a place can also be reinforced after natural disasters such as tornadoes or floods, thus spurring more forceful local climate action93. To what extent increased uses of AI in digital social networks (Fig. 1) could impact on empathy could thus prove important.

A Simplified version of information sharing and communication patterns in a social network. Such patterns affect perceptions, emotions, and individual and collective behavior. B An extended digital social network where such information sharing and communication patterns are influenced and mediated through different AI applications, for example, extracted emotion-related data, synthetic content (text and images) designed to induce emotional responses, and/or algorithmic systems, and automation that amplify or dampen emotionally charged online content. The blue circles highlight that the specific patterns of information sharing and communication, as well as any emerging perceptions, emotions, and behaviors, take place within a broader context in which individuals and groups are embedded (e.g., socio-cultural or biophysical). Such contexts can extend significant influence leading to context-specific behaviors in both (A) and (B). C Connects the changed socio-technological context of climate emotions to a selection of psychological mechanisms linked to various aspects of climate action. Illustration by Azote based on Elsa Wikander/Azote, Figs. 1 and 3 in ref. 25.
Digital platforms seem to allow the general public to form emotional bonds to non-human species in new ways (e.g., refs. 94,95), thus further contributing to a sense of urgency of climate action. The use of AI methods to help analyze bioacoustics data has been argued to be able to underpin new forms of empathy toward non-human animals96. A less explored dimension of AI and climate emotions is related to the possibilities to generate synthetic content—ranging from text to sound and video—in ways that is personalized, and that explicitly support digital empathy to others, to non-human species, and to future generations. While early attempts of using GenAI to create stronger emotional bonds to climate change issues do exist97,89, these are all at the experimental stage.
Technological developments in general may, however, undermine emotional connections to the “natural” world insofar as people have less direct experiences of nature90,98. The expansion of digital farming, for example, has been suggested to lead to the loss of local knowledge and emotive connection to the natural world99, both important sources of social resilience to climate change.
Taken together, affective information reaching more people via digital media, AI-supported emotion analysis, and novel uses of GenAI could lead to the extension of empathy. It could also foster the diffusion of climate norms and behavior through digitally amplified “political icons.” On the other hand, it may also undermine such emotional connections and behaviors as people have less direct experiences of nature.
Below, we summarize our argument illustrated as an extension of frameworks presented in refs. 25 (p. 199) and 33 (p. 1079). Figure 1A shows a simplified version of a social network where perceptions, emotions, and behavior are affected primarily by communication and interactions within a small social group. Figure 1B shows an expanded social digital network, where more individuals and groups become connected, at the same time as social interactions are technology-mediated by uses of AI. Figure 1C illustrates the hypothesized interplay between AI technologies, selected psychological mechanisms, and their links to climate emotions and action.
Conclusion and research frontiers
The diffusion of information about climate change unfolds within an increasingly “emotionalized” and automated media environment. However, the complex causal connections between individual and collective behavior, the diverse contexts, and technological systems in which they are embedded, do not allow for simple causal claims.
There are a number of methodological challenges in this domain, for example, the limited external validity of experiments in artificial settings; difficulties in differing between behavioral intentions and real-world behavior; and the lack of robust measurements of emotions, just to mention a few5,13. Studies based on computational methods of real-world social media data (e.g., refs. 100,101,102) offer a promising approach to systematically assess the medium-term collective effects of emotion propagation, however, noting that access to such data from social media companies has become more limited lately (e.g., Twitter/X). Such approaches could be complemented with other context-sensitive methodological approaches and data, such as controlled experiments and pre-registered surveys42, and hypothesis testing and generation using artificial agents which may include emotion dynamics103.
We propose three research areas which require further multi- and inter-disciplinary elaboration to examine the increasing intersection of emotions, AI, and climate action.
First, there is a need to further explore the causal relationship between emotions induced or amplified by engagement with digital media (e.g., social media posts, online news, synthetic content) and climate strategies and actions across different contexts (e.g., political, cultural). While such digital content can trigger immediate affective responses, it remains unclear to what extent these emotive experiences have long-lasting effects on behavior, and how context-specific these effects are75,104. For example, do different types of content (e.g., images/video, authentic/GenAI-produced, neutral framing/negative framing) induce different types of emotional responses? Does the emotion triggered depend on the sender (e.g., family member, user in social media, AI-augmented chatbot)? Does a specific emotion (say, anger) translate to some specific form of online behavior (i.e., sharing, commenting), and how much is this determined by users’ ideological predisposition (e.g., liberal, conservative, green, or left-leaning)?
Second, analyzing the interplay and feedback between algorithmic systems and emotions remains highly challenging, and algorithmic effects are far from straightforward to isolate33,40,67. There is therefore an urgent need to advance new methods and multidisciplinary approaches—such as agent-based models and network approaches—to unpack these complex connections between emotions and behavior mediated by AI technologies. Critical questions here relate to not only long-term impacts, but also which types of AI technologies to include (e.g., social bots, ranking algorithms, production of synthetic media), and at what scale (e.g., small controlled experiments and/or social media around international events). The literature in this domain is growing but has yet to explore issues relevant to climate action.
Third, it is possible to develop direct applications of AI to assess and induce emotional responses with the ambition to contribute to improved climate literacy and spur climate action, including new forms of AI-augmented climate communication (e.g., ref. 105), and the use of immersive technologies (e.g., ref. 106). While such intentional uses are intriguing, ethical concerns and legal issues remain a pressing challenge. The use of AI to induce emotive responses could be seen as a form of psychological manipulation, particularly if individuals are not aware of the technology’s use (e.g., refs. 107,108). Thus, any attempt to leverage emotional AI technologies to support climate action must carefully consider their complex legal and ethical implications109.
The extraction of emotion-related data uses of emotion analysis and AI, and the creation and dissemination of emotional texts, images, and videos to influence human emotions are likely to increase significantly in the coming years. The abilities of AI technologies including GenAI to analyze and influence emotional responses may continue to improve, allowing for more targeted and sophisticated applications. The reliability of the application of AI on such data remains controversial and will continue to raise challenging legal and ethical concerns. Assessing their impacts requires novel inter-disciplinary approaches and methods, as the ones we have presented here. We hope that this overview offers some insight, and new entry points as to why this is both an important and urgent undertaking.
Responses