Related Articles

Pathogen stress heightens sensorimotor dimensions in the human collective semantic space

Infectious diseases have been major causes of death throughout human history and are assumed to broadly affect human psychology. However, whether and how conceptual processing, an internal world model central to various cognitive processes, adapts to such salient stress variables remains largely unknown. To address this, we conducted three studies examining the relationship between pathogen severity and semantic space, probed through the main neurocognitive semantic dimensions revealed by large-scale text analyses: one cross-cultural study (across 43 countries) and two historical studies (over the past 100 years). Across all three studies, we observed that increasing pathogen severity was associated with an enhancement of the sensory-motor dimension in the collective semantic space. These patterns remained robust after controlling for the effects of sociocultural variables, including economic wealth and societal norms of tightness. These results highlight the universal dynamic mechanisms of collective semantics, such that pathogen stress potentially drives sensorially oriented semantic processing.

The radiogenomic and spatiogenomic landscapes of glioblastoma and their relationship to oncogenic drivers

Glioblastoma is a highly heterogeneous brain tumor, posing challenges for precision therapies and patient stratification in clinical trials. Understanding how genetic mutations influence tumor imaging may improve patient management and treatment outcomes. This study investigates the relationship between imaging features, spatial patterns of tumor location, and genetic alterations in IDH-wildtype glioblastoma, as well as the likely sequence of mutational events.

Language measures correlate with other measures used to study emotion

Researchers are increasingly using language measures to study emotion, yet less is known about whether language relates to other measures often used to study emotion. Building on previous work which focuses on associations between language and self-report, we test associations between language and a broader range of measures (self-report, observer report, facial cues, vocal cues). Furthermore, we examine associations across different dictionaries (LIWC-22, NRC, Lexical Suite, ANEW, VADER) used to estimate valence (i.e., positive versus negative emotion) or discrete emotions (i.e., anger, fear, sadness) in language. Associations were tested in three large, multimodal datasets (Ns = 193–1856; average word count = 316.7–2782.8). Language consistently related to observer report and consistently related to self-report in two of the three datasets. Statistically significant associations between language and facial cues emerged for language measures of valence but not for language measures of discrete emotions. Language did not consistently show significant associations with vocal cues. Results did not tend to significantly vary across dictionaries. The current research suggests that language measures (in particular, language measures of valence) are correlated with a range of other measures used to study emotion. Therefore, researchers may wish to use language to study emotion when other measures are unavailable or impractical for their research question.

Genetic architectures of childhood maltreatment and causal influence of childhood maltreatment on health outcomes in adulthood

Childhood maltreatment is increasingly recognized as a pivotal risk factor for adverse health outcomes. However, comprehensive analyses of its long-term impact are scarce. This study aims to fill this gap by examining the genetic architectures of childhood maltreatment and its influence on adult health and socioeconomic outcomes. Utilizing data from the UK Biobank (N = 129,017), we conducted sex-combined and sex-stratified genome-wide association studies to identify genomic loci associated with five childhood maltreatment subtypes. We then performed genetic correlation and Mendelian randomization (MR) analyses to assess the effects of childhood maltreatment on high-burden diseases, healthcare costs, lifespan, and educational attainment. We identified several novel loci for childhood maltreatment, including one locus for sexual abuse in sex-combined analysis, one novel locus for sexual abuse in males, one locus for emotional neglect in females, and one locus for sexual abuse in females. The pairwise genetic correlations between subtypes of childhood maltreatment were moderate to high, and similar patterns of genetic correlations between childhood maltreatment subtypes were observed in males and females. Childhood maltreatment was genetically correlated with ten out of 16 high-burden diseases significantly after multiple testing correction. Moreover, MR analyses suggest childhood maltreatment may increase the risk of age-related and other hearing loss, low back pain, major depressive disorder, and migraine in adulthood, and reduce the lifespan. Our study elucidates the genetic architecture of specific childhood maltreatment subtypes and the influence of childhood maltreatment on health outcomes in adulthood, highlighting the enduring influence of childhood maltreatment on lifelong health consequences. It is important to develop prevention strategies to lower the incidence of childhood maltreatment and provide support and care for victims of childhood maltreatment for better long-term health outcomes in the population.

A unified acoustic-to-speech-to-language embedding space captures the neural basis of natural language processing in everyday conversations

This study introduces a unified computational framework connecting acoustic, speech and word-level linguistic structures to study the neural basis of everyday conversations in the human brain. We used electrocorticography to record neural signals across 100 h of speech production and comprehension as participants engaged in open-ended real-life conversations. We extracted low-level acoustic, mid-level speech and contextual word embeddings from a multimodal speech-to-text model (Whisper). We developed encoding models that linearly map these embeddings onto brain activity during speech production and comprehension. Remarkably, this model accurately predicts neural activity at each level of the language processing hierarchy across hours of new conversations not used in training the model. The internal processing hierarchy in the model is aligned with the cortical hierarchy for speech and language processing, where sensory and motor regions better align with the model’s speech embeddings, and higher-level language areas better align with the model’s language embeddings. The Whisper model captures the temporal sequence of language-to-speech encoding before word articulation (speech production) and speech-to-language encoding post articulation (speech comprehension). The embeddings learned by this model outperform symbolic models in capturing neural activity supporting natural speech and language. These findings support a paradigm shift towards unified computational models that capture the entire processing hierarchy for speech comprehension and production in real-world conversations.

Responses

Your email address will not be published. Required fields are marked *