What patients and caregivers want to know when consenting to the use of digital behavioral markers

Introduction

Artificial-intelligence (AI) based computer perception (CP) technologies, including digital phenotyping [1,2,3,4], affective computing [5, 6], and computational behavioral analysis [7,8,9], are increasingly integrated into clinical decision-making across clinical domains, most notably in psychology and psychiatry [10]. These technologies use passive, continuous data collection via device sensors (e.g., microphones or accelerometers on smartphones and wearables) to provide insights into patients’ emotional and behavioral functioning during clinical assessments, research, and daily life. Depending on the context, clinical condition, or type of devices employed, a wide range of personal data can be collected, including those not directly related to the main study/clinical assessment, such as visual appearance, movements and morphology, speech, geolocation, accelerometry, phone screen capture/usage, call and text logs, social media content, physiological and even neural activity [6]. This is especially true for devices employed outside of the clinic, using an “ecological” approach that allows providers to understand patients’ behavior and mood in patients’ natural environments. Combining these data with remotely-administered self-report surveys (“ecological momentary assessments” or EMAs) and standardized clinical assessments may improve providers’ ability to detect certain clinical conditions, assess their severity, and guide personalized treatment recommendations [11,12,13].

While clinical inference is the primary goal for using CP technologies in care, these tools can also lead to incidental or secondary observations that may be sensitive and not yet consented to. Ethicists have raised concerns about patient privacy and the need for informed consent to the use of such sensitive data [14,15,16]. However, it remains unclear what patients need or want to know about AI-based CP tools, including their potential impacts on privacy and care quality, to provide meaningful consent [17, 18]. Understanding patients’ informational needs is critical for developing targeted education and consent procedures that address their concerns [19,20,21]. As we argue elsewhere [22, 23], these understandings can support participatory approaches to co-designing treatment plans that enhance patient engagement, autonomy and empowerment [24, 25]. This paper aims to sensitize researchers and practitioners about what patients and caregivers need to know to meaningfully consent to the use of CP technologies in clinical care.

Background

Little is known about the range of patients’ and caregivers’ questions and concerns regarding the use of AI-based technologies using CP data in clinical care. Approaches like digital phenotyping and computational behavioral analysis combine digital sensors with algorithms to analyze vast amounts of data [26]. This technological complexity and data density, as well as the nature of data collection in both clinical and ecological settings, introduce questions about how (and what) to communicate to patients about risks and benefits. Beyond concerns around privacy highlighted in the existing literature [14, 27, 28], many patients may not fully understand what data are collected, what inferences may be made, how data are stored, used or shared, and other ethical and practical considerations that may impact patients’ informed consent decisions [15].

In their “ethics checklist” designed to help researchers establish procedural safeguards around digital phenotyping, Shen et al. [29] include three key questions related to informed consent, including whether researchers have 1) “appropriately adapted [their] informed consent procedures to [the] specific study population, including possible use of surrogate consent”; 2) provided “background education on relevant technologies, such as explaining what social media companies may already be doing with the participant’s data”; and 3) “determined what a reasonable person would want to know, and explained in [their] IRB proposal the evidence on which [they] reached that determination.” To date, there is limited empirical data to provide robust answers to answer these questions, leaving gaps in understanding patients’ and caregivers’ informational needs when considering the use of digital phenotyping or related approaches in their care. It remains unclear how consent procedures should be tailored different stakeholders, considering factors like age, role (e.g., patients versus surrogate decision-makers), socioeconomic status, and other sociodemographic variables. Effective informed consent requires understanding patients’ and caregivers’ knowledge gaps about the purpose and impacts of these technologies [30]. This paper presents insights from interviews with adolescents and caregivers, offering an overview of their informational needs to help researchers and providers develop educational strategies that respect patient autonomy and enhance patient engagement in using CP technologies in ways that align with treatment goals.

Methods

As part of 4-year study funded by the National Center for Advancing Translational Sciences (R01TR004243), we conducted in-depth, semi-structured interviews with adolescent (n = 20) and caregiver (n = 20) dyads to explore their perspectives on potential benefits, risks and concerns around the integration of CP technologies into their care. Respondents were recruited from a “sister” study (5R01MH125958) aiming to validate CP tools designed to quantify objective digital biobehavioral markers of socio-emotional functioning.

Participants

Participants included a clinical sample of adolescents (aged 12–17 years) with varied diagnoses, including autism, Tourette’s, anxiety, obsessive-compulsive disorder, and Attention Deficit/Hyperactivity Disorder (ADHD), as well as their caregivers (typically biological parents, Table 1). Diagnostic presentations for all adolescents were confirmed by expert providers using a battery of established clinical measures. Adolescent-caregiver dyads were referred to the current study by the sister study’s coordinator and then contacted by a research assistant (AS) via phone or email to schedule an interview. Participants were interviewed between January 2023 and August 2023.

Table 1 Demographics for Interviewed Adolescents and Caregivers.
Full size table

Data collection

Separate but parallel interview guides were developed for adolescents and their caregivers, with the same constructs explored across both stakeholder groups (KK-Q, AS, MH), including: perceived benefits and concerns regarding integrating CP tools into clinical care, impacts on care, attitudes towards automatic and passive detection of emotional and behavioral states, perceived accuracy and potential for misinterpretation/-attribution/-classification of symptoms or conditions, clinical utility and actionability, data security and privacy concerns, potential for unintended uses, perceived generalizability and potential for bias. These domains were chosen based on issues raised in the clinical and ethics literatures (see Background) and with the guidance of experienced bioethicists and child mental health experts. Initial drafts of the interview guides were piloted with two psychologists (ES, CJZ) specializing in adolescent mental health, resulting in minor clarifications in wording. Interviews were conducted via a secure video conferencing platform (Zoom for Healthcare) and lasted an average of ~45 min (AS). This study was reviewed and approved by the Baylor College of Medicine Institutional Review Board (H-52227), which waived a requirement for written consent; thus, participants provided verbal consent.

Data analysis

Interviews were audio-recorded, transcribed verbatim, and analyzed using MAXQDA software. Led by a qualitative methods expert (KK-Q), team members (AS, MH) developed a codebook to identify thematic patterns in adolescent and caregiver responses to questions addressing the topics above. Each interview was coded by merging work from two separate coders (AS, MH) to reduce interpretability bias and enhance reliability. We used Thematic Content Analysis [31] to inductively identify themes by progressively abstracting relevant quotes, a process that entails reading every quotation to which a given code was attributed, paraphrasing each quotation (primary abstraction) and further identifying which constructs were addressed by each quotation (secondary abstraction), and organizing constructs into themes. To enhance the validity of our findings, all abstractions were validated by at least one other member of the research team. We then calculated the number of respondents who discussed each theme to obtain a sense of the proportion of stakeholders raising or discussing each concern/theme. In rare cases where abstractions reflected different interpretations, members of our research team met to reach consensus. Frequencies and percentages are reported as descriptive rather than inferential statistics and are therefore not intended to suggest any level of statistical significance.

Results

In order of salience (frequency discussed, see Methods), adolescents and caregivers highlighted questions and concerns across seven domains, including 1) Clinical Utility and Value; 2) Evidence, Explainability, Evaluation and Contestation; 3) Accuracy and Trustworthiness; 4) Data Security, Privacy and Misuse; 5) Patient Consent, Control and Autonomy; 6) Physician-Patient Roles and Relationship; and 7) Patient Safety, Security, Well-being and Dignity. While both adolescents and caregivers expressed informational needs and concerns across these domains, we observed the greatest differences between adolescents and caregivers in relation to caregivers’ greater emphasis (75%) than adolescents’ (35%) on the importance of ensuring informed consent, control and autonomy over which data are collected and how data are integrated into care (See Table 2). Caregivers (95%) also had more questions than adolescents (65%) about the clinical utility and value of CP for care. Adolescents (90%), on the other hand, expressed greater desire than caregivers (60%) about knowing why/how certain inferences are made from their data and wished to preserve the capacity to contest them in cases where they may contradict adolescents’ own subjective experiences. Details are discussed in turn below and relevant quotations illustrating each theme are presented in Tables 3–9.

Table 2 Observed Frequencies of Stakeholders’ Informational Needs about Integrating Computer Perception into Clinical Care.
Full size table
Table 3 Questions about Clinical Utility and Value.
Full size table
Table 4 Questions about Evidence, Explainability, Evaluation and Contestation.
Full size table
Table 5 Questions about Accuracy and Trustworthiness of CP.
Full size table
Table 6 Questions about Data Security, Privacy and Misuse.
Full size table
Table 7 Questions about Patient Consent, Control and Autonomy.
Full size table
Table 8 Questions about Provider-Patient Relationship and Role.
Full size table
Table 9 Questions about Patient Well-Being, Safety and Dignity.
Full size table

Clinical utility and value

Nearly all caregivers (95%) and most adolescent (65%) expressed a need for information about the clinical utility and value of integrating CP tools into care. Eight adolescents (40%) and 15 caregivers (75%) wanted to know the tangible improvements and justifications for using CP in their care, and actionability of CP in influencing treatment direction. Around half of the caregivers sought specific justifications for using of CP in their child’s care in order to understand the clinical utility and wanted to know how providers gauge the clinical significance of the data being collected. Six adolescents wanted to know whether CP tools may help them gain greater self-awareness of their symptoms and unconscious behaviors. Both groups had questions about CP’s potential to improve provider’s ability to detect or predict future disorders. Overall, the informational needs in this domain focus on understanding CP’s tangible benefits, expected feedback, and the clinical rationale behind data collection.

Evidence, explainability, evaluation and contestation

Almost all adolescents (90%) and over half of caregivers (60%) raised questions about how clinical inferences are made from CP data and wanted the ability to challenge data that might contradict a patient’s subjective experience. Just over half of caregivers (11) and adolescents (six) suggested that a basic understanding of CP algorithms could help assess accuracy and trustworthiness, and contest inferences they disagree with. Seven respondents wondered whether providers would explain how CP algorithms derive inferences (e.g., what pieces of information are used to classify diagnosis) and what evidence would support their validity. One caregiver emphasized the importance of seeking confirmatory evidence before making significant clinical decisions based on CP outputs. However, 12 caregivers and nine adolescents expressed that they did not necessarily need to understand the algorithm’s workings, trusting their provider’s judgment instead. These findings illustrate the need for information on the trustworthiness and explainability of CP algorithms, available corroborating evidence, and preserving the right to contest algorithmic conclusions.

Accuracy and trustworthiness of CP

Over half of adolescents and 17 caregivers expressed informational needs regarding the accuracy, clinical validity, bias, and relevance of CP tools in healthcare. A few adolescents (three) wanted details about the quality and inclusivity of training data sets – specifically the distributions of race/ethnicity, age and gender – before consenting to CP use in their care. Around half of caregivers sought similar information, including success rate data from validation studies, the developmental phase of the technologies, and whether patient perspectives were considered during development. Both groups also wanted to know how contextual factors, such as weather, schoolwork, socioeconomic circumstances or environmental insecurity, are accounted for in data interpretation.

Data security, privacy, misuse

A majority (60%) of adolescents and (75%) caregivers had informational needs related to what types and how much data would be collected, who might access or use their data, how it will be protected against unwanted uses or triangulation with other personal data, and how patients will be safeguarded from stigma or discrimination based on the data collected. About a third of participants (eight caregivers and five patients) emphasized the importance of data privacy, with some viewing CP data as more sensitive and deserving of greater protection and security than other physiological data (e.g. heart rate, blood glucose, etc.). Over half of all caregivers and adolescents expressed concern over potential unintended uses of their data and sought assurances that using CP tools would not negatively impact the patient now or in the future. Caregivers in particular, worried about stigma and discrimination if the data were accessible to future schools, employers or the justice system. Some patients preferred that privacy guidelines be clearly communicated early in the consent process to ensure informed consent when using CP.

Patient consent, control and autonomy

Three quarters (75%) of caregivers and approximately a third (35%) of adolescents highlighted concerns about transparency in data collection and their rights to participate in decisions about using CP tools in care. Both groups wanted to be informed of when CP would be used in their care and questioned whether they would be afforded a choice over which data are collected, which feedback is returned and whether they will have access to data, inferences and summaries. Preferences varied, with some wanting full access to the patient’s raw data to track changes and identify patterns, while others preferred to receive only summaries or alerts of significant changes. Some adolescents (five) and caregivers (six) wanted to know how involved they would be in decisions regarding data disclosure to providers. Certain adolescents (five) viewed themselves as owners of their data, wanting the right to directly grant or refuse access. Around three-quarters of caregivers and adolescents (13 adolescents, 16 caregivers) expressed a desire to restrict data access to providers and research teams, excluding corporations. Others (five adolescents and six caregivers) wanted the capacity to vet which information is shared, preferring that device interfaces share results with patients first, asking permission before sending data to providers. Over half of all respondents (14 adolescents and 11 caregivers) were concerned about being able to turn off passive monitoring in intimate, private or atypical settings.

Physician-patient relationship / Role of a physician while employing CP

A total of 11 adolescents (55%) and 12 caregivers (60%) had questions about the role CP data would play in their clinical care, how providers would ensure that personal experiences remain prioritized in CP-guided care, and how CP tools might impact their interactions with providers, especially when CP contradicts patients’ self-reported experiences. Some respondents (seven caregivers and three adolescents) raised concerns that providers may over-rely on data in clinical decision-making and sought assurances that patients’ subjective experiences would still be central to care. One caregiver suggested that while CP data should be considered, providers should use CP outputs to inform questions and discussion points with patients, rather than stand in for their perspectives. Five respondents (two caregivers and three adolescents) expressed concerns about CP tools disrupting daily life, and six caregivers felt that CP use might over-extend the provider-patient relationship, potentially crossing boundaries if providers could automatically detect a patient’s emotions. These informational needs address the preservation of patient-centered care in a data-driven environment, ensuring that patients feel heard, seen and autonomous.

Patient safety, well-being and dignity

Five adolescents and ten caregivers wanted to know what measures are in place to ensure CP technologies are officially sanctioned, respect diverse expressions and experiences of distress, and maintain patients’ dignity in social settings. A small number of caregivers (four) shared that they would like to use CP as a method to prioritize patient safety by becoming aware when their child might be putting themselves in danger during a mental health crisis. Another suggested that CP might help to ensure clinical attention for patients who may not otherwise be able or willing to communicate when they are experiencing anxiety, sadness, and depression. On the other hand, some adolescents expressed that this must be balanced with allowing adolescents to deal with their emotions independently, when appropriate, and not using CP as an additional means of parental monitoring. In addition to protecting patient safety, some caregivers (four) also expressed wanting to preserve their child’s dignity and prevent stigma through wearables. Some worried about CP devices drawing unwanted attention to patients in ways that may be experienced as stigmatizing, particularly among adolescents undergoing numerous developmental and social changes.

Discussion

This paper aims to inform providers and researchers about the informational needs and preferences of adolescent patients and their caregivers regarding using CP tools in their care. While ethical discussions to date have focused on data security, privacy and misuse of sensitive data [14, 15, 20, 24, 27], our findings highlight that questions about the clinical utility and value of CP technologies are equally important. Adolescents and caregivers seek clear justifications for the data collection enabled by CP tools, questioning whether the clinical benefits outweigh the privacy risks associated with generating large amounts of digital behavioral data. Both groups also wanted to know how inferences will be personalized to account for significant diversity and variability in the meaning (clinical significance) of behavioral patterns.

These knowledge needs echo ethical considerations highlighted by Shen et al. [29], who recently called for ethical frameworks to safeguard patient rights and address procedural and regulatory inconsistencies in CP research. They proposed a checklist of ethical considerations covering issues like those related to equity, diversity, access, privacy, research-industry partnerships, legal concerns, and return of results. Three of their key questions focused on establishing and adapting informed consent to meet the specific needs of various study populations, including surrogate decision makers. This paper provides the first empirical insights into the informational needs of patients and caregivers considering CP-related approaches into care. Although focused on adolescents in clinical research these findings are likely relevant across different study populations. In Table 10, we present an Informational Needs checklist aimed to guide providers and researchers in providing information to patients and research participants during patient education and informed consent, followed by suggestions and challenges for effectively addressing these needs. We hope that this checklist serves as a guideline, rather than an exhaustive or mandatory checklist, for providers to tailor their discussions based on a patient’s needs and values. Our hope is to support the delivery of patient-centered care while still remaining responsive to specific informational needs of patients and caregivers.

Table 10 Checklist of Stakeholder Informational Needs about Integrating Computer Perception Technologies into Clinical Research and Care.
Full size table

Clearly convey clinical utility of CP

The emphasis on the clinical utility and value of CP in care reflects a desire to better understand the risk-benefit calculus of using sensitive and private information beyond traditional clinical settings. Caregivers, all biological parents in this study, seek clear clinical justifications for using an intervention they view as new and investigational. This finding is noteworthy given the strong emphasis in the literature on privacy as a central ethical concern of these technologies and a relative under-emphasis on demonstrating clinical utility of CP tools for benefiting patient care [16, 32, 33]. While patients using CP may exchange privacy for effective treatment as is done in other areas of medicine (e.g., gynecology [34]; gastroenterology [35]), they are asked to trust CP’s invasiveness without sufficient empirical evidence of its benefits. As CP technologies move from research to regular care, patients are likely to demand stronger clinical justification for data collection outside the clinic. Providers must demonstrate that CP offers significant informational advantages over conventional methods and clarify for which conditions CP is appropriate.

Address trustworthiness concerns

Both respondent groups also had questions about the validity of CP inferences, i.e., whether they accurately reflect a patient’s subjective experiences and behaviors. Many sought to understand how algorithms generate conclusions (explainability), how interpretation of outputs is complicated by patient heterogeneity and comorbidities, and how these outputs affect providers’ interpretations of a patient’s condition or risk for future conditions (interpretability and actionability). They also had questions about how providers and patients could evaluate or contest the accuracy and validity of CP outputs (contestability). Adolescents were more concerned than caregivers about understanding how conclusions are made and wanted the ability to challenge outputs that may contradict subjective experience. Both groups also questioned how CP technologies might impact the patient-provider relationship, especially if providers prioritize algorithmic insights over patient perspectives, particularly when these conflict. For some, this raised concerns about the dehumanization of care through an overreliance on technology; others seemed to want to retain epistemic authority over the meaning of their symptoms and experiences.

These findings reflect a justified reluctance to trust algorithmic interpretations over patients’ own understanding of their condition. Current limitations in algorithmic accuracy, such as bias in training data, lack of generalizability; data gaps, and misuse of proxies, necessitate careful interpretation of CP results [36,37,38,39]. Researchers and providers must be prepared to clearly communicate the accuracy, relevance and epistemological limitations of these tools, including their potential for bias, ensuring that patients understand these constraints. Since CP outputs remain investigational, many lack demonstrated validity, reliability and generalizability beyond small datasets. Further the assumption that these are objective metrics remain contested. Their clinical utility remains unverified and likely varies by condition and patient groups, which limits providers’ ability to reassure patients of the accuracy and validity of CP outputs.

Further, even if outputs were indeed valid, accurate and generalizable, research is lacking on how to effectively communicate their meaning in patient education and care. Outputs from CP algorithms, like those from other statistical and machine learning models, typically consist of (e.g., risk) probabilities calculated from numerous, quantified behavioral observations. Research on risk communication suggests that when confronted with probabilistic risk estimates, patients tend to overestimate [40] or underestimate their risk [41], and struggle to integrate multiple probabilities into an overall understanding of risk [40]. These difficulties may be influenced by health literacy [20] or the presence of a clinician who can help interpret algorithmic outputs, similar to a genetic counselor. Emerging literature [42, 43] suggests that algorithmic outputs should include additional information, such as risk variation across patient populations or clinical sites; confidence intervals; explanatory variables, and alternative information sources, to enhance and contextualize interpretations. Multistakeholder collaboration across the development, evaluation, and deployment of CP tools, might increase contract validity and mitigate sources of bias, which can inform best practices of communicating results and providing feedback to patients. More research is needed to determine how best to present these outputs in ways that facilitate understanding for both patients and providers, enabling clinical action where appropriate.

Providers should also be ready to address conflicts between algorithmic outputs and patients’ experiences. Several respondents expressed a desire to challenge algorithmic conclusions that could impact diagnosis and treatment plans; however, it remains unclear how best to reconcile conflicts between algorithms and patients, as well as between algorithms and providers. No protocols yet exist for handling scenarios where an algorithmic output aligns with a provider’s human judgment but not a patient’s, or when outputs from a validated algorithm challenge a provider’s judgment but perhaps align with a patient’s experience. Further research and ethical consideration are required to equip clinical and research teams to manage these conflicts and explore whether patients’ rights to challenge validated algorithmic conclusions differ from their rights to contest traditional clinical assessments.

Provide patients with a roadmap for integrating CP into clinical care/research

Adolescents in particular raised questions about the role of CP data in clinical assessments and decision-making, and wondered how much access they would have to the clinical inferences from their data. Some feared that too much or too frequent might be disruptive, while too little might feel like surveillance without benefit. These concerns highlight varying preferences for feedback, suggesting that clinical teams should tailor feedback strategies based on individual needs. As we have argued elsewhere [22], personalized “roadmaps” balancing patient preferences with clinical reasoning may enhance integration of CP into care.

Advocate for and participate in maintaining data privacy and protections

Our findings also emphasize the need to assure patients that their data will be securely collected, stored, and protected from unwanted disclosure. Both groups expressed concerns about data security, privacy, and misuse. Providers and researchers should inform patients about data use and the risks of unintended disclosure [15, 25]. However, their ability to do so is limited by providers’ own awareness of and ability to mitigate such risks, and varying institutional data security norms and practices, and current guidelines for managing CP data remain unclear, spurring some researchers (e.g., Muurling et al. [20]) to call for greater attention to a harmonization of research methodologies and data management and responsible stewardship of CP data. The characteristics of these data are often different from more traditional forms of health data in that they are not always directly or exclusively clinical in nature and often do not receive the same legal protections, leaving them more vulnerable to misuse by third-party controllers (e.g., targeted marketing, discrimination or exploitation) [44]. Further, they are often generated in the content of research/industry partnerships, leading to uncertainties about data storage, ownership rights and decision-making rights, and gaps in legal protection under the Health Insurance Portability & Accountability Act (HIPAA [45]) in the U.S. or Europe’s General Data Protection Regulation (GDPR [46]). This could be problematic for the several patients and caregivers who wanted to know what resources for legal redress might exist in cases of unintended use or disclosure of sensitive data or inferences, as to date there seem to be few.

Similarly, the growing risk of re-identification, where de-identified data can be traced back to individuals, further challenges data security efforts [47, 48]. Providers and researchers often lack the cybersecurity expertise needed to address these vulnerabilities effectively, suggesting a need to consult with specialized bioinformatics and cybersecurity professionals in clinical research or care involving CP. Research teams should allocate resources to include these experts, while providers should advocate for institutional support to mitigate data risks. Addressing these challenges will likely require significant shifts in funding and prioritization to protect patient data and ensure ethical data stewardship.

Limitations

While our study provides a broad range of informational needs expressed by patients and caregivers regarding the ethical integration of CP tools in healthcare, the findings may not be generalizable to patients with diverse clinical and demographic profiles. We interviewed adolescents with a range of psychiatric diagnoses, along with their adult caregivers, across two different urban clinical sites. Further research is needed to explore informational needs that may be unique to individuals across other geographical regions, age groups, and condition types.

Conclusion

This study provides an empirically derived checklist of key informational needs expressed by adolescents and caregivers regarding the use of CP tools in clinical research and care. This checklist is intended to help providers and researchers anticipate and address questions critical for ensuring informed consent. However, uncertainties around the clinical utility of CP tools, knowledge gaps in how best to communicate these issues to patients, and evolving data control and cybersecurity challenges limit providers’ and researchers’ ability to fully address many of these concerns. This study offers suggestions to help address these questions within the current landscape of clinical knowledge, risk communication approaches and data protections relevant to CP technologies.

Citation diversity statement

The authors have attested that they made efforts to be mindful of diversity in selecting the citations used in this article.

Related Articles

A unified acoustic-to-speech-to-language embedding space captures the neural basis of natural language processing in everyday conversations

This study introduces a unified computational framework connecting acoustic, speech and word-level linguistic structures to study the neural basis of everyday conversations in the human brain. We used electrocorticography to record neural signals across 100 h of speech production and comprehension as participants engaged in open-ended real-life conversations. We extracted low-level acoustic, mid-level speech and contextual word embeddings from a multimodal speech-to-text model (Whisper). We developed encoding models that linearly map these embeddings onto brain activity during speech production and comprehension. Remarkably, this model accurately predicts neural activity at each level of the language processing hierarchy across hours of new conversations not used in training the model. The internal processing hierarchy in the model is aligned with the cortical hierarchy for speech and language processing, where sensory and motor regions better align with the model’s speech embeddings, and higher-level language areas better align with the model’s language embeddings. The Whisper model captures the temporal sequence of language-to-speech encoding before word articulation (speech production) and speech-to-language encoding post articulation (speech comprehension). The embeddings learned by this model outperform symbolic models in capturing neural activity supporting natural speech and language. These findings support a paradigm shift towards unified computational models that capture the entire processing hierarchy for speech comprehension and production in real-world conversations.

Affective integration in experience, judgment, and decision-making

The role of affect in value-based judgment and decision-making has attracted increasing interest in recent decades. Most previous approaches neglect the temporal dependence of mental states leading to mapping a relatively well-defined, but largely static, feeling state to a behavioral tendency. In contrast, we posit that expected and experienced consequences of actions are integrated over time into a unified overall affective experience reflecting current resources under current demands. This affective integration is shaped by context and continually modulates judgments and decisions. Changes in affective states modulate evaluation of new information (affect-as-information), signal changes in the environment (affect-as-a-spotlight) and influence behavioral tendencies in relation to goals (affect-as-motivation). We advocate for an approach that integrates affective dynamics into decision-making paradigms. This dynamical account identifies the key variables explaining how changes in affect influence information processing may provide us with new insights into the role of affect in value-based judgment and decision-making.

Effect of Nb contents on the passivation behavior of high-strength anti-seismic rebar in concrete environments

In this study, the surface analysis, cross-section analysis and electrochemical analysis were used to explore the formation mechanism of Nb contents on the passive film of high-strength anti-seismic rebar in simulated concrete pore solution. The passivation experiments confirmed that the addition of Nb promoted the stability and compactness of surface passive film of Nb-containing rebar, and the passivation efficiency of Nb-containing rebar was stronger than that of CS rebar. Firstly, with the decreases of pH, the increases of Nb promoted that the outer layer of the passive film were mainly composed of Fe oxides and Fe oxyhydroxides, the inner layer were mainly composed of Fe oxides and Nb oxides. Secondly, the increases of Nb were beneficial to the formation of Nb oxides, which enhanced the passivation rate of the passive film and inhibited the degradation of Fe oxides, thus enhancing the thickness of surface passive film of Nb-containing rebar.

Immersive auditory-cognitive training improves speech-in-noise perception in older adults with varying hearing and working memory

Ageing is associated with elevated pure-tone thresholds, accompanied by increased difficulties in understanding speech-in-noise. While amplification provides important, but insufficient support, auditory-cognitive training (ACT) might propose a solution. However, generalized effects have been scarce, highlighting the necessity of training designs targeting naturalistic listening situations. We addressed this issue by designing a short-term ACT in a purely auditory- and a virtual multisensory environment, targeting both, sensory and cognitive processing of natural speech. 40 healthy older participants with varying hearing- and cognitive capacities were exposed to both trainings (cross-over design), while speech-in-noise perception was measured before and after each session. Immersive ACT exposure resulted in increased speech-in-noise perception, particularly for individuals with more pronounced hearing loss or reduced auditory working memory capacity. These results demonstrate that combining sensory and cognitive training elements, particularly in a multisensory environment, has the potential to improve speech in noise perception.

Solution-processable 2D materials for monolithic 3D memory-sensing-computing platforms: opportunities and challenges

Solution-processable 2D materials (2DMs) are gaining attention for applications in logic, memory, and sensing devices. This review surveys recent advancements in memristors, transistors, and sensors using 2DMs, focusing on their charge transport mechanisms and integration into silicon CMOS platforms. We highlight key challenges posed by the material’s nanosheet morphology and defect dynamics and discuss future potential for monolithic 3D integration with CMOS technology.

Responses

Your email address will not be published. Required fields are marked *