Related Articles

Bayesian p-curve mixture models as a tool to dissociate effect size and effect prevalence

Much research in the behavioral sciences aims to characterize the “typical” person. A statistically significant group-averaged effect size is often interpreted as evidence that the typical person shows an effect, but that is only true under certain distributional assumptions for which explicit evidence is rarely presented. Mean effect size varies with both within-participant effect size and population prevalence (proportion of population showing effect). Few studies consider how prevalence affects mean effect size estimates and existing estimators of prevalence are, conversely, confounded by uncertainty about effect size. We introduce a widely applicable Bayesian method, the p-curve mixture model, that jointly estimates prevalence and effect size by probabilistically clustering participant-level data based on their likelihood under a null distribution. Our approach, for which we provide a software tool, outperforms existing prevalence estimation methods when effect size is uncertain and is sensitive to differences in prevalence or effect size across groups or conditions.

Bayesian stability and force modeling for uncertain machining processes

Accurately simulating machining operations requires knowledge of the cutting force model and system frequency response. However, this data is collected using specialized instruments in an ex-situ manner. Bayesian statistical methods instead learn the system parameters using cutting test data, but to date, these approaches have only considered milling stability. This paper presents a physics-based Bayesian framework which incorporates both spindle power and milling stability. Initial probabilistic descriptions of the system parameters are propagated through a set of physics functions to form probabilistic predictions about the milling process. The system parameters are then updated using automatically selected cutting tests to reduce parameter uncertainty and identify more productive cutting conditions, where spindle power measurements are used to learn the cutting force model. The framework is demonstrated through both numerical and experimental case studies. Results show that the approach accurately identifies both the system natural frequency and cutting force model.

Unifying fragmented perspectives with additive deep learning for high-dimensional models from partial faceted datasets

Biological systems are complex networks where measurable functions emerge from interactions among thousands of components. Many studies aim to link biological function with molecular elements, yet quantifying their contributions simultaneously remains challenging, especially at the single-cell level. We propose a machine-learning approach that integrates faceted data subsets to reconstruct a complete view of the system using conditional distributions. We develop both polynomial regression and neural network models, validated with two examples: a mechanical spring network under external forces and an 8-dimensional biological network involving the senescence marker P53, using single-cell data. Our results demonstrate successful system reconstruction from partial datasets, with predictive accuracy improving as more variables are measured. This approach offers a systematic method to integrate fragmented experimental data, enabling unbiased and holistic modeling of complex biological functions.

Ultrafast pump-probe phase-randomized tomography

Measuring fluctuations in matter’s low-energy excitations is the key to unveiling the nature of the non-equilibrium response of materials. A promising outlook in this respect is offered by spectroscopic methods that address matter fluctuations by exploiting the statistical nature of light-matter interactions with weak few-photon probes. Here we report the first implementation of ultrafast phase randomized tomography, combining pump-probe experiments with quantum optical state tomography, to measure the ultrafast non-equilibrium dynamics in complex materials. Our approach utilizes a time-resolved multimode heterodyne detection scheme with phase-randomized coherent ultrashort laser pulses, overcoming the limitations of phase-stable configurations and enabling a robust reconstruction of the statistical distribution of phase-averaged optical observables. This methodology is validated by measuring the coherent phonon response in α-quartz. By tracking the dynamics of the shot-noise limited photon number distribution of few-photon probes with ultrafast resolution, our results set an upper limit to the non-classical features of phononic state in α-quartz and provide a pathway to access non-equilibrium quantum fluctuations in more complex quantum materials.

A neural network-based synthetic diagnostic of laser-accelerated proton energy spectra

Machine learning can revolutionize the development of laser-plasma accelerators by enabling real-time optimization, predictive modeling and experimental automation. Given the broad range of laser and plasma parameters and shot-to-shot variability in laser-driven ion acceleration at present, continuous monitoring with real-time, non-disruptive ion diagnostics is crucial for consistent operation. Machine learning provides effective solutions for this challenge. We present a synthetic diagnostic method using deep neural networks to predict the energy spectrum of laser-accelerated protons. This model combines variational autoencoders for dimensionality reduction with feed-forward networks for predictions based on secondary diagnostics of the laser-plasma interactions. Trained on data from fewer than 700 laser-plasma interactions, the model achieves an error level of 13.5%, and improves with more data. This non-destructive diagnostic enables high-repetition laser operations with the approach extendable to a fully surrogate model for predicting realistic ion beam properties, unlocking potential for diverse applications of these promising sources.

Responses

Your email address will not be published. Required fields are marked *