Related Articles
Clinical validity of fluorescence-based devices versus visual-tactile method in detection of secondary caries around resin composite restorations: diagnostic accuracy study
To assess the validity of light-induced and laser-induced fluorescence devices compared to the visual-tactile method for detecting secondary caries around resin composite restorations.
Vision-based tactile sensor design using physically based rendering
High-resolution tactile sensors are very helpful to robots for fine-grained perception and manipulation tasks, but designing those sensors is challenging. This is because the designs are based on the compact integration of multiple optical elements, and it is difficult to understand the correlation between the element arrangements and the sensor accuracy by trial and error. In this work, we introduce the digital design of vision-based tactile sensors using a physically accurate light simulator. The framework modularizes the design process, parameterizes the sensor components, and contains an evaluation metric to quantify a sensor’s performance. We quantify the effects of sensor shape, illumination setting, and sensing surface material on tactile sensor performance using our evaluation metric. The proposed optical simulation framework can replicate the tactile image of the real vision-based tactile sensor prototype without any prior sensor-specific data. Using our approach we can substantially improve the design of a fingertip GelSight sensor. This improved design performs approximately 5 times better than previous state-of-the-art human-expert design at real-world robotic tactile embossed text detection. Our simulation approach can be used with any vision-based tactile sensor to produce a physically accurate tactile image. Overall, our approach enables the automatic design of sensorized soft robots and opens the door for closed-loop co-optimization of controllers and sensors for dexterous manipulation.
Structural flexible magnetic films for biometric encryption and tactile interaction in wearable devices
Human fingers have fingerprints and mechanoreceptors for biometric information encryption and tactile perception. Ideally, electronic skin (e-skin) integrates identity information and tactile sensing, but this remains challenging. Research on encryption and tactile sensing rarely overlaps. Here, we report using magnetization structures and combinations of magnetic materials to achieve two types of functions: 6n × n invisible secure encryption is achieved through a n × n dipole magnetic array, and multipole magnets are used to achieve decoupling of pressure at various positions and sliding in different directions. The sliding distance ranges from 0 to 2.5 mm, with speeds between 5 and 25 mm/s. This study is based on flexible magnetic films, which have the potential to be used in wearable devices. The magnetic ring and signal detection modules verify the prospects of this fundamental principle in human-computer interaction (HCI) and demonstrate its applications in user identity recognition and tactile interaction.
A thalamic hub-and-spoke network enables visual perception during action by coordinating visuomotor dynamics
For accurate perception and motor control, an animal must distinguish between sensory experiences elicited by external stimuli and those elicited by its own actions. The diversity of behaviors and their complex influences on the senses make this distinction challenging. Here, we uncover an action–cue hub that coordinates motor commands with visual processing in the brain’s first visual relay. We show that the ventral lateral geniculate nucleus (vLGN) acts as a corollary discharge center, integrating visual translational optic flow signals with motor copies from saccades, locomotion and pupil dynamics. The vLGN relays these signals to correct action-specific visual distortions and to refine perception, as shown for the superior colliculus and in a depth-estimation task. Simultaneously, brain-wide vLGN projections drive corrective actions necessary for accurate visuomotor control. Our results reveal an extended corollary discharge architecture that refines early visual transformations and coordinates actions via a distributed hub-and-spoke network to enable visual perception during action.
Sensory interactive fibers and textiles
Electronic textiles (e-textiles) have gradually emerged as a burgeoning industry, with the advancement of flexible electronic technology and the growing demand for personalization, convenience, and comfort. As the typical representative, sensory interactive e-textiles, integrated with visual, auditory, tactile, and other sensory experiences, have garnered significant attention in the next generation of wearable devices due to their outstanding performance and unique immersive interactive experience. To promote the practical application and better development of sensory interactive e-textiles, this paper reviews the research status of sensory interactive fibers and textiles in recent years, providing a detailed overview of functional fibers capable of achieving sensory interactive functions, categorizes system integration technologies for sensory interactive e-textiles, and summarizes the application scenarios of sensory interactive e-textiles. This review further delineates current design paradigms of e-textiles and proposes a novel design paradigm applicable to sensory interactive e-textiles. Finally, we clarify the challenges facing the future development of sensory interactive e-textiles and suggest vital research directions.
Responses