[2L3-5]
Wearable Human–Machine Interfaces for Silent Communication and 3D Tactile Interaction
발표자유기준 (연세대학교)
연구책임자유기준 (연세대학교)
Abstract
Wearable human–machine interfaces (HMIs) convert human intention and physical interaction into machine-readable signals. Practical realization requires soft, mechanically compliant materials and scalable signal-processing strategies robust to dynamic motion. This presentation introduces a wearable silent speech interface enabling nonacoustic communication by sensing facial strain during articulation. Ultrathin crystalline-silicon strain gauges combined with deep learning accurately recognize silently spoken words by directly mapping skin deformation to linguistic commands, overcoming key limitations of electromyography-based approaches.
We further present a soft and ultrathin electronic skin based on electrical impedance tomography for high-resolution tactile interaction. Using stretchable materials and computational reconstruction, the system enables real-time three-dimensional tactile mapping with a minimal number of electrodes, even under large deformation and damage.