Kenta Hori, Yusuke Uchida, Tsukasa Kan, Maya Minami, Chisako Naito, Tomohiro Kuroda, Hideya Takahashi, Masahiko Ando, Takashi Kawamura, Naoto Kume, Kazuya Okamoto, Tadamasa Takemura, Hiroyuki Yoshihara
2013 35TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC) 4646-4649 2013年 査読有り
The aim of this research is to develop an information support system for tele-auscultation. In auscultation, a doctor requires to understand condition of applying a stethoscope, in addition to auscultatory sounds. The proposed system includes intuitive navigation system of stethoscope operation, in addition to conventional audio streaming system of auscultatory sounds and conventional video conferencing system for telecommunication. Mixed reality technology is applied for intuitive navigation of the stethoscope. Information, such as position, contact condition and breath, is overlaid on a view of the patient's chest. The contact condition of the stethoscope is measured by e-textile contact sensors. The breath is measured by a band type breath sensor. In a simulated tele-auscultation experiment, the stethoscope with the contact sensors and the breath sensor were evaluated. The results show that the presentation of the contact condition was not understandable enough for navigating the stethoscope handling. The time series of the breath phases was usable for the remote doctor to understand the breath condition of the patient.