総合医科学研究所 遺伝子発見機構学
Profile Information
- Affiliation
- Research fellow, School of Medicine, Fujita Health University
- Degree
- Ph.D(Rehabilitation Science)(Mar, 2017, Nagoya University)
- Contact information
- obayashi
fujita-hu.ac.jp - Researcher number
- 00871120
- ORCID ID
https://orcid.org/0000-0002-6358-5435- J-GLOBAL ID
- 202001000587640062
- researchmap Member ID
- R000002427
Research Interests
4Research Areas
1Research History
2-
Dec, 2019 - Present
Education
3-
Apr, 2008 - Mar, 2012
Papers
4-
Frontiers in Behavioral Neuroscience, Aug 9, 2024<jats:sec><jats:title>Introduction</jats:title><jats:p>Smiling during conversation occurs interactively between people and is known to build good interpersonal relationships. However, whether and how much the amount that an individual smiles is influenced by the other person’s smile has remained unclear. This study aimed to quantify the amount of two individuals’ smiles during conversations and investigate the dependency of one’s smile amount (i.e., intensity and frequency) on that of the other.</jats:p></jats:sec><jats:sec><jats:title>Method</jats:title><jats:p>Forty participants (20 females) engaged in three-minute face-to-face conversations as speakers with a listener (male or female), under three conditions, where the amount of smiling response by listeners was controlled as “less,” “moderate,” and “greater.” The amount of the smiles was quantified based on their facial movements through automated facial expression analysis.</jats:p></jats:sec><jats:sec><jats:title>Results</jats:title><jats:p>The results showed that the amount of smiling by the speaker changed significantly depending on the listener’s smile amount; when the listeners smiled to a greater extent, the speakers tended to smile more, especially when they were of the same gender (i.e., male–male and female–female pairs). Further analysis revealed that the smiling intensities of the two individuals changed in a temporally synchronized manner.</jats:p></jats:sec><jats:sec><jats:title>Discussion</jats:title><jats:p>These results provide quantitative evidence for the dependence of one’s smile on the other’s smile, and the differential effect between gender pairs.</jats:p></jats:sec>
-
JMIR Aging, Apr 11, 2024
-
Journal of Head Trauma Rehabilitation, 36(5) E337-E344, Sep, 2021 Peer-reviewedLead author<jats:sec> <jats:title>Objective:</jats:title> <jats:p>To investigate whether automatic facial expression analysis can quantify differences in the intensity of facial responses depending on the affective stimuli in a patient with minimally conscious state (MCS).</jats:p> </jats:sec> <jats:sec> <jats:title>Methods:</jats:title> <jats:p>We filmed the facial responses of a patient with MCS during the delivery of three 1-minute auditory stimuli: audio clips of comedy movies, a nurse hilariously talking, and recitation of a novel (comedy, nurse, and recitation conditions, respectively). These measures were repeated at least 13 times for each condition on different days for approximately 10 months. The intensity of being “happy” was estimated from the smiling face using a software called FaceReader. The intensity among 5 conditions including those at 2 resting conditions (pre- and poststimuli) was compared using the Kruskal-Wallis test and the Dunn-Bonferroni test for multiple comparisons.</jats:p> </jats:sec> <jats:sec> <jats:title>Results:</jats:title> <jats:p>Significantly higher values were found in the intensity of being “happy” in the comedy and nurse conditions versus other conditions, with no significant differences between the recitation and pre- or poststimulus conditions. These findings indicate that the automated facial expression analysis can quantify differences in context-dependent facial responses in the patient recruited in this study.</jats:p> </jats:sec> <jats:sec> <jats:title>Conclusions:</jats:title> <jats:p>This case study demonstrates the feasibility of using automated facial expression analysis to quantitatively evaluate the differences in facial expressions and their corresponding emotions in a single patient with MCS.</jats:p> </jats:sec>
-
Somatosensory & motor research, 34(1) 1-8, Mar, 2017 Peer-reviewedNeural connectivity was measured during motor imagery (MI) and motor execution (ME) using magnetoencephalography in nine healthy subjects, MI, and at rest. Lower coherence values during ME and MI between sensorimotor areas than at rest, and lower values during MI between the left supplementary motor area and inferior frontal gyrus than ME suggested the sensorimotor network of MI functioned with similar connectivity to ME and that the inhibitory activity functioned continuously during MI, respectively.
Misc.
2-
The Japanese Journal of Rehabilitation Medicine, 58(特別号) 3-2, May, 2021
Research Projects
2-
科学研究費助成事業, 日本学術振興会, Apr, 2022 - Mar, 2026
-
科学研究費助成事業, 日本学術振興会, Sep, 2020 - Mar, 2023