Keynote Speakers of ICCSE 2025


Keynote Speech I

High-Precision, Full-Color, Real-Time 3D Shape Measurement System with High-Brightness Fringe Projection and the Whole-Space Tabulation Method


Professor Motoharu Fujigaki


Human and Artificial Intelligent Systems, Graduate School of Engineering

University of Fukui, JAPAN.

Abstract

We have developed a full-color, real-time 3D shape measurement device based on a projected fringe pattern. In recent years, we have developed a high-brightness fringe pattern projection unit using high-power LEDs and a cylindrical lens array. This has enabled the realization of a compact and high-speed 3D measurement device. We have also proposed the Whole-Space Tabulation Method (WSTM), which eliminates systematic errors such as lens distortion. By combining this method with the device, we have achieved a high-precision and high-speed 3D measurement system. This device can be applied to human body measurement for medical and apparel purposes, appearance inspection in manufacturing, and the inspection of infrastructure structures by mounting it on drones. In this presentation, we will introduce its principles, design method, and prototype.

Speaker Biography

Motoharu Fujigaki is a professor in University of Fukui. He received his doctoral degree in Engineering from Osaka University in 2001. He is interested in optical metrology using image processing, especially 3D shape and deformation measurement using phase analysis of fringe patterns. His research areas include robot sensing, experimental mechanics, nondestructive inspection, remote sensing, and life mechatronics. He is also President of Kaeru Keisoku Co., Ltd., a university-originated venture company he established in 2022. He is actively engaged in promoting industry-academia collaboration through this company.


Keynote Speech II

Neurotraining and Inclusive Gaming Using a Virtual Brain Switch


Dr. Ryohei P. Hasegawa


National Institute of Advanced Industrial Science and Technology (AIST), JAPAN

Abstract

We have developed a communication system, Neurocommunicator®, based on a brain-machine/computer interface (BMI/BCI) that interprets electroencephalogram (EEG) signals to support individuals with severe motor impairments. This system enables users to express their intentions via a virtual EEG switch, even without speech or movement. Clinical trials revealed reduced brain responsiveness in elderly or bedridden users, prompting the development of Neurotrainer®—a hands-free cognitive training platform that utilizes the same brain-switch mechanism. Initially designed for physically impaired patients, it has since shown promise in broader applications, including dementia prevention and attention training for neurodiverse children. To further enhance engagement and social interaction, we also created bSports (the "b" stands for brain), a competitive activity where players either control PC games or humanoid robots via the EEG switch. This inclusive framework enables intergenerational participation and equal competition regardless of physical ability. In this talk, we introduce the system's core principles, technical implementation, and its future potential as an EEG-based ecosystem for communication support and cognitive enhancement.

Speaker Biography

Dr. Ryohei P. Hasegawa is Chief Senior Research Scientist at the Research Institute on Human and Societal Augmentation (RIHSA), National Institute of Advanced Industrial Science and Technology (AIST), Japan. He also holds visiting professorships at the University of Fukui, Nagoya University, and Tokyo University of Science. His work focuses on BMI/BCI technologies, cognitive engineering, and neurotechnology for assistive communication and healthcare applications. His Neurocommunicator, an EEG-based system for users with severe motor disabilities, has earned multiple awards and attracted broad attention from both national and international media for its clinical and societal impact. Dr. Hasegawa leads interdisciplinary initiatives bridging neuroscience, engineering, and inclusive design.