Activities

Group4: Cooperative control of multiple CAs

Latest Activity

Development of flexible CA control technologies (Takayuki NAGAI)

This project involves the development of technology for simultaneous tele-operation and coordinated control of multiple CAs. In order for one operator to operate multiple CAs simultaneously, it is necessary to have technology that uses multiple CAs that work while understanding the operator's intentions, depending on the task and environment. For example, in a hospital room where collaborative tasks such as preparation, examination, treatment, and explanation occur frequently. CAs can accomplish tasks in the room more efficiently by working together as a plurality, rather than acting alone. In this way, the coordination among CAs can provide a variety of services that cannot be achieved by a single CA. In addition, in the flexible CA control, it is necessary to consider the relationship and coordination with people existing on the environment side. In this project, we reproduce the living environment and hospital room environment in which CAs work together, create a prototype CA that will be active in these environments, and develop a system consisting of multiple CAs, including an operation interface. Based on the CA infrastructure protocol to be established in group 5, a prototype system of the CA infrastructure in an indoor environment will be realized. Then, in collaboration with groups 2, 3, 4, and 5, work on a demonstration experiment in which CAs actually work together.

Hierarchical control for autonomous mobile robots (Tomoaki NAKAMURA)

Cybernetic avatars (CAs) are required to self-organize and understand the hierarchical task structure to realize the collaboration of multiple CAs. For instance, to accomplish domestic tasks efficiently, their roles need to be divided appropriately according to their function and body, and each CA conducts its own role autonomously. Moreover, this hierarchical structure should be changed depending on the situation to accomplish tasks flexibly. This hierarchical structure consists of multiple levels, from providing concrete services in the domestic environment to providing abstract services such as providing meals. In this project, we develop the technology that realizes CAs can self-organize this hierarchical structure and efficiently collaborate.

Research and development of Embodied Dialogue CAs (Komei SUGIURA)

This project focuses on embodied dialogue technology in situations where multiple CAs cooperate with humans to provide services in everyday environments. When CAs assist users at home or in hospitals, there are tasks that require remote human operation and monitoring, such as medication delivery. In such a task, routine responses should be automated and optimally assigned to CAs, while the operator should monitor responses and respond remotely in case of emergencies. Based on this background, we will conduct two research topics: 1) embodied dialogue CAs that cooperate with the operators in everyday environments, and (2) Embodied dialogue CAs that cooperate with others to manipulate objects in dynamic environments.

Lab website: https://smilab.org/
Research and development of Daily-Physical-Support CAs (Tadahiro TANIGUCHI)

We are developing intelligent technology for Daily-Physical-Support CA to assist people in home, office, and hospital environments. When trying to provide support in everyday environments, it is important for robots not only to have dialogues with users but also to perform tasks involving physical behaviors, such as carrying objects. For teleoperation based not only on physical teleoperation but also on verbal (symbolic) instructions, the CA must be semi-autonomous, quickly modeling and understanding the living environment through direct teleoperation and autonomous exploration. To this end, we are working on research and development of adaptive planning and navigation technologies and cooperative manipulation technologies, and software development frameworks. Some of these technologies were used at the World Robot Summit 2020 (Aichi, Japan), and we won the Grand Prize in the Future Convenience Store Challenge. In the future, we will continue to develop fundamental technologies and integrated systems that enable multiple CAs to expand their range of support through teleoperations and interactions.

Tactile sensing and control for CA manipulation (Yosuke SUZUKI)

We are developing tactile sensors and manipulation techniques to speed up and increase the accuracy of various physical support tasks with CA. When performing tasks with CA at home or in hospitals, human-like hands are often required. Such manipulation is sometimes difficult to perform remotely and requires semi-autonomous operation incorporating tactile sensor information-based manipulation support. Therefore, we are developing tactile and proximity sensors tailored to the structure and movement of hands (manipulators) on various CA platforms. We are also working on research and development of technology to improve the efficiency of physical assistance tasks by remotely operated CAs by superimposing sensor-based local autonomous adaptive movements and movement commands by the operator, while maintaining consistency. So far, we have developed and integrated sensors for a three-finger dexterous robot hand, and we developed a motion control method that autonomously compensates for errors in approaching the target object to be grasped and a reliable and fast grasp control method by predicting the stability of the grasp prior to grasping. Currently, we are developing a sensor to enable this method to be used in other CAs, as well as to verify the effectiveness of this method in speeding up the tasks and reducing the burden of operation by introducing it to teleoperation.

CA control based on invasive BMIs (Masayuki HIRATA)

We have been developing implantable brain machine interfaces (BMI) to support movement and communication for disabled people. Presently, however, there is certain limitation to perfectly retore the disabled functions by only BMIs. Here, we aim at significant improvement by supporting BMI-based voluntary control with CA-based autonomous control. Until now, based on this concept, we successfully realized harmonization between BMI-based 3-dimensional (3-D) voluntary control of an upper arm of CA and CA-based autonomous control of a hand of CA. Regarding CA autonomous control, we successfully realized automatic recognition of the name and 3-D position of objects and autonomous manipulation of grasping, once the CA gets intention of grasping from the BMI. Presently, we are also developing lower limb and communication functions. We will introduce a HSR as CA and implement harmonized control of BMI and CA to the HSR. Our final goal is the regain of social life of disabled people with implantable BMIs that they can perfectly manipulate CAs from their home.