@phdthesis{oai:sucra.repo.nii.ac.jp:00019051, author = {ALI, SARWAR}, month = {}, note = {xi, 65 p., The advancement of technology in research is increasing with several applications in robotics. Smart robotic wheelchair studies is one of them, which could provide user-friendly interfaces and/or autonomous functions that meet the needs of severely impaired users along with an aging society. Powered wheelchairs have been developed for people that lack muscle control (e.g., due to spinal cord injuries) and have difficulties in operating unpowered wheelchairs. However, many people have struggled in operating powered wheelchairs and in practice, are often accompanied by companions. For some wheelchair users, traveling by using wheelchairs may be a difficult task, especially when they are in dynamic environments where pedestrians move around in dense public areas such as at bus stops, airport terminals, shopping malls, and so forth. In addition, performing tasks such as boarding buses or climbing up a certain height is of great importance. Thus, to provide a better quality of life for individuals like the elderly, physically disabled, and mentally handicapped people, there are growing demands for advanced Smart Robotic wheelchairs that have independent mobility functions that can navigate and sense information from their environment and respond in useful ways without caregiver support. A large majority of the robotic wheelchair research to date has focused on indoor areas. Outdoor navigation should also be considered among environments people frequent. There are many possible functionalities for an advanced Smart Wheelchair in outdoor environments like the detection of outdoor terrain to run steadily, static and dynamic obstacles to avoid them, steps to climb up/down, and so forth. This thesis is specifically focused on navigation in outdoor crowded environments to move smoothly among people along with bus and bus door detections to get on/off buses for transportation while considering practical issues such as sensors and the computational cost of the robotic wheelchair with real-time autonomous operations. To develop such wheelchairs, Computer Vision techniques are essential for detection and further analysis and also our system should run in real-time. Thus the system should have (1) High detection speed, which can be obtained by using simple algorithms using less computations and it should use a notebook laptop for portability since wheelchairs have a limited power source, (2) High accuracy, which we achieve by using a Neural Network, and (3) High precision measurement for detecting exact locations of necessary objects, relative to the wheelchair. To achieve these goals, this thesis first proposes a smart robotic wheelchair system that is able to detect pedestrians and also control the wheelchair movements to avoid pedestrians smoothly including user comfort in mind. This thesis presents a method for our Smart Wheelchair to maneuver around individual and multiple pedestrians by detecting and analyzing their interactions and predicted intensions with the wheelchair. Our Smart Wheelchair can obtain head and body orientations of pedestrians by using TensorFlow based OpenPose. Using a single camera, we infer the walking directions or next movements of pedestrians by combining face pose and body posture estimation with our collision avoidance strategy in real-time. Experimental results show our approach is promising. We collected data from a train station and also from our campus to evaluate our method. Moreover, we determine the relative distance between pedestrians and the wheelchair by detecting how much of the image frame the pedestrians occupy (we call this “coverage”), to maintain a safe distance from pedestrians. If the distance between a given pedestrian and wheelchair is nearly 1m (which indicates 60% coverage of the pedestrian in the camera image), the wheelchair stops and processes the next frame and gets directions until it sees a clear path. For an added layer of safety for the user, we also use a LiDAR sensor for detection of any obstacles out of the camera field of view to avoid collisions in advance. The final system results in autonomous navigation that generates wheelchair movements that are safe and comfortable to the wheelchair user and other people in real-time multi-person scenarios. In addition, we propose a bus boarding wheelchair system that can precisely detect bus and bus doors (open-door and close-door) using Convolutional Neural Networks (CNN) based image recognition. This is an extension of our ongoing work on a bus boarding wheelchair system in terms of the camera processing, vision component. For that, the YOLO Dark-net based object detection approach is employed. The system needs to operate in real-time for proper detection despite our wheelchair platform having limited power. Therefore, we modified one of the YOLO versions called Tiny-YOLO to run at a fair amount of speed around 10FPS on our notebook PC. For accurate detection of buses with open and closed bus doors, we trained our image recognition system with 1,800 different images for each of these classes. The overall precision of the system was measured using the Intersection of Union (IoU) method and we achieved an overall 70% average result in detection. Once our system can detect a given bus with an open door using our fast and modified Tiny-YOLO, we then apply a Hough line transform algorithm to get accurate and precise localization of the open door lines. To evaluate the performance of our proposed method, we also compare the accuracy of our modified Tiny-YOLO and our proposed combined detection method with the original ground truth. Moreover, we achieved an average 90% IoU, which was a significant improvement over the modified Tiny-YOLO. This information is indispensable for our bus-boarding robotic wheelchair to board buses. In real experiments with a bus, we also see the effectiveness in detecting the buses and bus doors. Consequently, we propose a Smart Wheelchair that can detect buses and precisely recognize bus doors, whether they are opened or closed for automated boarding before receiving any bus boarding commands. Finally, we demonstrated successful operation of the proposed smart robotic wheelchair system using our method, which grants freedom of movement through pedestrian in urban environments., Acknowledgments ............................................................................................................... iii Abstract ................................................................................................................................ iv Contents .............................................................................................................................. vii List of Figures ...................................................................................................................... ix List of Tables ....................................................................................................................... xi Chapter 1 Introduction ..................................................................................................... 1 1.1 Motivation .............................................................................................................. 1 1.2 Objectives............................................................................................................... 6 1.3 Research Contributions .......................................................................................... 7 1.4 Organization of Sections ........................................................................................ 8 Chapter 2 Background ................................................................................................... 10 2.1 Smart Wheelchair Review ................................................................................... 10 2.2 Smart Wheelchair Hardware ................................................................................ 12 2.3 Detection Fundamentals ....................................................................................... 14 2.4 Summary .............................................................................................................. 16 Chapter 3 Detection and CNN Networks ...................................................................... 17 3.1 Overview .............................................................................................................. 17 3.2 Pose Detection (OpenPose) .................................................................................. 18 3.3 Object Detection (YOLO) .................................................................................... 23 3.4 Summary .............................................................................................................. 25 Chapter 4 Pedestrian Avoidance ................................................................................... 26 4.1 Overview .............................................................................................................. 26 4.2 System Outline ..................................................................................................... 28 4.3 Methodology ........................................................................................................ 29 4.3.1 Pedestrians’ Pose Estimation ........................................................................ 30 4.3.2 Intension and Interaction Evaluation ............................................................ 32 4.4 Autonomous Navigation ...................................................................................... 33 4.5 Experimental Results and Discussion .................................................................. 36 4.6 Summary .............................................................................................................. 45 Chapter 5 Bus and Bus Door Detections ....................................................................... 46 5.1 Overview .............................................................................................................. 46 5.2 Our BMR Wheelchair System ............................................................................. 47 5.3 Proposed Methodology ........................................................................................ 48 5.3.1 Bus and Bus Door Detection ........................................................................ 49 5.3.2 Precise Bounding Box Method ..................................................................... 52 5.4 Experimental Results and Discussion .................................................................. 53 5.5 Summary .............................................................................................................. 58 Chapter 6 Conclusions and Future Work ...................................................................... 59 6.1 Conclusions .......................................................................................................... 59 6.2 Future Work ......................................................................................................... 60 List of Publications ............................................................................................................. 62 References ........................................................................................................................... 63, 主指導教員 : 久野義徳, text, application/pdf}, school = {埼玉大学}, title = {Computer Vision Methods for Advanced Smart Wheelchairs Using DNN}, year = {2019}, yomi = {アリ, サロワロ} }