{"created":"2023-05-15T15:29:08.202862+00:00","id":18687,"links":{},"metadata":{"_buckets":{"deposit":"17e410ce-173c-4ef5-b021-bd1a2824847a"},"_deposit":{"created_by":15,"id":"18687","owners":[15],"pid":{"revision_id":0,"type":"depid","value":"18687"},"status":"published"},"_oai":{"id":"oai:sucra.repo.nii.ac.jp:00018687","sets":["94:429:431:432:942"]},"author_link":[],"item_113_alternative_title_1":{"attribute_name":"タイトル(別言語)","attribute_value_mlt":[{"subitem_alternative_title":"都市環境を走行するロボット車いす"}]},"item_113_biblio_info_9":{"attribute_name":"書誌情報","attribute_value_mlt":[{"bibliographicIssueDates":{"bibliographicIssueDate":"2018","bibliographicIssueDateType":"Issued"}}]},"item_113_date_35":{"attribute_name":"作成日","attribute_value_mlt":[{"subitem_date_issued_datetime":"2019-07-12","subitem_date_issued_type":"Created"}]},"item_113_date_granted_20":{"attribute_name":"学位授与年月日","attribute_value_mlt":[{"subitem_dategranted":"2018-09-21"}]},"item_113_degree_grantor_22":{"attribute_name":"学位授与機関","attribute_value_mlt":[{"subitem_degreegrantor":[{"subitem_degreegrantor_name":"埼玉大学"}],"subitem_degreegrantor_identifier":[{"subitem_degreegrantor_identifier_name":"12401","subitem_degreegrantor_identifier_scheme":"kakenhi"}]}]},"item_113_degree_name_21":{"attribute_name":"学位名","attribute_value_mlt":[{"subitem_degreename":"博士(学術)"}]},"item_113_description_13":{"attribute_name":"形態","attribute_value_mlt":[{"subitem_description":"8, v, 78 p.","subitem_description_type":"Other"}]},"item_113_description_23":{"attribute_name":"抄録","attribute_value_mlt":[{"subitem_description":"In the last decade, several robotic wheelchairs possessing user-friendly interfaces and/or autonomous functions for reaching goals have been proposed to meet the needs of an aging society. The elderly and disabled currently use different kinds of wheelchairs that are either manually controlled or powered by a control system. Although wheelchairs make it possible, for their users to go out alone, users are in practice often accompanied by companions such as friends, families and caregivers. The workload of assistive companions can be reduced with the help of motorized wheelchairs. However, even motorized wheelchairs have some shortcomings and discouraged by some doctors, especially in cases where the elderly need to navigate urban environments and perform tasks such as boarding buses or climbing up a certain height. Thus, in designing new wheelchair technologies, it is important to consider how we can reduce the caregivers workload while overcoming the existing shortcomings of wheelchairs. Autonomous wheelchairs are a potential solution to these issues, which could also have the option of granting many elderly wheelchair users the ability to control their wheelchairs on their own. In fact, this thesis reviewed that around 3 million people could benefit from such wheelchairs. However the large majority of research to date has focused on indoor navigation. Thus, this thesis is specifically concerned with navigation in outdoor urban environments while considering practical issues such as the cost of sensors and the computational cost of the robotic wheelchair. More specifically, this thesis is aimed at first developing and designing a sensing system which can detect conditions in outdoor terrain for the wheelchair to run smoothly. The thesis then focuses on getting precise measurements using a bidirectional sensing system with a single 2D laser for boarding and disembarking buses autonomously in urban environments. The empirical part of this thesis was conducted throughout the study period and involved developing an autonomous bus boarding wheelchair system that moves independently with a companion or caregiver in urban terrain to reach bus stations and the wheelchair also has the capability to find bus doors using precise measurements and then boards using mechanically adjusted wheels.\nTo achieve these goals, the thesis first proposes a robotic wheelchair system that is able to classify the type of outdoor terrain according to their roughness for the comfort of the user and also control the wheelchair movements to avoid step edges and watery areas on the road. The thesis study found that the wheelchairs terrain surfaces can be classified into four different categories. These are rough surfaces, watery places on the road, indoor plain surfaces, and dirt tracks. These roads may have different conditions, which can be dangerous for the wheelchair user due to the risk of sudden accidents when driving over them. Suppose a wheelchair follows the caregiver but s/he misses hazards such as bumpy/pit areas or watery spots. If the wheelchair were to not actively look for such hazards, it could fall and lead to injury of the occupant. Moreover, the wheelchairs speed for different surface areas should not be same. The wheelchairs speed need to be automatically controlled with care. For this reason, detecting the type of surface is a necessity. An artificial neural network based classifier was constructed to classify the patterns and features extracted from the Laser Range Sensor (LRS) intensity and distance data. The overall classification accuracy was found to be 97.24% using extracted features from the intensity and distance data. These classification results can in turn be used to control the motor of the smart wheelchair. Moreover, the computed 3D surface elevation map could be used for making the decision of whether to move-on, slow down the speed, or to avoid watery and muddy areas on the road. Finally, this classification method was implemented on the wheelchair to validate its effectiveness by an experiment in which the proposed wheelchair system could detect watery places on the road and avoid them in a very efficient manner.\nClassification of terrain is not all about smooth navigation where smart wheelchair systems aim to reduce the workload of the caregiver. In this thesis, a novel 3D sensing system is introduced. It consists of a triangular shaped apparatus with undistorted reflecting mirrors for vertical and horizontal sensing of the floor with a conventional single 2D Laser range sensor and we name it Bidirectional sensing system (BSS). Conventionally 2D laser sensors have a wide-angle horizontal field of view. We use a 2D LRF (HOKUYO) that has a horizontal angle view field of 270°, with a measuring interval angle of 0.25°. Typically, researchers uses two or more such laser sensors to obtain vertical and horizontal sensing points for measurement of heights. However, a single scan from the LRF captures 1080 values in 25 ms, representing a viewing angle of 270° but we do not need to scan all 270° in the horizontal field of view. On the other hand, we do need some vertical scan data to detect steps. Thus, we divided the total view angle into three parts with a 90° view angle each by using three undistorted reflecting mirrors. All acquired data from the 2D laser had outliers removed so that we could accurately measure the step size of stairs or the elevation gaps of bus doors. In order to remove any undesirable artifacts, we use the median values and standard deviation of the data series. Deploying our BSS titled towards the ground at 45°\u000e and 100 cm above, the slope was calculated from the laser data to find whether it was positive or negative. A positive tangent indicates a planar surface and a negative slope indicates steps. In our system, the scanning points are separated in such way that they give two horizontal scanning lines and one vertical scanning line. For the vertical positional data, we get distributions for every step of a given height like in the case of stairs or the doorstep of a bus in the y-direction and the z-direction. Here, y and z indicate the horizontal and vertical spaces of the data plot respectively. The z-values give the height whereas the y-values give the width, respectively. The standard deviation and mean of each direction are used for finding outliers. Then the mode of the distribution of the values are applied to find the peak points and lowest point. The Euclidian distance of those, two points are then used to determine the height of the steps. For finding the mode, we used 1 cm intervals in each direction. Therefore, this sensing system is able to calculate precise measurements of bus doorsteps or staircase steps for climbing. For measuring the accuracy of this sensing system, we calculated stair and bus doorstep heights in practical scenarios and were able to measure heights with an error of ±1cm.\nFinally, the major objective of this thesis is to build an autonomous wheelchair to board buses. For bus boarding, we need two vertical scanning lines set apart at a distance the same as the wheelchair width so that it can avoid collisions with the bus door. Therefore, we used two BSSs and a calibrated camera equipped along the same axis on a 78 cm long bar edge tilted to the ground. This gives two vertical sensing lines from the far left and right of the sensors and four horizontal sensing lines across the vertical lines. Theoretically, we merge the horizontal data from the top and bottom so that we can receive a single distance value from both sensors. Additionally, we mapped these 3D point cloud data onto an image for verification and experimentation. The vertical sensing positions of the sensors start from 3 cm by the front wheels to check for obstacles. Our sensing area also covers 1 meter of the distance ahead of wheelchair. The front area of the sensing lines are important to find other obstacles like other passengers, bus seats, or luggage. Once the wheelchair has been instructed to board the bus, our system can sense if this area is free to board, otherwise it sets itself back. This feature adds an extra layer of safety for our wheelchair. Also, before receiving any bus boarding commands, our wheelchair is able to detect the bus door using an image recognition system. For that, the YOLO Darknet based object detection approach is employed. For approximate detection of buses and bus doors with doorsteps, we trained our image recognition system with 400 different images of buses, bus doors and doorstep for each class. In addition, the overall precision of the system was measured using the Intersection of Union (IoU) method and we achieved an overall 88% average. In a real experiments with a buses, we see the effectiveness in detecting the buses and bus doors. Our wheelchair also approached to a bus and measured the doorstep height to be 14.9 cm where the actual height was measured to be 15 cm.\nIn addition, we made our wheelchair capable of autonomously following its companion on way to the bus station for smooth running on terrain and then had it wait for the bus. In our setup, once our image recognition system detects the bus, it changes its position to move to the approximate location of the door. Once the door is detected, the wheelchair guided by the BSS module to find the bus doorsteps and calculates the height and width of the door for boarding. To achieve these tasks, our wheelchair operates in three modes (1) Companion Following Mode, (2) Autonomous Mode for boarding and (3) Manual Mode. The companion following mode is in operation while the wheelchair is going to the bus stop and after getting off
from the bus. The autonomous mode is operated while the wheelchair is either boarding or getting off the bus. As space is limited inside the bus, the wheelchair is operated in manual mode with a joystick by the companion for adjusting the wheelchair position towards the bus door for safe navigation. We conducted several indoor mock-up bus experiments and real bus experiments for bus boarding with a newly trained image recognition server that can detect buses and bus doors as well as doorsteps. Based on the experimental results of this research, it can be concluded that a new mechanism of bidirectional sensing using a single 2D laser for getting the heights of bus doorsteps for autonomous bus boarding was successful. Moreover, we demonstrated successfully operation of the proposed robotic wheelchair system using our method, which grants freedom of movement to the wheelchair user in urban environments.","subitem_description_type":"Abstract"}]},"item_113_description_24":{"attribute_name":"目次","attribute_value_mlt":[{"subitem_description":"1 Introduction 1\n1.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1\n1.2 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4\n1.3 Research Contribution . . . . . . . . . . . . . . . . . . . . . . . . . . 5\n1.4 Organization of Sections . . . . . . . . . . . . . . . . . . . . . . . . . 6\n2 Interdisciplinary Background 8\n2.1 Review of Robotic Wheelchairs . . . . . . . . . . . . . . . . . . . . . 8\n2.1.1 Robotic Wheelchair . . . . . . . . . . . . . . . . . . . . . . . . 8\n2.2 Robotic Wheelchair Hardware . . . . . . . . . . . . . . . . . . . . . . 10\n2.2.1 Sensors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10\n2.2.2 Conventional 2D Laser Range Sensor . . . . . . . . . . . . . . 12\n2.3 Mobility Functionality of Robotic Wheelchair . . . . . . . . . . . . . 14\n2.4 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15\n3 Terrain Recognition 16\n3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16\n3.2 Surface Classification Methodology . . . . . . . . . . . . . . . . . . . 18\n3.2.1 Conditioning of Raw LRS data . . . . . . . . . . . . . . . . . 19\n3.2.2 Segmentation and Windowing . . . . . . . . . . . . . . . . . . 21\n3.2.3 Surface 3D Map Generation . . . . . . . . . . . . . . . . . . . 21\n3.2.4 Feature Extraction . . . . . . . . . . . . . . . . . . . . . . . . 22\n3.2.5 Classification Model . . . . . . . . . . . . . . . . . . . . . . . 23\n3.3 Experiment and Results . . . . . . . . . . . . . . . . . . . . . . . . . 24\n3.4 Chapter Summery . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27\n4 Single Laser Bidirectional Sensing 28\n4.1 Introduction and Motivation . . . . . . . . . . . . . . . . . . . . . . . 28\n4.2 Robotic Wheelchair System . . . . . . . . . . . . . . . . . . . . . . . 30\n4.3 Data Acquisition Method . . . . . . . . . . . . . . . . . . . . . . . . . 31\n4.3.1 2D Sensor Using Undistorted Mirrors . . . . . . . . . . . . . . 32\n4.3.2 Coordinate Transformation from Laser Data to Camera Image Plane . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33\n4.3.3 Positioning the Laser Data onto the Camera Image Plane . . . 35\n4.4 Step Measuring Methodology . . . . . . . . . . . . . . . . . . . . . . 36\n4.5 Experiment Result . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37\n4.5.1 Upward Stair Measurement . . . . . . . . . . . . . . . . . . . 37\n4.5.2 Bus Floor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38\n4.6 Chapter Summery . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40\n5 Bus Boarding Wheelchair and Autonomous Companion Following Mode 41\n5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41\n5.2 System Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43\n5.2.1 BMR Hardware . . . . . . . . . . . . . . . . . . . . . . . . . . 43\n5.2.2 Detection Software . . . . . . . . . . . . . . . . . . . . . . . . 45\n5.3 Bus, Bus door and doorsteps Detection . . . . . . . . . . . . . . . . . 45\n5.4 Bidirectional Sensing System (BSS) . . . . . . . . . . . . . . . . . . . 49\n5.4.1 Sensor System . . . . . . . . . . . . . . . . . . . . . . . . . . . 49\n5.4.2 Mapping Sensor Data on Camera Image . . . . . . . . . . . . 52\n5.5 Experimental Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53\n5.5.1 Step Detection and Position Correction . . . . . . . . . . . . . 54\n5.5.2 BMR Control . . . . . . . . . . . . . . . . . . . . . . . . . . . 54\n5.6 Experiment Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55\n5.7 Moving Alongside Companion . . . . . . . . . . . . . . . . . . . . . . 57\n5.8 Adjust Wheelchair Position . . . . . . . . . . . . . . . . . . . . . . . 64\n5.9 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67\n6 Conclusions and Future Works 68\n6.1 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68\n6.2 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69\nBibliography 71","subitem_description_type":"Other"}]},"item_113_description_25":{"attribute_name":"注記","attribute_value_mlt":[{"subitem_description":"主指導教員 : 久野義徳","subitem_description_type":"Other"}]},"item_113_description_33":{"attribute_name":"資源タイプ","attribute_value_mlt":[{"subitem_description":"text","subitem_description_type":"Other"}]},"item_113_description_34":{"attribute_name":"フォーマット","attribute_value_mlt":[{"subitem_description":"application/pdf","subitem_description_type":"Other"}]},"item_113_dissertation_number_19":{"attribute_name":"学位授与番号","attribute_value_mlt":[{"subitem_dissertationnumber":"甲第1101号"}]},"item_113_identifier_registration":{"attribute_name":"ID登録","attribute_value_mlt":[{"subitem_identifier_reg_text":"10.24561/00018656","subitem_identifier_reg_type":"JaLC"}]},"item_113_publisher_11":{"attribute_name":"出版者名","attribute_value_mlt":[{"subitem_publisher":"埼玉大学大学院理工学研究科"}]},"item_113_publisher_12":{"attribute_name":"出版者名(別言語)","attribute_value_mlt":[{"subitem_publisher":"Graduate School of Science and Engineering, Saitama University"}]},"item_113_record_name_8":{"attribute_name":"書誌","attribute_value_mlt":[{"subitem_record_name":"博士論文(埼玉大学大学院理工学研究科(博士後期課程))"}]},"item_113_text_31":{"attribute_name":"版","attribute_value_mlt":[{"subitem_text_value":"[出版社版]"}]},"item_113_text_36":{"attribute_name":"アイテムID","attribute_value_mlt":[{"subitem_text_value":"GD0001021"}]},"item_113_text_4":{"attribute_name":"著者 所属","attribute_value_mlt":[{"subitem_text_value":"埼玉大学大学院理工学研究科(博士後期課程)理工学専攻"}]},"item_113_text_5":{"attribute_name":"著者 所属(別言語)","attribute_value_mlt":[{"subitem_text_value":"Graduate School of Science and Engineering, Saitama University"}]},"item_113_version_type_32":{"attribute_name":"著者版フラグ","attribute_value_mlt":[{"subitem_version_resource":"http://purl.org/coar/version/c_970fb48d4fbd8a85","subitem_version_type":"VoR"}]},"item_access_right":{"attribute_name":"アクセス権","attribute_value_mlt":[{"subitem_access_right":"open access","subitem_access_right_uri":"http://purl.org/coar/access_right/c_abf2"}]},"item_creator":{"attribute_name":"著者","attribute_type":"creator","attribute_value_mlt":[{"creatorNames":[{"creatorName":"SHAMIM AL MAMUN","creatorNameLang":"en"},{"creatorName":"シャーミム アロ マムン","creatorNameLang":"ja-Kana"}]}]},"item_files":{"attribute_name":"ファイル情報","attribute_type":"file","attribute_value_mlt":[{"accessrole":"open_date","date":[{"dateType":"Available","dateValue":"2019-07-12"}],"displaytype":"detail","filename":"GD0001021.pdf","filesize":[{"value":"126.6 MB"}],"format":"application/pdf","licensetype":"license_note","mimetype":"application/pdf","url":{"label":"GD0001021.pdf","objectType":"fulltext","url":"https://sucra.repo.nii.ac.jp/record/18687/files/GD0001021.pdf"},"version_id":"fe42698d-235d-413a-84d1-bdbebf788bf1"}]},"item_language":{"attribute_name":"言語","attribute_value_mlt":[{"subitem_language":"eng"}]},"item_resource_type":{"attribute_name":"資源タイプ","attribute_value_mlt":[{"resourcetype":"doctoral thesis","resourceuri":"http://purl.org/coar/resource_type/c_db06"}]},"item_title":"Robotic Wheelchair for Navigating Urban Environments","item_titles":{"attribute_name":"タイトル","attribute_value_mlt":[{"subitem_title":"Robotic Wheelchair for Navigating Urban Environments","subitem_title_language":"en"}]},"item_type_id":"113","owner":"15","path":["942"],"pubdate":{"attribute_name":"PubDate","attribute_value":"2019-07-12"},"publish_date":"2019-07-12","publish_status":"0","recid":"18687","relation_version_is_last":true,"title":["Robotic Wheelchair for Navigating Urban Environments"],"weko_creator_id":"15","weko_shared_id":-1},"updated":"2023-06-23T09:19:58.663416+00:00"}