WEKO3
アイテム
{"_buckets": {"deposit": "a8124499-248a-4aa8-b55b-82846b48d644"}, "_deposit": {"created_by": 15, "id": "19184", "owners": [15], "pid": {"revision_id": 0, "type": "depid", "value": "19184"}, "status": "published"}, "_oai": {"id": "oai:sucra.repo.nii.ac.jp:00019184", "sets": ["972"]}, "author_link": [], "item_113_alternative_title_1": {"attribute_name": "タイトル(別言語)", "attribute_value_mlt": [{"subitem_alternative_title": "ビデオカメラを用いた内面的感情の認識"}]}, "item_113_biblio_info_9": {"attribute_name": "書誌情報", "attribute_value_mlt": [{"bibliographicIssueDates": {"bibliographicIssueDate": "2020", "bibliographicIssueDateType": "Issued"}}]}, "item_113_date_35": {"attribute_name": "作成日", "attribute_value_mlt": [{"subitem_date_issued_datetime": "2021-01-26", "subitem_date_issued_type": "Created"}]}, "item_113_date_granted_20": {"attribute_name": "学位授与年月日", "attribute_value_mlt": [{"subitem_dategranted": "2020-03-23"}]}, "item_113_degree_grantor_22": {"attribute_name": "学位授与機関", "attribute_value_mlt": [{"subitem_degreegrantor": [{"subitem_degreegrantor_name": "埼玉大学"}], "subitem_degreegrantor_identifier": [{"subitem_degreegrantor_identifier_name": "12401", "subitem_degreegrantor_identifier_scheme": "kakenhi"}]}]}, "item_113_degree_name_21": {"attribute_name": "学位名", "attribute_value_mlt": [{"subitem_degreename": "博士(学術)"}]}, "item_113_description_13": {"attribute_name": "形態", "attribute_value_mlt": [{"subitem_description": "ix, 58 p.", "subitem_description_type": "Other"}]}, "item_113_description_23": {"attribute_name": "抄録", "attribute_value_mlt": [{"subitem_description": "Automatically recognizing of human emotion is an interesting and challenging task with many applications such as human robot interaction, movie marketing, and more. Emotion analysis and recognition has become an interesting topic of research among the computer vision research community, Human-Computer Interaction (HCI) and Human-Robot Interaction (HRI) is one of the interesting challenges in the community of human-computer interaction today to make computers be more human-like for intelligent user interfaces. Emotion, one of the users affects, has been recognized as one of the most important ways of people to communicate with each other. Given the importance and potential of the emotions, effective interfaces using the emotion of the human user are gradually more desirable in intelligent user interfaces such as Human-Robot Interactions. Thus, there has been much work on systems that identify emotional states. Human emotion recognition using facial expressions is a common approach and there are many researchers working in this direction. However, there are times when facial expressions can either be faked or hidden. That is, one\u0027s \"apparent emotions\" from say, facial expressions may not be a reflection of one\u0027s genuine \"inner emotions\". Thus, other modalities such as physiological responses should be investigated for detecting and recognizing a person\u0027s inner emotions. This thesis aims to build a practical system for detecting such emotions by observing physiological responses. We focus on sensing physiological changes through visual means using only conventional cameras, as this would not require specialized equipment.\nAs mentioned earlier, the computer vision community has made many advancements in apparent emotion recognition through facial expressions. On the other hand, the psychophysiology community has conducted several studies on detecting and recognizing internal emotions using different physiological channels. Recognition of emotion has been done using many physiological signs such as heart rate change, eye movements, eye blinks, change in skin conductance, and change of skin temperature. Although many different physiological signs are used, many researchers find that cardiac activity is useful for emotion recognition. As a result, this thesis explores the use of cardiac activity for emotion recognition. However, most past studies use electrocardiography (ECG) for reading cardiac activity. ECGs are effective but have their limitations due to the need for attached sensors and higher cost. To realize practical application system for many real-world settings, we need a method that can sense cardiac activity without requiring any wearable attachments or devices.\nFortunately, in recent years, remote photo plethysmography (PPG) algorithms have received attention. These techniques allow one to read cardiac activity such as heart rates (HR) from conventional cameras by typically observing small changes in skin color over time. This thesis aims to use the sensed cardiac activity from remote PPG to detect and recognize emotions. Since remote PPG has been shown to work with conventional cameras, our proposed approach has the benefit that cameras such as webcams, surveillance cameras, and cellphone cameras could be used. With the ability to see cardiac activity without contact sensors, we present a convenient system for detecting internal emotions. Like in the psychophysiology literature, we chose to evaluate our approach by recognizing emotional reactions to emotionally stimulating videos such as horror and comedy clips. In the first phase of our work, we showed video content to human subjects and collected HR data using an attached sensor (Fitbit) for three emotional states (normal resting, funny, and horror). We then confirmed that the average HR between normal resting states and the emotionally stimulated states exhibit a statistically significant difference.\nThe first phase of our work showed that HRs could be used to detect changes in emotional state but did not explore the recognition of what kinds of emotions were present. In the next phase of our work, we investigated the feasibility of using cardiac pulse signals for recognizing different emotional states (joy vs. fear) and how this compares with the use of facial expressions. Specifically, we used the Open Face facial landmark tracker to estimate the average facial action unit intensities for each subject on the 30 second segments in both the comedy and horror cases. In this Thesis, we use a remote video based cardiac activity sensing technique to obtain physiological data to identify emotional states. We show that from the remotely sensed cardiac pulse patterns alone, emotional states can be differentiated. Specifically, we conducted an experimental study on recognizing the emotions of people watching video clips. We recorded all volunteers that all watched the same comedy and horror video clips and then we estimated their cardiac pulse signals from the video footage. From the cardiac pulse signal alone, we were able to classify whether the subjects were watching the comedy or horror video clip. We also compare against classifying for the same task using facial action units and discuss how the two modalities compare. In experimental period all subjects are watching two different kinds of emotional status changes video clips, like comedy and horror video clips. By using their pulse signal, we have analysis and tries to find their emotional status or emotional classification during watching the video clips. We have compared HR method with various types of wearable sensors and features, like Wii fit balanced board, pulse oximeter and GSR. Those wearable sensors are used to detect physiological signal.\nIn short, the two main contributions of this PhD thesis include sensing emotional changes in humans and determining the effectiveness of emotion recognition using only remote PPG sensed data relative to conventional facial expression analysis. To our knowledge, we are one of the first to bridge the gap between computer vision and psychophysiology through presentation of a promising system for visual detection of internal emotions.", "subitem_description_type": "Abstract"}]}, "item_113_description_24": {"attribute_name": "目次", "attribute_value_mlt": [{"subitem_description": "1 Introduction 1\n1.1 Internal Emotions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1\n1.2 Internal Emotion Recognition from Heart Rate . . . . . . . . . . . . . 2\n1.3 Heart Rate from Video . . . . . . . . . . . . . . . . . . . . . . . . . . 3\n1.4 Organization of the Thesis . . . . . . . . . . . . . . . . . . . . . . . . 4\n2 Related Work 6\n2.1 Emotion recognition from facial expression . . . . . . . . . . . . . . . 6\n2.2 Emotion recognition from other nonverbal behaviors . . . . . . . . . . 8\n2.3 Emotion recognition from physiological measures . . . . . . . . . . . 9\n2.4 Application of emotion recognition: Human Computer Interaction and Human-Robot Interaction . . . . . . . . . . . . . . . . . . . . . . . . 10\n2.5 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13\n3 Detecting Internal Emotions from Video Based Heart Rate Sensing 14\n3.1 Estimating Heart Rates from Video . . . . . . . . . . . . . . . . . . . 14\n3.2 Video PPG Algorithm for Estimating Heart Rate . . . . . . . . . . . 16\n3.3 Detect Internal Emotion from Video Vision Based . . . . . . . . . . . 18\n3.4 Experimental Analysis and Result . . . . . . . . . . . . . . . . . . . . 20\n3.4.1 Result for Single Subject . . . . . . . . . . . . . . . . . . . . . 22\n3.4.2 Result for Multiple Subjects . . . . . . . . . . . . . . . . . . . 23\n3.5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24\n3.6 Chapter Summery . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26\n4 Classification of Emotions from Video Based Cardiac Pulse Estimation 28\n4.1 Estimating Heart Pulse from Video . . . . . . . . . . . . . . . . . . . 29\n4.2 Classification Method . . . . . . . . . . . . . . . . . . . . . . . . . . . 29\n4.3 Using Support Vector Machine (SVM) and Principal component analysis (PCA) for Classification . . . . . . . . . . . . . . . . . . . . . . . 31\n4.3.1 Using Support Vector Machine (SVM) for emotion classification 31\n4.3.2 Principal component analysis (PCA) . . . . . . . . . . . . . . 32\n4.3.3 Using Support Vector Machine (SVM) for emotion classification 32\n4.4 Experiment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32\n4.5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35\n4.6 Chapter Summery . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36\n5 Hidden Emotion Detection Video method and others Feature comparison 37\n5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37\n5.2 System Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39\n5.3 Experiment Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41\n5.4 Experiment Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43\n5.5 Chapter Summery . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46\n6 Conclusions and Future Work 48\n6.1 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48\n6.2 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49", "subitem_description_type": "Other"}]}, "item_113_description_25": {"attribute_name": "注記", "attribute_value_mlt": [{"subitem_description": "指導教員 : 久野義徳", "subitem_description_type": "Other"}]}, "item_113_description_33": {"attribute_name": "資源タイプ", "attribute_value_mlt": [{"subitem_description": "text", "subitem_description_type": "Other"}]}, "item_113_description_34": {"attribute_name": "フォーマット", "attribute_value_mlt": [{"subitem_description": "application/pdf", "subitem_description_type": "Other"}]}, "item_113_dissertation_number_19": {"attribute_name": "学位授与番号", "attribute_value_mlt": [{"subitem_dissertationnumber": "甲第1170号"}]}, "item_113_identifier_registration": {"attribute_name": "ID登録", "attribute_value_mlt": [{"subitem_identifier_reg_text": "10.24561/00019153", "subitem_identifier_reg_type": "JaLC"}]}, "item_113_publisher_11": {"attribute_name": "出版者名", "attribute_value_mlt": [{"subitem_publisher": "埼玉大学大学院理工学研究科"}]}, "item_113_publisher_12": {"attribute_name": "出版者名(別言語)", "attribute_value_mlt": [{"subitem_publisher": "Graduate School of Science and Engineering, Saitama University"}]}, "item_113_record_name_8": {"attribute_name": "書誌", "attribute_value_mlt": [{"subitem_record_name": "博士論文(埼玉大学大学院理工学研究科(博士後期課程))"}]}, "item_113_text_31": {"attribute_name": "版", "attribute_value_mlt": [{"subitem_text_value": "[出版社版]"}]}, "item_113_text_36": {"attribute_name": "アイテムID", "attribute_value_mlt": [{"subitem_text_value": "GD0001219"}]}, "item_113_text_4": {"attribute_name": "著者 所属", "attribute_value_mlt": [{"subitem_text_value": "埼玉大学大学院理工学研究科(博士後期課程)理工学専攻"}]}, "item_113_text_5": {"attribute_name": "著者 所属(別言語)", "attribute_value_mlt": [{"subitem_text_value": "Graduate School of Science and Engineering, Saitama University"}]}, "item_113_version_type_32": {"attribute_name": "著者版フラグ", "attribute_value_mlt": [{"subitem_version_resource": "http://purl.org/coar/version/c_970fb48d4fbd8a85", "subitem_version_type": "VoR"}]}, "item_access_right": {"attribute_name": "アクセス権", "attribute_value_mlt": [{"subitem_access_right": "open access", "subitem_access_right_uri": "http://purl.org/coar/access_right/c_abf2"}]}, "item_creator": {"attribute_name": "著者", "attribute_type": "creator", "attribute_value_mlt": [{"creatorNames": [{"creatorName": "KEYA, DAS TILOTTOMA", "creatorNameLang": "en"}, {"creatorName": "ケヤ, ダス ティロットマー", "creatorNameLang": "ja-Kana"}]}]}, "item_files": {"attribute_name": "ファイル情報", "attribute_type": "file", "attribute_value_mlt": [{"accessrole": "open_date", "date": [{"dateType": "Available", "dateValue": "2021-01-26"}], "displaytype": "detail", "download_preview_message": "", "file_order": 0, "filename": "GD0001219.pdf", "filesize": [{"value": "2.1 MB"}], "format": "application/pdf", "future_date_message": "", "is_thumbnail": false, "licensetype": "license_note", "mimetype": "application/pdf", "size": 2100000.0, "url": {"label": "GD0001219.pdf", "objectType": "fulltext", "url": "https://sucra.repo.nii.ac.jp/record/19184/files/GD0001219.pdf"}, "version_id": "d188092d-a775-4894-85c0-fe4e7554bd0c"}]}, "item_language": {"attribute_name": "言語", "attribute_value_mlt": [{"subitem_language": "eng"}]}, "item_resource_type": {"attribute_name": "資源タイプ", "attribute_value_mlt": [{"resourcetype": "doctoral thesis", "resourceuri": "http://purl.org/coar/resource_type/c_db06"}]}, "item_title": "Detecting internal emotions using video cameras", "item_titles": {"attribute_name": "タイトル", "attribute_value_mlt": [{"subitem_title": "Detecting internal emotions using video cameras", "subitem_title_language": "en"}]}, "item_type_id": "113", "owner": "15", "path": ["972"], "permalink_uri": "https://doi.org/10.24561/00019153", "pubdate": {"attribute_name": "PubDate", "attribute_value": "2021-01-26"}, "publish_date": "2021-01-26", "publish_status": "0", "recid": "19184", "relation": {}, "relation_version_is_last": true, "title": ["Detecting internal emotions using video cameras"], "weko_shared_id": -1}
Detecting internal emotions using video cameras
https://doi.org/10.24561/00019153
https://doi.org/10.24561/0001915368ad2b81-ae5a-47b1-86a3-03f96ef8ed30
名前 / ファイル | ライセンス | アクション |
---|---|---|
GD0001219.pdf (2.1 MB)
|
|
Item type | 学位論文 / Thesis or Dissertation(1) | |||||||||
---|---|---|---|---|---|---|---|---|---|---|
公開日 | 2021-01-26 | |||||||||
タイトル | ||||||||||
言語 | en | |||||||||
タイトル | Detecting internal emotions using video cameras | |||||||||
言語 | ||||||||||
言語 | eng | |||||||||
資源タイプ | ||||||||||
資源タイプ識別子 | http://purl.org/coar/resource_type/c_db06 | |||||||||
資源タイプ | doctoral thesis | |||||||||
ID登録 | ||||||||||
ID登録 | 10.24561/00019153 | |||||||||
ID登録タイプ | JaLC | |||||||||
アクセス権 | ||||||||||
アクセス権 | open access | |||||||||
アクセス権URI | http://purl.org/coar/access_right/c_abf2 | |||||||||
タイトル(別言語) | ||||||||||
その他のタイトル | ビデオカメラを用いた内面的感情の認識 | |||||||||
著者 |
KEYA, DAS TILOTTOMA
× KEYA, DAS TILOTTOMA
|
|||||||||
著者 所属 | ||||||||||
埼玉大学大学院理工学研究科(博士後期課程)理工学専攻 | ||||||||||
著者 所属(別言語) | ||||||||||
Graduate School of Science and Engineering, Saitama University | ||||||||||
書誌 | ||||||||||
収録物名 | 博士論文(埼玉大学大学院理工学研究科(博士後期課程)) | |||||||||
書誌情報 |
発行日 2020 |
|||||||||
出版者名 | ||||||||||
出版者 | 埼玉大学大学院理工学研究科 | |||||||||
出版者名(別言語) | ||||||||||
出版者 | Graduate School of Science and Engineering, Saitama University | |||||||||
形態 | ||||||||||
内容記述タイプ | Other | |||||||||
内容記述 | ix, 58 p. | |||||||||
学位授与番号 | ||||||||||
学位授与番号 | 甲第1170号 | |||||||||
学位授与年月日 | ||||||||||
学位授与年月日 | 2020-03-23 | |||||||||
学位名 | ||||||||||
学位名 | 博士(学術) | |||||||||
学位授与機関 | ||||||||||
学位授与機関識別子Scheme | kakenhi | |||||||||
学位授与機関識別子 | 12401 | |||||||||
学位授与機関名 | 埼玉大学 | |||||||||
抄録 | ||||||||||
内容記述タイプ | Abstract | |||||||||
内容記述 | Automatically recognizing of human emotion is an interesting and challenging task with many applications such as human robot interaction, movie marketing, and more. Emotion analysis and recognition has become an interesting topic of research among the computer vision research community, Human-Computer Interaction (HCI) and Human-Robot Interaction (HRI) is one of the interesting challenges in the community of human-computer interaction today to make computers be more human-like for intelligent user interfaces. Emotion, one of the users affects, has been recognized as one of the most important ways of people to communicate with each other. Given the importance and potential of the emotions, effective interfaces using the emotion of the human user are gradually more desirable in intelligent user interfaces such as Human-Robot Interactions. Thus, there has been much work on systems that identify emotional states. Human emotion recognition using facial expressions is a common approach and there are many researchers working in this direction. However, there are times when facial expressions can either be faked or hidden. That is, one's "apparent emotions" from say, facial expressions may not be a reflection of one's genuine "inner emotions". Thus, other modalities such as physiological responses should be investigated for detecting and recognizing a person's inner emotions. This thesis aims to build a practical system for detecting such emotions by observing physiological responses. We focus on sensing physiological changes through visual means using only conventional cameras, as this would not require specialized equipment. As mentioned earlier, the computer vision community has made many advancements in apparent emotion recognition through facial expressions. On the other hand, the psychophysiology community has conducted several studies on detecting and recognizing internal emotions using different physiological channels. Recognition of emotion has been done using many physiological signs such as heart rate change, eye movements, eye blinks, change in skin conductance, and change of skin temperature. Although many different physiological signs are used, many researchers find that cardiac activity is useful for emotion recognition. As a result, this thesis explores the use of cardiac activity for emotion recognition. However, most past studies use electrocardiography (ECG) for reading cardiac activity. ECGs are effective but have their limitations due to the need for attached sensors and higher cost. To realize practical application system for many real-world settings, we need a method that can sense cardiac activity without requiring any wearable attachments or devices. Fortunately, in recent years, remote photo plethysmography (PPG) algorithms have received attention. These techniques allow one to read cardiac activity such as heart rates (HR) from conventional cameras by typically observing small changes in skin color over time. This thesis aims to use the sensed cardiac activity from remote PPG to detect and recognize emotions. Since remote PPG has been shown to work with conventional cameras, our proposed approach has the benefit that cameras such as webcams, surveillance cameras, and cellphone cameras could be used. With the ability to see cardiac activity without contact sensors, we present a convenient system for detecting internal emotions. Like in the psychophysiology literature, we chose to evaluate our approach by recognizing emotional reactions to emotionally stimulating videos such as horror and comedy clips. In the first phase of our work, we showed video content to human subjects and collected HR data using an attached sensor (Fitbit) for three emotional states (normal resting, funny, and horror). We then confirmed that the average HR between normal resting states and the emotionally stimulated states exhibit a statistically significant difference. The first phase of our work showed that HRs could be used to detect changes in emotional state but did not explore the recognition of what kinds of emotions were present. In the next phase of our work, we investigated the feasibility of using cardiac pulse signals for recognizing different emotional states (joy vs. fear) and how this compares with the use of facial expressions. Specifically, we used the Open Face facial landmark tracker to estimate the average facial action unit intensities for each subject on the 30 second segments in both the comedy and horror cases. In this Thesis, we use a remote video based cardiac activity sensing technique to obtain physiological data to identify emotional states. We show that from the remotely sensed cardiac pulse patterns alone, emotional states can be differentiated. Specifically, we conducted an experimental study on recognizing the emotions of people watching video clips. We recorded all volunteers that all watched the same comedy and horror video clips and then we estimated their cardiac pulse signals from the video footage. From the cardiac pulse signal alone, we were able to classify whether the subjects were watching the comedy or horror video clip. We also compare against classifying for the same task using facial action units and discuss how the two modalities compare. In experimental period all subjects are watching two different kinds of emotional status changes video clips, like comedy and horror video clips. By using their pulse signal, we have analysis and tries to find their emotional status or emotional classification during watching the video clips. We have compared HR method with various types of wearable sensors and features, like Wii fit balanced board, pulse oximeter and GSR. Those wearable sensors are used to detect physiological signal. In short, the two main contributions of this PhD thesis include sensing emotional changes in humans and determining the effectiveness of emotion recognition using only remote PPG sensed data relative to conventional facial expression analysis. To our knowledge, we are one of the first to bridge the gap between computer vision and psychophysiology through presentation of a promising system for visual detection of internal emotions. |
|||||||||
目次 | ||||||||||
内容記述タイプ | Other | |||||||||
内容記述 | 1 Introduction 1 1.1 Internal Emotions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.2 Internal Emotion Recognition from Heart Rate . . . . . . . . . . . . . 2 1.3 Heart Rate from Video . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.4 Organization of the Thesis . . . . . . . . . . . . . . . . . . . . . . . . 4 2 Related Work 6 2.1 Emotion recognition from facial expression . . . . . . . . . . . . . . . 6 2.2 Emotion recognition from other nonverbal behaviors . . . . . . . . . . 8 2.3 Emotion recognition from physiological measures . . . . . . . . . . . 9 2.4 Application of emotion recognition: Human Computer Interaction and Human-Robot Interaction . . . . . . . . . . . . . . . . . . . . . . . . 10 2.5 Chapter Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 3 Detecting Internal Emotions from Video Based Heart Rate Sensing 14 3.1 Estimating Heart Rates from Video . . . . . . . . . . . . . . . . . . . 14 3.2 Video PPG Algorithm for Estimating Heart Rate . . . . . . . . . . . 16 3.3 Detect Internal Emotion from Video Vision Based . . . . . . . . . . . 18 3.4 Experimental Analysis and Result . . . . . . . . . . . . . . . . . . . . 20 3.4.1 Result for Single Subject . . . . . . . . . . . . . . . . . . . . . 22 3.4.2 Result for Multiple Subjects . . . . . . . . . . . . . . . . . . . 23 3.5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 3.6 Chapter Summery . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26 4 Classification of Emotions from Video Based Cardiac Pulse Estimation 28 4.1 Estimating Heart Pulse from Video . . . . . . . . . . . . . . . . . . . 29 4.2 Classification Method . . . . . . . . . . . . . . . . . . . . . . . . . . . 29 4.3 Using Support Vector Machine (SVM) and Principal component analysis (PCA) for Classification . . . . . . . . . . . . . . . . . . . . . . . 31 4.3.1 Using Support Vector Machine (SVM) for emotion classification 31 4.3.2 Principal component analysis (PCA) . . . . . . . . . . . . . . 32 4.3.3 Using Support Vector Machine (SVM) for emotion classification 32 4.4 Experiment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 4.5 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 4.6 Chapter Summery . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 5 Hidden Emotion Detection Video method and others Feature comparison 37 5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 5.2 System Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 5.3 Experiment Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 5.4 Experiment Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 5.5 Chapter Summery . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46 6 Conclusions and Future Work 48 6.1 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 6.2 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 |
|||||||||
注記 | ||||||||||
内容記述タイプ | Other | |||||||||
内容記述 | 指導教員 : 久野義徳 | |||||||||
版 | ||||||||||
[出版社版] | ||||||||||
著者版フラグ | ||||||||||
出版タイプ | VoR | |||||||||
出版タイプResource | http://purl.org/coar/version/c_970fb48d4fbd8a85 | |||||||||
資源タイプ | ||||||||||
内容記述タイプ | Other | |||||||||
内容記述 | text | |||||||||
フォーマット | ||||||||||
内容記述タイプ | Other | |||||||||
内容記述 | application/pdf | |||||||||
作成日 | ||||||||||
日付 | 2021-01-26 | |||||||||
日付タイプ | Created | |||||||||
アイテムID | ||||||||||
GD0001219 |