Navigation ▼

Zur Kurzanzeige

dc.contributor.advisor Farina, Dario Prof. Dr. Dr.
dc.contributor.author Markovic, Marko
dc.date.accessioned 2016-07-19T11:11:28Z
dc.date.available 2016-07-20T22:50:06Z
dc.date.issued 2016-07-19
dc.identifier.uri http://hdl.handle.net/11858/00-1735-0000-0028-87CB-C
dc.language.iso eng de
dc.rights.uri http://creativecommons.org/licenses/by-nc-nd/4.0/
dc.subject.ddc 610 de
dc.title Sensor Fusion for Closed-loop Control of Upper-limb Prostheses de
dc.type doctoralThesis de
dc.contributor.referee Graimann, Bernhard Dr.
dc.date.examination 2016-04-18
dc.description.abstracteng This thesis addresses open challenges in the field of myoelectrically controlled upper limb prostheses. These challenges include the inherently low bandwidth of the myoelectric control channel that makes the current control interfaces limited and unintuitive for the user, especially when controlling modern multi-function prostheses, as well as the lack of somatosensory feedback that would allow the user to better perceive the state of his/her prosthesis. This thesis aims at addressing these challenges by designing novel man-machine-interfaces, based on the latest sensing and automatic control technologies, to provide improved operation and perception of the prosthetic device. To this end, the thesis comprises introductory chapters that describe the state of the art in the field, the aim of the thesis and the used methodology, as well as four peer-reviewed journal publications presenting the novel feedforward and feedback methods. In the first two studies, I proposed and evaluated a novel system for prosthesis control based on sensor-fusion. In the classic approach, the user has the responsibility of generating all the command signals, while the prosthesis controller operates as a decoder, acquiring the signals and decoding user intention. In the novel framework, proposed here, the prosthesis is enhanced with advanced sensing and autonomous decision-making, becoming thereby an intelligent agent assisting the user. The inspiration for this novel approach comes from the modern-day autonomous robotic systems that utilize a variety of multimodal sensors and data processing methods in order to perceive and interpret the environment. In the present work, the prosthetic hand was equipped with computer vision and inertial sensing, and this information was used by the prosthesis controller to provide an additional, artificial-intelligence processing layer. This component analyzed the usage-context (environment, user and prosthesis) and, based on this, automatically controlled the hand preshape and orientation, thereby supporting the user in operating the prosthesis functions. The overall control loop is therefore simplified for the user because the sensor-fusion controller takes over the part of the inherent control complexity by adjusting the prosthesis parameters automatically. The user only provides high-level commands (e.g., grasp an object) that can be delivered robustly through a simple two-channel myoelectric interface. In the second two studies, I introduced a versatile development framework for evaluating a variety of feedback interfaces. The framework comprises a library of components implementing specific elements of a generic closed-loop prosthesis control system, from control inputs to the feedback interfaces. The framework operates in real-time and allows fast prototyping and testing. This has been used to develop and evaluate a novel biofeedback paradigm that closes the loop by feeding the myoelectric control signals (prosthesis input) back to the user. This is a novel approach with respect to the classic methods in literature, in which the feedback variables were the prostheses outputs (e.g., grasping force, joint angle). Due to the nature of the prosthesis control interface, in which the prosthesis reaction is proportional to the user’s myoelectric commands, this paradigm allows for predictive and robust control in comparison to the state-of-the-art approaches. For example, the user can exploit the biofeedback to modulate his/her command to the prosthesis during closing, so that desired grasping force is generated, after contact (predictive force control). Finally, a practical biofeedback implementation that utilizes augmented-reality and wearable see-through display embedded in Google Glass is also presented. In conclusion, by innovating both feedforward as well as feedback interfaces with new functionalities, this thesis advances the overall development of the modern man-machine-interfaces used for prosthesis control. This research can lead to effective and user friendly methods for control of advanced modern-day prostheses (e.g., dexterous hands, full arms), which in return could improve the utility and facilitate wider acceptance of these systems in daily life. de
dc.contributor.coReferee Sax, Ulrich Prof. Dr.
dc.subject.eng closed-loop control de
dc.subject.eng sensor fusion de
dc.subject.eng upper limb prosthesis de
dc.subject.eng sensory feedback de
dc.subject.eng semi-autonomous de
dc.subject.eng myoelectric prosthesis de
dc.identifier.urn urn:nbn:de:gbv:7-11858/00-1735-0000-0028-87CB-C-8
dc.affiliation.institute Medizinische Fakultät de
dc.subject.gokfull Methoden und Techniken in der Medizin (PPN619875143) de
dc.description.embargoed 2016-07-20
dc.identifier.ppn 863415393

Dateien

Das Dokument erscheint in:

Zur Kurzanzeige