Zur Kurzanzeige

Sensor Fusion for Closed-loop Control of Upper-limb Prostheses

dc.contributor.advisorFarina, Dario Prof. Dr. Dr.
dc.contributor.authorMarkovic, Marko
dc.date.accessioned2016-07-19T11:11:28Z
dc.date.available2016-07-20T22:50:06Z
dc.date.issued2016-07-19
dc.identifier.urihttp://hdl.handle.net/11858/00-1735-0000-0028-87CB-C
dc.identifier.urihttp://dx.doi.org/10.53846/goediss-5733
dc.language.isoengde
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/
dc.subject.ddc610de
dc.titleSensor Fusion for Closed-loop Control of Upper-limb Prosthesesde
dc.typedoctoralThesisde
dc.contributor.refereeGraimann, Bernhard Dr.
dc.date.examination2016-04-18
dc.description.abstractengThis thesis addresses open challenges in the field of myoelectrically controlled upper limb prostheses. These challenges include the inherently low bandwidth of the myoelectric control channel that makes the current control interfaces limited and unintuitive for the user, especially when controlling modern multi-function prostheses, as well as the lack of somatosensory feedback that would allow the user to better perceive the state of his/her prosthesis. This thesis aims at addressing these challenges by designing novel man-machine-interfaces, based on the latest sensing and automatic control technologies, to provide improved operation and perception of the prosthetic device. To this end, the thesis comprises introductory chapters that describe the state of the art in the field, the aim of the thesis and the used methodology, as well as four peer-reviewed journal publications presenting the novel feedforward and feedback methods. In the first two studies, I proposed and evaluated a novel system for prosthesis control based on sensor-fusion. In the classic approach, the user has the responsibility of generating all the command signals, while the prosthesis controller operates as a decoder, acquiring the signals and decoding user intention. In the novel framework, proposed here, the prosthesis is enhanced with advanced sensing and autonomous decision-making, becoming thereby an intelligent agent assisting the user. The inspiration for this novel approach comes from the modern-day autonomous robotic systems that utilize a variety of multimodal sensors and data processing methods in order to perceive and interpret the environment. In the present work, the prosthetic hand was equipped with computer vision and inertial sensing, and this information was used by the prosthesis controller to provide an additional, artificial-intelligence processing layer. This component analyzed the usage-context (environment, user and prosthesis) and, based on this, automatically controlled the hand preshape and orientation, thereby supporting the user in operating the prosthesis functions. The overall control loop is therefore simplified for the user because the sensor-fusion controller takes over the part of the inherent control complexity by adjusting the prosthesis parameters automatically. The user only provides high-level commands (e.g., grasp an object) that can be delivered robustly through a simple two-channel myoelectric interface. In the second two studies, I introduced a versatile development framework for evaluating a variety of feedback interfaces. The framework comprises a library of components implementing specific elements of a generic closed-loop prosthesis control system, from control inputs to the feedback interfaces. The framework operates in real-time and allows fast prototyping and testing. This has been used to develop and evaluate a novel biofeedback paradigm that closes the loop by feeding the myoelectric control signals (prosthesis input) back to the user. This is a novel approach with respect to the classic methods in literature, in which the feedback variables were the prostheses outputs (e.g., grasping force, joint angle). Due to the nature of the prosthesis control interface, in which the prosthesis reaction is proportional to the user’s myoelectric commands, this paradigm allows for predictive and robust control in comparison to the state-of-the-art approaches. For example, the user can exploit the biofeedback to modulate his/her command to the prosthesis during closing, so that desired grasping force is generated, after contact (predictive force control). Finally, a practical biofeedback implementation that utilizes augmented-reality and wearable see-through display embedded in Google Glass is also presented. In conclusion, by innovating both feedforward as well as feedback interfaces with new functionalities, this thesis advances the overall development of the modern man-machine-interfaces used for prosthesis control. This research can lead to effective and user friendly methods for control of advanced modern-day prostheses (e.g., dexterous hands, full arms), which in return could improve the utility and facilitate wider acceptance of these systems in daily life.de
dc.contributor.coRefereeSax, Ulrich Prof. Dr.
dc.subject.engclosed-loop controlde
dc.subject.engsensor fusionde
dc.subject.engupper limb prosthesisde
dc.subject.engsensory feedbackde
dc.subject.engsemi-autonomousde
dc.subject.engmyoelectric prosthesisde
dc.identifier.urnurn:nbn:de:gbv:7-11858/00-1735-0000-0028-87CB-C-8
dc.affiliation.instituteMedizinische Fakultätde
dc.subject.gokfullMethoden und Techniken in der Medizin (PPN619875143)de
dc.description.embargoed2016-07-20
dc.identifier.ppn863415393


Dateien

Thumbnail

Das Dokument erscheint in:

Zur Kurzanzeige