Paper Submission & Registration
8th Dutch Bio-Medical Engineering Conference
Go-previous
12:30   Motion
Chair: Kenneth Meijer
12:30
15 mins
Portable gait lab: tracking gait kinetics and kinematics using only three inertial measurement units
Mohamed Irfan Mohamed Refai, Bert-Jan F. van Beijnum, Jaap H. Buurke, Peter H. Veltink
Abstract: There have been several efforts in developing minimal inertial measurement units (IMU) setups for ambulatory gait tracking. IMUs are small, and wearable, and can be integrated into the shoe or the clothing of the user. This has several potential uses including remote monitoring, or as a simple measurement tool to record gait recovery after an impairment. An ideal minimal sensing setup to track the kinematics of feet and centre of mass (CoM), would be to place three IMUs; one on the hip and one on each foot. Two major issues that needs to be dealt with, among others, is reducing the uncertainty in distance between segments, and avoiding the use of magnetic sensing for estimating the heading direction. As part of AMBITION, an NWO project, we developed solutions for these issues. Our methods allow tracking of 3D ground reaction forces (GRF), and kinematics of feet and CoM during gait using the three IMUs [1]–[3]. The Centroidal Moment Pivot (CMP) theory [1], [4] plays a pivotal role in these solutions. We validated the algorithms for variable over-ground gait against reference systems such as VICON© and ForceShoes™. The different walking tasks tested included straight line walking, walking with turns, slalom walking, as well as simulated asymmetrical walking. Our average absolute errors across all these tasks in estimating step lengths and step widths were 4,6 +/- 1,5 cm and 3,8 +/- 1,5 cm, which were slightly larger than the variability in these measures in healthy and stroke populations. Nonetheless, our approach was able to track variable walking, and distinguished step length asymmetry during asymmetrical gait. In addition to the efforts, we introduce the concept of body centric reference frames. This was defined by combining relative movement information from the three sensors, and thereby avoids the use of magnetic sensing [1]–[3]. Next, we plan to test our algorithms on subjects with gait impairment. REFERENCES [1] M. I. Mohamed Refai et al., “Portable Gait Lab: Zero Moment Point for Minimal Sensing of Gait*,” in 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 2019, pp. 2077–2081. [2] M. I. Mohamed Refai, B.-J. F. van Beijnum, J. H. Buurke, and P. H. Veltink, “Portable Gait Lab: Estimating 3D GRF Using a Pelvis IMU in a Foot IMU Defined Frame,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 28, no. 6, pp. 1308–1316, Jun. 2020. [3] M. I. Mohamed Refai, B.-J. F. van Beijnum, J. H. Buurke, and P. H. Veltink, “Portable Gait Lab: Tracking Relative Distances of Feet and CoM Using Three IMUs,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 28, no. 10, pp. 2255–2264, Oct. 2020. [4] M. B. Popovic, A. Goswami, and H. Herr, “Ground reference points in legged locomotion: Definitions, biological trajectories and control implications,” Int. J. Rob. Res., vol. 24, no. 12, pp. 1013–1032, 2005.
12:45
15 mins
3D-Printed sensing system to asses pathological synergies of the upper extremities
Gerjan Wolterink, Gijs Krijnen, Peter Veltink, Bert-Jan Van Beijnum
Abstract: Sensory information of the upper limb synergies is of great value for applications in the control of prosthesis, assistive devices and for the quantitative assessment of rehabilitation therapy. The goal of this research is to create integration of EMG, kinematic and kinetic sensing data, from soft flexible personalized sensing structures, to receive continuous information about the interaction of the human body with the environment. For the kinematic data existing inertial measurement units (IMU’s) will be used. For the EMG and kinetic data new soft and customizable sensors are being developed using FDM 3D-printing [1,4]. The IMU measurement system was tested in a clinical environment on 10 chronic stroke subjects with inertial sensors placed on the sternum, upper arm, lower-arm, hand, the distal phalanxes of index-, middle- finger and thumb. This study showed potential of this measurement system to asses pathological synergies. Enabling the possibility for a quantitative assessment of the upper limb synergies and rehabilitation progress which is nowadays performed by a clinician or physical therapist [1]. Additionally the interaction forces between the fingertips and the external environment are of great value in upper extremities assessment. However, force sensors that are currently used on the fingertips have shown two main difficulties. Firstly, the sensors are usually made out of stiff materials leading to the loss of touch sensation by the user. Secondly, current sensors do not enable good sensor to skin attachment making the sensor more sensitive to unwanted movements caused by the external forces [2]. With the use of FDM 3d-printing at three material flexible 3D-printed finger sensor is developed This sensor measures both shear and normal forces by using the mechanical deformation of the finger tips caused by normal and shear-forces. Therefore, the sensor is soft and can measure the interaction forces between the environment and the finger tips, while keeping the loss of touch sensation low [3]. [1] Wolterink, G., Dias, P., Sanders, R. G., Muijzer, F., van Beijnum, B. J., Veltink, P., & Krijnen, G. (2020). Development of Soft sEMG Sensing Structures Using 3D-Printing Technologies. Sensors, 20(15), 4292. [2] Schwarz, A., Bhagubai, M., Wolterink, G., Held, J. P., Luft, A. R., & Veltink, P. H. (2020). Assessment of Upper Limb Movement Impairments after Stroke Using Wearable Inertial Sensing. Sensors, 20(17), 4770. [3] H. Kortier, “Assessment of hand kinematics and interactions with the environment,” University of Twente, February 2018. [4] G. Wolterink, R. Sanders, and G. Krijnen, “A flexible, three material, 3d-printed, shear force sensor for use on finger tips,” in 2019 IEEE SENSORS. IEEE, 2019, pp. 1–4.
13:00
15 mins
A data-driven approach to time series analysis - application of Singular Spectrum Analysis to motor adaptation data
Sander Swart, Rob den Otter, Claudine Lamoth
Abstract: Background: Motor adaptation – a form of motor learning – has been widely studied by imposing a sustained perturbation during walking, reaching, or pointing movements. In walking, for instance, this has been done by imposing a different velocity on each leg using a split-belt treadmill. Although the speed-mismatch initially evokes an asymmetry in step lengths and step times, subjects gradually re-establish a near symmetric gait pattern over time. Hence such adaptation curves often follow an asymptotic time course in which its global trend (i.e. persistent change) quantifies the adaptation process. Assessment of this global trend can be complicated since adaptation curves also exhibit transitory changes like trial-to-trial variations or noise. Although current techniques (e.g. curve fitting, bin- and moving-average filters) may reduce these transitory changes, they require a priori settings that cannot always be inferred beforehand. Therefore, we present the Singular Spectrum Analysis (SSA) – a data-driven method – to analyse motor adaptation time series. The SSA is capable to discern the global trend from noise and trial-to-trial variations by decomposing the time series into its component parts and subsequently identifying the slowly varying global trend components. Based on simulated adaptation curves, it will be assessed whether SSA is better able to extract the global trend compared to conventional techniques. Methods: A curve, resembling the adaptation time course, was simulated as: "Y" _"N" "=" "1-N" /"a+N" "+σ 1
13:15
15 mins
Multi-camera motion capture system without markers
Jinne Geelen, Mariana P. Branco, Winfred Mugge, Alfred Schouten, Frans van der Helm
Abstract: Motion capture is a crucial technique for various disciplines, such as biomechanics, neuroscience, rehabilitation, and robotics. Current passive motion capture systems use cameras to record reflective markers, requiring attachment and labelling of the markers. In an attempt to simplify this process, single-camera images are enriched with a depth dimension to create 3D images. A well-known example from the gaming industry is the Microsoft Kinect. However, Kinect is sensitive to data loss due to occlusion. Another way to transform 2D into 3D images is by using machine learning. One machine learning strategy is called Lifting, which artificially adds a layer of depth based on single-camera images. Despite being a promising technique, this solution does not produce accurate 3D results yet. Another strategy, requiring multiple cameras, is to use machine learning to track virtual markers and automate the labour-intensive process of labelling, as DeepLabCut does. Currently, triangulation from multiple cameras provides the most reliable outcomes, especially for delicate movements that are sensitive to occlusion. We developed and evaluated a multi-camera motion capture system. The system contains several single-board computers (Raspberry Pi). Every camera is coupled to a Pi, and one additional Pi functions as the controller, which is operated from a web application. By connecting all Pi’s to the same network, we obtained sub-millisecond synchronisation between the camera modules. We evaluated our system by recording hand gestures using six camera modules. Hand gestures are challenging movements due to the relatively small scale and occlusion of the fingers. Our PiCam system is a compact, markerless, and modular motion capture solution which allows for the recording of highly occluded movements, like finger movements. The application of the PiCam system can be extended to movement of other body parts, full-body movement and animal movement. The system allows for integration with upcoming machine-learning techniques such as DeepLabCut, providing an alternative for commercial camera-based motion capture while eliminating the need for reflective markers and labelling.


end %-->