Analysis and Synthesis of Human and Machine Motion at UL FE

Post on 06-Feb-2023

3 views 0 download

Transcript of Analysis and Synthesis of Human and Machine Motion at UL FE

Proceedings of the 2007 IEEE 10th International Conference on Rehabilitation Robotics, June 12-15, Noordwijk, The Netherlands

Analysis and Synthesis of Human and Machine Motion at UL FE

M. Munih, Member, IEEE, G. Kurillo, M. Veber, J. Perdan, J. Podobnik, U. Mali, J. Cinkelj, M.Mihelj, T. Koritnik, R. Kamnik, T. Bajd, Fellow, IEEE

Abstract-The paper gives insight into main fields of analysisand synthesis of human motion in the recent years or into theworks that are still under development. After generalintroduction half to one page of description is provided foreach topics. Simple explanations are omitting theoretical detailsand rather each use one to two pictures to provide quicklyaccessible information, which is similar to digest structure.Topics are covering from fingers, hand, arms to the lowerextremities, measurement systems to the systems using VR andhaptics. In conclusion can be found a look into possible futureactivities..

I. INTRODUCTION

The Laboratory of Robotics and Biomedical Engineeringlat the Faculty of Electrical Engineering, University of

Ljubljana (UL FE) has long-standing excellence in the fieldof man and machine movement analysis, and artificial andnatural motor control. The group has wide experience inrobotics research, robotic applications, rehabilitation andclinical work. In the past, founders of the laboratory haveplayed the pioneering rule in the development of the surfaceFunctional Electrical Stimulation technique for paraplegicpatients enabling them to stand-up, stand and even walk.Current research is predominantly oriented towardsrehabilitation and mobile robotics, and applications. Forexample, an active supporting frame for training of standingand balancing was built and introduced into clinical praxis.The robotic devices for arm and finger movements trainingand evaluation are in the process of development. The grouppioneered the evaluation technique of functional status ofupper limb in patients using haptic interface and virtuallabyrinth environment. An original robot construction wasbuilt for facilitating the standing-up manoeuvre of impairedsubjects. An inertial sensory system for motion tracking andnovel mobile robot platform are under development.

The group has also been a partner in SENSATIONS,GENTLE/S (Robotic assistance in neuro and motorrehabilitation), in I-Match (A VR based system to allowmatching of an optimum interface to a user of assistivetechnology) and now in Alladin (Natural language baseddecision support in neurorehabilitation). The laboratory is inpossession of several modem robot manipulators, brandsAdept, Stauibli, ABB, Motoman, Epson, FCS and Phantom,some with low-level access to controller (RTLinux), variousforce sensors, and four haptic interfaces. A contact-less

motion analysis system Optotrak with real-time option isavailable together with force plates and various forcesensors (AMTI, JR3, Schunk, HBM). The personalbackground, facilities, and existing equipment are suitablefor combining recent achievements in robotics research andengineering with existing rehabilitation practice (hapticinterface, impedance control, biomechanical modelling,advanced measuring techniques).

II. AsSESSMENT OF GRASPING FORCE

An original tracking system for the assessment andtraining of grip force control was developed [1,2]. Thesystem consists of two measuring objects enablingassessment of cylindrical power and lateral precision grip(Fig. 1). It is connected to a personal computer for visualfeedback and data acquisition.

The task requires the patient to track the target signal onthe screen by applying appropriate force to the grip-measuring device. The target signal is presented with bluering moving vertically in the center of the screen. Theapplied force is indicated with a red spot. When the gripforce is applied, the red spot moves upwards. The aim of thetask is to continuously track the position of the blue ring bydynamically adapting the grip force to the measuring object.The complexity of the task is adjusted by selecting the shapeof the target signal, e.g. ramp, sinus, rectangular shape,setting the level of the target force, and changing thedynamic parameters, e.g. frequency, force-rate.

Fig. 1. Grasping force measuring device enabling assessment of cylindricaland lateral grasp.

Manuscript received February 9, 2007. All authors are with Faculty ofElectrical Engineering, University of Ljubljana, Trzaska 25, 1000 Ljubljana,Slovenia. (e-mail: author Orobo.fe.uni-lj.si).

1-4244-1320-6/07/$25.00 (c)2007 IEEE 504

Proceedings of the 2007 IEEE 10th International Conference on Rehabilitation Robotics, June 12-15, Noordwijk, The Netherlands

The results in healthy subjects showed significantdifferences in grip force control among different age groups.In a patient after Botulinum-Toxin treatment the methodrevealed noticeable effects of the therapy on patient'stracking performance. Training with the tracking systemshowed considerable improvements in the grip force controlin 8 out of 10 stroke patients. The proposed tracking methodis aimed to be used in connection with differentrehabilitation therapies, e.g. physiotherapy, FES, drugtreatment, to follow the influence of the therapy on patient'smuscular strength and grip force control.

III. MULTI-FINGERED GRASPING AND MANIPULATION INVIRTUAL ENVIRONMENT

Interaction with objects in virtual reality can be enhancedby multi-fingered grasping and manipulation. Realisticsimulation of grasping requires accurate modeling of forcesand torques on the virtual object resulting from the fingers incontact with the surface. We designed an isometric inputdevice for multi-fingered grasping in virtual environment(Fig. 2). The finger device was designed to simultaneouslymeasure forces applied by the thumb, index and middlefinger. Mathematical model of grasping adapted from theanalysis of multi-fingered robot hands was applied toachieve multi-fingered interaction with virtual objects. Weused the concept of visual haptic feedback where the userwas presented with visual cues to acquire haptic informationfrom the virtual environment. The virtual objectcorresponded dynamically to the forces and torques appliedby the three fingers. The application of the finger device formulti-fingered interaction is demonstrated in four tasksaimed at the rehabilitation of the upper extremities of strokepatients. The tasks include opening of a safe, filling andpouring water from a glass, training of muscle strength withan elastic torus and force tracking task (Fig. 3). The trainingtasks were designed to train patient's grip force coordinationand increase muscle strength through repetitive exercises.

Fig. 3. Various tasks for training of multi-fingered grasping andmanipulation in a virtual environment.

IV. ASSESSMENT OF GRASPING IN APPROACHING PHASE

Preshaping of the fingers according to the shape of theobject is characteristic for the approaching phase. Threeobjects were selected in experiments: thin plate, block, andcylinder [3]. The objects were by the use of magneticcontact attached to the endpoint of robotic manipulator. Thetask of the robot was to place the objects in differentpositions and orientations in the subject's workspace. Robotalso randomly introduced perturbations of the objectposition or orientation.

rig. 4. Experimental environment Ior assessment oI Ilinger movementsduring approaching of the hand to the object. The object is held by robotand positioned in various positions and orientations of the workspace.

ie input of fingertip forces for mlgrasping in VR.

Five infrared markers were placed on the fingertipstogether with additional three markers attached to thedorsum of the hand (Fig. 4). The movements of the eightmarkers were assessed by six OPTOTRAK cameras. Thepreshaping of the fingers was evaluated by defining apentagon connecting the five fingertips. The surface of thepentagon is increasing in the beginning of the approachingphase, reaches its maximum in the middle of the movementand is decreasing afterwards. Interesting observations can bedrawn from the angle between the pentagon normal and the

1-4244-1320-6/07/$25.00 (c)2007 IEEE 505

Proceedings of the 2007 IEEE 10th International Conference on Rehabilitation Robotics, June 12-15, Noordwijk, The Netherlands

object normal. The time course of the angle has a saddleshape and can be divided into three phases: fast turn of thewrist, transport phase, where the angle remains almostconstant, and final preshaping of the fingers according to theshape of the object.

V. TRAINING OF HAND DEXTERITY IN VIRTUALENVIRONMENT

In Fig. 5 a system for studying kinematics of the lowerarm and hand, while manipulating objects in a virtualenvironment (VE), is presented [4]. The VE is used toprovide an augmented visual feedback while a subject isasked to align a real object with the reference objectdisplayed in the VE.

Poses of hand segments and object are assessed by anoptical tracking device. The description of the hand withrigid bodies enables online adaptation of the tasks taught byone subject - a virtual trainer to inter-person variability ofhand anthropometry. The adaptation assures that all subjectsare able to reach the displayed postures.

The system is aimed for training of patients withneuromuscular impairments. New tasks can be recorded fora group of patients with similar impairments by the help of atherapeutist guiding the motion of the virtual trainer(patient).

The method enables programming of tracking and stepresponse tasks that require gradual or abrupt change ofobject pose. Root mean square error between the referenceand actual position and orientation can be used to evaluatethe progress of a patient with respect to previous trials. Inaddition parameters of response dynamics, such as risingtime and time of completion, can be used to assess thesuccess of therapy.

VI. CALIBRATION OF INSTRUMENTED GLOVE

At the moment a generally accepted method forassessment of hand kinematics is not available. Instrumentedgloves could manage the task if a reliable method for theircalibration was at hand.

In Fig. 6 a method for the calibration of an instrumentedglove is presented [5]. It is based on an optical trackingdevice and an inverse kinematic model of the human hand. Itrequires one reflective marker to be attached to each fingerand three on the dorsal aspect of the hand in order to assessangles in finger joints. A further three markers are needed tocalculate angles in thumb joints. Joint angles assessedthrough inverse kinematics and with the calibrated glove canbe validated against the reference angles assessed whilemeasuring the finger movements with multiple markers. Infingers accuracy of ±70 can be achieved when the model-based method is used to calibrate the glove.

VII. FES SYSTEM FOR UPPER EXTREMITIES SENSORY-MOTOR ABILITY AUGMENTATION

Injury to a central nervous system can result in loss ofsensory and motor functions in upper extremities. Becauseof this impairment patients have trouble or are incapable ofgrasping and manipulating objects. Especially patients withspastic finger flexors have difficulties with voluntaryopening of the hand [6]. They are normally able to hold anobject, but are incapable of grasping or releasing alreadygrasped object. One possibility for restoring the lostsensory-motor ability of the hand is the use of functionalelectrical stimulation (FES) [7]. Besides, the tracking tasksalso proved as a promising approach for training andassessment of hand function [8].

The aim of our research was to develop and evaluate anFES system for augmenting the sensory-motor abilities ofthe hand. With the system the patient trains the fingerflexors and finger extensors by accomplishing the forcetracking task. During training, the patient has to track thetarget signal representing the desired force as closely aspossible by adjusting his grip strength (Fig. 7). The systemis designed to allow full voluntary control of hand openingor closing, while the FES is added for facilitating thevoluntary contributions of the patient. The FES is closed-loop controlled according to the difference between thedesired and actual force. Actual forces are acquired by aspecially designed adjustable measurement setupinstrumented by two multiaxis force sensors The personal

1-4244-1320-6/07/$25.00 (c)2007 IEEE 506

Proceedings of the 2007 IEEE 10th International Conference on Rehabilitation Robotics, June 12-15, Noordwijk, The Netherlands

computer serves as a platform for data acquisition, referenceforce generation, stimulation control and for displaying thevisual information about target and actual force on acomputer screen.

rig. 7. Patient auring training witn the FES system ior upper extremitiessensory-motor augmentation.

The system was evaluated in experimental training studywith two incomplete tetraplegic patients (DA, 28 years old,injury C5-C6, 4 years after injury; SA, 15 years old, injuryC3-C4, 8 months after injury; both partially preservedvoluntary control over finger flexor and extensor muscles).In addition to the therapeutic treatment, the patients trainedwith the FES system approximately 45 minutes per day overa period of 4 weeks. After the training period, the resultsshow that both patients have strengthen the finger flexor andextensor muscles and that they both have substantiallyreduced the tracking error, what implies to the improvementof grip force control.

VIII. GRIP AND LOAD FORCE COORDINATION

Holding, transporting and manipulating objects arecommon activities in our daily life. These activities combinea number of motor commands of spinal and cortical originand demand high level of coordination of grip and loadforces [9]. Load forces and moments are forces andmoments exerted on the object to move it or to keep it in astable equilibrium. Load force is tending to cause slippageand loss of contact with the held object. Grip force is limitedto the normal force component exerted on the surface of theheld object. Grasp to load force coordination and grip forceto load force coupling strategies and patterns develop duringthe childhood until the adult configuration is reached at 14-15 years [10]. This strategies and patterns can be impaireddue to the injury of a central nervous system, hand injury

and neural or neuromuscular disease affect the handfunction. Understanding the development and control of gripis the fundamental for understanding and developing thetechniques and technology for rehabilitation [11]. In ourinvestigation on grip control we focused on the control ofgrip to load force coordination in precision and power grasp.

Haptic interfaces are used for generating haptic virtualenvironments and load forces [12]. Haptic technologyallows programming and generating external load forces,and defining exact dynamics and behavior of the virtualenvironments [13]. Use of the graphical representation ofthe haptic virtual environments augments the hapticinteraction. For measuring the grip forces various grip-measuring handles have been developed. The grip-measuring handles were mounted on the haptic devices formeasuring the grip force during the haptic interaction.

During the quasi-static (ramp task) and dynamic(sinusoidal task) high external load force conditions thesubject employs a strategy of grasp and load forcecoordination that is adjusted to the dynamics of the loadforce. As a consequence, this results in distinctive shapes ofplots of grasp versus load force representation: a line in thecase of quasi-static load force and a triangle in the case ofdynamic load force (Fig. 8). A linear relation between loadforce FL and grasp force FG was experimentally confirmedfor the ramp task. Results show elevation of the grasp forcein the dynamic (sinusoidal) task and a distinctive triangularshape was observed for grasp versus load forcerepresentation. Decomposition into linear, initial anddynamic parts of the grasp force is proposed. Elevation ofthe grasp force is needed in dynamic conditions to assure astable grasp [14].

In other study experiments were conducted in the hapticvirtual environment (HVE) and graphical virtualenvironment (GVE) experimental conditions. In HVEexperimental conditions the haptic interface produced theprogrammed unpredicted increase in load force. In the GVEexperimental set falling sphere shown on the graphicaldisplay acted as a visual cue and triggered the increase ingrip force. Grip force responses had distinctive skewed bellshape for both the HVE and GVE experimental conditions.Differences are in latency of the grip force response,amplitude and the duration of the grip force response in thedynamic loading phase. Though different neural controlmechanisms are triggered in HVE and GVE experiments,both neural control mechanisms result in muscular responseand increased muscular work. Current investigation clearlyshows differences in responses that can be expected in adulthealthy subjects to haptic or visual virtual environment.Since haptic plus visual or only visual environments arecommon in advanced rehabilitation technology thisinvestigation shows what is the potential of the two kinds ofthe virtual environments.

1-4244-1320-6/07/$25.00 (c)2007 IEEE 507

Proceedings of the 2007 IEEE 10th International Conference on Rehabilitation Robotics, June 12-15, Noordwijk, The Netherlands

400

350

300 F

Sinusoidal taskTriangleRamp task

-o- Linear aproximation of ramp

250

Q 200

0 10 20 30 40 50 60 70 80 90 100Load force [N]

Fig. 8. Grasp force versus load force plot. Straight line is fitted over theramp task data (dark line) and a triangle over the sinusoidal task (light gray).

A combination of a haptic interface with force/torquetransducers for measuring grasp force shows a greatpotential for studying grasp in humans. The generalframework allows use of a diverse external load force timeseries, virtual environments and human-robot interfaces(different handles for different types of grasps).

IX. HIFE

Due to very few haptic interfaces with sufficient outputforces and appropriate workspace for fingers, and the highprices of the ones available, a low cost haptic interface withtwo active degrees of freedom and with a tendon-driventransmission system (Fig. 9) was developed (HIFE - HapticInterface for Finger Exercise) [15, 16]. The segment lengthsof the device were optimized to the envelope of a fingerworkspace, and can provide forces up to 10 N, suitable forfinger exercise. The low transmission ratio also makes themechanism backdrivable. Good backdrivability is essentialin the design to sense forces inherently without resorting todestabilizing force-torque sensor strategies. The hapticdevice is controlled by an application running within theWindows operating system on a personal computer througha custom-designed controller unit. The kinematic anddynamic model equations with force characteristic analysis,and the Virtual Reality front end with a complete applicationfor therapists has been developed. The application databaseenables patient selection from a list of recorded patients withpatient data, experiment level setting, and the size of thepatient workspace recorded last. A number of differentexperiment types and levels of complexity of VR tasks foreach type were implemented. Moreover, the safety of thesystem was one of the main concerns, and was taken intoconsideration at every stage of the design. The performance,accuracy, and safety of the device were found to be verygood, which makes the device suitable for rehabilitationpurposes.The system was evaluated in a group of nine stroke patients

during a one-month period of therapy. The selection ofexercises was found to be suitable for the finger andassessing the functionality in patients with neuromusculardiseases. Our observation based on quantitative data is thatprogress during therapy of the affected hand side in patientsis better than the one of the non-affected side. As expected,the mean values for kinematic and static parameters inpatients are found to be lower than for healthy volunteers.We also verified that the progress coefficients and the meanvalues for either hands of healthy volunteers are verysimilar. Results from the present study (Fig. 10) werecompared to the M-FIM scale and are very well correlated.Great correlation to the existing clinical scale that wasobserved makes application suitable for objectiveassessment of post-stroke disability. In summary, the hapticdevice, along with the virtual environment, performed well,while experiments as selected proved to be suitable for thepopulation with neuromuscular diseases.

rig. 9. Haplic Interiace ior ringer Exercise with V irual Realiy t uneapplication

Regression line coefficients both groups, both hands, "Jo Ball"

-ao - . .....OH2LLO 1(R)

H3 R.......

*IM

1p.(k)10 _

6 _4-

Ok6pL

ProgressFig. 10. Measurement results: Coefficients of regression lines for finger

mobility (mean velocities) in the "Jo Ball" experiment.

1-4244-1320-6/07/$25.00 (c)2007 IEEE

r r | W 1 T r |~~~

22 -<

14

508

Proceedings of the 2007 IEEE 10th International Conference on Rehabilitation Robotics, June 12-15, Noordwijk, The Netherlands

X. I-MATCH

I-MATCH is a partially funded EU Project, whichconcentrates on the development of techniques to optimizethe selection of an interface for a user needing assistivetechnology by measuring both the functional characteristicsof the device (e.g. joystick, switch, mouse etc) and the skillsof the disabled user.

The I-MATCH Upper Limb Evaluation Tool is a set ofcomputer-based tests using Haptic interface, AssistiveTechnology devices or standard PC equipment as inputdevices. These tests are used to provide an objectivemeasurement of the performance of the subject's upperlimbs. Each test is conducted in a Virtual Reality setting andenables online recording of a set of parameters that are usedto assess the user upper limb ability [17]. The tests include:Peg-In-Hole, a test inspired by the Nine Hole Peg Test(NHPT); Circular tracking: tracking a circle clockwise andcounter-clockwise; Linear tracking: the user is required tofollow a line to its edge and backwards. The test is repeatedin six directions; Target tracking: The patient's task is tomove toward the target from the current position in the mostnatural way and then to hit it; Labyrinth: In this test, acomplex and movement demanding virtual environment,representing a labyrinth and aligned with the person s frontalplane is presented to the patient (Fig. 11). The tests cancapture the finger dexterity, the forearm and shouldermovement abilities; Maximal force: This task measures thecapacity of exerting forces in six evenly distributeddirections. The I-MATCH Haptic Evaluation Tool generatesreports which include a graphical representation of themeasured parameters for each test subject. This provides theTechnology Providers with immediate visual informationabout the ability of the test subject. A cut-down version ofthe Haptic Evaluation Tool, which does not have the abilityto generate reports or log patient data, is freely available onthe I-Match website.

The I-MATCH Wheelchair Simulation Software refers toa software simulation of a powered wheelchair. Accuratekinematics & dynamics models have been developed, toprovide a realistic simulation of the movement of thepowered wheelchair. The powered wheelchair simulationtakes place in a Virtual Environment which represents afamiliar home setting (Fig. 12). Modules for supportingstandard PC equipment (keyboard, mouse, standard gamejoystick) as input devices for the simulations have beendeveloped. The software simulation may be parameterised torealistically simulate a range of powered wheelchairs,including variants with front or rear drive, different mass,dimensions. In this way, it is possible to simulate an array ofAT equipment and aid users in the training and selectionprocess, as well as serve as many users as possible. Insimilar way is implemented the feeding robot Handy andassistive Manus ARM robot.

50

40

30

20

10EE 0

-1 0

-20

-30

-40 _

-50-50 0

XI mm50

Fig. 11. Trace through the labyrinth as measured in Parkinson person

Fig. 12. I-Match Virtual Reality house with wheelchair having attachedManus ARM. Top-view window improves space awareness.

XI. ALLADIN

Stroke is a leading cause of long-term disability in theWestern world. Effective rehabilitation with the objective ofoptimizing the recovery of motor performances, minimizinglong-term disability and enabling reintegration andparticipation in activities of daily life is of criticalimportance [18]. A reliable prediction system that canrapidly discover markers for recovery is needed for betteroutcome of the therapy. Force/torque (F/T) measurementswere proposed as a mean of diagnosis and patient progresstracking tool in [19]. Some attempts to identify markers offunctional recovery were made in [20].

The Alladin project includes developing a mechatronicplatform for assessing post-stroke functional recovery. Thedevice is shown in fig. 13. Eight force/torque sensorsmeasure performance of activity of daily living tasks (likedrinking a glass of water, taking a spoon etc.). Both, healthysubjects and patients have been measured. All measurementsare isometric (the patient can't move once he is in the ADD)

1-4244-1320-6/07/$25.00 (c)2007 IEEE 509

Proceedings of the 2007 IEEE 10th International Conference on Rehabilitation Robotics, June 12-15, Noordwijk, The Netherlands

thus actually motion imagination and motion initiation aremeasured.From measured trajectories a set of parameters has been

defined. One parameter is for example a delay betweendifferent sensors. Large delay is expected to indicate lack ofsynchronization between various body parts (like thumb andindex finger). Parameters are being analyzed with datamining algorithms. This on going work is expected to detectdifference between healthy person and post-stroke patient.The progress of the patient during the therapy should bedetected as smaller "distance from normality".

zs

a -

Ye tI

9

Fig. 14. Arm is modeled as a 7 degrees of freedom mechanism. Wrist poseas well as upper arm acceleration and angular rate are measured.

Fig. 13. Mechatronic platform for measuring performance of activity ofdaily living tasks of post-stroke patients.

XII. ARM INVERSE KINEMATICS

A technique fur computattion of the inverse kinematticmodel (Fig. 14) of the human arm is proposed [21]. Theapproach is based on measurements of the hand pose as wellas acceleration and angular rate of the upper arm segment.The shoulder position is fixed in space. A quaterniondescription of orientation is used to avoid singularities inrepresentations with Euler angles. A Kalman filter isdesigned to integrate sensory data from three different typesof sensors. The algorithm (Fig. 15) enables estimation ofhuman arm posture (three shoulder, one elbow and threewrist joint angles), which can be used in trajectory planningfor rehabilitation robots, evaluation of motion of patientswith movement disorders, and generation of virtual realityenvironments.

Fig. 15. Estimation process of the arm attitude quaternion Qs Inputs are

wrist position W, static component of the acceleration measurements ,

and angular rate of the upper arm

XIII. VIRTUAL MIRROR

Virtual reality is a powerful tool in rehabilitation andtraining of lower extremities. Virtual mirror [22] is a largescreen in front of which the subject performs the lower-extremity movements (Fig. 16). The subject can see twofigures in a 3D virtual environment in the virtual mirror,from the desired viewing angle. The solid figure representsthe training subject, the movements of which correspond tothose of the subject in real time. The transparent figurerepresents the virtual instructor. Both figures aresuperimposed on each other. The movements of theinstructor are preprogrammed, and are obtained throughlearning trials with a healthy subject. The task of the trainingsubject is to follow the movements of the virtual instructoras accurately as possible, so that both figures are closelyoverlaid throughout the duration of the lower-extremitiestraining.

1-4244-1320-6/07/$25.00 (c)2007 IEEE 510

Proceedings of the 2007 IEEE 10th International Conference on Rehabilitation Robotics, June 12-15, Noordwijk, The Netherlands

The device is designed as a 3 DOF mechanism driven byan electro-hydraulic servosystem with a standard bike seatmounted at the end-effector (Fig. 17). The disabled person isduring robot-assisted standing-up supported under thebuttocks, while the seat is moving in the subject's sagittalplane. The robot configuration allows the subject to activelyparticipate in rising using even a hand support. In addition,the sensory system implemented in the robot deviceprovides information about the human body motionkinematics and the supportive forces, making possible theassessment and evaluation of the human motionperformance and therapeutic benefits of exercising.

1'1r. iU. V irtuai mirrorlU. a iarge scr1e11 snUowing mo1UVemlentLs UI Lne sUUJLet 1in

real time.

A simplified kinematic model of the human body wasdeveloped to visualize the subject's movements in the virtualmirror. We used vector parameters method for kinematicscomputation. The model comprised 8 rigid segments (head-arms-torso (HAT), pelvis, thighs, shanks, and feet), whichwere connected by spherical and hinge joints, featuring atotal of 19 degrees of freedom. In order to obtain the valuesofjoint variables, 11 active markers were placed on the skinover anatomical landmarks. We aimed to keep the numberof markers as low as possible, in order to allow a quicksetup procedure. Marker positions were measured usingOPTOTRAK system (Northern Digital, Inc.) and sampledwith 70 Hz rate. We used VRML 2.0 and Matlab computingenvironment to visualize the movements of the figure with35 Hz refresh rate and without noticeable lag, ensuring aconvincing real-time performance of the virtualenvironment.A preliminary investigation was conducted by performing

the stepping-in-place movements in front of the virtualmirror. The virtual instructor was programmed with steppingtasks featuring different hip angles and cadences. 10 healthysubjects included in the study exhibited a quick adaptationto the movements of the virtual instructor, suggesting theapplicability of the virtual mirror as a feasible modality oflower extremities training. When considering the virtualmirror to be introduced in a clinical environment, weenvisage that motion-tracking techniques using computervision and accelerometers should be exploited in order toallow quick and simple setup procedure and measurements.

XIV. ROBOT ASSISTIVE DEVICE FOR STANDING-UP MOTIONAUGMENTATION

Rising from a chair is a common, however demandingactivity of daily living. Impaired persons and the elderlyoften have difficulty when rising to a standing position. Arobot assistive device has been developed for standing-uptraining and performance evaluation [23].

Fig. 17. Standing-up robot assistive device: 1) seat orientation bilateralservo slave cylinder, 2) robot end-effector, 3) force sensor, 4) seat, 5) servo

valves, 6) pressure sensors, 7) translational DOF hydraulic actuator, 8)rotational DOF hydraulic actuator, 9) seat orientation bilateral servo master

cylinder.

The robot controller allows different modes of operation.The device can operate either in a position control mode thatprovides tracking of the desired motion trajectory, or in aforce control mode that assures application of the desiredsupportive force to the subject.As an alternative to position control where the high

man/machine interaction forces can occur, a controlapproach integrating human voluntary activity into the robotcontrol scheme was developed [24]. In this regime, the robotis supposed to operate in a force control mode, while theforce reference is determined according to the risingsubject's activity. In this way, the artificial robot controller isintegrated into the control actions of the intactneuromuscular system of the subject. Hand and foot supportforces are used to characterize the subject's volition and arethus used as feedback to the controller. The basic ideabehind the calculation of the robot supportive force is toquantify the deficit in the force and moment equilibrium ofthe human trunk. We named the approach "Patient-DrivenRobot-Assisted Motion Augmentation" (PDRAMA).

In this way, a special approach in rehabilitation roboticshas been developed: a patient-driven" control of robot-assisted training. Moreover, the PDRAMA control algorithmenables the alteration of body weight bearing portionsbetween the robot and the subject. Thus, the standing-up

1-4244-1320-6/07/$25.00 (c)2007 IEEE 511

Proceedings of the 2007 IEEE 10th International Conference on Rehabilitation Robotics, June 12-15, Noordwijk, The Netherlands

training regime can be varied, depending on the subject'setiology, from more "robot-driven" at the beginning ofrehabilitation, to more and more 'patient-driven" inaccordance with the patient's progress.

XV. CONCLUSION

Future work is planned to cover existing areas more indepth and conclude current projects. Following the interestand the current state of the art in the field we plan to modifyand port the presented topics to the new fields of medicine.It is further expected that developments in the fields ofcomputer industry, governmental fields, medicine andindustrial robotics will enable utilization of existing and newtechnologies and applications in the target field.

ACKNOWLEDGMENT

The work presented here was partly supported by the EC,FP5: GENTLE/S (Robotic assistance in neuro and motorrehabilitation); I-Match (A VR based system to allowmatching of an optimum interface to a user of assistivetechnology); FP6: ALLADIN - Natural Language BasedDecision Support in Neuro-rehabilitation). Attribute is givento all respective partners within these consortiums in any

way contributing to common work being mentioned hereand other parts of these projects. Other studies were in partor completely supported by Ministry of Higher Education,Science and Technology, Republic of Slovenia.

REFERENCES

[1] G. Kurillo, M. Gregoric, N. Goljar, and T. Bajd, "Grip force trackingsystem for assessment and rehabilitation of hand function.,Technology and Health Care, vol. 13, pp. 137-149, 2005.

[2] G. Kurillo, A. Zupan, and T. Bajd, "Force tracking system forassessment of grip force control in patients with neuromusculardiseases," Clin. Biomech, vol. 19, pp. 1014-1021, 2004.

[3] T. Supuk, T. Kodek, and T. Bajd, "Estimation of hand preshapingduring human grasping," Medical Engineering & Physics, vol. 27, pp.790-797,2005.

[4] M. Veber, G. Kurillo, T. Bajd, and M. Munih., "Assessment andtraining of hand dexterity in virtual environment (Periodical styleAccepted for publication)," Journal Europeen des SystemesAutomatises, to be published.

[5] M. Veber, T. Bajd, and M. Munih., "Assessing joint angles in humanhand via optical tracking device and calibration of instrumented glove(Periodical style Accepted for publication)," Meccanica, to bepublished.

[6] A. E. Hines, P. E. Crago, C. Billian, "Hand opening by electricalstimulation in patients with spastic hemiplegia," IEEE Trans. Biomed.Eng., vol. 3, pp. 193-205, June 1995.

[7] M. R. Popovic, A. Curt, T. Keller, V. Dietz, "Functional electricalstimulation for grasping and walking: indications and limitations"Spinal Cord, vol. 39, pp. 403-412, August 2001.

[8] G. Kurillo, M. Gregoric, N. Goljar, T. Bajd, "Grip force trackingsystem for assessment and rehabilitation of hand function,"Technology and Health Care, vol. 13, pp. 137-149, July 2005.

[9] M. N. McDonnell, S. L. Hillier, M. C. Ridding, T. S. Miles,"Impairments in precision grip correlate with functional measures inadult hemiplegia," Clin Neurophysiol, vol. 117, 2006, pp. 1474-1480.

[10] A. C. Eliasson, H. Forssberg, K. Ikuta, I. Apel, G. Westling, R.Johansson, "Development of human precision grip. V. anticipatoryand triggered grip actions during sudden loading, " Exp Brain Res, vol106, 1995, pp. 425-33.

[11] G. Kurillo and A. Zupan and T. Bajd, "Force tracking system for theassessment of grip force control in patients with neuromusculardiseases, " Clin Biomech, vol 19, 2004, pp. 1014-21.

[12] J. Podobnik, M. Munih, "Improved haptic interaction control withforce filter compensator," in IEEE 9th International Conference onRehabilitation Robotics, Chicago, USA, 2005, pp. 160-163.

[13] U. Mali, N. Goljar, M. Munih, "Application of haptic interface forfinger exercise," IEEE Trans Neural Syst Rehabil Eng, vol 14, 2006,pp. 352-60.

[14] J. Podobnik, M. Munih, "Robot-assisted evaluation of coordinationbetween grasp and load forces in a power grasp in humans," AdvRobot, Vol. 20, 2006, pp. 933-951.

[15] U. Mali and M. Munih, "HIFE-haptic interface for finger exercise,"IEEE/ASME Trans. Mechatron., vol. 11, no. 1, pp. 93-102, Feb. 2006.

[16] U. Mali, N. Goljar, M. Munih, "Application of haptic interface forfinger exercise, IEEE Trans. on Neural Systems and RehabilitationEngineering, vol. 14, pp. 352-360, 2006.

[17] A. Bardorfer,M. Munih, A. Zupan, and A. Primozic, "Upper limbmotion analysis using haptic interface," IEEE/ASME Trans.Mechatron, vol. 6, pp. 253-260, 2001.

[18] C. Patten, J. Lexell, and E. B. Brown, "Weakness and strength trainingin persons with poststroke hemiplegia: Rationale, method, andefficacy," Journal of Rehabilitation Research & Development, vol. 41,May 2004, pp. 293-312.

[19] J. P. Dewald and R. F. Beer, "Abnormal joint torque patterns in theparetic upper limb of subjects with hemiparesis," Muscle & Nerve,vol. 24, Feb. 2001, pp. 273-283.

[20] T.K. Koo, A.F. Mak, L. Hung, and J.P. Dewald, "Joint positiondependence of weakness during maximum isometric voluntarycontractions in subjects with hemiparesis,".Archives of PhysicalMedicine and Rehabilitation, vol. 84, Sep. 2003 , pp. 1380-1386.

[21] M. Mihelj, "Inverse Kinematics ofHuman Arm Based on MultisensorData Integration," JIntell Robot Syst, vol. 47, pp. 139-153, 2006.

[22] T. Koritnik, T. Bajd, and M. Munih, "Virtual environment for lowerextremities training,", Gait and Posture, submitted for publication.

[23] R. Kamnik, T. Bajd, "Standing-up robot: an assistive rehabilitativedevice for training and assessment," J. Med. Eng. Technol., vol. 28,No. 2, March/April 2004.

[24] R. Kamnik, T. Bajd, "Human voluntary activity integration in thecontrol of a standing-up rehabilitation robot: a simulation study,"Med. eng. phys., in press.

1-4244-1320-6/07/$25.00 (c)2007 IEEE 512