• Tidak ada hasil yang ditemukan

Low-Cost, Intention-Detecting Robot to Assist the Movement of an Impaired Upper Limb

N/A
N/A
Protected

Academic year: 2023

Membagikan "Low-Cost, Intention-Detecting Robot to Assist the Movement of an Impaired Upper Limb"

Copied!
140
0
0

Teks penuh

I would also like to thank all the laboratory members of the Robotics and Autonomous Systems Laboratory (RASL) for their assistance and unending kindness. I would especially like to thank Phil Davis for taking the time to teach an inexperienced machinist the ropes.

Background

Approximately two million people worldwide live with a spinal cord injury (SCI), of whom nearly 250,000 live in the United States [6]. Due, in part, to the physical barriers to basic mobility experienced by those with SPIs, the global unemployment rate of adults with spinal cord injury is over 60% [8].

Robotic Assistive Devices

Robot-aided Rehabilitation

However, none of the systems described attempt to bridge the gap between robotic rehabilitation and robotic assistance in the outside world. However, each of the designs presented so far has limitations in its ability to be adapted to real assistive robotic systems.

Human Intention Detection

Eye-Gaze Implementation

In their review of eye tracking technology, Morimoto and Mimica discuss the appeal of remote eye gaze trackers (REGTs), noting that REGTs offer ease of use and easier and faster setup [65]. Equipping an assistive robotic device with remote eye tracking has the potential to effectively enable the system to track human eye gaze and thus detect human intent.

Project Overview

Ideally, this prototype will be adapted to attach to a support above the user and assist in human movement from above, allowing the table to be fully cluttered without interfering with the use of the robotic aid. Potentiometers were used to measure the position of the end effector with sufficient accuracy, eliminating the need for expensive encoders.

Figure 1-1: Conceptual Model of Entire System
Figure 1-1: Conceptual Model of Entire System

Thesis Outline

The second responsibility of the wrist interface is to sense the net force between the system and the user. When the controller is in position mode, the desired position is calculated by the system from eye-tracker data.

Mechanical Design

Mechanical Design Overview

The linear portion of the system rotates around the fixed rotating portion of the system, which is located in the frame on the left of the photo. The red dot indicates the axis around which the linear subsystem rotates, and the green ring represents the resulting workspace of the device.

Design Requirements

It was also desired that the system would be able to move the end effector to its side. With a desired acceleration of 2 m/s2 and an estimated mass of 5 kg, the system should be able to apply 10 N (about 2.5 lbs) of force in all directions.

Linear Actuation

  • Motor Selection
  • Spring Selection
  • Potentiometer Integration

The spring selected for the system also had to meet the requirements listed earlier in the chapter. One end of the spring is attached to part C and the other end to the pivoting part B.

Figure 2-3: Spring-Holder Design
Figure 2-3: Spring-Holder Design

Rotational Actuation

  • Motor/Gearbox Selection
  • Torque Transmission

The engine is used with a 64:1 P80 planetary gearbox, which brings maximum torque to 115 N-m (due to the limits in the gearbox) and a maximum angular velocity of just over 8 rad/s. To check the security of the connection between the bushings and the aluminum plate, both the screws and the aluminum plate should be checked.

Figure 2-7: Design of Rotational Motor Torque Transmission
Figure 2-7: Design of Rotational Motor Torque Transmission

Wrist Interface Design

  • Transmitting Force
  • Recording Forces
  • Design and Validation

The upper ANSYS figures show the stresses due to a horizontal force acting on the center of the design towards the upper right side. Testing of the final design with real loads confirmed the simulations, with the FSRs only registering forces in the direction of the applied force.

Figure 2-8: Complete Design of Wrist-Interface
Figure 2-8: Complete Design of Wrist-Interface

Eye-tracker Mount

Mechanical Safety Considerations

Bill of Materials

This chapter then discusses the specifications of the electronic components of the system in more detail. The graph below shows the trajectory of the end effector during the entire system step response.

Electronics and Hardware Architecture

Overall Schematic

Brushed DC Motor for Linear Subsystem

Brushed DC Motor/Gearbox for Rotational Subsystem

The photo above and to the right is of the BaneBots P80 planetary gearbox with a 64:1 ratio that the FIRST CIM motor is mated to. Even with an efficiency of 50%, this is able to meet the requirements set out in Chapter 2.

RoboClaw Brushed DC Motor Controller

The RoboClaw can also receive control signals such as analog, serial or RC signals, making it a very versatile controller and easily controllable with an Arduino, which was the microcontroller used for the system.

Arduino Uno Microcontroller

XBee Wireless Module

Supply voltage and standard baud rate compatibility, in addition to a 45mA operating current, make the XBee S2 compatible with Arduino Uno, while a data transfer rate of 250,000 bps and a range of 90m show that the XBee S2 meets all system requirements.

Potentiometer for Linear Subsystem

Potentiometer for Rotational Subsystem

Force-Sensing Resistors (FSRs)

Batteries

Eye-Tracker

Bill of Materials

To evaluate the performance of the position-based controller, the system responses to various position inputs were recorded. Again, the system has a smaller position error for the front coordinates than the rear coordinates, both in eye-gaze accuracy and in the accuracy of the end-effector's motion. Additionally, the eye tracker must be able to track eye gaze data when the user looks beyond the width of the eye tracker.

Software Architecture

Overall Schematic

The diagram below shows the overall structure of the system software, according to the color scheme of the diagram in Chapter 3.

Xbee to Arduino Wireless Communication

This way there is no real communication between the Arduino and the stationary XBee module, but the XBee is wired to be a wireless extension to the Arduino's virtual serial port. During Arduino startup, the Arduino issues a command to set the baud rate of its virtual serial port to 57,600 bits per second, the fastest speed allowed for XBee communications. When such a request is received, the Arduino sends a 19-byte request command through its virtual port, which is broadcast wirelessly through the stationary XBee.

Arduino to MATLAB Communication

At startup, MATLAB is set to open a serial port with the Arduino at a baud rate of . The Arduino code receives these sets of messages, determines the message type based on the first byte input, and responds accordingly. The figure on the left shows the results when the Arduino is programmed to respond to a message of [1,0,0] with a single analog value.

Figure 4-2: Times for Requesting One (Left) and Six (Right) Arduino  ADC Data Values using an Arduino Library
Figure 4-2: Times for Requesting One (Left) and Six (Right) Arduino ADC Data Values using an Arduino Library

MATLAB to C# Application Communication

One to initialize the Arduino, one to request analog data from the Arduino, and one to command the Arduino to send the desired pulses to the RoboClaw motor controller. When MATLAB sends a request, it will wait until it receives a response or until 20 milliseconds pass without a response, in which case it will resend the request. The final setting allows MATLAB to accept potentiometer data at more than 50 Hz, receive FSR data at more than 10 Hz, and send pulse commands at more than 50 Hz.

Eye-Tracker to C# Application Communication

When entering position mode, the desired position of the end effector is updated to the current position, which prevents the system from ignoring adjustments made under force control. For the entire system, the accuracy of the desired position calculation based on eye gaze data needs to be verified. The graph below shows an example of the x-y position graph of the user's gaze, the desired position displayed by the controller, and the actual position of the end effector for one trial.

Controller Design

Overall Controller Structure

The entire controller can be split into two separate sub-controllers, one controlling the system via force inputs and the other controlling the system via eye tracker and position data inputs. Only one subcontroller is active at any given time, meaning the system is either in forced or positioned mode. It will be shown that the system is still able to follow the desired velocity trajectories.

Force-Based Control

  • Desired Characteristics
  • Force-Feedback Structure

A force-based controller must cycle through all these steps in a smooth manner, making the system kinetically invisible to the user. When the system is at rest and the user applies a force, even though the force is averaged with previous negligible forces, the average is still non-zero. To make the system more responsive, the end effector is set to start decelerating when the input force is below a second force threshold.

Figure 5-2: Schematic Depicting the Process of Choosing which Sub-controller to Implement
Figure 5-2: Schematic Depicting the Process of Choosing which Sub-controller to Implement

Position-Based Control

  • Desired Characteristics
  • Position-Feedback Structure
  • Eye-Tracker to Position Calibration
  • Potentiometer to Position Calibration

Similar to a force-based controller, the requirements for a position sub-controller can be divided into two main components: calculating the desired position from the eye tracker data and driving the end effector to the desired position. The eye tracker was initially evaluated separately to determine the overall characteristics and accuracy of the eye gaze data it can provide. The graph on the next page shows the eye gaze locations as measured by the eye tracker for 50 consecutive fixations between two points 100 mm apart.

Figure 5-5: Eye-Tracker Setup Dimensions
Figure 5-5: Eye-Tracker Setup Dimensions

Conversion of Sub-controller Outputs to Arduino Pulse Outputs

  • Desired Velocity to Pulse-Width Calibration

As the graph shows, the system quickly drives the end effector to a constant speed (in this case -1080 mm/s) and maintains this speed until the end effector travels the length of the linear rail. The graph below shows the angular velocity data when the rotary motor was driven with an analogReference value of 171, the lowest value used by the system. In the future, this value may be calibrated while the system is in operation, as it will likely vary with user fatigue or energy.

Figure 5-9: Velocity vs. analogReference for RoboClaw set to  Linear (Upper) and Exponential (Lower) Modes
Figure 5-9: Velocity vs. analogReference for RoboClaw set to Linear (Upper) and Exponential (Lower) Modes

Control-based Safety Considerations

This allowed each subsystem to be evaluated separately and as part of the entire system. While the previous section examined the accuracy of the eye tracker, this section discusses the accuracy of the end effector when controlled by the eye tracker. The positioning of the eye tracker is the area that needs the most attention during future development.

System Evaluation

System Mechanical Responses

  • Step Response
  • Ramp Response
  • Frequency Response

The following graphs show the responses of the system when the (x,y) components of the desired position were changed from (900,0) to (400,0) and vice versa. The graphs below show the responses of the system when the (x,y) components of the desired position were changed from (700,300) to (700,0) and vice versa. The following graphs show the responses of the system when the (x,y) components of the desired position were changed from (900,0) to (600,250) and vice versa.

Figure 6-1: Position vs. Time Responses of the Linear System to Step Inputs in X-Direction880
Figure 6-1: Position vs. Time Responses of the Linear System to Step Inputs in X-Direction880

System Response to Force Inputs

The gain level can be safely estimated to be 10 dB, which supports the explanation of system stability presented in Chapter 5. The graph below shows the extracted start and end points for the motion shown in the graph above, which better shows the overall accuracy of the force - based controller. One of the 16 movements triggered the emergency stop described in Chapter 5 (due to an out-of-range potentiometer reading).

Figure 6-19: Extracted Begin and End Points for Example Motion During  Force-Based Evaluation
Figure 6-19: Extracted Begin and End Points for Example Motion During Force-Based Evaluation

Eye-Tracker Evaluation

The third point was the data collected just before the eye tracker successfully tracked the next coordinate. The overall average distance between the eye tracker output position and the actual desired position was 15.98 mm. The eye tracker was slightly less accurate for looks further away than looks closer.

Figure 6-20: Three Data Series showing the Eye-Tracker Position, the Desired Position output  by the Controller, and the Measured End-Effector Position during an Eye-Gaze Controlled Trial
Figure 6-20: Three Data Series showing the Eye-Tracker Position, the Desired Position output by the Controller, and the Measured End-Effector Position during an Eye-Gaze Controlled Trial

System Response to Combined Force and Eye-Tracker Inputs

In an attempt to best evaluate the system, the position error of the end effector was calculated in three different ways. Method 2 measured the distance between the actual desired location and the end-effector location when a stable position was reached or an intentional force was recorded. Method 3 measured the distance between the actual desired location and the end effector position immediately before the eye tracker output a new desired position.

Table 6-1: Average Errors for System under Normal Operation as measured by Three Different Methods
Table 6-1: Average Errors for System under Normal Operation as measured by Three Different Methods

System Cost

Proceedings of the 9th IEEE International Conference on Rehabilitation Robotics; 2005 June 28 - July 1; Chicago, USA: IEEE Press; 2005. Kobayashi, “An Upper Extremity Mobility Assistive Robot,” in Proceedings of the 6th IEEE International Conference on Rehabilitation Robotics ICORR1999, Stanford, CA, 1999. Beer, “Development of MACARM—A New Cable Robot for Upper Extremity Neurorehabilitation,” 9th International conference on rehabilitation robotics, pp.

Contributions and Future Work

Future Work

  • Mechanical Development
  • Electronic/Software Development
  • Experimentation with Intention-Detection

One such improvement would be to automatically adjust, or calibrate, the estimated moment of inertia of the system. Better estimation of the moment of inertia can elicit much improved responses from the rotary motor. Error handling if a complete response is not received if(a.BytesAvailable<2&&a.BytesAvailable>0) fread(a,a.BytesAvailable);.

Systematic review of the effect of robot-assisted therapy on the recovery of the hemiparetic arm after stroke. Masiero, “Nerebot: a wire-based robot for neurorehabilitation,” in Proceedings of the 8th IEEE International Conference on Rehabilitation Robotics ICORR2003, Daejeon, Republic of Korea, April 2003.

MATLAB Code

Gambar

Figure 2-4: Above (Top) and Front (Below) Views of Linear Subsystem
Figure 2-6: Linear Potentiometer Setup
Figure 2-8: Complete Design of Wrist-Interface
Figure 2-10: FSR Holder Design and ANSYS Stress Evaluation.  The left column represents the original  design and the second column shows the final design
+7

Referensi

Dokumen terkait

— describe the solution, how will it solve the problem be descriptive and quantitative, why should this be used as compared to other solutions if applicable, effect on the company cost,