Motorized Headrest for People with Neck Muscle Weakness

Graeham Douglas, Enrico Guld, Mark Hewett, Fraser Macdonald (The University of British Columbia)

ABSTRACT

No device is currently marketed that provides user-controlled, powered head support for people with severe neck muscle weakness. Further, eye-gaze tracking technology has yet to be widely accepted in applications of controlling real-world devices. We describe our design of a head support system developed for people with late stage ALS. The device can be controlled by joystick, button array, or eye-gaze tracking control. We give particular focus to this last method due to its joint applicability for users with muscle weakness throughout their body. Together, the system provides comfortable support and user-controlled head movement for people unable to do this with their muscles.

INTRODUCTION

In Amyotrophic Lateral Sclerosis (ALS, also called Lou Gehrig’s Disease) and other diseases, the affected person may have severely weakened neck muscles. Rigid mounted headrests, rigid neck braces, and unpowered dynamic neck supports are available, but these devices are not designed with consideration of the needs of people with ALS (PALS). Rigid neck support makes the neck stiff and sore and restricts mobility. PALS may not have sufficient neck strength to utilize a dynamic head support.

The second primary objective of this design is in showcasing the ability of eye-gaze tracking to control real-world objects. Eye-gaze tracking has been used for several years, but its application is largely limited to interacting directly with a computer: for email, browsing the internet, or word processing. However, eye-gaze tracking also has application to real-world devices. If proven robust, eye-gaze control could be applied to powered wheelchairs or even automobiles. Control of a head support through eye-gaze tracking provides a comparatively low-risk way for the technology to be proven in new applications.

Eye-gaze tracking as a human-computer interface technology offers a high bandwidth of data on the user’s interest: a person’s gaze is strongly linked to where their attention lies. The advantage to this system is that it is accessible to people who may not otherwise be able to physically interact with machines. In this project the Mirametrix S1 gaze tracker was selected as a low-cost eye-gaze interface. Its compact design allows for flexibility in its mounting.

The eye-gaze tracker utilizes the reflection of infrared light off of the user’s pupils to compute the intended point-of-gaze on the screen. Two infrared light sources are located on opposite sides of a centrally positioned camera, which takes 60 pictures per second to determine the direction of the user’s gaze (1).

APPROACH

The design builds on an earlier prototype created at our university that was also powered. That device was analysed and critiqued by people with ALS, assistive technologists, and occupational therapists who often work with people with ALS. These criticisms were combined with our engineering-based analysis of the earlier device’s durability and manufacturability to direct our design (See Figure 1: Eye-gaze-controlled motorized headrest).

Figure 1: Eye-gaze-controlled motorized headrest.

Electrical Design

We have configured the circuitry to accept a wide variety of input methods. Joystick, eye-gaze tracking, and a clinician-customized input are possible. The clinician-customized input uses four headphone jacks as inputs for up, down, left and right, allowing the user to employ any combination of input switches from sip/puff to feather-light buttons in a simple four-button array.

The user’s gaze is translated into controls by a Java application, which runs on a laptop and relays signal commands to the headrest’s microcontroller. The microcontroller determines the desired function and engages the DC motors accordingly. The Java application offers buttons for simple yes (nod) and no (shake) movements in addition to directional arrows for the user.

With a mind to mass manufacture, our circuitry is all commercially available and is mounted on a customized printed circuit board (PCB), which allows any manufacturer to simply solder the components into their allotted places on the board and have a mass-producible completed circuit. Additionally, the circuitry requires a 12 Volt, 1 Ampere supply, which is within the capabilities of a wheelchair battery.

Mechanical Design

The mechanical design (see Figure 2: Assembled model of motorized headrest) of our device allows movement in two axes of motion. We have considered the biomechanics of the neck, ensuring that the device does not over-constrain the neck. The device should be mounted such that the center of the user’s neck is at the center of both axes of spinal rotation. The connection between the forehead strap and the moving structure of the device allows translation so the neck flexes with minimal intervertebral stresses.

The pitch or nod motion of the device is actuated by a DC motor with a self-locking worm drive. The self-locking property allows the head to be supported without electric power

Due to the proximity of the device to a vulnerable user, safety is paramount to the design of the device. Mechanical stops prevent motion beyond 45 degrees in any direction from center. The limits can be changed by switching swappable parts. Electrical switches anticipate the mechanical stops, so that an electrical stop is applied before the mechanical stop in typical use. The mechanical stop is then redundant. The software controls the device in steps, so that a loss of user signal only results in a small step movement, not a large, uncontrolled movement. Moving parts and locations with pinching hazards have been covered by a rapid-prototyping shell, to improve the safety of the device. This reduces the chances of hair or fingers getting caught, and protects the systems from wear and debris.

We have made a custom mounting system for the device, which has six degrees of adjustability. Further adjustability is possible through the device itself. The device is also designed to be used with off-the-shelf wheelchair mounting products.

Figure 2: Assembled model of the motorized headrest.

Software Design

With the expected rise in quality and ubiquity of built-in cameras, an appropriate user interface design may well be more important than the physical tracking technology itself (2). In compliance with the ISO standard definition of usability: “the extent to which a product can be used by specified users to achieve specified goals … in a specified context of use”(3), the requirements for the user interface were that it could be operated entirely by gaze control to achieve directional commands, and in the context of motorized assistive devices.

These guidelines and the desire to use freely available Dwell-click software defined the user interface. Reports have suggested (4) that selection by a combination of Dwell and eye-gaze tracking does not necessarily require a long dwell time to be accurate, but can be accelerated by clearly delineating active areas. The unornamented design that was decided upon (see Figure 3: User interface), features large and simple buttons that provide the necessary delineation. Ample buffer zones also ensure there is a place for the mouse to rest, avoiding the “Midas touch” phenomenon (5), where a dwell-based gaze interface frustratingly selects nearly every on-screen item that the eye casually passes over. The application’s window is sized to fit a 15″ screen, which is the ideal size for the S1.

For controlling mechanical movement, the Arduino Nano is an ideal microprocessing platform, in part because of its simple and flexible programming language, and also because it easily communicates over a USB connection.

Figure 3: Author in device.

RESULTS AND LIMITATIONS

The device has been tested on users without neck weakness. It can achieve 45 degrees of motion in each direction in both axes, which is comparable to the natural neck. The speed of movement is adjustable; at top speed it is comparable to a controlled motion of the natural neck. During testing, it was suggested that the device could be used as a therapeutic device to reduce neck stiffness. This suggestion needs to be verified with appropriate clinicians.

Additional redundant safety features should be added to the device, and we plan on beginning user testing and validation with PALS. Most technical aspects of the device have been solved with this prototype, or have allowed us to offer specific improvements needed to the device. For example, the mechanism providing yaw movement needs to constrain a curved beam from forward translation and rotations, but cannot be tightened too much or it will increase friction on the beam providing yaw translation. Placing shims through trial and error is needed to find an optimal offset.

Eye-gaze tracking has a number of limitations including low accuracy (approximately 40 pixels on a standard computer screen) and difficulty tracking the eyes in certain lighting (6). Care must be taken to compensate for these limitations in the overall design of the system. One major limitation of free head motion eye-gaze systems is that the camera should ideally be 65 cm from the user’s eyes and used within the allowable range of head motion (field of view) of the camera. This limitation is important when used when controlling a powered neck brace, as the user’s head will move relative to the camera. Additional testing is also needed to see how the eye-gaze tracking preforms on a wheelchair when driving over rough terrain.

CONCLUSION

Equipping assistive devices, such as the motorized headrest, with eye-gaze control is a simple combination with tremendous potential. The result is a device that proves the use of eye-gaze tracking in real-world devices, pioneering new applications for the technology. The motorized headrest gains the ability for a user with limited peripheral musculature to use the device through eye-gaze tracking. People with neck weakness will likely also have peripheral muscle weakness, making the use of joystick or other manual inputs difficult. Both portions of the device were designed with consideration of the needs of people with ALS. They are designed to be simple to operate and with little required help from a caregiver. We believe this device offers significant benefit to people with ALS and other similar diseases.

ACKNOWLEDGEMENTS

The authors appreciate the help facilitating this project from the professors, teammates and previous students of ALS Design Teams at UBC, the ALS society, and Mirametrix for lending the S1 gaze tracker for this project. We offer particular thanks to our supervising faculty Dr. M. Van der Loos, Dr. A. Hodgson, Dr. P. Kruchten, and Dr. C. Hennessey, all of the University of British Columbia.

REFERENCES

1. Hennessey, C., Noureddin, B., & Lawrence, P. (2006). A single camera eye-gaze tracking system with free head motion. New York: ACM.

2. Betke, M., Gips, J., & Fleming, P. (2001, June 19). The Camera Mouse: Visual Tracking of Body Features to Provide Computer Access For People with Severe Disabilities. IEEE Transactions on Rehabilitation Engineering , pp. 1-26.

3. ISO. (1998). ISO 9241-11: Guidance on usability. BSI.

4. Ohno, T. (1998). Features of Eye Gaze Interface for Selection Tasks. APCHI , pp. 176-181.

5. Gee, A., & Cipolla, R. (1994). Non-Intrusive Gaze Tracking for Human-Computer Interaction. Proc. Mechatronics and Machine Vision in Practice , pp. 112-117.

6. Stiefelhagen, R., Yang, J., & Waibel, A. Tracking Eyes and Monitoring Eye Gaze. Workshop on Perceptual User Interfaces. Banff: Advanced Research Projects Agency under the Department of the Navy.

, , , ,

Powered by WordPress. Designed by WooThemes

Skip to toolbar