The MRI and CT scan may one day have a robotic cousin capable of following and peering into patients as they move around.
A
UF mechanical and aerospace engineer Scott Banks’ goal is to augment static images of patients’ bones, muscles and joints with an interior view of these and other parts in action during normal physical activity. By merging such full-motion X-rays with computerised representations, orthopaedic surgeons will make better diagnoses, suggest more appropriate treatments and get a clearer idea of post-operative successes and failures, he said.
“Our goal is come up with a way to observe and measure how joints are moving when people are actually using them,” Banks said. “We think this will be tremendously powerful, not only for research but also in the clinical setting as well.”
Complaints about orthopaedic injuries are among the most common reasons people visit the doctor, according to the American
Orthopaedic surgeons have long diagnosed patients by touch or with static X-rays, MRI and CT scans. They also may use X-ray video, but current technologies provide only a tight view of a very limited range of motion in a controlled laboratory setting.
While all of these techniques can be effective, they do not work well with injuries that manifest themselves when a joint is in motion, Banks said. These include, for example, injuries to the patella, or kneecap, and injuries of the shoulder. Surgeons sometimes have to operate to diagnose these and other injuries, which can lead to unnecessary surgeries.
After operations, surgeons have few tools beyond the patient’s experience to tell them whether a procedure worked as intended and whether it will forestall additional joint damage.
Banks hopes his robot – actually, a system that uses two robots because one robot will be necessary to shoot the X-ray video and another to hold the image sensor — will lead to a radical improvement.
He has one working robot currently. The robot, which has a one-metre mechanical arm, is a commercial product normally used in robotically assisted surgeries and silicon chip manufacturing that Banks and his graduate students have re-engineered. The robot can shadow a person’s knee, shoulder or other joint with its hand as he or she moves.
In its completed form, the hand will hold lightweight equipment capable of shooting X-rays, while another robot will hold the sensor that captures images of the body as moving videos. Although the robots will be attached to a fixed base, there is room for a person to move around normally within their reach. And in the future, said Banks, “we could put these robots on wheels and they could follow you around.”
For now, the single robot holds a standard video camera.
To use it, a patient wears an LED-lit patch on the body part that is intended for targeting. The patch, several cameras placed around the room and a networked computer command the robots to hone in on and track the joint.
In a demonstration, using a graduate student instead of a patient, the camera-tipped hand followed the student’s thigh as he walked and otherwise moved normally. While the hand appeared accurate to the untrained eye, Banks said the video image, which included shots of the student’s jeans, showed that it couldn’t yet track the small LED patch with the accuracy needed for video X-ray.
Improving the accuracy is one of several challenges that remain in the project, Banks said. He has applied for a $275,000 grant from the US National Institutes of Health to continue the work. UF has also applied for a patent on the new imaging technique, and Banks said it’s possible that it could become standard equipment in hospitals.
The secret life of a London Music Hall
Does anyone know when electric lighting was first used in Wiltons. I presume it was installed on the stage first and then backstage later? Or was it...