Although the sheep never returned to their preoperative load status, the results of this study are promising as they conclude that an adequate skin seal was maintained preventing infection and allowing them to resume symmetrical gait patterns. For the first three months following surgery, the PVF distribution was approximately 80% of the preoperative value. Despite being load with less weight, these load amounts indicate a good initial fit showing implant stability and maintained skin seal. One very possible explanation for altered loading of the implant one month postoperatively could be the changed proprioception with the artificial implant. The Delrin/polyurethane exoprosthetic limb could have altered the feeling of balance and grip with the surface. However, one would expect that over time, the animals would have acclimated to the shortcomings of the exoprosthetic limb. Instead, over the 12-month period, the applied loads decreased to nearly 74% of the preoperative value. We believe that the decrease in loading over time could be due to stress shielding of the bone by the implants with resorption of the stress shielded bone and a subsequent decrease in cortical bone strength (). Finite element models have shown that percutaneous, osseointegrated endoprosthetics can significantly change the stress and strain energy density levels in bones due to both the implant and altered loading conditions, providing another explanation for the decrease over time (). Based upon the radiographic findings, all of the animals exhibited at least some distal resorption of the implanted third metacarpal, and this loss of bone may explain the change that occurred in the forelimb loading conditions over the 12-month study. This bone loss could be diminished by an implant design that would provide more uniform distal stress distribution along the bone-implant interface (; ; ; ). Given our present system design, it is unknown what would occur to the prosthetic limb loads after 12-months. A longer term study would be needed to determine if loads would continue to decrease, remain the same, or improve over time, as the bone continues to remodel around the implant. To compensate for the decreased load on the amputated limb, the animals applied a greater load to the other three limbs, with the greatest shift to the contralateral left forelimb.
This investigation confirmed that an amputation model was load bearing and would allow the assessment of percutaneous osseointegrated implant function and the ability to test the efficacy of the device. One month postoperative, the animals loaded their prosthetic limb less than they had preoperatively and their PVF distribution was never restored to pre-amputation loads disproving Hypothesis 1. The sheep had a physiologically symmetrical stride length and uniform time in stance phase, over the 12-month evaluation period, supporting Hypothesis 2.
To facilitate the global approval and clinical use of these osseointegrated percutaneous implants, we believe that a weight-bearing, large animal model needs to be established to study the efficacy of decreasing the infection rates at the skin implant interface. Previously, the ovine model has shown promise for the translational testing of bone endoprosthetic implants (; ; ). Sheep have a body weight similar to that of most humans, have a bone morphology, structure and size appropriate for the testing of human implants (; ) and have a bone remodeling rate similar to that of humans ().
Gait analysis on amputees has also been performed as an important tool for assessing ambulation and weight bearing to confirm rehabilitation milestones and clinical endpoints (; ). In quadrupeds, the analyses are complex in that animals have been reported to offload their injured limb and apply more loads to their three other limbs (; ; ). However, most of the quadruped gait literature deals with fracture healing events. Healing around a percutaneous, osseointegrated implant for amputees would most appropriately be compared with appositional bone formation into porous coated total joint replacements that has been well documented as appositional bone attachment (; ; ; ; ; ; ; ) taking up to one year to achieve.
Percutaneous osseointegrated prostheses (POP) used as a docking system, is an alternant technology to sockets. These POP devices, also known in the literature as osseointegrated (; ; ; ; ; ; ; ; ; ) and endo-exoprosthesis (; ), represent a relatively new technology that is presently being investigated worldwide in an attempt to obviate the multiple shortcomings of socket technology by providing direct attachment in order to secure the exoprosthetic limb. Currently, there are three primary groups working with selected human amputee volunteers with this technology – Branemark and co-workers in Sweden, Aschoff and colleagues in Germany, and Blunn and Pendergrass in England). Their patients have reported significant improvements in their functionality as evidenced by improved range of motion (; ), the lack of skin or residual limb problems associated with socket prosthetic attachment (), greater overall satisfaction with their prosthetic device function (; ; ) and the previously unrecognized benefit of “osseoperception”, the central sensory feedback from the environment through the implanted bone (; ; ; ; ). Although there are multiple significant advantages seen in these clinical populations, the patients in these trials also show problems with high infection rates at the skin-implant interface, periprosthetic osteomyelitis, implant loosening, and long rehabilitation times associated with some of the existing procedures (; ). The high infection rates are challenges that could be significantly reduced or eliminated by assuring that the natural antibacterial skin barrier were maintained at the skin-implant interface. Currently, these devices are not approved for clinical use in the United States and some other parts of the world due to the reported 18-50% infection rates (; ).
In terms of restoring lost sensory function, cochlear implant research has arguably been the most successful and has been translated into a viable therapeutic option for many profoundly deaf patients (Loeb 1990; McDermott 2004; Pena, Bowsher et al. 2004; Bouccara, Avan et al. 2005). Simply explained, a wire electrode bundle is surgically inserted into the inner ear thus serving as an electronic replacement for damaged hair cells. A microphone and speech processor picks up and decodes sounds from the environment and the coded signals drive, in turn, the electrodes that stimulate the nerve fibers of the cochlea, and create the sensation of hearing. In conjunction with intense rehabilitation programs that are tailored to the individual, deaf individuals can learn to comprehend and in some cases, even acquire speech. The same hope exists in the field of visual prostheses.
The first and most successful example of a neural stimulation device is the cardiac pacemaker that has become a standard therapeutic approach to improve cardiac function in millions of patients. Technology partially achieved during the development of the pacemaker has been used successfully for rehabilitation of sensory and/or motor functions in patients with neurological diseases. Thus deep brain stimulators have been implanted successfully in patients for pain management and for control of motor disorders such as Parkinson’s disease (Hunter, Yoshino et al. 2004; Pena, Bowsher et al. 2004; Stieglitz, Schuettler et al. 2004), and cochlear implants are being used for restoring auditory function (Brors and Bodmer 2004; Chatelin, Kim et al. 2004; Cohen 2004; Balkany, Hodges et al. 2005; Chang 2005). Moreover advances in artificial limbs and brain-machine interfaces are now providing hope of increased mobility and independence for amputees and paralyzed patients (Donoghue 2002; Nicolelis 2003; Donoghue, Nurmikko et al. 2004; Carmena, Lebedev et al. 2005; Hochberg, Serruya et al. 2006) and there is preliminary data showing that electrophysiological methods can be used to extract neural information about the volitional intent of the subjects to move their distal musculature and then translate these signals into models that are able to control external devices (Barbeau, McCrea et al. 1999; Donoghue 2002; Nicolelis and Chapin 2002; Serruya, Hatsopoulos et al. 2002; Paninski, Fellows et al. 2004; Patil, Carmena et al. 2004; Sanchez, Carmena et al. 2004; Hochberg, Serruya et al. 2006). As more and more patients have benefited from this approach, the interest in neural interfaces and visual prostheses has grown significantly.
The selection of a specific person for a visual implant is not straightforward. There are no strict standardized criteria for accepting or rejecting a candidate, nor for the best rehabilitation procedure for every type of blindness. Generally a choice should be made between different approaches and/or rehabilitation procedures depending on availability, efficacy or rejection of invasive methods (Veraart, Duret et al. 2004; Dowling 2005; Dagnelie 2006). But a pre-surgical protocol and improved methods for predicting success with a visual neuroprosthesis need to be developed (Fernandez, Pelayo et al. 2005; Merabet, Rizzo et al. 2007; Dagnelie 2008).
The greater impediments to future progress in visual neuroprosthesis approaches are not only the technical, engineering and surgical issues that remain to be solved, but also the development and implementation of strategies designed to interface with the visually deprived brain specifically tailored for an individual patient’s own needs. This would particularly involve an improved patient selection and a “custom-tailoring” of visual prosthetic devices to the subject. A key issue in this context that has often been utterly underrated is the role of neural plasticity. Thus, these strategies should take into consideration not only standardized methods and employ current clinical and technological expertise, but also consider newly emerging developmental and neurophysiological evidence. For example, there is considerable evidence that adaptive and compensatory changes occur within the brain following the loss of sight (Cohen, Celnik et al. 1997; Pascual-Leone, Hamilton et al. 1999; Bavelier and Neville 2002; Fernandez, Pelayo et al. 2005; Merabet, Rizzo et al. 2005; Ptito and Kupers 2005; Bernabeu, Alfaro et al. 2009). These studies have shown that in some patients the occipital parts of the brain that sighted subjects use to process visual information are transformed and utilized to process tactile and auditory stimuli. This plastic change in the brain probably allows blind subjects to extract greater information from touch and hearing, thus improving quality of life and enhancing the integration of the blind in the social and working environment of a sighted society. The modulation and understanding of these neuroplastic processes is crucial for the success of any visual neuroprosthesis and can therefore provide the neuroscientific foundation for improved rehabilitation and teaching strategies for the blind.
The results demonstrated that one month following surgery, the sheep loaded their amputated limb to a mean value of nearly 80% of their pre-amputation loading condition; by 12-months, this mean had dropped to approximately 74%. There were no statistical differences between the symmetry of the amputated forelimb and the contralateral forelimb at any time point for the animals stride length or the time spent in the stance phase of their gait cycle. Thus, the data showed that while the animals maintained symmetric gait patterns, they did not return to full weight-bearing after 12-months. The results of this study showed that a large animal load-bearing model had a symmetric gait and was weight bearing for up to 12 months. While the current investigation utilizes an ovine model, there data show that osseointegrated implant technology with postoperative follow-up can help our human patients return to symmetric gait and maintain an active lifestyle, leading to an improvement in their quality of life following amputation.