Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUSRE43952 E1
Publication typeGrant
Application numberUS 09/784,829
PCT numberPCT/FR1990/000714
Publication date29 Jan 2013
Filing date5 Oct 1990
Priority date5 Oct 1989
Also published asCA2067209A1, CA2067209C, DE69022063D1, DE69022063T2, EP0494943A1, EP0494943B1, US5868675, WO1991004711A1
Publication number09784829, 784829, PCT/1990/714, PCT/FR/1990/000714, PCT/FR/1990/00714, PCT/FR/90/000714, PCT/FR/90/00714, PCT/FR1990/000714, PCT/FR1990/00714, PCT/FR1990000714, PCT/FR199000714, PCT/FR90/000714, PCT/FR90/00714, PCT/FR90000714, PCT/FR9000714, US RE43952 E1, US RE43952E1, US-E1-RE43952, USRE43952 E1, USRE43952E1
InventorsJean Francois Uhl, Joel Henrion, Michel Scriban, Jean-Baptiste Thiebaut
Original AssigneeMedtronic Navigation, Inc.
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Interactive system for local intervention inside a non-homogeneous structure
US RE43952 E1
Abstract
An interactive system for a local intervention inside a region of a non-homogeneous structure, such as the skull of a patient, which is related to the frame of reference (R2) of an operation table, and which is connected to a reference structure comprising a plurality of base points. The system creates on a screen a representation of the non-homogeneous structure and of the reference structure connected thereto, provides the coordinates of the images of the base points in the first frame of reference (R1), allows the marking of the coordinates of the base points in R2, and allows the carrying out of the local intervention with an active member such as a trephining tool, a needle, or a radioactive or chemical implant. The systems also optimizes the transfer of reference frames between R1 and R2, from the coordinates of the base points in R2 and the images in R1 by reducing down to a minimum the deviations between the coordinates of images in R1 and the base points in R1 after transfer. The system also establishes real time bi-directional coupling between: (1) an origin and a direction of intervention simulated on the screen, (2) the position of the active member.
Images(14)
Previous page
Next page
Claims(101)
1. An interactive system for local intervention inside a region of a non-homogeneous structure to which is connected a reference structure containing a plurality of base points, the interactive system comprising:
means for dynamically displaying a three-dimensional image of a representation of the non-homogeneous structure and of the reference structure connected to the non-homogeneous structure, wherein the three-dimensional image also includes a plurality of images of the plurality of base points;
means for determining a set of coordinates of the plurality of images of the plurality of base points in a first reference frame;
means for fixing a position of the non-homogeneous structure and of the reference structure with respect to a second reference frame;
means for determining a set of coordinates of the plurality of base points in the second reference frame;
means of intervention comprising an active member whose position is determined with respect to the second reference frame;
means for generating a plurality of reference frame translation tools for translating a plurality of reference frames from the first reference frame to the second reference frame and vice versa, based on the set of coordinates of the plurality of images of the plurality of base points in the first reference frame and of the set of coordinates of the plurality of base points in the second reference frame, in such a way as to reduce to a minimum at least one of a set of deviations between the set of coordinates of the plurality of images of the plurality of base points in the first reference frame and the set of coordinates of the base points, expressed in the first reference frame using the plurality of reference frame translation tools;
means for defining, with respect to the first reference frame, a simulated origin of intervention and a simulated direction of intervention; and,
means for transferring the plurality of reference frames using the plurality of reference frame translation tools to establish a bidirectional coupling between the simulated origin of intervention and the simulated direction of intervention and the position of the active member.
2. The interactive system according to claim 1, wherein the plurality of reference frame translation tools comprise:
means for creating a matrix (M) for transferring between the first reference frame and a first intermediate reference frame based on a set of coordinates of a set of three images of a set of three base points of the reference structure;
means for creating a matrix (N) for transferring between the second reference frame and a second intermediate reference frame based on the set of coordinates of the set of three images of the set of three base points of the reference structure; and,
means for validating matrix (M) and matrix (N) based on the set of three base points and the set of three images, such that at least one deviation between an expression for at least one additional base point in the second intermediate reference frame and an expression for at least one image point of the additional base point in the first intermediate reference frame is reduced to a minimum.
3. The interactive system according to plurality of claim 2, wherein the means for transferring the reference frames using the plurality of reference frame translation tools further comprises:
a first transfer sub-module for transferring a set of representation/non-homogeneous structure coordinates, and
a second transfer sub-module for transferring a set of non-homogeneous structure/representation coordinates.
4. The interactive system according to claim 3, wherein the first transfer sub-module comprises:
means for acquiring a set of coordinates (XM, YM, ZM), expressed in the first reference frame, of a point of the representation of the non-homogeneous structure to be transferred, by selection on the representation;
means for calculating a set of corresponding coordinates (XP, YP, ZP), expressed in the second reference frame, on the non-homogeneous structure through a transformation:
{YP,YP, ZP}=M*N−1 *{XM,YM,ZM} where M * N−1 represents a product of the matrix (M) and an inverse of the matrix (N), and
means for processing, with the aid of the corresponding coordinates (YP, YP, ZP), to display a corresponding point on a surface of the non-homogeneous structure and to secure the intervention.
5. The interactive system according to claim 3, wherein the second transfer sub-module comprises:
means for acquiring a set of coordinates (XP, YP, ZP), expressed in the second reference frame, of a point of the non-homogeneous structure to be transferred;
means for calculating a set of corresponding coordinates (XM YM, ZM), expressed in the first reference frame, of the representation through a transformation:
{YM, YM, ZM}=N*M−1 *{XP,ZP,ZP} where N*M−1 represents the product of the matrix (N) and an inverse of the matrix (M); and,
means for displaying the representation using the set of corresponding coordinates (YM, YM, ZM).
6. The interactive system according to claim 1, wherein the means for generating the plurality of reference frame translation tools also generate, in association with the reference frame translation tools, tools for taking into account a residual uncertainty which is based on the set of deviations between the set of coordinates of the plurality of images of the plurality of base points in the first reference frame and the set of coordinates of the base points, the tools for taking into account the residual uncertainty usable for displaying a set of contours in the representation whilst taking into account the residual uncertainties.
7. The interactive system according to claim 1, wherein the means of dynamic displaying the three-dimensional image comprises:
a file containing digitized data from a set of two-dimensional images constituted by successive non-invasive tomographic sections of the non-homogeneous structure;
means for calculating and reconstructing the three-dimensional image from the set of two-dimensional images; and
a high-resolution display screen.
8. The interactive system according to claim 7, wherein the means for calculating and reconstructing the three-dimensional image from the set of two-dimensional images comprises a program consisting of computer-aided design type software.
9. The interactive system according to claim 1, wherein the means for determining the set of coordinates of the plurality of base points in the second reference frame comprises a three-dimensional probe equipped with a tactile tip for delivering a set of coordinates of the tactile tip in the said second reference frame.
10. The interactive system according to claim 1, wherein the means for determining the set of coordinates of the plurality of base points is the second reference frame comprises at least one of a set of optical sensors and a set of electromagnetic sensors.
11. The interactive system according to claim 1, wherein a portion of the set of the plurality of base points of the reference structure comprises a plurality of marks positioned on a lateral surface of the non-homogeneous structure.
12. The interactive system according to claim 11, wherein the plurality of marks are four in number and are distributed over the lateral surface so as to define a substantially symmetrical tetrahedron.
13. The interactive system according to claim 1, wherein the means of intervention comprises:
a guide arm to secure intervention in the region of the non-homogeneous structure, the guide arm having a position marked with respect to the second reference frame; and,
an active intervention member whose position is marked with respect to the second reference frame.
14. The interactive system according to claim 13, wherein the active intervention member is removable and selected from the group consisting of:
tools for trephining;
needles and implants;
laser and radioisotope emission heads; and, sighting and viewing systems.
15. The interactive system according to claim 1, wherein the means for transferring the plurality of reference frames establishes a coupling between a direction of visualization of the representation of the non-homogeneous structure on the display means and a direction of observation of the non-homogeneous structure and of the reference structure by the active intervention member.
16. The interactive system according to claim 15, further comprising:
a first module for visualizing a representation in a direction given by two points;
a second module for visualizing a representation in a direction given by an angle of elevation and an angle of azimuth.
17. An interactive system for intervention inside a region of a patient, said interactive system comprising:
a device operable to receive image data of the region of the patient, wherein the image data includes image data of a first reference structure to establish an image reference frame for the region of the patient;
a second reference structure positioned relative to the patient to establish a patient reference frame for the region of the patient;
a controller operable to correlate the position of the first reference structure in the image reference frame with the position of the second reference structure in the patient reference frame;
an active member operable to perform the intervention; and
a tracking system operable to determine a position of at least the second reference structure and a position of the active member and configured to transmit the determined positions of the second reference structure and the active member to the controller;
wherein the controller is configured to determine the position of the active member based on the determined position of at least the active member and the correlation of the first reference structure and the second reference structure.
18. The interactive system as defined in claim 17 wherein the first reference structure includes a plurality of base points.
19. The interactive system as defined in claim 18 wherein the second reference structure includes a plurality of tracking markers.
20. The interactive system as defined in claim 19 wherein the plurality of base points are generated from the plurality of tracking markers.
21. The interactive system as defined in claim 18 wherein the plurality of base points are at least one of a plurality of notable points on the patient and marks fixed to the patient.
22. The interactive system as defined in claim 21 wherein the notable points are selected from a group comprising a head, eyebrows, temples, frontal medial point, an apex of a skull, a center of gravity of an orbits of the eyes and a combination thereof.
23. The interactive system as defined in claim 18 wherein the controller further includes a graphical tool operable to identify the plurality of base points of the first reference structure in the image data of the image data reference frame.
24. The interactive system as defined in claim 23 wherein the graphical tool is a mouse in communication with the controller.
25. The interactive system as defined in claim 17 wherein the second reference structure includes a plurality of tracking markers.
26. The interactive system as defined in claim 25 wherein the plurality of tracking markers are attached to the patient.
27. The interactive system as defined in claim 17 wherein the second reference structure is attached to the patient.
28. The interactive system as defined in claim 17 wherein the first reference structure is attached to the patient.
29. The interactive system as defined in claim 17 wherein the tracking system includes a marker device operable to determine a position of the second reference structure in relation to the patient reference frame.
30. The interactive system as defined in claim 29 wherein the marker device includes a telemetry system operable to determine the position of the second reference structure in the patient reference frame and transmit the determined position to the controller, wherein the controller is operable to perform the correlation at least with the transmitted determined position.
31. The interactive system as defined in claim 30 wherein the telemetry system is an electromagnetic telemetry system.
32. The interactive system as defined in claim 31 wherein the second reference structure includes electromagnetic tracking markers, wherein the electromagnetic telemetry system is operable to determine the position of the electromagnetic tracking markers of the second reference structure in relation to the patient reference frame.
33. The interactive system as defined in claim 32, wherein the electromagnetic tracking markers are transmitters and the electromagnetic telemetry system is an electromagnetic sensor.
34. The interactive system as defined in claim 30 wherein the telemetry system is an optical telemetry system.
35. The interactive system as defined in claim 34 wherein the optical telemetry system includes at least one of a video camera or an infrared camera to image at least the second reference structure and configured to plot points of the second reference structure.
36. The interactive system as defined in claim 34 wherein the second reference structure includes optical tracking markers, wherein the optical telemetry system is operable to determine the position of the optical tracking markers of the second reference structure in relation to the patient reference frame.
37. The interactive system as defined in claim 34 wherein the optical telemetry system utilizes position and shape recognition to identify the second reference structure.
38. The interactive system as defined in claim 29 wherein the marker device includes a three-dimensional probe.
39. The interactive system as defined in claim 38 wherein the three-dimensional probe includes a tactile tip operable to engage the second reference structure.
40. The interactive system as defined in claim 38 wherein the three-dimensional probe is robotically manipulated, such that the instantaneous position of the three-dimensional probe is known.
41. The interactive system as defined in claim 29 wherein the marker device includes a set of cameras operable to determine the position of the second reference structure in relation to the patient reference frame.
42. The interactive system as defined in claim 41 wherein the set of cameras are selected from video and infrared cameras.
43. The interactive system as defined in claim 29 wherein the marker device is a laser beam emission system operable to illuminate the second reference structure to determine a position of the second reference structure in relation to the patient reference frame.
44. The interactive system as defined in claim 17 wherein the first reference structure is generated from the second reference structure.
45. The interactive system as defined in claim 17 wherein the active member is selected from a group comprising a trephining tool, a needle, a laser, a radioscope emission head, an endoscopic viewing system, a tool used in the intervention, an implant, a sighting system, a microscope, and combinations thereof.
46. The interactive system as defined in claim 17 further comprising a telemetry system operable to determine the position of the active member in the patient reference frame, said telemetry system in communication with the controller.
47. The interactive system as defined in claim 46 wherein the position information of the active member is six degree of freedom information in relation to the patient reference frame.
48. The interactive system as defined in claim 17 wherein the device includes a display operable to display the image data of the region of the patient in relation to the image reference frame.
49. The interactive system as defined in claim 48 wherein the controller is further operable to determine a reference origin of intervention and a direction of intervention and said display is further operable to display the reference origin of intervention and direction of intervention.
50. The interactive system as defined in claim 48 wherein the controller is further operable to model a reference origin of intervention and a direction of intervention and said display is further operable to display the modeled reference origin of intervention and direction of intervention.
51. The interactive system as defined in claim 48 wherein the display is further operable to display the real-time position of the active member in the image reference frame based on the determined position of the active member with the tracking system.
52. The interactive system as defined in claim 48 wherein the display is further operable to display image data relative to a direction of intervention of the active member.
53. The interactive system as defined in claim 52 wherein the image data is displayed perpendicular to a direction of intervention of the active member.
54. The interactive system as defined in claim 48 wherein the controller is further operable to simulate an optimal trajectory of advance of the active member and said display is operable to display the optimal trajectory in the image data relative to the image reference frame.
55. The interactive system as defined in claim 54 wherein movement of the active member is steered to the optimal trajectory to carry out a programmed intervention.
56. The interactive system as defined in claim 17 wherein the active member is robotically controlled.
57. The interactive system as defined in claim 17 wherein the image data is at least one of a magnetic resonance image data, a tomographic image data, a radiographic image data, x-ray image data, and combinations thereof.
58. The interactive system as defined in claim 57 wherein the head set is further fixed to an operating table.
59. The interactive system as defined in claim 17 wherein the device is operable to construct three-dimensional images from captured two-dimensional images.
60. The interactive system as defined in claim 17 wherein the controller is further operable to correlate map data in a map reference frame with the patient reference frame.
61. The interactive system as defined in claim 17 wherein the intervention is at least one of a neurosurgery, orthopedic surgery, cranial surgery, and combinations thereof.
62. The interactive system as defined in claim 17 wherein the second reference structure is fixed to a head set.
63. The interactive system as defined in claim 17 wherein the device further includes memory operable to store the image data.
64. The interactive system as defined in claim 17 wherein the device is a first computer.
65. The interactive system as defined in claim 64 wherein the controller is a second computer.
66. The interactive system as defined in claim 65 wherein the first computer and the second computer is a single work station.
67. An interactive system for intervention inside a region of a patient, said interactive system comprising:
a device operable to receive image data of the region of the patient, wherein the image data includes image data of a first reference structure to establish an image reference frame for the region of the patient;
a second reference structure positioned relative to the patient to establish a patient reference frame for the region of the patient; and
a controller operable to correlate the position of the first reference structure in the image reference frame with the position of the second reference structure in the patient reference frame;
wherein the device is operable to construct three-dimensional images from captured two-dimensional images;
wherein the controller is operable to superimpose two-dimensional image data on the three-dimensional images wherein any change in soft external parts of the patient can be visualized as compared with the image captured by the imaging device.
68. An interactive system for intervention inside a region of a patient, said interactive system comprising:
a device operable to receive image data of the region of the patient, wherein the image data includes image data of a first reference structure to establish an image reference frame for the region of the patient;
a second reference structure positioned relative to the patient to establish a patient reference frame for the region of the patient;
a controller operable to correlate the position of the first reference structure in the image reference frame with the position of the second reference structure in the patient reference frame; and
an active member operable to perform the intervention;
wherein the device includes a display operable to display the image data of the region of the patient in relation to the image reference frame;
wherein the controller is further operable to determine residual uncertainty which is used to represent a contour with dimensions larger than those which would normally be represented and the display is operable to display the residual uncertainty of the contour.
69. The interactive system as defined in claim 68 wherein the contour is a display of an active member and a representation of residual uncertainty in order to reduce the chance of traversing undesired structures.
70. An interactive system for intervention inside a region of a patient, said interactive system comprising:
a device operable to receive image data of the region of the patient, wherein the image data includes image data of a first reference structure to establish an image reference frame for the region of the patient;
a second reference structure positioned relative to the patient to establish a patient reference frame for the region of the patient;
a controller operable to correlate the position of the first reference structure in the image reference frame with the position of the second reference structure in the patient reference frame;
an active member operable to perform the intervention inside the region of the patient;
a tracking system operable to track the position of the active member in relation to the patient reference frame, the tracking system being in communication with the controller to transmit the tracked position of the active member as position information to the controller, wherein the controller is operable to determine the position of the active member relative to the image reference frame; and
a display operable to display the real-time position of the active member in the image reference frame based on the controller determined position of the active member based on the tracked position of the active member from the tracking system, wherein the controller is configured to generate a representation of the active member that is displayed on the display relative to a display of the received image data.
71. The interactive system as defined in claim 70 wherein the active member is selected from a group comprising a trephining tool, a needle, a laser, a radioscope emission head, an endoscopic viewing system, a tool used in the intervention, an implant, a sighting system, a microscope, and combinations thereof.
72. The interactive system as defined in claim 70 wherein the position information of the active member is six degree of freedom information in relation to the patient reference frame.
73. The interactive system as defined in claim 70 wherein the tracking system that tracks the position of the active member is a telemetry system in communication with the controller.
74. The interactive system as defined in claim 70 wherein the active member is robotically controlled.
75. The interactive system as defined in claim 70 wherein the image data is at least one of a magnetic resonance image data, a tomographic image data, a radiographic image data, x-ray image data, and combinations thereof.
76. The interactive system as defined in claim 70 wherein the controller is further operable to determine a reference origin of intervention and a direction of intervention and said display is further operable to display the reference origin of intervention and direction of intervention.
77. The interactive system as defined in claim 70 wherein the first reference structure includes a plurality of base points.
78. The interactive system as defined in claim 77 wherein the second reference structure includes a plurality of tracking markers.
79. The interactive system as defined in claim 78 wherein the plurality of base points are generated by the plurality of tracking markers.
80. The interactive system as defined in claim 70 wherein the second reference structure is attached to the patient.
81. The interactive system as defined in claim 70 wherein intervention is at least one of a neurosurgery, orthopedic surgery, cranial surgery intervention, and combinations thereof.
82. The interactive system as defined in claim 70 wherein the second reference structure is fixed to a head set.
83. The interactive system as defined in claim 70 wherein the display forms part of the device and wherein the image data received is acquired image data of the region of the patient and is displayed on the display, further wherein the representation of the active member is displayed on the acquired image data of the region of the patient.
84. A method for performing an image guided intervention inside a region of a patient, said method comprising:
accessing a first image data of the region of the patient captured with an imaging system where the first image data includes image data of a first reference structure;
identifying the first reference structure in the first image data to establish an image reference frame;
identifying a second reference structure relative to the patient to establish a patient reference frame;
correlating the position of the first reference structure in the image reference frame in the first image data with the position of the second reference structure in the patient reference frame; and
tracking an active member at least to determine a position of the active member in the patient reference frame to determine a location of the active member based on the tracking of the active member and transmitting the determined position in the patient refrence frame for display on a display device relative to the image reference frame of the first image data based at least on the correlation of the first reference structure and the second reference structure.
85. The method as defined in claim 84 further comprising attaching a plurality of tracking markers to the patient where the tracking markers form the second reference structure.
86. The method as defined in claim 85 further comprising identifying the position of the tracking markers in the patient reference frame using a telemetry system.
87. The method as defined in claim 86 further comprising transmitting from the tracking markers a signal and receiving the transmitted signal with an electromagnetic sensor to identify the position of the second reference structure in the patient reference frame.
88. The method as defined in claim 84 wherein identifying the first reference structure includes identifying a plurality of base points visible in the image data.
89. The method as defined in claim 88 wherein identifying the plurality of base points includes identifying at least one of notable points on the patient as marks fixed to the patient representing the plurality of base points.
90. The method as defined in claim 89 wherein the notable points are selected from a group comprising a head, eyebrows, temporal point, frontal medial point, an apex of a skull, a center of gravity of an orbits of the eyes and a combination thereof.
91. The method as defined in claim 88 wherein the plurality of base points visible in the image data are generated from the plurality of tracking markers attached to the patient.
92. The method as defined in claim 84 further comprising attaching the second reference structure to the patient.
93. The method as defined in claim 92 further comprising attaching the second reference structure to a head set.
94. The method as defined in claim 84 further comprising displaying the image data of the region of the patient, including displaying the first reference structure.
95. The method as defined in claim 94 further comprising:
displaying the position of the active member as a representation of the active member in the accessed first image data that is captured image data that is correlated to the patient based on the correlation and displayed on a display device with the position of the active member being correlated between the patient reference frame defined by the first reference structure fixed to the patient and the image reference frame based on the tracking of the active member.
96. The method as defined in claim 95 further comprising identifying the position of the active member with a telemetry system by transmitting the tracked location of the active member for displaying the representation of the active member.
97. The method as defined in claim 95 further comprising displaying a reference origin of intervention and a direction of intervention in the image data.
98. The method as defined in claim 97 further comprising tracking the position of the active member relative to the reference origin of intervention and the direction of intervention.
99. The method as defined in claim 84 further comprising performing an intervention on the patient with an active member.
100. The method as defined in claim 99 wherein the intervention is selected from at least one of a neurosurgery, orthopedic surgery, cranial surgery, and combinations thereof.
101. The method as defined in claim 84 further comprising converting two-dimensional image data to three-dimensional image data.
Description

The invention relates to an interactive system for local intervention inside a region of a nonhomogeneous structure.

The performing of local interventions inside a nonhomogeneous structure, such as intracranial surgical operations or orthopedic surgery currently poses the problem of optimizing the intervention path or paths so as to secure, on the one hand, total intervention over the region or structure of interest, such as a tumor to be treated or explored and, on the other hand, minimal lesion to the regions neighboring or adjoining the region of interest, this entailing the localizing and then the selecting of the regions of the nonhomogeneous structure which are least sensitive to being traversed or the least susceptible to damage as regards the integrity of the structure.

Numerous works aimed at providing a solution to the abovementioned problem have hitherto been the subject of publications. Among the latter may be cited the article entitled “Three Dimensional Digitizer (Neuronavigator): New Equipment for computed Tomography Guided Stereotaxic Surgery”, published by Eiju Watanabe, M.D., Takashi Watanabe, M.D., Shinya Manaka, M.D., Yoshiaki Mayanagi, M.D., and Kintomo Takakura, M.D. Department of Neurosurgery, Faculty of Medicine, University of Tokyo, Japan, in the journal Surgery Neurol. 1987: 27 pp. 543-547, by Elsevier Science Publishing Co., Inc. The Patent WO-A-88 09151 teaches a similar item of equipment.

In the abovementioned publications are described in particular a system and an operational mode on the basis of which a three-dimensional position marking system, of the probe type, makes it possible to mark the three-dimensional position coordinates of a nonhomogeneous structure, such as the head of a patient having to undergo a neurosurgical intervention, and then to put into correspondence as a function of the relative position of the nonhomoc.eneous structure a series of corresponding images consisting of two-dimensional images sectioned along an arbitrary direction, and obtained previously with the aid of a medical imaging method of the “scanner” type.

The system and the operational mode mentioned above offer a sure advantage for the intervening surgeon since the latter has available, during the intervention, apart from a direct view of the intervention, at least one two-dimensional sectional view enabling him to be aware, in the sectional plane, of the state of performance of the intervention.

However, and by virtue of the very design of the system and of the operational mode mentioned above, the latter allow neither a precise representation of the state of performance of the intervention, nor partially or totally automated conduct of the intervention in accordance with a program for advance of the instrument determined prior to the intervention.

Such a system and such an operational mode cannot therefore claim to eradicate all man-made risk, since the intervention is still conducted by the surgeon alone.

The objective of the present invention is to remedy the whole of the problem cited earlier, and in particular to propose a system permitting as exact as possible a correlation, at any instant, between an intervention modeling on the screen and the actual intervention, and furthermore the representation from one or more viewing angles, and if appropriate in one or more sectional planes, of the nonhomogeneous structure, the sectional plane or planes possibly being for example perpendicular to the direction of the path of advance of the instrument or of the intervention tool.

Another objective of the present invention is also the implementation of a system permitting simulation of an optimal trajectory of advance of the tool, so as to constitute an assisted or fully programed intervention.

Finally, an objective of the present invention is to propose a system making it possible, on the basis of the simulated trajectory and of the programed intervention, to steer the movement of the instrument or tool to the said trajectory so as to carry out the programed intervention.

The invention proposes to this effect an interactive system for local intervention inside a region of a nonhomogeneous structure to which is tied a reference structure containing a plurality of base points, characterized in that it comprises:

    • means of dynamic display by three-dimensional imaging of a representation of the nonhomogeneous structure and of a reference structure tied to the nonhomogeneous structure, including images of the base points,
    • means of delivering the coordinates of the images of the base points in the first reference frame,
    • means of securing the position of the non-homogeneous structure and the reference structure with respect to a second reference frame.
    • marker means for delivering the coordinates of the base points in the second reference frame,
    • means of intervention comprising an active member whose position is determined with respect to the second reference frame,
    • means of optimizing the transfer of reference frames from the first reference frame to the second reference frame and vice versa, on the basis of the coordinates of the images of the base points in the first reference frame and of the coordinates of the base points in the second reference frame, in such a way as to reduce to a minimum the deviations between the coordinates of the images of the base points in the first reference frame and the coordinates of the base points, expressed in the said first reference frame with the aid of the said reference frame transfer tools,
    • means for defining with respect to the first reference frame a simulated origin of intervention and a simulated direction of intervention, and
    • reference frame transfer means using the said reference frame transfer tools to establish a bidirectional coupling between the simulated origin of intervention and the simulated direction of intervention and the position of the active member.

A more detailed description of the system of the invention will be given below with reference to the drawings in which:

FIG. 1 represents a general view of an interactive system for local intervention inside a region of a nonhomogeneous structure according to the present invention,

FIG. 2 represents, in the case where the nonhomogeneous Structure consists of the head of a patient, and with a view to a neurosurgical intervention, a reference structure tied to the nonhomogeneous structure and enabling a correlation to be established between a “patient” reference frame and a reference frame of images of the patient which were made and stored previously,

FIG. 3 represents an advantageous embodiment of the spacial distribution of the reference structure of FIG. 2,

FIG. 4 preesents an advantageous embodiment of the intervention means set up on an operating table in the case of a neurosirgical intervention,

FIGS. 5a and 5b represent a general flow diagram of functional steps implemented by the system,

FIGS. 6 thru 8 represent flow diagrams of programs permitting implementation of certain functional steps of FIG. 5b,

FIG. 9a represents a flow diagram of a program permitting implementation of a functional step of FIG. 5a,

FIG. 9b represents a flow diagram of a program permitting implementation of another functional step of FIG. 5a,

FIGS. 10a and 10b represent a general flow diagram of the successive steps of an interactive dialogue between the system of the present invention and the intervening surgeon and

FIG. 10c represents a general flow diagram of the successive functional steps carried out by the system of the invention, having (sic) the intervention, prior to the intervention, during the intervention and after the intervention.

The interactive system for local intervention according to the invention will firstly be described in connection with FIG. 1.

A nonhomogeneous structure, denoted SNH, on which an intervention is to be performed, consists for example of the head of a patient in which a neurosurgical intervention is to be performed. It is however understood that the system of the invention can be used to carry out any type of intervention in any type of nonhomogeneous structure inside which structural and/or functional elements or units may be in evidence and whose integrity, during the intervention, is to be respected as far as possible.

The system comprises means, denoted 1, of dynamic display by three-dimensional imaging, with respect to a first reference frame R1, of a representation (denoted RSR) of a reference structure SR (described later) tied to the structure SNH, and a representation or modeling of the nonhomogeneous structure, denoted RSNH.

More precisely, the means 1 make it possible to display a plurality of successive three-dimensional images, from different angles, of the representations RSNH and RSR.

The system of the invention also comprises means, denoted 2, of tied positioning, with respect to a second reference frame R2, of the structures SNH and SR.

In the present non-limiting example, the head of the patient, bearing the reference structure SR, is fixed on an operating table TO to which are fixed the means 2 of tied positioning.

Of course, the patient whose head has been placed in the means 2 for tied positioning has previously been subjected to the customary preparations, in order to enable him to undergo the intervention.

The means 2 of the tied positioning with respect to R2 will not be described in detail since they can consist of any means (such as a retaining headset) normally used in the field of surgery or neurosurgery. The reference frame R2 can arbitrarily be defined as a tri-rectangular reference trihedron tied to the operating table TO, as represented in FIG. 1.

Means 3 of marking, with respect to the second reference frame R2, the coordinates, denoted X2, Y2, Z2, of arbitrary points, and in particular of a certain number of base points of the reference structure SR are furthermore provided.

These base points constituting the reference structure SR can consist of certain notable points and/or of marks fixed to the patient, at positions selected by the surgeon and in particular at these notable points.

The system of the invention further comprises computing means 4 receiving means 3 of marking the coordinates X2, X2, Z2.

The computing means 4, as will be seen in detail later, are designed to elaborate optimal tools for reference frame transfer using on the one hand the coordinates in R2, measured by the probe 3, of a plurality of base points of the structure SR, and on the other hand the coordinates in R1, determined by graphical tools of the computer M01 (pointing by mouse, etc.), of the images of the corresponding base points in the representation RSR, so as to secure the best possible correlation between the information modeled in the computer equipment and the corresponding real-world information.

There is furthermore provision for reference frame transfer means 11 designed to use the tools thus elaborated and to secure this correlation in real time.

Moreover, means 40 are provided, as will be seen in detail later, for determining or modeling a reference origin of intervention ORI and a direction of intervention Δ.

With the aid of the means 11, the modeled direction of intervention Δ can, at least prior to the intervention and at the start of the intervention, be materialized through an optical sighting system available to the surgeon, it being possible to steer this sighting system positionally with respect to the second reference frame R2.

The sighting system will be described later.

The system of the present invention finally comprises means 5 of intervention comprising an active member, denoted 50, whose position is specified with respect to the second reference frame R2. The active member can consist of the various tools used in surgical intervention. For example, in the case of an intercranial neurosurgical intervention, the active member could be a trephining tool, a needle, a laser or radioscope emission head, or an endoscopic viewing system.

According to an advantageous characteristic of the invention, by virtue of the reference frame transfer means 11, the position of the active member can be controlled dynamically on the basis of the prior modeling of the origin of intervention ORI and of the direction of intervention Δ.

The means 1 of dynamic display by three-dimensional imaging of the representations RSNH and RSR comprise a file 10 of two-dimensional image data. The file 10 consists for example of digitized data from tomographic sections, from radiographs, from maps of the patient's head, and contained in an appropriate mass memory.

The successive tomographic sections can be produced prior to the intervention in a conventional manner, after the reference structure SR has been put in place on the nonhomogeneous structure SNH.

According to an advantageous feature, the reference structure SR can consist of a plurality of marks or notable points which can be both sensed by the marker means 3 and detected on the two-dimensional images obtained.

Of course, the abovementioned two-dimensional tomographic sections can likewise be produced by any medical imaging means such as a nuclear magnetic resonance system.

In a characteristic and well-known manner, each two-dimensional image corresponding to a tomographic scanner section corresponds to a structural slice thickness of about 2 to 3 mm, the pixels or image elements in the plane of the tomographic section being obtained with a precision of the order of ±1 mm. It is therefore understood that the marks or points constituting the reference structure SR appear on the images with a positional uncertainty, and an important feature of the invention will consist in minimizing these uncertainties as will be described later.

The system also comprises first means 110 for calculating and reconstructing three-dimensional images from the data from the file 10.

It also comprises a high-resolution screen 12 permitting the displaying of one or more three-dimensional or two-dimensional images constituting so many representations of the reference structure RSR and of the nonhomogeneous structure SNH.

Advantageously, the calculating means 110, the high-resolution screen and the mass memory containing the file 10 form part of a computer of the workstation type with conventional design and denoted MO1.

Preferably, the first calculating means 110 can consist of a CAD type program installed in the workstation MO1.

By way of non-limiting example, this program can be derived from the software marketed under the tradename “AUTOCAD” by the “AUTODESK” company in the United States of America.

Such software makes it possible, from the various digitized two-dimensional images, to reconstruct threedimensional images constituting the representations of the structures RSR and RSNH in arbitrary orientations.

Thus, as has furthermore been represented in FIG. 1, the calculating means 4 and 11 can consist of a third computer, denoted MO2 in FIG. 1.

The first and second computers MO1 and MO2 are interconnected by a conventional digital link (bus or network).

As a variant, the computers MO1 and MO2 can be replaced by a single workstation.

The marker means 3 consist of a three-dimensional probe equipped with a tactile tip 30.

This type of three-dimensional probe, known per se and not described in detail, consists of a plurality of hinged arms, marked in terms of position with respect to a base integral with the operating table TO. It makes it possible to ascertain the coordinates of the tactile tip 30 with respect to the origin O2 of the reference frame R2 with a precision better than 1 mm.

The probe is for example equipped with resolvers delivering signals representing the instantaneous position of the abovementioned tactile tip 30. The resolvers are themselves connected to the circuits for digital/analog conversion and sampling of the values representing these signals, these sampling circuits being interconnected in conventional manner to the second computer MO2 in order to supply it with the coordinates X2, X2, Z2 of the tactile tip 30.

As a variant or additionally, and as represented diagrammatically, the marker means 3 can comprise a set of video cameras 31 and 32 (or else infrared cameras) enabling pictures to be taken of the structures SNH and SR.

The set of cameras can act as a stereoscopic system permitting the positional plotting of the base points of the reference structure SR, or of other points of the nonhomogeneous structure SNH, with respect to the second reference frame R2. The positional plotting can be done for example by appending a laser beam emission system making it possible to illuminate successively the points whose coordinates are sought, appropriate software making it possible to then determine the position of these points one by one with respect to R2. This software will not be described since it can consist of position and shape recognition software normally available on the market.

According to another variant, the marker means 3 can comprise a telemetry system.

In this case, the marks of the structure SR can consist of small radiotransmitters implanted for example on the relevant points of the patient's head and designed to be visible on the two-dimensional images, appropriate electromagnetic or optical sensors (not shown) being provided in order to determine the coordinates of the said marks in the reference frame R2 or in a reference frame tied to the latter.

It is important to note here that the general function of the base points of the reference structure is, on the one hand, to be individually localizable on the reference structure, in order to deduce from this the coordinates in R2, and on the other hand, to be visualizable on the two-dimensional images so as to be identified (by their coordinates in R1) and included in the representation RSR on the screen.

It can therefore involve special marks affixed at arbitrary points of the lateral surface of the structure SNH, or else at notable points of the latter, or else, when the notable points can in themselves be localized with high precision both on the structure SNH and on the 2D sections, notable points totally devoid of marks.

In FIG. 2 a plurality of marks, denoted M1 to Mi, these marks, in the case where the nonhomogeneous structure consists of the head of a patient, being localized for example between the eyebrows of the patient, on the latter's temples, and at the apex of the skull at a notable point such as the frontal median point.

More generally, for a substantially ovoid volume constituting the nonhomogeneous structure, there is advantageously provision for four base points at least on the outer surface of the volume.

Thus, as has been represented in FIG. 3, the four marks M1 to M4 of the reference structure are distributed so as preferably to define a more or less symmetric tetrahedron. The symmetry of the tetrahedron, represented in FIG. 3, is materialized by the vertical symmetry plane PV and the horizontal symmetry plane PH.

According to an advantageous characteristic, as will be seen later, the means of elaborating the reference frame transfer tools are designed to select three points of the tetrahedron which will define the “best plane” for the reference frame transfer.

Also, the presence of four or more points enables the additional point(s) to validate a specified selection.

More precisely, the presence of a minimum of four base points on the reference structure makes it possible to search for the minimum distortion between the points captured on the patient by the marker means consisting for example of the three-dimensional probe and the images of these points on the representation by three-dimensional imaging, the coordinates of which are calculated during processing. The best plane of the tetrahedron described earlier, that is to say the plane for which the uncertainty in the coordinates of the points between the points actually captured by the three-dimensional probe and the points of the representation of the reference structure RSR, is minimal, then becomes the reference plane for the reference frame transfer. Thus, the best correlation will be established between a modeled direction of intervention and a modeled origin of intervention, on the one hand, and the action of the member 50. Preferably, the origin of intervention will be placed at the center of the region in which the intervention is to be carried out, that is to say a tumor observed or treated for example.

Furthermore, it will be possible to take the noted residual uncertainty into account in order to effect the representation of the model and of the tools on the dynamic display means.

A more detailed description of the means of intervention 5 will now be given in connection with FIG. 4.

Preferably, the means of intervention 5 comprise a carriage 52 which is translationally mobile along the operating table TO, for example on a rack, denoted 54, whilst being driven by a motor, not shown, itself controlled by the computer MO2 for example, via an appropriate link. This movement system will not be described in detail since it corresponds to a conventional movement system available on the market. As a variant, the carriage 52 can be mobile over a distinct path separated from the operating table TO, or immobile with respect to the operating table and then constitute a support.

The support carriage 52 comprises in the first place a sighting member OV, constituting the above-mentioned sighting system, which can consist of a binocular telescope.

The sighting member OV enables the surgeon, prior to the actual intervention, or during the latter, to sight the presumed position of the region in which the intervention is to be carried out.

Furthermore, and in a non-limiting manner, with the sighting member OV can be associated a helium-neon laser emission system, denoted EL, making it possible to secure the aiming of a fine positioning or sighting laser beam on the structure SNH and in particular, as will be seen in detail later, to indicate to the surgeon the position of an entry point PE prior to the intervention, to enable the latter to open the skull at the appropriate location, and likewise to indicate to him what the direction of intervention will be. Additionally, the illuminating of the relevant point of the nonhomogeneous structure or at the very least the lateral surface of the latter enables the video cameras 31 and 32 to carry out, if necessary, a positional plotting.

Preferably, a system for measuring position by telemetry 53 is provided to secure the precise measurement of the position of the support carriage 52 of the sighting member OV and of the laser emission system EL. During the operation, and in order to secure the intervention, the carriage 52 can be moved along the rack 54, the position of the carriage 52 being measured very precisely by means of the system 53. The telemetry system 53 is interconnected with the microcomputer MO2 by an appropriate link.

The means of intervention 5 can advantageously consist of a guide arm 51 for the active member 50.

The guide arm 51 can advantageously consist of several hinged segments, each hinge being equipped with motors and resolvers making it possible to secure control of movement of the end of the support arm and the positional plotting of this same end and therefore of the active member 50 according to six degrees of freedom with respect to the carriage 52. The six degrees of freedom comprise, of course, three translational degrees of freedom with respect to a reference frame tied to the carriage 52 and three rotational degrees of freedom along these same axes.

Thus, the support arm 51 and the member 50 are marked in terms of instantaneous position with respect to the second reference frame R2, on the one hand by way of the positional plot of the mobile carriage 52 and, on the other hand, by way of the resolvers associated with each hinge of the support arm 51.

In the case of an intracranial neurosurgical surgical intervention, the active member 50 can be removed and can consist of a trephining tool, a needle or radioactive or chemical implant, a laser or radioisotope emission head or an endoscopic viewing system. These various members will not be described since they correspond to instruments normally used in neurosurgery.

The materializing of the modeled direction of intervention can be effective by means of the laser emitter EL. This sighting being performed, the guide arm 51 can then be brought manually or in steered manner into superposition with the direction of intervention Δ.

In the case of manual positioning, the resolvers associated with the sighting member OV and the laser emitter EL, if appropriate, make it possible to record the path of the sighting direction, constituting in particular the actual direction of intervention, on the representation of the nonhomogeneous structure in the dynamic display means 1.

Furthermore, as will be described later and in preferential manner, the intervening surgeon will be able firstly to define a simulated intervention path and steer thereto the movements of the active member 50 in the nonhomogeneous structure in order effectively to secure all or part of the intervention.

In this case, the progress of the intervention tool 50 is then steered directly to the simulated path (data ORI, Δ) by involving the reference frame transfer means 11 in order to express the path in the reference frame R2.

A more detailed description of the implementation of the operational mode of the system of the invention will now be described in connection with FIGS. 5a and 5b.

According to FIG. 5a, the first step consists in obtaining and organizing in memory the two-dimensional image data (step 100). Firstly, the nonhomogeneous structure SNH is prepared. In the case of a neurosurgical intervention for example, this means that the patient's head can be equipped with marks constituting the base points of the reference structure SR. These marks can be produced by means of points consisting of a dye partially absorbing the X-rays, such as a radiopaque dye.

The abovementioned marks are implanted by the surgeon on the patient's head at notable points of the latter [sic], and images can then be taken of the nonhomogeneous structure SNH by tomography for example, by means of an apparatus of the X-ray scanner type.

This operation will not be described in detail since it corresponds to conventional operations in the field of medical imaging.

The two-dimensional image data obtained are then constituted as digitized data in the file 10, these data being themselves marked with respect to the reference frame R1 and making it possible, on demand, to restore the two-dimensional images onto the dynamic display means 1, these images representing superimposed sections of the nonhomogeneous structure SNH.

From the digitized image data available to the surgeon, the latter then proceeds, as indicated at 101 in FIG. 5a, to select the structures of interest of the abovementioned images.

The purpose of this step is to facilitate the work of the surgeon by forming three-dimensional images which contain only the contours of the elements of the structure which are essential for geometrical definition and real-time monitoring of the progress of the intervention.

In the case where the nonhomogeneous structure SNH consists of the head of a patient, an analysis of the two-dimensional image data makes it possible, from values of optical density of the corresponding image-points, straight-away to extract the contours of the skull, to check the distance scales, etc.

Preferably, the abovementioned operations are performed on a rectangle of interest for a given two-dimensional image, this making it possible, by moving the rectangle of interest, to cover the whole of the image.

The above analysis is performed by means of suitable software which thus makes it possible to extract and vectorize the contours of the structures which will be modeled in the representations RSNH and RSR.

The structures modeled in the case of a neurosurgical intervention are for example the skull, the cerebral ventricles, the tumor to be observed or treated, the falx cerebri, and the various functional regions.

According to a feature of the interactive system of the invention, the surgeon may have available a digitizing table or other graphics peripheral making it possible, for each displayed two-dimensional image, to rectify or complete the definition of the contour of a particular region of interest.

It will be noted finally that by superimposing the extracted contours on the displayed two-dimensional image, the surgeon will be able to validate the extractions carried out.

The extracted contours are next processed by sampling points to obtain their coordinates in the reference frame R1, it being possible to constitute these coordinates as an ASCII type file. This involves step 102 for generating the three-dimensional data base.

This step is followed by a step 103 of reconstructing the three-dimensional model. This step consists firstly, with the aid of the CAD type software, in carrying out on the basis of the contours of the structures of interest constituted as vectorized two-dimensional images an extrapolation between the various sectional planes.

The abovementioned extrapolation is carried out preferably by means of a “B-spline” type algorithm which seems best suited. This extrapolation transforms a discrete item of information, namely the successive sections obtained by means of the scanner analysis, into a continuous model permitting three-dimensional representation of the volume envelopes of the structures.

It should be noted that the reconstruction of the volumes constituting the structures of interest introduces an approximation related in particular to the spacing and non-zero thickness of the acquisition sections. An important characteristic of the invention, as explained in detail elsewhere, is on the one hand to minimize the resulting uncertainties in the patient-model correlation, and on the other hand to take into account the residual uncertainties.

The CAD type software used possesses standard functions which enable the model to be manipulated in space by displaying it from different viewpoints through just a criterion defined by the surgeon (step 104).

The software can also reconstruct sectional representation planes of the nonhomogeneous structure which differ from the planes of the images from the file 10, this making it possible in particular to develop knowledge enhancing the data for the representation by building up a neuro-anatomical map.

The surgeon can next (step 105) determine a model of intervention strategy taking into account the modeled structures of interest, by evaluating the distance and angle ratios on the two-and three-dimensional representations displayed.

This intervention strategy will consist, in actual fact, on the one hand in localizing the tumor and in associating therewith a “target point”, which will subsequently be able to substitute for the origin common to all the objects (real and images) treated by the system, and on the other hand in determining a simulated intervention path respecting to the maximum the integrity of the structures of interest. This step can be carried out “in the office”, involving only the workstation.

Once this operation is performed and prior to the intervention, the following phase consists in implementing the steps required to establish as exact as possible a correlation between the structure SNH (real world) and the representation RSNH (computer world). This involves steps 106 to 109 of FIG. 5b.

Firstly, as represented in FIG. 5b at step 107, marking of the base points of the reference structure SR with respect to the second reference frame is carried out with the aid of the marker means 3, by delivering to the system the coordinates X2, Y2, Z2 of the said base points.

The following step 106 consists in identifying on the representations RSNH and RSR displayed on the screen the images of the base points which have just been marked. More precisely, with the aid of appropriate graphics peripherals, these representations (images) of the base points are selected one by one, the workstation supplying on each occasion (in this instance to the computer MO2) the coordinates of these points represented in the reference frame R1.

Thus the computer MO2 has available a first set of three-dimensional coordinates representing the position of the base points in R2, and a second set of three-dimensional coordinates representing the position of the representations of the base points in R1.

According to an essential feature of the invention, these data will be used to elaborate at 108, 109, tools for reference frame transfer (from R1 to R2 and vice versa) by calling upon an intermediate reference frame determined from the base points and constituting an intermediate reference frame specific to the reconstructed model.

More precisely, the intermediate reference frame is constructed from three base points selected so that, in this reference frame, the coordinates of the other base points after transfer from R2 and the coordinates of the representations of these other base points after transfer from R1 are expressed with the greatest consistency and minimum distortion.

When the step of elaborating the reference frame transfer tools is concluded, these tools can be used by he system to secure optimal coupling between the real world and the computer world (step 1110).

Furthermore, according to a subsidiary feature of the present invention, the system can create on the display means a representation of the nonhomogeneous structure and of the intervention member which takes account of the deviations and distortions remaining after the “best” reference frame transfer tools have been selected (residual uncertainties). More precisely, from these deviations can be deduced by the calculating means a standard error likely to appear in the mutual positioning between the representation of the nonhomogeneous structure and the representation of elements (tools, sighting axes, etc.) referenced on R2 when using the reference frame transfer tools. This residual uncertainty, which may in practice be given substance through an error matrix, can be used for example to represent certain contours (tool, structures of interest to be avoided during the intervention, etc.) with dimensions larger than those which would normally be represented starting from the three-dimensional data base or with the aid of coordinates marked in R2, the said larger dimensions being deduced from the “normal” dimensions by involving the error matrix. For example, if the member were represented normally, in transverse section, by a circle of diameter D1, a circle of diameter D2>D1 can be represented in substance, with the difference D2-D1 deduced from the standard error value. In this way, when a direction of intervention will be selected making it possible to avoid traversing certain structures of interest, the taking into account of an “enlarged” size of the intervention tool will eradicate any risk of the member, because of the abovementioned errors, accidently traversing these structures.

Back at step 105, and as will be seen in more detail with reference to FIGS. 9a and 9b, the reference origin of intervention ORI and the direction of intervention Δ, that is to say the simulated intervention path, can be determined according to various procedures.

According to a first procedure, the trajectory can be defined from two points, namely an entry point PE (FIG. 3) and a target point, that is to say substantially the center of the structure of interest consisting of the tumor to be observed or treated. Initially, these two points are localized on the model represented on the screen.

According to a second methodology, the trajectory can be determined from the abovementioned target point and from a direction which takes account of the types of structures of interest and of their positions with a view to optimally respecting their integrity.

After the abovementioned step 108, the surgeon can at step 1110 perform the actual intervention.

The intervention can advantageously be performed by steering the tool or active member over the simulated intervention path, determined in step 1110.

As a variant, given that the support arm 51 for the active member, equipped with its resolvers, continuously delivers the coordinates in R2 of the said active member to the system, it is also possible to perform the operation manually or semi-manually, by monitoring on the screen the position and motions of a representation of the tool and by comparing them with the simulated, displayed intervention path.

It will furthermore be noted that the modeled direction of intervention can be materialized with the aid of the laser beam described earlier, the positioning of the latter (with respect to R2) being likewise carried out by virtue of the reference frame transfer tools.

Certain functional features of the system of the invention will now be described in further detail with reference to FIGS. 6, 7, 8, 9a and 9b.

The module for elaborating the reference frame transfer tools (steps 108, 109 of FIG. 5b) will firstly be described with reference to FIG. 6.

This module comprises a first sub-module 1001 for acquiring three points A, B, C, the images of the base points of SR on the representation RSNH (the coordinates of these points being expressed in the computer reference frame R1), by successive selections of these points on the representation. To this effect, the surgeon is led, by means of a graphics interface such as a “mouse” to point successively at the three selected points A, B, C.

The module for preparing the transfer tools also comprises a second sub-module, denoted 1002, for creating a unit three-dimensional orthogonal matrix M, this matrix being characteristic of a right-handed orthonormal basis represented by three unit vectors {right arrow over (i)}, {right arrow over (j)}, {right arrow over (k)}, which define an intermediate reference frame tied to R1.

The unit vectors {right arrow over (i)}, {right arrow over (j)} and {right arrow over (k)} are given by the relations:

j = AB / AB k = ( BA Λ BC ) / BA Λ BC i = j Λ k
where ∥ ∥ designates the norm of the relevant vector.

In the above relations, the sign “Λ” designates the vector product of the relevant vectors.

Similarly, the module for preparing the transfer tools comprises a third sub-module, denoted 1003, for acquiring three base points D, E, F, of the structure SR, these three points being those whose images on the model are the points A, B, C respectively. For this purpose, the surgeon, for example by means of the tactile tip 30, successively senses these three points to obtain their coordinates in R2.

The sub-module 1003 is itself followed, as represented in FIG. 6, by a fourth sub-module 1004 for creating a unit three-dimensional orthogonal matrix N, characteristic of a right-handed orthonormal basis comprising three unit vectors {right arrow over (i)}′, {right arrow over (j)}′, {right arrow over (k)}′ and which is tied to the second reference frame R2 owing to the fact that the nonhomogeneous structure SNH is positionally tied with respect to this reference frame.

The three unit vectors {right arrow over (i)}′, {right arrow over (j)}′, {right arrow over (k)}′ are defined by the relations:

j = DE / DE k = ( ED Λ EF ) / ED Λ EF i = j Λ k

As indicated above, to the extent that the base points of the reference structure can be marked in R2 with high precision, so their representation in the computer base R1 is marked with a certain margin of error given on the one hand the non-zero thickness (typically from 2 to 3 mm) of the slices represented by the two-dimensional images from the file 10, and on the other hand (in general to a lesser extent) the definition of each image element or pixel of a section.

According to the invention, once a pair of transfer matrices M, N has been elaborated with selected points A, B, C, D, E, F, it is sought to validate this selection by using one or more additional base points; more precisely, for the or each additional base point, this point is marked in R2 with the aid of the probe 30, the representation of this point is marked in R1 after selection on the screen, and then the matrices N and M are applied respectively to the coordinates obtained, in order to obtain their expressions in the bases ({right arrow over (i)}′, {right arrow over (j)}′, {right arrow over (k)}′) and ({right arrow over (i)}, {right arrow over (j)}, {right arrow over (k)}) respectively. If these expressions are in good agreement, these two bases can be regarded as a single intermediate reference frame, this securing the exact as possible mathematical coupling between the computer reference frame R1 tied to the model and the “real” reference frame R2 tied to the patient.

In practice, the module for elaborating the reference frame transfer tools can be designed to perform steps 1001 to 1004 in succession on basic triples which differ on each occasion (for example, if four base points have been defined associated with four representations in RSR, there are four possible triples), in order to perform the validation step 1005 for each of these selections and finally in order to choose the triple for which the best validation is obtained, that is to say for which the deviation between the abovementioned expressions is smallest. This triple defines the “best plane” mentioned elsewhere in the description, and results in the “best” transfer matrices M and N.

As a variant, it will be possible for the selection of the best plane to be made at least in part by the surgeon by virtue of his experience.

It should be noted that the reference frame transfer will only be concluded by supplementing the matrix calculation with the matrices M, N with a transfer of origin, so as to create a new common origin for example at the center of the tumor to be observed or treated (point ORI). This transfer of origin is effected simply by appropriate subtraction of vectors on the one hand on the coordinates in R1, and on the other hand on the coordinates in R2. These vectors to be subtracted are determined after localization of the center of the tumor on the representation.

Furthermore, the means described above for establishing the coupling between the patient's world and the model's world can also be used to couple to the model's world that of map data, also stored in the workstation and expressed in a different reference frame denoted R3. In this case, since these data contain no specific visible mark, the earlier described elaboration of matrices is performed by substituting for these marks the positions of notable points of the patient's head. These may be temporal points, the frontal median point, the apex of the skull, the center of gravity of the orbits of the eyes, etc.

The corresponding points of the model can be obtained either by selection by mouse or graphics tablet on the model, or by sensing on the patient himself and then using the transfer matrices.

The above step of elaborating the reference frame transfer tools, conducted in practice by the calculating means 4, makes it possible subsequently to implement the reference frame transfer means (FIGS. 7 and 8).

With reference to FIG. 7, the first transfer sub-module 201 comprises a procedure denoted 2010 for aquiring the coordinates XM, YM, ZM, expressed in R1, of the point to be transferred, by selecting on the representation.

The procedure 2010 followed by a procedure 2011 for calculating the coordinates XP, YP, ZP (expressed in R2) of the corresponding real point on the patient through the transformation:

    • {XP, YP, ZP}=M*N−1*{XM, YM, ZM} where M * N−1 represents the product of the matrix M and the inverse matrix N.

The procedure 2011 is followed by a processing procedure 2012 utilizing the calculated coordinates XP, YP, ZP, for example to indicate the corresponding point on the surface of the structure SNH by means of the laser emission system EL, or again to secure the intervention at the relevant point with coordinates XP, YP, ZP (by steering the active member).

Conversely, in order to secure a transfer from SNH to RSNH, the second sub-module 202 comprises (FIG. 8) a procedure denoted 2020 for acquiring on the structure SNH the coordinates XP, YP, ZP (expressed in R2) of a point to be transferred.

These coordinates can be obtained by means of the tactile tip 30 for example. The procedure 2020 is followed by a procedure 2021 for calculating the corresponding coordinates XM, YM, ZM in R1 through the transformation:

{XM, YM, ZM}=N*M−1*{XP, YP, ZP}

A procedure 2022 next makes it possible to effect the displaying of the point with coordinates XM, YM, ZM on the model or again of a straight line or of a plane passing through this point and furthermore meeting other criteria.

It will be noted here that the two sub-modules 201, 202 can used [sic] by the surgeon at any moment for the purpose of checking the valid nature of the transfer tools; in particular, it is possible to check at any time that a real base point, with coordinates known both in R2 and R1 (for example a base point of SR or an arbitrary notable point of the structure SNH visible on the images), correctly relocates with respect to its image after transferring the coordinates in step 2011.

In the event of an excessive difference, a new step of elaboration of the transfer tools is performed.

Furthermore, the sub-modules 201, 202 can be designed to also integrate the taking into account of the residual uncertainty, as spoken of above, so as for example to represent on the screen a point sensed not in a pointwise manner, but in the form for example of a circle or a sphere representing the said uncertainty.

From a simulated intervention path, for example on the representation RSNH, or from any other straight line selected by the surgeon, the invention furthermore enables the model to be represented on the screen from a viewpoint corresponding to this straight line. Thus the third transfer subroutine comprises, as represented in FIGS. 9a and 9b, a first module 301 for visualizing the representation in a direction given by two points and a second module 302 for visualizing the representation in a direction given by an angle of elevation and an angle of azimuth.

The first module 301 for visualizing the representation in a direction given by two points comprises a first sub-module denoted 3010 permitting acquisition of the two relevant points which will define the selected direction. The coordinates of these points are expressed in the reference frame R1, these points having either been acquired previously on the nonhomogeneous structure SNH for example by means of the tactile tip 30 and then subjected to the reference frame transfer, or chosen directly on the representation by means of the graphics interface of the “mouse” type.

The first sub-module 3010 is followed by a second sub-module denoted 3011 permitting the creation of a unit, orthogonal three-dimensional matrix V characteristic of a right-handed orthonormal basis {right arrow over (i)}″, {right arrow over (j)}″, {right arrow over (k)}″ the unit vectors {right arrow over (i)}″, {right arrow over (j)}″, {right arrow over (k)}″, being determined through the relations:

k = AB / AB ; i · k = O ; i · z = O ; i = 1 ; j = k Λ j
where “Λ” represents the vector product and “.” symbolizes the scalar product.

The sub-module 3011 is followed by a routine 3012 making it possible to secure for all the points of the entities (structures of interest) of the three-dimensional data base of coordinates XW, YW, ZW in R1 a conversion into the orthonormal basis ({right arrow over (i)}″, {right arrow over (j)}″, {right arrow over (k)}″) by the relation:

{XV, YV, ZV}=V*{XW, YW, ZW}

The subroutine 3013 is then followed by a subroutine 3014 for displaying the plane i″, j″, the subroutines 3013 and 3014 being called up for all the points, as symbolized by the arrow returning to block 3012 in FIG. 9a.

When all the points have been processed, an output module 3015 permits return to a general module, which will be described later in the description. It is understood that this module enables two-dimensional images to be reconstructed in planes perpendicular to the direction defined by A and B.

In the same way, the second module 302 (FIG. 9b) for visualizing the representation from a viewpoint given by an angle of elevation and an angle of azimuth comprises a first sub-module 3020 for acquiring the two angles in the representation frame of reference.

The selection of the angles of elevation and of azimuth can be made by selecting from a predefined data base or by moving software cursers associated with each view or else by modification relative to a current direction, such as the modeled direction of intervention. The sub-module 3020 is itself followed by a second sub-module 3021 for creating a unit orthoganal three-dimensional matrix W characteristic of a right-handed orthonormal basis of unit vectors {right arrow over (i)}′″, {right arrow over (j)}′″, {right arrow over (k)}′″. They are defined by the relations:
{right arrow over (i)}″′·{right arrow over (k)}″′=O;
{right arrow over (k)}″′·{right arrow over (z)}″′=sin(azimuth)
{right arrow over (j)}″′·{right arrow over (z)}″′=O;
{right arrow over (i)}″′·{right arrow over (y)}=cos(elevation);
{right arrow over (i)}″″·{right arrow over (x)}″′=sin(elevation)
{right arrow over (j)}″′={right arrow over (k)}″′Λ{right arrow over (i)}″′

A routine 3022 is then called for all the points of the entities of the three-dimensional data base of coordinates XW, YW, ZW and enables a first sub-routine 3023 to be called permitting calculation of the coordinates of the relevant point in the right-handed orthonormal bases {right arrow over (i)}′″, {right arrow over (j)}′″ {right arrow over (k)}′″ through the transformation:

{XV, YV, ZV}=V*{XW, YW, ZW}

The sub-routine 3023 is itself followed by a sub-routine 3024 for displaying the plane i′″, j′″, the two sub-routines 3023 and 3024 then being called up for each point as symbolized by the return via the arrow to the block 3022 for calling the abovementioned routine. When all the points have been processed, an output sub-module 3025 permits a return to the general menu.

Of course, all of the programs, sub-routines, modules, sub-modules and routines destroyed earlier are managed by a general “menu” type program so as to permit interactive driving of the system by screen dialogue with the intervening surgeon by specific screen pages.

A more specific description of a general flow diagram illustrating this general program will now be given in connection with FIGS. 10a and 10b.

Thus, in FIG. 10a has been represented in succession a screen page 4000 relating to the loading of data from the digitized file 10, followed by a screen page 4001 making it possible to secure the parameterizing of the grey scales of the display on the dynamic display means 1 and to calibrate the image, for example.

The screen page 4001 is followed by a screen page 4002 making it possible to effect the generation of a global view and then a step or screen page 4003 makes it possible to effect an automatic distribution of the sections on the screen of the workstation.

A screen page 4004 makes it possible to effect a manual selection of sections and then a screen page 4005 makes it possible to effect the selection of the strategy (search for the entry points and for the possible directions of intervention, first localizing of the target (tumor . . . ) to be treated . . . ), as defined earlier, and to select the position and horizontal, sagittal and frontal distribution of the sections.

A screen page 4006 also makes it possible to effect a display of the settings of a possible stereotaxic frame.

It will be recalled that the reference structure SR advantageously replaces the stereotaxic frame formerly used to effect the marking of position inside the patient's skull.

There may furthermore be provided a screen-page 4007 for choosing strategic sections by three-dimensional viewing, on selection by the surgeon, and then at 4008 the aligning of the references of the peripherals (tool, sighting members, etc., with the aid of the probe 30.

A screen page 4009 is also provided to effect the search for the base points on the patient with the aid of the said probe, following which the steps of construction of the reference frame transfer tools and of actual reference frame transfer are performed, preferably in a user-transparent manner.

Another screen page 4010 is then provided, so as to effect the localizing of the target on the representation (for example a tumor to be observed or treated in the case of a neurosurgical intervention) in order subsequently to determine a simulated intervention path.

Then a new screen page 4011 makes it possible to effect the setting of the guides for the tool on the basis of this simulated path before opening up the skin and bone flaps on the patient's skull.

Then a new localizing step 4012 makes it possible to check whether the position of the guides corresponds correctly to the simulated intervention path.

The screen page 4012 is followed by a so-called intervention screen page, the intervention being performed in accordance with step 1110 of FIG. 5b.

A more detailed description of the interactive dialogue between the surgeon and the system during a surgical, and in particular a neurosurgical, intervention will follow with reference to FIG. 10c and to all of the preceding description.

The steps of FIG. 10c are also integrated in the general program mentioned earlier; there are undertaken in succession a first phase I (preparation of the intervention), then a second phase II, (prior to the actual intervention, the patient is placed in a condition for intervention, the reference structure SR being tied to the second reference frame R2), then a third phase III (intervention) and finally a post-intervention phase IV.

With a view to preparing the intervention, the system requests the surgeon (step 5000) to choose the elementary structures of interest (for example bones of the skull, ventricles, vascular regions, the tumor to be explored or treated, and the images of the marks constituting in the first reference frame the representation RSR).

The choice of the elementary structures of interest is made on the display of the tomographic images, for example, called up from the digitized file 10.

The system next performs, at step 5001, a modeling of the structures of interest, as described earlier. Then, the nonhomogeneous structure having been thus constituted as a three-dimensional model RSNH displayed on the screen, the intervening surgeon is then led to perform a simulation by three-dimensional imaging, at step 5002, with a view to defining the intervention path of the tool 50.

During phase II the patient being placed in a condition for intervention and his head and the reference structure SR being tied to the second reference frame R2, the surgeon performs at step 5003 a search for the position of the marks M1 to M4 constituting base points of the reference structure in the second reference frame R2, and then during a step 5004, performs a search for the position of the sighting systems, visualizing member OV, or of the tools and intervention instruments 50, still in the second reference frame R2, so as, if appropriate, to align these implements with respect to R2.

The system then performs the validation of the intervention/patient spaces and representation by three-dimensional imaging in order to determine next the common origin of intervention ORI. In other words, the matrix reference frame transfer described above is supplemented with the necessary origin translations (origins 01 and 02 aligned on ORI).

This operation is performed as described earlier.

Phase III corresponds to the intervention, during which the system effects at step 5006 a permanent coupling in real time between the direction of aim of the active member 50, and/or of the direction of aim of the sighting member OV (and if appropriate of the laser beam), with the direction of aim (of observation) simulated by three-dimensional imaging on the display means 1, and vice versa.

In the following step 5007, the coupling is effected of the movements and motions of the intervention instrument with their movements simulated by three-dimensional imaging, with automatic or manual conduct of the intervention.

As noted at 5008, the surgeon can be supplied with a permanent display of the original two-dimensional sectional images in planes specified with respect to the origin ORI and to the direction of intervention. Such a display enables the surgeon at any time to follow the progress of the intervention in real time and to be assured that the intervention is proceeding in accordance with the simulated intervention. In phase IV which is executed after the intervention, the system effects a saving of the data acquired during the intervention, this saving making it possible subsequently to effect a comparison in real time or deferred in the event of successive interventions on the same patient.

Furthermore, the saved data make it possible to effect a playback of the operations carried out with the option of detailing and supplementing the regions traversed by the active member 50.

Thus, a particularly powerful interactive system for local intervention has been described.

Thus, the system which is the subject of the present invention makes it possible to represent a model containing only the essential structures of the nonhomogeneous structure, this facilitating the work of preparation and of monitoring of the intervention by the surgeon.

Moreover, the system, by virtue of the algorithms used and in particular by minimizing the distortion between the real base points and their images in the 2D sections or the maps, makes it possible to establish a two-way coupling between the real world and the computer world through which the transfer errors are minimized, making possible concrete exploitation of the imaging data in order to steer the intervention tool.

To summarize, the system makes possible an ineractive [sic] medical usage not only to create a three-dimensional model of the nonhomogeneous structure but also to permit a marking in real time with respect to the internal structures and to guide the surgeon in the intervention phase.

More generally, the invention makes it possible to end up with a coherent system in respect of:

    • the two-dimensional imaging data (scanner sections, maps, etc.)
    • the three-dimensional data base;
    • the data supplied by the marker means 3 in the reference frame R2;
    • the coordinate data for the sighting systems and intervention tools;
    • the real world of the patient on the operating table.

Accordingly, the options offered by the system are, in a non-limiting manner, the following:

    • the tools and of [sic] their position can be represented on the screen;
    • the position of a point on the screen can be materialized on the patient for example with the aid of the laser emission device EL;
    • the orientation and the path of a tool such as a needle can be represented on the screen and materialized on the patient optically (laser emission) or mechanically (positioning of the guide-arm in which the tool is guided in translation):
    • an image of the patient, yielded for example by a system for taking pictures if appropriate in relief, can be superimposed on the three-dimensional representation modeled on the screen; thus, any change in the soft external parts of the patient can be visualized as compared with the capture by the scanner;
    • it being possible for the surgeon's field of view given by a sighting member (such as a surgical microscope) to be referenced with respect to R2, the direction of visualization of the model on the screen can be made identical to the real sight by the sighting member;
    • finally, the three-dimensional images, normally displayed on the screen in the preceding description, may as a variant be introduced into the surgeon's microscope so as to obtain the superposition of the real image and the representation of the model.
Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US157678122 Apr 192416 Mar 1926Philips Herman BFluoroscopic fracture apparatus
US173572620 Dec 192712 Nov 1929 bornhardt
US240784516 Jan 194317 Sep 1946California Inst Res FoundAligning device for tools
US26505887 Feb 19511 Sep 1953Drew Harry Guy RadcliffeArtificial femoral head having an x-ray marker
US26974334 Dec 195121 Dec 1954Max A ZehnderDevice for accurately positioning and guiding guide wires used in the nailing of thefemoral neck
US30168993 Nov 195816 Jan 1962Carl B StenvallSurgical instrument
US301788719 Jan 196023 Jan 1962William T HeyerStereotaxy device
US30619368 Jul 19596 Nov 1962Univ LouvainStereotaxical methods and apparatus
US30733105 Aug 195715 Jan 1963Zenon R MocarskiSurgical instrument positioning device
US310958826 Jan 19625 Nov 1963Bostick Lewis MCelestial computers
US329408326 Aug 196327 Dec 1966Alderson Res Lab IncDosimetry system for penetrating radiation
US336732615 Jun 19656 Feb 1968Calvin H. FrazierIntra spinal fixation rod
US343925623 Feb 196715 Apr 1969Merckle Flugzeugwerke GmbhInductive angular position transmitter
US357716010 Jan 19684 May 1971James E WhiteX-ray gauging apparatus with x-ray opaque markers in the x-ray path to indicate alignment of x-ray tube, subject and film
US361495017 Mar 196926 Oct 1971Graham Peter RabeyApparatus for location relatively to a subject's cephalic axis
US364482531 Dec 196922 Feb 1972Texas Instruments IncMagnetic detection system for detecting movement of an object utilizing signals derived from two orthogonal pickup coils
US367401421 Oct 19704 Jul 1972Astra Meditec AbMagnetically guidable catheter-tip and method
US370293513 Oct 197114 Nov 1972Litton Medical ProductsMobile fluoroscopic unit for bedside catheter placement
US37047076 Apr 19715 Dec 1972William X HalloranOrthopedic drill guide apparatus
US382146915 May 197228 Jun 1974Amperex Electronic CorpGraphical data device
US386856530 Jul 197325 Feb 1975Jack KuipersObject tracking and orientation determination means, system and process
US39411273 Oct 19742 Mar 1976Froning Edward CApparatus and method for stereotaxic lateral extradural disc puncture
US398347421 Feb 197528 Sep 1976Polhemus Navigation Sciences, Inc.Tracking and determining orientation of object using coordinate transformation means, system and process
US401785824 Feb 197512 Apr 1977Polhemus Navigation Sciences, Inc.Apparatus for generating a nutating electromagnetic field
US40375924 May 197626 Jul 1977Kronner Richard FGuide pin locating tool and method
US405262028 Nov 19754 Oct 1977Picker CorporationMethod and apparatus for improved radiation detection in radiation scanning systems
US405488126 Apr 197618 Oct 1977The Austin CompanyRemote object position locater
US41173373 Nov 197726 Sep 1978General Electric CompanyPatient positioning indication arrangement for a computed tomography system
US417322816 May 19776 Nov 1979Applied Medical DevicesCatheter locating device
US418231220 May 19778 Jan 1980Mushabac David RDental probe
US4197855 *17 Mar 197815 Apr 1980Siemens AktiengesellschaftDevice for measuring the location, the attitude and/or the change in location or, respectively, attitude of a rigid body in space
US420234924 Apr 197813 May 1980Jones James WRadiopaque vessel markers
US422879922 Sep 197821 Oct 1980Anichkov Andrei DMethod of guiding a stereotaxic instrument at an intracerebral space target point
US425611212 Feb 197917 Mar 1981David Kopf InstrumentsHead positioner
US426230621 Nov 197914 Apr 1981Karlheinz RennerMethod and apparatus for monitoring of positions of patients and/or radiation units
US428780920 Aug 19798 Sep 1981Honeywell Inc.Helmet-mounted sighting system
US429887424 Oct 19783 Nov 1981The Austin CompanyMethod and apparatus for tracking objects
US431425130 Jul 19792 Feb 1982The Austin CompanyRemote object position and orientation locater
US431707815 Oct 197923 Feb 1982Ohio State University Research FoundationRemote position and orientation detection employing magnetic flux linkage
US43191369 Nov 19799 Mar 1982Jinkins J RandolphComputerized tomography radiograph data transfer cap
US43285484 Apr 19804 May 1982The Austin CompanyLocator for source of electromagnetic radiation having unknown structure or orientation
US432881320 Oct 198011 May 1982Medtronic, Inc.Brain lead anchoring system
US433995329 Aug 198020 Jul 1982Aisin Seiki Company, Ltd.Position sensor
US434122013 Apr 197927 Jul 1982Pfizer Inc.Stereotactic surgery apparatus and method
US434638430 Jun 198024 Aug 1982The Austin CompanyRemote object position and orientation locator
US435885631 Oct 19809 Nov 1982General Electric CompanyMultiaxial x-ray apparatus
US436853619 Nov 198011 Jan 1983Siemens AktiengesellschaftDiagnostic radiology apparatus for producing layer images
US43968853 Jun 19802 Aug 1983Thomson-CsfDevice applicable to direction finding for measuring the relative orientation of two bodies
US439694519 Aug 19812 Aug 1983Solid Photography Inc.Method of sensing the position and orientation of elements in space
US440332115 Apr 19816 Sep 1983U.S. Philips CorporationSwitching network
US441842211 Mar 198129 Nov 1983Howmedica International, Inc.Aiming device for setting nails in bones
US44190123 Sep 19806 Dec 1983Elliott Brothers (London) LimitedPosition measuring system
US442204130 Jul 198120 Dec 1983The United States Of America As Represented By The Secretary Of The ArmyMagnet position sensing system
US44310057 May 198114 Feb 1984Mccormick Laboratories, Inc.Method of and apparatus for determining very accurately the position of a device inside biological tissue
US448581530 Aug 19824 Dec 1984Kurt AmplatzDevice and method for fluoroscope-monitored percutaneous puncture treatment
US450667610 Sep 198226 Mar 1985Duska Alois ARadiographic localization technique
US454395925 Jan 19851 Oct 1985Instrumentarium OyDiagnosis apparatus and the determination of tissue structure and quality
US454820827 Jun 198422 Oct 1985Medtronic, Inc.Automatic adjusting induction coil treatment device
US457183416 May 198525 Feb 1986Orthotronics Limited PartnershipKnee laxity evaluator and motion module/digitizer arrangement
US457219818 Jun 198425 Feb 1986Varian Associates, Inc.Catheter for use with NMR imaging systems
US45835384 May 198422 Apr 1986Onik Gary MMethod and apparatus for stereotaxic placement of probes in the body utilizing CT scanner localization
US458457717 Oct 198322 Apr 1986Brookes & Gatehouse LimitedAngular position sensor
US460897720 Dec 19822 Sep 1986Brown Russell ASystem using computed tomography as for selective body treatment
US461386613 May 198323 Sep 1986Mcdonnell Douglas CorporationThree dimensional digitizer with electromagnetic coupling
US461792528 Sep 198421 Oct 1986Laitinen Lauri VAdapter for definition of the position of brain structures
US461897821 Oct 198321 Oct 1986Cosman Eric RMeans for localizing target coordinates in a body relative to a guidance system reference frame in any arbitrary plane as viewed by a tomographic image through the body
US46216286 Sep 198411 Nov 1986Ortopedia GmbhApparatus for locating transverse holes of intramedullary implantates
US46257187 Jun 19852 Dec 1986Howmedica International, Inc.Aiming apparatus
US463879810 Sep 198027 Jan 1987Shelden C HunterStereotactic method and apparatus for locating and treating or removing lesions
US464278625 May 198410 Feb 1987Position Orientation Systems, Ltd.Method and apparatus for position and orientation measurement using a magnetic field and retransmission
US46453434 Jun 198424 Feb 1987U.S. Philips CorporationAtomic resonance line source lamps and spectrophotometers for use with such lamps
US464950422 May 198410 Mar 1987Cae Electronics, Ltd.Optical position and orientation measurement techniques
US465173211 Apr 198524 Mar 1987Frederick Philip RThree-dimensional light guidance system for invasive procedures
US46535093 Jul 198531 Mar 1987The United States Of America As Represented By The Secretary Of The Air ForceGuided trephine samples for skeletal bone studies
US465997116 Aug 198521 Apr 1987Seiko Instruments & Electronics Ltd.Robot controlling system
US466097015 Oct 198428 Apr 1987Carl-Zeiss-StiftungMethod and apparatus for the contact-less measuring of objects
US46733522 Jan 198616 Jun 1987Markus HansenDevice for measuring relative jaw positions and movements
US468803719 Apr 198218 Aug 1987Mcdonnell Douglas CorporationElectromagnetic communications and switching system
US470104919 Jun 198420 Oct 1987B.V. Optische Industrie "De Oude Delft"Measuring system employing a measuring method based on the triangulation principle for the non-contact measurement of a distance from the surface of a contoured object to a reference level. _
US47053953 Oct 198410 Nov 1987Diffracto Ltd.Triangulation data integrity
US470540112 Aug 198510 Nov 1987Cyberware Laboratory Inc.Rapid three-dimensional surface digitizer
US470666517 Dec 198417 Nov 1987Gouda Kasim IFrame for stereotactic surgery
US470915627 Nov 198524 Nov 1987Ex-Cell-O CorporationMethod and apparatus for inspecting a surface
US471070823 Jul 19821 Dec 1987DevelcoMethod and apparatus employing received independent magnetic field components of a transmitted alternating magnetic field for determining location
US471941915 Jul 198512 Jan 1988Harris Graphics CorporationApparatus for detecting a rotary position of a shaft
US472205618 Feb 198626 Jan 1988Trustees Of Dartmouth CollegeReference display systems for superimposing a tomagraphic image onto the focal plane of an operating microscope
US472233625 Jan 19852 Feb 1988Michael KimPlacement guide
US47235449 Jul 19869 Feb 1988Moore Robert RHemispherical vectoring needle guide for discolysis
US472756513 Nov 198423 Feb 1988Ericson Bjoern EMethod of localization
US47339698 Sep 198629 Mar 1988Cyberoptics CorporationLaser probe for determining distance
US473703226 Aug 198512 Apr 1988Cyberware Laboratory, Inc.Surface mensuration sensor
US47377949 Dec 198512 Apr 1988Mcdonnell Douglas CorporationMethod and apparatus for determining remote object orientation and position
US47379213 Jun 198512 Apr 1988Dynamic Digital Displays, Inc.Three dimensional medical image display system
US47423569 Dec 19853 May 1988Mcdonnell Douglas CorporationMethod and apparatus for determining remote object orientation and position
US47428152 Jan 198610 May 1988Ninan Champil AComputer monitoring of endoscope
US474377022 Sep 198610 May 1988Mitutoyo Mfg. Co., Ltd.Profile-measuring light probe using a change in reflection factor in the proximity of a critical angle of light
US474377117 Jun 198510 May 1988View Engineering, Inc.Z-axis height measurement system
US474529019 Mar 198717 May 1988David FrankelMethod and apparatus for use in making custom shoes
US475048724 Nov 198614 Jun 1988Zanetti Paul HStereotactic frame
US475352830 Aug 198528 Jun 1988Quantime, Inc.Laser archery distance device
US476107219 Oct 19872 Aug 1988Diffracto Ltd.Electro-optical sensors for manual control
US476401613 Jun 198616 Aug 1988Anders BengtssonInstrument for measuring the topography of a surface
US477178711 Dec 198620 Sep 1988Richard Wolf GmbhUltrasonic scanner and shock wave generator
US47792126 Aug 198618 Oct 1988Levy Nessim IDistance measuring device
US47822391 Apr 19861 Nov 1988Nippon Kogaku K. K.Optical position measuring apparatus
US478848110 Mar 198729 Nov 1988Mitsubishi Denki Kabushiki KaishaNumerical control apparatus
US47919347 Aug 198620 Dec 1988Picker International, Inc.Computer tomography assisted stereotactic surgery system and method
US479335517 Apr 198727 Dec 1988Biomagnetic Technologies, Inc.Apparatus for process for making biomagnetic measurements
US479426225 Nov 198627 Dec 1988Yukio SatoMethod and apparatus for measuring profile of three-dimensional object
US47979077 Aug 198710 Jan 1989Diasonics Inc.Battery enhanced power generation for mobile X-ray machine
US48039768 Apr 198814 Feb 1989SynthesSighting instrument
US480426127 Mar 198714 Feb 1989Kirschen David GAnti-claustrophobic glasses
US48056152 Jul 198521 Feb 1989Carol Mark PMethod and apparatus for performing stereotactic surgery
US480969419 May 19877 Mar 1989Ferrara Vincent LBiopsy guide
US48212002 Apr 198711 Apr 1989Jonkopings Lans LandstingMethod and apparatus for manufacturing a modified, three-dimensional reproduction of a soft, deformable object
US482120623 Oct 198711 Apr 1989Photo Acoustic Technology, Inc.Ultrasonic apparatus for positioning a robot hand
US48217318 Dec 198718 Apr 1989Intra-Sonix, Inc.Acoustic image system and method
US482216326 Jun 198618 Apr 1989Robotic Vision Systems, Inc.Tracking vision sensor
US48250911 Feb 198825 Apr 1989Carl-Zeiss-StiftungOptoelectronic distance sensor with visible pilot beam
US48293733 Aug 19879 May 1989Vexcel CorporationStereo mensuration apparatus
US483677826 May 19876 Jun 1989Vexcel CorporationMandibular motion monitoring system
US483826521 Jan 198813 Jun 1989Cosman Eric RLocalization device for probe placement under CT scanner imaging
US484196730 Jan 198427 Jun 1989Chang Ming ZPositioning device for percutaneous needle insertion
US484577129 Jun 19874 Jul 1989Picker International, Inc.Exposure monitoring in radiation imaging
US48496929 Oct 198618 Jul 1989Ascension Technology CorporationDevice for quantitatively measuring the relative position and orientation of two bodies in the presence of metals utilizing direct current magnetic fields
US486033112 Sep 198822 Aug 1989Williams John FImage marker device
US48628932 Feb 19885 Sep 1989Intra-Sonix, Inc.Ultrasonic transducer
US486924711 Mar 198826 Sep 1989The University Of Virginia Alumni Patents FoundationVideo tumor fighting system
US487516527 Nov 198717 Oct 1989University Of ChicagoMethod for determination of 3-D structure in biplane angiography
US48754785 Apr 198924 Oct 1989Chen Harry HPortable compression grid & needle holder
US488456615 Apr 19885 Dec 1989The University Of MichiganSystem and method for determining orientation of planes of imaging
US488952613 Nov 198726 Dec 1989Magtech Laboratories, Inc.Non-invasive method and apparatus for modulating brain signals through an external magnetic or electric field to reduce pain
US489667315 Jul 198830 Jan 1990Medstone International, Inc.Method and apparatus for stone localization using ultrasound imaging
US490569813 Sep 19886 Mar 1990Pharmacia Deltec Inc.Method and apparatus for catheter location determination
US492345913 Sep 19888 May 1990Kabushiki Kaisha ToshibaStereotactics apparatus
US493105624 Oct 19885 Jun 1990Neurodynamics, Inc.Catheter guide apparatus for perpendicular insertion into a cranium orifice
US494530511 Apr 198931 Jul 1990Ascension Technology CorporationDevice for quantitatively measuring the relative position and orientation of two bodies in the presence of metals utilizing direct current magnetic fields
US494591418 Jul 19887 Aug 1990Allen George SMethod and apparatus for providing related images over time of a portion of the anatomy using at least four fiducial implants
US49516532 Mar 198828 Aug 1990Laboratory Equipment, Corp.Ultrasound brain lesioning system
US495589122 Oct 198711 Sep 1990Ohio Medical Instrument Company, Inc.Method and apparatus for performing stereotactic surgery
US49614222 May 19889 Oct 1990Marchosky J AlexanderMethod and apparatus for volumetric interstitial conductive hyperthermia
US497765518 Apr 198918 Dec 1990Intra-Sonix, Inc.Method of making a transducer
US498960828 Apr 19895 Feb 1991Ratner Adam VDevice construction and method facilitating magnetic resonance imaging of foreign objects in a body
US499157910 Nov 198712 Feb 1991Allen George SMethod and apparatus for providing related images over time of a portion of the anatomy using fiducial implants
US500205818 Apr 198926 Mar 1991Intra-Sonix, Inc.Ultrasonic transducer
US500559227 Oct 19899 Apr 1991Becton Dickinson And CompanyMethod and apparatus for tracking catheters
US50133177 Feb 19907 May 1991Smith & Nephew Richards Inc.Medical drill assembly transparent to X-rays and targeting drill bit
US501663913 Feb 199021 May 1991Allen George SMethod and apparatus for imaging the anatomy
US50171395 Jul 199021 May 1991Mushabac David RMechanical support for hand-held dental/medical instrument
US502781818 May 19892 Jul 1991University Of FloridaDosimetric technique for stereotactic radiosurgery same
US503019615 Jul 19839 Jul 1991Inoue-Japax Research IncorporatedMagnetic treatment device
US50302229 May 19909 Jul 1991James CalandruccioRadiolucent orthopedic chuck
US50312039 Feb 19909 Jul 1991Trecha Randal RCoaxial laser targeting device for use with x-ray equipment and surgical drill equipment during surgical procedures
US504248612 Sep 199027 Aug 1991Siemens AktiengesellschaftCatheter locatable with non-ionizing field and method for locating same
US504703617 Nov 198910 Sep 1991Koutrouvelis Panos GStereotactic device
US50506084 Sep 199024 Sep 1991Medirand, Inc.System for indicating a position to be operated in a patient's body
US505449217 Dec 19908 Oct 1991Cardiovascular Imaging Systems, Inc.Ultrasonic imaging catheter having rotational image correlation
US505709516 Nov 198915 Oct 1991Fabian Carl ESurgical implement detector utilizing a resonant marker
US505978922 Oct 199022 Oct 1991International Business Machines Corp.Optical position and orientation sensor
US507814023 Sep 19867 Jan 1992Kwoh Yik SImaging device - aided robotic stereotaxis system
US50796999 Aug 19897 Jan 1992Picker International, Inc.Quick three-dimensional display
US508640111 May 19904 Feb 1992International Business Machines CorporationImage-directed robotic system for precise robotic surgery including redundant consistency checking
US509424119 Jan 199010 Mar 1992Allen George SApparatus for imaging the anatomy
US509783913 Feb 199024 Mar 1992Allen George SApparatus for imaging the anatomy
US50984266 Feb 198924 Mar 1992Phoenix Laser Systems, Inc.Method and apparatus for precision laser surgery
US509984523 May 199031 Mar 1992Micronix Pty Ltd.Medical instrument location means
US509984623 Dec 198831 Mar 1992Hardy Tyrone LMethod and apparatus for video presentation from a variety of scanner imaging sources
US51058296 May 199121 Apr 1992Fabian Carl ESurgical implement detector utilizing capacitive coupling
US51078394 May 199028 Apr 1992Pavel V. HoudekComputer controlled stereotaxic radiotherapy system and method
US51078435 Apr 199128 Apr 1992Orion-Yhtyma OyMethod and apparatus for thin needle biopsy in connection with mammography
US51078626 May 199128 Apr 1992Fabian Carl ESurgical implement detector utilizing a powered marker
US51091943 Dec 199028 Apr 1992Sextant AvioniqueElectromagnetic position and orientation detector for a pilot's helmet
US511981719 Jan 19909 Jun 1992Allen George SApparatus for imaging the anatomy
US514293029 Mar 19911 Sep 1992Allen George SInteractive image-guided surgical system
US514307611 Jul 19901 Sep 1992Tyrone L. HardyThree-dimensional beam localization microscope apparatus for stereotactic diagnoses or surgery
US51522889 Oct 19916 Oct 1992Siemens AktiengesellschaftApparatus and method for measuring weak, location-dependent and time-dependent magnetic fields
US516033724 Sep 19903 Nov 1992Cosman Eric RCurved-shaped floor stand for use with a linear accelerator in radiosurgery
US516153622 Mar 199110 Nov 1992Catheter TechnologyUltrasonic position indicating apparatus and methods
US517816429 Mar 199112 Jan 1993Allen George SMethod for implanting a fiducial implant into a patient
US517862110 Dec 199112 Jan 1993Zimmer, Inc.Two-piece radio-transparent proximal targeting device for a locking intramedullary nail
US518617429 Jan 199016 Feb 1993G. M. PiaffProcess and device for the reproducible optical representation of a surgical operation
US518747510 Jun 199116 Feb 1993Honeywell Inc.Apparatus for determining the position of an object
US518812625 Mar 199223 Feb 1993Fabian Carl ESurgical implement detector utilizing capacitive coupling
US519005925 Mar 19922 Mar 1993Fabian Carl ESurgical implement detector utilizing a powered marker
US519310628 Aug 19909 Mar 1993Desena DanforthX-ray identification marker
US519747611 Oct 199130 Mar 1993Christopher NowackiLocating target in human body
US519796529 Jul 199230 Mar 1993Codman & Shurtleff, Inc.Skull clamp pin assembly
US519876825 Sep 199030 Mar 1993Elscint, Ltd.Quadrature surface coil array
US519887715 Oct 199030 Mar 1993Pixsys, Inc.Method and apparatus for three-dimensional non-contact shape sensing
US520768831 Oct 19914 May 1993Medco, Inc.Noninvasive head fixation method and apparatus
US521116429 Mar 199118 May 1993Allen George SMethod of locating a target on a portion of anatomy
US52111653 Sep 199118 May 1993General Electric CompanyTracking system to follow the position and orientation of a device with radiofrequency field gradients
US521117629 Nov 199118 May 1993Fuji Photo Optical Co., Ltd.Ultrasound examination system
US521272029 Jan 199218 May 1993Research Foundation-State University Of N.Y.Dual radiation targeting system
US521461524 Sep 199125 May 1993Will BauerThree-dimensional displacement of a body with computer interface
US521935123 Oct 199115 Jun 1993General Electric Cgr S.A.Mammograph provided with an improved needle carrier
US522249926 Mar 199229 Jun 1993Allen George SMethod and apparatus for imaging the anatomy
US522404921 May 199029 Jun 1993Mushabac David RMethod, system and mold assembly for use in preparing a dental prosthesis
US52284423 Dec 199220 Jul 1993Cardiac Pathways CorporationMethod for mapping, ablation, and stimulation using an endocardial catheter
US523033822 Apr 199227 Jul 1993Allen George SInteractive image-guided surgical system for displaying images corresponding to the placement of a surgical tool or the like
US523062310 Dec 199127 Jul 1993Radionics, Inc.Operating pointer with interactive computergraphics
US523399013 Jan 199210 Aug 1993Gideon BarneaMethod and apparatus for diagnostic imaging in radiation therapy
US523799611 Feb 199224 Aug 1993Waldman Lewis KEndocardial electrical mapping catheter
US524958115 Jul 19915 Oct 1993Horbal Mark TPrecision bone alignment
US525112731 Jul 19905 Oct 1993Faro Medical Technologies Inc.Computer-aided surgery apparatus
US52516353 Sep 199112 Oct 1993General Electric CompanyStereoscopic X-ray fluoroscopy system using radiofrequency fields
US525364712 Apr 199119 Oct 1993Olympus Optical Co., Ltd.Insertion position and orientation state pickup for endoscope
US52556803 Sep 199126 Oct 1993General Electric CompanyAutomatic gantry positioning for imaging systems
US52576362 Apr 19912 Nov 1993Steven J. WhiteApparatus for determining position of an endothracheal tube
US525799830 Jun 19922 Nov 1993Mitaka Kohki Co., Ltd.Medical three-dimensional locating apparatus
US52614048 Jul 199116 Nov 1993Mick Peter RThree-dimensional mammal anatomy imaging system and method
US52656103 Sep 199130 Nov 1993General Electric CompanyMulti-planar X-ray fluoroscopy system using radiofrequency fields
US52656116 Jul 199230 Nov 1993Siemens AktiengellschaftApparatus for measuring weak, location-dependent and time-dependent magnetic field
US526975928 Jul 199214 Dec 1993Cordis CorporationMagnetic guidewire coupling for vascular dilatation apparatus
US52714001 Apr 199221 Dec 1993General Electric CompanyTracking system to monitor the position and orientation of a device using magnetic resonance detection of a sample contained within the device
US527302512 Apr 199128 Dec 1993Olympus Optical Co., Ltd.Apparatus for detecting insertion condition of endoscope
US527303917 Nov 199228 Dec 1993Olympus Optical Co., Ltd.Surgical microscope apparatus having a function to display coordinates of observation point
US527455129 Nov 199128 Dec 1993General Electric CompanyMethod and apparatus for real-time navigation assist in interventional radiological procedures
US527930927 Jul 199218 Jan 1994International Business Machines CorporationSignaling device and method for monitoring positions in a surgical operation
US528042727 Nov 199018 Jan 1994Bard International, Inc.Puncture guide for computer tomography
US528578729 Sep 199215 Feb 1994Kabushiki Kaisha ToshibaApparatus for calculating coordinate data of desired point in subject to be examined
US52911996 Jan 19771 Mar 1994Westinghouse Electric Corp.Threat signal detection system
US529188923 May 19918 Mar 1994Vanguard Imaging Ltd.Apparatus and method for spatially positioning images
US529548311 Oct 199122 Mar 1994Christopher NowackiLocating target in human body
US529754923 Sep 199229 Mar 1994Endocardial Therapeutics, Inc.Endocardial mapping system
US529925310 Apr 199229 Mar 1994Akzo N.V.Alignment system to overlay abdominal computer aided tomography and magnetic resonance anatomy with single photon emission tomography
US529925419 Oct 199229 Mar 1994Technomed InternationalMethod and apparatus for determining the position of a target relative to a reference of known co-ordinates and without a priori knowledge of the position of a source of radiation
US529928818 Sep 199129 Mar 1994International Business Machines CorporationImage-directed robotic system for precise robotic surgery including redundant consistency checking
US53000801 Nov 19915 Apr 1994David ClaymanStereotactic instrument guided placement
US53050917 Dec 199219 Apr 1994Oreo Products Inc.Optical coordinate measuring system for large objects
US53052032 Oct 199019 Apr 1994Faro Medical Technologies Inc.Computer-aided surgery apparatus
US53062719 Mar 199226 Apr 1994Izi CorporationRadiation therapy skin markers
US53070729 Jul 199226 Apr 1994Polhemus IncorporatedNon-concentricity compensation in position and orientation measurement systems
US530991330 Nov 199210 May 1994The Cleveland Clinic FoundationFrameless stereotaxy system
US531563010 Mar 199324 May 1994Bodenseewerk Geratetechnik GmbhPositioning device in medical apparatus
US531602424 Dec 199231 May 1994Abbott LaboratoriesTube placement verifier system
US53180251 Apr 19927 Jun 1994General Electric CompanyTracking system to monitor the position and orientation of a device using multiplexed magnetic resonance detection
US532011114 Sep 199214 Jun 1994Livingston Products, Inc.Light beam locator and guide for a biopsy needle
US532572822 Jun 19935 Jul 1994Medtronic, Inc.Electromagnetic flow meter
US532587323 Jul 19925 Jul 1994Abbott LaboratoriesTube placement verifier system
US53299442 Mar 199319 Jul 1994Fabian Carl ESurgical implement detector utilizing an acoustic marker
US533048518 May 199319 Jul 1994Clayman David ACerebral instrument guide frame and procedures utilizing it
US533316829 Jan 199326 Jul 1994Oec Medical Systems, Inc.Time-based attenuation compensation
US535379510 Dec 199211 Oct 1994General Electric CompanyTracking system to monitor the position of a device using multiplexed magnetic resonance detection
US535380011 Dec 199211 Oct 1994Medtronic, Inc.Implantable pressure sensor lead
US53538077 Dec 199211 Oct 1994Demarco Thomas JMagnetically guidable intubation device
US535941715 Oct 199225 Oct 1994Carl-Zeiss-StiftungSurgical microscope for conducting computer-supported stereotactic microsurgery and a method for operating the same
US53680309 Sep 199229 Nov 1994Izi CorporationNon-invasive multi-modality radiographic surface markers
US537177829 Nov 19916 Dec 1994Picker International, Inc.Concurrent display and adjustment of 3D projection, coronal slice, sagittal slice, and transverse slice images
US537559629 Sep 199227 Dec 1994Hdc CorporationMethod and apparatus for determining the position of catheters, tubes, placement guidewires and implantable ports within biological tissue
US537767814 Jul 19933 Jan 1995General Electric CompanyTracking system to follow the position and orientation of a device with radiofrequency fields
US53834542 Jul 199224 Jan 1995St. Louis UniversitySystem for indicating the position of a surgical probe within a head on an image of the head
US53851468 Jan 199331 Jan 1995Goldreyer; Bruce N.Orthogonal sensing for use in clinical electrophysiology
US538514830 Jul 199331 Jan 1995The Regents Of The University Of CaliforniaCardiac imaging and ablation catheter
US538682817 Aug 19937 Feb 1995Sims Deltec, Inc.Guide wire apparatus with location sensing member
US538910121 Apr 199214 Feb 1995University Of UtahApparatus and method for photogrammetric surgical localization
US539119920 Jul 199321 Feb 1995Biosense, Inc.Apparatus and method for treating cardiac arrhythmias
US53944577 Oct 199328 Feb 1995Leibinger GmbhDevice for marking body sites for medical examinations
US539487521 Oct 19937 Mar 1995Lewis; Judith T.Automatic ultrasonic localization of targets implanted in a portion of the anatomy
US539732926 Feb 199314 Mar 1995Allen; George S.Fiducial implant and system of such implants
US539868431 Mar 199221 Mar 1995Hardy; Tyrone L.Method and apparatus for video presentation from scanner imaging sources
US539914613 Dec 199321 Mar 1995Nowacki; ChristopherIsocentric lithotripter
US540038425 Jan 199421 Mar 1995Oec Medical Systems, Inc.Time-based attenuation compensation
US540280128 Apr 19944 Apr 1995International Business Machines CorporationSystem and method for augmentation of surgery
US540840920 Dec 199318 Apr 1995International Business Machines CorporationImage-directed robotic system for precise robotic surgery including redundant consistency checking
US540949712 Mar 199225 Apr 1995Fischer Imaging CorporationOrbital aiming device for mammo biopsy
US541357322 May 19929 May 1995Onesys OyDevice for surgical procedures
US541721027 May 199223 May 1995International Business Machines CorporationSystem and method for augmentation of endoscopic surgery
US541932523 Jun 199430 May 1995General Electric CompanyMagnetic resonance (MR) angiography using a faraday catheter
US54233341 Feb 199313 Jun 1995C. R. Bard, Inc.Implantable medical device characterization system
US54253674 Sep 199120 Jun 1995Navion Biomedical CorporationCatheter depth, position and orientation location system
US542538214 Sep 199320 Jun 1995University Of WashingtonApparatus and method for locating a medical tube in the body of a patient
US542668314 Mar 199420 Jun 1995Oec Medical Systems, Inc.One piece C-arm for X-ray diagnostic equipment
US542668710 Aug 199320 Jun 1995Innovative Care Ltd.Laser targeting device for use with image intensifiers in surgery
US542709710 Dec 199227 Jun 1995Accuray, Inc.Apparatus for and method of carrying out stereotaxic radiosurgery and radiotherapy
US542913223 Aug 19914 Jul 1995Imperial College Of Science Technology And MedicineProbe system
US543319811 Mar 199318 Jul 1995Desai; Jawahar M.Apparatus and method for cardiac ablation
US543727714 Feb 19941 Aug 1995General Electric CompanyInductively coupled RF tracking system for use in invasive imaging of a living body
US54430662 Sep 199422 Aug 1995General Electric CompanyInvasive system employing a radiofrequency tracking system
US544348923 Sep 199422 Aug 1995Biosense, Inc.Apparatus and method for ablation
US544475617 Aug 199422 Aug 1995Minnesota Mining And Manufacturing CompanyX-ray machine, solid state radiation detector and method for reading radiation detection information
US544514416 Dec 199329 Aug 1995Purdue Research FoundationApparatus and method for acoustically guiding, positioning, and monitoring a tube within a body
US544515029 Jan 199329 Aug 1995General Electric CompanyInvasive system employing a radiofrequency tracking system
US54451666 Apr 199429 Aug 1995International Business Machines CorporationSystem for advising a surgeon
US54465488 Oct 199329 Aug 1995Siemens Medical Systems, Inc.Patient positioning and monitoring system
US544715430 Jul 19935 Sep 1995Universite Joseph FourierMethod for determining the position of an organ
US54486104 Feb 19945 Sep 1995Hitachi Medical CorporationDigital X-ray photography device
US54536868 Apr 199326 Sep 1995Polhemus IncorporatedPulsed-DC position and orientation measurement system
US54567183 Jan 199410 Oct 1995Szymaitis; Dennis W.Apparatus for detecting surgical objects within the human body
US545764126 Sep 199410 Oct 1995Sextant AvioniqueMethod and apparatus for determining an orientation associated with a mobile system, especially a line of sight inside a helmet visor
US545871818 Mar 199417 Oct 1995Vip Industries LimitedHeat sealing method for making a luggage case
US546444612 Oct 19937 Nov 1995Medtronic, Inc.Brain lead anchoring system
US546984728 Feb 199428 Nov 1995Izi CorporationRadiographic multi-modality skin markers
US547834123 Dec 199126 Dec 1995Zimmer, Inc.Ratchet lock for an intramedullary nail locking bolt
US547834322 Jul 199426 Dec 1995Howmedica International, Inc.Targeting device for bone nails
US548042223 Sep 19942 Jan 1996Biosense, Inc.Apparatus for treating cardiac arrhythmias
US54804391 Feb 19942 Jan 1996Lunar CorporationMethod for periprosthetic bone mineral density measurement
US548396129 Aug 199416 Jan 1996Kelly; Patrick J.Magnetic field digitizer for stereotactic surgery
US548584931 Jan 199423 Jan 1996Ep Technologies, Inc.System and methods for matching electrical characteristics and propagation velocities in cardiac tissue
US548739128 Jan 199430 Jan 1996Ep Technologies, Inc.Systems and methods for deriving and displaying the propagation velocities of electrical events in the heart
US548772925 Mar 199430 Jan 1996Cordis CorporationMagnetic guidewire coupling for catheter exchange
US548775721 Feb 199530 Jan 1996Medtronic CardiorhythmMulticurve deflectable catheter
US549019618 Mar 19946 Feb 1996Metorex International OyMulti energy system for x-ray imaging applications
US549403415 Jun 199427 Feb 1996Georg SchlondorffProcess and device for the reproducible optical representation of a surgical operation
US550341610 Mar 19942 Apr 1996Oec Medical Systems, Inc.Undercarriage for X-ray diagnostic equipment
US551363721 Dec 19947 May 1996Hdc CorporationMethod and apparatus for determining the position of catheters, tubes, placement guidewires and implantable ports within biological tissue
US55141466 Sep 19947 May 1996Dwl Electronische Systeme GmbhDevice for accomodating at least one sonographic probe
US551516020 Feb 19937 May 1996Aesculap AgMethod and apparatus for representing a work area in a three-dimensional structure
US55179908 Apr 199421 May 1996The Cleveland Clinic FoundationStereotaxy wand and tool guide
US553122728 Jan 19942 Jul 1996Schneider Medical Technologies, Inc.Imaging device and method
US55315201 Sep 19942 Jul 1996Massachusetts Institute Of TechnologySystem and method of registration of three-dimensional data sets including anatomical body data
US554293812 May 19956 Aug 1996Cordis CorporationMagnetic guidewire coupling for catheter exchange
US554395119 Feb 19956 Aug 1996Siemens AktiengesellschaftMethod for receive-side clock supply for video signals digitally transmitted with ATM in fiber/coaxial subscriber line networks
US554694022 Aug 199520 Aug 1996Ep Technologies, Inc.System and method for matching electrical characteristics and propagation velocities in cardiac tissue to locate potential ablation sites
US554694916 Mar 199520 Aug 1996Frazin; LeonMethod and apparatus of logicalizing and determining orientation of an insertion end of a probe within a biotic structure
US554695123 Sep 199420 Aug 1996Biosense, Inc.Method and apparatus for studying cardiac arrhythmias
US55514292 Jun 19953 Sep 1996Fitzpatrick; J. MichaelMethod for relating the data of an image space to physical space
US55580916 Oct 199324 Sep 1996Biosense, Inc.Magnetic determination of position and orientation
US55666812 May 199522 Oct 1996Manwaring; Kim H.Apparatus and method for stabilizing a body part
US556838413 Oct 199222 Oct 1996Mayo Foundation For Medical Education And ResearchBiomedical imaging and analysis
US556880912 Jul 199529 Oct 1996Biosense, Inc.Apparatus and method for intrabody mapping
US557299926 Jan 199512 Nov 1996International Business Machines CorporationRobotic system for positioning a surgical instrument relative to a patient's body
US557353310 Apr 199212 Nov 1996Medtronic CardiorhythmMethod and system for radiofrequency ablation of cardiac tissue
US557579427 Jan 199419 Nov 1996Walus; Richard L.Tool for implanting a fiducial marker
US557579822 Aug 199519 Nov 1996Koutrouvelis; Panos G.Stereotactic device
US558390920 Dec 199410 Dec 1996Oec Medical Systems, Inc.C-arm mounting structure for mobile X-ray imaging system
US558843014 Feb 199531 Dec 1996University Of Florida Research Foundation, Inc.Repeat fixation for frameless stereotactic procedure
US559021521 Apr 199531 Dec 1996Allen; George S.Method for providing medical images
US559293914 Jun 199514 Jan 1997Martinelli; Michael A.Method and system for navigating a catheter probe
US55951936 Jun 199521 Jan 1997Walus; Richard L.Tool for implanting a fiducial marker
US559622828 Jul 199521 Jan 1997Oec Medical Systems, Inc.Apparatus for cooling charge coupled device imaging systems
US560033012 Jul 19944 Feb 1997Ascension Technology CorporationDevice for measuring position and orientation using non-dipole magnet IC fields
US560331829 Oct 199318 Feb 1997University Of Utah Research FoundationApparatus and method for photogrammetric surgical localization
US561102523 Nov 199411 Mar 1997General Electric CompanyVirtual internal cavity inspection system
US56174627 Aug 19951 Apr 1997Oec Medical Systems, Inc.Automatic X-ray exposure control system and method of use
US56178576 Jun 19958 Apr 1997Image Guided Technologies, Inc.Imaging system having interactive medical instruments and methods
US561926125 Jul 19948 Apr 1997Oec Medical Systems, Inc.Pixel artifact/blemish filter for use in CCD video camera
US562216914 Sep 199422 Apr 1997University Of WashingtonApparatus and method for locating a medical tube in the body of a patient
US56221704 Oct 199422 Apr 1997Image Guided Technologies, Inc.Apparatus for determining the position and orientation of an invasive portion of a probe inside a three-dimensional body
US56278734 Aug 19956 May 1997Oec Medical Systems, Inc.Mini C-arm assembly for mobile X-ray imaging system
US562831511 Sep 199513 May 1997Brainlab Med. Computersysteme GmbhDevice for detecting the position of radiation target points
US563043111 Oct 199420 May 1997International Business Machines CorporationSystem and method for augmentation of surgery
US563664417 Mar 199510 Jun 1997Applied Medical Resources CorporationMethod and apparatus for endoconduit targeting
US563881929 Aug 199517 Jun 1997Manwaring; Kim H.Method and apparatus for guiding an instrument to a target
US56401705 Jun 199517 Jun 1997Polhemus IncorporatedPosition and orientation measuring system having anti-distortion source configuration
US56423957 Aug 199524 Jun 1997Oec Medical Systems, Inc.Imaging chain with miniaturized C-arm assembly for mobile X-ray imaging system
US564326811 Sep 19951 Jul 1997Brainlab Med. Computersysteme GmbhFixation pin for fixing a reference system to bony structures
US564506511 Apr 19958 Jul 1997Navion Biomedical CorporationCatheter depth, position and orientation location system
US564652416 Jun 19938 Jul 1997Elbit Ltd.Three dimensional tracking system employing a rotating field
US56473611 Mar 199315 Jul 1997Fonar CorporationMagnetic resonance imaging method and apparatus for guiding invasive therapy
US566211116 May 19952 Sep 1997Cosman; Eric R.Process of stereotactic optical navigation
US566400122 Mar 19962 Sep 1997J. Morita Manufacturing CorporationMedical X-ray imaging apparatus
US567429622 Jul 19967 Oct 1997Spinal Dynamics CorporationHuman spinal disc prosthesis
US567667324 Apr 199614 Oct 1997Visualization Technology, Inc.Position tracking and imaging system with error detection for use in medical applications
US56812609 Dec 199428 Oct 1997Olympus Optical Co., Ltd.Guiding apparatus for guiding an insertable body within an inspected object
US568288626 Dec 19954 Nov 1997Musculographics IncComputer-assisted surgical system
US568289026 Jan 19954 Nov 1997Picker International, Inc.Magnetic resonance stereotactic surgery with exoskeleton tissue stabilization
US569010821 Feb 199725 Nov 1997Chakeres; Donald W.Interventional medicine apparatus
US569494523 Oct 19959 Dec 1997Biosense, Inc.Apparatus and method for intrabody mapping
US56955006 Apr 19949 Dec 1997International Business Machines CorporationSystem for manipulating movement of a surgical instrument with computer controlled brake
US569550130 Sep 19949 Dec 1997Ohio Medical Instrument Company, Inc.Apparatus for neurosurgical stereotactic procedures
US569737722 Nov 199516 Dec 1997Medtronic, Inc.Catheter mapping system and method
US570240611 Sep 199530 Dec 1997Brainlab Med. Computersysteme GmbbDevice for noninvasive stereotactic immobilization in reproducible position
US571129926 Jan 199627 Jan 1998Manwaring; Kim H.Surgical guidance method and system for approaching a target within a body
US571394628 Oct 19963 Feb 1998Biosense, Inc.Apparatus and method for intrabody mapping
US571582228 Sep 199510 Feb 1998General Electric CompanyMagnetic resonance devices suitable for both tracking and imaging
US571583615 Feb 199410 Feb 1998Kliegis; UlrichMethod and apparatus for planning and monitoring a surgical operation
US57182417 Jun 199517 Feb 1998Biosense, Inc.Apparatus and method for treating cardiac arrhythmias with no discrete target
US572755211 Jan 199617 Mar 1998Medtronic, Inc.Catheter and electrical lead location system
US57275533 Apr 199617 Mar 1998Saad; Saad A.Catheter with integral electromagnetic location identification device
US57291297 Jun 199517 Mar 1998Biosense, Inc.Magnetic location system with feedback adjustment of magnetic field generator
US57301293 Apr 199524 Mar 1998General Electric CompanyImaging of interventional devices in a non-stationary subject
US57301302 Jun 199524 Mar 1998Johnson & Johnson Professional, Inc.Localization cap for fiducial markers
US573270320 May 199631 Mar 1998The Cleveland Clinic FoundationStereotaxy wand and tool guide
US573527815 Mar 19967 Apr 1998National Research Council Of CanadaSurgical procedure with magnetic resonance imaging
US57380961 Feb 199614 Apr 1998Biosense, Inc.Cardiac electromechanics
US57408028 Dec 199521 Apr 1998General Electric CompanyComputer graphic and live video system for enhancing visualization of body structures during surgery
US574121419 Dec 199421 Apr 1998Terumo Kabushiki KaishaAccessory pathway detecting/cauterizing apparatus
US574239414 Jun 199621 Apr 1998Ascension Technology CorporationOptical 6D measurement system with two fan shaped beams rotating around one axis
US574495329 Aug 199628 Apr 1998Ascension Technology CorporationMagnetic motion tracker with transmitter placed on tracked object
US574876710 Aug 19935 May 1998Faro Technology, Inc.Computer-aided surgery apparatus
US574936226 Jan 199512 May 1998International Business Machines CorporationMethod of creating an image of an anatomical feature where the feature is within a patient's body
US57498359 Jan 199712 May 1998Sims Deltec, Inc.Method and apparatus for location of a catheter tip
US57525137 Jun 199519 May 1998Biosense, Inc.Method and apparatus for determining position of object
US57557256 Sep 199426 May 1998Deemed International, S.A.Computer-assisted microsurgery methods and equipment
US57586675 Jan 19962 Jun 1998Siemens Elema AbDevice for locating a port on a medical implant
US576206423 Jan 19959 Jun 1998Northrop Grumman CorporationMedical magnetic positioning system and method for determining the position of a magnetic probe
US576766914 Jun 199616 Jun 1998Ascension Technology CorporationMagnetic field position and orientation measurement system with dynamic eddy current rejection
US576796014 Jun 199616 Jun 1998Ascension Technology CorporationOptical 6D measurement system with three fan-shaped beams rotating around one axis
US576978922 Apr 199723 Jun 1998George S. AllenAutomatic technique for localizing externally attached fiducial markers in volume images of the head
US576984320 Feb 199623 Jun 1998CormedicaPercutaneous endomyocardial revascularization
US576986112 Sep 199623 Jun 1998Brainlab Med. Computersysteme GmbhMethod and devices for localizing an instrument
US577259416 Oct 199630 Jun 1998Barrick; Earl F.Fluoroscopic image guided orthopaedic surgery system with intraoperative registration
US577532227 Jun 19967 Jul 1998Lucent Medical Systems, Inc.Tracheal tube and methods related thereto
US577606420 May 19967 Jul 1998The Cleveland Clinic FoundationFrameless stereotaxy system for indicating the position and axis of a surgical probe
US578276525 Apr 199621 Jul 1998Medtronic, Inc.Medical positioning system
US578788610 Apr 19954 Aug 1998Compass International IncorporatedMagnetic field digitizer for stereotatic surgery
US579205519 Nov 199611 Aug 1998Schneider (Usa) Inc.Guidewire antenna
US579529430 May 199518 Aug 1998Carl-Zeiss-StiftungProcedure for the correlation of different coordinate systems in computer-supported, stereotactic surgery
US57978497 Mar 199725 Aug 1998Sonometrics CorporationMethod for carrying out a medical procedure using a three-dimensional tracking and imaging system
US579905517 May 199625 Aug 1998Northwestern UniversityApparatus and method for planning a stereotactic surgical procedure using coordinated fluoroscopy
US57990996 Jun 199525 Aug 1998George S. AllenAutomatic technique for localizing externally attached fiducial markers in volume images of the head
US580035224 Apr 19961 Sep 1998Visualization Technology, Inc.Registration system for use with position tracking and imaging system for use in medical applications
US58005351 Nov 19941 Sep 1998The University Of Iowa Research FoundationWireless prosthetic electrode for the brain
US580271926 Feb 19978 Sep 1998Oec Medical Systems, Inc.One piece C-arm for X-ray diagnostic equipment
US580308913 Sep 19958 Sep 1998Visualization Technology, Inc.Position tracking and imaging system for use in medical applications
US580725222 Oct 199615 Sep 1998Aesculap AgMethod and apparatus for determining the position of a body part
US58100083 Dec 199622 Sep 1998Isg Technologies Inc.Apparatus and method for visualizing ultrasonic images
US581072818 Mar 199722 Sep 1998U.S. Philips CorporationMR imaging method and apparatus for guiding a catheter
US58107351 May 199722 Sep 1998Medtronic, Inc.External patient reference sensors
US582055316 Aug 199613 Oct 1998Siemens Medical Systems, Inc.Identification system and method for radiation therapy
US582319231 Jul 199620 Oct 1998University Of Pittsburgh Of The Commonwealth System Of Higher EducationApparatus for automatically positioning a patient for treatment/diagnoses
US582395815 Jun 199420 Oct 1998Truppe; MichaelSystem and method for displaying a structural data image in real-time correlation with moveable body
US582872523 Jun 199727 Oct 1998Eliav Medical Imaging Systems LtdProcessing images for removal of artifacts
US582877020 Feb 199627 Oct 1998Northern Digital Inc.System for determining the spatial position and angular orientation of an object
US582944415 Sep 19943 Nov 1998Visualization Technology, Inc.Position tracking and imaging system for use in medical applications
US583126010 Sep 19963 Nov 1998Ascension Technology CorporationHybrid motion tracker
US583360831 Mar 199710 Nov 1998Biosense, Inc.Magnetic determination of position and orientation
US583475922 May 199710 Nov 1998Glossop; Neil DavidTracking device having emitter groups with different emitting directions
US583695418 Feb 199717 Nov 1998University Of Utah Research FoundationApparatus and method for photogrammetric surgical localization
US584002413 Mar 199724 Nov 1998Olympus Optical Co., Ltd.Endoscope form detecting apparatus in which coil is fixedly mounted by insulating member so that form is not deformed within endoscope
US584002521 Nov 199724 Nov 1998Biosense, Inc.Apparatus and method for treating cardiac arrhythmias
US584307612 Jun 19961 Dec 1998Cordis Webster, Inc.Catheter with an electromagnetic guidance sensor
US58489677 Jun 199515 Dec 1998Cosman; Eric R.Optically coupled frameless stereotactic system and method
US585118316 Oct 199522 Dec 1998St. Louis UniversitySystem for indicating the position of a surgical probe within a head on an image of the head
US586584615 May 19972 Feb 1999Bryan; VincentHuman spinal disc prosthesis
US586867422 Nov 19969 Feb 1999U.S. Philips CorporationMRI-system and catheter for interventional procedures
US586867510 May 19909 Feb 1999Elekta Igs S.A.Interactive system for local intervention inside a nonhumogeneous structure
US58714457 Sep 199516 Feb 1999St. Louis UniversitySystem for indicating the position of a surgical probe within a head on an image of the head
US587145523 Apr 199716 Feb 1999Nikon CorporationOphthalmic apparatus
US587148710 Mar 199716 Feb 1999Cytotherpeutics, Inc.Microdrive for use in stereotactic surgery
US587382224 Apr 199623 Feb 1999Visualization Technology, Inc.Automatic registration system for use with position tracking and imaging system for use in medical applications
US588230427 Oct 199716 Mar 1999Picker Nordstar CorporationMethod and apparatus for determining probe location
US588441017 Dec 199623 Mar 1999Carl-Zeiss-StiftungSensing system for coordinate measuring equipment
US588983427 Sep 199630 Mar 1999Brainlab Med. Computersysteme GmbhBlade collimator for radiation therapy
US58910347 Jun 19956 Apr 1999St. Louis UniversitySystem for indicating the position of a surgical probe within a head on an image of the head
US589115726 Apr 19966 Apr 1999Ohio Medical Instrument Company, Inc.Apparatus for surgical stereotactic procedures
US590469126 Sep 199718 May 1999Picker International, Inc.Trackable guide block
US59073956 Jun 199725 May 1999Image Guided Technologies, Inc.Optical fiber probe for position measurement
US591382016 Aug 199322 Jun 1999British Telecommunications Public Limited CompanyPosition location system
US592039512 Nov 19976 Jul 1999Image Guided Technologies, Inc.System for locating relative positions of objects in three dimensional space
US592199211 Apr 199713 Jul 1999Radionics, Inc.Method and system for frameless tool calibration
US592372730 Sep 199713 Jul 1999Siemens Corporate Research, Inc.Method and apparatus for calibrating an intra-operative X-ray system
US592824825 Feb 199827 Jul 1999Biosense, Inc.Guided deployment of stents
US59386031 Dec 199717 Aug 1999Cordis Webster, Inc.Steerable catheter with electromagnetic sensor
US593869421 Feb 199617 Aug 1999Medtronic CardiorhythmElectrode array catheter
US59479806 Aug 19977 Sep 1999Price Invena ApsDevice for squeezing and cutting an umbilical cord
US594798119 Feb 19977 Sep 1999Cosman; Eric R.Head and neck localizer
US595062928 Apr 199414 Sep 1999International Business Machines CorporationSystem for assisting a surgeon during surgery
US595147525 Sep 199714 Sep 1999International Business Machines CorporationMethods and apparatus for registering CT-scan data to multiple fluoroscopic images
US595157119 Sep 199614 Sep 1999Surgical Navigation Specialist Inc.Method and apparatus for correlating a body with an image of the body
US595464726 Apr 199621 Sep 1999University Of Florida Research Foundation, Inc.Marker system and related stereotactic procedure
US595784428 Nov 199728 Sep 1999Surgical Navigation Specialist Inc.Apparatus and method for visualizing ultrasonic images
US596479625 Aug 199812 Oct 1999Cardiac Pathways CorporationCatheter assembly, catheter and multi-port introducer for use therewith
US596798017 Dec 199619 Oct 1999Visualization Technology, Inc.Position tracking and imaging system for use in medical applications
US59679829 Dec 199719 Oct 1999The Cleveland Clinic FoundationNon-invasive spine and bone registration for frameless stereotaxy
US59680476 Jul 199819 Oct 1999Reed; Thomas MillsFixation devices
US597199720 Feb 199726 Oct 1999Radionics, Inc.Intraoperative recalibration apparatus for stereotactic navigators
US59761562 Nov 19932 Nov 1999International Business Machines CorporationStereotaxic apparatus and method for moving an end effector
US598053523 Sep 19979 Nov 1999Picker International, Inc.Apparatus for anatomical tracking
US59831261 Aug 19979 Nov 1999Medtronic, Inc.Catheter location system and method
US598734918 Apr 199716 Nov 1999Image Guided Technologies, Inc.Method for determining the position and orientation of two moveable objects in three-dimensional space
US598796026 Sep 199723 Nov 1999Picker International, Inc.Tool calibrator
US599983726 Sep 19977 Dec 1999Picker International, Inc.Localizing and orienting probe for view devices
US599984030 Aug 19957 Dec 1999Massachusetts Institute Of TechnologySystem and method of registration of three-dimensional data sets
US60011306 Oct 199714 Dec 1999Bryan; VincentHuman spinal disc prosthesis with hinges
US60061267 Jun 199521 Dec 1999Cosman; Eric R.System and method for stereotactic registration of image scan data
US600612725 Feb 199821 Dec 1999U.S. Philips CorporationImage-guided surgery system
US601308725 Sep 199711 Jan 2000U.S. Philips CorporationImage-guided surgery system
US60145809 Feb 199811 Jan 2000Stereotaxis, Inc.Device and method for specifying magnetic field for surgical applications
US60164399 Oct 199718 Jan 2000Biosense, Inc.Method and apparatus for synthetic viewpoint imaging
US60197257 Mar 19971 Feb 2000Sonometrics CorporationThree-dimensional tracking and imaging system
US60246956 May 199915 Feb 2000International Business Machines CorporationSystem and method for augmentation of surgery
US605072428 Jan 199818 Apr 2000U. S. Philips CorporationMethod of and device for position detection in X-ray imaging
US60597182 Jun 19959 May 2000Olympus Optical Co., Ltd.Endoscope form detecting apparatus in which coil is fixedly mounted by insulating member so that form is not deformed within endoscope
US606302231 Dec 199716 May 2000Biosense, Inc.Conformal catheter
US60712886 Dec 19976 Jun 2000Ohio Medical Instrument Company, Inc.Apparatus and method for surgical stereotactic procedures
US607304322 Dec 19976 Jun 2000Cormedica CorporationMeasuring position and orientation using magnetic fields
US60760083 Feb 199913 Jun 2000St. Louis UniversitySystem for indicating the position of a surgical probe within a head on an image of the head
US609605019 Mar 19991 Aug 2000Surgical Navigation Specialist Inc.Method and apparatus for correlating a body with an image of the body
US610494417 Nov 199715 Aug 2000Martinelli; Michael A.System and method for navigating a multiple electrode catheter
US611884529 Jun 199812 Sep 2000Surgical Navigation Technologies, Inc.System and methods for the reduction and elimination of image artifacts in the calibration of X-ray imagers
US612253816 Jan 199719 Sep 2000Acuson CorporationMotion--Monitoring method and system for medical devices
US612254110 Dec 199619 Sep 2000Radionics, Inc.Head band for frameless stereotactic registration
US613139626 Sep 199717 Oct 2000Siemens AktiengesellschaftHeat radiation shield, and dewar employing same
US613918316 Oct 199831 Oct 2000Siemens AktiengesellschaftX-ray exposure system for 3D imaging
US614748015 Oct 199814 Nov 2000Biosense, Inc.Detection of metal disturbance
US614959226 Nov 199721 Nov 2000Picker International, Inc.Integrated fluoroscopic projection image data, volumetric image data, and surgical device position data
US615606715 May 19975 Dec 2000Spinal Dynamics CorporationHuman spinal disc prosthesis
US616103222 Mar 199912 Dec 2000Biosense, Inc.Three-axis coil sensor
US616518115 Oct 199826 Dec 2000Sofamor Danek Holdings, Inc.Apparatus and method for photogrammetric surgical localization
US616729630 Sep 199926 Dec 2000The Board Of Trustees Of The Leland Stanford Junior UniversityMethod for volumetric image navigation
US617249929 Oct 19999 Jan 2001Ascension Technology CorporationEddy current error-reduced AC magnetic position measurement system
US617575615 Dec 199816 Jan 2001Visualization Technology Inc.Position tracking and imaging system for use in medical applications
US617834510 May 199923 Jan 2001Brainlab Med. Computersysteme GmbhMethod for detecting the exact contour of targeted treatment areas, in particular, the external contour
US61946391 May 199727 Feb 2001The University Of QueenslandACC synthase genes from pineapple
US620138724 Sep 199813 Mar 2001Biosense, Inc.Miniaturized position sensor having photolithographic coils for tracking a medical probe
US620349721 Apr 199920 Mar 2001Surgical Navigation SpecialistApparatus and method for visualizing ultrasonic images
US621166624 Feb 19973 Apr 2001Biosense, Inc.Object location system and method using field actuation sequences having different field strengths
US62230677 Apr 199824 Apr 2001Brainlab Med. Computersysteme GmbhReferencing device including mouthpiece
US623347618 May 199915 May 2001Mediguide Ltd.Medical positioning system
US624623129 Jul 199912 Jun 2001Ascension Technology CorporationMagnetic field permeable barrier for magnetic position measurement system
US625994225 Sep 199810 Jul 2001Surgical Navigation Specialist Inc.Method and apparatus for recording a three-dimensional image of a body part
US627389621 Apr 199814 Aug 2001Neutar, LlcRemovable frames for stereotactic localization
US628590210 Feb 19994 Sep 2001Surgical Insights, Inc.Computer assisted targeting device for use in orthopaedic surgery
US62982621 May 20012 Oct 2001Neutar, LlcInstrument guidance for stereotactic surgery
US631431022 Jan 19986 Nov 2001Biosense, Inc.X-ray guided surgical location system with extended mapping volume
US633208914 Feb 199718 Dec 2001Biosense, Inc.Medical procedures and apparatus using intrabody probes
US634123111 Oct 200022 Jan 2002Visualization Technology, Inc.Position tracking and imaging system for use in medical applications
US635165928 Aug 199726 Feb 2002Brainlab Med. Computersysteme GmbhNeuro-navigation system
US638148528 Oct 199930 Apr 2002Surgical Navigation Technologies, Inc.Registration of human anatomy integrated for electromagnetic localization
US642485617 May 199923 Jul 2002Brainlab AgMethod for the localization of targeted treatment areas in soft body parts
US642731410 Feb 19986 Aug 2002Biosense, Inc.Magnetic determination of position and orientation
US642854715 Nov 20006 Aug 2002Brainlab AgDetection of the shape of treatment devices
US643441520 Sep 199913 Aug 2002St. Louis UniversitySystem for use in displaying images of a body part
US64375676 Dec 199920 Aug 2002General Electric CompanyRadio frequency coil for open magnetic resonance imaging system
US644594314 Dec 19983 Sep 2002Visualization Technology, Inc.Position tracking and imaging system for use in medical applications
US647020723 Mar 199922 Oct 2002Surgical Navigation Technologies, Inc.Navigational guidance via computer-assisted fluoroscopic imaging
US647434128 Oct 19995 Nov 2002Surgical Navigation Technologies, Inc.Surgical communication and power system
US647880211 Jun 200112 Nov 2002Ge Medical Systems Global Technology Company, LlcMethod and apparatus for display of an image guided drill bit
US648404928 Apr 200019 Nov 2002Ge Medical Systems Global Technology Company, LlcFluoroscopic tracking and visualization system
US649047528 Apr 20003 Dec 2002Ge Medical Systems Global Technology Company, LlcFluoroscopic tracking and visualization system
US64935738 Jun 200010 Dec 2002Winchester Development AssociatesMethod and system for navigating a catheter probe in the presence of field-influencing objects
US649894413 Apr 200024 Dec 2002Biosense, Inc.Intrabody measurement
US649948828 Oct 199931 Dec 2002Winchester Development AssociatesSurgical sensor
US651604622 Sep 20004 Feb 2003Brainlab AgExact patient positioning by compairing reconstructed x-ray images and linac x-ray images
US652744331 Aug 19994 Mar 2003Brainlab AgProcess and apparatus for image guided treatment with an integration of X-ray detection and navigation system
US66090228 Jan 200119 Aug 2003Brainlab AgIntraoperative navigation updating
US661170030 Dec 199926 Aug 2003Brainlab AgMethod and apparatus for positioning a body for radiation using a position sensor
US670117927 Oct 20002 Mar 2004Michael A. MartinelliCoil structures and methods for generating magnetic fields
US200100079188 Jan 200112 Jul 2001Brainlab AgIntraoperative navigation updating
US2002009508126 Feb 200218 Jul 2002Brainlab Med. Computersysteme GmbhNeuro-navigation system
US2004002430930 Apr 20035 Feb 2004Ferre Maurice R.System for monitoring the position of a medical instrument with respect to a patient's body
USRE3261917 Oct 19848 Mar 1988 Apparatus and method for nuclear magnetic resonance scanning and mapping
USRE350257 Jan 199122 Aug 1995Oec Medical SystemsBattery enhanced power generation for mobile X-ray machine
USRE3581630 Mar 19952 Jun 1998Image Guided Technologies Inc.Method and apparatus for three-dimensional non-contact shape sensing
CA964149A24 Oct 197211 Mar 1975William X. HalloranOrthopedic drill guide apparatus
DE3042343A110 Nov 19809 Jun 1982Philips PatentverwaltungSynthetic formation of defect-free images - by superimposed blurring images of defect on reconstituted multiple perspective images
DE3508730A112 Mar 198518 Sep 1986Siemens AgMeasuring device for medical purposes
DE3717871C227 May 198730 Nov 1989Georg Prof. Dr. 5106 Roetgen De SchloendorffTitle not available
DE3831278A114 Sep 198823 Mar 1989Toshiba Kawasaki KkStereotaktisches geraet
DE3838011C29 Nov 198810 Sep 1998George S AllenMarkierelement und Verfahren zur Erzeugung von Bildern der Anatomie
DE4213426C223 Apr 199230 Jan 1997Olympus Optical CoMedizinische Vorrichtung, die einen Kontaktzustand eines Behandlungsabschnitts in der Betätigungseinheit nachbildet
DE4225112C130 Jul 19929 Dec 1993Bodenseewerk GeraetetechInstrument position relative to processing object measuring apparatus - has measuring device for measuring position of instrument including inertia sensor unit
DE4233978C18 Oct 199221 Apr 1994Leibinger GmbhVorrichtung zum Markieren von Körperstellen für medizinische Untersuchungen
DE19715202B411 Apr 19972 Feb 2006Brainlab AgReferenzierungsvorrichtung mit einem Mundstück
DE19747427C228 Oct 19979 Dec 1999Zeiss Carl FaVorrichtung zur Knochensegmentnavigation
DE19751761B421 Nov 199722 Jun 2006Brainlab AgSystem und Verfahren zur aktuell exakten Erfassung von Behandlungszielpunkten
DE19832296A117 Jul 19984 Feb 1999Image Guided Technologies IncOptical tracking system for position and orientation of body in three=dimensional space
EP0062941B124 Mar 198226 Sep 1984Philips Electronics N.V.Contour recording device
EP0119660A115 Mar 198426 Sep 1984Nicolaas Roelof SnijderSystem of examining skeleton parts of a living body, more particularly the vertebral column of the human body
EP0155857A329 Jan 198526 Oct 1988Faro Medical Technologies Inc.Knee laxity evaluator and motion module/digitizer arrangement
EP0319844A11 Dec 198814 Jun 1989Ad-Tech Medical Instrument CorporationElectrical connectors for brain-contact devices
EP0326768A313 Dec 198823 Jan 1991Faro Medical Technologies Inc.Computer-aided surgery apparatus
EP0350996A15 Jul 198917 Jan 1990Philips Electronics N.V.X-ray Examination apparatus comprising a balanced supporting arm
EP0419729A129 Sep 19893 Apr 1991Siemens AktiengesellschaftPosition finding of a catheter by means of non-ionising fields
EP0427358B119 Oct 199027 Mar 1996George S. AllenMechanical arm for and interactive image-guided surgical system
EP0456103A32 May 199129 Apr 1992International Business Machines CorporationImage-directed robotic system for precise surgery
EP0581704B127 Jul 199324 Jan 2001Universite Joseph FourierMethod for determining the position of an organ
EP0651968A116 Aug 199010 May 1995Critikon, Inc.Epidural oxygen sensor
EP0655138B116 Aug 199329 Apr 1998BRITISH TELECOMMUNICATIONS public limited companyPosition location system
EP0894473A224 Jan 19953 Feb 1999Biosense Inc.Medical diagnosis, treatment and imaging systems
EP0908146B16 Oct 199814 Jan 2004General Electric CompanyReal-time image-guided placement of anchor devices
EP0930046A320 Oct 199820 Sep 2000Picker International, Inc.Method of, and apparatus for, imaging
FR2417970B1 Title not available
GB2094590A Title not available
GB2164856B Title not available
WO2001030437A127 Oct 20003 May 2001Winchester Development AssociatesPatient-shielding and coil system
Non-Patent Citations
Reference
1Adams et al., "Orientation Aid for Head and Neck Surgeons," Innov. Tech. Biol. Med., vol. 13, No. 4, 1992, pp. 409-424.
2Adams et al., Computer-Assisted Surgery, IEEE Computer Graphics & Applications, pp. 43-51, (May 1990).
3Barrick et al., "Prophylactic Intramedullary Fixation of the Tibia for Stress Fracture in a Professional Athlete," Journal of Orthopaedic Trauma, vol. 6, No. 2, pp. 241-244 (1992).
4Barrick et al., "Technical Difficulties with the Brooker-Wills Nail in Acute Fractures of the Femur," Journal of Orthopaedic Trauma, vol. 6, No. 2, pp. 144-150 (1990).
5Barrick, "Distal Locking Screw Insertion Using a Cannulated Drill Bit: Technical Note," Journal of Orthopaedic Trauma, vol. 7, No. 3, 1993, pp. 248-251.
6Batnitzky et al., "Three-dimensional Computer Reconstruction From Surface Contours for Head CT Examinations", Journal of Comp. Assisted Tomography, No. 5, Feb. 1981, pp. 60-67.
7Batnitzky, S. et al. Three-Dimensional Computer Reconstruction from Surface Contours for Head CT Examinations.
8Batnitzky, S. et al.; Three-Dimensional Computer Reconstruction from Surface Contours for Head CT Examinations. J. Comp. Asst. Tomogr., vol. 5, No. 1, pp. 60-67, Feb. 1981.
9Benzel et al., "Magnetic Source Imaging: a Review of the Magnes System of Biomagnetic Technologies Incorporated," Neurosurgery, vol. 33, No. 2 (Aug. 1993), pp. 252-259.
10Bergstrom et al. Stereotaxic Computed Tomography, Am. J. Roentgenol, vol. 127 pp. 167-170 (1976).
11Bouazza-Marouf et al.; "Robotic-Assisted Internal Fixation of Femoral Fractures", IMECHE., pp. 51-58 (1995).
12Brack et al., "Accurate X-ray Based Navigation in Computer-Assisted Orthopedic Surgery," CAR '98, pp. 716-722.
13Brown, R., M.D., A Stereotactic Head Frame for Use with CT Body Scanners, Investigative Radiology © J.B. Lippincott Company, pp. 300-304 (Jul.-Aug. 1979).
14Bryan, "Bryan Cervical Disc System Single Level Surgical Technique", Spinal Dynamics, 2002, pp. 1-33.
15Bucholz et al., "Variables affecting the accuracy of stereotactic localizationusing computerized tomography," Journal of Neurosurgery, vol. 79, Nov. 1993, pp. 667-673.
16Bucholz, R.D., et al. Image-guided surgical techniques for infections and trauma of the central nervous system, Neurosurg. Clinics of N.A., vol. 7, No. 2, pp. 187-200 (1996).
17Bucholz, R.D., et al., A Comparison of Sonic Digitizers Versus Light Emitting Diode-Based Localization, Interactive Image-Guided Neurosurgery, Chapter 16, pp. 179-200 (1993).
18Bucholz, R.D., et al., Intraoperative localization using a three dimensional optical digitizer, SPIE-The Intl. Soc. for Opt. Eng., vol. 1894, pp. 312-322 (Jan. 17-19, 1993).
19Bucholz, R.D., et al., Intraoperative localization using a three dimensional optical digitizer, SPIE—The Intl. Soc. for Opt. Eng., vol. 1894, pp. 312-322 (Jan. 17-19, 1993).
20Bucholz, R.D., et al., Intraoperative Ultrasonic Brain Shift Monitor and Analysis, Stealth Station Marketing Brochure (2 pages) (undated).
21Bucholz, R.D., et al., The Correction of Stereotactic Inaccuracy Caused by Brain Shift Using an Intraoperative Ultrasound Device, First Joint Conference, Computer Vision, Virtual Reality and Robotics in Medicine and Medical Robotics and Computer-Assisted Surgery, Grenoble, France, pp. 459-466 (Mar. 19-22, 1997).
22Champleboux et al., "Accurate Calibration of Cameras and Range Imaging Sensors: the NPBS Method," IEEE International Conference on Robotics and Automation, Nice, France, May 1992.
23Champleboux, "Utilisation de Fonctions Splines pour la Mise au Point D'un Capteur Tridimensionnel sans Contact," Quelques Applications Medicales, Jul. 1991.
24Cinquin et al., "Computer Assisted Medical Interventions," IEEE Engineering in Medicine and Biology, May/Jun. 1995, pp. 254-263.
25Cinquin et al., "Computer Assisted Medical Interventions," International Advanced Robotics Programme, Sep. 1989, pp. 63-65.
26Clarysse et al., "A Computer-Assisted System for 3-D Frameless Localization in Stereotaxic MRI," IEEE Transactions on Medical Imaging, vol. 10, No. 4, Dec. 1991, pp. 523-529.
27Cutting M.D. et al., Optical Tracking of Bone Fragments During Craniofacial Surgery, Second Annual International Symposium on Medical Robotics and Computer Assisted Surgery, pp. 221-225, (Nov. 1995).
28Feldmar et al., "3D-2D Projective Registration of Free-Form Curves and Surfaces," Rapport de recherche (Inria Sophia Antipolis), 1994, pp. 1-44.
29Foley et al., "Fundamentals of Interactive Computer Graphics," The Systems Programming Series, Chapter 7, Jul. 1984, pp. 245-266.
30Foley et al., "Image-guided Intraoperative Spinal Localization," Intraoperative Neuroprotection, Chapter 19, 1996, pp. 325-340.
31Foley, "The StealthStation: Three-Dimensional Image-Interactive Guidance for the Spine Surgeon," Spinal Frontiers, Apr. 1996, pp. 7-9.
32Friets, E.M., et al. A Frameless Stereotaxic Operating Microscope for Neurosurgery, IEEE Trans. on Biomed. Eng., vol. 36, No. 6, pp. 608-617 (Jul. 1989).
33Gallen, C.C., et al., Intracranial Neurosurgery Guided by Functional Imaging, Surg. Neurol., vol. 42, pp. 523-530 (1994).
34Galloway, R.L., et al., Interactive Image-Guided Neurosurgery, IEEE Trans. on Biomed. Eng., vol. 89, No. 12, pp. 1226-1231 (1992).
35Galloway, R.L., Jr. et al, Optical localization for interactive, image-guided neurosurgery, SPIE, vol. 2164, pp. 137-145 (undated.
36Germano, "Instrumentation, Technique and Technology", Neurosurgery, vol. 37, No. 2, Aug. 1995, pp. 348-350.
37Gildenberg et al., "Calculation of Stereotactic Coordinates from the Computed Tomographic Scan," Neurosurgery, vol. 10, No. 5, May 1982, pp. 580-586.
38Gomez, C.R., et al., Transcranial Doppler Ultrasound Following Closed Head Injury: Vasospasm or Vasoparalysis?, Surg. Neurol., vol. 35, pp. 30-35 (1991).
39Gonzalez, "Digital Image Fundamentals," Digital Image Processing, Second Edition, 1987, pp. 52-54.
40Gottesfeld Brown et al., "Registration of Planar Film Radiographs with Computer Tomography," Proceedings of MMBIA, Jun. 1996, pp. 42-51.
41Grimson, W.E.L., An Automatic Registration Method for Frameless Stereotaxy, Image Guided Surgery, and enhanced Reality Visualization, IEEE, pp. 430-436 (1994).
42Grimson, W.E.L., et al., Virtual-reality technology is giving surgeons the equivalent of x-ray vision helping them to remove tumors more effectively, to minimize surgical wounds and to avoid damaging critical tissues, Sci. Amer., vol. 280, No. 6, pp. 62-69 (Jun. 1999).
43Gueziec et al., "Registration of Computed Tomography Data to a Surgical Robot Using Fluoroscopy: A Feasibility Study," Computer Science/Mathematics, Sep. 27, 1996, 6 pages.
44Guthrie, B.L., Graphic-Interactive Cranial Surgery: The Operating Arm System, Handbook of Stereotaxy Using the CRW Apparatus, Chapter 13, pp. 193-211 (undated.
45Hamadeh et al., "Automated 3-Dimensional Computed Tomographic and Fluorscopic Image Registration," Computer Aided Surgery (1998), 3:11-19.
46Hamadeh et al., "Towards Automatic Registration Between CT and X-ray Images: Cooperation Between 3D/2D Registration and 2D Edge Detection," MRCAS '95, pp. 39-46.
47Hardy, T., M.D., et al., CASS: A Program for Computer Assisted Stereotaxic Surgery, The Fifth Annual Symposium on Comptuer Applications in Medical Care, Proceedings, Nov. 1-4, 1981, IEEE, pp. 1116-1126, (1981).
48Hatch, "Reference-Display System for the Integration of CT Scanning and the Operating Microscope," Thesis, Thayer School of Engineering, Oct. 1984, pp. 1-189.
49Hatch, et al., "Reference-Display System for the Integration of CT Scanning and the Operating Microscope", Proceedings of the Eleventh Annual Northeast Bioengineering Conference, Mar. 14-15, 1985, pp. 252-254.
50Heilbrun et al., "Preliminary experience with Brown-Roberts-Wells (BRW) computerized tomography stereotaxic guidance system," Journal of Neurosurgery, vol. 59, Aug. 1983, pp. 217-222.
51Heilbrun, M.D., Progressive Technology Applications, Neurosurgery for the Third Millenium, Chapter 15, J. Whitaker & Sons, Ltd., Amer. Assoc. of Neurol. Surgeons, pp. 191-198 (1992).
52Heilbrun, M.P., Computed Tomography-Guided Stereotactic Systems, Clinical Neurosurgery, Chapter 31, pp. 564-581 (1983).
53Heilbrun, M.P., Computed Tomography—Guided Stereotactic Systems, Clinical Neurosurgery, Chapter 31, pp. 564-581 (1983).
54Heilbrun, M.P., et al., Stereotactic Localization and Guidance Using a Machine Vision Technique, Sterotact & Funct. Neurosurg., Proceed. of the Mtg. of the Amer. Soc. for Sterot. and Funct. Neurosurg. (Pittsburgh, PA) vol. 58, pp. 94-98 (1992).
55Henderson et al., "An Accurate and Ergonomic Method of Registration for Image-guided Neurosurgery," Computerized Medical Imaging and Graphics, vol. 18, No. 4, Jul.-Aug. 1994, pp. 273-277.
56Hoerenz, "The Operating Microscope I. Optical Principles, Illumination Systems, and Support Systems," Journal of Microsurgery, vol. 1, 1980, pp. 364-369.
57Hofstetter et al., "Fluoroscopy Based Surgical Navigation-Concept and Clinical Applications," Computer Assisted Radiology and Surgery, 1997, pp. 956-960.
58Hofstetter et al., "Fluoroscopy Based Surgical Navigation—Concept and Clinical Applications," Computer Assisted Radiology and Surgery, 1997, pp. 956-960.
59Horner et al., "A Comparison of CT-Stereotaxic Brain Biopsy Techniques," Investigative Radiology, Sep.-Oct. 1984, pp. 367-373.
60Hounsfield, "Computerized transverse axial scanning (tomography): Part 1. Description of system," British Journal of Radiology, vol. 46, No. 552, Dec. 1973, pp. 1016-1022.
61International Search Report, Jan. 1991, France.
62J. Comp. Asst. Tomogr., vol. 5, No. 1, pp. 60-67, Feb. 1981.
63Jacques et al., "A Computerized Microstereotactic Method to Approach, 3-Dimensionally Reconstruct, Remove and Adjuvantly Treat Small CNS Lesions," Applied Neurophysiology, vol. 43, 1980, pp. 176-182.
64Jacques et al., "Computerized three-dimensional stereotaxic removal of small central nervous system lesion in patients," J. Neurosurg., vol. 53, Dec. 1980, pp. 816-820.
65Joskowicz et al., "Computer-Aided Image-Guided Bone Fracture Surgery: Concept and Implementation," CAR '98, pp. 710-715.
66Kall, B., The Impact of Computer and Imgaging Technology on Stereotactic Surgery, Proceedings of the Meeting of the American Society for Stereotactic and Functional Neurosurgery, pp. 10-22 (1987).
67Kato, A., et al., A frameless, armless navigational system for computer-assisted neurosurgery, J. Neurosurg., vol. 74, pp. 845-849 (May 1991).
68Kelly et al., "Precision Resection of Intra-Axial CNS Lesions by CT-Based Stereotactic Craniotomy and Computer Monitored CO2 Laser," Acta Neurochirurgica, vol. 68, 1983, pp. 1-9.
69Kelly, P.J. et al. Computer-assited Stereotactic Laser Microsurgery for the Treatment of Intracranial Neoplasms, Neurosurgery, vol. 10, No. 3, pp. 324-330, 1982.
70Kelly, P.J. et al.; Computer assisted Stereotactic Laser Microsurgery for the Treatment of Intracranial Neoplasms, Neurosurgery. vol. 10, No. 3, pp. 324-330, 1982.
71Kelly, P.J., Computer Assisted Stereotactic Biopsy and Volumetric Resection of Pediatric Brain Tumors, Brain Tumors in Children, Neurologic Clinics, vol. 9, No. 2, pp. 317-336 (May 1991).
72Kelly, P.J., Computer-Directed Stereotactic Resection of Brain Tumors, Neurologica Operative Atlas, vol. 1, No. 4, pp. 299-313 (1991).
73Kelly, P.J., et al., Results of Computed Tomography-based Computer-assisted Stereotactic Resection of Metastatic Intracranial Tumors, Neurosurgery, vol. 22, No. 1, Part 1, 1988, pp. 7-17 (Jan. 1988).
74Kelly, P.J., Stereotactic Imaging, Surgical Planning and Computer-Assisted Resection of Intracranial Lesions: Methods and Results, Advances and Technical Standards in Neurosurgery, vol. 17, pp. 78-118, (1990).
75Kim, W.S. et al., A Helmet Mounted Display for Telerobotics, IEEE, pp. 543-547 (1988).
76Klimek, L., et al., Long-Term Experience with Different Types of Localization Systems in Skull-Base Surgery, Ear, Nose & Throat Surgery, Chapter 51, pp. 635-638 (undated).
77Kosugi, Y., et al., An Articulated Neurosurgical Navigation System Using MRI and CT Images, IEEE Trans. on Biomed, Eng. vol. 35, No. 2, pp. 147-152 (Feb. 1988).
78Krybus, W., et al., Navigation Support for Surgery by Means of Optical Position Detection, Computer Assisted Radiology Proceed. of the Intl. Symp. CAR '91 Computed Assisted Radiology, pp. 362-366 (Jul. 3-6, 1991).
79Kwoh, Y.S., Ph.D., et al., A New Computerized Tomographic-Aided Robotic Stereotaxis System, Robotics Age, vol. 7, No. 6, pp. 17-22 (Jun. 1985).
80Laitinen et al., "An Adapter for Computed Tomography-Guided, Stereotaxis," Surg. Neurol., 1985, pp. 559-566.
81Laitinen, "Noninvasive multipurpose stereoadapter," Neurological Research, Jun. 1987, pp. 137-141.
82Lavallee et al, "Matching 3-D Smooth Surfaces with their 2-D Projections using 3-D Distance Maps," SPIE, vol. 1570, Geometric Methods in Computer Vision, 1991, pp. 322-336.
83Lavallee et al., "Computer Assisted Driving of a Needle into the Brain," Proceedings of the International Symposium CAR '89, Computer Assisted Radiology, 1989, pp. 416-420.
84Lavallee et al., "Computer Assisted Interventionist Imaging: The Instance of Stereotactic Brain Surgery," North-Holland MEDINFO 89, Part 1, 1989, pp. 613-617.
85Lavallee et al., "Image guided operating robot: a clinical application in stereotactic neurosurgery," Proceedings of the 1992 IEEE Internation Conference on Robotics and Automation, May 1992, pp. 618-624.
86Lavallee et al., "Matching of Medical Images for Computed and Robot Assisted Surgery," IEEE EMBS, Orlando, 1991.
87Lavallee, S. A New System for Computer Assisted Neuro-surgery. IEEE Engineering in Medicine & Biology Society 11th Annual International Conference, vol. 11, pp. 926-927, Nov. 1989.
88Lavallee, S. A New System for Computer Assisted Neurosurgery. IEEE Engineering in Medicine & Biology Society 11th Annual International Conference, vol. 11, pp. 926-927, Nov. 1989.
89Lavallee, S., et al., Computer Assisted Knee Anterior Cruciate Ligament Reconstruction First Clinical Tests, Proceedings of the First International Symposium on Medical Robotics and Computer Assisted Surgery, pp. 11-16 (Sep. 1994).
90Lavallee, S., et al., Computer Assisted Medical Interventions, NATO ASI Series, vol. F 60, 3d Imaging in Medic., pp. 301-312 (1990).
91Lavallee, S.A., New System for Computer Assisted Neurosurgery. IEEE Engineering in Medicine & Biology Society 11th Annual International Conference, vol. 11, pp. 926-927, Nov. 1989.
92Leavitt, D.D., et al., Dynamic Field Shaping to Optimize Stereotactic Radiosurgery, I.J. Rad. Onc. Biol. Physc., vol. 21, pp. 1247-1255 (1991).
93Leksell et al., "Stereotaxis and Tomography-A Technical Note," ACTA Neurochirurgica, vol. 52, 1980, pp. 1-7.
94Leksell et al., "Stereotaxis and Tomography—A Technical Note," ACTA Neurochirurgica, vol. 52, 1980, pp. 1-7.
95Lemieux et al., "A Patient-to-Computed-Tomography Image Registration Method Based on Digitally Reconstructed Radiographs," Med. Phys. 21 (11), Nov. 1994, pp. 1749-1760.
96Levin et al., "The Brain: Integrated Three-dimensional Display of MR and PET Images," Radiology, vol. 172, No. 3, Sep. 1989, pp. 783-789.
97Maurer, Jr., et al., Registration of Head CT Images to Physical Space Using a Weighted Combination of Points and Surfaces, IEEE Trans. on Med. Imaging, vol. 17, No. 5, pp. 753-761 (Oct. 1998).
98Mazier et al., "Computer-Assisted Interventionist Imaging: Application to the Vertebral Column Surgery," Annual International Conference of the IEEE Engineering in Medicine and Biology Society, vol. 12, No. 1, 1990, pp. 0430-0431.
99Mazier et al., Chirurgie de la Colonne Vertebrale Assistee par Ordinateur: Appication au Vissage Pediculaire, Innov. Tech. Biol. Med., vol. 11, No. 5, 1990, pp. 559-566.
100McGirr, S., M.D., et al., Stereotactic Resection of Juvenile Pilocytic Astrocytomas of the Thalamus and Basal Ganglia, Neurosurgery, vol. 20, No. 3, pp. 447-452, (1987).
101Ng, W.S. et al., Robotic Surgery—A First-Hand Experience in Transurethral Resection of the Prostate Surgery, IEEE Eng. in Med. and Biology, pp. 120-125 (Mar. 1993).
102P.J. Kelly et al., "Computer-Assisted Stereotactic Laser Micro-Surgery for the Treatment of Intracranial Neoplasms", Neuro., vol. 10, No. 3, 1982, pp. 324-330.
103Pelizzari et al., "Accurate Three-Dimensional Registration of CT, PET, and/or MR Images of the Brain," Journal of Computer Assisted Tomography, Jan./Feb. 1989, pp. 20-26.
104Pelizzari et al., "Interactive 3D Patient-Image Registration," Information Processing in Medical Imaging, 12th International Conference, IPMI '91, Jul. 7-12, 136-141 (A.C.F. Colchester et al. eds. 1991).
105Pelizzari et al., No. 528-"Three Dimensional Correlation of PET, CT and MRI Images," The Journal of Nuclear Medicine, vol. 28, No. 4, Apr. 1987, p. 682.
106Pelizzari et al., No. 528—"Three Dimensional Correlation of PET, CT and MRI Images," The Journal of Nuclear Medicine, vol. 28, No. 4, Apr. 1987, p. 682.
107Penn, R.D., et al., Stereotactic Surgery with Image Processing of Computerized Tomographic Scans, Neurosurgery, vol. 3, No. 2, pp. 157-163 (Sep.-Oct. 1978).
108Phillips et al., "Image Guided Orthopaedic Surgery Design and Analysis," Trans Inst. MC, vol. 17, No. 5, 1995, pp. 251-264.
109Pixsys, 3-D Digitizing Accessories, by Pixsys (marketing brochure)(undated) (2 pages).
110Potamianos et al., "Intra-Operative Imaging Guidance for Keyhole Surgery Methodology and Calibration," First International Symposium on Medical Robotics and Computer Assisted Surgery, Sep. 22-24, 1994, pp. 98-104.
111Reinhardt et al., "CT-Guided ‘Real Time’ Stereotaxy," ACTA Neurochirurgica, 1989.
112Reinhardt et al., "CT-Guided 'Real Time' Stereotaxy," ACTA Neurochirurgica, 1989.
113Reinhardt, H., et al., A Computer-Assisted Device for Intraoperative CT-Correlated Localization of Brain Tumors, pp. 51-58 (1988).
114Reinhardt, H.F. et al., Sonic Stereometry in Microsurgical Procedures for Deep-Seated Brain Tumors and Vascular Malformations, Neurosurgery, vol. 32, No. 1, pp. 51-57 (Jan. 1993).
115Reinhardt, H.F., et al., Mikrochirugische Entfernung tiefliegender Gefäβmiβbildungen mit Hilfe der Sonar-Stereometrie (Microsurgical Removal of Deep-Seated Vascular Malformations Using Sonar Stereometry). Ultraschall in Med. 12, pp. 80-83 (1991).
116Reinhardt, Hans. F., Neuronavigation: A Ten-Year Review, Neurosurgery, pp. 329-341 (undated).
117Roberts et al., "A Frameless Stereotoxic Integration of Computerized Tomographic Imaging and the Operating Microsc gsl", J. of Neuro. Surg. No. 65, 1986, pp. 545-549.
118Roberts, D.W., et al. A Frameless Stereotaxic Integration of Computerized Tomographic Imaging and the Operating Microscope, J. Neurosurg., vol. 65, pp. 545-549, 1986.
119Roberts, D.W., et al.; A Frameless Stereotaxic Integration of Computerized Tomographic Imaging and the Operating Microscope. J. Neurosurg., vol. 65, pp. 545-549, 1986.
120Rosenbaum et al., "Computerized Tomography Guided Stereotaxis: A New Approach," Applied Neurophysiology, vol. 43, No. 3-5, 1980, pp. 172-173.
121S. Lavallee, "A New System for Computer Neurosurgery", IEEE Eng. in Medicine & Bio. Soc. 11th Annual Int. Conf., Nov. 9-12, 1989, pp. 926-927.
122Sautot, "Vissage Pediculaire Assiste Par Ordinateur," Sep. 20, 1994.
123Schueler et al., "Correction of Image Intensifier Distortion for Three-Dimensional X-Ray Angiography," SPIE Medical Imaging 1995, vol. 2432, pp. 272-279.
124Selvik et al., "A Roentgen Stereophotogrammetric System," Acta Radiologica Diagnosis, 1983, pp. 343-352.
125Shelden et al., "Development of a computerized microsteroetaxic method for localization and removal of minute CNS lesions under direct 3-D vision," J. Neurosurg., vol. 52, 1980, pp. 21-27.
126Simon, D.A., Accuracy Validation in Image-Guided Orthopaedic Surgery, Second Annual Intl. Symp. on Med. Rob. an Comp-Assisted surgery, MRCAS '95, pp. 185-192 (undated).
127Smith et al., "Computer Methods for Improved Diagnostic Image Display Applied to Stereotactic Neurosurgery," Automedical, vol. 14, 1992, pp. 371-382 (4 unnumbered pages).
128Smith et al., "The Neurostation(TM)-A Highly Accurate, Minimally Invasive Solution to Frameless Stereotactic Neurosurgery," Computerized Medical Imaging and Graphics, vol. 18, Jul.-Aug. 1994, pp. 247-256.
129Smith et al., "The Neurostation™—A Highly Accurate, Minimally Invasive Solution to Frameless Stereotactic Neurosurgery," Computerized Medical Imaging and Graphics, vol. 18, Jul.-Aug. 1994, pp. 247-256.
130Smith, K.R., et al. Multimodality Image Analysis and Display Methods for Improved Tumor Localization in Stereotactic Neurosurgery, Annul Intl. Conf. of the IEEE Eng. in Med. and Biol. Soc., vol. 13, No. 1, p. 210 (1991).
131Tan, K., Ph.D., et al., A frameless stereotactic approach to neurosurgical planning based on retrospective patient-image registration, J Neurosurgy, vol. 79, pp. 296-303 (Aug. 1993).
132Thompson, et al., A System for Anatomical and Functional Mapping of the Human Thalamus, Computers and Biomedical Research, vol. 10, pp. 9-24 (1977).
133Trobraugh, J.W., et al., Frameless Stereotactic Ultrasonography: Method and Applications, Computerized Medical Imaging and Graphics, vol. 18, No. 4, pp. 235-246 (1994).
134Viant et al., "A Computer Assisted Orthopaedic System for Distal Locking of Intramedullary Nails," Proc. of MediMEC '95, Bristol, 1995, pp. 86-91.
135Von Hanwhr et al., Foreword, Computerized Medical Imaging and Graphics, vol. 18, No. 4, pp. 225-228, (Jul.-Aug. 1994).
136Wang, M.Y., et al., An Automatic Technique for Finding and Localizing Externally Attached Markers in CT and MR Volume Images of the Head, IEEE Trans. on Biomed. Eng., vol. 43, No. 6, pp. 627-637 (Jun. 1996).
137Watanabe et al., Three Dimensional Digitizer (Neuronavigator): New Equipment for Computed Tomography—Guided Stereotoxic Surgery, Surg. Neurol., 1987, No. 27, pp. 543-547.
138Watanabe, "Neuronavigator," Igaku-no-Ayumi, vol. 137, No. 6, May 10, 1986, pp. 1-4.
139Watanabe, E et al. Three-Dimensional Digitizer (Neuronavigator): New Equipment for Computed Tomography—Guided Stereotaxic Surgery. Surg. Neurol., vol. 27, pp. 543-547, 1987.
140Watanabe, E et al., Three-Dimensional Digitizer (Neuronavigator): New Equipment for Computed Tomography-Guided Stereotaxic Surgery. Surg. Neurol., vol. 27, pp. 543-547, 1987.
141Watanabe, E., M.D., et al., Open Surgery Assisted by the Neuronavigator, a Stereotactic, Articulated, Sensitive Arm, Neurosurgery, vol. 28, No. 6, pp. 792-800 (1991).
Classifications
U.S. Classification600/424, 600/427, 600/426, 606/130, 600/425
International ClassificationG06T1/00, A61B19/00
Cooperative ClassificationA61B90/10, A61B34/20, A61B34/70, A61B2090/364, A61B2034/105, A61B90/39, A61B2090/367, A61B2034/2068, A61B2090/363, A61B2034/2072, A61B2034/107