CA2634490A1 - Methods and systems for segmentation and surface matching - Google Patents

Methods and systems for segmentation and surface matching Download PDF

Info

Publication number
CA2634490A1
CA2634490A1 CA002634490A CA2634490A CA2634490A1 CA 2634490 A1 CA2634490 A1 CA 2634490A1 CA 002634490 A CA002634490 A CA 002634490A CA 2634490 A CA2634490 A CA 2634490A CA 2634490 A1 CA2634490 A1 CA 2634490A1
Authority
CA
Canada
Prior art keywords
images
lesion
reference surface
elements
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CA002634490A
Other languages
French (fr)
Other versions
CA2634490C (en
Inventor
Fabienne Lathuiliere
Veronique Audet
Tony Falco
Martin Lachaine
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Resonant Medical Inc
Original Assignee
Resonant Medical Inc.
Fabienne Lathuiliere
Veronique Audet
Tony Falco
Martin Lachaine
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Resonant Medical Inc., Fabienne Lathuiliere, Veronique Audet, Tony Falco, Martin Lachaine filed Critical Resonant Medical Inc.
Publication of CA2634490A1 publication Critical patent/CA2634490A1/en
Application granted granted Critical
Publication of CA2634490C publication Critical patent/CA2634490C/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/149Segmentation; Edge detection involving deformable models, e.g. active contour models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • G06T2207/20104Interactive definition of region of interest [ROI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20116Active contour; Active surface; Snakes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion

Abstract

A contoured surface map of a lesion within a patient is obtained by shifting a reference surface to an estimated location in operational images. The process can be repeated to minimize errors, and the contoured surface map can then be segmented.

Description

METHODS AND SYSTEMS FOR SEGMENTATION AND SURFACE
MATCHING
Technical Field [0001] This invention relates to methods and systems for verifying anatomical features of a patient undergoing radiation therapy and, more particularly, to methods and systems for using reference images to identify surface elements of anatomical elements imaged at a later time.

BackEround Information [0002] Radiation-emitting devices are used for the treatment of cancerous tumors within patients. The primary goal of treating cancerous tumors with radiation therapy is the complete eradication of the cancerous cells, while the secondary goal is to avoid, to the maximum possible extent, damaging healthy tissue and organs in the vicinity of the tumor.
Typically, a radiation therapy device includes a gantry that can be rotated around a horizontal axis of rotation during the delivery of a therapeutic treatment. A
particle linear accelerator ("LINAC") is located within the gantry, and generates a high-energy radiation beam of therapy, such as an electron beam or photon (x-ray) beam. The patient is placed on a treatment table located at the isocenter of the gantry, and the radiation beam is directed toward the tumor or lesion to be treated.
[0003] Radiation therapy typically involves a planning stage and a treatment stage.
Generally, the planning stage involves acquiring images of a lesion (using, for example an x-ray device) and subsequently using the image(s) to accurately measure the location, size, contour, and number of lesions to be treated. These are used to establish certain treatment plan parameters, such as an isocenter, beam angles, energy, aperture, dose distribution, and other parameters in an attempt to irradiate the lesion while minimizing damage to surrounding healthy tissue.
[0004] Imaging is often used by oncologists in determining the treatment parameters of radiation therapy plans such that the prescribed radiation is sufficient to eliminate the cancerous cells and while conforming the shape of the dose distribution to a target volume to the greatest extent possible, thereby sparing healthy tissue from exposure to potentially harmful doses of radiation. To develop a preferred treatment plan, simulations can be performed to design a set of beams which accomplish this goal that calculate the dose at each point in the patient resulting from this set of beams. The dose distribution can be represented, for example, as isodose lines or as three-dimensional isodose surfaces within the patient. The treatment goal is to encompass the lesion and an appropriate safety margin within the 100% isodose surface. The treatment plan is then administered, usually at a later date and over a period of weeks, based on the treatment parameters.
One shortcoming of this approach is that the time lapse between treatment planning and treatment delivery allows for changes to the patient's anatomy, thereby potentially rendering the treatment plan sub-optimal. Changes such as lesion movement, growth, organ shifting, or other morphisms can cause healthy tissue to become subject to potentially harmful radiation, and cancerous tissue to extend beyond the boundaries of the original treatment plan.
[0005] Once a treatment plan is determined, the patient receives the radiation treatments during a number of sessions (fractions). Treatment often includes significant time lapses between individual fractions and can also span many weeks (e.g., once a day five days a week for four weeks.) Because organs can change location and/or shape from the time of planning to the delivery of the initial fraction, as well as from fraction to fraction, the original segmented, contoured image may no longer accurately represent the lesion being treated. As a result, the treatment plan may no longer be optimal. Three-dimensional imaging modalities that are able to discern soft-tissues are therefore used in the treatment room in order to detect and compensate for organ motion. However, because of the time constraints imposed during the individual fractions, and the lack of a trained physician during the fractions, it may not be possible to generate an updated segmented or contoured image of the lesion. Thus, methods that provide fast, accurate, and reliable images and patient positioning data without requiring a physician's expertise are of great benefit to a radiation technologist administering the radiation treatment.
[0006] Therefore, a fast technique to segment organs or structures of interest prior to a medical procedure with minimal user guidance is needed.

Summary of the Invention [0007] The present invention provides systems and methods to obtain a segmented, contoured, three-dimensional representation of an anatomical feature or structure of interest of a patient prior to a medical procedure with minimal user guidance, which uses the information provided by pre-operation contours approved by the physician as an initial estimate of the treatment contour shape. Such techniques facilitate determining patient positioning corrections to compensate for displacement and morphological change, modifying treatment parameters to account for the current location of the lesion, and/or calculating changes in lesion size and shape over the course of a medical treatment. The invention utilizes a surface model derived during the treatment planning phase and images taken during the treatment phase. In general, the invention relates to using a previously contoured image of a tissue volume acquired during the treatment planning phase as an initial estimate of the current surface.
[0008] In a typical embodiment, a user shifts, rotates and/or scales the planning contour (using, for example, a video screen) until the contour fits as close as possible to the lesion or organ in the treatment image, and then uses this shifted contour as an initial estimate for a local segmentation algorithm. The local segmentation algorithm, in turn, uses this estimate as a starting point to find edges in the current treatment image, resulting in a segmented, contoured surface of the lesion in the current image. The invention is based on an assumption that the volume in the current image to be segmented has translated, rotated and/or scaled from its position when initially imaged, and that the planning contour can serve as a reference surface for the local segmentation process. The surface is moved to the correct location in the current image (either manually and/or automatically) by using iterations of local segmentation and surface-matching techniques to gradually move the planning contour to the correct location in the treatment image, which can then used as an initial estimate for a final iteration of a local segmentation process.
[0009] In the radiotherapy application, for example, adjustments to the treatment parameters and/or patient positioning can be made based on the newly segmented volume image, and thus the invention facilitates rapid and accurate treatment position adjustment just prior to treatment delivery while including important clinical and/or practical concerns not addressed by other conventional methods.
[0010] In one aspect, a method for obtaining a contoured image of a lesion within a patient (for the purposes of administering radiation treatment, for example) includes providing a contoured reference image of the lesion based on images of the lesion taken at a first time (such as during a treatment planning session), generating a set of operational images at a second time (such as during a treatment delivery session), shifting (e.g., translating, rotating, scaling, or any combination thereof) the reference surface to an approximated location in the operational images, segmenting the operational images based on the shifted reference surface (using, for example, a local segmentation technique), and determining a contoured surface of the lesion at the second time based on the segmentation.
100111 The reference surface and/or operational images can be created from images generated using any suitable tomographic or other imaging modality, e.g., a CT
scanner, a three-dimensional ultrasound device, a PET scanner, or an MRI device. The approximated location can be determined manually, or by indicating one or more points on the operational images and using a surface mapping algorithm to map surface elements on the reference surface to the indicated points. The surface elements on the reference surface can include triangles or other two-dimensional shapes, lines or points, and the points on the operational images can represent edge boundaries. In some embodiments, two-dimensional cross-sectional cuts through the surfaces and/or images (either radially offset or parallel to each other) and can be used as input into a segmentation algorithm. The process of shifting the reference surface with surface matching and segmentation can be repeated iteratively using the shifted reference surface in place of the operational images.
A minimization threshold (e.g., minimum squared distance) can be used to determine when the shifted image is optimally placed.
[0012] Weights can be assigned to the surface elements on the reference and/or treatment surface, and can be based on a degree of certainty that the surface element corresponds to a particular feature of the lesion, which in some cases can be an edge of the lesion; and/or on the clinical importance of an anatomical feature represented by the surface element and, in some embodiments, the proximity of an anatomical feature represented by the surface element to another anatomical structure of the patient. In some embodiments, the weights can be based on the density of the surface elements within a particular area of the image, and/or the area of the surface element itself.

[0013] In another aspect, a system for obtaining a contoured image of a lesion includes a register for storing a reference surface of the lesion based on images acquired at a first time, a second set of images of the lesion taken at a second time, and estimated surface elements on the second set of images. The system also includes a mapping module for mapping surface elements on the reference surface to estimated surface elements on the set of images and a processor for shifting (e.g., translating, rotating, and/or scaling) the reference surface based on the matched surface.
[0014] The foregoing and other objects, features and advantages of the present invention disclosed herein, as well as the invention itself, will be more fully understood from the following description of preferred embodiments and claims, when read together with the accompanying drawings.

Brief Description of the Drawinszs [0015] In the drawings, like reference characters generally refer to the same parts throughout the different views. Also, the drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating the principtes of the invention.
[0016] FIG. 1 schematically illustrates a segmented mapping of a lesion.
[0017] FIGS. 2a and 2b schematically illustrate obtaining cross-sectional views of the lesion.
[0018] FIG. 3 schematically illustrates using a reference surface as an initial estimate for segmentation.
[0019] FIG. 4 schematically illustrates placing estimated surface elements on cross-sections of the lesion in accordance with one embodiment of the invention.
[0020] FIG. 5a schematically illustrates mapping surface elements of the segmented image to the estimated surface elements of FIG. 4.
[0021] FIG. 5b schematically illustrates shifting the segmented image in accordance with the mapped surface elements of the segmented image and the cross-sectional view of the lesion.
[0022] FIG. 6 schematically illustrates placing estimated surface elements on a cross-section of the lesion in accordance with one embodiment of the invention.
[0023] FIG. 7 is a flow diagram illustrating various embodiments of determining lesion displacement.

[0024] FIG. 8 is a schematic illustration of various embodiments of a system adapted to practice the methods of the present invention.

Detailed Description 100251 Referring to FIG. 1, an image 100 of a lesion is obtained during a treatment planning stage. The image can be an individual two-dimensional image, multiple two-dimensional sectional images or slices that collectively represent a volume and may be combined into a composite image, or three-dimensional images rendered programmatically or manually (or a combination). The image can be obtained using devices such as CT scanners, ultrasound devices, MRI devices, PET scanners, and x-ray devices, as well as other imaging modalities commonly used throughout the art.
The structure of interest may be a tumor or lesion independent of other organs, a cancerous gland (such as the prostate) requiring treatment, a non-cancerous organ of interest, or any identifying landmark within the image. The image 100 may be, for instance, used by an oncologist, physician, or radiation technician to determine the location and shape of the lesion to be treated and to determine the parameters of the radiation treatment plan such as beam angle, beam shape, the number of beams needed to administer a sufficient dose to eradicate the target lesion, the dose level for each beams, and patient positioning.
However, due to the changes mentioned above, elements in the image at treatment time are not always in the same location or in the same shape as they were during the planning stage. Furthermore, because the technicians that administer the treatment generally lack the training necessary to build a segmented and/or contoured image of the lesion, these changes are not easily incorporated into an updated treatment plan. Therefore, it would be beneficial to be able to use the original segmented image devised by the physician during the treatment planning stage during the treatment delivery phase, albeit adjusted for changes in position and shape. The time required to generate a new segmented, contoured image is also a constraint as there is limited time available to deliver the individual radiotherapy sessions.
[0026] During a treatment planning session, the organ or lesion surface is contoured into either manually, semi-automatically or automatically and a three-dimensional planning surface image (also referred to herein as a "reference surface") is generated. The reference surface can be described by points, line segments, a regular mesh, an analytic function, a set of triangles or other shapes, or any other surface representation. A second surface image (referred to as a "treatment surface" or "operative surface") is generated at treatment time. In either or both images, some points on the surface may be known with more confidence than others due to, for example, poor edge information in some locations.
In such a case, each surface element can be assigned a weight that indicates how confident either the user or the segmentation algorithm is that the surface element corresponds to a true border. In the extreme case, a weight of zero indicates that a given surface element is completely arbitrary since there is no reliable image data in that region.
[0027] Still referring to FIG. 1, the image 100 of the lesion to be treated using radiation has been segmented into numerous triangular sections, each representing a different surface element. In some embodiments, certain surface elements can be weighted (e.g., elements 105, 110, and 115) so as to be given greater consideration by a surface-matching algorithm used to identify corresponding elements in a later image (described in greater detail below). Elements 120, 125, and 130 may also be considered during a matching process, but to discount any irregularities or compensate for unknowns in those areas, they can be given a weighting lower than those assigned to elements 105, 110, and 115. In some instances, the imaging modality used to obtain the images may not provide complete representations of the lesion. However, incomplete images can still be used as input into a matching algorithm by weighting surface elements in non-imaged areas with a low, or even zero weight and assigning a greater weight to elements in areas accurately characterized in the image. As a result, the algorithm will find elements of the lesion in the segmented image that correspond to those elements that are substantially discernible or of greater importance. In some cases, each surface element can be assigned an equal weighting, if, for example, weighting is not important for the particular application.
[002$] The weighting of the individual elements can be based on the degree of certainty of surface points or elements identified in the image - i.e., parts of a lesion's surface which are known with a greater degree of certainty are assigned higher weights than those about which there is less certainty. The degree of certainty can, for example, represent the accuracy with which the image or portion thereof accurately corresponds to an actual anatomical feature of the patient; the level of confidence that the representation is located accurately with respect to other anatomical features, the patient, fiducial markers, or other positioning devices; or the degree of detail shown (and accurately represented) in the image. The weighting of surface elements can also be determined using gradient information obtained by analyzing the pixels of the image to determine closeness to a binary edge. Other reasons for assigning higher weights to certain surface elements with respect to others may be their clinical importance, such as their proximity to other critical organs or other highly-susceptible non-target tissues. For example, the posterior part of the prostate surface is close to the rectum and thus can be assigned a higher weight than elements on the rest of the prostate surface to ensure the prostate/rectal interface is correctly repositioned.
[0029] In addition to determining the weights based on anatomical features of the patient, the number and arrangement of the surface elements themselves can help determine the assigned weights. For example, the weights can be adjusted for the density of points within a particular region to reduce bias in the case of an unequal distribution of surface elements. Likewise, surface areas with only a few defined elements may be assigned reduced weights to reduce the effect of statistical outliers, to eliminate statistical noise, or to minimize damage to areas that have not been adequately characterized by imaging.
[0030] Technicians generally rely on images such as ultrasound and/or CT to detect changes to the lesion between the planning stage and treatment stage (and in some cases between individual treatment sessions). As mentioned above, there may not be sufficient time and/or skill to fully contour the treatment image prior to delivering the radiation dose.
The invention provides methods and systems by which these subsequent images can be used in conjunction with the original segmented reference image to produce a segmented image that has effectively been "error-corrected" for lesion shift, rotation and morphisms.
[0031] Referring to FIG. 2a, three-dimensional planning and/or treatment images 200 can be separated manually or programmatically into a series of two-dimensional cross-sectional image slices for easier manipulation. In one embodiment, images can be the original two-dimensional images taken prior to three-dimensional reconstruction. In another embodiment, the cross-sections 205 are radially offset from each other and taken about a common central axis 210 of the image 200. The degree of offset and number of images can vary according to individual images and lesions, although ten images offset at an angle of 5 degrees from each other provides sufficient imaging for a typical prostate gland. In another embodiment, and referring to FIG. 2b, the cross-sections 215 are parallel to each other and orthogonal to a common axis 220. In either case, the set of two-dimensional slices taken together represents the three-dimensional image dataset.
Segmented structures can also be similarly divided into two-dimensional curves which are associated to each two-dimensional cross-sectional slice.
100321 Referring to FIG. 3a, once the three-dimensional image is acquired and processed into a series of cross-sectional two-dimensional frames (300) the lesion or organ 305 is segmented in the treatment image. In one embodiment, the planning/reference segmentation 310 is superimposed on the treatment/operative image 300 and moved manually to the approximate location of the organ 305. Although only one frame 300 is shown, the user can also move the planning contour in at least one other orthogonal view plane until it is approximately at the correct position in three-dimensional space. If, as may be the case, the organ has changed only in position, then an exact match is possible and the surface can be segmented. If, however, there are morphisms such as changes in volume and/or shape, the shifted segmentation 310 serves as an "initial estimate" of the organ 305.
[0033] Referring to FIG. 3b, a local segmentation technique is used which starts from the initial estimate 310 to deform the contour such that it contours the organ 305 more accurately. Non-limiting examples of tocal segmentation techniques include those that use image gradients to attract contours to edges starting from an estimate of the contour, and techniques described in "Prostate boundary segmentation from 2D ultrasound images" by Ladak et al. in Med. Phys. 27 (8), p. 1777-1799 (2000) and "Prostate boundary segmentation from 3D ultrasound images" by Hu et al. in Med. Phys. 30 (7), p.

(2003). The invention facilitates the segmentation by using the planning surface, shifted to the correct location, as the initial estimate. The degree of morphism tolerated depends on the capabilities of the segmentation technique(s) used. In some cases, the initial planning surface can be scaled as well as shifted so that it represents the lesion in the treatment image more accurately. Instead of or in addition to scaling and shifting, the initial segmentation can be rotated if the lesion has rotated between planning and treatment times.

[0034) The local segmentation algorithm used can either be fully three-dimensional, or can be two-dimensional and applied separately on each cross-sectional plane 205 or 215.
In the latter case, the set of two-dimensional curves generated by the segmentation algorithm can be reconstructed into a three-dimensional mesh.

[0035] In another embodiment, the step of moving the planning contour manually is replaced by an automated process. Once the three-dimensional treatment image is acquired, the user defines a number of points on the surface of the organ to be segmented.
The number of points should be minimal to conserve time, but should be sufficient to constrain the segmentation process in all three dimensions. For instance, referring to FIG
4, two cross-sectional cuts 400 and 405 through the organ can be presented to the user.
The orientation of the cuts are shown by 410 and 415. The user places four points 420 around the cross-section of the organ 425 in the first cut, and two points 430 in around the cross-section of the organ 435 in the second cut.
[0036] Referring to FIGS. 5a and 5b, the set of elements 505 from the reference surface can be mapped to corresponding elements 420 and 430 identified on the treatment image, and the "shift" 510 necessary to move the surface image such that the corresponding elements 505 and the points 420 and 430 are aligned is measured.
The process of detecting and measuring the shift can use any point-to-surface matching technique which, for example, minimizes the square distance between the points 420 and 430 and the surface, including, but not limited to the weighted surface-to-surface mapping technique described in currently-pending, co-owned U.S. Patent Application Serial No.
10/894,392, the entire disclosure of which is incorporated by reference herein. The shift 510 can be applied to the surface image so that it is "close" to the organ of interest in the treatment image. This shifted surface 100' can then be used as an initial estimate to a local segmentation algorithm, as previously described, to find the contour of the lesion or organ at time of treatment.
[0037] In some embodiments, such as a clinical setting where only one image may be presented to the technologist administering the treatment, the user need only select elements in a single cut through the three-dimensional treatment image.
Referring to FIG
6, the cross-sectional slice 600 shows an image of the lesion 605 and four user "hint-points" 610 can be indicated by the user on the lesion surface. If the hint points placed by the user in one plane are not sufficient to shift the planning surface to the correct location in the treatment image as previously described (because, for example, there are no points to constrain the shift in the direction perpendicular to the imaging plane), additional estimate points can be automatically identified in other planes using, for example, local segmentation techniques. These techniques can, in some cases, identify a full three-dimensional representation of the organ. It is unlikely, however, that the resulting contoured representation of the lesion will be suitable because local segmentation techniques require a good initial estimate, which in general cannot be suitably constructed with hint points in only one plane. The new surface, or partial surface, or series of points, generated by the initial local segmentation technique can, however, be used in a conventional point-to-surface or surface-to-surface matching process to bring the planning surface to an approximately correct position in the treatment image. This shifted planning surface can then, as previously described, be used as an initial estimate for another pass of the local segmentation technique to find a better representation of the organ at time of treatment.
[0038] In some embodiments, once the lesion or organ is contoured as previously described, the elements of the surface can once again be used to move the planning surface to a more accurate location in the treatment image. This process of local segmentation, surface matching, and shifting the planning surface can be repeated for a number of iterations until the planning surface is finally at the "best" (i.e., error-minimized) location.
Once this best location is determined, then a final run of the local segmentation algorithm can be used to identify the edges of the organ.
[0039] One non-limiting example of a segmentation technique used to position the reference surface uses landmarks of the lesion that represent strong and/or well-defined edges of the lesion (the base of a prostate gland, for example). The edge points can be identified during an initial run of the segmentation process as well as after the user has identified one or more hint points. These strong edges can be assigned higher weights than other points, and in one embodiment, any points not in this initial set can be assigned a weight of zero. Similarly, the rotations or cross-sections used to identify the two-dimensional slices on which the estimate points are identified can be limited to certain rotations (e.g., at 45-degree angles to each other) to avoid capturing false and/or weak points on the images.

[0040] In some cases, segments of the contour obtained at time of planning, the contour obtained at time of treatment, and/or any intermediary contours obtained during the iterative process of finding the final treatment surface can be assigned a certainty weight c indicating how certain it is that the segmented point corresponds to a true edge.
In some embodiments, the certainty weight c is a real number, or in some cases limited to either 0 or 1 representing uncertain and certain boundaries, respectively.
Other methods to generate certainty weights include, but are not limited to, using the magnitude of the gradient at the surface point, a closeness to another edge detection technique, or manual identification. The result of the segmentation is the generation of a surface of the lesion at the time of treatment planning, S I. The surface can consist of, for example, points, lines, or other surface elements such as triangles (e.g., in a mesh). The second image, obtained at treatment time (using, for example a three-dimensional ultrasound imaging device), S2, is not segmented, and instead a set (e.g., 4) of estimated edge points are identified manually for one or more cross-sections of the image. For ease of representation, surface elements are hereafter referred to as "points."

Exemplary Surface-Matching Technique [0041] Various embodiments, such as the one described above, use a surface matching technique to move the planning surface to an approximately correct location in the treatment image. Thus, techniques which map surface elements in a surface S 1 to surface elements in a surface S2 are used. (Surface elements can be points, triangles, line segments, etc.) Any such technique can be used, but an exemplary technique is as follows.
The set of surface elements and certainty weights on the segmented surface S1 are referred to as {r(') }={x~') , y~') , z~') , c~'~ I and the estimated surface points of the set of two-dimensional cross-sections S2 as {r;(2) }=ix(i 2), y(i 2), z(i 2), c(i 2) }.
The index i runs from 1 to the number of points M in S2, and the indexj runs from 1 to the number of points N in S~.
The terms x, y, z represent the three-dimensional positional coordinates of each point and c refers to the certainty index assigned to that point. Either set of points can be down-sampled to improve the speed of computation by removing excess points. The set of certainty indices of S2 (or, in some cases, SI or both S1 and S2) may be modified to account for the local density of elements in the case of surface points, or the area of the element in the case of a mesh, so that parts of surfaces with large density of elements are not overly weighted in the algorithm. As a result, the set of elements on S2 are referred to as {r;(2) ~ = {x; 2) , y; 2) , z(2) , w; 2) } where w(2) is the modified set of certainty indices C(2) [0042] The shift r~.h f~ which minimizes least-square error between points on SI and S2 is found, which includes the weights of the points on each surface. A method to determine this shift is to minimize the cost function given by C(rsh~t~ W~ 112 r,(2) - r i~ose (rr(2) - r,h~~r ~ Sl ) - rsh~h (1) where r~i ~Ce (r;(2) - rsh; , S, ) refers to the point on S i which is the closest to the point r,l z) - r5.h f, , and the weights W; are defined as W (2) (1) ( (2' ) (2) ; = Wi Cclose r; - rsh;ft SI where c,(,'OSe (r,(2) - rs.h f, S1) refers to the certainty weight of that closest point on Si.

One suitable method of minimizing the cost of equation 1 includes the following steps:
1) Set rsh;ft = (0,0,0) 2) Calculate the set of closest points r;OSe (r;(2) - rshf , S, ) in Eq. (1).
3) Calculate the cost C(rs,,f ) for this particular shift.
4) Is the cost below a threshold? If yes, stop and accept rsh;ft as the optimal shift. If no, then update r,h f; to r.Sh,~t + AR where AR is an incremental update step.
5) return to step 2) The shift vector rs.,,ijt is updated by the update step AR until the cost function is a minimum. The result converges rapidly if AR is chosen such that:

AR = 1E 1Vrr"110~" (r;(2) - rr, f I S, ) - 1I W; r;(Z) . (3) M ; N
Rotations can also be introduced into the technique.
[0043] FIG. 7 illustrates various embodiments of a method of determining an appropriate adjustment to improve the delivery of radiation treatment to a patient. As described above, the process is typically divided into two phases: a treatment planning stage (corresponding to the steps to the left of the dashed line 700), during which an oncologist or similarly trained physician prepares a treatment plan for the administration of radiation to a cancerous lesion; and a treatment delivery stage (corresponding to the steps to the right of the dashed line 700) during which a radiology technician positions the patient within the gantry, makes any adjustments to the positioning based on lesion morphing or shifting, and administers the radiation according to the treatment plan. The treatment planning phase can occur substantially before the treatment delivery phase, or in some cases immediately preceding the treatment delivery phase, and may take place in the same room, or in some cases different rooms. As the time span increases between the phases, the target lesion has a greater opportunity to grow, change in morphology, and change its positioning with respect to surrounding normal tissue and healthy organs, resulting in a need for positional compensation.
[0044] As an initial step, a first image of the lesion and surrounding area is obtained (step 705) using any of the imaging modalities described above. In some instances, the image may not be a complete representation of the target lesion or surrounding organs, whereas in some instances the image can be a very accurate representation of the target area. From this first image, a segmented and/or contoured image consisting of a set of surface elements is generated (step 710) representing one or more surface elements of the lesion and surrounding organs. The set of surface elements (collectively the "reference surface") may include the entire three-dimensional surface, or in some cases where only a portion of the lesion has been accurately imaged, may only describe a portion of the lesion. In some embodiments, one or more of the surface elements is weighted based on one or in some cases a combination of the factors described above. In some embodiments, the reference surface can be contoured manually by a physician on a series of axial CT
slices, and the parallel contours integrated together to form a surface. The physician may also use one or more registered and fused datasets to show additional information, such as MRI, PET or 3D ultrasound. Automatic segmentation techniques may also aid the physician in segmenting the organ or lesion of interest.
[0045] Subsequent to obtaining the treatment planning image, and in anticipation of a treatment delivery session, a second image of the target area is obtained (step 720), using, for example, a three-dimensional ultrasound imaging device. One or more two-dimensional image planes are obtained by viewing cross-sections of the three-dimensional ultrasound image at various angles or positions relative to a common axis. A
user then indicates estimates of edge points within one or more of the two-dimensional images. Any number of edge points may be identified, but in general, using four points for each two-dimensional image provides sufficient data to initiate the matching process with the reference surface image. Based on the second image, initial estimates of edge points are identified (step 725) and an axis of rotation is created in the plane of the identified points.
Therefore, an approximate initial estimate to the organ surface, or a subset of points on the organ surface, is generated using minimal user input (e.g., four hint points on only one plane). In some cases, where an automatic knowledge-based technique is used to makes assumptions about anatomical landmarks (e.g., bladder is anterior to the prostate which is itself anterior to the rectum), it is possible to dispense with hint points altogether. In some embodiments, multiple two-dimensional cross-sections can be obtained by rotating about the axis of rotation identified in the first image, thereby facilitating identification of additional edge points. The edge point estimates are used as input to a segmentation process which generates the surface or surface elements of the "treatment"
surface (step 730). The initial or reference surface can be shifted, rotated and/or warped (step 735) such that the mapped reference and treatment surface elements are then aligned.
[0046] In some cases, and at least for one iteration, the shifted reference surface can be used as an estimate in subsequent iterations for segmentation (step 740). An error can then be computed using, for example, a least-squares fit or other similar convergence calculation approach that measures the misalignment of the reference surface with the new treatment surface (step 745) estimates. If the three-dimensional surface is properly placed (e.g., a convergence decision threshold is met in step 750) the process ends, whereas if the error is unacceptable, the process repeats. The number of iterations can be predefined at a fixed value, which is known to converge for typical clinical images, or can be defined, for example, by comparing the amount by which the reference surface is shifted between each iteration and terminating the process when the shift falls below a given threshold.
[0047] Although the invention is described as relating to a single surface at time of planning and at time of treatment, it may also relate to multiple lesions and/or organs, some of which are to be treated and some of which are to be avoided.
[0048] In step 745, after the three-dimensional treatment surface is found, the plan may be updated to account for the change in location, shape, and rotation of the lesion.
For example, a completely new plan can be developed, involving changing the beam angles, collimator angles, couch position, dose weights, etc. using inverse planning algorithms. In some embodiments, the shift can then be translated into a set of displacements for a patient support device, such as a treatment table of the LINAC, or the patient, just prior to or during treatment delivery. For example, a shift of (-4, +8) may translate into moving the treatment table 4 millimeters to the left and 8 millimeters up with respect to the gantry and radiation beam. The shift can be calculated by finding the centroid of the current and reference surfaces and shifting the treatment couch by that amount and/or by implementing a final surface-matching algorithm, with or without associated weights as described above, to calculate a couch displacement.
Other shifts may require rotating the gantry, moving the patient, or some combination thereof.

[0049] Alternatively, a technician may use simulation techniques to directly manipulate the patient or patent support device while viewing the real-time images of the target area on a screen or monitor. In one embodiment, a technician can adjust the treatment table position with respect to the LINAC until a desired number of surface elements from the second image overlaps corresponding elements in the first image (or vice versa), or a threshold value is reached indicating a sufficient overlap, by manipulating an input device such as a j oystick, keypad, or other input device. In another embodiment, the technician manually adjusts the position of the patient on a stationary treatment table until the desired position is reached. In some cases, the technician may employ a combination of both programmatic adjustments based on pre-determined alignment displacements and manual patient positioning techniques.
[0050] I~ IG. 8 schematically represents a hardware embodiment of the invention realized as a system 800 for determining the changes to a lesion in anticipation of the administration of radiation therapy. The system 800 comprises a register 805, a mapping module 810, and a processor 812 and may optionally include a weighting module 815.
[0051] The register 805, which may be any known organized data storage facility (e.g., partitions in RAM, etc.), receives images from an imager 820 such as an MRI, CT/PET scanner, ultrasound device, or x-ray device. The register 805 receives a first image from the imager 820 during or subsequent to a treatment planning session characterizing the target region at the time the treatment plan is determined.
The register 805 receives a second image from the imager 820 during or just previous to a treatment delivery session characterizing the target region at the time of treatment.
The imaging modalities used during the planning and the treatment stages can, in some embodiments, be different. In some embodiments, the images can be stored on a data storage device separate from the imager (e.g., a database, microfiche, etc.) and sent to the system 800.
The register 805 may receive the images and beam shapes through conventional data ports and may also include circuitry for receiving analog image data and analog-to-digital conversion circuitry for digitizing the image data.
[0052] The register 805 then determines a set of surface elements from the first image either programmatically, or based on some input from the user. The register optionally provides the image to the weighting module 812, which facilitates the assignment of weights to one or more surface elements generated from the first image.
The surface elements and weights can be determined programmatically, manually, or both.

Where manual input and manipulation is used, the system 800 receives instructions from a user via an input device 830 such as a keyboard, a mouse, or other pointing device.
Results of the weighting, manipulations, and alignments can also be viewed using a display device 840 such as a computer display screen or hand-held device. The set of surface elements from the first image, their associated weights, and the second image are then sent to the processor 810 which, based on the proximity of the surface elements (and optionally the assigned weights) of the first image and estimated edge points identified on the second image, shifts the first image to compensate for the displacement of the lesion as described above. In some embodiments, the processor 815 translates displacements into instructions representing physical movements of a patient support device 850 and sends the instructions to the device 850 in order to adjust the position of the patient in accordance with the alignment calculations.
[0053] In some embodiments, the register 805, mapping module 810, weighting module 812 and processor 815 may implement the functionality of the present invention in hardware or software, or a combination of both on a general-purpose computer.
In addition, such a program may set aside portions of a computer's random access memory to provide control logic that affects one or more of the image manipulation, mapping, alignment, and support device control. In such an embodiment, the program may be written in any one of a number of high-level languages, such as FORTRAN, PASCAL, C, C++, C#, Java, Tcl, or BASIC. Further, the program can be written in a script, macro, or functionality embedded in commercially available software, such as EXCEL or VISUAL
BASIC. Additionally, the software could be implemented in an assembly language directed to a microprocessor resident on a computer. For example, the software can be implemented in Intel 80x86 assembly language if it is configured to run on an IBM PC or PC clone. The software may be embedded on an article of manufacture including, but not limited to, "computer-readable program means" such as a floppy disk, a hard disk, an optical disk, a magnetic tape, a PROM, an EPROM, or CD-ROM.
[0054] The invention herein is primarily described in relation to external beam radiotherapy, but can be used in any application where a reference surface is generated at one time and a semi-automatic or fully-automatic surface is acquired on the same subject at a later time. For example, in brachytherapy applications a physician often contours a lesion to develop a preplan, deciding on how many radioactive seeds to purchase, etc. In the operating room, the lesion or organ will require a new segmentation to develop a new plan, accounting for the new location and/or shape of the organ. The systems and methods described herein can be used to segment the lesion in the operating room, using the reference surface contoured during the preplanning stage.
[0055] While the invention has been particularly shown and described with reference to specific embodiments, it should be understood by those skilled in the area that various changes in form and detail may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. The scope of the invention is thus indicated by the appended claims and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced.

Claims (38)

1. A method for obtaining a contoured surface of a lesion within a patient, the method comprising the steps of:
(a) providing a contoured reference surface of the lesion based on one or more images acquired at a first time;
(b) generating a set of operational images of the lesion at a second time;
(c) shifting the reference surface to an estimated location in the operational images;
(d) segmenting the operational images based at least in part on the shifted reference surface; and (e) determining a contoured surface of the lesion at the second time based on the segmented images.
2. The method of claim 1 wherein the shifting comprises rotational movement.
3. The method of claim 1 wherein the shifting comprises translational movement.
4. The method of claim 1 wherein the shifting comprises scaling.
5. The method of claim 1 wherein the shifting comprises movement in three dimensions.
6. The method of claim 1 wherein the reference surface is created using one or more CT images, three-dimensional ultrasound images, two-dimensional ultrasound images, PET images and MRI images.
7. The method of claim 1 wherein the operational images are obtained using one of a CT scanner, an ultrasound device, a PET scanner and an MRI.
8. The method of claim 1 further comprising manually locating the reference surface with respect to the operational images.
9. The method of claim 1 further comprising indicating points within the operational images and mapping the points to surface elements on the reference surface to determine the shift.
10. The method of claim 9 wherein the surface elements on the reference surface comprise at least one of the group consisting of points, triangles, and lines.
11. The method of claim 9 further comprising minimizing the distance between the indicated points and mapped surface elements.
12. The method of claim 9 wherein the indicated points are in multiple planes.
13. The method of claim 9 wherein the indicated points are within one two-dimensional plane.
14. The method of claim 13 further comprising obtaining additional points from the contoured surface and repeating steps (c) through (d), substituting the shifted reference surface for the segmented operational images.
15. The method of claim 1 further comprising obtaining two-dimensional cross-sectional curves from the contoured reference surface.
16. The method of claim 15 wherein the two-dimensional cross-sectional curves are obtained from the shifted contoured reference surface.
17. The method of claim 15 wherein the two-dimensional cross-sectional curves are radially offset from each other.
18. The method of claim 15 wherein the two-dimensional cross-sectional curves are parallel to each other.
19. The method of claim 9 wherein the surface elements on the reference surface represent an edge boundary.
20. The method of claim 9 wherein weights are assigned to the surface elements on the reference surface and the weights are based at least in part on a clinical importance of an anatomical feature represented by the one or more surface elements.
21. The method of claim 20 wherein the clinical importance of the one or more surface elements is based at least in part on the proximity of an anatomical feature represented by the surface element to another anatomical structure of the patient.
22. The method of claim 9 wherein weights are assigned to one or more of the surface elements and the weights are based at least in part on a density of the one or more surface elements within an area of the segmented image.
23. The method of claim 9 wherein the mapping is determined by minimizing the mean square distance between surface elements in the reference surface and the indicated points.
24. The method of claim 1 wherein the first time is substantially proximate to a treatment planning session.
25. The method of claim 1 wherein the second time is substantially proximate to a treatment session.
26. A system for obtaining a contoured image of a lesion, the system comprising:
(a) a register for storing (i) a reference surface of the lesion based on one or more images acquired at a first time, (ii) a set of images of the lesion taken at a second time and (iii) estimated surface elements on at least one of the set of images;
(b) a mapping module for mapping surface elements on the reference surface to the estimated surface elements on at least one of the set of images; and (c) a processor for shifting the reference surface based on the matched surface elements.
27. The system of claim 26 wherein the reference surface is based on images obtained using one of a CT scanner, a three-dimensional ultrasound device, a PET
scanner and an MRI.
28. The system of claim 26 wherein the surface elements on the reference surface comprise at least one of the group consisting of points, triangles, and lines.
29. The system of claim 26 wherein the estimated surface elements on one of the set of images comprise at least one of the group consisting of points, triangles, and lines.
30. The system of claim 26 wherein the register further obtains two-dimensional cross-sectional curves from the shifted reference surface.
31. The system of claim 30 wherein the two-dimensional cross-sectional curves are radially offset from each other.
32. The system of claim 30 wherein the two-dimensional cross-sectional curves are parallel to each other.
33. The system of claim 26 wherein the mapping module and processor iteratively map surface elements on the reference surface to surface elements on the shifted reference surface.
34. The system of claim 26 further comprising a weighting module for assigning weights to the surface elements on the segmented image.
35. The system of claim 26 wherein the mapping is determined by minimizing the mean square distance between surface elements on the reference surface and the one or more estimated surface elements on one of the set of images.
36. The system of claim 26 wherein the first time is substantially proximate to a treatment planning session.
37. The system of claim 26 wherein the second time is substantially proximate to a treatment session.
38. An article of manufacture having computer-readable program portions embodied thereon for determining the changes to a lesion for the purpose of administering radiation treatment to the lesion, the article comprising computer-readable instructions for:
(a) providing a contoured surface of the lesion based on one or more images acquired at a first time as a reference surface;
(b) generating a set of images of the lesion at a second time as operational images;
(c) shifting the reference surface to an estimated location in the operational images;
(d) segmenting the operational images based at least in part on the shifted reference surface; and (e) determining a contoured surface map of the lesion at the second time based on the segmented images.
CA2634490A 2005-12-20 2006-12-19 Methods and systems for segmentation and surface matching Active CA2634490C (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US11/313,236 2005-12-20
US11/313,236 US8929621B2 (en) 2005-12-20 2005-12-20 Methods and systems for segmentation and surface matching
PCT/CA2006/002099 WO2007071050A1 (en) 2005-12-20 2006-12-19 Methods and systems for segmentation and surface matching

Publications (2)

Publication Number Publication Date
CA2634490A1 true CA2634490A1 (en) 2007-06-28
CA2634490C CA2634490C (en) 2016-12-06

Family

ID=38188228

Family Applications (1)

Application Number Title Priority Date Filing Date
CA2634490A Active CA2634490C (en) 2005-12-20 2006-12-19 Methods and systems for segmentation and surface matching

Country Status (3)

Country Link
US (1) US8929621B2 (en)
CA (1) CA2634490C (en)
WO (1) WO2007071050A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11282204B2 (en) * 2017-08-03 2022-03-22 Shantou Institute Of Ultrasonic Instruments Co., Ltd. Simulated and measured data-based multi-target three-dimensional ultrasound image segmentation method

Families Citing this family (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040238732A1 (en) * 2001-10-19 2004-12-02 Andrei State Methods and systems for dynamic virtual convergence and head mountable display
GB2455926B (en) * 2006-01-30 2010-09-01 Axellis Ltd Method of preparing a medical restraint
US20110057930A1 (en) * 2006-07-26 2011-03-10 Inneroptic Technology Inc. System and method of using high-speed, high-resolution depth extraction to provide three-dimensional imagery for endoscopy
US7848592B2 (en) * 2006-07-31 2010-12-07 Carestream Health, Inc. Image fusion for radiation therapy
US7728868B2 (en) 2006-08-02 2010-06-01 Inneroptic Technology, Inc. System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities
EP2081494B1 (en) * 2006-11-16 2018-07-11 Vanderbilt University System and method of compensating for organ deformation
US20080146914A1 (en) * 2006-12-19 2008-06-19 General Electric Company System, method and apparatus for cancer imaging
US7995813B2 (en) * 2007-04-12 2011-08-09 Varian Medical Systems, Inc. Reducing variation in radiation treatment therapy planning
US20080262524A1 (en) * 2007-04-19 2008-10-23 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Systems and methods for closing of fascia
US20080262390A1 (en) * 2007-04-19 2008-10-23 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Fiducials for placement of tissue closures
US8098909B2 (en) * 2007-08-31 2012-01-17 Computerized Medical Systems, Inc. Method and apparatus for efficient three-dimensional contouring of medical images
US8160345B2 (en) 2008-04-30 2012-04-17 Otismed Corporation System and method for image segmentation in generating computer models of a joint to undergo arthroplasty
US20090324041A1 (en) * 2008-01-23 2009-12-31 Eigen, Llc Apparatus for real-time 3d biopsy
WO2009094646A2 (en) * 2008-01-24 2009-07-30 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for image guided ablation
US8265356B2 (en) * 2008-01-30 2012-09-11 Computerized Medical Systems, Inc. Method and apparatus for efficient automated re-contouring of four-dimensional medical imagery using surface displacement fields
US8340379B2 (en) 2008-03-07 2012-12-25 Inneroptic Technology, Inc. Systems and methods for displaying guidance data based on updated deformable imaging data
EP2260468B1 (en) * 2008-04-04 2015-06-10 Koninklijke Philips N.V. Simultaneous model-based segmentation of objects satisfying pre-defined spatial relationships
CN102099833B (en) * 2008-04-07 2016-08-10 皇家飞利浦电子股份有限公司 Mesh collision is avoided
WO2010005973A2 (en) * 2008-07-07 2010-01-14 The Johns Hopkins University Automated surface-based anatomical analysis based on atlas-based segmentation of medical imaging
US8554307B2 (en) 2010-04-12 2013-10-08 Inneroptic Technology, Inc. Image annotation in image-guided medical procedures
US8641621B2 (en) 2009-02-17 2014-02-04 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
US8690776B2 (en) 2009-02-17 2014-04-08 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image guided surgery
US11464578B2 (en) 2009-02-17 2022-10-11 Inneroptic Technology, Inc. Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures
WO2011014687A2 (en) * 2009-07-31 2011-02-03 Inneroptic Technology, Inc. Dual-tube stereoscope
US20110082351A1 (en) * 2009-10-07 2011-04-07 Inneroptic Technology, Inc. Representing measurement information during a medical procedure
US9282947B2 (en) 2009-12-01 2016-03-15 Inneroptic Technology, Inc. Imager focusing based on intraoperative data
JP2013530028A (en) 2010-05-04 2013-07-25 パスファインダー セラピューティクス,インコーポレイテッド System and method for abdominal surface matching using pseudo features
US20120259224A1 (en) * 2011-04-08 2012-10-11 Mon-Ju Wu Ultrasound Machine for Improved Longitudinal Tissue Analysis
US8798342B2 (en) * 2011-05-10 2014-08-05 General Electric Company Method and system for ultrasound imaging with cross-plane images
US8867806B2 (en) 2011-08-01 2014-10-21 Impac Medical Systems, Inc. Method and apparatus for correction of errors in surfaces
KR101916855B1 (en) * 2011-10-17 2019-01-25 삼성전자주식회사 Apparatus and method for correcting lesion in image frame
WO2013116240A1 (en) 2012-01-30 2013-08-08 Inneroptic Technology, Inc. Multiple medical device guidance
WO2014097103A1 (en) * 2012-12-17 2014-06-26 Koninklijke Philips N.V. Segmentation of breast lesions in 3d ultrasound images
US10314559B2 (en) 2013-03-14 2019-06-11 Inneroptic Technology, Inc. Medical device guidance
US10912571B2 (en) 2013-03-15 2021-02-09 Howmedica Osteonics Corporation Generation of a mating surface model for patient specific cutting guide based on anatomical model segmentation
US9401017B2 (en) * 2014-05-05 2016-07-26 Wisconsin Alumni Research Foundation Systems and methods for semi-automated segmentation of medical images
US9808213B2 (en) * 2014-08-11 2017-11-07 Canon Kabushiki Kaisha Image processing apparatus, image processing method, medical image diagnostic system, and storage medium
US9901406B2 (en) 2014-10-02 2018-02-27 Inneroptic Technology, Inc. Affected region display associated with a medical device
RU2671513C1 (en) * 2014-10-27 2018-11-01 Электа, Инк. Visual guidance for radiation therapy
US10188467B2 (en) 2014-12-12 2019-01-29 Inneroptic Technology, Inc. Surgical guidance intersection display
US9949700B2 (en) 2015-07-22 2018-04-24 Inneroptic Technology, Inc. Medical device approaches
US9675319B1 (en) 2016-02-17 2017-06-13 Inneroptic Technology, Inc. Loupe display
CA3025777A1 (en) * 2016-06-17 2017-12-21 Children's National Medical Center Medical anatomy quantification: computer-aided diagnosis tool
US10278778B2 (en) 2016-10-27 2019-05-07 Inneroptic Technology, Inc. Medical device navigation using a virtual 3D space
EP3398551A1 (en) 2017-05-03 2018-11-07 Stryker European Holdings I, LLC Methods of pose estimation of three-dimensional bone models in surgical planning a total ankle replacement
US11259879B2 (en) 2017-08-01 2022-03-01 Inneroptic Technology, Inc. Selective transparency to assist medical device navigation
US11484365B2 (en) 2018-01-23 2022-11-01 Inneroptic Technology, Inc. Medical image guidance
US10307209B1 (en) * 2018-08-07 2019-06-04 Sony Corporation Boundary localization of an internal organ of a subject for providing assistance during surgery
IT201800009938A1 (en) * 2018-10-31 2020-05-01 Medics Srl METHOD AND APPARATUS FOR THE THREE-DIMENSIONAL REPRODUCTION OF ANATOMICAL ORGANS FOR DIAGNOSTIC AND / OR SURGICAL PURPOSES

Family Cites Families (78)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3082322A (en) 1958-11-28 1963-03-19 Westinghouse Electric Corp Therapy unit
US3991310A (en) 1970-08-03 1976-11-09 Morrison Richard A Biplane radiographic localization of target center for radiotherapy
US3777124A (en) 1970-11-27 1973-12-04 Varian Associates Computer assisted radiation therapy machine
US3987281A (en) 1974-07-29 1976-10-19 The United States Of America As Represented By The Department Of Health, Education And Welfare Method of radiation therapy treatment planning
GB1572347A (en) 1976-03-30 1980-07-30 Emi Ltd Radiographic apparatus
US4618978A (en) 1983-10-21 1986-10-21 Cosman Eric R Means for localizing target coordinates in a body relative to a guidance system reference frame in any arbitrary plane as viewed by a tomographic image through the body
DE3844716C2 (en) 1987-08-24 2001-02-22 Mitsubishi Electric Corp Ionised particle beam therapy device
JPS6472736A (en) 1987-09-14 1989-03-17 Toshiba Corp Mri apparatus
EP0319885B1 (en) 1987-12-11 1994-11-02 Varian International AG. Therapy simulator
FR2637189A1 (en) 1988-10-04 1990-04-06 Cgr Mev SYSTEM AND METHOD FOR MEASURING AND / OR VERIFYING THE POSITION OF A PATIENT IN RADIOTHERAPY EQUIPMENT
US5099846A (en) 1988-12-23 1992-03-31 Hardy Tyrone L Method and apparatus for video presentation from a variety of scanner imaging sources
US5117829A (en) 1989-03-31 1992-06-02 Loma Linda University Medical Center Patient alignment system and procedure for radiation treatment
US5222499A (en) 1989-11-15 1993-06-29 Allen George S Method and apparatus for imaging the anatomy
US5107839A (en) 1990-05-04 1992-04-28 Pavel V. Houdek Computer controlled stereotaxic radiotherapy system and method
US5295483A (en) 1990-05-11 1994-03-22 Christopher Nowacki Locating target in human body
US5207223A (en) 1990-10-19 1993-05-04 Accuray, Inc. Apparatus for and method of performing stereotaxic surgery
US6405072B1 (en) 1991-01-28 2002-06-11 Sherwood Services Ag Apparatus and method for determining a location of an anatomical target with reference to a medical apparatus
US5291889A (en) 1991-05-23 1994-03-08 Vanguard Imaging Ltd. Apparatus and method for spatially positioning images
US5715166A (en) 1992-03-02 1998-02-03 General Motors Corporation Apparatus for the registration of three-dimensional shapes
US5317616A (en) 1992-03-19 1994-05-31 Wisconsin Alumni Research Foundation Method and apparatus for radiation therapy
US5301674A (en) 1992-03-27 1994-04-12 Diasonics, Inc. Method and apparatus for focusing transmission and reception of ultrasonic beams
US6122341A (en) 1992-06-12 2000-09-19 Butler; William E. System for determining target positions in the body observed in CT image data
FR2694881B1 (en) 1992-07-31 1996-09-06 Univ Joseph Fourier METHOD FOR DETERMINING THE POSITION OF AN ORGAN.
US5391139A (en) 1992-09-03 1995-02-21 William Beaumont Hospital Real time radiation treatment planning system
US5379642A (en) 1993-07-19 1995-01-10 Diasonics Ultrasound, Inc. Method and apparatus for performing imaging
US5411026A (en) 1993-10-08 1995-05-02 Nomos Corporation Method and apparatus for lesion position verification
US5531227A (en) 1994-01-28 1996-07-02 Schneider Medical Technologies, Inc. Imaging device and method
DE69529857T2 (en) 1994-03-25 2004-01-08 Kabushiki Kaisha Toshiba, Kawasaki Radiotherapy System
US5531520A (en) * 1994-09-01 1996-07-02 Massachusetts Institute Of Technology System and method of registration of three-dimensional data sets including anatomical body data
US5609485A (en) 1994-10-03 1997-03-11 Medsim, Ltd. Medical reproduction system
US6978166B2 (en) 1994-10-07 2005-12-20 Saint Louis University System for use in displaying images of a body part
DE69530355D1 (en) 1994-11-28 2003-05-22 Ohio State University Columbus Medical intervention device
US5511549A (en) 1995-02-13 1996-04-30 Loma Linda Medical Center Normalizing and calibrating therapeutic radiation delivery systems
US6019724A (en) 1995-02-22 2000-02-01 Gronningsaeter; Aage Method for ultrasound guidance during clinical procedures
US6345114B1 (en) 1995-06-14 2002-02-05 Wisconsin Alumni Research Foundation Method and apparatus for calibration of radiation therapy equipment and verification of radiation treatment
US5810007A (en) 1995-07-26 1998-09-22 Associates Of The Joint Center For Radiation Therapy, Inc. Ultrasound localization and image fusion for the treatment of prostate cancer
US6256529B1 (en) 1995-07-26 2001-07-03 Burdette Medical Systems, Inc. Virtual reality 3D visualization for surgical procedures
US5673300A (en) 1996-06-11 1997-09-30 Wisconsin Alumni Research Foundation Method of registering a radiation treatment plan to a patient
US6009212A (en) 1996-07-10 1999-12-28 Washington University Method and apparatus for image registration
CA2288177C (en) 1997-04-25 2006-07-18 Kiwamu Kase Method of determining shape error of free-form surface
WO1999006644A1 (en) 1997-07-28 1999-02-11 Josu Corporation Pty. Ltd. Conduit fitting for termites
JP3054108B2 (en) 1997-08-15 2000-06-19 理化学研究所 Free-form surface measurement data synthesis method
US6636622B2 (en) 1997-10-15 2003-10-21 Wisconsin Alumni Research Foundation Method and apparatus for calibration of radiation therapy equipment and verification of radiation treatment
US6129670A (en) 1997-11-24 2000-10-10 Burdette Medical Systems Real time brachytherapy spatial registration and visualization system
US6201888B1 (en) 1998-02-18 2001-03-13 International Business Machines Corporation System and method for restoring, describing and graphically displaying noise-corrupted boundaries in tomography images
US6012458A (en) * 1998-03-20 2000-01-11 Mo; Larry Y. L. Method and apparatus for tracking scan plane motion in free-hand three-dimensional ultrasound scanning using adaptive speckle correlation
IL141203A0 (en) 1998-08-06 2002-02-10 Wisconsin Alumni Res Found Radiotherapy verification system
US6560311B1 (en) 1998-08-06 2003-05-06 Wisconsin Alumni Research Foundation Method for preparing a radiation therapy plan
ATE324930T1 (en) 1998-08-06 2006-06-15 Wisconsin Alumni Res Found SYSTEM FOR ADJUSTING RADIATION DELIVERY FOR RADIATION THERAPY
US6117081A (en) 1998-10-01 2000-09-12 Atl Ultrasound, Inc. Method for correcting blurring of spatially compounded ultrasonic diagnostic images
US6754374B1 (en) * 1998-12-16 2004-06-22 Surgical Navigation Technologies, Inc. Method and apparatus for processing images with regions representing target objects
US6285805B1 (en) 1999-01-25 2001-09-04 International Business Machines Corp. System and method for finding the distance from a moving query point to the closest point on one or more convex or non-convex shapes
US6591127B1 (en) 1999-03-15 2003-07-08 General Electric Company Integrated multi-modality imaging system and method
CA2377190A1 (en) 1999-07-23 2001-02-01 University Of Florida Ultrasonic guidance of target structures for medical procedures
US6778690B1 (en) * 1999-08-13 2004-08-17 Hanif M. Ladak Prostate boundary segmentation from 2D and 3D ultrasound images
US6379302B1 (en) 1999-10-28 2002-04-30 Surgical Navigation Technologies Inc. Navigation information overlay onto ultrasound imagery
DE19953177A1 (en) 1999-11-04 2001-06-21 Brainlab Ag Method to position patient exactly for radiation therapy or surgery; involves comparing positions in landmarks in X-ray image and reconstructed image date, to determine positioning errors
US6546073B1 (en) 1999-11-05 2003-04-08 Georgia Tech Research Corporation Systems and methods for global optimization of treatment planning for external beam radiation therapy
DE10015815A1 (en) 2000-03-30 2001-10-11 Siemens Ag Image data set generating system for medical diagnostics - superimposes or merges image data obtained from X-ray and ultrasound systems, whose position was determined using navigation system
DE10015826A1 (en) 2000-03-30 2001-10-11 Siemens Ag Image generating system for medical surgery
US6869390B2 (en) 2000-06-05 2005-03-22 Mentor Corporation Automated implantation system for radioisotope seeds
CA2314794A1 (en) 2000-08-01 2002-02-01 Dimitre Hristov Apparatus for lesion or organ localization
US6628983B1 (en) 2000-10-25 2003-09-30 Koninklijke Philips Electronics N.V. Nuclear imaging systems and methods with feature-enhanced transmission imaging
US6909794B2 (en) 2000-11-22 2005-06-21 R2 Technology, Inc. Automated registration of 3-D medical scans of similar anatomical structures
US6844884B2 (en) 2000-12-27 2005-01-18 Ge Medical Systems Global Technology Company, Llc Multi-plane graphic prescription interface and method
US6661870B2 (en) 2001-03-09 2003-12-09 Tomotherapy Incorporated Fluence adjustment for improving delivery to voxels without reoptimization
ES2194809T3 (en) 2001-05-22 2003-12-01 Brainlab Ag DEVICE FOR RECORDING RADIOGRAPHIC IMAGES WITH A MEDICAL NAVIGATION SYSTEM.
US6535574B1 (en) 2001-11-01 2003-03-18 Siemens Medical Solutions Usa, Inc. Patient positioning system employing surface photogrammetry and portal imaging
WO2003039370A1 (en) 2001-11-05 2003-05-15 Computerized Medical Systems, Inc. Apparatus and method for registration, guidance, and targeting of external beam radiation therapy
US8406844B2 (en) 2002-03-06 2013-03-26 Tomotherapy Incorporated Method for modification of radiotherapy treatment delivery
AUPS205202A0 (en) * 2002-05-02 2002-06-06 Flinders Technologies Pty Ltd A method and system for computer aided detection of cancer
US7333644B2 (en) * 2003-03-11 2008-02-19 Siemens Medical Solutions Usa, Inc. Systems and methods for providing automatic 3D lesion segmentation and measurements
US7343030B2 (en) * 2003-08-05 2008-03-11 Imquant, Inc. Dynamic tumor treatment system
US7103399B2 (en) * 2003-09-08 2006-09-05 Vanderbilt University Apparatus and methods of cortical surface registration and deformation tracking for patient-to-image alignment in relation to image-guided surgery
WO2005106773A2 (en) 2004-04-15 2005-11-10 Edda Technology, Inc. Spatial-temporal lesion detection, segmentation, and diagnostic information extraction system and method
US7430321B2 (en) * 2004-09-09 2008-09-30 Siemens Medical Solutions Usa, Inc. System and method for volumetric tumor segmentation using joint space-intensity likelihood ratio test
US7736313B2 (en) * 2004-11-22 2010-06-15 Carestream Health, Inc. Detecting and classifying lesions in ultrasound images
WO2006114003A1 (en) * 2005-04-27 2006-11-02 The Governors Of The University Of Alberta A method and system for automatic detection and segmentation of tumors and associated edema (swelling) in magnetic resonance (mri) images

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11282204B2 (en) * 2017-08-03 2022-03-22 Shantou Institute Of Ultrasonic Instruments Co., Ltd. Simulated and measured data-based multi-target three-dimensional ultrasound image segmentation method

Also Published As

Publication number Publication date
US8929621B2 (en) 2015-01-06
WO2007071050A1 (en) 2007-06-28
CA2634490C (en) 2016-12-06
US20070167699A1 (en) 2007-07-19

Similar Documents

Publication Publication Date Title
CA2634490C (en) Methods and systems for segmentation and surface matching
US7672705B2 (en) Weighted surface-to-surface mapping
US10300305B2 (en) Image guidance for radiation therapy
US9014446B2 (en) Efficient user interaction with polygonal meshes for medical image segmentation
US11097127B2 (en) Heart tissue surface contour-based radiosurgical treatment planning
EP1778351B1 (en) Verifying lesion characteristics using beam shapes
EP2175931B1 (en) Systems for compensating for changes in anatomy of radiotherapy patients
US8457372B2 (en) Subtraction of a segmented anatomical feature from an acquired image
Xing et al. Computational challenges for image-guided radiation therapy: framework and current research
US20120280135A1 (en) Use of collection of plans to develop new optimization objectives
WO2012069965A1 (en) Interactive deformation map corrections
US8233686B2 (en) Methods and systems for locating objects embedded in a body

Legal Events

Date Code Title Description
EEER Examination request