WO2016201637A1 - Guided ultrasound breast cancer screening system - Google Patents
Guided ultrasound breast cancer screening system Download PDFInfo
- Publication number
- WO2016201637A1 WO2016201637A1 PCT/CN2015/081636 CN2015081636W WO2016201637A1 WO 2016201637 A1 WO2016201637 A1 WO 2016201637A1 CN 2015081636 W CN2015081636 W CN 2015081636W WO 2016201637 A1 WO2016201637 A1 WO 2016201637A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- patient
- ultrasound
- guidance
- computing device
- scan
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0825—Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the breast, e.g. mammography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/42—Details of probe positioning or probe attachment to the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/42—Details of probe positioning or probe attachment to the patient
- A61B8/4245—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
- A61B8/4263—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors not mounted on the probe, e.g. mounted on an external reference frame
Definitions
- the present disclosure relates to a system and method for guiding physician in performing an ultrasound examination of a patient’s breast to screen for cancer.
- Mammography has long been a standard imaging modality for breast cancer screenings. However, the effectively of mammography is significantly reduced when scanning dense breast tissue. Ultrasound imaging has therefore been used as an alternative and/or complementary non-invasive subsurface imaging modality for breast cancer screening.
- Traditional hand-held ultrasound devices present a problem in that the scanning is completely manual and requires a great deal of time and experience from the operator of the ultrasound device to ensure a full scan of the whole breast.
- a system for performing a guided ultrasound scan comprises a camera configured to capture a position of a patient’s body, a computing device including a processor operatively coupled to a memory storing instructions which, when executed by the processor, cause the computing device to generate guidance for an ultrasound scan of a portion of the patient’s body, wherein the guidance includes one or more of images and instructions for moving an ultrasound sensor over the patient’s body, and a projector configured to project the guidance onto the patient’s body, and the instructions further cause the computing device to determine a current position of the ultrasound sensor, and iteratively update the projected guidance based on the determined current location of the ultrasound sensor.
- the computing device receives ultrasound image data from the ultrasound sensor during the ultrasound scan of the portion of the patient’s body.
- the instructions further cause the computing device to determine whether the guidance has been sufficiently followed to generate a complete scan of the portion of the patient’s body, and generate updated guidance when it is determined that the scan of the portion of the patient’s body is incomplete.
- the projector is further configured to project the updated guidance onto the patient’s body.
- the instructions further cause the computing device to, analyze the received ultrasound data to determine whether the ultrasound data is sufficient, and when it is determined that the ultrasound data is insufficient, generate updated guidance for the portion of the patient’s body that produced insufficient ultrasound data.
- the instructions further cause the computing device to assemble the received ultrasound data into an output format volume.
- the portion of the patient’s body includes a breast of the patient.
- system further comprises at least one sensor located on the body of the patient, and the instructions further configure the computing device to determine the position of the body of the patient based on the location of the at least one sensor.
- a method for performing a guided ultrasound scan comprises capturing a position of a patient’s body, generating guidance for an ultrasound scan of a portion of the patient’s body, wherein the guidance includes one or more of images and instructions for moving an ultrasound sensor over the patient’s body, projecting the guidance onto the patient’s body, determining a current position of the ultrasound sensor, and iteratively updating the projected guidance based on the determined current location of the ultrasound sensor.
- the method further comprises receiving ultrasound image data from the ultrasound sensor during the ultrasound scan of the portion of the patient’s body.
- the method further comprises determining whether the guidance has been sufficiently followed to generate a complete scan of the portion of the patient’s body, and generating updated guidance when it is determined that the scan of the portion of the patient’s body is incomplete.
- the method further comprises projecting the updated guidance onto the patient’s body.
- the method further comprises analyzing the received ultrasound data to determine whether the ultrasound data is sufficient, and when it is determined that the ultrasound data is insufficient, generating updated guidance for the portion of the patient’s body that produced insufficient ultrasound data.
- the method further comprises assembling the received ultrasound data into an output format volume.
- the portion of the patient’s body includes a breast of the patient.
- the position of the patient’s body is determined based on the location of at least one sensor located on the patient’s body.
- a non-transitory computer-readable storage medium stores instructions which, when executed by a processor, cause a computing device to determine a position of a patient’s body based on images captured by a camera and at least one sensor located on the patient’s body, generate guidance for an ultrasound scan of a portion of the patient’s body, wherein the guidance includes one or more of images and instructions for moving an ultrasound sensor over the patient’s body, cause a projector to project the guidance onto the patient’s body, determine a current position of the ultrasound sensor, and iteratively update the projected guidance based on the determined current location of the ultrasound sensor.
- the computing device receives ultrasound image data from the ultrasound sensor during the ultrasound scan of the portion of the patient’s body.
- the non-transitory computer-readable storage medium comprises further instructions which further cause the computing device to determine whether the guidance has been sufficiently followed to generate a complete scan of the portion of the patient’s body, and generate updated guidance when it is determined that the scan of the portion of the patient’s body is incomplete.
- the non-transitory computer-readable storage medium comprises further instructions which further cause the computing device to analyze the received ultrasound data to determine whether the ultrasound data is sufficient, and when it is determined that the ultrasound data is insufficient, generate updated guidance for the portion of the patient’s body that produced insufficient ultrasound data.
- Fig. 1 is a schematic diagram of a guided ultrasound breast cancer screening system in accordance with an illustrative embodiment of the present disclosure
- Fig. 2 is a schematic diagram of a computing device which forms part of the system of Fig. 1 in accordance with an embodiment of the present disclosure
- Fig. 3 is flow chart illustrating an example method for performing a guided ultrasound breast cancer screening using the system of Fig. 1 in accordance with an embodiment of the present disclosure
- Figs. 4A-C are illustrations of guidance provided by the guided ultrasound breast cancer screening system of Fig. 1 in accordance with an embodiment of the present disclosure.
- the present disclosure provides a system and method for performing a guided ultrasound breast cancer screening.
- the system presents a clinician with interactive guidance for performing an ultrasound breast cancer screening. By using the system, the clinician can ensure that an ultrasound scan of the whole breast is performed without any omitted areas.
- a system 10 which includes a table 100, a computing device 110 such as, for example, a laptop computer, desktop computer, tablet computer, or other similar device, a projector 120 configured to project guidance, such as images, instructions, and messages relating to the performance of an ultrasound scan, onto the body of a patient, and a camera 130 which captures the position of the body of the patient, the position of an ultrasound sensor 140, and the position of one or more optical markers 150.
- a computing device 110 such as, for example, a laptop computer, desktop computer, tablet computer, or other similar device
- a projector 120 configured to project guidance, such as images, instructions, and messages relating to the performance of an ultrasound scan, onto the body of a patient
- a camera 130 which captures the position of the body of the patient, the position of an ultrasound sensor 140, and the position of one or more optical markers 150.
- Table 100 may be any table, bed, or other support structure on which a patient may be positioned during an ultrasound breast cancer screening.
- Projector 120 may be any projector, for example, a microprojector, suitable for projecting images onto a human body.
- Camera 130 may be any camera suitable for capturing images of the body of a patient during the ultrasound breast cancer screening.
- Ultrasound sensor 140 may be any ultrasound device, for example, an ultrasound wand, suitable for use during an ultrasound breast cancer screening.
- Optical markers 150 may, for example, be positioned above the collarbone of the patient.
- Projector 120 and camera 130 are positioned at a predetermined distance from the surface of table 100.
- optical markers 150 are placed at predetermined locations on the body of the patient, and camera 130 captures images of the position of the body of the patient on table 100.
- Computing device 110 determines a zone of interest to be scanned during the ultrasound breast cancer screening based on the locations of optical markers 150, and controls projector 120 to project guidance onto the body of the patient to guide a clinician during performance of the ultrasound breast cancer screening.
- Computing device 110 may include memory 202, processor 204, display device 206, network interface 208, input device 210, and/or output module 212.
- Memory 202 includes any non-transitory computer-readable storage media for storing data and/or software that is executable by processor 204 and which controls the operation of computing device 110.
- memory 202 may include one or more solid-state storage devices such as flash memory chips.
- memory 202 may include one or more mass storage devices connected to the processor 204 through a mass storage controller (not shown) and a communications bus (not shown) .
- mass storage controller not shown
- communications bus not shown
- computer readable storage media includes non-transitory, volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data.
- computer-readable storage media includes RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, CD-ROM, DVD, Blu-Ray or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by computing device 110.
- Memory 202 may store an application 214.
- Application 214 may, when executed by processor 204, cause display device 206 to display a user interface 216.
- Processor 204 may be a general purpose processor, a specialized graphics processing unit (GPU) configured to perform specific graphics processing tasks while freeing up the general purpose processor to perform other tasks, and/or any number or combination of such processors.
- GPU graphics processing unit
- Display device 206 may be touch sensitive and/or voice activated, enabling display device 206 to serve as both an input and output device.
- a keyboard not shown
- mouse not shown
- other data input devices may be employed.
- Network interface 208 may be configured to connect to a network such as a local area network (LAN) consisting of a wired network and/or a wireless network, a wide area network (WAN) , a wireless mobile network, a Bluetooth network, and/or the internet.
- LAN local area network
- WAN wide area network
- computing device 110 may receive image data of the patient from camera 130 during the ultrasound breast cancer screening. Patient data may also be provided to computing device 110 via a removable memory 202.
- Computing device 110 may receive updates to its software, for example, application 214, via network interface 208.
- Computing device 110 may also display notifications on display 206 that a software update is available.
- Input device 210 may be any device by means of which a user may interact with computing device 110, such as, for example, a mouse, keyboard, foot pedal, touch screen, and/or voice interface.
- Output module 212 may include any connectivity port or bus, such as, for example, parallel ports, serial ports, universal serial busses (USB) , or any other similar connectivity port known to those skilled in the art.
- connectivity port or bus such as, for example, parallel ports, serial ports, universal serial busses (USB) , or any other similar connectivity port known to those skilled in the art.
- Application 214 may be one or more software programs stored in memory 202 and executed by processor 204 of computing device 110. As will be described in more detail below, application 214 generates images and instructions to guide a clinician during the performance of the ultrasound breast cancer screening. During the screening, application 214 receives ultrasound image data from ultrasound sensor 140. Application 214 analyzes the received ultrasound image data to determine whether all areas of the region of interest have been scanned. If an area of the region of interest was not properly scanned, application 214 generates further guidance to instruct the clinician to scan those omitted areas.
- Application 214 includes a user interface to be displayed on display device 206 via which the clinician may provide input, for example, via user input device 210.
- user interface 216 may generate a graphical user interface (GUI) and output the GUI to display device 206 for viewing by a clinician.
- GUI graphical user interface
- Computing device 110 is linked to projector 120, thus enabling computing device 110 to control the output of projector 120 along with the output on display device 206.
- the term “clinician” refers to any medical professional (i.e., doctor, surgeon, nurse, or the like) or other user of the treatment planning system 10 involved in planning, performing, monitoring and/or supervising a medical procedure involving the use of the embodiments described herein.
- FIG. 3 there is shown a flowchart of an example method for a guided ultrasound breast cancer screening according to an embodiment of the present disclosure.
- camera 130 captures images identifying the position of the body of the patient on table 100 and the position of optical markers 150 on the body of the patient.
- Computing device 110 uses these images to determine a zone of interest on the body of the patient and to generate images and instructions for guidance during performance of the ultrasound breast cancer screening.
- the guidance may include images, such as an image of ultrasound sensor 140 depicted at the position at which ultrasound sensor 140 should be placed and an angle at which ultrasound sensor 140 should be held, as well as information indicating the speed at which ultrasound sensor 140 should be moved during the scanning procedure.
- the guidance may further indicate the areas of the patient’s body that should be scanned with ultrasound sensor 140.
- the guidance may be a single depiction covering the entire area to be scanned with ultrasound sensor 140, or a series of lines covering the area, or a path along which ultrasound sensor 140 should be moved.
- the guidance may further include an indicator showing where to start the scanning procedure, as well as indicators showing which areas have been sufficiently scanned.
- projector 120 projects the guidance, including the images and instructions, onto the body of the patient, and particularly, onto the zone of interest.
- the guidance shows an indicator of a path 410 along which ultrasound sensor 140 should be moved to perform an ultrasound scan of the zone of interest.
- the path may be a series of parallel swaths which together will cover the whole breast.
- the parallel swaths may, for example, have some overlap to ensure the entire area is scanned.
- the guidance further includes an indicator 402 representing ultrasound sensor 140, showing the position and orientation, such as an angle, at which ultrasound sensor 140 should be positioned to perform the ultrasound scan.
- the clinician moves ultrasound sensor 140 over the body of the patient according to the projected guidance.
- Camera 130 captures images reflecting the position of ultrasound sensor 140 on the body of the patient.
- Computing device 110 uses these images to determine that ultrasound sensor 140 has moved and determines the current location of ultrasound sensor 140. Further, computing device 110 tracks the position, orientation, and angle of ultrasound sensor 140 to identify a plane of imaging to confirm that the entire zone of interest has been properly scanned.
- ultrasound sensor 140 may have to be held vertically in relation to camera 130, while in other areas ultrasound sensor 140 may have to be held at an angle in relation to camera 130 to properly scan the tissue underlying the zone of interest.
- indicator 402 representing ultrasound sensor 140 is depicted showing the correct angle at which ultrasound sensor 140 should be held at any given location during the scan.
- computing device 110 generates updated guidance based on the determined current location of ultrasound sensor 140, and projector 120 projects the updated guidance onto the body of the patient, as shown in Figs. 4B and 4C.
- the updated guidance includes indicator 402 representing ultrasound sensor 140, path 410 showing where ultrasound sensor 140 should be moved, and path 412 showing the area that ultrasound sensor 140 has already scanned.
- Path 410 may be depicted differently from path 412. For example, path 410 may be depicted in one color showing that the area under path 410 has yet to be scanned, and path 412 may be depicted in a different color showing that the area under path 412 has already been scanned.
- computing device 110 determines whether the ultrasound sensor has reached the end of the guided path. If the determination is no, processing returns to step 306. If the determination is yes, computing device 110 analyzes the received ultrasound data to determine whether the entire zone of interest has been successfully scanned. This may include whether any area of the zone of interest was omitted during the scan, and/or whether the received ultrasound data for any area of the zone of interest is insufficient.
- updated guidance is generated and projected, at step 314, indicating the areas of the zone of interest not successfully scanned, and thus instructing the clinician to move ultrasound sensor 140 over those areas again.
- a part of the zone of interest may have to be rescanned, and the guidance will be updated to instruct the clinician to scan those areas again.
- the updated guidance may instruct the clinician to restart the entire process and rescan the entire zone of interest. Thereafter, processing returns to step 306.
- computing device 110 determines that the entire zone of interest was successfully scanned, at step 316, computing device 110 assembles the received ultrasound data into an output format, such as the Digital Imaging and Communications in Medicine (DICOM) image format, allowing the clinician to view and export the ultrasound data.
- DICOM Digital Imaging and Communications in Medicine
- computing device 110 Prior to starting the guided ultrasound breast cancer screening, instructs the clinician how to configure ultrasound sensor 140. For example, computing device 110 may require that ultrasound sensor 140 be set to a particular scan depth to perform the guided ultrasound breast cancer screening. The clinician may, during the performance of the guided ultrasound breast cancer screening, interrupt the scanning procedure by deviating from the guidance to perform a more detailed scan of a particular area of the patient’s breast, or to change the depth at which ultrasound sensor 140 is scanning.
- computing device 110 When computing device 110 detects that there has been a deviation from the guidance, computing device 110 suspends the guidance and displays instructions on how to resume the guided ultrasound breast cancer screening.
Abstract
Disclosed are devices, systems, and methods for performing a guided ultrasound scan, comprising capturing a position of a patient's body, generating guidance for an ultrasound scan of a portion of the patient's body, wherein the guidance includes one or more of images and instructions for moving an ultrasound sensor over the patient's body, projecting the guidance onto the patient's body, determining a current position of the ultrasound sensor, and iteratively updating the projected guidance based on the determined current location of the ultrasound sensor.
Description
1. Technical Field
The present disclosure relates to a system and method for guiding physician in performing an ultrasound examination of a patient’s breast to screen for cancer.
2. Discussion of Related Art
Mammography has long been a standard imaging modality for breast cancer screenings. However, the effectively of mammography is significantly reduced when scanning dense breast tissue. Ultrasound imaging has therefore been used as an alternative and/or complementary non-invasive subsurface imaging modality for breast cancer screening. Traditional hand-held ultrasound devices present a problem in that the scanning is completely manual and requires a great deal of time and experience from the operator of the ultrasound device to ensure a full scan of the whole breast.
SUMMARY
Systems and methods for performing a guided ultrasound breast cancer screening are disclosed.
Disclosed according to an aspect of the present disclosure, a system for performing a guided ultrasound scan comprises a camera configured to capture a position of a patient’s body, a computing device including a processor operatively coupled to a memory storing instructions which, when executed by the processor, cause the computing device to generate guidance for an ultrasound scan of a portion of the patient’s body, wherein the guidance includes one or more of images and instructions
for moving an ultrasound sensor over the patient’s body, and a projector configured to project the guidance onto the patient’s body, and the instructions further cause the computing device to determine a current position of the ultrasound sensor, and iteratively update the projected guidance based on the determined current location of the ultrasound sensor.
In another aspect of the present disclosure, the computing device receives ultrasound image data from the ultrasound sensor during the ultrasound scan of the portion of the patient’s body.
In yet another aspect of the present disclosure, the instructions further cause the computing device to determine whether the guidance has been sufficiently followed to generate a complete scan of the portion of the patient’s body, and generate updated guidance when it is determined that the scan of the portion of the patient’s body is incomplete.
In a further aspect of the present disclosure, the projector is further configured to project the updated guidance onto the patient’s body.
In another aspect of the present disclosure, the instructions further cause the computing device to, analyze the received ultrasound data to determine whether the ultrasound data is sufficient, and when it is determined that the ultrasound data is insufficient, generate updated guidance for the portion of the patient’s body that produced insufficient ultrasound data.
In a further aspect of the present disclosure, the instructions further cause the computing device to assemble the received ultrasound data into an output format volume.
In another aspect of the present disclosure, the portion of the patient’s body includes a breast of the patient.
In yet another aspect of the present disclosure, the system further comprises at least one sensor located on the body of the patient, and the instructions further configure the computing device to determine the position of the body of the patient based on the location of the at least one sensor.
Disclosed according to another aspect of the present disclosure, a method for performing a guided ultrasound scan, the method comprises capturing a position of a patient’s body, generating guidance for an ultrasound scan of a portion of the patient’s body, wherein the guidance includes one or more of images and instructions for moving an ultrasound sensor over the patient’s body, projecting the guidance onto the patient’s body, determining a current position of the ultrasound sensor, and iteratively updating the projected guidance based on the determined current location of the ultrasound sensor.
In a further aspect of the present disclosure, the method further comprises receiving ultrasound image data from the ultrasound sensor during the ultrasound scan of the portion of the patient’s body.
In another aspect of the present disclosure, the method further comprises determining whether the guidance has been sufficiently followed to generate a complete scan of the portion of the patient’s body, and generating updated guidance when it is determined that the scan of the portion of the patient’s body is incomplete.
In yet a further aspect of the present disclosure, the method further comprises projecting the updated guidance onto the patient’s body.
In another aspect of the present disclosure, the method further comprises analyzing the received ultrasound data to determine whether the ultrasound data is sufficient, and when it is determined that the ultrasound data is insufficient, generating updated guidance for the portion of the patient’s body that produced insufficient ultrasound data.
In a further aspect of the present disclosure, the method further comprises assembling the received ultrasound data into an output format volume.
In another aspect of the present disclosure, the portion of the patient’s body includes a breast of the patient.
In yet another aspect of the present disclosure, the position of the patient’s body is determined based on the location of at least one sensor located on the patient’s body.
Disclosed according to yet another aspect of the present disclosure, a non-transitory computer-readable storage medium stores instructions which, when executed by a processor, cause a computing device to determine a position of a patient’s body based on images captured by a camera and at least one sensor located on the patient’s body, generate guidance for an ultrasound scan of a portion of the patient’s body, wherein the guidance includes one or more of images and instructions for moving an ultrasound sensor over the patient’s body, cause a projector to project the guidance onto the patient’s body, determine a current position of the ultrasound sensor, and iteratively update the projected guidance based on the determined current location of the ultrasound sensor.
In a further aspect of the present disclosure, the computing device receives ultrasound image data from the ultrasound sensor during the ultrasound scan of the portion of the patient’s body.
In yet a further aspect of the present disclosure, the non-transitory computer-readable storage medium comprises further instructions which further cause the computing device to determine whether the guidance has been sufficiently followed to generate a complete scan of the portion of the patient’s body, and generate updated guidance when it is determined that the scan of the portion of the patient’s body is incomplete.
In another aspect of the present disclosure, the non-transitory computer-readable storage medium comprises further instructions which further cause the computing device to analyze the received ultrasound data to determine whether the ultrasound data is sufficient, and when it is determined that the ultrasound data is insufficient, generate updated guidance for the portion of the patient’s body that produced insufficient ultrasound data.
Any of the above aspects and embodiments of the present disclosure may be combined without departing from the scope of the present disclosure.
Objects and features of the presently disclosed system and method will become apparent to those of ordinary skill in the art when descriptions of various embodiments thereof are read with reference to the accompanying drawings, of which:
Fig. 1 is a schematic diagram of a guided ultrasound breast cancer screening system in accordance with an illustrative embodiment of the present disclosure;
Fig. 2 is a schematic diagram of a computing device which forms part of the system of Fig. 1 in accordance with an embodiment of the present disclosure;
Fig. 3 is flow chart illustrating an example method for performing a guided ultrasound breast cancer screening using the system of Fig. 1 in accordance with an embodiment of the present disclosure; and
Figs. 4A-C are illustrations of guidance provided by the guided ultrasound breast cancer screening system of Fig. 1 in accordance with an embodiment of the present disclosure.
The present disclosure provides a system and method for performing a guided ultrasound breast cancer screening. The system presents a clinician with interactive guidance for performing an ultrasound breast cancer screening. By using the system, the clinician can ensure that an ultrasound scan of the whole breast is performed without any omitted areas.
Although the present disclosure will be described in terms of specific illustrative embodiments, it will be readily apparent to those skilled in this art that various modifications, rearrangements and substitutions may be made without departing from the spirit of the present disclosure. The scope of the present disclosure is defined by the claims appended hereto.
Referring now to Fig. 1, the present disclosure is generally directed to a system 10, which includes a table 100, a computing device 110 such as, for example, a laptop computer, desktop computer, tablet computer, or other similar device, a projector
120 configured to project guidance, such as images, instructions, and messages relating to the performance of an ultrasound scan, onto the body of a patient, and a camera 130 which captures the position of the body of the patient, the position of an ultrasound sensor 140, and the position of one or more optical markers 150.
Table 100 may be any table, bed, or other support structure on which a patient may be positioned during an ultrasound breast cancer screening. Projector 120 may be any projector, for example, a microprojector, suitable for projecting images onto a human body. Camera 130 may be any camera suitable for capturing images of the body of a patient during the ultrasound breast cancer screening. Ultrasound sensor 140 may be any ultrasound device, for example, an ultrasound wand, suitable for use during an ultrasound breast cancer screening. Optical markers 150 may, for example, be positioned above the collarbone of the patient.
Turning now to FIG. 2, there is shown a system diagram of computing device 110. Computing device 110 may include memory 202, processor 204, display device 206, network interface 208, input device 210, and/or output module 212.
As used herein, the term “clinician” refers to any medical professional (i.e., doctor, surgeon, nurse, or the like) or other user of the treatment planning system 10 involved in planning, performing, monitoring and/or supervising a medical procedure involving the use of the embodiments described herein.
Turning now to Fig. 3, there is shown a flowchart of an example method for a guided ultrasound breast cancer screening according to an embodiment of the present disclosure. At step 302, camera 130 captures images identifying the position of the
body of the patient on table 100 and the position of optical markers 150 on the body of the patient. Computing device 110 uses these images to determine a zone of interest on the body of the patient and to generate images and instructions for guidance during performance of the ultrasound breast cancer screening.
The guidance may include images, such as an image of ultrasound sensor 140 depicted at the position at which ultrasound sensor 140 should be placed and an angle at which ultrasound sensor 140 should be held, as well as information indicating the speed at which ultrasound sensor 140 should be moved during the scanning procedure. The guidance may further indicate the areas of the patient’s body that should be scanned with ultrasound sensor 140. For example, the guidance may be a single depiction covering the entire area to be scanned with ultrasound sensor 140, or a series of lines covering the area, or a path along which ultrasound sensor 140 should be moved. The guidance may further include an indicator showing where to start the scanning procedure, as well as indicators showing which areas have been sufficiently scanned.
Continuing to step 304, projector 120 then projects the guidance, including the images and instructions, onto the body of the patient, and particularly, onto the zone of interest. As shown in Fig. 4A, the guidance shows an indicator of a path 410 along which ultrasound sensor 140 should be moved to perform an ultrasound scan of the zone of interest. Various potential paths are envisioned. For example, as shown in Figs. 4A-C, the path may be a series of parallel swaths which together will cover the whole breast. The parallel swaths may, for example, have some overlap to ensure the entire area is scanned. The guidance further includes an indicator 402 representing
ultrasound sensor 140, showing the position and orientation, such as an angle, at which ultrasound sensor 140 should be positioned to perform the ultrasound scan.
At step 306, the clinician moves ultrasound sensor 140 over the body of the patient according to the projected guidance. Camera 130 captures images reflecting the position of ultrasound sensor 140 on the body of the patient. Computing device 110 uses these images to determine that ultrasound sensor 140 has moved and determines the current location of ultrasound sensor 140. Further, computing device 110 tracks the position, orientation, and angle of ultrasound sensor 140 to identify a plane of imaging to confirm that the entire zone of interest has been properly scanned. Depending on the geography of the zone of interest, at some areas ultrasound sensor 140 may have to be held vertically in relation to camera 130, while in other areas ultrasound sensor 140 may have to be held at an angle in relation to camera 130 to properly scan the tissue underlying the zone of interest. As discussed below, indicator 402 representing ultrasound sensor 140 is depicted showing the correct angle at which ultrasound sensor 140 should be held at any given location during the scan.
Thereafter, at step 308, computing device 110 generates updated guidance based on the determined current location of ultrasound sensor 140, and projector 120 projects the updated guidance onto the body of the patient, as shown in Figs. 4B and 4C. The updated guidance includes indicator 402 representing ultrasound sensor 140, path 410 showing where ultrasound sensor 140 should be moved, and path 412 showing the area that ultrasound sensor 140 has already scanned. Path 410 may be depicted differently from path 412. For example, path 410 may be depicted in one color showing that the area under path 410 has yet to be scanned, and path 412 may be
depicted in a different color showing that the area under path 412 has already been scanned.
At step 310, computing device 110 determines whether the ultrasound sensor has reached the end of the guided path. If the determination is no, processing returns to step 306. If the determination is yes, computing device 110 analyzes the received ultrasound data to determine whether the entire zone of interest has been successfully scanned. This may include whether any area of the zone of interest was omitted during the scan, and/or whether the received ultrasound data for any area of the zone of interest is insufficient.
If computing device determines that the entire zone of interest was not successfully scanned, updated guidance is generated and projected, at step 314, indicating the areas of the zone of interest not successfully scanned, and thus instructing the clinician to move ultrasound sensor 140 over those areas again. In some instances, a part of the zone of interest may have to be rescanned, and the guidance will be updated to instruct the clinician to scan those areas again. In other instances, the updated guidance may instruct the clinician to restart the entire process and rescan the entire zone of interest. Thereafter, processing returns to step 306.
However, if computing device 110 determines that the entire zone of interest was successfully scanned, at step 316, computing device 110 assembles the received ultrasound data into an output format, such as the Digital Imaging and Communications in Medicine (DICOM) image format, allowing the clinician to view and export the ultrasound data.
Prior to starting the guided ultrasound breast cancer screening, computing device 110 instructs the clinician how to configure ultrasound sensor 140. For example, computing device 110 may require that ultrasound sensor 140 be set to a particular scan depth to perform the guided ultrasound breast cancer screening. The clinician may, during the performance of the guided ultrasound breast cancer screening, interrupt the scanning procedure by deviating from the guidance to perform a more detailed scan of a particular area of the patient’s breast, or to change the depth at which ultrasound sensor 140 is scanning.
When computing device 110 detects that there has been a deviation from the guidance, computing device 110 suspends the guidance and displays instructions on how to resume the guided ultrasound breast cancer screening.
While the foregoing has been described and set forth with respect to determining medical treatment procedure including planning a route to a target within a patient, the same methodologies and systems may be employed to in a planning procedure to identify a target, a route to the target and to conduct a review of the proposed approach for treatment or servicing in other contexts, including without limitation analysis of piping systems, electronic systems, and other industrial applications where access to a target is limited and internal analyses of the system in question are required to ascertain the most desirable pathway to reach the target.
Although embodiments have been described in detail with reference to the accompanying drawings for the purpose of illustration and description, it is to be understood that the inventive processes and apparatus are not to be construed as limited thereby. It will be apparent to those of ordinary skill in the art that various
modifications to the foregoing embodiments may be made without departing from the scope of the disclosure.
Claims (20)
- A system for performing a guided ultrasound scan, the system comprising:a camera configured to capture a position of a patient’s body;a computing device including a processor operatively coupled to a memory storing instructions which, when executed by the processor, cause the computing device to generate guidance for an ultrasound scan of a portion of the patient’s body, wherein the guidance includes one or more of images and instructions for moving an ultrasound sensor over the patient’s body; anda projector configured to project the guidance onto the patient’s body;wherein the instructions further cause the computing device to:determine a current position of the ultrasound sensor; anditeratively update the projected guidance based on the determined current location of the ultrasound sensor.
- The system according to claim 1, wherein the computing device receives ultrasound image data from the ultrasound sensor during the ultrasound scan of the portion of the patient’s body.
- The system according to claim 1, wherein the instructions further cause the computing device to:determine whether the guidance has been sufficiently followed to generate a complete scan of the portion of the patient’s body; andgenerate updated guidance when it is determined that the scan of the portion of the patient’s body is incomplete.
- The system according to claim 3, wherein the projector is further configured to project the updated guidance onto the patient’s body.
- The system according to claim 2, wherein the instructions further cause the computing device to:analyze the received ultrasound data to determine whether the ultrasound data is sufficient; andwhen it is determined that the ultrasound data is insufficient, generate updated guidance for the portion of the patient’s body that produced insufficient ultrasound data.
- The system according to claim 2, wherein the instructions further cause the computing device to assemble the received ultrasound data into an output format volume.
- The system according to claim 1, wherein the portion of the patient’s body includes a breast of the patient.
- The system according to claim 1, further comprising:at least one sensor located on the body of the patient,wherein the instructions further configure the computing device to determine the position of the body of the patient based on the location of the at least one sensor.
- A method for performing a guided ultrasound scan, the method comprising:capturing a position of a patient’s body;generating guidance for an ultrasound scan of a portion of the patient’s body, wherein the guidance includes one or more of images and instructions for moving an ultrasound sensor over the patient’s body;projecting the guidance onto the patient’s body;determining a current position of the ultrasound sensor; anditeratively updating the projected guidance based on the determined current location of the ultrasound sensor.
- The method according to claim 9, further comprising receiving ultrasound image data from the ultrasound sensor during the ultrasound scan of the portion of the patient’s body.
- The method according to claim 9, further comprising:determining whether the guidance has been sufficiently followed to generate a complete scan of the portion of the patient’s body; andgenerating updated guidance when it is determined that the scan of the portion of the patient’s body is incomplete.
- The method according to claim 11, further comprising projecting the updated guidance onto the patient’s body.
- The method according to claim 10, further comprising:analyzing the received ultrasound data to determine whether the ultrasound data is sufficient; andwhen it is determined that the ultrasound data is insufficient, generating updated guidance for the portion of the patient’s body that produced insufficient ultrasound data.
- The method according to claim 10, further comprising assembling the received ultrasound data into an output format volume.
- The method according to claim 9, wherein the portion of the patient’s body includes a breast of the patient.
- The method according to claim 9, wherein the position of the patient’s body is determined based on the location of at least one sensor located on the patient’s body.
- A non-transitory computer-readable storage medium storing instructions which, when executed by a processor, cause a computing device to:determine a position of a patient’s body based on images captured by a camera and at least one sensor located on the patient’s body;generate guidance for an ultrasound scan of a portion of the patient’s body, wherein the guidance includes one or more of images and instructions for moving an ultrasound sensor over the patient’s body;cause a projector to project the guidance onto the patient’s body;determine a current position of the ultrasound sensor; anditeratively update the projected guidance based on the determined current location of the ultrasound sensor.
- The non-transitory computer-readable storage medium according to claim 17, wherein the computing device receives ultrasound image data from the ultrasound sensor during the ultrasound scan of the portion of the patient’s body.
- The non-transitory computer-readable storage medium according to claim 18, comprising further instructions which further cause the computing device to:determine whether the guidance has been sufficiently followed to generate a complete scan of the portion of the patient’s body; andgenerate updated guidance when it is determined that the scan of the portion of the patient’s body is incomplete.
- The non-transitory computer-readable storage medium according to claim 18, comprising further instructions which further cause the computing device to:analyze the received ultrasound data to determine whether the ultrasound data is sufficient; andwhen it is determined that the ultrasound data is insufficient, generate updated guidance for the portion of the patient’s body that produced insufficient ultrasound data.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2015/081636 WO2016201637A1 (en) | 2015-06-17 | 2015-06-17 | Guided ultrasound breast cancer screening system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2015/081636 WO2016201637A1 (en) | 2015-06-17 | 2015-06-17 | Guided ultrasound breast cancer screening system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016201637A1 true WO2016201637A1 (en) | 2016-12-22 |
Family
ID=57544663
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2015/081636 WO2016201637A1 (en) | 2015-06-17 | 2015-06-17 | Guided ultrasound breast cancer screening system |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2016201637A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018109227A1 (en) * | 2016-12-16 | 2018-06-21 | Koninklijke Philips N.V. | System providing images guiding surgery |
US11571180B2 (en) | 2016-12-16 | 2023-02-07 | Koninklijke Philips N.V. | Systems providing images guiding surgery |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1494873A (en) * | 2002-06-12 | 2004-05-12 | 株式会社东芝 | Supersonic diagnostic device, supersonic probe and supersonic image photographic support method |
US20090024030A1 (en) * | 2007-07-20 | 2009-01-22 | Martin Lachaine | Methods and systems for guiding the acquisition of ultrasound images |
CN102133110A (en) * | 2010-01-27 | 2011-07-27 | 株式会社东芝 | ULTRASONIC DIAGNOSTIC APPARATUS and MEDICAL IMAGE DIAGNOSTIC APPARATUS |
US20150051489A1 (en) * | 2011-12-18 | 2015-02-19 | Calin Caluser | Three Dimensional Mapping Display System for Diagnostic Ultrasound Machines |
-
2015
- 2015-06-17 WO PCT/CN2015/081636 patent/WO2016201637A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1494873A (en) * | 2002-06-12 | 2004-05-12 | 株式会社东芝 | Supersonic diagnostic device, supersonic probe and supersonic image photographic support method |
US20090024030A1 (en) * | 2007-07-20 | 2009-01-22 | Martin Lachaine | Methods and systems for guiding the acquisition of ultrasound images |
CN102133110A (en) * | 2010-01-27 | 2011-07-27 | 株式会社东芝 | ULTRASONIC DIAGNOSTIC APPARATUS and MEDICAL IMAGE DIAGNOSTIC APPARATUS |
US20150051489A1 (en) * | 2011-12-18 | 2015-02-19 | Calin Caluser | Three Dimensional Mapping Display System for Diagnostic Ultrasound Machines |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018109227A1 (en) * | 2016-12-16 | 2018-06-21 | Koninklijke Philips N.V. | System providing images guiding surgery |
US11571180B2 (en) | 2016-12-16 | 2023-02-07 | Koninklijke Philips N.V. | Systems providing images guiding surgery |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111655184B (en) | Guidance for placement of surgical ports | |
JP6809888B2 (en) | Mammography equipment | |
US10062166B2 (en) | Trachea marking | |
US20130197355A1 (en) | Method of controlling needle guide apparatus, and ultrasound diagnostic apparatus using the same | |
CN111670018A (en) | Guidance for positioning a patient and a surgical robot | |
JP5883147B2 (en) | Image display device and medical image pickup device | |
KR20150133500A (en) | Method for Measuring Size of Lesion which is shown by Endoscopy, and Computer Readable Recording Medium | |
JP2018518226A (en) | System and method for motion compensation in medical procedures | |
US10852939B2 (en) | Medical image display apparatus and recording medium | |
CN110507337B (en) | Medical equipment control system, medical equipment control method and device | |
JP2022031179A (en) | Device-to-image registration method, apparatus, and storage medium | |
WO2016035312A1 (en) | Assistance apparatus for assisting interpretation report creation and method for controlling the same | |
WO2016201637A1 (en) | Guided ultrasound breast cancer screening system | |
US20150054855A1 (en) | Image processing apparatus, image processing system, image processing method, and program | |
JP2017113390A (en) | Medical image processing apparatus, control method of the same, and program | |
EP2662025A1 (en) | Ultrasonic diagnostic apparatus and control method thereof | |
WO2022012541A1 (en) | Image scanning method and system for medical device | |
US20220409324A1 (en) | Systems and methods for telestration with spatial memory | |
JP7123696B2 (en) | Medical report generation device, medical report generation method, and medical report generation program | |
KR20120036420A (en) | Ultrasonic waves diagnosis method and apparatus for providing user interface on screen | |
KR20160037024A (en) | Apparatus and Method for supporting medical treatment based on personalized checklist | |
US11672494B2 (en) | Imaged-range defining apparatus, medical apparatus, and program | |
JP2018094421A (en) | Medical image display method and medical image display apparatus | |
US20240090866A1 (en) | System and method for displaying ablation zone progression | |
JP2009061156A (en) | Medical image diagnosis support system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15895210 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15895210 Country of ref document: EP Kind code of ref document: A1 |