CA2313996C - General purpose distributed operating room control system - Google Patents
General purpose distributed operating room control system Download PDFInfo
- Publication number
- CA2313996C CA2313996C CA002313996A CA2313996A CA2313996C CA 2313996 C CA2313996 C CA 2313996C CA 002313996 A CA002313996 A CA 002313996A CA 2313996 A CA2313996 A CA 2313996A CA 2313996 C CA2313996 C CA 2313996C
- Authority
- CA
- Canada
- Prior art keywords
- operating room
- patient
- command
- controller
- control system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
Classifications
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/26—Speech to text systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
- A61B34/74—Manipulators with manual electric input means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/30—Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B15/00—Systems controlled by a computer
- G05B15/02—Systems controlled by a computer electric
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/40—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management of medical equipment or devices, e.g. scheduling maintenance or upgrades
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/63—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B2017/00017—Electrical control of surgical instruments
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00115—Electrical control of surgical instruments with audible or visual output
- A61B2017/00119—Electrical control of surgical instruments with audible or visual output alarm; indicating an abnormal situation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00203—Electrical control of surgical instruments with speech control or speech recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B2017/00973—Surgical instruments, devices or methods, e.g. tourniquets pedal-operated
Abstract
The present invention pertains to control systems and provides a run time configurable control system for selecting and operating one of a plurality of operating room devices from a single input source, the system comprising a master controller having a voice control interface and means for routing control signals. The system additionally may include a plurality of slave controllers to provide expandability of the system. Also, the system includes output means for generating messages to the user relating to the status of the control system in general and to the status of devices connected thereto.
Description
GENERAL PURPOSE DISTRIBUTED OPERATING
ROOM CONTROL SYSTEM
BACKGROUND OF THE INVENTION
1. FIELD OF THE INVENTION
The present invention generally relates to control systems. More particularly, the present invention relates to a control system and apparatus that allows multiple surgical devices to be controlled from one or more input devices. Even more particularly, the present invention provides a run-time configurable control system allowing operating room component connectivity and control.
ROOM CONTROL SYSTEM
BACKGROUND OF THE INVENTION
1. FIELD OF THE INVENTION
The present invention generally relates to control systems. More particularly, the present invention relates to a control system and apparatus that allows multiple surgical devices to be controlled from one or more input devices. Even more particularly, the present invention provides a run-time configurable control system allowing operating room component connectivity and control.
2. DESCRIPTION OF RELATED ART
Many surgical procedures are performed with multiple instruments. For example, some laproscopic procedures are performed utilizing a robotic arm system produced by Computer Motion, Inc. of Goleta, California to hold and move an endoscope. The surgeon may also use a laser to cut tissue, an electrocautery device to cauterize the tissue, and lights to illuminate the surgical site.
Each instrument has a unique control interface for its operation. Therefore, the surgeon must independently operate each device. For example, the surgeon must utilize a foot pedal to control the electrocautery device, a separate foot pedal to operate the robotic arm, and yet another interface to operate the laser.
Operating multiple devices may distract the surgeon, thereby reducing the efficiency of performing various procedures. Additionally, it is cumbersome utilizing various devices where each device has a separate user interface. If a new device is introduced into the operating room environment, the doctor must learn how to use the new user interface. Additionally, there is currently no known run time configurable system for operating more than one specific operating room device via voice control. As such, if there are two or more devices in the operating room that are voice controlled, the to doctor has to remove the microphone used for one device and replace it with the microphone for the other device. Obviously, this creates many problems associated with productivity. Additionally, the necessity of actually switching between many user interfaces takes a measurable amount of time and as such, extends the time that a patient is under anesthesia, which may add to the danger of a procedure.
~5 Therefore, what is needed in the art is a general purpose platform for controlling a plurality of devices such that devices can be added or subtracted from the platform depending upon the environment into which the platform, also known as a control system is introduced. The system may additionally be automatically configured at start up.
Additionally, what is needed is a system and method for selecting and operating one of 2o the plurality of the attached devices, namely operating room devices. It is to the solution of the hereinabove mentioned problems to which the present invention is directed.
SUMMARY OF THE INVENTION
In accordance with the present invention there is provided a control system for selecting from and controlling a plurality of devices in an operating room, the control system comprising:
a master controller , the master controller comprising:
a) means for receiving selection commands from a user wherein each selection command is associated with one specific device in electrical communication with the master controller;
b) means for receiving control commands from a user;
c) means for converting selection commands and control commands into corresponding selection signals and control signals;
d) means for routing control signals to a device specified by a selection command received by the means for receiving selection commands.
In accordance with a first aspect of the present invention, there is provided a master controller for selecting and controlling a plurality of devices. Each of the plurality ~5 of devices to be controlled are in electrical communication or in wireless communication with the master controller, either directly or via a slave controller which will be discussed in more detail hereinbelow with respect to the second aspect of the present invention.
The master controller includes means for receiving selection commands issued by a user. The selection commands available to the user are based upon the devices in electrical communication with the master controller. The master controller may recognize those devices that are in electrical communication therewith upon startup of the master controller. This will be described in detail in the description of the preferred embodiment. Each device in electrical communication with the master controller is represented by a correspondingly available selection command.
The master controller additionally includes means for receiving control commands from the user. Both the means for receiving selection commands and the means for receiving control commands from a user may be included in a voice control interface (VCI) for receiving voice commands. The system may additionally employ a foot pedal, a hand held device, or some other device which receives selection or control commands or inputs indicative of such commands from a user. The VCI provides signals indicative of a user's selection of a specific device and signals indicative of control 1o commands the user wishes to supply to the device specified by a specific selection command. These are known, respectively, as selection signals and control signals. If the user is using a foot pedal, hand controller or some other input device, the VCI is not utilized as the inputs are already in the form of electrical signals as opposed to voice input. Alternatively, a combination of devices may be used to receive selection and 15 control commands and to provide selection and control signals indicative of such commands.
The master controller additionally includes means for routing control signals to a device specified by a selection command. For example, if the user wants to operate the laser, a device used in many surgeries and contemplated as being included as one of the 20 devices that may be operated via the control system of the present invention, then the user may issue a selection command indicating such, i.e. speak the word "laser" or the words "select laser". As such, the name of the device may serve as the selection command, or the selection command may be the combination of two or more words.
Many surgical procedures are performed with multiple instruments. For example, some laproscopic procedures are performed utilizing a robotic arm system produced by Computer Motion, Inc. of Goleta, California to hold and move an endoscope. The surgeon may also use a laser to cut tissue, an electrocautery device to cauterize the tissue, and lights to illuminate the surgical site.
Each instrument has a unique control interface for its operation. Therefore, the surgeon must independently operate each device. For example, the surgeon must utilize a foot pedal to control the electrocautery device, a separate foot pedal to operate the robotic arm, and yet another interface to operate the laser.
Operating multiple devices may distract the surgeon, thereby reducing the efficiency of performing various procedures. Additionally, it is cumbersome utilizing various devices where each device has a separate user interface. If a new device is introduced into the operating room environment, the doctor must learn how to use the new user interface. Additionally, there is currently no known run time configurable system for operating more than one specific operating room device via voice control. As such, if there are two or more devices in the operating room that are voice controlled, the to doctor has to remove the microphone used for one device and replace it with the microphone for the other device. Obviously, this creates many problems associated with productivity. Additionally, the necessity of actually switching between many user interfaces takes a measurable amount of time and as such, extends the time that a patient is under anesthesia, which may add to the danger of a procedure.
~5 Therefore, what is needed in the art is a general purpose platform for controlling a plurality of devices such that devices can be added or subtracted from the platform depending upon the environment into which the platform, also known as a control system is introduced. The system may additionally be automatically configured at start up.
Additionally, what is needed is a system and method for selecting and operating one of 2o the plurality of the attached devices, namely operating room devices. It is to the solution of the hereinabove mentioned problems to which the present invention is directed.
SUMMARY OF THE INVENTION
In accordance with the present invention there is provided a control system for selecting from and controlling a plurality of devices in an operating room, the control system comprising:
a master controller , the master controller comprising:
a) means for receiving selection commands from a user wherein each selection command is associated with one specific device in electrical communication with the master controller;
b) means for receiving control commands from a user;
c) means for converting selection commands and control commands into corresponding selection signals and control signals;
d) means for routing control signals to a device specified by a selection command received by the means for receiving selection commands.
In accordance with a first aspect of the present invention, there is provided a master controller for selecting and controlling a plurality of devices. Each of the plurality ~5 of devices to be controlled are in electrical communication or in wireless communication with the master controller, either directly or via a slave controller which will be discussed in more detail hereinbelow with respect to the second aspect of the present invention.
The master controller includes means for receiving selection commands issued by a user. The selection commands available to the user are based upon the devices in electrical communication with the master controller. The master controller may recognize those devices that are in electrical communication therewith upon startup of the master controller. This will be described in detail in the description of the preferred embodiment. Each device in electrical communication with the master controller is represented by a correspondingly available selection command.
The master controller additionally includes means for receiving control commands from the user. Both the means for receiving selection commands and the means for receiving control commands from a user may be included in a voice control interface (VCI) for receiving voice commands. The system may additionally employ a foot pedal, a hand held device, or some other device which receives selection or control commands or inputs indicative of such commands from a user. The VCI provides signals indicative of a user's selection of a specific device and signals indicative of control 1o commands the user wishes to supply to the device specified by a specific selection command. These are known, respectively, as selection signals and control signals. If the user is using a foot pedal, hand controller or some other input device, the VCI is not utilized as the inputs are already in the form of electrical signals as opposed to voice input. Alternatively, a combination of devices may be used to receive selection and 15 control commands and to provide selection and control signals indicative of such commands.
The master controller additionally includes means for routing control signals to a device specified by a selection command. For example, if the user wants to operate the laser, a device used in many surgeries and contemplated as being included as one of the 20 devices that may be operated via the control system of the present invention, then the user may issue a selection command indicating such, i.e. speak the word "laser" or the words "select laser". As such, the name of the device may serve as the selection command, or the selection command may be the combination of two or more words.
Subsequent to receiving a selection command from the user and converting the selection command into a selection signal, if necessary, the master controller then routes control commands, or control signals for a selected device indicative of control commands received from the user to the device specified by the preceding selection command. In this exemplary instance, control signals would be routed to the laser.
Preferred structures for both selection commands and control commands are disclosed herein in the detailed description of the preferred embodiment of the present invention.
Additionally, a controller may include means for ensuring that control signals indicative of control commands issued subsequent to the receipt of a selection command are, in fact, valid control signals. This is accomplished via a database of valid control commands and grammars that are either prestored in the master, or are prestored in a slave prior to or at system startup which is described hereinbelow.
A second aspect of the present invention is at least one slave electrically connected to the master controller. Each slave controller connected to the master controller operates similarly to the master controller for specific devices electrically connected thereto; additionally, the slave controllers may receive control commands directly from the user if they are to be used as a stand alone unit. However, if they are utilized as slaves then control commands are received at the master controller and converted into control signals and transmitted from the master controller to the slave controller that has the device specified by the last selection command received by the master controller connected thereto . This allows the control system of the present invention to operate with a plurality of different devices without the master controller requiring any knowledge of the devices connected to the slave controllers prior to startup of the control system.
The slave controllers are connected to the master controller just like any other device;
however, each slave controller provides the master controller information relating to the specific devices that are connected thereto, so the master controller, generally at startup, is provided information as to exactly what devices are connected to the system.
The selection commands available to the user include all devices connected to each of the slave controllers as well as the devices directly connected to the master controller. By providing an open architecture such as that generally set out hereinabove, and more particularly, a master controller and slave controllers, various devices may be controlled from a single controller, or a plurality of controllers, such that a doctor utilizing the control system will not have to switch between different control systems or interfaces, or at a minimum will have an easier interface to control each of the devices. It is additionally envisioned that the main means for selecting and controlling each of the devices will be a voice recognition system which will be described in detail hereinbelow.
Also, the control system may include audio and video outputs which are capable of alerting the user to errors in selecting, or controlling specific devices. The audio and video outputs may additionally be used to alert the user to problems with each of the specific devices as well as to provide status notices as to which devices) are available, which devices are active, as well as a host of other device operation information which will be discussed further hereinbelow.
In a further aspect, the present invention provides an operating room control system for use during a medical procedure on a patient, comprising: a plurality of operating room devices; a voice input device capable of receiving a spoken device command that has an associated device qualifier, and a data retrieval command having an associated patient record identifier; a display device; and a controller coupled to the input device and the display device, the controller capable of receiving the data retrieval command and the device command, herein the controller transmits a command to a server to retrieve patient information in response to the data retrieval command so that desired patient data can be selected in response to the patient record identifier, wherein the controller receives the patient information from the server and displays the patient information on the display device, and wherein the controller transmits a control command to a selected operating room device in response to the device command, the medical operating room device selected from the plurality of operating room devices in response to the device qualifier so as to allow voice manipulation of an operating room environment.
In a still further aspect, the present invention provides a method for accessing medical data from a server while performing a medical procedure, comprising: receiving a plurality of commands from an input device including an instrument command that has an associated instrument qualifier, and a data retrieval command having an associated patient record identifier; transmitting a request to the server for the unit of patient data in response to the data retrieval command; receiving the unit of patient data from the server; and displaying the unit of patient data on a display device.
In a further aspect, the present invention provides an operating room control system for use during a medical procedure on a patient, comprising: a plurality of medical devices, each device having an associated device identifier; a voice input device capable of receiving spoken device commands having associated device qualifiers, and also capable of receiving spoken 6a voice data retrieval commands having an associated data retrieval qualifier; a display device; a master controller coupled to the input device, the master controller capable of receiving the data retrieval commands and the device commands, wherein the master controller receives a command having a qualifier and transmits the command to a network so as to retrieve a unit of information when the qualifier is a data retrieval qualifier, and wherein the master controller transmits the command to a selected medical device when the qualifier is a device qualifier, the selected medical device being selected from the plurality of medical devices and having an associated device identifier matching the device qualifier; and a slave controller coupled to the master controller and the display device, the slave controller capable of transmitting a request for the unit of information to a server, receiving the unit of information from the server, and displaying the unit of information on the display device.
6b For a more complete understanding of the present invention, reference is made to the following detailed description and accompanying drawings. In the drawings, like reference characters refer to like parts, in which:
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 is a block diagram of a master controller in electrical communication with both slave controllers and operating room devices in accordance with the present invention;
Figure 2 is a block diagram of the voice control interface in accordance with the presentinvention;
1o Figure 3 is a schematic of the voice control interface card in accordance with the present invention;
Figure 4 is a schematic diagram of a master controller in accordance with a the present mvenhon;
Figure 5 is an exemplary tree diagram of a grammar for operating a device in accordance with the present invention.
DESCRIPTION OF THE PREFERRED EMBODIMENTS
In accordance with the present invention, there is shown in Figure 1 an operating room control system, generally at 10, in accordance with the present invention. The operating room control system, or control system 10, generally comprises a master controller 12, which is preferably attached to at least one slave controller 14. Although the exemplary preferred embodiment is shown as having both a master controller 12 and at least one slave controller 14 in electrical communication therewith, the control system may be implemented with only a master controller 12 as will be described hereinbelow.
The master controller 12 is electrically connected to and in electrical communication with a plurality of devices 16 via a plurality of communication ports 46.
5 Alternatively, the master controller 12 may be connected to any slave or specific medical device via wireless communications systems such as IR or RF signal transmitters and receivers on each of the master 12, slaves 14, and devices 16. Some of these devices 16 may be at least one slave controller 14 the operation of which will be described hereinbelow. Other devices that are intended to be electrically connected to the master to controller 12, either directly or via the at least one slave controller 14 include devices that are commonly found in an operating room environment.
For purposes of non-limiting example, directly connected to the master controller 12 in Figure 1 is an electrocautery device 18. A robotic arm 20 for holding and manipulating an endoscope, such as that produced by Computer Motion of Goleta, ~5 California and marketed under the tradename AESOP is electrically connected with the master controller 12 via one of the at least one slave controllers 14. Also in electrical communication with the master controller 12 via a slave controller is an operating room table 22, an insufflator 24, and an operating room lighting system 26. It is envisioned that any electrically controlled device utilized in an operating room environment may be attached to the master controller 12 either directly or via one of the at least one slave controllers 14.
The master controller 12 is configured to provide a main user interface for each of the devices electrically connected thereto. As such, a doctor can manipulate the operating room environment in a simpler and more direct fashion. Currently, each device in an operating room includes a separate interface. The proximity of the doctor to each interface requires a substantial amount of movement either on the part of the doctor or a nurse to effectuate changes required by the doctor during a medical procedure.
For example, if the doctor needs the lights dimmed slightly, then a nurse currently has to approach the lighting system of the operating room and dim the lights.
It would be highly advantageous for the doctor to be able to control such changes directly to keep movement in the operating room to a minimum to increase sterility, and because direct control by the doctor of the operating room environment and the devices he or she is 1o using ensures the highest degree of safety with the smallest amount of error due to miscommunication between people in the operating room. Minimization of movement in an operating room environment is additionally advantageous to reduce the risk of contamination of specific sterile instruments, as well as the operative site itself.
To effectuate such a control system 10, the master controller 12 generally 15 comprises a voice control interface (VCI) 32. The VCI 32 includes means 28 for receiving selection commands from a user wherein each selection command is associated with one specific device in electrical communication with the master controller 12. This is accomplished by providing the master controller 12 a list of the devices that are in electrical communication therewith upon start-up of the control system 10. The process 2o and hardware for providing the master controller 12 with such a list will be described hereinbelow.
As shown in Figure 2, The VCI 32 additionally comprises means 30 for receiving control commands from a user. In the preferred embodiment, both the means 28 for receiving selection commands and the means 30 for receiving control commands may coexist in the VCI 32 as a microphone 34, for receiving the actual speech of the user, an analog to digital converter 36 for converting the analog speech into a digital representation thereof, a feature extractor 38 for converting the digital representation to a digital representation that is suited for decoding, and a decoder 40 for comparing the features of the transformed digital representation of the speech to a set of presaved user-models 41 to determine whether the speech received at the microphone 34 was a selection command, a control command, or some other speech to be ignored by the master controller 12. Such "other speech" would include extraneous noise, speech between the l0 doctor and another person in the operating suite, as well as speech of other people in the operating suite in general.
Feature extractors, such as the one employed in the present invention, are well known in the art of voice recognition. Feature vectors are preferably generated by the feature extractor 38 utilizing techniques such as Mel-Cepstrum, or linear prediction. It is 15 to be appreciated that such techniques are well-known and are employed in the feature extractor 38 to develop feature vectors that represent speech received by the VCI 32.
Additionally, voice software is also available that provides extractors and decoders such as the ones set out in the present application. As such, although a specific implementation is presented herein for voice recognition, it may be carried out by the 2o inclusion of a pre-made voice recognition system that is purchased from a vendor such as Creative labs under the tradename VOICE BLASTER, Dragon Dictate produced by Dragon Systems, or VOICE PAD produced by Kurzweil AI of Massachusetts, each of these companies produce front-end voice recognition systems.
The decoder 40 utilizes the information produced by the feature extractor 38, by matching the stored user models 41 to the output of the feature extractor 38 utilizing a well-known method, such as a Hidden Markov Modeling. One Hidden Markov Model (HMM) is created for each phoneme. The HMMs are trained to identify their respective phonemes given the Mel-Cepstrum output from the feature extractor 38. The use of Hidden Markov Models for voice recognition is generally well known.
The stored user models 41 used by the decoder 40 may be placed in a memory 44 associated with the VCI itself. As depicted in Figure 3, such a memory 44 may be incorporated onto a VCI board 46 as an EPROM, a PROM or some other programmable memory storage device. However, it is preferable to store the models on a transportable memory device 45, such as a disk, transportable storage medium or the like. It is even more preferable that the transportable memory device be a PCMCIA format card 48 as data transfer times are reduced and the ruggedness of the system is increased.
PCMCIA
format cards retain data better than floppy disks. Additionally, the configuration of currently produced PCMCIA cards allows for additional program data to be stored on the PCMCIA format card and downloaded into the master controller 12 when system changes are made (i.e. upgrades to the system software etc.). Therefore, the use of such a PCMCIA form factor card is preferable in the control system 10 of the present invention.
Figure 3 depicts, in more detail, the VCI 32. Once the user's speech has been digitized at the A/D converter 36, it is fed to the feature extractor 38. The feature extractor 38 functions as set out hereinabove. In more detail, the feature extractor 38 converts the digitized signal into a representation that is suitable for decoding (e.g. Mel-Cepstrum). This representation is then passed to the decoder 40 which compares the representations produced at the feature extractor 38 to the models stored on a memory 44 which contains the user models 41. The memory 44 may be supplied the models 41 via a downloading process from the transportable memory device 45. The models stored in the memory 44 constitute a lexicon, which is the entire set of valid pronunciations, or all of the valid words that the master 12 is to recognize. Because the lexicon is stored on a transportable data storage medium 41, the lexicon may be added to or subtracted from depending upon the devices that are to be connected to the master controller 12. In this fashion, if new equipment is purchased at a date subsequent to the purchase of the master controller 12, then new words may be added to the lexicon through a well-known data acquisition technique, wherein the user speaks the words that are to be added to the lexicon and they are used to update the user models 41 on the transportable memory 45.
Most preferable to the implementation of the present system 10, there is provided one master controller 12 and at least one slave 14 controller. In such a configuration, which will be discussed in more detail hereinbelow, once the master controller or master 12 receives a selection command, all speech received at the VCI 32 of the master 12 that is not a new selection command is fed to the feature extractor of the appropriately attached slave 14. In this way, a plurality of devices may be attached to several different controllers and the lexicon stored in each controller does not have to be downloaded into the master 12. The master 12 only contains the lexicon of all the devices that may be 2o connected to the system 10 as well as the lexicon for the commands of those devices that are directly attached to the master 12 as opposed to being attached to a slave 14 which is, in tum, attached to the master 12.
All the other controllers, which for purposes herein, are referred to as slaves 14, include the lexicon for the devices that are directly connected thereto. For example, in Figure l, one slave includes the lexicon for the control commands and the select commands for a robotic arm and an operating table. This way, that controller can have a microphone plugged into the VCI which is included in the unit and it may serve as a solo unit. Or, depending upon the configuration of the control system 10, it may actually server as a master. The entire system 10 is configurable at startup and as such is expandable. Every controller preferably includes a VCI.
The decoder 40 additionally contains a language model. This term is well-known 1o on the art and will be explained further hereinbelow. In essence, certain words may be validly said in certain orders. The language model is implemented by developing a network representing all the valid possibilities of word combinations and decoding the extracted vectors along each path in the network. Whichever path has the highest probability of matching the incoming speech, the information associated with that path is selected by the decoder 40. It is to additionally be appreciated that to carry out the present invention, a silence path is available and an unrecognized command path is provided as well. As such, even though a user speaks, if valid commands are not given, the system 10 will not respond.
Figure 5 sets out one exemplary language model for the proper operation of the 2o robotic arm 20. Such language models are developed for each device in electrical communication with the master controllerl2. Once again, a device may be in wireless communication with the master controller 12. It is preferable to store the language models for each device in their respective controller. For example, if a device is directly connected to a slave 14 then the control language model (that language model containing the language used to control the device) for the device is stored in the slave VCI. If the device is directly connected to the master 12 then the control language model is included in the VCI of the master 12. It is to be appreciated that the select language model must be stored in the master 12 for all the possible devices that may be directly connected to the master 12 as opposed to being connected to a slave. As such, depending upon what devices are connected to the system at any given time, a user may select from any of the connected devices. If a device is not connected, the system will recognize this upon startup and will not attempt to access the device as it is not there. This will be discussed in more detail hereinbelow.
If a device is connected directly to the master controllerl2, then it is preferable to store the language model for controlling the device either in the VCI itself, or in the transportable memory 45. The advantages of this configuration are set out hereinbelow with respect to the startup of the control system 10.
If a select command is given for a device that is directly connected to the master 12, then the information is passed to the decoder in the master 12 and the decoder 40 generates a packet 52 of information. The packet includes the address of the device to be operated, a code representing the specific operation, and a checksum to ensure that as the packet 52 is transferred over various busses, the data does not become corrupted. Such 2o information packaging is well-known although the specific package set out hereinabove has heretofore not been utilized to control one of a plurality of medical devices. Data checking using a checksum is also well-known in the art.
The decoder 40, upon decoding a valid selection command, activates the address of the device which has been stored in a lookup table and is related to the device. This is accomplished as follows. At startup every controller, whether the master 12 or a slave 14 knows the addresses of its communication ports. It sends a query to each communication port to see if a device is connected thereto. If so, an adapter connected to the device specifies the name of the device and an indication that it is functioning properly. Such adapters are well known in the electrical arts and as such will not be further discussed herein. Every slave controller establishes a lookup table of addresses and associated device codes or names. The device codes or names are transmitted to the master 1o which includes all the devices and the corresponding address of the port to which the associated slave controller is connected to the master 12.
The addresses of all devices available are initially stored in a memory associated with the VCI such that a multiplexer may be used to activate a specific address or make that address accessible. In this fashion, once the master 12 receives a valid selection command, which it is able to identify, it then routes all the control commands to the VCI
of the appropriate slave controller in the case where the device selected is connected to a slave controller. If the selected device is connected directly to the master 12 then the control commands are fed through the decoder 40 of the master 12 and the control information packet is produced and sent to the device via the central processor 44 of the 2o master 12. In this fashion, the VCI of a slave is fed control signals and processes those signals as though they were received from the A/D converter, which is where the input to the slave is routed. Every slave can be attached to one master, and that master, can, in turn be attached to another master, thus providing a daisychain of slaves all of which are connected to one master having a microphone attached thereto.
In addition to the VCI 32, the master controller 12 comprises means 42 for routing control signals to a device specified by a selection command received at the VCI 32.
Figure 4 depicts the master controller 12 having one slave controller 14 and two medical devices in electrical communication therewith. The master controller includes the VCI
32 as well as the means 42 for routing control signals. Once the speech has been extracted and decoded into either a selection command, or a control command, the specific command is transmitted to the Central Processor 44 of the master controllerl2.
1o In the preferred embodiment, the means 42 for routing control signals is incorporated into the central processor 44 of the master controller 12. The means 42 for routing is essentially an addressable multiplexes and has a memory of the addresses for each device and their associated one of the plurality of communication ports 46 to which they are connected. If the addresses are stored in the decoder 40, then the central 15 processor 44 will be in communication with that memory.
The means 42 for routing, takes the packet 50 of information or the control signal, if the information is to be sent to a slave 14, checks which of the plurality of communication ports 46 it is to direct the information to and then directs the information to the desired one of the plurality 46 of ports.
2o The addresses and their associated ports are uploaded into the master 12 upon startup of the system. This procedure is embedded in the software and such a procedure is well-known in the art.
For example, in Figure 4, an electrocautery device 18 transmits an address to the master controller 12. The address is received at a one of a plurality of communication ports 46, the address is saved in the memory along with the associated communication port number. It is to be appreciated that the valid selection commands are stored on the transportable memory. For devices directly connected to the master, the language model may be stored in a memory in the master 12 or in the transportable memory.
Language models are stored in associated slaves for devices that are directly connected to a slave 14. In this fashion, upon startup, the masterl2 knows all devices that are connected to the system, as each slave sends to the master the addresses of each device and the name (i.e.
l0 coded phonemes that constitute the device) of the device. The names of the devices are uploaded into the master so that the validity of selection commands may take place in the master 12. However, language models for the validity of control commands are not transmitted to the master 12 as this would take much time and slow the control system 10 down. Therefore, the master controller 12 actually contains a subset of the grammar necessary to operate the devices in connection therewith, but that language model is limited to only the device names. The information regarding valid sequences of control commands (i.e. their control language model) is stored on each slave controller to which they are connected. Of course, if the device 14 is directly connected to the master, then the language model is stored at the master 12 as described hereinabove.
2o The control system 10 in accordance with the present invention provides a way to configure and reconfigure an operating room in a very simple fashion.
Additionally, it is to be appreciated that the system 10 provides an intuitive interface whereby a user can "..,.
select a device to control and then subsequently control that device. The system checks to ensure that control commands received for a specific device are valid.
Additionally, the system 10 requires the inclusion of adapters 52 placed intermediate a specific one of the plurality of devices 16 and a slave or the master 12.
The adapters 52 transmit signals to their respective slave 14 or master 12 indicative of the address of the device, and translate control signals sent from the controller to which they are connected to signals understood by the particular device for which they are intended.
Such adapters are easily constructed and are well-known in the art.
Additionally, such adapters may be included either in the respective slave 14 or master 12 or attached to the 1o particular one of the plurality of devices 16 itself. There is substantial advantage to attaching the adapters 52 to the devices 16 as then the devices may be attached to any port, whereas, if the adapters are attached interior the controller 12, 14, the specific device for which they were designed must be attached to the specific one of the plurality of communication ports 46.
15 If new devices are added to the system, or if improvements or upgrades are made to the system software, such changes may be incorporated into a PCMCIA format card, such as the card that stores the user voice models. The card may be inserted into the same interface, however, system software may be uploaded into the master to make the upgrade without having to disassemble the master. This is accomplished by 2o incorporating a serial interface on the PCMCIA format card. As such, the central processor 44 additionally checks upon startup whether there is a system upgrade to be made by checking the data being supplied by the PCMCIA format card. Checking the activity of a serial interface is well known, however it is not heretofore known to incorporate a serial interface on a single PCMCIA format card. Therefore, the combination is seen to be novel. Additionally, it is heretofore not known to incorporate voice models on such a PCMCIA format card.
Each of the at least one slave 14 is substantially similar to the master controller 12. And, each of the plurality of slaves 14 may include the full VCI so that each slave 14 can operate as a master. Alternatively, although not preferred, the slaves may not include the feature extractor, and only contain a subset of the language model (i.e.
control commands) relating to the operation of each specific device. This is all that may be necessary in the slave because the slave receives from the master controller the specific address a command is to be sent and that it is in fact a command. Therefore, the slave only needs to check to ensure that it is a valid command for the specific device. In this fashion, devices may be directly connected to the master, or they may be connected to a slave which is in communication with the master 12.
The system 10 may include output means including a video monitor 86 and a speaker 88. The speaker may be incorporated into the VCI 32 via a D/A
converter 90 such that the system may communicate to the user any errors committed by the user in operating or selecting a specific device. Additionally, the output means may communicate system errors or the malfunction of a specific device. Such information is included in each specific adapter and is specific to the device attached to the adapter. It is to be appreciated that such communications would be transmitted to the master where they would be either auditorially or visually displayed. The system and controller in accordance with the invention may additionally include a foot controller, a hand controller or other well-known controllers. Each of these controllers may be used to control any of the devices connected to the master or a slave, as is described in the patent application incorporated herein by reference. As such, the VCI may only be used to select certain devices, and once selected the device may be controlled via one of the well-known controllers. Ultimately, the flexibility of such a system can reduce costs and increase the safety of surgical procedures.
Finally, the system 10 may include a connection to a hospital computer network via a network gateway 500. Hospital networks are implemented in substnatially all hospitals and provide for electrical storage of patient records as well as scheduling and financial information.
l0 The network gateway 500 is preferably a personal computer such as an IBM
compatible, or some other well known personal computer running web browsing software such as Microsoft Internet Explorer, Netscape Communicator or any other known web browsing software.
By connecting to the hospital network, patient information that is available at 15 computer terminals in the hospital would also be made available in the operating room.
As such, a vocabulary for accessing patient data must be provided to be used with the control system. Examples of such vocabulary include the commands "get", "load"
and "display". The data that may be provided includes, but is not limited to x-rays, patient history, MRIs, angiography and CAT scans.
20 Through the use of a web browser, the patient data may be sent to the gateway 500 in a format to be displayed by either a monitor 510 connected to the gateway or directly to the monitor. This would be accomplished through electrical connections already disclosed hereinabove.
To effectuate the display of patient data in a web browsable format, essentially HTML or some other well known web format, the data must be provided to the gateway 500 in such. An access port, essentially consisting of a URL provides a location for the web broswer to obtain patient information. This LJtTItI. provides an interface into the hospital network.
Accessing of patient information is well known, however the reformatting of such information into an HTML document is new. Because HTML formatting is well known the specifics of such formatting will not be disclosed herein. In this fashion, patient data may be accessed via voice commands and displayed on a monitor or a display coupled to to the gateway 500.
While certain exemplary embodiments of the present invention have been described and shown on the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad invention, and that this invention not be limited to the specific constructions and arrangements shown and 15 described, since various other modifications may occur to those ordinarily skilled in the art.
?1
Preferred structures for both selection commands and control commands are disclosed herein in the detailed description of the preferred embodiment of the present invention.
Additionally, a controller may include means for ensuring that control signals indicative of control commands issued subsequent to the receipt of a selection command are, in fact, valid control signals. This is accomplished via a database of valid control commands and grammars that are either prestored in the master, or are prestored in a slave prior to or at system startup which is described hereinbelow.
A second aspect of the present invention is at least one slave electrically connected to the master controller. Each slave controller connected to the master controller operates similarly to the master controller for specific devices electrically connected thereto; additionally, the slave controllers may receive control commands directly from the user if they are to be used as a stand alone unit. However, if they are utilized as slaves then control commands are received at the master controller and converted into control signals and transmitted from the master controller to the slave controller that has the device specified by the last selection command received by the master controller connected thereto . This allows the control system of the present invention to operate with a plurality of different devices without the master controller requiring any knowledge of the devices connected to the slave controllers prior to startup of the control system.
The slave controllers are connected to the master controller just like any other device;
however, each slave controller provides the master controller information relating to the specific devices that are connected thereto, so the master controller, generally at startup, is provided information as to exactly what devices are connected to the system.
The selection commands available to the user include all devices connected to each of the slave controllers as well as the devices directly connected to the master controller. By providing an open architecture such as that generally set out hereinabove, and more particularly, a master controller and slave controllers, various devices may be controlled from a single controller, or a plurality of controllers, such that a doctor utilizing the control system will not have to switch between different control systems or interfaces, or at a minimum will have an easier interface to control each of the devices. It is additionally envisioned that the main means for selecting and controlling each of the devices will be a voice recognition system which will be described in detail hereinbelow.
Also, the control system may include audio and video outputs which are capable of alerting the user to errors in selecting, or controlling specific devices. The audio and video outputs may additionally be used to alert the user to problems with each of the specific devices as well as to provide status notices as to which devices) are available, which devices are active, as well as a host of other device operation information which will be discussed further hereinbelow.
In a further aspect, the present invention provides an operating room control system for use during a medical procedure on a patient, comprising: a plurality of operating room devices; a voice input device capable of receiving a spoken device command that has an associated device qualifier, and a data retrieval command having an associated patient record identifier; a display device; and a controller coupled to the input device and the display device, the controller capable of receiving the data retrieval command and the device command, herein the controller transmits a command to a server to retrieve patient information in response to the data retrieval command so that desired patient data can be selected in response to the patient record identifier, wherein the controller receives the patient information from the server and displays the patient information on the display device, and wherein the controller transmits a control command to a selected operating room device in response to the device command, the medical operating room device selected from the plurality of operating room devices in response to the device qualifier so as to allow voice manipulation of an operating room environment.
In a still further aspect, the present invention provides a method for accessing medical data from a server while performing a medical procedure, comprising: receiving a plurality of commands from an input device including an instrument command that has an associated instrument qualifier, and a data retrieval command having an associated patient record identifier; transmitting a request to the server for the unit of patient data in response to the data retrieval command; receiving the unit of patient data from the server; and displaying the unit of patient data on a display device.
In a further aspect, the present invention provides an operating room control system for use during a medical procedure on a patient, comprising: a plurality of medical devices, each device having an associated device identifier; a voice input device capable of receiving spoken device commands having associated device qualifiers, and also capable of receiving spoken 6a voice data retrieval commands having an associated data retrieval qualifier; a display device; a master controller coupled to the input device, the master controller capable of receiving the data retrieval commands and the device commands, wherein the master controller receives a command having a qualifier and transmits the command to a network so as to retrieve a unit of information when the qualifier is a data retrieval qualifier, and wherein the master controller transmits the command to a selected medical device when the qualifier is a device qualifier, the selected medical device being selected from the plurality of medical devices and having an associated device identifier matching the device qualifier; and a slave controller coupled to the master controller and the display device, the slave controller capable of transmitting a request for the unit of information to a server, receiving the unit of information from the server, and displaying the unit of information on the display device.
6b For a more complete understanding of the present invention, reference is made to the following detailed description and accompanying drawings. In the drawings, like reference characters refer to like parts, in which:
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 is a block diagram of a master controller in electrical communication with both slave controllers and operating room devices in accordance with the present invention;
Figure 2 is a block diagram of the voice control interface in accordance with the presentinvention;
1o Figure 3 is a schematic of the voice control interface card in accordance with the present invention;
Figure 4 is a schematic diagram of a master controller in accordance with a the present mvenhon;
Figure 5 is an exemplary tree diagram of a grammar for operating a device in accordance with the present invention.
DESCRIPTION OF THE PREFERRED EMBODIMENTS
In accordance with the present invention, there is shown in Figure 1 an operating room control system, generally at 10, in accordance with the present invention. The operating room control system, or control system 10, generally comprises a master controller 12, which is preferably attached to at least one slave controller 14. Although the exemplary preferred embodiment is shown as having both a master controller 12 and at least one slave controller 14 in electrical communication therewith, the control system may be implemented with only a master controller 12 as will be described hereinbelow.
The master controller 12 is electrically connected to and in electrical communication with a plurality of devices 16 via a plurality of communication ports 46.
5 Alternatively, the master controller 12 may be connected to any slave or specific medical device via wireless communications systems such as IR or RF signal transmitters and receivers on each of the master 12, slaves 14, and devices 16. Some of these devices 16 may be at least one slave controller 14 the operation of which will be described hereinbelow. Other devices that are intended to be electrically connected to the master to controller 12, either directly or via the at least one slave controller 14 include devices that are commonly found in an operating room environment.
For purposes of non-limiting example, directly connected to the master controller 12 in Figure 1 is an electrocautery device 18. A robotic arm 20 for holding and manipulating an endoscope, such as that produced by Computer Motion of Goleta, ~5 California and marketed under the tradename AESOP is electrically connected with the master controller 12 via one of the at least one slave controllers 14. Also in electrical communication with the master controller 12 via a slave controller is an operating room table 22, an insufflator 24, and an operating room lighting system 26. It is envisioned that any electrically controlled device utilized in an operating room environment may be attached to the master controller 12 either directly or via one of the at least one slave controllers 14.
The master controller 12 is configured to provide a main user interface for each of the devices electrically connected thereto. As such, a doctor can manipulate the operating room environment in a simpler and more direct fashion. Currently, each device in an operating room includes a separate interface. The proximity of the doctor to each interface requires a substantial amount of movement either on the part of the doctor or a nurse to effectuate changes required by the doctor during a medical procedure.
For example, if the doctor needs the lights dimmed slightly, then a nurse currently has to approach the lighting system of the operating room and dim the lights.
It would be highly advantageous for the doctor to be able to control such changes directly to keep movement in the operating room to a minimum to increase sterility, and because direct control by the doctor of the operating room environment and the devices he or she is 1o using ensures the highest degree of safety with the smallest amount of error due to miscommunication between people in the operating room. Minimization of movement in an operating room environment is additionally advantageous to reduce the risk of contamination of specific sterile instruments, as well as the operative site itself.
To effectuate such a control system 10, the master controller 12 generally 15 comprises a voice control interface (VCI) 32. The VCI 32 includes means 28 for receiving selection commands from a user wherein each selection command is associated with one specific device in electrical communication with the master controller 12. This is accomplished by providing the master controller 12 a list of the devices that are in electrical communication therewith upon start-up of the control system 10. The process 2o and hardware for providing the master controller 12 with such a list will be described hereinbelow.
As shown in Figure 2, The VCI 32 additionally comprises means 30 for receiving control commands from a user. In the preferred embodiment, both the means 28 for receiving selection commands and the means 30 for receiving control commands may coexist in the VCI 32 as a microphone 34, for receiving the actual speech of the user, an analog to digital converter 36 for converting the analog speech into a digital representation thereof, a feature extractor 38 for converting the digital representation to a digital representation that is suited for decoding, and a decoder 40 for comparing the features of the transformed digital representation of the speech to a set of presaved user-models 41 to determine whether the speech received at the microphone 34 was a selection command, a control command, or some other speech to be ignored by the master controller 12. Such "other speech" would include extraneous noise, speech between the l0 doctor and another person in the operating suite, as well as speech of other people in the operating suite in general.
Feature extractors, such as the one employed in the present invention, are well known in the art of voice recognition. Feature vectors are preferably generated by the feature extractor 38 utilizing techniques such as Mel-Cepstrum, or linear prediction. It is 15 to be appreciated that such techniques are well-known and are employed in the feature extractor 38 to develop feature vectors that represent speech received by the VCI 32.
Additionally, voice software is also available that provides extractors and decoders such as the ones set out in the present application. As such, although a specific implementation is presented herein for voice recognition, it may be carried out by the 2o inclusion of a pre-made voice recognition system that is purchased from a vendor such as Creative labs under the tradename VOICE BLASTER, Dragon Dictate produced by Dragon Systems, or VOICE PAD produced by Kurzweil AI of Massachusetts, each of these companies produce front-end voice recognition systems.
The decoder 40 utilizes the information produced by the feature extractor 38, by matching the stored user models 41 to the output of the feature extractor 38 utilizing a well-known method, such as a Hidden Markov Modeling. One Hidden Markov Model (HMM) is created for each phoneme. The HMMs are trained to identify their respective phonemes given the Mel-Cepstrum output from the feature extractor 38. The use of Hidden Markov Models for voice recognition is generally well known.
The stored user models 41 used by the decoder 40 may be placed in a memory 44 associated with the VCI itself. As depicted in Figure 3, such a memory 44 may be incorporated onto a VCI board 46 as an EPROM, a PROM or some other programmable memory storage device. However, it is preferable to store the models on a transportable memory device 45, such as a disk, transportable storage medium or the like. It is even more preferable that the transportable memory device be a PCMCIA format card 48 as data transfer times are reduced and the ruggedness of the system is increased.
PCMCIA
format cards retain data better than floppy disks. Additionally, the configuration of currently produced PCMCIA cards allows for additional program data to be stored on the PCMCIA format card and downloaded into the master controller 12 when system changes are made (i.e. upgrades to the system software etc.). Therefore, the use of such a PCMCIA form factor card is preferable in the control system 10 of the present invention.
Figure 3 depicts, in more detail, the VCI 32. Once the user's speech has been digitized at the A/D converter 36, it is fed to the feature extractor 38. The feature extractor 38 functions as set out hereinabove. In more detail, the feature extractor 38 converts the digitized signal into a representation that is suitable for decoding (e.g. Mel-Cepstrum). This representation is then passed to the decoder 40 which compares the representations produced at the feature extractor 38 to the models stored on a memory 44 which contains the user models 41. The memory 44 may be supplied the models 41 via a downloading process from the transportable memory device 45. The models stored in the memory 44 constitute a lexicon, which is the entire set of valid pronunciations, or all of the valid words that the master 12 is to recognize. Because the lexicon is stored on a transportable data storage medium 41, the lexicon may be added to or subtracted from depending upon the devices that are to be connected to the master controller 12. In this fashion, if new equipment is purchased at a date subsequent to the purchase of the master controller 12, then new words may be added to the lexicon through a well-known data acquisition technique, wherein the user speaks the words that are to be added to the lexicon and they are used to update the user models 41 on the transportable memory 45.
Most preferable to the implementation of the present system 10, there is provided one master controller 12 and at least one slave 14 controller. In such a configuration, which will be discussed in more detail hereinbelow, once the master controller or master 12 receives a selection command, all speech received at the VCI 32 of the master 12 that is not a new selection command is fed to the feature extractor of the appropriately attached slave 14. In this way, a plurality of devices may be attached to several different controllers and the lexicon stored in each controller does not have to be downloaded into the master 12. The master 12 only contains the lexicon of all the devices that may be 2o connected to the system 10 as well as the lexicon for the commands of those devices that are directly attached to the master 12 as opposed to being attached to a slave 14 which is, in tum, attached to the master 12.
All the other controllers, which for purposes herein, are referred to as slaves 14, include the lexicon for the devices that are directly connected thereto. For example, in Figure l, one slave includes the lexicon for the control commands and the select commands for a robotic arm and an operating table. This way, that controller can have a microphone plugged into the VCI which is included in the unit and it may serve as a solo unit. Or, depending upon the configuration of the control system 10, it may actually server as a master. The entire system 10 is configurable at startup and as such is expandable. Every controller preferably includes a VCI.
The decoder 40 additionally contains a language model. This term is well-known 1o on the art and will be explained further hereinbelow. In essence, certain words may be validly said in certain orders. The language model is implemented by developing a network representing all the valid possibilities of word combinations and decoding the extracted vectors along each path in the network. Whichever path has the highest probability of matching the incoming speech, the information associated with that path is selected by the decoder 40. It is to additionally be appreciated that to carry out the present invention, a silence path is available and an unrecognized command path is provided as well. As such, even though a user speaks, if valid commands are not given, the system 10 will not respond.
Figure 5 sets out one exemplary language model for the proper operation of the 2o robotic arm 20. Such language models are developed for each device in electrical communication with the master controllerl2. Once again, a device may be in wireless communication with the master controller 12. It is preferable to store the language models for each device in their respective controller. For example, if a device is directly connected to a slave 14 then the control language model (that language model containing the language used to control the device) for the device is stored in the slave VCI. If the device is directly connected to the master 12 then the control language model is included in the VCI of the master 12. It is to be appreciated that the select language model must be stored in the master 12 for all the possible devices that may be directly connected to the master 12 as opposed to being connected to a slave. As such, depending upon what devices are connected to the system at any given time, a user may select from any of the connected devices. If a device is not connected, the system will recognize this upon startup and will not attempt to access the device as it is not there. This will be discussed in more detail hereinbelow.
If a device is connected directly to the master controllerl2, then it is preferable to store the language model for controlling the device either in the VCI itself, or in the transportable memory 45. The advantages of this configuration are set out hereinbelow with respect to the startup of the control system 10.
If a select command is given for a device that is directly connected to the master 12, then the information is passed to the decoder in the master 12 and the decoder 40 generates a packet 52 of information. The packet includes the address of the device to be operated, a code representing the specific operation, and a checksum to ensure that as the packet 52 is transferred over various busses, the data does not become corrupted. Such 2o information packaging is well-known although the specific package set out hereinabove has heretofore not been utilized to control one of a plurality of medical devices. Data checking using a checksum is also well-known in the art.
The decoder 40, upon decoding a valid selection command, activates the address of the device which has been stored in a lookup table and is related to the device. This is accomplished as follows. At startup every controller, whether the master 12 or a slave 14 knows the addresses of its communication ports. It sends a query to each communication port to see if a device is connected thereto. If so, an adapter connected to the device specifies the name of the device and an indication that it is functioning properly. Such adapters are well known in the electrical arts and as such will not be further discussed herein. Every slave controller establishes a lookup table of addresses and associated device codes or names. The device codes or names are transmitted to the master 1o which includes all the devices and the corresponding address of the port to which the associated slave controller is connected to the master 12.
The addresses of all devices available are initially stored in a memory associated with the VCI such that a multiplexer may be used to activate a specific address or make that address accessible. In this fashion, once the master 12 receives a valid selection command, which it is able to identify, it then routes all the control commands to the VCI
of the appropriate slave controller in the case where the device selected is connected to a slave controller. If the selected device is connected directly to the master 12 then the control commands are fed through the decoder 40 of the master 12 and the control information packet is produced and sent to the device via the central processor 44 of the 2o master 12. In this fashion, the VCI of a slave is fed control signals and processes those signals as though they were received from the A/D converter, which is where the input to the slave is routed. Every slave can be attached to one master, and that master, can, in turn be attached to another master, thus providing a daisychain of slaves all of which are connected to one master having a microphone attached thereto.
In addition to the VCI 32, the master controller 12 comprises means 42 for routing control signals to a device specified by a selection command received at the VCI 32.
Figure 4 depicts the master controller 12 having one slave controller 14 and two medical devices in electrical communication therewith. The master controller includes the VCI
32 as well as the means 42 for routing control signals. Once the speech has been extracted and decoded into either a selection command, or a control command, the specific command is transmitted to the Central Processor 44 of the master controllerl2.
1o In the preferred embodiment, the means 42 for routing control signals is incorporated into the central processor 44 of the master controller 12. The means 42 for routing is essentially an addressable multiplexes and has a memory of the addresses for each device and their associated one of the plurality of communication ports 46 to which they are connected. If the addresses are stored in the decoder 40, then the central 15 processor 44 will be in communication with that memory.
The means 42 for routing, takes the packet 50 of information or the control signal, if the information is to be sent to a slave 14, checks which of the plurality of communication ports 46 it is to direct the information to and then directs the information to the desired one of the plurality 46 of ports.
2o The addresses and their associated ports are uploaded into the master 12 upon startup of the system. This procedure is embedded in the software and such a procedure is well-known in the art.
For example, in Figure 4, an electrocautery device 18 transmits an address to the master controller 12. The address is received at a one of a plurality of communication ports 46, the address is saved in the memory along with the associated communication port number. It is to be appreciated that the valid selection commands are stored on the transportable memory. For devices directly connected to the master, the language model may be stored in a memory in the master 12 or in the transportable memory.
Language models are stored in associated slaves for devices that are directly connected to a slave 14. In this fashion, upon startup, the masterl2 knows all devices that are connected to the system, as each slave sends to the master the addresses of each device and the name (i.e.
l0 coded phonemes that constitute the device) of the device. The names of the devices are uploaded into the master so that the validity of selection commands may take place in the master 12. However, language models for the validity of control commands are not transmitted to the master 12 as this would take much time and slow the control system 10 down. Therefore, the master controller 12 actually contains a subset of the grammar necessary to operate the devices in connection therewith, but that language model is limited to only the device names. The information regarding valid sequences of control commands (i.e. their control language model) is stored on each slave controller to which they are connected. Of course, if the device 14 is directly connected to the master, then the language model is stored at the master 12 as described hereinabove.
2o The control system 10 in accordance with the present invention provides a way to configure and reconfigure an operating room in a very simple fashion.
Additionally, it is to be appreciated that the system 10 provides an intuitive interface whereby a user can "..,.
select a device to control and then subsequently control that device. The system checks to ensure that control commands received for a specific device are valid.
Additionally, the system 10 requires the inclusion of adapters 52 placed intermediate a specific one of the plurality of devices 16 and a slave or the master 12.
The adapters 52 transmit signals to their respective slave 14 or master 12 indicative of the address of the device, and translate control signals sent from the controller to which they are connected to signals understood by the particular device for which they are intended.
Such adapters are easily constructed and are well-known in the art.
Additionally, such adapters may be included either in the respective slave 14 or master 12 or attached to the 1o particular one of the plurality of devices 16 itself. There is substantial advantage to attaching the adapters 52 to the devices 16 as then the devices may be attached to any port, whereas, if the adapters are attached interior the controller 12, 14, the specific device for which they were designed must be attached to the specific one of the plurality of communication ports 46.
15 If new devices are added to the system, or if improvements or upgrades are made to the system software, such changes may be incorporated into a PCMCIA format card, such as the card that stores the user voice models. The card may be inserted into the same interface, however, system software may be uploaded into the master to make the upgrade without having to disassemble the master. This is accomplished by 2o incorporating a serial interface on the PCMCIA format card. As such, the central processor 44 additionally checks upon startup whether there is a system upgrade to be made by checking the data being supplied by the PCMCIA format card. Checking the activity of a serial interface is well known, however it is not heretofore known to incorporate a serial interface on a single PCMCIA format card. Therefore, the combination is seen to be novel. Additionally, it is heretofore not known to incorporate voice models on such a PCMCIA format card.
Each of the at least one slave 14 is substantially similar to the master controller 12. And, each of the plurality of slaves 14 may include the full VCI so that each slave 14 can operate as a master. Alternatively, although not preferred, the slaves may not include the feature extractor, and only contain a subset of the language model (i.e.
control commands) relating to the operation of each specific device. This is all that may be necessary in the slave because the slave receives from the master controller the specific address a command is to be sent and that it is in fact a command. Therefore, the slave only needs to check to ensure that it is a valid command for the specific device. In this fashion, devices may be directly connected to the master, or they may be connected to a slave which is in communication with the master 12.
The system 10 may include output means including a video monitor 86 and a speaker 88. The speaker may be incorporated into the VCI 32 via a D/A
converter 90 such that the system may communicate to the user any errors committed by the user in operating or selecting a specific device. Additionally, the output means may communicate system errors or the malfunction of a specific device. Such information is included in each specific adapter and is specific to the device attached to the adapter. It is to be appreciated that such communications would be transmitted to the master where they would be either auditorially or visually displayed. The system and controller in accordance with the invention may additionally include a foot controller, a hand controller or other well-known controllers. Each of these controllers may be used to control any of the devices connected to the master or a slave, as is described in the patent application incorporated herein by reference. As such, the VCI may only be used to select certain devices, and once selected the device may be controlled via one of the well-known controllers. Ultimately, the flexibility of such a system can reduce costs and increase the safety of surgical procedures.
Finally, the system 10 may include a connection to a hospital computer network via a network gateway 500. Hospital networks are implemented in substnatially all hospitals and provide for electrical storage of patient records as well as scheduling and financial information.
l0 The network gateway 500 is preferably a personal computer such as an IBM
compatible, or some other well known personal computer running web browsing software such as Microsoft Internet Explorer, Netscape Communicator or any other known web browsing software.
By connecting to the hospital network, patient information that is available at 15 computer terminals in the hospital would also be made available in the operating room.
As such, a vocabulary for accessing patient data must be provided to be used with the control system. Examples of such vocabulary include the commands "get", "load"
and "display". The data that may be provided includes, but is not limited to x-rays, patient history, MRIs, angiography and CAT scans.
20 Through the use of a web browser, the patient data may be sent to the gateway 500 in a format to be displayed by either a monitor 510 connected to the gateway or directly to the monitor. This would be accomplished through electrical connections already disclosed hereinabove.
To effectuate the display of patient data in a web browsable format, essentially HTML or some other well known web format, the data must be provided to the gateway 500 in such. An access port, essentially consisting of a URL provides a location for the web broswer to obtain patient information. This LJtTItI. provides an interface into the hospital network.
Accessing of patient information is well known, however the reformatting of such information into an HTML document is new. Because HTML formatting is well known the specifics of such formatting will not be disclosed herein. In this fashion, patient data may be accessed via voice commands and displayed on a monitor or a display coupled to to the gateway 500.
While certain exemplary embodiments of the present invention have been described and shown on the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad invention, and that this invention not be limited to the specific constructions and arrangements shown and 15 described, since various other modifications may occur to those ordinarily skilled in the art.
?1
Claims (24)
1. An operating room control system for use during a medical procedure on a patient, comprising:
a plurality of operating room devices;
a voice input device capable of receiving a spoken device command that has an associated device qualifier, and a data retrieval command having an associated patient record identifier;
a display device; and a controller coupled to the input device and the display device, the controller capable of receiving the data retrieval command and the device command, wherein the controller transmits a command to a server to retrieve patient information in response to the data retrieval command so that desired patient data can be selected in response to the patient record identifier, wherein the controller receives the patient information from the server and displays the patient information on The display device, and wherein the controller transmits a control command to a selected operating room device in response to the device command, the medical operating room device selected from the plurality of operating room devices in response to the device qualifier so as to allow voice manipulation of an operating room environment.
a plurality of operating room devices;
a voice input device capable of receiving a spoken device command that has an associated device qualifier, and a data retrieval command having an associated patient record identifier;
a display device; and a controller coupled to the input device and the display device, the controller capable of receiving the data retrieval command and the device command, wherein the controller transmits a command to a server to retrieve patient information in response to the data retrieval command so that desired patient data can be selected in response to the patient record identifier, wherein the controller receives the patient information from the server and displays the patient information on The display device, and wherein the controller transmits a control command to a selected operating room device in response to the device command, the medical operating room device selected from the plurality of operating room devices in response to the device qualifier so as to allow voice manipulation of an operating room environment.
2. The operating room control system of claim 1 wherein the server is located outside the operating room.
3. The operating room control system of claim 1 wherein the patient information includes one or more of the following: an MRI, CT, X-ray, still-frame video, moving video, ultrasound image, patient records, patient notes, and patient schedule.
4. The operating room control system of claim 1 wherein the server is coupled to the controller via one or more of the following: a local area network, wide area network, Internet, and Intranet.
5. The operating room control system of claim 1 wherein the controller includes a memory containing an application program, and a processor, said processor to execute the application program to cause the patient information to be displayed on the display device in a web browseable format.
6. The operating room control system of claim 1 wherein the input device comprises one or more of the following: a microphone, keyboard, mouse, foot pedal, and hand-held device.
7. The operating room control system of claim 1, wherein the controller compares the device command with a list of stored control commands associated with the selected operating room device, and transmits a control signal to a medical instrument for controlling the operating room device if there is a match, wherein the operating room devices comprise one or more of the following: a robotic arm, electrocautery device, operating room table, operating room lights, insufflator, and camera.
8. The operating room control system of claim 1 wherein the controller captures one or more units of patient data, transmits the one or more units of data and a request to a second server located outside the operating room to store the one or more units of patient data.
9. The operating room control system of claim 8 wherein the second server is coupled to the controller via one or more of the following: a local area network, wide area network, Internet, and Intranet.
10. The operating room control system of claim 8 wherein the one or more units of patient data comprises one or more of the following: an image captured by an image capturing device coupled to the controller, a moving video captured by the image capturing device coupled to the controller, and notes captured via the input device regarding the patient.
11. A method for accessing medical data from a server while performing a medical procedure, comprising:
receiving a plurality of commands from an input device including an instrument command that has an associated instrument qualifier, and a data retrieval command having an associated patient record identifier;
transmitting a request to the server for the unit of patient data in response to the data retrieval command;
receiving the unit of patient data from the server; and displaying the unit of patient data on a display device.
receiving a plurality of commands from an input device including an instrument command that has an associated instrument qualifier, and a data retrieval command having an associated patient record identifier;
transmitting a request to the server for the unit of patient data in response to the data retrieval command;
receiving the unit of patient data from the server; and displaying the unit of patient data on a display device.
12. The method of claim 11 wherein the server is located remotely from a location where the medical procedure is performed.
13. The method of claim 11 wherein the instrument command, associated instrument qualifier, data retrieval command and associated data retrieval qualifier each comprise an audible command.
14. The method of claim 11 wherein transmitting the request to the server comprises transmitting a URL of the server for the unit of patient data.
15. The method of claim 11 wherein the unit of patient data includes one or more of the following: an MRI, CT, X-ray still-frame video, moving video, ultrasound image, patient records, patient notes, and patient schedule.
16. The method of claim 11 further comprising:
capturing a second unit of patient data during the medical procedure;
transmitting the second unit of patient data to a second server with a request to store the second unit of patient data.
capturing a second unit of patient data during the medical procedure;
transmitting the second unit of patient data to a second server with a request to store the second unit of patient data.
17. An operating room control system for use during a medical procedure on a patient, comprising:
a plurality of medical devices, each device having an associated device identifier;
a voice input device capable of receiving spoken device commands having associated device qualifiers, and also capable of receiving spoken voice data retrieval commands having an associated data retrieval qualifier;
a display device;
a master controller coupled to the input device, the master controller capable of receiving the data retrieval commands and the device commands, wherein the master controller receives a command having a qualifier and transmits the command to a network so as to retrieve a unit of information when the qualifier is a data retrieval qualifier, and wherein the master controller transmits the command to a selected medical device when the qualifier is a device qualifier, the selected medial device being selected from the plurality of medical devices and having an associated device identifier matching the device qualifier; and a slave controller coupled to the master controller and the display device, the slave controller capable of transmitting a request for the unit of information to a server, receiving the unit of information from the server, and displaying the unit of information on the display device.
a plurality of medical devices, each device having an associated device identifier;
a voice input device capable of receiving spoken device commands having associated device qualifiers, and also capable of receiving spoken voice data retrieval commands having an associated data retrieval qualifier;
a display device;
a master controller coupled to the input device, the master controller capable of receiving the data retrieval commands and the device commands, wherein the master controller receives a command having a qualifier and transmits the command to a network so as to retrieve a unit of information when the qualifier is a data retrieval qualifier, and wherein the master controller transmits the command to a selected medical device when the qualifier is a device qualifier, the selected medial device being selected from the plurality of medical devices and having an associated device identifier matching the device qualifier; and a slave controller coupled to the master controller and the display device, the slave controller capable of transmitting a request for the unit of information to a server, receiving the unit of information from the server, and displaying the unit of information on the display device.
18. The operating room control system of claim 17 wherein the unit of information includes one or more of the following: an MRI, CT, X-ray, still-frame video, moving video, ultrasound image, patient records, patient notes, and patient schedule.
19. The operating room control system of claim 17 wherein the server is coupled to a slave controller via one or more of the following: a local area network, wide area network, Internet, and Intranet.
20. The operating room control system of claim 17 wherein the request for the unit of information includes a patient identification number.
21. The operating room control system of claim 17 wherein the server is located remotely from the slave controller.
22. The method of claim 11, further comprising transmitting a control signal to one or more medical instruments in response to the instrument command.
23. The method of claim 22, wherein transmitting a control signal comprises comparing the instrument command with a list of one or more stored control commands associated with the one or more medical instruments, and transmits one or more control signals to a medical instrument for controlling the medical instrument if there is a match.
24. The method of claim 23, wherein the medical instruments comprise one or more of the following: a robotic arm, electrocautery device, insufflator, and camera.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/354944 | 1999-07-15 | ||
US09/354,944 US6496099B2 (en) | 1996-06-24 | 1999-07-15 | General purpose distributed operating room control system |
Publications (2)
Publication Number | Publication Date |
---|---|
CA2313996A1 CA2313996A1 (en) | 2001-01-15 |
CA2313996C true CA2313996C (en) | 2007-04-03 |
Family
ID=23395551
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA002313996A Expired - Lifetime CA2313996C (en) | 1999-07-15 | 2000-07-13 | General purpose distributed operating room control system |
Country Status (4)
Country | Link |
---|---|
US (3) | US6496099B2 (en) |
EP (1) | EP1068837A1 (en) |
JP (1) | JP2001104336A (en) |
CA (1) | CA2313996C (en) |
Families Citing this family (210)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6463361B1 (en) * | 1994-09-22 | 2002-10-08 | Computer Motion, Inc. | Speech interface for an automated endoscopic system |
WO1999050721A1 (en) * | 1997-09-19 | 1999-10-07 | Massachusetts Institute Of Technology | Robotic apparatus |
US6911916B1 (en) * | 1996-06-24 | 2005-06-28 | The Cleveland Clinic Foundation | Method and apparatus for accessing medical data over a network |
US6231565B1 (en) | 1997-06-18 | 2001-05-15 | United States Surgical Corporation | Robotic arm DLUs for performing surgical tasks |
US7789875B2 (en) | 1998-02-24 | 2010-09-07 | Hansen Medical, Inc. | Surgical instruments |
US7775972B2 (en) | 1998-02-24 | 2010-08-17 | Hansen Medical, Inc. | Flexible instrument |
US7371210B2 (en) | 1998-02-24 | 2008-05-13 | Hansen Medical, Inc. | Flexible instrument |
US7758569B2 (en) | 1998-02-24 | 2010-07-20 | Hansen Medical, Inc. | Interchangeable surgical instrument |
US7713190B2 (en) | 1998-02-24 | 2010-05-11 | Hansen Medical, Inc. | Flexible instrument |
US8303576B2 (en) | 1998-02-24 | 2012-11-06 | Hansen Medical, Inc. | Interchangeable surgical instrument |
US7901399B2 (en) | 1998-02-24 | 2011-03-08 | Hansen Medical, Inc. | Interchangeable surgical instrument |
US8414598B2 (en) | 1998-02-24 | 2013-04-09 | Hansen Medical, Inc. | Flexible instrument |
US8527094B2 (en) | 1998-11-20 | 2013-09-03 | Intuitive Surgical Operations, Inc. | Multi-user medical robotic system for collaboration or training in minimally invasive surgical procedures |
US6659939B2 (en) | 1998-11-20 | 2003-12-09 | Intuitive Surgical, Inc. | Cooperative minimally invasive telesurgical system |
US6398726B1 (en) | 1998-11-20 | 2002-06-04 | Intuitive Surgical, Inc. | Stabilizer for robotic beating-heart surgery |
US6852107B2 (en) | 2002-01-16 | 2005-02-08 | Computer Motion, Inc. | Minimally invasive surgical training using robotics and tele-collaboration |
JP2002543865A (en) | 1999-05-10 | 2002-12-24 | ブロック ロジャース サージカル インコーポレイティド | Surgical instruments |
US7383358B1 (en) * | 1999-12-29 | 2008-06-03 | Ge Medical Technology Services, Inc. | System and method for remote servicing of in-field product |
IL135371A (en) * | 2000-03-30 | 2006-10-31 | Roie Medical Technologies Ltd | Resectoscope |
GB2365145A (en) * | 2000-07-26 | 2002-02-13 | Canon Kk | Voice control of a machine |
EP2441395A3 (en) | 2000-11-28 | 2014-06-18 | Intuitive Surgical Operations, Inc. | Endoscope beating-heart stabilizer and vessel occlusion fastener |
US20030135204A1 (en) | 2001-02-15 | 2003-07-17 | Endo Via Medical, Inc. | Robotically controlled medical instrument with a flexible section |
US7699835B2 (en) | 2001-02-15 | 2010-04-20 | Hansen Medical, Inc. | Robotically controlled surgical instruments |
US7766894B2 (en) | 2001-02-15 | 2010-08-03 | Hansen Medical, Inc. | Coaxial catheter system |
US8414505B1 (en) | 2001-02-15 | 2013-04-09 | Hansen Medical, Inc. | Catheter driver system |
KR100438838B1 (en) * | 2002-01-29 | 2004-07-05 | 삼성전자주식회사 | A voice command interpreter with dialogue focus tracking function and method thereof |
WO2003086714A2 (en) * | 2002-04-05 | 2003-10-23 | The Trustees Of Columbia University In The City Of New York | Robotic scrub nurse |
KR20040000920A (en) * | 2002-06-26 | 2004-01-07 | 텔원정보통신 주식회사 | Audio control apparatus of home automation system and audio control method of home automation system |
US20040162637A1 (en) * | 2002-07-25 | 2004-08-19 | Yulun Wang | Medical tele-robotic system with a master remote station with an arbitrator |
US7593030B2 (en) | 2002-07-25 | 2009-09-22 | Intouch Technologies, Inc. | Tele-robotic videoconferencing in a corporate environment |
US6925357B2 (en) * | 2002-07-25 | 2005-08-02 | Intouch Health, Inc. | Medical tele-robotic system |
WO2004014244A2 (en) | 2002-08-13 | 2004-02-19 | Microbotics Corporation | Microsurgical robot system |
US20040176751A1 (en) | 2002-08-14 | 2004-09-09 | Endovia Medical, Inc. | Robotic medical instrument system |
US7331967B2 (en) | 2002-09-09 | 2008-02-19 | Hansen Medical, Inc. | Surgical instrument coupling mechanism |
JP4096687B2 (en) * | 2002-10-09 | 2008-06-04 | 株式会社デンソー | EEPROM and method of manufacturing the same |
US7252633B2 (en) * | 2002-10-18 | 2007-08-07 | Olympus Corporation | Remote controllable endoscope system |
US7246192B1 (en) | 2003-01-10 | 2007-07-17 | Marvell International Ltd. | Serial/parallel ATA controller and converter |
US7158859B2 (en) * | 2003-01-15 | 2007-01-02 | Intouch Technologies, Inc. | 5 degrees of freedom mobile robot |
US7844657B2 (en) * | 2003-01-17 | 2010-11-30 | Storz Endoskop Produktions Gmbh | System for controlling medical devices |
US7171286B2 (en) * | 2003-02-24 | 2007-01-30 | Intouch Technologies, Inc. | Healthcare tele-robotic system with a robot that also functions as a remote station |
US7158860B2 (en) | 2003-02-24 | 2007-01-02 | Intouch Technologies, Inc. | Healthcare tele-robotic system which allows parallel remote station observation |
US7262573B2 (en) | 2003-03-06 | 2007-08-28 | Intouch Technologies, Inc. | Medical tele-robotic system with a head worn device |
US7591783B2 (en) | 2003-04-01 | 2009-09-22 | Boston Scientific Scimed, Inc. | Articulation joint for video endoscope |
US8118732B2 (en) | 2003-04-01 | 2012-02-21 | Boston Scientific Scimed, Inc. | Force feedback control system for video endoscope |
US7578786B2 (en) | 2003-04-01 | 2009-08-25 | Boston Scientific Scimed, Inc. | Video endoscope |
US20050245789A1 (en) | 2003-04-01 | 2005-11-03 | Boston Scientific Scimed, Inc. | Fluid manifold for endoscope system |
US20040199052A1 (en) | 2003-04-01 | 2004-10-07 | Scimed Life Systems, Inc. | Endoscopic imaging system |
US7361171B2 (en) | 2003-05-20 | 2008-04-22 | Raydiance, Inc. | Man-portable optical ablation system |
JP2004351533A (en) * | 2003-05-27 | 2004-12-16 | Fanuc Ltd | Robot system |
US8007511B2 (en) | 2003-06-06 | 2011-08-30 | Hansen Medical, Inc. | Surgical instrument design |
US6888333B2 (en) * | 2003-07-02 | 2005-05-03 | Intouch Health, Inc. | Holonomic platform for a robot |
US7960935B2 (en) | 2003-07-08 | 2011-06-14 | The Board Of Regents Of The University Of Nebraska | Robotic devices with agent delivery components and related methods |
US7042184B2 (en) | 2003-07-08 | 2006-05-09 | Board Of Regents Of The University Of Nebraska | Microrobot for surgical applications |
US20080058989A1 (en) * | 2006-04-13 | 2008-03-06 | Board Of Regents Of The University Of Nebraska | Surgical camera robot |
US20050177143A1 (en) * | 2003-08-11 | 2005-08-11 | Jeff Bullington | Remotely-controlled ablation of surfaces |
US9022037B2 (en) | 2003-08-11 | 2015-05-05 | Raydiance, Inc. | Laser ablation method and apparatus having a feedback loop and control unit |
US8921733B2 (en) | 2003-08-11 | 2014-12-30 | Raydiance, Inc. | Methods and systems for trimming circuits |
US8173929B1 (en) | 2003-08-11 | 2012-05-08 | Raydiance, Inc. | Methods and systems for trimming circuits |
US8930583B1 (en) | 2003-09-18 | 2015-01-06 | Marvell Israel (M.I.S.L) Ltd. | Method and apparatus for controlling data transfer in a serial-ATA system |
JP2005111085A (en) * | 2003-10-09 | 2005-04-28 | Olympus Corp | Operation supporting system |
DE10351199B3 (en) * | 2003-11-03 | 2005-06-30 | Erbe Elektromedizin Gmbh | Control device for controlling electromedical devices |
US7161322B2 (en) * | 2003-11-18 | 2007-01-09 | Intouch Technologies, Inc. | Robot with a manipulator arm |
US7292912B2 (en) * | 2003-12-05 | 2007-11-06 | Lntouch Technologies, Inc. | Door knocker control system for a remote controlled teleconferencing robot |
US7813836B2 (en) | 2003-12-09 | 2010-10-12 | Intouch Technologies, Inc. | Protocol for a remotely controlled videoconferencing robot |
US8065418B1 (en) * | 2004-02-02 | 2011-11-22 | Apple Inc. | NAT traversal for media conferencing |
US20050204438A1 (en) | 2004-02-26 | 2005-09-15 | Yulun Wang | Graphical interface for a remote presence system |
US8077963B2 (en) | 2004-07-13 | 2011-12-13 | Yulun Wang | Mobile robot with a head-based movement mapping scheme |
EP1799096A2 (en) | 2004-09-30 | 2007-06-27 | Boston Scientific Scimed, Inc. | System and method of obstruction removal |
US7479106B2 (en) | 2004-09-30 | 2009-01-20 | Boston Scientific Scimed, Inc. | Automated control of irrigation and aspiration in a single-use endoscope |
EP1799095A2 (en) | 2004-09-30 | 2007-06-27 | Boston Scientific Scimed, Inc. | Adapter for use with digital imaging medical device |
CA2581079A1 (en) | 2004-09-30 | 2006-04-13 | Boston Scientific Scimed, Inc. | Multi-functional endoscopic system for use in electrosurgical applications |
US8083671B2 (en) | 2004-09-30 | 2011-12-27 | Boston Scientific Scimed, Inc. | Fluid delivery system for use with an endoscope |
US7241263B2 (en) | 2004-09-30 | 2007-07-10 | Scimed Life Systems, Inc. | Selectively rotatable shaft coupler |
EP1835871B1 (en) * | 2004-12-22 | 2013-05-22 | Bracco Diagnostics Inc. | System, imaging suite, and method for using an electro-pneumatic insufflator for magnetic resonance imaging |
US7222000B2 (en) * | 2005-01-18 | 2007-05-22 | Intouch Technologies, Inc. | Mobile videoconferencing platform with automatic shut-off features |
US20060224766A1 (en) * | 2005-03-31 | 2006-10-05 | Malackowski Donald W | Operating room communication bus and method |
US7846107B2 (en) | 2005-05-13 | 2010-12-07 | Boston Scientific Scimed, Inc. | Endoscopic apparatus with integrated multiple biopsy device |
US8097003B2 (en) | 2005-05-13 | 2012-01-17 | Boston Scientific Scimed, Inc. | Endoscopic apparatus with integrated variceal ligation device |
US20070015999A1 (en) * | 2005-07-15 | 2007-01-18 | Heldreth Mark A | System and method for providing orthopaedic surgical information to a surgeon |
US8135050B1 (en) | 2005-07-19 | 2012-03-13 | Raydiance, Inc. | Automated polarization correction |
US8052597B2 (en) | 2005-08-30 | 2011-11-08 | Boston Scientific Scimed, Inc. | Method for forming an endoscope articulation joint |
US9198728B2 (en) | 2005-09-30 | 2015-12-01 | Intouch Technologies, Inc. | Multi-camera mobile teleconferencing platform |
US20070078678A1 (en) * | 2005-09-30 | 2007-04-05 | Disilvestro Mark R | System and method for performing a computer assisted orthopaedic surgical procedure |
US7620553B2 (en) * | 2005-12-20 | 2009-11-17 | Storz Endoskop Produktions Gmbh | Simultaneous support of isolated and connected phrase command recognition in automatic speech recognition systems |
US7967759B2 (en) | 2006-01-19 | 2011-06-28 | Boston Scientific Scimed, Inc. | Endoscopic system with integrated patient respiratory status indicator |
US8232687B2 (en) | 2006-04-26 | 2012-07-31 | Raydiance, Inc. | Intelligent laser interlock system |
US7444049B1 (en) | 2006-01-23 | 2008-10-28 | Raydiance, Inc. | Pulse stretcher and compressor including a multi-pass Bragg grating |
US9130344B2 (en) | 2006-01-23 | 2015-09-08 | Raydiance, Inc. | Automated laser tuning |
US7769492B2 (en) | 2006-02-22 | 2010-08-03 | Intouch Technologies, Inc. | Graphical interface for a remote presence system |
US8888684B2 (en) | 2006-03-27 | 2014-11-18 | Boston Scientific Scimed, Inc. | Medical devices with local drug delivery capabilities |
US7822347B1 (en) | 2006-03-28 | 2010-10-26 | Raydiance, Inc. | Active tuning of temporal dispersion in an ultrashort pulse laser system |
US7955255B2 (en) | 2006-04-20 | 2011-06-07 | Boston Scientific Scimed, Inc. | Imaging assembly with transparent distal cap |
US8202265B2 (en) | 2006-04-20 | 2012-06-19 | Boston Scientific Scimed, Inc. | Multiple lumen assembly for use in endoscopes or other medical devices |
US8635082B2 (en) | 2006-05-25 | 2014-01-21 | DePuy Synthes Products, LLC | Method and system for managing inventories of orthopaedic implants |
US8849679B2 (en) | 2006-06-15 | 2014-09-30 | Intouch Technologies, Inc. | Remote controlled robot system that provides medical images |
US8015014B2 (en) * | 2006-06-16 | 2011-09-06 | Storz Endoskop Produktions Gmbh | Speech recognition system with user profiles management component |
US9579088B2 (en) | 2007-02-20 | 2017-02-28 | Board Of Regents Of The University Of Nebraska | Methods, systems, and devices for surgical visualization and device manipulation |
CA2991346C (en) | 2006-06-22 | 2020-03-10 | Board Of Regents Of The University Of Nebraska | Magnetically coupleable robotic devices and related methods |
US8679096B2 (en) | 2007-06-21 | 2014-03-25 | Board Of Regents Of The University Of Nebraska | Multifunctional operational component for robotic devices |
US7921017B2 (en) * | 2006-07-20 | 2011-04-05 | Abbott Medical Optics Inc | Systems and methods for voice control of a medical device |
US7772962B2 (en) * | 2006-08-02 | 2010-08-10 | Maciej Labowicz | Multiple lock security system for cargo trailers |
US8502876B2 (en) * | 2006-09-12 | 2013-08-06 | Storz Endoskop Producktions GmbH | Audio, visual and device data capturing system with real-time speech recognition command and control system |
US9514746B2 (en) * | 2006-09-26 | 2016-12-06 | Storz Endoskop Produktions Gmbh | System and method for hazard mitigation in voice-driven control applications |
US7925511B2 (en) * | 2006-09-29 | 2011-04-12 | Nellcor Puritan Bennett Llc | System and method for secure voice identification in a medical device |
US7761185B2 (en) | 2006-10-03 | 2010-07-20 | Intouch Technologies, Inc. | Remote presence display through remotely controlled robot |
US20080091432A1 (en) * | 2006-10-17 | 2008-04-17 | Donald Dalton | System and method for voice control of electrically powered devices |
US8037179B2 (en) * | 2006-11-02 | 2011-10-11 | Storz Endoskop Produktions Gmbh | Device control system employing extensible markup language for defining information resources |
US8265793B2 (en) | 2007-03-20 | 2012-09-11 | Irobot Corporation | Mobile robot for telecommunication |
US9160783B2 (en) | 2007-05-09 | 2015-10-13 | Intouch Technologies, Inc. | Robot system that operates through a network firewall |
US20080289660A1 (en) * | 2007-05-23 | 2008-11-27 | Air Products And Chemicals, Inc. | Semiconductor Manufacture Employing Isopropanol Drying |
EP2170564A4 (en) | 2007-07-12 | 2015-10-07 | Univ Nebraska | Methods and systems of actuation in robotic devices |
CA2695619C (en) | 2007-08-15 | 2015-11-24 | Board Of Regents Of The University Of Nebraska | Modular and cooperative medical devices and related systems and methods |
JP2010536435A (en) | 2007-08-15 | 2010-12-02 | ボード オブ リージェンツ オブ ザ ユニバーシティ オブ ネブラスカ | Medical inflation, attachment and delivery devices and associated methods |
US8116910B2 (en) | 2007-08-23 | 2012-02-14 | Intouch Technologies, Inc. | Telepresence robot with a printer |
US8265949B2 (en) | 2007-09-27 | 2012-09-11 | Depuy Products, Inc. | Customized patient surgical plan |
ES2733937T3 (en) | 2007-09-30 | 2019-12-03 | Depuy Products Inc | Specific patient-specific orthopedic surgical instrument |
EP2060986B1 (en) | 2007-11-13 | 2019-01-02 | Karl Storz SE & Co. KG | System and method for management of processes in a hospital and/or in an operating room |
US7903326B2 (en) | 2007-11-30 | 2011-03-08 | Radiance, Inc. | Static phase mask for high-order spectral phase control in a hybrid chirped pulse amplifier system |
US8633975B2 (en) | 2008-01-16 | 2014-01-21 | Karl Storz Imaging, Inc. | Network based endoscopic surgical system |
US10875182B2 (en) | 2008-03-20 | 2020-12-29 | Teladoc Health, Inc. | Remote presence system mounted to operating room hardware |
US8179418B2 (en) | 2008-04-14 | 2012-05-15 | Intouch Technologies, Inc. | Robotic based health care system |
US8170241B2 (en) | 2008-04-17 | 2012-05-01 | Intouch Technologies, Inc. | Mobile tele-presence system with a microphone system |
US20090276515A1 (en) * | 2008-05-02 | 2009-11-05 | Boston Scientific Scimed, Inc. | Multi-modality network for improved workflow |
WO2009137688A2 (en) * | 2008-05-07 | 2009-11-12 | Carrot Medical Llc | Integration system for medical instruments with remote control |
US9193065B2 (en) | 2008-07-10 | 2015-11-24 | Intouch Technologies, Inc. | Docking system for a tele-presence robot |
US9842192B2 (en) | 2008-07-11 | 2017-12-12 | Intouch Technologies, Inc. | Tele-presence robot system with multi-cast features |
WO2010018907A1 (en) * | 2008-08-14 | 2010-02-18 | (주)미래컴퍼니 | Robot system for performing surgery using a client server method |
US8340819B2 (en) | 2008-09-18 | 2012-12-25 | Intouch Technologies, Inc. | Mobile videoconferencing robot system with network adaptive driving |
US9259274B2 (en) | 2008-09-30 | 2016-02-16 | Intuitive Surgical Operations, Inc. | Passive preload and capstan drive for surgical instruments |
US9339342B2 (en) | 2008-09-30 | 2016-05-17 | Intuitive Surgical Operations, Inc. | Instrument interface |
US8996165B2 (en) | 2008-10-21 | 2015-03-31 | Intouch Technologies, Inc. | Telepresence robot with a camera boom |
US8605885B1 (en) * | 2008-10-23 | 2013-12-10 | Next It Corporation | Automated assistant for customer service representatives |
US8498538B2 (en) | 2008-11-14 | 2013-07-30 | Raydiance, Inc. | Compact monolithic dispersion compensator |
US9138891B2 (en) | 2008-11-25 | 2015-09-22 | Intouch Technologies, Inc. | Server connectivity control for tele-presence robot |
US20100131280A1 (en) * | 2008-11-25 | 2010-05-27 | General Electric Company | Voice recognition system for medical devices |
US8463435B2 (en) | 2008-11-25 | 2013-06-11 | Intouch Technologies, Inc. | Server connectivity control for tele-presence robot |
US8849680B2 (en) | 2009-01-29 | 2014-09-30 | Intouch Technologies, Inc. | Documentation through a remote presence robot |
US8897920B2 (en) | 2009-04-17 | 2014-11-25 | Intouch Technologies, Inc. | Tele-presence robot system with software modularity, projector and laser pointer |
US8384755B2 (en) | 2009-08-26 | 2013-02-26 | Intouch Technologies, Inc. | Portable remote presence robot |
US11399153B2 (en) | 2009-08-26 | 2022-07-26 | Teladoc Health, Inc. | Portable telepresence apparatus |
MX2009010902A (en) * | 2009-10-08 | 2011-04-20 | Magno Alcantara Talavera | Voice control system and method. |
EP2512754A4 (en) | 2009-12-17 | 2016-11-30 | Univ Nebraska | Modular and cooperative medical devices and related systems and methods |
US11154981B2 (en) | 2010-02-04 | 2021-10-26 | Teladoc Health, Inc. | Robot user interface for telepresence robot system |
US8670017B2 (en) | 2010-03-04 | 2014-03-11 | Intouch Technologies, Inc. | Remote presence system including a cart that supports a robot face and an overhead camera |
US8935005B2 (en) | 2010-05-20 | 2015-01-13 | Irobot Corporation | Operating a mobile robot |
US8918213B2 (en) | 2010-05-20 | 2014-12-23 | Irobot Corporation | Mobile human interface robot |
US9014848B2 (en) | 2010-05-20 | 2015-04-21 | Irobot Corporation | Mobile robot system |
US10343283B2 (en) | 2010-05-24 | 2019-07-09 | Intouch Technologies, Inc. | Telepresence robot system that can be accessed by a cellular phone |
US10808882B2 (en) | 2010-05-26 | 2020-10-20 | Intouch Technologies, Inc. | Tele-robotic system with a robot face placed on a chair |
WO2013022423A1 (en) | 2010-08-06 | 2013-02-14 | Board Of Regents Of The University Of Nebraska | Methods and systems for handling or delivering materials for natural orifice surgery |
WO2012021748A1 (en) | 2010-08-12 | 2012-02-16 | Raydiance, Inc. | Polymer tubing laser micromachining |
WO2012037468A1 (en) | 2010-09-16 | 2012-03-22 | Raydiance, Inc. | Singulation of layered materials using selectively variable laser output |
US9264664B2 (en) | 2010-12-03 | 2016-02-16 | Intouch Technologies, Inc. | Systems and methods for dynamic bandwidth allocation |
US8930019B2 (en) | 2010-12-30 | 2015-01-06 | Irobot Corporation | Mobile human interface robot |
KR102018763B1 (en) | 2011-01-28 | 2019-09-05 | 인터치 테크놀로지스 인코퍼레이티드 | Interfacing with a mobile telepresence robot |
US9323250B2 (en) | 2011-01-28 | 2016-04-26 | Intouch Technologies, Inc. | Time-dependent navigation of telepresence robots |
US20120252367A1 (en) * | 2011-04-04 | 2012-10-04 | Meditalk Devices, Llc | Auditory Speech Module For Medical Devices |
US10769739B2 (en) | 2011-04-25 | 2020-09-08 | Intouch Technologies, Inc. | Systems and methods for management of information among medical providers and facilities |
US20140139616A1 (en) | 2012-01-27 | 2014-05-22 | Intouch Technologies, Inc. | Enhanced Diagnostics for a Telepresence Robot |
US9098611B2 (en) | 2012-11-26 | 2015-08-04 | Intouch Technologies, Inc. | Enhanced video interaction for a user interface of a telepresence network |
CA2838637C (en) | 2011-06-10 | 2020-11-17 | Board Of Regents Of The University Of Nebraska | Methods, systems, and devices relating to surgical end effectors |
JP6106169B2 (en) | 2011-07-11 | 2017-03-29 | ボード オブ リージェンツ オブ ザ ユニバーシティ オブ ネブラスカ | Surgical robot system |
US20130041662A1 (en) * | 2011-08-08 | 2013-02-14 | Sony Corporation | System and method of controlling services on a device using voice data |
US10239160B2 (en) | 2011-09-21 | 2019-03-26 | Coherent, Inc. | Systems and processes that singulate materials |
US8836751B2 (en) | 2011-11-08 | 2014-09-16 | Intouch Technologies, Inc. | Tele-presence system with a user interface that displays different communication links |
CA3098065C (en) | 2012-01-10 | 2023-10-31 | Board Of Regents Of The University Of Nebraska | Methods, systems, and devices for surgical access and insertion |
US8902278B2 (en) | 2012-04-11 | 2014-12-02 | Intouch Technologies, Inc. | Systems and methods for visualizing and managing telepresence devices in healthcare networks |
US9251313B2 (en) | 2012-04-11 | 2016-02-02 | Intouch Technologies, Inc. | Systems and methods for visualizing and managing telepresence devices in healthcare networks |
EP2844181B1 (en) | 2012-05-01 | 2021-03-10 | Board of Regents of the University of Nebraska | Single site robotic device and related systems |
WO2013176760A1 (en) | 2012-05-22 | 2013-11-28 | Intouch Technologies, Inc. | Graphical user interfaces including touchpad driving interfaces for telemedicine devices |
US9361021B2 (en) | 2012-05-22 | 2016-06-07 | Irobot Corporation | Graphical user interfaces including touchpad driving interfaces for telemedicine devices |
WO2013186794A2 (en) * | 2012-06-15 | 2013-12-19 | Suresh DESHPANDE | A voice controlled operation theater automation system |
EP3943255B1 (en) | 2012-06-22 | 2023-06-14 | Board of Regents of the University of Nebraska | Local control robotic surgical devices |
US9770305B2 (en) | 2012-08-08 | 2017-09-26 | Board Of Regents Of The University Of Nebraska | Robotic surgical devices, systems, and related methods |
WO2014025399A1 (en) | 2012-08-08 | 2014-02-13 | Board Of Regents Of The University Of Nebraska | Robotic surgical devices, systems, and related methods |
US9060674B2 (en) | 2012-10-11 | 2015-06-23 | Karl Storz Imaging, Inc. | Auto zoom for video camera |
US9539155B2 (en) | 2012-10-26 | 2017-01-10 | Hill-Rom Services, Inc. | Control system for patient support apparatus |
US9414740B2 (en) | 2013-03-14 | 2016-08-16 | Arthrex, Inc. | Endoscopic imaging system and method for adapting to remote stimulus |
US9743987B2 (en) | 2013-03-14 | 2017-08-29 | Board Of Regents Of The University Of Nebraska | Methods, systems, and devices relating to robotic surgical devices, end effectors, and controllers |
US9888966B2 (en) | 2013-03-14 | 2018-02-13 | Board Of Regents Of The University Of Nebraska | Methods, systems, and devices relating to force control surgical systems |
JP2016513556A (en) | 2013-03-15 | 2016-05-16 | ボード オブ リージェンツ オブ ザ ユニバーシティ オブ ネブラスカ | Robotic surgical devices, systems, and related methods |
US9498291B2 (en) | 2013-03-15 | 2016-11-22 | Hansen Medical, Inc. | Touch-free catheter user interface controller |
US10966700B2 (en) | 2013-07-17 | 2021-04-06 | Virtual Incision Corporation | Robotic surgical devices, systems and related methods |
WO2015143067A1 (en) * | 2014-03-19 | 2015-09-24 | Intuitive Surgical Operations, Inc. | Medical devices, systems, and methods using eye gaze tracking |
CN106659541B (en) | 2014-03-19 | 2019-08-16 | 直观外科手术操作公司 | Integrated eyeball stares medical device, the system and method that tracking is used for stereoscopic viewer |
EP3191009B1 (en) | 2014-09-12 | 2021-03-31 | Board of Regents of the University of Nebraska | Quick-release end effectors and related systems |
WO2016077478A1 (en) | 2014-11-11 | 2016-05-19 | Board Of Regents Of The University Of Nebraska | Robotic device with compact joint design and related systems and methods |
JP6501217B2 (en) * | 2015-02-16 | 2019-04-17 | アルパイン株式会社 | Information terminal system |
WO2017024081A1 (en) | 2015-08-03 | 2017-02-09 | Board Of Regents Of The University Of Nebraska | Robotic surgical devices systems and related methods |
JP7176757B2 (en) | 2016-05-18 | 2022-11-22 | バーチャル インシジョン コーポレイション | ROBOTIC SURGICAL DEVICES, SYSTEMS AND RELATED METHODS |
CN110248614B (en) | 2016-08-25 | 2023-04-18 | 内布拉斯加大学董事会 | Quick release tool couplers and related systems and methods |
JP7090615B2 (en) | 2016-08-30 | 2022-06-24 | ボード オブ リージェンツ オブ ザ ユニバーシティ オブ ネブラスカ | Robot device |
EP3539117A4 (en) * | 2016-11-10 | 2020-03-25 | Think Surgical, Inc. | Remote mentoring station |
CA3044674A1 (en) | 2016-11-22 | 2018-05-31 | Board Of Regents Of The University Of Nebraska | Improved gross positioning device and related systems and methods |
CN115553922A (en) | 2016-11-29 | 2023-01-03 | 虚拟切割有限公司 | User controller with user presence detection and related systems and methods |
US10722319B2 (en) | 2016-12-14 | 2020-07-28 | Virtual Incision Corporation | Releasable attachment device for coupling to medical devices and related systems and methods |
US10028794B2 (en) | 2016-12-19 | 2018-07-24 | Ethicon Llc | Surgical system with voice control |
US11862302B2 (en) | 2017-04-24 | 2024-01-02 | Teladoc Health, Inc. | Automated transcription and documentation of tele-health encounters |
US20200152190A1 (en) * | 2017-06-06 | 2020-05-14 | Intuitive Surgical Operations, Inc. | Systems and methods for state-based speech recognition in a teleoperational system |
US10483007B2 (en) | 2017-07-25 | 2019-11-19 | Intouch Technologies, Inc. | Modular telehealth cart with thermal imaging and touch screen user interface |
US11636944B2 (en) | 2017-08-25 | 2023-04-25 | Teladoc Health, Inc. | Connectivity infrastructure for a telehealth platform |
CN111417333B (en) | 2017-09-27 | 2023-08-29 | 虚拟切割有限公司 | Robotic surgical device with tracking camera technology and related systems and methods |
US11013564B2 (en) | 2018-01-05 | 2021-05-25 | Board Of Regents Of The University Of Nebraska | Single-arm robotic device with compact joint design and related systems and methods |
US10617299B2 (en) | 2018-04-27 | 2020-04-14 | Intouch Technologies, Inc. | Telehealth cart that supports a removable tablet with seamless audio/video switching |
EP3793780A4 (en) * | 2018-05-18 | 2022-10-05 | Corindus, Inc. | Remote communications and control system for robotic interventional procedures |
US11051829B2 (en) | 2018-06-26 | 2021-07-06 | DePuy Synthes Products, Inc. | Customized patient-specific orthopaedic surgical instrument |
CA3125742A1 (en) | 2019-01-07 | 2020-07-16 | Virtual Incision Corporation | Robotically assisted surgical system and related devices and methods |
US11717457B2 (en) * | 2019-02-18 | 2023-08-08 | Liko Research & Development Ab | Lift system with a stowable support assembly |
US20210196302A1 (en) * | 2019-12-30 | 2021-07-01 | Ethicon Llc | Method for operating a surgical instrument |
EP4287953A1 (en) * | 2021-02-05 | 2023-12-13 | Alcon Inc. | Voice-controlled surgical system |
Family Cites Families (174)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US977825A (en) | 1910-01-08 | 1910-12-06 | George N Murphy | Surgical instrument. |
GB955005A (en) | 1961-07-21 | 1964-04-08 | Molins Machine Co Ltd | Apparatus for gripping and lifting articles |
US3280991A (en) | 1964-04-28 | 1966-10-25 | Programmed & Remote Syst Corp | Position control manipulator |
US5196688A (en) | 1975-02-04 | 1993-03-23 | Telefunken Systemtechnik Gmbh | Apparatus for recognizing and following a target |
GB1569450A (en) | 1976-05-27 | 1980-06-18 | Nippon Electric Co | Speech recognition system |
US4128880A (en) | 1976-06-30 | 1978-12-05 | Cray Research, Inc. | Computer vector register processing |
US4058001A (en) | 1976-08-02 | 1977-11-15 | G. D. Searle & Co. | Ultrasound imaging system with improved scan conversion |
US4216462A (en) | 1978-03-06 | 1980-08-05 | General Electric Company | Patient monitoring and data processing system |
US4207959A (en) | 1978-06-02 | 1980-06-17 | New York University | Wheelchair mounted control apparatus |
US4221997A (en) | 1978-12-18 | 1980-09-09 | Western Electric Company, Incorporated | Articulated robot arm and method of moving same |
DE3045295A1 (en) | 1979-05-21 | 1982-02-18 | American Cystoscope Makers Inc | Surgical instrument for an endoscope |
US4367998A (en) | 1979-09-13 | 1983-01-11 | United Kingdom Atomic Energy Authority | Manipulators |
FR2482508A1 (en) | 1980-05-14 | 1981-11-20 | Commissariat Energie Atomique | MANIPULATOR AND MOTORIZED ORIENTATION BRACKET FOR SUCH A MANIPULATOR |
FR2492304A1 (en) | 1980-10-17 | 1982-04-23 | Commissariat Energie Atomique | TELEMANIPULATION ASSEMBLY MOUNTED ON A MOBILE PLATFORM AND COMPRISING A RETRACTABLE TELESCOPIC CARRIER ASSEMBLY WITHIN A SEALED HOOD, AND METHOD FOR SETTING UP ON AN ENCLOSURE |
JPS57118299A (en) | 1981-01-14 | 1982-07-23 | Nissan Motor | Voice load driver |
JPS58130393A (en) | 1982-01-29 | 1983-08-03 | 株式会社東芝 | Voice recognition equipment |
JPS58134357A (en) | 1982-02-03 | 1983-08-10 | Hitachi Ltd | Array processor |
US4456961A (en) | 1982-03-05 | 1984-06-26 | Texas Instruments Incorporated | Apparatus for teaching and transforming noncoincident coordinate systems |
US4503654A (en) * | 1982-09-24 | 1985-03-12 | Edward Cosentino | Method and apparatus for laying tile |
US4491135A (en) | 1982-11-03 | 1985-01-01 | Klein Harvey A | Surgical needle holder |
US4517963A (en) | 1983-01-04 | 1985-05-21 | Harold Unger | Image-erecting barrel rotator for articulated optical arm |
CS235792B1 (en) | 1983-04-06 | 1985-05-15 | Pavel Zahalka | Connection for direct heating of ionic melt especially of glass by alternating current passage with lower frequency than 50 hz |
US4503854A (en) | 1983-06-16 | 1985-03-12 | Jako Geza J | Laser surgery |
US4641292A (en) | 1983-06-20 | 1987-02-03 | George Tunnell | Voice controlled welding system |
US4604016A (en) | 1983-08-03 | 1986-08-05 | Joyce Stephen A | Multi-dimensional force-torque hand controller having force feedback |
US4586398A (en) | 1983-09-29 | 1986-05-06 | Hamilton Industries | Foot control assembly for power-operated tables and the like |
US4807723A (en) | 1983-10-17 | 1989-02-28 | Otis Elevator Company | Elevator roping arrangement |
US4635292A (en) | 1983-12-19 | 1987-01-06 | Matsushita Electric Industrial Co., Ltd. | Image processor |
US4616637A (en) | 1984-09-14 | 1986-10-14 | Precision Surgical Instruments, Inc. | Shoulder traction apparatus |
US4676243A (en) | 1984-10-31 | 1987-06-30 | Aldebaran Xiii Consulting Company | Automated anterior capsulectomy instrument |
JPH055529Y2 (en) | 1985-03-25 | 1993-02-15 | ||
US4624022A (en) * | 1985-05-10 | 1986-11-25 | Dolan Donald G | Device for holding a sheet or cover in position on the surface of a water bed mattress |
JPS61279491A (en) | 1985-05-31 | 1986-12-10 | 株式会社安川電機 | Visual apparatus holder |
US4672963A (en) | 1985-06-07 | 1987-06-16 | Israel Barken | Apparatus and method for computer controlled laser surgery |
US4945479A (en) | 1985-07-31 | 1990-07-31 | Unisys Corporation | Tightly coupled scientific processing system |
EP0223058B1 (en) * | 1985-10-25 | 1989-10-11 | INTERATOM Gesellschaft mit beschränkter Haftung | Method for soldering metallic catalyst support articles |
US4776016A (en) | 1985-11-21 | 1988-10-04 | Position Orientation Systems, Inc. | Voice control system |
US4817050A (en) * | 1985-11-22 | 1989-03-28 | Kabushiki Kaisha Toshiba | Database system |
US4750136A (en) | 1986-01-10 | 1988-06-07 | American Telephone And Telegraph, At&T Information Systems Inc. | Communication system having automatic circuit board initialization capability |
JPH085018B2 (en) | 1986-02-26 | 1996-01-24 | 株式会社日立製作所 | Remote manipulation method and apparatus |
US5078140A (en) | 1986-05-08 | 1992-01-07 | Kwoh Yik S | Imaging device - aided robotic stereotaxis system |
US4791934A (en) | 1986-08-07 | 1988-12-20 | Picker International, Inc. | Computer tomography assisted stereotactic surgery system and method |
SE464855B (en) | 1986-09-29 | 1991-06-24 | Asea Ab | PROCEDURE OF AN INDUSTRIAL BOTTOM FOR CALIBRATION OF A SENSOR |
DE3636678A1 (en) | 1986-10-28 | 1988-05-11 | Siemens Ag | X-RAY DIAGNOSTIC DEVICE |
US4854301A (en) | 1986-11-13 | 1989-08-08 | Olympus Optical Co., Ltd. | Endoscope apparatus having a chair with a switch |
JPH0829509B2 (en) | 1986-12-12 | 1996-03-27 | 株式会社日立製作所 | Control device for manipulator |
US4791940A (en) * | 1987-02-02 | 1988-12-20 | Florida Probe Corporation | Electronic periodontal probe with a constant force applier |
DE3889681T2 (en) | 1987-02-09 | 1994-09-08 | Sumitomo Electric Industries | Device for bending an elongated body. |
US4860215A (en) | 1987-04-06 | 1989-08-22 | California Institute Of Technology | Method and apparatus for adaptive force and position control of manipulators |
US5065741A (en) | 1987-04-16 | 1991-11-19 | Olympus Optical Co. Ltd. | Extracoporeal ultrasonic lithotripter with a variable focus |
US4863133A (en) | 1987-05-26 | 1989-09-05 | Leonard Medical | Arm device for adjustable positioning of a medical instrument or the like |
US4762455A (en) | 1987-06-01 | 1988-08-09 | Remote Technology Corporation | Remote manipulator |
US4852083A (en) | 1987-06-22 | 1989-07-25 | Texas Instruments Incorporated | Digital crossbar switch |
JPH088933B2 (en) | 1987-07-10 | 1996-01-31 | 日本ゼオン株式会社 | Catheter |
US4794912A (en) | 1987-08-17 | 1989-01-03 | Welch Allyn, Inc. | Borescope or endoscope with fluid dynamic muscle |
JP2602240B2 (en) | 1987-08-28 | 1997-04-23 | 株式会社日立製作所 | Multi-processor system |
US4991579A (en) | 1987-11-10 | 1991-02-12 | Allen George S | Method and apparatus for providing related images over time of a portion of the anatomy using fiducial implants |
US5303148A (en) * | 1987-11-27 | 1994-04-12 | Picker International, Inc. | Voice actuated volume image controller and display controller |
US4815450A (en) | 1988-02-01 | 1989-03-28 | Patel Jayendra I | Endoscope having variable flexibility |
US5251127A (en) | 1988-02-01 | 1993-10-05 | Faro Medical Technologies Inc. | Computer-aided surgery apparatus |
EP0326768A3 (en) * | 1988-02-01 | 1991-01-23 | Faro Medical Technologies Inc. | Computer-aided surgery apparatus |
US4964062A (en) | 1988-02-16 | 1990-10-16 | Ubhayakar Shivadev K | Robotic arm systems |
US4930494A (en) | 1988-03-09 | 1990-06-05 | Olympus Optical Co., Ltd. | Apparatus for bending an insertion section of an endoscope using a shape memory alloy |
US4949717A (en) | 1988-03-17 | 1990-08-21 | Shaw Edward L | Surgical instrument with suture cutter |
US5019968A (en) | 1988-03-29 | 1991-05-28 | Yulan Wang | Three-dimensional vector processor |
US4989253A (en) | 1988-04-15 | 1991-01-29 | The Montefiore Hospital Association Of Western Pennsylvania | Voice activated microscope |
US4979949A (en) * | 1988-04-26 | 1990-12-25 | The Board Of Regents Of The University Of Washington | Robot-aided system for surgery |
US4979933A (en) | 1988-04-27 | 1990-12-25 | Kraft, Inc. | Reclosable bag |
US4883400A (en) | 1988-08-24 | 1989-11-28 | Martin Marietta Energy Systems, Inc. | Dual arm master controller for a bilateral servo-manipulator |
JPH079606B2 (en) | 1988-09-19 | 1995-02-01 | 豊田工機株式会社 | Robot controller |
CA2000818C (en) | 1988-10-19 | 1994-02-01 | Akira Tsuchihashi | Master slave manipulator system |
EP0369055A1 (en) * | 1988-11-17 | 1990-05-23 | Siemens Aktiengesellschaft | Noise signal compensation circuit |
US5300928A (en) * | 1988-12-27 | 1994-04-05 | Semiconductor Energy Laboratory Co., Ltd. | Liquid crystal color display device |
US5123095A (en) | 1989-01-17 | 1992-06-16 | Ergo Computing, Inc. | Integrated scalar and vector processors with vector addressing by the scalar processor |
US5098426A (en) | 1989-02-06 | 1992-03-24 | Phoenix Laser Systems, Inc. | Method and apparatus for precision laser surgery |
FR2642882B1 (en) | 1989-02-07 | 1991-08-02 | Ripoll Jean Louis | SPEECH PROCESSING APPARATUS |
US4965417A (en) | 1989-03-27 | 1990-10-23 | Massie Philip E | Foot-operated control |
JPH034831A (en) | 1989-06-01 | 1991-01-10 | Toshiba Corp | Endoscope device |
US5105357A (en) * | 1989-07-24 | 1992-04-14 | Eaton Corporation | Shift implementation control system and method for mechanical transmission system |
US4980626A (en) | 1989-08-10 | 1990-12-25 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Method and apparatus for positioning a robotic end effector |
US5271384A (en) | 1989-09-01 | 1993-12-21 | Mcewen James A | Powered surgical retractor |
US5201325A (en) | 1989-09-01 | 1993-04-13 | Andronic Devices Ltd. | Advanced surgical retractor |
US5182557A (en) | 1989-09-20 | 1993-01-26 | Semborg Recrob, Corp. | Motorized joystick |
US4930484A (en) | 1989-10-26 | 1990-06-05 | Binkley Steven M | Fuel and air mixture expanding and preheating system |
US5091656A (en) | 1989-10-27 | 1992-02-25 | Storz Instrument Company | Footswitch assembly with electrically engaged detents |
US5249121A (en) | 1989-10-27 | 1993-09-28 | American Cyanamid Company | Remote control console for surgical control system |
EP0647428A3 (en) | 1989-11-08 | 1995-07-12 | George S Allen | Interactive image-guided surgical system. |
DE4102196C2 (en) * | 1990-01-26 | 2002-08-01 | Olympus Optical Co | Imaging device for tracking an object |
JP2964518B2 (en) | 1990-01-30 | 1999-10-18 | 日本電気株式会社 | Voice control method |
US5175694A (en) | 1990-02-08 | 1992-12-29 | The United States Of America As Represented By The Secretary Of The Navy | Centroid target tracking system utilizing parallel processing of digital data patterns |
US5097829A (en) | 1990-03-19 | 1992-03-24 | Tony Quisenberry | Temperature controlled cooling system |
FR2660852A1 (en) * | 1990-04-17 | 1991-10-18 | Cheval Freres Sa | LASER BEAM DENTAL INSTRUMENT. |
DK0455852T3 (en) * | 1990-05-09 | 1994-12-12 | Siemens Ag | Medical, especially dental equipment |
US5086401A (en) | 1990-05-11 | 1992-02-04 | International Business Machines Corporation | Image-directed robotic system for precise robotic surgery including redundant consistency checking |
JPH0771288B2 (en) | 1990-08-24 | 1995-07-31 | 神田通信工業株式会社 | Automatic view adjustment method and device |
US5131105A (en) | 1990-11-21 | 1992-07-21 | Diasonics, Inc. | Patient support table |
US5088401A (en) * | 1990-11-30 | 1992-02-18 | Kabushiki Kaisha Shinkawa | Marking method and apparatus |
US5145227A (en) | 1990-12-31 | 1992-09-08 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Electromagnetic attachment mechanism |
US5228429A (en) | 1991-01-14 | 1993-07-20 | Tadashi Hatano | Position measuring device for endoscope |
US5085741A (en) * | 1991-01-23 | 1992-02-04 | Phillips Petroleum Company | Extractive distillation of low boiling alkene/alkane mixtures |
US5217003A (en) | 1991-03-18 | 1993-06-08 | Wilk Peter J | Automated surgical system and apparatus |
US5166513A (en) | 1991-05-06 | 1992-11-24 | Coherent, Inc. | Dual actuation photoelectric foot switch |
US5313306A (en) * | 1991-05-13 | 1994-05-17 | Telerobotics International, Inc. | Omniview motionless camera endoscopy system |
JP3173042B2 (en) * | 1991-05-21 | 2001-06-04 | ソニー株式会社 | Robot numerical controller |
US5279309A (en) * | 1991-06-13 | 1994-01-18 | International Business Machines Corporation | Signaling device and method for monitoring positions in a surgical operation |
US5417210A (en) * | 1992-05-27 | 1995-05-23 | International Business Machines Corporation | System and method for augmentation of endoscopic surgery |
US5182641A (en) | 1991-06-17 | 1993-01-26 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Composite video and graphics display for camera viewing systems in robotics and teleoperation |
US5735290A (en) * | 1993-02-22 | 1998-04-07 | Heartport, Inc. | Methods and systems for performing thoracoscopic coronary bypass and other procedures |
US5184601A (en) | 1991-08-05 | 1993-02-09 | Putman John M | Endoscope stabilizer |
US5335313A (en) | 1991-12-03 | 1994-08-02 | Douglas Terry L | Voice-actuated, speaker-dependent control system for hospital bed |
US5230623A (en) | 1991-12-10 | 1993-07-27 | Radionics, Inc. | Operating pointer with interactive computergraphics |
US5289365A (en) | 1991-12-23 | 1994-02-22 | Donnelly Corporation | Modular network control system |
US5631973A (en) * | 1994-05-05 | 1997-05-20 | Sri International | Method for telemanipulation with telepresence |
US6963792B1 (en) * | 1992-01-21 | 2005-11-08 | Sri International | Surgical method |
US5345538A (en) * | 1992-01-27 | 1994-09-06 | Krishna Narayannan | Voice activated control apparatus |
US5282806A (en) | 1992-08-21 | 1994-02-01 | Habley Medical Technology Corporation | Endoscopic surgical instrument having a removable, rotatable, end effector assembly |
US5201743A (en) | 1992-05-05 | 1993-04-13 | Habley Medical Technology Corp. | Axially extendable endoscopic surgical instrument |
JP3199130B2 (en) * | 1992-03-31 | 2001-08-13 | パイオニア株式会社 | 3D coordinate input device |
US5221283A (en) | 1992-05-15 | 1993-06-22 | General Electric Company | Apparatus and method for stereotactic surgery |
DE4306466A1 (en) | 1992-05-18 | 1993-11-25 | Ulrich Dr Kurze | Method and device for the patient-appropriate positioning of a patient |
US5274862A (en) | 1992-05-18 | 1994-01-04 | Palmer Jr John M | Patient turning device and method for lateral traveling transfer system |
US5257999A (en) | 1992-06-04 | 1993-11-02 | Slanetz Jr Charles A | Self-oriented laparoscopic needle holder for curved needles |
US5372147A (en) | 1992-06-16 | 1994-12-13 | Origin Medsystems, Inc. | Peritoneal distension robotic arm |
GR930100244A (en) * | 1992-06-30 | 1994-02-28 | Ethicon Inc | Flexible endoscopic surgical port |
US5524180A (en) | 1992-08-10 | 1996-06-04 | Computer Motion, Inc. | Automated endoscope system for optimal positioning |
US5515478A (en) * | 1992-08-10 | 1996-05-07 | Computer Motion, Inc. | Automated endoscope system for optimal positioning |
US5657429A (en) * | 1992-08-10 | 1997-08-12 | Computer Motion, Inc. | Automated endoscope system optimal positioning |
US5609560A (en) * | 1992-08-19 | 1997-03-11 | Olympus Optical Co., Ltd. | Medical operation device control system for controlling a operation devices accessed respectively by ID codes |
US5397323A (en) * | 1992-10-30 | 1995-03-14 | International Business Machines Corporation | Remote center-of-motion robot for surgery |
US5304185A (en) * | 1992-11-04 | 1994-04-19 | Unisurge, Inc. | Needle holder |
US5788688A (en) | 1992-11-05 | 1998-08-04 | Bauer Laboratories, Inc. | Surgeon's command and control |
FI95427C (en) * | 1992-12-23 | 1996-01-25 | Instrumentarium Oy | data transmission system |
US5309717A (en) * | 1993-03-22 | 1994-05-10 | Minch Richard B | Rapid shape memory effect micro-actuators |
JP3477781B2 (en) * | 1993-03-23 | 2003-12-10 | セイコーエプソン株式会社 | IC card |
US5417701A (en) * | 1993-03-30 | 1995-05-23 | Holmed Corporation | Surgical instrument with magnetic needle holder |
ATE225964T1 (en) | 1993-03-31 | 2002-10-15 | Luma Corp | INFORMATION MANAGEMENT IN AN ENDOSCOPY SYSTEM |
US5410638A (en) * | 1993-05-03 | 1995-04-25 | Northwestern University | System for positioning a medical instrument within a biotic structure using a micromanipulator |
US5395369A (en) * | 1993-06-10 | 1995-03-07 | Symbiosis Corporation | Endoscopic bipolar electrocautery instruments |
US5382885A (en) * | 1993-08-09 | 1995-01-17 | The University Of British Columbia | Motion scaling tele-operating system with force feedback suitable for microsurgery |
EP0647931B1 (en) * | 1993-08-13 | 1999-03-10 | Sun Microsystems, Inc. | High speed method and apparatus for generating animation by means of a three-region frame buffer and associated region pointers |
US5566272A (en) | 1993-10-27 | 1996-10-15 | Lucent Technologies Inc. | Automatic speech recognition (ASR) processing using confidence measures |
US5876325A (en) * | 1993-11-02 | 1999-03-02 | Olympus Optical Co., Ltd. | Surgical manipulation system |
US5715548A (en) | 1994-01-25 | 1998-02-10 | Hill-Rom, Inc. | Chair bed |
US5511256A (en) | 1994-07-05 | 1996-04-30 | Capaldi; Guido | Patient lift mechanism |
US6463361B1 (en) | 1994-09-22 | 2002-10-08 | Computer Motion, Inc. | Speech interface for an automated endoscopic system |
US5737711A (en) * | 1994-11-09 | 1998-04-07 | Fuji Jukogyo Kabuishiki Kaisha | Diagnosis system for motor vehicle |
US5437300A (en) * | 1994-11-14 | 1995-08-01 | R. W. Lyall & Company, Inc. | Apparatus for changing out gas meters |
US5530622A (en) * | 1994-12-23 | 1996-06-25 | National Semiconductor Corporation | Electronic assembly for connecting to an electronic system and method of manufacture thereof |
US5640953A (en) * | 1995-03-09 | 1997-06-24 | Siemens Medical Systems, Inc. | Portable patient monitor reconfiguration system |
US5882206A (en) * | 1995-03-29 | 1999-03-16 | Gillio; Robert G. | Virtual surgery system |
US5887121A (en) * | 1995-04-21 | 1999-03-23 | International Business Machines Corporation | Method of constrained Cartesian control of robotic mechanisms with active and passive joints |
US5544654A (en) * | 1995-06-06 | 1996-08-13 | Acuson Corporation | Voice control of a medical ultrasound scanning machine |
US5729659A (en) | 1995-06-06 | 1998-03-17 | Potter; Jerry L. | Method and apparatus for controlling a digital computer using oral input |
US5771511A (en) | 1995-08-04 | 1998-06-30 | Hill-Rom, Inc. | Communication network for a hospital bed |
JP3083465B2 (en) * | 1995-09-06 | 2000-09-04 | フクダ電子株式会社 | Patient information analysis management system and method |
US5860995A (en) * | 1995-09-22 | 1999-01-19 | Misener Medical Co. Inc. | Laparoscopic endoscopic surgical instrument |
US5970457A (en) | 1995-10-25 | 1999-10-19 | Johns Hopkins University | Voice command and control medical care system |
JPH09148341A (en) * | 1995-11-17 | 1997-06-06 | Stanley Electric Co Ltd | Heat treating method of ii-iv compd. semiconductor crystal |
US5855583A (en) * | 1996-02-20 | 1999-01-05 | Computer Motion, Inc. | Method and apparatus for performing minimally invasive cardiac procedures |
US5727569A (en) * | 1996-02-20 | 1998-03-17 | Cardiothoracic Systems, Inc. | Surgical devices for imposing a negative pressure to fix the position of cardiac tissue during surgery |
GB9603605D0 (en) * | 1996-02-21 | 1996-04-17 | Kodak Ltd | Improvements in or relating to photographic processing apparatus |
US5715823A (en) * | 1996-02-27 | 1998-02-10 | Atlantis Diagnostics International, L.L.C. | Ultrasonic diagnostic imaging system with universal access to diagnostic information and images |
US5809591A (en) | 1996-03-19 | 1998-09-22 | Lift Aid, Inc. | Patient lift mechanism |
JPH09262066A (en) * | 1996-03-27 | 1997-10-07 | Nagase & Co Ltd | Food product and medicinal composition for inhibiting hyperlipemia |
US5858934A (en) * | 1996-05-08 | 1999-01-12 | The Lubrizol Corporation | Enhanced biodegradable vegetable oil grease |
US5895461A (en) * | 1996-07-30 | 1999-04-20 | Telaric, Inc. | Method and system for automated data storage and retrieval with uniform addressing scheme |
US5897498A (en) | 1996-09-25 | 1999-04-27 | Atl Ultrasound, Inc. | Ultrasonic diagnostic imaging system with electronic message communications capability |
US5924074A (en) | 1996-09-27 | 1999-07-13 | Azron Incorporated | Electronic medical records system |
US5812978A (en) | 1996-12-09 | 1998-09-22 | Tracer Round Associaties, Ltd. | Wheelchair voice control apparatus |
US6393431B1 (en) | 1997-04-04 | 2002-05-21 | Welch Allyn, Inc. | Compact imaging instrument system |
US5857967A (en) * | 1997-07-09 | 1999-01-12 | Hewlett-Packard Company | Universally accessible healthcare devices with on the fly generation of HTML files |
WO1999021165A1 (en) * | 1997-10-20 | 1999-04-29 | Computer Motion Inc. | General purpose distributed operating room control system |
EP0917859A1 (en) * | 1997-11-14 | 1999-05-26 | Medsys S.A. | Apparatus for operating chrurgical instruments |
US6224542B1 (en) | 1999-01-04 | 2001-05-01 | Stryker Corporation | Endoscopic camera system with non-mechanical zoom |
-
1999
- 1999-07-15 US US09/354,944 patent/US6496099B2/en not_active Expired - Lifetime
-
2000
- 2000-07-11 EP EP00305831A patent/EP1068837A1/en not_active Withdrawn
- 2000-07-13 CA CA002313996A patent/CA2313996C/en not_active Expired - Lifetime
- 2000-07-17 JP JP2000216451A patent/JP2001104336A/en active Pending
-
2002
- 2002-12-09 US US10/315,893 patent/US6943663B2/en not_active Expired - Lifetime
-
2005
- 2005-06-23 US US11/166,677 patent/US7259652B2/en not_active Expired - Fee Related
Also Published As
Publication number | Publication date |
---|---|
US6496099B2 (en) | 2002-12-17 |
US6943663B2 (en) | 2005-09-13 |
EP1068837A1 (en) | 2001-01-17 |
US20010040496A1 (en) | 2001-11-15 |
JP2001104336A (en) | 2001-04-17 |
US7259652B2 (en) | 2007-08-21 |
US20030197590A1 (en) | 2003-10-23 |
CA2313996A1 (en) | 2001-01-15 |
US20050242919A1 (en) | 2005-11-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA2313996C (en) | General purpose distributed operating room control system | |
US6646541B1 (en) | General purpose distributed operating room control system | |
US7053752B2 (en) | General purpose distributed operating room control system | |
US6642836B1 (en) | General purpose distributed operating room control system | |
US6911916B1 (en) | Method and apparatus for accessing medical data over a network | |
EP1172064A2 (en) | Method and apparatus for accessing medical data over a network | |
EP1031137B1 (en) | General purpose distributed operating room control system | |
EP1676540B1 (en) | Apparatus for performing a voice-assisted orthopaedic surgical procedure | |
US5425128A (en) | Automatic management system for speech recognition processes | |
US11176945B2 (en) | Healthcare systems and methods using voice inputs | |
US5970457A (en) | Voice command and control medical care system | |
JP2007334301A (en) | Speech recognition system with user profile management component | |
US7588534B2 (en) | Endoscope system for operating medical device by voice | |
US20080294458A1 (en) | Device And Method For The Central Control Of Devices Used During An Operation | |
US8734160B2 (en) | Operating room educational television “OReduTV” | |
JP2006221583A (en) | Medical treatment support system | |
JP2008161706A (en) | General-purpose distribution type operation room control system | |
Roe et al. | A voice-controlled network for universal control of devices in the OR | |
JP2019101715A (en) | Medical information management device, medical information management method and program | |
Aguiar et al. | Integrated Interfacing System for Video Laparoscopy Procedures Based on Voice Recognition |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
EEER | Examination request | ||
MKEX | Expiry |
Effective date: 20200713 |
|
MKEX | Expiry |
Effective date: 20200713 |