US20160366328A1 - Medical image acquisition system and medical imaging device - Google Patents
Medical image acquisition system and medical imaging device Download PDFInfo
- Publication number
- US20160366328A1 US20160366328A1 US15/155,197 US201615155197A US2016366328A1 US 20160366328 A1 US20160366328 A1 US 20160366328A1 US 201615155197 A US201615155197 A US 201615155197A US 2016366328 A1 US2016366328 A1 US 2016366328A1
- Authority
- US
- United States
- Prior art keywords
- imaging
- signal
- unit
- camera head
- lens
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H04N5/23212—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/663—Remote control of cameras or camera parts, e.g. by remote control devices for controlling interchangeable camera parts based on electronic image sensor signals
-
- H04N5/2254—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/555—Constructional details for picking-up images in sites, inaccessible due to their dimensions or hazardous conditions, e.g. endoscopes or borescopes
Abstract
A medical image acquisition system includes an imaging device and an image processing device. The imaging device includes: an imaging unit configured to receive light and convert the light into an electric signal so as to generate the imaging signal; an optical unit including a focus mechanism moving one or a plurality of lenses so as to adjust a focal point position, and configured to form an optical image on the imaging unit; a memory configured to store therein unique information of the imaging device; and an auto focus controller configured to totally control the imaging device. The image processing device includes an auto focus evaluation unit configured to perform focusing evaluation based on the imaging signal, and the auto focus controller controls driving of the focus mechanism by referring to the unique information in accordance with an evaluation result by the auto focus evaluation unit.
Description
- The present application claims priority to and incorporates by reference the entire contents of Japanese Patent Application No. 2015-119731 filed in Japan on Jun. 12, 2015.
- The present disclosure relates to a medical image acquisition system and a medical imaging device.
- In a medical field, medical image acquisition systems that image a subject using an imaging element so as to observe the subject have been known (For example, see Japanese Patent Application Laid-open No. 2006-25913).
- An endoscope system as disclosed in Japanese Patent Application Laid-open No. 2006-25913 is a medical image acquisition system and includes an imaging device having a camera head with the imaging element and a camera cord as a signal transmitter that is electrically connected to the camera head, and an image processing device that processes an imaging signal received from the camera cord so as to generate an image signal based on the imaging signal. The camera head as disclosed in Japanese Patent Application Laid-open No. 2006-25913 includes a focal point position adjusting mechanism adjusting a focal point position.
- The focal point position adjusting mechanism includes a lens frame that holds one or a plurality of lens(es) and is movable in the optical axis direction and a focus ring that is rotatable about an optical axis and inputs a movement amount of the lens frame based on a rotation amount thereof. A user moves the lens frame by rotating the focus ring so as to adjust the focal point position.
- When what is called manual focus of adjusting a focal point position in accordance with an operation by a user is performed like with the above-mentioned focus ring, detail operations such as minute adjustment of the focal point position may be needed in some cases. For example, the depth of field is shallow in some cases with pixel increase of an imaging element in order to provide a high-definition observation image. In such a case, a manual adjustment operation of the focal point position is frequently required to be performed, resulting in a cumbersome focus operation and increase in time taken for adjusting the focal point position.
- In order to smoothly perform the adjustment operation of the focal point position, a technique of auto focus (AF) capable of adjusting the focal point position automatically can be employed. When a camera head of a different model type is mounted on an image processing device, incorporation of the AF configuration into the camera head requires pieces of detailed characteristic information of optical performance of a lens, driving performance of the lens (lens frame), and performance of the imaging element that are specific to the individual camera head. In the case where the image processing device is made to hold the pieces of characteristic information for individual camera heads capable of being mounted thereon, version upgrading of the image processing device may be necessary and maintenance may be needed every case when an information amount is increased, a camera head of a new model type is released, or version upgrading of the camera head is performed. This increases load on the user.
- According to one aspect of the present disclosure, there is provided a medical image acquisition system including: an imaging device configured to image a subject so as to generate an imaging signal; and an image processing device electrically connected to the imaging device detachably and configured to process the received imaging signal so as to generate an image signal corresponding to the imaging signal. The imaging device includes: an imaging unit configured to receive light and convert the light into an electric signal so as to generate the imaging signal; an optical unit including a focus mechanism moving one or a plurality of lenses so as to adjust a focal point position, and configured to form an optical image on the imaging unit; a memory configured to store therein unique information of the imaging device; and an auto focus controller configured to totally control the imaging device and control driving of the focus mechanism by referring to the memory, the image processing device includes an auto focus evaluation unit configured to perform focusing evaluation based on the imaging signal, and the auto focus controller controls driving of the focus mechanism by referring to the unique information in accordance with an evaluation result by the auto focus evaluation unit.
- According to another aspect of the present disclosure, there is provided a medical imaging device adapted to image a subject so as to generate an imaging signal, the medical imaging device including: an imaging unit configured to receive light and convert the light into an electric signal so as to generate the imaging signal; an optical unit including a focus mechanism moving one or a plurality of lenses so as to adjust a focal point position and configured to form an optical image on the imaging unit; a memory configured to store therein unique information of the medical imaging device; and an auto focus controller configured to totally control the medical imaging device and control driving of the focus mechanism by referring to the unique information in accordance with a focusing evaluation result of an image received from an external device.
-
FIG. 1 is a view illustrating the schematic configuration of an endoscope device according to a first embodiment of the disclosure; -
FIG. 2 is a block diagram illustrating the configurations of a camera head and an image processing device as illustrated inFIG. 1 ; -
FIG. 3 is a schematic plan view for explaining a focus mechanism of a lens unit in the first embodiment of the disclosure; -
FIG. 4 is a schematic plan view for explaining a focus mechanism of a lens unit according to a first modification of the first embodiment of the disclosure; -
FIG. 5 is a schematic plan view for explaining a focus mechanism of a lens unit according to a second modification of the first embodiment of the disclosure; -
FIG. 6 is a block diagram illustrating the configurations of a camera head and an image processing device according to a second embodiment of the disclosure; -
FIG. 7 is a block diagram illustrating the configurations of a camera head and an image processing device according to a third embodiment of the disclosure; -
FIG. 8 is a block diagram illustrating the configurations of a camera head and an image processing device according to a fourth embodiment of the disclosure; -
FIG. 9 is a block diagram illustrating the configurations of a camera head and an image processing device according to a fifth embodiment of the disclosure; and -
FIG. 10 is a block diagram illustrating the configurations of a camera head and an image processing device according to a sixth embodiment of the disclosure. - Hereinafter, embodiments of the present disclosure will be described. In the embodiments, medical endoscope devices that image an inner portion of a subject such as a patient and display an image are described as examples of a medical image acquisition system including a medical imaging device according to the disclosure. The embodiments do not limit the disclosure. In the accompanying drawings, the same reference numerals and symbols denote the same components.
-
FIG. 1 is a view illustrating the schematic configuration of anendoscope device 1 according to a first embodiment of the disclosure. Theendoscope device 1 is a device that is used in a medical field in order to observe a subject in (a living body of) an observation target such as a person. Theendoscope device 1 includes an endoscope (insertion portion) 8, an imaging device 2 (medical imaging device), adisplay device 3, animage processing device 4, and alight source device 5, as illustrated inFIG. 1 . Theimaging device 2 and theimage processing device 4 configure a medical image acquisition system. Although an endoscope device using a rigid scope as theendoscope 8 is described in the first embodiment, the endoscope device is not limited thereto and may use a flexible scope (not illustrated) as theendoscope 8. - An end of a
light guide 6 is connected to theendoscope 8 and thelight source device 5 supplies light for illuminating an inner portion of the living body to the other end of thelight guide 6. One end of thelight guide 6 is connected to thelight source device 5 detachably and the other end thereof is connected to theinsertion portion 8 detachably. Thelight guide 6 transfers the light supplied from thelight source device 5 from one end to the other end and supplies the light to theendoscope 8. - The
imaging device 2 images a subject image from theendoscope 8 and outputs the imaging result. Theimaging device 2 includes atransmission cable 7 as a signal transmitter and a camera head 9 as illustrated inFIG. 1 . In the first embodiment, thetransmission cable 7 and the camera head 9 configure the medical imaging device. - The
endoscope 8 is rigid and has an elongated shape, and is inserted into the living body. Theendoscope 8 includes therein an optical system configured by one or a plurality of lens(es) and collecting the subject image. Theendoscope 8 emits, from a tip thereof, the light supplied through thelight guide 6 and irradiates the inner portion of the living body with the light. The optical system (a lens unit 91) in theendoscope 8 collects the light (subject image) with which the inner portion of the living body is irradiated. - The camera head 9 is connected to a base end of the
endoscope 8 detachably. The camera head 9 images the subject image collected by theendoscope 8 and outputs an imaging signal generated by the imaging under control by theimage processing device 4. It should be noted that the detail configuration of the camera head 9 will be described later. - The
transmission cable 7 has afirst connector unit 7A at one end and is connected to theimage processing device 4 detachably with thefirst connector unit 7A interposed therebetween. Thetransmission cable 7 has asecond connector unit 7B at the other end and is connected to the camera head 9 detachably with thesecond connector unit 7B interposed therebetween. To be specific, thetransmission cable 7 is a cable in which a plurality of electric wirings (not illustrated) are arranged at the inner side of an outer cover as an outermost layer. The electric wirings are electric wirings for transmitting the imaging signal output from the camera head 9 and a control signal, a synchronization signal, clocks, and electric power output from theimage processing device 4 to the camera head 9. Although the camera head 9 and thesecond connector unit 7B are connected detachably in the embodiment, the configuration is not limited thereto and the camera head 9 and thesecond connector unit 7B may be integrally fixed and connected. - The
display device 3 displays an image generated by theimage processing device 4 under control by theimage processing device 4. - The
image processing device 4 processes the imaging signal input from the camera head 9 through thetransmission cable 7 and outputs an image signal to thedisplay device 3. In addition, theimage processing device 4 totally controls operations of the camera head 9 and thedisplay device 3. The detail configuration of theimage processing device 4 will be described later. - Next, the configurations of the
imaging device 2 and theimage processing device 4 will be described.FIG. 2 is a block diagram illustrating the configurations of theimaging device 2 and theimage processing device 4. It should be noted thatFIG. 2 omits illustration of the connector (second connector unit 7B) capable of connecting the camera head 9 and thetransmission cable 7 detachably. - The configuration of the
image processing device 4, the configuration of thefirst connector unit 7A, and the configuration of the camera head 9 will be described in this order below. A primary part of the disclosure is mainly described as the configuration of theimage processing device 4. As illustrated inFIG. 2 , theimage processing device 4 includes asignal processor 41, animage generator 42, acommunication module 43, aninput unit 44, animage processing controller 45, and amemory 46. Theimage processing device 4 may include a power supply unit (not illustrated) or the like generating a power supply voltage for driving theimage processing device 4 and the camera head 9, supplying it to the individual parts of theimage processing device 4, and supplying it to the camera head 9 through thetransmission cable 7. - The
signal processor 41 performs signal processing such as noise removal and analog-to-digital (A/D) conversion if necessary on the imaging signal output from the camera head 9 and outputs a digitalized imaging signal (pulse signal) to theimage generator 42. - The
signal processor 41 generates synchronization signals and clocks for theimaging device 2 and theimage processing device 4. The synchronization signal (for example, synchronization signal instructing an imaging timing of the camera head 9) and the clock (for example, clock for serial communication) to theimaging device 2 are sent to theimaging device 2 through a line (not illustrated). Theimaging device 2 is driven based on the synchronization signal and the clock. - The
image generator 42 generates the image signal for display that thedisplay device 3 displays based on the imaging signal input from thesignal processor 41. Theimage generator 42 executes predetermined signal processing on the imaging signal so as to generate the image signal for display including the subject image. Image processing includes pieces of processing of various types such as color correction, color enhancement, and contour enhancement. Theimage generator 42 outputs the generated image signal to thedisplay device 3. - The
communication module 43 outputs signals from theimage processing device 4 that include a control signal transmitted from theimage processing controller 45, which will be described later, to theimaging device 2. Thecommunication module 43 outputs signals from theimaging device 2 to theimage processing device 4. That is to say, thecommunication module 43 is a relay device that collects the signals to be output to theimaging device 2 from the individual parts of theimage processing device 4 by parallel-to-serial conversion or the like and outputs them, and allocates the signals input from theimaging device 2 by serial-to-parallel conversion or the like and outputs them to the corresponding parts of theimage processing device 4. - The
input unit 44 is configured by a user interface such as a keyboard, a mouse, and a touch panel, and receives input of pieces of information of various types. - The
image processing controller 45 performs driving control of the individual constituent components including theimage processing device 4 and the camera head 9, input/output control of pieces of information to the corresponding constituent components, and the like. Theimage processing controller 45 generates a control signal containing a result of AF operation processing, which will be described later, by referring to communication information data (for example, communication format information) recorded in thememory 46, and transmits the generated control signal to the imaging device 2 (first connector unit 7A) through thecommunication module 43. - The
image processing controller 45 outputs the control signal to the camera head 9 through thetransmission cable 7. - The
memory 46 is configured by a semiconductor memory such as a flash memory and a dynamic random access memory (DRAM) and records therein the communication information data (for example, communication format information). Thememory 46 may record therein programs of various types that theimage processing controller 45 executes. - The
signal processor 41 includes anAF processor 41 a. TheAF processor 41 a outputs a predetermined AF evaluation value of each frame based on an input imaging signal of the frame. - The
image processing controller 45 includes anAF operation unit 45 a. TheAF operation unit 45 a performs AF operation processing of selecting a frame or a focus lens position that is optimum as a focusing position based on the AF evaluation values of the respective frames from theAF processor 41 a. - The result of the AF operation processing is output to an
AF controller 712, which will be described later, through thecommunication module 43. - Although the
AF processor 41 a is provided in thesignal processor 41 and theAF operation unit 45 a is provided in theimage processing controller 45 in the embodiment, the configuration is not limited thereto. Alternatively, both theAF processor 41 a and theAF operation unit 45 a may be provided together in thesignal processor 41 or theimage processing controller 45 or they may be provided as different devices. - The
signal processor 41, theimage generator 42, thecommunication module 43, and theimage processing controller 45 as described above are made to operate by a general processor such as a central processing unit (CPU) having an internal memory (not illustrated) with programs recorded therein or exclusive processors such as operation circuits of various types executing specific functions, like an application specific integrated circuit (ASIC). Furthermore, they may be configured by a field programmable gate array (FPGA) (not illustrated) as one type of a programmable integrated circuit. When they are configured by the FPGA, a memory storing therein configuration data may be provided and the FPGA as the programmable integrated circuit may be configured by the configuration data read from the memory. - Subsequently, a primary part of the disclosure is mainly described as the configuration of the
transmission cable 7. As illustrated inFIG. 2 , thefirst connector unit 7A includes acommunication module 711, theAF controller 712, and amemory 713. - The
communication module 711 outputs signals transmitted from theimage processing device 4, such as the control signal containing the result of the AF operation processing, and signals transmitted from the camera head to theAF controller 712. Thecommunication module 711 outputs signals transmitted from theAF controller 712 that contain an AF driving signal, which will be described later, to the camera head 9 and theimage processing device 4. That is to say, thecommunication module 711 is a relay device that collects signals to be output to the camera head 9 and theimage processing device 4 from the individual parts of thetransmission cable 7 including theAF controller 712 by parallel-to-serial conversion or the like and outputs them, and allocates the signals input from the camera head 9 and theimage processing device 4 by serial-to-parallel conversion or the like and outputs them to the corresponding parts of thetransmission cable 7 including theAF controller 712. - The
AF controller 712 controls focus driving by a drivingunit 93. TheAF controller 712 generates an AF driving signal by referring to AF performance data (for example, reading timing and lens driving) 713 a for AF control that is recorded in thememory 713. To be specific, theAF controller 712 generates the AF driving signal in accordance with the result of the AF operation processing that is received from theAF operation unit 45 a of theimage processing device 4 through thecommunication module 711. Furthermore, theAF controller 712 transmits the generated AF driving signal to the camera head 9 through thecommunication module 711 and the predetermined electric wiring included in thetransmission cable 7. - The
memory 713 is configured by a semiconductor memory such as a flash memory and a dynamic random access memory (DRAM) and records therein programs of various types and the like that theAF controller 712 executes. Thememory 713 stores therein theAF performance data 713 a related to AF performance of the camera head 9 as unique information. TheAF performance data 713 a includes pieces of performance data related to the AF driving, such as information of a movement distance (inter-frame distance) of a lens between the frames for which the imaging is made in the AF processing, setting information of a driver of the drivingunit 93, information of a lens movement amount for an input signal to afocus mechanism 900, and individual variation data of the drivingunit 93 including adetector 93 a and thelens unit 91 including thefocus mechanism 900. - The
communication module 711 and theAF controller 712 as described above are made to operate by a general processor such as a central processing unit (CPU) having an internal memory (not illustrated) with programs recorded therein or exclusive processors such as operation circuits of various types executing specific functions, like an application specific integrated circuit (ASIC). Furthermore, they may be configured by a field programmable gate array (FPGA) (not illustrated) as one type of a programmable integrated circuit. When they are configured by the FPGA, a memory storing therein configuration data may be provided and the FPGA as the programmable integrated circuit may be configured by the configuration data read from the memory. - Although the
AF controller 712 and thememory 713 are provided in thefirst connector unit 7A in the embodiment, the configuration is not limited thereto and at least one of them may be provided in thesecond connector unit 7B or another portion of thetransmission cable 7. - Then, a primary part of the disclosure is mainly described as the configuration of the camera head 9. As illustrated in
FIG. 2 , the camera head 9 includes thelens unit 91, animaging unit 92, the drivingunit 93, acamera head controller 94, and acommunication module 95. - The
lens unit 91 is configured by one or a plurality of lens(es) and forms the subject image collected by theinsertion portion 8 on an imaging surface of an imaging element (not illustrated) forming theimaging unit 92. The one or plurality of lens(es) are configured to be movable along an optical axis. Thelens unit 91 includes an optical zoom mechanism (not illustrated) moving the one or plurality of lens(es) so as to change an angle of view and thefocus mechanism 900 changing a focal point. Thefocus mechanism 900 will be described later. Thelens unit 91 may include, in addition to the optical zoom mechanism and thefocus mechanism 900, a diaphragm mechanism and an optical filter (for example, filter for cutting infrared light) capable of being detachably inserted on the optical axis. - The
imaging unit 92 images the subject under control by thecamera head controller 94. Theimaging unit 92 is configured by a sensor chip provided by integrally forming an imaging element (not illustrated) such as a charge coupled device (CCD) and a complementary metal oxide semiconductor (CMOS) that receives the subject image formed by thelens unit 91 and converts it to an electric signal. In the case of the CCD, for example, a signal processor (not illustrated) that performs signal processing (A/D conversion or the like) on the electric signal (analog signal) from the imaging element and outputs the imaging signal is mounted on the sensor chip or the like. In the case of the CMOS, for example, a signal processor that performs signal processing (A/D conversion or the like) on the electric signal (analog signal) converted from the light and outputs the imaging signal is included in the imaging element. Theimaging unit 92 converts the generated imaging signal into an imaging signal in accordance with a predetermined transmission system and outputs it to theimage processing device 4 without passing through thecommunication module 95. In the first embodiment, theimaging unit 92 outputs RAW data, for example. - The driving
unit 93 has the driver that causes the optical zoom mechanism and thefocus mechanism 900 to operate so as to change the angle of view and the focal point position of thelens unit 91 under control by theAF controller 712. The drivingunit 93 includes thedetector 93 a that receives a detection signal of a lens position (reference position) in thelens unit 91 and outputs it to thecamera head controller 94. - The
camera head controller 94 controls operations of the entire camera head 9 in accordance with the driving signal input from thefirst connector unit 7A through thetransmission cable 7, an instruction signal output from an operation unit such as a switch provided on the outer surface of the camera head 9 in an exposed manner by a user operation on the operation unit, and the like. Thecamera head controller 94 outputs information related to the current state of the camera head 9 to theimage processing device 4 through thetransmission cable 7. - The
communication module 95 outputs the signals transmitted from thetransmission cable 7 that contain the AF driving signal and the signals transmitted from theimage processing device 4 to the corresponding parts in the camera head 9, such as thecamera head controller 94. Thecommunication module 95 converts the information related to the current state of the camera head 9, and the like into a signal format in accordance with the predetermined transmission system and outputs the converted signal to thetransmission cable 7 and theimage processing device 4 through thetransmission cable 7. That is to say, thecommunication module 95 is a relay device that allocates the signals input from theimage processing device 4 and thetransmission cable 7 by the serial-to-parallel conversion or the like and outputs them to the corresponding parts of the camera head 9, and collects the signals to be output to theimage processing device 4 and thetransmission cable 7 from the individual parts of the camera head 9 by the parallel serial conversion or the like and outputs them. - The driving
unit 93, thecamera head controller 94, and thecommunication module 95 as described above are made to operate by a general processor such as a central processing unit (CPU) having an internal memory (not illustrated) with programs recorded therein or exclusive processors such as operation circuits of various types executing specific functions, like an application specific integrated circuit (ASIC). Furthermore, they may be configured by a field programmable gate array (FPGA) (not illustrated) as one type of a programmable integrated circuit. When they are configured by the FPGA, a memory storing therein configuration data may be provided and the FPGA as the programmable integrated circuit may be configured by the configuration data read from the memory. - A signal processor that performs signal processing on the imaging signal generated by the
communication module 95 and theimaging unit 92 may be configured in the camera head 9 or thetransmission cable 7. An imaging clock for driving theimaging unit 92 and a driving clock for driving the drivingunit 93 may be generated based on a reference clock generated by an oscillator provided in the camera head 9 and may be output to theimaging unit 92 and the drivingunit 93, respectively. Furthermore, timing signals of pieces of processing of various types in theimaging unit 92, the drivingunit 93, and thecamera head controller 94 may be generated based on the synchronization signals input from theimage processing device 4 and thefirst connector unit 7A through thetransmission cable 7 and may be output to theimaging unit 92, the drivingunit 93, and thecamera head controller 94, respectively. - The focus mechanism of the
lens unit 91 will be described with reference toFIG. 3 .FIG. 3 is a schematic plan view for explaining the focus mechanism of the lens unit in the first embodiment. Thefocus mechanism 900 as illustrated inFIG. 3 includes alens group 911 formed by a plurality of lenses (lenses 911A to 911C), afirst lens frame 912A, asecond lens frame 912B, a first supportingshaft 913A, a second supportingshaft 913B, arotating shaft 914, a motor M, and alens position detector 915. - The
lens group 911 is held by the lens frames (thefirst lens frame 912A and thesecond lens frame 912B: movable optical members), and is provided so as to be movable along the axial direction of therotating shaft 914. In the first embodiment, thefirst lens frame 912A holding thelens 911A and thesecond lens frame 912B holding thelenses lenses 911A to 911C to move in the optical axis direction. The lens group in thefocus mechanism 900 may be formed by one lens or two or equal to or more than four lenses instead of the lens group formed by three lenses as illustrated inFIG. 3 . - The
first lens frame 912A holds thelens 911A. Thefirst lens frame 912A includes a transfer unit having anut 9120 screwed with therotating shaft 914 and converting rotating force of therotating shaft 914 into driving force in the optical axis direction and atransfer unit 9121 transferring the driving force provided by the conversion by thenut 9120 to thefirst lens frame 912A. The configuration of the lens frame is not limited thereto as long as the lens frame holds the lens and is movable in the optical axis direction. - The
second lens frame 912B holds thelenses second lens frame 912B includes a transfer unit having thenut 9120 screwed with therotating shaft 914 and converting the rotating force of therotating shaft 914 into driving force in the optical axis direction and thetransfer unit 9121 transferring the driving force provided by the conversion by thenut 9120 to thesecond lens frame 912B. - The first supporting
shaft 913A and the second supportingshaft 913B extend in the optical axis direction. The first supportingshaft 913A and the second supportingshaft 913B hold thefirst lens frame 912A and thesecond lens frame 912B such that the individual lenses of thelens group 911 do not incline with respect to the optical axis and the individual lenses of the lens group 911 (lens frames) are movable in the optical axis direction. A through-hole 913 a through which thetransfer units 9121 are inserted is formed in the second supportingshaft 913B. - The
rotating shaft 914 is connected to the motor M and rotates about a lengthwise axis in accordance with the rotating force from the motor M. For example, a spiral groove is formed in therotating shaft 914 and thenuts 9120 are engaged with the groove so as to convert the rotation of therotating shaft 914 into driving force in the axial direction. - In the
focus mechanism 900, rotation of the motor M causes therotating shaft 914 to rotate under control by the drivingunit 93. The rotation of therotating shaft 914 causes thefirst lens frame 912A and thesecond lens frame 912B to move along the axial direction through thefirst transfer unit 9121 and thesecond transfer unit 9121, respectively. With this, thelenses 911A to 911C held by the corresponding lens frames may be moved in the axial direction. - The
lens position detector 915 detects distances to thefirst lens frame 912A and thesecond lens frame 912B from the reference positions. Thelens position detector 915 emits infrared rays, for example, and receives light returned from the lens frames. Then, thelens position detector 915 outputs, to thedetector 93 a, detection signals (light detection signals) related to positions (distances) of thefirst lens frame 912A and thesecond lens frame 912B relative to the reference positions. Thelens position detector 915 may use a photo interrupter or the like instead of distance measurement with the infrared rays as described above. - Subsequently, the AF processing by the
endoscope device 1 is described with reference back toFIG. 2 . When theAF processor 41 a receives input of imaging signals of a plurality of frames, it outputs AF evaluation values for the respective frames. Thereafter, theAF operation unit 45 a provided in theimage processing controller 45 selects a frame that is optimum as the focusing position based on the AF evaluation values and generates optimum frame information (focusing evaluation) as information of the frame optimum for focusing. Then, theAF operation unit 45 a outputs an AF control signal containing the optimum frame information to theAF controller 712 through thecommunication modules AF processor 41 a and theAF operation unit 45 a configure an AF evaluation unit. The AF control signal may contain information of the lens movement direction (direction toward or away from the subject). - The above-mentioned selection of the frame may be made using a well-known AF method such as contrast AF, phase difference AF, and AF using a space recognition technique. The
AF processor 41 a outputs well-known AF evaluation values in accordance with the employed AF method, such as contrast values for the respective frames, and theAF operation unit 45 a selects a frame based on the well-known AF evaluation values in accordance with the employed AF method, such as a frame having the largest contrast value. - When the
AF controller 712 receives the AF control signal, it generates an AF driving signal for moving the lens group 911 (thefirst lens frame 912A and thesecond lens frame 912B) in the movement direction and by the movement distance (for example, movement distance to positions corresponding to the optimum frame information from the current positions) to positions corresponding to the optimum frame information by referring to theperformance data 713 a. Then, theAF controller 712 outputs the generated AF driving signal to thecamera head controller 94 through thecommunication modules camera head controller 94 controls the drivingunit 93 based on the received driving signal so as to move the lens group 911 (thefirst lens frame 912A and thesecond lens frame 912B). In this case, the drivingunit 93 causes therotating shaft 914 to move in accordance with the optimum frame information from the current position while checking a detection result by thedetector 93 a so as to move the lens group 911 (thefirst lens frame 912A and thesecond lens frame 912B). - With the above-mentioned first embodiment, the
image processing device 4 outputs, to theimaging device 2, the information of the frame optimum for the focusing in the imaging signal imaged by the camera head 9, wherein the information does not depend on individual difference and model type difference of the camera head 9. Theimaging device 2 performs the focus driving based on the information of the frame optimum for the focusing from theimage processing device 4 with reference to pieces of information of the individual difference and the model type difference of the camera head 9 that are related to AF, in particular. The movement amount of the lenses in accordance with the performance of theconnected imaging device 2 is therefore set even when thedifferent imaging device 2 is connected to theimage processing device 4, thereby performing the AF processing in view of the performance of eachimaging device 2. The first embodiment may reduce burden on the user regardless of the characteristics of the camera head. - Furthermore, in the above-mentioned first embodiment, the user grips the camera head 9 of the
imaging device 2 in order to image a desired observation site while adjusting a position of the camera head 9 relative to the subject. When the user grips the camera head 9, heat is easy to be accumulated in the camera head 9. When the inner portion of the camera head 9 is increased in temperature, quality of the imaging signal that is output may be deteriorated with the increase in the temperature of the imaging element. The camera head 9 that is gripped is desired to be reduced in size and weight as less as possible, but increase in the number of built-in objects may inhibit the reduction in size and weight. It is preferable that the built-in objects in the camera head 9 be minimized and, in particular, no heat generator such as the CPU be provided in the camera head 9 if circumstances allow. Increase of the camera head in size and weight and accumulation of heat may be prevented by providing the optical system (lens unit 91) and the imaging element (imaging unit 92) as minimum necessary components in the camera head 9 and providing components other than them in thetransmission cable 7 if circumstances allow. For example, in order to transmit the signals between the camera head 9 and theimage processing device 4 efficiently and reduce the number of signal lines, a parallel-to-serial element or a serial-to-parallel element for converting a parallel signal from the camera head 9 into a serial signal or converting a serial signal to the camera head 9 into a parallel signal may be provided in the camera head 9. In the first embodiment, a processor element for AF control, in particular, is preferably provided in thetransmission cable 7. Although thememory 713 is provided in thetransmission cable 7, it may be provided in the camera head 9 when it is reduced in size. Furthermore, when theAF controller 712 generates less heat and is reduced in size, theAF controller 712 may be provided in the camera head 9. - With the above-mentioned first embodiment, the
first connector unit 7A on a side away from the camera head 9 includes theAF controller 712. This configuration enables the CPU and the like generating heat with driving to be away from the camera head 9 as far as possible, thereby reducing influence of heat on the camera head 9. - Subsequently, a first modification of the first embodiment of the disclosure will be described.
FIG. 4 is a schematic plan view for explaining a focus mechanism of a lens unit according to a first modification of the first embodiment of the disclosure. Although thelens position detector 915 measures the distances of the lens frames from the reference positions based on the optical signals acquired with the light (infrared rays) in the above-mentioned first embodiment, the positions of the lens frames are detected using magnets in the first modification. - A
focus mechanism 910 in the first modification includes thelens group 911 formed by the plurality of lenses (lenses 911A to 911C), thefirst lens frame 912A, thesecond lens frame 912B, the first supportingshaft 913A, the second supportingshaft 913B, and therotating shaft 914 as described above, and alens position detector 916. - The
lens position detector 916 detects the positions of thefirst lens frame 912A and thesecond lens frame 912B. To be specific, thelens position detector 916 includes a firstpermanent magnet 916A, a secondpermanent magnet 916B, and a Hallelement holding unit 916C. - The first
permanent magnet 916A is provided in thefirst lens frame 912A. The secondpermanent magnet 916B is provided in thesecond lens frame 912B. - The Hall
element holding unit 916C extends in parallel with the second supportingshaft 913B and has a plurality ofHall elements 9160 arranged along the extension direction. TheHall elements 9160 detect magnetic fields using a Hall effect and convert the detected magnetic fields (magnetism) into electric signals. Theindividual Hall elements 9160 output the electric signals provided by the conversion as detection signals to thedetector 93 a. - When the
detector 93 a receives the detection signals, it determines theHall elements 9160 having the largest voltage values and detects the determinedHall elements 9160 as the positions of the lens frames. To be specific, thedetector 93 a determines two Hall elements of the Hall element corresponding to the position of the firstpermanent magnet 916A and the Hall element corresponding to the position of the secondpermanent magnet 916B. Thus, the positions of the current lens frames may be detected. - Although the first
permanent magnet 916A and the secondpermanent magnet 916B are provided in the respective lens frames in the above-described first modification, any one of the permanent magnets may be arranged so as to detect the position of the lens frame on which the permanent magnet is arranged. - Subsequently, a second modification of the first embodiment of the disclosure will be described.
FIG. 5 is a schematic plan view for explaining a focus mechanism of a lens unit according to a second modification of the first embodiment of the disclosure. Although thelens position detector 915 measures the distances of the lens frames from the reference positions based on the optical signals acquired with the light (infrared rays) in the above-mentioned first embodiment, the positions of the lens frames are detected by detecting a rotation amount of the motor in the second modification. - A
focus mechanism 920 in the second modification includes thelens group 911 formed by the plurality of lenses (lenses 911A to 911C), thefirst lens frame 912A, thesecond lens frame 912B, the first supportingshaft 913A, the second supportingshaft 913B, and therotating shaft 914 as described above, and alens position detector 917. - The
lens position detector 917 detects the position indicating the rotation amount of the motor M. To be specific, thelens position detector 917 is configured by a rotary encoder, for example. Thelens position detector 917 outputs the detected rotation amount (displacement with rotation) of the motor M as a detection signal to thedetector 93 a. - When the
detector 93 a receives the rotation amount (displacement with rotation) of the motor M from the detection signal, it converts the rotation amount into a movement amount of the lens frame and detects the movement amount after conversion as the position of the lens frame. In this case, thememory 713 or the like stores therein the previous position of the lens frame and the position of the lens frame is determined by adding the movement amount to the previous position. With this determination, the current position of the lens frame may be detected. It should be noted that as the position of the lens frame, the respective positions of thefirst lens frame 912A and thesecond lens frame 912B may be determined or any one of the positions of them may be detected. Information (for example, conversion coefficient) related to the conversion of the rotation amount (displacement with rotation) of the motor M into the movement amount of the lens frame is previously stored as theperformance data 713 a. - Next, a second embodiment of the disclosure will be described.
FIG. 6 is a block diagram illustrating the configurations of a camera head and an image processing device in the second embodiment. The same reference numerals and symbols denote the same configurations as the above-mentioned configurations. Although the imaging signal is transmitted as the electric signal between theimaging unit 92 and thesignal processor 41 through thetransmission cable 7 with the electric wirings arranged therein in the above-mentioned first embodiment, the imaging signal is transmitted as an optical signal in the second embodiment. - An
endoscope device 1 a in the second embodiment includes theendoscope 8, theimaging device 2, and thedisplay device 3 as described above, and animage processing device 4 a. In the second embodiment, theimaging device 2 includes acamera head 9 a instead of the camera head 9. - As illustrated in
FIG. 6 , theimage processing device 4 a includes thesignal processor 41, theimage generator 42, thecommunication module 43, theinput unit 44, theimage processing controller 45, and thememory 46 as described above, and an optical-to-electrical converter (0/E) 47. - As illustrated in
FIG. 6 , thecamera head 9 a includes thelens unit 91, theimaging unit 92, the drivingunit 93, thecamera head controller 94, and thecommunication module 95 as described above, and an electrical-to-optical (E/O)converter 96. - The E/
O converter 96 performs electro-optical conversion processing on the imaging signal as the electric signal input from theimaging unit 92 so as to convert it into an optical signal, and outputs the imaging signal as the optical signal to theimage processing device 4 a. - The 0/E converter 47 receives the optical signal from the
camera head 9 a (E/O converter 96), performs photoelectric conversion processing on the received optical signal so as to convert it into an electric signal, and outputs the imaging signal as the electric signal after conversion to thesignal processor 41. After the imaging signal is input to thesignal processor 41, theimage generator 42 generates the image signal as described above. - The second embodiment may provide the effects provided in the above-mentioned first embodiment. In addition, transmission of the imaging signal between the
camera head 9 a and theimage processing device 4 a is transmission of the optical signal, so that even when a transmission path is long like thetransmission cable 7, more pieces of information may be transmitted at a time at higher speed while reducing attenuation in comparison with the electric signal. - Next, a third embodiment of the disclosure will be described.
FIG. 7 is a block diagram illustrating the configurations of a camera head and an image processing device in the third embodiment. The same reference numerals and symbols denote the same configurations as the above-mentioned configurations. Although the lens frames are moved based on theperformance data 713 a in the above-mentioned first embodiment, the lens frames are moved after temperature information is further acquired in the third embodiment. - An
endoscope device 1 b in the third embodiment includes theendoscope 8, theimaging device 2, thedisplay device 3, and theimage processing device 4 as described above. In the third embodiment, theimaging device 2 includes afirst connector unit 7C and acamera head 9 b instead of thefirst connector unit 7A and the camera head 9. - As illustrated in
FIG. 7 , thecamera head 9 b includes thelens unit 91, theimaging unit 92, the drivingunit 93, thecamera head controller 94, and thecommunication module 95 as described above, and atemperature sensor 97. - The
temperature sensor 97 is provided at a front end of the camera head near thelens unit 91. To be specific, thetemperature sensor 97 is provided near the lens frames (thefirst lens frame 912A and thesecond lens frame 912B) of thefocus mechanism 900, for example. Thetemperature sensor 97 is configured by a thermocouple, a thermometric resistor, a thermistor, or the like, and measures the temperature near the lens frames (thefirst lens frame 912A and thesecond lens frame 912B). Thetemperature sensor 97 outputs a detection signal containing the temperature measurement result to thedetector 93 a. - As illustrated in
FIG. 7 , thefirst connector unit 7C includes thecommunication module 711, theAF controller 712, and thememory 713. Thememory 713 stores therein a correction amount-temperature table 713 b in addition to the above-mentionedperformance data 713 a. - The correction amount-temperature table 713 b is a table indicating a relation between the temperature detected by the
temperature sensor 97 and a correction amount of a lens frame movement amount. To be specific, for example, the lens frames expand under a high-temperature environment, whereas the viscosity of a lubricant for sliding is increased under a low-temperature environment. Under these environments, friction force in the sliding is increased, and sliding characteristics along the first supportingshaft 913A and the second supportingshaft 913B change. The correction amount-temperature table 713 b is a table for correcting the inter-frame distance in accordance with the change in the movement amount with thermal expansion of the lens frames, for example, or correcting driving force or driving speed of the lens frames in accordance with the change in the viscosity of the lubricant for sliding for the inter-frame distance of the frames that is stored as theperformance data 713 a. For example, the correction amount-temperature table 713 b is data provided by performing linear interpolation, quadratic curve interpolation, or the like on data measured by every 5 degrees. - The
AF controller 712 determines settings related to driving control of the lens frames based on the detection result by thetemperature sensor 97, the correction amount-temperature table 713 b, and the optimum frame information. With these settings, driving of the lens frames is controlled in consideration of movement conditions that change with the temperature so as to control the driving of the lens frames in accordance with the characteristics of theimaging device 2 more specifically. - The third embodiment may provide the effects provided in the above-mentioned first embodiment. In addition, the
imaging device 2 controls the driving of the lens frames based on the detection result by thetemperature sensor 97, so that the lenses (lens frames) may be moved with higher accuracy. In particular, the imaging unit 92 (imaging element) and thecommunication module 95 are easy to generate heat with energization. When the heat is transferred to the lens frames and the like, the movement amount, the driving force, the driving speed, and the like change in some cases. The lenses (lens frames) may be moved with high accuracy by changing the driving control in consideration of the temperature as in the third embodiment. - Next, a fourth embodiment of the disclosure will be described.
FIG. 8 is a block diagram illustrating the configurations of a camera head and an image processing device in the fourth embodiment. The same reference numerals and symbols denote the same configurations as the above-mentioned configurations. In the fourth embodiment, in the configuration in the above-mentioned first embodiment, the camera head further includes aninput unit 98 receiving input of a lens driving instruction. - An
endoscope device 1 c in the fourth embodiment includes theendoscope 8, theimaging device 2, thedisplay device 3, and theimage processing device 4 as described above. In the fourth embodiment, theimaging device 2 includes afirst connector unit 7D and acamera head 9 c instead of thefirst connector unit 7A and the camera head 9. - As illustrated in
FIG. 8 , thecamera head 9 c includes thelens unit 91, theimaging unit 92, the drivingunit 93, thecamera head controller 94, and thecommunication module 95 as described above, and theinput unit 98. - The
input unit 98 is configured by a user interface such as a key, a dial-type input unit (including a sensor detecting rotation of the dial), and a lever, and receives input of information related to movement of the lens group 911 (thefirst lens frame 912A and thesecond lens frame 912B) of the focus mechanism. The information related to the movement includes pieces of information of the movement amount, the movement direction, the movement speed, and the like of thelens group 911. For example, when the dial-type input unit is used, a signal instructing the movement direction in accordance with the rotation direction is input, a signal instructing the movement amount in accordance with the rotation amount is input, and a signal instructing the movement speed in accordance with the rotation speed is input. Theinput unit 98 further receives input of an instruction signal to execute the AF processing by a key operation. - As illustrated in
FIG. 8 , thefirst connector unit 7D includes thecommunication module 711, theAF controller 712, and thememory 713. Thememory 713 stores thereinlens driving information 713 c in addition to the above-mentionedperformance data 713 a. - The
lens driving information 713 c is driving information of the lenses (lens frames) in accordance with the instruction signals received by theinput unit 98. To be specific, when the instruction signals are input using the above-mentioned dial-type input unit, thelens driving information 713 c contains information correlating any one direction (advancement or retreat direction with respect to the imaging unit 92) in the optical axis direction with the rotation direction, information correlating the movement amount of the lenses (lens frames) in the optical axis direction with the rotation amount, and information correlating the movement speed of the lenses (lens frames) with the rotation speed. It should be noted that the rotation speed and the movement amount may be correlated with each other. - When the instruction signals are input through the
input unit 98, theAF controller 712 generates a driving signal related to the movement of the lenses (lens frames) in accordance with the instruction signals by referring to thelens driving information 713 c and outputs it to thecamera head controller 94. Thecamera head controller 94 outputs the driving signal to the drivingunit 93 and moves the lenses (lens frames) in accordance with the driving signal under control by the drivingunit 93. - In the fourth embodiment, the user inputs the instruction signals through the
input unit 98 so as to move the lens manually (manual focusing (MF)). With this, the MF processing in the fourth embodiment and the AF processing as described in the first embodiment may be used in combination. The MF processing in the fourth embodiment may be processing of adjusting the focal point position while checking, by the user, an image without performing the AF processing or may be processing of finely adjusting the focal point position while checking, by the user, the image after the AF processing. The MF processing and the AF processing may be switched by a key operation on theinput unit 98. - The fourth embodiment may provide the effects provided in the above-mentioned first embodiment. In addition, the lenses are moved manually (MF) by input of the instruction signals through the
input unit 98 by the user, so that the focal point position may be adjusted by moving the lenses (lens frames) with higher degree of freedom. - Next, a fifth embodiment of the disclosure will be described.
FIG. 9 is a block diagram illustrating the configurations of a camera head and an image processing device in the fifth embodiment. The same reference numerals and symbols denote the same configurations as the above-mentioned configurations. In the fifth embodiment, the lens driving control by temperature detection and the MF processing by the user may be performed while the above-mentioned configurations in the third and fourth embodiments are combined. - An
endoscope device 1 d in the fifth embodiment includes theendoscope 8, theimaging device 2, thedisplay device 3, and theimage processing device 4 as described above. In the fifth embodiment, theimaging device 2 includes afirst connector unit 7E and acamera head 9 d instead of thefirst connector unit 7A and the camera head 9. - As illustrated in
FIG. 9 , thecamera head 9 d includes thelens unit 91, theimaging unit 92, the drivingunit 93, thecamera head controller 94, thecommunication module 95, thetemperature sensor 97, and theinput unit 98 as described above. - As illustrated in
FIG. 9 , thefirst connector unit 7E includes thecommunication module 711, theAF controller 712, and thememory 713. Thememory 713 stores therein theperformance data 713 a, the correction amount-temperature table 713 b, and thelens driving information 713 c as described above. - The
AF controller 712 determines the settings related to the driving control of the lens frames based on the detection result by thetemperature sensor 97, the correction amount-temperature table 713 b, and the optimum frame information as described above. When the instruction signals are input through theinput unit 98, theAF controller 712 generates a driving signal related to the movement of the lens (lens frames) in accordance with the instruction signals by referring to thelens driving information 713 c and outputs it to thecamera head controller 94. - The fifth embodiment may move the lenses (lens frames) with higher accuracy and adjust the focal point position by moving the lenses (lens frames) with higher degree of freedom as in the above-mentioned third and fourth embodiments.
- Furthermore, in the fifth embodiment, the transmission of the imaging signal may be transmission of the optical signal as in the above-mentioned second embodiment.
- Next, a sixth embodiment of the disclosure will be described.
FIG. 10 is a block diagram illustrating the configurations of a camera head and an image processing device in the sixth embodiment. The same reference numerals and symbols denote the same configurations as the above-mentioned configurations. In the sixth embodiment, the camera head adjusts gain of the imaging signal. - An
endoscope device 1 e in the sixth embodiment includes theendoscope 8, theimaging device 2, thedisplay device 3, and theimage processing device 4 as described above. In the sixth embodiment, theimaging device 2 includes acamera head 9 e instead of the camera head 9. Thecamera head 9 e includes animaging unit 92A instead of the above-mentionedimaging unit 92 of the camera head 9, as illustrated inFIG. 10 . - The
imaging unit 92A includes alight receiver 921 receiving light of an imaging target through thelens unit 91 and an analog front end unit (AFE unit) 922 performing gain adjustment processing and A/D conversion processing on the imaging signal as the electric signal input from thelight receiver 921. Thelight receiver 921 corresponds to photodiodes of a CCD or a CMOS. TheAFE unit 922 performs the gain adjustment processing of amplifying the gain of the imaging signal by a predetermined amplification amount before the A/D conversion under control by thecamera head controller 94. It should be noted that thelight receiver 921 and theAFE unit 922 may be formed separately or integrally. For example, theAFE unit 922 may be provided integrally on one CMOS imaging element provided with thelight receiver 921. - The gain adjustment is described. When the
signal processor 41 receives the imaging signal from theimaging unit 92A, it performs detection processing of the imaging signal and outputs the detection result to theimage processing controller 45. Theimage processing controller 45 sets an amplification amount by which theAFE unit 922 performs the amplification based on the detection result and outputs it as a control signal to theAF controller 712. TheAF controller 712 sets a gain adjustment amount based on the control signal and outputs it as a driving signal to thecamera head controller 94. Thecamera head controller 94 causes theAFE unit 922 to amplify the gain of the imaging signal in accordance with the gain adjustment amount as indicated by the received driving signal. - The sixth embodiment may provide the effects provided in the above-mentioned first embodiment. In addition, the imaging signal detection processing is performed and the gain adjustment in accordance with the detection processing is performed on the imaging signal output from the
camera head 9 e before the A/D conversion. Noise in the imaging signal that is transmitted may be therefore reduced by performing the amplification processing in a state of the analog signal. - Although the embodiments of the disclosure have been described hereinbefore, the disclosure should not be limited by the above-mentioned embodiments. Although the
AF controller 712 generates the driving signal and so on in the above-mentioned embodiments, thecamera head controller 94 may generate the driving signal. - Although the
AF controller 712 and thememory 713 are provided in the first connector unit in the above-mentioned embodiments, thememory 713 may be provided in the camera head or thecamera head controller 94 may generate the driving signal by referring to thememory 713 provided in the camera head. - Although the
communication modules transmission cable 7, and theimage processing device 4 in the above-mentioned embodiments, the configuration is not limited thereto and at least any of the relay devices may not be provided and direct communication may be made. - Although the medical image acquisition system is used for the endoscope system as an example in the above-mentioned embodiments, it is not limited to the endoscope system as long as it is a medical image acquisition system imaging an observation site. For example, the medical image acquisition system may be applied to a medical microscope system. The medical microscope system is a medical image acquisition system observing a predetermined site of a subject while enlarging it. The medical microscope system includes a camera head enlarging and imaging the subject and a transmission cable transmitting an imaging signal from the camera head. Furthermore, the medical microscope system includes a movable arm portion to which the camera head is connected detachably and that holds the camera head and an image processing device to which the transmission cable of a signal transmitter is connected. The medical microscope system may move and fix a relative position and a posture of the camera head relative to the subject by gripping a camera head portion and moving the camera head while deforming the arm. In the case of the medical microscope system, the transmission cable and the image processing device may be connected detachably or fixed and connected integrally. The medical image acquisition system according to the disclosure is useful for the above-mentioned medical microscope system, for example.
- As described above, the medical image acquisition system and a medical imaging device of the disclosure are useful for reducing burden on the user regardless of the characteristics of the camera head.
- It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
Claims (8)
1. A medical image acquisition system comprising:
an imaging device configured to image a subject so as to generate an imaging signal; and
an image processing device electrically connected to the imaging device detachably and configured to process the received imaging signal so as to generate an image signal corresponding to the imaging signal, wherein
the imaging device includes:
an imaging unit configured to receive light and convert the light into an electric signal so as to generate the imaging signal;
an optical unit including a focus mechanism moving one or a plurality of lenses so as to adjust a focal point position, and configured to form an optical image on the imaging unit;
a memory configured to store therein unique information of the imaging device; and
an auto focus controller configured to totally control the imaging device and control driving of the focus mechanism by referring to the memory,
the image processing device includes an auto focus evaluation unit configured to perform focusing evaluation based on the imaging signal, and
the auto focus controller controls driving of the focus mechanism by referring to the unique information in accordance with an evaluation result by the auto focus evaluation unit.
2. The medical image acquisition system according to claim 1 , wherein
the imaging device further includes:
a camera head including the imaging unit and the optical unit; and
a signal transmitter including a transmission cable transmitting the imaging signal from the imaging unit to the image signal generation device, and
the auto focus controller is arranged in the signal transmitter.
3. The medical image acquisition system according to claim 2 , wherein
the signal transmitter includes a first connection unit as a connection unit to the image processing device, a second connection unit as a connection unit to the camera head, and a cable unit connecting the first connection unit and the second connection unit, and
the auto focus controller is provided in the first connection unit.
4. The medical image acquisition system according to claim 1 , wherein
the focus mechanism includes a movable optical member configured to hold the one or plurality of lenses and that is movable in an optical axis direction,
the imaging device further includes a temperature sensor configured to detect a temperature near the movable optical member,
the memory stores therein a relational table indicating a relation between the temperature and a correction amount related to movement of the movable optical member, and
the auto focus controller determines the correction amount based on a detection result by the temperature sensor and the relational table, and sets a movement amount of the one or plurality of lenses of the focus mechanism based on the determined correction amount and the movement amount of the movable optical member that is stored as the unique information.
5. The medical image acquisition system according to claim 1 , wherein
the imaging device further includes an input unit configured to receive input of an instruction signal related to movement of the one or plurality of lenses, and
the auto focus controller sets a movement amount of the one or plurality of the lenses in accordance with the instruction signal.
6. The medical image acquisition system according to claim 1 , wherein the auto focus evaluation unit operates a contrast of an image in accordance with the imaging signal and performs the focusing evaluation for each frame, and generates an evaluation result correlated with the frame.
7. The medical image acquisition system according to claim 1 , further comprising:
an electrical-to-optical converter provided in the imaging unit and configured to convert the imaging signal generated by the imaging unit into an optical signal; and
an optical-to-electrical converter provided in the image generation device and configured to convert the optical signal into an electric signal.
8. A medical imaging device adapted to image a subject so as to generate an imaging signal, the medical imaging device comprising:
an imaging unit configured to receive light and convert the light into an electric signal so as to generate the imaging signal;
an optical unit including a focus mechanism moving one or a plurality of lenses so as to adjust a focal point position and configured to form an optical image on the imaging unit;
a memory configured to store therein unique information of the medical imaging device; and
an auto focus controller configured to totally control the medical imaging device and control driving of the focus mechanism by referring to the unique information in accordance with a focusing evaluation result of an image received from an external device.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/231,009 US10484594B2 (en) | 2015-06-12 | 2018-12-21 | Medical image acquisition system and medical imaging device |
US16/583,262 US11102391B2 (en) | 2015-06-12 | 2019-09-26 | Medical image acquisition system and medical imaging device |
US17/351,241 US11516376B2 (en) | 2015-06-12 | 2021-06-18 | Medical image acquisition system and medical imaging device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015119731A JP6892087B2 (en) | 2015-06-12 | 2015-06-12 | Medical image acquisition system and medical imaging device |
JP2015-119731 | 2015-06-12 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/231,009 Continuation US10484594B2 (en) | 2015-06-12 | 2018-12-21 | Medical image acquisition system and medical imaging device |
Publications (1)
Publication Number | Publication Date |
---|---|
US20160366328A1 true US20160366328A1 (en) | 2016-12-15 |
Family
ID=57515996
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/155,197 Abandoned US20160366328A1 (en) | 2015-06-12 | 2016-05-16 | Medical image acquisition system and medical imaging device |
US16/231,009 Active US10484594B2 (en) | 2015-06-12 | 2018-12-21 | Medical image acquisition system and medical imaging device |
US16/583,262 Active 2036-07-24 US11102391B2 (en) | 2015-06-12 | 2019-09-26 | Medical image acquisition system and medical imaging device |
US17/351,241 Active US11516376B2 (en) | 2015-06-12 | 2021-06-18 | Medical image acquisition system and medical imaging device |
Family Applications After (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/231,009 Active US10484594B2 (en) | 2015-06-12 | 2018-12-21 | Medical image acquisition system and medical imaging device |
US16/583,262 Active 2036-07-24 US11102391B2 (en) | 2015-06-12 | 2019-09-26 | Medical image acquisition system and medical imaging device |
US17/351,241 Active US11516376B2 (en) | 2015-06-12 | 2021-06-18 | Medical image acquisition system and medical imaging device |
Country Status (2)
Country | Link |
---|---|
US (4) | US20160366328A1 (en) |
JP (1) | JP6892087B2 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20200152416A1 (en) * | 2018-11-12 | 2020-05-14 | Fei Company | Charged particle microscope for examining a specimen, and method of determining an aberration of said charged particle microscope |
DE102018133443A1 (en) * | 2018-12-21 | 2020-06-25 | Leica Microsystems Cms Gmbh | Pulse detector |
CN111867439A (en) * | 2018-03-20 | 2020-10-30 | 索尼公司 | System with endoscope and image sensor and method for processing medical images |
US20210212551A1 (en) * | 2018-10-03 | 2021-07-15 | Olympus Corporation | Power supply apparatus for endoscope |
WO2023024780A1 (en) * | 2021-08-25 | 2023-03-02 | 深圳杰泰科技有限公司 | Endoscope structure with adjustable depth of field, and endoscope |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107427199B (en) * | 2016-03-07 | 2019-10-25 | 奥林巴斯株式会社 | Endoscopic system and endoscope |
EP3586717B1 (en) | 2017-03-27 | 2023-07-05 | Sony Olympus Medical Solutions Inc. | Control device and endoscope system |
WO2018179979A1 (en) | 2017-03-29 | 2018-10-04 | ソニー・オリンパスメディカルソリューションズ株式会社 | Control device, external device, medical observation system, control method, display method, and program |
US11906527B2 (en) * | 2020-10-29 | 2024-02-20 | Ambergen, Inc. | Photocleavable mass-tags for multiplexed mass spectrometric imaging of tissues using biomolecular probes |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030025789A1 (en) * | 1998-11-26 | 2003-02-06 | Olympus Optical Co., Ltd. | Image processing unit for expanding endoscope image signal processing capability |
US20030040659A1 (en) * | 2001-08-23 | 2003-02-27 | Yuri Kazakevich | Autofocusing endoscopic system |
US20030060681A1 (en) * | 2001-08-31 | 2003-03-27 | Olympus Optical Co., Ltd. | Instrumentation endoscope apparatus |
US20050063694A1 (en) * | 2003-09-18 | 2005-03-24 | Fuji Photo Film Co., Ltd. | Auto focus apparatus |
US20100261961A1 (en) * | 2006-12-21 | 2010-10-14 | Intuitive Surgical Operations, Inc. | Hermetically sealed distal sensor endoscope |
US20110021873A1 (en) * | 2009-07-23 | 2011-01-27 | Olympus Corporation | Endoscope apparatus and measurement method |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02304413A (en) * | 1989-05-18 | 1990-12-18 | Olympus Optical Co Ltd | Endoscope |
JP4648577B2 (en) * | 2001-06-15 | 2011-03-09 | Hoya株式会社 | Electronic endoscope device |
EP1645219B1 (en) * | 2004-02-16 | 2016-11-09 | Olympus Corporation | Endoscope system |
JP4464858B2 (en) * | 2005-04-05 | 2010-05-19 | オリンパスメディカルシステムズ株式会社 | Electronic endoscope |
JP2006025913A (en) | 2004-07-13 | 2006-02-02 | Olympus Corp | Medical equipment |
US9459415B2 (en) * | 2008-11-18 | 2016-10-04 | Stryker Corporation | Endoscopic LED light source having a feedback control system |
JP5714931B2 (en) * | 2011-02-15 | 2015-05-07 | オリンパス株式会社 | Imaging device |
JP5767036B2 (en) * | 2011-06-20 | 2015-08-19 | オリンパス株式会社 | Electronic endoscope device |
JP6013020B2 (en) * | 2012-05-02 | 2016-10-25 | オリンパス株式会社 | Endoscope apparatus and method for operating endoscope apparatus |
EP2901912B1 (en) * | 2012-09-27 | 2019-02-20 | Olympus Corporation | Rotation unit, insertion device, insertion body, insertion device wherein insertion body is positioned, and insertion system having insertion body and insertion device |
US9760048B2 (en) | 2013-04-25 | 2017-09-12 | Xerox Corporation | Surface coating and fuser member |
JP6109695B2 (en) * | 2013-09-27 | 2017-04-05 | 富士フイルム株式会社 | Endoscope system, processor device, operation method, and distance measuring device |
-
2015
- 2015-06-12 JP JP2015119731A patent/JP6892087B2/en active Active
-
2016
- 2016-05-16 US US15/155,197 patent/US20160366328A1/en not_active Abandoned
-
2018
- 2018-12-21 US US16/231,009 patent/US10484594B2/en active Active
-
2019
- 2019-09-26 US US16/583,262 patent/US11102391B2/en active Active
-
2021
- 2021-06-18 US US17/351,241 patent/US11516376B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030025789A1 (en) * | 1998-11-26 | 2003-02-06 | Olympus Optical Co., Ltd. | Image processing unit for expanding endoscope image signal processing capability |
US20030040659A1 (en) * | 2001-08-23 | 2003-02-27 | Yuri Kazakevich | Autofocusing endoscopic system |
US20030060681A1 (en) * | 2001-08-31 | 2003-03-27 | Olympus Optical Co., Ltd. | Instrumentation endoscope apparatus |
US20050063694A1 (en) * | 2003-09-18 | 2005-03-24 | Fuji Photo Film Co., Ltd. | Auto focus apparatus |
US20100261961A1 (en) * | 2006-12-21 | 2010-10-14 | Intuitive Surgical Operations, Inc. | Hermetically sealed distal sensor endoscope |
US20110021873A1 (en) * | 2009-07-23 | 2011-01-27 | Olympus Corporation | Endoscope apparatus and measurement method |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111867439A (en) * | 2018-03-20 | 2020-10-30 | 索尼公司 | System with endoscope and image sensor and method for processing medical images |
US20210212551A1 (en) * | 2018-10-03 | 2021-07-15 | Olympus Corporation | Power supply apparatus for endoscope |
US20200152416A1 (en) * | 2018-11-12 | 2020-05-14 | Fei Company | Charged particle microscope for examining a specimen, and method of determining an aberration of said charged particle microscope |
DE102018133443A1 (en) * | 2018-12-21 | 2020-06-25 | Leica Microsystems Cms Gmbh | Pulse detector |
WO2023024780A1 (en) * | 2021-08-25 | 2023-03-02 | 深圳杰泰科技有限公司 | Endoscope structure with adjustable depth of field, and endoscope |
Also Published As
Publication number | Publication date |
---|---|
JP2017000568A (en) | 2017-01-05 |
US20210314482A1 (en) | 2021-10-07 |
JP6892087B2 (en) | 2021-06-18 |
US20200021732A1 (en) | 2020-01-16 |
US11516376B2 (en) | 2022-11-29 |
US10484594B2 (en) | 2019-11-19 |
US11102391B2 (en) | 2021-08-24 |
US20190246033A1 (en) | 2019-08-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11516376B2 (en) | Medical image acquisition system and medical imaging device | |
US8326139B2 (en) | Camera system | |
JP5704699B2 (en) | Rear focus adjustment system for infrared camera and rear focus adjustment method for infrared camera | |
CN105635564A (en) | Multiple camera apparatus and method for synchronized autofocus | |
JP6434227B2 (en) | Infrared camera | |
US11418700B2 (en) | Control device, endoscope system, processing method, and program | |
US6141498A (en) | Shake corrector for use in an optical apparatus, a photographing apparatus provided with a shake corrector, and a shake correcting method | |
JP5010868B2 (en) | Endoscope device | |
JP5970965B2 (en) | Imaging device | |
TW201536049A (en) | System and method for continuous auto focus within camera module | |
WO2018092269A1 (en) | Solid-state image pickup device and image pickup apparatus | |
US11036025B2 (en) | Medical observation apparatus and medical observation system | |
US20150124074A1 (en) | Image acquisition apparatus and image acquisition method | |
KR102054783B1 (en) | X-ray imaging system using intraoral sensor | |
US20230185163A1 (en) | Interchangeable lens device, imaging device, imaging system, method, and program | |
KR101761156B1 (en) | Machine vision system using interline type charge-coupled device | |
JP2011154111A (en) | Vibration isolation device | |
JP2010046220A (en) | Endoscope apparatus | |
JP2015181586A (en) | Endoscope apparatus, camera head, and control apparatus | |
KR200361649Y1 (en) | X-ray view box and digital x-ray picture translator with cmos image sensor | |
JP2021118504A5 (en) | ||
JP2022076742A (en) | Optical device, camera device, processing device, system, processing method, and program | |
JP2012203248A (en) | Focus detection device, control method for the same and imaging device with focus detection device | |
JP2015111170A (en) | Focus control device, camera system, camera body, conversion lens unit, and imaging device | |
Shi et al. | Research and design of thermal infrared camera based on gigabit ethernet |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SONY OLYMPUS MEDICAL SOLUTIONS INC., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YAMAMOTO, TAKAHIRO;REEL/FRAME:038707/0880 Effective date: 20160509 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE |