US20110221877A1 - Endoscope apparatus and measurement method - Google Patents

Endoscope apparatus and measurement method Download PDF

Info

Publication number
US20110221877A1
US20110221877A1 US13/022,010 US201113022010A US2011221877A1 US 20110221877 A1 US20110221877 A1 US 20110221877A1 US 201113022010 A US201113022010 A US 201113022010A US 2011221877 A1 US2011221877 A1 US 2011221877A1
Authority
US
United States
Prior art keywords
base
point
base point
points
measurement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US13/022,010
Other versions
US8913110B2 (en
Inventor
Fumio Hori
Yuusuke Kuwa
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Evident Corp
Original Assignee
Olympus Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Corp filed Critical Olympus Corp
Assigned to OLYMPUS CORPORATION reassignment OLYMPUS CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HORI, FUMIO, KUWA, YUUSUKE
Publication of US20110221877A1 publication Critical patent/US20110221877A1/en
Application granted granted Critical
Publication of US8913110B2 publication Critical patent/US8913110B2/en
Assigned to OLYMPUS CORPORATION reassignment OLYMPUS CORPORATION CHANGE OF ADDRESS Assignors: OLYMPUS CORPORATION
Assigned to EVIDENT CORPORATION reassignment EVIDENT CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: OLYMPUS CORPORATION
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source

Definitions

  • the present invention relates to an endoscope apparatus with a measurement function. Furthermore, the present invention relates to a method of measuring a subject.
  • An endoscope apparatus with a measurement function is used for inspecting blades.
  • the endoscope apparatus measures the defect based on an image where the defect is imaged (hereinafter, referred to as a measurement image) and displays a measurement result. A user checks the measurement result, and determines whether or not the blade needs replacing.
  • a function of a plane-based measurement is known as one of functions provided in an endoscope apparatus (refer to, for example, Japanese Unexamined Patent Application, First Publication No. H2-296209).
  • a spatial distance i.e., three-dimensional distance
  • the base plane is a plane at the defect position that approximates a surface of the measurement target when no defect exists.
  • An endoscope apparatus includes: an imaging unit that captures a subject to acquire an image of the subject; a base point setting section that sets a first base point and a second base point on the image based on an instruction input via an input device; a base line setting section that sets a base line on the image based on the first base point and the second base point; a point setting section that sets at least three points on the image based on the base line; a base plane setting section that sets a base plane in a space based on the at least three points; a distance calculation section that calculates a distance between the base plane and a point corresponding to the first base point; and a display that displays the image.
  • FIG. 1 is a block diagram showing a configuration of an endoscope apparatus according to a first embodiment of the present invention.
  • FIG. 2 is a block diagram showing a configuration of a measurement processing portion provided in the endoscope apparatus according to the first embodiment of the present invention.
  • FIG. 3 is a reference diagram showing a measurement screen according to the first embodiment of the present invention.
  • FIG. 4 is a flow chart showing a procedure of measurement according to the first embodiment of the present invention.
  • FIGS. 5A to 5D are reference diagrams showing a state in which a first base point and a second base point are specified according to the first embodiment of the present invention.
  • FIGS. 6A to 6D are reference diagrams showing a state in which a first base point and a second base point are specified according to the first embodiment of the present invention.
  • FIGS. 7A and 7B are reference diagrams showing a measurement screen according to the first embodiment of the present invention.
  • FIG. 8 is a flow chart showing a procedure of measurement according to the first embodiment of the present invention.
  • FIG. 9 is a flow chart showing a procedure of measurement according to the first embodiment of the present invention.
  • FIG. 10 is a flow chart showing a procedure of measurement according to the first embodiment of the present invention.
  • FIGS. 11A and 11B are reference diagrams showing a calculation method of a base plane according to the first embodiment of the present invention.
  • FIGS. 12A to 12D are reference diagrams showing a calculation method of a base plane according to the first embodiment of the present invention.
  • FIGS. 13A to 13C are reference diagrams showing a calculation method of a spatial distance according to the first embodiment of the present invention.
  • FIGS. 14A to 14C are reference diagrams showing a base ellipse, a base rectangle, and a base triangle, respectively, according to a second embodiment of the present invention.
  • FIGS. 15A to 15C are reference diagrams showing a state in which a first base point and a second base point are specified according to the second embodiment of the present invention.
  • FIGS. 16A to 16C are reference diagrams showing a state in which a first base point and a second base point are specified according to the second embodiment of the present invention.
  • FIGS. 17A to 17C are reference diagrams showing a state in which a first base point and a second base point are specified according to the second embodiment of the present invention.
  • FIG. 18 is a reference diagram for explaining effects according to the second embodiment of the present invention.
  • FIG. 19 is a flow chart showing a procedure of measurement according to a third embodiment of the present invention.
  • FIG. 20 is a flow chart showing a procedure of measurement according to the third embodiment of the present invention.
  • FIGS. 21A to 21D are reference diagrams showing a state in which a first base point and a second base point are specified according to the third embodiment of the present invention.
  • FIG. 22 is a reference diagram for explaining a method of finding three-dimensional coordinates of a measurement point using the stereo measurement.
  • FIG. 1 shows a configuration of an endoscope apparatus according to the present embodiment.
  • an endoscope apparatus 1 includes: an endoscope 2 ; a main unit 3 ; a remote control 4 (input device); a liquid crystal monitor 5 ; optical adapters 7 a , 7 b , and 7 c ; an endoscope unit 8 ; a camera control unit 9 ; and a control unit 10 .
  • the endoscope 2 (videoscope), which captures an image of a measurement target to generate its image signal, includes a long and thin insertion portion 20 .
  • the insertion portion 20 includes: a rigid distal portion 21 ; a bent portion 22 capable of being bent, for example, in the vertical and horizontal directions; and a flexible tube portion 23 , which are coupled in this order from the distal side.
  • the proximal portion of the insertion portion 20 is connected to the endoscope unit 8 .
  • stereo optical adapter for stereo having two observation fields of view
  • normal observation optical adapter 7 c having only one observation field of view
  • the main unit 3 includes the endoscope unit 8 ; the camera control unit (hereinafter, referred to as CCU) 9 as an image processing device; and the control unit 10 as a control device.
  • the endoscope unit 8 includes: a light source apparatus for supplying necessary illumination light at the time of observation; and a bending apparatus for bending the bent portion 22 that constitutes the insertion portion 20 .
  • the CCU 9 receives an image signal output from a solid-state imaging device 2 a built in the distal portion 21 of the insertion portion 20 , converts the image signal into a video signal such as an NTSC signal, and supplies it to the control unit 10 .
  • the solid-state imaging device 2 a generates an image signal by performing photoelectric conversion on a subject image that has been formed through the optical adapter.
  • the control unit 10 includes: an audio signal processing circuit 11 ; a video signal processing circuit 12 ; a ROM 13 ; a RAM 14 ; a PC card interface (hereinafter, referred to as PC card I/F) 15 ; a USB interface (hereinafter, referred to as USB I/F) 16 ; an RS-232C interface (hereinafter, referred to as RS-232C I/F) 17 ; and a measurement processing portion 18 .
  • PC card I/F PC card interface
  • USB I/F USB interface
  • RS-232C I/F RS-232C I/F
  • An audio signal generated by collecting sound with the microphone 34 or an audio signal obtained by playing a recording medium such as a memory card is supplied to the audio signal processing circuit 11 .
  • the video signal processing circuit 12 performs processing of synthesizing the video signal from the CCU 9 with a graphic image signal such as an operation menu generated through the control by the measurement processing portion 18 .
  • the video signal processing circuit 12 subjects the video signal after the synthesis to predetermined processing, and supplies it to the liquid crystal monitor 5 .
  • the video signal processing circuit 12 outputs image data, which is based on the video signal from the CCU 9 , to the measurement processing portion 18 .
  • a stereo optical adapter is attached to the distal portion 21 , and a plurality of subject images relating to the same subject as a measurement target are included in the image based on the image data from the video signal processing circuit 12 .
  • a pair of left and right subject images is included, by way of example.
  • a memory card such as a PCMCIA memory card 32 or a flash memory card 33 is freely attached to or detached from the PC card I/F 15 .
  • control processing information, image information, optical data, or the like that is stored in the memory card can be taken in, or control processing information, image information, optical data, or the like can be stored in memory card, in accordance with the control of the measurement processing portion 18 .
  • the USB I/F 16 is an interface which electrically connects the main unit 3 and a personal computer (PC) 31 to each other.
  • PC personal computer
  • the USB I/F 16 it is possible to perform various kinds of instruction and controls, such as an instruction to display an endoscope image or an image processing during measurement, at the personal computer 31 side.
  • various kinds of instruction and controls such as an instruction to display an endoscope image or an image processing during measurement, at the personal computer 31 side.
  • the RS-232C I/F 17 is connected to the CCU 9 , the endoscope unit 8 , and the remote control 4 which performs control and operation instructions of the CCU 9 , the endoscope unit 8 , and the like.
  • the remote control 4 When a user operates the remote control 4 , a communication required for controlling the CCU 9 and the endoscope unit 8 is performed based on the user's operation of the remote control 4 .
  • the measurement processing portion 18 executes a program stored in the ROM 13 , to thereby take in the image data from the video signal processing circuit 12 and perform measurement processing based on the image data.
  • the RAM 14 is used by the measurement processing portion 18 as a work area for temporarily storing data.
  • FIG. 2 shows a configuration of the measurement processing portion 18 .
  • the measurement processing portion 18 includes: a control section 18 a ; a base point specification section 18 b ; a base line calculation section 18 c ; a point calculation section 18 d ; a spatial-coordinate calculation section 18 e ; a base plane calculation section 18 f ; a distance calculation section 18 g ; and a storage section 18 h.
  • the control section 18 a controls the various sections of the measurement processing portion 18 . Furthermore, the control section 18 a has a function of generating a graphic image signal for displaying the measurement result, the operation menu, and the like on the liquid crystal monitor 5 , and of outputting the graphic image signal to the video signal processing circuit 12 .
  • the base point specification section 18 b specifies a base point on the measurement target based on a signal input from the remote control 4 or the PC 31 (input portion).
  • the user inputs a base point while looking at the image of the measurement target displayed on the liquid crystal monitor 5 .
  • its coordinates are calculated by the base point specification section 18 b .
  • the user operates the remote control 4 .
  • two base points are set on the image.
  • the base line calculation section 18 c set a base line whose shape or size is determined based on the two base points specified by the base point specification section 18 b , and calculates image coordinates of the base line (or the equation of the base line used to determine its image coordinates).
  • a base circle is used as the base line.
  • the base circle is set so as to surround a part of or all of a feature region (a burned portion in the present embodiment).
  • image coordinates on an image displayed on the liquid crystal monitor 5 are described as “image coordinates”, and three-dimensional coordinates in the actual space are described as “spatial coordinates”.
  • the point calculation section 18 d sets three or four base plane composing points that constitute a base plane with reference to the position of the base circle, and calculates their image coordinates.
  • the base plane composing points are set on the base circle.
  • the spatial-coordinate calculation section 18 e calculates spatial coordinates which correspond to the image coordinates.
  • the base plane calculation section 18 f sets a base plane based on the three or four spatial coordinates which correspond to the base plane composing points, and calculates spatial coordinates of the base plane (or the equation of the base plane used to determine its spatial coordinates).
  • the base plane is a plane, at the defect position, which approximates a surface of the measurement target in the case in which no defect exists.
  • the distance calculation section 18 g calculates a spatial distance between the base plane and a point in the space corresponding to one of the two base points specified by the base point specification section 18 b . This spatial distance corresponds to the depth of a concave portion or the height of a convex portion (the depth of the burned portion in the present embodiment) which exists on the surface of the measurement target.
  • the storage section 18 h stores various pieces of information that are processed in the measurement processing portion 18 . The various pieces of information stored in the storage section 18 h are appropriately read by the control section 18 a and are then output to the appropriate sections.
  • three-dimensional coordinates (X, Y, Z) of a measurement target point 60 are calculated by the triangulation method using the following Equations (a) to (c).
  • the three-dimensional coordinates of the measurement target point 60 are found using the parameters D and F.
  • various measurements such as a point-to-point distance, the distance between a line connecting two points and one point, surface area, depth, and surface shape, are possible.
  • optical data that shows the characteristics of the optical system including the distal portion 21 and the stereo optical adaptor are required. Note that the details of the optical data are disclosed, for example, in Japanese Unexamined Patent Application, First Publication No. 2004-49638, so an explanation thereof will be omitted here.
  • measurement of a defect is performed by using the stereo measurement.
  • a measurement target is imaged in a state with the stereo optical adapter attached to the distal portion 21 of the endoscope 2 . Therefore, a pair of left and right images of the measurement target is displayed on the measurement screen.
  • measurement of a defect may be performed by using a measurement method other than the stereo measurement.
  • FIG. 3 shows a measurement screen before the start of measurement.
  • a left image of the measurement target is displayed on a left screen 700
  • a right image of the measurement target is displayed on a right screen 710 .
  • Optical adapter name information 720 , time information 721 , message information 722 , icons 723 a , 723 b , 723 c , and 723 d , and a zoom window 724 are displayed on the measurement screen in a region outside the left screen 700 and the right screen 710 , as other pieces of measurement information.
  • the optical adapter name information 720 and the time information 721 are pieces of information showing measurement conditions.
  • the optical adapter name information 720 is textual information showing the name of the optical adapter in current use.
  • the time information 721 is textual information showing the current date and time.
  • the message information 722 includes: textual information showing an operational instruction for the user; and textual information showing coordinates of a base point, which is one of the measurement conditions.
  • the icons 723 a to 723 d constitute an operation menu for the user to input operational instructions such as switching measurement modes and clearing a measurement result.
  • operational instructions such as switching measurement modes and clearing a measurement result.
  • the remote control 4 When the user operates the remote control 4 to move a cursor 725 onto any of the icons 723 a to 723 d and performs an operation such as a click in this state, a signal corresponding to the operation is input to the measurement processing portion 18 .
  • the control section 18 a Based on the signal, the control section 18 a recognizes the operational instruction from the user, and controls the measurement processing.
  • an enlarged image of the measurement target located around the cursor 725 is displayed on the zoom window 724 .
  • FIG. 4 shows a procedure of measurement.
  • Step SA the user operates the remote control 4 to specify a first base point on the measurement screen displayed on the liquid crystal monitor 5 .
  • the user moves a cursor 500 displayed on the left screen of the measurement screen and performs an operation such as a click to specify a first base point 510 .
  • the first base point becomes the center position of a base circle, and the depth of a concave portion or the height of a convex portion is measured at the first base point. It is preferable for the user to specify the first base point at a point located nearly in the center of the inside of the burned portion.
  • Step SA when the user inputs an instruction of specifying the first base point, the base point specification section 18 b recognizes image coordinates at the current cursor position as image coordinates of the first base point. Details of Step SA will be described later.
  • Step SB the user operates the remote control 4 to specify a second base point on the measurement screen displayed on the liquid crystal monitor 5 .
  • a base circle 520 having a size in accordance with the position of the cursor 500 is displayed, and three base plane composing points 530 are displayed on the base circle 520 .
  • a second base point is temporarily specified at the position of the cursor.
  • the base circle has a diameter twice as long as the distance between the first base point and the second base point.
  • the three base plane composing points are set on the base circle in an evenly spaced manner, and one of the three base plane composing points is at the same position as the cursor.
  • Step SB when the user inputs an instruction of specifying (fixing) the second base point, the base point specification section 18 b recognizes image coordinates at the current cursor position as image coordinates of the second base point. Details of Step SB will be described later.
  • the number of base plane composing points is three.
  • the number of base plane composing points may be four or more.
  • the number of base plane composing points is four, as shown in FIG. 6A , the user moves a cursor 600 and performs an operation such as a click to specify a first base point 610 .
  • a base circle 620 is displayed and four base plane composing points 630 are displayed on the base circle 620 .
  • a second base point is temporarily specified at the position of the cursor.
  • the four base plane composing points are set on the base circle in an evenly spaced manner, and one of the four base plane composing points is at the same position as the cursor.
  • the position of the temporarily-specified second base point changes in accordance with the movement of the cursor 600 , and the base circle 620 and the base plane composing points 630 also change.
  • the user performs an operation such as a click to specify (fix) the second base point in a state where the base circle 620 is located slightly outside the burned portion (i.e., in a state where the base circle 620 nearly surrounds the burned portion).
  • Step SC the base plane composing points and the base plane are calculated, and a spatial distance (depth or height: the depth of the burned portion in the present embodiment) between the base plane and a point in the space corresponding to the first base point are calculated. Details of Step SC will be described later.
  • Step SD the control section 18 a generates a graphic image signal for displaying the base circle, the base plane composing points, and the first base point, and outputs it to the video signal processing circuit 12 .
  • the base circle, the base plane composing points, and the first base point are displayed on the left screen.
  • the control section 18 a generates a graphic image signal for displaying the spatial distance between the base plane and the point in the space corresponding to the first base point, and outputs it to the video signal processing circuit 12 .
  • the spatial distance is displayed on the left screen.
  • FIGS. 7A and 7B show measurement screens when the measurement result of the burned portion is displayed.
  • FIG. 7A shows a measurement screen in the case in which the number of base plane composing points is three.
  • Base plane composing points 800 and a first base point 810 are displayed on the left screen. Specifically, the base plane composing points 800 are displayed as unfilled diamond marks, and they are connected with each other. The first base point 810 is displayed as an “x” mark. Further, an intersection point 820 (i.e., foot of a perpendicular) between the base plane and a line which is perpendicular to the base plane and passes through the point in the space corresponding to the first base point 810 is displayed as a small filled square.
  • intersection point 820 i.e., foot of a perpendicular
  • a result window 830 is displayed on the right screen.
  • the image of the measurement target is displayed in the upper portion of the result window 830
  • the spatial distance in text is displayed in the lower portion of the result window 830 .
  • D denotes the spatial distance (i.e., the depth of the burned portion).
  • FIG. 7B shows a measurement screen in the case in which the number of base plane composing points is four.
  • FIG. 7B is similar to FIG. 7A except that four base plane composing points 800 are displayed.
  • Step SA a procedure of Step SA.
  • a measurement screen is displayed and a cursor is displayed on the measurement screen.
  • Step SA 1 a signal indicating a movement amount of the cursor by the user's operation of the remote control 4 is input into the measurement processing portion 18 .
  • Step SA 2 the base point specification section 18 b calculates image coordinates of the cursor at the next time by calculating the movement amount of the cursor based on the signal from the remote control 4 , and adding the calculated movement amount to the position of the cursor at the current time.
  • Step SA 3 the control section 18 a generates a graphic image signal for displaying the cursor at the image coordinates calculated by the base point specification section 18 b , and outputs it to the video signal processing circuit 12 .
  • the cursor is displayed at the position the user specifies.
  • Step SA 4 the control section 18 a determines, based on the signal from the remote control 4 , whether or not an operation, such as a click, of specifying a first base point has been input.
  • an operation of specifying a first base point has not been input, the processing returns to Step SA 1 .
  • the processing proceeds to Step SA 5 , and the base point specification section 18 b recognizes the image coordinates calculated in Step SA 2 as image coordinates of the first base point. As a result, the first base point is fixed.
  • Step SA 6 the control section 18 a generates a graphic image signal for displaying the first base point at the above image coordinates, and outputs it to the video signal processing circuit 12 .
  • the first base point is displayed at the same position as the cursor. Then, the processing proceeds to Step SB.
  • Step SB 1 a signal indicating a movement amount of the cursor by the user's operation of the remote control 4 is input into the measurement processing portion 18 .
  • Step SB 2 the base point specification section 18 b calculates image coordinates of the cursor at the next time by calculating the movement amount of the cursor based on the signal from the remote control 4 , and adding the calculated movement amount to the position of the cursor at the current time. Further, the base point specification section 18 b recognizes the above image coordinates as image coordinates of the second base point which is temporarily specified by the user.
  • Step SB 3 the control section 18 a generates a graphic image signal for displaying the cursor at the image coordinates calculated by the base point specification section 18 b , and outputs it to the video signal processing circuit 12 .
  • the base line calculation section 18 c calculates image coordinates of the base circle (or the equation of the base circle used to determine its image coordinates) based on the image coordinates of the first base point and the image coordinates of the cursor calculated in Step SB 2 (i.e., the image coordinates of the temporarily-specified second base point).
  • the center of the base circle is at the first base point, and the radius of the base circle is equal to the distance between the image coordinates of the first base point and the image coordinates of the cursor.
  • Step SB 5 the point calculation section 18 d calculates image coordinates of three or four base plane composing points that constitute a base plane based on the image coordinates of the base circle.
  • the control section 18 a generates a graphic image signal for displaying the base circle and the base plane composing points, and outputs it to the video signal processing circuit 12 . As a result, the base circle and the base plane composing points are displayed.
  • Step SB 7 the control section 18 a determines, based on the signal from the remote control 4 , whether or not an operation, such as a click, of specifying (fixing) a second base point has been input.
  • an operation of specifying (fixing) a second base point has not been input
  • the processing returns to Step SB 1 .
  • the processing proceeds to Step SB 8 , and the base point specification section 18 b recognizes the image coordinates calculated in Step SB 2 as image coordinates of the fixed second base point. Then, the processing proceeds to Step SC.
  • Step SC 1 the base line calculation section 18 c calculates image coordinates of the base circle (or the equation of the base circle used to determine its image coordinates) based on the image coordinates of the first base point and the image coordinates of the second base point.
  • the center of the base circle is at the first base point, and the radius of the base circle is equal to the distance between the image coordinates of the first base point and the image coordinates of the second base point.
  • Step SC 2 the point calculation section 18 d calculates image coordinates of three or four base plane composing points which constitute a base plane based on the image coordinates of the base circle.
  • the base circle and the base plane composing points which were calculated in the previous Steps SB 4 and SB 5 may be also used in the subsequent processing instead of calculating the base circle and the base plane composing points in Steps SC 1 and SC 2 .
  • Step SC 3 the point calculation section 18 d performs matching processing in which image coordinates of corresponding points (matching points) on the right image which correspond to the image coordinates of the first base point and the base plane composing points on the left image are calculated by pattern-matching.
  • Step SC 4 the spatial-coordinate calculation section 18 e calculates spatial coordinates of a point in the space which corresponds to the first base point, based on the image coordinates of the first base point and the image coordinates of its matching point. Further, the spatial-coordinate calculation section 18 e calculates spatial coordinates of three or four points in the space which correspond to the base plane composing points, based on the image coordinates of the base plane composing points and the image coordinates of their matching points.
  • Step SC 5 the base plane calculation section 18 f set a base plane based on the spatial coordinates of the three or four points calculated in Step SC 4 , and calculates spatial coordinates of the base plane (or the equation of the base plane used to determine its spatial coordinates). Details of the calculation method of the base plane will be described later.
  • Step SC 6 the distance calculation section 18 g calculates a spatial distance between the base plane and the spatial coordinates of the point in the space corresponding to the first base point. This spatial distance corresponds to the depth of the burned portion in the present embodiment. Details of the calculation method of the spatial distance will be described later. Then, the processing proceeds to Step SD.
  • FIGS. 11A and 11B Details of the calculation method of the base plane in the case in which the number of base plane composing points is three will be described with reference to FIGS. 11A and 11B .
  • FIG. 11A it is assumed that spatial points of the three base plane composing points are P 1 , P 2 , and P 3 , respectively, and their spatial coordinates are expressed as follows.
  • the values of A, B, and C can be calculated using the Equations (2) to (4).
  • the base plane can be calculated.
  • FIGS. 12A to 12D Details of the calculation method of the base plane in the case in which the number of base plane composing points is four will be described with reference to FIGS. 12A to 12D .
  • FIG. 12A it is assumed that spatial points of the four base plane composing points are P 1 , P 2 , P 3 , and P 4 , respectively, and their spatial coordinates are expressed as follows.
  • directional vectors V and W of the lines L 1 and L 2 are expressed by the following Equations (8) and (9), respectively.
  • Equation (11) Since the base plane passes through the gravitation point P 0 , the following Equation (11) can be obtained.
  • the values of A, B, and C can be calculated using the Equations (11) to (13).
  • the base plane can be obtained.
  • Step SC 6 the calculation method of the spatial distance (depth or height) in Step SC 6 will be described with reference to FIGS. 13A to 13C .
  • a spatial point which corresponds to the first base point is Pb, and its spatial coordinates are expressed as follows.
  • Equation (14) a spatial line L 3 which passes through the spatial point Pb and whose directional vector is parallel to the normal vector I of the base plane is expressed by the following Equation (14).
  • the spatial distance D between the spatial point Pb and the intersection point Pf is equal to the distance between the spatial point Pb and the base plane.
  • the spatial distance D is expressed by the following Equation (16).
  • the spatial point Pb is compared with the spatial coordinates of the intersection point Pf and the spatial point Pb is positioned further toward the positive side in the z-direction than the intersection point Pf, it is assumed that the spatial point Pb is at the position lower than the base plane (i.e., in the concave portion of the measurement target), and ⁇ D is regarded as a spatial distance (depth).
  • the spatial point Pb is positioned further toward the negative side in the z-direction than the intersection point Pf, it is assumed that the spatial point Pb is at the position higher than the base plane (i.e., in the convex portion of the measurement target), and +D is regarded as a spatial distance (height).
  • the spatial distance (depth or height) can be calculated.
  • the calculation method of the spatial distance in the case in which the number of base plane composing points is four is the same as is described above.
  • the base plane is calculated by using the spatial coordinates of the three or four base plane composing points in the above-described methods
  • the base plane may be calculated by using spatial coordinates of five or more base plane composing points.
  • the spatial distance is calculated based on the base plane in the above-described methods
  • the spatial distance between the gravity point P 0 and the spatial point Pb may be regarded as a spatial distance which corresponds to the depth or height without obtaining the base plane.
  • the second base point is set after the first base point is set.
  • the second base point may be set before the first base point is set.
  • the user may specify (fix) the second base point such that the second base point is located slightly outside the burned portion by performing an operation such as a click, and then, the user may move the cursor 500 and performs an operation such as a click to specify the first base point 510 .
  • a spatial distance between the base plane and a point in the space corresponding to the first base point can be calculated. Therefore, it is possible to reduce the burden of operation at the time of the plane-based measurement and improve the operability of the apparatus.
  • the user can confirm the accuracy of the plane-based measurement by displaying the base circle and the base plane composing points. For example, the user can make a determination such that the user presumes that a certain level of accuracy can be obtained when the base circle and all of the base plane composing points surround the burned portion, and the user presumes that the accuracy degrades when a portion of the base circle passes outside the burned portion, or some of the base plane composing points are located outside the burned portion.
  • Step SB until an operation of specifying (fixing) a second base point has been performed, the procedure does not proceed to the processing of calculating a base plane and a spatial distance, and the base circle and the base plane composing points on the measurement screen are updated in accordance with the movement of the cursor. As a result, it is possible to make the user confirm the base circle and the base plane composing points before performing measurement.
  • a configuration of an endoscope apparatus according to the present embodiment is the same as the configuration of the endoscope apparatus of the first embodiment.
  • the present embodiment makes it possible to specify a second base point more flexibly.
  • base ellipse is an ellipse as shown in FIG. 14A that is set and displayed on the measurement screen when the user specifies a second base point.
  • a base ellipse 900 has one diameter whose length is twice as long as the distance in the horizontal direction between a first base point 910 and a cursor 920 , and the other diameter whose length is twice as long as the distance in the vertical direction between the first base point 910 and the cursor 920 .
  • base rectangle is a rectangle as shown in FIG. 14B that is set and displayed on the measurement screen when the user specifies a second base point.
  • a base rectangle 930 has one side whose length is twice as long as the distance in the horizontal direction between a first base point 910 and a cursor 920 , and the other diameter whose length is twice as long as the distance in the vertical direction between the first base point 910 and the cursor 920 .
  • base triangle is a triangle as shown in FIG. 14C that is set and displayed on the measurement screen when the user specifies a second base point.
  • a base triangle 940 has a base whose length is twice as long as the distance in the horizontal direction between a first base point 910 and a cursor 920 , and a height which is twice as long as the distance in the vertical direction between the first base point 910 and the cursor 920 .
  • the method in the case of using the base ellipse is as follows. As shown in FIG. 15A , the user moves a cursor 1000 displayed on the left screen of the measurement screen and performs an operation such as a click to specify a first base point 1010 . Then, as shown in FIGS. 15B and 15C , when the user moves the cursor 1000 , a base ellipse 1020 and base plane composing points 1030 are displayed. At this time, a second base point is temporarily specified at the position of the cursor 1000 .
  • the base ellipse 1020 has one diameter whose length is twice as long as the distance in the horizontal direction between the first base point 1010 and the cursor 1000 , and the other diameter whose length is twice as long as the distance in the vertical direction between the first base point 1010 and the cursor 1000 .
  • the four base plane composing points 1030 are set on the base ellipse 1020 at the both edge points of the major and minor diameters.
  • the user performs an operation such as a click to specify (fix) the second base point in a state where the four base plane composing points 1030 on the base ellipse 1020 are located slightly outside the burned portion (i.e., in a state where the base ellipse 1020 nearly surrounds the burned portion).
  • the method in the case of using the base rectangle is as follows. As shown in FIG. 16A , the user moves a cursor 1000 displayed on the left screen of the measurement screen and performs an operation such as a click to specify a first base point 1010 . Then, as shown in FIGS. 16B and 16C , when the user moves the cursor 1000 , a base rectangle 1040 and base plane composing points 1050 are displayed. At this time, a second base point is temporarily specified at the position of the cursor 1000 .
  • the base rectangle 1040 has one side whose length is twice as long as the distance in the horizontal direction between the first base point 1010 and the cursor 1000 , and the other side whose length is twice as long as the distance in the vertical direction between the first base point 1010 and the cursor 1000 .
  • the four base plane composing points 1050 are set at the corners of the base rectangle 1040 .
  • the user performs an operation such as a click to specify (fix) the second base point in a state where the four base plane composing points 1050 on the base rectangle 1040 are located slightly outside the burned portion (i.e., in a state where the base rectangle 1040 nearly surrounds the burned portion).
  • the method in the case of using the base triangle is as follows. As shown in FIG. 17A , the user moves a cursor 1000 displayed on the left screen of the measurement screen and performs an operation such as a click to specify a first base point 1010 . Then, as shown in FIGS. 17B and 17C , when the user moves the cursor 1000 , a base triangle 1060 and base plane composing points 1070 are displayed. At this time, a second base point is temporarily specified at the position of the cursor 1000 .
  • the base triangle 1060 has a base whose length is twice as long as the distance in the horizontal direction between the first base point 1010 and the cursor 1000 , and a height which is twice as long as the distance in the vertical direction between the first base point 1010 and the cursor 1000 .
  • the three base plane composing points 1070 are set on the base triangle 1060 at the two edge points of the base and at the apex.
  • the user performs an operation such as a click to specify (fix) the second base point in a state where the three base plane composing points 1050 on the base triangle 1060 are located slightly outside the burned portion.
  • a base line i.e., base ellipse, base rectangle, or base triangle
  • a base line i.e., base ellipse, base rectangle, or base triangle
  • FIG. 18 the case in which a concave portion 121 as a measurement target exists on an image 120 obtained by capturing the surface of the measurement target will be illustrated.
  • the lower portion in FIG. 18 shows a cross-sectional surface of the measurement target.
  • the first base point which specifies the position at which the depth of the concave portion 121 is measured, and the base plane composing points which determine a base line approximating a surface 122 of the measurement target are set in accordance with the user's operation.
  • the base plane composing points are set in the positions where errors in the measurement result becomes small.
  • the base line can be flexibly set by using a base ellipse, a base rectangle, or a base triangle in accordance with the shape and the size of the burned portion. Therefore, it is possible to reduce errors in the measurement result.
  • a configuration of an endoscope apparatus according to the present embodiment is the same as the configuration of the endoscope apparatus of the first embodiment.
  • the present embodiment makes it possible to preliminarily notify the user of a guide of measurement accuracy.
  • the measurement accuracy depends on the object distance which is a distance from the distal end of the endoscope 2 to the measurement target. It is possible to have some awareness of the level of measurement accuracy by calculating the object distance. Generally, the smaller the object distance is, the better the measurement accuracy is.
  • FIG. 19 shows a procedure of Step SA.
  • the procedure shown in FIG. 19 is different from the procedure shown in FIG. 8 in that Steps SA 10 and SA 11 are inserted between Steps SA 5 and SA 6 .
  • Step SA 10 the point calculation section 18 d calculates image coordinates of a corresponding point (matching point) on the right image which correspond to the image coordinates of the first base point on the left image by the matching processing.
  • Step SA 11 the spatial-coordinate calculation section 18 e calculates spatial coordinates of a point in the space corresponding to the first base point based on the image coordinates of the first base point and the image coordinates of its matching point.
  • the z-coordinate of the spatial coordinates is the object distance.
  • Step SA 6 the control section 18 a generates a graphic image signal for displaying the first base point at the calculated image coordinates, and outputs it to the video signal processing circuit 12 .
  • the control section 18 a sets the color of the first base point in accordance with its object distance.
  • the first base point is displayed in the color in accordance with its object distance at the same position as the cursor.
  • FIG. 20 shows a procedure of Step SB.
  • the procedure shown in FIG. 20 is different from the procedure shown in FIG. 9 in that Steps SB 10 and SB 11 are inserted between Steps SB 5 and SB 6 .
  • the point calculation section 18 d calculates image coordinates of three or four corresponding points (matching points) on the right image which correspond to the image coordinates of the three or four base plane composing points on the left image by the matching processing.
  • the spatial-coordinate calculation section 18 e calculates spatial coordinates of points in the space which correspond to the base plane composing points based on the image coordinates of the base plane composing points and the image coordinates of their matching points.
  • the z-coordinate of the spatial coordinates is the object distance.
  • Step SB 6 the control section 18 a generates a graphic image signal for displaying the base circle and the base plane composing points, and outputs it to the video signal processing circuit 12 .
  • the control section 18 a sets the colors of the base plane composing points in accordance with their respective object distances.
  • the base circle is displayed and the base plane composing points are displayed in the color in accordance with their respective object distances.
  • FIGS. 21A to 21D shows a state in which first and second base points are specified.
  • the user moves a cursor 1100 displayed on the left screen of the measurement screen and performs an operation such as a click to specify a first base point 1110 .
  • the first base point 1110 is displayed in the color in accordance with its object distance.
  • FIG. 21B when the user moves the cursor 1100 , a base circle 1120 is displayed and three base plane composing points 1130 are displayed on the base circle 1120 .
  • a second base point is temporarily specified at the position of the cursor 1100 .
  • the base plane composing points 1130 are displayed in the color in accordance with their respective object distances.
  • the position of the temporarily-specified second base point changes in accordance with the movement of the cursor 1100 , and the base circle 1120 and the base plane composing points 1130 also change.
  • the base plane composing points 1130 are displayed in the color in accordance with their respective object distances.
  • the user performs an operation such as a click to specify (fix) the second base point in a state where the base circle 1120 is located slightly outside the burned portion (i.e., in a state where the base circle 1120 nearly surrounds the burned portion).
  • the user can know the object distances from the colors of the first base point and the base plane composing points, and use the object distances as a guide of measurement accuracy.
  • the colors of the first base point and the base plane composing points are changed in accordance with their respective object distances.
  • other display configurations such as the size may be changed.
  • the object distance of the second base point may be calculated and the second base point may be displayed in the color in accordance with the object distance.
  • Step SA 11 or Step SB 11 the control section 18 a may determines whether or not each of the object distances is equal to or less than a predetermined value, and the specification (fixation) of the first base point or the second base point may be prohibited when there is an object distance which exceeds the predetermined value. That is, the first base point or the second base point may be specified (fixed) only when each of the object distances is equal to or less than a predetermined value. In this case, the processing proceeds to Step SC only when all of the object distances become equal to or less than the predetermined value. Further, a warning for the user which indicates that there is a possibility that the base plane is not appropriately set may be displayed when the variability of the object distances of the base plane composing points calculated in Step SB 11 is remarkable.
  • the user it is possible for the user to confirm the measurement accuracy using the object distance when specifying the first base point and the second base point. As a result, it is possible to reduce measurement error in cases where a base plane is set at the position unsuitable for measurement. Further, by prohibiting the processing from proceeding to the calculation of the base plane and the spatial distance in Step SC until the object distances become equal to or less than a predetermined value, it is possible to reduce measurement error in cases where a base plane is set at the position unsuitable for measurement.

Abstract

An endoscope apparatus, includes: an imaging unit that captures a subject to acquire an image of the subject; a base point setting section that sets a first base point and a second base point on the image based on an instruction input via an input device; a base line setting section that sets a base line on the image based on the first base point and the second base point; a point setting section that sets at least three points on the image based on the base line; a base plane setting section that sets a base plane in a space based on the at least three points; a distance calculation section that calculates a distance between the base plane and a point corresponding to the first base point; and a display that displays the image.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates to an endoscope apparatus with a measurement function. Furthermore, the present invention relates to a method of measuring a subject.
  • Priority is claimed on Japanese Patent Application No. 2010-051917, filed Mar. 9, 2010, the content of which is incorporated herein by reference.
  • 2. Description of Related Art
  • In gas turbines mainly used in aircraft, their internal portions reach a high temperature. This sometimes results in the production of a defect (burned portion) such as a burn or tarnish on a surface of a turbine blade. The size of the defect is one of the indices for determining whether to replace the blade or not, so the inspection of the defect is extremely important. An endoscope apparatus with a measurement function is used for inspecting blades. In the inspection of blades, the endoscope apparatus measures the defect based on an image where the defect is imaged (hereinafter, referred to as a measurement image) and displays a measurement result. A user checks the measurement result, and determines whether or not the blade needs replacing.
  • A function of a plane-based measurement is known as one of functions provided in an endoscope apparatus (refer to, for example, Japanese Unexamined Patent Application, First Publication No. H2-296209). In the plane-based measurement, a spatial distance (i.e., three-dimensional distance) between a virtual plane (i.e., base plane) determined by spatial coordinates of three points which are specified on a measurement screen by a user, and spatial coordinates of one point which is specified on the measurement screen by the user. The base plane is a plane at the defect position that approximates a surface of the measurement target when no defect exists. By performing the plane-based measurement, it is possible to obtain the depth of a concave portion, the height of a convex portion, or the like which exists on the surface of a measurement target.
  • SUMMARY OF THE INVENTION
  • An endoscope apparatus according to an aspect of the present invention includes: an imaging unit that captures a subject to acquire an image of the subject; a base point setting section that sets a first base point and a second base point on the image based on an instruction input via an input device; a base line setting section that sets a base line on the image based on the first base point and the second base point; a point setting section that sets at least three points on the image based on the base line; a base plane setting section that sets a base plane in a space based on the at least three points; a distance calculation section that calculates a distance between the base plane and a point corresponding to the first base point; and a display that displays the image.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram showing a configuration of an endoscope apparatus according to a first embodiment of the present invention.
  • FIG. 2 is a block diagram showing a configuration of a measurement processing portion provided in the endoscope apparatus according to the first embodiment of the present invention.
  • FIG. 3 is a reference diagram showing a measurement screen according to the first embodiment of the present invention.
  • FIG. 4 is a flow chart showing a procedure of measurement according to the first embodiment of the present invention.
  • FIGS. 5A to 5D are reference diagrams showing a state in which a first base point and a second base point are specified according to the first embodiment of the present invention.
  • FIGS. 6A to 6D are reference diagrams showing a state in which a first base point and a second base point are specified according to the first embodiment of the present invention.
  • FIGS. 7A and 7B are reference diagrams showing a measurement screen according to the first embodiment of the present invention.
  • FIG. 8 is a flow chart showing a procedure of measurement according to the first embodiment of the present invention.
  • FIG. 9 is a flow chart showing a procedure of measurement according to the first embodiment of the present invention.
  • FIG. 10 is a flow chart showing a procedure of measurement according to the first embodiment of the present invention.
  • FIGS. 11A and 11B are reference diagrams showing a calculation method of a base plane according to the first embodiment of the present invention.
  • FIGS. 12A to 12D are reference diagrams showing a calculation method of a base plane according to the first embodiment of the present invention.
  • FIGS. 13A to 13C are reference diagrams showing a calculation method of a spatial distance according to the first embodiment of the present invention.
  • FIGS. 14A to 14C are reference diagrams showing a base ellipse, a base rectangle, and a base triangle, respectively, according to a second embodiment of the present invention.
  • FIGS. 15A to 15C are reference diagrams showing a state in which a first base point and a second base point are specified according to the second embodiment of the present invention.
  • FIGS. 16A to 16C are reference diagrams showing a state in which a first base point and a second base point are specified according to the second embodiment of the present invention.
  • FIGS. 17A to 17C are reference diagrams showing a state in which a first base point and a second base point are specified according to the second embodiment of the present invention.
  • FIG. 18 is a reference diagram for explaining effects according to the second embodiment of the present invention.
  • FIG. 19 is a flow chart showing a procedure of measurement according to a third embodiment of the present invention.
  • FIG. 20 is a flow chart showing a procedure of measurement according to the third embodiment of the present invention.
  • FIGS. 21A to 21D are reference diagrams showing a state in which a first base point and a second base point are specified according to the third embodiment of the present invention.
  • FIG. 22 is a reference diagram for explaining a method of finding three-dimensional coordinates of a measurement point using the stereo measurement.
  • DETAILED DESCRIPTION OF THE INVENTION
  • Hereunder is a description of embodiments of the present invention with reference to the drawings. Hereunder is a description of a measurement function of a defect for the case where a burned portion of a turbine blade is a measurement target, by way of example.
  • First Embodiment
  • First, a first embodiment of the present invention will be described. FIG. 1 shows a configuration of an endoscope apparatus according to the present embodiment.
  • As shown in FIG. 1, an endoscope apparatus 1 includes: an endoscope 2; a main unit 3; a remote control 4 (input device); a liquid crystal monitor 5; optical adapters 7 a, 7 b, and 7 c; an endoscope unit 8; a camera control unit 9; and a control unit 10.
  • The endoscope 2 (videoscope), which captures an image of a measurement target to generate its image signal, includes a long and thin insertion portion 20. The insertion portion 20 includes: a rigid distal portion 21; a bent portion 22 capable of being bent, for example, in the vertical and horizontal directions; and a flexible tube portion 23, which are coupled in this order from the distal side. The proximal portion of the insertion portion 20 is connected to the endoscope unit 8. Various optical adapters, such as the optical adapter 7 a or 7 b for stereo having two observation fields of view (hereinafter, referred to as stereo optical adapter) or the normal observation optical adapter 7 c having only one observation field of view, can be attached to the distal portion 21 in a freely detachably manner by, for example, threading.
  • The main unit 3 includes the endoscope unit 8; the camera control unit (hereinafter, referred to as CCU) 9 as an image processing device; and the control unit 10 as a control device. The endoscope unit 8 includes: a light source apparatus for supplying necessary illumination light at the time of observation; and a bending apparatus for bending the bent portion 22 that constitutes the insertion portion 20. The CCU 9 receives an image signal output from a solid-state imaging device 2 a built in the distal portion 21 of the insertion portion 20, converts the image signal into a video signal such as an NTSC signal, and supplies it to the control unit 10. The solid-state imaging device 2 a generates an image signal by performing photoelectric conversion on a subject image that has been formed through the optical adapter.
  • The control unit 10 includes: an audio signal processing circuit 11; a video signal processing circuit 12; a ROM 13; a RAM 14; a PC card interface (hereinafter, referred to as PC card I/F) 15; a USB interface (hereinafter, referred to as USB I/F) 16; an RS-232C interface (hereinafter, referred to as RS-232C I/F) 17; and a measurement processing portion 18.
  • An audio signal generated by collecting sound with the microphone 34 or an audio signal obtained by playing a recording medium such as a memory card is supplied to the audio signal processing circuit 11. To display a synthesized image obtained by synthesizing the endoscope image supplied from the CCU 9 with a graphical operation menu, the video signal processing circuit 12 performs processing of synthesizing the video signal from the CCU 9 with a graphic image signal such as an operation menu generated through the control by the measurement processing portion 18. In addition, to display a video on the screen of the liquid crystal monitor 5, the video signal processing circuit 12 subjects the video signal after the synthesis to predetermined processing, and supplies it to the liquid crystal monitor 5.
  • The video signal processing circuit 12 outputs image data, which is based on the video signal from the CCU 9, to the measurement processing portion 18. At the time of measurement, a stereo optical adapter is attached to the distal portion 21, and a plurality of subject images relating to the same subject as a measurement target are included in the image based on the image data from the video signal processing circuit 12. In the present embodiment, a pair of left and right subject images is included, by way of example.
  • A memory card (recording medium) such as a PCMCIA memory card 32 or a flash memory card 33 is freely attached to or detached from the PC card I/F 15. When the memory card is attached to the PC card I/F 15, control processing information, image information, optical data, or the like that is stored in the memory card can be taken in, or control processing information, image information, optical data, or the like can be stored in memory card, in accordance with the control of the measurement processing portion 18.
  • The USB I/F 16 is an interface which electrically connects the main unit 3 and a personal computer (PC) 31 to each other. When the main unit 3 and the personal computer 31 are connected to each other through the USB I/F 16, it is possible to perform various kinds of instruction and controls, such as an instruction to display an endoscope image or an image processing during measurement, at the personal computer 31 side. In addition, it is possible to input and output various pieces of processing information, data and the like between the main unit 3 and the personal computer 31.
  • The RS-232C I/F 17 is connected to the CCU 9, the endoscope unit 8, and the remote control 4 which performs control and operation instructions of the CCU 9, the endoscope unit 8, and the like. When a user operates the remote control 4, a communication required for controlling the CCU 9 and the endoscope unit 8 is performed based on the user's operation of the remote control 4.
  • The measurement processing portion 18 executes a program stored in the ROM 13, to thereby take in the image data from the video signal processing circuit 12 and perform measurement processing based on the image data. The RAM 14 is used by the measurement processing portion 18 as a work area for temporarily storing data. FIG. 2 shows a configuration of the measurement processing portion 18. As shown in FIG. 2, the measurement processing portion 18 includes: a control section 18 a; a base point specification section 18 b; a base line calculation section 18 c; a point calculation section 18 d; a spatial-coordinate calculation section 18 e; a base plane calculation section 18 f; a distance calculation section 18 g; and a storage section 18 h.
  • The control section 18 a controls the various sections of the measurement processing portion 18. Furthermore, the control section 18 a has a function of generating a graphic image signal for displaying the measurement result, the operation menu, and the like on the liquid crystal monitor 5, and of outputting the graphic image signal to the video signal processing circuit 12.
  • The base point specification section 18 b specifies a base point on the measurement target based on a signal input from the remote control 4 or the PC 31 (input portion). When the user inputs a base point while looking at the image of the measurement target displayed on the liquid crystal monitor 5, its coordinates are calculated by the base point specification section 18 b. In the following description, it is assumed that the user operates the remote control 4. However, the same applies to the case where the user operates the PC 31. In the present embodiment, two base points are set on the image.
  • The base line calculation section 18 c set a base line whose shape or size is determined based on the two base points specified by the base point specification section 18 b, and calculates image coordinates of the base line (or the equation of the base line used to determine its image coordinates). In the present embodiment, a base circle is used as the base line. The base circle is set so as to surround a part of or all of a feature region (a burned portion in the present embodiment). In the present specification, two-dimensional coordinates on an image displayed on the liquid crystal monitor 5 are described as “image coordinates”, and three-dimensional coordinates in the actual space are described as “spatial coordinates”. The point calculation section 18 d sets three or four base plane composing points that constitute a base plane with reference to the position of the base circle, and calculates their image coordinates. In the present embodiment, the base plane composing points are set on the base circle. The spatial-coordinate calculation section 18 e calculates spatial coordinates which correspond to the image coordinates.
  • The base plane calculation section 18 f sets a base plane based on the three or four spatial coordinates which correspond to the base plane composing points, and calculates spatial coordinates of the base plane (or the equation of the base plane used to determine its spatial coordinates). The base plane is a plane, at the defect position, which approximates a surface of the measurement target in the case in which no defect exists. The distance calculation section 18 g calculates a spatial distance between the base plane and a point in the space corresponding to one of the two base points specified by the base point specification section 18 b. This spatial distance corresponds to the depth of a concave portion or the height of a convex portion (the depth of the burned portion in the present embodiment) which exists on the surface of the measurement target. The storage section 18 h stores various pieces of information that are processed in the measurement processing portion 18. The various pieces of information stored in the storage section 18 h are appropriately read by the control section 18 a and are then output to the appropriate sections.
  • Next, the way of calculating three-dimensional coordinates of a measurement point by the stereo measurement will be described with reference to FIG. 22. For images that are imaged by the left side and right side optical systems, three-dimensional coordinates (X, Y, Z) of a measurement target point 60 are calculated by the triangulation method using the following Equations (a) to (c). Note that it is assumed that the coordinates of a measurement point 61 and a corresponding point 62 (a point on the right image that corresponds to the measurement point 61 on the left image) on the left and right images that have been subjected to distortion correction are (XL, YL) and (XR, YR), respectively, the distance between optical centers 63 and 64 on the left side and the right side is D, the focal length is F, and t=D/(XL−XR).

  • X=t×X R +D/2  (a)

  • Y=t×Y R  (b)

  • Z=t×F  (c)
  • When the coordinates of the measurement point 61 and the corresponding point 62 on are determined in the aforementioned manner, the three-dimensional coordinates of the measurement target point 60 are found using the parameters D and F. By calculating the three-dimensional coordinates of a number of points, various measurements such as a point-to-point distance, the distance between a line connecting two points and one point, surface area, depth, and surface shape, are possible. Furthermore, it is possible to calculate the distance (object distance) from the left-side optical center 63 or the right-side optical center 64 to the subject. In order to carry out the aforementioned stereo measurement, optical data that shows the characteristics of the optical system including the distal portion 21 and the stereo optical adaptor are required. Note that the details of the optical data are disclosed, for example, in Japanese Unexamined Patent Application, First Publication No. 2004-49638, so an explanation thereof will be omitted here.
  • Next is a description of a measurement screen in the present embodiment. In the present embodiment, measurement of a defect is performed by using the stereo measurement. In the stereo measurement, a measurement target is imaged in a state with the stereo optical adapter attached to the distal portion 21 of the endoscope 2. Therefore, a pair of left and right images of the measurement target is displayed on the measurement screen. Note that measurement of a defect may be performed by using a measurement method other than the stereo measurement.
  • FIG. 3 shows a measurement screen before the start of measurement. As measurement information, a left image of the measurement target is displayed on a left screen 700, and a right image of the measurement target is displayed on a right screen 710. Optical adapter name information 720, time information 721, message information 722, icons 723 a, 723 b, 723 c, and 723 d, and a zoom window 724 are displayed on the measurement screen in a region outside the left screen 700 and the right screen 710, as other pieces of measurement information.
  • The optical adapter name information 720 and the time information 721 are pieces of information showing measurement conditions. The optical adapter name information 720 is textual information showing the name of the optical adapter in current use. The time information 721 is textual information showing the current date and time. The message information 722 includes: textual information showing an operational instruction for the user; and textual information showing coordinates of a base point, which is one of the measurement conditions.
  • The icons 723 a to 723 d constitute an operation menu for the user to input operational instructions such as switching measurement modes and clearing a measurement result. When the user operates the remote control 4 to move a cursor 725 onto any of the icons 723 a to 723 d and performs an operation such as a click in this state, a signal corresponding to the operation is input to the measurement processing portion 18. Based on the signal, the control section 18 a recognizes the operational instruction from the user, and controls the measurement processing. In addition, an enlarged image of the measurement target located around the cursor 725 is displayed on the zoom window 724.
  • Next, a procedure of measurement in the present embodiment will be described. FIG. 4 shows a procedure of measurement. In Step SA, the user operates the remote control 4 to specify a first base point on the measurement screen displayed on the liquid crystal monitor 5. As shown in FIG. 5A, the user moves a cursor 500 displayed on the left screen of the measurement screen and performs an operation such as a click to specify a first base point 510. The first base point becomes the center position of a base circle, and the depth of a concave portion or the height of a convex portion is measured at the first base point. It is preferable for the user to specify the first base point at a point located nearly in the center of the inside of the burned portion. In Step SA, when the user inputs an instruction of specifying the first base point, the base point specification section 18 b recognizes image coordinates at the current cursor position as image coordinates of the first base point. Details of Step SA will be described later.
  • In Step SB, the user operates the remote control 4 to specify a second base point on the measurement screen displayed on the liquid crystal monitor 5. As shown in FIG. 5B, when the user moves the cursor 500, a base circle 520 having a size in accordance with the position of the cursor 500 is displayed, and three base plane composing points 530 are displayed on the base circle 520. At this time, a second base point is temporarily specified at the position of the cursor. The base circle has a diameter twice as long as the distance between the first base point and the second base point. The three base plane composing points are set on the base circle in an evenly spaced manner, and one of the three base plane composing points is at the same position as the cursor.
  • As shown in FIG. 5C, when the user further moves the cursor 500, the position of the temporarily-specified second base point changes in accordance with the movement of the cursor 500, and the base circle 520 and the base plane composing points 530 also change. Then, as shown in FIG. 5D, the user performs an operation such as a click to specify (fix) the second base point in a state where the base circle 520 is located slightly outside the burned portion (i.e., in a state where the base circle 520 nearly surrounds the burned portion). The base circle may be set in a state where the base circle does not completely surround the burned portion. However, it is preferable that all of the base plane composing points are located outside the burned portion in order to enhance the measurement accuracy. In Step SB, when the user inputs an instruction of specifying (fixing) the second base point, the base point specification section 18 b recognizes image coordinates at the current cursor position as image coordinates of the second base point. Details of Step SB will be described later.
  • In the example shown in FIGS. 5A to 5D, the number of base plane composing points is three. However, the number of base plane composing points may be four or more. In the case in which the number of base plane composing points is four, as shown in FIG. 6A, the user moves a cursor 600 and performs an operation such as a click to specify a first base point 610. Then, when the user further moves the cursor 600, as shown in FIG. 6B, a base circle 620 is displayed and four base plane composing points 630 are displayed on the base circle 620. At this time, a second base point is temporarily specified at the position of the cursor. The four base plane composing points are set on the base circle in an evenly spaced manner, and one of the four base plane composing points is at the same position as the cursor.
  • As shown in FIG. 6C, when the user further moves the cursor 600, the position of the temporarily-specified second base point changes in accordance with the movement of the cursor 600, and the base circle 620 and the base plane composing points 630 also change. Then, as shown in FIG. 6D, the user performs an operation such as a click to specify (fix) the second base point in a state where the base circle 620 is located slightly outside the burned portion (i.e., in a state where the base circle 620 nearly surrounds the burned portion).
  • In Step SC, the base plane composing points and the base plane are calculated, and a spatial distance (depth or height: the depth of the burned portion in the present embodiment) between the base plane and a point in the space corresponding to the first base point are calculated. Details of Step SC will be described later.
  • In Step SD, the control section 18 a generates a graphic image signal for displaying the base circle, the base plane composing points, and the first base point, and outputs it to the video signal processing circuit 12. As a result, the base circle, the base plane composing points, and the first base point are displayed on the left screen. In Step SE, the control section 18 a generates a graphic image signal for displaying the spatial distance between the base plane and the point in the space corresponding to the first base point, and outputs it to the video signal processing circuit 12. As a result, the spatial distance is displayed on the left screen.
  • FIGS. 7A and 7B show measurement screens when the measurement result of the burned portion is displayed. FIG. 7A shows a measurement screen in the case in which the number of base plane composing points is three. Base plane composing points 800 and a first base point 810 are displayed on the left screen. Specifically, the base plane composing points 800 are displayed as unfilled diamond marks, and they are connected with each other. The first base point 810 is displayed as an “x” mark. Further, an intersection point 820 (i.e., foot of a perpendicular) between the base plane and a line which is perpendicular to the base plane and passes through the point in the space corresponding to the first base point 810 is displayed as a small filled square.
  • A result window 830 is displayed on the right screen. The image of the measurement target is displayed in the upper portion of the result window 830, and the spatial distance in text is displayed in the lower portion of the result window 830. D denotes the spatial distance (i.e., the depth of the burned portion). FIG. 7B shows a measurement screen in the case in which the number of base plane composing points is four. FIG. 7B is similar to FIG. 7A except that four base plane composing points 800 are displayed.
  • Next, a procedure of Step SA (i.e., processing of specifying a first base point) will be described. FIG. 8 shows a procedure of Step SA. At the start time of Step SA, a measurement screen is displayed and a cursor is displayed on the measurement screen. In Step SA1, a signal indicating a movement amount of the cursor by the user's operation of the remote control 4 is input into the measurement processing portion 18. In Step SA2, the base point specification section 18 b calculates image coordinates of the cursor at the next time by calculating the movement amount of the cursor based on the signal from the remote control 4, and adding the calculated movement amount to the position of the cursor at the current time.
  • In Step SA3, the control section 18 a generates a graphic image signal for displaying the cursor at the image coordinates calculated by the base point specification section 18 b, and outputs it to the video signal processing circuit 12. As a result, the cursor is displayed at the position the user specifies.
  • In Step SA4, the control section 18 a determines, based on the signal from the remote control 4, whether or not an operation, such as a click, of specifying a first base point has been input. When an operation of specifying a first base point has not been input, the processing returns to Step SA1. When an operation of specifying a first base point has been input, the processing proceeds to Step SA5, and the base point specification section 18 b recognizes the image coordinates calculated in Step SA2 as image coordinates of the first base point. As a result, the first base point is fixed.
  • In Step SA6, the control section 18 a generates a graphic image signal for displaying the first base point at the above image coordinates, and outputs it to the video signal processing circuit 12. As a result, the first base point is displayed at the same position as the cursor. Then, the processing proceeds to Step SB.
  • Next, a procedure of Step SB (i.e., processing of specifying a second base point) will be described. FIG. 9 shows a procedure of Step SB. In Step SB1, a signal indicating a movement amount of the cursor by the user's operation of the remote control 4 is input into the measurement processing portion 18. In Step SB2, the base point specification section 18 b calculates image coordinates of the cursor at the next time by calculating the movement amount of the cursor based on the signal from the remote control 4, and adding the calculated movement amount to the position of the cursor at the current time. Further, the base point specification section 18 b recognizes the above image coordinates as image coordinates of the second base point which is temporarily specified by the user.
  • In Step SB3, the control section 18 a generates a graphic image signal for displaying the cursor at the image coordinates calculated by the base point specification section 18 b, and outputs it to the video signal processing circuit 12. As a result, the cursor is displayed at the position the user specifies. In Step SB4, the base line calculation section 18 c calculates image coordinates of the base circle (or the equation of the base circle used to determine its image coordinates) based on the image coordinates of the first base point and the image coordinates of the cursor calculated in Step SB2 (i.e., the image coordinates of the temporarily-specified second base point). The center of the base circle is at the first base point, and the radius of the base circle is equal to the distance between the image coordinates of the first base point and the image coordinates of the cursor.
  • In Step SB5, the point calculation section 18 d calculates image coordinates of three or four base plane composing points that constitute a base plane based on the image coordinates of the base circle. As described above, although the base plane composing points are set on the base circle in an evenly spaced manner in the present embodiment, they may be set unevenly. In Step SB6, the control section 18 a generates a graphic image signal for displaying the base circle and the base plane composing points, and outputs it to the video signal processing circuit 12. As a result, the base circle and the base plane composing points are displayed.
  • In Step SB7, the control section 18 a determines, based on the signal from the remote control 4, whether or not an operation, such as a click, of specifying (fixing) a second base point has been input. When an operation of specifying (fixing) a second base point has not been input, the processing returns to Step SB 1. When an operation of specifying (fixing) a second base point has been input, the processing proceeds to Step SB8, and the base point specification section 18 b recognizes the image coordinates calculated in Step SB2 as image coordinates of the fixed second base point. Then, the processing proceeds to Step SC.
  • Next, a procedure of Step SC (i.e., calculation of the depth or height) will be described. FIG. 10 shows a procedure of Step SC. In Step SC1, the base line calculation section 18 c calculates image coordinates of the base circle (or the equation of the base circle used to determine its image coordinates) based on the image coordinates of the first base point and the image coordinates of the second base point. The center of the base circle is at the first base point, and the radius of the base circle is equal to the distance between the image coordinates of the first base point and the image coordinates of the second base point.
  • In Step SC2, the point calculation section 18 d calculates image coordinates of three or four base plane composing points which constitute a base plane based on the image coordinates of the base circle. The base circle and the base plane composing points which were calculated in the previous Steps SB4 and SB5 may be also used in the subsequent processing instead of calculating the base circle and the base plane composing points in Steps SC1 and SC2.
  • In Step SC3, the point calculation section 18 d performs matching processing in which image coordinates of corresponding points (matching points) on the right image which correspond to the image coordinates of the first base point and the base plane composing points on the left image are calculated by pattern-matching. In Step SC4, the spatial-coordinate calculation section 18 e calculates spatial coordinates of a point in the space which corresponds to the first base point, based on the image coordinates of the first base point and the image coordinates of its matching point. Further, the spatial-coordinate calculation section 18 e calculates spatial coordinates of three or four points in the space which correspond to the base plane composing points, based on the image coordinates of the base plane composing points and the image coordinates of their matching points.
  • In Step SC5, the base plane calculation section 18 f set a base plane based on the spatial coordinates of the three or four points calculated in Step SC4, and calculates spatial coordinates of the base plane (or the equation of the base plane used to determine its spatial coordinates). Details of the calculation method of the base plane will be described later. In Step SC6, the distance calculation section 18 g calculates a spatial distance between the base plane and the spatial coordinates of the point in the space corresponding to the first base point. This spatial distance corresponds to the depth of the burned portion in the present embodiment. Details of the calculation method of the spatial distance will be described later. Then, the processing proceeds to Step SD.
  • Next, the calculation method of the base plane will be described. First, details of the calculation method of the base plane in the case in which the number of base plane composing points is three will be described with reference to FIGS. 11A and 11B. As shown in FIG. 11A, it is assumed that spatial points of the three base plane composing points are P1, P2, and P3, respectively, and their spatial coordinates are expressed as follows.

  • P1:(x1,y1,z1)

  • P2:(x2,x2,z2)

  • P3:(x3,y3,z3)
  • As shown in FIG. 11B, when a plane passing through the points P1 to P3 is defined as a base plane, the equation of the base plane is obtained as follows. The base plane is defined by the following Equation (1).

  • Ax+By+Cz=1  (1)
  • Since the base plane passes through the points P1 to P3, the following Equations (2) to (4) can be obtained.

  • Ax 1 +By 1 +Cz 1=1  (2)

  • AX 2 +By 2 +Cz 2=1  (3)

  • Ax 3 +By 3 +Cz 3=1  (4)
  • The values of A, B, and C can be calculated using the Equations (2) to (4). By the above-described method, the base plane can be calculated.
  • Next, details of the calculation method of the base plane in the case in which the number of base plane composing points is four will be described with reference to FIGS. 12A to 12D. As shown in FIG. 12A, it is assumed that spatial points of the four base plane composing points are P1, P2, P3, and P4, respectively, and their spatial coordinates are expressed as follows.

  • P1:(x1,y1,z1)

  • P2:(x2,y2,z2)

  • P3:(x3,y3,z3)

  • P4:(x4,y4,z4)
  • When it is assumed that the gravity point of the points P1 to P4 is P0, its spatial coordinates are expressed by the following Equation (5).
  • P 0 : ( x 0 , y 0 , z 0 ) = ( x 1 + x 2 + x 3 + x 4 4 , y 1 + y 2 + y 3 + y 4 4 , z 1 + z 2 + z 3 + z 4 4 ) ( 5 )
  • As shown in FIG. 12B, when it is assumed that a line in the space passing through the points P1 and P2 is L1 and a line in the space passing through the points P3 and P4 is L2, the lines L1 and L2 are expressed by the following Equations (6) and (7), respectively.
  • L 1 : x - x 1 x 1 - x 2 = y - y 1 y 1 - y 2 = z - z 2 z 1 - z 2 ( 6 ) L 2 : x - x 3 x 3 - x 4 = y - y 3 y 3 - y 4 = z - z 3 z 3 - z 4 ( 7 )
  • As shown in FIG. 12C, directional vectors V and W of the lines L1 and L2 are expressed by the following Equations (8) and (9), respectively.

  • V=(v x ,v y ,V z)=(x 1 −x 2 ,y 1 −y 2 ,z 1 −z 2)

  • W=(W x ,W y ,W z)=(x 3 −x 4 ,y 3 −y 4 ,z 3 −z 4)  (9)
  • As shown in FIG. 12D, when a plane which passes through the gravity point P0 and whose normal vector is perpendicular to each of the directional vectors of the lines L1 and L2 is defined as a base plane, the equation of the base plane is obtained as follows. The base plane is defined by the following Equation (10).

  • Ax+By+Cz=1  (10)
  • Since the base plane passes through the gravitation point P0, the following Equation (11) can be obtained.

  • Ax 0 +By 0 +Cz 0=1  (11)
  • Further, the normal vector I of the base plane is expressed by I=(A, B, C). Since the normal vector I of the base plane is perpendicular to each of the directional vectors V and W of the lines L1 and L2, the inner product between the directional vector V and the normal vector I becomes 0, and the inner product between the directional vector W and the normal vector I also becomes 0, as the following Equations (12) and (13).

  • AV x +BV y +CV z=0  (12)

  • AW x +BW y +CW z=0  (13)
  • The values of A, B, and C can be calculated using the Equations (11) to (13). By the above-described method, the base plane can be obtained.
  • Next, the calculation method of the spatial distance (depth or height) in Step SC6 will be described with reference to FIGS. 13A to 13C. As shown in FIG. 13A, it is assumed that a spatial point which corresponds to the first base point is Pb, and its spatial coordinates are expressed as follows.

  • Pb:(xb,yb,zb)
  • As shown in FIG. 13B, a spatial line L3 which passes through the spatial point Pb and whose directional vector is parallel to the normal vector I of the base plane is expressed by the following Equation (14).
  • L 3 : x - x b A = y - y b B = z - z b C ( 14 )
  • As shown in FIG. 13C, when it is assumed that an intersection point between the line L3 and the base plane is Pf, its spatial coordinates are expressed by the following Equation (15) using the Equations of the line L3 and the base plane. The intersection point Pf is the same as a foot of a perpendicular with respect to the base plane from the spatial point Pb.
  • P f : ( x b - A ( Ax b + By b + Cz b - 1 ) A 2 + B 2 + C 2 , y b - B ( Ax b + By b + Cz b - 1 ) A 2 + B 2 + C 2 , z b - C ( Ax b + By b + Cz b - 1 ) A 2 + B 2 + C 2 ) ( 15 )
  • Accordingly, the spatial distance D between the spatial point Pb and the intersection point Pf is equal to the distance between the spatial point Pb and the base plane. The spatial distance D is expressed by the following Equation (16).
  • D = Ax b + By b + Cz b - 1 A 2 + B 2 + C 2 ( 16 )
  • When the spatial coordinates of the spatial point Pb is compared with the spatial coordinates of the intersection point Pf and the spatial point Pb is positioned further toward the positive side in the z-direction than the intersection point Pf, it is assumed that the spatial point Pb is at the position lower than the base plane (i.e., in the concave portion of the measurement target), and −D is regarded as a spatial distance (depth). On the other hand, when the spatial point Pb is positioned further toward the negative side in the z-direction than the intersection point Pf, it is assumed that the spatial point Pb is at the position higher than the base plane (i.e., in the convex portion of the measurement target), and +D is regarded as a spatial distance (height). By the above-described method, the spatial distance (depth or height) can be calculated. The calculation method of the spatial distance in the case in which the number of base plane composing points is four is the same as is described above.
  • Although the base plane is calculated by using the spatial coordinates of the three or four base plane composing points in the above-described methods, the base plane may be calculated by using spatial coordinates of five or more base plane composing points. Further, although the spatial distance is calculated based on the base plane in the above-described methods, the spatial distance between the gravity point P0 and the spatial point Pb may be regarded as a spatial distance which corresponds to the depth or height without obtaining the base plane.
  • In the present embodiment, the second base point is set after the first base point is set. However, the second base point may be set before the first base point is set. For example, in FIGS. 5A to 5D, the user may specify (fix) the second base point such that the second base point is located slightly outside the burned portion by performing an operation such as a click, and then, the user may move the cursor 500 and performs an operation such as a click to specify the first base point 510.
  • As described above, according to the present embodiment, when two points of the first base point and the second base point are set, a spatial distance between the base plane and a point in the space corresponding to the first base point can be calculated. Therefore, it is possible to reduce the burden of operation at the time of the plane-based measurement and improve the operability of the apparatus.
  • Further, the user can confirm the accuracy of the plane-based measurement by displaying the base circle and the base plane composing points. For example, the user can make a determination such that the user presumes that a certain level of accuracy can be obtained when the base circle and all of the base plane composing points surround the burned portion, and the user presumes that the accuracy degrades when a portion of the base circle passes outside the burned portion, or some of the base plane composing points are located outside the burned portion.
  • Further, in Step SB, until an operation of specifying (fixing) a second base point has been performed, the procedure does not proceed to the processing of calculating a base plane and a spatial distance, and the base circle and the base plane composing points on the measurement screen are updated in accordance with the movement of the cursor. As a result, it is possible to make the user confirm the base circle and the base plane composing points before performing measurement.
  • Second Embodiment
  • Next, a second embodiment of the present invention will be described. A configuration of an endoscope apparatus according to the present embodiment is the same as the configuration of the endoscope apparatus of the first embodiment. The present embodiment makes it possible to specify a second base point more flexibly.
  • Next is a description of the terms used in the present embodiment. The term “base ellipse” is an ellipse as shown in FIG. 14A that is set and displayed on the measurement screen when the user specifies a second base point. A base ellipse 900 has one diameter whose length is twice as long as the distance in the horizontal direction between a first base point 910 and a cursor 920, and the other diameter whose length is twice as long as the distance in the vertical direction between the first base point 910 and the cursor 920.
  • The term “base rectangle” is a rectangle as shown in FIG. 14B that is set and displayed on the measurement screen when the user specifies a second base point. A base rectangle 930 has one side whose length is twice as long as the distance in the horizontal direction between a first base point 910 and a cursor 920, and the other diameter whose length is twice as long as the distance in the vertical direction between the first base point 910 and the cursor 920.
  • The term “base triangle” is a triangle as shown in FIG. 14C that is set and displayed on the measurement screen when the user specifies a second base point. A base triangle 940 has a base whose length is twice as long as the distance in the horizontal direction between a first base point 910 and a cursor 920, and a height which is twice as long as the distance in the vertical direction between the first base point 910 and the cursor 920.
  • Next, the method of specifying first and second base points will be described with reference to FIGS. 15A to 17C.
  • The method in the case of using the base ellipse is as follows. As shown in FIG. 15A, the user moves a cursor 1000 displayed on the left screen of the measurement screen and performs an operation such as a click to specify a first base point 1010. Then, as shown in FIGS. 15B and 15C, when the user moves the cursor 1000, a base ellipse 1020 and base plane composing points 1030 are displayed. At this time, a second base point is temporarily specified at the position of the cursor 1000.
  • The base ellipse 1020 has one diameter whose length is twice as long as the distance in the horizontal direction between the first base point 1010 and the cursor 1000, and the other diameter whose length is twice as long as the distance in the vertical direction between the first base point 1010 and the cursor 1000. The four base plane composing points 1030 are set on the base ellipse 1020 at the both edge points of the major and minor diameters. The user performs an operation such as a click to specify (fix) the second base point in a state where the four base plane composing points 1030 on the base ellipse 1020 are located slightly outside the burned portion (i.e., in a state where the base ellipse 1020 nearly surrounds the burned portion).
  • The method in the case of using the base rectangle is as follows. As shown in FIG. 16A, the user moves a cursor 1000 displayed on the left screen of the measurement screen and performs an operation such as a click to specify a first base point 1010. Then, as shown in FIGS. 16B and 16C, when the user moves the cursor 1000, a base rectangle 1040 and base plane composing points 1050 are displayed. At this time, a second base point is temporarily specified at the position of the cursor 1000.
  • The base rectangle 1040 has one side whose length is twice as long as the distance in the horizontal direction between the first base point 1010 and the cursor 1000, and the other side whose length is twice as long as the distance in the vertical direction between the first base point 1010 and the cursor 1000. The four base plane composing points 1050 are set at the corners of the base rectangle 1040. The user performs an operation such as a click to specify (fix) the second base point in a state where the four base plane composing points 1050 on the base rectangle 1040 are located slightly outside the burned portion (i.e., in a state where the base rectangle 1040 nearly surrounds the burned portion).
  • The method in the case of using the base triangle is as follows. As shown in FIG. 17A, the user moves a cursor 1000 displayed on the left screen of the measurement screen and performs an operation such as a click to specify a first base point 1010. Then, as shown in FIGS. 17B and 17C, when the user moves the cursor 1000, a base triangle 1060 and base plane composing points 1070 are displayed. At this time, a second base point is temporarily specified at the position of the cursor 1000.
  • The base triangle 1060 has a base whose length is twice as long as the distance in the horizontal direction between the first base point 1010 and the cursor 1000, and a height which is twice as long as the distance in the vertical direction between the first base point 1010 and the cursor 1000. The three base plane composing points 1070 are set on the base triangle 1060 at the two edge points of the base and at the apex. The user performs an operation such as a click to specify (fix) the second base point in a state where the three base plane composing points 1050 on the base triangle 1060 are located slightly outside the burned portion.
  • As described above, according to the present embodiment, it is possible to flexibly set a base line (i.e., base ellipse, base rectangle, or base triangle) with various shapes and sizes. Further, it is possible to obtain the following effects.
  • As shown in FIG. 18, the case in which a concave portion 121 as a measurement target exists on an image 120 obtained by capturing the surface of the measurement target will be illustrated. The lower portion in FIG. 18 shows a cross-sectional surface of the measurement target. As described above, the first base point which specifies the position at which the depth of the concave portion 121 is measured, and the base plane composing points which determine a base line approximating a surface 122 of the measurement target are set in accordance with the user's operation.
  • For example, when points A, B1, and C1 are set as base plane composing points and a point D is set as a first base point, the depth of the concave portion 121 becomes d1. On the other hand, when points A, B2, and C2 are specified as base plane composing points and the point D is set as a first base point, the depth of the concave portion 121 becomes d2.
  • When the points A, B1, and C1 are set as base plane composing points, a base plane S1 passing through these points approximates the surface 122 inside a pipe with relatively high accuracy. Therefore, an error of the depth d1 as the measurement result with respect to the actual depth is small. However, when the points A, B2, and C2 are set as base plane composing points, the degree of approximation of a base plane S2 passing through these points with respect to the surface 122 inside the pipe is low compared with the base plane S1. Therefore, an error of the depth d2 as the measurement result with respect to the actual depth becomes large.
  • As described above, errors in the measurement result may occur due to the position of the base plane composing points. Therefore, it is desirable that the base plane composing points are set in the positions where errors in the measurement result becomes small. In the present embodiment, the base line can be flexibly set by using a base ellipse, a base rectangle, or a base triangle in accordance with the shape and the size of the burned portion. Therefore, it is possible to reduce errors in the measurement result.
  • Third Embodiment
  • Next, a third embodiment of the present invention will be described. A configuration of an endoscope apparatus according to the present embodiment is the same as the configuration of the endoscope apparatus of the first embodiment. The present embodiment makes it possible to preliminarily notify the user of a guide of measurement accuracy. The measurement accuracy depends on the object distance which is a distance from the distal end of the endoscope 2 to the measurement target. It is possible to have some awareness of the level of measurement accuracy by calculating the object distance. Generally, the smaller the object distance is, the better the measurement accuracy is.
  • Hereinafter, an explanation will be made only a different point from the first embodiment. The measurement of the present embodiment is different from the first embodiment in the processing of Steps SA and SB in FIG. 4. FIG. 19 shows a procedure of Step SA. The procedure shown in FIG. 19 is different from the procedure shown in FIG. 8 in that Steps SA10 and SA11 are inserted between Steps SA5 and SA6.
  • In Step SA10, the point calculation section 18 d calculates image coordinates of a corresponding point (matching point) on the right image which correspond to the image coordinates of the first base point on the left image by the matching processing. In Step SA11, the spatial-coordinate calculation section 18 e calculates spatial coordinates of a point in the space corresponding to the first base point based on the image coordinates of the first base point and the image coordinates of its matching point. The z-coordinate of the spatial coordinates is the object distance.
  • In Step SA6, the control section 18 a generates a graphic image signal for displaying the first base point at the calculated image coordinates, and outputs it to the video signal processing circuit 12. At this time, the control section 18 a sets the color of the first base point in accordance with its object distance. As a result, the first base point is displayed in the color in accordance with its object distance at the same position as the cursor.
  • FIG. 20 shows a procedure of Step SB. The procedure shown in FIG. 20 is different from the procedure shown in FIG. 9 in that Steps SB 10 and SB 11 are inserted between Steps SB5 and SB6. In Step SB10, the point calculation section 18 d calculates image coordinates of three or four corresponding points (matching points) on the right image which correspond to the image coordinates of the three or four base plane composing points on the left image by the matching processing. In Step SB11, the spatial-coordinate calculation section 18 e calculates spatial coordinates of points in the space which correspond to the base plane composing points based on the image coordinates of the base plane composing points and the image coordinates of their matching points. The z-coordinate of the spatial coordinates is the object distance.
  • In Step SB6, the control section 18 a generates a graphic image signal for displaying the base circle and the base plane composing points, and outputs it to the video signal processing circuit 12. At this time, the control section 18 a sets the colors of the base plane composing points in accordance with their respective object distances. As a result, the base circle is displayed and the base plane composing points are displayed in the color in accordance with their respective object distances.
  • FIGS. 21A to 21D shows a state in which first and second base points are specified. As shown in FIG. 21A, the user moves a cursor 1100 displayed on the left screen of the measurement screen and performs an operation such as a click to specify a first base point 1110. At this time, the first base point 1110 is displayed in the color in accordance with its object distance. As shown in FIG. 21B, when the user moves the cursor 1100, a base circle 1120 is displayed and three base plane composing points 1130 are displayed on the base circle 1120. At this time, a second base point is temporarily specified at the position of the cursor 1100. Further, the base plane composing points 1130 are displayed in the color in accordance with their respective object distances.
  • As shown in FIGS. 21C and 21D, when the user further moves the cursor 1100, the position of the temporarily-specified second base point changes in accordance with the movement of the cursor 1100, and the base circle 1120 and the base plane composing points 1130 also change. At this time, the base plane composing points 1130 are displayed in the color in accordance with their respective object distances. Then, the user performs an operation such as a click to specify (fix) the second base point in a state where the base circle 1120 is located slightly outside the burned portion (i.e., in a state where the base circle 1120 nearly surrounds the burned portion).
  • The user can know the object distances from the colors of the first base point and the base plane composing points, and use the object distances as a guide of measurement accuracy. In the above-described embodiment, the colors of the first base point and the base plane composing points are changed in accordance with their respective object distances. However, other display configurations such as the size may be changed. Further, the object distance of the second base point may be calculated and the second base point may be displayed in the color in accordance with the object distance.
  • In Step SA11 or Step SB11, the control section 18 a may determines whether or not each of the object distances is equal to or less than a predetermined value, and the specification (fixation) of the first base point or the second base point may be prohibited when there is an object distance which exceeds the predetermined value. That is, the first base point or the second base point may be specified (fixed) only when each of the object distances is equal to or less than a predetermined value. In this case, the processing proceeds to Step SC only when all of the object distances become equal to or less than the predetermined value. Further, a warning for the user which indicates that there is a possibility that the base plane is not appropriately set may be displayed when the variability of the object distances of the base plane composing points calculated in Step SB11 is remarkable.
  • As described above, according to the present embodiment, it is possible for the user to confirm the measurement accuracy using the object distance when specifying the first base point and the second base point. As a result, it is possible to reduce measurement error in cases where a base plane is set at the position unsuitable for measurement. Further, by prohibiting the processing from proceeding to the calculation of the base plane and the spatial distance in Step SC until the object distances become equal to or less than a predetermined value, it is possible to reduce measurement error in cases where a base plane is set at the position unsuitable for measurement.
  • While preferred embodiments of the invention have been described and illustrated above, it should be understood that these are exemplary of the invention and are not to be considered as limiting. Additions, omissions, substitutions, and other modifications can be made without departing from the scope of the present invention. Accordingly, the invention is not to be considered as being limited by the foregoing description, and is only limited by the scope of the appended claims.

Claims (11)

1. An endoscope apparatus, comprising:
an imaging unit that captures a subject to acquire an image of the subject;
a base point setting section that sets a first base point and a second base point on the image based on an instruction input via an input device;
a base line setting section that sets a base line on the image based on the first base point and the second base point;
a point setting section that sets at least three points on the image based on the base line;
a base plane setting section that sets a base plane in a space based on the at least three points;
a distance calculation section that calculates a distance between the base plane and a point corresponding to the first base point; and
a display that displays the image.
2. The endoscope apparatus according to claim 1, wherein the display further displays the base line or the at least three points.
3. The endoscope apparatus according to claim 2, wherein:
the display further displays a cursor;
after setting the first base point based on an instruction input via the input device, the base point setting section detects a position of the cursor based on an instruction input via the input device and sets the second base point based on the position of the cursor; and
the base line or the at least three points displayed by the display are updated in accordance with the movement of the cursor.
4. The endoscope apparatus according to claim 3, wherein the base plane setting section sets the base plane when an instruction of fixing the second base point is input via the input device.
5. The endoscope apparatus according to claim 3, wherein the distance calculation section calculates the distance when an instruction of fixing the second base point is input via the input device.
6. The endoscope apparatus according to claim 1, further comprising an object distance calculation section which calculates an object distance based on the first base point, the second base point, or the at least three points.
7. The endoscope apparatus according to claim 6, wherein the base plane setting section sets the base plane when the object distance is equal to or less than a predetermined value.
8. The endoscope apparatus according to claim 6, wherein the distance calculation section calculates the distance when the object distance is equal to or less than a predetermined value.
9. The endoscope apparatus according to claim 7, wherein the display further displays the object distance.
10. The endoscope apparatus according to claim 8, wherein the display further displays the object distance.
11. A measurement method comprising the following steps of:
acquiring an image of a subject;
setting a first base point and a second base point on the image based on an instruction input via an input device;
setting a base line on the image based on the first base point and the second base point;
setting at least three points on the image based on the base line;
setting a base plane in a space based on the at least three points; and
calculating a distance between the base plane and a point in the space corresponding to the first base point.
US13/022,010 2010-03-09 2011-02-07 Endoscope apparatus and measurement method Expired - Fee Related US8913110B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JPP2010-051917 2010-03-09
JP2010051917A JP5530225B2 (en) 2010-03-09 2010-03-09 Endoscope apparatus and program

Publications (2)

Publication Number Publication Date
US20110221877A1 true US20110221877A1 (en) 2011-09-15
US8913110B2 US8913110B2 (en) 2014-12-16

Family

ID=44559600

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/022,010 Expired - Fee Related US8913110B2 (en) 2010-03-09 2011-02-07 Endoscope apparatus and measurement method

Country Status (2)

Country Link
US (1) US8913110B2 (en)
JP (1) JP5530225B2 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150170412A1 (en) * 2013-12-17 2015-06-18 General Electric Company Method and device for automatically identifying a point of interest on the surface of an anomaly
US20150170352A1 (en) * 2013-12-17 2015-06-18 General Electric Company Method and device for automatically identifying the deepest point on the surface of an anomaly
US20160171705A1 (en) * 2013-12-17 2016-06-16 General Electric Company Method and device for automatically identifying a point of interest in a depth measurement on a viewed object
WO2017053505A2 (en) * 2015-09-25 2017-03-30 General Electric Company Method and device for measuring features on or near an object
US9690427B2 (en) 2014-09-03 2017-06-27 Panasonic Intellectual Property Management Co., Ltd. User interface device, and projector device
US9842430B2 (en) 2013-12-17 2017-12-12 General Electric Company Method and device for automatically identifying a point of interest on a viewed object
US9984474B2 (en) 2011-03-04 2018-05-29 General Electric Company Method and device for measuring features on or near an object
US10019812B2 (en) 2011-03-04 2018-07-10 General Electric Company Graphic overlay for measuring dimensions of features using a video inspection device
US10157495B2 (en) 2011-03-04 2018-12-18 General Electric Company Method and device for displaying a two-dimensional image of a viewed object simultaneously with an image depicting the three-dimensional geometry of the viewed object
US10586341B2 (en) 2011-03-04 2020-03-10 General Electric Company Method and device for measuring features on or near an object
EP3835218A1 (en) * 2019-12-10 2021-06-16 Rolls-Royce plc Methods and apparatus for inspecting an engine
US11579046B2 (en) 2019-12-10 2023-02-14 Rolls-Royce Plc Methods and apparatus for inspecting an engine

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6432240B2 (en) * 2014-09-19 2018-12-05 日立化成株式会社 Conductive particle shape evaluation apparatus and conductive particle shape evaluation method
JP6446251B2 (en) * 2014-10-13 2018-12-26 ゼネラル・エレクトリック・カンパニイ Method and device for automatically identifying points of interest on anomalous surfaces
JP7098271B2 (en) * 2016-02-08 2022-07-11 ゼネラル・エレクトリック・カンパニイ How to automatically identify points of interest on a visible object
CA3009798A1 (en) 2017-07-12 2019-01-12 General Electric Company Graphic overlay for measuring dimensions of features using a video inspection device
JP6804417B2 (en) 2017-09-26 2020-12-23 オリンパス株式会社 Measuring device, measuring system, measuring device operating method, and program

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020191074A1 (en) * 2001-05-30 2002-12-19 Olympus Optical Co., Ltd. Image processing measuring apparatus and measuring endoscope apparatus
US6937268B2 (en) * 1999-09-01 2005-08-30 Olympus Corporation Endoscope apparatus
US20080240491A1 (en) * 2007-01-31 2008-10-02 Olympus Corporation Instrumentation endoscope apparatus
US7443488B2 (en) * 2005-05-24 2008-10-28 Olympus Corporation Endoscope apparatus, method of operating the endoscope apparatus, and program to be executed to implement the method
US20090043161A1 (en) * 2006-11-14 2009-02-12 Olympus Corporation Measuring endoscope apparatus, program and recording medium
US20090092278A1 (en) * 2007-01-31 2009-04-09 Olympus Corporation Endoscope apparatus and program
US7564626B2 (en) * 2002-01-25 2009-07-21 Ge Inspection Technologies Lp Stereo-measurement borescope with 3-D viewing
US7679041B2 (en) * 2006-02-13 2010-03-16 Ge Inspection Technologies, Lp Electronic imaging device with photosensor arrays
US7782453B2 (en) * 2006-12-28 2010-08-24 Ge Inspection Technologies, Lp Method for measuring missing corner dimensions

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2778739B2 (en) * 1989-05-10 1998-07-23 オリンパス光学工業株式会社 Measurement endoscope device
JPH07281105A (en) * 1994-02-21 1995-10-27 Olympus Optical Co Ltd Endoscope device
JP4445622B2 (en) 1999-12-08 2010-04-07 オリンパス株式会社 Three-dimensional measurement endoscope device, subject measurement display device, subject measurement display method of three-dimensional measurement endoscope, and subject measurement display method
JP2004049638A (en) * 2002-07-22 2004-02-19 Olympus Corp Endoscope apparatus
JP4790223B2 (en) 2004-01-20 2011-10-12 オリンパス株式会社 Endoscope device for measurement
JP5113990B2 (en) 2005-05-24 2013-01-09 オリンパス株式会社 Endoscope device for measurement
JP2008136706A (en) 2006-12-04 2008-06-19 Matsushita Electric Ind Co Ltd Sauna device
JP5073384B2 (en) 2007-07-03 2012-11-14 オリンパス株式会社 Endoscope device for measurement
JP5127302B2 (en) * 2007-05-29 2013-01-23 オリンパス株式会社 Endoscope device for measurement and program
JP5186286B2 (en) 2007-06-04 2013-04-17 オリンパス株式会社 Endoscope device for measurement and program
JP5231173B2 (en) * 2007-12-27 2013-07-10 オリンパス株式会社 Endoscope device for measurement and program
JP5307407B2 (en) 2008-01-11 2013-10-02 オリンパス株式会社 Endoscope apparatus and program
JP5361246B2 (en) 2008-05-23 2013-12-04 オリンパス株式会社 Endoscope apparatus and program
JP5186314B2 (en) * 2008-05-26 2013-04-17 オリンパス株式会社 Endoscope apparatus and program
JP5242335B2 (en) 2008-10-23 2013-07-24 オリンパス株式会社 Image processing apparatus, endoscope apparatus, endoscope system, and program
JP4750197B2 (en) 2009-04-20 2011-08-17 オリンパス株式会社 Endoscope device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6937268B2 (en) * 1999-09-01 2005-08-30 Olympus Corporation Endoscope apparatus
US20020191074A1 (en) * 2001-05-30 2002-12-19 Olympus Optical Co., Ltd. Image processing measuring apparatus and measuring endoscope apparatus
US6914623B2 (en) * 2001-05-30 2005-07-05 Olympus Corporation Image processing measuring apparatus and measuring endoscope apparatus
US7564626B2 (en) * 2002-01-25 2009-07-21 Ge Inspection Technologies Lp Stereo-measurement borescope with 3-D viewing
US7443488B2 (en) * 2005-05-24 2008-10-28 Olympus Corporation Endoscope apparatus, method of operating the endoscope apparatus, and program to be executed to implement the method
US7679041B2 (en) * 2006-02-13 2010-03-16 Ge Inspection Technologies, Lp Electronic imaging device with photosensor arrays
US20090043161A1 (en) * 2006-11-14 2009-02-12 Olympus Corporation Measuring endoscope apparatus, program and recording medium
US7782453B2 (en) * 2006-12-28 2010-08-24 Ge Inspection Technologies, Lp Method for measuring missing corner dimensions
US20080240491A1 (en) * 2007-01-31 2008-10-02 Olympus Corporation Instrumentation endoscope apparatus
US20090092278A1 (en) * 2007-01-31 2009-04-09 Olympus Corporation Endoscope apparatus and program

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9984474B2 (en) 2011-03-04 2018-05-29 General Electric Company Method and device for measuring features on or near an object
US11514643B2 (en) * 2011-03-04 2022-11-29 Baker Hughes, A Ge Company, Llc Method and device for displaying a two-dimensional image of a viewed object simultaneously with an image depicting the three-dimensional geometry of the viewed object
US10586341B2 (en) 2011-03-04 2020-03-10 General Electric Company Method and device for measuring features on or near an object
US10157495B2 (en) 2011-03-04 2018-12-18 General Electric Company Method and device for displaying a two-dimensional image of a viewed object simultaneously with an image depicting the three-dimensional geometry of the viewed object
US10019812B2 (en) 2011-03-04 2018-07-10 General Electric Company Graphic overlay for measuring dimensions of features using a video inspection device
US10699149B2 (en) 2013-12-17 2020-06-30 General Electric Company Method and device for automatically identifying a point of interest in a depth measurement on a viewed object
US20150170352A1 (en) * 2013-12-17 2015-06-18 General Electric Company Method and device for automatically identifying the deepest point on the surface of an anomaly
US9818039B2 (en) * 2013-12-17 2017-11-14 General Electric Company Method and device for automatically identifying a point of interest in a depth measurement on a viewed object
US9842430B2 (en) 2013-12-17 2017-12-12 General Electric Company Method and device for automatically identifying a point of interest on a viewed object
US9875574B2 (en) * 2013-12-17 2018-01-23 General Electric Company Method and device for automatically identifying the deepest point on the surface of an anomaly
US11308343B2 (en) * 2013-12-17 2022-04-19 Baker Hughes, A Ge Company, Llc Method and device for automatically identifying a point of interest in a depth measurement on a viewed object
US20150170412A1 (en) * 2013-12-17 2015-06-18 General Electric Company Method and device for automatically identifying a point of interest on the surface of an anomaly
US9600928B2 (en) * 2013-12-17 2017-03-21 General Electric Company Method and device for automatically identifying a point of interest on the surface of an anomaly
US10217016B2 (en) 2013-12-17 2019-02-26 General Electric Company Method and device for automatically identifying a point of interest in a depth measurement on a viewed object
US20160171705A1 (en) * 2013-12-17 2016-06-16 General Electric Company Method and device for automatically identifying a point of interest in a depth measurement on a viewed object
US9690427B2 (en) 2014-09-03 2017-06-27 Panasonic Intellectual Property Management Co., Ltd. User interface device, and projector device
WO2017053505A2 (en) * 2015-09-25 2017-03-30 General Electric Company Method and device for measuring features on or near an object
WO2017053505A3 (en) * 2015-09-25 2017-05-11 General Electric Company Method and device for measuring features on or near an object
EP3835218A1 (en) * 2019-12-10 2021-06-16 Rolls-Royce plc Methods and apparatus for inspecting an engine
US11579046B2 (en) 2019-12-10 2023-02-14 Rolls-Royce Plc Methods and apparatus for inspecting an engine
EP4151535A1 (en) * 2019-12-10 2023-03-22 Rolls-Royce plc Methods and apparatus for inspecting an engine
US11761856B2 (en) * 2019-12-10 2023-09-19 Rolls-Royce Plc Methods and apparatus for inspecting an engine

Also Published As

Publication number Publication date
US8913110B2 (en) 2014-12-16
JP2011182977A (en) 2011-09-22
JP5530225B2 (en) 2014-06-25

Similar Documents

Publication Publication Date Title
US8913110B2 (en) Endoscope apparatus and measurement method
US20110187824A1 (en) Endoscope apparatus and measurement method
US8004560B2 (en) Endoscope apparatus
JP5073564B2 (en) Endoscope device for measurement and program
JP5137033B2 (en) Surgery support information display device, surgery support information display method, and surgery support information display program
US8248465B2 (en) Measuring endoscope apparatus and program
JP4873794B2 (en) Image processing measuring apparatus and measuring endoscope apparatus
US8708890B2 (en) Endoscope apparatus and method of measuring subject
US20110021873A1 (en) Endoscope apparatus and measurement method
JP5113990B2 (en) Endoscope device for measurement
JP5231173B2 (en) Endoscope device for measurement and program
JP5307407B2 (en) Endoscope apparatus and program
JP2005204724A (en) Endoscope apparatus for measurement
JP5354494B2 (en) 3D image generation apparatus, 3D image generation method, and program
US9113806B2 (en) Endoscope apparatus for measuring a spatial characteristic
JP4674093B2 (en) Endoscope apparatus and program
CN110858397A (en) Measuring device, method for operating measuring device, and storage medium
JP6400767B2 (en) Measuring endoscope device
US20100128115A1 (en) Endoscope apparatus and method
US20120100512A1 (en) Inspection apparatus and inspection method
JPWO2017199657A1 (en) Endoscope apparatus, measurement method, and program
JP2011161019A (en) Endoscope apparatus and program
US20120107780A1 (en) Inspection apparatus and inspection method
JP5042550B2 (en) Endoscope device
JP2010256247A (en) Endoscope apparatus and measurement method

Legal Events

Date Code Title Description
AS Assignment

Owner name: OLYMPUS CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HORI, FUMIO;KUWA, YUUSUKE;REEL/FRAME:025752/0821

Effective date: 20101203

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: OLYMPUS CORPORATION, JAPAN

Free format text: CHANGE OF ADDRESS;ASSIGNOR:OLYMPUS CORPORATION;REEL/FRAME:039344/0502

Effective date: 20160401

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551)

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

AS Assignment

Owner name: EVIDENT CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:OLYMPUS CORPORATION;REEL/FRAME:062492/0267

Effective date: 20221024

FP Lapsed due to failure to pay maintenance fee

Effective date: 20221216