WO2005109341A1 - Method and apparatus for use in the image analysis of biological specimens - Google Patents

Method and apparatus for use in the image analysis of biological specimens Download PDF

Info

Publication number
WO2005109341A1
WO2005109341A1 PCT/GB2005/001750 GB2005001750W WO2005109341A1 WO 2005109341 A1 WO2005109341 A1 WO 2005109341A1 GB 2005001750 W GB2005001750 W GB 2005001750W WO 2005109341 A1 WO2005109341 A1 WO 2005109341A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
mask
intensity threshold
illumination intensity
areas
Prior art date
Application number
PCT/GB2005/001750
Other languages
French (fr)
Inventor
John R. Maddison
Original Assignee
Fairfield Imaging Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fairfield Imaging Limited filed Critical Fairfield Imaging Limited
Publication of WO2005109341A1 publication Critical patent/WO2005109341A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/155Segmentation; Edge detection involving morphological operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • G06V20/695Preprocessing, e.g. image segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20036Morphological image processing
    • G06T2207/20044Skeletonization; Medial axis transform
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30024Cell structures in vitro; Tissue sections in vitro

Definitions

  • the present invention relates to digital image analysis, and more particularly to a method, and associated apparatus, for use in the analysis of digital images of biological specimens in which the boundaries of features or objects within the image are delineated.
  • a specimen may be tested with one or more markers, such as antibodies, which attach to a certain part of a diseased or affected tissue, for example, a protein in the membrane, cytoplasm or nucleus of affected cells (or a combination thereof).
  • markers such as antibodies
  • a detection technique may be applied to a specimen under test, whereby a "stain" chemically attaches itself to the marker (e.g. chemically reacts with the marker to produce a coloured reaction product), to visually reveal affected areas of the specimen. In this way, the affected areas of the specimen can be seen by the human eye under a microscope.
  • EGFR Epidermal Growth Factor Receptor
  • More recently software has been developed to analyse digital images of stained, biological specimens under test, in order to quantify the level or degree of staining of the relevant areas of the tissue.
  • the processing involves the "segmentation" of areas of the image of the specimen coloured by the stain (thereby excluding areas of the specimen image not coloured by the stain), and analysing the intensity of the defined colour in the particular areas of the specimen that are relevant to the test (e.g. membrane, cytoplasm, nucleus).
  • intensity thresholds which identify the boundaries of the relevant, stained areas of the specimen.
  • the selection of this threshold may significantly affect the results of the image analysis. This is because when a biological specimen under test is stained, the stain may not be taken up uniformly by the marker. In addition, optical effects and/or the nature of the specimen may lead to the appearance of the stain in parts of the cell to which the marker is not attached.
  • a test is concerned with analysing the intensity of a stain attached to cell membranes (such as the aforementioned EGFR test)
  • the threshold is too low, the areas of the specimen having intensity values above the threshold may include parts of the cytoplasm and adjacent cells, which are not relevant to the test but may nevertheless appear lightly stained in the image.
  • the threshold is too high, it may be that not all of the cell membranes will be analysed, because some membranes may not have been properly stained.
  • the present invention provides aspects and embodiments as provided in the claims.
  • the present invention provides a method for automatically determining the boundaries of anatomical parts of cells within a stained biological specimen image.
  • the boundaries may represent the boundary of cells (i.e. enclosing cell membranes) or the boundaries of the cell nuclei with the cytoplasm (i.e. enclosing cell nuclei), depending on the type of image analysis. Determination of the boundaries thus delineates the cells (or cell nuclei) within the image. This delineation may then be used to determine the relevant areas of the image to be considered in the image analysis.
  • a mask is generated based on the determined boundaries.
  • the mask defines the relevant areas of the image, which are to be considered in the image analysis.
  • image analysis may determine an intensity value, representing the degree or extent of staining of the specimen.
  • Figure 1 is a flow diagram showing a method for use in the analysis of an image of a stained, biological specimen under test, according to an embodiment of the present invention
  • Figure 2 is an exemplary, stained biological specimen image
  • Figures 3 a, 3b and 3 c each show skeleton masks, determined using different intensity thresholds for the stain of the specimen image of Figure 2, for use in analysis thereof
  • Figure 4 illustrates the specimen image of Figure 2, overlaid with a mask having the best selected threshold, as determined in accordance with a preferred ⁇ embodiment of the present invention.
  • Figure 1 is a flow diagram illustrating a method performed in the analysis of stained, biological specimen images in accordance with an embodiment of the present invention.
  • the method is preferably implemented in the form of a computer program.
  • the computer program may be stored on a computer readable medium such as a magnetic or optical disc, or may be downloaded to a computer from a remote site over a network, such as the Internet, and thus embodied on a carrier wave. It will be - appreciated that the present invention may be implemented in these and other forms.
  • the method of Figure 1 automatically determines the intensity threshold to be used for identifying or delineating areas of a stained biological specimen image to be included in image analysis.
  • the method determines the best intensity threshold for segmented areas of a stained biological specimen image (such as the image of Figure 2), from which a mask, defining the areas of the image for image analysis, may be generated.
  • the illustrated method may be performed as part of such an image analysis method in the context of an assay test, although it will be appreciated that the illustrated method may be carried out separately from, but nevertheless prior to, the analysis of the intensity of the stained areas of the specimen image.
  • the method of Figure 1 will be described with reference to the stained, biological specimen image of Figure 2.
  • the biological specimen depicted in Figure 2 relates to the aforementioned EGFR test, which involves attachment of a brown stain to the membranes of cells, which is indicative of the tested condition (i.e. where the EGFR resides).
  • the method of Figure 1 is concerned with identifying stained cell membranes. It will be appreciated that for other tests, the stain or colour may be associated with other parts of cells, such as the nuclei, and the method identifies the boundaries of such other features or objects within the cells.
  • the darkest area of staining is associated with the membranes of cells, but some staining appears in the cytoplasm of the cells, as well as adjacent areas of the specimen where cells are not clearly and distinctly visible.
  • the method starts at step 100, by obtaining a digital image of a stained, biological specimen under test, such as the image of Figure 2.
  • the image is typically retrieved from memory, where it has been stored following capture of the image of the specimen on a slide, using a microscope and digital camera, in accordance with conventional techniques.
  • parameters specific to, or associated with, the specimen may be manually or automatically set.
  • the colour of the stain to be identified in the image analysis is set for image segmentation.
  • the colour may be automatically, digitally defined with reference to the test or stain associated with the image (i.e. a fixed colour corresponding to the stain is associated with a given test).
  • a threshold illumination intensity range and threshold illumination intensity increment within the range is set, as explained in more detail below.
  • the image may be assigned an identifier for the purposes of result storage for different threshold values, again, as described in more detail below.
  • the image apparatus is arranged to identify the colours associated with a particular hue. So, for example, the hue associated with brown. A hue range may be used. Colour saturation, along with hue, may be used. This process allows the apparatus to focus in on the brown stained areas and ignore non-brown areas.
  • the method selects an illumination intensity value for the relevant colour, which is typically the lowest value within the threshold range set at step 100.
  • the threshold range may be a default range, for example between 0-255 (representing all digital illumination intensity values), or more preferably, between 90-160.
  • the illumination intensity threshold range is preferably set at step 100 based on previous results of best threshold values for specimens undergoing the same test, thus refining the threshold range.
  • the method generates a "skeleton mask" based on the illumination intensity threshold set at step 110.
  • the skeleton mask generation involves two main process steps.
  • the method determines all the areas of the segmented image having an intensity value greater than the set threshold. For example, referring to Figure 2, if the intensity threshold set is 90, then this will reveal all areas of the specimen with an intensity of brown above the intensity value of 90. This will include areas of the cytoplasm within the cells surrounding the membranes, as discussed above.
  • the method performs a "skeletonisation" procedure which defines a median line through identified object areas.
  • the "skeletonisation" of an object is typically performed by iteratively removing all pixels at the outer edge of the object, until a line of one pixel thickness is achieved, this line constituting the skeleton mask.
  • the pixels at the edge of an object are determined by comparing pixels of the object with adjacent pixels; all pixels at the edge of an object will be adjacent pixels which do not form part of the object.
  • step 120 results in a "skeleton mask" of one pixel thickness, defining the cell boundaries (membranes), based on the particular selected threshold.
  • the lines of the skeleton mask may or may not correspond to the correct location of the cell membranes, since the threshold value may take into account areas of the cytoplasm.
  • the skeleton mask is analysed to determine a "fitness value" according to predefined criteria indicative of the likelihood that the skeleton mask closely corresponds to the relevant anatomical boundaries, in the present example the cell membranes.
  • a criterion suitable for biological specimens, in particular cell membranes or nuclei is the number of "closed loops" within the skeleton mask. Such closed loops represent the number of complete cells (or cell nuclei). It will be appreciated that the number of closed loops is functionally related to the number of pixels in the skeleton mask. Thus, the number of pixels in the skeleton mask may be used to determine a "fitness value" (the larger the number of pixels, the better the fitness value).
  • the fitness value may be determined by looking for a predetermined shape or size of closed loops in the skeleton mask (again, representing cells/cell nuclei) or a predetermined distance between lines (representing spacing between cells/nuclei).
  • step 130 uses an algorithm based on one or more of the discussed criteria, to determine a "fitness value" of the skeleton mask.
  • the fitness value may be a number within a predetermined range, for example 0-1 or 0-10.
  • the method stores the fitness value together with the threshold value in memory associated with the image, using the image identifier.
  • the method considers whether the fitness value for all thresholds
  • the method (according to the threshold range and threshold increment set at step 100) has been determined. If the fitness values for all thresholds has not been determined, the method returns to step 110, and sets a new threshold value by adding the previous threshold value to the threshold increment. The method then continues with steps 120-150 using the new threshold, and is repeated until all thresholds within the set threshold range and increment have been considered.
  • step 150 determines that all the thresholds have been considered, the method continues with step 160 by identifying the best fitness values within the stored results, and thus the corresponding best threshold for use in identifying the relevant boundaries.
  • the threshold determined at step 160 may then be used to determine a mask for image analysis using conventional techniques, as described below.
  • Figures 3a to 3c depict skeleton masks generated by step 120 of the method of
  • Figure 1 based on the stained, biological specimen image of Figure 2.
  • the mask is shown in white on a black background.
  • the skeleton masks can be considered as binary pictures.
  • Figure 3 a the skeleton mask has been generated using a very low intensity threshold.
  • wide areas surrounding the cell membranes of the specimen of Figure 2 are identified and skeletonised in step 120 of Figure 1, resulting in a mask having no interconnecting lines.
  • a low "fitness value" will be determined.
  • the number of the pixels in the mask is relatively low, and there are no closed loops representing cells of appropriate size. Accordingly, a low fitness value is determined for this skeleton mask.
  • Figure 3 c illustrates a skeleton mask generated at step 120 of Figure 1 in which the threshold is set very high. In this case, only very highly stained areas of the specimen will be identified and skeletonised. Thus, the skeleton mask does not identify some cell membranes, which have absorbed the stain to a lesser extent and so have low intensity staining. Thus, as shown in Figure 3 c, the skeleton mask includes some closed loops which enclose more than one cell nucleus, when compared with Figure 2.
  • the fitness value of this mask is determined at step 130 of Figure 1, whilst the fitness value will be greater than that for the mask of Figure 3 a, the number of closed loops and the number of pixels in the skeleton mask will be lower than a best fit mask.
  • Figure 3b illustrates a skeleton mask generated using the correct (best) threshold.
  • the skeleton mask of Figure 3b has a high number of closed loops, representing cell membranes boundaries, and thus at step 130 of Figure 1, a high fitness value will be generated.
  • step 160 of Figure 1 will determine that the threshold used to generate the mask of Figure 3b is the most appropriate mask to use for the image analysis of the biological specimen image of Figure 2.
  • the skeleton mask is used to generate a mask for the areas of the image to be used in image analysis.
  • the skeleton mask is used to generate a mask for the areas of the image to be used in image analysis.
  • the skeleton mask of Figure 3b is widened to an appropriate pixel width, representative of the anatomical size of the cell membranes.
  • the thickness of a cell membrane may be 10 pixels wide in the image of Figure 2.
  • Other pixel widths may be appropriate, depending upon the magnification of the objective lens used to acquire the image, the specification of the digital camera etc.
  • the generated mask of 10 pixels thick is then overlaid on the image, as shown in white in Figure 4, and all areas not covered by the mask are removed.
  • the remaining areas of the image are then analysed. For example, the intensity of each and every pixel underlying the mask is determined, and a histogram generated to show the intensity distribution, corresponding to the extent of the staining, across the membranes of the cells.
  • a mean intensity value for all of the pixels may be determined, by adding together the intensity values for all of the pixels, and dividing the sum by the number of pixels. The results can then be analysed by an expert, such as a pathologist.
  • the present invention by automatically scanning through intensity thresholds at increments within an appropriate intensity threshold range, is able to objectively determine the best threshold to use in determining the mask.
  • the technique is accordingly highly tolerant. Since the generation of a mask for each threshold is time consuming (typically about 2 minutes), it would be desirable to increase the increment between selected thresholds when a wide threshold range is specified.
  • the above-described example of the present invention provides a method for automatic selection of a correct membrane/cytoplasm threshold to obtain best membrane specification in EGFR testing.
  • a specimen image is iteratively processed by creating an image that defines a line representing cytoplasm connectivity.
  • a threshold where this line is to most accurately represent the characteristics of the cytoplasm boundary is used to specify the intensity boundary between cytoplasm and membrane stain.
  • the line, representing cytoplasm connectivity is determined using a skeleton process which reduces stained structures, according to the threshold, to a one pixel thick line through the centre of the object. Thus, it is only really concerned with the structure rather than how fat/thick an object is. By reducing the structure to its skeleton, the subjective effects of selecting different manual thresholds are obviated. Furthermore, the change that is seen in the final mask used for quantification is minimised.
  • the method has been shown to detect the correct threshold for a variety of stains from different sections, and is independent of intensity of stain. It will be appreciated that the present invention is applicable to other tests involving biological specimen images.

Abstract

The present invention provides a method. for determining an optimal intensity threshold for use in image analysis of a stained biological specimen, the method comprising: (a) receiving a digital image of the stained biological specimen; (b) determining, using a first illumination intensity threshold value, a skeleton mask for the digital image; dependent on at least the number of pixels represented by the mask; (d) storing the first illumination intensity threshold value and fitness value; (e) correspondingly repeating steps (b) to (d) for a plurality of further predefined illumination intensity threshold values, and determining the optimal illumination intensity threshold as the illumination intensity threshold value having the highest fitness value stored in step (d).

Description

METHOD AND APPARATUS FOR USE IN THE IMAGE ANALYSIS OF BIOLOGICAL SPECIMENS
The present invention relates to digital image analysis, and more particularly to a method, and associated apparatus, for use in the analysis of digital images of biological specimens in which the boundaries of features or objects within the image are delineated.
In the field of biological specimen testing, numerous assay tests are known, and under continual development, the results of which rely on the inspection or analysis of a particular part of the biological tissue. For example, a specimen may be tested with one or more markers, such as antibodies, which attach to a certain part of a diseased or affected tissue, for example, a protein in the membrane, cytoplasm or nucleus of affected cells (or a combination thereof). A detection technique may be applied to a specimen under test, whereby a "stain" chemically attaches itself to the marker (e.g. chemically reacts with the marker to produce a coloured reaction product), to visually reveal affected areas of the specimen. In this way, the affected areas of the specimen can be seen by the human eye under a microscope. Conventionally, a pathologist or other expert is able to review slides of tested biological specimens under the microscope to manually assess the degree of staining, and thus the extent to which the cells are affected by the disease or condition under test. One such example is the test for the Epidermal Growth Factor Receptor (EGFR), which resides in the cell membrane and can be indicative of increased cell division and thus malignancy, leading to cancer.
More recently software has been developed to analyse digital images of stained, biological specimens under test, in order to quantify the level or degree of staining of the relevant areas of the tissue. The processing involves the "segmentation" of areas of the image of the specimen coloured by the stain (thereby excluding areas of the specimen image not coloured by the stain), and analysing the intensity of the defined colour in the particular areas of the specimen that are relevant to the test (e.g. membrane, cytoplasm, nucleus).
However, such techniques require an expert user to manually define intensity thresholds, which identify the boundaries of the relevant, stained areas of the specimen. The selection of this threshold may significantly affect the results of the image analysis. This is because when a biological specimen under test is stained, the stain may not be taken up uniformly by the marker. In addition, optical effects and/or the nature of the specimen may lead to the appearance of the stain in parts of the cell to which the marker is not attached. Thus, for example, if a test is concerned with analysing the intensity of a stain attached to cell membranes (such as the aforementioned EGFR test), if the threshold is too low, the areas of the specimen having intensity values above the threshold may include parts of the cytoplasm and adjacent cells, which are not relevant to the test but may nevertheless appear lightly stained in the image. Conversely, if the threshold is too high, it may be that not all of the cell membranes will be analysed, because some membranes may not have been properly stained.
Whilst the use of computerised image analysis in such assay tests has led to advances in the field, since the determination of the intensity threshold is performed manually, it involves subjective assessment, which can lead to errors, or inconsistencies between results from different specimens.
Accordingly, it would be desirable to provide a fully automated technique of image analysis for such tests, which is entirely objective, and thus leads to more consistent test results.
The present invention provides aspects and embodiments as provided in the claims. The present invention provides a method for automatically determining the boundaries of anatomical parts of cells within a stained biological specimen image. The boundaries may represent the boundary of cells (i.e. enclosing cell membranes) or the boundaries of the cell nuclei with the cytoplasm (i.e. enclosing cell nuclei), depending on the type of image analysis. Determination of the boundaries thus delineates the cells (or cell nuclei) within the image. This delineation may then be used to determine the relevant areas of the image to be considered in the image analysis. In one embodiment, a mask is generated based on the determined boundaries.
The mask defines the relevant areas of the image, which are to be considered in the image analysis. Such image analysis may determine an intensity value, representing the degree or extent of staining of the specimen. Further features and advantages of the present invention will be apparent from the following description and accompanying claims.
Embodiments of the present invention will now be described, with reference to the accompanying the drawings, in which: Figure 1 is a flow diagram showing a method for use in the analysis of an image of a stained, biological specimen under test, according to an embodiment of the present invention; Figure 2 is an exemplary, stained biological specimen image; Figures 3 a, 3b and 3 c each show skeleton masks, determined using different intensity thresholds for the stain of the specimen image of Figure 2, for use in analysis thereof, and Figure 4 illustrates the specimen image of Figure 2, overlaid with a mask having the best selected threshold, as determined in accordance with a preferred embodiment of the present invention. Figure 1 is a flow diagram illustrating a method performed in the analysis of stained, biological specimen images in accordance with an embodiment of the present invention. The method is preferably implemented in the form of a computer program. The computer program may be stored on a computer readable medium such as a magnetic or optical disc, or may be downloaded to a computer from a remote site over a network, such as the Internet, and thus embodied on a carrier wave. It will be - appreciated that the present invention may be implemented in these and other forms.
Generally speaking, the method of Figure 1 automatically determines the intensity threshold to be used for identifying or delineating areas of a stained biological specimen image to be included in image analysis. In particular, the method determines the best intensity threshold for segmented areas of a stained biological specimen image (such as the image of Figure 2), from which a mask, defining the areas of the image for image analysis, may be generated. The illustrated method may be performed as part of such an image analysis method in the context of an assay test, although it will be appreciated that the illustrated method may be carried out separately from, but nevertheless prior to, the analysis of the intensity of the stained areas of the specimen image. The method of Figure 1 will be described with reference to the stained, biological specimen image of Figure 2. The biological specimen depicted in Figure 2 relates to the aforementioned EGFR test, which involves attachment of a brown stain to the membranes of cells, which is indicative of the tested condition (i.e. where the EGFR resides). Thus, the method of Figure 1 is concerned with identifying stained cell membranes. It will be appreciated that for other tests, the stain or colour may be associated with other parts of cells, such as the nuclei, and the method identifies the boundaries of such other features or objects within the cells.
Thus, as illustrated in Figure 2, the darkest area of staining is associated with the membranes of cells, but some staining appears in the cytoplasm of the cells, as well as adjacent areas of the specimen where cells are not clearly and distinctly visible. Thus, in this example, it is desirable to delineate the membrane of cells from other portions of the tissue, in order to identify areas of the image which represent cell membranes, whilst excluding other tissue areas. Referring to Figure 1, the method starts at step 100, by obtaining a digital image of a stained, biological specimen under test, such as the image of Figure 2. The image is typically retrieved from memory, where it has been stored following capture of the image of the specimen on a slide, using a microscope and digital camera, in accordance with conventional techniques. At this stage, parameters specific to, or associated with, the specimen may be manually or automatically set. For example, the colour of the stain to be identified in the image analysis is set for image segmentation. The colour may be automatically, digitally defined with reference to the test or stain associated with the image (i.e. a fixed colour corresponding to the stain is associated with a given test). In addition, a threshold illumination intensity range and threshold illumination intensity increment within the range is set, as explained in more detail below. In addition, the image may be assigned an identifier for the purposes of result storage for different threshold values, again, as described in more detail below. To set the colour of the stain, the image apparatus is arranged to identify the colours associated with a particular hue. So, for example, the hue associated with brown. A hue range may be used. Colour saturation, along with hue, may be used. This process allows the apparatus to focus in on the brown stained areas and ignore non-brown areas.
At step 110, the method selects an illumination intensity value for the relevant colour, which is typically the lowest value within the threshold range set at step 100. The threshold range may be a default range, for example between 0-255 (representing all digital illumination intensity values), or more preferably, between 90-160. The illumination intensity threshold range is preferably set at step 100 based on previous results of best threshold values for specimens undergoing the same test, thus refining the threshold range.
At step 120, the method generates a "skeleton mask" based on the illumination intensity threshold set at step 110. The skeleton mask generation involves two main process steps.
Firstly, the method determines all the areas of the segmented image having an intensity value greater than the set threshold. For example, referring to Figure 2, if the intensity threshold set is 90, then this will reveal all areas of the specimen with an intensity of brown above the intensity value of 90. This will include areas of the cytoplasm within the cells surrounding the membranes, as discussed above.
Secondly, using the identified areas meeting the threshold, the method performs a "skeletonisation" procedure which defines a median line through identified object areas. As the skilled person will appreciate, the "skeletonisation" of an object is typically performed by iteratively removing all pixels at the outer edge of the object, until a line of one pixel thickness is achieved, this line constituting the skeleton mask. The pixels at the edge of an object are determined by comparing pixels of the object with adjacent pixels; all pixels at the edge of an object will be adjacent pixels which do not form part of the object.
Thus, step 120 results in a "skeleton mask" of one pixel thickness, defining the cell boundaries (membranes), based on the particular selected threshold. The lines of the skeleton mask may or may not correspond to the correct location of the cell membranes, since the threshold value may take into account areas of the cytoplasm.
This is discussed below in more detail with reference to Figure 3a-3c.
At step 130, the skeleton mask is analysed to determine a "fitness value" according to predefined criteria indicative of the likelihood that the skeleton mask closely corresponds to the relevant anatomical boundaries, in the present example the cell membranes. One example criterion suitable for biological specimens, in particular cell membranes or nuclei, is the number of "closed loops" within the skeleton mask. Such closed loops represent the number of complete cells (or cell nuclei). It will be appreciated that the number of closed loops is functionally related to the number of pixels in the skeleton mask. Thus, the number of pixels in the skeleton mask may be used to determine a "fitness value" (the larger the number of pixels, the better the fitness value). Alternative criteria will be apparent to the skilled person. For example, the fitness value may be determined by looking for a predetermined shape or size of closed loops in the skeleton mask (again, representing cells/cell nuclei) or a predetermined distance between lines (representing spacing between cells/nuclei).
Thus, step 130 uses an algorithm based on one or more of the discussed criteria, to determine a "fitness value" of the skeleton mask. The fitness value may be a number within a predetermined range, for example 0-1 or 0-10.
At step 140, the method stores the fitness value together with the threshold value in memory associated with the image, using the image identifier. At step 150, the method considers whether the fitness value for all thresholds
(according to the threshold range and threshold increment set at step 100) has been determined. If the fitness values for all thresholds has not been determined, the method returns to step 110, and sets a new threshold value by adding the previous threshold value to the threshold increment. The method then continues with steps 120-150 using the new threshold, and is repeated until all thresholds within the set threshold range and increment have been considered.
It will be appreciated that rather than starting at a low threshold (e.g. 90) and incrementally increasing the value, it would be possible to start at a high value and incrementally decrease the threshold value. In such cases, it will be appreciated that the threshold increment would be a negative value. When step 150 determines that all the thresholds have been considered, the method continues with step 160 by identifying the best fitness values within the stored results, and thus the corresponding best threshold for use in identifying the relevant boundaries.
The threshold determined at step 160 may then be used to determine a mask for image analysis using conventional techniques, as described below. Figures 3a to 3c depict skeleton masks generated by step 120 of the method of
Figure 1, based on the stained, biological specimen image of Figure 2. In these diagrams, the mask is shown in white on a black background. The skeleton masks can be considered as binary pictures. In Figure 3 a, the skeleton mask has been generated using a very low intensity threshold. In this case, wide areas surrounding the cell membranes of the specimen of Figure 2 are identified and skeletonised in step 120 of Figure 1, resulting in a mask having no interconnecting lines. Thus, when analysing this mask, using the step 130 of Figure 1 as described above, a low "fitness value" will be determined. In particular, the number of the pixels in the mask is relatively low, and there are no closed loops representing cells of appropriate size. Accordingly, a low fitness value is determined for this skeleton mask.
Figure 3 c illustrates a skeleton mask generated at step 120 of Figure 1 in which the threshold is set very high. In this case, only very highly stained areas of the specimen will be identified and skeletonised. Thus, the skeleton mask does not identify some cell membranes, which have absorbed the stain to a lesser extent and so have low intensity staining. Thus, as shown in Figure 3 c, the skeleton mask includes some closed loops which enclose more than one cell nucleus, when compared with Figure 2. When the fitness value of this mask is determined at step 130 of Figure 1, whilst the fitness value will be greater than that for the mask of Figure 3 a, the number of closed loops and the number of pixels in the skeleton mask will be lower than a best fit mask.
Finally, Figure 3b illustrates a skeleton mask generated using the correct (best) threshold. Thus, the skeleton mask of Figure 3b has a high number of closed loops, representing cell membranes boundaries, and thus at step 130 of Figure 1, a high fitness value will be generated.
Accordingly, in this example, step 160 of Figure 1 will determine that the threshold used to generate the mask of Figure 3b is the most appropriate mask to use for the image analysis of the biological specimen image of Figure 2.
In accordance with an image analysis technique, as described above, the skeleton mask is used to generate a mask for the areas of the image to be used in image analysis. In particular, in the case of a test involving cells membranes, as in the
Figure 2 test, the skeleton mask of Figure 3b is widened to an appropriate pixel width, representative of the anatomical size of the cell membranes. For example, the thickness of a cell membrane may be 10 pixels wide in the image of Figure 2. Other pixel widths may be appropriate, depending upon the magnification of the objective lens used to acquire the image, the specification of the digital camera etc.
The generated mask of 10 pixels thick is then overlaid on the image, as shown in white in Figure 4, and all areas not covered by the mask are removed. The remaining areas of the image are then analysed. For example, the intensity of each and every pixel underlying the mask is determined, and a histogram generated to show the intensity distribution, corresponding to the extent of the staining, across the membranes of the cells. Alternatively, or in addition, a mean intensity value for all of the pixels may be determined, by adding together the intensity values for all of the pixels, and dividing the sum by the number of pixels. The results can then be analysed by an expert, such as a pathologist. It will be appreciated that the present invention, by automatically scanning through intensity thresholds at increments within an appropriate intensity threshold range, is able to objectively determine the best threshold to use in determining the mask. The technique is accordingly highly tolerant. Since the generation of a mask for each threshold is time consuming (typically about 2 minutes), it would be desirable to increase the increment between selected thresholds when a wide threshold range is specified.
Thus, the above-described example of the present invention provides a method for automatic selection of a correct membrane/cytoplasm threshold to obtain best membrane specification in EGFR testing.
A specimen image is iteratively processed by creating an image that defines a line representing cytoplasm connectivity. A threshold where this line is to most accurately represent the characteristics of the cytoplasm boundary is used to specify the intensity boundary between cytoplasm and membrane stain. The line, representing cytoplasm connectivity, is determined using a skeleton process which reduces stained structures, according to the threshold, to a one pixel thick line through the centre of the object. Thus, it is only really concerned with the structure rather than how fat/thick an object is. By reducing the structure to its skeleton, the subjective effects of selecting different manual thresholds are obviated. Furthermore, the change that is seen in the final mask used for quantification is minimised. Although the starting point can be different dependent upon the threshold selection (membrane threshold is fatter or thinner) the structure remains the same. In tests, not only was change minimised, but there was also no difference at all in the mask generated, even though the threshold selected changed by 20 grey levels.
From results, the method has been shown to detect the correct threshold for a variety of stains from different sections, and is independent of intensity of stain. It will be appreciated that the present invention is applicable to other tests involving biological specimen images.
Various modifications and changes may be made to the described embodiments. The invention should be interpreted to embrace all such modifications, changes and equivalents, which fall within the spirit and scope of the present invention. All various combinations of embodiments of the invention are within the scope of the present invention.

Claims

CLAIMS:
1. A method for determining an optimal intensity threshold for use in image analysis of a stained biological specimen, the method comprising: (a) receiving a digital image of the stained biological specimen; (b) determining, using a first illumination intensity threshold value, a skeleton mask for the digital image; (c) calculating a fitness value for the skeleton mask, the fitness value dependent on at least the number of pixels represented by the mask; (d) storing the first illumination intensity threshold value and fitness value; (e) correspondingly repeating steps (b) to (d) for a plurality of further predefined illumination intensity threshold values, and (f) deteπnining the optimal illumination intensity threshold as the illumination intensity threshold value having the highest fitness value stored in step (d).
2. A method as claimed in claim 1, wherein the skeleton mask is one pixel in thickness.
3. A method as claimed in claim 1 or claim 2, wherein step (b) comprises: identifying all areas of the digital image having an illumination intensity value for the stain above the intensity threshold value, and skeletonising the identified areas to define a median line through the identified areas.
4. A method as claimed in claim 1, 2 or 3, wherein the staining of the biological specimen is associated with a boundary of a cellular or tissue structure and wherein the fitness value is dependent upon one or more of: the number of pixels in the skeleton mask; the number of closed loops in the skeleton mask; the deviation in the size and/or shape of the closed loops within the skeleton mask relative to an optimal size and/or shape of the cellular or tissue structure, and the deviation in spacing of adjacent lines of the skeleton mask relative to an optimal spacing for the cellular or tissue structure.
5. A method as claimed in claim 1, comprising pre-selecting a hue associated with the colour of the stain, and using the illumination intensity values on regions of the image associated with the selected hue.
6. A method for determining an optimal illumination intensity threshold for use in image analysis of a stained biological specimen substantially as hereinbefore described with reference to, and as illustrated in, Figures 1 to 3 of the accompanymg drawings.
7. A method for delineating areas of an image of a stained biological specimen for image analysis, the method comprising: determining an optimal illumination intensity threshold for the image using the method as claimed in any one of claims 1 to 6; generating a skeleton mask for the image using the deteπnined optimal illumination intensity threshold; expanding the mask by a predetermined number of pixels, and using the expanded mask to delineate the areas of the image to undergo image analysis.
8. A method as claimed in claim 7, comprising overlaying the expanded mask over the image, removing all areas of the image not covered by the expanded mask, and perfomύng image analysis on the remaining areas of the image underlying the expanded mask.
9. A method for delineating areas of an image of a stained biological specimen for image analysis substantially as hereinbefore described with reference to, and as illustrated in, Figures 1 to 4 of the accompanying drawings.
10. A computer readable medium having a computer program for carrying out a method as claimed in any one of claimed 1 to 9.
11. Apparatus for determining an illumination optimal intensity threshold for use in image analysis of a stained biological specimen, the apparatus comprising: (a) an input to receive a digital image of the stained biological specimen; (b) one or more processors arranged to determine, using a first illumination intensity threshold value, a skeleton mask for the digital image; (c) one or more processors arranged to calculate a fitness value for the skeleton mask, the fitness value dependent on at least the number of pixels represented by the mask; (d) one or more stores to store the first illumination intensity threshold value and fitness value; (e) one or more processors arranged to correspondingly repeat steps (b) to (d) for a plurality of further predefined illumination intensity threshold values, and arranged to determine the optimal illumination intensity threshold as the illumination intensity threshold value having the highest fitness value stored in step (d).
12. Apparatus as claimed in claim 11, wherein the one or more processors of features (b), (c) and (e) are the same.
13. Apparatus as claimed in claim 11, wherein the apparatus is arranged to comprise an input for pre-selecting a hue associated with the colour of the stain, and arranged to use the illumination intensity values on regions of the image associated with the selected hue.
14. Apparatus for delineating areas of an image of a stained biological specimen for image analysis, the apparatus comprising: an input for receiving a determined optimal illumination intensity threshold for the image using the apparatus of claim 11 ; and wherein the delineating apparatus comprises one or more processors arranged to generate a skeleton mask for the image using the determined optimal illumination intensity threshold; one or more processors arranged to expand the mask by a predetermined number of pixels, and one or more processors arranged to use the expanded mask to delineate the areas of the image to undergo image analysis.
15. Apparatus as claimed in claim 14, wherein the apparatus is arranged to overlay the expanded mask over the image, remove all areas of the image not covered by the expanded mask, and perform image analysis on the remaining areas of the image underlying the expanded mask.
PCT/GB2005/001750 2004-05-11 2005-05-09 Method and apparatus for use in the image analysis of biological specimens WO2005109341A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GBGB0410499.8A GB0410499D0 (en) 2004-05-11 2004-05-11 Method and apparatus for use in the image analysis of biological specimens
GB0410499.8 2004-05-11

Publications (1)

Publication Number Publication Date
WO2005109341A1 true WO2005109341A1 (en) 2005-11-17

Family

ID=32526833

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2005/001750 WO2005109341A1 (en) 2004-05-11 2005-05-09 Method and apparatus for use in the image analysis of biological specimens

Country Status (2)

Country Link
GB (2) GB0410499D0 (en)
WO (1) WO2005109341A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0502511D0 (en) * 2005-02-08 2005-03-16 Medical Solutions Plc Apparatus and method for image processing of specimen images for use in computer analysis thereof
JP2016541039A (en) * 2013-10-07 2016-12-28 ベンタナ メディカル システムズ, インコーポレイテッド System and method for comprehensive multi-assay tissue analysis

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5231580A (en) * 1991-04-01 1993-07-27 The United States Of America As Represented By The Secretary Of The Department Of Health And Human Services Automated method and apparatus for determining characteristics of nerve fibers
US20020154798A1 (en) * 2001-02-20 2002-10-24 Ge Cong Extracting shape information contained in cell images
US20030048931A1 (en) * 2001-03-23 2003-03-13 Peter Johnson Quantification and differentiation of tissue based upon quantitative image analysis

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0610916A3 (en) * 1993-02-09 1994-10-12 Cedars Sinai Medical Center Method and apparatus for providing preferentially segmented digital images.
FR2708165B1 (en) * 1993-07-22 1995-09-29 Philips Laboratoire Electroniq Method for processing digitized images in X-ray imaging to detect the edge of a region masked by a field flap.
US6424732B1 (en) * 1998-12-01 2002-07-23 The Board Of Trustees Of The Leland Stanford Junior University Object segregation in images
AUPQ849200A0 (en) * 2000-06-30 2000-07-27 Cea Technologies Inc. Unsupervised scene segmentation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5231580A (en) * 1991-04-01 1993-07-27 The United States Of America As Represented By The Secretary Of The Department Of Health And Human Services Automated method and apparatus for determining characteristics of nerve fibers
US20020154798A1 (en) * 2001-02-20 2002-10-24 Ge Cong Extracting shape information contained in cell images
US20030048931A1 (en) * 2001-03-23 2003-03-13 Peter Johnson Quantification and differentiation of tissue based upon quantitative image analysis

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
SAHOO P K ET AL: "SURVEY OF THRESHOLDING TECHNIQUES", COMPUTER VISION GRAPHICS AND IMAGE PROCESSING, ACADEMIC PRESS, DULUTH, MA, US, vol. 41, no. 2, 1 February 1988 (1988-02-01), pages 233 - 260, XP000000250 *
SANCHEZ-MARIN F J: "Automatic segmentation of contours of corneal cells", COMPUTERS IN BIOLOGY AND MEDICINE, NEW YORK, NY, US, vol. 29, no. 4, July 1999 (1999-07-01), pages 243 - 258, XP004532398, ISSN: 0010-4825 *

Also Published As

Publication number Publication date
GB2414074A (en) 2005-11-16
GB0509355D0 (en) 2005-06-15
GB0410499D0 (en) 2004-06-16

Similar Documents

Publication Publication Date Title
JP7422825B2 (en) Focus-weighted machine learning classifier error prediction for microscope slide images
Pohle et al. Segmentation of medical images using adaptive region growing
CN107180421B (en) Fundus image lesion detection method and device
JP4504203B2 (en) Scoring of estrogen and progesterone expressions based on image analysis
US9547801B2 (en) Methods of chromogen separation-based image analysis
EP3343440A1 (en) Identifying and excluding blurred areas of images of stained tissue to improve cancer scoring
JP2021503666A (en) Systems and methods for single-channel whole-cell segmentation
CN106650794B (en) A kind of image bloom removing method influenced by body surface high light reflectivity and system
JP6342810B2 (en) Image processing
CN110310291A (en) A kind of rice blast hierarchy system and its method
CN113793301B (en) Training method of fundus image analysis model based on dense convolution network model
Skounakis et al. ATD: A multiplatform for semiautomatic 3-D detection of kidneys and their pathology in real time
JP2006507579A (en) Histological evaluation of nuclear polymorphism
CN110021019B (en) AI-assisted hair thickness distribution analysis method for AGA clinical image
WO2005109341A1 (en) Method and apparatus for use in the image analysis of biological specimens
Pollatou An automated method for removal of striping artifacts in fluorescent whole-slide microscopy
KR101245923B1 (en) Method of analyzing cell structures and their components
JP4452624B2 (en) Automatic histological categorization of tubules
Pohle et al. Self-learning model-based segmentation of medical images
CN109919924B (en) Method suitable for cell digital processing of large-batch HE staining pictures
Aubreville et al. Field of Interest Proposal for Augmented Mitotic Cell Count: Comparison of Two Convolutional Networks.
Ajemba et al. Stability-based validation of cellular segmentation algorithms
CN110458853B (en) Ankle ligament separation method and system in medical image
CN110378949B (en) Starch granule distribution analysis device and method thereof
Sarve et al. Quantification of bone remodeling in the proximity of implants

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

122 Ep: pct application non-entry in european phase