WO2001010132A2 - Motion estimation - Google Patents

Motion estimation Download PDF

Info

Publication number
WO2001010132A2
WO2001010132A2 PCT/EP2000/006973 EP0006973W WO0110132A2 WO 2001010132 A2 WO2001010132 A2 WO 2001010132A2 EP 0006973 W EP0006973 W EP 0006973W WO 0110132 A2 WO0110132 A2 WO 0110132A2
Authority
WO
WIPO (PCT)
Prior art keywords
motion vector
block
motion
vectors
global
Prior art date
Application number
PCT/EP2000/006973
Other languages
French (fr)
Other versions
WO2001010132A3 (en
Inventor
Rob A. Beuker
Original Assignee
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=8240515&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=WO2001010132(A2) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Koninklijke Philips Electronics N.V. filed Critical Koninklijke Philips Electronics N.V.
Priority to JP2001513899A priority Critical patent/JP2003512749A/en
Priority to EP00956232A priority patent/EP1145560A2/en
Publication of WO2001010132A2 publication Critical patent/WO2001010132A2/en
Publication of WO2001010132A3 publication Critical patent/WO2001010132A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/176Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/513Processing of motion vectors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/527Global motion vector estimation

Definitions

  • the invention relates to a method and device for motion estimation, a motion- compensated picture signal processing device comprising such a motion estimation device, and a picture display apparatus comprising such a motion-compensated picture signal processing device.
  • 3-D Recursive Search falls in the class of pixel- or block-recursive motion estimators.
  • the algorithm is based in the following assumptions: motion does not change much in time, i.e. from frame to frame.
  • the algorithm maintains a motion field and tries to update this field only when necessary.
  • the motion field is usually similar for a relatively large region, i.e. for an object.
  • Video consists of a sequence of frames. Each frame is divided into blocks, e.g. of 16x16 pixels. A motion vector is associated with each block. The motion vector should hold the displacement between the block in the current frame compared to the previous frame.
  • 3-D Recursive Search uses only a limited number of candidate vectors, say five, for the estimation, viz. some vectors from the previous frame, i.e. temporal vectors, some vectors from the current frame, i.e. spatial vectors, and an update of a spatial vector. For each candidate the motion estimation error is calculated.
  • WO-A-97/46,022 discloses a method of estimating motion vectors, in which motion parameters are determined for a given field of a video signal, and motion vectors for a subsequent field of the video signal are determined in dependence upon at least one predetermined motion vector (i.e. a motion vector already estimated for a spatio-temporally neighboring block) and at least one additional motion vector derived from the motion parameters.
  • the motion parameters for the given field may be derived from motion vectors determined for the given field, e.g. by applying a two-dimensional histogram operation on the motion vectors determined for the given field.
  • EP-A-0,652,678 discloses a method and apparatus for improving a block-based motion compensation in digital video coding.
  • the location of the search window within a reference frame is defined using the global motion of the frame.
  • the global motion vector is generated utilizing the motion vector occurring with the most repetition within a plurality of previously stored motion vectors.
  • the invention provides a motion estimation method and device, a motion-compensated picture signal processing apparatus, and a picture display apparatus as defined in the independent claims.
  • Advantageous embodiments are defined in the dependent claims.
  • a block-based motion vector estimation process that involves comparing a plurality of candidate vectors is carried out to determine block-based motion vectors, at least a most frequently occurring block-based motion vector is determined, a global motion vector estimation process using at least the most frequently occurring block-based motion vector is carried out to obtain a global motion vector, and the global motion vector is applied as a candidate vector to the block-based motion vector estimation process.
  • the drawing shows a functional block diagram of an embodiment of a motion- compensated picture display apparatus in accordance with the present invention.
  • the global motion vector is defined as the most used motion vector in appropriate blocks. We also use the second-most used motion vector. A block is appropriate if the motion estimation error is small enough and the block contains enough detail.
  • the global motion is extracted from the motion field in two steps: count for all "appropriate” blocks the number of times that a motion vector is used, and obtain from these counts the most and second-most used motion vectors.
  • a block is "appropriate” if the motion estimation error is small enough (average
  • (MV x ,MV y ) is the candidate motion vector and the summation is over the block.
  • the SAD is set to "0" if it is smaller than a threshold to remove the influence of fixed pattern noise.
  • the 3-D Recursive Search estimator uses the following 6 candidates:
  • the spatial vector of block (x- l,y- 1) (upper left). 3.
  • the spatial vector of block (x+ l,y- 1) (upper right). 4.
  • the temporal vector of the current block. 5.
  • the temporal vector of block (x,y+l) (lower). 6.
  • the update is obtained as follows.
  • the update vector is the sum of the spatial vector and a delta vector.
  • the delta vector (dx, dy) is read from a list of 16 possible delta vectors. The list of sixteen possible delta vectors is given in the next table.
  • the table shows that the maximum update per vector is 12 pixels horizontally and 8 pixels vertically.
  • Each candidate is checked to see whether the resulting address is valid, i.e. points to an area within the frame. If not, the vector is clipped to the nearest valid motion vector.
  • the next table shows the penalties for each vector type. Penalties are added to block matching errors (SAD) in order to favor certain candidate motion vectors over other candidate motion vectors in order to smoothen the motion field.
  • SAD block matching errors
  • the global motion estimator uses the following four candidates: 1. Most-used motion vector obtained by the block-based 3-D Recursive Search estimator.
  • the global estimator determines which of the four candidates is the best one. From these best candidates determined on a block basis, the most-frequently occurring one is retained.
  • the penalty for the global motion candidate is 1 if the motion vector is zero and 0 otherwise. This may be simplified to "0" only without losing accuracy.
  • the full motion estimator in accordance with the present invention uses the following steps per frame:
  • the number of motion estimation blocks is reduced by sub-sampling. Since we only require one motion vector per frame, the global motion vector, we do not need to calculate a motion vector for each block, so that the number of used blocks can be sub-sampled. We currently use a sub-sampling factor of two horizontally and two vertically. Note that we may be able to use a factor of four for the global motion estimation, if necessary.
  • the sub-sampling factor is limited for the following reasons: A too high sub-sampling factor reduces the probability that there are "appropriate candidates" (blocks with a small motion estimation error and a sufficiently high activity). Moreover, using too few blocks will reduce the smoothness of the motion field. In addition, it is possible to apply sub-sampling within a block to reduce the number of pixels.
  • processor-specific features such as MMX
  • improve the global motion estimation algorithm the following measures are possible. Retain not only the most used and second-most used global motion vectors, but also less frequently used motion vectors. Use only the central part of the current frame for motion estimation, e.g. a quarter of the frame. If there is some rotation (with the middle of the frame as center of rotation), the blocks in the outer area of the frame will contain more displacement than the central part. Note that this latter measure will also reduce the computational load.
  • the drawing shows a functional block diagram of an embodiment of a motion- compensated picture display apparatus in accordance with the present invention.
  • a picture signal is applied to a block-based motion vector estimator BME and to a global motion- vector estimator GME that operate as set out above.
  • the block-based motion vector estimator BME applies a most frequently used motion vector MFMV and a second-most frequently used motion vector SMFMV to the global motion-vector estimator GME.
  • the global motion-vector estimator GME applies a global motion vector GMV as a candidate vector to the block-based motion vector estimator BME.
  • the picture signal is also applied to a motion-compensated processor MCP for carrying out, e.g.
  • a motion-compensated interpolation (say, a 100 Hz conversion) or a motion-compensated stitching of images obtained by a scanner or video camera.
  • the motion-compensated processor is controlled by either block-based motion vectors supplied by the block-based motion vector estimator BME or global motion vectors supplied by the global motion estimator GME.
  • a switch S symbolically indicates this choice. In practice, depending on the application, there is no switch S and the appropriate type of motion vectors is used. Global vectors will e.g. be used for stitching scanned images, while block- based vectors will e.g. be used for 100 Hz conversion.
  • the output of the motion-compensated processor MCP is applied to a display device DD. In other applications of the invention, such as in a scanner, the output of the motion-compensated processor MCP will be printed on paper.
  • the invention can be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer.
  • the device claim enumerating several means several of these means can be embodied by one and the same item of hardware.
  • the mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.

Abstract

In a motion vector estimation method, a block-based motion vector estimation process (BME) that involves comparing a plurality of candidate vectors is carried out to determine block-based motion vectors, at least a most frequently occurring block-based motion vector (MFMV) is determined, a global motion vector estimation process (GME) using at least the most frequently occurring block-based motion vector (MFMV) is carried out to obtain a global motion vector (GMV), and the global motion vector (GMV) is applied as a candidate vector to the block-based motion vector estimation process (BME).

Description

Motion estimation.
The invention relates to a method and device for motion estimation, a motion- compensated picture signal processing device comprising such a motion estimation device, and a picture display apparatus comprising such a motion-compensated picture signal processing device.
A prior art motion estimation technique, called "3-D Recursive Search", has been described by Gerard de Haan and Robert Jan Schutten, "Real-time 2-3 pull-down elimination applying motion estimation/compensation in a programmable device", IEEE Transactions on Consumer Electronics, Vol. 44, No. 3, August 1988, pp. 930-938. 3-D Recursive Search falls in the class of pixel- or block-recursive motion estimators. The algorithm is based in the following assumptions: motion does not change much in time, i.e. from frame to frame. The algorithm maintains a motion field and tries to update this field only when necessary. The motion field is usually similar for a relatively large region, i.e. for an object. Therefore the motion vectors in the neighborhood of a location are good candidates for the motion in that location. Video consists of a sequence of frames. Each frame is divided into blocks, e.g. of 16x16 pixels. A motion vector is associated with each block. The motion vector should hold the displacement between the block in the current frame compared to the previous frame. Suppose that we want to update the motion vector of block (x,y) in the current frame. 3-D Recursive Search uses only a limited number of candidate vectors, say five, for the estimation, viz. some vectors from the previous frame, i.e. temporal vectors, some vectors from the current frame, i.e. spatial vectors, and an update of a spatial vector. For each candidate the motion estimation error is calculated. The candidate with the lowest motion estimation error is chosen as the best motion vector for that block. The algorithm uses the normal raster scan order to go through the blocks. WO-A-97/46,022 discloses a method of estimating motion vectors, in which motion parameters are determined for a given field of a video signal, and motion vectors for a subsequent field of the video signal are determined in dependence upon at least one predetermined motion vector (i.e. a motion vector already estimated for a spatio-temporally neighboring block) and at least one additional motion vector derived from the motion parameters. The motion parameters for the given field may be derived from motion vectors determined for the given field, e.g. by applying a two-dimensional histogram operation on the motion vectors determined for the given field.
EP-A-0,652,678 discloses a method and apparatus for improving a block-based motion compensation in digital video coding. The location of the search window within a reference frame is defined using the global motion of the frame. In one embodiment, the global motion vector is generated utilizing the motion vector occurring with the most repetition within a plurality of previously stored motion vectors.
It is, inter alia, an object of the invention to provide an improved motion estimation technique. To this end, the invention provides a motion estimation method and device, a motion-compensated picture signal processing apparatus, and a picture display apparatus as defined in the independent claims. Advantageous embodiments are defined in the dependent claims.
In a motion vector estimation method in accordance with a primary aspect of the present invention, a block-based motion vector estimation process that involves comparing a plurality of candidate vectors is carried out to determine block-based motion vectors, at least a most frequently occurring block-based motion vector is determined, a global motion vector estimation process using at least the most frequently occurring block-based motion vector is carried out to obtain a global motion vector, and the global motion vector is applied as a candidate vector to the block-based motion vector estimation process.
These and other aspects of the invention will be apparent from and elucidated with reference to the embodiments described hereinafter.
The drawing shows a functional block diagram of an embodiment of a motion- compensated picture display apparatus in accordance with the present invention.
When using 3-D Recursive Search for global motion estimation, we have as main task: how to obtain one motion vector from all the motion vectors? Our approach is based on the following steps: 1. The global motion vector is defined as the most used motion vector in appropriate blocks. We also use the second-most used motion vector. A block is appropriate if the motion estimation error is small enough and the block contains enough detail.
2. Make the motion field smooth by introducing a global motion vector candidate. In fact, we use two motion estimators, a normal motion estimator that is using the global motion vector and a global motion estimator. First we will describe how we extract a global motion from the motion field. We will describe the properties of each motion estimator in the next sections. Finally, it is described how both motion fields are used to build the global motion estimator.
Why do we need two motion vectors? We use the normal motion estimator to track the changes. From the associated motion field we cannot obtain the correct global motion, but only candidate global motion vectors. We use a global motion estimator, still using the 3-D Recursive Search concept, for selecting the best global motion vector. We cannot solely use this global motion estimator because it is not capable of tracking changes.
The global motion is extracted from the motion field in two steps: count for all "appropriate" blocks the number of times that a motion vector is used, and obtain from these counts the most and second-most used motion vectors. A block is "appropriate" if the motion estimation error is small enough (average
SAD smaller than 30), and the block contains enough activity (activity larger than 50) where the activity per block is defined as: activity = max(ij)(y(i,j)) - min(jj)(y(i,j)). We remove the blocks with a low activity, because the motion estimation is not reliable for blocks without detail. Currently, we use the Sum-of- Absolute-Difference measure (SAD) for the displacement error. Let y[ij] and yprev [i,j] denote the pixel values of the current frame and previous frame, respectively. The Sum-of-Absolute-Difference measure is calculated by:
SAD = ∑\ y(i,j) - yprev(i + MVX + MVy) |,
where (MVx,MVy) is the candidate motion vector and the summation is over the block. The SAD is set to "0" if it is smaller than a threshold to remove the influence of fixed pattern noise. We also use the second-most used motion vector to improve the robustness of the algorithm. We found that sometimes the algorithm will favor the zero-motion vector, whereas there is some camera panning. Supplying the global motion estimator with both vectors solves this situation.
The 3-D Recursive Search estimator uses the following 6 candidates:
1. The most-used global motion vector (which is also used as best global motion vector).
2. The spatial vector of block (x- l,y- 1) (upper left). 3. The spatial vector of block (x+ l,y- 1) (upper right). 4. The temporal vector of the current block. 5. The temporal vector of block (x,y+l) (lower). 6. An update of the spatial vector of block (x- 1, y- 1) if x is even and of block (x+l,y-l) if x is odd.
The update is obtained as follows. The update vector is the sum of the spatial vector and a delta vector. The delta vector (dx, dy) is read from a list of 16 possible delta vectors. The list of sixteen possible delta vectors is given in the next table.
Figure imgf000005_0001
The table shows that the maximum update per vector is 12 pixels horizontally and 8 pixels vertically.
How do we select a delta vector? We simply use the next delta vector in the list for the next block and start with delta vector 0. Suppose that element j is used for block (x,y), then we use element j+1 for the next block (i.e. block (x+l,y)) and we use delta element "0" if j+1 equals sixteen.
Each candidate is checked to see whether the resulting address is valid, i.e. points to an area within the frame. If not, the vector is clipped to the nearest valid motion vector.
The next table shows the penalties for each vector type. Penalties are added to block matching errors (SAD) in order to favor certain candidate motion vectors over other candidate motion vectors in order to smoothen the motion field.
Figure imgf000005_0002
The global motion estimator uses the following four candidates: 1. Most-used motion vector obtained by the block-based 3-D Recursive Search estimator.
2. Second-most used motion vector obtained by the block-based 3-D Recursive Search estimator. 3. Cyclically varying updates of the motion vector mentioned at 1 (see above).
4. Cyclically varying updates of the motion vector mentioned at 2 (see above).
On a block basis, the global estimator determines which of the four candidates is the best one. From these best candidates determined on a block basis, the most-frequently occurring one is retained.
The penalties for each type are:
Figure imgf000006_0001
The penalty for the global motion candidate is 1 if the motion vector is zero and 0 otherwise. This may be simplified to "0" only without losing accuracy.
The full motion estimator in accordance with the present invention uses the following steps per frame:
1. Get the best global motion vector from the global motion estimation GME.
2. Use this vector for the 6-candidate 3-D Recursive Search motion estimation BME.
3. Extract the most used and second-most used global motion vectors from the resulting motion field.
4. Use these motion vectors in the global motion estimation GME, i.e. in the four- candidate motion estimation. 5. The global motion is extracted from the resulting motion field and used in step
1.
To reduce the CPU load of the algorithm, the number of motion estimation blocks is reduced by sub-sampling. Since we only require one motion vector per frame, the global motion vector, we do not need to calculate a motion vector for each block, so that the number of used blocks can be sub-sampled. We currently use a sub-sampling factor of two horizontally and two vertically. Note that we may be able to use a factor of four for the global motion estimation, if necessary. The sub-sampling factor is limited for the following reasons: A too high sub-sampling factor reduces the probability that there are "appropriate candidates" (blocks with a small motion estimation error and a sufficiently high activity). Moreover, using too few blocks will reduce the smoothness of the motion field. In addition, it is possible to apply sub-sampling within a block to reduce the number of pixels.
Using processor-specific features, such as MMX, also helps in speeding up the computation. Also, the time spend in the SAD calculation can in principle be reduced by using cross correlation. To improve the global motion estimation algorithm, the following measures are possible. Retain not only the most used and second-most used global motion vectors, but also less frequently used motion vectors. Use only the central part of the current frame for motion estimation, e.g. a quarter of the frame. If there is some rotation (with the middle of the frame as center of rotation), the blocks in the outer area of the frame will contain more displacement than the central part. Note that this latter measure will also reduce the computational load.
The drawing shows a functional block diagram of an embodiment of a motion- compensated picture display apparatus in accordance with the present invention. A picture signal is applied to a block-based motion vector estimator BME and to a global motion- vector estimator GME that operate as set out above. The block-based motion vector estimator BME applies a most frequently used motion vector MFMV and a second-most frequently used motion vector SMFMV to the global motion-vector estimator GME. The global motion-vector estimator GME applies a global motion vector GMV as a candidate vector to the block-based motion vector estimator BME. The picture signal is also applied to a motion-compensated processor MCP for carrying out, e.g. a motion-compensated interpolation (say, a 100 Hz conversion) or a motion-compensated stitching of images obtained by a scanner or video camera. The motion-compensated processor is controlled by either block-based motion vectors supplied by the block-based motion vector estimator BME or global motion vectors supplied by the global motion estimator GME. A switch S symbolically indicates this choice. In practice, depending on the application, there is no switch S and the appropriate type of motion vectors is used. Global vectors will e.g. be used for stitching scanned images, while block- based vectors will e.g. be used for 100 Hz conversion. The output of the motion-compensated processor MCP is applied to a display device DD. In other applications of the invention, such as in a scanner, the output of the motion-compensated processor MCP will be printed on paper.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design many alternative embodiments without departing from the scope of the appended claims. Where in the above- described examples only a most-used and a second-most used vector are used, it is an obvious generalization clearly falling within the scope of the claims to use the N most-used vectors. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps other than those listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention can be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the device claim enumerating several means, several of these means can be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.

Claims

CLAIMS:
1. A motion vector estimation method, comprising the steps of: carrying out a block-based motion vector estimation process (BME) that involves comparing a plurality of candidate vectors to determine block-based motion vectors; determining at least a most frequently occurring block-based motion vector (MFMV); carrying out a global motion vector estimation process (GME) using at least the most frequently occurring block-based motion vector (MFMV) to obtain a global motion vector (GMV); and applying the global motion vector (GMV) as a candidate vector to the block- based motion vector estimation process (BME).
2. A method as claimed in claim 1, wherein the determining step includes making a selection among block-based motion vectors having a corresponding motion error below a given motion error threshold.
3. A method as claimed in claim 1, wherein the determining step includes making a selection among block-based motion vectors estimated for blocks having a difference between maximum and minimum pixel values above a given activity threshold.
4. A method as claimed in claim 1, wherein both the most frequently occurring block-based motion vector (MFMV) and a second-most frequently occurring block-based motion vector (SMFMV) are determined and used in the global motion vector estimation process (GME).
5. A method as claimed in claim 1, wherein said global motion vector estimation process (GME) includes the steps of: comparing, on a block basis, a plurality of candidate vectors including the most frequently occurring block-based motion vector (MFMV) to obtain best vectors determined per block; outputting a most-frequently occurring best vector determined per block as the global motion vector (GMV).
6. A motion vector estimation device, comprising: block-based motion vector estimation means (BME) for determining block- based motion vectors based on a comparison of a plurality of candidate vectors; means for determining at least a most frequently occurring block-based motion vector (MFMV, SMFMV); means (GME) for carrying out a global motion vector estimation process using at least the most frequently occurring block-based motion vector (MFMV, SMFMV) to obtain a global motion vector; and means for applying the global motion vector (GMV) as a candidate vector to the block-based motion vector estimation means (BME).
7. A motion-compensated picture signal processing apparatus, comprising: a motion vector estimation device as claimed in claim 6 for generating motion vectors; and a motion-compensated processor (MCP) for processing a picture signal in dependence on the motion vectors.
8. A picture display apparatus, comprising: a motion-compensated picture signal processing apparatus as claimed in claim 7 to obtain a processed picture signal; and a display device for displaying the processed picture signal.
PCT/EP2000/006973 1999-08-02 2000-07-20 Motion estimation WO2001010132A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2001513899A JP2003512749A (en) 1999-08-02 2000-07-20 Motion estimation method and apparatus
EP00956232A EP1145560A2 (en) 1999-08-02 2000-07-20 Motion estimation

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP99202532A EP1075147A1 (en) 1999-08-02 1999-08-02 Motion estimation
EP99202532.0 1999-08-02

Publications (2)

Publication Number Publication Date
WO2001010132A2 true WO2001010132A2 (en) 2001-02-08
WO2001010132A3 WO2001010132A3 (en) 2002-10-03

Family

ID=8240515

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2000/006973 WO2001010132A2 (en) 1999-08-02 2000-07-20 Motion estimation

Country Status (6)

Country Link
US (1) US6996177B1 (en)
EP (2) EP1075147A1 (en)
JP (1) JP2003512749A (en)
KR (1) KR100727795B1 (en)
TW (1) TW474105B (en)
WO (1) WO2001010132A2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007089068A1 (en) * 2006-02-02 2007-08-09 Samsung Electronics Co., Ltd. Method and apparatus for block-based motion estimation
WO2007148907A1 (en) * 2006-06-19 2007-12-27 Lg Electronics, Inc. Method and apparatus for processing a vedeo signal
US8170108B2 (en) 2006-03-30 2012-05-01 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal
US8532181B2 (en) 2006-08-25 2013-09-10 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal with inter-view reference picture list construction
EP3217668A1 (en) * 2005-08-29 2017-09-13 Samsung Electronics Co., Ltd. Motion estimation method, video encoding method and apparatus using the same

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7602847B1 (en) * 2001-03-27 2009-10-13 Vixs Systems, Inc. Device and method for compression of a video stream
US7675972B1 (en) 2001-07-30 2010-03-09 Vixs Systems, Inc. System and method for multiple channel video transcoding
KR100441509B1 (en) * 2002-02-25 2004-07-23 삼성전자주식회사 Apparatus and method for transformation of scanning format
US7421129B2 (en) * 2002-09-04 2008-09-02 Microsoft Corporation Image compression and synthesis for video effects
US7408986B2 (en) * 2003-06-13 2008-08-05 Microsoft Corporation Increasing motion smoothness using frame interpolation with motion analysis
US7558320B2 (en) * 2003-06-13 2009-07-07 Microsoft Corporation Quality control in frame interpolation with motion analysis
KR100987765B1 (en) * 2003-09-30 2010-10-13 삼성전자주식회사 Prediction method and apparatus in video encoder
US7840085B2 (en) * 2006-04-06 2010-11-23 Qualcomm Incorporated Electronic video image stabilization
US8050324B2 (en) * 2006-11-29 2011-11-01 General Instrument Corporation Method and apparatus for selecting a reference frame for motion estimation in video encoding
US8111750B2 (en) * 2007-03-20 2012-02-07 Himax Technologies Limited System and method for 3-D recursive search motion estimation
US8559518B2 (en) * 2008-06-30 2013-10-15 Samsung Electronics Co., Ltd. System and method for motion estimation of digital video using multiple recursion rules
TWI491248B (en) * 2011-12-30 2015-07-01 Chung Shan Inst Of Science Global motion vector estimation method
CN103838795A (en) * 2012-11-27 2014-06-04 大连灵动科技发展有限公司 Template correlation matching method
CA3173525A1 (en) 2017-04-21 2018-10-25 Zenimax Media Inc. Systems and methods for game-generated motion vectors
US10523961B2 (en) 2017-08-03 2019-12-31 Samsung Electronics Co., Ltd. Motion estimation method and apparatus for plurality of frames

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0410826A1 (en) * 1989-06-27 1991-01-30 Thomson-Csf Iterative motion estimation process, between a reference image and a current image, and device for canying out the process
EP0652678A2 (en) * 1993-11-04 1995-05-10 AT&T Corp. Method and apparatus for improving motion compensation in digital video coding

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2231749B (en) * 1989-04-27 1993-09-29 Sony Corp Motion dependent video signal processing
DE4112235A1 (en) 1991-04-15 1992-11-05 Bundesrep Deutschland Separating zoom, rotational and translational motor parameters in image sequences - dividing motion vector data and storing in matrix memory, forming histograms and determining global parameters
JPH0686149A (en) * 1992-08-31 1994-03-25 Sony Corp Motion vector detector and video camera
JP3165296B2 (en) * 1992-12-25 2001-05-14 三菱電機株式会社 Inter-frame coding processing method, inter-frame coding processing method, and coding control method
JP3308617B2 (en) * 1992-12-28 2002-07-29 キヤノン株式会社 Apparatus and method for detecting motion vector
GB2312806B (en) * 1993-04-08 1998-01-07 Sony Uk Ltd Motion compensated video signal processing
JPH07135663A (en) 1993-09-17 1995-05-23 Oki Electric Ind Co Ltd Method and device for detecting movement vector
US5575286A (en) 1995-03-31 1996-11-19 Siemens Medical Systems, Inc. Method and apparatus for generating large compound ultrasound image
JP3639640B2 (en) * 1995-06-20 2005-04-20 キヤノン株式会社 Motion vector detection device
JP3745425B2 (en) * 1995-11-15 2006-02-15 日本放送協会 Motion vector detection method and adaptive switching prefilter for motion vector detection
DE69710413T2 (en) * 1996-05-24 2002-10-02 Koninkl Philips Electronics Nv MOTION ESTIMATE
FR2749411B1 (en) 1996-06-04 1998-09-04 Grados Christian ATTACHING AN ACCESSORY FOR A WRISTBAND ASSEMBLY AND WRIST CASE
US6462791B1 (en) * 1997-06-30 2002-10-08 Intel Corporation Constrained motion estimation and compensation for packet loss resiliency in standard based codec
EP0897247A3 (en) 1997-08-14 2001-02-07 Philips Patentverwaltung GmbH Method for computing motion vectors
KR100582856B1 (en) * 1997-09-23 2006-05-24 코닌클리케 필립스 일렉트로닉스 엔.브이. Motion estimation and motion-compensated interpolation

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0410826A1 (en) * 1989-06-27 1991-01-30 Thomson-Csf Iterative motion estimation process, between a reference image and a current image, and device for canying out the process
EP0652678A2 (en) * 1993-11-04 1995-05-10 AT&T Corp. Method and apparatus for improving motion compensation in digital video coding

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
HAAN DE G ET AL: "AN EFFICIENT TRUE-MOTION ESTIMATOR USING CANDIDATE VECTORS FROM A PARAMETRIC MOTION MODEL" IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY,US,IEEE INC. NEW YORK, vol. 8, no. 1, 1 February 1998 (1998-02-01), page 85-91 XP000737028 ISSN: 1051-8215 *
KAMIKURA K ET AL: "GLOBAL MOTION COMPENSATION IN VIDEO CODING" ELECTRONICS & COMMUNICATIONS IN JAPAN, PART I - COMMUNICATIONS,US,SCRIPTA TECHNICA. NEW YORK, vol. 78, no. 4, page 91-101 XP000523910 ISSN: 8756-6621 *
See also references of EP1145560A2 *

Cited By (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3217668A1 (en) * 2005-08-29 2017-09-13 Samsung Electronics Co., Ltd. Motion estimation method, video encoding method and apparatus using the same
WO2007089068A1 (en) * 2006-02-02 2007-08-09 Samsung Electronics Co., Ltd. Method and apparatus for block-based motion estimation
US8559505B2 (en) 2006-03-30 2013-10-15 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal
US8576920B2 (en) 2006-03-30 2013-11-05 Lg Electronics, Inc. Method and apparatus for decoding/encoding a video signal
US8170108B2 (en) 2006-03-30 2012-05-01 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal
US8559523B2 (en) 2006-03-30 2013-10-15 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal
US8363732B2 (en) 2006-03-30 2013-01-29 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal
US8411744B2 (en) 2006-03-30 2013-04-02 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal
US8565303B2 (en) 2006-03-30 2013-10-22 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal
US8432972B2 (en) 2006-03-30 2013-04-30 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal
US8457207B2 (en) 2006-03-30 2013-06-04 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal
US8472519B2 (en) 2006-03-30 2013-06-25 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal
US8483273B2 (en) 2006-03-30 2013-07-09 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal
US8526504B2 (en) 2006-03-30 2013-09-03 Lg Electronics Inc. Method and apparatus for decoding/encoding a multi-view video signal with inter-view reference picture list management
US8428130B2 (en) 2006-03-30 2013-04-23 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal
US8634475B2 (en) 2006-03-30 2014-01-21 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal using a reference picture list for inter-view prediction
US8611419B2 (en) 2006-03-30 2013-12-17 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal
US8611427B2 (en) 2006-03-30 2013-12-17 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal
US8565319B2 (en) 2006-03-30 2013-10-22 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal
US8571113B2 (en) 2006-03-30 2013-10-29 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal
WO2007148907A1 (en) * 2006-06-19 2007-12-27 Lg Electronics, Inc. Method and apparatus for processing a vedeo signal
WO2007148906A1 (en) * 2006-06-19 2007-12-27 Lg Electronics, Inc. Method and apparatus for processing a vedeo signal
WO2007148909A1 (en) * 2006-06-19 2007-12-27 Lg Electronics, Inc. Method and apparatus for processing a vedeo signal
US8325814B2 (en) 2006-06-19 2012-12-04 Lg Electronics Inc. Method and apparatus for processing a video signal
US8718136B2 (en) 2006-08-25 2014-05-06 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal with inter-view reference picture list construction
US8532180B2 (en) 2006-08-25 2013-09-10 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal with inter-view reference picture list construction
US8559508B2 (en) 2006-08-25 2013-10-15 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal with inter-view reference picture list construction
US8532183B2 (en) 2006-08-25 2013-09-10 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal with inter-view reference picture list construction
US8532184B2 (en) 2006-08-25 2013-09-10 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal with inter-view reference picture list construction
US8532178B2 (en) 2006-08-25 2013-09-10 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal with inter-view reference picture list construction
US8630344B2 (en) 2006-08-25 2014-01-14 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal with inter-view reference picture list construction
US8532182B2 (en) 2006-08-25 2013-09-10 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal with inter-view reference picture list construction
US8649433B2 (en) 2006-08-25 2014-02-11 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal with inter-view reference picture list construction
US8660179B2 (en) 2006-08-25 2014-02-25 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal with inter-view reference picture list construction
US8681863B2 (en) 2006-08-25 2014-03-25 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal with inter-view reference picture list construction
US8711932B2 (en) 2006-08-25 2014-04-29 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal with inter-view reference picture list construction
US8559507B2 (en) 2006-08-25 2013-10-15 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal with inter-view reference picture list construction
US8724700B2 (en) 2006-08-25 2014-05-13 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal with inter-view reference picture list construction
US8761255B2 (en) 2006-08-25 2014-06-24 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal with inter-view reference picture list construction
US8767827B2 (en) 2006-08-25 2014-07-01 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal with inter-view reference picture list construction
US8855200B2 (en) 2006-08-25 2014-10-07 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal with inter-view reference picture list construction
US8532181B2 (en) 2006-08-25 2013-09-10 Lg Electronics Inc. Method and apparatus for decoding/encoding a video signal with inter-view reference picture list construction

Also Published As

Publication number Publication date
WO2001010132A3 (en) 2002-10-03
EP1145560A2 (en) 2001-10-17
TW474105B (en) 2002-01-21
KR20010075507A (en) 2001-08-09
JP2003512749A (en) 2003-04-02
EP1075147A1 (en) 2001-02-07
US6996177B1 (en) 2006-02-07
KR100727795B1 (en) 2007-06-14

Similar Documents

Publication Publication Date Title
US6996177B1 (en) Motion estimation
US6782054B2 (en) Method and apparatus for motion vector estimation
JP4472986B2 (en) Motion estimation and / or compensation
US8130835B2 (en) Method and apparatus for generating motion vector in hierarchical motion estimation
US8625673B2 (en) Method and apparatus for determining motion between video images
EP2011342A1 (en) Motion estimation at image borders
JP2004518341A (en) Recognition of film and video objects occurring in parallel in a single television signal field
Kaviani et al. Frame rate upconversion using optical flow and patch-based reconstruction
EP1514242A2 (en) Unit for and method of estimating a motion vector
WO2003102872A2 (en) Unit for and method of estimating a motion vector
US20060098886A1 (en) Efficient predictive image parameter estimation
US6996175B1 (en) Motion vector estimation
KR100649654B1 (en) Method of motion estimation for transmission cost reduction of motion vectors
WO2003073757A1 (en) Method and apparatus for field rate up-conversion
US20080144716A1 (en) Method For Motion Vector Determination
EP1606952A1 (en) Method for motion vector determination
EP1128678A1 (en) Motion estimation apparatus and method
Braspenning et al. Efficient motion estimation with content-adaptive resolution
JPH08242454A (en) Method for detecting global motion parameter
Al-Mualla et al. Motion field interpolation for improved motion compensation and frame-rate conversion
WO2000072590A2 (en) Block matching

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): JP KR

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

WWE Wipo information: entry into national phase

Ref document number: 2000956232

Country of ref document: EP

ENP Entry into the national phase

Ref country code: JP

Ref document number: 2001 513899

Kind code of ref document: A

Format of ref document f/p: F

WWE Wipo information: entry into national phase

Ref document number: 1020017004124

Country of ref document: KR

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWP Wipo information: published in national office

Ref document number: 1020017004124

Country of ref document: KR

WWP Wipo information: published in national office

Ref document number: 2000956232

Country of ref document: EP

AK Designated states

Kind code of ref document: A3

Designated state(s): JP KR

AL Designated countries for regional patents

Kind code of ref document: A3

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE

WWG Wipo information: grant in national office

Ref document number: 1020017004124

Country of ref document: KR