CN103686107A - Processing method and device based on projected image - Google Patents

Processing method and device based on projected image Download PDF

Info

Publication number
CN103686107A
CN103686107A CN201310687010.2A CN201310687010A CN103686107A CN 103686107 A CN103686107 A CN 103686107A CN 201310687010 A CN201310687010 A CN 201310687010A CN 103686107 A CN103686107 A CN 103686107A
Authority
CN
China
Prior art keywords
image
identification point
value
motion compensation
reference picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310687010.2A
Other languages
Chinese (zh)
Other versions
CN103686107B (en
Inventor
许春景
黎伟
刘健庄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201310687010.2A priority Critical patent/CN103686107B/en
Publication of CN103686107A publication Critical patent/CN103686107A/en
Priority to PCT/CN2014/093811 priority patent/WO2015085956A1/en
Application granted granted Critical
Publication of CN103686107B publication Critical patent/CN103686107B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • H04N9/3194Testing thereof including sensor feedback
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence

Abstract

The invention belongs to the technical field of image processing, and provides a processing method and device based on a projected image. The method comprises the steps that a secondary imaged image of the projected image on a projection surface is obtained, wherein the projected image is formed by projecting a preset reference image on the projection surface, and reference identification points are preset in the reference image; secondary image identification points in the secondary imaged image are obtained; a distortion factor is obtained according to the change of the position parameters of the secondary image identification points in the secondary imaged image based on the position parameters of the reference identification points in the reference image; an image to be projected is corrected according to the distortion factor. Processed projected images are made to be reduced in distortion or even be free of distortion, and thus the environment adaptive capacity of image projecting equipment is improved.

Description

A kind of processing method and device based on projected image
Technical field
The embodiment of the present invention relates to image processing field, relates in particular to a kind of processing method and device based on projected image.
Background technology
Projection Display be current people in routine work and life widely used a kind of display mode all, it is adjustable with the size of projected image, the freedom and flexibility of projection surface are chosen and the plurality of advantages such as moving freely property of projector equipment and being liked by masses.The fixed setting that traditional Projection Display generally adopts lift-on/lift-off type or platform to put, and there is fixing projection surface, along with scientific and technological development, projecting apparatus is more and more less more and more can be arranged by the mobile of freedom and flexibility, such as wearable projector equipment just can be configured on people's health, this increases uncertain factor in projection environment, how angle variation between projector equipment and projection surface and the uneven distortion that all can cause projected image of projection surface, process this image fault and be even more important.Prior art provides a kind of projector equipment, comprise projector, projection surface and camera, projector's projection on a certain medium goes out can interactive interface, user carries out interactive operation by certain gestures to this interface, gesture is caught and is passed to computing equipment and identify by camera collection, according to recognition result driving arrangement, complete corresponding actions, but, owing to not knowing in advance the particular location of projection surface, the angle of projector and projection surface may often change, the projection environment of this complexity can cause the image projecting to produce larger projection deformation, make to show image fault.
Summary of the invention
The embodiment of the present invention provides a kind of processing method and device based on projected image, projection environment is unstable cause projected image distortion in the situation that, the image fault causing is thus proofreaied and correct to processing.
First aspect, the invention provides a kind of processing method based on projected image, comprises following method:
Obtain projected image in the projection surface secondary imaging image in imaging device, described projected image is the image that by projector equipment, default reference picture is projected to become in described projection surface, in described default reference picture, be provided with reference identification point, described reference identification point forms once as identification point after projection in described projected image; The described picture point once forming in described secondary imaging image as identification point is that secondary is as identification point;
Obtain secondary in described secondary imaging image as identification point;
According to described secondary, as the location parameter of identification point, with respect to the variation of the location parameter of described reference identification, obtain distortion factor; Described secondary as the location parameter of identification point be described secondary as identification point the location parameter in described secondary imaging image, the location parameter of described reference identification point is the location parameter of described reference identification point in described reference picture;
According to described distortion factor, treat projected image and carry out distortion correction.
In conjunction with first aspect, in the possible implementation of the first of first aspect, described default reference picture is with reference to infrared grid line, and described reference identification point is the intersection point of the grid line of different directions in the infrared grid line of described reference; Accordingly, described projector equipment is infrared projection equipment, and described imaging device is infrared filter camera head;
Described projected image in the projection surface secondary imaging image in imaging device that obtains comprises:
Obtain projected image in projection surface through the infrared grid line of the formed secondary of described infrared filter camera;
The described secondary obtaining in described secondary imaging image comprises as identification point:
Obtain the intersection point of the grid line of different directions in the infrared grid line of described secondary, the intersection point of the grid line of described different directions is that described secondary is as identification point.
In conjunction with the possible implementation of the first of first aspect or first aspect, in the possible implementation of the second of first aspect, described according to described secondary as identification point the variation of the location parameter location parameter in described reference picture with respect to described reference identification point in described secondary imaging image obtain distortion factor and specifically comprise: by following formula, obtain described distortion factor R = r 11 r 12 r 13 r 21 r 22 r 23 ,
r 11 r 12 r 21 r 22 * x i y i + r 13 r 23 = x i ' y i ' , i = 1,2 , . . . , n - - - ( 1 )
x i r 11 + y i r 12 + r 13 = x i ' x i r 21 + y i r 22 + r 23 = y i ' , i = 1,2 , . . . , n - - - ( 2 )
Wherein, R is distortion factor, described in r 11 r 12 r 13 r 21 r 22 r 23 The matrix forming in affine transformation for described reference picture, wherein r 11, r 12, r 13, r 21, r 22, r 23for the parameter in affine variation, described x iwith described y ifor described secondary as identification point abscissa value and the ordinate value in described secondary imaging image, described (x i, y i) be that secondary is as the coordinate figure of identification point; Described x i' and described y i' be abscissa value and the ordinate value of described reference identification point in described reference picture, described (x i', y i') be the coordinate figure of reference identification point; Described affine transformation process is described parameter r 11, r 12, r 21, r 22with described secondary as identification point (x i, y i) carry out once linear conversion, and connect translation parameters r 13, r 23be transformed to described reference identification point (x i', y i'); Formula (2) is the system of linear equations consisting of formula (1) conversion, by described formula (2), calculates described affine transformation distortion factor R.
In conjunction with the possible implementation of the second of first aspect, in the third possible implementation of first aspect, describedly according to described distortion factor, to described, treat that projected image correction comprises:
By following formula, treat projected image and carry out distortion correction:
x i R y i R = R * x i y i 1 r 11 r 12 r 13 r 21 r 22 r 23 * x i y i 1 , i = 1,2 , . . . , n - - - ( 3 )
Wherein, R is distortion factor, described in
Figure BDA0000437125490000036
with
Figure BDA0000437125490000037
for treating that projected image carries out revised image pixel coordinate points according to distortion factor R
Figure BDA0000437125490000038
abscissa value and ordinate value, described x iand y ifor treating projected image original pixels coordinate points (x i, y i) abscissa value and ordinate value.
In conjunction with the possible implementation of the first of first aspect and first aspect to any one mode in the third possible implementation of first aspect, in the 4th kind of possible implementation of first aspect, described, after according to described secondary, as the location parameter of identification point, distortion factor is obtained in the variation of the location parameter based on described reference identification, described method also comprises: according to described distortion factor, described reference picture is revised;
Specifically by following formula, reference picture is carried out to distortion correction:
x k R y k R = R * x k y k 1 r 11 r 12 r 13 r 21 r 22 r 23 * x k y k 1 , k = 1,2 , . . . , n - - - ( 4 )
Wherein, R is distortion factor, described in
Figure BDA0000437125490000042
with for described reference picture is carried out to revised image pixel coordinate points according to distortion factor R abscissa value and ordinate value, described x kand y koriginal pixels coordinate points (x for described reference picture k, y k) abscissa value and ordinate value.
In conjunction with the 4th kind of possible implementation of first aspect, in the 5th kind of possible implementation of first aspect, described according to described distortion factor, described reference picture is revised after, described method also comprises:
Obtain described projector equipment at the kinematic parameter described reference picture being projected to while forming projected image in described projection surface;
According to described kinematic parameter, obtain motion compensation parameters;
According to described motion compensation parameters, the described image through distortion correction is carried out to counter motion compensation; The described projected image for the treatment of that comprises correction through the revised image of distortion, or the reference picture of revising.
In conjunction with the 5th kind of possible implementation of first aspect, in the 6th kind of possible implementation of first aspect, described according to before described kinematic parameter acquisition motion compensation parameters, further comprise:
Described secondary imaging image is carried out to estimation and obtain preliminary motion-compensated values;
Describedly according to described kinematic parameter, obtain motion compensation parameters and comprise:
In conjunction with described kinematic parameter and described preliminary motion-compensated values, obtain motion compensation parameters.
In conjunction with the 5th kind of possible implementation of first aspect, in the 7th kind of possible implementation of first aspect, describedly according to described kinematic parameter, obtain motion compensation parameters and comprise:
In conjunction with described kinematic parameter, described secondary imaging image is carried out to estimation and obtain motion compensation parameters.
In conjunction with the 6th kind of possible implementation of first aspect, in the 8th kind of possible implementation of first aspect, kinematic parameter and preliminary motion-compensated values are obtained motion compensation parameters and are comprised described in described combination:
First motion-compensated values of described kinematic parameter for directly being obtained by acceleration transducer;
Described secondary imaging image is carried out to estimation and obtain preliminary motion-compensated values;
Described the first motion-compensated values and described preliminary motion-compensated values are weighted and on average obtain motion compensation parameters, specifically by following formula, realize:
M=α*M 1+(1-α)*M 2 (5)
Wherein, M is motion compensation parameters, M 1preliminary motion-compensated values, M 2the first motion-compensated values, described in M 1 = t x 1 t y 1 , Described M 2 = t x 2 t y 2 , Described
Figure BDA0000437125490000053
for
Figure BDA0000437125490000054
described for
Figure BDA0000437125490000056
described tx iand ty ifor the secondary in described secondary imaging image as identification point abscissa and ordinate the alternate position spike in adjacent two frames, described N is that secondary described in each frame is as the quantity of identification point; Described
Figure BDA0000437125490000057
for described
Figure BDA0000437125490000059
for
Figure BDA00004371254900000510
described tx jand ty jthe alternate position spike in adjacent two frames for the reference identification point abscissa in described reference picture and ordinate, described H is the quantity of reference identification point described in each frame; α is at described M 1with described M 2between the empirical value of value proportion, the span of described α is 0≤α≤1.
The 7th kind of possible implementation in conjunction with first aspect, in the 9th kind of possible implementation of first aspect, described kinematic parameter be the described projector equipment that senses of acceleration transducer at component value u and the v of X-axis and Y direction displacement, kinematic parameter carries out estimation to described secondary imaging image and obtains motion compensation parameters M and specifically by following formula, realize described in described combination:
M = t x t y
t x = Σ i = 1 N ( tx i ) 2 + ( ty i ) 2 · cot ( β · arctan ( ty i tx i ) + ( 1 - β ) · arctan ( v u ) ) - - - ( 6 )
t y = Σ i = 1 N ( tx i ) 2 + ( ty i ) 2 · tan ( β · arctan ( ty i tx i ) + ( 1 - β ) · arctan ( v u ) ) - - - ( 7 )
Tx wherein iand ty ifor the secondary in described secondary imaging image as identification point abscissa and ordinate the alternate position spike in adjacent two frames, described N is the get some quantity of secondary described in each frame secondary imaging image as identification point, wherein u and v are described kinematic parameter, the described projector equipment being sensed by acceleration transducer X, the component value of Y-direction displacement, β be
Figure BDA0000437125490000061
with
Figure BDA0000437125490000062
between the empirical value of value proportion, the span of described β is 0≤β≤1.
In conjunction with the 5th kind of possible implementation of first aspect to any one mode in the 9th kind of possible implementation of first aspect, in the tenth kind of possible implementation of first aspect, according to described motion compensation parameters, the described projected image for the treatment of through distortion correction is carried out to motion compensation, specifically by following formula, realizes:
x i RM y i RM = x i R y i R + t x t y , i = 1,2 , . . . , n - - - ( 8 )
Wherein, described in
Figure BDA0000437125490000064
with
Figure BDA0000437125490000065
for treating that projected image carries out abscissa value and the ordinate value of revised image pixel point coordinates according to distortion factor R, described in
Figure BDA0000437125490000066
with
Figure BDA0000437125490000067
for carrying out abscissa value and the ordinate value of the image pixel point coordinates after counter motion compensation through the revised image slices vegetarian refreshments of distortion, wherein t x t y For motion compensation parameters M.
In conjunction with the 5th kind of possible implementation of first aspect to any one mode in the tenth kind of possible implementation of first aspect, in the 11 kind of possible implementation of first aspect, according to described motion compensation parameters M, the described reference picture through distortion correction is carried out to motion compensation, specifically by following formula, realizes:
x k RM y k RM = x k R y k R + t x t y , k = 1,2 , . . . , n - - - ( 9 )
Wherein, described in
Figure BDA00004371254900000610
with
Figure BDA00004371254900000611
for described reference picture carries out abscissa value and the ordinate value of revised image pixel point coordinates according to distortion factor R, described in
Figure BDA00004371254900000612
with
Figure BDA00004371254900000613
for carrying out abscissa value and the ordinate value of the image pixel point coordinates after counter motion compensation through the revised image slices vegetarian refreshments of distortion, wherein t x t y For motion compensation parameters M.
In conjunction with the 5th kind of possible implementation of first aspect and first aspect to any one mode in the 11 kind of possible implementation of first aspect, in the 12 kind of possible implementation of first aspect, described imaging device is at least two, to form stereo visual system;
The described secondary obtaining in described secondary imaging image specifically comprises as identification point:
By described stereo visual system, obtain the three-dimensional parameter in projected image; And described three-dimensional parameter is fitted to a plane, and described three-dimensional parameter is vertically mapped to described plane, obtain two-dimensional coordinate point, described two-dimensional coordinate point is for described secondary is as identification point.
The 12 kind of possible implementation in conjunction with first aspect, in the 13 kind of possible implementation of first aspect, described according to described distortion factor, treat projected image carry out distortion correction after described method also comprise: according to projector equipment, to the distance D of projection surface, adjust the focal length of projector equipment, specifically by following formula, realize:
f = D × s 0 s 1 - - - ( 10 )
Wherein, described s 0for projector equipment LCD panel size, described s 1for the experience value of image size, described in getting N secondary is averaging and obtains distance D as the z value in the D coordinates value of identification point and to described z value.
Second aspect, the invention provides a kind of processing unit based on projected image, and described device comprises:
Imaging acquiring unit, for obtaining projected image in projection surface at the secondary imaging image of imaging device, described projected image is the image that by projector equipment, default reference picture is projected to become in described projection surface, in described default reference picture, be provided with reference identification point, described reference identification point forms once as identification point after projection in described projected image; The described picture point once forming in described secondary imaging image as identification point is that secondary is as identification point;
Picture identification point acquiring unit, for the secondary that obtains described secondary imaging image as identification point;
Distortion factor acquiring unit, for obtaining distortion factor as the location parameter of identification point with respect to the variation of the location parameter of described reference identification according to described secondary; Described secondary as the location parameter of identification point be described secondary as identification point the location parameter in described secondary imaging image, the location parameter of described reference identification point is the location parameter of described reference identification point in described reference picture;
Treat projected image amending unit, for treating projected image according to described distortion factor R, carry out distortion correction.
In conjunction with second aspect, in the possible implementation of the first of second aspect, described default reference picture is with reference to infrared grid line, and described reference identification point is the intersection point of the grid line of different directions in the infrared grid line of described reference;
Described imaging acquiring unit, for obtaining projected image in projection surface through the infrared grid line of the formed secondary of described infrared filter camera;
Described picture identification point acquiring unit, for obtaining the intersection point of the grid line of the infrared grid line different directions of described secondary, the intersection point of the grid line of described different directions is that described secondary is as identification point.
In conjunction with the possible implementation of the first of second aspect or second aspect, in the possible implementation of the second of second aspect, described distortion factor acquiring unit is for calculating described distortion factor by following formula R = r 11 r 12 r 13 r 21 r 22 r 23 ,
r 11 r 12 r 21 r 22 * x i y i + r 13 r 23 = x i ' y i ' , i = 1,2 , . . . , n - - - ( 1 )
x i r 11 + y i r 12 + r 13 = x i ' x i r 21 + y i r 22 + r 23 = y i ' , i = 1,2 , . . . , n - - - ( 2 )
Wherein, R is distortion factor, described in r 11 r 12 r 13 r 21 r 22 r 23 The matrix forming in affine transformation for described reference picture, wherein r 11, r 12, r 13, r 21, r 22, r 23for the parameter in affine variation, described x iwith described y ifor described secondary as identification point abscissa value and the ordinate value in described secondary imaging image, described (x i, y i) be that secondary is as the coordinate figure of identification point; Described x i' and described y i' be abscissa value and the ordinate value of described reference identification point in described reference picture, described (x i', y i') be the coordinate figure of reference identification point; Described affine transformation process is described parameter r 11, r 12, r 21, r 22with described secondary as identification point (x i, y i) carry out once linear conversion, and connect translation parameters r 13, r 23be transformed to described reference identification point (x i', y i'); Formula (2) is the system of linear equations consisting of formula (1) conversion, by described formula (2), calculates described affine transformation distortion factor R.
In conjunction with the possible implementation of the second of second aspect, in the third possible implementation of second aspect, described in treat that projected image amending unit is for treating that to described projected image carries out distortion correction by following formula:
x i R y i R = R * x i y i 1 r 11 r 12 r 13 r 21 r 22 r 23 * x i y i 1 , i = 1,2 , . . . , n - - - ( 3 )
Wherein, R is distortion factor, described in
Figure BDA0000437125490000092
with
Figure BDA0000437125490000093
for treating that projected image carries out revised image pixel coordinate points according to distortion factor R
Figure BDA0000437125490000094
abscissa value and ordinate value, described x iand y ifor treating projected image original pixels coordinate points (x i, y i) abscissa value and ordinate value.
In conjunction with the possible implementation of the first of second aspect and second aspect to any one mode in the third possible implementation of second aspect, in the 4th kind of possible implementation of second aspect, described device also comprises: reference picture amending unit, for described reference picture being revised according to described distortion factor R; Described reference picture amending unit is for carrying out distortion correction by following formula to described reference picture:
x k R y k R = R * x k y k 1 r 11 r 12 r 13 r 21 r 22 r 23 * x k y k 1 , k = 1,2 , . . . , n - - - ( 4 )
Wherein, R is distortion factor, described in with
Figure BDA0000437125490000097
for described reference picture is carried out to revised image pixel coordinate points according to distortion factor R
Figure BDA0000437125490000098
abscissa value and ordinate value, described x kand y koriginal pixels coordinate points (x for described reference picture k, y k) abscissa value and ordinate value.
In conjunction with the 4th kind of possible implementation of second aspect, in the 5th kind of possible implementation of second aspect, described device also comprises:
Kinematic parameter acquiring unit, for obtaining described projector equipment at the kinematic parameter described reference picture being projected to while forming projected image in described projection surface;
Compensating parameter acquiring unit, for obtaining motion compensation parameters according to described kinematic parameter;
Motion compensation units, for carrying out counter motion compensation according to described motion compensation parameters to the described image through distortion correction; The described projected image for the treatment of that comprises correction through the revised image of distortion, or the reference picture of revising.
In conjunction with the 5th kind of possible implementation of second aspect, in the 6th kind of possible implementation of second aspect, described device comprises:
Estimation unit is mended in motion, for described secondary imaging image is carried out to estimation, obtains preliminary motion-compensated values;
Compensating parameter acquiring unit comprises that the 3rd obtains subelement, for the kinematic parameter obtaining in conjunction with kinematic parameter acquiring unit and the preliminary motion-compensated values of being obtained by described motion benefit estimation unit, obtains motion compensation parameters.
The 5th kind of possible implementation in conjunction with second aspect, in the 7th kind of possible implementation of second aspect, described compensating parameter acquiring unit comprises: the 4th obtains subelement, for the kinematic parameter obtaining in conjunction with described kinematic parameter acquiring unit, described secondary imaging image is carried out to estimation and obtains motion compensation parameters.
In conjunction with the 6th kind of possible implementation of second aspect, in the 8th kind of possible implementation of second aspect, for directly obtaining the first motion-compensated values according to acceleration transducer, using described the first motion compensation parameters as described kinematic parameter;
Described motion is mended estimation unit and is obtained preliminary motion-compensated values for described secondary imaging image is carried out to estimation, and the first motion-compensated values that described kinematic parameter acquiring unit is obtained and described in the preliminary motion-compensated values that acquires be weighted and on average obtain motion compensation parameters, specifically by following formula, calculate and obtain described motion compensation parameters:
M=α*M 1+(1-α)*M 2 (5)
Wherein, M is motion compensation parameters, M 1preliminary motion-compensated values, M 2the first motion-compensated values, described in M 1 = t x 1 t y 1 , Described M 2 = t x 2 t y 2 , Described
Figure BDA0000437125490000103
for
Figure BDA0000437125490000104
described
Figure BDA0000437125490000105
for
Figure BDA0000437125490000106
described tx iand ty ifor the secondary in described secondary imaging image as identification point abscissa and ordinate the alternate position spike in adjacent two frames, described N is that secondary described in each frame is as the quantity of identification point; Described for
Figure BDA0000437125490000112
described
Figure BDA0000437125490000113
for
Figure BDA0000437125490000114
described tx jand ty jthe alternate position spike in adjacent two frames for the reference identification point abscissa in described reference picture and ordinate, described H is the quantity of reference identification point described in each frame; α is at described M 1with described M 2between the empirical value of value proportion, the span of described α is 0≤α≤1.
The 7th kind of possible implementation in conjunction with second aspect, in the 9th kind of possible implementation of second aspect, described kinematic parameter acquiring unit is for obtaining described projector equipment that acceleration transducer senses at component value u and the v of X-axis and Y direction displacement, using described u and v as kinematic parameter;
Described compensating parameter acquiring unit is carried out estimation for the kinematic parameter obtaining in conjunction with described kinematic parameter acquiring unit to described secondary imaging image and is obtained motion compensation parameters M, specifically by following formula, calculates described motion compensation parameters M:
M = t x t y
t x = Σ i = 1 N ( tx i ) 2 + ( ty i ) 2 · cot ( β · arctan ( ty i tx i ) + ( 1 - β ) · arctan ( v u ) ) - - - ( 6 )
t y = Σ i = 1 N ( tx i ) 2 + ( ty i ) 2 · tan ( β · arctan ( ty i tx i ) + ( 1 - β ) · arctan ( v u ) ) - - - ( 7 )
Tx wherein iand ty ifor the secondary in described secondary imaging image as identification point abscissa and ordinate the alternate position spike in adjacent two frames, described N is the get some quantity of secondary described in each frame as identification point, wherein u and v are described kinematic parameter, the described projector equipment being sensed by acceleration transducer X, the component value of Y-direction displacement, β be
Figure BDA0000437125490000118
with
Figure BDA0000437125490000119
between the empirical value of value proportion, the span of described β is 0≤β≤1.
In conjunction with the 5th kind of possible implementation of second aspect to any one mode in the 9th kind of possible implementation of second aspect, in the tenth kind of possible implementation of second aspect, described motion compensation units is for carrying out motion compensation by following formula to the described projected image for the treatment of through distortion correction:
x i RM y i RM = x i R y i R + t x t y , i = 1,2 , . . . , n - - - ( 8 )
Wherein, described in
Figure BDA0000437125490000122
with
Figure BDA0000437125490000123
for treating that projected image carries out abscissa value and the ordinate value of revised image pixel point coordinates according to distortion factor R, described in
Figure BDA0000437125490000124
with
Figure BDA0000437125490000125
for carrying out abscissa value and the ordinate value of the image pixel point coordinates after counter motion compensation through the revised image slices vegetarian refreshments of distortion, wherein t x t y For motion compensation parameters M.
In conjunction with the 5th kind of possible implementation of second aspect to any one mode in the tenth kind of possible implementation of second aspect, in the 11 kind of possible implementation of second aspect, described motion compensation units is for carrying out motion compensation by following formula to the described reference picture through distortion correction:
x k RM y k RM = x k R y k R + t x t y , k = 1,2 , . . . , n - - - ( 9 )
Wherein, described in
Figure BDA0000437125490000128
with
Figure BDA0000437125490000129
for described reference picture carries out abscissa value and the ordinate value of revised image pixel point coordinates according to distortion factor R, described in
Figure BDA00004371254900001210
with
Figure BDA00004371254900001211
for carrying out abscissa value and the ordinate value of the image pixel point coordinates after counter motion compensation through the revised image slices vegetarian refreshments of distortion, wherein t x t y For motion compensation parameters M.
In conjunction with the 5th kind of possible implementation of second aspect and second aspect to any one mode in the 11 kind of possible implementation of second aspect, in the 12 kind of possible implementation of second aspect, describedly as identification point acquiring unit, also comprise:
Stereopsis subelement, for obtaining the three-dimensional parameter of projected image;
The 9th obtains subelement, for the three-dimensional parameter that described stereopsis subelement is obtained, fits to a plane, and described three-dimensional parameter is vertically mapped to described plane, obtains two-dimensional coordinate point, and described two-dimensional coordinate point is for described secondary is as identification point.
In conjunction with the 12 kind of possible implementation of second aspect, in the 13 kind of possible implementation of second aspect, described device also comprises:
Focus adjustment unit, for adjust the focal length of projector equipment to the distance D of projection surface according to projector equipment, specifically realizes by following formula:
f = D × s 0 s 1 - - - ( 10 )
Wherein, described s 0for projector equipment LCD panel size, described s 1for the experience value of image size, described in getting N secondary is averaging and obtains distance D as the z value in the D coordinates value of identification point and to described z value.
According to secondary, as identification point, the location parameter in secondary imaging image and the location parameter of reference identification point in reference picture obtain distortion factor to the embodiment of the present invention, with the image that this distortion factor is treated correction, revise again, make the distortion of the treated projected image not even distortion that diminishes, thereby improve the adaptive capacity to environment of image projecting equipment.
Accompanying drawing explanation
In order to be illustrated more clearly in the technical scheme of the embodiment of the present invention, below the accompanying drawing of required use during embodiment is described is briefly described, apparently, accompanying drawing in the following describes is only some embodiments of the present invention, for those of ordinary skills, do not paying under the prerequisite of creative work, can also obtain according to these accompanying drawings other accompanying drawing.
A kind of flow chart based on projected image processing method that Fig. 1 provides for the embodiment of the present invention one;
A kind of flow chart based on projected image processing method that Fig. 2 provides for the embodiment of the present invention two;
Fig. 3 is a kind of implementation detail schematic diagram based on projected image processing method in the embodiment of the present invention two;
A kind of a kind of execution mode supplementary flow chart of method of processing based on projected image that Fig. 4 provides for the embodiment of the present invention two;
The another kind of execution mode supplementary flow chart of a kind of method of processing based on projected image that Fig. 5 provides for the embodiment of the present invention two;
A kind of structured flowchart based on projected image processing unit that Fig. 6 provides for the embodiment of the present invention three;
The details block diagram of a kind of structured flowchart based on projected image processing unit that Fig. 7 provides for the embodiment of the present invention three.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is clearly and completely described, obviously, described embodiment is only the present invention's part embodiment, rather than whole embodiment.Embodiment based in the present invention, those of ordinary skills, not making the every other embodiment obtaining under creative work prerequisite, belong to the scope of protection of the invention.
Below in conjunction with specific embodiment, specific implementation of the present invention is described in detail:
As shown in Figure 1, the embodiment of the present invention provides a kind of projected image processing method, and it comprises the steps:
S101, obtain projected image in the projection surface secondary imaging image in imaging device, described projected image is the image that by projector equipment, default reference picture is projected to become in described projection surface, in described default reference picture, be provided with reference identification point, described reference identification point forms once as identification point after projection in described projected image; The described picture point once forming in described secondary imaging image as identification point is that secondary is as identification point.
In embodiments of the present invention, projector equipment is projected to projection surface by default reference picture and forms projected image, imaging device picked-up projected image forms secondary imaging image, computing equipment obtains secondary imaging image from imaging device, and has stored in advance default reference picture in computing equipment; In default reference picture, be provided with reference identification point, described reference identification point is incident upon in projection surface and is formed once as identification point by projector equipment; In concrete implementation process, this reference identification point can be chosen according to regular spacing distance, also can choose not according to regular spacing distance, the pattern that all identification points present can be square net shape, also can be rhombus net trellis, can also be Else Rule or irregularly shaped, specifically can determine according to the mild degree of the position relationship between projector equipment and projection surface and projection surface; In concrete implementation process, the two-dimensional coordinate value that described reference identification point is reference identification point.
S102, obtains secondary in described secondary imaging image as identification point.
In embodiments of the present invention, after getting secondary imaging image, therefrom obtain again secondary in secondary imaging image as identification point, described secondary is the picture point once forming in secondary imaging image as identification point in projected image as identification point, in concrete implementation process, described in the secondary that obtains as identification point, be that secondary is as the two-dimensional coordinate value of identification point.
S103, obtains distortion factor R according to the variation of described secondary location parameter location parameter in described reference picture based on described reference identification point in described secondary imaging image as identification point.
In embodiments of the present invention, computing equipment takes out the default reference picture of storing in advance, extract the two-dimensional coordinate value of the reference identification point wherein presetting, and obtain its location parameter in reference picture according to the coordinate figure of this reference identification point, according to the secondary getting, as the two-dimensional coordinate value of identification point, obtain its location parameter in secondary imaging image, the location parameter in secondary imaging image compares as identification point for location parameter with reference to identification point in reference picture and secondary, thereby obtain deformation coefficient, be distortion factor R, deformation is herein actually an affine transformation, be that R can be expressed as the matrix of 2 * 3: R = r 11 r 12 r 13 r 21 r 22 r 23 ,
Optionally, in concrete implementation process, the mode of specifically obtaining R in the embodiment of the present application can obtain by following formula:
r 11 r 12 r 21 r 22 * x i y i + r 13 r 23 = x i ' y i ' , i = 1,2 , . . . , n - - - ( 1 )
x i r 11 + y i r 12 + r 13 = x i ' x i r 21 + y i r 22 + r 23 = y i ' , i = 1,2 , . . . , n - - - ( 2 )
Each secondary obtaining from secondary imaging image is as the two-dimensional coordinate value (x of identification point, y) can by affine transformation R transform in default reference picture corresponding reference identification point two-dimensional coordinate value (x ', y '), in above-mentioned formula (1) and (2), r 11, r 12, r 13, r 21, r 22, r 236 known variables, (x ', y ') for the reference identification point in reference picture is known point, (x, y) for the secondary in the secondary imaging image being obtained by step S101 and step S102 is as identification point, also be known variables, get three pairs of reference identification points and secondary thus just can obtain 6 equations as the corresponding points of identification point by step S101 and step S102,6 equations of simultaneous just can solve 6 known variables r 11, r 12, r 13, r 21, r 22, r 23, obtain R value.
S104, treats projected image according to described distortion factor R and carries out distortion correction.
In an embodiment of the present invention, the distortion factor R obtaining is acted on the image for the treatment of projection, treat projected image and carry out distortion correction, in concrete implementation process, can in treating correction image, get n pixel, the coordinate position value of each pixel is (x i, y i), in revised image, corresponding pixel is
Figure BDA0000437125490000161
Optionally, in concrete implementation process, the concrete makeover process in the embodiment of the present application can be represented by formula (3): x i R y i R = R * x i y i 1 r 11 r 12 r 13 r 21 r 22 r 23 * x i y i 1 , I=1,2, L, n, wherein R is known variables, (x i, y i) in the known known coordinate value of getting arbitrarily on projected image treated, for known coordinate value (x i, y i) coordinate figure of corresponding revised pixel.
Optionally, on the basis of above-mentioned execution mode, the present embodiment can also according to described in the distortion factor R that obtains described reference picture is revised, specifically by following formula, reference picture is carried out to distortion correction:
x k R y k R = R * x k y k 1 r 11 r 12 r 13 r 21 r 22 r 23 * x k y k 1 , k = 1,2 , . . . , n - - - ( 4 )
Wherein, described in for described reference picture is carried out to revised image pixel coordinate, (x according to distortion factor R k, y k) be the original pixels coordinate of described reference picture.
In embodiments of the present invention, by obtaining secondary imaging image, from secondary imaging image, obtain secondary as identification point again, relatively as identification point, the location parameter in secondary imaging image and the location parameter of reference identification point in reference picture obtain distortion factor R to secondary, distortion factor R is applied in image to be revised, the image for the treatment of correction carries out precorrection and makes the image of output undistorted again.
In another embodiment of the present invention, default reference picture is with reference to infrared grid line, adopt infrared grid line as with reference to image, make projector equipment can be in real time continual to projection surface's projection reference picture and do not disturb the imaging of visible ray, so can guarantee that to reference picture and the correction for the treatment of projected image be also continual carrying out in real time, thus make eye recognition to projected image be all distortionless image from start to finish.
As shown in Figure 2, the embodiment of the present invention provides a kind of realization flow of image processing method, and it comprises the steps:
S201, obtains projected image in the projection surface secondary imaging image in infrared filter camera head, and this secondary imaging image is the infrared grid line of secondary; Described projected image is for by infrared projection equipment, the default infrared grid line of reference being projected to the projected image that becomes in projection surface, and this projected image is infrared grid line once; In described default reference picture, be provided with reference identification point, this reference identification point is specially the intersection point with reference to the grid line of different directions in infrared grid line; Described reference identification point forms once as identification point in described projected image;
Concrete, infrared projection equipment is projected to projection surface with reference to infrared grid line and forms projected image, infrared filter camera head picked-up projected image forms the infrared grid line of secondary, computing equipment obtains the infrared grid line of secondary from infrared filter camera head, and in computing equipment, has stored in advance the default infrared grid line of reference; In the infrared grid line of described reference, the intersection point of the grid line of different directions is described reference identification point, in concrete implementation process, the shape of the infrared grid line of this reference can be square net shape, also can be rhombus net trellis, can also be Else Rule or irregularly shaped, specifically can determine according to the mild degree of the position relationship between infrared projection equipment and projection surface and projection surface; In concrete implementation process, the two-dimensional coordinate value that described reference identification point is reference identification point.
S202, obtains secondary in the infrared grid line of described secondary as identification point, and described secondary is the picture point once forming in the infrared grid line of the described secondary getting as identification point in described projected image as identification point; Wherein, in the infrared grid line of secondary in secondary imaging image, the intersection point of the grid line of different directions is that secondary is as identification point; In concrete implementation process, described in the secondary that obtains as identification point, be that secondary is as the two-dimensional coordinate value of identification point.
S203, obtains distortion factor R according to the variation of described secondary location parameter location parameter in described reference picture based on described reference identification point in described secondary imaging image as identification point.
In embodiments of the present invention, computing equipment takes out the default infrared grid line of reference of storage in advance, extract the two-dimensional coordinate value of the reference identification point wherein presetting, i.e. the two-dimensional coordinate value of the intersection point of the grid line of different directions in the infrared grid line of reference; And obtain it at the location parameter with reference in infrared grid line according to the two-dimensional coordinate value of this reference identification point, according to the secondary getting, as the two-dimensional coordinate value of identification point, obtain its location parameter in the infrared grid line of secondary, the location parameter in the infrared grid line of secondary compares as identification point for location parameter with reference to identification point in the infrared grid line of reference and secondary, thereby obtain deformation coefficient, be distortion factor R, deformation is herein actually an affine transformation, and R can be expressed as the matrix of 2 * 3: R = r 11 r 12 r 13 r 21 r 22 r 23 .
Optionally, in concrete implementation process, the mode of specifically obtaining R in the embodiment of the present application can obtain by following formula:
r 11 r 12 r 21 r 22 * x i y i + r 13 r 23 = x i ' y i ' , i = 1,2 , . . . , n - - - ( 5 )
x i r 11 + y i r 12 + r 13 = x i ' x i r 21 + y i r 22 + r 23 = y i ' , i = 1,2 , . . . , n - - - ( 6 )
Each secondary obtaining from the infrared grid line of secondary is as the two-dimensional coordinate value (x of identification point, y) can by affine transformation R transform in the infrared grid line of default reference corresponding reference identification point two-dimensional coordinate value (x ', y '), in above-mentioned formula (1) and (2), r 11, r 12, r 13, r 21, r 22, r 236 known variables, (x ', y ') for the reference identification point with reference in infrared grid line is known point, (x, y) for the secondary in the infrared grid line of secondary being obtained by step S201 and step S202 is as identification point, also be known variables, get three pairs of reference identification points and secondary thus just can obtain 6 equations as the corresponding points of identification point by step S201 and step S202,6 equations of simultaneous just can solve 6 known variables r 11, r 12, r 13, r 21, r 22, r 23, obtain R value.
S204, treats projected image correction according to described distortion factor R.
In an embodiment of the present invention, the distortion factor R obtaining is acted on the image for the treatment of projection, treat projected image and carry out distortion correction, in concrete implementation process, can in treating correction image, get n pixel, the coordinate position value of each pixel is (x i, y i), in revised image, corresponding pixel is ( x i R , y i R ) .
Optionally, in concrete implementation process, the concrete makeover process in the embodiment of the present application can be represented by formula (7): x i R y i R = R * x i y i 1 r 11 r 12 r 13 r 21 r 22 r 23 * x i y i 1 , I=1,2, L, n, wherein R is known variables, (x i, y i) in the infrared grid line of known reference and the known known coordinate value of getting arbitrarily on projected image treated,
Figure BDA0000437125490000191
for known coordinate value (x i, y i) coordinate figure of corresponding revised pixel.
Optionally, on the basis of above-mentioned execution mode, the present embodiment can also according to described in the distortion factor R that obtains described reference picture is revised, specifically by following formula, reference picture is carried out to distortion correction:
x k R y k R = R * x k y k 1 r 11 r 12 r 13 r 21 r 22 r 23 * x k y k 1 , k = 1,2 , . . . , n - - - ( 8 )
Wherein, described in
Figure BDA0000437125490000193
for described reference picture is carried out to revised image pixel coordinate, (x according to distortion factor R k, y k) be the original pixels coordinate of described reference picture.
In embodiments of the present invention, after getting the infrared grid line of secondary, more therefrom obtain secondary in the infrared grid line of secondary as identification point, in concrete implementation process, what obtain can be that secondary is as the two-dimensional coordinate value of identification point.
In the embodiment of the present invention, adopt infrared grid line as default reference picture, the intersection point of getting the grid line of different directions in infrared grid line is reference identification point, make infrared projection equipment can in real time continually project reference picture, do not disturb the imaging of visible ray simultaneously, projection environment can reach thus in real time continual to reference picture with treat that the image of projection revises, even if therefore also can make that display frame is distortionless to be presented under complicated situation very much.
Optionally, on the basis of above embodiment, can also comprise:
Have two imaging devices at least, to form stereo visual system, as Fig. 3 shows, can be two imaging devices in embodiments of the present invention, form Binocular Stereo Vision System, the secondary imaging image that described two imaging devices form the projected image capturing is passed to computing equipment, the secondary imaging image that computing equipment transmits two imaging devices by binocular stereo vision method is processed, in specific implementation process, this treatment step is: extract secondary in described secondary imaging image as the three-dimensional parameter of identification point, described three-dimensional parameter is become to a plane with least square fitting, described three-dimensional parameter is vertically mapped in described fit Plane, obtain described secondary as identification point the point of the two-dimensional coordinate in described projection surface, be that the secondary described in the embodiment of the present invention is as identification point.
Further, described Binocular Stereo Vision System can be the Binocular Stereo Vision System of parallel optical axis, in this vision system, can there be two for absorbing the camera of projected image, computing equipment obtains secondary imaging image and extracts secondary in secondary imaging image as the three-dimensional parameter of identification point from camera, be that secondary is as the D coordinates value of identification point, specific as follows: the distance at two cam lens centers is e, the focal length of two cameras is f, the mid point of the two cam lens lines of centres of take is initial point, it is X-axis that camera 2 lens centre directions are pointed in camera 1 lens centre, parallel camera optical axis direction is Z axis, XYZ direction of principal axis meets right-hand rule, set up coordinate system, a bit (x in this coordinate system, y, z) coordinate figure can be calculated by formula below:
z = e · f x 2 - x 1 - - - ( 9 )
x = e 2 · x 1 + x 2 x 1 - x 2 - - - ( 10 )
y = e 2 · y 1 + y 2 y 1 - y 2 - - - ( 11 )
(x wherein 1, y 1) and (x 2, y 2) be respectively (x, y, z) at the image space of camera 1 and camera 2 imaging surfaces, like this, we can in the hope of all secondaries as identification point the position in three dimensions, be provided with N secondary as identification point, D coordinates value is respectively { (x 1, y 1, z 1), (x 2, y 2, z 2), L, (x n, y n, z n).
Then, { (x 1, y 1, z 1), (x 2, y 2, z 2), L, (x n, y n, z n) can go out a plane with least square fitting:
Ax+By+Cz+D=0 (12)
C ≠ 0 wherein, because parallel camera optical axis direction is Z axis, so projection surface can not be parallel to Z axis, note a 0 = - A C , a 1 = - B C , a 2 = - D C , Formula (6) can be write as:
z=a 0x+a 1y+a 2 (13)
A wherein 0, a 1, a 2for parameter to be asked, least square method requires following formula value minimum:
S = Σ i = 1 N ( a 0 x i + a 1 y i + a 2 - z i ) 2 - - - ( 14 )
Make S minimum, should meet:
∂ S ∂ a k = 0 , k = 0,1,2 - - - ( 15 )
That is:
Σ i = 1 N 2 ( a 0 x i + a 1 y i + a 2 - z i ) x i = 0 Σ i = 1 N 2 ( a 0 x i + a 1 y i + a 2 - z i ) y i = 0 Σ i = 1 N 2 ( a 0 x i + a 1 y i + a 2 - z i ) = 0 - - - ( 16 )
Arranging above formula can obtain:
a 0 a 1 a 2 = Σ i = 1 N ( x i ) 2 Σ i = 1 N x i y i Σ i = 1 N x i Σ i = 1 N x i y i Σ i = 1 N ( y i ) 2 Σ i = 1 N y i Σ i = 1 N x i Σ i = 1 N y i N - 1 Σ i = 1 N x i z i Σ i = 1 N y i z i Σ i = 1 N z i - - - ( 17 )
Like this, we have just obtained an expressed space plane of formula (7), then by { (x 1, y 1, z 1), (x 2, y 2, z 2), L, (x n, y n, z n) upright projection is to this fit Plane, just can obtain these at the two-dimensional coordinate of this plane
Figure BDA0000437125490000214
secondary image scale in Here it is secondary imaging image is known point coordinates value, and in specific implementation process, after above-mentioned D coordinates value perpendicular projection, obtaining two-dimensional coordinate value can be realized by following computational process:
By above-mentioned formula, draw a known fit Plane Ax+By+Cz+D=0, therefore can obtain { (x 1, y 1, z 1), (x 2, y 2, z 2), L, (x n, y n, z n) upright projection is to the three-dimensional coordinate of this plane { ( x t 1 , y t 1 , z t 1 ) , ( x t 2 , y t 2 , z t 2 ) , . . . , ( x t N , y t N , z t N ) } :
x t i = x i - Aδ y t i = y i - Bδ z t i = z i - Cδ - - - ( 18 )
Wherein δ = Ax i + By i + Cz i + D A 2 + B 2 + C 2 .
Obtain after, then be mapped to two-dimensional coordinate need to determine the origin of coordinates and X-direction (Y-axis is vertical with X-axis through the origin of coordinates) herein, optional, in concrete implementation process, the origin of coordinates and X-direction can obtain as follows:
In default reference picture, do specific markers, make specific markers point in reference picture, be easy to be detected, in specific implementation process, choose the situation that reference picture is grid image, in grid image with the specific markers of " rice " font, as shown in Figure 6, in the grid image of projection, do specific markers, the grid image of this band of projection " rice " font mark, the grid point of these two bands " rice " font mark is easy to be detected in image, middle " rice " font gauge point can be used as the initial point of coordinate system in subsequent calculations, the direction of pointing to another " rice " font gauge point from this point is as X-direction.
If two " rice " font gauge points are through formula z = e · f x 2 - x 1 - - - ( 9 ) ; x = e 2 · x 1 + x 2 x 1 - x 2 - - - ( 10 ) ; y = e 2 · y 1 + y 2 y 1 - y 2 - - - ( 11 ) ; x t i = x i - Aδ ; y t i = y i - Bδ ; z t i = z i - Cδ ; - - - ( 18 ) D coordinates value after calculating is with with
Figure BDA00004371254900002210
for initial point, with point to for X-axis, Y-axis is vertical with X-axis through the origin of coordinates, and unit length is identical with former three-dimensional system of coordinate, sets up thus two-dimensional coordinate system.Will be mapped to this two-dimensional coordinate system, obtain
Figure BDA00004371254900002214
x i ' = l 1 · cos θ , y i ' = l 1 · sin θ , I=1 wherein, 2, L, N (19)
Wherein:
Figure BDA00004371254900002216
l wherein 1represent the three-dimensional coordinate point in described fit Plane arrive the origin of coordinates of the two-dimensional coordinate system of described foundation
Figure BDA0000437125490000232
distance;
Figure BDA0000437125490000233
described l 2represent the three-dimensional coordinate point in described fit Plane
Figure BDA0000437125490000234
arrive the origin of coordinates of the two-dimensional coordinate system of described foundation
Figure BDA0000437125490000235
distance;
described l 3represent the three-dimensional coordinate point in described fit Plane to another meter of character point outside the initial point of the two-dimensional coordinate system of described foundation
Figure BDA0000437125490000238
distance;
Figure BDA0000437125490000239
described θ represents with the origin of coordinates
Figure BDA00004371254900002310
for summit, point
Figure BDA00004371254900002311
to point
Figure BDA00004371254900002312
straight line and point
Figure BDA00004371254900002313
to point
Figure BDA00004371254900002314
straight line between angle.
In embodiments of the present invention, consider the complexity of projection environment, projection surface may present for rough three-dimensional state, now the projection surface presenting with three-dimensional state must be fitted to two dimensional surface and just can obtain final secondary as the two-dimensional coordinate value of identification point, and extract distortion factor R to reference picture and treat projected image correction based on this, even if make thus the image that projects in complicated projection environment also undistorted.
Optionally, in embodiments of the present invention, the imaging device that is at least two can also be the camera head of infrared filter camera that has been at least two install additional.
On the basis of the above-mentioned all embodiment of the present invention, optional, as shown in Figure 4, this programme can also comprise following execution mode, and concrete steps are as follows:
S301, obtains described projector equipment at the kinematic parameter described reference picture being projected to while forming projected image in described projection surface.
In specific implementation process, kinematic parameter can be obtained by acceleration transducer, and this acceleration transducer should and be rigidly connected with projector equipment next-door neighbour, thereby obtains kinematic parameter the most accurately so that acceleration sensor can accurately capture the motion of projector equipment.In concrete implementation process, this kinematic parameter can be the motion compensation directly being obtained by acceleration transducer, can be also the direction of motion information being obtained by acceleration sensor.
S302, obtains motion compensation parameters according to described kinematic parameter, specifically can be realized by S302a or S302b method as shown in Figure 5:
S302a, carries out estimation to described secondary imaging image and obtains preliminary motion-compensated values, in conjunction with described kinematic parameter and preliminary motion-compensated values, obtains motion compensation parameters M;
As described kinematic parameter the first motion compensation M that described acceleration transducer directly obtains that serves as reasons 2time, secondary imaging image is carried out to estimation: be specially adjacent two frames of getting secondary imaging image, for standard frame and present frame, in standard frame and present frame, get the coordinate figure of corresponding pixel points, and ask the coordinate position of these corresponding points poor, finally poor being averaging of all coordinate positions obtained to preliminary motion-compensated values M 1, in conjunction with kinematic parameter M 2with preliminary motion-compensated values M 1obtain motion compensation parameters M.
Optionally, in embodiments of the present invention, in conjunction with kinematic parameter M 2with preliminary motion-compensated values M 1the method of obtaining motion compensation parameters M can be: by described kinematic parameter, be M 2the first motion-compensated values and described preliminary motion-compensated values M 1be weighted and on average obtain motion compensation parameters M, specifically by following formula, realize:
M=α*M 1+(1-α)*M 2 (21)
Wherein, described in M 1 = t x 1 t y 1 , Described M 2 = t x 2 t y 2 , Described
Figure BDA0000437125490000243
described described (tx i, ty i) be secondary in described secondary imaging image as identification point the alternate position spike in adjacent two frames, described N is the get some quantity of secondary described in each frame as identification point; Described
Figure BDA0000437125490000245
described described (tx j, ty j) be reference identification point in the described reference picture alternate position spike in adjacent two frames, the quantity of getting that described M is reference identification point described in each frame; α is at described M 1with described M 2between the empirical value of value proportion, the span of described α is 0≤α≤1.
Optionally, the embodiment of the present invention can also be:
S302b, carries out estimation to described secondary imaging image and obtains preliminary motion-compensated values, in conjunction with described kinematic parameter and preliminary motion-compensated values, obtains motion compensation parameters M.
When the direction of motion information of kinematic parameter for being obtained by acceleration transducer, in concrete implementation process, this direction of motion information be the described projector equipment that senses of acceleration transducer X, the component value (u of Y-direction displacement, v), in conjunction with this direction of motion information, described secondary imaging image is carried out to estimation and obtain motion compensation parameters M.
Optionally, in embodiments of the present invention, in conjunction with this direction of motion information, described secondary imaging image is carried out to estimation and obtains motion compensation parameters M and can realize by following formula:
Get acceleration transducer at X, Y-direction component value, establish its value for (u, v), the angle of itself and directions X is:
θ uv = arctan ( v u ) - - - ( 22 )
If total N grid point in secondary imaging image, total N secondary be as identification point, and the displacement that each identification point is obtained by the alternate position spike of adjacent two frames is (tx i, ty i), i=1,2, L, N, the angle theta of each identification point displacement and directions X iwith big or small L ibe respectively:
θ i = arctan ( ty i tx i ) - - - ( 23 )
L i = ( tx i ) 2 + ( ty i ) 2 - - - ( 24 )
The direction of displacement of the revised grid point of accelerated degree transducer is:
θ i'=α·θ i+(1-α)·θ uv (25)
Wherein α is empirical value, such as α=0.5.
Final motion compensation M at the component of X, Y-direction is:
t x = Σ i = 1 N L i · cot ( θ i ' ) - - - ( 26 )
t y = Σ i = 1 N L i · tan ( θ i ' ) - - - ( 27 )
That is: t x = Σ i = 1 N ( tx i ) 2 + ( ty i ) 2 · cot ( α · arctan ( ty i tx i ) + ( 1 - α ) · arctan ( v u ) ) - - - ( 28 )
t y = Σ i = 1 N ( tx i ) 2 + ( ty i ) 2 · tan ( α · arctan ( ty i tx i ) + ( 1 - α ) · arctan ( v u ) ) - - - ( 29 )
M = t x t y
(tx wherein i, ty i) be secondary in described secondary imaging image as identification point the alternate position spike in adjacent two frames, described N is the get some quantity of secondary described in each frame as identification point, (u wherein, v) be kinematic parameter, the described projector equipment being sensed by acceleration transducer X, the component value of Y-direction displacement, α be
Figure BDA0000437125490000264
with
Figure BDA0000437125490000265
between the empirical value of value proportion, the span of described α is 0≤α≤1.
S303, carries out counter motion compensation according to described motion compensation parameters M to the described image through distortion correction; The described projected image for the treatment of that comprises correction through the revised image of distortion, or the reference picture of revising.
Optionally, in embodiments of the present invention, for treating that through distortion correction projected image carries out counter motion compensation and can realize by following formula:
x i RM y i RM = x i R y i R + t x t y , i = 1,2 , . . . , n - - - ( 30 )
Wherein
Figure BDA0000437125490000267
for the pixel coordinate until projected image through revising is carried out to the image pixel coordinate after motion compensation.
Optionally, in embodiments of the present invention, for the reference picture through distortion correction, carry out counter motion compensation and can realize by following formula:
x k RM y k RM = x k R y k R + t x t y , k = 1,2 , . . . , n - - - ( 31 )
Wherein, described in
Figure BDA0000437125490000269
for described reference picture is carried out to revised image pixel point coordinates according to distortion factor R, wherein said
Figure BDA00004371254900002610
for carrying out the image pixel coordinate after counter motion compensation through the revised image pixel point coordinates of distortion, wherein (tx, ty) is motion compensation parameters M.
In embodiments of the present invention, consider the complexity of projection environment, may there is the situation that moves or rock in projector equipment, thereby make projected picture shake, the embodiment of the present invention is by the combination of acceleration transducer and estimation, try to achieve motion compensation parameters M the most accurately to the revised reference picture through distortion correction and treat that projected image carries out counter motion compensation, thereby guarantee the not shake of image of exporting under projection environment complicated situation.
On the basis of the above-mentioned all embodiment of the present invention, optional, the embodiment of the present invention can also comprise:
By the image through distortion correction or through distortion correction and the image after counter motion compensation, project projection surface, comprise: according to projector equipment, to the distance D of projection surface, adjust the focal length of projector equipment, the image size that makes projector equipment project projection surface meets optimum visual and receives, and specifically by following formula, realizes:
f = D × s 0 s 1 - - - ( 32 )
Wherein, described s 0for projector equipment LCD panel size, described s 1for the experience value of image size, described in
Figure BDA0000437125490000272
getting N secondary is averaging and obtains distance D as the z value in the D coordinates value of identification point and to described z value.
Wherein secondary is obtained by stereo visual system as the D coordinates value of identification point,
In concrete implementation process, have two imaging devices at least, to form stereo visual system; In conjunction with flow chart 3, showing, can be two imaging devices in embodiments of the present invention, forms Binocular Stereo Vision System; The secondary imaging image that described two imaging devices form the projected image capturing is passed to computing equipment, the secondary imaging image that computing equipment transmits two imaging devices by binocular stereo vision method is processed, in specific implementation process, this treatment step is: extract secondary in described secondary imaging image as the three-dimensional parameter of identification point.
Further, described Binocular Stereo Vision System can be the Binocular Stereo Vision System of parallel optical axis, in this vision system, can there be two for absorbing the camera of projected image, computing equipment obtains secondary imaging image and extracts secondary in secondary imaging image as the three-dimensional parameter of identification point from camera, be that secondary is as the D coordinates value of identification point, specific as follows: the distance at two cam lens centers is e, the focal length of two cameras is f, the mid point of the two cam lens lines of centres of take is initial point, it is X-axis that camera 2 lens centre directions are pointed in camera 1 lens centre, parallel camera optical axis direction is Z axis, XYZ direction of principal axis meets right-hand rule, set up coordinate system, a bit (x in this coordinate system, y, z) coordinate figure can be calculated by formula below:
z = e · f x 2 - x 1 - - - ( 22 )
x = e 2 · x 1 + x 2 x 1 - x 2 - - - ( 34 )
y = e 2 · y 1 + y 2 y 1 - y 2 - - - ( 35 )
(x wherein 1, y 1) and (x 2, y 2) be respectively (x, y, z) camera 1 in described stereopsis subelement 302a and the image space of camera 2 imaging surfaces.Like this, we can in the hope of all secondaries as identification point the position in three dimensions, be provided with N secondary as identification point, coordinate is respectively { (x 1, y 1, z 1), (x 2, y 2, z 2), L, (x n, y n, z n).
According to the Z coordinate of these points, can estimate that projection surface is apart from the approximate distance of projector:
d = 1 N Σ i = 1 N z i - - - ( 36 )
According to automatically adjust the focal length of visible ray projector apart from d, the image size that makes visible ray projector project to projection surface is applicable to dressing watches.Following table is the projector distance that rule of thumb obtains and the reference value of projected image size.
The reference value of table 1. projector distance and projected image size
Obtain projector distance d(unit: rice), by formula below can obtain the focal distance f that projector need to set (unit: rice):
f = D × s 0 s 1 - - - ( 37 )
Wherein, s 0for projecting camera LCD chip size (unit: inch), s 1for according to the size of the projected image obtaining apart from d question blank 1 (unit: inch), the value of f is subject to the restriction of hardware, must be positioned at [f min, f max] between, actual value is max (min (f, f max), f min).For fear of the variation because of distance, cause focal length not stop to change, in the time of can stipulating to only have the absolute value of the difference of the distance when current distance and a front adjusting focal length to be greater than a given threshold value (such as 0.5 meter), just carry out new focal length and calculate and setting.
In embodiments of the present invention, on the basis of embodiment above again by secondary as identification point the Z coordinate figure in three dimensions average and obtain projection surface to the distance d between imaging device, according to automatically adjust the focal length of projector equipment apart from d, thereby make projection reach best viewing effect to the image size of projection surface.
Embodiment bis-
The embodiment of the present invention provides a kind of processing unit based on projected image, and as shown in Figure 6, this device comprises:
Imaging acquiring unit 301, for obtaining projected image in projection surface at the secondary imaging image of imaging device, described projected image is the image that by projector equipment, default reference picture is projected to become in described projection surface, in described default reference picture, be provided with reference identification point, described reference identification point forms once as identification point after projection in described projected image; The described picture point once forming in described secondary imaging image as identification point is that secondary is as identification point;
Picture identification point acquiring unit 302, for the secondary that obtains described secondary imaging image as identification point;
Distortion factor acquiring unit 303, for according to described secondary as the location parameter of identification point the variation of the location parameter based on described reference identification obtain distortion factor; Described secondary as the location parameter of identification point be described secondary as identification point the location parameter in described secondary imaging image, the location parameter of described reference identification point is the location parameter of described reference identification point in described reference picture;
Treat projected image amending unit 304, for treating projected image according to described distortion factor R, carry out distortion correction.
In embodiments of the present invention, projector equipment is projected to projection surface by default reference picture and forms projected image, imaging device picked-up projected image forms secondary imaging image, imaging acquiring unit 301 obtains secondary imaging image from imaging device, and has stored in advance default reference picture in imaging acquiring unit 301; In default reference picture, be provided with reference identification point, described reference identification point is incident upon in projection surface and is formed once as identification point by projector equipment; In concrete implementation process, this reference identification point can be chosen according to regular spacing distance, also can choose not according to regular spacing distance, the pattern that all identification points present can be square net shape, also can be rhombus net trellis, can also be Else Rule or irregularly shaped, specifically can determine according to the mild degree of the position relationship between projector equipment and projection surface and projection surface; In concrete implementation process, the two-dimensional coordinate value that described reference identification point is reference identification point.
Imaging acquiring unit 301 obtains after secondary imaging image, as identification point acquiring unit 302, from secondary imaging image, obtain again secondary in secondary imaging image as identification point, described secondary is the picture point once forming in secondary imaging image as identification point in projected image as identification point, in concrete implementation process, described in the secondary that obtains as identification point, be that secondary is as the two-dimensional coordinate value of identification point.
Distortion factor acquiring unit 303 takes out the default reference picture of storing in advance, extract the two-dimensional coordinate value of the reference identification point wherein presetting, and obtain its location parameter in reference picture according to the coordinate figure of this reference identification point, according to the secondary getting as identification point acquiring unit 302, as the two-dimensional coordinate value of identification point, obtain its location parameter in secondary imaging image, distortion factor acquiring unit 303 with reference to identification point the location parameter in reference picture and secondary as identification point, the location parameter in secondary imaging image compares, thereby obtain deformation coefficient, be distortion factor R.
Concrete, distortion factor acquiring unit 303 can be for calculating described distortion factor by following formula:
R = r 11 r 12 r 13 r 21 r 22 r 23
r 11 r 12 r 21 r 22 * x i y i + r 13 r 23 = x i ' y i ' , i = 1,2 , . . . , n - - - ( 1 )
x i r 11 + y i r 12 + r 13 = x i ' x i r 21 + y i r 22 + r 23 = y i ' , i = 1,2 , . . . , n - - - ( 2 )
Wherein, R is distortion factor, described in r 11 r 12 r 13 r 21 r 22 r 23 The matrix forming in affine transformation for described reference picture, wherein r 11, r 12, r 13, r 21, r 22, r 23for the parameter in affine variation, described x iwith described y ifor described secondary as identification point abscissa value and the ordinate value in described secondary imaging image, described (x i, y i) be that secondary is as the coordinate figure of identification point; Described x i' and described y i' be abscissa value and the ordinate value of described reference identification point in described reference picture, described (x i', y i') be the coordinate figure of reference identification point; Described affine transformation process is described parameter r 11, r 12, r 21, r 22with described secondary as identification point (x i, y i) carry out once linear conversion, and connect translation parameters r 13, r 23be transformed to described reference identification point (x i', y i'); Formula (2) is the system of linear equations consisting of formula (1) conversion, by described formula (2), calculates described affine transformation distortion factor R.
Correspondingly, treat that projected image amending unit 304 can be for treating that to described projected image carries out distortion correction by following formula:
x i R y i R = R * x i y i 1 r 11 r 12 r 13 r 21 r 22 r 23 * x i y i 1 , i = 1,2 , . . . , n - - - ( 3 )
Wherein, R is distortion factor, described in with
Figure BDA0000437125490000315
for treating that projected image carries out revised image pixel coordinate points according to distortion factor R
Figure BDA0000437125490000316
abscissa value and ordinate value, described x iand y ifor treating projected image original pixels coordinate points (x i, y i) abscissa value and ordinate value.
On the basis of device, can also comprise reference picture amending unit 305, for described reference picture being revised according to described distortion factor R in the above-described embodiments; Described reference picture amending unit is for carrying out distortion correction by following formula to described reference picture:
x k R y k R = R * x k y k 1 r 11 r 12 r 13 r 21 r 22 r 23 * x k y k 1 , k = 1,2 , . . . , n - - - ( 4 )
Wherein, R is distortion factor, described in
Figure BDA0000437125490000318
with
Figure BDA0000437125490000319
for described reference picture is carried out to revised image pixel coordinate points according to distortion factor R
Figure BDA00004371254900003110
abscissa value and ordinate value, described x kand y koriginal pixels coordinate points (x for described reference picture k, y k) abscissa value and ordinate value.
As a kind of optional execution mode, described imaging acquiring unit 301 is for obtaining projected image in projection surface at the infrared grid line of secondary of described infrared filter camera;
Described as identification point acquiring unit 302 for obtaining the intersection point of the grid line of the infrared grid line different directions of described secondary, the intersection point of the grid line of described different directions is that described secondary is as identification point.
Because in concrete application, owing to treating projected image amending unit 304 and reference picture amending unit, may treat correction image and carry out real-time continual correction, in embodiment of the present invention device, default reference picture is with reference to infrared grid line, default reference identification point is the intersection point of the grid line of different directions in the infrared grid line of described reference, adopt infrared grid line as with reference to image, make projector equipment can be in real time continual to projection surface's projection reference picture and do not disturb the imaging of visible ray, so just can guarantee reference picture and treat that the correction of projected image is also real-time continual carrying out, thereby make eye recognition to projected image be all distortionless image from start to finish.
Due in reality scene, projection plane is likely uneven, the secondary now obtaining as identification point acquiring unit 302 may be inaccurate as identification point, therefore, as shown in Figure 7, describedly as identification point acquiring unit 302, can further include: stereopsis subelement 302a, for obtaining the three-dimensional parameter of projected image.
Describedly as identification point, obtain subelement 302b, for the three-dimensional parameter that described stereopsis subelement 302a is obtained, fit to a plane, described three-dimensional parameter is vertically mapped to described plane, obtains two-dimensional coordinate point, described two-dimensional coordinate point is for described secondary is as identification point.
In this stereopsis subelement 302a, have two imaging devices at least, in conjunction with Fig. 3, show, can, for two imaging devices, form Binocular Stereo Vision System in embodiments of the present invention, the secondary imaging image that imaging acquiring unit 301 forms the projected image capturing is passed to picture identification point acquiring unit 302, the secondary imaging image that stereopsis subelement 302a transmits two imaging devices by binocular stereo vision method is processed, in specific implementation process, this treatment step is: extract secondary in described secondary imaging image as the three-dimensional parameter of identification point, described three-dimensional parameter is become to a plane with least square fitting, described three-dimensional parameter is vertically mapped in described fit Plane, obtain described secondary as identification point the point of the two-dimensional coordinate in described projection surface, be that the secondary described in the embodiment of the present invention is as identification point.
Further, described Binocular Stereo Vision System can be the Binocular Stereo Vision System of parallel optical axis, in this vision system, can there be two for absorbing the camera of projected image, computing equipment obtains secondary imaging image and extracts secondary in secondary imaging image as the three-dimensional parameter of identification point from camera, be that secondary is as the D coordinates value of identification point, specific as follows: the distance at two cam lens centers is e, the focal length of two cameras is f, the mid point of the two cam lens lines of centres of take is initial point, it is X-axis that camera 2 lens centre directions are pointed in camera 1 lens centre, parallel camera optical axis direction is Z axis, XYZ direction of principal axis meets right-hand rule, set up coordinate system, a bit (x in this coordinate system, y, z) coordinate figure can be calculated by formula below:
z = e · f x 2 - x 1 - - - ( 5 )
x = e 2 · x 1 + x 2 x 1 - x 2 - - - ( 6 )
y = e 2 · y 1 + y 2 y 1 - y 2 - - - ( 7 )
(x wherein 1, y 1) and (x 2, y 2) be respectively (x, y, z) at the image space of camera 1 and camera 2 imaging surfaces, like this, we can in the hope of all secondaries as identification point the position in three dimensions, be provided with N secondary as identification point, D coordinates value is respectively { (x 1, y 1, z 1), (x 2, y 2, z 2), L, (x n, y n, z n).
Then, { (x 1, y 1, z 1), (x 2, y 2, z 2), L, (x n, y n, z n) can go out a plane with least square fitting:
Ax+By+Cz+D=0 (8)
C ≠ 0 wherein, because parallel camera optical axis direction is Z axis, so projection surface can not be parallel to Z axis, note a 0 = - A C , a 1 = - B C , a 2 = - D C , Formula (8) can be write as:
z=a 0x+a 1y+a 2 (9)
A wherein 0, a 1, a 2for parameter to be asked, least square method requires following formula value minimum:
S = Σ i = 1 N ( a 0 x i + a 1 y i + a 2 - z i ) 2 - - - ( 10 )
Make S minimum, should meet:
∂ S ∂ a k = 0 , k = 0,1,2 - - - ( 11 )
That is:
Σ i = 1 N 2 ( a 0 x i + a 1 y i + a 2 - z i ) x i = 0 Σ i = 1 N 2 ( a 0 x i + a 1 y i + a 2 - z i ) y i = 0 Σ i = 1 N 2 ( a 0 x i + a 1 y i + a 2 - z i ) = 0 - - - ( 12 )
Arranging above formula can obtain:
a 0 a 1 a 2 = Σ i = 1 N ( x i ) 2 Σ i = 1 N x i y i Σ i = 1 N x i Σ i = 1 N x i y i Σ i = 1 N ( y i ) 2 Σ i = 1 N y i Σ i = 1 N x i Σ i = 1 N y i N - 1 Σ i = 1 N x i z i Σ i = 1 N y i z i Σ i = 1 N z i - - - ( 13 )
Like this, we have just obtained an expressed space plane of formula (9), then by { (x 1, y 1, z 1), (x 2, y 2, z 2), L, (x n, y n, z n) upright projection is to this fit Plane, just can obtain these at the two-dimensional coordinate of this plane
Figure BDA0000437125490000345
secondary image scale in Here it is secondary imaging image is known point coordinates value, and in specific implementation process, after above-mentioned D coordinates value perpendicular projection, obtaining two-dimensional coordinate value can be realized by following computational process:
By above-mentioned formula, draw a known fit Plane Ax+By+Cz+D=0, therefore can obtain { (x 1, y 1, z 1), (x 2, y 2, z 2), L, (x n, y n, z n) upright projection is to the three-dimensional coordinate of this plane { ( x t 1 , y t 1 , z t 1 ) , ( x t 2 , y t 2 , z t 2 ) , . . . , ( x t N , y t N , z t N ) } :
x t i = x i - Aδ
y t i = y i - Bδ z t i = z i - Cδ - - - ( 14 )
Wherein δ = Ax i + By i + Cz i + D A 2 + B 2 + C 2 .
Obtain
Figure BDA0000437125490000353
after, then be mapped to two-dimensional coordinate need to determine the origin of coordinates and X-direction (Y-axis is vertical with X-axis through the origin of coordinates) herein, optional, in concrete implementation process, the origin of coordinates and X-direction can obtain as follows:
In default reference picture, do specific markers, make specific markers point in reference picture, be easy to be detected, in specific implementation process, choose the situation that reference picture is grid image, in grid image with the specific markers of " rice " font, as shown in Figure 6, in the grid image of projection, do specific markers, the grid image of this band of projection " rice " font mark, the grid point of these two bands " rice " font mark is easy to be detected in image, middle " rice " font gauge point can be used as the initial point of coordinate system in subsequent calculations, the direction of pointing to another " rice " font gauge point from this point is as X-direction.
If two " rice " font gauge points are through formula z = e · f x 2 - x 1 - - - ( 5 ) ; x = e 2 · x 1 + x 2 x 1 - x 2 - - - ( 6 ) ;
y = e 2 · y 1 + y 2 y 1 - y 2 - - - ( 7 ) ; x t i = x i - Aδ ; y t i = y i - Bδ ; z t i = z i - Cδ ; - - - ( 14 ) D coordinates value after calculating is
Figure BDA0000437125490000359
with
Figure BDA00004371254900003510
with
Figure BDA00004371254900003511
for initial point, with
Figure BDA00004371254900003512
point to
Figure BDA00004371254900003513
for X-axis, Y-axis is vertical with X-axis through the origin of coordinates, and unit length is identical with former three-dimensional system of coordinate, sets up thus two-dimensional coordinate system.Will
Figure BDA00004371254900003514
be mapped to this two-dimensional coordinate system, obtain
Figure BDA00004371254900003515
x i ' = l 1 · cos θ , y i ' = l 1 · sin θ , I=1 wherein, 2, L, N (15)
Wherein:
l wherein 1represent the three-dimensional coordinate point in described fit Plane arrive the origin of coordinates of the two-dimensional coordinate system of described foundation
Figure BDA0000437125490000363
distance;
Figure BDA0000437125490000364
described l 2represent the three-dimensional coordinate point in described fit Plane
Figure BDA0000437125490000365
arrive the origin of coordinates of the two-dimensional coordinate system of described foundation
Figure BDA0000437125490000366
distance;
Figure BDA0000437125490000367
described l 3represent the three-dimensional coordinate point in described fit Plane
Figure BDA0000437125490000368
to another meter of character point outside the initial point of the two-dimensional coordinate system of described foundation
Figure BDA0000437125490000369
distance;
Figure BDA00004371254900003610
described θ represents with the origin of coordinates
Figure BDA00004371254900003611
for summit, point
Figure BDA00004371254900003612
to point
Figure BDA00004371254900003613
straight line and point to point
Figure BDA00004371254900003615
straight line between angle.
In this embodiment, consider the complexity of projection environment, projection surface may present for rough three-dimensional state, now the projection surface presenting with three-dimensional state must be fitted to two dimensional surface and just can obtain final secondary as the two-dimensional coordinate value of identification point, and extract distortion factor R to reference picture and treat projected image correction based on this, even if make thus the image that projects in complicated projection environment also undistorted.
Above-mentioned all embodiment have carried out distortion correction to the distortion of projected image, make the out no longer distortion of image projection through distortion correction, but in the application of reality, popular gradually due to wearable device, wearable projector equipment also arises at the historic moment, when wearable projector equipment work, along with moving of human body, make possibly the image projecting shake, or the slight shake of human body also can bring bad impact to the drop shadow effect of wearable projecting apparatus, so, on the basis of above-described embodiment, the debounce of considering image is moving, the embodiment of the present invention can also comprise as lower device:
Kinematic parameter acquiring unit 306, for obtaining described projector equipment at the kinematic parameter described reference picture being projected to while forming projected image in described projection surface;
Compensating parameter acquiring unit 307, for obtaining motion compensation parameters according to described kinematic parameter;
Motion compensation units 308, for carrying out counter motion compensation according to described motion compensation parameters to the described image through distortion correction; The described projected image for the treatment of that comprises correction through the revised image of distortion, or the reference picture of revising.
Wherein kinematic parameter acquiring unit 306 can be further used for directly obtaining the first motion-compensated values according to acceleration transducer, using described the first motion compensation parameters as described kinematic parameter; Kinematic parameter acquiring unit 306 can also be for obtaining described projector equipment that acceleration transducer senses at component value u and the v of X-axis and Y direction displacement, using described u and v as kinematic parameter.
The embodiment of the present invention can also comprise motion benefit estimation unit, for described secondary imaging image is carried out to estimation, obtains preliminary motion-compensated values; As a kind of optional execution mode, compensating parameter acquiring unit 307 can be for i.e. the first motion compensation parameters and mend by described motion the preliminary motion-compensated values that estimation unit obtains and obtain motion compensation parameters of the kinematic parameter that obtains in conjunction with kinematic parameter acquiring unit 306.
Concrete, estimation unit is mended in described motion can obtain preliminary motion-compensated values for described secondary imaging image is carried out to estimation;
Kinematic parameter acquiring unit 306 can be weighted and on average obtains motion compensation parameters for the first motion-compensated values and described motion being mended to preliminary motion-compensated values that estimation unit obtains, specifically by following formula, is calculated and is obtained described motion compensation parameters:
M=α*M 1+(1-α)*M 2 (17)
Wherein, M is motion compensation parameters, M 1preliminary motion-compensated values, M 2the first motion-compensated values, described in M 1 = t x 1 t y 1 , Described M 2 = t x 2 t y 2 , Described for
Figure BDA0000437125490000374
described
Figure BDA0000437125490000375
for
Figure BDA0000437125490000376
described tx iand ty ifor the secondary in described secondary imaging image as identification point abscissa and ordinate the alternate position spike in adjacent two frames, described N is that secondary described in each frame is as the quantity of identification point; Described
Figure BDA0000437125490000377
for described
Figure BDA0000437125490000379
for described tx jand ty jthe alternate position spike in adjacent two frames for the reference identification point abscissa in described reference picture and ordinate, described H is the quantity of reference identification point described in each frame; α is at described M 1with described M 2between the empirical value of value proportion, the span of described α is 0≤α≤1.
As the optional execution mode of another kind, compensating parameter acquiring unit 307 can be further used for the kinematic parameter u that obtains in conjunction with described kinematic parameter acquiring unit 306 and v and described secondary imaging image is carried out to estimation obtain motion compensation parameters.
Concrete, the kinematic parameter that compensating parameter acquiring unit 307 can be further used for obtaining in conjunction with described kinematic parameter acquiring unit 306 carries out estimation to described secondary imaging image and obtains motion compensation parameters M, specifically by following formula, calculates described motion compensation parameters M:
M = t x t y
t x = Σ i = 1 N ( tx i ) 2 + ( ty i ) 2 · cot ( β · arctan ( ty i tx i ) + ( 1 - β ) · arctan ( v u ) ) - - - ( 18 )
t y = Σ i = 1 N ( tx i ) 2 + ( ty i ) 2 · tan ( β · arctan ( ty i tx i ) + ( 1 - β ) · arctan ( v u ) ) - - - ( 19 )
Tx wherein iand ty ifor the secondary in described secondary imaging image as identification point abscissa and ordinate the alternate position spike in adjacent two frames, described N is the get some quantity of secondary described in each frame as identification point, wherein u and v are described kinematic parameter, the described projector equipment being sensed by acceleration transducer X, the component value of Y-direction displacement, β be
Figure BDA0000437125490000384
with
Figure BDA0000437125490000385
between the empirical value of value proportion, the span of described β is 0≤β≤1.
After above-mentioned any one execution mode, the motion compensation units 308 in described device can comprise the first motion compensation subelement, for the described projected image for the treatment of through distortion correction being carried out to motion compensation by following formula:
x i RM y i RM = x i R y i R + t x t y , i = 1,2 , . . . , n - - - ( 20 )
Wherein, described in with
Figure BDA0000437125490000388
for treating that projected image carries out abscissa value and the ordinate value of revised image pixel point coordinates according to distortion factor R, described in
Figure BDA0000437125490000389
with
Figure BDA00004371254900003810
for carrying out abscissa value and the ordinate value of the image pixel point coordinates after counter motion compensation through the revised image slices vegetarian refreshments of distortion, wherein t x t y For motion compensation parameters M.
Optionally, the motion compensation units 308 in described device can also comprise the first motion compensation subelement, for the described reference picture through distortion correction being carried out to motion compensation by following formula:
x k RM y k RM = x k R y k R + t x t y , k = 1,2 , . . . , n - - - ( 21 )
Wherein, described in
Figure BDA0000437125490000393
with
Figure BDA0000437125490000394
for described reference picture carries out abscissa value and the ordinate value of revised image pixel point coordinates according to distortion factor R, described in
Figure BDA0000437125490000395
with for carrying out abscissa value and the ordinate value of the image pixel point coordinates after counter motion compensation through the revised image slices vegetarian refreshments of distortion, wherein t x t y For motion compensation parameters M.
In embodiments of the present invention, consider the complexity of projection environment, may there is the situation that moves or rock in projector equipment, thereby make projected picture shake, the embodiment of the present invention is by the combination of acceleration transducer and estimation, try to achieve motion compensation parameters M the most accurately to the revised reference picture through distortion correction and treat that projected image carries out counter motion compensation, thereby guarantee the not shake of image of exporting under projection environment complicated situation.
On the basis of above-described embodiment, it through the projected image casting out after distortion correction treatment and de-jitter, is the image that there is no distortion and there is no shake, applicable human eye is watched, if but projector equipment occurs in vertical mobile situation with projection plane, still take wearable projector equipment as example, when human body carries wearable projector equipment in the situation of the vertical movement of projection plane, although image does not have distortion and shake, but when human body from projection plane close to time projected image can be excessive for human eye, and when human body from projection plane away from time projected image can be excessive for human eye, therefore, on the basis of above-mentioned distortion correction and the moving embodiment of debounce, as identification point acquiring unit 302, can also obtain projector equipment to the distance D of projection surface, embodiment of the present invention device can also comprise: focus adjustment unit 309, for adjust the focal length of projector equipment to the distance D of projection surface according to the projector equipment obtaining as identification point acquiring unit, specifically realizes by following formula:
f = D × s 0 s 1 - - - ( 22 )
Wherein, described s 0for projector equipment LCD panel size, described s 1for the experience value of image size, described in getting N secondary is averaging and obtains distance D as the z value in the D coordinates value of identification point and to described z value.Wherein secondary is obtained by stereopsis subelement 302a as the Z value in the D coordinates value of identification point.
Concrete, in embodiments of the present invention, this stereopsis subelement 302a can be two imaging devices, forms Binocular Stereo Vision System; The secondary imaging image that described two imaging devices form the projected image capturing is passed to computing equipment, the secondary imaging image that computing equipment transmits two imaging devices by binocular stereo vision method is processed, in specific implementation process, this treatment step is: extract secondary in described secondary imaging image as the three-dimensional parameter of identification point.
Further, described Binocular Stereo Vision System can be the Binocular Stereo Vision System of parallel optical axis, in this vision system, can there be two for absorbing the camera of projected image, computing equipment obtains secondary imaging image and extracts secondary in secondary imaging image as the three-dimensional parameter of identification point from camera, be that secondary is as the D coordinates value of identification point, specific as follows: the distance at two cam lens centers is e, the focal length of two cameras is f, the mid point of the two cam lens lines of centres of take is initial point, it is X-axis that camera 2 lens centre directions are pointed in camera 1 lens centre, parallel camera optical axis direction is Z axis, XYZ direction of principal axis meets right-hand rule, set up coordinate system, a bit (x in this coordinate system, y, z) coordinate figure can be calculated by formula below:
z = e · f x 2 - x 1 - - - ( 23 )
x = e 2 · x 1 + x 2 x 1 - x 2 - - - ( 24 )
y = e 2 · y 1 + y 2 y 1 - y 2 - - - ( 25 )
(x wherein 1, y 1) and (x 2, y 2) be respectively (x, y, z) camera 1 in stereopsis system and the image space of camera 2 imaging surfaces.Like this, we can in the hope of all secondaries as identification point the position in three dimensions, be provided with N secondary as identification point, coordinate is respectively { (x 1, y 1, z 1), (x 2, y 2, z 2), L, (x n, y n, z n).
According to the Z coordinate of these points, can estimate that projection surface is apart from the approximate distance of projector:
d = 1 N Σ i = 1 N z i - - - ( 26 )
According to automatically adjust the focal length of visible ray projector apart from d, the image size that makes visible ray projector project to projection surface is applicable to dressing watches.Following table is the projector distance that rule of thumb obtains and the reference value of projected image size.
The reference value of table 1. projector distance and projected image size
Figure BDA0000437125490000412
Obtain projector distance d(unit: rice), by formula below can obtain the focal distance f that projector need to set (unit: rice):
f = D × s 0 s 1 - - - ( 27 )
Wherein, s 0for projecting camera LCD chip size (unit: inch), s 1for according to the size of the projected image obtaining apart from d question blank 1 (unit: inch), the value of f is subject to the restriction of hardware, must be positioned at [f min, f max] between, actual value is max (min (f, f max), f min).For fear of the variation because of distance, cause focal length not stop to change, in the time of can stipulating to only have the absolute value of the difference of the distance when current distance and a front adjusting focal length to be greater than a given threshold value (such as 0.5 meter), just carry out new focal length and calculate and setting.
In embodiments of the present invention, on the basis of embodiment above again by secondary as identification point the Z coordinate figure in three dimensions average and obtain projection surface to the distance d between imaging device, according to automatically adjust the focal length of projector equipment apart from d, also convergent-divergent accordingly when the size of projected image is moved both vertically according to the relative projection plane of human body, the applicable human eye of projected image size after convergent-divergent is watched, thereby strengthen user, experienced.

Claims (28)

1. the processing method based on projected image, is characterized in that, comprises following method:
Obtain projected image in the projection surface secondary imaging image in imaging device, described projected image is the image that by projector equipment, default reference picture is projected to become in described projection surface, in described default reference picture, be provided with reference identification point, described reference identification point forms once as identification point after projection in described projected image; The described picture point once forming in described secondary imaging image as identification point is that secondary is as identification point;
Obtain secondary in described secondary imaging image as identification point;
According to described secondary, as the location parameter of identification point, with respect to the variation of the location parameter of described reference identification, obtain distortion factor; Described secondary as the location parameter of identification point be described secondary as identification point the location parameter in described secondary imaging image, the location parameter of described reference identification point is the location parameter of described reference identification point in described reference picture;
According to described distortion factor, treat projected image and carry out distortion correction.
2. method according to claim 1, is characterized in that, described default reference picture is with reference to infrared grid line, and described reference identification point is the intersection point of the grid line of different directions in the infrared grid line of described reference; Accordingly, described projector equipment is infrared projection equipment, and described imaging device is infrared filter camera head;
Described projected image in the projection surface secondary imaging image in imaging device that obtains comprises:
Obtain projected image in projection surface through the infrared grid line of the formed secondary of described infrared filter camera;
The described secondary obtaining in described secondary imaging image comprises as identification point:
Obtain the intersection point of the grid line of different directions in the infrared grid line of described secondary, the intersection point of the grid line of described different directions is that described secondary is as identification point.
3. method according to claim 1 and 2, it is characterized in that, described according to described secondary as identification point the variation of the location parameter location parameter in described reference picture with respect to described reference identification point in described secondary imaging image obtain distortion factor and specifically comprise: by following formula, obtain described distortion factor R = r 11 r 12 r 13 r 21 r 22 r 23 ,
r 11 r 12 r 21 r 22 * x i y i + r 13 r 23 = x i ' y i ' , i = 1,2 , . . . , n - - - ( 1 )
x i r 11 + y i r 12 + r 13 = x i ' x i r 21 + y i r 22 + r 23 = y i ' , i = 1,2 , . . . , n - - - ( 2 )
Wherein, R is distortion factor, described in r 11 r 12 r 13 r 21 r 22 r 23 The matrix forming in affine transformation for described reference picture, wherein r 11, r 12, r 13, r 21, r 22, r 23for the parameter in affine variation, described x iwith described y ifor described secondary as identification point abscissa value and the ordinate value in described secondary imaging image, described (x i, y i) be that secondary is as the coordinate figure of identification point; Described x i' and described y i' be abscissa value and the ordinate value of described reference identification point in described reference picture, described (x i', y i') be the coordinate figure of reference identification point; Described affine transformation process is described parameter r 11, r 12, r 21, r 22with described secondary as identification point (x i, y i) carry out once linear conversion, and connect translation parameters r 13, r 23be transformed to described reference identification point (x i', y i'); Formula (2) is the system of linear equations consisting of formula (1) conversion, by described formula (2), calculates described affine transformation parameter r 11, r 12, r 13, r 21, r 22, r 23thereby value obtain described affine transformation distortion factor R.
4. method according to claim 3, is characterized in that, describedly according to described distortion factor, to described, treats that projected image correction comprises:
By following formula, treat projected image and carry out distortion correction:
x i R y i R = R * x i y i 1 r 11 r 12 r 13 r 21 r 22 r 23 * x i y i 1 , i = 1,2 , . . . , n - - - ( 3 )
Wherein, R is distortion factor, described in
Figure FDA0000437125480000026
with
Figure FDA0000437125480000027
for treating that projected image carries out revised image pixel coordinate points according to distortion factor R abscissa value and ordinate value, described x iand y ifor treating projected image original pixels coordinate points (x i, y i) abscissa value and ordinate value.
5. according to arbitrary described method in claim 1 to 4, it is characterized in that, described, after according to described secondary, as the location parameter of identification point, distortion factor is obtained in the variation of the location parameter based on described reference identification, described method also comprises: according to described distortion factor, described reference picture is revised;
Specifically by following formula, reference picture is carried out to distortion correction:
x k R y k R = R * x k y k 1 r 11 r 12 r 13 r 21 r 22 r 23 * x k y k 1 , k = 1,2 , . . . , n - - - ( 4 )
Wherein, R is distortion factor, described in
Figure FDA0000437125480000032
with for described reference picture is carried out to revised image pixel coordinate points according to distortion factor R
Figure FDA0000437125480000034
abscissa value and ordinate value, described x kand y koriginal pixels coordinate points (x for described reference picture k, y k) abscissa value and ordinate value.
6. method according to claim 5, is characterized in that, described according to described distortion factor, described reference picture is revised after, described method also comprises:
Obtain described projector equipment at the kinematic parameter described reference picture being projected to while forming projected image in described projection surface;
According to described kinematic parameter, obtain motion compensation parameters;
According to described motion compensation parameters, the described image through distortion correction is carried out to counter motion compensation; The described projected image for the treatment of that comprises correction through the revised image of distortion, or the reference picture of revising.
7. method as claimed in claim 6, is characterized in that, described according to before described kinematic parameter acquisition motion compensation parameters, further comprises:
Described secondary imaging image is carried out to estimation and obtain preliminary motion-compensated values;
Describedly according to described kinematic parameter, obtain motion compensation parameters and comprise:
In conjunction with described kinematic parameter and described preliminary motion-compensated values, obtain motion compensation parameters.
8. method as claimed in claim 6, is characterized in that, describedly according to described kinematic parameter, obtains motion compensation parameters and comprises:
In conjunction with described kinematic parameter, described secondary imaging image is carried out to estimation and obtain motion compensation parameters.
9. method according to claim 7, is characterized in that, kinematic parameter and preliminary motion-compensated values are obtained motion compensation parameters and comprised described in described combination:
First motion-compensated values of described kinematic parameter for directly being obtained by acceleration transducer;
Described secondary imaging image is carried out to estimation and obtain preliminary motion-compensated values;
Described the first motion-compensated values and described preliminary motion-compensated values are weighted and on average obtain motion compensation parameters, specifically by following formula, realize:
M=α*M 1+(1-α)*M 2 (5)
Wherein, M is motion compensation parameters, M 1preliminary motion-compensated values, M 2the first motion-compensated values, described in M 1 = t x 1 t y 1 , Described M 2 = t x 2 t y 2 , Described
Figure FDA0000437125480000043
for
Figure FDA0000437125480000044
described
Figure FDA0000437125480000045
for described tx iand ty ifor the secondary in described secondary imaging image as identification point abscissa and ordinate the alternate position spike in adjacent two frames, described N is that secondary described in each frame is as the quantity of identification point; Described
Figure FDA0000437125480000047
for
Figure FDA0000437125480000048
described
Figure FDA0000437125480000049
for
Figure FDA00004371254800000410
described tx jand ty jthe alternate position spike in adjacent two frames for the reference identification point abscissa in described reference picture and ordinate, described H is the quantity of reference identification point described in each frame; α is at described M 1with described M 2between the empirical value of value proportion, the span of described α is 0≤α≤1.
10. method according to claim 8, it is characterized in that, described kinematic parameter be the described projector equipment that senses of acceleration transducer at component value u and the v of X-axis and Y direction displacement, kinematic parameter carries out estimation to described secondary imaging image and obtains motion compensation parameters M and specifically by following formula, realize described in described combination:
M = t x t y
t x = Σ i = 1 N ( tx i ) 2 + ( ty i ) 2 · cot ( β · arctan ( ty i tx i ) + ( 1 - β ) · arctan ( v u ) ) - - - ( 6 )
t y = Σ i = 1 N ( tx i ) 2 + ( ty i ) 2 · tan ( β · arctan ( ty i tx i ) + ( 1 - β ) · arctan ( v u ) ) - - - ( 7 )
Tx wherein iand ty ifor the secondary in described secondary imaging image as identification point abscissa and ordinate the alternate position spike in adjacent two frames, described N is the get some quantity of secondary described in each frame secondary imaging image as identification point, wherein u and v are described kinematic parameter, the described projector equipment being sensed by acceleration transducer X, the component value of Y-direction displacement, β be with
Figure FDA0000437125480000052
between the empirical value of value proportion, the span of described β is 0≤β≤1.
11. according to arbitrary described method in claim 6 to 10, it is characterized in that, according to described motion compensation parameters, the described projected image for the treatment of through distortion correction is carried out to motion compensation, specifically by following formula, realizes:
x i RM y i RM = x i R y i R + t x t y , i = 1,2 , . . . , n - - - ( 8 )
Wherein, described in
Figure FDA0000437125480000054
with
Figure FDA0000437125480000055
for treating that projected image carries out abscissa value and the ordinate value of revised image pixel point coordinates according to distortion factor R, described in
Figure FDA0000437125480000056
with for carrying out abscissa value and the ordinate value of the image pixel point coordinates after counter motion compensation through the revised image slices vegetarian refreshments of distortion, wherein t x t y For motion compensation parameters M.
12. according to arbitrary described method in claim 6 to 11, it is characterized in that, according to described motion compensation parameters M, the described reference picture through distortion correction is carried out to motion compensation, specifically by following formula, realizes:
x k RM y k RM = x k R y k R + t x t y , k = 1,2 , . . . , n - - - ( 9 )
Wherein, described in
Figure FDA00004371254800000510
with
Figure FDA00004371254800000511
for described reference picture carries out abscissa value and the ordinate value of revised image pixel point coordinates according to distortion factor R, described in
Figure FDA00004371254800000512
with
Figure FDA00004371254800000513
for carrying out abscissa value and the ordinate value of the image pixel point coordinates after counter motion compensation through the revised image slices vegetarian refreshments of distortion, wherein t x t y For motion compensation parameters M.
13. according to arbitrary described method in claim 1 to 12, it is characterized in that, described imaging device is at least two, to form stereo visual system;
The described secondary obtaining in described secondary imaging image specifically comprises as identification point:
By described stereo visual system, obtain the three-dimensional parameter in projected image; And described three-dimensional parameter is fitted to a plane, and described three-dimensional parameter is vertically mapped to described plane, obtain two-dimensional coordinate point, described two-dimensional coordinate point is for described secondary is as identification point.
14. methods according to claim 13, it is characterized in that, described according to described distortion factor, treat projected image carry out distortion correction after described method also comprise: according to projector equipment, to the distance D of projection surface, adjust the focal length of projector equipment, specifically by following formula, realize:
f = D × s 0 s 1 - - - ( 10 )
Wherein, described s 0for projector equipment LCD panel size, described s 1for the experience value of image size, described in
Figure FDA0000437125480000062
getting N secondary is averaging and obtains distance D as the z value in the D coordinates value of identification point and to described z value.
15. 1 kinds of processing unit based on projected image, is characterized in that, described device comprises:
Imaging acquiring unit, for obtaining projected image in projection surface at the secondary imaging image of imaging device, described projected image is the image that by projector equipment, default reference picture is projected to become in described projection surface, in described default reference picture, be provided with reference identification point, described reference identification point forms once as identification point after projection in described projected image; The described picture point once forming in described secondary imaging image as identification point is that secondary is as identification point;
Picture identification point acquiring unit, for the secondary that obtains described secondary imaging image as identification point;
Distortion factor acquiring unit, for obtaining distortion factor as the location parameter of identification point with respect to the variation of the location parameter of described reference identification according to described secondary; Described secondary as the location parameter of identification point be described secondary as identification point the location parameter in described secondary imaging image, the location parameter of described reference identification point is the location parameter of described reference identification point in described reference picture;
Treat projected image amending unit, for treating projected image according to described distortion factor R, carry out distortion correction.
16. devices according to claim 15, is characterized in that, described default reference picture is with reference to infrared grid line, and described reference identification point is the intersection point of the grid line of different directions in the infrared grid line of described reference;
Described imaging acquiring unit, for obtaining projected image in projection surface through the infrared grid line of the formed secondary of described infrared filter camera;
Described picture identification point acquiring unit, for obtaining the intersection point of the grid line of the infrared grid line different directions of described secondary, the intersection point of the grid line of described different directions is that described secondary is as identification point.
17. according to the device described in claim 15 or 16, it is characterized in that, described distortion factor acquiring unit, for calculating described distortion factor by following formula R = r 11 r 12 r 13 r 21 r 22 r 23 ,
r 11 r 12 r 21 r 22 * x i y i + r 13 r 23 = x i ' y i ' , i = 1,2 , . . . , n - - - ( 1 )
x i r 11 + y i r 12 + r 13 = x i ' x i r 21 + y i r 22 + r 23 = y i ' , i = 1,2 , . . . , n - - - ( 2 )
Wherein, R is distortion factor, described in r 11 r 12 r 13 r 21 r 22 r 23 The matrix forming in affine transformation for described reference picture, wherein r 11, r 12, r 13, r 21, r 22, r 23for the parameter in affine variation, described x iwith described y ifor described secondary as identification point abscissa value and the ordinate value in described secondary imaging image, described (x i, y i) be that secondary is as the coordinate figure of identification point; Described x i' and described y i' be abscissa value and the ordinate value of described reference identification point in described reference picture, described (x i', y i') be the coordinate figure of reference identification point; Described affine transformation process is described parameter r 11, r 12, r 21, r 22with described secondary as identification point (x i, y i) carry out once linear conversion, and connect translation parameters r 13, r 23be transformed to described reference identification point (x i', y i'); Formula (2) is the system of linear equations consisting of formula (1) conversion, by described formula (2), calculates described affine transformation distortion factor R.
18. devices according to claim 17, is characterized in that, described in treat projected image amending unit, for treating that to described projected image carries out distortion correction by following formula:
x i R y i R = R * x i y i 1 r 11 r 12 r 13 r 21 r 22 r 23 * x i y i 1 , i = 1,2 , . . . , n - - - ( 3 )
Wherein, R is distortion factor, described in
Figure FDA0000437125480000076
with
Figure FDA0000437125480000077
for treating that projected image carries out revised image pixel coordinate points according to distortion factor R
Figure FDA0000437125480000081
abscissa value and ordinate value, described x iand y ifor treating projected image original pixels coordinate points (x i, y i) abscissa value and ordinate value.
19. according to claim 15 to arbitrary described device in 18, it is characterized in that, described device also comprises: reference picture amending unit, for described reference picture being revised according to described distortion factor R; Described reference picture amending unit is for carrying out distortion correction by following formula to described reference picture:
x k R y k R = R * x k y k 1 r 11 r 12 r 13 r 21 r 22 r 23 * x k y k 1 , k = 1,2 , . . . , n - - - ( 4 )
Wherein, R is distortion factor, described in
Figure FDA0000437125480000083
with for described reference picture is carried out to revised image pixel coordinate points according to distortion factor R
Figure FDA0000437125480000085
abscissa value and ordinate value, described x kand y koriginal pixels coordinate points (x for described reference picture k, y k) abscissa value and ordinate value.
20. devices according to claim 19, is characterized in that, described device also comprises:
Kinematic parameter acquiring unit, for obtaining described projector equipment at the kinematic parameter described reference picture being projected to while forming projected image in described projection surface;
Compensating parameter acquiring unit, for obtaining motion compensation parameters according to described kinematic parameter;
Motion compensation units, for carrying out counter motion compensation according to described motion compensation parameters to the described image through distortion correction; The described projected image for the treatment of that comprises correction through the revised image of distortion, or the reference picture of revising.
21. devices according to claim 20, is characterized in that, described device comprises:
Estimation unit is mended in motion, for described secondary imaging image is carried out to estimation, obtains preliminary motion-compensated values;
Described compensating parameter acquiring unit, obtains motion compensation parameters for the kinematic parameter obtaining in conjunction with kinematic parameter acquiring unit and the preliminary motion-compensated values of being obtained by described motion benefit estimation unit.
22. devices according to claim 20, is characterized in that, described compensating parameter acquiring unit is carried out estimation for the kinematic parameter obtaining in conjunction with described kinematic parameter acquiring unit to described secondary imaging image and obtained motion compensation parameters.
23. devices according to claim 21, is characterized in that, described kinematic parameter acquiring unit, for directly obtaining the first motion-compensated values according to acceleration transducer, using described the first motion compensation parameters as described kinematic parameter;
Described motion is mended estimation unit and is obtained preliminary motion-compensated values for described secondary imaging image is carried out to estimation, and the first motion-compensated values that described kinematic parameter acquiring unit is obtained and described in the preliminary motion-compensated values that acquires be weighted and on average obtain motion compensation parameters, specifically by following formula, calculate and obtain described motion compensation parameters:
M=α*M 1+(1-α)*M 2 (5)
Wherein, M is motion compensation parameters, M 1preliminary motion-compensated values, M 2the first motion-compensated values, described in M 1 = t x 1 t y 1 , Described M 2 = t x 2 t y 2 , Described
Figure FDA0000437125480000093
for
Figure FDA0000437125480000094
described
Figure FDA0000437125480000095
for
Figure FDA0000437125480000096
described tx iand ty ifor the secondary in described secondary imaging image as identification point abscissa and ordinate the alternate position spike in adjacent two frames, described N is that secondary described in each frame is as the quantity of identification point; Described
Figure FDA0000437125480000097
for
Figure FDA0000437125480000098
described
Figure FDA0000437125480000099
for
Figure FDA00004371254800000910
described tx jand ty jthe alternate position spike in adjacent two frames for the reference identification point abscissa in described reference picture and ordinate, described H is the quantity of reference identification point described in each frame; α is at described M 1with described M 2between the empirical value of value proportion, the span of described α is 0≤α≤1.
24. devices according to claim 22, is characterized in that, described kinematic parameter acquiring unit, for obtaining described projector equipment that acceleration transducer senses at component value u and the v of X-axis and Y direction displacement, using described u and v as kinematic parameter;
Described compensating parameter acquiring unit, carries out estimation specifically for the kinematic parameter obtaining in conjunction with described kinematic parameter acquiring unit to described secondary imaging image and obtains motion compensation parameters M, specifically by following formula, calculates described motion compensation parameters M:
M = t x t y
t x = Σ i = 1 N ( tx i ) 2 + ( ty i ) 2 · cot ( β · arctan ( ty i tx i ) + ( 1 - β ) · arctan ( v u ) ) - - - ( 6 )
t y = Σ i = 1 N ( tx i ) 2 + ( ty i ) 2 · tan ( β · arctan ( ty i tx i ) + ( 1 - β ) · arctan ( v u ) ) - - - ( 7 )
Tx wherein iand ty ifor the secondary in described secondary imaging image as identification point abscissa and ordinate the alternate position spike in adjacent two frames, described N is the get some quantity of secondary described in each frame secondary imaging image as identification point, wherein u and v are described kinematic parameter, the described projector equipment being sensed by acceleration transducer X, the component value of Y-direction displacement, β be with
Figure FDA0000437125480000103
between the empirical value of value proportion, the span of described β is 0≤β≤1.
25. according to arbitrary described device in claim 20 to 24, it is characterized in that, described motion compensation units comprises, specifically for the described projected image for the treatment of through distortion correction being carried out to motion compensation by following formula:
x i RM y i RM = x i R y i R + t x t y , i = 1,2 , . . . , n - - - ( 8 )
Wherein, described in with for treating that projected image carries out abscissa value and the ordinate value of revised image pixel point coordinates according to distortion factor R, described in
Figure FDA0000437125480000107
with for carrying out abscissa value and the ordinate value of the image pixel point coordinates after counter motion compensation through the revised image slices vegetarian refreshments of distortion, wherein t x t y For motion compensation parameters M.
26. according to arbitrary described device in claim 20 to 25, it is characterized in that, and described motion compensation units, also for the described reference picture through distortion correction being carried out to motion compensation by following formula:
x k RM y k RM = x k R y k R + t x t y , k = 1,2 , . . . , n - - - ( 9 )
Wherein, described in with
Figure FDA00004371254800001012
for described reference picture carries out abscissa value and the ordinate value of revised image pixel point coordinates according to distortion factor R, described in
Figure FDA00004371254800001013
with
Figure FDA00004371254800001014
for carrying out abscissa value and the ordinate value of the image pixel point coordinates after counter motion compensation through the revised image slices vegetarian refreshments of distortion, wherein t x t y For motion compensation parameters M.
27. according to claim 15 to arbitrary described device in 26, it is characterized in that, describedly as identification point acquiring unit, also comprises:
Stereopsis subelement, for obtaining the three-dimensional parameter of projected image;
As identification point, obtain subelement, for the three-dimensional parameter that described stereopsis subelement is obtained, fit to a plane, described three-dimensional parameter is vertically mapped to described plane, obtain two-dimensional coordinate point, described two-dimensional coordinate point is for described secondary is as identification point.
28. devices according to claim 27, is characterized in that, described device also comprises:
Focus adjustment unit, for adjust the focal length of projector equipment to the distance D of projection surface according to projector equipment, specifically realizes by following formula:
f = D × s 0 s 1 - - - ( 10 )
Wherein, described s 0for projector equipment LCD panel size, described s 1for the experience value of image size, described in getting N secondary is averaging and obtains distance D as the z value in the D coordinates value of identification point and to described z value.
CN201310687010.2A 2013-12-13 2013-12-13 Processing method and device based on projected image Active CN103686107B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201310687010.2A CN103686107B (en) 2013-12-13 2013-12-13 Processing method and device based on projected image
PCT/CN2014/093811 WO2015085956A1 (en) 2013-12-13 2014-12-15 Processing method and device based on projection image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310687010.2A CN103686107B (en) 2013-12-13 2013-12-13 Processing method and device based on projected image

Publications (2)

Publication Number Publication Date
CN103686107A true CN103686107A (en) 2014-03-26
CN103686107B CN103686107B (en) 2017-01-25

Family

ID=50322220

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310687010.2A Active CN103686107B (en) 2013-12-13 2013-12-13 Processing method and device based on projected image

Country Status (2)

Country Link
CN (1) CN103686107B (en)
WO (1) WO2015085956A1 (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015085956A1 (en) * 2013-12-13 2015-06-18 华为技术有限公司 Processing method and device based on projection image
CN105376640A (en) * 2014-08-06 2016-03-02 腾讯科技(北京)有限公司 Filter processing method, filter processing device and electronic equipment
CN106251309A (en) * 2016-08-04 2016-12-21 北京尚水信息技术股份有限公司 For eliminating the processing method of camera shake
CN106331631A (en) * 2016-08-30 2017-01-11 山东惠工电气股份有限公司 Superposition method of two paths of videos
CN107277380A (en) * 2017-08-16 2017-10-20 成都市极米科技有限公司 A kind of Zooming method and device
CN107465901A (en) * 2016-06-03 2017-12-12 上海顺久电子科技有限公司 A kind of projection correction's method, apparatus and laser television
CN108509025A (en) * 2018-01-26 2018-09-07 吉林大学 A kind of crane intelligent Lift-on/Lift-off System based on limb action identification
CN108769636A (en) * 2018-03-30 2018-11-06 京东方科技集团股份有限公司 Projecting method and device, electronic equipment
CN109196499A (en) * 2016-06-01 2019-01-11 科磊股份有限公司 For automatically generating system, method and the computer program product of the coordinate mapping of wafer images to design
CN110871735A (en) * 2018-08-31 2020-03-10 比亚迪股份有限公司 Vehicle-mounted sensor and vehicle with same
CN111866481A (en) * 2020-09-22 2020-10-30 歌尔股份有限公司 Method for detecting contamination of projection device, detection device and readable storage medium
CN112004072A (en) * 2020-07-31 2020-11-27 海尔优家智能科技(北京)有限公司 Projection image detection method and device
CN112135111A (en) * 2020-09-22 2020-12-25 鲁迅美术学院艺术工程总公司 Projection picture correction method
WO2022052921A1 (en) * 2020-09-08 2022-03-17 青岛海信激光显示股份有限公司 Projection system and projected image correction method
CN114615478A (en) * 2022-02-28 2022-06-10 青岛信芯微电子科技股份有限公司 Projection picture correction method, projection picture correction system, projection device, and storage medium
CN114697623A (en) * 2020-12-29 2022-07-01 成都极米科技股份有限公司 Projection surface selection and projection image correction method and device, projector and medium

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112637575B (en) * 2020-12-14 2022-02-01 四川长虹电器股份有限公司 Automatic image correction system and method for ultra-short-focus laser projector
CN117351077A (en) * 2023-09-14 2024-01-05 广东凯普科技智造有限公司 Visual correction method for dynamic prediction of sample application instrument

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1308465A (en) * 1999-12-18 2001-08-15 Lg电子株式会社 Image distortion correcting deivce and method and image display
US20040085256A1 (en) * 2002-10-30 2004-05-06 The University Of Chicago Methods and measurement engine for aligning multi-projector display systems
CN1517780A (en) * 2003-01-21 2004-08-04 惠普开发有限公司 Correcting of projection image based on reflection image
CN101166288A (en) * 2006-10-17 2008-04-23 精工爱普生株式会社 Calibration technique for heads up display system
CN102104732A (en) * 2009-12-21 2011-06-22 索尼公司 Image processing apparatus and method, and program
US20130222776A1 (en) * 2012-02-23 2013-08-29 Masaaki Ishikawa Image projector, method of image projection, and computer-readable storage medium storing program for causing computer to execute image projection

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005033703A (en) * 2003-07-11 2005-02-03 Seiko Epson Corp System and method for processing image, projector, program, and information storage medium
JP4729999B2 (en) * 2005-06-28 2011-07-20 富士ゼロックス株式会社 Information processing system, information processing apparatus, information processing method, and computer program
CN101815188A (en) * 2009-11-30 2010-08-25 四川川大智胜软件股份有限公司 Irregular smooth curve surface display wall multi-projector image frame correction method
US9456172B2 (en) * 2012-06-02 2016-09-27 Maradin Technologies Ltd. System and method for correcting optical distortions when projecting 2D images onto 2D surfaces
CN103686107B (en) * 2013-12-13 2017-01-25 华为技术有限公司 Processing method and device based on projected image

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1308465A (en) * 1999-12-18 2001-08-15 Lg电子株式会社 Image distortion correcting deivce and method and image display
US20040085256A1 (en) * 2002-10-30 2004-05-06 The University Of Chicago Methods and measurement engine for aligning multi-projector display systems
CN1517780A (en) * 2003-01-21 2004-08-04 惠普开发有限公司 Correcting of projection image based on reflection image
CN101166288A (en) * 2006-10-17 2008-04-23 精工爱普生株式会社 Calibration technique for heads up display system
CN102104732A (en) * 2009-12-21 2011-06-22 索尼公司 Image processing apparatus and method, and program
US20130222776A1 (en) * 2012-02-23 2013-08-29 Masaaki Ishikawa Image projector, method of image projection, and computer-readable storage medium storing program for causing computer to execute image projection

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015085956A1 (en) * 2013-12-13 2015-06-18 华为技术有限公司 Processing method and device based on projection image
CN105376640A (en) * 2014-08-06 2016-03-02 腾讯科技(北京)有限公司 Filter processing method, filter processing device and electronic equipment
CN105376640B (en) * 2014-08-06 2019-12-03 腾讯科技(北京)有限公司 Filter processing method, device and electronic equipment
TWI726105B (en) * 2016-06-01 2021-05-01 美商克萊譚克公司 System, method and computer program product for automatically generating a wafer image to design coordinate mapping
CN109196499B (en) * 2016-06-01 2021-05-28 科磊股份有限公司 System, method and computer program product for automatically generating a coordinate mapping of a wafer image to a design
CN109196499A (en) * 2016-06-01 2019-01-11 科磊股份有限公司 For automatically generating system, method and the computer program product of the coordinate mapping of wafer images to design
CN107465901A (en) * 2016-06-03 2017-12-12 上海顺久电子科技有限公司 A kind of projection correction's method, apparatus and laser television
CN107465901B (en) * 2016-06-03 2019-07-30 上海顺久电子科技有限公司 A kind of projection correction's method, apparatus and laser television
CN106251309A (en) * 2016-08-04 2016-12-21 北京尚水信息技术股份有限公司 For eliminating the processing method of camera shake
CN106331631A (en) * 2016-08-30 2017-01-11 山东惠工电气股份有限公司 Superposition method of two paths of videos
CN106331631B (en) * 2016-08-30 2019-10-25 山东惠工电气股份有限公司 A kind of two-path video coincidence method
CN107277380A (en) * 2017-08-16 2017-10-20 成都市极米科技有限公司 A kind of Zooming method and device
CN108509025A (en) * 2018-01-26 2018-09-07 吉林大学 A kind of crane intelligent Lift-on/Lift-off System based on limb action identification
CN108769636B (en) * 2018-03-30 2022-07-01 京东方科技集团股份有限公司 Projection method and device and electronic equipment
CN108769636A (en) * 2018-03-30 2018-11-06 京东方科技集团股份有限公司 Projecting method and device, electronic equipment
CN110871735A (en) * 2018-08-31 2020-03-10 比亚迪股份有限公司 Vehicle-mounted sensor and vehicle with same
CN112004072A (en) * 2020-07-31 2020-11-27 海尔优家智能科技(北京)有限公司 Projection image detection method and device
WO2022052921A1 (en) * 2020-09-08 2022-03-17 青岛海信激光显示股份有限公司 Projection system and projected image correction method
CN112135111A (en) * 2020-09-22 2020-12-25 鲁迅美术学院艺术工程总公司 Projection picture correction method
CN111866481A (en) * 2020-09-22 2020-10-30 歌尔股份有限公司 Method for detecting contamination of projection device, detection device and readable storage medium
CN111866481B (en) * 2020-09-22 2020-12-08 歌尔股份有限公司 Method for detecting contamination of projection device, detection device and readable storage medium
CN114697623A (en) * 2020-12-29 2022-07-01 成都极米科技股份有限公司 Projection surface selection and projection image correction method and device, projector and medium
WO2022142139A1 (en) * 2020-12-29 2022-07-07 成都极米科技股份有限公司 Projection plane selection and projection image correction methods, device, projector and medium
CN114697623B (en) * 2020-12-29 2023-08-15 极米科技股份有限公司 Projection plane selection and projection image correction method, device, projector and medium
CN114615478A (en) * 2022-02-28 2022-06-10 青岛信芯微电子科技股份有限公司 Projection picture correction method, projection picture correction system, projection device, and storage medium
CN114615478B (en) * 2022-02-28 2023-12-01 青岛信芯微电子科技股份有限公司 Projection screen correction method, projection screen correction system, projection apparatus, and storage medium

Also Published As

Publication number Publication date
WO2015085956A1 (en) 2015-06-18
CN103686107B (en) 2017-01-25

Similar Documents

Publication Publication Date Title
CN103686107A (en) Processing method and device based on projected image
CN106791784B (en) A kind of the augmented reality display methods and device of actual situation coincidence
US20180332222A1 (en) Method and apparatus for obtaining binocular panoramic image, and storage medium
CN106780389B (en) Fisheye image correction method and device based on coordinate transformation
TWI507729B (en) Eye-accommodation-aware head mounted visual assistant system and imaging method thereof
CN107665483B (en) Calibration-free convenient monocular head fisheye image distortion correction method
CN105989577B (en) Image correction method and device
CN109040728B (en) Ultra-short-focus projection equipment with double-camera trapezoidal correction and method thereof
CN104657982A (en) Calibration method for projector
CN102620713A (en) Method for measuring distance and positioning by utilizing dual camera
CN103247020A (en) Fisheye image spread method based on radial characteristics
US10250802B2 (en) Apparatus and method for processing wide viewing angle image
CN101795375A (en) Device for displaying projection and method for controlling projection
CN106713894B (en) A kind of tracking mode stereo display method and equipment
CN105488807A (en) Method for calibrating and rectifying telecentric lens
CN104093013A (en) Method for automatically regulating image parallax in stereoscopic vision three-dimensional visualization system
KR101545633B1 (en) Method and System for Vehicle Stereo Camera Calibration
CN103253193A (en) Method and system of calibration of panoramic parking based on touch screen operation
CN109087253A (en) A kind of method for correcting image and device
KR20090108822A (en) Camera image correction method and apparatus
CN114663495A (en) Calibration method and apparatus, head-mounted display device, and computer-readable storage medium
KR101670328B1 (en) The appratus and method of immersive media display and image control recognition using real-time image acquisition cameras
US11902492B2 (en) Image processing method and apparatus for stereoscopic images of nearby object in binocular camera system of parallel axis type
CN101950247A (en) Method for correcting deformed picture
CN105303166A (en) Camera device and method acquiring iris images through camera device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant