CN100568975C - Decoding device, coding/decoding method - Google Patents

Decoding device, coding/decoding method Download PDF

Info

Publication number
CN100568975C
CN100568975C CN 200610074396 CN200610074396A CN100568975C CN 100568975 C CN100568975 C CN 100568975C CN 200610074396 CN200610074396 CN 200610074396 CN 200610074396 A CN200610074396 A CN 200610074396A CN 100568975 C CN100568975 C CN 100568975C
Authority
CN
China
Prior art keywords
image
decoding
mentioned
frame
coding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN 200610074396
Other languages
Chinese (zh)
Other versions
CN1913641A (en
Inventor
荻窪纯一
柴田三代子
志潟太郎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Publication of CN1913641A publication Critical patent/CN1913641A/en
Application granted granted Critical
Publication of CN100568975C publication Critical patent/CN100568975C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

Program of the present invention, decoding device are not repeatedly decoded just can realize upsetting to the reference image and are reset.The decoding control section part is accepted the instruction of stream of resetting and frame (B13), (B3) and playback speed and the replay mode etc. of resetting output, according to being stored in concordance list and baseband images data information with the baseband images data in the memory, decoder and decoding order that decision is used to decode, and sequentially offer the decoding processing parts.Decoder is decoded to the data that provided according to (I2) image, the baseband images data that decoding is generated are saved in the baseband images data with in the memory, simultaneously when (P5), (P8), (P11) and decoding (P14), baseband images data from be kept at baseband images data usefulness memory are accepted the supply with reference to image, and carry out decoding according to its.The present invention goes for replay device or personal computer.

Description

Decoding device, coding/decoding method
Technical field
The present invention relates to program, decoding device, coding/decoding method and recording medium, particularly be suitable for carrying out program, decoding device, coding/decoding method and recording medium under the situation of random playback.
Background technology
Under the situation of (encode) image data of encoding with the MPEG mode, following technology is arranged: not only generate packed data, also by subdatas (subdata) such as generation index datas, and carry out playback time in decoding compressed data, make it possible to carry out the special playback (for example patent documentation 1) that begins to reset from the picture of regulation.
Patent documentation 1: Japanese kokai publication hei 11-341437 communique
The existing replay device 1 that uses Fig. 1 explanation can use index data that the flow data that is encoded with the MPEG2LongGOP form is decoded.
The operation input obtains the operation input that parts 21 are obtained the user.The operation input obtains parts 21 and accepts the selection of decoder object stream, the appointment of playback starting position from the user, and offers stream decoded positions detection processing unit 22.The stream decoded positions detects processing unit 22 according to the selection that obtains the decoder object stream that parts 21 provide from the operation input, the appointment of playback starting position, obtain the stream of playback stream numbering (Stream No.), playback starting position frame number (Frame No.) and offer stream control assembly 23 be provided.
Stream provides the stream numbering of control assembly 23 according to the stream of resetting, the frame number of playback starting position, obtain the needed various information of decoding of the stream that is stored in the correspondence in the index management parts 24, detection is decoded needed with reference to picture frame to the frame of the playback starting position of appointment.Then, stream provides control assembly 23 to require the needed flow data of decoding to stream scu 25, and provides the flow data that provides with memory 26 from stream to decoder 28 via stream scu 25.In addition, stream provides control assembly 23 that the frame number of the frame of output is offered base band signal process parts 32 described later.
Stream scu 25 convection current are controlled with the storage of memory 26, read the flow data that control assembly 23 requirements are provided from stream from stream with memory 26, and output to stream control assembly 23 is provided.
Flow with memory 26 difference storage flow data, and export the flow data of stipulating according to the control of stream scu 25.
Decoder 28 portion within it has scu 41, control with reference to baseband images memory 29 by scu 41, the flow data that provides control assembly 23 to provide from stream is decoded, decoded baseband images data are offered baseband images data scu 30.
Specifically, decoder 28 the compressive flow that is provided be by the situation of interframe with reference to the compressive flow that compresses under, processing by scu 41 makes that to preserve the back decoding with reference to baseband images with memory 29 needed with reference to image, and compressive flow is decoded.
The baseband images data will offer the baseband images data from the decoded baseband images data that decoder 28 provides with memory 31 with scu 30, simultaneously from the baseband images data with memory 31 read by the stream that has required from base band signal process parts 32 number and frame number shown in frame the baseband images data and offer base band signal process parts 32.
Base band signal process parts 32 are at the baseband images data that provided, for example carry out color correct, size correction, the various corrections such as field control when putting slowly, image behind the feasible output decoder of correctly resetting, and export the output baseband images data that generated.
Then, with reference to the flow chart of Fig. 2, the existing playback processing of carrying out is described in the replay device 1 of Fig. 1.
In step S1, the stream decoded positions detects processing unit 22 and obtains the input that parts 21 are accepted the appointment of replay position (stream numbering, frame number) by the operation input, and offers stream control assembly 23 is provided.
In step S2, stream provides control assembly 23 to obtain and the corresponding concordance list of stream numbering that is provided from index management parts 24.
In step S3, stream provide control assembly 23 from concordance list, extract out for example image type as the needed information of decoding, with reference to information such as image-related information, data lengths.
In step S4, stream provide control assembly 23 obtain for the frame of the frame number of the appointment of decoding as with reference to the frame number of the needed frame of image (frame at the frame that is designated as decoding is not under the situation of I image, is a frame of the I image before comprising at least), their information is used to decode.
In step S5, stream provides control assembly 23 to require conduct with reference to the needed frame of the image of image and decoding to stream scu 25.Stream scu 25 is read conduct with reference to the necessary frame of the image of image and decoding from stream with memory 26 direction in order, and offers stream control assembly 23 is provided.
In step S6, stream provides control assembly 23 that the frame data that provided are offered decoder 28.28 pairs of data that provided of decoder are decoded.Specifically, decoder 28 is when decoding with reference to image to the decoding of back is needed, processing by scu 41 makes with reference to baseband images and preserves the baseband images data that decoding generates with memory 29, by it is used as the reference image, and compressive flow is decoded.At this moment, decoder 28 is (if the decoding of the B image that provided, the frame that can decode yet and need are not provided) when the providing of unwanted B frame is provided, the decoding of skipping the B image.
Then, in step S7, the baseband images data that decoder 28 will have been decoded offer the baseband images data with scu 30.The baseband images data will offer baseband images data memory 31 from the decoded baseband images data that decoder 28 provides with scu 30, read baseband images data from the baseband images data with memory 31 simultaneously, and offer base band signal process parts 32 by the frame shown in stream numbering that has required from base band signal process parts 32 and the frame number.Base band signal process parts 32 are at the baseband images data that provided, for example carry out color correct, size correction, the various corrections such as field control when putting slowly, image behind the feasible output decoder of correctly resetting, and export the output baseband images data that generated.
Using Fig. 3, is example with the situation of sequentially exporting B13 and B3 frame in the replay device 1 that has illustrated with Fig. 1 and Fig. 2, and upset (scramble) playback (being also referred to as random playback) of the frame of the hope that playback user is specified is described.
The stream decoded positions detects processing unit 22 and obtains the input that parts 21 are accepted the stream numbering of the stream of decoder object, are designated as the frame number of the B13 of frame of the output of resetting and B3 by the operation input, and offers stream control assembly 23 is provided.
Stream provide control assembly 23 at first to stream scu 25 require to comprise as for the B13 of the output in first of decoding as with reference to I2, P5, P8, P11 and the P14 of the needed frame of image, as the stream of the time sequencing of the B13 of the frame of the output of resetting, and offer decoder 28.
The data (can skip the decoding of B image as required) that decoder 28 is sequentially decoded and provided, the processing by scu 41 is provided with P11 and P14 to the reference baseband images with memory 29, and with them as the reference image and the B13 that decodes.Decoder 28 will the baseband images data corresponding with the B13 that has decoded offers the baseband images data with scu 30, and is saved in the baseband images data with memory 31 (memory copy).Then, in base band signal process parts 32, carry out various corrections (Base-Band Processing) and output output baseband images data.
Then, stream provide control assembly 23 then to stream scu 25 require as for the B3 of the output in second of decoding as with reference to the I2 of the needed frame of image and P5, as the B3 of the frame of the output of resetting, the stream of the time sequencing that comprises them is provided to decoder 28.
The data that decoder 28 is sequentially decoded and provided, the processing by scu 41 is provided with I2 and P5 to the reference baseband images with memory 29, and with them as the reference image and the B3 that decodes.Decoder 28 will the baseband images data corresponding with the B3 that has decoded offers the baseband images data with scu 30, and is saved in the baseband images data with (memory copy) in the memory 31.Then, in base band signal process parts 32, carry out various corrections (Base-Band Processing) and output output baseband images data.
By such processing, in existing replay device 1, upset playback.
In the replay device 1 that uses Fig. 1 to illustrate, compressive flow is provided and decodes to decoder 28 according to time sequencing, and the baseband images data that generate of output decoder.Decoder 28 the compressive flow that is provided be by the situation of interframe with reference to the compressive flow that compress under, the processing by scu 41 makes with reference to baseband images needed with reference to image with the decoding of memory 29 preservation back, thus decoding compressed stream.
Specifically, decoder 28 is under the situation of the compressive flow that is provided being encoded with the MPEG2LongGOP form, decoding I image also is kept at reference to baseband images with in the memory 29, with reference to decode P image and being kept at reference to baseband images of this I image, with reference to the I image of being preserved or P image and the B image of decoding with in the memory 29.In addition, after the decoding of the frame of institute's reference finishes, discarded being kept at reference to baseband images with the reference image data in the memory 29.
That is, with scu 41 control of memory 29, in the decoding processing of decoder 28, only be used for temporarily storing the reference image data the when packed data that carries out the interframe reference decoded by decoder 28 with reference to baseband images.
In such mode, the decoding of carrying out on the time image at random at high speed is difficult.For example be used as under the situation with reference to image at the decoded image of hope with preceding arbitrary frame, if do not have the image that decoder 28 can reference in memory 29 in the reference baseband images, then must provide as the necessary baseband plot picture frame of reference image to decoder 28 once more and decode, and be set to reference to baseband images with in the memory 29.Then, can be so that decode by the reset baseband plot picture frame of output of 28 pairs of decoders, therefore be difficult to carry out at high speed on the time decoding of image at random.
Therefore, in existing coding/decoding method, the instruction at upsetting reset (random playback) is difficult to follow the tracks of at high speed the output of resetting.In addition, during the playback that is difficult to carry out at high speed the frame of random site is meant and for example plays at the frame lattice with operational ton (be under the situation of dial (dial) for example at operation input apparatus, being the rotation amount etc. of dial) operation (job) of the frame number that is directly proportional resets, search (search) with speed that operational ton is directly proportional under browse (shuttle) that reset reset etc., put upside down, temporarily stop, the frame lattice are play, various playbacks such as the frame lattice return down, the variable-ratio stunt (trick) that the waits instruction that various playbacks handle such as reset of resetting or put upside down is difficult to follow the tracks of at high speed reset and exports.
Summary of the invention
The present invention proposes in view of such situation, at the instruction of upsetting reset (random playback), can follow the tracks of the output of resetting at high speed.
The coding/decoding method of one aspect of the present invention is characterized in that comprising: according to the playback information of the playback mode of presentation code stream, first deciding step of the coding/decoding method of the above-mentioned encoding stream of decision from a plurality of coding/decoding methods; According to above-mentioned playback information, decoding processing is needed to obtain step with reference to image information with reference to image information with reference to image for the object images of above-mentioned encoding stream is carried out to obtain expression; According to be included in above-mentioned with reference to the past side in the image information with reference to frame number and in the demonstration time of the image corresponding, can carry out the frame number of decoding processing with 1 frame, determine second deciding step of the coding/decoding method of above-mentioned encoding stream; By above-mentioned obtain with reference to image information that step obtains above-mentioned with reference to image information under the situation in the represented above-mentioned storage area that is not stored in regulation with reference to image, according to the above-mentioned coding/decoding method that determines by above-mentioned first deciding step, from above-mentioned encoding stream, obtain and be not stored in above-mentioned in the above-mentioned storage area and carry out decoding processing with reference to image, and be stored in the above-mentioned storage area with reference to the picture decoding step; According to the above-mentioned coding/decoding method that determines by above-mentioned first deciding step, utilize to be stored in above-mentioned in the above-mentioned storage area with reference to image, the object images of above-mentioned encoding stream is carried out decoding processing, generate the decoding step of view data thus.
In the processing of reference picture decoding step and decoding step, can also comprise: carry out decoding processing by a plurality of coding/decoding methods, according to the playback information of importing by the processing of playback information input step, the decoding order of decision encoding stream and first deciding step of coding/decoding method, in the processing of reference picture decoding step and decoding step, according to carrying out decoding processing by determined decoding order of the processing of first deciding step and coding/decoding method.
In the processing of determining step, can judge whether being stored in the storage area of in the coding/decoding method of the decision of the processing by first deciding step, having decoded with reference to image.
In the processing of reference picture decoding step and decoding step, can carry out any one in the different a plurality of decoding processing of resolution according to decision based on the processing of first deciding step.
In the processing of reference picture decoding step and decoding step, can carry out any one in the decoding processing corresponding according to decision based on first deciding step with a plurality of coded systems.
Can comprise I image, P image and B image in encoding stream, in the processing of decoding step, be under the situation of B image in the object images of encoding stream, can be so that decoding be present in I image or P image nearby in time with respect to object images.
In encoding stream, can comprise intraframe coding image and inter prediction encoding image, in the processing of decoding step, under the situation of the image that the object images of encoding stream is an inter prediction encoding, can so that decoding with respect to object images be present in intraframe coding nearby in time image.
Can also comprise following steps: according to be included in reference to the past side in the image information with reference to frame number and in the demonstration of the image corresponding, can carry out the frame number of decoding processing, second deciding step of the coding/decoding method of decision encoding stream with 1 frame.
Can also comprise following steps: calculate the calculation procedure that in the demonstration time of the image corresponding, to carry out the frame number of decoding processing with 1 frame.
Can also comprise following steps: calculate the calculation procedure that is included in reference to the past side in the image information with reference to frame number.
The decoding device of one aspect of the present invention comprises: according to the playback information of the playback mode of presentation code stream, first coding/decoding method of the above-mentioned encoding stream of decision from a plurality of coding/decoding methods, according to above-mentioned playback information, obtain expression for the object images of above-mentioned encoding stream is carried out decoding processing needed with reference to image with reference to image information, according to be included in above-mentioned with reference to the past side in the image information with reference to frame number and in the demonstration time of the image corresponding, can carry out the frame number of decoding processing with 1 frame, determine the control unit of second coding/decoding method of above-mentioned encoding stream; By above-mentioned with reference to image information under the situation in the represented above-mentioned storage area that is not stored in regulation with reference to image, according to above-mentioned first coding/decoding method, from above-mentioned encoding stream, obtain and be not stored in above-mentioned in the above-mentioned storage area and carry out decoding processing with reference to image, and be stored in the above-mentioned storage area, and utilize to be stored in above-mentioned in the above-mentioned storage area, the object images of above-mentioned encoding stream is carried out the decoding unit of decoding processing with reference to image.
The coding/decoding method of one aspect of the present invention comprises: the playback information input step of input of playback information of accepting the playback mode of presentation code stream; According to the playback information of the input of the processing by the playback information input step, obtain expression and the object images of encoding stream is carried out decoding processing is needed to obtain step with reference to image information with reference to image information with reference to image; According to the processing that obtains step by the reference image information obtain with reference to image information, judge with reference to image whether be stored in determining step in the storage area; Judge situation about not being stored in reference to image in the storage area by the processing of determining step under, from encoding stream, obtain not to be stored in and carry out decoding processing with reference to image in the storage area, control in storage area, store with reference to the picture decoding step; Utilization be stored in the storage area with reference to image, generate view data by the object images of encoding stream is carried out decoding processing, the control decoding step of in storage area, storing.
In one aspect of the invention, accept the input of playback information of the playback mode of presentation code stream, according to playback information obtain expression decoder object image needed with reference to image with reference to image information, whether be stored in the judgement in the storage area of regulation based on the reference image, from encoding stream, obtain as required with reference to the image corresponding codes image decode, with storing in the storage area of regulation of being generated with reference to image, simultaneously according in the storage area that is stored in regulation with reference to image, the decoder object image also records in the storage area of regulation.
According to an aspect of the present invention, the decoded stream view data, the frame with reference to frame and decoded playback output is stored in the same storage area especially, therefore can upset playback at high speed.
Description of drawings
Fig. 1 is the block diagram of the structure of the existing replay device of expression.
Fig. 2 is the flow chart that is used to illustrate existing decoding processing.
Fig. 3 is the figure that is used for illustrating that the upset of existing replay device is reset.
Fig. 4 is the block diagram that expression has been suitable for the structure of replay device of the present invention.
Fig. 5 is the figure that is used to illustrate editing pictures.
Fig. 6 is the figure that is used to illustrate concordance list.
Fig. 7 is used for the figure that the upset of the replay device of key diagram 4 is reset.
Fig. 8 is the figure that is used for the base band memory management unit institute information of managing of key diagram 4.
Fig. 9 is the block diagram of the structure of expression personal computer.
Figure 10 is the functional block diagram that is used for illustrating the function that realizes at the personal computer of Fig. 9.
Figure 11 is the flow chart that is used to illustrate decoding processing 1.
Figure 12 is the flow chart that is used to illustrate the depth information computing.
Figure 13 is the flow chart that is used to illustrate decoding processing 2.
Figure 14 is the flow chart that is used to illustrate decoding switching decision processing.
Figure 15 is used to illustrate the flow chart of frame number computing of can decoding.
Figure 16 is the block diagram of the structure of expression converting means.
Figure 17 is the figure that is used to illustrate index file.
Figure 18 is the figure that is used to illustrate the situation that the bit rate that makes the I image is fixing.
Figure 19 is the figure that is used to illustrate the situation of the bit rate change that makes the I image.
Figure 20 is the figure that is used to illustrate the situation of random playback P image.
Figure 21 is the figure that is used to illustrate the situation of random playback B image.
Figure 22 is the functional block diagram that is used for illustrating the function that realizes at the personal computer of Fig. 9.
Figure 23 is used to illustrate that agents document generates the flow chart of processing 1.
Figure 24 is the figure that is used to illustrate the situation that has changed PI conversion number.
Figure 25 is the figure that is used to illustrate the minimal solution yardage.
Figure 26 is the figure that is used to illustrate the minimal solution yardage.
Figure 27 is the figure that is used to illustrate the minimal solution yardage.
Figure 28 is the figure that is used to illustrate the minimal solution yardage.
Figure 29 is the figure that is used to illustrate the minimal solution yardage.
Figure 30 is the flow chart of set handling that is used to illustrate the P image of conversion.
Figure 31 is the figure of setting that is used to illustrate the P image of conversion.
Figure 32 is used to illustrate that agents document generates the flow chart of processing 2.
Embodiment
Embodiments of the invention below are described, if the corresponding relation of the embodiment that example structural element of the present invention and specification or accompanying drawing are put down in writing is then as follows.This record is used for confirming to support that embodiments of the invention have been documented in specification or accompanying drawing.Therefore, though on the books in specification or accompanying drawing, as the embodiment corresponding with structural element of the present invention, even not have the embodiment that puts down in writing at this, this embodiment also and do not mean that not corresponding with this structural element.On the contrary, even as the key element corresponding and, also and do not mean that this embodiment is not corresponding with structural element beyond this structural element at this embodiment on the books with structural element.
Program of one aspect of the present invention or coding/decoding method are carried out computer to comprise the processing of following steps: the playback information input step (for example processing of the step S41 of Figure 11) of input of playback information (for example appointment of the frame of Chong Fanging, playback speed or replay mode) of accepting the playback mode of presentation code stream; According to the playback information of the input of the processing by the playback information input step, obtain expression and the object images of encoding stream is carried out decoding processing is needed to obtain step (for example processing of the step S46 of Figure 11) with reference to image information with reference to image information with reference to image; According to the processing that obtains step by the reference image information obtain with reference to image information, judge with reference to image whether be stored in determining step (for example processing of the step S47 of Figure 11) in the storage area (for example the baseband images data of the RAM223 of Figure 10 with memory 223) of regulation; Judge situation about not being stored in reference to image in the storage area by the processing of determining step under, from encoding stream, obtain not to be stored in and carry out decoding processing with reference to image in the storage area, control in storage area, store with reference to picture decoding step (for example processing of the step S48 of Figure 11); Utilization be stored in the storage area with reference to image, generate view data by the object images of encoding stream is carried out decoding processing, the control decoding step (for example processing of the step S49 of Figure 11) of in storage area, storing.
In the processing of reference picture decoding step and decoding step, can also comprise: can carry out decoding processing by a plurality of coding/decoding methods, according to the playback information of importing by the processing of playback information input step, the decoding order of decision encoding stream and first deciding step (for example processing of the step S44 of Figure 11) of coding/decoding method, in the processing of reference picture decoding step and decoding step, can be according to carrying out decoding processing by determined decoding order of the processing of first deciding step and coding/decoding method.
In the processing of reference picture decoding step and decoding step, can carry out any one in the decoding processing corresponding (for example decoding processing beyond the decoding processing of Gui Ding image type special use or the MPEG) according to decision based on first deciding step with a plurality of coded systems.
In encoding stream, can comprise intraframe coding image (for example I image) and inter prediction encoding image (for example P image or B image), in the processing of decoding step, under the situation of the image that the object images of encoding stream is an inter prediction encoding, can decode with respect to object images be present in intraframe coding nearby in time image.
Can also comprise following steps: according to be included in reference to the past side in the image information with reference to frame number (for example corresponding value) with depth information and with the demonstration of the corresponding image of 1 frame in can carry out the frame number (frame number of for example can decoding) of decoding processing, second deciding step (for example processing of the step S127 of Figure 13, processing promptly shown in Figure 14) of the coding/decoding method of decision encoding stream.
Can also comprise following steps: the calculation procedure (for example processing of the step S164 of Figure 14) that calculates the frame number (frame number of for example can decoding) that in the demonstration time of the image corresponding, to carry out decoding processing with 1 frame.
Can also comprise following steps: calculate the calculation procedure with reference to frame number (for example corresponding value) (for example processing that illustrates with Figure 12) that is included in reference to the past side in the image information with depth information.
The decoding device of one aspect of the present invention (for example personal computer 201 of the replay device of Fig. 4 or Fig. 9) comprises: the playback information input unit (for example the operation input of Fig. 4 obtains parts 21) of input of playback information (for example appointment of the frame of Chong Fanging, playback speed or replay mode) of accepting the playback mode of presentation code stream; Management expression to the object images of encoding stream carry out decoding processing needed with reference to image with reference to image information with reference to image information management apparatus (for example index management parts 24 of Fig. 4); Encoding stream is carried out decoding processing and generate the decoding device (for example decoding processing parts 76 of Fig. 4) of view data; Storage is by the storage device (for example baseband images data of Fig. 4 memory 31) of the view data of decoding device generation; Management is by the memory management unit (for example base band memory management unit 73 of Fig. 4) of the store status of the view data of memory device stores; By with reference to by reference image information management apparatus management with reference to image information with by the store status of memory management unit management, control the decoding control device (for example decoding control section part 72 of Fig. 4) of the decoding processing of decoding device, wherein decoding device will carry out decoding processing as the reference image to encoding stream by in the view data of memory device stores any one, and decoded object images is offered storage device store.
Below, with reference to the description of drawings embodiments of the invention.
Fig. 4 is the block diagram that expression has been suitable for the configuration example of replay device 61 of the present invention.
In addition, to the additional prosign of the part corresponding, and suitably omit its explanation with prior art.
Promptly, the replay device 61 of Fig. 4 replaces streams that control assembly 23 is provided and is provided with stream control assembly 74 is provided, replace decoder 28 and be provided with decoding processing parts 76, replace the baseband images data to be provided with baseband images data scu 77 with scu 30, and omitted with reference to baseband images memory 29, and GUI display control unit spare 71, decoding control section part 72, base band memory management unit 73 newly are set and with reference to image indicating device 75, in addition, have and the replay device 1 basic the same structure that has illustrated with Fig. 1.
GUI display control unit spare 71 provides the GUI (graphic user interface) that urges the operation input to the user, and perhaps control is used to show the demonstration of the picture of exporting result, for example the demonstration of control editing pictures as shown in Figure 5.
Editing pictures 101 shown in Figure 5 is provided for carrying out the various GUI of following processing, and the in-edit of for example resetting a plurality of flow datas randomly and detect wishing at the in-edit place of hope, is carried out the editing and processing that different flow datas is coupled together etc.
The flow data of selecting by the user decoded in decoding processing parts 76 (decode) for example, store the baseband images data into in the memory 31 via the baseband images data with scu 77, in base band signal process parts 32, carried out after the predetermined process, by the processing of GUI display control unit spare 71, in playback window 111, show.That is, the output baseband images data of the playback usefulness that will handle in base band signal process parts 32 offer GUI display control unit spare 71, are presented at by GUI display control unit spare 71 to show in the playback window 111 of editing pictures 101 of control.
Operation input by according to the user is moved the position of the pointer (pointer) 122 of the upset bar (scramble bar) 121 in the bottom that is arranged on playback window 111, and the replay position of flow data is set.In addition, in timeline (time line) window 112, the replay position or the user that also are provided with in the time shaft in user notification video flowing or audio stream for example can indicate the such timeline of replay position 131, operation input according to the user, the position of the pointer 132 of timeline 131 is moved, thereby replay position can be set.
The operation input obtains parts 21 and obtains the operation input that the user carries out with reference to the editing pictures 101 that has illustrated with Fig. 5.The operation input obtains parts 21 and accepts the selection of decoder object stream, the appointment of playback starting position from the user, promptly upset the operation of the pointer 132 of the pointer 122 of bar 121 or timeline 131, accept the instruction of replay mode or playback speed etc. simultaneously, and offer stream decoded positions detection processing unit 22.The stream decoded positions detects processing unit 22 according to the selection that obtains the decoder object stream that parts 21 provide from the operation input, the appointment of playback starting position, obtain the stream of playback stream numbering, playback starting position frame number and offer decoding control section part 72 with the instruction of playback speed etc.
Replay mode for example has the common playback that begins of replay position from regulation, puts soon, refunds, the frame lattice of both forward and reverse directions are play, the demonstration of rest image etc., also can be provided with soon respectively the speed of putting, refunding etc., frame compartment that the frame lattice are play every etc.
Decoding control section part 72 is according to the stream numbering of the stream of the playback that provides from stream decoded positions detection processing unit 22, the frame number of playback starting position and the instruction content of playback speed, replay mode etc., obtain the needed various information index tables of decoding (Fig. 6) that record corresponding stream that are stored in the index management parts 24, and then obtain the information of baseband images data that is stored in the baseband images data in the memory 31 by 73 management of base band memory management unit.In addition, decoding control section part 72 determines the decoder and the decoding order that are used to decode according to these information.
Fig. 6 represents to be stored in an example of the concordance list in the index management parts 24.
In concordance list shown in Figure 6, record: the putting in order, be used for of the image in the putting in order of the image in the DISPLAY ORDER (Display Order), the stream order (Stream Order) from the side-play amount (TemporalOffset) of DISPLAY ORDER to the image of the arrangement of stream order changing image.In addition, in concordance list, also record the information of the position of the sequence head (Sequence Header) in the expression stream order, to there be sequence to be recited as 1 in the drawings, to not be recited as 0, also record expression forwards to or the rear to the information that has or not with reference to image (Forward/Backward Prediction (Picture Type)).
In addition, in the present example, for the expression forwards to or the rear to the information that has or not with reference to image, at the bit stream that is provided is under the situation of OpenGOP form, the place ahead is recited as " 10 " to encoded predicted frame (being the P image), two direction prediction coded frame (being the B image) are recited as " 11 ", in addition intracoded frame (being the I image) are recited as " 00 ".In addition, be under the situation of ClosedGOP form at the bit stream that is provided, then 2 B images of intra-coding data (being the I image) back since only with reference to the rear to, so be recited as " 01 ".
In addition, in concordance list, record in the stream of image size (Picture Size), image of each image or the address in the posting field (Address).In addition, in concordance list, record this image of expression and the distance of the image that becomes keyword (key) frame (is to carry out the I image that intraframe coding is handled at this) information (Key Frame Offset (decimal number)), postpone (VBV:Video Buffer Verifier) with the Vbv of the memory space of the imaginary input buffer of the time representation decoder of 90KHz clock.And then, in concordance list, also record the depth information (Depth of Past Frames forDecoding) of expression and decoding needed the place ahead of this image frame number (degree of depth) between playback frame frame farthest in image.
That is, decoding control section part 72 is according to concordance list, the position of the needed I frame of frame (keyword frame) of the playback starting position of retrieval decoding appointment, is stored in stream with the address of the stream in the memory 26 or image size etc.In addition, decoding control section part 72 determines the needed quality of decoding according to the information such as playback speed by user's indication.
Specifically, decoding control section part 72 for example with replay device 61 in each several part processing speed, processing stream data rate etc. accordingly, for the display quality of the display frame that keeps final output, display frame lattice number etc., and control decoding order by the arbitrary method shown in following (1)~(3).
(1) sequentially decodes, make the stand-by period (Latency) for the shortest to being indicated as the frame that to reset.(2) be under the situation of B image at the image type that is indicated as the frame that to reset, nearby I image or P image are decoded and exported.(3) only to the I image nearby that is indicated as the frame that the to reset output of decoding.
Specifically, decoding control section part 72 also can for example can controlled under the situation of decoding order by the method for (1), take the method shown in (1), indication putting soon or refunding etc. to a certain degree, under the situation of the display frame lattice number that can't guarantee the final display frame of exporting by the method shown in (1), by the control of the method shown in (2) decoding order, under the situation of the display frame lattice number that also can't guarantee the final display frame of exporting by the method shown in (2), by the control of the method shown in (3) decoding order.
In addition, decoding control section part 72 and indicated replay mode, the processing speed of the each several part in the replay device 61 and the data rate of handled stream etc. are accordingly, for the display quality of the display frame that keeps final output and display frame lattice number etc., the optimal decoder in a plurality of decodings that are included in the decoding processing parts 76 can be selected decoder as the execution decoding processing.
Specifically, decoding control section part 72 for example can controlled under the situation of decoding order by the method for above-mentioned (1), display speed is slowed down to more than to a certain degree, or when showing rest image, promptly in the time can fully identifying the replay mode of resolution of shown image by the user, select high-resolution decoder, perhaps controlling under the situation of decoding order by the method for (1), display speed is added situation more than to a certain degree etc. be difficult to confirm the replay mode of resolution of shown image by the user time, select the decoder of low resolution.
In addition, even decoding control section part 72 has been indicated under same speed and has been reset, for example controlling under the situation of decoding order by the method for (1), also select the decoder of low resolution to make it possible to guarantee the finally display frame lattice number of the display frame of output, on the other hand, control under the situation of decoding order in method by (2), owing to can fully guarantee the display frame lattice number of the display frame of final output, so can select high-resolution decoder.
And then, decoding control section part 72 also can be for example in the situation of the method control decoding order by (3) etc. down, select the decoder (is the decoder of I image special use at this) of specific image type special use.
In addition, decoding control section part 72 can possess can be to the decoder of decoding with the stream of the compression of the compression method beyond the MPEG, for example under the situation that the compressive flow that has compressed with the compression method beyond the MPEG is provided, the decoder that selection can be decoded to this compressive flow is carried out decoding processing.
Decoding control section part 72 also can be to the selection of each stream or frame switching encoding/decoding order and decoder.
In addition, decoding control section part 72 is according to index data, the needed reference frame of frame of decision decoding appointment, and, judge whether essential reference frame is decoded according to the information of baseband images data that is stored in that obtains from base band memory management unit 73 with the baseband images data the memory 31.Then, decoding control section part 72 is according to this judged result, the decoding order of the frame that decision need be decoded and implement the decoder of decoding.Then, decoding control section part 72 provides the frame number of the frame of the stream numbering of the stream that control assembly 74 provides decoding and decoding to stream, the frame number of the essential frame that the stream numbering of the stream of decoding is provided to reference image indicating device 75, reads with memory 31 from the baseband images data as the reference image (is the frame of o~2 for 1 playback frame), is used to control the control signal of decoding processing at the selection result of the decoder that is provided for handling to decoding processing parts 76.
Base band memory management unit 73 monitors with scu 77 via the baseband images data and is kept at the baseband images data with the baseband images data in the memory 31, and will be kept at the baseband images data and offer decoding control section part 72 with the frame information of the baseband images data in the memory 31.
The frame number of the frame of the stream numbering of control assembly 74 according to the stream of the playback that provides from decoding control section part 72, decoding is provided stream, require the needed flow data of decoding to stream scu 25, the flow data that provides from stream scu 25 is provided to decoding processing parts 76.
Stream scu 25 control flows are read flow data that control assembly 74 requirements from stream are provided from stream with memory 26 with the storage of the flow data of memory 26, and output to stream control assembly 74 is provided.
Stream according to the control of stream scu 25, is exported the flow data of regulation with memory 26 difference storage flow data.
The frame number of the essential frame of reading with memory 31 from the baseband images data according to the stream numbering of the stream of the decoding that provides from decoding control section part 72, as the reference image with reference to image indicating device 75, to offer decoding processing parts 76 to the baseband images data as the essential frame that the reference image is read with memory 31 from the baseband images data with scu 77 requirements, will offer decoding processing parts 76 as the baseband images data that the reference image uses.
Decoding processing parts 76 are made of a plurality of decoders (decoder 0~decoder n), control according to decoding control section part 72, by the decoder that is fit to the flow data that provides control assembly 74 to provide from stream is decoded, and decoded baseband images data are offered baseband images data scu 77.
Specifically, decoding processing parts 76 for example comprise the decoder of highresolution decoder, low resolution decoder, specific image special use, maybe can be to the decoder of decoding etc. by the flow data of the compression of the compression method beyond the MPEG, by the decoder of selecting according to the control of decoding control section part 72, to carry out reference with the baseband images data that scu 77 provides as the reference image from the baseband images data as required, to compression that control assembly 74 provides is provided from stream flow data decode.
The baseband images data will offer baseband images data memory 31 from the decoded baseband images data that decoding processing parts 76 provide with scu 77, accept the supply of the frame of the necessity of reading with memory 31 from the baseband images data simultaneously as the reference image from reference image indicating device 75, from the baseband images data with memory 31 read appointment with reference to image, offer decoding processing parts 76.
In addition, the baseband images data that the baseband images data are read the frame by the stream numbering that from base band signal process parts 32 require and frame number represented from the baseband images data with memory 31 with scu 77 offer base band signal process parts 32.The baseband images data with scu 77 for example under the situation of having indicated operation playback iso-variable velocity to reset, also can read frame with memory 31, offer base band signal process parts 32 output of resetting from the baseband images data based on the regulation of user's operation input.
Base band signal process parts 32 are at the baseband images data that provided, for example carry out color correct, size correction, the various corrections such as field control when putting slowly, make correctly reset output decoder image, the output baseband images data that generated are outputed to GUI display control unit spare 71.
As above illustrated, in being suitable for replay device 61 of the present invention, by any one decoder decode in the decoding processing parts 76 the baseband images data all be stored in the baseband images data with in the memory 31, also read with reference to image with memory 31 from the baseband images data.Controlling with 29 scus 41 of memory of the replay device 1 that has illustrated with Fig. 1 by decoder 28 with reference to baseband images, only receive and dispatch the data of decoder 28, the baseband images data of the replay device 1 that has illustrated with Fig. 1 according to the control of baseband images data with scu 30, will offer base band signal process parts 32 with memory 31 from the baseband images data that decoder 28 provides.Relative therewith, the baseband images data that have been suitable for replay device 61 of the present invention are used the control of scu 77 according to the baseband images data with memory 31, with decoding processing parts 76 transmitting-receiving reference image data, simultaneously the same with prior art ground provides the baseband images data to base band signal process parts 32.
In being suitable for replay device 61 of the present invention, when the baseband images data store the baseband images data that can utilize as reference image data in memory 31, as the reference image, use is stored in the baseband images data with the baseband images data in the memory 31, does not therefore provide new being used to generate frame with reference to image to decoding processing parts 76.That is, the frame that provides to decoding processing parts 76 also can not be the order as common frame.
Perhaps, in being suitable for replay device 61 of the present invention, when the baseband images data store the baseband images data that can utilize as reference image data in memory 31, as the reference image, use is stored in the baseband images data in the baseband images data usefulness memory 31, therefore decoding processing parts 76 also have not according to this order the decode situation of (under situation about having with reference to image, the frame of correspondence is not decoded and skip) of the bit stream of the time sequencing that provided.
Therefore, in being suitable for replay device 61 of the present invention, be provided with supervision and be stored in the base band memory management unit 73 of baseband images data with the baseband images data in the memory 31, and be provided with the decoding control section part 72 of the selection of the order of decoding and decoder according to the state of baseband memory, also be provided with to the baseband images data with scu 77 provide to decoding processing parts 76 provide with reference to the indication of image with reference to image indicating device 75.
Then, using Fig. 7, is example with the situation of sequentially exporting B13 frame and B3 frame, illustrates that the upset of the frame of the hope that playback user is specified is reset.
The stream decoded positions detects processing unit 22 and obtains the input that parts 21 are accepted the stream numbering of the stream of decoder object, are designated as the frame number of the B13 of frame of the output of resetting and B3 by the operation input, accept the indication of playback speed, replay mode etc. simultaneously, and offer decoding control section part 72.
Decoding control section part 72 numbers, is designated as B13 and the frame number of B3 and the instruction content of playback speed and replay mode etc. of the frame of the output of resetting according to the stream of the stream of the playback that provides from stream decoded positions detection processing unit 22, obtain the concordance list that has illustrated with Fig. 6 that is stored in the index management parts 24, obtain the corresponding needed various information of stream of decoding, and then obtain the information of baseband images data that is stored in the baseband images data in the memory 31 by 73 management of base band memory management unit.At this, illustrate in the baseband images data and do not preserve B13 and the needed situation of B3 that decoding is designated as the output of resetting in memory 31 with reference to the pairing frame of image.
Decoding control section part 72 determines the decoder and the decoding order that are used to decode according to these information.Promptly, decoding control section part 72 is under the situation that the rest image of this frame of 2 is reset continuously at replay mode for example, decision is for selecting high-resolution decoder from decoding processing parts 76, to I2, P5, P8, P11 and the P14 as the needed frame of reference image decodes in order to decode B13, use is with reference to image P11 and P14 decoding B13, then, for the B13 that decodes, and use decoded be kept at the baseband images data with I2 in the memory 31 and the P5 B3 that decodes.
Decoding control section part 72 is numbered the frame number that needs is offered the frame of decoding processing parts 76 with stream, being the B13 of decoding output in first and, offering stream control assembly 74 is provided as with reference to I2, P5, P8, P11 and the P14 of the necessary frame of image and as the frame number of the frame of the B13 of the frame of the output of resetting and B3.
Stream provides control assembly 74 at first to require for the B13 of the output in first of decoding as with reference to I2, P5, P8, P11 and the P14 of the necessary frame of image and as the B13 and the B3 of the frame of the output of resetting to stream scu 25, and sequentially offers decoding processing parts 76.
The decoder of being selected by decoding control section part 72 in the decoding processing parts 76 is according to the control of decoding control section part 72, according to the I2 image data that provided are decoded, the baseband images data that decoding is generated offer baseband images data scu 77, and be kept at the baseband images data with in the memory 31, simultaneously later at decoding P5, P8, when P11 and P14, accept to be kept at of the supply of baseband images data via the baseband images data with scu 77, carry out decoding according to it with the baseband images data of the frame corresponding in the baseband images data in the memory 31 with the reference image.
Then, the decoder of being selected by decoding control section part 72 in the decoding processing parts 76 is when decoding to B13 according to the control of decoding control section part 72, the supply of the baseband images data that acceptance is corresponding with P11 and P14, with them as the reference image and the B13 that decodes, in baseband images storage that will be corresponding via the baseband images data with the B13 that has decoded with scu 77 in the baseband images data with after in the memory 31, offer base band signal process parts 32.Then, in base band signal process parts 32, carry out various corrections (Base-Band Processing), output output baseband images data.
Then, the decoder of being selected by decoding control section part 72 in the decoding processing parts 76 is when then decoding to B3, control according to decoding control section part 72, accept be stored in baseband images data usefulness memory 31 in I2 and the supply of the corresponding baseband images signal of P5, with them as the reference image, B3 is decoded, will the baseband images data corresponding via the baseband images data being saved in the baseband images data, offer base band signal process parts 32 with after in the memory 31 with the B3 that has decoded with scu 77.Then, in base band signal process parts 32, carry out various corrections (Base-Band Processing), output output baseband images data.
Like this, in a memory, carry out the preservation of decoded baseband images data, with reference to the preservation of image, therefore need be for the action of the memory copy of exporting processing, in addition in upsetting playback, need not repeat same frame is decoded, therefore can decode to playback frame at high speed in order to generate with reference to image.
In addition, the memory that carries out with reference to the preservation of image is separated with decoder, part beyond decoder is controlled, therefore increased the degree of freedom of the control of decoding order, can switch inlet flow to each frame in addition, to each frame switching encoding/decoding device, therefore can corresponding various decoding algorithms.
For the processing of carrying out having illustrated with Fig. 7,73 pairs of information as shown in Figure 8 of base band memory management unit manage.Promptly, whether each frame of 73 pairs of base band memory management units expression is present in the baseband images data with the sign in the memory 31 (Exist:1 is present in the baseband images data with in the memory 31 after representing to decode) and represent that these decoded frames are the value (decoder type (Decoder Type): for example under situation shown in Figure 8 by which decoder decode in the decoding processing parts 76 after decoding, highresolution decoder is 1, and low resolution decoder is 2 etc.) manage.
Decoding control section part 72 is using high-resolution decoder to carry out under the situation of decoding, in order to prevent the deterioration of image, even under low resolution, carried out decoding and be kept at the baseband images data with in the memory 31 to being used as with reference to the baseband images data of image, do not use this reference image data yet, and can decode to the compressed image frame of correspondence and as with reference to image with high-resolution decoder.
In addition, base band memory management unit 73 also can be to each frame managerial skills size (Horizontal Size), vertical size (Vertical Size), chroma format (Chroma Format) etc., base band signal process parts 32 are according to these information, and, the baseband images data that provided are adjusted size according to output format.
Can carry out above-mentioned a series of processing by hardware, also can carry out by software.In this case, for example replay device 61 is made of as shown in Figure 9 personal computer 201.
In Fig. 9, CPU (CPU) 221 carries out various processing according to being stored in the program among the ROM (read-only memory) 222 or being loaded into program the RAM (random access storage device) 223 from HDD226.In RAM223, also suitably be stored in CPU221 needed data etc. when carrying out various processing.
CPU221, ROM222, RAM223 interconnect via bus 224.This bus 224 also is connected with Signal Processing Element 228 with interface (I/F) 225-1~225-3, HDD (hard disk drive) 226, image special-effect sound mix processing unit 227.
Interface 225-1 is connected with input equipments such as keyboard 202, mouses 203.Interface 225-2 is connected with storage device 204, can receive and send messages.In addition, interface 225-3 is connected with external image device for reproducing recorded 205-1~205-m, can receive and send messages.HDD226 drives hard disk, can storing various information.
Image special-effect sound mix processing unit 227 also is connected with Signal Processing Element 228, storage device 204 and photologging replay device 205-1~205-m, to providing from any one of storage device 204 and photologging replay device 205-1~205-m or implementing special-effect from the signal of video signal that HDD226 provides via bus 224, or sound carried out mixing etc. and offer Signal Processing Element 228 outputs, any one that perhaps offers among storage device 204 and the photologging replay device 205-1~205-m preserved.
Signal Processing Element 228 also is connected with loud speaker 230 with display 229, for example will offer display 229 and show, and perhaps voice signal be offered loud speaker 230 and carry out voice output from the signal of video signal that image special-effect sound mix processing unit 227 grades provide.
Display 229 shows the image that provides from Signal Processing Element 228 for example by CRT (cathode ray tube), LCD formations such as (LCD).Loud speaker is to the sound that provides from Signal Processing Element 228 output of resetting.
Bus 224 also is connected with driver 231 as required, and removable mediums 206 such as disk, CD, photomagneto disk or semiconductor memory suitably are installed, and will be installed to the HDD226 from the computer program that they are read as required.
Figure 10 has represented to be used for to illustrate at the personal computer 201 that uses Fig. 9 to illustrate, carries out the functional block diagram of an example of the function under the situation that has been suitable for processing of the present invention by software.
In addition, to the additional prosign of the part corresponding, and suitably omit its explanation with the situation of Fig. 4 or Fig. 9.
By the program that puts rules into practice by CPU221, personal computer 201 has: the operation input that has with Fig. 4 obtains parts 21, the GUI control assembly 251 of GUI display control unit spare 71 basic functions equally, having with the stream of Fig. 4 provides the stream of the basic the same function of control assembly 74 and stream scu 25 that control assembly 252 is provided, have with Fig. 4 with reference to image indicating device 75 and baseband images data function with reference to image indicating device 253 with the basic the same function of scu 77, and the stream decoded positions detection processing unit 22 that has and illustrated with Fig. 4, decoding control section part 72, decoding processing parts 76, base band signal process parts 32 function equally.
CPU221 is according to the user's who imports by input equipments such as mouse 202 or keyboards 203 operation, reference record is at storage device 204, concordance list among external image device for reproducing recorded 205 or the HDD226, storage area with the regulation among the RAM223 is used as baseband images data memory 31 simultaneously, with reference to being stored in the baseband images data, determine to be stored in storage device 204 with the baseband images data in the memory 31, be included in order in the flow data among external image device for reproducing recorded 205 or the HDD226 by the decoding of the frame in the stream of user's appointment, the method of decoding.
CPU221 utilizes as the baseband images data storage area of the regulation among the RAM223 with memory 31, therefore the baseband images data that will decode offer RAM223 and store, and the baseband images data that will be stored in simultaneously among the RAM223 are utilized in decoding as the reference image.In addition, CPU221 reads the baseband images data that are stored among the RAM223, for example carry out color correct, size correction, the various corrections such as field control when putting slowly, make the image behind the output decoder correctly reset, and the output baseband images data that generated are offered display 229 show.
Promptly, CPU221 utilizes as the baseband images data storage area of the regulation among the RAM223 with memory 31, not to the reference image with and output signal with storing decoded baseband images data distinctively, and these baseband images data are utilized as reference image and output signal.
Then, with reference to the flow chart of Figure 11, illustrate by the replay device 61 that illustrated with Fig. 4 or the decoding processing 1 carried out with the personal computer 201 that Fig. 9 and Figure 10 have illustrated.
In step S41, the stream decoded positions detects processing unit 22 (the stream decoded positions of CPU221 detects processing unit 22) and obtains the input that parts 21 (the GUI control assembly 251 of CPU221) are accepted the appointment of replay position (stream numbering, frame number) by the operation input, and offers decoding control section part 72 (the decoding control section part 72 of CPU221).
In step S42, decoding control section part 72 (the decoding control section part 72 of CPU221) is obtained and the corresponding concordance list of stream numbering that provides from index management parts 24 (storage device 204, external image device for reproducing recorded 205 or HDD226).
In step S43, decoding control section part 72 (the decoding control section part 72 of CPU221) from concordance list, extract out as for example image type of the needed information of decoding, with reference to the information such as address in image-related information, data length or the memory.
In step S44, decoding control section part 72 (the decoding control section part 72 of CPU221) determines the decoding order and the decoder (method of decoding) of the frame of playback output according to the instruction from the user.
Specifically, for example under the situation of the playback of having indicated low speed by the user, decoding control section part 72 (the decoding control section part 72 of CPU221) can make that the stand-by period is the shortest so that sequentially use high-resolution decoder that the frame that has been instructed to reset is decoded.But, indicating under the situation of putting soon or refunding etc. of high speed to a certain degree, the method of decoding control section part 72 (the decoding control section part 72 of CPU221) by sequentially using high-resolution decoder that the frame of resetting is decoded, it is the display frame lattice number that to guarantee the display frame of final output, therefore can be so that, perhaps use the decoder of low resolution to decode to the output of decoding of the I image nearby of the frame that has been instructed to reset or P image.In addition, and then under high speed, do not require in the replay mode of image quality, can be so that only the I image nearby of the frame that is instructed to reset be decoded and exports with the decoder of I image special use.
In step S45, decoding control section part 72 (the decoding control section part 72 of CPU221) is with reference to the information that is documented in the concordance list, and the frame number (degree of depth) from playback frame farthest frame between (Depth of Past Framesfor Decoding) of needed the place ahead in image retrieves and decode.
In step S46, decoding control section part 72 (the decoding control section part 72 of CPU221) is with reference to the information that is documented in the concordance list, obtain for the frame of the frame number of appointment is decoded as frame number with reference to the necessary frame of image (frame at the frame that is designated as decoding is not under the situation of I image, is at least one frame of the I image before comprising).Decoding control section part 72 (the decoding control section part 72 of CPU221) is for example as illustrated with Fig. 7, under the situation of the playback output of having indicated the B13 frame, needing to obtain the situation of I2, P5, P8, P11 and P14 as the reference image, under the situation of the playback output of having indicated the B3 frame, needing to obtain the situation of I2 and P5 as the reference image.
In step S47, decoding control section part 72 (the decoding control section part 72 of CPU221) with reference to by base band memory management unit 73 management be stored in the baseband images data with the baseband images data in the memory 31 (promptly being used as the baseband images data with the baseband images data in the storage area of memory 31 utilizations) with reference to what be stored in RAM223, judgement the baseband images data with memory 31 in whether with the quality of necessity exist whole necessity with reference to image.
In step S47, judge in the baseband images data and do not exist under the situation with reference to image of whole necessity in memory 31 with the quality of necessity, change a kind of saying, judge the baseband images data with the situation that has necessary image with reference to the dispensable quality in the image in the memory 31 under, in step S48, decoding control section part 72 (the decoding control section part 72 of CPU221) by corresponding decoder (coding/decoding method) to not being present in the baseband images data with decoding in the memory 31 with reference to image.
Promptly, decoding control section part 72 (the decoding control section part 72 of CPU221) make stream provide control assembly 74 via stream memory 25 from stream with memory 26 read be not present in baseband images data usefulness memory 31 with reference to the corresponding compressed image frame of image, and offer decoding processing parts 76 and decode, offering the baseband images data via the baseband images data with scu 77 preserves with memory 31 and (makes the stream of CPU221 provide control assembly 252 to read and be stored in storage device 204, flow data among external image device for reproducing recorded 205 or the HDD226, offering decoding processing parts 76 decodes, and offer preserving with the storage area that memory 31 utilizes of RAM223) as the baseband images data, return step S47, repeat later processing.
In step S47, exist under the situation with reference to image of whole necessity with the quality of necessity in memory 31 judging in the baseband images data, in step S49, decoding processing parts 76 (the decoding processing parts 76 of CPU221) are according to the control of decoding control section part 72, utilization be stored in the baseband images data with in the memory 31 (storage area that utilizes with memory 31 as the baseband images data of RAM223) with reference to image, by corresponding decoder (coding/decoding method) frame of resetting is decoded, and offer baseband images data usefulness memory 31 (the storage area of RAM223) as 31 utilizations of baseband images data usefulness memory.
In step S50, base band signal process parts 32 (the base band signal process parts 32 of CPU221) are obtained by decoding control section part 72 and have been carried out decoding and by the baseband images data of baseband images data with memory 31 (storage area that utilizes with memory 31 as the baseband images data of RAM223) storage, implement various corrections (Base-Band Processing), the output baseband images Frame that output is generated, processing by GUI display control unit spare 71 shows (processing of the GUI control assembly 251 by CPU221 outputs to display 229 and shows), end process.
By such processing, in upsetting playback, need not repeat same frame is decoded the playback frame of therefore can decoding at high speed in order to generate with reference to image.
That is, such as mentioned above, by the control decoding order, for example can shorten and to upset reset (random playback) and wait the playback time that begins from ad-hoc location, the stand-by period of so-called special effects playback, the frame of output hope of can resetting time of delay the shortlyest.
And then, become one by the memory that the memory used with reference to image and output image are used, can reduce all memory numbers of device, can reduce to the memory copy number of times till the data behind the output decoder, therefore can shorten the stand-by period.
In addition, make the memory carry out with reference to the preservation of image separate from decoder, part beyond decoder is controlled, therefore increased the degree of freedom of the control of decoding order, can switch inlet flow to each frame, or to each frame switching encoding/decoding device, therefore can corresponding various decoding algorithms.
In addition, by such structure, can select decoder, coding/decoding method accordingly with disposal ability, playback speed or the desired image quality of device.For example, usually carry out high-resolution decoding according to the order of the output of resetting, and it is relative therewith, can carry out the decoding of low resolution as required, perhaps do not carry out the decoding of B image and only I image or P image are decoded, and then or only specific image types such as I image are decoded the frame of compress mode (frame in), make corresponding with the image quality of the speed of the output of resetting or necessity.
Therefore, at the complicated playback indications such as upset playback that in existing replay device, can't follow the tracks of the output of resetting at high speed, the balance ground of the image quality of the image that can consider resets exports and decode time, select optimal decoding algorithm, decoder (coding/decoding method), can realize the decoding of the shortest delay of not waste.
And then, by such structure, for example can so that the decoder that has the decoder that can decode to the stream that has compressed by the MPEG mode respectively and can decode to the stream that has compressed by mode in addition can suitably decode with switching.
In addition, be suitable under the situation of the present invention, do not distinguish with reference to image with and output land used storage by a plurality of decoders or the decoded baseband images data of coding/decoding method, therefore the information of decoded baseband images data having been carried out decoding by which coding/decoding method is managed.The information of which coding/decoding method to have carried out decoding by for example is the flag information of the decoder type (Decoder Type) that illustrated with Fig. 8.
For example under the situation of the decoding of carrying out high-resolution such as low-speed reproducing, static picture demonstration, in order to prevent the deterioration of image, by which coding/decoding method decoded baseband images data have been carried out the information of decoding by reference, the baseband images data that will not decode under low resolution use as reference image data, and the baseband images data that only will decode under high-resolution are used as the reference image.
In addition,, the situation by the MPEG2LongGOP of I image, P image and B image construction has been described mainly, but the present invention also goes for the situation of resetting with reference to the compress coding data of having decoded to by interframe at this.
Like this, for example when upsetting playback etc., carry out the decoding of low resolution as required, perhaps do not carry out the decoding of B image and only I image or P image are decoded, and then or only to specific image types such as I image decode the decoding of compress mode (frame in), make corresponding with the image quality of the speed of the output of resetting, necessity etc., the balance ground of the image quality of the image that thereby considering resets exports and decode time, select optimal decoding algorithm, decoder (coding/decoding method), can realize the decoding of the shortest delay of not waste.
In addition, can utilize the depth of decode information (Depthof Past Frames for Decoding) of the concordance list that has illustrated with Fig. 6, and then can carry out the shortest decoding processing of delay that image quality is tried one's best high.
Then, specifically describe in the decoding control section part 72 of Fig. 4 or Figure 10 the method for utilizing the depth of decode information (Depth of Past Framesfor Decoding) of concordance list in the control of the decoding processing of carrying out.
The depth of decode information (Depth of Past Framesfor Decoding) of the concordance list that has illustrated with Fig. 6 is expressed as terrible to the number of times of the necessary decoding of reference frame of side in the past as mentioned above.With reference to the frame of following side have only the B image, but the following side of B image is 1 frame with reference to frame number.That is,, can be set to " decoding number of times+1 of past side " in order to obtain being used to obtain the necessary decoding number of times of whole reference frame of B image.Therefore, as concordance list, the degree of depth that only has decoding in the past is just enough.
That is,, do not need the reference frame of side in the past, so the value of the degree of depth is 0 owing to I image ahead for the GOP that decodes.In addition, the value of the degree of depth of each P image occurs the P image at every turn and just adds 1 value for the number that begins ahead from GOP, in addition, since for convection current ahead or ClosedGOP 2 B images ahead decode, do not need the reference frame of side in the past, so the value of the degree of depth is 0, and OpenGOP's is not that the value that flows the degree of depth of 2 B images after ahead the I image is that the value of the degree of depth of the P image before the I image adds 1, and the value of the degree of depth of B image in addition equates with the value of the degree of depth of P image before.
In the decoding control section part 72 of Fig. 4 or Figure 10, can utilize the depth of decode information (Depth of Past Frames for Decoding) of the concordance list that has illustrated with Fig. 6 to carry out the control of decoding processing.
With reference to the flow chart of Figure 12, the depth information computing that makes depth information from beginning retrieval (pass) stream ahead is described.
In addition, for example in the replay device 61 that has illustrated with Fig. 4, carry out under the situation about handling of resetting, the depth information computing also can be carried out in the index management parts 24 of storage concordance list, can also be at device (for example general personal computer beyond the replay device 61, specifically, comprise the personal computer 201 of computer etc. with the function shown in the functional block diagram that has illustrated with Figure 10) in carry out, and the concordance list that generates offered replay device 61, be stored in the index management parts 24.
In addition, in the personal computer 201 that has illustrated with Fig. 9 and Figure 10, carry out under the situation about handling of resetting, the depth information computing can be at CPU221 (wherein, the function executing that the functional block diagram that use has illustrated with Figure 10 is not represented is handled) in carry out, can also be at device (for example general personal computer beyond the personal computer 201, specifically, comprise the personal computer of computer etc. with the function shown in the functional block diagram that has illustrated with Figure 10) in carry out, and the concordance list that generates offered personal computer 201, be stored among the HDD226, can also with the index table stores that generates in storage device 204 or external image device for reproducing recorded 205, offer personal computer 201.
In the flow chart of Figure 12, the situation of depth information computing execution in the CPU221 of personal computer 201 (function executing of wherein using the functional block diagram that illustrated with Figure 10 not represent is handled) has been described, but execution in as mentioned above, also can be beyond the CPU221 of personal computer 201.
In step S81, the CPU221 of personal computer 201 accepts the stream input of (Stream Order) in proper order of the stream of generation concordance list.
In step S82, CPU221 judges whether the image type of next image is the I image.The image type of judging next image in step S82 is not under the situation of I image, handles to advance to step S85 described later.
The image type of judging next image in step S82 is under the situation of I image, in step S83, CPU221 is set to depth=depth+1 as the value of the depth of first counter that is used to calculate depth of decode, and then is set to prev_depth=depth as the value of the prev_depth of second counter that is used to calculate depth of decode.
At this, as the depth of first counter that is used to calculate depth of decode is the counter that is used to calculate the depth of decode of P image in the GOP and the B image beyond GOP 2 B images ahead, is the counter that is used to calculate the depth of decode of GOP 2 B images ahead as the prev_depth of second counter that is used to calculate depth of decode.
In step S84, CPU221 is set to depth=0 as the value of the depth of first counter, handles to advance to step S89 described later.
The image type of judging next image in step S82 is not under the situation of I image, and in step S85, CPU221 judges whether the image type of next image is the P image.The image type of judging next image in step S85 is not under the situation of P image, handles to advance to step S88 described later.
The image type of judging next image in step S85 is under the situation of P image, and in step S86, CPU221 resets to the value as the prev_depth of second counter that is used to calculate depth of decode, is set to prev_depth=0.
In step S87, CPU221 is set to depth=depth+1 as the value of the depth of first counter that is used to calculate depth of decode, handles advancing to step S89 described later.
Whether the image type of judging next image in step S85 is not the P image, promptly is under the situation of B image, and in step S88, CPU221 judges whether it is depth=0, changes a kind of saying, be GOP 2 B images ahead.
After the processing of step S84 or step S87 finishes, perhaps in step S88, judge under the situation that is not depth=0, in step S89, the value of CPU221 depth of decode (Depth of Past Frames for Decoding) is set to depth of decode=depth, handles to advance to step S93 described later.
Specifically, at image type is under the situation of I image, in step S84, value as the depth of first counter that is used to calculate depth of decode is depth=0, therefore the value of the depth of decode (Depth of Past Frames for Decoding) of I image is 0, at image type is under the situation of P image, in step S87, as the value of the depth of first counter that is used to calculate depth of decode for adding 1 one by one, therefore the value of the depth of decode (Depth of Past Frames for Decoding) of P image for respectively from GOP ahead (I image) count and occur the P image at every turn and then add 1 value, at image type is the B image, and be that the value of depth of decode (Depth of Past Frames for Decoding) equates with P image before under the situation of the B image beyond GOP 2 B images ahead.
In step S88, judge depth=0, promptly be under the situation of GOP 2 B images ahead, in step S90, CPU221 is at this B image, judge whether not exist the past with reference to image, change a kind of saying, whether this B image is any one ahead or in ClosedGOP 2 B images ahead of stream.
Judging in step S90 does not exist the past with reference to image, promptly this B image is under the situation of any one ahead or in ClosedGOP 2 B images ahead of stream, in step S91, the value of CPU221 depth of decode (Depth of Past Frames for Decoding) is set to depth of decode=0, handles to advance to described step S93.
In step S90, judge and exist under the situation of past with reference to image, promptly this B image is under the situation with any one ahead or in different GOP 2 the B images ahead of ClosedGOP of stream, in step S92, the value of CPU221 depth of decode (Depth of PastFrames for Decoding) is set to depth of decode=prev_depth, handles to advance to step S93 described later.
Specifically, in step S86, when the P image promptly occurring, value to prev_depth reset (reset), in step S83, when the I image promptly occurring, the value of prev_depth is set to depth+1, so therefore to GOP ahead and be stream be the image of the GOP of ColsedGOP ahead or not, the value that becomes the depth of decode (Depth of Past Frames for Decoding) of before P image adds 1 value.
After the processing of step S89, step S91 or step S92 finished, in step S93, CPU221 judged whether stream is through with.Judging stream in step S93 does not also have to handle and turn back to step S82 under the situation of end, repeats processing thereafter.In step S93, judge under the situation that stream is through with end process.
By such processing, calculate the value of the depth of decode (Depth of Past Frames for Decoding) of the concordance list that has illustrated with Fig. 6, and be used for the control of decoding processing.
Then, flow chart with reference to Figure 13, the value of using the depth of decode (Depth of Past Frames for Decoding) that the depth information computing by Figure 12 calculates is described, the decoding processing 2 of personal computer 201 execution that illustrated by the replay device 61 that illustrated with Fig. 4 or with Fig. 9 and Figure 10.
In step S121~step S126, carry out basic the same processing with step S41~step S46 of Figure 11.
Promptly, the stream decoded positions detects processing unit 22 (the stream decoded positions of CPU221 detects processing unit 22) and obtains the input that parts 21 (the GUI control assembly 251 of CPU221) are accepted the appointment of replay position (stream numbering, frame number) by the operation input, offers decoding control section part 72 (the decoding control section part 72 of CPU221).Decoding control section part 72 (the decoding control section part 72 of CPU221) is obtained and the corresponding concordance list of stream numbering that is provided from index management parts 24 (storage device 204, external image device for reproducing recorded 205 or HDD226).
Decoding control section part 72 (the decoding control section part 72 of CPU221) from concordance list, extract out as for example image type of the needed information of decoding, with reference to the information such as address in image-related information, data length or the memory, according to instruction from the user, the decoding order and the decoder (method of decoding) of the frame of decision playback output.
Then, decoding control section part 72 (the decoding control section part 72 of CPU221) is with reference to the information that is documented in the concordance list, frame number from playback frame farthest the frame between (degree of depth: Depth of Past Frames forDecoding) of needed the place ahead in image detects and decodes, with reference to the information that is documented in the concordance list, obtain for the frame of the frame number of appointment is decoded as frame number with reference to the necessary frame of image (frame at the frame that is designated as decoding is not under the situation of I image, is at least one frame of the I image before comprising).
Then, in step S127, carry out the decoding that illustrates in the back with Figure 14 and switch the decision processing.
Promptly, even in the situation of common replay mode or upset and reset but existed under the situation of reference frame in memory 31 in the baseband images data, because the decoding processing that can be predicted as after this is short, so do not used the processing of depth information, and carried out common decoding processing (processing of step S130~step S133).On the other hand, in the upset pattern, do not pass by under the situation with reference to image, under the situation that promptly in the random playback process, does not have frame is on every side decoded, utilization is documented in the depth information in the concordance list, judging whether to have enough time to carry out common process of decoding, is to be common decoding processing (processing of step S130 step S133) or the processing (processing of step S129) that replaces with the decoding processing decision.
In step S128, whether decoding control section part 72 (the decoding control section part 72 of CPU221) determines to replace demonstration into carrying out judge that the decoding switching decision of carrying out is handled in step S127 in.
Judging decision in step S128 is to carry out under the situation that replaces showing, in step S129, decoding control section part 72 (the decoding control section part 72 of CPU221) is carried out and used low-resolution image or the nearby replacement demonstration of image, end process.
Specifically, method instead, for example do not carry out the decoding of B image, and only I image or P image are decoded, perhaps only to specific image types such as I image decode (decoding of compress mode in the frame), perhaps handling decoding at high speed from the decoding of carrying out low resolution as the destination etc., thereby consider the image quality of image of the output of resetting and the balance ground of decode time as required, select optimal decoding algorithm and decoder (coding/decoding method).
And then method instead for example can be used as agents document (proxy file), and is temporary transient to the image that any one frame at least in P image and B image or the P image is decoded and preparation is encoded to the I image more in addition, uses this I image display image.To describe agents document in the back in detail.
In step S128, judge decision for not carrying out under the situation that replaces showing, in step S130~step S133, carry out the processing the same, end process with step S47~step S50 of Figure 11.
Promptly, decoding control section part 72 (the decoding control section part 72 of CPU221) with reference to by base band memory management unit 73 management be stored in the baseband images data with the baseband images data in the memory 31 (promptly with reference to be stored in RAM223 be used as the baseband images data with memory 31 and the baseband images data in the storage area of utilization), judgement the baseband images data with memory 31 in whether with the quality of necessity exist whole necessity with reference to image.
Do not exist under the situation with reference to image of whole necessity in memory 31 judging with the quality of necessity in the baseband images data, change a kind of saying, judge the baseband images data with the situation that has necessary image with reference to the dispensable quality in the image in the memory 31 under, by corresponding decoder (coding/decoding method) to not being present in the baseband images data with decoding in the memory 31 with reference to image, processing turns back to step S130, repeats later processing.
In addition, exist under the situation with reference to image of whole necessity with the quality of necessity in memory 31 judging in the baseband images data, control according to decoding control section part 72, utilization be stored in the baseband images data with in the memory 31 (storage area that utilizes with memory 31 as the baseband images data of RAM223) with reference to image, by corresponding decoder (coding/decoding method) frame of resetting is decoded, and offer baseband images data usefulness memory 31 (the storage area of RAM223) as 31 utilizations of baseband images data usefulness memory.
Then, base band signal process parts 32 (the base band signal process parts 32 of CPU221) are obtained by decoding control section part 72 and have been carried out decoding and by the baseband images data of baseband images data with memory 31 (storage area that utilizes with memory 31 as the baseband images data of RAM223) storage, implement various corrections (Base-Band Processing), the output baseband images Frame that output is generated, processing by GUI display control unit spare 71 shows (processing of the GUI control assembly 251 by CPU221 outputs to display 229 and shows), end process.
By such processing, in upsetting playback, need not repeat same frame is decoded in order to generate with reference to image, so playback frame of can decoding at high speed, simultaneously by decoding hand-off process described later, for example under the situation of also not passing by under the upset pattern with reference to image, promptly in random playback not to around the frame situation of decoding etc. down, utilization is documented in the depth information in the concordance list, judge whether to have enough time to carry out common process of decoding, judge whether switching encoding/decoding and replace demonstration, and replace as required showing.
Promptly, such as mentioned above, by the control decoding order, for example can shorten the playback time of upsetting reset (random playback) etc. and beginning, the stand-by period of so-called special effects playback from ad-hoc location, the frame that the output of can resetting time of delay is wished the shortlyest, in addition, carry out under the situation of common process of decoding switching encoding/decoding and carry out and replace showing having little time.
Then, with reference to the flow chart of Figure 14, the decoding switching decision processing of carrying out is described in the step S127 of Figure 13.
In step S161, decoding control section part 72 (the decoding control section part 72 of CPU221) judges whether present replay mode is the upset pattern.In step S161, judge under the situation that is not the upset pattern, handle advancing to step S170 described later.
Judging in step S161 is under the situation of upset pattern, and in step S162, decoding control section part 72 (the decoding control section part 72 of CPU221) judges in the baseband images data whether have to pass by reference frame in memory 31.In step S162, judge under the situation that has the past reference frame, handle advancing to step S170 described later.
In step S162, judge under the situation that does not have the past reference frame, in step S163, decoding control section part 72 (the decoding control section part 72 of CPU221) judges whether to need to calculate the frame number of can decoding according to whether being predetermined the frame number or not from the condition setting of upper application program appointment etc. of can decoding.
Specifically, can the decoded frame numerical example as setting in advance (directly specifying frame number), also can specify by upper application program.For example, can also be according to the number of the decoder, the CPU that are used for decoding processing or the clock frequency setting frame number of can decoding.Under these circumstances, do not need to calculate the frame number of to decode, can obtain the frame number of to decode of appointment.
Relative therewith, for example under the situation of only having specified the time distribution T (for example 20ms) that can offer decoding processing, need to calculate the frame number of can decoding from upper application program.
Judging in step S163 needs to calculate under the situation of the frame number of can decoding, and in step S164, carries out the frame number computing of can decoding that illustrates in the back with Figure 15.
Judge not need to calculate in step S163 and can decode under the situation of frame number, in step S165, decoding control section part 72 (the decoding control section part 72 of CPU221) is obtained the frame number of can decoding of being specified or being set in advance by upper application program.
After the processing of step S164 or step S165 finished, in step S166, decoding control section part 72 (the decoding control section part 72 of CPU221) judged whether handled frame is the B image.
Judge in step S166 is under the situation of B image, in step S167, the depth information of decoding control section part 72 (the decoding control section part 72 of CPU221) cross index table (Depth of Past Frames for Decoding) judges whether it is that the past is with reference to frame number (degree of depth)+2>frame number of can decoding.
At this, the past is for the corresponding needed decoding number of times of B image of decoding under situation about not existing fully with reference to picture frame with reference to frame number (degree of depth)+2.Specifically, for the corresponding needed decoding number of times of B image of decoding be the frame number with reference to image (value of Depth of Past Frames for Decoding) in the past, following side (being the frame of back on the time) with reference to frame number 1,1 time the total of decoding of self.
In step S167, judge under the situation with reference to frame number (degree of depth)+2>frame number of can decoding, processing advances to step S169 described later, not under the situation of past with reference to frame number (degree of depth)+2>frame number of can decoding judging, handle advancing to step S170 described later.
In step S166, judge under the situation that is not the B image, in step S168, the depth information of decoding control section part 72 (the decoding control section part 72 of CPU221) cross index table (value of Depth of Past Frames for Decoding) judges whether it is that the past is with reference to frame number (degree of depth)+1>frame number of can decoding.
At this, the past is for corresponding P image or the needed decoding number of times of I image of decoding under situation about not existing fully with reference to picture frame with reference to frame number (degree of depth)+1.Specifically, be the number of times once that the frame number with reference to image (value of Depth of Past Frames for Decoding) in past is added the decoding of self for decode corresponding P image or the needed decoding number of times of I image.Because the past of I image is 0 with reference to frame number (degree of depth), so yes 1 time for the needed decoding number of times of the I image of decoding.
In step S168, judge under the situation with reference to frame number (degree of depth)+1>frame number of can decoding, processing advances to step S169 described later, not under the situation of past with reference to frame number (degree of depth)+1>frame number of can decoding judging, handle advancing to step S170 described later.
In step S167, judge under the situation with reference to frame number (degree of depth)+2>frame number of can decoding, perhaps in step S168, judge under the situation with reference to frame number (degree of depth)+1>frame number of can decoding, in step S169, decoding control section part 72 (the decoding control section part 72 of CPU221) decision shows for replacing, processing turns back to the step S127 of Figure 13, advances to step S128.
At this, specifically, replace showing and for example be meant as mentioned above, do not carry out the decoding of B image, and only I image or P image are decoded, perhaps only to specific image types such as I image decode (decoding of compress mode in the frame), perhaps to handle decoding at high speed from the decoding of carrying out low resolution as the destination etc., in addition, also pass through as agents document (proxy file), temporarily to P image and B image, or the image that any one frame at least in the P image is decoded and preparation is encoded to the I image more in addition, use this I image display image, compare with the situation that does not replace showing thus, can show the image of hope at high speed.Will be explained below the detailed of agents document.Show that instead except these display packings, optimal decoding algorithm, decoder (coding/decoding method) are selected in the balance ground of the image quality of the image that can also consider as required resets exports and decode time.
In step S167, judge and be not under the situation of past with reference to frame number (degree of depth)+2>frame number of can decoding, perhaps in step S168, judge and be not under the situation of past with reference to frame number (degree of depth)+1>frame number of can decoding, in step S170, decoding control section part 72 (the decoding control section part 72 of CPU221) decision shows for not replacing, processing turns back to the step S127 of Figure 13, advances to step S128.
By such processing, utilize the depth information be documented in the index file, judge whether to have enough time to carry out common process of decoding, judge whether switching encoding/decoding and replace showing, and replace as required showing.
Then, with reference to the flow chart of Figure 15, the frame number computing of carrying out of can decoding is described in the step S164 of Figure 14.
In step S201, decoding control section part 72 (the decoding control section part 72 of CPU221) for example be provided with 1 seconds 30 frame or 1 seconds 15 frame etc. the demonstration frame number of unit interval.
In step S202, decoding control section part 72 (the decoding control section part 72 of CPU221) deducts the time that is used for other processing from 1 frame the display cycle, is provided with 1 frame is shown the time T of carrying out decoding processing.
Specifically, if for example every frame has all used 13ms in the processing beyond decoding, then under the situation that 1 seconds 30, frame showed, become 1/30=33ms, 33-13=20ms, 20ms are set to 1 frame is shown the time T of carrying out decoding processing, under the situation that 1 seconds 15, frame showed, become 1/15=66ms, 66-13=53ms, 53ms are set to 1 frame is shown the time T of carrying out decoding processing.
In step S203, decoding control section part 72 (the decoding control section part 72 of CPU221) is decoded to the decoder object stream more than or equal to 1GOP, measures average 1 frame decoding time A.
In addition, in step S204, decoding control section part 72 (the decoding control section part 72 of CPU221) calculates T ÷ A, will handle the step S164 that turns back to Figure 14 smaller or equal to the maximum of the integer of T ÷ A as decoding frame number X, advances to step S166.
By such processing, calculate the frame number X that to decode,, judge whether to have enough time to carry out common process of decoding by comparing with the depth information that is documented in the index file, judge whether switching encoding/decoding and replace demonstration, and replace as required showing.
Like this, by being suitable for the present invention, use depth information can determine the switching of whether decoding.That is, even also exist under the situation of the reference frame of having decoded in the situation upset playback of common replay mode, the decoding processing that can be predicted as thereafter is short, so has not used the processing of depth information, and carries out common decoding processing.On the other hand, be the upset pattern and do not passing by under the situation with reference to image, under the situation that promptly in the random playback process, does not have frame is on every side decoded, judge whether to dope and have little time to carry out common process of decoding and can't show, therefore obtain or calculate the frame number X that to decode, by comparing with the depth information that is documented in the concordance list, switching determines whether decoding.
The frame number of can decoding can be specified frame number, also can specify the decode time maximum, calculates according to it.Under the situation of specifying frame number, for example can directly specify maximum decoding frame number by upper application program, also can be according to the disposal ability of CPU etc. (inferring the speed that numeral, processing clock speed etc. are handled and the information of quantification), make up several key element parameters and determine (for example frame number is 5 frames if the 3.6GHz of DualCoreCPU then can decode, and frame number is that 2 frames are such if the 2GHz of SingleCoreCPU then can decode).On the other hand, under the situation at the appointed time, the time T that can provide decoding processing according to application program and the average decoding time A of 1 frame obtain the value of T/A, the calculating frame number of can decoding.
In addition, for example in replacing demonstration, use I image and P image nearby or only use the I image to carry out under the such situation of decoding processing, can effectively utilize the frame number of to decode.That is the replacement object when, can be used in restriction and select to carry out the nearby picture frame of decoding processing.For example, at the frame number of can decoding is 4 frames, 1GOP is that 15 frames are (in DISPLAY ORDER, be B0, B1, I2, B3, B4, P5, B6, B7, P8, B9......P14) situation under, I2, P5, P8, P11 are decoded, even by P14 is that the P image is also removed it from decoder object, and can carry out decoding processing on high speed ground such as upsetting playback.
Like this, be suitable under the situation of the present invention, in upsetting special playback such as playbacks, by with the difficulty of decoding switching encoding/decoding method automatically accordingly, can till the frame of GOP latter half, all show glibly.In addition, can with the ability of decoding switching encoding/decoding mode neatly accordingly, when the low decoder of use ability, also can keep the high display performance of upsetting.
In addition, carrying out under the situation that replaces showing, for example as mentioned above, do not carry out the decoding of B image, and only I image or P image are decoded, perhaps only to specific image types such as I image decode (decoding of compress mode in the frame), perhaps to handle decoding at high speed from the decoding of carrying out low resolution as the destination etc., in addition, also pass through as agents document (proxy file), temporarily to P image and B image, or the image that any one frame at least in the P image is decoded and preparation is encoded to the I image more in addition, use this I image display image etc., optimal decoding algorithm is selected on the balance ground of the image quality of the image that can considering as required thus resets exports and decode time, decoder (coding/decoding method).
Then, instead an example of Xian Shiing illustrates above-mentioned agents document.
Agents document can be generated by replay device 61 or personal computer 201, also can generate by installing the converting means 311 that constitutes independently with these.
Converting means 311 is for example accepted the supply that MPEG2LongGOP stream waits the bit stream of having encoded with reference to coding by interframe (bit stream that is made of I image (intracoded frame), P image (the place ahead is to encoded predicted frame), B image (two direction prediction coded frame)), be transformed to the I image by the P image in the bit stream that is provided is encoded (encode), carry out the agents document that is utilized when replacing showing among the step S129 of Figure 13 and can be created on.
Figure 16 is the block diagram of the configuration example of expression converting means 311.
Bit stream is obtained parts 331 acceptance and is comprised the supply of the bit stream of I image, P image and B image, and offers bit stream analysis parts 332.
Bit stream analysis parts 332 accept to comprise the supply of the bit stream of I image, P image and B image, and wherein I image and P image offered decoder 334.In addition, 332 pairs of decoders 334 of bit stream analysis parts, encoder 336 and agents document memory unit 337 are controlled.
In addition, bit stream analysis parts 332 can be analyzed the bit stream of being supplied with, and the result to the processing carried out by encoder 336 analyzes simultaneously, generates index (index) file with Fig. 6 explanation, and offers index file memory unit 333.
And then, what be fit to is in the indexed file, according to by index stores parts 338 canned datas of acting on behalf of described later, except the information that has illustrated with Fig. 6, for example as shown in Figure 17, as be included in the agents document with primary flow in the corresponding frame of P image, promptly with the image-related coded message of I of having encoded, record the address (Proxy File Address) of image size (Proxy File Picture Size), these frames by encoder 336.These information are stored in described later agency in the index stores parts 338 by encoder 336.
The illustrated index file of usefulness Figure 17 that 333 storages of index file memory unit provide from bit stream analysis parts 332.
334 pairs of I image and P images that provide from bit stream analysis parts 332 of decoder are decoded, and generate non-compression baseband images, and non-compression baseband images data that will be corresponding with the P image offer encoder 336.At this moment, under the situation of the non-compression baseband images that decoder 334 generated in needing with reference to image as the frame of decoding thereafter, with the non-compression baseband images that generated also offer with reference to image and to store with memory 335, under the situation of decoding P image, suitably with reference to being stored in reference to image with the reference image in the memory 335.
Encoder 336 by intraframe coding will be corresponding with the P image that is provided the digital coding of non-compression baseband images be the I image, and the I image that is generated is offered agents document memory unit 337.The parameter of (when generating the I image) when handling as compression, encoder 336 for example can be provided with compression ratio etc.
With reference to Figure 18, the situation that explanation is set to fix from the bit rate of the I image of encoder 336 output.
Encoder 336 for example generates the I image and makes and become the bit rate of the regulation corresponding with the memory capacity of agents document memory unit 337 etc. under the fixing situation of the bit rate of the I image that makes output.Generally, utilize the P information content of image of the place ahead reference to lack than the I information content of image, the bit rate when therefore being encoded to the I image for the P image is recompressed is if be set to than (P image) big bit rate before the conversion, then keep image quality easily, be fit to.In addition, also can this bit rate be set by the user.
Then, with reference to Figure 19, the situation that explanation is set to change from the bit rate of the I image of encoder 336 output.
Encoder 336 is obtained the speed of the I image (is the I image ahead of unit with GOP) of original bit stream from bit stream analysis parts 332 or decoder 334, generates the I image and makes and become the bit rate corresponding with this value.Generally, the bit rate of the I image of original compressive flow has reflected the complexity of the image among this GOP.Therefore, by with the bit rate of the I image ahead of the complexity that has reflected the image among the GOP accordingly, make the bit rate change of the I image that generates from the P image transform, can prevent the deterioration of image quality.
So agents document memory unit 337 data of the I image that provides from encoder 336 that generates like that generates agents document (proxy, i.e. proxy data file) and storage according to illustrating with Figure 18 or Figure 19 of providing from encoder 336.
Act on behalf of index stores parts 338 at every turn by intraframe coding will be corresponding with the P image that is provided the digital coding of non-compression baseband images be that the I image is when generating the I image, as obtaining the address (Proxy File Address) of image size (Proxy FilePicture Size), these frames and store from encoder 336, offer bit stream analysis parts 332 as acting on behalf of index with the image-related coded message of the I that is generated.
Then, with reference to Figure 20, the situation of in original bit stream the P image being decoded is described.
For example, in the bit stream of MPEG2LongGOP, reset and export under the situation of P11 frame, in the prior art, can be shown in Figure 20 A, the I2 frame as the I image ahead of the GOP that comprises P11 is decoded, then, the frame of P5, P8 is decoded generated the reference frame view data after, the P11 frame of target is decoded.
Relative therewith, utilizing agents document to reset under the situation of output P11 frame, can be shown in Figure 20 B, from agents document, extract out as the I11 frame of the I image that has compressed in the frame corresponding and decode with the P11 frame, and the output of resetting.
Therefore, utilizing agents document to reset under the situation of output P11 frame, is to compare for 4 times with existing decoding number of times, the decoding number of times for once, the frame of output appointment of can resetting at high speed.
Then, with reference to Figure 21, the situation of in original bit stream the B image being decoded is described.
For example, in the bit stream of MPEG2LongGOP, reset and export under the situation of B12 frame, in the prior art, can be shown in Figure 21 A, to decoding as the I2 frame of I image ahead of the GOP that comprises B12, then, P5, P8, P11, P14 frame are decoded generated the reference frame view data after, the B12 frame of target is decoded.
Relative therewith, under the situation of utilizing agents document playback output B12 frame, can be shown in Figure 21 B, to with as the B12 frame with reference to image and corresponding I11 frame and the I14 frame of essential P11 frame and P14 frame as the I image that has compressed in the frame decode, they decoded to the B12 frame of target as the reference image and reset output.
Therefore, utilizing agents document to reset under the situation of output B12 frame, is to compare for 6 times with existing decoding number of times, and the decoding number of times is 3 times, the frame of output appointment of can resetting at high speed.
Can carry out above-mentioned a series of processing by hardware, also can carry out by software.In this case, for example can be by realize the function the same with the personal computer 201 of Fig. 9 explanation with converting means 311.
Figure 22 represented to be used for to illustrate at the personal computer 201 that has illustrated with Fig. 9, realizes the functional block diagram of an example of the function under the situation of the function the same with converting means 311 by software.
In addition, to the additional prosign of the part corresponding, and suitably omit its explanation with the situation of Figure 16 or Fig. 9.
By the program that is put rules into practice by CPU221, personal computer 201 has the function the same with the bit stream analysis parts 332 of Figure 16, decoder 334 and encoder 336.
CPU221 is according to the user's who imports by input equipments such as mouse 202 or keyboards 203 operation, utilize and zone with reference to the corresponding RAM223 of image usefulness memory 335, to being recorded in storage device 204, external image device for reproducing recorded 205 or with HDD226 in the corresponding bit stream memory unit 281 in zone arbitrarily in bit stream in I image and P image decode, be the I image only with the P image encoding, generate agents document thus, and store storage device 204 into, in external image device for reproducing recorded 205 or the agents document memory unit 337 corresponding with arbitrary region among the HDD226.
And then, CPU221 will the non-compression baseband images digital coding corresponding with the P image be that the I image is when generating the I image by intraframe coding at every turn, as the coded message image-related with the I that is generated, with image size (Proxy File Picture Size), the address of these frames (Proxy File Address) is as acting on behalf of in the zone that index is saved in the RAM223 corresponding with acting on behalf of index stores parts 338, simultaneously bit stream is analyzed, obtain and be kept at the index of acting on behalf of in the index stores parts 338 of acting on behalf of, generate index file, and store storage device 204 into, in external image device for reproducing recorded 205 or the index file memory unit 333 corresponding with arbitrary region among the HDD226.
The decoding control section part 72 of replay device 61 is via the transmission medium of regulation, obtains the index file that is recorded in storage device 204, external image device for reproducing recorded 205 or the index file memory unit 333 corresponding with arbitrary region among the HDD226.In addition, decoding control section part 72 or decoding processing parts 76 utilize agents document to carry out under the situation that replaces demonstration in the step S129 of Figure 13, transmission medium via regulation, obtain the agents document that is stored in storage device 204, external image device for reproducing recorded 205 or the agents document memory unit 337 corresponding, carry out decoding processing with arbitrary region among the HDD226.
In addition, for example the personal computer 201 with Fig. 9 explanation also has situation with the function of the execution decoding processing of Figure 10 explanation on the basis of function shown in Figure 22 under, the decoding control section part 72 of the personal computer 201 of Figure 10 is obtained index file from storage device 204, external image device for reproducing recorded 205 or the index file memory unit 333 corresponding with arbitrary region the HDD226.In addition, decoding control section part 72 or decoding processing parts 76 utilize agents document to carry out under the situation that replaces demonstration in the step S129 of Figure 13, from storage device 204, external image device for reproducing recorded 205 or the agents document memory unit 337 corresponding, obtain agents document, carry out decoding processing with arbitrary region the HDD226.
Then, with reference to the flow chart of Figure 23, illustrate that the agents document of carrying out generates processing 1 in the CPU221 of the personal computer that has illustrated with the converting means 311 of Figure 16 explanation or with Fig. 9 and Figure 22.
In step S221, bit stream is obtained parts 331 (CPU221) and is obtained original bit stream, offers bit stream analysis parts 332 (the bit stream analysis parts 332 of CPU221).
In step S222, an image in the original bit stream that is provided is provided bit stream analysis parts 332 (the bit stream analysis parts 332 of CPU221).
In step S223, bit stream analysis parts 332 (the bit stream analysis parts 332 of CPU221) are analyzed the image that has read in.That is, bit stream analysis parts 332 are obtained the information of the image of the correspondence in the index file that has illustrated with Figure 17.
In step S224, bit stream analysis parts 332 (the bit stream analysis parts 332 of CPU221) judge that the image read in is I image or P image.Judging in step S224 and be not I image or P image, promptly is under the situation of B image, handles advancing to step S230 described later.
Judging in step S224 is under the situation of I image or P image, and in step S225, I image or P image that bit stream analysis parts 332 (the bit stream analysis parts 332 of CPU221) will read in offer decoder 334.334 pairs of I images that provided of decoder or P image are decoded, and are stored in reference to image with in the memory 335.
In step S226, decoder 334 (decoder 334 of CPU221) judges whether the image decoded is the P image.In step S226, not the P image judging the image of having decoded, promptly be under the situation of I image, handle advancing to step S230 described later.
Judging the image of having decoded in step S226 is under the situation of P image, in step S227, decoder 334 (decoder 334 of CPU221) offers encoder 336 (encoder 336 of CPU221) with the non-compressed image frame corresponding with the P image of having decoded.Encoder 336 is encoded to the I image with the non-compressed image frame that is provided, and offers agents document memory unit 337 (storage device 204, external image device for reproducing recorded 205 or with the corresponding agents document memory unit 337 of arbitrary region among the HDD226).
In step S228, the agents document of the I image construction that agents document memory unit 337 (storage device 204, external image device for reproducing recorded 205 or with the corresponding agents document memory unit 337 of arbitrary region among the HDD226) storage is generated by decoding.
In step S229, as the coded message image-related with the I that is generated, encoder 336 is the index information of the address (Proxy File Address) of image size (Proxy File Picture Size), these frames, promptly acts on behalf of index and offers and act on behalf of index stores parts 338.Act on behalf of the index of acting on behalf of of index stores parts 338 these images of storage.
Judging the image that reads in step S224 is not I image or P image, promptly be under the situation of B image, judging the image of having decoded in step S226 is not the P image, promptly be under the situation of I image, perhaps after the processing of step S229 finishes, in step S230, bit stream analysis parts 332 (CPU221) judge whether whole treatment of picture is through with.Judging whole treatment of picture in step S230 does not also have to handle and turn back to step S222 under the situation of end, repeats with reprocessing.
In step S230, judge under the situation that whole treatment of picture is through with, in step S231, bit stream analysis parts 332 (the bit stream analysis parts 332 of CPU221) are according to the analysis result of each image, be stored in to act on behalf of and be encoded to the frame of I image by encoder 336 in the index stores parts 338, promptly be image size (Proxy File Picture Size) and address (the Proxy File Address) of the frame of I image from the P image transform, the index file that generation has illustrated with Figure 17, and offer index file memory unit 333 (storage device 204, external image device for reproducing recorded 205 or with the corresponding index file memory unit 333 of arbitrary region among the HDD226) store end process.
By such processing, after the P image in the original bit stream having been carried out decoding, generate the agents document record by being encoded to the I image that the I image generates, be included in original bit stream and agents document in the relevant information index file of I image (being the P image in original bit stream).
Like this, the supply of the bit stream that converting means 311 has been accepted to compress is analyzed it and is generated index file, is that the I image generates agents document simultaneously with the P image transform.
In addition, be suitable for personal computer 201 of the present invention and can have had the function the same by the program that puts rules into practice with converting means 311.
In converting means 311, according to the original bit stream that is provided, only I image and P image are decoded, and once more the frame corresponding with the P image is encoded to the I image, generate agents document thus.
At this moment, the data rate of the I image of the agents document of generation can be a fixed rate, also can change (variable bit rate) accordingly with the frame rate of the I image ahead of the corresponding GOP of original bit stream.
In addition, in the index file that in converting means 311, generates, be not the needed information of decoding of original bit stream, the I image that also comprises in the agents document to be comprised promptly is encoded to the information of relevant image size of the frame of I image and address etc. with the frame that will be the P image in original bit stream.
By in the decoding of stream, utilizing agents document, can shorten the decode time that generates at random.
In addition, as mentioned above, carrying out the P image transform is the PI conversion of I image, while and switch stream before the conversion and the part after the conversion is used for decoding processing, can shorten decode time thus and improve the random access performance.But before editing and resetting, the part of the P image in the original stream is transformed to the I treatment of picture needs spended time.If for example 1GOP is that the P image in the stream of 15 frames is 5 frames, then needs cost to be used for whole these 5 frames are decoded and be transformed to the time of I image (decoded coding once more).
Therefore, the bit stream analysis parts 332 of Figure 16 are when carrying out the PI conversion, also can control decoder 334, encoder 336, agents document memory unit 337, make to be not the I image, but only the P image transform of a part is also stored for the I image generates agents document as required whole P image transforms.
The processing time of carrying out the PI conversion depends on the number of the P image that is transformed to I image (decoded coding once more), if therefore reduce the number of the image of conversion, then can shorten the processing time (rise time of agents document).It is desirable to bit stream analysis parts 332 and will separate the ability that stream the longest needed frame number after the code conversion surpasses decoder, as the judgment standard of the image number that is used to determine conversion.Separating stream the longest needed frame number after the code conversion (below, be also referred to as the frame number of can decoding) is to upset to reset or reset usually and different according to replay mode also for example.
Specifically, can the decoded frame numerical example as setting in advance (directly specifying frame number), also can be to specify by upper application program.For example, can be according to number, the clock frequency setting of decoder that is used for decoding processing or the CPU frame number of can decoding.Under these circumstances, bit stream analysis parts 332 do not need to calculate the frame number of can decoding, and can obtain the frame number of can decoding of appointment.
Relative therewith, for example under the situation of only having specified the time distribution T (for example 20ms) that can offer decoding processing from upper application program, bit stream analysis parts 332 must calculate the frame number of can decoding.
In addition, at this moment, if the P image that bit stream analysis parts 332 are selected to be transformed to the I image makes that the P image of conversion is discontinuous as far as possible, then the number of the P image that be transformed to I image corresponding with the frame number of can decoding tails off, and is fit to.
With reference to Figure 24, illustrate that 1GOP is 15 frames, the distortion of the PI conversion under the situation of the number N=4 of P image (IBBPBBPBBPBBPBB).
For example do not carrying out under the situation of PI conversion, effectively sequence (image types of I image except the B image and 5 frames of P image) is IPPPP, and the long decode time is to be Open GOP and not to be to follow the decode time of 7 frames of 2 B images (for example B0B1 among Figure 20, Figure 21) when decoding of I image back in the stream order under the situation of GOP ahead of stream.
In addition, as mentioned above, 4 P images whole are being carried out under the situation of PI conversion, ordered sequence is IIIII, and the long decode time is the decode time of 3 frames under the situation that B image is arbitrarily decoded.
Relative therewith, carry out conversion by the whole of P image not, but be reduced to a certain number ofly transforming object, then the long decode time of decoding changes as shown in figure 24 like that at random.That is, only 1 in 4 P images is being carried out under the situation of PI conversion, making that by the P image of selecting to be transformed to the I image P image of conversion is discontinuous as far as possible, ordered sequence becomes IPIPP or IPPIP, and the long decode time becomes the decode time of 5 frames.In addition, only 2 in 4 P images are being carried out under the situation of PI conversion, make that by the P image of selecting to be transformed to the I image P image of conversion is discontinuous as far as possible, ordered sequence becomes IPIIP, IIPIP or IPIPI, and the long decode time becomes the decode time of 4 frames.
In addition, only 1 in 4 P images is being carried out under the situation of PI conversion, and only 2 in 4 P images are being carried out under the situation of PI conversion, the long decode time shortens has only the P image of selecting to be transformed to the I image to make the P image of the conversion discontinuous situation of trying one's best, it is the situation of ordered sequence shown in Figure 24, for example if the situation that 2 images are carried out the PI conversion, then be to select the P image of conversion to make sequence after conversion towards the discontinuous situation that has 2 of P image, if the situation that 1 image is carried out the PI conversion, then be to select the P image of conversion to make the discontinuous situation that has 3 of P image in the sequence after conversion.
Then, with reference to Figure 25~Figure 29, illustrate that or not whole of P image to the object of conversion carry out conversion, and be reduced to the concrete decoding processing of a certain number of situation.
With reference to Figure 25, illustrate and only carry out the PI conversion and ordered sequence is the decoding processing of the situation of IPIPP 1 in 4 P images.
Shown in Figure 25 A, in DISPLAY ORDER, with the P8 image transform in 4 P images that are included in 15 the frame that is arranged as B0, B1, I2, B3, B4, P5, B6, B7, P8, B9...... is the I image, prepare as agents document under the situation of I8 image, decode time is the longest is the situation that B0 or B1 are decoded, what at this moment need to decode is shown in Figure 25 B, on as basis with reference to I8, the P11 of image, P14, I2,5 images that also have B0 or B1 (in Figure 25 B, being illustrated as B0).
Then,, illustrate and same only carry out the PI conversion that ordered sequence is the decoding processing under the situation of IPPIP to 1 in 4 P images with reference to Figure 26.
Shown in Figure 26 A, in DISPLAY ORDER, with the P11 image transform in 4 P images that are included in 15 the frame that is arranged as B0, B1, I2, B3, B4, P5, B6, B7, P8, B9...... is the I image, prepare as agents document under the situation of I11 image, decode time is the longest is the situation that B9 or B10 are decoded, what at this moment need to decode is shown in Figure 26 B, on as basis with reference to I2, the P5 of image, P8, I11,5 images that also have B9 or B10 (in Figure 26 B, being illustrated as B9).
In addition, only 1 in 4 P images is being carried out under the whole circumstances of PI conversion, the long decode time should not be 5.That is, the long decode time be 5 be as Figure 25 and shown in Figure 26, have only the discontinuous situation that has 3 of P image after the conversion.Under situation in addition, for example the sequence after the conversion is under the situation of IIPPP or the situation of IPPPI etc., has produced the long decode time for needing 6 situation, has reduced the effect of PI conversion.
Then,, illustrate and carry out the PI conversion to 2 in 4 P images that ordered sequence is the decoding processing under the situation of IPIIP with reference to Figure 27.
Shown in Figure 27 A, in DISPLAY ORDER, to be included in and be arranged as B0, B1, I2, B3, B4, P5, B6, B7, P8, B9...... P8 image and P11 image transform in 4 P images in 15 the frame are the I image, prepare as agents document under the situation of I8 image and I11 image, what decode time was the longest is to B6 or B7, perhaps B0 or the B1 situation of decoding, what at this moment need to decode is shown in Figure 27 B, at the I2 of conduct with reference to image, P5, on the basis of I8, also have B6 or B7 (in Figure 27 B, be illustrated as B6) 4 images, perhaps at the I11 of conduct with reference to image, P14, on the basis of I2,4 images that also have B0 or B1 (in Figure 27 B, being illustrated as B0).
Equally, Figure 28 is used for illustrating 2 to 4 P images to carry out the PI conversion, ordered sequence is the figure of the decoding processing under the situation of IIPIP, and Figure 29 is used for illustrating 2 to 4 P images to carry out the PI conversion, and ordered sequence is the figure of the decoding processing under the situation of IPIPI.
Shown in Figure 28 A, in DISPLAY ORDER, to be included in and be arranged as B0, B1, I2, B3, B4, P5, B6, B7, P8, B9...... P5 image and P11 image transform in 4 P images in 15 the frame are the I image, prepare as agents document under the situation of I5 image and I11 image, what decode time was the longest is to B9 or B10, perhaps B0 or the B1 situation of decoding, what at this moment need to decode is shown in Figure 28 B, at the I5 of conduct with reference to image, P8, on the basis of I11, also have B9 or B10 (in Figure 28 B, be illustrated as B9) 4 images, perhaps at the I11 of conduct with reference to image, P14, on the basis of I2,4 images that also have B0 or B1 (in Figure 28 B, being illustrated as B0).
In addition, shown in Figure 29 A, in DISPLAY ORDER, to be included in and be arranged as B0, B1, I2, B3, B4, P5, B6, B7, P8, B9...... P8 image and P14 image transform in 4 P images in 15 the frame are the I image, prepare as agents document under the situation of I8 image and I14 image, what decode time was the longest is to B6 or B7, perhaps B12 or the B13 situation of decoding, what at this moment need to decode is shown in Figure 29 B, at the I2 of conduct with reference to image, P5, on the basis of I8, also have B6 or B7 (in Figure 29 B, be illustrated as B6) 4 images, perhaps at the I8 of conduct with reference to image, P11, on the basis of I14,4 images that also have B12 or B13 (in Figure 29 B, being illustrated as B12).
In addition, in this case, similarly the long decode time should not be 4 images under the whole circumstances of 2 in 4 P images being carried out the PI conversion.That is, the long decode time be 4 be as Figure 27~shown in Figure 29, have only the discontinuous situation that has 2 of P image after the conversion.Under situation in addition, for example the sequence after conversion is under the situation of IIPPI or the situation of IPPII etc., has reduced the effect of PI conversion.
In addition, in Figure 25~Figure 29, the number that B image between I image or the P image has been described is respectively 2 a situation, even but the number of the B image between I image or the P image is more than 2 several, because the needed number with reference to image of decoding of continuous B image is the same, so yes the long decode time under each situation.
Then, with reference to the flow chart of Figure 30, the set handling of the P image of conversion is described.
In step S281, carry out and the same frame number computing of can decoding of situation that illustrates with Figure 15, obtain the frame number of can decoding.
In addition, the bit stream analysis parts 332 of converting means 311 are for example pre-determining the frame number of can decoding, perhaps under the situation of upper application program appointment, in step S281, do not carry out the frame number computing of to decode, can obtain and pre-determine or from the frame number of can decoding of upper application program appointment.
The number of supposing to be obtained by bit stream the P image among the 1GOP of the bit stream that parts 331 obtain is N, and the frame number of can decoding is X, and then in step S282, the bit stream analysis parts 332 of converting means 311 judge whether it is N+3>X.
Judge in step S282 under the situation that is not N+3>X, in step S283, bit stream analysis parts 332 do not carry out PI conversion, end process.
Under the situation that is N+3≤X, the needed time ratio of decoding processing of the longest frame of decode time is also short to the frame number X that can decode needed time of decoding in the bit stream that does not carry out the PI conversion.That is, under these circumstances, do not need to carry out the PI conversion.
Judge in step S282 is under the situation of N+3>X, change a kind of saying, the needed time ratio of decoding processing of judging the longest frame of in the bit stream that does not carry out PI conversion decode time is decoded long situation of needed time to the frame number X that can decode under, in step S284, the PI of maximum consecutive numbers that has enough time to carry out the P image of decoding processing in the sequence of bit stream analysis parts 332 after with the PI conversion skips (skip) number S and is assumed to be X-3.
In step S285, bit stream analysis parts 332 judge whether it is N/S>1 under the situation of S=X-3.
Judging in step S285 is under the situation of N/S>1, and in step S286, bit stream analysis parts 332 PI skip several S and are set to X-3.
Judge in step S285 under the situation that is not N/S>1, in step S287, bit stream analysis parts 332 PI skip several S and are set to N/2 (wherein being the integer near value under the situation that is not integer).
After the processing of step S286 or step S287 finished, in step S288, bit stream analysis parts 332 were skipped the P image that several S are provided with conversion, end process according to PI.
By such processing, according to the number of the P image among can decode frame number and the 1GOP, the PI that obtains the maximum consecutive numbers of the P image that has enough time to carry out decoding processing in the sequence after the PI conversion skips number, according to its P image that is transformed to the I image is set.Like this, under the situation of the number of the P image that as far as possible reduces the PI conversion, compare, can cut down the time that is used to generate agents document with the situation that all is the I image with P image restoration.
Use Figure 31, the example of the concrete setting of the P image that is transformed to the I image is described, the number N that promptly is included in the P image among the 1GOP is the setting that can decode frame number X and the PI under the situation of N=11 skips the relation of several S and be transformed to the P image of I image.
For example at X=4, under the situation of S=1, owing to the discontinuous existence of P image after the PI conversion, so the sequence after the PI conversion is IPIPIPIPIPIP.In addition, for example at X=5, under the situation of S=2, because the P image after the PI conversion is only understood 2 of continued presences, so the sequence after the PI conversion is IPPIPPIPPIPP.In addition, for example at X=6, under the situation of S=3, because the P image after the PI conversion only can maximum continued presences 3, so the sequence after the PI conversion is IPPPIPPPIPPP.
In addition, for example at X=7, under the situation of S=4, because the P image after the PI conversion at most can continued presence 4, so the sequence after the PI conversion can merely be IPPPPIPPPPIP, if the consecutive numbers of P image is smaller or equal to 4, and the number that is transformed to the I image then also can be a sequence (for example IPPPPIPPPIPP etc.) in addition smaller or equal to 2.Wherein, the number of continuous P image it is desirable to few as far as possible, therefore at X=7, under the situation of S=4, with X=6, the situation of S=3 is the same, ordered sequence is that the situation of IPPPIPPPIPPP makes that the picture number of PI conversion is the same, can further shorten the long decode time, is fit to.
In addition, for example at X=8, under the situation of S=5, because the P image after the PI conversion at most can continued presence 5, so the sequence after the PI conversion is IPPPPPIPPPPP, and then, for example at X=9, under the situation of S=5, in above-mentioned steps S285, be judged as N/S>1, therefore, bit stream analysis parts 332 according to N/2 (wherein under the situation that is not integer, be integer near value), PI skips several S and is set to S=5.In addition, 14 (=N+3)>situation of X 〉=10 under, in above-mentioned steps S285, be judged as N/S>1, so bit stream analysis parts 332 PI skip several S and are set to S=5.At this moment, the sequence after the PI conversion is IPPPPPIPPPPP.
In addition, (under=N+3) the situation, being judged as does not need to carry out the PI conversion in X 〉=14.
Then, with reference to the flow chart of Figure 32, illustrate that the agents document of carrying out generates processing 2 in the converting means 311 of Figure 16 explanation or the CPU221 with the personal computer of Fig. 9 and Figure 22 explanation.
In step S331~step S336, carry out and the same processing of step 221~step S226 that has illustrated with Figure 23.
That is, bit stream is obtained parts 331 (CPU221) and is obtained original bit stream, offers bit stream analysis parts 332 (the bit stream analysis parts 332 of CPU221).1 image that bit stream analysis parts 332 (the bit stream analysis parts 332 of CPU221) read in the original bit stream that is provided is analyzed.That is, bit stream analysis parts 332 are obtained the information of the image of the correspondence in the index file that has illustrated with Figure 17.
Then, bit stream analysis parts 332 (the bit stream analysis parts 332 of CPU221) judge that the image that reads in is I image or P image.Not being I image or P image judging, promptly is under the situation of B image, handles advancing to step S341 described later.
Judging is under the situation of I image or P image, and bit stream analysis parts 332 (the bit stream analysis parts 332 of CPU221) offer decoder 334 with I image or the P image that reads in.I image or P image that 334 pairs of decoders provide are decoded, and are stored in reference to image with in the memory 335.
Then, decoder 334 (decoder 334 of CPU221) judges whether the image decoded is the P image.Judging the image of having decoded is not the P image, promptly is under the situation of I image, handles advancing to step S341 described later.
Judging the image of having decoded in step S336 is under the situation of P image, in step S337, decoder 334 (decoder 334 of CPU221) is according to the control of bit stream analysis parts 332, judges that whether this P image is that above-mentioned being set in the set handling of the P of conversion image is transformed to the P image that the needs of I image carry out conversion.In step S337, judge under the situation of the P image that is not to carry out conversion, handle advancing to step S341 described later.
Judge in step S337 is need carry out under the situation of P image of conversion, in step S338, decoder 334 (decoder 334 of CPU221) offers encoder 336 (encoder 336 of CPU221) with the non-compressed image frame corresponding with decoded P image.Encoder 336 is encoded to the I image with the non-compressed image frame that is provided, and offers agents document memory unit 337 (storage device 204, external image device for reproducing recorded 205 or with the corresponding agents document memory unit 337 of arbitrary region among the HDD226).
In step S339, the agents document of the I image construction that agents document memory unit 337 (storage device 204, external image device for reproducing recorded 205 or with the corresponding agents document memory unit 337 of arbitrary region among the HDD226) storage is generated by coding.
In step S340, as with the image-related coded message of I that generates, encoder 336 is the index information of the address (ProxyFile Address) of image size (Proxy File Picture Size), these frames, promptly acts on behalf of index and offers and act on behalf of index stores parts 338.Act on behalf of the index of acting on behalf of of index stores parts 338 these images of storage.
Judging the image that reads in step S334 is not I image or P image, promptly be under the situation of B image, judging the image of having decoded in step S336 is not the P image, promptly be under the situation of I image, in step S337, judge under the situation of the P image that is not to carry out conversion, perhaps after the processing of step S340 finished, in step S341, bit stream analysis parts 332 (CPU221) judged whether whole treatment of picture is through with.Judging whole treatment of picture in step S341 does not also have to handle and turn back to step S332 under the situation of end, repeats later processing.
In step S341, judge under the situation that whole treatment of picture is through with, in step S342, bit stream analysis parts 332 (the bit stream analysis parts 332 of CPU221) are according to the analysis result of each image, be stored in to act on behalf of and be encoded to the frame of I image by encoder 336 in the index stores parts 338, promptly be image size (Proxy File Picture Size) and address (the Proxy File Address) of the frame of I image from the P image transform, the index file that generation has illustrated with Figure 17, and offer index file memory unit 333 (storage device 204, external image device for reproducing recorded 205 or with the corresponding index file memory unit 333 of arbitrary region among the HDD226) store end process.
By such processing, after the P image that being set in the P image in the original bit stream is transformed to the I image has carried out decoding, generated the agents document that records by being encoded to the I image that the I image generates, be included in original bit stream and agents document in the relevant information index file of I image (being the P image in original bit stream).
In addition, under the situation of carrying out the replacement demonstration in the processing of this step S129 that has illustrated at Figure 13, can utilize agents document, but for example in the processing of the step S124 of the step S44 of Figure 11 or Figure 13, decision also can utilize agents document certainly for using I image and P image nearby or only using the I image to carry out under the situation of decoding processing.And then, replace to show not carrying out, and carry out decoding with reference to image, under the situation of the frame that the output of resetting is wished,, can shorten the time of decoding by suitably utilizing agents document, carry out decoding processing at a high speed in resetting etc. upsetting.
In addition,, the situation by the MPEG2LongGOP of I image, P image and B image construction has been described mainly, but the present invention also goes for the situation of resetting with reference to the compress coding data of having decoded to by interframe at this.
Can carry out above-mentioned a series of processing by hardware as mentioned above, also can carry out by software.
Carrying out by software under the situation of a series of processing, can be installed to from the program that network or recording medium will constitute this software in for example general personal computer that is assembled in the computer the special-purpose hardware or can carries out various functions by various programs are installed etc.
This recording medium as shown in Figure 9, not only can be by providing program and removable medium 206 that the disk that has program recorded thereon (comprising floppy disk), CD (CD-ROM), DVD (comprising digital universal disc (Digital Versatile Disk)), photomagneto disk (comprising MD (Mini-Disk)) or the semiconductor memory etc. issued constitute constitutes to the user with device body being used for of separating, can also be by the ROM222 that has program recorded thereon that under the state that is installed in advance in the device body, provides, be included in hard disk among the HDD226 etc. and constitute to the user.
In addition, in this manual, the step that is recorded in the program in the recording medium comprises the processing of carrying out with time sequencing along the order of being put down in writing, and certainly also comprises not to handle and side by side or the processing of carrying out respectively according to time sequencing.
In addition, in this manual, system is meant that the device that is made of multiple arrangement is all.
In addition, embodiments of the invention have more than and are limited to the above embodiments, in the scope that does not break away from aim of the present invention, can carry out various distortion.

Claims (12)

1. coding/decoding method is characterized in that comprising:
According to the playback information of the playback mode of presentation code stream, first deciding step of the coding/decoding method of the above-mentioned encoding stream of decision from a plurality of coding/decoding methods;
According to above-mentioned playback information, decoding processing is needed to obtain step with reference to image information with reference to image information with reference to image for the object images of above-mentioned encoding stream is carried out to obtain expression;
According to be included in above-mentioned with reference to the past side in the image information with reference to frame number and in the demonstration time of the image corresponding, can carry out the frame number of decoding processing with 1 frame, determine second deciding step of the coding/decoding method of above-mentioned encoding stream;
By above-mentioned obtain with reference to image information that step obtains above-mentioned with reference to image information under the situation in the represented above-mentioned storage area that is not stored in regulation with reference to image, according to the above-mentioned coding/decoding method that determines by above-mentioned first deciding step, from above-mentioned encoding stream, obtain and be not stored in above-mentioned in the above-mentioned storage area and carry out decoding processing with reference to image, and be stored in the above-mentioned storage area with reference to the picture decoding step;
According to the above-mentioned coding/decoding method that determines by above-mentioned first deciding step, utilize to be stored in above-mentioned in the above-mentioned storage area with reference to image, the object images of above-mentioned encoding stream is carried out decoding processing, generate the decoding step of view data thus.
2. according to the coding/decoding method of claim 1 record, it is characterized in that:
In above-mentioned decoding step, with the image data storage that generated in above-mentioned storage area.
3. according to the coding/decoding method of claim 1 record, it is characterized in that also comprising:
Whether judgement is decoded in the above-mentioned coding/decoding method that determines by above-mentioned first deciding step above-mentionedly is stored in determining step in the above-mentioned storage area with reference to image.
4. according to the coding/decoding method of claim 1 record, it is characterized in that:
Above-mentioned with reference to picture decoding step and above-mentioned decoding step in, can carry out any one in the different a plurality of decoding processing of resolution according to the decision of above-mentioned first deciding step.
5. according to the coding/decoding method of claim 1 record, it is characterized in that:
Above-mentioned with reference to picture decoding step and above-mentioned decoding step in, can carry out any one in the decoding processing corresponding according to the decision of above-mentioned first deciding step with a plurality of coded systems.
6. according to the coding/decoding method of claim 1 record, it is characterized in that:
In above-mentioned encoding stream, comprise I image, P image and B image,
In above-mentioned decoding step, be under the situation of B image in the object images of above-mentioned encoding stream, nearby I image or the P image that are present in above-mentioned object images are in time decoded.
7. according to the coding/decoding method of claim 1 record, it is characterized in that:
In above-mentioned encoding stream, comprise carry out intraframe coding image and carry out inter prediction encoding image,
In above-mentioned decoding step, be to have carried out under the situation of image of inter prediction encoding in the object images of above-mentioned encoding stream, to the carrying out nearby that be present in above-mentioned object images in time the image of intraframe coding decode.
8. according to the coding/decoding method of claim 1 record, it is characterized in that also comprising:
Calculating can be carried out the calculation procedure of the frame number of decoding processing in the demonstration time of the image corresponding with 1 frame.
According to Claim 8 the record coding/decoding method, it is characterized in that:
In above-mentioned second deciding step, in object images is under the situation of B image, when the past side of the B of object image above-mentioned adds that with reference to frame number number after 2 is bigger than the above-mentioned frame number that calculates by the aforementioned calculation step, be under the situation of I image or P image perhaps in object images, when the past side of the I of object image or P image above-mentioned adds that with reference to frame number number after 1 is bigger than the above-mentioned frame number that calculates by the aforementioned calculation step, determine to carry out the decoding processing of above-mentioned encoding stream for coding/decoding method according to the above-mentioned coding/decoding method that replaces determining by above-mentioned first deciding step.
10. according to the coding/decoding method of claim 9 record, it is characterized in that:
Above-mentioned with reference to picture decoding step and above-mentioned decoding step in, the above-mentioned coding/decoding method that replacement determines by above-mentioned first deciding step, and, carry out decoding and the display image that carries out I image or P image or generate the more decoding of the image of low resolution and the processing of display image by decoding processing based on the above-mentioned encoding stream of the above-mentioned coding/decoding method that determines by above-mentioned second deciding step.
11. the coding/decoding method according to claim 1 record is characterized in that also comprising:
Calculating is included in the above-mentioned calculation procedure with reference to frame number with reference to the past side in the image information.
12. a decoding device is characterized in that comprising:
Playback information according to the playback mode of presentation code stream, first coding/decoding method of the above-mentioned encoding stream of decision from a plurality of coding/decoding methods, according to above-mentioned playback information, obtain expression for the object images of above-mentioned encoding stream is carried out decoding processing needed with reference to image with reference to image information, according to be included in above-mentioned with reference to the past side in the image information with reference to frame number and in the demonstration time of the image corresponding, can carry out the frame number of decoding processing with 1 frame, determine the control unit of second coding/decoding method of above-mentioned encoding stream;
By above-mentioned with reference to image information under the situation in the represented above-mentioned storage area that is not stored in regulation with reference to image, according to above-mentioned first coding/decoding method, from above-mentioned encoding stream, obtain and be not stored in above-mentioned in the above-mentioned storage area and carry out decoding processing with reference to image, and be stored in the above-mentioned storage area, and utilize to be stored in above-mentioned in the above-mentioned storage area, the object images of above-mentioned encoding stream is carried out the decoding unit of decoding processing with reference to image.
CN 200610074396 2005-04-15 2006-04-14 Decoding device, coding/decoding method Expired - Fee Related CN100568975C (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2005119047 2005-04-15
JP2005119046 2005-04-15
JP2005119046 2005-04-15

Publications (2)

Publication Number Publication Date
CN1913641A CN1913641A (en) 2007-02-14
CN100568975C true CN100568975C (en) 2009-12-09

Family

ID=37722379

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 200610074396 Expired - Fee Related CN100568975C (en) 2005-04-15 2006-04-14 Decoding device, coding/decoding method

Country Status (1)

Country Link
CN (1) CN100568975C (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102077900B1 (en) * 2013-04-17 2020-02-14 노키아 테크놀로지스 오와이 An apparatus, a method and a computer program for video coding and decoding
CN108960384B (en) * 2018-06-07 2020-04-28 阿里巴巴集团控股有限公司 Decoding method of graphic code and client

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5742344A (en) * 1991-05-31 1998-04-21 Kabushiki Kaisha Toshiba Motion compensated video decoding method and system for decoding a coded video signal using spatial and temporal filtering
CN1557098A (en) * 2002-07-15 2004-12-22 松下电器产业株式会社 Moving picture encoding device and moving picture decoding device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5742344A (en) * 1991-05-31 1998-04-21 Kabushiki Kaisha Toshiba Motion compensated video decoding method and system for decoding a coded video signal using spatial and temporal filtering
CN1557098A (en) * 2002-07-15 2004-12-22 松下电器产业株式会社 Moving picture encoding device and moving picture decoding device

Also Published As

Publication number Publication date
CN1913641A (en) 2007-02-14

Similar Documents

Publication Publication Date Title
KR101242447B1 (en) Program, decoding device, decoding method, and recording medium
EP2665259A1 (en) Recording medium, reproducing device for performing trick play for data of the recording medium, and method thereof
CN1278550C (en) Method and apparatus for regenerating image and image recording device
CN101895765B (en) Transcoder, recorder, and transcoding method
CN101535981A (en) Methods and systems for use in maintaining media data quality upon conversion to a different data format
US20120121016A1 (en) Predictive encoding and data decoding control
US7046260B2 (en) Menu generating method and recording device for a record carrier
JP4563833B2 (en) Recording device
CN100568975C (en) Decoding device, coding/decoding method
CA2652187A1 (en) Video contents editing apparatus, controller and method
US7860369B2 (en) Play-back device and method for controlling the same
JP4244051B2 (en) Program, decoding device, decoding method, and recording medium
JP2008227744A (en) Video server and video editing system
CN100546389C (en) Information processor and information processing method
JP2008271546A (en) Decoding device and decoding method, and information processing apparatus and information processing method
JP4170993B2 (en) Multiple subtitle display system and method for digital video disc player
US20040197080A1 (en) Apparatus and method for duplicating digital data
JPH07193785A (en) Device and method for recording/reproducing information
JP2006319963A (en) Program, information processing apparatus, information processing method, and recording medium
JP3615519B2 (en) Data recording and reproducing method and apparatus, data reproducing method and apparatus
JP4487196B2 (en) Recording apparatus and control method thereof
JP2009272929A (en) Video encoder, and video encoding method
JP3384563B2 (en) Optical disc, reproducing apparatus and reproducing method
JP5553533B2 (en) Image editing apparatus, control method thereof, and program
JP3384562B2 (en) Recording device and recording method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20091209

Termination date: 20150414

EXPY Termination of patent right or utility model