US20030095790A1 - Methods and apparatus for generating navigation information on the fly - Google Patents

Methods and apparatus for generating navigation information on the fly Download PDF

Info

Publication number
US20030095790A1
US20030095790A1 US10/335,112 US33511202A US2003095790A1 US 20030095790 A1 US20030095790 A1 US 20030095790A1 US 33511202 A US33511202 A US 33511202A US 2003095790 A1 US2003095790 A1 US 2003095790A1
Authority
US
United States
Prior art keywords
interest
information
content
navigation
navigation information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/335,112
Inventor
Ajit Joshi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tahoe Research Ltd
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US10/335,112 priority Critical patent/US20030095790A1/en
Assigned to INTEL CORPORATION reassignment INTEL CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: JOSHI, AJIT P.
Publication of US20030095790A1 publication Critical patent/US20030095790A1/en
Assigned to TAHOE RESEARCH, LTD. reassignment TAHOE RESEARCH, LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: INTEL CORPORATION
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/8042Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/034Electronic editing of digitised analogue information signals, e.g. audio or video signals on discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/102Programmed access in sequence to addressed parts of tracks of operating record carriers
    • G11B27/105Programmed access in sequence to addressed parts of tracks of operating record carriers of operating discs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/19Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier
    • G11B27/28Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording
    • G11B27/30Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording
    • G11B27/3027Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information detectable on the record carrier by using information signals recorded by the same method as the main recording on the same track as the main recording used signal is digitally coded
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers
    • G11B2220/25Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
    • G11B2220/2537Optical discs
    • G11B2220/2545CDs
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B2220/00Record carriers by type
    • G11B2220/20Disc-shaped record carriers
    • G11B2220/25Disc-shaped record carriers characterised in that the disc is based on a specific recording technology
    • G11B2220/2537Optical discs
    • G11B2220/2562DVDs [digital versatile discs]; Digital video discs; MMCDs; HDCDs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/84Television signal recording using optical recording
    • H04N5/85Television signal recording using optical recording on discs or drums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal

Definitions

  • the present disclosure is directed to video presentation transmission and, more particularly, to methods and apparatus for generating navigation information on the fly.
  • MPEG Moving Pictures Experts Group
  • Various versions of these protocols have been developed, and are referred to as MPEG-1, MPEG-2, etc.
  • compressed video and audio data is packetized into elementary streams wrapped inside packet headers that contain information necessary to decompress the individual streams during playback.
  • These individual audio and video elementary streams can be further assembled, or multiplexed, into a single stream with timing information in the packet headers that identify the time at which the contents of each packet should be presented. In this way, video packets can be synchronized with audio packets during playback.
  • MPEG systems use two basic types of multiplexed streams, namely, Program Streams (PS) and Target Streams (TS).
  • Program Streams are targeted primarily for storage media.
  • Transport Streams are targeted primarily for transmission.
  • Transport Streams have a potentially higher error rate associated with data transmission.
  • audio and video are individually compressed and packetized.
  • a multiplexer then combines the individual packets into a PS or TS.
  • the packets are retrieved from the stream by a demultiplexer, individual packets are depacketized and decompressed, and synchronization between audio and video is achieved by using the appropriate fields in the PS or TS headers.
  • Decoding is typically performed on the fly (i.e., dynamically) as the audio/video is played back.
  • the packets are time-stamped, allowing the playback can be manipulated to perform such functions as: moving directly to specified portions of the audio and/or video presentation, pausing, playing only audio or only video, playing audio in different languages, etc., while maintaining proper synchronization.
  • These and similar functions are collectively referred to as navigation.
  • Generating navigation data for an MPEG stream is conventionally performed during the encoding operation, and is placed into the MPEG stream in the form of navigation packets.
  • particularly-relevant points in the MPEG data stream were selected to be navigation points or links that were presented to a user on a display screen, such as a television that was also displaying decoded program content. The viewer could then select one of the navigation points from the display and have the audio and video associated with the selected navigation point presented to the viewer.
  • the navigation points were generated prior to broadcast of the MPEG stream and were transmitted at substantially the same time as the MPEG stream. Accordingly, in prior systems, program content had always been stored and previewed before it was broadcast so that the navigation points could be generated and broadcast with the MPEG stream including the program content.
  • FIGS. 1 and 2 are block diagrams of example communication systems.
  • FIG. 3 is a block diagram of the example encoders of FIGS. 1 and 2.
  • FIG. 4 is a block diagram of the example navigation generator of FIG. 1.
  • FIG. 5 is a block diagram of the example decoder/players of FIGS. 1 and 2.
  • FIG. 6 is a block diagram of the example playback stack of FIG. 5.
  • FIG. 7 is a block diagram of the example seek point controller of FIG. 6.
  • FIG. 8 is an example display view including navigation seek points presented to a user of the decoder/player of FIGS. 1 and 2.
  • FIG. 9 is flow diagram of an example compose navigation database process.
  • FIG. 10 is a flow diagram of an example playback process.
  • FIG. 1 shows a block diagram of an example end-to-end system 1 including an encoder 10 coupled to a navigation generator 20 that is further coupled to a decoder/player 30 .
  • the system 1 is used to create a navigation database and an audio/video datastream, such as, for example, an MPEG stream including audio and video information.
  • the encoder 10 receives audio and video data, compresses the received data to reduce the storage space and bandwidth required by that data, packetizes the compressed data into audio and video packets, and multiplexes the packets together into an MPEG stream that is coupled to the navigation generator 20 .
  • the navigation generator 20 includes information from a composer (e.g., a user who manually enters navigation data while viewing a video presentation and who is typically located at a broadcast operation center (BOC)), examines the MPEG stream received from the encoder 10 , and produces an associated navigation database containing navigation metadata for performing navigation functions on the MPEG stream.
  • a composer e.g., a user who manually enters navigation data while viewing a video presentation and who is typically located at a broadcast operation center (BOC)
  • BOC broadcast operation center
  • the encoder 10 provides information to an automatic navigation generator 40 .
  • the automatic navigation generator 40 automatically generates the navigation database by analyzing the content of the MPEG stream and identifying desired points in the MPEG stream based on predefined criteria.
  • the navigation generator 40 produces a navigational database using an automated process rather than relying on direct user inputs to specify which parts of the MPEG stream are identified in the navigational database.
  • the output of either the generator 20 or the generator 40 is coupled to the decoder/player 30 to allow a user to select various portions of the MPEG stream for playback based on the data in the navigation database.
  • the selected portions are demultiplexed, depacketized, and decompressed by the decoder/player 30 to produce the desired video and/or audio outputs (e.g., for playing, for viewing, for listening and/or recording).
  • the navigation data in the navigation database can be used to produce these outputs and/or to create special effects such as pause/resume, freeze-frame, fast playback, and slow playback, as well as to provide a user the ability to jump to a particular location in the programming.
  • the functions of decoding and playing are integrated into the decoder/player 30 .
  • the decoder and player are separate units, and the decoded data developed by the decoder is stored and/or transmitted to the player by a transmitter (not shown) for presentation.
  • the functions of the encoder 10 , the generator 20 , and the decoder/player 30 are performed at different times and on different platforms, and one or more storage media are used to hold the data stream and/or navigation database until the next stage is ready to receive that data.
  • the encoder 10 may store the encoded data stream before passing encoded data stream to the generator 20 .
  • the generator 20 may hold, or store, navigation database information until such information is passed to the decoder/player 30 .
  • FIG. 3 illustrates an example encoder 10 .
  • the encoder 10 could be located at a BOC for a Video-on-Demand network, for example.
  • a raw video signal is provided to a video encoder 11 and a raw audio signal is provided to an audio encoder 12 .
  • the video and audio signals are digitized before presentation to the encoders 11 , 12 .
  • the encoders 11 , 12 could include analog-to-digital converters and, therefore, could accommodate the input of analog signals.
  • the video data is compressed through any known video compression algorithm by the video encoder 11 .
  • the audio data is compressed through any known audio compression algorithm by the audio encoder 12 .
  • the encoders could perform MPEG compression or any other suitable type of compression.
  • the compressed video data from the video encoder 11 is segmented into packets with predefined sizes, formats, and protocols by a video packetizer 13 while the compressed audio data from the audio the encoder 12 is segmented into packets with predefined sizes, formats, and protocols by the audio packetizer 14 .
  • Each packet developed by the video packetizer 13 and/or the audio packetizer 14 may be referred to as an audio/video packetized elementary stream (A/V PES) and contains timing information, such as, for example, a presentation time stamp (PTS) that identifies where in the playback presentation the data in the packet should be placed.
  • PTS presentation time stamp
  • the playback operation later synchronizes the video and audio packets in the proper timing relationships by matching up timing information from various packets.
  • the audio data may contain multiple audio tracks, such as voice tracks in different languages for a movie.
  • Each audio track uses the same relative timing data and, therefore, each packet is identified by a sequence number or other identifier.
  • the navigation data uses these sequence numbers to identify particular packets for playback thereby permitting selection of the audio packets for the desired audio track.
  • the navigation data can also use the packet identifiers to permit mixing specified video and specified audio in other predetermined ways during playback.
  • the packets produced by the packetizers 13 , 14 are combined into a single data stream by a multiplexer 15 .
  • the multiplexer 15 which receives a system time clock input, may be, for example, a program stream multiplexer or a transport stream multiplexer.
  • the multiplexed data may contain additional information related to timing and/or contents, and may follow an MPEG-2 transport protocol.
  • the MPEG stream is stored in an optional storage device 16 before being provided to the navigation generator 20 .
  • the storage device 16 may be internal or external to the encoder 10 , and may utilize a portable medium such as, but not limited to, a compact disk, read only memory (CD-ROM) or a digital versatile disk (DVD).
  • the data stream may be written directly into storage device 16 , or may be transmitted through a transmission medium before being stored.
  • the MPEG stream may be read out of the storage device 16 for transmission to the navigation generator 20 .
  • the optional storage device 16 may be completely eliminated and, therefore, the MPEG stream may not be stored at all.
  • FIG. 4 An example navigation generator 20 is illustrated in FIG. 4.
  • the navigation generator 20 includes an authoring tool 21 .
  • the authoring tool 21 is used to examine the MPEG stream and to generate navigation information identifying content of interest in the MPEG data stream.
  • the authoring tool 21 may be implemented in hardware, software and/or firmware or in any other suitable combination thereof.
  • the authoring tool 21 is responsive to navigation configuration information 24 , which includes a file of desired seek points in the MPEG stream as defined by composer inputs 23 .
  • the composer is typically a person who views the MPEG stream content and manually records the seek points in the file according to a predefined format, which is discussed below.
  • the composer specifies the composer inputs through a keyboard, by pointing to icons on a screen with a mouse or other pointing device, or by any other data input device.
  • the authoring tool 21 may be programmed to search through the MPEG stream in various ways to locate and define desired seek points.
  • the authoring tool examines the timing information of the packets.
  • the authoring tool 21 may be interactive, thereby allowing personnel at the BOC to view the MPEG encoded programming as it is broadcast to viewers.
  • the authoring tool 21 may be embodied in a workstation or any other suitable computer installation that allows personnel at the BOC to identify portions of the broadcast programming that are of interest as the MPEG stream is broadcast. Accordingly, personnel at the BOC view the programming at substantially the same time as the viewers and the personnel at the BOC surmise the events in the programming that the viewers will find interesting and will want to review at a later time. For example, considering a baseball game, personnel at the BOC watch the MPEG stream delivering the baseball game programming content at approximately the same time the viewers view the content.
  • the BOC personnel and the viewers watch the game, a player from the home team hits a homerun. At that point, the BOC personnel controls the authoring tool 21 to make an indication that an event of interest has occurred.
  • the indication of the interesting event is translated into a navigation file that is broadcast to the viewer's location to identify the content of interest.
  • the viewers may select a link that causes their receiver/decoder to present the audio and video of the home run. To permit this on the fly insertion of navigation seek points, there may be a delay (e.g., 3 minutes) between the time the BOC personnel are provided with the MPEG stream and the broadcast time of that MPEG stream.
  • the MPEG stream may include a video presentation such as a digitized movie or other video sequence.
  • the composer-inputted criteria may be seek points that are located specified amounts of time after the movie starts, or may be points that divide the movie into a specified number of equal time segments.
  • the authoring tool 21 locates a video intraframe (I-frame) that is closest to each specified part of the MPEG presentation of which a seek point is desired, and identifies that I-frame, or the packet containing that I-frame, as the requested point in the MPEG sequence.
  • I-frames are used as reference points because, unlike predicted frames (P-frames) or bi-directional frames (B-frames), they are self-contained video images that do not depend on previous or subsequent frames for their reconstruction.
  • the navigation data generated by the authoring tool 21 is placed into one or more navigation files 22 .
  • the navigation file(s) 22 of FIG. 4 are separate files from the files holding the MPEG stream and are broadcast after the MPEG stream is broadcast. Both navigation file(s) 22 and the associated MPEG stream may be stored at the BOC until needed for playback although they may remain as separate files.
  • both the navigation file(s) 22 and the MPEG stream are stored in the storage device 26 (e.g., a hard drive).
  • the storage device 26 may be internal or external to the navigation generator 20 , and may include a fixed or portable medium.
  • the navigation files 22 and the associated MPEG stream may be broadcast “on-the-fly” to client devices by a transmitter (not shown) over a transmission medium.
  • the navigation file comprises two files.
  • the first file is an Extensible Markup Language (XML) file or some other pre-defined file format.
  • the first file contains chapter times, positions and labels, and audio/video stream packet ID's and labels. It may also include timing information, titles for scenes in the media presentation, bitmap files for the scenes, and I-frame location in the stream.
  • the second file is a binary file referred to as an I-frame index file.
  • the second file contains the presentation time and file offset of the packet corresponding to each video I-frame.
  • the I-frame index file is used for video trick modes, such as fast forward and fast reverse.
  • the I-frame index may be used to provide a user the ability to jump to particular locations in the presentation and may also be used as a quick-scan source for locating specific time points in the presentation.
  • the navigation files (including seek points), and the MPEG stream, are read out and broadcast (or simply broadcast in the case of “on-the-fly” broadcast during play of the media presentation) to client devices.
  • FIG. 5 shows an example client device 30 for playing all or a portion of the MPEG stream.
  • a playback stack 31 reads the navigation files 22 and presents navigation options to the display 34 . Such options may include chapters available, chapter labels, etc.
  • a playback control 33 is adapted to provide input data to the playback stack 31 to identify the portions or segments of the MPEG stream that are to be selected by a viewer for presentation.
  • the playback stack 31 responds to input data from the playback control 33 by reading the navigation files to identify where in the MPEG stream requested segments of the presentation are located.
  • the selected MPEG segments are then read from the playback stack 31 and presented to the decoder 32 , where they are decoded and played on one or more media players.
  • the illustrated example shows media players such as a display 34 for presenting video data and a speaker 35 for presenting audio data.
  • the example playback stack 31 includes an intelligent reception system 54 , an internal mass storage device 56 for storing the MPEG data stream and the navigation files and a seek point controller 58 .
  • the intelligent reception system 54 which may be implemented as software, hardware and/or firmware, controls whether to store navigation file(s) and whether to associate the file with the MPEG stream content.
  • the intelligent reception system 54 may be used, for example, only to allow subscribers of the system wishing seek point functionality to have access to navigation information. Additionally or alternatively, the intelligent reception system 54 may be used to allow only viewers of a particular program or viewers who have expressed interest in a particular program to receive and view the navigation files associated with that program.
  • the mass storage device 56 may be a hard drive, an optical disk and its associated optical drive, semiconductor memory or any other memory device.
  • the navigation file(s) may be encrypted by the encoder 10 and decrypted by the intelligent reception system 54 of the paying subscriber. Alternatively, the navigation files may be unencrypted.
  • the seek point controller 58 enables a user of the client device 30 to use the navigation seek points to navigate within the MPEG media presentation.
  • the seek point controller 58 may be located within the playback stack 31 , as shown, or may be located external to the playback stack 31 , and may be implemented as software, firmware and/or hardware or any suitable combination thereof. When implemented as software or firmware, for example, the seek point controller 58 could be an additional application stack added to an existing playback device.
  • the seek point controller 58 includes a controller/selector 62 that receives an input of I-frame data and navigation file information from the intelligent reception system 54 .
  • the navigation file information includes the navigation seek points.
  • the controller/selector 62 reads or parses the file(s) 22 , which includes stored seek points or locations, and outputs the navigation points to a user's media playing device (e.g., either visual display 34 , audio speaker 35 or both) via a seek point output device 68 .
  • a user's media playing device e.g., either visual display 34 , audio speaker 35 or both
  • the seek point output device 68 formats the navigation seek points for visual display on a user's visual display 34 . Additionally, audio data concerning the seek points may also be formatted and output for audio indication of the navigation seek points to the audio speaker 35 , for example.
  • the control output block 64 of the seek point controller 58 also controls the mass storage device 56 to control the flow of information from the mass storage device 56 to the decoder 32 . For example, seek point information and MPEG stream portions may be output from the mass storage device 56 to the decoder 32 for presentation on the display 34 (FIG. 5).
  • the receiver 66 of the seek point controller 58 receives information from the playback control 33 , such as, for example, a remote control, and passes the information on to the controller/selector 62 .
  • the information received by the receiver 66 may include indications of links that a viewer desires to select. Accordingly, the information passed to the receiver 66 is used to control, via the control output 64 , the MPEG stream information passed from the mass storage device 56 to the decoder 32 .
  • FIG. 8 An example of a navigation display presented to a user is illustrated in FIG. 8.
  • the display 34 includes a program viewing area 70 in which programming information such as television shows may be presented.
  • the display 34 also includes a number of sections of navigation seek point information, as shown at reference numerals 72 - 78 . While the navigation seek point information and the programming information are shown as occupying two separate areas of the display 34 , those having ordinary skill in the art will readily recognize that this arrangement is just one example arrangement because the information may be presented together, such as by overlaying one type of information over the other.
  • the navigation seek point information displayed at reference numerals 72 - 78 may be textual, but may also include, for example, image and audio information.
  • the navigation seek point information is not limited to display in the locations denoted with reference numeral 72 - 78 .
  • the locations denoted with reference numerals 72 - 78 are merely examples of locations at which the navigation seek point information may be displayed.
  • the user may input a selection of a seek point.
  • a selection is sent from the playback control 33 to the receiver 66 within the seek point controller 58 of the playback stack 31 .
  • the receiver 66 may also be configured to accept other data from the playback control 33 , such as commands to move graphical indications, such as cursors, on the display of which navigation seek point is desired. In this way, the user is presented with a graphical user interface to graphically move between different seek points to facilitate selection of a particular seek point.
  • the controller/selector 62 controls the control output 64 to relay the selection information, including the starting I-frame, to the playback stack 31 .
  • the playback stack 31 accesses the MPEG data from the mass storage device 56 corresponding to the selected portion of the media presentation and serves up the MPEG data to the display 34 and/or speaker 35 through the decoder 32 .
  • the disclosed playback system permits a client/user to jump within an MPEG media presentation to portions of the presentation that the user wants to playback through visually or audibly presented navigation seek points.
  • FIG. 9 is a flow chart of an example process 80 for composing a navigation database.
  • the process 80 of FIG. 9 is, for example, implemented in the authoring tool 21 (FIG. 4), at which BOC personnel (e.g., an operator) can view the MPEG stream to display the programming content of the MPEG stream (block 82 ).
  • BOC personnel e.g., an operator
  • the operator waits for content of interest to be displayed (block 84 ). If no content of interest is displayed (block 84 ), the operator continues to view the programming (block 82 ). If, however, content of interest is presented to the operator (block 84 ), the operator makes an identification of the content of interest (block 86 ).
  • the identification could include the operator selecting a link provided on the authoring tool 21 , depressing a button provided on the authoring tool 21 or making a time notation by hand and later keying the time information into the authoring tool 21 . Both the start and end time of the content of interest may be indicated.
  • the authoring tool selects an MPEG I-frame having a timestamp close to the identified time (block 88 ).
  • the I-frame is selected because I-frames include all visual information necessary to generate an image (i.e., I-frames are not relative frames dependent on other frames for visual content information).
  • the I-frame is stored in a navigation file (block 90 ), as is the I-frame file offset and the presentation time stamp (PTS).
  • the navigation file is then transmitted to viewers watching the programming (block 92 ).
  • An indication of the end of the identified content of interest may also be stored in the navigation information.
  • the navigation file is broadcast after viewers have had the opportunity to watch the programming content in the MPEG stream.
  • the navigation file may be transmitted to all receiver/decoders so that the receiver decoders can store the navigation file in the event that a viewer later tunes to programming related to the navigation file.
  • the navigation file in the disclosed example is separate from any file containing all or part of the MPEG stream itself, and may even be stored in a separate medium from the MPEG stream. Additionally, the metadata in the navigation file may also contain one or more of timing information, titles for scenes of the MPEG media presentation, bitmap or MPEG data for displaying images corresponding to particular scenes of the MPEG presentation and I-frame location.
  • the authoring tool 21 continues to display programming (block 82 ).
  • the process 80 may end execution or may return control to any routine that called the process 80 .
  • an operator at the BOC was described as selecting content of interest.
  • content of interest could be identified at locations other than at the BOC and that the content of interest could be identified in an automated manner without operator interaction.
  • software or hardware could be adapted to identify content of interest by monitoring video information in the MPEG stream.
  • the authoring tool 21 could be automated to detect instant replays contained in sporting event broadcasts and could, therefore, identify the instant replays as content of interest.
  • FIG. 10 a process 100 for playing back portions of the MPEG stream associated with the navigation information.
  • the process 100 may be implemented by the playback stack 31 (FIG. 5).
  • the navigation information is displayed to the viewer watching the programming to which the navigation information corresponds (block 102 ).
  • the navigation information may be displayed in areas denoted with the reference numerals 72 - 78 . Additionally or alternatively, navigation information related to other programming content may be stored for later use.
  • the process 100 determines if a viewer has selected any of the navigation information presented on the display (block 104 ). If the viewer has not selected any of the navigation information (block 104 ), real time programming continues to be displayed (block 106 ). If, however, a viewer has selected the navigation information (block 104 ), real time programming is interrupted (block 108 ) and the frames associated with the selected navigation information are recalled (block 110 ). The program content beginning with the information represented by the I-frame associated with the selected navigation information are then presented to the viewer (block 112 ) until the end of the programming content associated with the selected navigation information is reached (block 114 ). When the end of the programming associated with the selected navigation information is reached (block 114 ), real time programming is again displayed (block 106 ).
  • the disclosed example may be implemented in circuitry or as a method.
  • the disclosed example may also be implemented as information or instructions stored on media, which may be read and executed by at least one processor to perform the functions described herein.
  • a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer).
  • a machine-readable medium may include read only memory (ROM); random access memory (RAM); magnetic disk storage media; optical storage media; flash memory devices; electrical, optical, acoustical or other forms of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.), and others.
  • the disclosed examples permit navigational data to be generated from an encoded data stream, thus allowing creation of a navigation database after the MPEG data has been compressed, packetized, and multiplexed.
  • the data stream contains video and/or audio data that has been compressed and packetized according to any of various known formats and protocols such as any of the MPEG formats.
  • the navigation data permits selective retrieval of portions of the data stream for playback by identifying packets or other portions of the data stream that are associated with navigation points (i.e., points in a presentation carried in the data stream that the user may wish to access quickly and begin playing).
  • Navigation data may also include data that enables special effects or trick modes, such as fast forward or fast reverse.
  • the navigation information provides the viewer the ability to jump to particular point in the data stream.
  • the configuration information in the navigation database includes information on the particular points identified in the navigation database. These points and/or the associated information may be specified by a user. Alternately, the configuration data in the navigation database may be generated automatically.
  • the data in the navigation database, which contains data about other data, may be referred to as navigation metadata.
  • the navigation database may be kept separately from the MPEG stream.

Abstract

Methods and apparatus for utilizing navigation seek points in a media presentation are disclosed. A stream of media presentation data including program content of interest is received and decoded and content of interest is made available to a user. After the program content of interest has been made available to the user for viewing, navigation information is received and communicated to the user.

Description

    RELATED APPLICATIONS
  • This patent application is a continuing patent application of U.S. patent application Ser. No. 09/932,806, filed Aug. 17, 2001.[0001]
  • TECHNICAL FIELD
  • The present disclosure is directed to video presentation transmission and, more particularly, to methods and apparatus for generating navigation information on the fly. [0002]
  • BACKGROUND
  • Many digitized moving picture systems use the well-known protocols and formats developed by the Moving Pictures Experts Group, generically referred to as MPEG. Various versions of these protocols have been developed, and are referred to as MPEG-1, MPEG-2, etc. In an MPEG system, compressed video and audio data is packetized into elementary streams wrapped inside packet headers that contain information necessary to decompress the individual streams during playback. These individual audio and video elementary streams can be further assembled, or multiplexed, into a single stream with timing information in the packet headers that identify the time at which the contents of each packet should be presented. In this way, video packets can be synchronized with audio packets during playback. MPEG systems use two basic types of multiplexed streams, namely, Program Streams (PS) and Target Streams (TS). Program Streams are targeted primarily for storage media. Transport Streams are targeted primarily for transmission. Transport Streams have a potentially higher error rate associated with data transmission. [0003]
  • In encoders of MPEG systems, audio and video are individually compressed and packetized. A multiplexer then combines the individual packets into a PS or TS. On the decoder side, the packets are retrieved from the stream by a demultiplexer, individual packets are depacketized and decompressed, and synchronization between audio and video is achieved by using the appropriate fields in the PS or TS headers. Decoding is typically performed on the fly (i.e., dynamically) as the audio/video is played back. The packets are time-stamped, allowing the playback can be manipulated to perform such functions as: moving directly to specified portions of the audio and/or video presentation, pausing, playing only audio or only video, playing audio in different languages, etc., while maintaining proper synchronization. These and similar functions are collectively referred to as navigation. Generating navigation data for an MPEG stream is conventionally performed during the encoding operation, and is placed into the MPEG stream in the form of navigation packets. [0004]
  • Previously, particularly-relevant points in the MPEG data stream were selected to be navigation points or links that were presented to a user on a display screen, such as a television that was also displaying decoded program content. The viewer could then select one of the navigation points from the display and have the audio and video associated with the selected navigation point presented to the viewer. However, conventionally the navigation points were generated prior to broadcast of the MPEG stream and were transmitted at substantially the same time as the MPEG stream. Accordingly, in prior systems, program content had always been stored and previewed before it was broadcast so that the navigation points could be generated and broadcast with the MPEG stream including the program content.[0005]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIGS. 1 and 2 are block diagrams of example communication systems. [0006]
  • FIG. 3 is a block diagram of the example encoders of FIGS. 1 and 2. [0007]
  • FIG. 4 is a block diagram of the example navigation generator of FIG. 1. [0008]
  • FIG. 5 is a block diagram of the example decoder/players of FIGS. 1 and 2. [0009]
  • FIG. 6 is a block diagram of the example playback stack of FIG. 5. [0010]
  • FIG. 7 is a block diagram of the example seek point controller of FIG. 6. [0011]
  • FIG. 8 is an example display view including navigation seek points presented to a user of the decoder/player of FIGS. 1 and 2. [0012]
  • FIG. 9 is flow diagram of an example compose navigation database process. [0013]
  • FIG. 10 is a flow diagram of an example playback process. [0014]
  • DETAILED DESCRIPTION
  • FIG. 1 shows a block diagram of an example end-to-[0015] end system 1 including an encoder 10 coupled to a navigation generator 20 that is further coupled to a decoder/player 30. The system 1 is used to create a navigation database and an audio/video datastream, such as, for example, an MPEG stream including audio and video information. In the disclosed example, the encoder 10 receives audio and video data, compresses the received data to reduce the storage space and bandwidth required by that data, packetizes the compressed data into audio and video packets, and multiplexes the packets together into an MPEG stream that is coupled to the navigation generator 20.
  • The [0016] navigation generator 20, as described below in conjunction with FIG. 4, includes information from a composer (e.g., a user who manually enters navigation data while viewing a video presentation and who is typically located at a broadcast operation center (BOC)), examines the MPEG stream received from the encoder 10, and produces an associated navigation database containing navigation metadata for performing navigation functions on the MPEG stream. The particular navigation functions that are performed are specified by the composer inputs.
  • Alternatively, as shown in the example alternative end-to-[0017] end system 2 of FIG. 2, the encoder 10 provides information to an automatic navigation generator 40. The automatic navigation generator 40 automatically generates the navigation database by analyzing the content of the MPEG stream and identifying desired points in the MPEG stream based on predefined criteria. In this alternative example, the navigation generator 40 produces a navigational database using an automated process rather than relying on direct user inputs to specify which parts of the MPEG stream are identified in the navigational database.
  • The output of either the [0018] generator 20 or the generator 40 is coupled to the decoder/player 30 to allow a user to select various portions of the MPEG stream for playback based on the data in the navigation database. The selected portions are demultiplexed, depacketized, and decompressed by the decoder/player 30 to produce the desired video and/or audio outputs (e.g., for playing, for viewing, for listening and/or recording). The navigation data in the navigation database can be used to produce these outputs and/or to create special effects such as pause/resume, freeze-frame, fast playback, and slow playback, as well as to provide a user the ability to jump to a particular location in the programming.
  • In the examples of FIGS. 1 and 2, the functions of decoding and playing are integrated into the decoder/[0019] player 30. In another example, the decoder and player are separate units, and the decoded data developed by the decoder is stored and/or transmitted to the player by a transmitter (not shown) for presentation. In an example, the functions of the encoder 10, the generator 20, and the decoder/player 30 are performed at different times and on different platforms, and one or more storage media are used to hold the data stream and/or navigation database until the next stage is ready to receive that data. For example, the encoder 10 may store the encoded data stream before passing encoded data stream to the generator 20. Additionally or alternatively, the generator 20 may hold, or store, navigation database information until such information is passed to the decoder/player 30.
  • FIG. 3 illustrates an [0020] example encoder 10. The encoder 10 could be located at a BOC for a Video-on-Demand network, for example. In the example of FIG. 3, a raw video signal is provided to a video encoder 11 and a raw audio signal is provided to an audio encoder 12. According to the disclosed example, the video and audio signals are digitized before presentation to the encoders 11, 12. However, in another example, the encoders 11, 12 could include analog-to-digital converters and, therefore, could accommodate the input of analog signals. The video data is compressed through any known video compression algorithm by the video encoder 11. Similarly, the audio data is compressed through any known audio compression algorithm by the audio encoder 12. For example, the encoders could perform MPEG compression or any other suitable type of compression.
  • The compressed video data from the [0021] video encoder 11 is segmented into packets with predefined sizes, formats, and protocols by a video packetizer 13 while the compressed audio data from the audio the encoder 12 is segmented into packets with predefined sizes, formats, and protocols by the audio packetizer 14. Each packet developed by the video packetizer 13 and/or the audio packetizer 14 may be referred to as an audio/video packetized elementary stream (A/V PES) and contains timing information, such as, for example, a presentation time stamp (PTS) that identifies where in the playback presentation the data in the packet should be placed. The playback operation later synchronizes the video and audio packets in the proper timing relationships by matching up timing information from various packets.
  • In the foregoing examples, the audio data may contain multiple audio tracks, such as voice tracks in different languages for a movie. Each audio track uses the same relative timing data and, therefore, each packet is identified by a sequence number or other identifier. As described below in detail, the navigation data uses these sequence numbers to identify particular packets for playback thereby permitting selection of the audio packets for the desired audio track. Person of ordinary skill in the art will readily appreciate that the navigation data can also use the packet identifiers to permit mixing specified video and specified audio in other predetermined ways during playback. [0022]
  • In the example of FIG. 3, the packets produced by the [0023] packetizers 13, 14 are combined into a single data stream by a multiplexer 15. The multiplexer 15, which receives a system time clock input, may be, for example, a program stream multiplexer or a transport stream multiplexer. The multiplexed data may contain additional information related to timing and/or contents, and may follow an MPEG-2 transport protocol.
  • In the illustrated example, the MPEG stream is stored in an [0024] optional storage device 16 before being provided to the navigation generator 20. The storage device 16 may be internal or external to the encoder 10, and may utilize a portable medium such as, but not limited to, a compact disk, read only memory (CD-ROM) or a digital versatile disk (DVD). The data stream may be written directly into storage device 16, or may be transmitted through a transmission medium before being stored. Regardless of the particular configuration of the storage device 16, the MPEG stream may be read out of the storage device 16 for transmission to the navigation generator 20. Alternatively, the optional storage device 16 may be completely eliminated and, therefore, the MPEG stream may not be stored at all.
  • An [0025] example navigation generator 20 is illustrated in FIG. 4. In the illustrated example, the navigation generator 20 includes an authoring tool 21. The authoring tool 21 is used to examine the MPEG stream and to generate navigation information identifying content of interest in the MPEG data stream. The authoring tool 21 may be implemented in hardware, software and/or firmware or in any other suitable combination thereof. The authoring tool 21 is responsive to navigation configuration information 24, which includes a file of desired seek points in the MPEG stream as defined by composer inputs 23. The composer is typically a person who views the MPEG stream content and manually records the seek points in the file according to a predefined format, which is discussed below. In an example, the composer specifies the composer inputs through a keyboard, by pointing to icons on a screen with a mouse or other pointing device, or by any other data input device. Additionally or alternatively, the authoring tool 21 may be programmed to search through the MPEG stream in various ways to locate and define desired seek points. In an example, the authoring tool examines the timing information of the packets.
  • In some examples, the [0026] authoring tool 21 may be interactive, thereby allowing personnel at the BOC to view the MPEG encoded programming as it is broadcast to viewers. For example, the authoring tool 21 may be embodied in a workstation or any other suitable computer installation that allows personnel at the BOC to identify portions of the broadcast programming that are of interest as the MPEG stream is broadcast. Accordingly, personnel at the BOC view the programming at substantially the same time as the viewers and the personnel at the BOC surmise the events in the programming that the viewers will find interesting and will want to review at a later time. For example, considering a baseball game, personnel at the BOC watch the MPEG stream delivering the baseball game programming content at approximately the same time the viewers view the content. As the BOC personnel and the viewers watch the game, a player from the home team hits a homerun. At that point, the BOC personnel controls the authoring tool 21 to make an indication that an event of interest has occurred. The indication of the interesting event is translated into a navigation file that is broadcast to the viewer's location to identify the content of interest. At some later time, (e.g., between innings of the baseball game), the viewers may select a link that causes their receiver/decoder to present the audio and video of the home run. To permit this on the fly insertion of navigation seek points, there may be a delay (e.g., 3 minutes) between the time the BOC personnel are provided with the MPEG stream and the broadcast time of that MPEG stream.
  • While the foregoing example is pertinent to a baseball game in particular and, in general, sporting events, those having ordinary skill in the art will readily recognize that other programming events could be analyzed at the BOC as they are watched by viewers and, subsequently, navigation information could be provided to each user. Additionally, it will be readily appreciated by those having ordinary skill in the art that the application of the disclosed system is not limited only to live events such as sporting events. To the contrary, the disclosed system could be used for previously-recorded programs stored at the BOC that have not had navigation information generated by the BOC until the program is broadcast. However, the application to live broadcasts is particularly advantageous. [0027]
  • In another example, the MPEG stream may include a video presentation such as a digitized movie or other video sequence. In such an example, the composer-inputted criteria may be seek points that are located specified amounts of time after the movie starts, or may be points that divide the movie into a specified number of equal time segments. [0028]
  • After the content of interest has been identified, the [0029] authoring tool 21 locates a video intraframe (I-frame) that is closest to each specified part of the MPEG presentation of which a seek point is desired, and identifies that I-frame, or the packet containing that I-frame, as the requested point in the MPEG sequence. The identified points and corresponding I-frames divide the data stream into labeled segments. In this example, I-frames are used as reference points because, unlike predicted frames (P-frames) or bi-directional frames (B-frames), they are self-contained video images that do not depend on previous or subsequent frames for their reconstruction.
  • In the example of FIG. 4, the navigation data generated by the [0030] authoring tool 21 is placed into one or more navigation files 22. Unlike conventional systems that encode navigation data into the MPEG stream, the navigation file(s) 22 of FIG. 4 are separate files from the files holding the MPEG stream and are broadcast after the MPEG stream is broadcast. Both navigation file(s) 22 and the associated MPEG stream may be stored at the BOC until needed for playback although they may remain as separate files. In the example illustrated in FIG. 4, both the navigation file(s) 22 and the MPEG stream are stored in the storage device 26 (e.g., a hard drive). The storage device 26 may be internal or external to the navigation generator 20, and may include a fixed or portable medium. Alternatively, the navigation files 22 and the associated MPEG stream may be broadcast “on-the-fly” to client devices by a transmitter (not shown) over a transmission medium.
  • In an example, the navigation file comprises two files. The first file is an Extensible Markup Language (XML) file or some other pre-defined file format. The first file contains chapter times, positions and labels, and audio/video stream packet ID's and labels. It may also include timing information, titles for scenes in the media presentation, bitmap files for the scenes, and I-frame location in the stream. The second file is a binary file referred to as an I-frame index file. The second file contains the presentation time and file offset of the packet corresponding to each video I-frame. The I-frame index file is used for video trick modes, such as fast forward and fast reverse. Additionally, the I-frame index may be used to provide a user the ability to jump to particular locations in the presentation and may also be used as a quick-scan source for locating specific time points in the presentation. The navigation files (including seek points), and the MPEG stream, are read out and broadcast (or simply broadcast in the case of “on-the-fly” broadcast during play of the media presentation) to client devices. [0031]
  • FIG. 5 shows an [0032] example client device 30 for playing all or a portion of the MPEG stream. A playback stack 31 reads the navigation files 22 and presents navigation options to the display 34. Such options may include chapters available, chapter labels, etc. A playback control 33 is adapted to provide input data to the playback stack 31 to identify the portions or segments of the MPEG stream that are to be selected by a viewer for presentation. The playback stack 31 responds to input data from the playback control 33 by reading the navigation files to identify where in the MPEG stream requested segments of the presentation are located. The selected MPEG segments are then read from the playback stack 31 and presented to the decoder 32, where they are decoded and played on one or more media players. The illustrated example shows media players such as a display 34 for presenting video data and a speaker 35 for presenting audio data.
  • As shown in FIG. 6, the [0033] example playback stack 31 includes an intelligent reception system 54, an internal mass storage device 56 for storing the MPEG data stream and the navigation files and a seek point controller 58. The intelligent reception system 54, which may be implemented as software, hardware and/or firmware, controls whether to store navigation file(s) and whether to associate the file with the MPEG stream content. The intelligent reception system 54 may be used, for example, only to allow subscribers of the system wishing seek point functionality to have access to navigation information. Additionally or alternatively, the intelligent reception system 54 may be used to allow only viewers of a particular program or viewers who have expressed interest in a particular program to receive and view the navigation files associated with that program. The mass storage device 56 may be a hard drive, an optical disk and its associated optical drive, semiconductor memory or any other memory device. In any of the foregoing examples, the navigation file(s) may be encrypted by the encoder 10 and decrypted by the intelligent reception system 54 of the paying subscriber. Alternatively, the navigation files may be unencrypted.
  • The seek [0034] point controller 58 enables a user of the client device 30 to use the navigation seek points to navigate within the MPEG media presentation. The seek point controller 58 may be located within the playback stack 31, as shown, or may be located external to the playback stack 31, and may be implemented as software, firmware and/or hardware or any suitable combination thereof. When implemented as software or firmware, for example, the seek point controller 58 could be an additional application stack added to an existing playback device.
  • An example seek [0035] point controller 58 is illustrated in detail in FIG. 7. In this example, the seek point controller 58 includes a controller/selector 62 that receives an input of I-frame data and navigation file information from the intelligent reception system 54. The navigation file information includes the navigation seek points. The controller/selector 62 reads or parses the file(s) 22, which includes stored seek points or locations, and outputs the navigation points to a user's media playing device (e.g., either visual display 34, audio speaker 35 or both) via a seek point output device 68.
  • The seek [0036] point output device 68 formats the navigation seek points for visual display on a user's visual display 34. Additionally, audio data concerning the seek points may also be formatted and output for audio indication of the navigation seek points to the audio speaker 35, for example. The control output block 64 of the seek point controller 58 also controls the mass storage device 56 to control the flow of information from the mass storage device 56 to the decoder 32. For example, seek point information and MPEG stream portions may be output from the mass storage device 56 to the decoder 32 for presentation on the display 34 (FIG. 5).
  • The [0037] receiver 66 of the seek point controller 58 receives information from the playback control 33, such as, for example, a remote control, and passes the information on to the controller/selector 62. The information received by the receiver 66 may include indications of links that a viewer desires to select. Accordingly, the information passed to the receiver 66 is used to control, via the control output 64, the MPEG stream information passed from the mass storage device 56 to the decoder 32.
  • An example of a navigation display presented to a user is illustrated in FIG. 8. As shown, the [0038] display 34 includes a program viewing area 70 in which programming information such as television shows may be presented. The display 34 also includes a number of sections of navigation seek point information, as shown at reference numerals 72-78. While the navigation seek point information and the programming information are shown as occupying two separate areas of the display 34, those having ordinary skill in the art will readily recognize that this arrangement is just one example arrangement because the information may be presented together, such as by overlaying one type of information over the other. The navigation seek point information displayed at reference numerals 72-78 may be textual, but may also include, for example, image and audio information. Of course, as will be readily appreciated by those having ordinary skill in the art, the navigation seek point information is not limited to display in the locations denoted with reference numeral 72-78. To the contrary, the locations denoted with reference numerals 72-78 are merely examples of locations at which the navigation seek point information may be displayed.
  • After the navigation seek point data is displayed to the user, the user may input a selection of a seek point. Returning to FIGS. 6 and 7, such a selection is sent from the [0039] playback control 33 to the receiver 66 within the seek point controller 58 of the playback stack 31. The receiver 66 may also be configured to accept other data from the playback control 33, such as commands to move graphical indications, such as cursors, on the display of which navigation seek point is desired. In this way, the user is presented with a graphical user interface to graphically move between different seek points to facilitate selection of a particular seek point.
  • Once a navigation seek point selection is received by the [0040] receiver 66 from the playback control 33, the selection is communicated to the controller/selector 62 to affect presentation of the desired portion of the MPEG media presentation. The controller/selector 62 correlates the selected seek point with the particular I-frame carrying the corresponding portion of the media selection by addressing the navigation database. The controller/selector 62 then commands the control output 64 to relay the selection information, including the starting I-frame, to the playback stack 31. In response, the playback stack 31 accesses the MPEG data from the mass storage device 56 corresponding to the selected portion of the media presentation and serves up the MPEG data to the display 34 and/or speaker 35 through the decoder 32. Thus, the disclosed playback system permits a client/user to jump within an MPEG media presentation to portions of the presentation that the user wants to playback through visually or audibly presented navigation seek points.
  • FIG. 9 is a flow chart of an [0041] example process 80 for composing a navigation database. The process 80 of FIG. 9 is, for example, implemented in the authoring tool 21 (FIG. 4), at which BOC personnel (e.g., an operator) can view the MPEG stream to display the programming content of the MPEG stream (block 82). As the operator views the programming, the operator waits for content of interest to be displayed (block 84). If no content of interest is displayed (block 84), the operator continues to view the programming (block 82). If, however, content of interest is presented to the operator (block 84), the operator makes an identification of the content of interest (block 86). The identification could include the operator selecting a link provided on the authoring tool 21, depressing a button provided on the authoring tool 21 or making a time notation by hand and later keying the time information into the authoring tool 21. Both the start and end time of the content of interest may be indicated.
  • After the operator has identified the time of the content of interest, the authoring tool selects an MPEG I-frame having a timestamp close to the identified time (block [0042] 88). The I-frame is selected because I-frames include all visual information necessary to generate an image (i.e., I-frames are not relative frames dependent on other frames for visual content information). After the I-frame is identified (block 88), the I-frame is stored in a navigation file (block 90), as is the I-frame file offset and the presentation time stamp (PTS). The navigation file is then transmitted to viewers watching the programming (block 92). An indication of the end of the identified content of interest may also be stored in the navigation information. In the disclosed example, the navigation file is broadcast after viewers have had the opportunity to watch the programming content in the MPEG stream. Alternatively or additionally, the navigation file may be transmitted to all receiver/decoders so that the receiver decoders can store the navigation file in the event that a viewer later tunes to programming related to the navigation file.
  • The navigation file in the disclosed example is separate from any file containing all or part of the MPEG stream itself, and may even be stored in a separate medium from the MPEG stream. Additionally, the metadata in the navigation file may also contain one or more of timing information, titles for scenes of the MPEG media presentation, bitmap or MPEG data for displaying images corresponding to particular scenes of the MPEG presentation and I-frame location. [0043]
  • If there is more content to view (block [0044] 94), the authoring tool 21 continues to display programming (block 82). In the alternative, if there is no more programming content to view (block 94), the process 80 may end execution or may return control to any routine that called the process 80.
  • In the foregoing example, an operator at the BOC was described as selecting content of interest. However, persons of ordinary skill in the art will recognize that content of interest could be identified at locations other than at the BOC and that the content of interest could be identified in an automated manner without operator interaction. For example, software or hardware could be adapted to identify content of interest by monitoring video information in the MPEG stream. For example, the [0045] authoring tool 21 could be automated to detect instant replays contained in sporting event broadcasts and could, therefore, identify the instant replays as content of interest.
  • Having described examples of how navigation information is identified and broadcast, attention is now turned to a process [0046] 100 (FIG. 10) for playing back portions of the MPEG stream associated with the navigation information. The process 100 may be implemented by the playback stack 31 (FIG. 5). During the playback process 100, the navigation information is displayed to the viewer watching the programming to which the navigation information corresponds (block 102). For example, as shown in FIG. 8, the navigation information may be displayed in areas denoted with the reference numerals 72-78. Additionally or alternatively, navigation information related to other programming content may be stored for later use.
  • After the navigation information is displayed (block [0047] 102), the process 100 determines if a viewer has selected any of the navigation information presented on the display (block 104). If the viewer has not selected any of the navigation information (block 104), real time programming continues to be displayed (block 106). If, however, a viewer has selected the navigation information (block 104), real time programming is interrupted (block 108) and the frames associated with the selected navigation information are recalled (block 110). The program content beginning with the information represented by the I-frame associated with the selected navigation information are then presented to the viewer (block 112) until the end of the programming content associated with the selected navigation information is reached (block 114). When the end of the programming associated with the selected navigation information is reached (block 114), real time programming is again displayed (block 106).
  • Although the foregoing describes real time programming as being displayed (block [0048] 106), it will be readily appreciated by those having ordinary skill in the art that other pre-recorded programming could be displayed. For example, if a user were watching an hour long program and wanted to re-watch the first 30 minutes of the program, which, for example, took 10 minutes, the user could either resume watching the real time programming that is at minute 40 or could resume watching the program from minute 30 from a recording thereof. Accordingly, resuming to real time programming is merely one example of content that may be displayed after a user navigates to view stored information.
  • The disclosed example may be implemented in circuitry or as a method. The disclosed example may also be implemented as information or instructions stored on media, which may be read and executed by at least one processor to perform the functions described herein. A machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer). For example, a machine-readable medium may include read only memory (ROM); random access memory (RAM); magnetic disk storage media; optical storage media; flash memory devices; electrical, optical, acoustical or other forms of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.), and others. [0049]
  • From the foregoing, persons of ordinary skill in the art will appreciate that the disclosed examples permit navigational data to be generated from an encoded data stream, thus allowing creation of a navigation database after the MPEG data has been compressed, packetized, and multiplexed. The data stream contains video and/or audio data that has been compressed and packetized according to any of various known formats and protocols such as any of the MPEG formats. The navigation data permits selective retrieval of portions of the data stream for playback by identifying packets or other portions of the data stream that are associated with navigation points (i.e., points in a presentation carried in the data stream that the user may wish to access quickly and begin playing). Navigation data may also include data that enables special effects or trick modes, such as fast forward or fast reverse. In addition, the navigation information provides the viewer the ability to jump to particular point in the data stream. [0050]
  • The configuration information in the navigation database includes information on the particular points identified in the navigation database. These points and/or the associated information may be specified by a user. Alternately, the configuration data in the navigation database may be generated automatically. The data in the navigation database, which contains data about other data, may be referred to as navigation metadata. The navigation database may be kept separately from the MPEG stream. [0051]
  • Although certain apparatus and methods have been described herein, the scope of coverage of this patent is not limited thereto. On the contrary, this patent covers all embodiments of the teachings of the invention fairly falling within the scope of the appended claims, either literally or under the doctrine of equivalents. [0052]

Claims (59)

What is claimed is:
1. A method for utilizing navigation information in a media presentation, the method comprising:
receiving a stream of media presentation data including program content of interest;
decoding the stream of media presentation data and making the program content of interest available to a user;
receiving navigation information representative of the program content of interest after the program content of interest has been made available to the user; and
communicating the navigation information to the user.
2. A method as defined by claim 1, wherein the navigation information comprises a pointer to a video frame.
3. A method as defined by claim 2, wherein the video frame comprises a motion pictures expert group (MPEG) I-frame.
4. A method as defined by claim 1, further comprising receiving a selection input from the user, wherein the selection input corresponds to selected navigation information.
5. A method as defined by claim 4, further comprising making the program content of interest available to the user in response to the selection of the navigation information.
6. A method as defined in claim 4, further comprising:
identifying a starting point in the stream of media presentation data associated with the selected navigation information;
processing the stream of media presentation data beginning at the starting point; and
presenting to the user at least a portion of the processed stream of media presentation data.
7. A method as defined by claim 1, further comprising storing the navigation information.
8. A method as defined by claim 7, wherein the navigation information is stored if a program including the program content of interest is being viewed by the user when the navigation information is received.
9. A method as defined by claim 7, wherein the navigation information is stored if a program including the program content of interest has been designated by the user.
10. A receiving apparatus comprising:
a playback stack configured to receive a stream of media presentation data including program content of interest;
a decoder coupled to the playback stack and the display device, the decoder being configured to decode the stream of media presentation data and to pass the decoded information to the display device to display the program content of interest; and
a seek point controller configured to receive navigation information representative of the program content of interest after the decoder has passed the decoded information to a display device and to pass the navigation! information to the decoder.
11. A receiving apparatus as defined by claim 10, further comprising a storage device configured to store the stream of media presentation data and the navigation information.
12. A receiving apparatus as defined by claim 10, wherein the navigation information comprises a pointer to a video frame.
13. A receiving apparatus as defined by claim 12, wherein the video frame comprises a motion pictures expert group (MPEG) I-frame.
14. A receiving apparatus as defined by claim 10, wherein the seek point controller is further configured to receive a selection input from the user, wherein the selection input corresponds to selected navigation information.
15. A receiving apparatus as defined by claim 14, wherein the seek point controller is further configured to make the program content of interest available to the user in response to the selection of the navigation information.
16. A receiving apparatus as defined in claim 14, wherein the seek point controller is further configured to:
identify a starting point in the stream of media presentation data associated with the selected navigation information; and
provide to the decoder the stream of media presentation data beginning at the starting point.
17. A receiving apparatus as defined by claim 10, wherein the playback stack is further configured to store the navigation information.
18. A receiving apparatus as defined by claim 17, wherein the playback stack is configured to store the navigation information if a program including the program content of interest is being viewed by the user when the navigation information is received.
19. A receiving apparatus as defined by claim 18, wherein the playback stack is configured to store the navigation information if a program including the program content of interest has been designated by the user.
20. A storage medium storing information configured to cause an apparatus to:
receive a stream of audio/video presentation data including program content of interest;
decode the stream of the audio/video presentation data and to make the program content of interest available to a user;
receive navigation data representative of the program content of interest after the program content of interest has been made available to the user; and
communicate the navigation data to the user.
21. A storage medium as defined by claim 20, wherein the navigation data comprises a pointer to a video frame.
22. A storage medium as defined by claim 21, wherein the video frame comprises a motion pictures expert group (MPEG) I-frame.
23. A storage medium as defined by claim 20, wherein the information is further configured to cause the apparatus to receive a selection input from the user, wherein the selection input corresponds to selected navigation data.
24. A storage medium as defined by claim 23, wherein the information is further configured to cause the apparatus to make the program content of interest available to the user in response to the selection of the navigation data.
25. A storage medium as defined in claim 23, wherein the information is further configured to cause the apparatus to:
identify a starting point in the stream of audio/video presentation data associated with the selected navigation data;
process the stream of audio/video presentation data beginning at the starting point; and
present to the user at least a portion of the processed stream of audio/video presentation data.
26. A storage medium as defined by claim 20, wherein the information is further configured to cause the apparatus to store the navigation data.
27. A storage medium as defined by claim 26, wherein the information is further configured to cause the apparatus to store the navigation data if a program including the program content of interest is being viewed by the user when the navigation data is provided to the apparatus.
28. A storage medium as defined by claim 26, wherein the information is further configured to cause the apparatus to store the navigation data if a program including the program content of interest has been designated by the user.
29. A method of delivering information comprising:
producing an encoded video signal including content of interest;
producing an encoded audio signal;
receiving the encoded video and audio signals as the encoded video and audio signals are distributed for presentation to users;
generating navigation information in response to the content of interest in the encoded video and audio signals; and
distributing the navigation information to users after the encoded video and audio signals are distributed for presentation to users.
30. A method as defined by claim 29, wherein the navigation information comprises a formatted file.
31. A method as defined by claim 30, wherein the formatted file is an Extensible Markup Language file.
32. A method as defined by claim 30, wherein the navigation information comprises video information including at least one of text data of a listing of a reference to the content of interest, images indicative of the content of interest and audio information indicative of the content of interest.
33. A method as defined by claim 29, further including presenting content of the encoded video and audio signals to personnel.
34. A method as defined by claim 33, wherein further including allowing the personnel to indicate when the content of interest is manifest to the personnel.
35. An information delivery system comprising:
a video encoder configured to receive a video signal and to produce an encoded video signal;
an audio encoder configured to receive an audio signal and to produce an encoded audio signal;
an authoring tool configured to receive the encoded video and audio signals as the encoded video and audio signals are distributed for presentation to users, the authoring tool being further configured to generate navigation information in response to content of interest in the encoded video and audio signals and to distribute the navigation information to users.
36. An information delivery system as defined by claim 35, wherein the navigation information comprises a formatted file.
37. An information delivery system as defined by claim 36, wherein the formatted file is an Extensible Markup Language file.
38. An information delivery system as defined by claim 36, wherein the navigation information comprises video information including at least one of text data of a listing of a reference to the content of interest, images indicative of the content of interest and audio information indicative of the content of interest.
39. An information delivery system as defined by claim 35, wherein the authoring tool comprises an audio and video display device that manifests content of the encoded video and audio signals to personnel.
40. An information delivery system as defined by claim 39, wherein the authoring tool further comprises an input device allowing the personnel to indicate when the content of interest is manifest to the personnel.
41. An information delivery system as defined by claim 40, wherein the authoring tool comprises a computing device and wherein the input device comprises a keyboard.
42. An information delivery system as defined by claim 40, wherein the authoring tool comprises a computing device and wherein the input device comprises a mouse.
43. A storage medium storing information configured to cause an apparatus to:
receive encoded video and audio signals as the encoded video and audio signals are distributed for presentation to users;
generate navigation information in response to content of interest in the encoded video and audio signals; and
distribute the navigation information to users after the encoded video and audio signals are distributed for presentation to users.
44. A storage medium as defined by claim 43, wherein the navigation information comprises a formatted file.
45. A storage medium as defined by claim 44, wherein the formatted file is an Extensible Markup Language file.
46. A storage medium as defined by claim 44, wherein the navigation information comprises video information including at least one of text data of a listing of a reference to the content of interest, images indicative of the content of interest and audio information indicative of the content of interest.
47. A storage medium as defined by claim 43, wherein the instructions are further configured to cause the apparatus to manifests content of the encoded video and audio signals to personnel.
48. A storage medium as defined by claim 47, wherein the instructions are further configured to cause the apparatus to receive from the personnel an indication when the content of interest is manifest to the personnel.
49. A method of delivering information comprising:
producing an encoded video signal of a live event including content of interest;
producing an encoded audio signal of the live event;
delaying the transmission of the encoded video and audio signals of the live event while generating navigation information in response to the content of interest in the encoded video and audio signals; and
distributing the navigation information to users.
50. A method as defined by claim 49, further comprising distributing the navigation information to users as the encoded video and audio signals are distributed to users.
51. A method as defined by claim 49, wherein the navigation information comprises a formatted file.
52. A method as defined by claim 51, wherein the formatted file is an Extensible Markup Language file.
53. A method as defined by claim 51, wherein the navigation information comprises video information including at least one of text data of a listing of a reference to the content of interest, images indicative of the content of interest and audio information indicative of the content of interest.
54. A method as defined by claim 49, further including presenting content of the encoded video and audio signals to personnel.
55. A method as defined by claim 54, wherein further including allowing the personnel to indicate when the content of interest is manifest to the personnel.
56. An information distribution system comprising:
a video encoder configured to receive a video signal including content of interest and to produce an encoded video signal;
an audio encoder configured to receive an audio signal and to produce an encoded audio signal;
a transmission component configured to distribute for presentation to a user the encoded audio and video signals;
an authoring tool configured to receive the encoded video and audio signals as the encoded video and audio signals are distributed for presentation to users, the authoring tool being further configured to generate navigation information in response to content of interest in the encoded video and audio signals and to distribute the navigation information to the user via the transmission component;
a playback stack configured to receive the encoded video including the content of interest and to receive the encoded audio signal;
a decoder coupled to the playback stack and the display device, the decoder being configured to decode the encoded video and audio signals and to pass the decoded information to the display device to display the program content of interest; and
a seek point controller configured to receive the navigation information representative of the program content of interest after the decoder has passed the decoded information to the display device and to pass the navigation information to the decoder.
57. A system as defined by claim 56, wherein the playback stack is further configured to store the navigation data.
58. A system as defined by claim 57, wherein the playback stack is configured to store the navigation data if a program including the program content of interest is being viewed by the user when the navigation information is provided to the playback stack.
59. A system as defined by claim 57, wherein the playback stack is configured to store the navigation data if a program including the program content of interest has been designated by the user.
US10/335,112 2001-08-17 2002-12-31 Methods and apparatus for generating navigation information on the fly Abandoned US20030095790A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/335,112 US20030095790A1 (en) 2001-08-17 2002-12-31 Methods and apparatus for generating navigation information on the fly

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/932,806 US7139470B2 (en) 2001-08-17 2001-08-17 Navigation for MPEG streams
US10/335,112 US20030095790A1 (en) 2001-08-17 2002-12-31 Methods and apparatus for generating navigation information on the fly

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/932,806 Continuation US7139470B2 (en) 2001-08-17 2001-08-17 Navigation for MPEG streams

Publications (1)

Publication Number Publication Date
US20030095790A1 true US20030095790A1 (en) 2003-05-22

Family

ID=25462968

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/932,806 Active 2024-10-11 US7139470B2 (en) 2001-08-17 2001-08-17 Navigation for MPEG streams
US10/335,112 Abandoned US20030095790A1 (en) 2001-08-17 2002-12-31 Methods and apparatus for generating navigation information on the fly

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/932,806 Active 2024-10-11 US7139470B2 (en) 2001-08-17 2001-08-17 Navigation for MPEG streams

Country Status (4)

Country Link
US (2) US7139470B2 (en)
EP (1) EP1417839A2 (en)
KR (1) KR100618473B1 (en)
WO (1) WO2003017671A2 (en)

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050005308A1 (en) * 2002-01-29 2005-01-06 Gotuit Video, Inc. Methods and apparatus for recording and replaying sports broadcasts
WO2005043910A1 (en) * 2003-10-22 2005-05-12 Video Networks Ltd. Non-linear interactive video navigation
US20050169150A1 (en) * 2004-01-14 2005-08-04 Jonathan Resnick Method to display visual information
US20050172009A1 (en) * 2004-01-29 2005-08-04 Lg Electronics Inc., Server system for performing communication over wireless network
US20050198341A1 (en) * 2004-01-13 2005-09-08 Michel Walter F. On-demand digital asset management and distribution method and system
US20060015912A1 (en) * 2004-07-16 2006-01-19 Sony Corporation Information processing system, information processing apparatus and method, recording medium, and program
US20060026646A1 (en) * 2004-07-27 2006-02-02 Microsoft Corporation Multi-view video format
US20060280437A1 (en) * 1999-01-27 2006-12-14 Gotuit Media Corp Methods and apparatus for vending and delivering the content of disk recordings
US20060294212A1 (en) * 2003-03-27 2006-12-28 Norifumi Kikkawa Information processing apparatus, information processing method, and computer program
US20070113250A1 (en) * 2002-01-29 2007-05-17 Logan James D On demand fantasy sports systems and methods
US20070300258A1 (en) * 2001-01-29 2007-12-27 O'connor Daniel Methods and systems for providing media assets over a network
US20080036917A1 (en) * 2006-04-07 2008-02-14 Mark Pascarella Methods and systems for generating and delivering navigatable composite videos
US20080155627A1 (en) * 2006-12-04 2008-06-26 O'connor Daniel Systems and methods of searching for and presenting video and audio
US20080170622A1 (en) * 2007-01-12 2008-07-17 Ictv, Inc. Interactive encoded content system including object models for viewing on a remote device
US20090049186A1 (en) * 2007-08-16 2009-02-19 Sony Corporation, A Japanese Corporation Method to facilitate trick-modes for streaming video
US7735101B2 (en) 2006-03-28 2010-06-08 Cisco Technology, Inc. System allowing users to embed comments at specific points in time into media presentation
US20100158109A1 (en) * 2007-01-12 2010-06-24 Activevideo Networks, Inc. Providing Television Broadcasts over a Managed Network and Interactive Content over an Unmanaged Network to a Client Device
US7783653B1 (en) * 2005-06-30 2010-08-24 Adobe Systems Incorporated Fast seek in streaming media
US20120027379A1 (en) * 2010-01-29 2012-02-02 Raymond Thompson Video processing methods and systems
US8230343B2 (en) 1999-03-29 2012-07-24 Digitalsmiths, Inc. Audio and video program recording, editing and playback systems using metadata
US8856650B1 (en) 2012-06-15 2014-10-07 Gregory S. Off System and method for interactive digital content generation
US9021541B2 (en) 2010-10-14 2015-04-28 Activevideo Networks, Inc. Streaming digital video between video devices using a cable television system
CN104717571A (en) * 2013-12-13 2015-06-17 中国移动通信集团公司 Key playing time point determination method, video playing method and related device
US9077860B2 (en) 2005-07-26 2015-07-07 Activevideo Networks, Inc. System and method for providing video content associated with a source image to a television in a communication network
US9123084B2 (en) 2012-04-12 2015-09-01 Activevideo Networks, Inc. Graphical application integration with MPEG objects
US9204203B2 (en) 2011-04-07 2015-12-01 Activevideo Networks, Inc. Reduction of latency in video distribution networks using adaptive bit rates
US9218848B1 (en) * 2014-07-01 2015-12-22 Amazon Technologies, Inc. Restructuring video streams to support random access playback
US9219922B2 (en) 2013-06-06 2015-12-22 Activevideo Networks, Inc. System and method for exploiting scene graph information in construction of an encoded video sequence
US9294785B2 (en) 2013-06-06 2016-03-22 Activevideo Networks, Inc. System and method for exploiting scene graph information in construction of an encoded video sequence
US9326047B2 (en) 2013-06-06 2016-04-26 Activevideo Networks, Inc. Overlay rendering of user interface onto source video
US10275128B2 (en) 2013-03-15 2019-04-30 Activevideo Networks, Inc. Multiple-mode system and method for providing user selectable video content
US10409445B2 (en) 2012-01-09 2019-09-10 Activevideo Networks, Inc. Rendering of an interactive lean-backward user interface on a television

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003101114A1 (en) * 2002-05-28 2003-12-04 Matsushita Electric Industrial Co., Ltd. Moving picture data reproducing device with improved random access
EP1522024A4 (en) * 2002-06-04 2007-03-28 Qualcomm Inc System for multimedia rendering in a portable device
US20040083015A1 (en) * 2002-06-04 2004-04-29 Srinivas Patwari System for multimedia rendering in a portable device
US7236960B2 (en) * 2002-06-25 2007-06-26 Eastman Kodak Company Software and system for customizing a presentation of digital images
KR100607949B1 (en) * 2002-09-11 2006-08-03 삼성전자주식회사 Apparatus for recording or reproducing multimedia data using hierarchical infromation structure and information storage medium thereof
DE602004032584D1 (en) * 2003-02-19 2011-06-16 Panasonic Corp Recording medium, reproducing apparatus and recording method
EP1469476A1 (en) * 2003-04-16 2004-10-20 Accenture Global Services GmbH Controlled multi-media program review
US7519274B2 (en) 2003-12-08 2009-04-14 Divx, Inc. File format for multiple track digital data
US8472792B2 (en) 2003-12-08 2013-06-25 Divx, Llc Multimedia distribution system
KR20060053425A (en) * 2004-11-15 2006-05-22 엘지전자 주식회사 Method and apparatus for writing information on picture data sections in a data stream and for using the information
JP4712812B2 (en) * 2005-10-21 2011-06-29 パナソニック株式会社 Recording / playback device
US8340507B2 (en) * 2007-05-31 2012-12-25 Panasonic Corporation Recording medium, playback apparatus, recording method, program, and playback method
US20090133054A1 (en) * 2007-11-16 2009-05-21 Matthew Thomas Boggie Presentation of auxiliary content via a content presentation device
WO2009065137A1 (en) 2007-11-16 2009-05-22 Divx, Inc. Hierarchical and reduced index structures for multimedia files
KR100958176B1 (en) * 2008-11-26 2010-05-14 주식회사 코아로직 Multimedia data processing device, multimedia system, and recording method of multimedia data
US20120198492A1 (en) * 2011-01-31 2012-08-02 Cbs Interactive, Inc. Stitching Advertisements Into A Manifest File For Streaming Video
EP3047607B1 (en) * 2013-09-20 2017-09-06 Telefonaktiebolaget LM Ericsson (publ) In band control channels of a communication network

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5190534A (en) * 1990-12-07 1993-03-02 Delmed, Inc. Prefilled sterilant fluid releasable coupling connector apparatus for catheter applications
US5195957A (en) * 1991-02-01 1993-03-23 Tollini Dennis R Sterilant cartridge-cap and associated connection
US5334188A (en) * 1987-12-07 1994-08-02 Nissho Corporation Connector with injection site
US5745645A (en) * 1995-09-29 1998-04-28 Matsushita Electric Industrial Co., Ltd. Method and an apparatus for encoding telecine-converted video data for seamless connection
US5792120A (en) * 1995-02-17 1998-08-11 Menyhay; Steve Z. Method of using a sterile medical injection port and cover
US5870754A (en) * 1996-04-25 1999-02-09 Philips Electronics North America Corporation Video retrieval of MPEG compressed sequences using DC and motion signatures
US5956458A (en) * 1996-11-07 1999-09-21 Sharp Laboratories Of America, Inc. System and method for determining representative frames of video captured by a video camera
US6118927A (en) * 1995-01-30 2000-09-12 Kabushiki Kaisha Toshiba Method and apparatus for reproducing a data according to navigation data
US6185363B1 (en) * 1997-06-02 2001-02-06 Philips Electronics North America Corporation Visual indexing system
US6360057B1 (en) * 1999-05-12 2002-03-19 Kabushiki Kaisha Toshiba Digital video recording/playback system with entry point processing function

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5168356A (en) * 1991-02-27 1992-12-01 General Electric Company Apparatus for segmenting encoded video signal for transmission
JP3377677B2 (en) * 1996-05-30 2003-02-17 日本電信電話株式会社 Video editing device
US6236395B1 (en) * 1999-02-01 2001-05-22 Sharp Laboratories Of America, Inc. Audiovisual information management system
KR100608454B1 (en) 1999-10-19 2006-08-02 삼성전자주식회사 A moving picture recording and/or reproduction apparatus using key frame
IL132859A (en) 1999-11-10 2008-07-08 Nds Ltd System for data stream processing

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5334188A (en) * 1987-12-07 1994-08-02 Nissho Corporation Connector with injection site
US5190534A (en) * 1990-12-07 1993-03-02 Delmed, Inc. Prefilled sterilant fluid releasable coupling connector apparatus for catheter applications
US5195957A (en) * 1991-02-01 1993-03-23 Tollini Dennis R Sterilant cartridge-cap and associated connection
US6118927A (en) * 1995-01-30 2000-09-12 Kabushiki Kaisha Toshiba Method and apparatus for reproducing a data according to navigation data
US5792120A (en) * 1995-02-17 1998-08-11 Menyhay; Steve Z. Method of using a sterile medical injection port and cover
US5745645A (en) * 1995-09-29 1998-04-28 Matsushita Electric Industrial Co., Ltd. Method and an apparatus for encoding telecine-converted video data for seamless connection
US5870754A (en) * 1996-04-25 1999-02-09 Philips Electronics North America Corporation Video retrieval of MPEG compressed sequences using DC and motion signatures
US5956458A (en) * 1996-11-07 1999-09-21 Sharp Laboratories Of America, Inc. System and method for determining representative frames of video captured by a video camera
US6185363B1 (en) * 1997-06-02 2001-02-06 Philips Electronics North America Corporation Visual indexing system
US6360057B1 (en) * 1999-05-12 2002-03-19 Kabushiki Kaisha Toshiba Digital video recording/playback system with entry point processing function

Cited By (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060280437A1 (en) * 1999-01-27 2006-12-14 Gotuit Media Corp Methods and apparatus for vending and delivering the content of disk recordings
US8230343B2 (en) 1999-03-29 2012-07-24 Digitalsmiths, Inc. Audio and video program recording, editing and playback systems using metadata
US20070300258A1 (en) * 2001-01-29 2007-12-27 O'connor Daniel Methods and systems for providing media assets over a network
US20080059989A1 (en) * 2001-01-29 2008-03-06 O'connor Dan Methods and systems for providing media assets over a network
US20080060001A1 (en) * 2001-06-08 2008-03-06 Logan James D Methods and apparatus for recording and replaying sports broadcasts
US8091111B2 (en) 2001-06-08 2012-01-03 Digitalsmiths, Inc. Methods and apparatus for recording and replaying sports broadcasts
US20070113250A1 (en) * 2002-01-29 2007-05-17 Logan James D On demand fantasy sports systems and methods
US20050005308A1 (en) * 2002-01-29 2005-01-06 Gotuit Video, Inc. Methods and apparatus for recording and replaying sports broadcasts
US20060294212A1 (en) * 2003-03-27 2006-12-28 Norifumi Kikkawa Information processing apparatus, information processing method, and computer program
US8782170B2 (en) * 2003-03-27 2014-07-15 Sony Corporation Information processing apparatus, information processing method, and computer program
WO2005043910A1 (en) * 2003-10-22 2005-05-12 Video Networks Ltd. Non-linear interactive video navigation
US9055352B2 (en) 2003-10-22 2015-06-09 Video Networks Ip Holdings Limited Non-linear interactive video navigation
US20050188408A1 (en) * 2003-10-22 2005-08-25 Wallis Emily Claire L. Non-linear interactive video navigation
US7293278B2 (en) * 2004-01-13 2007-11-06 Comcast Cable Holdings, Llc On-demand digital asset management and distribution method and system
US20050198341A1 (en) * 2004-01-13 2005-09-08 Michel Walter F. On-demand digital asset management and distribution method and system
US20050169150A1 (en) * 2004-01-14 2005-08-04 Jonathan Resnick Method to display visual information
US20050172009A1 (en) * 2004-01-29 2005-08-04 Lg Electronics Inc., Server system for performing communication over wireless network
US8170782B2 (en) * 2004-07-16 2012-05-01 Sony Corporation Information processing system, information processing apparatus and method, recording medium, and program
US20060015912A1 (en) * 2004-07-16 2006-01-19 Sony Corporation Information processing system, information processing apparatus and method, recording medium, and program
US8483946B2 (en) 2004-07-16 2013-07-09 Sony Corporation Information processing system, information processing apparatus and method, recording medium, and program
US7444664B2 (en) * 2004-07-27 2008-10-28 Microsoft Corp. Multi-view video format
US20060026646A1 (en) * 2004-07-27 2006-02-02 Microsoft Corporation Multi-view video format
US7783653B1 (en) * 2005-06-30 2010-08-24 Adobe Systems Incorporated Fast seek in streaming media
US9077860B2 (en) 2005-07-26 2015-07-07 Activevideo Networks, Inc. System and method for providing video content associated with a source image to a television in a communication network
US7735101B2 (en) 2006-03-28 2010-06-08 Cisco Technology, Inc. System allowing users to embed comments at specific points in time into media presentation
US8332886B2 (en) 2006-03-28 2012-12-11 Michael Lanza System allowing users to embed comments at specific points in time into media presentation
US20080036917A1 (en) * 2006-04-07 2008-02-14 Mark Pascarella Methods and systems for generating and delivering navigatable composite videos
US20080155627A1 (en) * 2006-12-04 2008-06-26 O'connor Daniel Systems and methods of searching for and presenting video and audio
US9826197B2 (en) 2007-01-12 2017-11-21 Activevideo Networks, Inc. Providing television broadcasts over a managed network and interactive content over an unmanaged network to a client device
US9355681B2 (en) * 2007-01-12 2016-05-31 Activevideo Networks, Inc. MPEG objects and systems and methods for using MPEG objects
US20080170622A1 (en) * 2007-01-12 2008-07-17 Ictv, Inc. Interactive encoded content system including object models for viewing on a remote device
US9042454B2 (en) 2007-01-12 2015-05-26 Activevideo Networks, Inc. Interactive encoded content system including object models for viewing on a remote device
US20100158109A1 (en) * 2007-01-12 2010-06-24 Activevideo Networks, Inc. Providing Television Broadcasts over a Managed Network and Interactive Content over an Unmanaged Network to a Client Device
US20090049186A1 (en) * 2007-08-16 2009-02-19 Sony Corporation, A Japanese Corporation Method to facilitate trick-modes for streaming video
US20120027379A1 (en) * 2010-01-29 2012-02-02 Raymond Thompson Video processing methods and systems
US8670648B2 (en) * 2010-01-29 2014-03-11 Xos Technologies, Inc. Video processing methods and systems
US9021541B2 (en) 2010-10-14 2015-04-28 Activevideo Networks, Inc. Streaming digital video between video devices using a cable television system
US9204203B2 (en) 2011-04-07 2015-12-01 Activevideo Networks, Inc. Reduction of latency in video distribution networks using adaptive bit rates
US10409445B2 (en) 2012-01-09 2019-09-10 Activevideo Networks, Inc. Rendering of an interactive lean-backward user interface on a television
US9123084B2 (en) 2012-04-12 2015-09-01 Activevideo Networks, Inc. Graphical application integration with MPEG objects
US10456687B1 (en) 2012-06-15 2019-10-29 Gregory S. Off System and method for interactive digital content generation
US10894210B2 (en) 2012-06-15 2021-01-19 Gregory S. Off System and method for interactive digital content generation
US8856650B1 (en) 2012-06-15 2014-10-07 Gregory S. Off System and method for interactive digital content generation
US9393495B1 (en) 2012-06-15 2016-07-19 Gregory S. Off System and method for interactive digital content generation
US10275128B2 (en) 2013-03-15 2019-04-30 Activevideo Networks, Inc. Multiple-mode system and method for providing user selectable video content
US11073969B2 (en) 2013-03-15 2021-07-27 Activevideo Networks, Inc. Multiple-mode system and method for providing user selectable video content
US9326047B2 (en) 2013-06-06 2016-04-26 Activevideo Networks, Inc. Overlay rendering of user interface onto source video
US10200744B2 (en) 2013-06-06 2019-02-05 Activevideo Networks, Inc. Overlay rendering of user interface onto source video
US9294785B2 (en) 2013-06-06 2016-03-22 Activevideo Networks, Inc. System and method for exploiting scene graph information in construction of an encoded video sequence
US9219922B2 (en) 2013-06-06 2015-12-22 Activevideo Networks, Inc. System and method for exploiting scene graph information in construction of an encoded video sequence
CN104717571A (en) * 2013-12-13 2015-06-17 中国移动通信集团公司 Key playing time point determination method, video playing method and related device
US10002644B1 (en) * 2014-07-01 2018-06-19 Amazon Technologies, Inc. Restructuring video streams to support random access playback
US9218848B1 (en) * 2014-07-01 2015-12-22 Amazon Technologies, Inc. Restructuring video streams to support random access playback

Also Published As

Publication number Publication date
KR100618473B1 (en) 2006-08-31
WO2003017671A3 (en) 2003-09-04
US7139470B2 (en) 2006-11-21
US20030035648A1 (en) 2003-02-20
KR20040030969A (en) 2004-04-09
EP1417839A2 (en) 2004-05-12
WO2003017671A2 (en) 2003-02-27

Similar Documents

Publication Publication Date Title
US20030095790A1 (en) Methods and apparatus for generating navigation information on the fly
US10869102B2 (en) Systems and methods for providing a multi-perspective video display
US8015584B2 (en) Delivering interactive content to a remote subscriber
US20170221520A1 (en) Systems and methods to play secondary media content
US7849487B1 (en) Review speed adjustment marker
US20160142768A1 (en) Closed caption tagging system
US20090178092A1 (en) Video picture information delivering apparatus and receiving apparatus
US9456243B1 (en) Methods and apparatus for processing time-based content
KR101426241B1 (en) Apparatus and method for processing recording contents for personal liking
EP1266521B1 (en) System and method for providing multi-perspective instant replay
AU2001266732A1 (en) System and method for providing multi-perspective instant replay
JP2002077820A (en) Accumulating/reproducing device and digital broadcast transmitting device
KR101033558B1 (en) Private Video Recorder and Method for Highlight Reproduction of Private Video Recorder
JP2005204233A (en) Digital broadcast receiver and transmitter, receiving method, program, recording medium, and video recording and reproducing apparatus

Legal Events

Date Code Title Description
AS Assignment

Owner name: INTEL CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:JOSHI, AJIT P.;REEL/FRAME:013755/0365

Effective date: 20021231

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: TAHOE RESEARCH, LTD., IRELAND

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INTEL CORPORATION;REEL/FRAME:061827/0686

Effective date: 20220718