Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS20040249807 A1
Publication typeApplication
Application numberUS 10/886,809
Publication date9 Dec 2004
Filing date8 Jul 2004
Priority date16 Dec 1999
Also published asUS6928655, US7305384, US7565440, US20050076378, US20050080847
Publication number10886809, 886809, US 2004/0249807 A1, US 2004/249807 A1, US 20040249807 A1, US 20040249807A1, US 2004249807 A1, US 2004249807A1, US-A1-20040249807, US-A1-2004249807, US2004/0249807A1, US2004/249807A1, US20040249807 A1, US20040249807A1, US2004249807 A1, US2004249807A1
InventorsNosakhare Omoigui
Original AssigneeMicrosoft Corporation
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Live presentation searching
US 20040249807 A1
Abstract
In a networked client/server system, live presentations can be streamed from an encoder or other server to a client computer. Additionally, information describing the presentation is registered with a search server. This information is made available for user searches only for as long as the information properly describes the live presentation. When the information no longer describes a current live presentation, the information is no longer available for searching.
Images(7)
Previous page
Next page
Claims(48)
1. A system comprising:
a search server;
an encoder;
a client computer;
wherein the encoder is to provide an indication of a currently available live presentation to the search server;
wherein the client computer is to submit a request with search criteria to the search server;
wherein the search server is to,
determine whether the currently available live presentation from the encoder matches the search criteria, and
transmit an identifier of the encoder to the client computer if the currently available live presentation matches the search criteria; and
wherein the encoder is to provide the live presentation to the client computer.
2. A system as recited in claim 1, wherein the encoder is further to provide a subsequent indication to the search server indicating that the live presentation is over.
3. A system as recited in claim 1, wherein the encoder further provides to the search server, during the live presentation, information identifying current characteristics of the live presentation.
4. A system as recited in claim 3, wherein:
the search server is further to transmit the information identifying current characteristics of the live presentation to the client computer; and
the client computer is further to display the information identifying current characteristics of the live presentation.
5. A system as recited in claim 3, wherein:
the information identifying current characteristics comprises a topic description; and
the encoder provides a characteristics over indication to the search server when the topic identified by the topic description is no longer being presented.
6. A system as recited in claim 3, wherein the information identifying the current characteristics comprises text corresponding to the live presentation.
7. A system as recited in claim 1, wherein the live presentation comprises an audio/video streaming media presentation.
8. A system as recited in claim 1, wherein the search server is further to:
maintain a record of user search requests; and
notify the corresponding user when a new live presentation becomes available that satisfies a search request.
9. A system as recited in claim 1, wherein the encoder is configured to send a message to the search server identifying duration of the live presentation and is further configured to send, to the search server, a further indication to change the duration when a presentation that is scheduled for a particular duration is to be extended.
10. A method comprising:
sending, to a search server, information identifying a live presentation available over a network at the beginning of the live presentation; and
identifying, to the search server, when the live presentation is no longer available over the network.
11. A method as recited in claim 10, wherein the identifying comprises sending, to the search server, an indication of the duration of the live presentation.
12. A method as recited in claim 10, wherein the identifying comprises sending, to the search server, an indication of when the live presentation has ended.
13. A method as recited in claim 10, further comprising sending, to the search server, an identifier of an encoder from which the live presentation can be obtained.
14. A method as recited in claim 13, wherein the sending the identifier comprises sending, as the identifier, a uniform resource locator (URL).
15. A method as recited in claim 10, further comprising identifying, to the search server, information indicating characteristics of a part of the live presentation currently being presented.
16. A method as recited in claim 15, wherein the identifying information includes sending, to the search server, an indication of the duration of the characteristics.
17. A method as recited in claim 16, wherein the identifying information comprises:
sending, to the search server, an indication of the characteristics when the current characteristics begin to describe the live presentation; and
sending, to the search server, a characteristics over indication when the current characteristics no longer describe the live presentation.
18. A method as recited in claim 10, further comprising generating the information identifying the live presentation as the live presentation is presented over the network.
19. A method as recited in claim 18, wherein the generating comprises identifying key words as the live presentation is presented.
20. A method as recited in claim 10, further comprising using closed captioning data as the information identifying the live presentation.
21. A method as recited in claim 10, wherein the live presentation comprises a composite media stream having an audio stream and a video stream.
22. A method as recited in claim 10, wherein the identifying comprises sending, to the search server, an indication of duration of the live presentation; and further comprising, sending, to the search server, a further indication to change the duration when a presentation that is scheduled for a particular duration is to be extended.
23. One or more computer-readable memories containing a computer program that is executable by a processor to perform acts of:
sending, to a search server, information identifying a live presentation available over a network at the beginning of the live presentation; and
identifying, to the search server, when the live presentation is no longer available over the network.
24. An apparatus comprising:
a bus;
a processor coupled to the bus; and
a memory, coupled to the bus, to store a plurality of instructions that are executed by the processor, wherein the plurality of instructions, when executed, cause the processor to,
receive information identifying live content,
maintain the information for as long as the live content is available, and
use the information to respond to searches from a plurality of client computers.
25. An apparatus as recited in claim 24, wherein the instructions to receive information identifying live content are to receive information identifying live content available from an encoder at the time the information is received.
26. An apparatus as recited in claim 24, further comprising a nonvolatile storage device, coupled to the bus, to record the information identifying live content.
27. An apparatus as recited in claim 24, wherein the plurality of instructions, when executed, further cause the processor to store the information identifying live content in the memory.
28. An apparatus as recited in claim 24, wherein the information identifying live content includes a set of descriptive words and an indicator of a server from which the live content is available.
29. An apparatus as recited in claim 28, wherein the indicator of the server comprises a uniform resource locator (URL).
30. An apparatus as recited in claim 24, wherein the plurality of instructions, when executed, further cause the processor to:
receive information identifying current characteristics of the live content;
maintain the information identifying the current characteristics for as long as the characteristics describe the live content; and
use the information identifying the current characteristics to respond to searches from the plurality of client computers.
31. An apparatus as recited in claim 24, wherein the plurality of instructions, when executed, further cause the processor to:
receive information identifying current topic information identifying a topic currently being presented as part of the live content;
receive an indication that the topic is no longer being presented;
maintaining the topic information for a period of time after receiving the indication that the topic is no longer being presented; and
using the current topic information to respond to searches from the plurality of computers during the period of time.
32. An apparatus as recited in claim 24, wherein the plurality of instructions, when executed, further cause the processor to generate, based on the information identifying live content, descriptive information to be added to a database of live content.
33. A system comprising:
a search server;
an encoder;
a client computer;
wherein the encoder is to provide an indication of a currently available live presentation to the search server;
wherein the client computer is to submit a request with search criteria to the search server;
wherein the search server is to,
determine whether the currently available live presentation from the encoder matches the search criteria, and
transmit an identifier of the encoder to the client computer if the currently available live presentation matches the search criteria;
wherein the encoder is configured to provide the live presentation to the client computer; and
wherein the encoder is further configured to provide to the search server, during the live presentation, information identifying current characteristics of the live presentation and the data is chosen from a group consisting of: one or more key words describing the presentation, a summary or abstract of the presentation, or a textual transcript of the presentation.
34. Computer-readable storage media containing a computer-readable instructions executable by a processor to cause the processor to:
send, to a search server, information identifying a live presentation available over a network at the beginning of the live presentation; and
identify, to the search server, when the live presentation is no longer available over the network.
35. Computer-readable storage media as recited in claim 34, wherein the computer-readable instructions comprise instructions configured to cause the processor to send, to the search server, an indication of the duration of the live presentation.
36. Computer-readable storage media as recited in claim 34, wherein the computer-readable instructions configured to cause the processor to identify comprise instructions configured to cause the processor to send, to the search server, an indication of when the live presentation has ended.
37. Computer-readable storage media as recited in claim 34, wherein the computer-readable instructions comprise instructions configured to cause the processor to send, to the search server, an identifier of an encoder from which the live presentation can be obtained.
38. Computer-readable storage media as recited in claim 34, wherein the computer-readable instructions comprise instructions configured to cause the processor to send, to the search server, an identifier of an encoder from which the live presentation can be obtained, wherein the identifier comprises a uniform resource locator (URL).
39. Computer-readable storage media as recited in claim 34, wherein the computer-readable instructions comprise instructions configured to cause the processor to identify, to the search server, information indicating characteristics of a part of the live presentation currently being presented.
40. Computer-readable storage media as recited in claim 34, wherein the computer-readable instructions comprise instructions configured to cause the processor to identify, to the search server, information indicating characteristics of a part of the live presentation currently being presented, including an indication of a duration of the characteristics.
41. Computer-readable storage media as recited in claim 34, wherein the computer-readable instructions comprise instructions configured to cause the processor to identify, to the search server, information indicating current characteristics of a part of the live presentation currently being presented, wherein the instructions are further configured to cause the processor to:
send, to the search server, an indication of the characteristics when the current characteristics begin to describe the live presentation; and
send, to the search server, a characteristics over indication when the current characteristics no longer describe the live presentation.
42. Computer-readable storage media as recited in claim 34, wherein the computer-readable instructions comprise instructions configured to cause the processor to generate the information identifying the live presentation as the live presentation is presented over the network.
43. Computer-readable storage media as recited in claim 34, wherein the computer-readable instructions comprise instructions configured to cause the processor to generate the information identifying the live presentation as the live presentation is presented over the network, wherein the information comprises key words identified as the live presentation is presented.
44. Computer-readable storage media as recited in claim 34, wherein the computer-readable instructions comprise instructions configured to cause the processor to use closed captioning data as the information identifying the live presentation.
45. Computer-readable storage media as recited in claim 34, wherein the computer-readable instructions comprise instructions configured to cause the processor to send information identifying a live presentation comprise instructions configured to cause the processor to send information identifying a composite media stream having an audio stream and a video stream.
46. Computer-readable storage media as recited in claim 34, wherein the computer-readable instructions configured to cause the processor to identify comprise instructions configured to cause the processor to send, to the search server, an indication of when the live presentation has ended and to prevent any subsequent user search requests from being satisfied using the information describing that presentation by deleting the information describing that presentation.
47. A system comprising:
an encoder configured to provide an indication of a currently available live presentation;
a client computer;
a search server including:
a content database configured to maintain descriptive information regarding current live content available from the encoder;
a notification database configured to maintain information regarding users of client computers registered to be notified when particular live content is available; and
a scheduled presentations database configured to maintain information regarding future live presentations that have been registered with the search server;
wherein the client computer is configured to submit a request with search criteria to the search server;
wherein the search server is configured to:
determine whether the currently available live presentation from the encoder matches the search criteria, and
transmit an identifier of the encoder to the client computer if the currently available live presentation matches the search criteria; and
wherein the encoder is configured to provide the live presentation to the client computer.
48. The system of claim 47, wherein the search server includes:
a query interface configured to allow the client computer to communicate with the search server to enter search criteria for live content;
a registration interface configured to allow the encoder to provide descriptive information regarding the live content to be provided to the client computer and to be added to the content database to be used for searches;
a search engine configured to access the content database to search the descriptive information for live content that matches the search criteria upon receipt of a search request via the query interface, and
a database controller configured to manage the content, notification and scheduled presentations databases, including adding entries to and removing entries from the content, notification and scheduled presentations databases.
Description
    RELATED APPLICATIONS
  • [0001]
    This application is a divisional of U.S. patent application Ser. No. 09/465,547, filed on Dec. 16, 1999, entitled “Live Presentation Searching” and naming Nosakhare D. Omoigui as inventor, the disclosure of which is hereby incorporated herein by reference.
  • TECHNICAL FIELD
  • [0002]
    This invention relates to networked client/server systems and to methods of delivering and rendering live content in such systems. More particularly, the invention relates to searching for live presentations.
  • BACKGROUND OF THE INVENTION
  • [0003]
    The advent of computers and their continued technological advancement has revolutionized the manner in which people work and live. Information that used to be available only in written or verbal form is becoming increasingly available in electronic form. Furthermore, presentations which used to be available only on particular recording media (e.g., film or tape) or via television broadcasts are now available in digital form (e.g., over the Internet).
  • [0004]
    One problem encountered by users when faced with this continually increasing mass of digital information is the ability to locate particular information that the user is interested in. For example, trying to locate a particular presentation can be difficult and cumbersome for users. Various search mechanisms exist for pre-recorded “on-demand” presentations (e.g., various world wide web search engines). On-demand presentations are fairly easily searchable because the underlying data of the presentation is already known. However, in the case of live presentations, such underlying data is not known because, as the presentation is live, the underlying data is not available yet.
  • [0005]
    Some systems do exist that allow a user to identify scheduled live presentations. For example, a television programming guide may be available over the Internet that allows a user to search for television programs that are scheduled to be broadcast (e.g., via cable, satellite system, or typical television broadcast frequencies, such as UHF or VHF) and their associated broadcast times. However, such programming guides typically do not provide the flexibility to allow non-scheduled programs to be identified to the user. Furthermore, such programming guides are typically limited to television broadcasts and do not allow users to identify presentations from any of the wide variety of alternate sources (such as via the Internet).
  • [0006]
    The invention described below addresses these disadvantages, providing a way to search for live presentations.
  • SUMMARY OF THE INVENTION
  • [0007]
    In a networked client/server system, live presentations can be streamed from an encoder or other server to a client computer. Additionally, information describing the presentation is registered with a search server. This information is made available for user searches only for as long as the information properly describes the live presentation. When the information no longer describes a current live presentation, the information is no longer available for searching.
  • [0008]
    According to one aspect of the invention, the information describes the entire presentation. The information is available in the search server for user searches for the duration of the presentation. Once the presentation is over, the information is deleted from the search server, preventing any subsequent user search requests from being satisfied using the information describing that presentation.
  • [0009]
    According to another aspect of the invention, the information describes a particular characteristic(s) of the presentation (e.g., the current topic). The information for a characteristic is available in the search server for user searches for as long as that characteristic describes the portion of the presentation currently being presented. Once that characteristic no longer describes the portion currently being presented, the information describing that characteristic is deleted from the search server, preventing any subsequent user search requests from being satisfied using the information describing that characteristic.
  • [0010]
    According to another aspect of the invention, a user can register a notification request with the search server. The notification request identifies a set of search criteria as well as a manner in which the user should be notified in the event a live presentation matches the search criteria. The search server continues to compare new information regarding available live presentations to the search criteria. If a match is found, the search server notifies the user in whatever manner the user requested.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • [0011]
    [0011]FIG. 1 shows a client/server network system and environment in accordance with one embodiment of the invention.
  • [0012]
    [0012]FIG. 2 shows a general example of a computer that can be used in accordance with the invention.
  • [0013]
    [0013]FIG. 3 illustrates an exemplary search server in more detail.
  • [0014]
    [0014]FIG. 4 illustrates entries of an exemplary content database in more detail.
  • [0015]
    [0015]FIG. 5 shows exemplary steps in a process for allowing searching of live presentations.
  • [0016]
    [0016]FIG. 6 shows exemplary steps in a process for allowing searching of current characteristics information in live presentations.
  • DETAILED DESCRIPTION
  • [0017]
    General Network Structure
  • [0018]
    [0018]FIG. 1 shows a client/server network system and environment in accordance with one embodiment of the invention. Generally, the system includes multiple (n) network client computers 102, multiple (m) encoders 104, and a search server 106. The computers 102, encoders 104, and server 106 communicate with each other over a data communications network. The communications network in FIG. 1 is a public network 108, such as the Internet. The data communications network might also include local-area networks and/or private wide-area networks, and can include both wired and wireless sections. Client computers 102, encoders 104, and server 106 can communicate with one another via any of a wide variety of known protocols, such as the Hypertext Transfer Protocol (HTTP).
  • [0019]
    Encoders 104 receive live content or presentations in the form of different media streams 110. Encoders 104 can be dedicated media servers, or alternatively other more general-purpose computer systems. These media streams 110 can be individual media streams (e.g., audio, video, graphical, etc.), or alternatively can be composite media streams including two or more of such individual streams. The media streams 110 are provided to encoders on a “live” basis from other data source components through dedicated communications channels or through the Internet itself. Encoders 104 coordinate the streaming of the live content to other components on the network 108 that request the content, such as client computers 102. It is to be appreciated that although the media streams are referred to as being “live”, there may be a delay (e.g., between one second and thirty seconds) between the time of the actual event and the time the media streams reach the encoder(s).
  • [0020]
    There are various standards for streaming media content and composite media streams. “Advanced Streaming Format” (ASF) is an example of such a standard, including both accepted versions of the standard and proposed standards for future adoption. ASF specifies the way in which multimedia content is stored, 11 streamed, and presented by the tools, servers, and clients of various multimedia vendors. ASF provides benefits such as local and network playback, extensible media types, component download, scalable media types, prioritization of streams, multiple language support, environment independence, rich inter-stream relationships, and expandability. Further details about ASF are available from Microsoft Corporation of Redmond, Wash.
  • [0021]
    Encoders 104 can transmit any type of presentation over the network 108. Examples of such presentations include audio/video presentations (e.g., television broadcasts or presentations from a “NetShow™” server (available from Microsoft Corp. of Redmond, Wash.)), video-only presentations, audio-only presentations, graphical or animated presentations, etc.
  • [0022]
    Search server 106 maintains a content database 112, a notification database 114, and a scheduled presentations database 116. In content database 112, server 106 maintains descriptive information regarding the current live content available from encoders 104. A user of a client computer 102 can access search server 106 to search for particular live content. In notification database 114, server 106 maintains information regarding users of client computers 102 that have registered to be notified when particular live content is available. In scheduled presentations database 116, server 106 maintains information regarding future live presentations that have been registered with server 106.
  • [0023]
    Exemplary Computer Environment
  • [0024]
    In the discussion below, the invention will be described in the general context of computer-executable instructions, such as program modules, being executed by one or more conventional personal computers. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Moreover, those skilled in the art will appreciate that the invention may be practiced with other computer system configurations, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and the like. In a distributed computer environment, program modules may be located in both local and remote memory storage devices.
  • [0025]
    Alternatively, the invention could be implemented in hardware or a combination of hardware, software, and/or firmware. For example, the invention could be implemented in one or more application specific integrated circuits (ASICs).
  • [0026]
    [0026]FIG. 2 shows a general example of a computer 142 that can be used in accordance with the invention. Computer 142 is shown as an example of a computer that can perform the functions of any of client computers 102, server encoders 104, or server 106 of FIG. 1.
  • [0027]
    Computer 142 includes one or more processors or processing units 144, a system memory 124, and a system bus 148 that couples various system components including the system memory 124 to processors 144.
  • [0028]
    The bus 148 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. The system memory includes read only memory (ROM) 150 and random access memory (RAM) 152. A basic input/output system (BIOS) 154, containing the basic routines that help to transfer information between elements within computer 142, such as during start-up, is stored in ROM 150. Computer 142 further includes a hard disk drive 156 for reading from and writing to a hard disk, not shown, a magnetic disk drive 158 for reading from and writing to a removable magnetic disk 160, and an optical disk drive 162 for reading from or writing to a removable optical disk 164 such as a CD ROM or other optical media. The hard disk drive 156, magnetic disk drive 158, and optical disk drive 162 are connected to the system bus 148 by an SCSI interface 166 or some other appropriate interface. The drives and their associated computer-readable media provide nonvolatile storage of computer readable instructions, data structures, program modules and other data for computer 142. Although the exemplary environment described herein employs a hard disk, a removable magnetic disk 160 and a removable optical disk 164, it should be appreciated by those skilled in the art that other types of computer readable media which can store data that is accessible by a computer, such as magnetic cassettes, flash memory cards, digital video disks, random access memories (RAMs) read only memories (ROM), and the like, may also be used in the exemplary operating environment.
  • [0029]
    A number of program modules may be stored on the hard disk, magnetic disk 160, optical disk 164, ROM 150, or RAM 152, including an operating system 170, one or more application programs 172, other program modules 174, and program data 176. A user may enter commands and information into computer 142 through input devices such as keyboard 178 and pointing device 180. Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner, or the like. These and other input devices are connected to the processing unit 144 through an interface 182 that is coupled to the system bus. A monitor 184 or other type of display device is also connected to the system bus 148 via an interface, such as a video adapter 186. In addition to the monitor, personal computers typically include other peripheral output devices (not shown) such as speakers and printers.
  • [0030]
    Computer 142 operates in a networked environment using logical connections to one or more remote computers, such as a remote computer 188. The remote computer 188 may be another personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to computer 142, although only a memory storage device 190 has been illustrated in FIG. 2. The logical connections depicted in FIG. 2 include a local area network (LAN) 192 and a wide area network (WAN) 194. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets, and the Internet. In the described embodiment of the invention, remote computer 188 executes an Internet Web browser program such as the “Internet Explorer” Web browser manufactured and distributed by Microsoft Corporation of Redmond, Wash.
  • [0031]
    When used in a LAN networking environment, computer 142 is connected to the local network 192 through a network interface or adapter 196. When used in a WAN networking environment, computer 142 typically includes a modem 198 or other means for establishing communications over the wide area network 194, such as the Internet. The modem 198, which may be internal or external, is connected to the system bus 148 via a serial port interface 168. In a networked environment, program modules depicted relative to the personal computer 142, or portions thereof, may be stored in the remote memory storage device. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used.
  • [0032]
    Generally, the data processors of computer 142 are programmed by means of instructions stored at different times in the various computer-readable storage media of the computer. Programs and operating systems are typically distributed, for example, on floppy disks or CD-ROMs. From there, they are installed or loaded into the secondary memory of a computer. At execution, they are loaded at least partially into the computer's primary electronic memory. The invention described herein includes these and other various types of computer-readable storage media when such media contain instructions or programs for implementing the steps described below in conjunction with a microprocessor or other data processor. The invention also includes the computer itself when programmed according to the methods and techniques described below. Furthermore, certain sub-components of the computer may be programmed to perform the functions and steps described below. The invention includes such sub-components when they are programmed as described. In addition, the invention described herein includes data structures, described below, as embodied on various types of memory media.
  • [0033]
    For purposes of illustration, programs and other executable program components such as the operating system are illustrated herein as discrete blocks, although it is recognized that such programs and components reside at various times in different storage components of the computer, and are executed by the data processor(s) of the computer.
  • [0034]
    Search Server
  • [0035]
    [0035]FIG. 3 illustrates an exemplary search server in more detail. Search server 106 includes a query interface 210, a registration interface 212, a search engine 214, and a database controller 216. Client computers 102 (FIG. 1) communicate with search server 106 via query interface 210. Query interface 210 allows users of client computers 102 to enter search criteria for live content. Upon receipt of a search request via query interface 210, search engine 214 accesses content database 112 to search for live content that matches the search criteria.
  • [0036]
    Encoders 104 (FIG. 1) communicate with search server 106 via registration interface 212. Registration interface 212 allows encoders 104 to provide descriptive information regarding the live content that they can stream to client computers. This descriptive information can then be added to content database 112 and used for searches by search engine 214. Descriptive information can be maintained by server 106 for any live content that can be provided by encoders 104.
  • [0037]
    Database controller 216 manages the databases 112, 114, and 116. This management includes both adding entries to and removing entries from databases 112, 114, and 116.
  • [0038]
    Live content or presentations available from encoders 104 can be either pre-scheduled or non-scheduled. Pre-scheduled presentations refer to presentations that have been registered with search server 106 as occurring in the future (e.g., not already in progress and not starting within the next five minutes). Descriptive information regarding pre-scheduled presentations (e.g., presentation title, key words describing the content of the presentation, and encoder(s) from which the presentation will be available) can be provided to server 106 from one of the encoders 104 or some other source, either via the network 108 (FIG. 1) or alternatively some other delivery mechanism (e.g., a magnetic or optical disk).
  • [0039]
    Non-scheduled presentations refer to presentations that have not been registered with search server 106 as occurring in the future. A non-scheduled presentation is registered with server 106 as the presentation is about to begin (e.g., within the next five minutes) or shortly after it has begun. The presentation can be registered by an encoder 104 (e.g., the encoder 104 that will be streaming the live content), or alternatively some other source. As part of the registration process, server 106 is provided with descriptive information regarding the presentation.
  • [0040]
    [0040]FIG. 4 illustrates entries of an exemplary content database in more detail. FIG. 4 is described with additional reference to components in FIG. 3. Each entry in content database 112 includes data for one or more of the following fields: title 230, source 232, duration 234, current characteristic(s) 236, and descriptive information 238. Content database 112 can be stored in volatile memory (e.g., RAM), non-volatile memory (e.g., a magnetic disk drive), or a combination thereof.
  • [0041]
    Title field 230 includes a descriptive title of the presentation. Source field 232 identifies the encoder(s) 104 from which the presentation can be obtained. Duration field 234 identifies, for some entries, the duration of the presentation; in other entries, the duration data is not included. Characteristics field 236 optionally identifies the current characteristics for the presentation (i.e., one or more characteristics describing the portion of the presentation currently being presented or about to be presented). Any of a wide variety of characteristics can be included for the presentation, such as the current topic (as illustrated in FIG. 4), the name of the current speaker, the gender of the current speaker, the color of the current speaker's clothing, etc. Descriptive information field 238 provides various descriptive information that describes the content of the presentation.
  • [0042]
    Content database 112 maintains information identifying each of the currently available live presentations that is registered with server 106. Information regarding pre-scheduled presentations that are not currently available (or shortly will be available) from an encoder 104 is maintained in scheduled database 116. Alternatively, such information could be included in content database 112 and simply marked as “invalid” until the presentation is available from an encoder 104.
  • [0043]
    In the illustrated example, each current live presentation has an associated entry in database 112. When a pre-scheduled presentation is about to begin (e.g., it is scheduled to begin shortly, such as within two minutes), database controller 216 (FIG. 3) loads descriptive information corresponding to the presentation into an entry of database 112. In the case of pre-scheduled presentations, this descriptive information is loaded from pre-scheduled presentations database 116. In the case of non-scheduled presentations, this descriptive information is received directly from registration interface 212.
  • [0044]
    When a presentation is over, the entry in database 112 corresponding to the presentation is deleted. Server 106 is provided with an indication of the duration of a live presentation from the encoder or other device that registered the presentation with server 106. In one implementation, this indication of the duration is a time period or “run time” for the presentation. For example, an encoder may indicate that a particular live presentation is going to be available between 2:00 p.m. and 4:00 p.m. on Jan. 1, 2000, or that a particular live 1 presentation is going to last for 45 minutes. Database controller 216 monitors content database 112 for presentations whose time period or “run time” has passed, and deletes the corresponding entries from database 112. According to another implementation, this indication of the duration of the presentation is simply a “presentation over” message or similar indicator. For example, an encoder may register for a current live presentation, and then send a “presentation over” message to the server 106 when the presentation has completed. Upon receipt of the “presentation over” message, database controller 216 deletes the entry corresponding to the presentation from database 112.
  • [0045]
    In one implementation of the invention, database controller 216 also includes a timeout control that monitors the length of time that entries have been in database 112. If a “presentation over” message is not received for a live presentation within a default period of time, then controller 216 assumes that a “presentation over” message was mistakenly not sent (or was lost in transit) and removes the entry corresponding to the live presentation from database 112.
  • [0046]
    Additionally, in another implementation of the invention a presentation that is scheduled for a particular duration (e.g., 90 minutes, or from 2:00 p.m. to 4:00 p.m.) may be extended. The duration can be extended by the encoder 104 (or other device) sending a message to server 106 to change the duration in the corresponding entry of database 112. For example, the message may indicate to change “90 minutes” to “110 minutes”, or to change “2:00 p.m. to 4:00 p.m.” to “2:00 p.m. to 5:00 p.m.”. Alternatively, the duration may be extended by the encoder 104 (or other device) sending a message to server 106 indicating that server 106 is to ignore the previously identified duration and that a “presentation over” message or similar indicator will be transmitted to server 106 when the presentation is over.
  • [0047]
    The descriptive information field 238 of an entry includes data that describes the content of the corresponding presentation. In the illustrated example of FIG. 4, the data includes a set of one or more key words describing the presentation. Alternatively, the data could include a summary or abstract of the presentation, or a textual transcript of the presentation.
  • [0048]
    The data for descriptive information field 238 can be generated manually or automatically. Manual generation refers to an individual (e.g., the presentation author) creating the data. For example, the author may write a summary or a list of key words for the presentation and provide them to server 106 (either directly or via an encoder 104).
  • [0049]
    Automatic generation refers to one of the components, such as an encoder 104 or server 106, using any of a variety of mechanisms to generate data describing the presentation as the presentation occurs. For example, conventional key word generation processes may be employed to identify key words from the presentation. This may be carried out by an encoder 104, server 106, or some other component coupled to network 108. By way of another example, closed captioning information may be used as the data, or conventional speech-to-text conversion techniques may be used to convert audio data into text data.
  • [0050]
    The information maintained in content database 112 is used by search engine 214 to respond to search requests received from users of a client computer 102 (FIG. 1). A user provides, as part of his or her search request, a set of search criteria and which fields the search criteria should be applied to. The user can provide search requests via any of a wide variety of conventional input mechanisms, such as a graphical user interface (GUI). In the illustrated example, the user is able to search any of the fields in content database 214. Search engine 214 compares the user-provided search criteria to each entry in the database 112 to determine whether the presentation corresponding to the entry satisfies the search request. Any of a variety of conventional searching algorithms and methodologies can be used. For example, any entry with at least one word matching one of the search criteria may satisfy the search request, an entry may be required to include every word in the search criteria in order to satisfy the search request, etc.
  • [0051]
    Information regarding presentations that satisfy a search request are provided to the client computer 102 of the user that placed the request. Such information may be the entire entry from database 112, or alternatively a selected portion (e.g., the title field 230 and source field 232 for the entry). The source field 232 is provided to the client computer to allow the user to subsequently request the presentation, via the client computer, from the appropriate encoder resource locator (URL) that identifies a particular presentation available from a particular encoder.
  • [0052]
    Information from each entry that satisfies the search criteria is provided to the user and, if multiple entries satisfy the each criteria, then the user can select one or more presentations based on this information. Alternatively, server 106 may rank the entries based on how well they match the search criteria and return information for only the highest ranking entry (or entries) to the user.
  • [0053]
    In addition to information describing the overall content of the presentation, current “characteristic” information is also (optionally) included in database 112. Characteristic information describing one or more current characteristics of the presentation is registered with search server 106 by the encoder 104. When one or more of the current characteristics changes, the encoder 104 registers the new current characteristic(s) with server 106. Server 106, in response, changes the entry in content database 112 corresponding to the presentation to identify the new current characteristics (e.g., by replacing one or more of the current characteristics or by adding a new characteristic(s)). By continually updating the current characteristics, a user can search for particular characteristics without regard for which actual presentation includes the characteristics. For example, a user may be interested in discussions of Microsoft Corporation and can search for the characteristics “Microsoft” or “Bill Gates” across multiple presentations registered with server 106.
  • [0054]
    A current characteristic has a duration analogous to that of the presentation discussed above. Each characteristic may have its own duration, or multiple characters for a presentation may have the same duration. The duration of the characteristics can be identified explicitly (e.g., the author may indicate that Microsoft Corporation will be discussed from 2:07 p.m. to 2:12 p.m., or that the current characteristic of Microsoft Corporation will be accurate for the next seven minutes, or a “characteristic over” indicator (such as a “cancel characteristic” message) may be transmitted to server 106 from encoder 104). Alternatively, the duration of the characteristics can be identified implicitly (e.g., the previous current characteristics are over when new current characteristics information is received).
  • [0055]
    Current characteristics data can also be generated either manually or automatically, analogous to the generation of data for descriptive information field 238 discussed above. For example, an algorithm may use closed captioning data or a speech-to-text conversion algorithm to obtain a textual version of the presentation. Key words can then be identified from the textual version and if their frequency is high enough (e.g., the word “Microsoft” occurs at least a threshold number of times, such as ten, within a period of time, such as sixty seconds or every 500 words), then those key words are identified as the current topic data.
  • [0056]
    Current characteristics information can be deleted from database 112 in an immediate manner. That is, as soon as new current characteristics data is received, the previous current characteristics data is deleted. Alternatively, the current characteristic information may be “aged out” of database 112 gradually. For example, if new current characteristics are identified and the key words that caused the identification of the previous current characteristics are not detected within a threshold amount of time (e.g., ten minutes), then the previous current characteristics are deleted from database 112. This aging out can be implemented by server 106, or alternatively can be used by encoder 104 in determining when to transmit a “characteristic over” indicator to server 106.
  • [0057]
    Thus, using characteristics, it can be seen that the results of a search request can vary depending on when during the presentation the search request is made.
  • [0058]
    Alternatively, the current characteristics for a presentation can be displayed to the user rather than used for searching. For example, a user may submit a search request that results in multiple live presentations with descriptive information 238 satisfying the search criteria. Search server 106 transmits the current characteristics for each of these matching live presentations (as well as other information, such as title 230) to client 102 for display to the user. Search 11 server 106 also transmits any changes in the current characteristics for these matching live presentations to client 102. Thus, client 102 presents to the user a continually updating display of the current characteristics of the live presentations that satisfy his or her search request.
  • [0059]
    Database controller 216 also maintains notification database 114. A user can register a “notification request” with server 106 that includes a search request and a notification type. The search request includes the user's search criteria and the notification type identifies how the user wants to be notified in the event a live presentation begins that matches the search criteria. In one implementation, a user can register an email address, a pager number, a cellular phone (or other telephone) number, etc.
  • [0060]
    Database controller 216 receives the notification request and places the search criteria and notification type in notification database 114. Database controller 216 also invokes search engine 214 to determine whether any current entry in content database 112 satisfies the search criteria. If a match is found, then the user is notified in a manner according to the notification type. The notification request may then be removed from notification database 114, or alternatively left in notification database 114 to detect subsequent matches.
  • [0061]
    If a match is not immediately found, then database controller 216 continues to invoke search engine 214 each time new information is placed in content database 112. Once invoked, search engine 214 determines whether the new information results in an entry that matches any of the search criteria of notification requests in notification database 114. This search may be compared to all entries in content database 112, or alternatively only to the entries in database 112 that include the new information.
  • [0062]
    [0062]FIG. 5 shows exemplary steps in a process for allowing searching of live presentations. Steps on the left side of dashed line 250 are carried out by an encoder 104 of FIG. 1, and steps on the right side of dashed line 250 are carried out by search server 106 of FIG. 1. These steps may be performed in software. FIG. 5 is described with additional reference to components in FIG. 1.
  • [0063]
    Initially, encoder 104 sends identifying information for a current live presentation to search server 106 (step 252). This identifying information is received by server 106 (step 254), which records the information and makes the information available for user searches (step 256). The identifying information is used by server 106 in responding to any subsequent search requests it receives (step 258).
  • [0064]
    While server 106 is performing steps 254-258, encoder 104 continues to stream the live presentation to any of the client computers 102 that request it until the presentation is over (steps 260 and 262). When the presentation is over, encoder 104 stops streaming the presentation to client computers 102 and sends a “presentation over” indication to server 106 (step 264).
  • [0065]
    Server 106 receives the “presentation over” indication from encoder 104 (step 266) and deletes its record of the identifying information regarding the presentation (step 268). Thus, any subsequent search requests will not be compared to the identifying information for that presentation, as that presentation is over.
  • [0066]
    Alternatively, rather than relying on a “presentation over” indication in step 266, server 106 may be informed of the end of the presentation in other manners (such as a pre-programmed duration).
  • [0067]
    [0067]FIG. 6 shows exemplary steps in a process for allowing searching of current characteristic information in live presentations. Steps on the left side of dashed line 280 are carried out by an encoder 104 of FIG. 1, and steps on the right side of dashed line 280 are carried out by search server 106 of FIG. 1. These steps may be performed in software. FIG. 6 is described with additional reference to components in FIG. 1.
  • [0068]
    Initially encoder 104 sends, to search server 106, current characteristic(s) information for the portion of a live presentation currently being presented (step 282). Search server 106 in turn receives the current characteristic(s) information (step 284). Server 106 records the current characteristic(s) information and makes the information available for searching (step 286). The characteristic(s) information is used by server 106 in responding to any subsequent search requests it receives (step 288).
  • [0069]
    While server 106 is performing steps 284-288, encoder 104 continues to stream the live presentation to any of the client computers 102 that request it (step 290). Encoder 104 also checks whether the current characteristic(s) are over (step 292). When the current characteristic(s) are over (e.g., they no longer describe the portion of the live presentation currently being presented), encoder 104 sends a “characteristic(s) over” indication to server 106 (step 294).
  • [0070]
    Server 106 receives the characteristic(s) over indication from encoder 104 (step 296) and deletes its record of the characteristic(s) information (step 298). Thus, any subsequent search requests will not be compared to the characteristic(s) information for that presentation, as those characteristic(s) are over.
  • CONCLUSION
  • [0071]
    The invention allows for the searching of live presentations. An encoder providing a live presentation registers with a search server, advantageously making information identifying the presentation available for searching only for the duration of the presentation. Additionally, characteristic information identifying current characteristic(s) of the presentation can be registered with the search server only for the duration of that characteristic(s). Thus, the characteristic information is advantageously made available for only as long as that characteristic(s) describes the current portion of the live presentation.
  • [0072]
    Although the invention has been described in language specific to structural features and/or methodological steps, it is to be understood that the invention defined in the appended claims is not necessarily limited to the specific features or steps described. Rather, the specific features and steps are disclosed as preferred forms of implementing the claimed invention.
Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US4931950 *25 Jul 19885 Jun 1990Electric Power Research InstituteMultimedia interface and method for computer system
US5050161 *4 Dec 198917 Sep 1991Bell Communications Research, Inc.Congestion management based on multiple framing strategy
US5119474 *11 Jul 19912 Jun 1992International Business Machines Corp.Computer-based, audio/visual creation and presentation system and method
US5133075 *19 Dec 198821 Jul 1992Hewlett-Packard CompanyMethod of monitoring changes in attribute values of object in an object-oriented database
US5309562 *19 Aug 19913 May 1994Multi-Tech Systems, Inc.Method and apparatus for establishing protocol spoofing from a modem
US5313454 *1 Apr 199217 May 1994Stratacom, Inc.Congestion control for cell networks
US5341474 *15 May 199223 Aug 1994Bell Communications Research, Inc.Communications architecture and buffer for distributing information services
US5414455 *7 Jul 19939 May 1995Digital Equipment CorporationSegmented video on demand system
US5434848 *28 Jul 199418 Jul 1995International Business Machines CorporationTraffic management in packet communications networks
US5455910 *6 Jan 19933 Oct 1995International Business Machines CorporationMethod and system for creating a synchronized presentation from different types of media presentations
US5481542 *10 Nov 19932 Jan 1996Scientific-Atlanta, Inc.Interactive information services control system
US5490252 *30 Sep 19926 Feb 1996Bay Networks Group, Inc.System having central processor for transmitting generic packets to another processor to be altered and transmitting altered packets back to central processor for routing
US5504744 *3 Jun 19942 Apr 1996British Telecommunications Public Limited CompanyBroadband switching network
US5519701 *29 Mar 199521 May 1996International Business Machines CorporationArchitecture for high performance management of multiple circular FIFO storage means
US5521630 *4 Apr 199428 May 1996International Business Machines CorporationFrame sampling scheme for video scanning in a video-on-demand system
US5533021 *3 Feb 19952 Jul 1996International Business Machines CorporationApparatus and method for segmentation and time synchronization of the transmission of multimedia data
US5537408 *5 Jun 199516 Jul 1996International Business Machines Corporationapparatus and method for segmentation and time synchronization of the transmission of multimedia data
US5541955 *28 Apr 199530 Jul 1996Pericle Communications CompanyAdaptive data rate modem
US5559942 *10 May 199324 Sep 1996Apple Computer, Inc.Method and apparatus for providing a note for an application program
US5566175 *4 May 199315 Oct 1996Roke Manor Research LimitedAsynchronous transfer mode data transmission system
US5574724 *26 May 199512 Nov 1996Lucent Technologies Inc.Adjustment of call bandwidth during a communication call
US5614940 *21 Oct 199425 Mar 1997Intel CorporationMethod and apparatus for providing broadcast information with indexing
US5617423 *7 Jul 19941 Apr 1997Multi-Tech Systems, Inc.Voice over data modem with selectable voice compression
US5623690 *16 Jul 199222 Apr 1997Digital Equipment CorporationAudio/video storage and retrieval for multimedia workstations by interleaving audio and video data in data file
US5625405 *20 Feb 199629 Apr 1997At&T Global Information Solutions CompanyArchitectural arrangement for a video server
US5640320 *16 May 199517 Jun 1997Scitex Digital Video, Inc.Method and apparatus for video editing and realtime processing
US5664227 *14 Oct 19942 Sep 1997Carnegie Mellon UniversitySystem and method for skimming digital audio/video data
US5692213 *16 Oct 199525 Nov 1997Xerox CorporationMethod for controlling real-time presentation of audio/visual data on a computer system
US5717691 *30 Oct 199510 Feb 1998Nec Usa, Inc.Multimedia network interface for asynchronous transfer mode communication system
US5717869 *3 Nov 199510 Feb 1998Xerox CorporationComputer controlled display system using a timeline to control playback of temporal data representing collaborative activities
US5719786 *3 Feb 199317 Feb 1998Novell, Inc.Digital media data stream network management system
US5721829 *5 May 199524 Feb 1998Microsoft CorporationSystem for automatic pause/resume of content delivered on a channel in response to switching to and from that channel and resuming so that a portion of the content is repeated
US5742347 *24 Jan 199621 Apr 1998International Business Machines CorporationEfficient support for interactive playout of videos
US5751282 *13 Jun 199512 May 1998Microsoft CorporationSystem and method for calling video on demand using an electronic programming guide
US5768533 *1 Sep 199516 Jun 1998National Semiconductor CorporationVideo coding using segmented frames and retransmission to overcome channel errors
US5786814 *3 Nov 199528 Jul 1998Xerox CorporationComputer controlled display system activities using correlated graphical and timeline interfaces for controlling replay of temporal data representing collaborative activities
US5794210 *11 Dec 199511 Aug 1998Cybergold, Inc.Attention brokerage
US5794249 *21 Dec 199511 Aug 1998Hewlett-Packard CompanyAudio/video retrieval system that uses keyword indexing of digital recordings to display a list of the recorded text files, keywords and time stamps associated with the system
US5799292 *25 Sep 199525 Aug 1998International Business Machines CorporationAdaptive hypermedia presentation method and system
US5801685 *8 Apr 19961 Sep 1998Tektronix, Inc.Automatic editing of recorded video elements sychronized with a script text read or displayed
US5808662 *8 Nov 199515 Sep 1998Silicon Graphics, Inc.Synchronized, interactive playback of digital movies across a network
US5818510 *27 Jun 19966 Oct 1998Intel CorporationMethod and apparatus for providing broadcast information with indexing
US5822537 *5 Apr 199613 Oct 1998At&T Corp.Multimedia networked system detecting congestion by monitoring buffers' threshold and compensating by reducing video transmittal rate then reducing audio playback rate
US5828848 *31 Oct 199627 Oct 1998Sensormatic Electronics CorporationMethod and apparatus for compression and decompression of video data streams
US5835495 *11 Oct 199510 Nov 1998Microsoft CorporationSystem and method for scaleable streamed audio transmission over a network
US5835667 *14 Oct 199410 Nov 1998Carnegie Mellon UniversityMethod and apparatus for creating a searchable digital video library and a system and method of using such a library
US5838906 *17 Oct 199417 Nov 1998The Regents Of The University Of CaliforniaDistributed hypermedia method for automatically invoking external application providing interaction and display of embedded objects within a hypermedia document
US5859641 *10 Oct 199712 Jan 1999Intervoice Limited PartnershipAutomatic bandwidth allocation in multimedia scripting tools
US5864682 *21 May 199726 Jan 1999Oracle CorporationMethod and apparatus for frame accurate access of digital audio-visual information
US5870755 *26 Feb 19979 Feb 1999Carnegie Mellon UniversityMethod and apparatus for capturing and presenting digital data in a synthetic interview
US5873735 *18 Dec 199523 Feb 1999Sharp Kabushiki KaishaInformation reproducer and information creating unit
US5892506 *18 Mar 19966 Apr 1999Discreet Logic, Inc.Multitrack architecture for computer-based editing of multimedia sequences
US5894480 *8 Oct 199713 Apr 1999Apple Computer, Inc.Method and apparatus for operating a multicast system on an unreliable network
US5903673 *14 Mar 199711 May 1999Microsoft CorporationDigital video signal encoder and encoding method
US5918002 *14 Mar 199729 Jun 1999Microsoft CorporationSelective retransmission for efficient and reliable streaming of multimedia packets in a computer network
US5930473 *8 Mar 199627 Jul 1999Teng; PeterVideo application server for mediating live video services
US5930787 *27 Sep 199627 Jul 1999Sharp Kabushiki KaishaMethod for retrieving related word information, information processing apparatus, method for controlling related information display, and related information display apparatus
US5933603 *10 Jun 19963 Aug 1999Emc CorporationVideo file server maintaining sliding windows of a video data set in random access memories of stream server computers for immediate video-on-demand service beginning at any specified location
US5953506 *17 Dec 199614 Sep 1999Adaptive Media TechnologiesMethod and apparatus that provides a scalable media delivery system
US5956716 *7 Jun 199621 Sep 1999Intervu, Inc.System and method for delivery of video data over a computer network
US5995941 *15 Sep 199730 Nov 1999Maquire; JohnData correlation and analysis tool
US5996015 *31 Oct 199730 Nov 1999International Business Machines CorporationMethod of delivering seamless and continuous presentation of multimedia data files to a target device by assembling and concatenating multimedia segments in memory
US6014706 *14 Mar 199711 Jan 2000Microsoft CorporationMethods and apparatus for implementing control functions in a streamed video display system
US6023731 *30 Jul 19978 Feb 2000Sun Microsystems, Inc.Method and apparatus for communicating program selections on a multiple channel digital media server having analog output
US6032130 *22 Oct 199729 Feb 2000Video Road Digital Inc.Multimedia product catalog and electronic purchasing system
US6035341 *29 Apr 19987 Mar 2000Sensormatic Electronics CorporationMultimedia data analysis in intelligent video information management system
US6041345 *7 Mar 199721 Mar 2000Microsoft CorporationActive stream format for holding multiple media streams
US6049823 *1 Nov 199611 Apr 2000Hwang; Ivan Chung-ShungMulti server, interactive, video-on-demand television system utilizing a direct-access-on-demand workgroup
US6064794 *8 Mar 199616 May 2000Thomson Licensing S.A.Trick-play control for pre-encoded video
US6111882 *26 Mar 199729 Aug 2000Fujitsu LimitedOn-demand system
US6115035 *21 Jul 19975 Sep 2000Mediaone Group, Inc.System and method for automated audio/video archive and distribution
US6118450 *3 Apr 199812 Sep 2000Sony CorporationGraphic user interface that is usable as a PC interface and an A/V interface
US6118817 *14 Mar 199712 Sep 2000Microsoft CorporationDigital video signal encoder and encoding method having adjustable quantization
US6128653 *17 Mar 19973 Oct 2000Microsoft CorporationMethod and apparatus for communication media commands and media data using the HTTP protocol
US6133920 *27 Jul 199817 Oct 2000Oak Technology, Inc.Method and apparatus for activating buttons from a DVD bitstream using a pointing device
US6144375 *14 Aug 19987 Nov 2000Praja Inc.Multi-perspective viewer for content-based interactivity
US6144991 *19 Feb 19987 Nov 2000Telcordia Technologies, Inc.System and method for managing interactions between users in a browser-based telecommunications network
US6148304 *19 Mar 199714 Nov 2000Microsoft CorporationNavigating multimedia content using a graphical user interface with multiple display regions
US6173317 *14 Mar 19979 Jan 2001Microsoft CorporationStreaming and displaying a video stream with synchronized annotations over a computer network
US6173329 *19 Feb 19989 Jan 2001Nippon Telegraph And Telephone CorporationDistributed multimedia server device and distributed multimedia server data access method
US6184878 *23 Dec 19976 Feb 2001Sarnoff CorporationInteractive world wide web access using a set top terminal in a video on demand system
US6184996 *18 Jun 19976 Feb 2001Hewlett-Packard CompanyNetwork printer with remote print queue control procedure
US6201536 *2 Dec 199413 Mar 2001Discovery Communications, Inc.Network manager for cable television system headends
US6204840 *8 Apr 199820 Mar 2001Mgi Software CorporationNon-timeline, non-linear digital multimedia composition method and system
US6215910 *28 Mar 199610 Apr 2001Microsoft CorporationTable-based compression with embedded coding
US6230172 *3 Sep 19998 May 2001Microsoft CorporationProduction of a video stream with synchronized annotations over a computer network
US6230205 *16 Nov 19998 May 2001Mci Communications CorporationMethod and apparatus for managing delivery of multimedia content in a communications system
US6233389 *30 Jul 199815 May 2001Tivo, Inc.Multimedia time warping system
US6263371 *10 Jun 199917 Jul 2001Cacheflow, Inc.Method and apparatus for seaming of streaming content
US6279040 *27 Apr 199921 Aug 2001Industrial Technology Research InstituteScalable architecture for media-on demand servers
US6366914 *7 Aug 19982 Apr 2002Qorvis Media Group, Inc.Audiovisual content distribution system
US6397275 *28 Apr 200028 May 2002Viseon, Inc.Peripheral video conferencing system
US6418557 *8 Feb 19999 Jul 2002Nec CorporationOn-demand system enabling control of power-on/off of on-demand server
US6434621 *31 Mar 199913 Aug 2002Hannaway & AssociatesApparatus and method of using the same for internet and intranet broadcast channel creation and management
US6463462 *2 Feb 19998 Oct 2002Dialogic Communications CorporationAutomated system and method for delivery of messages and processing of message responses
US6810526 *14 Aug 199726 Oct 2004March Networks CorporationCentralized broadcast channel real-time search system
US6928655 *16 Dec 19999 Aug 2005Microsoft CorporationLive presentation searching
US7343614 *2 Apr 199911 Mar 2008Sedna Patent Services, LlcProgram delivery system for VOD
US20030050784 *5 Nov 200213 Mar 2003Hoffberg Mark B.Content-driven speech- or audio-browser
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US20110314053 *13 Jun 201122 Dec 2011Konica Minolta Business Technologies, Inc.Presentation support device and computer readable medium
Classifications
U.S. Classification1/1, 707/E17.117, 707/E17.009, 707/999.003
International ClassificationG06F17/30
Cooperative ClassificationY10S707/99945, Y10S707/914, Y10S707/916, Y10S707/99933, G06F17/30056, G06F17/30893
European ClassificationG06F17/30E4P1, G06F17/30W7L
Legal Events
DateCodeEventDescription
15 Jan 2015ASAssignment
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034766/0001
Effective date: 20141014