|Publication number||WO2006014745 A2|
|Publication date||9 Feb 2006|
|Filing date||21 Jul 2005|
|Priority date||22 Jul 2004|
|Also published as||CA2574357A1, CN101027663A, EP1782264A2, US7621814, US20060019751, WO2006014745A3|
|Publication number||PCT/2005/25756, PCT/US/2005/025756, PCT/US/2005/25756, PCT/US/5/025756, PCT/US/5/25756, PCT/US2005/025756, PCT/US2005/25756, PCT/US2005025756, PCT/US200525756, PCT/US5/025756, PCT/US5/25756, PCT/US5025756, PCT/US525756, WO 2006/014745 A2, WO 2006014745 A2, WO 2006014745A2, WO-A2-2006014745, WO2006/014745A2, WO2006014745 A2, WO2006014745A2|
|Applicant||Scientific Games Royalty Corporation|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (4), Referenced by (6), Classifications (11), Legal Events (11)|
|External Links: Patentscope, Espacenet|
MEDIA ENHANCED GAMING SYSTEM
CROSS-REFERENCE TO RELATED APPLICATION  This application claims the benefit of US Provisional Application No. 60/590,255, filed July 22, 2004, the entirety of which is hereby fully incorporated herein by this reference.
BACKGROUND OF THE INVENTION  1. Field of the Invention
 The invention relates generally to gaming and lottery systems. More particularly, the invention relates to systems, processes and controls that allow for the use of modern video and audio compression processes along with high- bandwidth communications circuits to bring media-rich services to the gaming and lottery environment.
 2. Description of the Related Art
 Traditionally, graphics and other media presented to the operators, players, and other persons present at a gaming establishment have been either pre-generated (canned) or message-based content. An example of such gaming system is an Keno game implemented by a state lottery authority. The graphic content resides on the gaming terminal and is presented through various interfaces. This content is either downloaded from the central data center(s) during off-hours or via background downloads during operational hours. Message-based content is pushed out to the gaming terminals from a centralized console and presented, usually via a dot-matrix type display. The security required to maintain system integrity typically prevents advanced computer features to the real time play of the game because of the need to protect the data flow of the game.
 These relatively crude methods, by today's standards, places limits on both the quality of the content as well as the quantity of unique content to present. These deficiencies manifest themselves as players losing interest in the games quickly, which thereby results in lowered sales and/or participation. To attract players, increase their interest, and provide general information, the gaming industry has traditionally relied upon these rudimentary graphics and printed produces. What is needed, therefore, is a media-rich method for attracting and informing players of secure game offerings in a real-time environment. SUMMARY OF THE INVENTION
 The present invention provides an improved gaming system which overcomes some of the deficiencies of the known art. In one embodiment, the system is comprised of several hardware and software components which embody and enable core functionality. It is this core design that integrates known encoding schemes with new software and processes to enable ground-breaking media-rich delivery from a central site to remote gaming venues.
 In one embodiment, the invention is a system for providing media to users at secure remote gaming locations that one or more secure gaming terminals located at remote locations on a communication network, with the one or more secure gaming terminals each allowing a user to play and wager in a game of chance. The system includes at least one media server on the communication network that determines the usable media for the one or more secure gaming terminals, such as multimedia, live video, etc. Then one or more media feeds in the system selectively feed media to the media server and the media server selectively distributes the appropriate media content from the one or more media feeds to the one or more secure gaming terminals, preferably during game play. The system can include an assistance server, such as a telephone call center to help the players and others at the remote terminals.  In one embodiment, the invention is a method of for providing media to users at secure remote gaming locations that includes the steps of hosting a game of chance at the one or more secure gaming terminals located at remote locations on a communication network, with the one or more terminals each allowing a user to play and wager in the game of chance, then feeding media content from one or media feeds to a media server, with the media server determining the usable media for the one or more secure terminals. The method then includes the step of distributing the appropriate media content from the media server to the one or more secure gaming terminals ate least during the game of chance.
 The present invention therefore provides a media-rich environment at the secure gaming terminal that can both attract and inform players of secure game offerings, even in a real-time environment. Such function is advantageous because it increases player interest and can provide a simplified delivery of general information and instruction.  Other objects, features and advantages of the present application will before apparent after review of the hereinafter set forth Brief Description of the Drawings, Detailed Description of the Invention, and the Claims.
BRIEF DESCRIPTION OF THE DRAWINGS
 Fig. 1 is a schematic illustration of an embodiment of a media gaming system of the invention.
 Fig. 2 is a schematic illustration of an embodiment of a media server of the invention.
 Fig. 3 is an illustration of a video call center for use with the invention.  Fig. 4 is an illustration of a discrete terminal system for use with the invention.  Fig. 5 is an illustration of an integrated terminal system for use with the invention.
 Fig. 6 is a flowchart of media server operations.  Fig. 7 is a flowchart of main conferencing operations.  Fig. 8 is a flowchart of call center operations.
DETAILED DESCRIPTION OF THE INVENTION
 Referring now to the drawings in which like reference numbers indicate like parts throughout the several views, and in particular here to Fig. 1 , the main content delivery system 10 is based upon a Media Server/Sequencer System 12, which is responsible for controlling content type, mix, and delivery. The uniqueness of this core device is found in the software and system interfaces driving its operation. The server will accept various types of media input via industry-standard hardware interfaces such as composite, component, and 5- video ports. Additional content is available via encoded media stored locally on a mass storage device 14 or over the communications network 26.
 Standard raw media content is passed through the aforementioned standard hardware ports and encoded using well-known and available encoding algorithms. The various types of media that can be processed by the system could be third-party video feeds 16, computer generated graphics 18, and live broadcast content 20. It is the availability of this real-time media and the ability to deliver this content that differentiates this system from those traditionally used and currently available within the industry.
 Once this media is available, the sequencing and control logic within the server provides a method to distribute the content to the desired gaming devices over the communications network 26. This distribution can entail a single remote device, a group of these devices, or the entire installed base of devices. The specialized software within the Media Server/Sequencer System 12 controls this distribution via standard Internet Protocol (IP) unicast, multicast, and broadcast methods.  For the far-end gaming terminal locations, two methods of providing media functionality can be utilized. In discrete system locations 28, the existing terminal device 36 is not capable of handling the media content. This could be due to either the terminal be a third-party device or not having the processing power/interfaces to accommodate this feature. In these instances, a separate Media Processor 32 with corresponding media interface devices 34 would be installed to permit delivery of content.  The integrated method is utilized where the terminal device is controlled by the system licensee and it has the ability to handle the media processing tasks. In this scenario at an integrated system remote location 30, the terminal with integrated media capabilities 38 contains the necessary software and interfaces to provide for the delivery of content. These interfaces handle the connections to the various media interface devices 40.
 Due to the media-rich capabilities of the remote device locations, they now lend themselves easily to be a source of media input. Already containing a method of displaying video and producing audio output, the incorporation of readily available video camera and microphone technology provides the capability for the remote location to send video and audio back to the Media Server/Sequencing System 12. This capability enables video conferencing features that can be utilized by the Call Center Media Controller/Queuing System 22.
 The Call Center Media Controller/Queuing System 22 is designed to function as an add-on system as well as a standalone offering to customers. Designed around the same core processes and functionality of the Media Server/Sequencer System 12, this system provides for real-time video conferencing contact between the remote device locations and a call center/help desk service.
 The Call Center Media Controller/Queuing System 22 receives the encoded media streams from the remote locations through the same functionality that allows it to accept raw media input like its counterpart, the Media Server/Sequencer System 22. How it handles this media differently is a function of additional specialized programming. As in traditional call center telecommunications systems, there are times when all personnel are already assisting callers. The ability to handle this type of situation is handled by the queuing feature of the system.
 Requests for conferencing sessions from remote locations route to the Call Center Media Controller/Queuing System 22. If there is an available call center technician, the session is routed through to the selected media-enabled workstation 26 where the technician answers the request. This action begins the two-way video conferencing session. If a technician is not available to immediately handle the session, the queuing controls process the session until the situation changes.  While in queue, the remote location can be controlled to display various informational messages. This entails a display that no technicians are available, the anticipated wait time, and possibly a logo or promotional graphic. Depending on bandwidth availability over the communications network 26, video-based promotional, technical, or informational content could be displayed. This content is pushed to the remote location from the In-queue Media Pool 24 which resides on a storage device within the server or other like device on the network. Once a call center technician becomes available, the remote session is passed through to the corresponding workstation 26.
 Additional functionality is incorporated into this system through more specialized software features. The design features include tracking media and bandwidth capabilities of each individual remote location, real-time bandwidth monitoring of the network, current media sessions, and scheduled media events. These features enable the various functions provided by the system to remain in check and adjust their operation accordingly.
 Due to the design of the communications network tying the remote locations back to where the system is housed, varying bandwidth capabilities may exist across the installation base. In order to account for this very possible design constraint, the per-location bandwidth available should be incorporated into the system so that it may adjust media content.
 Since media capabilities and/or desires may vary by remote location, this fact should be considered also. Certain groups of remote locations may be members of a chain or corporate structure and thereby have unique needs or restrictions for content. There may also exist a need to provide content based upon regional areas. This capability would be very important should the system be utilized to broadcast weather alerts.  To take these factors into account and act accordingly, both the Media Server/Sequencer System 12 and the Call Center Media Controller/Queuing System 22 maintain a database containing pertinent information. Before establishing a stream or terminating a video conferencing session respectively, these systems will perform a call to the database to determine the best configuration or capability to carry the session. Information is also contained in this database that provides the system with the configuration of the backbone communications network so that it can adjust system- wide aggregate bandwidth utilization accordingly. When both systems are installed concurrently, one system can be designated to hold the primary database and the other the backup. Changes in information to the primary database are migrated to the backup database by system process. Each system has the capability to utilize the others database if corruption or other failure renders its own database unusable.  Similar to the database redundancy and failover capability, both systems are designed with the ability to be deployed in redundant sets. When this method is employed, either strictly for redundancy or for accommodating large installations of remotes, one system will be designated primary and others as backup units. Inter- machine processes on each server monitor the status and eligibility of other servers within the group and react accordingly should a failure occur.  A media scheduling process is contained within both the Media Server/Sequencer System 12 and the Call Center Media Controller/Queuing System 22. In the prior instance, this process controls media content and distribution based upon information contained within a separate scheduling database. In the latter it provides the ability to push out scheduled notices and informational content such as maintenance downtime and impairment releases. The database utilized is structured to control content distribution based on both time of day and remote location affected. An example using this feature would be the distribution of a corporate announcement at a particular time and only to those locations belonging to that corporate entity.  The functional components of the Media Server/Sequencer System are shown in Fig. 2. At the heart of the system is the media server engine 50 which is tasked with distributing content based upon control input and automated operational monitoring sub-processes. Content control allows for multiple simultaneous streams of media based upon distribution commands from the server side or on-demand requests from remote terminal locations. Terminal, as used herein, herein refers to a terminal or a device adapted for gaming use and which is traditionally defined as a purpose-built unit that accepts and processes wagering transactions and also provides a wagering system interface to the user/operator.
 This content control is provided for by the sequencing and control logic 64 process. Programming enables input from various sources to dictate content distribution. Additional inputs from the media server engine 50 and communications interface 72 provide for monitoring of system and network communications operational parameters. This feedback is an essential component of the system and provides for proper operation and utilization of resources.
 Explained individually, the first input is provided for by the media schedule 68. This component is comprised of a database and an interface process to the sequencing and control logic 64. Entries into this database control the scheduled distribution of content and to which location(s) this content is directed. The data is maintained by interaction via the operator workstation 70. Date and time information as well as content and intended destination(s) is input into the database. At the prescribed moment, the proper content is pushed out to the intended recipient(s).
 The second method for controlling the distribution of content is via commands entered directly into the system from the operator workstation 70. Content selection and recipient information is input via the GUI interface and passed to the sequencing and control logic 64 through the media server control interface 66. This latter process handles the human-machine I/O interface requirements and provides a method to adapt and present a standardized interface to the operator.
 Other than providing a universal interface to the communications network media, the communications interface 72 provides feedback to the monitoring and control logic 64 on communications functioning as related to bandwidth utilization and impairments to the communication network 74. To make available content for distribution, the media server engine 50 has several sources which to draw from. First is a raw media interface 62 that is the gateway for pre-encoded external real-time media. Another source for pre-encoded media is drawing from media storage 52.  For interfacing with traditional video signals, the media server contains a process dedicated to encoding video signals utilizing well-known compression algorithms. The encoder 54 performs this function. It accepts these traditional signals through industry standard hardware interface adapters installed in the server. Media sources can consist of third-party feeds 56, computer generated graphics 58 input, and live media 60 such as from a broadcast studio. Besides providing real-time content sources, additional processes provide the ability to take these encoded inputs and buffer and/or store them to media storage 52 for later delivery.  Designed around the same core concept of the Media Server/Sequencer System is the Call Center Media Controller/Queuing System detailed next which can is illustrated in Fig. 3, which illustrates video call center detail. Being such, these two systems share many of the same components and logic. Because of the modular architecture of the systems, they are designed to allow deployment individually or as an integrated solution.
 Once again, the media server engine 80 is responsible for controlling the flow of media streams to and from the system. Unique to this system is that it is designed to handle the routing of real-time two-way video conferencing traffic. This capability is provided for by the sequencing and control logic 86 process which listens for conferencing requests from stations, queue and routes these requests, and also oversees established conferences by way of a monitoring process through the communications interface 94.
 The feedback received via the communications interface 94 allows the sequencing and control logic 86 to monitor communications network 96 utilization and adjust the operation of the system to prevent degradation to other activities that rely on the network.
 System operation is controlled and monitored via the video call center control interface 88 from the master workstation 90. The design of the system allows for the master workstation 90 to be physically connected to the system or located elsewhere on the network. When the workstation is located on the network, no specialized client software is required and this allows for control of the system to be easily relocated to another workstation as when a shift change at the call center may dictate.  The video call center control interface 88 maintains a database of the media capabilities and other operational restraints for each remote location and call center workstations 92. In the case of remotes, limitations in the communication network 96 may reduce or preclude the capability for video conferencing and the system must tailor operation accordingly. For the call center workstations, the system must know which workstations are staffed, in conference, and available for service. Additionally, the video call center control interface 88 tracks in-progress conferences to calculate hold times for queued conference requests. This conference volume and hold time information is displayed on each call center workstation 92, master workstation 90, and can be pushed down to queued remotes.
 Similar to modern voice-only call center software, the system provides the capability to determine the source of conference requests and perform a lookup within a database of location information. Basic details of in-process and queued conference session are displayed on each call center workstation 92 and the master workstation 90. The availability of this information alerts supervisors and technicians to session volume and location detail which allows them to recognize common denominators amongst the sessions that may indicate problems in the associated gaming system. When flagged for assignment of a new conference session, remote location detail and history information is displayed on the call center workstation 92 to enhance service and reduce conference times. This last function is very similar to the Computer Telephony Interface (CTI) utilized in standard call center software.  The video call center control interface 88 can either utilized its integrated database for remote location detail or interface to an external database via standard Structured Query Language (SQL) calls. This capability allows for a tight integration with an existing gaming system database and precludes the requirement to duplicate location information and associated updates across multiple independent databases.  In order to enhance system functionality it incorporates a capability to push notices and other informational messages out to remote locations, either preconceived or real-time. This delivery is controlled via the master workstation 90 and pulls content from media storage 82, the raw media interface 84, or via the communications network 96. The system is also designed to permit call center workstations 92 to place conference requests to remote locations. This feature allows technicians to proactively contact remote locations, perform follow-up/courtesy calls, and establishes a basis to enable telemarketing functions with the system.
 To enable this media capability at remote locations, two methods can be utilized. Depending upon circumstances, on a per-remote location basis, either an integrated or discrete media processing system can be installed. The first method discussed will be that of a discrete configuration as detailed and referenced in Fig. 4, which illustrates a discrete terminal system.
 The discrete method is utilized primarily when the existing remote device either can not be touched or is incapable of providing the required hardware and software integration. In this instance, a separate processor unit is installed and handles all media-related activities. This method could also be used to provide stand-alone media capabilities within a gaming establishment where media capabilities on a per- terminal basis are either not required or desired.
 At the core of the discrete terminal system is the media engine 100 which directly controls and processes various media streams traversing the unit. Under command from the sequencing and control logic 122, the media engine 100 may establish, route, terminate, and otherwise control content flow. Content may be processed either across the communications network 130 via the communications interface 128, from local media storage 102, or from local external sources.  In the case of external sources, basic video conferencing media capability is provided for by means of a camera 108 and monitor 110 through the video interface 106 and also a speaker 114 and microphone 116 via the audio interface 112. The external monitor 110 and speaker 114 would be utilized in the case of pushed or streamed media to the remote location. Also available is an external interface 118 which provides a means to provide connectivity to external audio/video devices 120. This external interface 118 allows connection to existing or an otherwise available media distribution system that may exist within the remote location. The signals traversing these various interface are processed by the encoder/decoder 104 module utilizing well-know compression/decompression (Codec) algorithms.  The sequencing and control logic 122 also monitors real-time communications properties via a hook into the communications interface 128. This allows the sequencing and control logic 122 to be aware of communications network 130 utilization, current media sessions, and pending media requests. Video conferencing and on-demand media control is primarily handled by the sequencing and control logic 122 through user commands entered via integrated keyboard or touch-screen methods. To allow for interfacing with existing external systems 126, an adaptive machine interface 124 provides a common-ground capability. The media system may need to interface with traditional Point of Sale (POS) or other terminal devices.  Programming contained within the adaptive machine interface 124 allows the system to accept and provide information to external systems 126 through a separate software module. This module can be modified to present a standard interface to the systems on both sides of the interface without necessarily requiring unique modifications to the systems themselves. The result is a highly adaptable system that is capable of enabling rich media functions integrated with basic and/or legacy terminal devices.
 The integrated terminal system, as diagramed and referenced in Fig. 5, which illustrates an integrated terminal system, and is utilized in instances where the remote terminal or system has the capability to accommodate the required hardware interfaces and software modules. The components and design of this integrated system is not much different than the discrete implementation (Fig. 4) and vary only in the means by which it interfaces with the pre-existing terminal application.
 Once again, at the core of the integrated terminal system is the media engine 100 which directly controls and processes various media streams traversing the unit. Under command from the sequencing and control logic 122, the media engine 100 may establish, route, terminate, and otherwise control content flow. Content may be processed either across the communications network 126 via the communications interface 124, from local media storage 102, or from local external sources.  In the case of external sources, basic video conferencing media capability is provided once again by means of a camera 108 and monitor 110 through the video interface 106 and also a speaker 114 and microphone 116 via the audio interface 112. The external monitor 110 and speaker 114 would be utilized in the case of pushed or streamed media to the remote location. Also available is an external interface 118 which provides a means to establish connectivity to external audio/video devices 120. This external interface 118 allows connection to existing or an otherwise available media distribution system that may exist within the remote location.  The sequencing and control logic 122 also monitors real-time communications properties via a hook into the communications interface 124. This allows the sequencing and control logic 122 to be aware of communications network 126 utilization, current media sessions, and pending media requests. Video conferencing and on-demand media control is primarily handled by the sequencing and control logic 122 through user commands entered via integrated keyboard or touch-screen methods. To allow for interfacing with existing external systems 132 like that of the discrete terminal system, an adaptive machine interface 120 provides a common-ground capability. The media system may need to interface with traditional Point of Sale (POS) or other terminal devices and this capability provides that functionality.  Programming contained within the adaptive machine interface 130 allows the system to accept and provide information to external systems 132 through a separate software module. This module can be modified to present a standard interface to the systems on both sides of the interface without necessarily requiring unique modifications to the systems themselves. The result is a highly adaptable system that is capable of enabling rich media functions integrated with basic and/or legacy terminal devices.
 Likewise, the terminal application interface 128 allows this same functionality and ease of adaptability to take place with the pre-existing terminal application. In some instances, the licensee will be installing the system on a third-party terminal device that is up to the task of handling the required media content and control. The terminal application interface 128 allows programming a discrete interface software module to allow for seamless interaction without requiring code changes to either the host application or media system core. In the case that the licensee installs the system on their own terminal device, the terminal application interface 128 can be written to provide a standard interface to the application software. In many instances, when a vendor offers multiple models of terminal devices, they will provide for standard interface specifications to external applications. The capability of this system to do likewise allows for portability of the media system across their compatible product line.  From and end-to-end viewpoint, the two systems described herein function along the same basic principals. However, the following text and diagrams will detail the overall interaction between the centralized server systems and remote terminal devices independently due to the distinct properties of each. The flow of processes within the Media Server/Sequencer System is detailed as shown in Fig. 6.  Media can be streamed to remotes utilizing several methods. The first is manually via the operator workstation 140, the next is with a prompt from the schedule 144, and lastly, from a on-demand request from a remote terminal 142. Prompts for these media triggers are validated for conflicts related to the time of this media session with sessions either imminent or already in progress that may be of higher priority, as shown by decision 148. If there is a conflict, the system will adjust according to schedule and notify the operator via the workstation 140 interface.  If a conflict does not exist, the sequencing and control process 150 queries the database for the remotes capability 152 to ensure that it is indeed capable of receiving the media feed. If the remote is flagged in the database as having a bandwidth limitation, the media feed is checked to see if it can be scaled back to fit within the available bandwidth. If the feed is valid (decision 154), the sequencing and control process 156 checks that this bandwidth (decision 160) is available on the communications network by interfacing with the communications interface monitoring process 158.
 With bandwidth available, the sequencing and control process 162 sends a command to the media server engine 170 to start the proper media feed. It also informs the communications interface monitor process 158 that the media feed request has been placed. The sequencing and control process continues to monitor 164 the status and bandwidth 166 of the feed via interfaces with the communications interface monitor process 158 and the media server engine 170. If bandwidth must be reduced or the feed must be stopped, the sequencing and control process 168 sends the appropriate commands to the media server engine 170.
 As part of the command to the media server engine 170 to start the feed, a direction as to what media and/or source is to be utilized to supply the given feed. The media server engine 170 selects the proper input from either third party media 172, computer generated media 174, live media 176, or media storage 178. If the media is not available (decision 180) the media server engine 170 notifies the sequencing and control monitor process 164 where the error is displayed on the operator workstation 140.
 If the media is available, the media server engine 182 streams the video to the specified remote(s) via the communications interface 184. The media server engine 182 constantly listens for commands to end or otherwise terminate (step 188) the feed. Once the feed has ended or is terminated (decision 186), the media server engine 182 informs the sequencing and control monitor process 164.
 The process flow for the setup and teardown of video conferencing sessions pertaining to the call center media controller/queuing system is detailed and referenced in Fig. 7. The sequencing and control monitor 204 process continually monitors sessions and network utilization via the communications interface monitor process 206. It also utilizes the communications interface to listen for conference requests 208 from call center workstations 210 and remote terminals 212. Continuous control and monitoring is available to the master workstation 200 via the video call center control interface 202.
 Being that the call center workstations 210 are all capable of full conference features, the sequencing and control process 214 checks for remote capability 216 via a database query. If the request is not valid (decision 218), sequencing and control 214 handles the issue and sends a notice to the master workstation 200. If the request is valid, the sequencing and control process 214 next checks to see if the destination is available (decision 220).
 If the destination is not available, the sequencing and control process 222 queues the request, makes note of the situation, and sends a request to the media server engine 224 to stream a hold time message to the destination 228 via the communications interface 226. If the destination is available at decision 220, the sequencing and control process 230 continues to process the connection.  The sequencing and control process 230 checks if bandwidth is available (decision 234) for the conference through the communications interface monitor process 232. If not, it will notify the initiator (if a call center workstation 210) that there is a bandwidth conflict and offer an option to queue the call or drop the request. If the initiator is a remote terminal 212, the sequencing and control process 230 will send a message advising of a busy status and queue the request.
 With bandwidth available (decision 234), the sequencing and control process 236 will broker the call with the call center workstation 242 and the remote terminal 244 via the communications interface 238 and communications network 240. The communications interface setup conference 238 process is where the proper setup commands and addressing is specified to the conferencing endpoints. The communications interface monitor process 246 continuously monitors the conference for activity (decision 248) and bandwidth (decision 252) availability. If the call is not active at decision 248, the sequencing and control process tears down any remaining conference components, step 250. If bandwidth is a problem at decision 252, the sequencing and control process 254 throttles bandwidth of the conference accordingly.  Once a conference is in session, the call center operator may want to stream media to the remote. This may be a help video or other way of assisting the remote conference caller. This associated process flow is depicted and annotated in the flowchart of Fig. 8.
 The sequencing and control monitoring process 262 is actively handling a conference in session 260 and aware of media and other traffic on the communications network through the communications interface monitor process 264. A media push request is received from a call center workstation 268 through the communications interface 266. The first step will be for the sequencing and control process 270 to perform a remote capability query 272 in the database. This allows the system to validate (decision 274) the remote device ability to handle the media stream required.  Through the communications interface monitor process 278, the sequencing and control process 276 then checks for bandwidth capacity (decision 280) on the network. If bandwidth is not available at that time, the call center workstation 268 is notified of the situation and offered the opportunity to wait, cancel, or to push the media to the remote terminal in a near real-time fashion. In the latter instance, the media feed is pushed out to the remote as bandwidth permits and is buffered on the remote's storage device.
 If bandwidth is available, the sequencing and control process 282 send a command to the media server engine 284 to send the media stream to the remote terminal 288 via the communications interface 286. The sequencing and control process 282 continues to monitor the feed through the communications interface 286. If bandwidth continues to be available (decision 290) the feed continues unchanged. If bandwidth utilization on the communications network changes and cannot continue to support the media feed at the current rate, the sequencing and control process 292 throttles down the rate and/or buffers the media stream on the remote terminal 288 to minimize the bandwidth impact.
 Although several preferred embodiments of the invention have been disclosed in the foregoing specification, it is understood by those skilled in the art that many modifications and other embodiments of the invention will come to mind to which the invention pertains, having the benefit of the teaching presented in the foregoing description and associated drawings. It is thus understood that the invention is not limited to the specific embodiments disclosed herein, and that many modifications and other embodiments of the inventions are intended to be included within the scope of the appended claims. Moreover, although specific terms are employed herein, as well as in the claims, they are used in a generic and descriptive sense only, and not for the purposes of limiting the described invention, nor the claims which follow below.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US5971271 *||24 Jun 1997||26 Oct 1999||Mirage Resorts, Incorporated||Gaming device communications and service system|
|US20030216185 *||6 Jan 2003||20 Nov 2003||Varley John A.||Method and system for providing an environment for the delivery of interactive gaming services|
|US20040106454 *||4 Sep 2003||3 Jun 2004||Walker Jay S.||Method and apparatus for providing a complimentary service to a player|
|US20050113173 *||15 Sep 2004||26 May 2005||Waters David B.||System and method for enhancing amusement machines|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US9501907||6 Dec 2013||22 Nov 2016||Patent Investment & Licensing Company||Method and apparatus for generating a virtual win|
|US9619973||15 Jan 2015||11 Apr 2017||Patent Investment & Licensing Company||Outcome determination method for gaming device|
|US9633528||15 Jan 2016||25 Apr 2017||Patent Investment & Licensing Company||Method for configuring casino operations|
|US9659429||5 Oct 2015||23 May 2017||Patent Investment & Licensing Company||Gaming device having advance game information analyzer|
|US9666015||31 Jan 2012||30 May 2017||Patent Investment & Licensing Company||Generating a score related to play on gaming devices|
|US9728043||29 Dec 2010||8 Aug 2017||Patent Investment & Licensing Company||Means for enhancing game play of gaming device|
|Cooperative Classification||G07F17/3223, G07F17/3227, G07F17/323, G07F17/3288, G07F17/32|
|European Classification||G07F17/32, G07F17/32P2, G07F17/32C6, G07F17/32E2, G07F17/32E4|
|9 Feb 2006||AK||Designated states|
Kind code of ref document: A2
Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW
|9 Feb 2006||AL||Designated countries for regional patents|
Kind code of ref document: A2
Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG
|9 Aug 2006||121||Ep: the epo has been informed by wipo that ep was designated in this application|
|19 Jan 2007||WWE||Wipo information: entry into national phase|
Ref document number: MX/a/2007/000802
Country of ref document: MX
Ref document number: 2574357
Country of ref document: CA
Ref document number: 2005269640
Country of ref document: AU
|22 Jan 2007||WWE||Wipo information: entry into national phase|
Ref document number: 2005773586
Country of ref document: EP
Ref document number: 2007522705
Country of ref document: JP
Ref document number: 1020077001546
Country of ref document: KR
|23 Jan 2007||NENP||Non-entry into the national phase in:|
Ref country code: DE
|15 Feb 2007||WWP||Wipo information: published in national office|
Ref document number: 2005269640
Country of ref document: AU
|15 Feb 2007||ENP||Entry into the national phase in:|
Ref document number: 2005269640
Country of ref document: AU
Date of ref document: 20050721
Kind code of ref document: A
|19 Mar 2007||WWE||Wipo information: entry into national phase|
Ref document number: 200580031476.0
Country of ref document: CN
|9 May 2007||WWP||Wipo information: published in national office|
Ref document number: 2005773586
Country of ref document: EP
|6 May 2008||ENP||Entry into the national phase in:|
Ref document number: PI0513574
Country of ref document: BR