WO2012092706A1 - Hybrid operating system media integration - Google Patents

Hybrid operating system media integration Download PDF

Info

Publication number
WO2012092706A1
WO2012092706A1 PCT/CN2011/070018 CN2011070018W WO2012092706A1 WO 2012092706 A1 WO2012092706 A1 WO 2012092706A1 CN 2011070018 W CN2011070018 W CN 2011070018W WO 2012092706 A1 WO2012092706 A1 WO 2012092706A1
Authority
WO
WIPO (PCT)
Prior art keywords
media
runtime environment
data
hardware
service
Prior art date
Application number
PCT/CN2011/070018
Other languages
French (fr)
Inventor
Liang Zhao
Bo Huang
Hai-Tao Lin
Lin-lin YU
Original Assignee
Motorola Mobility, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Motorola Mobility, Inc. filed Critical Motorola Mobility, Inc.
Priority to PCT/CN2011/070018 priority Critical patent/WO2012092706A1/en
Publication of WO2012092706A1 publication Critical patent/WO2012092706A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/455Emulation; Interpretation; Software simulation, e.g. virtualisation or emulation of application or operating system execution engines

Definitions

  • Mobile devices such as portable computers and mobile phones, may be implemented with a hybrid operating system that includes multiple runtime environments operating with one operating system kernel. Unlike a virtual machine that emulates a computing device within another controlling operating system, the runtime environments of a hybrid operating system can each access and control device hardware. For example, in a LinuxTM hybrid operating system, an AndroidTM application in a first runtime environment and an UbuntuTM application in a second runtime environment may both access and control audio hardware with independent hardware control signals via the Linux Kernel. This may result in an audio conflict, such as in a mobile phone when audio for an incoming phone call in the Android environment overlaps or conflicts with audio for a multimedia video file in the Ubuntu environment.
  • FIG. 1 illustrates an example system in which embodiments of hybrid operating system media integration can be implemented.
  • FIG. 2 further illustrates components of the example system in embodiments of hybrid operating system media integration.
  • FIG. 3 illustrates an example electronic device in which embodiments of hybrid operating system media integration can be implemented.
  • FIG. 4 illustrates example method(s) of hybrid operating system media integration in accordance with one or more embodiments.
  • FIG. 5 illustrates various components of an example device that can implement embodiments of hybrid operating system media integration.
  • an electronic device can be implemented with a hybrid operating system, such as a GNU/Linux operating system that includes a Linux Kernel, an AndroidTM runtime environment, and a separate GNU/Linux runtime environment.
  • the operating system kernel communicates and/or transfers data between the runtime environments and the memory, processors, and media hardware of the device.
  • a media control service in the Android runtime environment manages and controls the distribution of media data, such as audio data and/or video data, from applications in the Android runtime environment to media hardware that renders the media data.
  • the media hardware may include speakers to render audio and/or a display component (or monitor) to display video.
  • a data routing service in the GNU/Linux runtime environment routes additional media data files and hardware control data from the GNU/Linux runtime environment to the media control service in the Android runtime environment.
  • the media control service in the Android runtime environment then manages and controls the distribution of the additional media data to the media hardware.
  • an Ubuntu MP3 player (e.g., an application in the GNU/Linux environment) may be playing an audio file at a background audio level. If the GNU/Linux runtime environment has access to control the audio hardware in an electronic device (e.g., a mobile phone) as described above in the Background section, then an audio controller in the GNU/Linux runtime environment can send control signals to turn a device speaker on and set the volume to the background audio level to playback the audio file.
  • a call application e.g. , an application in the Android runtime environment
  • a user of the device may initiate turning off the Ubuntu MP3 player to take the phone call.
  • turning off the Ubuntu MP3 player may trigger a control signal to turn the device speaker off, while the Android call application attempts to initiate turning the device speaker on and setting a particular volume for an audible phone call ring alert.
  • the embodiments of hybrid operating system media integration preempts this type of audio conflict as well as other media hardware conflicts, where two or more software applications in different runtime environments attempt to control the media hardware in an electronic device to render or playback media data.
  • FIG. 1 illustrates an example system 100 in which embodiments of hybrid operating system media integration can be implemented.
  • the example system 100 may be implemented as a variety of mobile or non-mobile environments in any type of fixed or mobile device.
  • a dual runtime environment may be implemented as a gaming and computing device, as an electronic book (eBook) and television media device, as a mobile phone and computing device, or as any other set of two or more runtime environments.
  • the example system can be implemented as an electronic device with any combination of differing components as further described with reference to the example electronic device shown in FIG. 5.
  • the example system 100 may be generally described with reference to layers of abstraction, such as in an electronic and/or computing device.
  • the system includes media hardware 110 that renders and/or captures media data, such as audio data and/or video data.
  • the media hardware may include an audio speaker that renders audio when input with audio data, and may include a display component that displays video when input with video data.
  • the media hardware can include various audio speakers, such as a private speaker, a hands-free speaker, a wireless earpiece speaker with a Bluetooth transceiver, a headphone jack, and the like.
  • the media hardware may also include a microphone for capturing audio signals, a camera for capturing video or still picture signals, vibration devices for capturing vibrational signals, haptic feedback devices for generating vibrations when input with haptic data, visual outputs, and/or any other media receiving and/or rendering device.
  • a microphone for capturing audio signals
  • a camera for capturing video or still picture signals
  • vibration devices for capturing vibrational signals
  • haptic feedback devices for generating vibrations when input with haptic data, visual outputs, and/or any other media receiving and/or rendering device.
  • the system also includes memory 112 and shared memory 114, each implemented to store or otherwise maintain various data.
  • the shared memory 114 stores media data files 116, such as audio data files and/or video data files.
  • the memory 1 12 stores media distribution policies 1 18 that can be used to manage distribution of the media data files to the media hardware.
  • the memory and shared memory can be implemented as any type of memory and/or suitable electronic data storage including memory units removable from the electronic device, such as a flash memory data storage device (e.g., USB flash drive), a compact disk (CD), or a digital versatile disk (DVD).
  • a flash memory data storage device e.g., USB flash drive
  • CD compact disk
  • DVD digital versatile disk
  • the example system 100 includes elements of a hybrid operating system, such as an operating system kernel 120 and a kernel binder 122.
  • the system also includes a first runtime environment 130 and a second runtime environment 132.
  • a LinuxTM hybrid operating system may be implemented as a GNU/Linux operating system that includes a Linux Kernel (e.g., the operating system kernel), an AndroidTM runtime environment (e.g. , the first runtime environment), and a GNU/Linux runtime environment (e.g., the second runtime environment).
  • the operating system kernel 120 communicates and/or transfers data between the runtime environments and the memory, processors, and media hardware of the example system.
  • Embodiments of a hybrid operating system are not limited to Linux-based systems and environments.
  • Other operating systems that may be used to implement hybrid operating system media integration could include Disk Operating System (DOS), Microsoft WindowsTM operating systems, and Apple Mac OSTM operating systems.
  • DOS Disk Operating System
  • Microsoft WindowsTM operating systems Microsoft WindowsTM operating systems
  • Apple Mac OSTM operating systems Alternate implementations can include more than two runtime environments of a hybrid operating system.
  • the first runtime environment 130 includes a media control service 140 that can be implemented as computer-executable instructions, such as a software application, and executed by one or more processors to implement the various embodiments described herein.
  • the media control service can be implemented as an Android Audio Flinger service (e.g., in an Android runtime environment) for audio management and distribution of audio data files to the media hardware.
  • the second runtime environment 132 includes a data routing service 142 that can also be implemented as computer-executable instructions, such as a software application, and executed by one or more processors to implement the various embodiments described herein.
  • the data routing service can be implemented as an Advanced Linux Sound Architecture (ALSA) service (e.g., in a GNU/Linux runtime environment).
  • ALSA Advanced Linux Sound Architecture
  • the first runtime environment 130 also includes software applications 150, such as Android applications
  • the second runtime environment 132 also includes software applications 152, such as UbuntuTM applications.
  • the software applications in their respective runtime environments may include, but are not limited to, any one or combination of a call/dialer application, camera (video or still images), audio recorder, MP3 player, WAVE audio file player, TIFF still images viewer, AVI audio-video player, Flash audio-video player, QuickTime audio-video player, or MP4 audio-video player.
  • the applications generally retrieve and/or generate media data, as well as retrieve and/or generate hardware control data that controls the media hardware 1 10 to render the corresponding media data, such as audio data and/or video data.
  • the media data that is retrieved and/or generated by the software applications can be stored or otherwise maintained in the shared memory 114 as the media data files 116.
  • the hardware control data may include data signals to turn the media hardware on and off, set a volume level or gain, set a brightness level, mix multiple media data files, turn muting on and off, and any other type of hardware control data signals.
  • the media control service 140 in the first runtime environment 130 manages the distribution of media data files from the software applications 150 in the first runtime environment to the media hardware 110.
  • Media data from the media data files 116 is routed for input to the media hardware at 190, and the media control service communicates hardware control signals 192 through the operating system kernel 120 to initiate and control the media hardware that renders the media data.
  • the data routing service 142 in the second runtime environment 132 is implemented to preempt direct distribution of media data files and hardware control data from the software applications 152 in the second runtime environment to the media hardware at 194. Note that, although a playback example is shown here, hardware control signals 192 can also be used to control media hardware 110 to capture media data and store it in the shared memory 1 14.
  • the data routing service 142 is implemented to route additional media data files from the second runtime environment 132 to the media control service 140 in the first runtime environment 130 where the media control service also manages the distribution of the additional media data files to the media hardware 110. This prevents a media hardware control conflict and/or a media rendering conflict if both the first runtime environment 130 and the second runtime environment 132 were to initiate the same media hardware for media playback, or if one runtime environment 130 were to turn off the media hardware when the other runtime environment 132 is using or initiates use of the media hardware. Not only can playback conflicts be avoided, but recording conflicts can also be avoided.
  • the data routing service 142 in the second runtime environment 132 utilizes the shared memory 114 to store the additional media data files from the applications 152, and the additional media data files are then accessible to the operating system (e.g., the operating system kernel 120) from the shared memory.
  • the data routing service 142 is also implemented to route hardware control data 196 that is associated with the additional media data files from the second runtime environment to the media control service 140 in the first runtime environment.
  • the hardware control data is routed via the kernel binder 122 between the second runtime environment and the first runtime environment.
  • the media control service 140 can then selectively initiate the hardware control signals 192 from the hardware control data 196 to manage the distribution of the media data files 1 16 that are routed from the second runtime environment by the data routing service.
  • FIG. 2 further illustrates components 200 of the example system 100 as described with reference to FIG. 1.
  • the media control service 140 includes code libraries 230.
  • the media control service may be implemented as an Android Audio Flinger service (e.g., in an Android runtime environment).
  • the code libraries 230 include "libmedia” 232, "libcutils” 233, “libbinder” 234, "liblog” 235, and "libutils” 236.
  • the data routing service 142 can be implemented with one or more of the code libraries of the media control service at 238.
  • the data routing service may be implemented as an Advanced Linux Sound Architecture (ALSA) (e.g., in a GNU/Linux runtime environment).
  • ALSA Advanced Linux Sound Architecture
  • the Android Audio Flinger libraries are ported to the Linux/Ubuntu environment, and a new ALSA PCM plug-in (pulse code modulation for audio data) and a new ALSA CTL plug-in (media hardware control) can be built from the Android code libraries.
  • FIG. 3 illustrates an example of an electronic device 300 in which embodiments of hybrid operating system media integration can be implemented.
  • the electronic device 300 can be implemented as any type of fixed or mobile device, and can be implemented with any combination of differing components as further described with reference to the example electronic device shown in FIG. 5. Additionally, the electronic device 300 includes multiple runtime environments, such as described with reference to FIG. 1.
  • the electronic device 300 includes media hardware 310 that renders media data, such as audio data and/or video data.
  • the media hardware may include an audio speaker that renders audio when input with audio data, and may include a display component that displays video when input with video data.
  • media hardware 310 may be integrated into the electronic device 300 or may be a peripheral connected through a port or through a wireless transceiver.
  • the media hardware in the electronic device can include various audio speakers, such as a private speaker, a hands-free speaker, a wireless earpiece speaker with a Bluetooth transceiver, a headphone jack output, and the like.
  • the media hardware may also include a microphone, a camera, vibration devices, haptic feedback devices, visual outputs, and/or any other media receiving and/or rendering device.
  • the electronic device 300 also includes memory 312 and shared memory 314, each implemented to store or otherwise maintain various data.
  • the shared memory 314 stores media data files 316, such as audio data files and/or video data files.
  • the memory 312 stores media distribution policies 318 that can be used to manage distribution of the media data files to the media hardware.
  • the memory and shared memory can be implemented as any type of memory and/or suitable electronic data storage as described with reference to FIG. 1.
  • the electronic device 300 includes elements of a hybrid operating system 320, such as an operating system kernel 322 and a kernel binder 324.
  • the hybrid operating system includes a first runtime environment 330 and a second runtime environment 332.
  • the hybrid operating system may be implemented as a GNU/Linux operating system that includes a Linux Kernel (e.g., the operating system kernel), an AndroidTM runtime environment (e.g., the first runtime environment), and a GNU/Linux runtime environment (e.g., the second runtime environment).
  • the operating system kernel 322 communicates and/or transfers data between the runtime environments and the memory, processors, and media hardware of the electronic device.
  • embodiments of a hybrid operating system are not limited to Linux-based systems and environments. Additionally, alternate implementations can include more than two runtime environments of a hybrid operating system.
  • the first runtime environment 330 includes a media control service 340, and the second runtime environment 332 includes a data routing service 342.
  • the media control service and the data routing service can each be implemented as computer-executable instructions, such as software applications, and executed by one or more processors to implement the various embodiments described herein.
  • the media control service is implemented as an Android Audio Flinger service (e.g., in an Android runtime environment) for audio management and distribution of audio data files to the media hardware.
  • the data routing service is implemented as an Advanced Linux Sound Architecture (ALSA) service (e.g., in a GNU/Linux runtime environment).
  • ALSA Advanced Linux Sound Architecture
  • the runtime environments also include software applications, such as Android applications 350 in the first runtime environment 330 and Ubuntu applications 352 in the second runtime environment 332.
  • the software applications in the respective runtime environments may include any one or combination of the software applications described with reference to FIG. 1.
  • the media data that is retrieved and/or generated by the applications can be stored or otherwise maintained in the shared memory 314 as the media data files 316.
  • the media control service 340 in the first runtime environment 330 manages the distribution of media data files from the Android applications 350 to the media hardware 310.
  • Media data from the media data files 316 is routed for input to the media hardware, and the media control service communicates hardware control signals through the operating system kernel 322 to initiate and control the media hardware that renders the media data.
  • the data routing service 342 in the second runtime environment 332 is implemented to route additional media data files from the Ubuntu applications 352 to the media control service 340 in the first runtime environment 330 where the media control service also manages the distribution of the additional media data files to the media hardware 310.
  • the data routing service 342 in the second runtime environment 332 utilizes the shared memory 314 to store the additional media data files from the Ubuntu applications 352, and the additional media data files are then accessible to the operating system (e.g., the operating system kernel 322) from the shared memory.
  • the data routing service 342 is also implemented to route hardware control data 390 that is associated with the additional media data files from the second runtime environment to the media control service 340 in the first runtime environment.
  • the hardware control data is routed via the kernel binder 324 between the second runtime environment and the first runtime environment.
  • the media control service 340 can then initiate (or ignore) hardware control signals from the hardware control data 390 to manage the distribution of the media data files 316 that are routed from the second runtime environment by the data routing service.
  • the media control service 340 checks the distribution policies 318 to determine whether to ignore or pass hardware control data 390 from the second runtime environment to the media hardware 310.
  • the media control service 340 may be implemented as an Android Audio Flinger service (e.g., in an Android runtime environment), and includes code libraries that may also be used to implement the data routing service 342, which may be implemented as an Advanced Linux Sound Architecture (ALSA) service (e.g., in a GNU/Linux runtime environment).
  • the Android Audio Flinger libraries are ported to the Linux/Ubuntu environment, and a new ALSA PCM plug-in (pulse code modulation for audio data) and a new ALSA CTL plug-in (media hardware control) can be built from the Android code libraries.
  • Example method 400 is described with reference to FIG. 4 in accordance with one or more embodiments of hybrid operating system media integration.
  • any of the services, functions, methods, procedures, components, and modules described herein can be implemented using software, firmware, hardware (e.g., fixed logic circuitry), manual processing, or any combination thereof.
  • a software implementation represents program code that performs specified tasks when executed by a computer processor.
  • the example methods may be described in the general context of computer-executable instructions, which can include software, applications, routines, programs, objects, components, data structures, procedures, modules, functions, and the like.
  • the program code can be stored in one or more computer-readable storage media devices, both local and/or remote to a computer processor.
  • the methods may also be practiced in a distributed computing environment by multiple computer devices. Further, the features described herein are platform-independent and can be implemented on a variety of computing platforms having a variety of processors.
  • FIG. 4 illustrates example method(s) 400 of hybrid operating system media integration.
  • the order in which the method blocks are described are not intended to be construed as a limitation, and any number or combination of the described method blocks can be combined in any order to implement a method, or an alternate method.
  • a first media data file and first hardware control data for media hardware is received at a first runtime environment.
  • the first runtime environment 130 receives a media data file and hardware control data from a software application 150 in the first runtime environment.
  • an Android call application 350 (FIG. 3) of the first runtime environment 330 receives an incoming call signal that points to a ringer alert audio file and a specific volume control level.
  • a second media data file and second hardware control data for the media hardware is received at a second runtime environment.
  • the second runtime environment 132 receives a media data file and hardware control data from a software application 152 in the second runtime environment.
  • an Ubuntu Internet browser application 352 of the second runtime environment 332 receives streaming movie video and audio data with user-selectable audio control settings.
  • the second media data file is routed from the second runtime environment to the first runtime environment.
  • the data routing service 142 in the second runtime environment 132 routes a media data file 116 from the second runtime environment to the first runtime environment 130 via the shared memory 1 14, which is accessible by the operating system kernel 120.
  • the data routing service 342 in the second runtime environment 332 routes the video and audio data files based on the Ubuntu Internet browser application 352 instructions.
  • the second hardware control data is routed from the second runtime environment to the first runtime environment.
  • the data routing service 142 in the second runtime environment 132 routes the hardware control data 196 from the second runtime environment to the first runtime environment 130 via the kernel binder 122 between the second runtime environment and the first runtime environment.
  • the data routing service 342 sends speaker and volume setting control signals through the kernel binder 324 from the second runtime environment 332 to the first runtime environment 330.
  • the media hardware is managed using the first hardware control data and/or the second hardware control data in compliance with media distribution policies.
  • the media control service 140 in the first runtime environment 130 manages the media hardware 1 10 using the hardware control signals 192 (e.g., derived from the first hardware control data and/or the second hardware control data).
  • the media control service 140 manages the media hardware in compliance with the media distribution policies 118 by selecting the first hardware control data and suppressing the second hardware control data, by selecting the second hardware control data and suppressing the first hardware control data, or by combining the first hardware control data and the second hardware control data.
  • the distribution policies 318 prioritize incoming call alerts over all other audio data.
  • the media control service 340 would select the first hardware control data from the Android call application 350 for the incoming call to turn on a particular audio speaker at a particular volume; suppress the second hardware control data from the Ubuntu Internet browser application 352 for the streaming movie audio; and couple the ringer alert audio data file to the media hardware 310.
  • the media distribution policies 118 can be quite complex and may prioritize audio data and loudspeaker hardware control data from particular software applications or specific runtime environments over other audio data and loudspeaker hardware control data.
  • the prioritization of audio information may be completely different from the prioritization of video information.
  • the video data and display hardware control data may continue to be sent from the Ubuntu Internet browser application 352 in the second runtime environment 332 while the audio data and loudspeaker hardware control data from the second runtime environment 332 is suppressed.
  • FIG. 5 illustrates various components of an example electronic device 500 that can be implemented as any device described with reference to any of the previous FIGs. 1-4.
  • the electronic device may be implemented as any one or combination of a fixed or mobile device, in any form of a consumer, computer, portable, user, communication, phone, navigation, gaming, media playback, and/or electronic device.
  • the electronic device 500 includes communication transceivers 502 that enable wired and/or wireless communication of device data 504, such as received data, data that is being received, data scheduled for broadcast, data packets of the data, etc.
  • Example transceivers include wireless personal area network (WPAN) radios compliant with various IEEE 802.15 (sometimes referred to as BluetoothTM) standards, wireless local area network (WLAN) radios compliant with any of the various IEEE 802.11 (sometimes referred to as WiFiTM) standards, wireless wide area network (WW AN) radios for cellular telephony, wireless metropolitan area network (WMAN) radios compliant with various IEEE 802.15 (sometimes referred to as WiMAXTM) standards, and wired local area network (LAN) Ethernet transceivers.
  • WPAN wireless personal area network
  • WLAN wireless local area network
  • WLAN wireless wide area network
  • WMAN wireless metropolitan area network
  • WiMAXTM wireless metropolitan area network
  • the electronic device 500 may also include one or more data input ports 506 via which any type of user data, media content, and/or inputs can be received, such as user-selectable inputs, messages, music, television content, recorded video content, and any other type of audio, video, and/or image data received from any content and/or data source.
  • the data input ports may include USB ports, coaxial cable ports, and other serial or parallel connectors (including internal connectors) for flash memory, DVDs, CDs, and the like. These data input ports may be used to couple the electronic device 500 to components, peripherals, or accessories such as microphones or cameras.
  • the electronic device 500 includes one or more processors 508 (e.g., any of microprocessors, controllers, and the like), which process computer- executable instructions to control operation of the device.
  • processors 508 e.g., any of microprocessors, controllers, and the like
  • the electronic device can be implemented with any one or combination of software, hardware, firmware, or fixed logic circuitry that is implemented in connection with processing and control circuits, which are generally identified at 510.
  • the electronic device can include a system bus or data transfer system that couples the various components within the device.
  • a system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures.
  • the electronic device 500 also includes one or more memory devices 512 that enable data storage, examples of which include random access memory (RAM), non-volatile memory (e.g., read-only memory (ROM), flash memory, EPROM, EEPROM, etc.), and a disk storage device.
  • RAM random access memory
  • non-volatile memory e.g., read-only memory (ROM), flash memory, EPROM, EEPROM, etc.
  • a disk storage device may be implemented as any type of magnetic or optical storage device, such as a hard disk drive, a recordable and/or rewriteable disc, any type of a digital versatile disc (DVD), and the like.
  • the electronic device may also include a mass storage media device.
  • a memory device 512 provides data storage mechanisms to store the device data 504, other types of information and/or data, and various device applications 514 (e.g., software applications).
  • an operating system 516 can be maintained as software instructions within a memory device and executed on processors 508.
  • the device applications may also include a device manager, such as any form of a control application, software application, signal-processing and control module, code that is native to a particular device, a hardware abstraction layer for a particular device, and so on.
  • the operating system 516 can establish one or more runtime environments 518 as described herein.
  • a first runtime environment includes a media control service 520
  • a second runtime environment includes a data routing service 522.
  • the electronic device 500 also includes an audio and/or video processing system 524 that generates audio data for an audio system 526 and/or generates display data for a display system 528.
  • the audio system and/or the display system may include any devices that process, display, and/or otherwise render audio, video, display, and/or image data.
  • Display data and audio signals can be communicated to an audio component and/or to a display component via an RF (radio frequency) link, S-video link, HDMI (high-definition multimedia interface), composite video link, component video link, DVI (digital video interface), analog audio connection, or other similar communication link, such as media data port 530.
  • the audio system and/or the display system are external components to the electronic device.
  • the audio system and/or the display system are integrated components of the example electronic device, such as an integrated touch-screen.
  • hybrid operating system media integration has been described in language specific to features and/or methods, the subject of the appended claims is not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as example implementations of hybrid operating system media integration.

Abstract

In embodiments of hybrid operating system media integration, a media control service in a first runtime environment manages the distribution of a media data file from the first runtime environment to media hardware that renders the media data. A data routing service in a second runtime environment routes an additional media data file and hardware control data from the second runtime environment to the media control service in the first runtime environment. The media control service further manages distribution of the additional media data file to the media hardware.

Description

HYBRID OPERATING SYSTEM MEDIA INTEGRATION
BACKGROUND
[0001] Mobile devices, such as portable computers and mobile phones, may be implemented with a hybrid operating system that includes multiple runtime environments operating with one operating system kernel. Unlike a virtual machine that emulates a computing device within another controlling operating system, the runtime environments of a hybrid operating system can each access and control device hardware. For example, in a Linux™ hybrid operating system, an Android™ application in a first runtime environment and an Ubuntu™ application in a second runtime environment may both access and control audio hardware with independent hardware control signals via the Linux Kernel. This may result in an audio conflict, such as in a mobile phone when audio for an incoming phone call in the Android environment overlaps or conflicts with audio for a multimedia video file in the Ubuntu environment.
BRIEF DESCRIPTION OF THE DRAWINGS
[0002] Embodiments of hybrid operating system media integration are described with reference to the following drawings. The same numbers are used throughout the drawings to reference like features and components: FIG. 1 illustrates an example system in which embodiments of hybrid operating system media integration can be implemented.
FIG. 2 further illustrates components of the example system in embodiments of hybrid operating system media integration.
FIG. 3 illustrates an example electronic device in which embodiments of hybrid operating system media integration can be implemented.
FIG. 4 illustrates example method(s) of hybrid operating system media integration in accordance with one or more embodiments.
FIG. 5 illustrates various components of an example device that can implement embodiments of hybrid operating system media integration.
DETAILED DESCRIPTION
[0003] In embodiments of hybrid operating system media integration, an electronic device can be implemented with a hybrid operating system, such as a GNU/Linux operating system that includes a Linux Kernel, an Android™ runtime environment, and a separate GNU/Linux runtime environment. The operating system kernel communicates and/or transfers data between the runtime environments and the memory, processors, and media hardware of the device. A media control service in the Android runtime environment manages and controls the distribution of media data, such as audio data and/or video data, from applications in the Android runtime environment to media hardware that renders the media data. The media hardware may include speakers to render audio and/or a display component (or monitor) to display video. A data routing service in the GNU/Linux runtime environment routes additional media data files and hardware control data from the GNU/Linux runtime environment to the media control service in the Android runtime environment. The media control service in the Android runtime environment then manages and controls the distribution of the additional media data to the media hardware.
[0004] In an example, an Ubuntu MP3 player (e.g., an application in the GNU/Linux environment) may be playing an audio file at a background audio level. If the GNU/Linux runtime environment has access to control the audio hardware in an electronic device (e.g., a mobile phone) as described above in the Background section, then an audio controller in the GNU/Linux runtime environment can send control signals to turn a device speaker on and set the volume to the background audio level to playback the audio file. When an incoming phone call is received by a call application (e.g. , an application in the Android runtime environment), a user of the device may initiate turning off the Ubuntu MP3 player to take the phone call. However, turning off the Ubuntu MP3 player may trigger a control signal to turn the device speaker off, while the Android call application attempts to initiate turning the device speaker on and setting a particular volume for an audible phone call ring alert. The embodiments of hybrid operating system media integration, as described herein, preempts this type of audio conflict as well as other media hardware conflicts, where two or more software applications in different runtime environments attempt to control the media hardware in an electronic device to render or playback media data.
[0005] While features and concepts of hybrid operating system media integration can be implemented in any number of different devices, systems, environments, and/or configurations, embodiments of hybrid operating system media integration are described in the context of the following example devices, systems, and methods. [0006] FIG. 1 illustrates an example system 100 in which embodiments of hybrid operating system media integration can be implemented. The example system 100 may be implemented as a variety of mobile or non-mobile environments in any type of fixed or mobile device. For example, a dual runtime environment may be implemented as a gaming and computing device, as an electronic book (eBook) and television media device, as a mobile phone and computing device, or as any other set of two or more runtime environments. Additionally, the example system can be implemented as an electronic device with any combination of differing components as further described with reference to the example electronic device shown in FIG. 5.
[0007] The example system 100 may be generally described with reference to layers of abstraction, such as in an electronic and/or computing device. The system includes media hardware 110 that renders and/or captures media data, such as audio data and/or video data. The media hardware may include an audio speaker that renders audio when input with audio data, and may include a display component that displays video when input with video data. The media hardware can include various audio speakers, such as a private speaker, a hands-free speaker, a wireless earpiece speaker with a Bluetooth transceiver, a headphone jack, and the like. The media hardware may also include a microphone for capturing audio signals, a camera for capturing video or still picture signals, vibration devices for capturing vibrational signals, haptic feedback devices for generating vibrations when input with haptic data, visual outputs, and/or any other media receiving and/or rendering device.
[0008] In this example, the system also includes memory 112 and shared memory 114, each implemented to store or otherwise maintain various data. For example, the shared memory 114 stores media data files 116, such as audio data files and/or video data files. The memory 1 12 stores media distribution policies 1 18 that can be used to manage distribution of the media data files to the media hardware. The memory and shared memory can be implemented as any type of memory and/or suitable electronic data storage including memory units removable from the electronic device, such as a flash memory data storage device (e.g., USB flash drive), a compact disk (CD), or a digital versatile disk (DVD).
[0009] The example system 100 includes elements of a hybrid operating system, such as an operating system kernel 120 and a kernel binder 122. The system also includes a first runtime environment 130 and a second runtime environment 132. In implementations, a Linux™ hybrid operating system may be implemented as a GNU/Linux operating system that includes a Linux Kernel (e.g., the operating system kernel), an Android™ runtime environment (e.g. , the first runtime environment), and a GNU/Linux runtime environment (e.g., the second runtime environment). The operating system kernel 120 communicates and/or transfers data between the runtime environments and the memory, processors, and media hardware of the example system. Embodiments of a hybrid operating system are not limited to Linux-based systems and environments. Other operating systems that may be used to implement hybrid operating system media integration could include Disk Operating System (DOS), Microsoft Windows™ operating systems, and Apple Mac OS™ operating systems. Additionally, alternate implementations can include more than two runtime environments of a hybrid operating system.
[0010] The first runtime environment 130 includes a media control service 140 that can be implemented as computer-executable instructions, such as a software application, and executed by one or more processors to implement the various embodiments described herein. In embodiments, the media control service can be implemented as an Android Audio Flinger service (e.g., in an Android runtime environment) for audio management and distribution of audio data files to the media hardware. The second runtime environment 132 includes a data routing service 142 that can also be implemented as computer-executable instructions, such as a software application, and executed by one or more processors to implement the various embodiments described herein. In embodiments, the data routing service can be implemented as an Advanced Linux Sound Architecture (ALSA) service (e.g., in a GNU/Linux runtime environment).
[0011] The first runtime environment 130 also includes software applications 150, such as Android applications, and the second runtime environment 132 also includes software applications 152, such as Ubuntu™ applications. The software applications in their respective runtime environments may include, but are not limited to, any one or combination of a call/dialer application, camera (video or still images), audio recorder, MP3 player, WAVE audio file player, TIFF still images viewer, AVI audio-video player, Flash audio-video player, QuickTime audio-video player, or MP4 audio-video player. Depending on specific implementations and functions of the various software applications, the applications generally retrieve and/or generate media data, as well as retrieve and/or generate hardware control data that controls the media hardware 1 10 to render the corresponding media data, such as audio data and/or video data. The media data that is retrieved and/or generated by the software applications can be stored or otherwise maintained in the shared memory 114 as the media data files 116. The hardware control data may include data signals to turn the media hardware on and off, set a volume level or gain, set a brightness level, mix multiple media data files, turn muting on and off, and any other type of hardware control data signals.
[0012] The media control service 140 in the first runtime environment 130 manages the distribution of media data files from the software applications 150 in the first runtime environment to the media hardware 110. Media data from the media data files 116 is routed for input to the media hardware at 190, and the media control service communicates hardware control signals 192 through the operating system kernel 120 to initiate and control the media hardware that renders the media data. The data routing service 142 in the second runtime environment 132 is implemented to preempt direct distribution of media data files and hardware control data from the software applications 152 in the second runtime environment to the media hardware at 194. Note that, although a playback example is shown here, hardware control signals 192 can also be used to control media hardware 110 to capture media data and store it in the shared memory 1 14.
[0013] In embodiments, the data routing service 142 is implemented to route additional media data files from the second runtime environment 132 to the media control service 140 in the first runtime environment 130 where the media control service also manages the distribution of the additional media data files to the media hardware 110. This prevents a media hardware control conflict and/or a media rendering conflict if both the first runtime environment 130 and the second runtime environment 132 were to initiate the same media hardware for media playback, or if one runtime environment 130 were to turn off the media hardware when the other runtime environment 132 is using or initiates use of the media hardware. Not only can playback conflicts be avoided, but recording conflicts can also be avoided.
[0014] The data routing service 142 in the second runtime environment 132 utilizes the shared memory 114 to store the additional media data files from the applications 152, and the additional media data files are then accessible to the operating system (e.g., the operating system kernel 120) from the shared memory. The data routing service 142 is also implemented to route hardware control data 196 that is associated with the additional media data files from the second runtime environment to the media control service 140 in the first runtime environment. The hardware control data is routed via the kernel binder 122 between the second runtime environment and the first runtime environment. The media control service 140 can then selectively initiate the hardware control signals 192 from the hardware control data 196 to manage the distribution of the media data files 1 16 that are routed from the second runtime environment by the data routing service.
[0015] FIG. 2 further illustrates components 200 of the example system 100 as described with reference to FIG. 1. The media control service 140 includes code libraries 230. As described above, the media control service may be implemented as an Android Audio Flinger service (e.g., in an Android runtime environment). In an example implementation of an Android runtime environment, the code libraries 230 include "libmedia" 232, "libcutils" 233, "libbinder" 234, "liblog" 235, and "libutils" 236. The data routing service 142 can be implemented with one or more of the code libraries of the media control service at 238. As described above, the data routing service may be implemented as an Advanced Linux Sound Architecture (ALSA) (e.g., in a GNU/Linux runtime environment). The Android Audio Flinger libraries are ported to the Linux/Ubuntu environment, and a new ALSA PCM plug-in (pulse code modulation for audio data) and a new ALSA CTL plug-in (media hardware control) can be built from the Android code libraries.
[0016] FIG. 3 illustrates an example of an electronic device 300 in which embodiments of hybrid operating system media integration can be implemented. The electronic device 300 can be implemented as any type of fixed or mobile device, and can be implemented with any combination of differing components as further described with reference to the example electronic device shown in FIG. 5. Additionally, the electronic device 300 includes multiple runtime environments, such as described with reference to FIG. 1.
[0017] For example, the electronic device 300 includes media hardware 310 that renders media data, such as audio data and/or video data. The media hardware may include an audio speaker that renders audio when input with audio data, and may include a display component that displays video when input with video data. Note that media hardware 310 may be integrated into the electronic device 300 or may be a peripheral connected through a port or through a wireless transceiver. The media hardware in the electronic device can include various audio speakers, such as a private speaker, a hands-free speaker, a wireless earpiece speaker with a Bluetooth transceiver, a headphone jack output, and the like. The media hardware may also include a microphone, a camera, vibration devices, haptic feedback devices, visual outputs, and/or any other media receiving and/or rendering device. [0018] The electronic device 300 also includes memory 312 and shared memory 314, each implemented to store or otherwise maintain various data. For example, the shared memory 314 stores media data files 316, such as audio data files and/or video data files. The memory 312 stores media distribution policies 318 that can be used to manage distribution of the media data files to the media hardware. The memory and shared memory can be implemented as any type of memory and/or suitable electronic data storage as described with reference to FIG. 1.
[0019] The electronic device 300 includes elements of a hybrid operating system 320, such as an operating system kernel 322 and a kernel binder 324. The hybrid operating system includes a first runtime environment 330 and a second runtime environment 332. In implementations, the hybrid operating system may be implemented as a GNU/Linux operating system that includes a Linux Kernel (e.g., the operating system kernel), an Android™ runtime environment (e.g., the first runtime environment), and a GNU/Linux runtime environment (e.g., the second runtime environment). The operating system kernel 322 communicates and/or transfers data between the runtime environments and the memory, processors, and media hardware of the electronic device. As previously mentioned, embodiments of a hybrid operating system are not limited to Linux-based systems and environments. Additionally, alternate implementations can include more than two runtime environments of a hybrid operating system.
[0020] The first runtime environment 330 includes a media control service 340, and the second runtime environment 332 includes a data routing service 342. The media control service and the data routing service can each be implemented as computer-executable instructions, such as software applications, and executed by one or more processors to implement the various embodiments described herein. In embodiments, the media control service is implemented as an Android Audio Flinger service (e.g., in an Android runtime environment) for audio management and distribution of audio data files to the media hardware. Additionally, the data routing service is implemented as an Advanced Linux Sound Architecture (ALSA) service (e.g., in a GNU/Linux runtime environment).
[0021] The runtime environments also include software applications, such as Android applications 350 in the first runtime environment 330 and Ubuntu applications 352 in the second runtime environment 332. The software applications in the respective runtime environments may include any one or combination of the software applications described with reference to FIG. 1. The media data that is retrieved and/or generated by the applications can be stored or otherwise maintained in the shared memory 314 as the media data files 316.
[0022] The media control service 340 in the first runtime environment 330 manages the distribution of media data files from the Android applications 350 to the media hardware 310. Media data from the media data files 316 is routed for input to the media hardware, and the media control service communicates hardware control signals through the operating system kernel 322 to initiate and control the media hardware that renders the media data. The data routing service 342 in the second runtime environment 332 is implemented to route additional media data files from the Ubuntu applications 352 to the media control service 340 in the first runtime environment 330 where the media control service also manages the distribution of the additional media data files to the media hardware 310. [0023] The data routing service 342 in the second runtime environment 332 utilizes the shared memory 314 to store the additional media data files from the Ubuntu applications 352, and the additional media data files are then accessible to the operating system (e.g., the operating system kernel 322) from the shared memory. The data routing service 342 is also implemented to route hardware control data 390 that is associated with the additional media data files from the second runtime environment to the media control service 340 in the first runtime environment. The hardware control data is routed via the kernel binder 324 between the second runtime environment and the first runtime environment. The media control service 340 can then initiate (or ignore) hardware control signals from the hardware control data 390 to manage the distribution of the media data files 316 that are routed from the second runtime environment by the data routing service. The media control service 340 checks the distribution policies 318 to determine whether to ignore or pass hardware control data 390 from the second runtime environment to the media hardware 310.
[0024] As described with reference to FIG. 2, the media control service 340 may be implemented as an Android Audio Flinger service (e.g., in an Android runtime environment), and includes code libraries that may also be used to implement the data routing service 342, which may be implemented as an Advanced Linux Sound Architecture (ALSA) service (e.g., in a GNU/Linux runtime environment). The Android Audio Flinger libraries are ported to the Linux/Ubuntu environment, and a new ALSA PCM plug-in (pulse code modulation for audio data) and a new ALSA CTL plug-in (media hardware control) can be built from the Android code libraries. [0025] Example method 400 is described with reference to FIG. 4 in accordance with one or more embodiments of hybrid operating system media integration. Generally, any of the services, functions, methods, procedures, components, and modules described herein can be implemented using software, firmware, hardware (e.g., fixed logic circuitry), manual processing, or any combination thereof. A software implementation represents program code that performs specified tasks when executed by a computer processor. The example methods may be described in the general context of computer-executable instructions, which can include software, applications, routines, programs, objects, components, data structures, procedures, modules, functions, and the like. The program code can be stored in one or more computer-readable storage media devices, both local and/or remote to a computer processor. The methods may also be practiced in a distributed computing environment by multiple computer devices. Further, the features described herein are platform-independent and can be implemented on a variety of computing platforms having a variety of processors.
[0026] FIG. 4 illustrates example method(s) 400 of hybrid operating system media integration. The order in which the method blocks are described are not intended to be construed as a limitation, and any number or combination of the described method blocks can be combined in any order to implement a method, or an alternate method.
[0027] At block 402, a first media data file and first hardware control data for media hardware is received at a first runtime environment. For example, the first runtime environment 130 (FIG. 1) receives a media data file and hardware control data from a software application 150 in the first runtime environment. In a specific example, an Android call application 350 (FIG. 3) of the first runtime environment 330 receives an incoming call signal that points to a ringer alert audio file and a specific volume control level.
[0028] At block 404, a second media data file and second hardware control data for the media hardware is received at a second runtime environment. For example, the second runtime environment 132 receives a media data file and hardware control data from a software application 152 in the second runtime environment. In a specific example, an Ubuntu Internet browser application 352 of the second runtime environment 332 receives streaming movie video and audio data with user-selectable audio control settings.
[0029] At block 406, the second media data file is routed from the second runtime environment to the first runtime environment. For example, the data routing service 142 in the second runtime environment 132 routes a media data file 116 from the second runtime environment to the first runtime environment 130 via the shared memory 1 14, which is accessible by the operating system kernel 120. In the specific example, the data routing service 342 in the second runtime environment 332 routes the video and audio data files based on the Ubuntu Internet browser application 352 instructions.
[0030] At block 408, the second hardware control data is routed from the second runtime environment to the first runtime environment. For example, the data routing service 142 in the second runtime environment 132 routes the hardware control data 196 from the second runtime environment to the first runtime environment 130 via the kernel binder 122 between the second runtime environment and the first runtime environment. In the continuing example, the data routing service 342 sends speaker and volume setting control signals through the kernel binder 324 from the second runtime environment 332 to the first runtime environment 330.
[0031] At block 410, the media hardware is managed using the first hardware control data and/or the second hardware control data in compliance with media distribution policies. For example, the media control service 140 in the first runtime environment 130 manages the media hardware 1 10 using the hardware control signals 192 (e.g., derived from the first hardware control data and/or the second hardware control data). In implementations, the media control service 140 manages the media hardware in compliance with the media distribution policies 118 by selecting the first hardware control data and suppressing the second hardware control data, by selecting the second hardware control data and suppressing the first hardware control data, or by combining the first hardware control data and the second hardware control data. In the continuing example described with reference to the electronic device 300 shown in FIG. 3, the distribution policies 318 prioritize incoming call alerts over all other audio data. Then, the media control service 340 would select the first hardware control data from the Android call application 350 for the incoming call to turn on a particular audio speaker at a particular volume; suppress the second hardware control data from the Ubuntu Internet browser application 352 for the streaming movie audio; and couple the ringer alert audio data file to the media hardware 310.
[0032] The media distribution policies 118 can be quite complex and may prioritize audio data and loudspeaker hardware control data from particular software applications or specific runtime environments over other audio data and loudspeaker hardware control data. The prioritization of audio information may be completely different from the prioritization of video information. For example, the video data and display hardware control data may continue to be sent from the Ubuntu Internet browser application 352 in the second runtime environment 332 while the audio data and loudspeaker hardware control data from the second runtime environment 332 is suppressed.
[0033] FIG. 5 illustrates various components of an example electronic device 500 that can be implemented as any device described with reference to any of the previous FIGs. 1-4. The electronic device may be implemented as any one or combination of a fixed or mobile device, in any form of a consumer, computer, portable, user, communication, phone, navigation, gaming, media playback, and/or electronic device.
[0034] The electronic device 500 includes communication transceivers 502 that enable wired and/or wireless communication of device data 504, such as received data, data that is being received, data scheduled for broadcast, data packets of the data, etc. Example transceivers include wireless personal area network (WPAN) radios compliant with various IEEE 802.15 (sometimes referred to as Bluetooth™) standards, wireless local area network (WLAN) radios compliant with any of the various IEEE 802.11 (sometimes referred to as WiFi™) standards, wireless wide area network (WW AN) radios for cellular telephony, wireless metropolitan area network (WMAN) radios compliant with various IEEE 802.15 (sometimes referred to as WiMAX™) standards, and wired local area network (LAN) Ethernet transceivers. [0035] The electronic device 500 may also include one or more data input ports 506 via which any type of user data, media content, and/or inputs can be received, such as user-selectable inputs, messages, music, television content, recorded video content, and any other type of audio, video, and/or image data received from any content and/or data source. The data input ports may include USB ports, coaxial cable ports, and other serial or parallel connectors (including internal connectors) for flash memory, DVDs, CDs, and the like. These data input ports may be used to couple the electronic device 500 to components, peripherals, or accessories such as microphones or cameras.
[0036] The electronic device 500 includes one or more processors 508 (e.g., any of microprocessors, controllers, and the like), which process computer- executable instructions to control operation of the device. Alternatively or in addition, the electronic device can be implemented with any one or combination of software, hardware, firmware, or fixed logic circuitry that is implemented in connection with processing and control circuits, which are generally identified at 510. Although not shown, the electronic device can include a system bus or data transfer system that couples the various components within the device. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures.
[0037] The electronic device 500 also includes one or more memory devices 512 that enable data storage, examples of which include random access memory (RAM), non-volatile memory (e.g., read-only memory (ROM), flash memory, EPROM, EEPROM, etc.), and a disk storage device. A disk storage device may be implemented as any type of magnetic or optical storage device, such as a hard disk drive, a recordable and/or rewriteable disc, any type of a digital versatile disc (DVD), and the like. The electronic device may also include a mass storage media device.
[0038] A memory device 512 provides data storage mechanisms to store the device data 504, other types of information and/or data, and various device applications 514 (e.g., software applications). For example, an operating system 516 can be maintained as software instructions within a memory device and executed on processors 508. The device applications may also include a device manager, such as any form of a control application, software application, signal-processing and control module, code that is native to a particular device, a hardware abstraction layer for a particular device, and so on. In embodiments of hybrid operating system media integration, the operating system 516 can establish one or more runtime environments 518 as described herein. For example, a first runtime environment includes a media control service 520, and a second runtime environment includes a data routing service 522.
[0039] The electronic device 500 also includes an audio and/or video processing system 524 that generates audio data for an audio system 526 and/or generates display data for a display system 528. The audio system and/or the display system may include any devices that process, display, and/or otherwise render audio, video, display, and/or image data. Display data and audio signals can be communicated to an audio component and/or to a display component via an RF (radio frequency) link, S-video link, HDMI (high-definition multimedia interface), composite video link, component video link, DVI (digital video interface), analog audio connection, or other similar communication link, such as media data port 530. In implementations, the audio system and/or the display system are external components to the electronic device. Alternatively, the audio system and/or the display system are integrated components of the example electronic device, such as an integrated touch-screen.
[0040] Although embodiments of hybrid operating system media integration have been described in language specific to features and/or methods, the subject of the appended claims is not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as example implementations of hybrid operating system media integration.

Claims

1. A media system, comprising:
media hardware configured to render media data;
a media control service in a first runtime environment, the media control service configured to manage distribution of a media data file from the first runtime environment to the media hardware; and
a data routing service in a second runtime environment, the data routing service configured to route an additional media data file from the second runtime environment to the media control service in the first runtime environment, the media control service further configured to manage distribution of the additional media data file to the media hardware.
2. The media system as recited in claim 1 , further comprising:
an operating system configured for establishing the first runtime environment and the second runtime environment.
3. The media system as recited in claim 2, further comprising:
a shared memory configured to store the media data file from the first runtime environment and the additional media data file from the second runtime environment, where the media data file and the additional media data file are accessible to the operating system from the shared memory.
4. The media system as recited in claim 1, wherein the data routing service is further configured to route hardware control data that is associated with the additional media data file from the second runtime environment to the media control service in the first runtime environment.
5. The media system as recited in claim 4, further comprising:
a kernel binder configured for transferring the hardware control data from the second runtime environment to the first runtime environment.
6. The media system as recited in claim 1, wherein the media hardware includes an audio speaker.
7. The media system as recited in claim 1, wherein the media hardware includes at least one of a microphone, a display component, or a camera.
8. The media system as recited in claim 1, wherein the media control service complies with media distribution policies to manage rendering of the media data file and the additional media data file.
9. The media system as recited in claim 1, wherein the media control service includes code libraries, and wherein the data routing service includes one or more of the code libraries of the media control service.
10. An electronic device, comprising:
a first runtime environment configured for execution with an operating system kernel;
a media control service in the first runtime environment, the media control service configured to manage distribution of a media data file with at least audio data or video data from the first runtime environment;
a second runtime environment configured for execution with the operating system kernel; and
a data routing service in the second runtime environment, the data routing service configured to route an additional media data file to the media control service in the first runtime environment.
11. The electronic device as recited in claim 10, further comprising:
a shared memory configured to store the additional media data file from the second runtime environment, and wherein the additional media data file is routed to the media control service via the shared memory.
12. The electronic device as recited in claim 10, wherein the data routing service is further configured to route hardware control data that is associated with the additional media data file from the second runtime environment to the media control service in the first runtime environment.
13. The electronic device as recited in claim 12, wherein the hardware control data is routed from the second runtime environment to the first runtime environment via a kernel binder.
14. The electronic device as recited in claim 12, wherein the media control service is further configured to comply with media distribution policies to suppress the hardware control data that is associated with the additional media data file.
15. The electronic device as recited in claim 10, further comprising:
an operating system configured for establishing the first runtime environment and the second runtime environment.
16. The electronic device as recited in claim 10, wherein the media control service includes code libraries, and wherein the data routing service includes one or more of the code libraries of the media control service.
17. A method, comprising:
receiving a first media data file and first hardware control data for media hardware at a first runtime environment;
receiving a second media data file and second hardware control data for the media hardware at a second runtime environment;
routing the second media data file from the second runtime environment to the first runtime environment;
routing the second hardware control data from the second runtime environment to the first runtime environment; and
managing the media hardware at the first runtime environment using the first hardware control data or the second hardware control data.
18. The method as recited in claim 17, wherein managing the media hardware includes complying with media distribution policies to perform at least one of:
selecting the first hardware control data and suppressing the second hardware control data;
selecting the second hardware control data and suppressing the first hardware control data; or
combining the first hardware control data and the second hardware control data.
19. The method as recited in claim 17, wherein the second media data file is routed via a shared memory from the second runtime environment to the first runtime environment.
20. The method as recited in claim 17, wherein the second hardware control data is routed via a kernel binder from the second runtime environment to the first runtime environment.
PCT/CN2011/070018 2011-01-04 2011-01-04 Hybrid operating system media integration WO2012092706A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2011/070018 WO2012092706A1 (en) 2011-01-04 2011-01-04 Hybrid operating system media integration

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2011/070018 WO2012092706A1 (en) 2011-01-04 2011-01-04 Hybrid operating system media integration

Publications (1)

Publication Number Publication Date
WO2012092706A1 true WO2012092706A1 (en) 2012-07-12

Family

ID=46457179

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2011/070018 WO2012092706A1 (en) 2011-01-04 2011-01-04 Hybrid operating system media integration

Country Status (1)

Country Link
WO (1) WO2012092706A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170255461A1 (en) * 2014-09-01 2017-09-07 Denso Corporation In-vehicle apparatus
US9978265B2 (en) 2016-04-11 2018-05-22 Tti (Macao Commercial Offshore) Limited Modular garage door opener
US10015898B2 (en) 2016-04-11 2018-07-03 Tti (Macao Commercial Offshore) Limited Modular garage door opener
CN113821359A (en) * 2021-07-12 2021-12-21 北京鲸鲮信息系统技术有限公司 Audio-driven system compatible method, device and equipment
CN114879932A (en) * 2022-07-12 2022-08-09 北京麟卓信息科技有限公司 Android system audio output optimization method based on automatic switching of output modes
CN113821359B (en) * 2021-07-12 2024-04-26 北京字节跳动网络技术有限公司 Audio-driven system compatible method, device and equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070057953A1 (en) * 2005-09-15 2007-03-15 Microsoft Corporation Providing 3D graphics across partitions of computing device
US7275121B1 (en) * 2005-04-05 2007-09-25 Nvidia Corporation System and method for hardware assisted resource sharing
US20080077993A1 (en) * 2006-09-26 2008-03-27 Zimmer Vincent J Methods and arrangements to launch trusted, co-existing environments
JP2009104258A (en) * 2007-10-19 2009-05-14 Intel Corp Method and arrangement for launching trusted coexisting environment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7275121B1 (en) * 2005-04-05 2007-09-25 Nvidia Corporation System and method for hardware assisted resource sharing
US20070057953A1 (en) * 2005-09-15 2007-03-15 Microsoft Corporation Providing 3D graphics across partitions of computing device
US20080077993A1 (en) * 2006-09-26 2008-03-27 Zimmer Vincent J Methods and arrangements to launch trusted, co-existing environments
JP2009104258A (en) * 2007-10-19 2009-05-14 Intel Corp Method and arrangement for launching trusted coexisting environment

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170255461A1 (en) * 2014-09-01 2017-09-07 Denso Corporation In-vehicle apparatus
US10331433B2 (en) * 2014-09-01 2019-06-25 Denso Corporation Hybrid operating system for an in-vehicle apparatus
US9978265B2 (en) 2016-04-11 2018-05-22 Tti (Macao Commercial Offshore) Limited Modular garage door opener
US10015898B2 (en) 2016-04-11 2018-07-03 Tti (Macao Commercial Offshore) Limited Modular garage door opener
US10127806B2 (en) 2016-04-11 2018-11-13 Tti (Macao Commercial Offshore) Limited Methods and systems for controlling a garage door opener accessory
US10157538B2 (en) 2016-04-11 2018-12-18 Tti (Macao Commercial Offshore) Limited Modular garage door opener
US10237996B2 (en) 2016-04-11 2019-03-19 Tti (Macao Commercial Offshore) Limited Modular garage door opener
CN113821359A (en) * 2021-07-12 2021-12-21 北京鲸鲮信息系统技术有限公司 Audio-driven system compatible method, device and equipment
CN113821359B (en) * 2021-07-12 2024-04-26 北京字节跳动网络技术有限公司 Audio-driven system compatible method, device and equipment
CN114879932A (en) * 2022-07-12 2022-08-09 北京麟卓信息科技有限公司 Android system audio output optimization method based on automatic switching of output modes
CN114879932B (en) * 2022-07-12 2022-09-02 北京麟卓信息科技有限公司 Android system audio output optimization method based on automatic switching of output modes

Similar Documents

Publication Publication Date Title
KR101512918B1 (en) System timeout reset based on media detection
CA2740581C (en) System and method for resuming media
US10091345B2 (en) Media out interface
US20080186960A1 (en) System and method of controlling media streams in an electronic device
JP6006749B2 (en) Method and system for providing incoming call notification using video multimedia
US20160150344A1 (en) Surround Sound Effects Provided By Cell Phones
WO2024016832A1 (en) Application continuation method and apparatus
RU2667982C2 (en) Wireless docking unit
KR101477944B1 (en) Method and apparatus for simultaneously presenting at least two multimedia content on a processing device
US20140109003A1 (en) System and method for selectively muting computer applications
CN112398855B (en) Method and device for transferring application contents across devices and electronic device
CN102857805A (en) Method and device for processing signals of mobile terminal
WO2022121775A1 (en) Screen projection method, and device
CN112394895A (en) Cross-equipment display method and device of picture and electronic equipment
RU2607994C2 (en) Information sharing device, information sharing method, information sharing program and terminal device
EP2640081A2 (en) Content reproduction apparatus and content reproduction system
US11076121B2 (en) Apparatus and associated methods for video presentation
EP2557778A1 (en) Method and apparatus for video recording in video calls
KR20140029740A (en) Method and apparatus for transferring files during video telephony in electronic device
EP2575331A1 (en) Electronic Apparatus and Media System Using Information Terminal
US8670556B2 (en) Multi-participant communication system with audio manager responsive to mode change requests
WO2012092706A1 (en) Hybrid operating system media integration
US20130246937A1 (en) Method for managing sink device, source device and system for the same
CN110933221A (en) Audio channel management method, device, terminal and storage medium
US9152374B2 (en) Control and capture of audio data intended for an audio endpoint device of an application executing on a data processing device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11854726

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11854726

Country of ref document: EP

Kind code of ref document: A1