WO2014163621A1 - Managing use of resources in mobile devices - Google Patents

Managing use of resources in mobile devices Download PDF

Info

Publication number
WO2014163621A1
WO2014163621A1 PCT/US2013/034946 US2013034946W WO2014163621A1 WO 2014163621 A1 WO2014163621 A1 WO 2014163621A1 US 2013034946 W US2013034946 W US 2013034946W WO 2014163621 A1 WO2014163621 A1 WO 2014163621A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
orientation
mobile device
sensor
processing
Prior art date
Application number
PCT/US2013/034946
Other languages
French (fr)
Inventor
James Toga
Original Assignee
Vivox, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivox, Inc. filed Critical Vivox, Inc.
Priority to PCT/US2013/034946 priority Critical patent/WO2014163621A1/en
Publication of WO2014163621A1 publication Critical patent/WO2014163621A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/325Power saving in peripheral device
    • G06F1/3265Power saving in display device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Definitions

  • the present invention generally relates to resource management, and particularly to managing use of resources in mobile devices.
  • smartphones are used not just for telephone communications, but are increasingly used for applications that include navigation, playing games, accessing the Internet, scheduling, entertainment, transmitting receiving or watching videos, and so forth.
  • VOIP Voice-Over-IP
  • video-call communications over networks such as the Internet, including bi-directional calls.
  • mobile devices are smartphones, audio and video players and recorders, laptops, netbooks, portable computation devices, electronic pocket organizers, and so forth.
  • Mobile devices while increasingly powerful, by their nature have limited resources. Examples include electrical power— power may be available for periods of time only from an internal battery; computation ⁇ the processor or computer is only able to perform a finite amount of computation in a given amount of time (and further the mount of resource available may vary due to other factors, such the clock rate being raised or lowered as a matter of simple power management); and bandwidth ⁇ the device may be only able to transmit or receive data at up to a limited maximum speed, which may also vary. Further, the amount of a given resource that is available may at times be different than at other times, or may be consumed at a different rate, depending on other functions of the device or environmental factors.
  • Medial processing such as video processing, is often particularly costly in consuming electrical power, computational resources, and communications bandwidth.
  • video at higher detail consumes computation and communication resources at a higher rate than at less detail.
  • encoding or decoding video information when the video image is changing e.g. because either the subject or the video camera are moving
  • encoding or decoding video information when the video image is changing generally consumes computation at a higher rate than when the video images are changing less, dependent on the particular form of video data and how it may be encoded.
  • One aspect of the present invention is a method of that it is appropriate to process data in a fashion that consumes a different amount of a resource in a mobile device by reading data of a sensor of the device, determining from the sensor data that the device is in an operating circumstance for processing the data in a fashion that consumes a different amount of a resource, and processing the data in a fashion that consumes a different amount of a resource.
  • the data may be multimedia data such as video data.
  • the circumstance may be that the device is in motion, the circumstance may be that the device is in an orientation, or the circumstance may be that that there is insufficient ambient light for a camera of the device to produce usable images.
  • the sensor may be any of a number of kinds of sensors.
  • quantity, form, or type of data processed or encoded in an originating device may be determined based on a sensor reading, and the amount of a resource consumed by a receiving or a rendering device of the data optimized.
  • images may be blurred or encoded at a different level of image detail when the circumstance is determined that an originating device is in motion.
  • the circumstance may be an orientation where the device is an accepting device of the data and will produce video information of different value, such as when the device is held next to the user's head, or is face-down on a flat surface
  • the orientation may be an orientation where the device is a displaying device of the data and video information will not be seen by readily a user, such as when the device is held against the user's ear, or the device is positioned inside a pocket of the user.
  • the circumstance may also be for a device that is an accepting device in which the information is video information, that the video sensor (camera) is blocked by an object detected by means of a touch or proximity sensor.
  • the circumstance may also be for a device that is a rendering device in which the information is video information, that the video display is blocked by an object or the device is in motion.
  • the processing may include computational or communications processing of media data such as video, and the resource may be a limited resource such as battery power, bandwidth, or computation, for processing and/or communicating real-time audio and/or video information. Further aspects of the invention are directed to using other kinds of sensor data, such as touch data, and other kinds of processing.
  • FIG. 1 illustrates an exemplary mobile device.
  • FIG. 2 shows an oblique view of an exemplary mobile device.
  • FIG. 3 is a flow diagram.
  • FIG. 4 is a flow diagram.
  • FIG. 5 is a flow diagram.
  • a component can be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer.
  • a component may be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer.
  • an application running on a server and the server can be a component.
  • One or more components may reside within a process and/or thread of execution and a component may be localized on one computer and/or distributed between two or more computers.
  • these components can execute from various computer readable media having various data structures stored thereon.
  • the components may communicate via local and/or remote processes such as in accordance with a signal having one or more data packets (e.g., data from one component interacting with another component in a local system, distributed system, and/or across a network such as the Internet with other systems via the signal).
  • FIG. 1 illustrates an exemplary mobile device, such as a smart phone.
  • Smart phones in this context include devices using operating software of the iPhone®, the
  • Android® operating system also referred to as Droid®
  • Symbian® operating system software based on Linux® and others.
  • Mobile devices in the present context need not have phone capabilities, nor need they include video camera capabilities. Further representative examples of mobile devices include without limitation digital cameras, video recording devices, mobile computing devices with a keyboard, touchscreen, image recognizer, voice input, or other user input means, audio and video players, and so forth.
  • Fig. 1 shows a front view of the exemplary mobile device: 150 shows a rear view.
  • 140 is a touch screen for touch input and display, showing representative icons or tiles such as 145.
  • 130 is an audio output transducer that may be used by appropriate applications for audio chats when the device is held against a user's ear in "telephone use”.
  • 110 is the location of a microphone transducer for audio chats.
  • connector 105 for external connector or docking.
  • Power button 120 may be used to turn the device on or off, or optionally put it into a "sleep" mode.
  • the device may include one or more sensors such as a vibration sensors, an accelerometer, one or more magnetic sensors, a gyroscopic sensor, additional audio sensors or transducers, and so forth.
  • sensors such as a vibration sensors, an accelerometer, one or more magnetic sensors, a gyroscopic sensor, additional audio sensors or transducers, and so forth.
  • [0027] 150 shows a rear view of the exemplary device: visible are connector 105 and power button 120. Also visible are exemplary volume and camera-control buttons 180. Video camera 170 is seen adjacent to ambient light sensor 160. In some embodiments, ambient light may be detected by video camera 170.
  • FIG. 2 shows an oblique view of an exemplary mobile device. Shown are connector 205, and touch screen 240. 228 indicates the rear surface like that shown in 150 of Fig. 1. 231 illustrates the three axes relative to touchscreen 240, for the data of an internal accelerometer.
  • One representative form of a preferred embodiment is a method performed by a processor of a mobile device.
  • the mobile device may be running a version of the Android® operating software, which is readily understood in the present context.
  • mobile devices such as iPhones®, iPods®, Symbian® smartphones, Web tablets, and Android® tablets.
  • devices running a version of the Android® operating software may support multiple kinds of sensors that sense orientation or change of orientation in the devices.
  • These sensors of mobile devices can include accelerometers, gyroscopic sensors, magnetic field sensors, and so forth.
  • the sensors may provide data for up to three axes or more, and may report data in forms such as directional vectors, rotation vectors, or change in orientation, inclination, or positions.
  • Information from multiple sensors can be combined, such as combining higher frequency data from a gyroscopic sensor with lower frequency data from an accelerometer sensor for a more optimal determination of orientation.
  • Orientation and motion can e determined using various and other kinds of sensors, such as touch sensors to determine that a smart phone is being held in contact with a user's head, or proximity sensors (for example, electrostatic or acoustic proximity sensors) to determine that an object is obscuring a display due to the devices orientation with respect to the object.
  • sensors such as touch sensors to determine that a smart phone is being held in contact with a user's head, or proximity sensors (for example, electrostatic or acoustic proximity sensors) to determine that an object is obscuring a display due to the devices orientation with respect to the object.
  • Orientation or motion of course may be determined with respect to various references, such as the direction of gravity, a presumed surface such as a table surface, or a proximate object for example a side of a user's head or the inside of a pocket or container.
  • references such as the direction of gravity, a presumed surface such as a table surface, or a proximate object for example a side of a user's head or the inside of a pocket or container.
  • FIG. 3, FIG. 4, and FIG. 5 illustrate in flowcharts representative forms of preferred method embodiments.
  • FIG. 3 describes a form of a preferred embodiment in which the circumstance determined from data of an orientation sensor is that the mobile device is in motion, or its orientation is changing at a rate that makes details of a video image (either being accepted or being rendered) less perceivable.
  • the sensor is an accelerometer, and the direction of gravity ("down") can be determined, and the orientation of the device with respect to "down": orientation or changes in orientation with respect to other axes can also be determined, for example by integrating accelerometer measurements to determine a relative change in orientation.
  • the sensor is a gyroscopic sensor.
  • the steps of the process may be performed by a processor of a mobile device according to programming instructions stored in a memory of the device.
  • the process starts at the step 310 as a next video image (or frame) is available from a camera of the device: the next video image is accepted from the camera performing the method at 320.
  • the processor accepts sensor data from a sensor responsive to orientation (accelerometer, gyroscopic sensor, etc.).
  • the processor determines the orientation of the device, such as with respect to the three axes defined by the plane of a touchscreen of the device (see discussion for FIG. 2).
  • the processor fetches a stored value from a memory of the processor for a prior orientation of the device, and compares it with the current orientation determined in step 340.
  • the processor determines whether the orientation of the device has changed greater than a predetermined threshold. If not, the method continues to 380, where the video data is processed "normally". If so, the method continues to 370, where the video data is processed in a fashion that uses a different amount of a resource: for example, the data may not be transmitted, or some video frames or images may be skipped, or data may be encoded at a lesser resolution.
  • the predetermined threshold may be determined at least in part dynamically, for example based on a history of how the device has been moved or oriented, before, or in based on an input from another sensor, or a based on a previous orientation for a period of time.
  • the processor updates the stored values for the prior orientation of the device with data of the current orientation, and completes and may continue to other operations at 395.
  • Figure 4 describes a form of a preferred embodiment in which the circumstance determined is that the device is oriented such that the desired image is not visible, or would be less visible.
  • the sensor is a proximity sensor, in others it may be a touch sensor such as a touch surface of the mobile device, in others without limitation the sensor is an accelerometer, or may be a combination of sensors.
  • the embodiment starts at 410 as a next video image (or frame) is available from a camera of the mobile device: the next video image is accepted from the camera as shown at 420.
  • the processor accepts sensor data from a sensor.
  • the processor of the device determines the orientation of the device, and at 450 determines whether the orientation corresponds to an orientation such that the image is less visible, or not visible to a user.
  • the orientation is that the device is face-down, and presumed to be on a surface that blocks the camera or the user's view of the display.
  • the orientation is determined by an accelerometer, in others the orientation is determined both by an accelerometer and by proximity or contact data with the side of the device with the camera or the display. In others, an orientation in which the camera or display are blocked is determined by contact or proximity sensor data that indicates that the device may be being held up to the side of a user's head, as in a voice chat or telephone chat.
  • the processor determines whether the image is less visible than a threshold. If not, the image data is processed normally, as indicated at 480. If so, the image or video data is processed in a fashion that consumes less of a resource, as indicated at 470. Subsequent to either of steps 470 or 480, the method completes and processing may continue to further operations as shown at 495.
  • the processing at more optimal consumption of a resource as indicated at 470 may be that video information is processed or encoded at lower temporal resolution (e.g. not all frames are sent), or at a less image detail (e.g. lower image resolution, lower color resolution.
  • Such techniques, and others may be applied in the step as a matter of design choice. It will readily be apparent that the invention is not limited in this or similar steps to processing techniques that may or may not be generally known in the art today.
  • Fig. 5 shows one form of a preferred embodiment in which the mobile device is a rendering device for video data. Details readily apparent, or readily apparent from the discussion of other figures, is omitted for brevity.
  • the embodiment starts at 510: at step 520 the processor of the device receives data representing a next video image: in many forms the data is received via a network connection, or read from a data storage device.
  • the processor accepts sensor data from a sensor that may be used to determine an orientation of the device.
  • the processor of the device determines the orientation of the device from the sensor data of 530.
  • the processor compares the current orientation as determined at 540 with a prior orientation of the device.
  • the processor determines whether the change in orientation (e.g. the motion) of the devices is greater than a threshold. If not, the video data is rendered to display an image normally, as indicated at 580. If so, the image or video data is processed in a fashion that consumes less of a resource, as indicated at 570. Subsequent to either of steps 570 or 580, the method continues as shown at 590 to update the information about prior orientation, and the method then completes and processing may continue to further operations at shown at 595.
  • the change in orientation e.g. the motion
  • the above exemplary embodiments apply both to a device which is originating data, such as a device accepting images from a camera of the device, and processing that information, such as for storage or transmission, and also to a device that is receiving data, such as video data originating from another device or from storage, and processing that information, such as for display or local storage.
  • the step of determining whether a change is orientation is greater than a threshold may be implemented in multiple forms, such as a by determining a derivative, an integral, or measure of change in orientation, and the threshold may be either a minimum or maximum threshold, or combination of multiple factors, such as psycho-perceptual criteria. Thresholds used are matter of design choice, and may be determined experimentally. There may be multiple thresholds, and multiple forms of processing data at different rates of consumption, and the thresholds may be adaptable (such as being adapted in response to the mobile device having been in motion for a period of time), or settable.
  • Steps may also be performed in multiple components, for example the steps of processing video data depending on a determination of change in orientation may be performed on a separate processor or server.
  • the representative embodiment forms of FIG. 3, FIG. 4, and FIG. 5 differ from an embodiment of merely transforming an image to compensate for motion of a video camera (a form of image stabilization) by detecting motion of the video camera and transforming the image to reduce the resulting apparent motion of the image. It will be further apparent that they differ from an embodiment of merely using information of a sensor of motion in determining how to encode for motion in encoding or decoding of video information to maintain a desired level of detail for an image.

Abstract

Methods and apparatus, including computer program products, for managing use of resources in mobile devices. A method includes, in a mobile device comprising at least a display, a processor and a memory, reading data of a sensor of the device, determining from the sensor data that the device is in an operating circumstance for processing the data in a fashion that consumes a different amount of a resource, and processing the data in a fashion that consumes a different amount of the resource.

Description

MANAGING USE OF RESOURCES IN MOBILE DEVICES
BACKGROUND OF THE INVENTION
[001] The present invention generally relates to resource management, and particularly to managing use of resources in mobile devices.
[002] In recent years, mobile devices have become increasingly capable. For example, smartphones are used not just for telephone communications, but are increasingly used for applications that include navigation, playing games, accessing the Internet, scheduling, entertainment, transmitting receiving or watching videos, and so forth. There are a number of applications on mobile devices for VOIP (Voice-Over-IP) audio communications over wireless connections to the Internet, and for real-time "video-call" communications over networks such as the Internet, including bi-directional calls. Examples of mobile devices are smartphones, audio and video players and recorders, laptops, netbooks, portable computation devices, electronic pocket organizers, and so forth.
[003] Mobile devices, while increasingly powerful, by their nature have limited resources. Examples include electrical power— power may be available for periods of time only from an internal battery; computation ~ the processor or computer is only able to perform a finite amount of computation in a given amount of time (and further the mount of resource available may vary due to other factors, such the clock rate being raised or lowered as a matter of simple power management); and bandwidth ~ the device may be only able to transmit or receive data at up to a limited maximum speed, which may also vary. Further, the amount of a given resource that is available may at times be different than at other times, or may be consumed at a different rate, depending on other functions of the device or environmental factors.
[004] Medial processing, such as video processing, is often particularly costly in consuming electrical power, computational resources, and communications bandwidth.
[005] Generally, video at higher detail consumes computation and communication resources at a higher rate than at less detail. [006] Similarly, encoding or decoding video information when the video image is changing (e.g. because either the subject or the video camera are moving) generally consumes computation at a higher rate than when the video images are changing less, dependent on the particular form of video data and how it may be encoded.
[007] Thus, there is a continuing need for techniques for managing resource
consumption in mobile devices more optimally.
SUMMARY OF THE INVENTION
[008] The following presents a simplified summary of the innovation in order to provide a basic understanding of some aspects of the invention. This summary is not an extensive overview of the invention. It is intended to neither identify key or critical elements of the invention nor delineate the scope of the invention. Its sole purpose is to present some concepts of the invention in a simplified form as a prelude to the more detailed description that is presented later.
[009] One aspect of the present invention is a method of that it is appropriate to process data in a fashion that consumes a different amount of a resource in a mobile device by reading data of a sensor of the device, determining from the sensor data that the device is in an operating circumstance for processing the data in a fashion that consumes a different amount of a resource, and processing the data in a fashion that consumes a different amount of a resource.
[0010] In another aspect, the data may be multimedia data such as video data. In further aspects, the circumstance may be that the device is in motion, the circumstance may be that the device is in an orientation, or the circumstance may be that that there is insufficient ambient light for a camera of the device to produce usable images. The sensor may be any of a number of kinds of sensors.
[0011] In further aspects, quantity, form, or type of data processed or encoded in an originating device may be determined based on a sensor reading, and the amount of a resource consumed by a receiving or a rendering device of the data optimized. In aspects of some embodiments, images may be blurred or encoded at a different level of image detail when the circumstance is determined that an originating device is in motion. In other aspects, the circumstance may be an orientation where the device is an accepting device of the data and will produce video information of different value, such as when the device is held next to the user's head, or is face-down on a flat surface The orientation may be an orientation where the device is a displaying device of the data and video information will not be seen by readily a user, such as when the device is held against the user's ear, or the device is positioned inside a pocket of the user.
[0012] The circumstance may also be for a device that is an accepting device in which the information is video information, that the video sensor (camera) is blocked by an object detected by means of a touch or proximity sensor. The circumstance may also be for a device that is a rendering device in which the information is video information, that the video display is blocked by an object or the device is in motion.
[0013] The processing may include computational or communications processing of media data such as video, and the resource may be a limited resource such as battery power, bandwidth, or computation, for processing and/or communicating real-time audio and/or video information. Further aspects of the invention are directed to using other kinds of sensor data, such as touch data, and other kinds of processing.
[0014] These and other features and advantages will be apparent from a reading of the following detailed description and a review of the associated drawings. It is to be understood that both the foregoing general description and the following detailed description are explanatory only and are not restrictive of aspects as claimed.
BRIEF DESCRIPTION OF THE DRAWINGS
[0015] The invention will be more fully understood by reference to the detailed description, in conjunction with the following figures, wherein:
[0016] FIG. 1 illustrates an exemplary mobile device.
[0017] FIG. 2 shows an oblique view of an exemplary mobile device.
[0018] FIG. 3 is a flow diagram. [0019] FIG. 4 is a flow diagram.
[0020] FIG. 5 is a flow diagram.
DETAILED DESCRIPTION
[0021] The subject innovation is now described with reference to the drawings, wherein like reference numerals are used to refer to like elements throughout. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It may be evident, however, that the present invention may be practiced without these specific details. In other instances, well- known structures and devices are shown in block diagram form in order to facilitate describing the present invention.
[0022] As used in this application, the terms "component," "system," "platform," and the like can refer to a computer-related entity or an entity related to an operational machine with one or more specific functionalities. The entities disclosed herein can be either hardware, a combination of hardware and software, software, or software in execution. For example, a component may be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer. By way of illustration, both an application running on a server and the server can be a component. One or more components may reside within a process and/or thread of execution and a component may be localized on one computer and/or distributed between two or more computers. Also, these components can execute from various computer readable media having various data structures stored thereon. The components may communicate via local and/or remote processes such as in accordance with a signal having one or more data packets (e.g., data from one component interacting with another component in a local system, distributed system, and/or across a network such as the Internet with other systems via the signal).
[0023] In addition, the term "or" is intended to mean an inclusive "or" rather than an exclusive "or." That is, unless specified otherwise, or clear from context, "X employs A or B" is intended to mean any of the natural inclusive permutations. That is, if X employs A, X employs B, or X employs both A and B, then "X employs A or B" is satisfied under any of the foregoing instances. Moreover, articles "a" and "an" as used in the subject specification and annexed drawings should generally be construed to mean "one or more" unless specified otherwise or clear from context to be directed to a singular form.
[0024] FIG. 1 illustrates an exemplary mobile device, such as a smart phone. Smart phones in this context include devices using operating software of the iPhone®, the
Android® operating system (also referred to as Droid®), the Symbian® operating system, software based on Linux® and others. Mobile devices in the present context need not have phone capabilities, nor need they include video camera capabilities. Further representative examples of mobile devices include without limitation digital cameras, video recording devices, mobile computing devices with a keyboard, touchscreen, image recognizer, voice input, or other user input means, audio and video players, and so forth.
[0025] 100 in Fig. 1 shows a front view of the exemplary mobile device: 150 shows a rear view. As illustrated, 140 is a touch screen for touch input and display, showing representative icons or tiles such as 145. 130 is an audio output transducer that may be used by appropriate applications for audio chats when the device is held against a user's ear in "telephone use". 110 is the location of a microphone transducer for audio chats. Also visible is connector 105 for external connector or docking. Power button 120 may be used to turn the device on or off, or optionally put it into a "sleep" mode.
[0026] The device may include one or more sensors such as a vibration sensors, an accelerometer, one or more magnetic sensors, a gyroscopic sensor, additional audio sensors or transducers, and so forth.
[0027] 150 shows a rear view of the exemplary device: visible are connector 105 and power button 120. Also visible are exemplary volume and camera-control buttons 180. Video camera 170 is seen adjacent to ambient light sensor 160. In some embodiments, ambient light may be detected by video camera 170.
[0028] FIG. 2 shows an oblique view of an exemplary mobile device. Shown are connector 205, and touch screen 240. 228 indicates the rear surface like that shown in 150 of Fig. 1. 231 illustrates the three axes relative to touchscreen 240, for the data of an internal accelerometer. [0029] One representative form of a preferred embodiment is a method performed by a processor of a mobile device. The mobile device may be running a version of the Android® operating software, which is readily understood in the present context. There are further kinds of mobile devices such as iPhones®, iPods®, Symbian® smartphones, Web tablets, and Android® tablets. For example, devices running a version of the Android® operating software may support multiple kinds of sensors that sense orientation or change of orientation in the devices.
[0030] These sensors of mobile devices can include accelerometers, gyroscopic sensors, magnetic field sensors, and so forth. The sensors may provide data for up to three axes or more, and may report data in forms such as directional vectors, rotation vectors, or change in orientation, inclination, or positions. Information from multiple sensors can be combined, such as combining higher frequency data from a gyroscopic sensor with lower frequency data from an accelerometer sensor for a more optimal determination of orientation. Orientation and motion can e determined using various and other kinds of sensors, such as touch sensors to determine that a smart phone is being held in contact with a user's head, or proximity sensors (for example, electrostatic or acoustic proximity sensors) to determine that an object is obscuring a display due to the devices orientation with respect to the object.
[0031] Orientation or motion of course may be determined with respect to various references, such as the direction of gravity, a presumed surface such as a table surface, or a proximate object for example a side of a user's head or the inside of a pocket or container.
[0032] FIG. 3, FIG. 4, and FIG. 5 illustrate in flowcharts representative forms of preferred method embodiments.
[0033] FIG. 3 describes a form of a preferred embodiment in which the circumstance determined from data of an orientation sensor is that the mobile device is in motion, or its orientation is changing at a rate that makes details of a video image (either being accepted or being rendered) less perceivable. In some forms, the sensor is an accelerometer, and the direction of gravity ("down") can be determined, and the orientation of the device with respect to "down": orientation or changes in orientation with respect to other axes can also be determined, for example by integrating accelerometer measurements to determine a relative change in orientation. In other forms, the sensor is a gyroscopic sensor. [0034] In the representative form of a preferred embodiment of Fig. 3. The steps of the process may be performed by a processor of a mobile device according to programming instructions stored in a memory of the device. The process starts at the step 310 as a next video image (or frame) is available from a camera of the device: the next video image is accepted from the camera performing the method at 320. At 330 the processor accepts sensor data from a sensor responsive to orientation (accelerometer, gyroscopic sensor, etc.). Next at 340 the processor determines the orientation of the device, such as with respect to the three axes defined by the plane of a touchscreen of the device (see discussion for FIG. 2).
Subsequently, at 350, the processor fetches a stored value from a memory of the processor for a prior orientation of the device, and compares it with the current orientation determined in step 340.
[0035] At 360, the processor determines whether the orientation of the device has changed greater than a predetermined threshold. If not, the method continues to 380, where the video data is processed "normally". If so, the method continues to 370, where the video data is processed in a fashion that uses a different amount of a resource: for example, the data may not be transmitted, or some video frames or images may be skipped, or data may be encoded at a lesser resolution. In the embodiments, the predetermined threshold may be determined at least in part dynamically, for example based on a history of how the device has been moved or oriented, before, or in based on an input from another sensor, or a based on a previous orientation for a period of time.
[0036] Subsequent to either of steps 370 or 380, the processor updates the stored values for the prior orientation of the device with data of the current orientation, and completes and may continue to other operations at 395.
[0037] In alternative forms of the embodiment, Figure 4 describes a form of a preferred embodiment in which the circumstance determined is that the device is oriented such that the desired image is not visible, or would be less visible. In some forms, the sensor is a proximity sensor, in others it may be a touch sensor such as a touch surface of the mobile device, in others without limitation the sensor is an accelerometer, or may be a combination of sensors. The embodiment starts at 410 as a next video image (or frame) is available from a camera of the mobile device: the next video image is accepted from the camera as shown at 420. At 430 the processor accepts sensor data from a sensor. At 440 the processor of the device determines the orientation of the device, and at 450 determines whether the orientation corresponds to an orientation such that the image is less visible, or not visible to a user.
[0038] In some forms, the orientation is that the device is face-down, and presumed to be on a surface that blocks the camera or the user's view of the display. In some embodiments, the orientation is determined by an accelerometer, in others the orientation is determined both by an accelerometer and by proximity or contact data with the side of the device with the camera or the display. In others, an orientation in which the camera or display are blocked is determined by contact or proximity sensor data that indicates that the device may be being held up to the side of a user's head, as in a voice chat or telephone chat.
[0039] At 460, the processor determines whether the image is less visible than a threshold. If not, the image data is processed normally, as indicated at 480. If so, the image or video data is processed in a fashion that consumes less of a resource, as indicated at 470. Subsequent to either of steps 470 or 480, the method completes and processing may continue to further operations as shown at 495.
[0040] In some forms of a preferred embodiment, the processing at more optimal consumption of a resource as indicated at 470 may be that video information is processed or encoded at lower temporal resolution (e.g. not all frames are sent), or at a less image detail (e.g. lower image resolution, lower color resolution. Such techniques, and others, may be applied in the step as a matter of design choice. It will readily be apparent that the invention is not limited in this or similar steps to processing techniques that may or may not be generally known in the art today.
[0041] Fig. 5 shows one form of a preferred embodiment in which the mobile device is a rendering device for video data. Details readily apparent, or readily apparent from the discussion of other figures, is omitted for brevity.
[0042] The embodiment starts at 510: at step 520 the processor of the device receives data representing a next video image: in many forms the data is received via a network connection, or read from a data storage device. At 530 the processor accepts sensor data from a sensor that may be used to determine an orientation of the device. At 540 the processor of the device determines the orientation of the device from the sensor data of 530. At 550 the processor compares the current orientation as determined at 540 with a prior orientation of the device.
[0043] At 560, the processor determines whether the change in orientation (e.g. the motion) of the devices is greater than a threshold. If not, the video data is rendered to display an image normally, as indicated at 580. If so, the image or video data is processed in a fashion that consumes less of a resource, as indicated at 570. Subsequent to either of steps 570 or 580, the method continues as shown at 590 to update the information about prior orientation, and the method then completes and processing may continue to further operations at shown at 595.
[0044] As will be easily appreciated, the above exemplary embodiments apply both to a device which is originating data, such as a device accepting images from a camera of the device, and processing that information, such as for storage or transmission, and also to a device that is receiving data, such as video data originating from another device or from storage, and processing that information, such as for display or local storage.
[0045] It is readily apparent that here are many variations on the steps and the ordering of the steps of the exemplary embodiments above, and techniques of the invention may be applied to other kinds of information than video information, and to other kinds and combinations of sensors and sensor data. Further, the step of determining whether a change is orientation is greater than a threshold may be implemented in multiple forms, such as a by determining a derivative, an integral, or measure of change in orientation, and the threshold may be either a minimum or maximum threshold, or combination of multiple factors, such as psycho-perceptual criteria. Thresholds used are matter of design choice, and may be determined experimentally. There may be multiple thresholds, and multiple forms of processing data at different rates of consumption, and the thresholds may be adaptable (such as being adapted in response to the mobile device having been in motion for a period of time), or settable.
[0046] Steps may also be performed in multiple components, for example the steps of processing video data depending on a determination of change in orientation may be performed on a separate processor or server. [0047] As is readily apparent, the representative embodiment forms of FIG. 3, FIG. 4, and FIG. 5 differ from an embodiment of merely transforming an image to compensate for motion of a video camera (a form of image stabilization) by detecting motion of the video camera and transforming the image to reduce the resulting apparent motion of the image. It will be further apparent that they differ from an embodiment of merely using information of a sensor of motion in determining how to encode for motion in encoding or decoding of video information to maintain a desired level of detail for an image. It will also be readily appreciated that they differ from embodiments of power management that merely disable power to components or subsystems, or place components or subsystems into a nonfunctional "sleep" mode, in response to a period of inactivity, in response to sensing that available battery or other power is low, or in response to a specific input.
[0048] The above specification, examples and data provide a complete description of the manufacture and use of the composition of the invention. Since many embodiments of the invention can be made without departing from the spirit and scope of the invention, the invention resides in the claims hereinafter appended.
[0049] What is claimed is:

Claims

1. A method comprising:
in a mobile device comprising at least a display, a processor and a memory, reading data of a sensor of the device;
determining from the sensor data that the device is in an operating circumstance for processing the data in a fashion that consumes a different amount of a resource; and
processing the data in a fashion that consumes a different amount of the resource.
2. The method of claim 1 wherein the data that is read is multimedia data.
3. The method of claim 1 where in the operating circumstance is that the mobile device is in an orientation.
4. The method of claim 3 wherein the orientation is an orientation where the mobile device is a displaying the data and video information is not be seen readily by a user.
5. The method of claim 1 wherein the operating circumstance is that there is insufficient ambient light for a camera of the mobile device to produce usable images.
6. The method of claim 1 wherein a quantity, a form, or a type of data processed or encoded is determined based on the sensor reading.
7. The method of claim 1 where processing the data comprises computational or
communications processing of media data.
8. The method of claim 1 wherein the resource is a limited resource.
9. The method of claim 8 wherein the limited resource is battery power
10. The method of claim 8 wherein the limited resource is bandwidth.
11. The method of claim 8 wherein the limited resource is processing for real time audio and/or video data.
12. A method comprising:
in a mobile device comprising at least a display, a processor and a memory, receiving video data;
receiving sensor data from an orientation sensor in the mobile device;
determining a current orientation of the mobile device from the received sensor data; comparing the current orientation of the mobile device with a prior orientation of the mobile device;
if the current change in orientation exceeds a threshold orientation, processing the received video data at a reduced rate of consumption; and
if the current change in orientation does not exceed the threshold orientation, processing the received video data at a normal rate of consumption.
PCT/US2013/034946 2013-04-02 2013-04-02 Managing use of resources in mobile devices WO2014163621A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2013/034946 WO2014163621A1 (en) 2013-04-02 2013-04-02 Managing use of resources in mobile devices

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2013/034946 WO2014163621A1 (en) 2013-04-02 2013-04-02 Managing use of resources in mobile devices

Publications (1)

Publication Number Publication Date
WO2014163621A1 true WO2014163621A1 (en) 2014-10-09

Family

ID=51658751

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/034946 WO2014163621A1 (en) 2013-04-02 2013-04-02 Managing use of resources in mobile devices

Country Status (1)

Country Link
WO (1) WO2014163621A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107222612A (en) * 2017-04-18 2017-09-29 广东小天才科技有限公司 The background application method for closing and device of a kind of mobile terminal
CN107306445A (en) * 2016-04-17 2017-10-31 联发科技股份有限公司 Poewr control method and its device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070075127A1 (en) * 2005-12-21 2007-04-05 Outland Research, Llc Orientation-based power conservation for portable media devices
US20070257928A1 (en) * 2006-05-04 2007-11-08 Richard Marks Bandwidth Management Through Lighting Control of a User Environment via a Display Device
US20130040710A1 (en) * 2011-02-16 2013-02-14 Michael Lockwood Mobile device display management

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070075127A1 (en) * 2005-12-21 2007-04-05 Outland Research, Llc Orientation-based power conservation for portable media devices
US20070257928A1 (en) * 2006-05-04 2007-11-08 Richard Marks Bandwidth Management Through Lighting Control of a User Environment via a Display Device
US20130040710A1 (en) * 2011-02-16 2013-02-14 Michael Lockwood Mobile device display management

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
PHONES4U.: "How To Use Quick Mute On The Samsung Galaxy S3 (S III) - Phones 4u.", 2012, Retrieved from the Internet <URL:http://www.youtube.com/watch?feature=player_embedded&v=_NoyXRO60Ms> *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107306445A (en) * 2016-04-17 2017-10-31 联发科技股份有限公司 Poewr control method and its device
CN107222612A (en) * 2017-04-18 2017-09-29 广东小天才科技有限公司 The background application method for closing and device of a kind of mobile terminal
CN107222612B (en) * 2017-04-18 2020-03-17 广东小天才科技有限公司 Background application program closing method and device of mobile terminal

Similar Documents

Publication Publication Date Title
US11388403B2 (en) Video encoding method and apparatus, storage medium, and device
CN110022489B (en) Video playing method, device and storage medium
CN108427630B (en) Performance information acquisition method, device, terminal and computer readable storage medium
WO2015035870A1 (en) Multiple cpu scheduling method and device
CN108257104B (en) Image processing method and mobile terminal
US11843652B2 (en) Data processing method and electronic device
CN109302563B (en) Anti-shake processing method and device, storage medium and mobile terminal
US9807646B1 (en) Determining noise levels in electronic environments
CN114095437A (en) Method and device for sending data packet, electronic equipment and storage medium
CN111158815B (en) Dynamic wallpaper blurring method, terminal and computer readable storage medium
US20140292998A1 (en) Managing Use of Resources in Mobile Devices
CN107888975B (en) Video playing method, device and storage medium
CN108965042B (en) Network delay obtaining method and device, terminal equipment and storage medium
WO2014163621A1 (en) Managing use of resources in mobile devices
CN108965701B (en) Jitter correction method and terminal equipment
CN108536272B (en) Method for adjusting frame rate of application program and mobile terminal
CN110543403A (en) power consumption evaluation method and device
US20220174356A1 (en) Method for determining bandwidth, terminal, and storage medium
CN112533065B (en) Method and device for publishing video, electronic equipment and storage medium
CN114071224B (en) Video data processing method, device, computer equipment and storage medium
CN111698512B (en) Video processing method, device, equipment and storage medium
CN111083162B (en) Multimedia stream pause detection method and device
CN114339294A (en) Network jitter confirmation method, device, equipment and storage medium
CN110109813B (en) Information determination method and device for GPU (graphics processing Unit) performance, terminal and storage medium
CN107864294B (en) Do not disturb mode starting method and mobile terminal

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13880829

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13880829

Country of ref document: EP

Kind code of ref document: A1