US9485421B2 - Method and apparatus for operating camera function in portable terminal - Google Patents

Method and apparatus for operating camera function in portable terminal Download PDF

Info

Publication number
US9485421B2
US9485421B2 US14/104,146 US201314104146A US9485421B2 US 9485421 B2 US9485421 B2 US 9485421B2 US 201314104146 A US201314104146 A US 201314104146A US 9485421 B2 US9485421 B2 US 9485421B2
Authority
US
United States
Prior art keywords
image
touch screen
screen display
input
portable terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US14/104,146
Other versions
US20140098257A1 (en
Inventor
Cheol-Ho CHEONG
Dong-Hoon Kim
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority to US14/104,146 priority Critical patent/US9485421B2/en
Publication of US20140098257A1 publication Critical patent/US20140098257A1/en
Priority to US15/288,361 priority patent/US9712745B2/en
Application granted granted Critical
Publication of US9485421B2 publication Critical patent/US9485421B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • H04N5/23229
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/40Circuits
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04M1/72522
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • H04N5/2258
    • H04N5/232
    • H04N5/23245
    • H04N5/23293
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera

Definitions

  • the present disclosure relates to a method and an apparatus for operating a camera function in a portable terminal based on a sensor.
  • cameras 101 , 103 , and 105 are typically provided in a folder inner portion, a front side, and a rear side, respectively.
  • the camera 101 mounted in the folder inner portion is used for video communication
  • the camera 103 mounted in the front side is used for capturing images of the user
  • the camera 105 mounted in the rear side is used for capturing a different object.
  • the orientation of captured image may be changed due to the user's motion of rotating the portable terminal.
  • FIGS. 2A to 2D when a user rotates a portable terminal mounting a camera unit therein by 90 degrees, 180 degrees, and 270 degrees and captures an image, the direction of an image displayed on a lens or a Liquid Crystal Display (LCD) screen and the direction of an image actually stored in the portable terminal are different from each other. That is, even when the direction of the portable terminal changes, the user cannot recognize the changed direction through the screen or lens, but the captured result image is rotated according to the direction of the portable terminal and stored or transferred to a different apparatus.
  • LCD Liquid Crystal Display
  • a portable terminal since most of portable terminals are configured for right-handed users, left-handed users have an inconvenience in using the portable terminal.
  • a portable terminal has a shutter button for capturing at a position that is easy to use with a right hand, and accordingly, a left-handed user has an inconvenience of having to press the shutter button located at the position that is difficult to press with a left hand, or press the shutter button with an unfamiliar right hand.
  • An aspect of the present invention is to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present invention is to provide a method and an apparatus for operating a camera function based on a sensor in a portable terminal.
  • Another aspect of the present invention is to provide a method and an apparatus for operating a multi-camera according to state information of a terminal based on a sensor in the portable terminal.
  • Still another aspect of the present invention is to provide a method and an apparatus for a user interface according to state information of a terminal based on a sensor in the portable terminal.
  • Yet another aspect of the present invention is to provide a method and an apparatus for displaying and storing a captured image according to state information of a terminal based on a sensor in the portable terminal.
  • Still another aspect of the present invention is to provide a method and an apparatus for providing various capturing techniques using a multi-camera in a portable terminal.
  • a method for operating a camera function in a portable terminal includes executing a camera application, obtaining state information representing at least one of a direction and a rotation state of the portable terminal, determining at least one of a sequence of reading an image from a camera sensor and a user interface depending on the state information, and performing the camera function depending on the determined sequence or user interface.
  • an apparatus for operating a camera function in a portable terminal includes a state information recognizer for, when a camera application is executed, obtaining state information representing at least one of a direction and a rotation state of the portable terminal, and a controller for determining at least one of a sequence of reading an image from a camera sensor and a user interface depending on the state information, and controlling the camera function depending on the determined sequence or user interface.
  • FIG. 1 is a view illustrating a portable terminal mounting a multi-camera therein;
  • FIG. 2A , FIG. 2B , FIG. 2C and FIG. 2D are views illustrating result images captured according to a rotation state of a terminal in the conventional portable terminal;
  • FIG. 3 is a block diagram illustrating a portable terminal according to an exemplary embodiment of the present invention.
  • FIG. 4 is a flowchart illustrating a procedure for performing a camera function depending on a rotation state in a portable terminal according to an exemplary embodiment of the present invention
  • FIG. 5 is a flowchart illustrating a procedure for displaying an image depending on a rotation state in a portable terminal according to an exemplary embodiment of the present invention
  • FIG. 7A , FIG. 7B , FIG. 7C and FIG. 7D are views illustrating result images captured according to a rotation state of a terminal in the portable terminal according to an exemplary embodiment of the present invention
  • FIG. 8A and FIG. 8B are views illustrating a user interface depending on a rotation state in a portable terminal according to an exemplary embodiment of the present invention
  • the state information i.e., the direction or rotation state of the portable terminal denotes the direction or rotation state of the camera mounted in the portable terminal.
  • FIG. 3 is a block diagram illustrating a portable terminal according to an exemplary embodiment of the present invention.
  • the controller 300 controls an overall operation of the portable terminal. More particularly, the controller 300 determines a capture camera depending on the state information of the portable terminal, determines and changes a user interface for controlling a camera function, determines an output sequence of an image signal input from a camera sensor to control and process a function for displaying and storing an image in the normal direction.
  • the state information of the portable terminal denotes the direction or rotation state of the portable terminal.
  • the controller 300 may determine a camera to be used for capturing among the two or more cameras depending on the rotation state of the portable terminal.
  • the controller 300 may determine the camera to be used depending on the type of an application executed by a user.
  • the controller 300 controls a function for changing a user interface (for example, a shutter key/icon, a zoom control key/icon, and a capture key/icon) depending on the rotation state of the portable terminal.
  • the controller 300 determines an output sequence of an image signal input from a camera sensor depending on the rotation state of the portable terminal and displays an input image on the screen in the normal direction as illustrated in FIGS. 7A to 7D , and simultaneously, controls and processes a function for storing the input image in the normal direction.
  • the output sequence of the input image signal denotes a sequence in which the controller 300 reads an image from the camera sensor.
  • the controller 300 controls a function for displaying an image such that the captured image stored in the storage unit 320 is displayed in the normal direction when the user views the image.
  • the controller 300 controls a function for capturing an image in a Picture In Picture (PIP) mode using a plurality of cameras.
  • PIP Picture In Picture
  • the controller 300 controls and processes various functions for performing operations illustrated in FIGS. 4 to 6 .
  • the camera unit 310 includes at least one camera, drives a camera for capturing under control of the controller 300 , and provides an image signal input via a sensor of the driven camera to the controller 300 .
  • the camera unit 310 provides an image signal input via the sensor to the controller 300 according to an image signal output sequence determined by the controller 300 . That is, when the output sequence determined by the controller 300 is a sequence corresponding to the normal direction, the camera unit 310 provides pixel information input from the camera sensor in the sequence of (0, 0), (0, 1), (0, 2), . . . , (0, n ⁇ 1), (1, 0), (1, 1), . . . , (m ⁇ 1, n ⁇ 2), (m ⁇ 1, n ⁇ 1) to the controller 300 as illustrated in FIG.
  • the camera unit 310 provides pixel information input from the camera sensor in the sequence of (m ⁇ 1, n ⁇ 1), (m ⁇ 1, n ⁇ 2), (m ⁇ 1, n ⁇ 3), . . . , (m ⁇ 2, n ⁇ 1), (m ⁇ 2, n ⁇ 2), . . . , (0, 2), (0, 1), (0, 0) to the controller 300 as illustrated in FIG. 9C .
  • the storage unit 320 stores various programs and data for an overall operation of the portable terminal, and stores a captured image under control of the controller 300 .
  • the size of a memory map for storing the captured image in the storage unit 320 may correspond to a maximum number of the horizontal pixels of the camera.
  • the memory map of the camera has a size of m ⁇ m for preventing an angle of view from being lost depending on the rotation state of the portable terminal.
  • the size of the memory map storing the captured image is fixed according to the horizontal direction of the portable terminal, when the portable terminal performs a vertical capturing operation, since a horizontal-to-vertical ratio becomes different, this configuration of the memory map size prevents an angle of view from being lost.
  • the state information recognizer 330 detects state information representing the direction or rotation state of the portable terminal, and provides the state information to the controller 300 .
  • the state information recognizer 330 may detect the state information of the portable terminal using one of an accelerometer sensor, a gyroscope sensor, a horizontal angle measuring sensor, a touch sensor, and a user interface. That is, the state information recognizer 330 may detect the direction or rotation state of the portable terminal by measuring an angular velocity when the portable terminal rotates from a reference position using the accelerometer sensor. In addition, the state information recognizer 330 may detect the rotation state by measuring a horizontal angle of the portable terminal using the horizontal sensor.
  • the state information recognizer 330 may detect the direction or rotation state of the portable terminal by recognizing applied pressure or static electricity using the touch sensor mounted on the surface of the portable terminal when a user of the portable terminal grips the portable terminal and analyzing the pattern of the surface contacted by the user's hand.
  • the state information recognizer 330 may obtain state information of the portable terminal through an icon provided as a user interface or direction information selected or input by the user.
  • the state information recognizer 330 may use the above-described two or more methods in order to accurately detect the state information of the portable terminal.
  • the display unit 340 displays various state information, numbers, letters, and images occurring during an operation of the portable terminal. More particularly, the display unit 340 displays an image signal input from the camera unit 310 on a screen in real-time and displays a captured image stored in advance on the screen in the normal direction under control of the controller 300 .
  • the display unit 340 may display information regarding a change in a user interface according to the state information of the portable terminal under control of the controller 300 . That is, when the portable terminal performs an image capturing operation using a key, the display unit 340 may display to show that the function of each key has changed. When the portable terminal uses a touch screen, the display unit 340 may change and display capturing functions displayed on the screen depending on the rotation state of the portable terminal. For example, as illustrated in FIG. 8A , while the terminal does not rotate, the display unit 340 may display that a key A performs a shutter function and a key B performs a zoom function on a screen.
  • the display unit 340 may display that the key B performs the shutter function and the key A performs the zoom function on the screen.
  • the display unit 340 may display a shutter icon on the right upper end of the screen.
  • the display unit 340 may display the shutter icon on the left upper end of the screen.
  • the display unit 340 may display an image in a Picture In Picture (PIP) mode under control of the controller 300 as illustrated in FIGS. 10A and 10B .
  • PIP Picture In Picture
  • the input unit 350 has at least one function key or touch sensor to provide data input by a user to the controller 300 . That is, the input unit 350 provides data corresponding to a key pressed by a user to the controller 300 or provides the coordinate of a position touched by the user to the controller 300 .
  • the communication unit 360 transmits/receives a signal to/from an external apparatus via a wired line or wirelessly under control of the controller 300 .
  • the communication unit 360 may transmit a captured image to a different portable terminal connected wirelessly and may transmit a captured image to a computer system connected via a wired line under control of the controller 300 .
  • FIG. 4 is a flowchart illustrating the process for performing a camera function depending on a rotation state in a portable terminal according to an exemplary embodiment of the present invention.
  • the portable terminal determines whether an application that requires a camera operation is executed, and if so, the portable terminal obtains the state information of the portable terminal, i.e., information representing the direction or rotation state through the state information recognizer 330 in step 403 .
  • the rotation state of the portable terminal may be divided into a 0-degree rotation state, a 90-degree rotation state, a 180-degree rotation state, and a 270-degree rotation state. Accordingly, each rotation state includes ⁇ 45 degree. That is, when the portable terminal has a rotation state between 45 degrees and 135 degrees, the rotation state of the portable terminal may be determined as the 90 degrees.
  • description is made on the assumption that the case illustrated in FIG. 7A is a normal state of the portable terminal, i.e., a 0-degree rotate state.
  • the portable terminal proceeds to step 405 to determine a camera to be used for executing the application.
  • a camera For example, when the portable terminal includes a horizontal mode camera and a vertical mode camera, when the portable terminal rotates by 0 degree or 180 degrees, the portable terminal determines the horizontal mode camera. In contrast, when the portable terminal rotates by 90 degrees or 270 degrees, the portable terminal may determine the vertical mode camera.
  • the horizontal mode camera has more horizontal pixels than vertical pixels, and therefore horizontal mode camera is more suitable for the images in which horizontal length is greater than vertical length.
  • the vertical mode camera has more vertical pixels than horizontal pixels, and therefore vertical mode camera is more suitable for the images in which vertical length is greater than horizontal length.
  • the portable terminal may determine the camera with consideration of whether the folder or slide of the portable terminal is opened and the executed application, and may simply determine the camera selected by the user through a key button or an icon provided to the portable terminal in step 405 .
  • the portable terminal determines a user interface corresponding to state information of the portable terminal and an output sequence of an input image in step 407 . That is, as illustrated in FIGS. 8A and 8B , the portable terminal determines the user interface used for capturing depending on the rotation state of the portable terminal. For example, as illustrated in FIG. 8A , while the portable terminal rotates by 0 degree, the portable terminal determines the user interface such that the key A performs the shutter function and the key B performs the zoom function. In contrast, when the portable terminal rotates by 180 degrees, the portable terminal determines the user interface such that the key B performs the shutter function and the key A performs the zoom function. In addition, as illustrated in FIG.
  • the portable terminal determines the user interface such that a shutter icon exists on the right upper end of the screen. In contrast, when the portable terminal rotates by 180 degrees, the portable terminal may determine the user interface such that the shutter icon exists on the left upper end of the screen. Further, the portable terminal determines a sequence in which an input image from the camera sensor is output on the screen. For example, when the portable terminal rotates by 0 degree, the portable terminal determines a normal direction sequence so that pixel information input from the camera sensor is output on the screen in the sequence of (0, 0), (0, 1), (0, 2), . . . , (0, n ⁇ 1), (1, 0), (1, 1), . . .
  • the portable terminal determines a 180-degree direction sequence so that pixel information input from the camera sensor is output on the screen in the sequence of (m ⁇ 1, n ⁇ 1), (m ⁇ 1, n ⁇ 2), (m ⁇ 1, n ⁇ 3), . . . , (m ⁇ 2, n ⁇ 1), (m ⁇ 2, n ⁇ 2), . . . , (0, 2), (0, 1), (0, 0) as illustrated in FIG. 9C .
  • the portable terminal drives the determined camera in step 409 , and displays an input image on the screen according to the determined output sequence and displays the determined user interface on the screen in step 411 .
  • the portable terminal determines whether the state information of the portable terminal changes in step 413 . That is, the portable terminal determines whether the portable terminal rotates according to the user's motion while the portable terminal drives the camera and performs a screen display operation. When the state information of the portable terminal changes, the portable terminal proceeds to step 415 to determine a user interface and an input image output sequence depending on the changed state information and returns to step 411 .
  • the portable terminal proceeds to step 417 to determine whether an image capture event occurs. For example, the portable terminal determines whether a shutter key or a shutter icon is input, or whether a predetermined automatic capture condition is met. When the image capture event does not occur, the portable terminal returns to step 413 to re-perform subsequent steps.
  • the portable terminal proceeds to step 419 to store a relevant image according to the determined input image output sequence.
  • the portable terminal stores a relevant image in the storage unit 320 according to the image output sequence determined depending on the rotation state of the portable terminal Therefore, as illustrated in FIGS. 7A to 7D , the portable terminal may store a captured image such that the image is always stored in the normal direction even when the portable terminal captures the image in a rotated state.
  • FIG. 5 is a flowchart illustrating the process for displaying an image depending on a rotation state in a portable terminal according to an exemplary embodiment of the present invention.
  • the portable terminal determines whether an event for displaying a captured image occurs in step 501 . For example, the portable terminal determines whether a view menu for viewing a captured image stored in the portable terminal is selected.
  • the portable terminal obtains the state information of the portable terminal, i.e., information representing the direction or rotation state in step 503 .
  • the portable terminal determines an output sequence of a relevant captured image depending on the state information in step 505 .
  • the state information represents 0-degree rotation state
  • the portable terminal determines output on the screen in the sequence of (0, 0), (0, 1), (0, 2), . . . , (0, n ⁇ 1), (1, 0), (1, 1), . . . , (m ⁇ 1, n ⁇ 2), (m ⁇ 1, n ⁇ 1) as illustrated in FIG. 9A .
  • the state information represents 180-degree rotation state
  • the portable terminal determines output on the screen in the sequence of (m ⁇ 1, n ⁇ 1), (m ⁇ 1, n ⁇ 2), (m ⁇ 1, n ⁇ 3), . . .
  • the portable terminal reads the relevant captured image from the storage unit 320 and displays the same on the screen according to the determined output sequence in step 507 . Accordingly, the user may view a captured image displayed in the normal direction even when the portable terminal rotates.
  • FIG. 6 is a flowchart illustrating the process for operating a multi-camera and capturing an image in a portable terminal according to an exemplary embodiment of the present invention.
  • the portable terminal determines whether an application that requires operations of two cameras is executed in step 601 . For example, the portable terminal determines whether a PIP mode capture menu is selected.
  • the portable terminal determines two cameras to be used for the application in step 603 .
  • the portable terminal may determine the two cameras with consideration of the state information of the portable terminal, whether a folder or a slide is opened, and the executed application.
  • the portable terminal may simply determine a camera(s) selected by a user through a key button or an icon provided to the portable terminal
  • the portable terminal determines the purposes of the two cameras in step 605 .
  • the portable terminal may determine one of the two cameras for the purpose of a background image capturing, and may determine the other camera for the purpose of a sub image capturing.
  • Such purposes of the cameras may be automatically determined depending on a camera position mounted on the portable terminal, or may be manually selected by a user. That is, the portable terminal may determine a camera mounted in the rear side for the purpose of a background image capturing, and may determine a camera mounted in the front side for the purpose of a sub image capturing.
  • the portable terminal determines a first camera image display region and a second camera image display region in step 607 .
  • the two camera image display regions may be automatically determined according to a predetermined scheme, and may be manually determined by a user.
  • the portable terminal drives the first camera to display an image input from the first camera on the first camera image display region, and drives the second camera to display an image input from the second camera on the second camera image display region in step 609 .
  • the portable terminal determines a camera mounted in the front side of the portable terminal as a camera for a sub capturing, determines a camera mounted in the rear side of the portable terminal as a camera for a background capturing, displays an image input from the camera for the background capturing as a background image of the screen, and displays an image input from the camera for the sub capturing on a predetermined region of the background image as illustrated in FIG. 10A .
  • the display icon is changed to an edit mode (shown by the highlighted box) so that the size of the display screen can be altered by dragging the box to desired position.
  • the display portion for sub capturing may be configured and altered.
  • the portable terminal may configure a user interface depending on the state information of the portable terminal, and may determine an output sequence of an image signal input from each camera sensor depending on the state information of the portable terminal.
  • the portable terminal determines whether an image capture event occurs in step 611 . For example, the portable terminal determines whether a shutter key or a shutter icon is input, or whether an automatic capture condition set in advance is met. When the image capture event does not occur, the portable terminal determines whether an event for changing an image display region occurs according to user manipulation in step 615 . When the display region change event does not occur, the portable terminal returns to step 611 . When the display region change event occurs, the portable terminal proceeds to step 617 to change the first camera image display region and the second camera image display region and display an image input from each camera according to the user manipulation, and then returns to step 611 .
  • the portable terminal proceeds to step 613 to store a relevant image according to the determined input image output sequence, and ends the algorithm according to an exemplary embodiment of the present invention.
  • the above-described methods according to the present invention can be realized in hardware or as software or computer code that can be stored in a recording medium such as a CD ROM, an RAM, a floppy disk, a hard disk, or a magneto-optical disk or downloaded over a network, so that the methods described herein can be executed by such software using a general purpose computer, or a special processor or in programmable or dedicated hardware, such as an ASIC or FPGA.
  • the computer, the processor or the programmable hardware include memory components, e.g., RAM, ROM, Flash, etc. that may store or receive software or computer code that when accessed and executed by the computer, processor or hardware implement the processing methods described herein.
  • Exemplary embodiments of the present invention provide effects of detecting the rotation state of a camera and automatically changing a camera image in the normal direction to provide the same to a user and store the same, thus providing a more convenient interface to the user depending on the rotation state of the camera. This is achieved by operating a camera function depending on the state information of a sensor-based portable terminal in the portable terminal.
  • exemplary embodiments may be usefully utilized for even the case of using a web cam function that uses a camera of the portable terminal, and may be usefully utilized for the case of synthesizing a screen using a plurality of cameras.
  • the direction of the portable terminal may be changed and used, so that usability improves.

Abstract

A method and an apparatus for operating a camera function in a portable terminal are provided. State information representing at least one of a direction and a rotation state of the portable terminal is obtained, and at least one of a sequence of reading an image from a camera sensor and an appropriate user interface is determined depending on the state information. Thus, the camera function is performed depending on the determined sequence or user interface.

Description

CROSS REFERENCE TO RELATED APPLICATIONS
The present application is a Continuation of U.S. patent application Ser. No. 13/111,227 filed on May 19, 2011, which claims the benefit under 35 U.S.C. §119 of a Korean patent application filed in the Korean Intellectual Property Office on Jun. 7, 2010 and assigned Serial No. 10-2010-0053223, the entire disclosure of which is hereby incorporated by reference.
BACKGROUND
1. Technical Field
The present disclosure relates to a method and an apparatus for operating a camera function in a portable terminal based on a sensor.
2. Description of the Related Art
Recently, as users' interest in a digital camera rapidly increases, portable terminals mounting a plurality of cameras therein are being provided. These cameras are used for capturing images at different angles or for a different purpose. For example, as illustrated in FIG. 1, cameras 101, 103, and 105 are typically provided in a folder inner portion, a front side, and a rear side, respectively. The camera 101 mounted in the folder inner portion is used for video communication, the camera 103 mounted in the front side is used for capturing images of the user, and the camera 105 mounted in the rear side is used for capturing a different object.
Most of cameras mounted in the conventional portable terminal are fixed at a specific position, thus have some drawbacks. In operation, the orientation of captured image may be changed due to the user's motion of rotating the portable terminal. For example, as illustrated in FIGS. 2A to 2D, when a user rotates a portable terminal mounting a camera unit therein by 90 degrees, 180 degrees, and 270 degrees and captures an image, the direction of an image displayed on a lens or a Liquid Crystal Display (LCD) screen and the direction of an image actually stored in the portable terminal are different from each other. That is, even when the direction of the portable terminal changes, the user cannot recognize the changed direction through the screen or lens, but the captured result image is rotated according to the direction of the portable terminal and stored or transferred to a different apparatus.
Meanwhile, since most of portable terminals are configured for right-handed users, left-handed users have an inconvenience in using the portable terminal. For example, a portable terminal has a shutter button for capturing at a position that is easy to use with a right hand, and accordingly, a left-handed user has an inconvenience of having to press the shutter button located at the position that is difficult to press with a left hand, or press the shutter button with an unfamiliar right hand.
SUMMARY
An aspect of the present invention is to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present invention is to provide a method and an apparatus for operating a camera function based on a sensor in a portable terminal.
Another aspect of the present invention is to provide a method and an apparatus for operating a multi-camera according to state information of a terminal based on a sensor in the portable terminal.
Still another aspect of the present invention is to provide a method and an apparatus for a user interface according to state information of a terminal based on a sensor in the portable terminal.
Yet another aspect of the present invention is to provide a method and an apparatus for displaying and storing a captured image according to state information of a terminal based on a sensor in the portable terminal.
Further yet another aspect of the present invention is to provide a method and an apparatus for providing various capturing techniques using a multi-camera in a portable terminal.
In accordance with an aspect of the present invention, a method for operating a camera function in a portable terminal includes executing a camera application, obtaining state information representing at least one of a direction and a rotation state of the portable terminal, determining at least one of a sequence of reading an image from a camera sensor and a user interface depending on the state information, and performing the camera function depending on the determined sequence or user interface.
In accordance with another aspect of the present invention, an apparatus for operating a camera function in a portable terminal includes a state information recognizer for, when a camera application is executed, obtaining state information representing at least one of a direction and a rotation state of the portable terminal, and a controller for determining at least one of a sequence of reading an image from a camera sensor and a user interface depending on the state information, and controlling the camera function depending on the determined sequence or user interface.
Other aspects, advantages and salient features of the invention will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses exemplary embodiments of the invention.
BRIEF DESCRIPTION OF THE DRAWINGS
The above and other aspects, features and advantages of certain exemplary embodiments of the present invention will be more apparent from the following description taken in conjunction with the accompanying drawings in which:
FIG. 1 is a view illustrating a portable terminal mounting a multi-camera therein;
FIG. 2A, FIG. 2B, FIG. 2C and FIG. 2D are views illustrating result images captured according to a rotation state of a terminal in the conventional portable terminal;
FIG. 3 is a block diagram illustrating a portable terminal according to an exemplary embodiment of the present invention;
FIG. 4 is a flowchart illustrating a procedure for performing a camera function depending on a rotation state in a portable terminal according to an exemplary embodiment of the present invention;
FIG. 5 is a flowchart illustrating a procedure for displaying an image depending on a rotation state in a portable terminal according to an exemplary embodiment of the present invention;
FIG. 6 is a flowchart illustrating a procedure for operating a multi-camera and capturing an image in a portable terminal according to an exemplary embodiment of the present invention;
FIG. 7A, FIG. 7B, FIG. 7C and FIG. 7D are views illustrating result images captured according to a rotation state of a terminal in the portable terminal according to an exemplary embodiment of the present invention;
FIG. 8A and FIG. 8B are views illustrating a user interface depending on a rotation state in a portable terminal according to an exemplary embodiment of the present invention;
FIG. 9A, FIG. 9B, FIG. 9C and FIG. 9D are views illustrating a sequence of reading an image depending on a rotation state in a portable terminal according to an exemplary embodiment of the present invention; and
FIG. 10A and FIG. 10B are views illustrating an example of operating a multi-camera and capturing an image in a portable terminal according to an exemplary embodiment of the present invention.
Throughout the drawings, like reference numerals will be understood to refer to like parts, components and structures.
DETAILED DESCRIPTION
The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of exemplary embodiments of the invention as defined by the claims and their equivalents. Also, descriptions of well-known functions and constructions are omitted for clarity and conciseness.
Hereinafter, a case where the camera is fixedly mounted on a predetermined position of a portable terminal is exemplarily described. Accordingly, in the following description, the state information, i.e., the direction or rotation state of the portable terminal denotes the direction or rotation state of the camera mounted in the portable terminal.
FIG. 3 is a block diagram illustrating a portable terminal according to an exemplary embodiment of the present invention.
Referring to FIG. 3, the portable terminal includes a controller 300, a camera unit 310, a storage unit 320, a state information recognizer 330, a display unit 340, an input unit 350, and a communication unit 360.
The controller 300 controls an overall operation of the portable terminal. More particularly, the controller 300 determines a capture camera depending on the state information of the portable terminal, determines and changes a user interface for controlling a camera function, determines an output sequence of an image signal input from a camera sensor to control and process a function for displaying and storing an image in the normal direction. Here, the state information of the portable terminal denotes the direction or rotation state of the portable terminal.
In operation, when the portable terminal has two or more cameras, the controller 300 may determine a camera to be used for capturing among the two or more cameras depending on the rotation state of the portable terminal. The controller 300 may determine the camera to be used depending on the type of an application executed by a user. In addition, the controller 300 controls a function for changing a user interface (for example, a shutter key/icon, a zoom control key/icon, and a capture key/icon) depending on the rotation state of the portable terminal. Moreover, the controller 300 determines an output sequence of an image signal input from a camera sensor depending on the rotation state of the portable terminal and displays an input image on the screen in the normal direction as illustrated in FIGS. 7A to 7D, and simultaneously, controls and processes a function for storing the input image in the normal direction. Here, the output sequence of the input image signal denotes a sequence in which the controller 300 reads an image from the camera sensor.
Furthermore, the controller 300 controls a function for displaying an image such that the captured image stored in the storage unit 320 is displayed in the normal direction when the user views the image. In addition, the controller 300 controls a function for capturing an image in a Picture In Picture (PIP) mode using a plurality of cameras.
The controller 300 controls and processes various functions for performing operations illustrated in FIGS. 4 to 6.
The camera unit 310 includes at least one camera, drives a camera for capturing under control of the controller 300, and provides an image signal input via a sensor of the driven camera to the controller 300. The camera unit 310 provides an image signal input via the sensor to the controller 300 according to an image signal output sequence determined by the controller 300. That is, when the output sequence determined by the controller 300 is a sequence corresponding to the normal direction, the camera unit 310 provides pixel information input from the camera sensor in the sequence of (0, 0), (0, 1), (0, 2), . . . , (0, n−1), (1, 0), (1, 1), . . . , (m−1, n−2), (m−1, n−1) to the controller 300 as illustrated in FIG. 9A. When the output sequence determined by the controller 300 is a sequence corresponding to a 180-degree direction, the camera unit 310 provides pixel information input from the camera sensor in the sequence of (m−1, n−1), (m−1, n−2), (m−1, n−3), . . . , (m−2, n−1), (m−2, n−2), . . . , (0, 2), (0, 1), (0, 0) to the controller 300 as illustrated in FIG. 9C.
The storage unit 320 stores various programs and data for an overall operation of the portable terminal, and stores a captured image under control of the controller 300. Here, the size of a memory map for storing the captured image in the storage unit 320 may correspond to a maximum number of the horizontal pixels of the camera. For example, when a maximum number of horizontal pixels of the camera is m, the memory map of the camera has a size of m×m for preventing an angle of view from being lost depending on the rotation state of the portable terminal. For example, when the size of the memory map storing the captured image is fixed according to the horizontal direction of the portable terminal, when the portable terminal performs a vertical capturing operation, since a horizontal-to-vertical ratio becomes different, this configuration of the memory map size prevents an angle of view from being lost. That is, when a camera memory map is fixed to 320×240 according to the horizontal direction of the portable terminal, when the portable terminal performs a vertical capturing operation, the horizontal-to-vertical ratio of a captured image changes to 240×320, and consequently, the captured image is reduced to 240×160 according to the horizontal-to-vertical ratio of fixed memory map. The configuration of the memory map size prevents this lost in the angle of view.
The state information recognizer 330 detects state information representing the direction or rotation state of the portable terminal, and provides the state information to the controller 300. Here, the state information recognizer 330 may detect the state information of the portable terminal using one of an accelerometer sensor, a gyroscope sensor, a horizontal angle measuring sensor, a touch sensor, and a user interface. That is, the state information recognizer 330 may detect the direction or rotation state of the portable terminal by measuring an angular velocity when the portable terminal rotates from a reference position using the accelerometer sensor. In addition, the state information recognizer 330 may detect the rotation state by measuring a horizontal angle of the portable terminal using the horizontal sensor. In addition, the state information recognizer 330 may detect the direction or rotation state of the portable terminal by recognizing applied pressure or static electricity using the touch sensor mounted on the surface of the portable terminal when a user of the portable terminal grips the portable terminal and analyzing the pattern of the surface contacted by the user's hand. In addition, the state information recognizer 330 may obtain state information of the portable terminal through an icon provided as a user interface or direction information selected or input by the user. Here, the state information recognizer 330 may use the above-described two or more methods in order to accurately detect the state information of the portable terminal.
The display unit 340 displays various state information, numbers, letters, and images occurring during an operation of the portable terminal. More particularly, the display unit 340 displays an image signal input from the camera unit 310 on a screen in real-time and displays a captured image stored in advance on the screen in the normal direction under control of the controller 300.
In addition, the display unit 340 may display information regarding a change in a user interface according to the state information of the portable terminal under control of the controller 300. That is, when the portable terminal performs an image capturing operation using a key, the display unit 340 may display to show that the function of each key has changed. When the portable terminal uses a touch screen, the display unit 340 may change and display capturing functions displayed on the screen depending on the rotation state of the portable terminal. For example, as illustrated in FIG. 8A, while the terminal does not rotate, the display unit 340 may display that a key A performs a shutter function and a key B performs a zoom function on a screen. When the terminal rotates by 180 degrees, the display unit 340 may display that the key B performs the shutter function and the key A performs the zoom function on the screen. In addition, as illustrated in FIG. 8B, while the portable terminal does not rotate, the display unit 340 may display a shutter icon on the right upper end of the screen. When the portable terminal rotates by 180 degrees, the display unit 340 may display the shutter icon on the left upper end of the screen. Further, the display unit 340 may display an image in a Picture In Picture (PIP) mode under control of the controller 300 as illustrated in FIGS. 10A and 10B.
The input unit 350 has at least one function key or touch sensor to provide data input by a user to the controller 300. That is, the input unit 350 provides data corresponding to a key pressed by a user to the controller 300 or provides the coordinate of a position touched by the user to the controller 300.
The communication unit 360 transmits/receives a signal to/from an external apparatus via a wired line or wirelessly under control of the controller 300. For example, the communication unit 360 may transmit a captured image to a different portable terminal connected wirelessly and may transmit a captured image to a computer system connected via a wired line under control of the controller 300.
FIG. 4 is a flowchart illustrating the process for performing a camera function depending on a rotation state in a portable terminal according to an exemplary embodiment of the present invention.
Referring to FIG. 4, in step 401, the portable terminal determines whether an application that requires a camera operation is executed, and if so, the portable terminal obtains the state information of the portable terminal, i.e., information representing the direction or rotation state through the state information recognizer 330 in step 403. Here, the rotation state of the portable terminal may be divided into a 0-degree rotation state, a 90-degree rotation state, a 180-degree rotation state, and a 270-degree rotation state. Accordingly, each rotation state includes ±45 degree. That is, when the portable terminal has a rotation state between 45 degrees and 135 degrees, the rotation state of the portable terminal may be determined as the 90 degrees. According to an exemplary embodiment of the present invention, description is made on the assumption that the case illustrated in FIG. 7A is a normal state of the portable terminal, i.e., a 0-degree rotate state.
When the state information of the portable terminal is obtained, the portable terminal proceeds to step 405 to determine a camera to be used for executing the application. For example, when the portable terminal includes a horizontal mode camera and a vertical mode camera, when the portable terminal rotates by 0 degree or 180 degrees, the portable terminal determines the horizontal mode camera. In contrast, when the portable terminal rotates by 90 degrees or 270 degrees, the portable terminal may determine the vertical mode camera. The horizontal mode camera has more horizontal pixels than vertical pixels, and therefore horizontal mode camera is more suitable for the images in which horizontal length is greater than vertical length. The vertical mode camera has more vertical pixels than horizontal pixels, and therefore vertical mode camera is more suitable for the images in which vertical length is greater than horizontal length. In addition, the portable terminal may determine the camera with consideration of whether the folder or slide of the portable terminal is opened and the executed application, and may simply determine the camera selected by the user through a key button or an icon provided to the portable terminal in step 405.
When the camera is determined, the portable terminal determines a user interface corresponding to state information of the portable terminal and an output sequence of an input image in step 407. That is, as illustrated in FIGS. 8A and 8B, the portable terminal determines the user interface used for capturing depending on the rotation state of the portable terminal. For example, as illustrated in FIG. 8A, while the portable terminal rotates by 0 degree, the portable terminal determines the user interface such that the key A performs the shutter function and the key B performs the zoom function. In contrast, when the portable terminal rotates by 180 degrees, the portable terminal determines the user interface such that the key B performs the shutter function and the key A performs the zoom function. In addition, as illustrated in FIG. 8B, while the portable terminal rotates by 0 degree, the portable terminal determines the user interface such that a shutter icon exists on the right upper end of the screen. In contrast, when the portable terminal rotates by 180 degrees, the portable terminal may determine the user interface such that the shutter icon exists on the left upper end of the screen. Further, the portable terminal determines a sequence in which an input image from the camera sensor is output on the screen. For example, when the portable terminal rotates by 0 degree, the portable terminal determines a normal direction sequence so that pixel information input from the camera sensor is output on the screen in the sequence of (0, 0), (0, 1), (0, 2), . . . , (0, n−1), (1, 0), (1, 1), . . . , (m−1, n−2), (m−1, n−1) as illustrated in FIG. 9A. When the portable terminal rotates by 180, the portable terminal determines a 180-degree direction sequence so that pixel information input from the camera sensor is output on the screen in the sequence of (m−1, n−1), (m−1, n−2), (m−1, n−3), . . . , (m−2, n−1), (m−2, n−2), . . . , (0, 2), (0, 1), (0, 0) as illustrated in FIG. 9C.
The portable terminal drives the determined camera in step 409, and displays an input image on the screen according to the determined output sequence and displays the determined user interface on the screen in step 411.
The portable terminal determines whether the state information of the portable terminal changes in step 413. That is, the portable terminal determines whether the portable terminal rotates according to the user's motion while the portable terminal drives the camera and performs a screen display operation. When the state information of the portable terminal changes, the portable terminal proceeds to step 415 to determine a user interface and an input image output sequence depending on the changed state information and returns to step 411.
In contrast, when the state information of the portable terminal does not change, the portable terminal proceeds to step 417 to determine whether an image capture event occurs. For example, the portable terminal determines whether a shutter key or a shutter icon is input, or whether a predetermined automatic capture condition is met. When the image capture event does not occur, the portable terminal returns to step 413 to re-perform subsequent steps.
When the image capture event occurs, the portable terminal proceeds to step 419 to store a relevant image according to the determined input image output sequence. For example, as illustrated in FIGS. 9A to 9D, the portable terminal stores a relevant image in the storage unit 320 according to the image output sequence determined depending on the rotation state of the portable terminal Therefore, as illustrated in FIGS. 7A to 7D, the portable terminal may store a captured image such that the image is always stored in the normal direction even when the portable terminal captures the image in a rotated state.
FIG. 5 is a flowchart illustrating the process for displaying an image depending on a rotation state in a portable terminal according to an exemplary embodiment of the present invention.
Referring to FIG. 5, the portable terminal determines whether an event for displaying a captured image occurs in step 501. For example, the portable terminal determines whether a view menu for viewing a captured image stored in the portable terminal is selected.
When the event for displaying the captured image occurs, the portable terminal obtains the state information of the portable terminal, i.e., information representing the direction or rotation state in step 503.
The portable terminal determines an output sequence of a relevant captured image depending on the state information in step 505. For example, when the state information represents 0-degree rotation state, the portable terminal determines output on the screen in the sequence of (0, 0), (0, 1), (0, 2), . . . , (0, n−1), (1, 0), (1, 1), . . . , (m−1, n−2), (m−1, n−1) as illustrated in FIG. 9A. When the state information represents 180-degree rotation state, the portable terminal determines output on the screen in the sequence of (m−1, n−1), (m−1, n−2), (m−1, n−3), . . . , (m−2, n−1), (m−2, n−2), . . . , (0, 2), (0, 1), (0, 0) as illustrated in FIG. 9C, and the portable terminal reads the relevant captured image from the storage unit 320 and displays the same on the screen according to the determined output sequence in step 507. Accordingly, the user may view a captured image displayed in the normal direction even when the portable terminal rotates.
FIG. 6 is a flowchart illustrating the process for operating a multi-camera and capturing an image in a portable terminal according to an exemplary embodiment of the present invention.
Referring to FIG. 6, the portable terminal determines whether an application that requires operations of two cameras is executed in step 601. For example, the portable terminal determines whether a PIP mode capture menu is selected.
When the application that requires the operations of the two cameras is executed, the portable terminal determines two cameras to be used for the application in step 603. Here, the portable terminal may determine the two cameras with consideration of the state information of the portable terminal, whether a folder or a slide is opened, and the executed application. Alternatively, the portable terminal may simply determine a camera(s) selected by a user through a key button or an icon provided to the portable terminal
Thereafter, the portable terminal determines the purposes of the two cameras in step 605. For example, as illustrated in FIG. 10B, the portable terminal may determine one of the two cameras for the purpose of a background image capturing, and may determine the other camera for the purpose of a sub image capturing. Such purposes of the cameras may be automatically determined depending on a camera position mounted on the portable terminal, or may be manually selected by a user. That is, the portable terminal may determine a camera mounted in the rear side for the purpose of a background image capturing, and may determine a camera mounted in the front side for the purpose of a sub image capturing.
The portable terminal determines a first camera image display region and a second camera image display region in step 607. Here, the two camera image display regions may be automatically determined according to a predetermined scheme, and may be manually determined by a user. The portable terminal drives the first camera to display an image input from the first camera on the first camera image display region, and drives the second camera to display an image input from the second camera on the second camera image display region in step 609. For example, the portable terminal determines a camera mounted in the front side of the portable terminal as a camera for a sub capturing, determines a camera mounted in the rear side of the portable terminal as a camera for a background capturing, displays an image input from the camera for the background capturing as a background image of the screen, and displays an image input from the camera for the sub capturing on a predetermined region of the background image as illustrated in FIG. 10A. As illustrated in FIG. 10B, the display icon is changed to an edit mode (shown by the highlighted box) so that the size of the display screen can be altered by dragging the box to desired position. Further, as illustrated in FIG. 10B, the display portion for sub capturing may be configured and altered. Here, as illustrated in FIG. 4, the portable terminal may configure a user interface depending on the state information of the portable terminal, and may determine an output sequence of an image signal input from each camera sensor depending on the state information of the portable terminal.
The portable terminal determines whether an image capture event occurs in step 611. For example, the portable terminal determines whether a shutter key or a shutter icon is input, or whether an automatic capture condition set in advance is met. When the image capture event does not occur, the portable terminal determines whether an event for changing an image display region occurs according to user manipulation in step 615. When the display region change event does not occur, the portable terminal returns to step 611. When the display region change event occurs, the portable terminal proceeds to step 617 to change the first camera image display region and the second camera image display region and display an image input from each camera according to the user manipulation, and then returns to step 611.
When the image capture event occurs, the portable terminal proceeds to step 613 to store a relevant image according to the determined input image output sequence, and ends the algorithm according to an exemplary embodiment of the present invention.
In the above, a method for maintaining an image in the normal direction by controlling a sequence of reading an image from a camera sensor depending on the rotation state of the portable terminal has been described using a case where a camera is fixedly mounted in the portable terminal as an example. However, when the camera mounted in the portable terminal is not fixed, a predetermined pendulum is mounted such that the center of mass is directed to the lower portion of a camera module, and the camera module is made rotatable, so that the camera always maintains a horizontal state and thus maintains an image in the normal direction even when the sequence of reading an image from the camera sensor is not controlled.
Note that the above-described methods according to the present invention can be realized in hardware or as software or computer code that can be stored in a recording medium such as a CD ROM, an RAM, a floppy disk, a hard disk, or a magneto-optical disk or downloaded over a network, so that the methods described herein can be executed by such software using a general purpose computer, or a special processor or in programmable or dedicated hardware, such as an ASIC or FPGA. As would be understood in the art, the computer, the processor or the programmable hardware include memory components, e.g., RAM, ROM, Flash, etc. that may store or receive software or computer code that when accessed and executed by the computer, processor or hardware implement the processing methods described herein.
Exemplary embodiments of the present invention provide effects of detecting the rotation state of a camera and automatically changing a camera image in the normal direction to provide the same to a user and store the same, thus providing a more convenient interface to the user depending on the rotation state of the camera. This is achieved by operating a camera function depending on the state information of a sensor-based portable terminal in the portable terminal. In addition, exemplary embodiments may be usefully utilized for even the case of using a web cam function that uses a camera of the portable terminal, and may be usefully utilized for the case of synthesizing a screen using a plurality of cameras. Further, even when a cable, a memory card, and USB devices are mounted in the portable terminal and so an installation direction of the camera is limited due to space constraints, the direction of the portable terminal may be changed and used, so that usability improves.
Although the invention has been shown and described with reference to certain exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims and their equivalents. Therefore, the scope of the present invention should not be limited to the above-described embodiments but should be determined by not only the appended claims but also the equivalents thereof.

Claims (30)

What is claimed is:
1. A portable device, comprising:
a touch screen display;
at least first and second image sensors; and
a controller configured to perform operations comprising:
obtaining a first image and a second image in real-time via the first and second image sensors, respectively, wherein the first image and the second image from the first and second image sensors are displayed in real-time;
displaying, via the touch screen display, the second image over at least a portion of the first image;
thereafter, changing a displayed location of all regions of the second image and thereby displacing the second image in response to a user input, so as to display a real time composite image;
thereafter, detecting a capture event in respect of the displayed real time composite image; and
responsive to the detected capture event, storing the composite image.
2. The portable device of claim 1, wherein:
the input is a touch related input manipulated by a user.
3. The portable device of claim 1, wherein the operations further comprise:
presenting a menu for a user to determine a shape corresponding to the second image, the menu including one or more selectable shape images.
4. The portable device of claim 3, wherein presenting the menu comprises:
acquiring an input corresponding to a shape image of the one or more selectable shape images; and
presenting the second image in the shape image of the one or more selectable shape images.
5. The portable device of claim 1, wherein the operations further comprise:
obtaining state information representing at least one of a direction or a rotation state of the touch screen display via at least one of an accelerometer sensor, a gyroscope sensor, a horizontal angle measuring sensor, a touch sensor, or a user interface.
6. The portable device of claim 5, wherein the operations further comprise:
acquiring another input via at least one user interface key;
determining a function related to the other input based on state information of the portable device.
7. The portable device of claim 5, wherein the operations further comprise:
determining a region of the touch screen display via which to present a shutter icon, based on state information of the portable device.
8. The portable device of claim 1, wherein the capture event comprises a shutter key input, a shutter icon input or the occurrence of an automatic capture condition input.
9. An apparatus comprising:
a touch screen display;
at least first and second image sensors; and
a controller configured to perform operations comprising:
obtaining a first image and a second image in real-time via the first and second image sensors, respectively, wherein the first image and the second image from the first and second image sensors are displayed in real-time;
presenting the first image via a first region of the touch screen display;
editing, responsive to user input, a size of a particular object delineated within the first image relative to other portions of the first image;
presenting the second image via a second region of the touch screen display, the second region being within a boundary of the first region;
thereafter, displacing all regions of the second image responsive to user manipulation so that the second image appears integrated within the object of the first image, so as to display a real time composite image;
thereafter, detecting a capture event in respect of the displayed real time composite image; and
responsive to the detected capture event, storing the composite image.
10. The apparatus of claim 9, wherein presenting the second image comprises:
determining a shape based on at least one of an input from a user or a specified scheme.
11. The apparatus of claim 9, wherein presenting the second image comprises:
presenting the second image via a third region of the touch screen display based on another input to change a region corresponding to the second image.
12. The apparatus of claim 9, wherein the operations further comprise:
obtaining state information representing at least one of a direction or a rotation state of the touch screen display.
13. The apparatus of claim 12, wherein the operations further comprise:
determining a region of the touch screen display via which to present a shutter icon, based on the state information.
14. The apparatus of claim 12, wherein the operations further comprise:
acquiring another input via at least one user interface key;
determining a function related to the other input based on the state information.
15. The apparatus of claim 14, wherein the function comprises at least one of a capturing function or a zooming function.
16. A method comprising:
obtaining a first image and a second image in real-time via at least one of a plurality of image sensors in response to a first user input, wherein the first image and the second image from the plurality of image sensors are displayed in real-time;
displaying, on a touch screen display, the second image over at least a portion of the first image;
thereafter, changing a displayed location of all regions of the second image and thereby displacing the second image in response to a second user input, so as to display a real time composite image;
thereafter, detecting a capture event in respect of the displayed real time composite image; and
responsive to the detected capture event, storing the composite image.
17. The method of claim 16, wherein displaying the second image comprises:
presenting a menu for a user to determine a shape corresponding to the second image;
acquiring a third input to determine the shape corresponding to the second image; and
presenting the second image in the shape determined by the user.
18. The method of claim 16, further comprising:
obtaining state information representing at least one of a direction or a rotation state of the touch screen display;
presenting a shutter icon on a region of the touch screen display based on a first state information; and
presenting the shutter icon on another region of the touch screen display based on a second state information.
19. The method of claim 16, further comprising:
obtaining state information representing at least one of a direction or a rotation state of the touch screen display; and
determining a function related to a user interface key based on the state information.
20. An apparatus comprising:
a touch screen display;
at least first and second image sensors; and
a controller configured to perform operations comprising:
obtaining a first image and a second image in real-time via the first and second image sensors, respectively, wherein the first image and the second image from the first and second image sensors are displayed in real-time;
editing, responsive to user input, a size of a particular object delineated within the first image relative to other portions of the first image;
displaying, via the touch screen display, the second image over a portion of the first image, and thereafter displacing all regions of the second image so that the second image appears integrated within the object;
displaying, via the touch screen display, the first image in a first region over a portion of the second image in response to a user input, so as to display a real time composite image;
thereafter, detecting a capture event in respect of the displayed real time composite image; and
responsive to the detected capture event, storing the composite image.
21. The apparatus of claim 20, wherein:
the input is a touch related input manipulated by a user.
22. The apparatus of claim 20, wherein the operations further comprise:
presenting a menu for a user to determine a shape corresponding to the second image.
23. The apparatus of claim 22, wherein presenting the menu comprises:
acquiring another input to determine the shape corresponding to the second image; and
presenting the second image in the shape determined by the user.
24. The apparatus of claim 20, wherein the operations further comprise:
obtaining state information representing at least one of a direction or a rotation state of the touch screen display via at least one of an accelerometer sensor, a gyroscope sensor, a horizontal angle measuring sensor, a touch sensor, or a user interface.
25. The apparatus of claim 24, wherein the operations further comprise:
determining a region of the touch screen display via which to present a shutter icon, based on the state information.
26. A method in a portable device having at least first and second image sensors and a touch screen display, comprising:
obtaining a first image and a second image in real-time via the first and second image sensors, respectively, wherein the first image and the second image from the first and second image sensors are displayed in real-time;
displaying the first image on the touch screen display and editing, responsive to user input, a size of a particular object delineated within the first image relative to other portions of the first image;
displaying the second image on the touch screen display over a portion of the first image, and thereafter, displacing all regions of the second image responsive to a user manipulation so that the second image appears integrated within the object of the first image, so as to display a real time composite image;
thereafter, detecting a capture event in respect of the displayed real time composite image; and
responsive to the detected capture event, storing the composite image.
27. A portable terminal comprising:
a touch screen display;
a first camera mounted in a front side of the portable terminal for sub capturing;
a second camera mounted in a rear side of the portable terminal for background capturing; and
a controller configured to perform operations comprising:
obtaining a background image and a sub image real-time via the first and second cameras, respectively, wherein the background image and the sub image from the first and second cameras are displayed in real-time;
displaying the background image as a background on the touch screen display;
displaying the sub image on a predetermined region of the background image;
thereafter, changing a displayed location of all regions of the sub image and thereby displacing the sub image in response to an input associated with the sub image, so as to display a real time composite image;
thereafter, detecting a capture event in respect of the displayed real time composite image; and
responsive to the detected capture event, storing the composite image.
28. A portable device, comprising:
a touch screen display;
a first image sensor and a second image sensor; and
a controller configured to:
obtain a first image in real-time via the first image sensor, and a second image in real-time via the second image sensor, wherein the first image and the second image from the first and second image sensors are displayed in real-time;
present, via the touch screen display, a real time third image generated using at least one portion of the first image overlapping at least one portion of the second image;
identify an input to capture the real time third image; and
store the third image in a memory operatively coupled with the controller in response to the input,
wherein the third image is acquired by changing a displayed location of all regions of the second image in response to a user input.
29. The portable device of claim 28, wherein the controller is further configured to control the touch screen display to display the second image over a portion of the first image, and thereafter, display the first image over a portion of the second image in response to a user input.
30. A portable device, comprising:
a touch screen display;
a first imagae sensor and a second image sensor; and
a controller configured to:
obtain a first image in real-time via the first image sensor, and a second image in real-time via the second image sensor, wherein the first image and the second image from the first and second image sensors are displayed in real-time;
present, via the touch screen display, a real time third image generated using at least one portion of the first image overlapping at least one portion of the second image;
identify an input to capture the real time third image; and
store the third image in a memory operatively coupled with the controller in response to the input,
wherein the third image is acquired by displacing all regions of the second image so that the second image appears integrated within a particular object delineated within the first image relative to other portions of the first image.
US14/104,146 2010-06-07 2013-12-12 Method and apparatus for operating camera function in portable terminal Active US9485421B2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US14/104,146 US9485421B2 (en) 2010-06-07 2013-12-12 Method and apparatus for operating camera function in portable terminal
US15/288,361 US9712745B2 (en) 2010-06-07 2016-10-07 Method and apparatus for operating camera function in portable terminal

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2010-0053223 2010-06-07
KR1020100053223A KR101674011B1 (en) 2010-06-07 2010-06-07 Method and apparatus for operating camera function in portable terminal
US13/111,227 US8625020B2 (en) 2010-06-07 2011-05-19 Method and apparatus for operating camera function in portable terminal
US14/104,146 US9485421B2 (en) 2010-06-07 2013-12-12 Method and apparatus for operating camera function in portable terminal

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/111,227 Continuation US8625020B2 (en) 2010-06-07 2011-05-19 Method and apparatus for operating camera function in portable terminal

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/288,361 Continuation US9712745B2 (en) 2010-06-07 2016-10-07 Method and apparatus for operating camera function in portable terminal

Publications (2)

Publication Number Publication Date
US20140098257A1 US20140098257A1 (en) 2014-04-10
US9485421B2 true US9485421B2 (en) 2016-11-01

Family

ID=45053343

Family Applications (3)

Application Number Title Priority Date Filing Date
US13/111,227 Active 2031-12-09 US8625020B2 (en) 2010-06-07 2011-05-19 Method and apparatus for operating camera function in portable terminal
US14/104,146 Active US9485421B2 (en) 2010-06-07 2013-12-12 Method and apparatus for operating camera function in portable terminal
US15/288,361 Expired - Fee Related US9712745B2 (en) 2010-06-07 2016-10-07 Method and apparatus for operating camera function in portable terminal

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/111,227 Active 2031-12-09 US8625020B2 (en) 2010-06-07 2011-05-19 Method and apparatus for operating camera function in portable terminal

Family Applications After (1)

Application Number Title Priority Date Filing Date
US15/288,361 Expired - Fee Related US9712745B2 (en) 2010-06-07 2016-10-07 Method and apparatus for operating camera function in portable terminal

Country Status (3)

Country Link
US (3) US8625020B2 (en)
KR (1) KR101674011B1 (en)
CN (2) CN104284089B (en)

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5574162B2 (en) * 2010-05-27 2014-08-20 株式会社リコー POSITION INFORMATION RECORDING DEVICE, IMAGING DEVICE PROVIDED WITH THE DEVICE, AND POSITION INFORMATION RECORDING METHOD
US8520080B2 (en) 2011-01-31 2013-08-27 Hand Held Products, Inc. Apparatus, system, and method of use of imaging assembly on mobile terminal
JP5970937B2 (en) 2012-04-25 2016-08-17 ソニー株式会社 Display control apparatus and display control method
KR101995258B1 (en) 2012-05-14 2019-07-02 삼성전자 주식회사 Apparatus and method for recording a moving picture of wireless terminal having a camera
JP5988860B2 (en) * 2012-12-21 2016-09-07 キヤノン株式会社 IMAGING DEVICE AND IMAGING DEVICE CONTROL METHOD
CN104049859A (en) * 2013-03-15 2014-09-17 青岛海尔电子有限公司 Intelligent terminal and display control method of soft control interface thereof
KR20150104302A (en) * 2014-03-05 2015-09-15 삼성전자주식회사 User input detection method of electronic apparatus and electronic apparatus thereof
CN104052928A (en) * 2014-06-16 2014-09-17 广东欧珀移动通信有限公司 Method and electronic equipment for controlling camera to rotate automatically
KR102176365B1 (en) * 2014-07-14 2020-11-09 엘지전자 주식회사 Mobile terminal and control method for the mobile terminal
KR20160144179A (en) * 2015-06-08 2016-12-16 임재현 Method for providing user interface in user terminal including camera
US11284003B2 (en) 2015-07-29 2022-03-22 Samsung Electronics Co., Ltd. User terminal apparatus and control method thereof
US9936138B2 (en) 2015-07-29 2018-04-03 Samsung Electronics Co., Ltd. User terminal apparatus and control method thereof
JP6579899B2 (en) * 2015-10-09 2019-09-25 キヤノン株式会社 IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, PROGRAM, AND STORAGE MEDIUM
CN105635572B (en) * 2015-12-28 2019-07-26 联想(北京)有限公司 Image processing method, image processing apparatus and electronic equipment
US10560565B2 (en) * 2017-02-15 2020-02-11 Samsung Electronics Co., Ltd. Electronic device and operating method thereof
CN107668811A (en) * 2017-09-20 2018-02-09 深圳市盛世华服信息有限公司 A kind of photographic method based on amount body
CN108712527A (en) * 2018-05-14 2018-10-26 宇龙计算机通信科技(深圳)有限公司 Mobile terminal and its control method
CN108650357A (en) * 2018-05-28 2018-10-12 信利光电股份有限公司 A kind of method of communication terminal and control sensor module rotation
CN109088996A (en) * 2018-10-19 2018-12-25 北京小米移动软件有限公司 The starting method, apparatus and readable storage medium storing program for executing of magnifying function

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030098847A1 (en) 2001-11-27 2003-05-29 Yuji Yamamoto Information display apparatus
US20040012702A1 (en) * 2002-07-08 2004-01-22 Casio Computer Co., Ltd. Camera apparatus, photographing method and a storage medium that records method of photographing
US20050036044A1 (en) * 2003-08-14 2005-02-17 Fuji Photo Film Co., Ltd. Image pickup device and image synthesizing method
KR20050087941A (en) 2004-02-27 2005-09-01 삼성전자주식회사 Portable electronic device for changing menu display state according to rotating degree and method thereof
US6987535B1 (en) * 1998-11-09 2006-01-17 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium
US20060017832A1 (en) 2004-07-21 2006-01-26 Nokia Corporation Portable electronic devices with picture in picture capability
US20060103734A1 (en) 2004-11-16 2006-05-18 Samsung Techwin Co., Ltd. Apparatus and method for rotating image in digital camera
KR20060046955A (en) 2004-11-12 2006-05-18 삼성전자주식회사 Apparatus and method for displaying plurality of pictures simultaneously in portable wireless communication terminal
US20070004451A1 (en) 2005-06-30 2007-01-04 C Anderson Eric Controlling functions of a handheld multifunction device
US20070070184A1 (en) 2003-12-26 2007-03-29 Lg Electronics Inc. Mobile Communication Device With Enhanced Image Communication Capability
US20070222863A1 (en) * 2006-03-22 2007-09-27 Samsung Electronics Co., Ltd. Digital broadcasting reception apparatus for simultaneously capturing camera image and digital broadcasting image and method for the same
US20080002963A1 (en) 2006-06-28 2008-01-03 Media Tek Inc. Systems and methods for capturing images of objects
US20080036876A1 (en) 1999-12-28 2008-02-14 Sony Corporation Electronic apparatus having a display function
KR20080021377A (en) 2006-09-04 2008-03-07 엘지전자 주식회사 Image processing device and method for storing an image using automatical rotating processing
CN101312483A (en) 2008-07-14 2008-11-26 深圳华为通信技术有限公司 Image coding method, apparatus and terminal
US7551211B2 (en) * 2003-08-28 2009-06-23 Kabushiki Kaisha Toshiba Apparatus and method for processing a photographic image using a stencil
CN101577783A (en) 2008-05-09 2009-11-11 宏达国际电子股份有限公司 Method for processing images and electronic device thereof
US20100073487A1 (en) * 2006-10-04 2010-03-25 Nikon Corporation Electronic apparatus and electronic camera
US20100149398A1 (en) 2008-12-15 2010-06-17 Robbyn Gayer Camera
US20110102671A1 (en) * 2009-10-29 2011-05-05 Apple Inc. Picture in Picture Video Display for Low Power Device
JP2011192641A (en) 2010-02-17 2011-09-29 Hitachi Cable Ltd Radiation resistant electric wire/cable

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100708178B1 (en) 2005-09-01 2007-04-16 삼성전자주식회사 Method for image processing, apparatus and information storage medium storing image information therefor
KR100672338B1 (en) * 2005-09-09 2007-01-24 엘지전자 주식회사 Mobile communication terminal having dual display equipment and method of taking picture using same
KR101144423B1 (en) * 2006-11-16 2012-05-10 엘지전자 주식회사 Mobile phone and display method of the same
JP2010032839A (en) * 2008-07-30 2010-02-12 Kyocera Corp Image reading device and electronic apparatus
KR101500741B1 (en) 2008-09-12 2015-03-09 옵티스 셀룰러 테크놀로지, 엘엘씨 Mobile terminal having a camera and method for photographing picture thereof

Patent Citations (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6987535B1 (en) * 1998-11-09 2006-01-17 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium
US20080036876A1 (en) 1999-12-28 2008-02-14 Sony Corporation Electronic apparatus having a display function
US20030098847A1 (en) 2001-11-27 2003-05-29 Yuji Yamamoto Information display apparatus
CN1421737A (en) 2001-11-27 2003-06-04 佳能株式会社 Information display device
US20040012702A1 (en) * 2002-07-08 2004-01-22 Casio Computer Co., Ltd. Camera apparatus, photographing method and a storage medium that records method of photographing
US20050036044A1 (en) * 2003-08-14 2005-02-17 Fuji Photo Film Co., Ltd. Image pickup device and image synthesizing method
US7551211B2 (en) * 2003-08-28 2009-06-23 Kabushiki Kaisha Toshiba Apparatus and method for processing a photographic image using a stencil
US20070070184A1 (en) 2003-12-26 2007-03-29 Lg Electronics Inc. Mobile Communication Device With Enhanced Image Communication Capability
CN101052185A (en) 2003-12-26 2007-10-10 Lg电子株式会社 Mobile communication device with enhanced image communication capability
KR20050087941A (en) 2004-02-27 2005-09-01 삼성전자주식회사 Portable electronic device for changing menu display state according to rotating degree and method thereof
US20050190281A1 (en) 2004-02-27 2005-09-01 Samsung Electronics Co., Ltd. Portable electronic device for changing menu display state according to rotating degree and method thereof
US20060017832A1 (en) 2004-07-21 2006-01-26 Nokia Corporation Portable electronic devices with picture in picture capability
CN101023655A (en) 2004-07-21 2007-08-22 诺基亚公司 Portable electronic devices with picture in picture capability
KR20060046955A (en) 2004-11-12 2006-05-18 삼성전자주식회사 Apparatus and method for displaying plurality of pictures simultaneously in portable wireless communication terminal
CN1777226A (en) 2004-11-16 2006-05-24 三星Techwin株式会社 Apparatus and method for rotating image in digital camera
US20060103734A1 (en) 2004-11-16 2006-05-18 Samsung Techwin Co., Ltd. Apparatus and method for rotating image in digital camera
US20070004451A1 (en) 2005-06-30 2007-01-04 C Anderson Eric Controlling functions of a handheld multifunction device
US20070222863A1 (en) * 2006-03-22 2007-09-27 Samsung Electronics Co., Ltd. Digital broadcasting reception apparatus for simultaneously capturing camera image and digital broadcasting image and method for the same
US20080002963A1 (en) 2006-06-28 2008-01-03 Media Tek Inc. Systems and methods for capturing images of objects
KR20080021377A (en) 2006-09-04 2008-03-07 엘지전자 주식회사 Image processing device and method for storing an image using automatical rotating processing
US20100073487A1 (en) * 2006-10-04 2010-03-25 Nikon Corporation Electronic apparatus and electronic camera
CN101577783A (en) 2008-05-09 2009-11-11 宏达国际电子股份有限公司 Method for processing images and electronic device thereof
CN101312483A (en) 2008-07-14 2008-11-26 深圳华为通信技术有限公司 Image coding method, apparatus and terminal
US20100149398A1 (en) 2008-12-15 2010-06-17 Robbyn Gayer Camera
US20110102671A1 (en) * 2009-10-29 2011-05-05 Apple Inc. Picture in Picture Video Display for Low Power Device
JP2011192641A (en) 2010-02-17 2011-09-29 Hitachi Cable Ltd Radiation resistant electric wire/cable

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Korean Notice of Patent Grant dated Aug. 29, 2016.

Also Published As

Publication number Publication date
CN104284089B (en) 2017-09-15
CN104284089A (en) 2015-01-14
CN102271184A (en) 2011-12-07
US20170026569A1 (en) 2017-01-26
US20110298940A1 (en) 2011-12-08
US20140098257A1 (en) 2014-04-10
KR20110133698A (en) 2011-12-14
KR101674011B1 (en) 2016-11-08
US9712745B2 (en) 2017-07-18
US8625020B2 (en) 2014-01-07

Similar Documents

Publication Publication Date Title
US9712745B2 (en) Method and apparatus for operating camera function in portable terminal
CN111034181B (en) Image capturing apparatus, image display system, and operation method
CN107026973B (en) Image processing device, image processing method and photographic auxiliary equipment
JP4098808B2 (en) Remote video display method, video acquisition device, method thereof, and program thereof
KR101343220B1 (en) Real-time preview for panoramic images
JP6205067B2 (en) Pan / tilt operating device, camera system, pan / tilt operating program, and pan / tilt operating method
JP4864295B2 (en) Image display system, image display apparatus, and program
CN106716985B (en) Video camera controller, camera shooting control method and camera system
CN106133794B (en) Information processing method, information processing apparatus, and program
CN104902170A (en) Photo-taking method and terminal
KR20140106333A (en) Image display positioning using image sensor location
US20170111574A1 (en) Imaging apparatus and imaging method
US20210084228A1 (en) Tracking shot method and device, and storage medium
CN113473004A (en) Shooting method and device
CN110941375A (en) Method and device for locally amplifying image and storage medium
CN112805995A (en) Information processing apparatus
CN114125268A (en) Focusing method and device
JP2005025268A (en) Electronic device and method for controlling display
US8194147B2 (en) Image presentation angle adjustment method and camera device using the same
EP4195647A1 (en) Image processing method, mobile terminal, and storage medium
WO2022061541A1 (en) Control method, handheld gimbal, system, and computer-readable storage medium
JP5631065B2 (en) Video distribution system, control terminal, network camera, control method and program
CN110709813A (en) Image display method, equipment, holder and storage medium
WO2022041013A1 (en) Control method, handheld gimbal, system, and computer readable storage medium
JP2002094867A (en) Image pickup controller, control method of image pickup controller, memory medium and remote image pickup system

Legal Events

Date Code Title Description
STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8