WO2016114475A1 - Method of providing preset service by bending mobile device according to user input of bending mobile device and mobile device performing the same - Google Patents

Method of providing preset service by bending mobile device according to user input of bending mobile device and mobile device performing the same Download PDF

Info

Publication number
WO2016114475A1
WO2016114475A1 PCT/KR2015/011382 KR2015011382W WO2016114475A1 WO 2016114475 A1 WO2016114475 A1 WO 2016114475A1 KR 2015011382 W KR2015011382 W KR 2015011382W WO 2016114475 A1 WO2016114475 A1 WO 2016114475A1
Authority
WO
WIPO (PCT)
Prior art keywords
mobile device
user input
service
object image
bending
Prior art date
Application number
PCT/KR2015/011382
Other languages
French (fr)
Inventor
Hee-Seok Jeong
Soo-Jeong Kim
Shi-Yun Cho
Jeong-Hwan Kim
Original Assignee
Samsung Electronics Co., Ltd.
Seoul National University R&Db Foundation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co., Ltd., Seoul National University R&Db Foundation filed Critical Samsung Electronics Co., Ltd.
Publication of WO2016114475A1 publication Critical patent/WO2016114475A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas

Definitions

  • One or more exemplary embodiments relate to a method of receiving a user input of bending a mobile device and providing a preset service according to the received user input, and a mobile device performing the same.
  • a mobile device is configured to employ a touchscreen panel (TSP) so that a user may conveniently input necessary information to the mobile device.
  • TSP touchscreen panel
  • the user may input information to the mobile device by using the TSP more conveniently than by inputting information via a key input to a keypad.
  • Functions provided by the mobile device have become increasingly various.
  • a process of searching for an application list displayed on a TSP and selecting an application to be executed is necessary, so as to execute various functions by using the mobile device. It may be inconvenient for a user of the mobile device to undergo such a process so as to execute functions of the mobile device. Accordingly, it may be necessary to intuitively execute functions of the mobile device according to various situations such as a location, a space, a relation with peripheral objects of the user of the mobile device.
  • One or more exemplary embodiments include a method of intuitively executing a function according to a situation of a mobile device, so as to execute a function provided by the mobile device.
  • a method of automatically executing and providing a function in correspondence with a situation according to a user input of bending the mobile device are examples of a method of intuitively executing a function according to a situation of a mobile device.
  • One or more exemplary embodiments include a mobile device that may perform a function in correspondence with a situation according to a user input of bending the mobile device.
  • FIG. 1 illustrates a diagram showing a mobile device for photographing an actual space and displaying the actual space on a screen of the mobile device as the mobile device receives a user input of bending the mobile device is bent, according to an exemplary embodiment
  • FIG. 2 illustrates a flowchart showing a method of receiving a user input of bending the mobile device and providing a preset service, which are performed by the mobile device, according to an exemplary embodiment
  • FIGS. 3A and 3B illustrate diagrams for explaining a process of receiving of a user input of bending the mobile device so that the mobile device is bent, the receiving being performed by the mobile device, according to an exemplary embodiment
  • FIG. 4 illustrates a flowchart of a method of providing a service on each area of a divided screen as the mobile device receives a user input of bending the mobile device, the method being performed by the mobile device according to an exemplary embodiment
  • FIG. 5 illustrates a diagram showing an example of providing a service on each area of a divided screen as the mobile device receives a user input of bending the mobile device, the method being performed by the mobile device, according to an exemplary embodiment
  • FIG. 6 illustrates a flowchart showing a method of determining a location of an actual space and providing a service by using information about the determined location as the mobile device receives a user input of bending the mobile device, the method being performed by the mobile device according to an exemplary embodiment
  • FIG. 7 illustrates a diagram for explaining an example of determining a location of an actual space and providing a navigation service by using the determined location information as the mobile device receives a user input of bending the mobile device, the determining and the providing being performed by the mobile device, according to an exemplary embodiment
  • FIG. 8 illustrates a flowchart of a method of storing location information of an actual space and providing information about a route to a location of which location information is stored, the method being performed by the mobile device, according to an exemplary embodiment
  • FIGS. 9A and 9B illustrate diagrams for explaining an example of storing location information and providing information about a route to a stored location as the mobile device receives a user input of bending the mobile device, the storing and providing being performed by the mobile device according to an exemplary embodiment
  • FIG. 10 illustrates a flowchart of a method of recognizing an object in an actual space and providing a service related to an object image of the recognized object as the mobile device receives a user input of bending the mobile device, according to an exemplary embodiment
  • FIG. 11 illustrates a diagram for explaining an example of recognizing an object in an actual space and providing a search service related to an image of the recognized object as the mobile device receives a user input of bending the mobile device, according to an exemplary embodiment
  • FIG. 12 illustrates a diagram for explaining an example of recognizing an object in an actual space and providing a text recognition and translation service related to an image of the recognized object as the mobile device receives a user input of bending the mobile device, according to an exemplary embodiment
  • FIG. 13 illustrates a flowchart showing a method of recognizing a peripheral device and providing a service related to the recognized peripheral device as the mobile device receives a user input of bending the mobile device, according to an exemplary embodiment
  • FIG. 14 illustrates a diagram for explaining an example of recognizing a peripheral device and providing a service for connecting to the recognized peripheral device as the mobile device receives a user input of bending the mobile device, according to an exemplary embodiment
  • FIGS. 15 and 16 illustrate diagrams for explaining an example of recognizing a wearable device and providing a service related to the recognized wearable device as the mobile device receives a user input of bending the mobile device, according to an exemplary embodiment
  • FIG. 17 illustrates a diagram for explaining an example of recognizing a peripheral device and sharing content with the peripheral device as the mobile device receives a user input of bending the mobile device, according to an exemplary embodiment
  • FIG. 18 illustrates a flowchart of a method of providing a service from among services related to an application that is being executed as the mobile device receives user inputs of bending the mobile device at different angles from each other, according to an exemplary embodiment
  • FIG. 19 illustrates a diagram for explaining an example in which the mobile device is bent, as the mobile device receives user inputs of bending the mobile device respectively in correspondence with a first threshold value and a second threshold value, which are different from each other;
  • FIGS. 20A and 20B illustrate diagrams for explaining an example of providing a service as the mobile device receives user inputs of bending the mobile device respectively in correspondence with a first threshold value and a second threshold value, which are different from each other;
  • FIG. 21 illustrates a flowchart of a method of providing a service as the mobile device receives user inputs of bending the mobile device respectively in correspondence with a first threshold value and a second threshold value, which are different from each other, according to an exemplary embodiment
  • FIGS. 22A and 22B illustrate diagrams for explaining an example of providing a service as the mobile device receives user inputs of bending the mobile device respectively in correspondence with an angle of first threshold value and a second threshold value;
  • FIG. 23 illustrates a flowchart of a method of providing a service as the mobile device sequentially receives a user input of designating a particular location and a user input of bending the mobile device, according to an exemplary embodiment
  • FIG. 24 is a block diagram of the mobile device according to an exemplary embodiment.
  • a method of providing a preset service according to a user input of bending a mobile device which is performed by the mobile device includes: receiving a first user input of bending the mobile device in correspondence with a first threshold value; photographing an object in an actual space by using a camera included in the mobile device, as the first user input is received; displaying an object image of the photographed object on a screen of the mobile device; and providing a service related to the object image of the photographed object, from among services of an application installed on the mobile device.
  • the method may further include determining a location of the mobile device by using the object image of the photographed image, wherein the providing of the service includes providing a service related to the object image, based on the determined location.
  • the application may be navigation application, and the providing of the service may include providing route information in which the determined location is a departure location.
  • the application may be a navigation application, the method further including storing location information about the determined location, and if a location of the mobile device is moved, the providing of the service may include providing route information from the location where the mobile device is located to a location of the stored location information.
  • the method may further include recognizing an object in the object image of the photographed object, wherein the providing of the service includes providing information related to the recognized object.
  • the recognized object may include at least one selected from the group consisting of text and an image included in the object image
  • the providing of the service may include providing information that is found by using the at least one selected from the group consisting of the text and the image as an input value.
  • the recognized object may include text in the object image, and the providing of the service may include translating the text into a preset language and providing the translated text.
  • the recognized object may be a peripheral device that is located near the mobile device, the method including obtaining device information of the peripheral device by using the object image, and the providing of the service may include establishing communication between the mobile device and the peripheral device.
  • the providing of the service may include sharing content in the mobile device with the peripheral device or sharing content in the peripheral device with the mobile device.
  • the providing of the service may include mirroring at least a part of the screen of the mobile device on the peripheral device.
  • the displaying of the object image may include displaying the object image of the photographed object on a first area from among areas of the screen of the mobile device, wherein the areas of the screen of the mobile device are obtained when the screen of the mobile device is divided as the mobile device is bent.
  • the providing of the service may include displaying service information of the service on a second area from among the areas of the screen of the mobile device.
  • the method may further include: receiving a second user input of bending the mobile device in correspondence with a second threshold value before the first user input is received; and displaying detailed information regarding a location of the mobile device and an application that is being executed by the mobile device.
  • the method may further include: receiving a second user input of bending the mobile device in correspondence with a second threshold value before the first user input is received; and displaying additional information regarding an object included in the object image on a second area from among the areas of the screen of the mobile device.
  • the second threshold value may represent an angle having a smaller value than the first threshold value.
  • the control unit may determine a location of the mobile device, by using the object image of the photographed object, and provide a service related to the object image, based on the determined location.
  • the control unit may recognize an object in the object image of the photographed object and provide information that is found by using at least one selected from the group consisting of text and an image, included in the recognized object, as an input value.
  • the control unit may translate the text included in the photographed object into a preset language and provide the translated text.
  • the control unit may recognize a peripheral device included in the object image of the photographed object, and obtain device information of the recognized peripheral device.
  • the mobile device may further include a communication unit for establishing communication between the recognized peripheral device and the mobile device.
  • the control unit may share content in the mobile device or content in the peripheral device with the peripheral device.
  • the mobile device may further include an angle sensor configured to recognize an angle of a first threshold value or a second threshold value at which the mobile device is bent.
  • the control unit may provide a service of recognizing location information of the mobile device and searching for detailed information regarding the recognized location information, if the angle sensor senses that the mobile device is bent in correspondence with the second threshold value that is smaller than the first threshold value.
  • a non-transitory computer-readable recording storage medium having stored thereon a computer program, which when executed by a computer, performs the method.
  • An object image may be an image, a video clip, or text which is obtained by photographing an actual space in which a mobile device is located by using a camera included in the mobile device, and displayed on a screen of the mobile device.
  • the object image may include, for example, a user interface, a virtual reality interface, a result of executing a navigation application, a result of executing content, or a list of applications being executed, but is not limited thereto.
  • FIG. 1 illustrates a diagram showing a mobile device 1000 for photographing an actual space and displaying the actual space on a display screen of the mobile device 1000 as the mobile device 1000 receives a user input of bending the mobile device 1000, according to an exemplary embodiment.
  • the mobile device 1000 receives a user input of bending the mobile device 1000, a whole body and the whole display screen of the mobile device 1000 may be bent.
  • the display screen of the mobile device 1000 may be divided into a first area 110 and a second area 120.
  • the mobile device 100 includes the display screen, and the display screen is divided into a plurality of areas that includes the first area 110 and the second area 120.
  • the plurality of area respectively may be bent, folded, or unfolded at a preset angle.
  • Each area may be formed of a touchscreen.
  • the display screen of the mobile device 1000 may include a touchscreen as an input unit for receiving an input of a signal, but is not limited the touchscreen.
  • the mobile device 1000 may provide a preset service. For example, as the mobile device 1000 receives a user input of bending the mobile device 1000, the mobile device 1000 may photograph an actual space 2000 by using a camera and display the actual space 2000 on the first area 110. Additionally, the mobile device 1000 may provide a service related to a location of the photographed actual space 2000, from among services that may be provided by applications installed and executed on the mobile device 1000. The provided service may be displayed on the second area 120 in the form of an application.
  • FIG. 2 illustrates a flowchart showing a method of receiving a user input of bending the mobile device 1000 and providing a preset service, which are performed by the mobile device 1000, according to an exemplary embodiment.
  • the mobile device 1000 receives a user input of bending the mobile device 1000 in correspondence with a first threshold value.
  • the mobile device 1000 may include a display screen, and as the user input of bending the mobile device 1000 is received, the display screen of the mobile device 1000 may be divided into the first area 110 and the second area 120. However, the display screen is not completely split into the first area 110 and the second area 120.
  • the first area 110 and the second area are formed as one area and bent to be differentiated from each other. Since services respectively provided by applications provided and displayed on the first area 110 and the second area 120 are different from each other, the first area 110 and the second area 120 may be recognized as separate areas of the display screen. .
  • the user input of bending the mobile device 100 may be made so that a body and the display screen of the mobile device 1000 are bent at a preset threshold angle from the ground.
  • the first threshold value may represent a particular angle such as 0°, 30°, 60°, 90°, or the like at which the screen of the mobile device 1000 is preset to be bent from the ground. According to an exemplary embodiment, the first threshold value may be 60°.
  • the mobile device 1000 may include an angle sensor.
  • the angle sensor may determine an angle at which the mobile device 1000 is bent from the ground or bent when a user is holding the mobile device 1000.
  • the angle sensor may include, for example, at least one selected from the group consisting of a terrestrial magnetic sensor, a gravity sensor, a gyro sensor, and an acceleration sensor.
  • the angle sensor may determine a degree in which the mobile device is bent, by using the at least one selected from the group consisting of the terrestrial magnetic sensor, the gravity sensor, the gyro sensor, and the acceleration sensor.
  • the mobile device 1000 photographs an object in an actual space, by using a camera included in the mobile device 1000.
  • the actual space may refer to a surrounding space of the mobile device 1000.
  • the camera may be equipped or installed in the mobile device 1000 so as to photograph an actual space toward which the eyesight of a user of the mobile device 1000 is directed.
  • the camera may include a camera sensor for photographing an object included in the actual space in a periphery of the mobile device 1000 and converting an object image of the photographed object into an electrical signal, and a signal processing unit for converting an analog image signal obtained by photographing the object by using the camera sensor into digital data.
  • the camera sensor may be a complementary metal-oxide semiconductor image sensor (CMOS) or a charge-coupled device (CCD) sensor.
  • CMOS complementary metal-oxide semiconductor image sensor
  • CCD charge-coupled device
  • the signal processing unit may be implemented as a digital signal processor (DSP).
  • the camera sensor included in the camera may photograph an actual space in a periphery of the mobile device 1000, and the signal processing unit may obtain an object image of the actual space.
  • the mobile device 1000 may recognize a space in which the mobile device 1000 is located, an object located in a periphery of the mobile device 1000, or a peripheral device located in the periphery of the mobile device 1000.
  • the object image of the photographed object may be displayed on the first area 110 of the screen of the mobile device 1000.
  • the mobile device 1000 may receive in real time the object image of the object in the actual space which is photographed by the camera.
  • the object image of the object in the actual space, received in real time by the mobile device 1000 may be displayed on the first area 110 of the screen in the form such as a video clip.
  • the mobile device 1000 provides a service related to the object image of the photographed object, from among services provided by applications installed on the mobile device 1000.
  • the mobile device 1000 may include at least one selected from the group consisting of applications that are already installed on the mobile device 1000.
  • the mobile device 1000 may include at least one selected from the group consisting of a web browsing application, a search application, a mobile shopping application, a navigation application, a text recognition application, a translation application, and a video playback application.
  • the mobile device 1000 may be in a state of playing the at least one application selected from the group described above.
  • the mobile device 1000 may recognize the object image which is obtained by photographing the actual space and displayed on the first area 110, and thus, execute an application related to the object image from among the at least one application so as to provide a service.
  • the mobile device 1000 may not only recognize the object image, but also obtain space and location information related to the object image by using a location sensor, and execute the related application based on the obtained space and location information so as to provide a service.
  • the mobile device 1000 may recognize a particular object in the object image and execute a search application for searching for detailed information about the recognized object in a web page so as to provide a search service.
  • the mobile device 1000 may recognize text in the object image and execute a text recognition application or a translation application so as to provide a detailed information search service or a translation service related to the text. A detail description thereof is described later.
  • FIGS. 3A and 3B illustrate diagrams for explaining a process of receiving of a user input of bending the mobile device 1000 so that the mobile device 1000 is bent, the receiving being performed by the mobile device 1000, according to an exemplary embodiment.
  • the mobile device 1000 may include a display screen 100.
  • the mobile device 1000 shown in FIG. 3A is in a state of not receiving a user input of bending the mobile device 1000, and thus, is not in a bent state but in a flat state.
  • the display screen 100 may be formed of a touchscreen.
  • the display screen 100 may output a screen from among a booting screen, a lock screen, a menu screen, a home screen, and an application activation screen.
  • the display screen 100 may be formed of a flexible material that may be bent.
  • the display screen 100 may be formed of a flexible active matrix organic light-emitting diode (F-AMOLED) or a flexible liquid-crystal display (FLCD).
  • F-AMOLED flexible active matrix organic light-emitting diode
  • FLCD flexible liquid-crystal display
  • a main body of the mobile device 1000 may also be formed of a flexible material.
  • the mobile device 1000 may include a frame 130 in which the display screen 100 is formed and which supports the display screen 100.
  • the frame 130 may provide a space in which a printed circuit board (PCB) is mounted in addition to a space of the display screen 100, wherein the PCB includes a control unit, a storage unit, an audio input/output unit, and a communication unit which are described with reference to FIG. 24.
  • the frame 130 may be formed of a sash material or a flexible plastic material that may be bent according to an intention of a designer.
  • the mobile device 1000 may receive a user input of bending the mobile device 1000, and thus, be bent at a preset angle.
  • the mobile device 1000 may be bent so that a bent part of the mobile device 1000 forms a first threshold angle ⁇ with an unbent part of the mobile device 1000, according to the user input.
  • the first threshold angle ⁇ may be a particular angle that is preset to 0°, 30°, 60°, 90°, or the like. According to an exemplary embodiment, the first threshold angle ⁇ may be 60°.
  • the display screen 100 of the mobile device 1000 may be divided into the first area 100 and the second area 120. It may be understood that the first area 110 and the second area 120 are not separate from each other, but are bent as one area. According to an exemplary embodiment, a size of the first area 110 may correspond to 1/2 of a size of the second area 120. However, the sizes of the first and second area 110 and 120 are not limited thereto.
  • the display screen 100 may be divided into the first area 110 and the second area 120.
  • the frame 130 may also be bent and divided into two areas.
  • the frame 130 may be divided into a first frame 132 that surrounds and supports the first area 110 and a second frame 134 that surrounds and supports the second area 120.
  • a first extension line 132l that is a line extending from the first frame 132 may form the first threshold angle ⁇ with a second extension line 134l that is a line extending from the second frame 134.
  • the mobile device 1000 may be convenient for a user to view the object image obtained by photographing the object in the actual space on the first area 110.
  • the first threshold angle ⁇ may be 60°.
  • the mobile device 1000 may provide a virtual reality navigation service on the first area 110. Since the first area 110 may be bent toward a user, a user may walk in an actual space with his/her eye eyesight directed forward instead of downward. Thus, it may be convenient for a user to view the actual space.
  • FIG. 4 illustrates a flowchart of a method of providing a service on each area of a divided display screen as the mobile device 1000 receives a user input of bending the mobile device 1000, the method being performed by the mobile device 1000 according to an exemplary embodiment.
  • the mobile device 1000 receives a user input of bending the mobile device 1000 in correspondence with a first threshold value.
  • the first threshold value may represent a particular angle such as 0°, 30°, 60°, 90°, or the like at which the display screen of the mobile device 1000 is preset to be bent from a supporting surface of the mobile device 1000.
  • the first threshold value may be 60°.
  • the mobile device 1000 may include an angle sensor. The angle sensor may determine an angle at which the mobile device is bent, by using at least one selected from the group consisting of a terrestrial magnetic sensor, a gravity sensor, a gyro sensor, and an acceleration sensor.
  • an area of a display screen of the mobile device 1000 is divided into the first area 110 shown in FIG. 3 and the second area 120 shown in FIG. 3. It may be understood that the dividing of the area of the display screen into the first area 110 and the second area 120 refers to bending of the display screen as one area, instead of completely splitting the area of the display screen into the first area 110 and the second area 120. According to an exemplary embodiment, a size of the first area 110 may correspond to 1/2 of a size of the second area 120.
  • the mobile device 1000 photographs an object in an actual space by using a camera included in the mobile device 1000, and displays an object image of the photographed object.
  • the camera may photograph the actual space toward which the eyesight of a user of the mobile device 1000 is directed.
  • the object image of the photographed object in the actual space may be transmitted to the mobile device 1000 and displayed on the first area 110 of the display screen of the mobile device 1000 in the form of a video clip in real time.
  • the mobile device 1000 provides a service of providing information related to the object image which is displayed on the first area 110, from among services provided by applications installed on the mobile device 1000.
  • the mobile device 1000 may recognize a particular object in the object image displayed on the first area 110, and execute a search application for searching for detailed information about the recognized object in web pages so as to provide a search service.
  • An application for providing the search service may be installed on or included in the mobile device 1000.
  • the mobile device 1000 displays information related to the recognized particular object in the object image on the second area 120.
  • the search application for providing a service of searching for detailed information about the particular object recognized in operation S430, and an execution screen of the search application may be displayed on the second area 120.
  • the object image of the photographed object in the actual space which is photographed in operation S420 is displayed on the first image 110
  • the search application for searching for the detailed information about the recognized particular object in the image of the object is displayed on the second area 120.
  • a lot of information may be viewed at a same time by dividing the screen into the two parts.
  • FIG. 5 illustrates a diagram showing an example of the flowchart of the method of providing a service on each area of a divided screen as the mobile device 1000 receives a user input of bending the mobile device 1000, the method being performed by the mobile device 1000, as described with reference to FIG. 4 according to an exemplary embodiment.
  • a user of the mobile device 1000 may photograph an object 2010 in an actual space according to a user input of bending the mobile device 1000.
  • the photographed object 2010 in the actual space may be displayed on the first area 110 of a screen of the mobile device 1000.
  • An execution screen of a search application for searching for detailed information about the object 2010 photographed and displayed on the first area 110 may be displayed on the second area 120.
  • the user of the mobile device 1000 may move to a location in which the user may purchase shoes 2010, such as a department store, and perform a user input of bending the mobile device 1000 so as to photograph the shoes 2010.
  • an object 2010 in the actual space may be the shoes 2010.
  • An object image of the photographed shoes 2010 may be displayed on the first area 110 of the mobile device 1000 in real time.
  • the mobile device 1000 may recognize the shoes 2010 in the object image of the photographed shoes 2010, and execute an application related to the shoes 2010 from among search applications.
  • the mobile device 1000 may select either a search application or a mobile shopping application as the application related to the shoes 2010.
  • the mobile device 1000 may recognize that the user is currently located at the department store by recognizing location information of the mobile device 1000, and select and execute the mobile shopping application.
  • An execution screen of the mobile shopping application may be displayed on the second area 120.
  • FIG. 6 illustrates a flowchart showing a method of determining a location of an actual space and providing a service by using information about the determined location as the mobile device 1000 receives a user input of bending the mobile device 1000, the method being performed by the mobile device 1000, according to an exemplary embodiment.
  • the mobile device 1000 receives a user input of bending the mobile device 1000 in correspondence with a first threshold value.
  • the first threshold value may represent a particular angle such as 0°, 30°, 60°, 90°, or the like at which the screen of the mobile device 1000 is preset to be bent from a supporting surface of the mobile device 1000.
  • the first threshold value may be 60°.
  • the mobile device 1000 may include an angle sensor. The angle sensor may determine an angle at which the mobile device is bent, by using at least one selected from the group consisting of a terrestrial magnetic sensor, a gravity sensor, a gyro sensor, and an acceleration sensor.
  • the mobile device 1000 photographs an object in an actual space, by using a camera included in the mobile device 1000.
  • the camera may photograph the actual space toward which the eyesight of a user of the mobile device 1000 is directed.
  • An object image of the photographed object in the actual space may be transmitted to the mobile device 1000 in real time.
  • the mobile device 1000 determines a current location of the mobile device 1000 by using the object image of the object in the actual space which is photographed by using the camera.
  • the mobile device 1000 may include an image sensor.
  • the mobile device 1000 may recognize a particular object in the object image by sensing and analyzing the object image of the object by using the image sensor, collect location information related to the particular object, and thus, determine the current location of the mobile device 1000.
  • the mobile device 1000 may include at least one selected from the group consisting of a location information sensor, a global positioning system (GPS) sensor, or a positioning sensor, and determine a current location of the mobile device 1000 by combining location information obtained by using the at least one selected from the group consisting of the location information sensor, the GPS sensor, or the positioning sensor with space information obtained from the object image of the photographed object.
  • the mobile device 1000 may include a communication unit, and determine a current location of the mobile device 1000 via communication with an access point (AP) of a communication network service provider.
  • AP access point
  • the mobile device 1000 provides a service related to the location determined in operation S620, from among services of applications installed on the mobile device 1000.
  • a map application, a route guide application, and a navigation application may be installed on the mobile device 1000, and the mobile device 1000 may execute the navigation application for providing route information in which the determined current location is a departure location.
  • the navigation application may add an image of a virtual route guide to the object image of the object in the actual space which is photographed in operation S610, and display the image of the virtual route guide 200 on the first area 110 of the display screen which is shown in FIG. 1. A detailed description thereof is described with reference to FIG. 7.
  • FIG. 7 illustrates a diagram for explaining an example of determining a location of an actual space and providing a navigation service by using the determined location information as the mobile device 1000 receives a user input of bending the mobile device 1000, the determining and the providing being performed by the mobile device 1000, according to an exemplary embodiment.
  • the mobile device 1000 photographs an actual space and displays an object image obtained by photographing the actual space. Then, the mobile device 1000 may determine a location of the mobile device 1000 by using the object image, and then, execute an application related to the determined location. According to an exemplary embodiment, as the mobile device 1000 receives a user input of bending the mobile device 1000, the mobile device 1000 may photograph an actual space that is a place in a periphery of an actual space, subway station 2020 and display an image of the subway station 2020 on the first area 110.
  • the mobile device 1000 may execute a search application by using the image of the subway station 2020, that is, an object image of the subway station 2020, or determine a name of the subway station 2020 near which the mobile device 1000 is located by activating a location sensor or a GPS sensor. For example, if a current location of the mobile device 1000 is determined as being in a periphery of Myeongdong station, the mobile device 1000 may execute a navigation application for guiding a user through a route from the periphery of Myeongdong station that is determined as a departure location to a destination preset by the user.
  • a service provided by the navigation application may display the object image of the photographed subway station 2020 and a virtual route guide 200 for guiding the user to the preset destination on the first area 110 of the screen.
  • the navigation application may be an application for providing a virtual reality service.
  • the service provided by the navigation application may photograph an actual moving route, and display the photographed actual moving route and the virtual route guide 200 for guiding to the preset destination in real time on the first area 110 in real time.
  • FIG. 8 illustrates a flowchart of a method of storing location information of an actual space and providing information about a route to a location of which location information is stored, the method being performed by the mobile device 1000, according to an exemplary embodiment.
  • the mobile device 1000 receives a user input of bending the mobile device 1000 in correspondence with a first threshold value.
  • the first threshold value may represent a particular angle such as 0°, 30°, 60°, 90°, or the like at which the screen of the mobile device 1000 is preset to be bent from a supporting surface of the mobile device 1000.
  • the first threshold value may be 60°.
  • the mobile device 1000 may include an angle sensor. The angle sensor may determine an angle at which the mobile device is bent, by using at least one selected from the group consisting of a terrestrial magnetic sensor, a gravity sensor, a gyro sensor, and an acceleration sensor.
  • the mobile device 1000 photographs an object in an actual space, by using a camera included in the mobile device 1000, and determines a current location of the mobile device 1000 by using an object image of the photographed object.
  • the camera may photograph the actual space toward which the eyesight of a user of the mobile device 1000 is directed.
  • the object image of the photographed object may be transmitted to the mobile device 1000 in real time, and displayed on the first area 110 of the screen of the mobile device 1000 in the form of a streaming video in real time.
  • the mobile device 1000 may determine a current location of the mobile device 1000 in correspondence with the object image of the photographed object, by using an image sensor, a location sensor, or a GPS sensor.
  • the mobile device 1000 stores an information value with respect to the location determined in operation S810.
  • the mobile device 1000 may include at least one selected from the group consisting of a location information sensor, a GPS sensor, and a positioning sensor.
  • the mobile device 1000 may determine an information value that includes a coordinate value of the current location of the device 1000 by combining location information obtained by using the at least one selected from the group consisting of the location information sensor, the GPS sensor, and the positioning sensor with space information obtained from the object image, and store the information value as a location coordinate value.
  • the mobile device 1000 may include a communication unit, and determine a current location of the mobile device 1000 via communication with an AP, which is disposed in the actual space and provided by a communication network service provider.
  • the mobile device 1000 may store information about the determined current location as a numerical value.
  • a service for providing information about a route from a location to which the mobile device 1000 has moved to the location of which the information value is stored in operation S820 is provided. If a user of the mobile device 1000 has moved from a first location that is the location of which the information value is stored in operation S820 to a second location that is a current location, the mobile device 1000 may provide a navigation service for guiding the user through a route by recognizing the second location that is the current location and designating the first location as a destination.
  • FIGS. 9A and 9B illustrate diagrams for explaining a navigation service for storing location information about a first location and, as the mobile device 1000 moves to a second location, providing information about a route from the second location to the first location, the storing and the providing being performed by the mobile device 1000 as the mobile device 1000 receives a user input of bending the mobile device 1000, according to an exemplary embodiment.
  • the mobile device 1000 may recognize a first location that is a surrounding space 2030, and store coordinate location information about the recognized first location.
  • the first location that is the surrounding space 2030 may be the surrounding space 2030, name of “B-24” in the fourth basement floor, which is in a parking lot where a vehicle of the user is parked
  • the mobile device 1000 may photograph the surrounding space 2030 of B-24 in the fourth basement floor in which the vehicle of the user is parked, and display an object image of the photographed surrounding space 2030 on the first area 110.
  • the mobile device 1000 may recognize the text, “B-24 in the second basement floor” shown in the object image, or recognize and store location information of the surrounding space 2030 of B-24 in the fourth basement floor in the parking lot by using a GPS sensor or a positioning sensor.
  • a notification screen showing that current location information, that is, information of “B-24 in the fourth basement floor” in the parking lot is stored may be displayed on the second area 120 of a screen of the mobile device 1000.
  • a user 10 holding and using the mobile device 1000 may move from the first location 2030 that is the surrounding space 2030 shown in FIG. 9B to a second location. If the user 10 moves to the second location and performs a user input of bending the mobile device 1000, the mobile device 1000 may photograph a surrounding space of the second location by using a camera and display an image obtained by photographing the surrounding space on the first area 110. Additionally, the mobile device 1000 may determine a current location based on the object image of the surrounding space of the second location displayed on the first area 110, and display a virtual route guide 210 for providing route information, in which a current location is a departure location and the first location is a destination, on the first area 110.
  • the mobile device 1000 may receive a user input of bending the mobile device 1000 and provide a route guide service for moving to the surrounding space 2030 of “B-24 in the second basement floor” by using the stored information location about “B-24 in the second basement floor”.
  • the route guide service may be a virtual reality application for displaying the virtual route guide 210 on the first area 110 of the screen of the mobile device 1000.
  • a map application may be executed on the second area 120 of the screen of the mobile device 1000, and the map application may display a current location of the user 10.
  • the map application may determine a location of the mobile device 1000 by using the GPS sensor included in the mobile device 1000.
  • FIG. 10 illustrates a flowchart of a method of recognizing an object image in an actual space and providing a service related to the recognized object image as the mobile device 1000 receives a user input of bending the mobile device 1000, according to an exemplary embodiment.
  • the mobile device 1000 receives a user input of bending the mobile device 1000 in correspondence with a first threshold value.
  • the first threshold value may represent a particular angle such as 0°, 30°, 60°, 90°, or the like at which the screen of the mobile device 1000 is preset to be bent from a supporting surface of the mobile device 1000.
  • the first threshold value may be 60°.
  • the mobile device 1000 may include an angle sensor. The angle sensor may determine an angle at which the mobile device is bent, by using at least one selected from the group consisting of a terrestrial magnetic sensor, a gravity sensor, a gyro sensor, and an acceleration sensor.
  • the mobile device 1000 photographs an object in an actual space, by using a camera included in the mobile device 1000.
  • the camera may photograph the actual space toward which the eyesight of a user of the mobile device 1000 is directed.
  • An object image of the photographed object may be transmitted to the mobile device 1000 in real time.
  • the mobile device 1000 recognizes a particular object in the object image of the actual space which is photographed by the camera.
  • the mobile device 1000 may include an image sensor, and recognize the particular object in the object image by sensing and analyzing the object image by using the image sensor.
  • the mobile device 1000 provides a service for providing information related to the recognized object image.
  • the mobile device 1000 may execute an application related to the recognized particular object, from among at least one application installed on the mobile device 1000, and thus, display detailed information about the particular object on the second area 120 of the mobile device 1000.
  • a search application for providing a service for searching for the detailed information about the particular object, recognized in operation S1020 may be executed, and an execution screen of the search application may be displayed on the second area 120.
  • FIG. 11 illustrates a diagram for explaining an example of recognizing an object image in an actual space and providing a search service related to the recognized object image as the mobile device 1000 receives a user input of bending the mobile device 1000, according to an exemplary embodiment.
  • the mobile device 1000 may display an object image obtained by photographing an actual space on the first area 110, recognize a particular object in the object image of the photographed actual space, and thus, provide a search service for searching for detailed information about the particular object.
  • the mobile device 1000 may photograph an actual space that includes a tumbler 2040, and display the actual space on the first area of the screen. Additionally, the mobile device 1000 may recognize the tumbler 2040 that is a particular object in the photographed actual space.
  • the mobile device 1000 may include an image sensor, and the image sensor may recognize a shape of the tumbler 2040 by analyzing an object image of the actual space and transmit object information about the recognized shape of the tumbler 2040 to the mobile device 1000.
  • the mobile device 1000 may execute an application for searching for detailed information about the tumbler 2040 that is the particular object.
  • the search application may be an application through which a “tumbler” is searched for in web pages and detailed information thereof is obtained.
  • An execution screen of the search application may be displayed on the second area 120.
  • FIG. 12 illustrates a diagram for explaining an example of recognizing an object image of an actual space and providing a text recognition and translation service related to the recognized object image as the mobile device 1000 receives a user input of bending the mobile device 1000, according to an exemplary embodiment.
  • the mobile device 1000 may display an object image obtained by photographing an actual space on the first area 110, recognize a particular object in the object image of the photographed actual space, and thus, provide a search service or a translation service related to the recognized text.
  • the mobile device 1000 may photograph an object 2050 in an actual space which includes text “Se-jong Dae Wang”, and display the object 2050 on the first area 110.
  • the mobile device 100 may recognize the text “Se-jong Dae Wang” in an object image of the object 2050 displayed on the first area 110.
  • the mobile device 1000 may include an image sensor, and the image sensor may recognize the text “Se-jong Dae Wang” by analyzing the object image of the object 2050, and transmit information about the recognized text to the mobile device 1000.
  • the mobile device 1000 may provide a service of translating the text “Se-jong Dae Wang” into a language preset by a user. For example, the mobile device 1000 may provide a service of translating the text “Se-jong Dae Wang” into English “King Se-jong”.
  • a translation result 220 obtained by translating the text “Se-jong Dae Wang” into the language preset by the user may be converted into a virtual image and displayed on the first area 110 of the screen.
  • the translation result 220 may be displayed on the first area 110 to overlap with the object 2050 in the actual space.
  • the mobile device 1000 may provide a search service by executing an application for searching for information about the translation result 220.
  • a screen of the application for performing a detailed search for information about the translation result 220 may be displayed on the second area 120.
  • FIG. 13 illustrates a flowchart showing a method of recognizing a peripheral device and providing a service related to the recognized peripheral device, as the mobile device 1000 receives a user input of bending the mobile device 1000, according to an exemplary embodiment.
  • the mobile device 1000 receives a user input of bending the mobile device 1000 in correspondence with a first threshold value.
  • the first threshold value may represent a particular angle such as 0°, 30°, 60°, 90°, or the like at which the screen of the mobile device 1000 is preset to be bent from a supporting surface of the mobile device 1000.
  • the first threshold value may be 60°.
  • the mobile device 1000 may include an angle sensor. The angle sensor may determine an angle at which the mobile device is bent, by using at least one selected from the group consisting of a terrestrial magnetic sensor, a gravity sensor, a gyro sensor, and an acceleration sensor.
  • the mobile device 1000 photographs an object in an actual space, by using a camera included in the mobile device 1000.
  • the camera may photograph the actual space toward which the eyesight of a user of the mobile device 1000 is directed.
  • An object image of the photographed object may be transmitted to the mobile device 1000 in real time.
  • the mobile device 1000 recognizes a peripheral device in the object image of the object in the actual space which is photographed by the camera.
  • the mobile device 1000 may include an image sensor, and recognize the peripheral device in the object image by sensing and analyzing the object image of the photographed object by using the image sensor.
  • the mobile device 1000 obtains device information of the recognized peripheral device.
  • the mobile device 1000 may obtain the device information that includes a service set identifier (SSID), a device model number, or the like of the peripheral device recognized in operation S1320.
  • SSID service set identifier
  • the mobile device 1000 provides a service related to the recognized peripheral device by using the obtained device information.
  • the mobile device 1000 may determine whether the mobile device 1000 may communicate with the recognized peripheral device, and establish communication with the recognized peripheral device.
  • the mobile device 1000 may share content such as a photograph, music, or a video clip with the recognized peripheral device.
  • FIG. 14 illustrates a diagram for explaining an example of recognizing a peripheral device 3000 and providing a service for connecting to the recognized peripheral device 3000 as the mobile device 1000 receives a user input of bending the mobile device 1000, according to an exemplary embodiment.
  • the mobile device 1000 may photograph an object 3000 in an actual space by using a camera, and display an object image of the photographed object 3000 on the first area 110 of a screen of the mobile device 1000.
  • the mobile device 1000 may recognize a peripheral device 3000 that is included in the object image of the photographed object 3000.
  • the peripheral device 3000 that may be recognized may be a TV, a tablet PC, a refrigerator, or a gas stove.
  • the mobile device 1000 may recognize the TV 3000 included in the object image.
  • the mobile device 1000 may establish communication with the TV 3000 recognized in the object image, and thus, be connected to the TV 3000 via a communication network.
  • the mobile device 1000 may include a communication unit, and establish communication with the peripheral device 3000 and thus, be connected to the peripheral device 3000.
  • the mobile device 1000 may execute an application through which a video clip is viewed, on the second area 120 of the screen.
  • the video playback application may include a video playback screen 132 and a video information providing screen 124 that provides information about a video clip that is being played. If the video playback application is being executed, the mobile device 1000 may establish communication with the TV 3000 and receive a separate user input, and thus, share video content with the TV 3000. In other words, if the mobile device 1000 receives a separate user input, for example, a swipe input, the mobile device 1000 may share video content that is being played on the video playback screen 132 with the TV 3000 that has established a communication with the mobile device 1000.
  • the shared video content that is, the video content that is being played on the video playback screen 132 of the mobile device 1000 may be mirrored, displayed, and played on the TV 3000.
  • FIGS. 15 and 16 illustrate diagrams for explaining an example of recognizing a wearable device 3010 and providing a service related to the recognized wearable device as the mobile device 1000 receives a user input of bending the mobile device 1000, according to an exemplary embodiment.
  • the mobile device 1000 may obtain device information of the wearable device 3010 located near the mobile device 1000 and establish communication with the wearable device 3010 to be connected to the wearable device 3010.
  • the mobile device 1000 may obtain device information of a wearable device 3010 located near the mobile device 1000, establish communication with the wearable device 3010, and thus, be connected to the wearable device 3010.
  • the mobile may recognize the wearable device 3010 by obtaining the device information that includes an SSID, a device model number, or the like of the wearable device 3010.
  • the wearable device 3000 may be glasses that have a communication function and a data processing function.
  • the wearable device 3010 may be worn by a user, and photograph an object in an actual space by using a camera that is directed toward the front of the user.
  • the mobile device 1000 may display a list of services, provided by an application that may be executed by the wearable device 3010, on the second area 120.
  • the user may select a service from the list of the services displayed on the second area 120, and thus, be provided with the service of using the wearable device 3010.
  • the mobile device 1000 connected to the wearable device 3010 receives the user input of bending the mobile device 1000, the mobile device 1000 transmit an object image of an actual space 2060, which is located in a periphery of the mobile device 1000 and photographed by a camera, to the wearable device 3010.
  • the wearable device 3010 may display virtual signs 2072 and 2074 on the transmitted object image of the actual space 2060, and provide a service related to the actual space 2060.
  • the wearable device 3010 may display a virtual image 2070 which is obtained by adding the virtual signs 2072 and 2074 to the transmitted object image of the actual space 2060.
  • the wearable device 3010 may provide a navigation service
  • the navigation service may be a service of displaying the virtual signs 2072 and 2074 for guiding a user to move to a place preset by the user on a display unit 3012.
  • FIG. 17 illustrates a diagram for explaining an example of recognizing a peripheral device 3200 and sharing content with the peripheral device 3200 as the mobile device 1000 receives a user input of bending the mobile device 1000, according to an exemplary embodiment.
  • the mobile device 1000 may recognize the peripheral device 3020 in an object image of an object photographed by a camera, and share content stored in the mobile device 1000 with the peripheral device 3020.
  • the mobile device 1000 may include an image sensor, and recognize the peripheral device 3020 in the object image by sensing and analyzing the object image of the photographed object by using the image sensor.
  • the mobile device 1000 may obtain device identification information of the peripheral device 3020.
  • the obtained device information may include an SSID of the peripheral device 3020, a device model number, or the like of the peripheral device 3020.
  • the mobile device 1000 may establish communication with the peripheral device 3020 based on obtained identification information of the peripheral device 3020, and display a user interface for determining whether to share content with the peripheral device 3020 on the second area 120.
  • the mobile device 1000 may be connected to the peripheral device 3020, for example, a mobile phone 3020 via a short-range wireless communication.
  • the short-range wireless communication via which the mobile device 1000 is connected to the mobile phone 3020 may include a Beacon communication, near-field communication (NFC), a ZigBee communication, a radio frequency identification (RFID) communication, an ultra-wide band (UWB) communication, or a Bluetooth communication, but is not limited thereto.
  • the mobile device 1000 may obtain an SSID of the mobile phone 3020 or user certification information of the mobile phone 3020 (for example, communication network subscription information, a user identification (ID), or the like), and determine whether to share content with the mobile phone 3020 based on the obtained user certification information.
  • the mobile device 1000 may display contents to be shared with the mobile phone 3020, on the second area 120 of the screen. A user may select and share content that the user wants to share with the mobile phone 3020, from among the contents displayed on the second area 120 of the mobile device 1000.
  • FIG. 18 illustrates a flowchart of a method of providing a service from among services related to an application that is being executed as the mobile device 1000 receives user inputs of bending the mobile device 1000 at different angles from each other, according to an exemplary embodiment.
  • the mobile device 1000 receives a user input of bending the mobile device 1000 in correspondence with a first threshold value.
  • the first threshold value may represent a particular angle such as 0°, 30°, 60°, 90°, or the like at which the screen of the mobile device 1000 is preset to be bent from a supporting surface of the mobile device 1000.
  • the first threshold value may be 60°.
  • the mobile device 1000 displays detailed information regarding a location of the mobile device 1000 and the application that is being executed on the first area 110 of a screen of the mobile device 1000 which is shown in FIG. 1.
  • the mobile device 1000 may include at least one selected from the group consisting of a location information sensor, a GPS and a positioning sensor, and determine information about a current location of the mobile device 1000 by using the at least one selected from the group consisting of the location information sensor, the GPS and the positioning sensor.
  • the mobile device 1000 may include a communication unit, and determine a current location of the mobile device 1000 via communication with an AP of a communication network service provider which is disposed in an actual space.
  • At least one application may be installed on the mobile device 1000, and the mobile device 1000 may be executing the installed at least one application.
  • the mobile device 1000 may execute an application related to current location information about the mobile device 1000, from among the at least one application that is being executed, and thus, provide a service related to the current location information.
  • the related service may be a navigation service in which an application for searching for location information, that is, a map application is executed.
  • the related service may be a search service in which an application for searching for product information or a product cost is executed.
  • the mobile device 1000 receives a user input of bending the mobile device 1000 in correspondence with a second threshold value.
  • the first threshold value may represent a particular angle such as 0°, 30°, 60°, 90°, or the like at which the screen of the mobile device 1000 is preset to be bent from a supporting surface of the mobile device 1000.
  • the first threshold value may be 60°.
  • the second threshold value for bending the mobile device 1000 may be a value corresponding to a user input for bending the mobile device at a greater angle than the first threshold value for bending the mobile device 1000 in operation S1800.
  • the first threshold value may be 30°
  • the second threshold value may be 60°.
  • the user input of bending the mobile device 1000 in correspondence with the second threshold value may be received successively in a time interval after the user input of bending the mobile device 1000 in correspondence with the first threshold value.
  • the mobile device 1000 is bent in correspondence with the first threshold value in operation S1800, and then, successively bent in correspondence with the second threshold value in operation S1820 without being unbent and becoming flat.
  • the bending of the mobile device 1000 is not limited thereto.
  • the mobile device 1000 may be bent in correspondence with the first threshold value in operation S1800, unbent, and then, bent in correspondence with the second threshold value in operation S1820.
  • the mobile device 1000 photographs an object in an actual space, by using a camera included in the mobile device 1000.
  • the camera may photograph the actual space toward which the eyesight of a user of the mobile device 1000 is directed.
  • An object image of the photographed object may be transmitted to the mobile device 1000 in real time.
  • the mobile device 1000 provides a service related to the object image of the photographed object, from among services of the application that is being executed.
  • the mobile device 1000 may obtain current location information about the mobile device 1000 from the object image of the photographed object.
  • the mobile device 1000 may execute a service related to the location information which is executed in operation S1810, based on the obtained location information so as to provide the service related to the location information.
  • the service related to the location information may be a service for guiding a user through a route by executing a navigation application or a map application.
  • the service related to the location information may be a service related to virtual reality for combining a virtual image with the first area 110 of the mobile device 1000, and thus, display the virtual image on the first area 110 of the mobile device 1000.
  • FIG. 19 illustrates a diagram for explaining an example in which the mobile device 1000 is bent, as the mobile device 1000 receives user inputs of bending the mobile device 1000 respectively in correspondence with a first threshold value ⁇ and a second threshold value ⁇ , which are different from each other.
  • the mobile device 1000 may include a display area that is divided into a plurality of parts that includes the first area 110 and the second area 120.
  • the display area is divided into the first area 110 and the second area 120.
  • respective areas included in the display area may contact each other, and be bent or unbent, or folded or unfolded at a preset angle.
  • Each area may be formed of a touchscreen.
  • a frame that supports the display screen may also be bent and divided into two parts.
  • the frame may be divided into a first frame 132 that surrounds and supports the first area 110 and a second frame 134 that surrounds and support the second area 120.
  • An angle between the first frame 134 and the second frame 132 may vary with a degree in which the mobile device 1000 is bent. If the first frame 134 is bent to a location 134-1 in correspondence with the first threshold value ⁇ , the first area 110 and the second area 120 may also be bent in correspondence with the first threshold angle ⁇ .
  • the first frame 134 is bent to a location 134-2 in correspondence with the second threshold value ⁇
  • the first area 110 and the second area 120 may also be bent in correspondence with the second threshold value ⁇ .
  • the first threshold value ⁇ may be 30°
  • the second threshold value ⁇ may be 60°.
  • FIGS. 20A and 20B illustrate diagrams for explaining an example of providing a service as the mobile device 1000 receives user inputs of bending the mobile device 1000 respectively in correspondence with a first threshold value ⁇ and a second threshold value ⁇ , which are different from each other.
  • the mobile device 1000 may provide a service of determining a current location of the mobile device 1000, searching for detailed information related to the determined location, and thus, displaying the detailed information on the first area 110 of the screen.
  • the mobile device 1000 may determine a location of the entrance 2080 of the airport, and store a value of location information.
  • the mobile device 1000 may determine a current location of the mobile device 1000, by determining the location information of the entrance 2080 of the airport by using a GPS sensor or a location sensor. Additionally, according to an exemplary embodiment, the mobile device 1000 may determine a current location of the mobile device 1000 based on AP usage information provided by a communication network provider, which is obtained by a communication unit included in the mobile device 1000. If the mobile device 1000 is executing, for example, an airport information application, the mobile device 1000 may display detailed information about a current location, that is, the entrance 2080 of the airport on the first area 110. The detailed information may include a name of the airport, a name of an air flight departing from the airport, or information about a departure time of the air flight. Such information may be found in web pages.
  • the mobile device 1000 may photograph an object in an actual space by using a camera, and provide a service related to an object image of the photographed object from among services that may be provided by an application that is being executed.
  • the mobile device 1000 may obtain an object image by photographing the entrance 2080 of the airport, and display the object image on the first area 110.
  • the mobile device 1000 may provide a service related to the object image of the entrance 2080 of the airport which is displayed on the first area 110, from among services that may be provided by the airport information application described with reference to FIG.
  • the mobile device 1000 may provide a navigation service of setting a place preset by a user, for example, a place for check-in before flight departure as a destination.
  • the navigation service may be a virtual reality service for guiding a user by displaying a virtual guide sign 230 on the object image of the entrance 2080 of the airport which is displayed on the first area 110.
  • the mobile device 1000 may display a map application for displaying a current location of the mobile device 1000 on the second area of the screen.
  • FIG. 21 illustrates a flowchart of a method of providing a service as the mobile device 1000 receives user inputs of bending the mobile device 1000 respectively in correspondence with a first threshold value ⁇ and a second threshold value ⁇ , which are different from each other.
  • the mobile device 1000 receives a user input of bending the mobile device 1000 in correspondence with the first threshold value ⁇ .
  • the first threshold value ⁇ may represent a particular angle such as 0°, 30°, 60°, 90°, or the like at which the screen of the mobile device 1000 is preset to be bent from a supporting surface of the mobile device 1000.
  • the first threshold value may be 60°.
  • the mobile device 1000 photographs an object in an actual space, by using a camera included in the mobile device 1000.
  • the camera may photograph the actual space toward which the eyesight of a user of the mobile device 1000 is directed.
  • An object image of the photographed object may be transmitted to the mobile device 1000 in real time.
  • the mobile device 1000 provides a service related to the object image of the photographed object, from among services of the application that is being executed.
  • the mobile device 100 may recognize a particular object in the object image of the actual space photographed by using the camera in operation S2110, and execute an application related to the recognized particular object.
  • the mobile device 1000 receives a user input of bending the mobile device 1000 in correspondence with the second threshold value ⁇ .
  • the second threshold value may represent a particular angle such as 0°, 30°, 60°, 90°, or the like at which the screen of the mobile device 1000 is preset to be bent from a supporting surface of the mobile device 1000.
  • the second threshold value may be 30°.
  • the mobile device 1000 may receive a user input of bending the mobile device at 30° sequentially after the mobile device 1000 is bent at 60° in operation S2100.
  • the mobile device 1000 displays additional information related to a particular object included in the object image on the second area 120 of the screen.
  • the mobile device 1000 may include an image sensor, and recognize the particular object in the object image by sensing and analyzing the object image of the photographed object by using the image sensor.
  • the mobile device 1000 may execute an application related to the recognized particular object from among at least one application installed on the mobile device 1000, and thus, display detailed information about the particular object on the second area 120 of the mobile device 1000.
  • FIGS. 22A and 22B illustrate diagrams for explaining an example of providing a service as the mobile device 1000 receives user inputs of bending the mobile device 1000 respectively in correspondence with a first threshold value ⁇ and a second threshold value ⁇ , which are different from each other, the providing being performed by the mobile device 1000 according to an exemplary embodiment.
  • the mobile device 1000 may photograph the running shoes 2090.
  • the photographed running shoes 2090 may be displayed on the first area 110 of the mobile device 1000 in real time.
  • the mobile device 1000 may recognize the running shoes 2090 in an object image of the photographed running shoes 2090, execute an application related to the running shoes 2090 from among search applications installed on the mobile device 1000, and thus, display the application on the second area 120 of the screen.
  • the mobile device 1000 may display either a search application or a mobile shopping application as an application related to the running shoes 2090 on the second area 120 of the screen.
  • the mobile device 1000 may recognize the running shoes 2090 that is an object 2090 located in the actual space, and thus, execute a search application for searching for detailed information about the running shoes 2090.
  • the mobile device 1000 may include an image sensor, and recognize a type, a manufacturer, or the like of the running shoes 2090 by sensing and analyzing an object image of the photographed running shoes 2090 by using the image sensor.
  • the search application may be an application for searching for and displaying information in web pages, wherein the information includes a size, quantity of stocks, a cost, a discount, and evaluations by consumers with respect to the sneakers 2090.
  • An execution screen of the search application may be displayed on the second area 120.
  • FIG. 23 illustrates a flowchart of a method of providing a service as the mobile device 1000 sequentially receives a user input of designating a particular location and a user input of bending the mobile device 1000, according to an exemplary embodiment.
  • the mobile device 1000 receives a user input of designating a particular location, and stores an information value of the particular location.
  • the mobile device 1000 may include a location sensor, a GPS sensor, or a positioning sensor, and store the information value of the particular location according to the user input as a numerical value.
  • the mobile device 1000 receives a user input of bending the mobile device 1000 in correspondence with a first threshold value.
  • the first threshold value may represent a particular angle such as 0°, 30°, 60°, 90°, or the like at which the screen of the mobile device 1000 is preset to be bent from a supporting surface of the mobile device 1000.
  • the first threshold value may be 60°.
  • the mobile device 1000 determines whether the mobile device 1000 is placed in the particular location. According to an exemplary embodiment, the mobile device 1000 may compare the particular location that is stored according to the user input received in operation S2300 to a coordinate value of a current location of the mobile device 1000, and determine whether the current location is identical to the stored particular location.
  • the mobile device 1000 uploads particular data designated by a user to a server, or download particular data from the server. If the mobile device 1000 sequentially receives a user input of designating a particular location, for example, location information of a lecture room and a user input of bending the mobile device 1000 in correspondence with the first threshold value, the mobile device 1000 may upload prestored information about homework to the server. Alternatively, if the mobile device 1000 receives a user input of bending the mobile device 1000 in correspondence with a first threshold value and recognizes that the particular location is, for example, a lecture room, the mobile device 1000 may download the information about homework stored in the server.
  • the mobile device 1000 may receive a user input of unbending the mobile device 1000.
  • FIG. 24 is a block diagram of the mobile device 1000 according to an exemplary embodiment.
  • the mobile device 1000 may include a user input unit 1100, an output unit 1200, a camera unit 1300, a communication unit 1400, a memory, and a control unit 1600.
  • the user input unit 1100 receives a user input.
  • the user input unit 1100 may receive a user input of bending the mobile device 1000.
  • the user input unit 1100 may include a sensor unit 1110 and a sound input unit 1120.
  • the user input unit 1100 may detect a user input by using the sensor unit 1110 included in the mobile device 1000.
  • the sensor unit 1110 may include, for example, a vibration sensor, an angle sensor, a gyro sensor, an acceleration sensor, a pressure sensor, a gravity sensor, or a touch sensor.
  • the sensor unit 1110 may calculate an angle at which the frame 130 of the mobile device 1000 shown in FIG. 3 and the display screen 100 shown in FIG. 3 are bent, and transmit a value of the calculated angle to the control unit 1600.
  • the sensor unit 1100 may determine whether the mobile device 1000 is bent in correspondence with a first threshold value or a second threshold value, by using a terrestrial magnetic sensor, the gyro sensor, or the acceleration sensor which may detect an angle at which the mobile device 1000 is bent with reference to a ground surface. Additionally, the sensor unit 1110 may determine a state when the mobile device 1000 is unbent. For example, the user input unit 1100 may detect a user input by using the sound input unit 1120.
  • the output unit 1200 outputs data of the mobile device 1000.
  • the output unit 1200 may display an object that includes at least one selected from the group consisting of an image, a video clip, and text. Additionally, the output unit 1200 may output voice.
  • the display unit 1210 displays data of the mobile device 1000.
  • the display unit 1210 may include the display screen 100 and the first area 110 and the second area 120 of the display screen 100, which are shown in FIGS. 1 through 23.
  • the display unit 1210 may display an object that includes at least one selected from the group consisting of an image, a video clip, and text.
  • the display unit 1210 may be formed of a touchscreen.
  • the display unit 1210 may output a screen from among from among a booting screen, a lock screen, a menu screen, a home screen, and an application activation screen.
  • the display screen 100 may be formed of a flexible material that may be bent.
  • the display unit 1210 may be formed of an F-AMOLED display or an FLCD.
  • the sound output unit 1220 outputs voice played by the mobile device 1000.
  • the sound output unit 1220 may include a speaker (SPK) for playing audio data transceived during a phone call using the mobile device 1000.
  • SPK speaker
  • the sound output unit 1220 may play and output an audio signal for an application that is executed when the mobile device 1000 receives a user input of bending the mobile device 1000.
  • the camera unit 1300 may be located at a front surface or a rear surface of the mobile device 1000, and obtain an object image by photographing an actual space that is in a periphery of the mobile device 1000.
  • the camera unit 1300 may include a camera sensor 1310, a signal processing unit 1320, and an image sensor 1330.
  • the camera sensor 1310 may photograph an object included in the actual space that is in the periphery of the mobile device 1000 and convert an object image of the photographed object into an electrical signal.
  • the signal processing unit 1320 may convert an analog image signal obtained by photographing of the object by using the camera sensor into digital data.
  • the camera sensor 1310 may be a CMOS or a CCD sensor.
  • the signal processing unit 1330 may be implemented as a DSP.
  • the image sensor 1330 may determine a current location of the mobile device 1000 by recognizing a particular object in the object image by sensing and analyzing the object image of the particular object photographed by using the camera sensor 1310, or by collecting location information related to the particular object, and thus, transmit the current location of the mobile device 1000 to the control unit 1600. .
  • the communication unit 1400 may include a mobile communication unit 1410 and a short-range communication unit 1420.
  • the mobile device 1000 may establish communication with a peripheral device or a server (not shown) by using the mobile communication unit 1410 or the short-range communication unit 1420.
  • the mobile communication unit 1410 may enable the mobile device 1000 to communicate with an AP provided by a communication network service provider, and determine a current location of the mobile device 1000 by transmitting location information of the AP.
  • the short-range communication unit 1420 may enable the mobile device 1000 to communicate or share content with a peripheral device by using a Beacon communication, an NFC, a ZigBee communication, an RFID communication, a UWB communication, or a Bluetooth communication.
  • the memory 1500 store information about a particular location of the mobile device 1000 as a numerical value or store information about a recognized object. As such, the memory 1500 may store information needed when the mobile device 1000 executes an application to provide a service.
  • the control unit 1600 controls all operations of the user input unit 1100, the output unit 1200, the camera unit 1300, the communication unit 1400, and the memory 1500, so that the mobile device 1000 may provide a preset service as the mobile device 1000 receives a user input of bending the mobile device 1000.
  • the control unit 1600 provides a service related to an object image captured by using the camera unit 1300, from among services of applications installed on the mobile device 1000. Additionally, the control unit 1600 may determine a location of the mobile device 1000 by using the captured object image, and provide a service related to an object included in the object image based on the determined location.
  • control unit 1600 may recognize a particular object in the object image captured by the camera unit 1300, and provide information that is found by using at least one selected from the group consisting of text and an image included in the recognized particular object as an input value.
  • the control unit 1600 may translate the text included in the recognized particular object into a preset language and provide the translated text.
  • control unit 1600 may determine a degree in which the mobile device 1000 is bent, by using the sensor unit 1110 included in the mobile device 1000.
  • the control unit 1600 may provide a different service when the sensor unit 1110 is bent in correspondence with a second threshold value that is less than the first threshold value from when the sensor unit 1110 is bent in correspondence with the first threshold value. For example, if the control unit 1600 recognizes that the mobile device 1000 is bent in correspondence with the second threshold value, the mobile device 1000 may recognize location information of the mobile device 1000 and provide a service for searching for detailed information related to the recognized location information
  • control unit 1600 may recognize a peripheral device included in the object image captured by using the camera unit 1300, and obtain device information about the recognized peripheral device.
  • control unit 1600 may establish communication between the mobile device 1000 and a peripheral device located in a periphery of the mobile device 1000, based on a user input of bending the mobile device 1000, so that the mobile device may share content, stored in the memory 1500 included in the mobile device 1000, or content, included in the peripheral device, with the peripheral device.
  • exemplary embodiments can also be implemented through computer readable code/instructions in/on a medium, e.g., a computer readable medium, to control at least one processing element to implement any above described exemplary embodiment.
  • the medium can correspond to any medium/media permitting the storage and/or transmission of the computer readable code.
  • the computer readable code can be recorded/transferred on a medium in a variety of ways, with examples of the medium including recording media, such as magnetic storage media (e.g., ROM, floppy disks, hard disks, etc.) and optical recording media (e.g., CD-ROMs, or DVDs), and transmission media such as Internet transmission media.
  • the medium may be such a defined and measurable structure including or carrying a signal or information, such as a device carrying a bitstream according to one or more exemplary embodiments.
  • the media may also be a distributed network, so that the computer readable code is stored/transferred and executed in a distributed fashion.
  • the processing element could include a processor or a computer processor, and processing elements may be distributed and/or included in a single device.
  • each component described in singular form may be executed in a distributed form.
  • components described in a distributed form may be executed in a combined form.

Abstract

Provided are a method of performing a user input by using a mobile device and a mobile device. The method includes: receiving a first user input of bending the mobile device in correspondence with a first threshold value; photographing an object in an actual space by using a camera included in the mobile device, as the first user input is received; displaying an object image of the photographed object on a screen of the mobile device; and providing a service related to the object image of the photographed object, from among services of an application installed on the mobile device.

Description

METHOD OF PROVIDING PRESET SERVICE BY BENDING MOBILE DEVICE ACCORDING TO USER INPUT OF BENDING MOBILE DEVICE AND MOBILE DEVICE PERFORMING THE SAME
One or more exemplary embodiments relate to a method of receiving a user input of bending a mobile device and providing a preset service according to the received user input, and a mobile device performing the same.
Recently, a mobile device is configured to employ a touchscreen panel (TSP) so that a user may conveniently input necessary information to the mobile device. Thus, the user may input information to the mobile device by using the TSP more conveniently than by inputting information via a key input to a keypad.
Functions provided by the mobile device have become increasingly various. A process of searching for an application list displayed on a TSP and selecting an application to be executed is necessary, so as to execute various functions by using the mobile device. It may be inconvenient for a user of the mobile device to undergo such a process so as to execute functions of the mobile device. Accordingly, it may be necessary to intuitively execute functions of the mobile device according to various situations such as a location, a space, a relation with peripheral objects of the user of the mobile device.
One or more exemplary embodiments include a method of intuitively executing a function according to a situation of a mobile device, so as to execute a function provided by the mobile device. In other words, according to an exemplary embodiment, a method of automatically executing and providing a function in correspondence with a situation according to a user input of bending the mobile device.
One or more exemplary embodiments include a mobile device that may perform a function in correspondence with a situation according to a user input of bending the mobile device.
Additional aspects will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the presented exemplary embodiments.
These and/or other aspects will become apparent and more readily appreciated from the following description of the exemplary embodiments, taken in conjunction with the accompanying drawings in which:
FIG. 1 illustrates a diagram showing a mobile device for photographing an actual space and displaying the actual space on a screen of the mobile device as the mobile device receives a user input of bending the mobile device is bent, according to an exemplary embodiment;
FIG. 2 illustrates a flowchart showing a method of receiving a user input of bending the mobile device and providing a preset service, which are performed by the mobile device, according to an exemplary embodiment;
FIGS. 3A and 3B illustrate diagrams for explaining a process of receiving of a user input of bending the mobile device so that the mobile device is bent, the receiving being performed by the mobile device, according to an exemplary embodiment;
FIG. 4 illustrates a flowchart of a method of providing a service on each area of a divided screen as the mobile device receives a user input of bending the mobile device, the method being performed by the mobile device according to an exemplary embodiment;
FIG. 5 illustrates a diagram showing an example of providing a service on each area of a divided screen as the mobile device receives a user input of bending the mobile device, the method being performed by the mobile device, according to an exemplary embodiment;
FIG. 6 illustrates a flowchart showing a method of determining a location of an actual space and providing a service by using information about the determined location as the mobile device receives a user input of bending the mobile device, the method being performed by the mobile device according to an exemplary embodiment;
FIG. 7 illustrates a diagram for explaining an example of determining a location of an actual space and providing a navigation service by using the determined location information as the mobile device receives a user input of bending the mobile device, the determining and the providing being performed by the mobile device, according to an exemplary embodiment;
FIG. 8 illustrates a flowchart of a method of storing location information of an actual space and providing information about a route to a location of which location information is stored, the method being performed by the mobile device, according to an exemplary embodiment;
FIGS. 9A and 9B illustrate diagrams for explaining an example of storing location information and providing information about a route to a stored location as the mobile device receives a user input of bending the mobile device, the storing and providing being performed by the mobile device according to an exemplary embodiment;
FIG. 10 illustrates a flowchart of a method of recognizing an object in an actual space and providing a service related to an object image of the recognized object as the mobile device receives a user input of bending the mobile device, according to an exemplary embodiment;
FIG. 11 illustrates a diagram for explaining an example of recognizing an object in an actual space and providing a search service related to an image of the recognized object as the mobile device receives a user input of bending the mobile device, according to an exemplary embodiment;
FIG. 12 illustrates a diagram for explaining an example of recognizing an object in an actual space and providing a text recognition and translation service related to an image of the recognized object as the mobile device receives a user input of bending the mobile device, according to an exemplary embodiment;
FIG. 13 illustrates a flowchart showing a method of recognizing a peripheral device and providing a service related to the recognized peripheral device as the mobile device receives a user input of bending the mobile device, according to an exemplary embodiment;
FIG. 14 illustrates a diagram for explaining an example of recognizing a peripheral device and providing a service for connecting to the recognized peripheral device as the mobile device receives a user input of bending the mobile device, according to an exemplary embodiment;
FIGS. 15 and 16 illustrate diagrams for explaining an example of recognizing a wearable device and providing a service related to the recognized wearable device as the mobile device receives a user input of bending the mobile device, according to an exemplary embodiment;
FIG. 17 illustrates a diagram for explaining an example of recognizing a peripheral device and sharing content with the peripheral device as the mobile device receives a user input of bending the mobile device, according to an exemplary embodiment;
FIG. 18 illustrates a flowchart of a method of providing a service from among services related to an application that is being executed as the mobile device receives user inputs of bending the mobile device at different angles from each other, according to an exemplary embodiment;
FIG. 19 illustrates a diagram for explaining an example in which the mobile device is bent, as the mobile device receives user inputs of bending the mobile device respectively in correspondence with a first threshold value and a second threshold value, which are different from each other;
FIGS. 20A and 20B illustrate diagrams for explaining an example of providing a service as the mobile device receives user inputs of bending the mobile device respectively in correspondence with a first threshold value and a second threshold value, which are different from each other;
FIG. 21 illustrates a flowchart of a method of providing a service as the mobile device receives user inputs of bending the mobile device respectively in correspondence with a first threshold value and a second threshold value, which are different from each other, according to an exemplary embodiment;
FIGS. 22A and 22B illustrate diagrams for explaining an example of providing a service as the mobile device receives user inputs of bending the mobile device respectively in correspondence with an angle of first threshold value and a second threshold value;
FIG. 23 illustrates a flowchart of a method of providing a service as the mobile device sequentially receives a user input of designating a particular location and a user input of bending the mobile device, according to an exemplary embodiment; and
FIG. 24 is a block diagram of the mobile device according to an exemplary embodiment.
According to one or more exemplary embodiments, a method of providing a preset service according to a user input of bending a mobile device, which is performed by the mobile device includes: receiving a first user input of bending the mobile device in correspondence with a first threshold value; photographing an object in an actual space by using a camera included in the mobile device, as the first user input is received; displaying an object image of the photographed object on a screen of the mobile device; and providing a service related to the object image of the photographed object, from among services of an application installed on the mobile device.
The method may further include determining a location of the mobile device by using the object image of the photographed image, wherein the providing of the service includes providing a service related to the object image, based on the determined location.
The application may be navigation application, and the providing of the service may include providing route information in which the determined location is a departure location.
The application may be a navigation application, the method further including storing location information about the determined location, and if a location of the mobile device is moved, the providing of the service may include providing route information from the location where the mobile device is located to a location of the stored location information.
The method may further include recognizing an object in the object image of the photographed object, wherein the providing of the service includes providing information related to the recognized object.
The recognized object may include at least one selected from the group consisting of text and an image included in the object image, and the providing of the service may include providing information that is found by using the at least one selected from the group consisting of the text and the image as an input value.
The recognized object may include text in the object image, and the providing of the service may include translating the text into a preset language and providing the translated text.
The recognized object may be a peripheral device that is located near the mobile device, the method including obtaining device information of the peripheral device by using the object image, and the providing of the service may include establishing communication between the mobile device and the peripheral device.
The providing of the service may include sharing content in the mobile device with the peripheral device or sharing content in the peripheral device with the mobile device.
The providing of the service may include mirroring at least a part of the screen of the mobile device on the peripheral device.
The displaying of the object image may include displaying the object image of the photographed object on a first area from among areas of the screen of the mobile device, wherein the areas of the screen of the mobile device are obtained when the screen of the mobile device is divided as the mobile device is bent.
The providing of the service may include displaying service information of the service on a second area from among the areas of the screen of the mobile device.
The method may further include: receiving a second user input of bending the mobile device in correspondence with a second threshold value before the first user input is received; and displaying detailed information regarding a location of the mobile device and an application that is being executed by the mobile device.
The method may further include: receiving a second user input of bending the mobile device in correspondence with a second threshold value before the first user input is received; and displaying additional information regarding an object included in the object image on a second area from among the areas of the screen of the mobile device.
The second threshold value may represent an angle having a smaller value than the first threshold value.
According to one or more exemplary embodiments, a mobile device configured to provide a preset service according to a user input includes: a user input unit configured to receive a user input of bending the mobile device in correspondence with a first threshold value; a camera unit configured to photograph an object in an actual space, as the first user input is received; a display unit configured to display an object image of the photographed object on a screen; and a control unit configured to provide a service related to the object image of the photographed object, from among services of an application installed on the mobile device.
The control unit may determine a location of the mobile device, by using the object image of the photographed object, and provide a service related to the object image, based on the determined location.
The control unit may recognize an object in the object image of the photographed object and provide information that is found by using at least one selected from the group consisting of text and an image, included in the recognized object, as an input value.
The control unit may translate the text included in the photographed object into a preset language and provide the translated text.
The control unit may recognize a peripheral device included in the object image of the photographed object, and obtain device information of the recognized peripheral device.
The mobile device may further include a communication unit for establishing communication between the recognized peripheral device and the mobile device.
The control unit may share content in the mobile device or content in the peripheral device with the peripheral device.
The mobile device may further include an angle sensor configured to recognize an angle of a first threshold value or a second threshold value at which the mobile device is bent.
The control unit may provide a service of recognizing location information of the mobile device and searching for detailed information regarding the recognized location information, if the angle sensor senses that the mobile device is bent in correspondence with the second threshold value that is smaller than the first threshold value.
According to one or more exemplary embodiments, a non-transitory computer-readable recording storage medium having stored thereon a computer program, which when executed by a computer, performs the method.
This application claims the benefit of Korean Patent Application No. 10-2015-0004459, filed on January 12, 2015, in the Korean Intellectual Property Office, the disclosure of which is incorporated herein in its entirety by reference.
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to like elements throughout. In this regard, the present exemplary embodiments may have different forms and should not be construed as being limited to the descriptions set forth herein. Accordingly, the exemplary embodiments are merely described below, by referring to the figures, to explain aspects of the present description. In the description of the inventive concept, certain detailed explanations of the related art are omitted when it is deemed that they may unnecessarily obscure the essence of the inventive concept. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items. Expressions such as "at least one of," when preceding a list of elements, modify the entire list of elements and do not modify the individual elements of the list.
It will be understood that when an element is referred to as being "connected to" or "coupled to" another element, it may be "directly connected or coupled" to the other element, or "electrically connected to" the other element with intervening elements therebetween. It will be further understood that the terms "comprises", "comprising", "includes", and/or "including" when used herein, specify the presence of components, but do not preclude the presence or addition of one or more other components, unless otherwise specified.
An object image, described herein, may be an image, a video clip, or text which is obtained by photographing an actual space in which a mobile device is located by using a camera included in the mobile device, and displayed on a screen of the mobile device. The object image may include, for example, a user interface, a virtual reality interface, a result of executing a navigation application, a result of executing content, or a list of applications being executed, but is not limited thereto.
Hereinafter, the inventive concept will be described in detail by explaining embodiments of the inventive concept with reference to the attached drawings.
FIG. 1 illustrates a diagram showing a mobile device 1000 for photographing an actual space and displaying the actual space on a display screen of the mobile device 1000 as the mobile device 1000 receives a user input of bending the mobile device 1000, according to an exemplary embodiment.
As shown in FIG. 1, according to an exemplary embodiment, as the mobile device 1000 receives a user input of bending the mobile device 1000, a whole body and the whole display screen of the mobile device 1000 may be bent. As the user input of bending the mobile device 1000 is received, the display screen of the mobile device 1000 may be divided into a first area 110 and a second area 120. The mobile device 100 includes the display screen, and the display screen is divided into a plurality of areas that includes the first area 110 and the second area 120. The plurality of area respectively may be bent, folded, or unfolded at a preset angle. Each area may be formed of a touchscreen. The display screen of the mobile device 1000 may include a touchscreen as an input unit for receiving an input of a signal, but is not limited the touchscreen.
As the mobile device 1000 receives a user input of bending the mobile device 1000 in correspondence with a preset threshold value, the mobile device 1000 may provide a preset service. For example, as the mobile device 1000 receives a user input of bending the mobile device 1000, the mobile device 1000 may photograph an actual space 2000 by using a camera and display the actual space 2000 on the first area 110. Additionally, the mobile device 1000 may provide a service related to a location of the photographed actual space 2000, from among services that may be provided by applications installed and executed on the mobile device 1000. The provided service may be displayed on the second area 120 in the form of an application.
FIG. 2 illustrates a flowchart showing a method of receiving a user input of bending the mobile device 1000 and providing a preset service, which are performed by the mobile device 1000, according to an exemplary embodiment.
In operation S200, the mobile device 1000 receives a user input of bending the mobile device 1000 in correspondence with a first threshold value. The mobile device 1000 may include a display screen, and as the user input of bending the mobile device 1000 is received, the display screen of the mobile device 1000 may be divided into the first area 110 and the second area 120. However, the display screen is not completely split into the first area 110 and the second area 120. The first area 110 and the second area are formed as one area and bent to be differentiated from each other. Since services respectively provided by applications provided and displayed on the first area 110 and the second area 120 are different from each other, the first area 110 and the second area 120 may be recognized as separate areas of the display screen. .
The user input of bending the mobile device 100 may be made so that a body and the display screen of the mobile device 1000 are bent at a preset threshold angle from the ground. The first threshold value may represent a particular angle such as 0°, 30°, 60°, 90°, or the like at which the screen of the mobile device 1000 is preset to be bent from the ground. According to an exemplary embodiment, the first threshold value may be 60°.
The mobile device 1000 may include an angle sensor. The angle sensor may determine an angle at which the mobile device 1000 is bent from the ground or bent when a user is holding the mobile device 1000. The angle sensor may include, for example, at least one selected from the group consisting of a terrestrial magnetic sensor, a gravity sensor, a gyro sensor, and an acceleration sensor. The angle sensor may determine a degree in which the mobile device is bent, by using the at least one selected from the group consisting of the terrestrial magnetic sensor, the gravity sensor, the gyro sensor, and the acceleration sensor.
In operation S210, the mobile device 1000 photographs an object in an actual space, by using a camera included in the mobile device 1000. The actual space may refer to a surrounding space of the mobile device 1000. The camera may be equipped or installed in the mobile device 1000 so as to photograph an actual space toward which the eyesight of a user of the mobile device 1000 is directed.
The camera may include a camera sensor for photographing an object included in the actual space in a periphery of the mobile device 1000 and converting an object image of the photographed object into an electrical signal, and a signal processing unit for converting an analog image signal obtained by photographing the object by using the camera sensor into digital data. The camera sensor may be a complementary metal-oxide semiconductor image sensor (CMOS) or a charge-coupled device (CCD) sensor. The signal processing unit may be implemented as a digital signal processor (DSP). The camera sensor included in the camera may photograph an actual space in a periphery of the mobile device 1000, and the signal processing unit may obtain an object image of the actual space. According to an exemplary embodiment, the mobile device 1000 may recognize a space in which the mobile device 1000 is located, an object located in a periphery of the mobile device 1000, or a peripheral device located in the periphery of the mobile device 1000.
In operation S220, the object image of the photographed object may be displayed on the first area 110 of the screen of the mobile device 1000. The mobile device 1000 may receive in real time the object image of the object in the actual space which is photographed by the camera. The object image of the object in the actual space, received in real time by the mobile device 1000, may be displayed on the first area 110 of the screen in the form such as a video clip.
In operation S230, the mobile device 1000 provides a service related to the object image of the photographed object, from among services provided by applications installed on the mobile device 1000.
The mobile device 1000 may include at least one selected from the group consisting of applications that are already installed on the mobile device 1000. For example, the mobile device 1000 may include at least one selected from the group consisting of a web browsing application, a search application, a mobile shopping application, a navigation application, a text recognition application, a translation application, and a video playback application. The mobile device 1000 may be in a state of playing the at least one application selected from the group described above.
The mobile device 1000 may recognize the object image which is obtained by photographing the actual space and displayed on the first area 110, and thus, execute an application related to the object image from among the at least one application so as to provide a service. According to an exemplary embodiment, the mobile device 1000 may not only recognize the object image, but also obtain space and location information related to the object image by using a location sensor, and execute the related application based on the obtained space and location information so as to provide a service. Additionally, according to an exemplary embodiment, the mobile device 1000 may recognize a particular object in the object image and execute a search application for searching for detailed information about the recognized object in a web page so as to provide a search service. Additionally, according to an exemplary embodiment, the mobile device 1000 may recognize text in the object image and execute a text recognition application or a translation application so as to provide a detailed information search service or a translation service related to the text. A detail description thereof is described later.
FIGS. 3A and 3B illustrate diagrams for explaining a process of receiving of a user input of bending the mobile device 1000 so that the mobile device 1000 is bent, the receiving being performed by the mobile device 1000, according to an exemplary embodiment.
Referring to FIG. 3A, the mobile device 1000 may include a display screen 100. The mobile device 1000 shown in FIG. 3A is in a state of not receiving a user input of bending the mobile device 1000, and thus, is not in a bent state but in a flat state. The display screen 100 may be formed of a touchscreen. The display screen 100 may output a screen from among a booting screen, a lock screen, a menu screen, a home screen, and an application activation screen. The display screen 100 may be formed of a flexible material that may be bent. For example, the display screen 100 may be formed of a flexible active matrix organic light-emitting diode (F-AMOLED) or a flexible liquid-crystal display (FLCD). A main body of the mobile device 1000, other than the display screen 100, may also be formed of a flexible material. The mobile device 1000 may include a frame 130 in which the display screen 100 is formed and which supports the display screen 100. The frame 130 may provide a space in which a printed circuit board (PCB) is mounted in addition to a space of the display screen 100, wherein the PCB includes a control unit, a storage unit, an audio input/output unit, and a communication unit which are described with reference to FIG. 24. The frame 130 may be formed of a sash material or a flexible plastic material that may be bent according to an intention of a designer.
Referring to FIG. 3B, the mobile device 1000 may receive a user input of bending the mobile device 1000, and thus, be bent at a preset angle. The mobile device 1000 may be bent so that a bent part of the mobile device 1000 forms a first threshold angle θ with an unbent part of the mobile device 1000, according to the user input. The first threshold angle θ may be a particular angle that is preset to 0°, 30°, 60°, 90°, or the like. According to an exemplary embodiment, the first threshold angle θ may be 60°.
As the mobile device 1000 receives the user input, the display screen 100 of the mobile device 1000 may be divided into the first area 100 and the second area 120. It may be understood that the first area 110 and the second area 120 are not separate from each other, but are bent as one area. According to an exemplary embodiment, a size of the first area 110 may correspond to 1/2 of a size of the second area 120. However, the sizes of the first and second area 110 and 120 are not limited thereto.
If the mobile device 1000 receives a user input of bending the mobile device 1000 at the threshold angle θ, the display screen 100, shown in FIG. 3A, may be divided into the first area 110 and the second area 120. As the display screen 100 is bent and divided into the two areas, the frame 130 may also be bent and divided into two areas. The frame 130 may be divided into a first frame 132 that surrounds and supports the first area 110 and a second frame 134 that surrounds and supports the second area 120. A first extension line 132l that is a line extending from the first frame 132 may form the first threshold angle θ with a second extension line 134l that is a line extending from the second frame 134.
As the mobile device 1000 receives the user input of bending the mobile device 1000, it may be convenient for a user to view the object image obtained by photographing the object in the actual space on the first area 110. In other words, according to an exemplary embodiment, the first threshold angle θ may be 60°. In this case, since the first area 110 is bent toward the user, the first area 110 may be located near an eye level of the user, and thus, be appropriate for a viewing angle of the user, and thus, it may be convenient for the user to view the object image on the first area 110. Additionally, according to an exemplary embodiment, the mobile device 1000 may provide a virtual reality navigation service on the first area 110. Since the first area 110 may be bent toward a user, a user may walk in an actual space with his/her eye eyesight directed forward instead of downward. Thus, it may be convenient for a user to view the actual space.
FIG. 4 illustrates a flowchart of a method of providing a service on each area of a divided display screen as the mobile device 1000 receives a user input of bending the mobile device 1000, the method being performed by the mobile device 1000 according to an exemplary embodiment.
In operation S400, the mobile device 1000 receives a user input of bending the mobile device 1000 in correspondence with a first threshold value. The first threshold value may represent a particular angle such as 0°, 30°, 60°, 90°, or the like at which the display screen of the mobile device 1000 is preset to be bent from a supporting surface of the mobile device 1000. According to an exemplary embodiment, the first threshold value may be 60°. The mobile device 1000 may include an angle sensor. The angle sensor may determine an angle at which the mobile device is bent, by using at least one selected from the group consisting of a terrestrial magnetic sensor, a gravity sensor, a gyro sensor, and an acceleration sensor.
In operation S410, an area of a display screen of the mobile device 1000 is divided into the first area 110 shown in FIG. 3 and the second area 120 shown in FIG. 3. It may be understood that the dividing of the area of the display screen into the first area 110 and the second area 120 refers to bending of the display screen as one area, instead of completely splitting the area of the display screen into the first area 110 and the second area 120. According to an exemplary embodiment, a size of the first area 110 may correspond to 1/2 of a size of the second area 120.
In operation S420, the mobile device 1000 photographs an object in an actual space by using a camera included in the mobile device 1000, and displays an object image of the photographed object. The camera may photograph the actual space toward which the eyesight of a user of the mobile device 1000 is directed. The object image of the photographed object in the actual space may be transmitted to the mobile device 1000 and displayed on the first area 110 of the display screen of the mobile device 1000 in the form of a video clip in real time.
In operation S430, the mobile device 1000 provides a service of providing information related to the object image which is displayed on the first area 110, from among services provided by applications installed on the mobile device 1000. According to an exemplary embodiment, the mobile device 1000 may recognize a particular object in the object image displayed on the first area 110, and execute a search application for searching for detailed information about the recognized object in web pages so as to provide a search service. An application for providing the search service may be installed on or included in the mobile device 1000.
In operation S440, the mobile device 1000 displays information related to the recognized particular object in the object image on the second area 120. According to an exemplary embodiment, the search application for providing a service of searching for detailed information about the particular object recognized in operation S430, and an execution screen of the search application may be displayed on the second area 120. As such, the object image of the photographed object in the actual space which is photographed in operation S420, is displayed on the first image 110, and the search application for searching for the detailed information about the recognized particular object in the image of the object is displayed on the second area 120. Thus, a lot of information may be viewed at a same time by dividing the screen into the two parts.
FIG. 5 illustrates a diagram showing an example of the flowchart of the method of providing a service on each area of a divided screen as the mobile device 1000 receives a user input of bending the mobile device 1000, the method being performed by the mobile device 1000, as described with reference to FIG. 4 according to an exemplary embodiment.
Referring to FIG. 5, a user of the mobile device 1000 may photograph an object 2010 in an actual space according to a user input of bending the mobile device 1000. The photographed object 2010 in the actual space may be displayed on the first area 110 of a screen of the mobile device 1000. An execution screen of a search application for searching for detailed information about the object 2010 photographed and displayed on the first area 110 may be displayed on the second area 120.
According to an exemplary embodiment described with reference to FIG. 5, the user of the mobile device 1000 may move to a location in which the user may purchase shoes 2010, such as a department store, and perform a user input of bending the mobile device 1000 so as to photograph the shoes 2010. In this case, an object 2010 in the actual space may be the shoes 2010. An object image of the photographed shoes 2010 may be displayed on the first area 110 of the mobile device 1000 in real time. The mobile device 1000 may recognize the shoes 2010 in the object image of the photographed shoes 2010, and execute an application related to the shoes 2010 from among search applications. For example, the mobile device 1000 may select either a search application or a mobile shopping application as the application related to the shoes 2010. According to an exemplary embodiment, the mobile device 1000 may recognize that the user is currently located at the department store by recognizing location information of the mobile device 1000, and select and execute the mobile shopping application. An execution screen of the mobile shopping application may be displayed on the second area 120.
FIG. 6 illustrates a flowchart showing a method of determining a location of an actual space and providing a service by using information about the determined location as the mobile device 1000 receives a user input of bending the mobile device 1000, the method being performed by the mobile device 1000, according to an exemplary embodiment.
In operation S600, the mobile device 1000 receives a user input of bending the mobile device 1000 in correspondence with a first threshold value. The first threshold value may represent a particular angle such as 0°, 30°, 60°, 90°, or the like at which the screen of the mobile device 1000 is preset to be bent from a supporting surface of the mobile device 1000. According to an exemplary embodiment, the first threshold value may be 60°. The mobile device 1000 may include an angle sensor. The angle sensor may determine an angle at which the mobile device is bent, by using at least one selected from the group consisting of a terrestrial magnetic sensor, a gravity sensor, a gyro sensor, and an acceleration sensor.
In operation S610, the mobile device 1000 photographs an object in an actual space, by using a camera included in the mobile device 1000. The camera may photograph the actual space toward which the eyesight of a user of the mobile device 1000 is directed. An object image of the photographed object in the actual space may be transmitted to the mobile device 1000 in real time.
In operation S620, the mobile device 1000 determines a current location of the mobile device 1000 by using the object image of the object in the actual space which is photographed by using the camera. According to an exemplary embodiment, the mobile device 1000 may include an image sensor. The mobile device 1000 may recognize a particular object in the object image by sensing and analyzing the object image of the object by using the image sensor, collect location information related to the particular object, and thus, determine the current location of the mobile device 1000. Additionally, according to an exemplary embodiment, the mobile device 1000 may include at least one selected from the group consisting of a location information sensor, a global positioning system (GPS) sensor, or a positioning sensor, and determine a current location of the mobile device 1000 by combining location information obtained by using the at least one selected from the group consisting of the location information sensor, the GPS sensor, or the positioning sensor with space information obtained from the object image of the photographed object. According to another exemplary embodiment, the mobile device 1000 may include a communication unit, and determine a current location of the mobile device 1000 via communication with an access point (AP) of a communication network service provider.
In operation S630, the mobile device 1000 provides a service related to the location determined in operation S620, from among services of applications installed on the mobile device 1000. According to an exemplary embodiment, a map application, a route guide application, and a navigation application may be installed on the mobile device 1000, and the mobile device 1000 may execute the navigation application for providing route information in which the determined current location is a departure location. According to an exemplary embodiment, the navigation application may add an image of a virtual route guide to the object image of the object in the actual space which is photographed in operation S610, and display the image of the virtual route guide 200 on the first area 110 of the display screen which is shown in FIG. 1. A detailed description thereof is described with reference to FIG. 7.
FIG. 7 illustrates a diagram for explaining an example of determining a location of an actual space and providing a navigation service by using the determined location information as the mobile device 1000 receives a user input of bending the mobile device 1000, the determining and the providing being performed by the mobile device 1000, according to an exemplary embodiment.
Referring to FIG. 7, as the mobile device 1000 receives a user input of bending the mobile device 100, the mobile device 1000 photographs an actual space and displays an object image obtained by photographing the actual space. Then, the mobile device 1000 may determine a location of the mobile device 1000 by using the object image, and then, execute an application related to the determined location. According to an exemplary embodiment, as the mobile device 1000 receives a user input of bending the mobile device 1000, the mobile device 1000 may photograph an actual space that is a place in a periphery of an actual space, subway station 2020 and display an image of the subway station 2020 on the first area 110. Additionally, the mobile device 1000 may execute a search application by using the image of the subway station 2020, that is, an object image of the subway station 2020, or determine a name of the subway station 2020 near which the mobile device 1000 is located by activating a location sensor or a GPS sensor. For example, if a current location of the mobile device 1000 is determined as being in a periphery of Myeongdong station, the mobile device 1000 may execute a navigation application for guiding a user through a route from the periphery of Myeongdong station that is determined as a departure location to a destination preset by the user. According to an exemplary embodiment, a service provided by the navigation application may display the object image of the photographed subway station 2020 and a virtual route guide 200 for guiding the user to the preset destination on the first area 110 of the screen. In other words, according to an exemplary embodiment, the navigation application may be an application for providing a virtual reality service. The service provided by the navigation application may photograph an actual moving route, and display the photographed actual moving route and the virtual route guide 200 for guiding to the preset destination in real time on the first area 110 in real time.
FIG. 8 illustrates a flowchart of a method of storing location information of an actual space and providing information about a route to a location of which location information is stored, the method being performed by the mobile device 1000, according to an exemplary embodiment.
In operation S800, the mobile device 1000 receives a user input of bending the mobile device 1000 in correspondence with a first threshold value. The first threshold value may represent a particular angle such as 0°, 30°, 60°, 90°, or the like at which the screen of the mobile device 1000 is preset to be bent from a supporting surface of the mobile device 1000. According to an exemplary embodiment, the first threshold value may be 60°. The mobile device 1000 may include an angle sensor. The angle sensor may determine an angle at which the mobile device is bent, by using at least one selected from the group consisting of a terrestrial magnetic sensor, a gravity sensor, a gyro sensor, and an acceleration sensor.
In operation S810, the mobile device 1000 photographs an object in an actual space, by using a camera included in the mobile device 1000, and determines a current location of the mobile device 1000 by using an object image of the photographed object. The camera may photograph the actual space toward which the eyesight of a user of the mobile device 1000 is directed. The object image of the photographed object may be transmitted to the mobile device 1000 in real time, and displayed on the first area 110 of the screen of the mobile device 1000 in the form of a streaming video in real time. The mobile device 1000 may determine a current location of the mobile device 1000 in correspondence with the object image of the photographed object, by using an image sensor, a location sensor, or a GPS sensor.
In operation S820, the mobile device 1000 stores an information value with respect to the location determined in operation S810. The mobile device 1000 may include at least one selected from the group consisting of a location information sensor, a GPS sensor, and a positioning sensor. The mobile device 1000 may determine an information value that includes a coordinate value of the current location of the device 1000 by combining location information obtained by using the at least one selected from the group consisting of the location information sensor, the GPS sensor, and the positioning sensor with space information obtained from the object image, and store the information value as a location coordinate value. According to another exemplary embodiment, the mobile device 1000 may include a communication unit, and determine a current location of the mobile device 1000 via communication with an AP, which is disposed in the actual space and provided by a communication network service provider. Thus, the mobile device 1000 may store information about the determined current location as a numerical value.
In operation S830, as the location of the mobile device 1000 is moved, a service for providing information about a route from a location to which the mobile device 1000 has moved to the location of which the information value is stored in operation S820 is provided. If a user of the mobile device 1000 has moved from a first location that is the location of which the information value is stored in operation S820 to a second location that is a current location, the mobile device 1000 may provide a navigation service for guiding the user through a route by recognizing the second location that is the current location and designating the first location as a destination.
FIGS. 9A and 9B illustrate diagrams for explaining a navigation service for storing location information about a first location and, as the mobile device 1000 moves to a second location, providing information about a route from the second location to the first location, the storing and the providing being performed by the mobile device 1000 as the mobile device 1000 receives a user input of bending the mobile device 1000, according to an exemplary embodiment.
Referring to FIG. 9A, as the mobile device 1000 receives a user input of bending the mobile device 1000, the mobile device 1000 may recognize a first location that is a surrounding space 2030, and store coordinate location information about the recognized first location. According to an exemplary embodiment, the first location that is the surrounding space 2030 may be the surrounding space 2030, name of “B-24” in the fourth basement floor, which is in a parking lot where a vehicle of the user is parked As the mobile device 1000 receives a user input of bending the mobile device 1000, the mobile device 1000 may photograph the surrounding space 2030 of B-24 in the fourth basement floor in which the vehicle of the user is parked, and display an object image of the photographed surrounding space 2030 on the first area 110. The mobile device 1000 may recognize the text, “B-24 in the second basement floor” shown in the object image, or recognize and store location information of the surrounding space 2030 of B-24 in the fourth basement floor in the parking lot by using a GPS sensor or a positioning sensor. A notification screen showing that current location information, that is, information of “B-24 in the fourth basement floor” in the parking lot is stored may be displayed on the second area 120 of a screen of the mobile device 1000.
Referring to FIG. 9B, a user 10 holding and using the mobile device 1000 may move from the first location 2030 that is the surrounding space 2030 shown in FIG. 9B to a second location. If the user 10 moves to the second location and performs a user input of bending the mobile device 1000, the mobile device 1000 may photograph a surrounding space of the second location by using a camera and display an image obtained by photographing the surrounding space on the first area 110. Additionally, the mobile device 1000 may determine a current location based on the object image of the surrounding space of the second location displayed on the first area 110, and display a virtual route guide 210 for providing route information, in which a current location is a departure location and the first location is a destination, on the first area 110.
According to an exemplary embodiment, after the user 10 finishes shopping at a department store, if the user 10 is to move to a first location that is the surrounding space 2030 in which a vehicle of the user 10 is parked, for example, “the surrounding space 2030 of B-24 in the fourth basement floor” in the parking lot, the mobile device 1000 may receive a user input of bending the mobile device 1000 and provide a route guide service for moving to the surrounding space 2030 of “B-24 in the second basement floor” by using the stored information location about “B-24 in the second basement floor”. The route guide service may be a virtual reality application for displaying the virtual route guide 210 on the first area 110 of the screen of the mobile device 1000. According to an exemplary embodiment, a map application may be executed on the second area 120 of the screen of the mobile device 1000, and the map application may display a current location of the user 10. The map application may determine a location of the mobile device 1000 by using the GPS sensor included in the mobile device 1000.
FIG. 10 illustrates a flowchart of a method of recognizing an object image in an actual space and providing a service related to the recognized object image as the mobile device 1000 receives a user input of bending the mobile device 1000, according to an exemplary embodiment.
In operation S1000, the mobile device 1000 receives a user input of bending the mobile device 1000 in correspondence with a first threshold value. The first threshold value may represent a particular angle such as 0°, 30°, 60°, 90°, or the like at which the screen of the mobile device 1000 is preset to be bent from a supporting surface of the mobile device 1000. According to an exemplary embodiment, the first threshold value may be 60°. The mobile device 1000 may include an angle sensor. The angle sensor may determine an angle at which the mobile device is bent, by using at least one selected from the group consisting of a terrestrial magnetic sensor, a gravity sensor, a gyro sensor, and an acceleration sensor.
In operation S1010, the mobile device 1000 photographs an object in an actual space, by using a camera included in the mobile device 1000. The camera may photograph the actual space toward which the eyesight of a user of the mobile device 1000 is directed. An object image of the photographed object may be transmitted to the mobile device 1000 in real time.
In operation S1020, the mobile device 1000 recognizes a particular object in the object image of the actual space which is photographed by the camera. The mobile device 1000 may include an image sensor, and recognize the particular object in the object image by sensing and analyzing the object image by using the image sensor.
In operation S1030, the mobile device 1000 provides a service for providing information related to the recognized object image. According to an exemplary embodiment, the mobile device 1000 may execute an application related to the recognized particular object, from among at least one application installed on the mobile device 1000, and thus, display detailed information about the particular object on the second area 120 of the mobile device 1000. According to an exemplary embodiment, a search application for providing a service for searching for the detailed information about the particular object, recognized in operation S1020, may be executed, and an execution screen of the search application may be displayed on the second area 120.
FIG. 11 illustrates a diagram for explaining an example of recognizing an object image in an actual space and providing a search service related to the recognized object image as the mobile device 1000 receives a user input of bending the mobile device 1000, according to an exemplary embodiment.
Referring to FIG. 11, as the mobile device 1000 receives the user input of bending the mobile device 1000, the mobile device 1000 may display an object image obtained by photographing an actual space on the first area 110, recognize a particular object in the object image of the photographed actual space, and thus, provide a search service for searching for detailed information about the particular object. According to an exemplary embodiment, as the mobile device 1000 receives a user input of bending the mobile device 1000, the mobile device 1000 may photograph an actual space that includes a tumbler 2040, and display the actual space on the first area of the screen. Additionally, the mobile device 1000 may recognize the tumbler 2040 that is a particular object in the photographed actual space. The mobile device 1000 may include an image sensor, and the image sensor may recognize a shape of the tumbler 2040 by analyzing an object image of the actual space and transmit object information about the recognized shape of the tumbler 2040 to the mobile device 1000. The mobile device 1000 may execute an application for searching for detailed information about the tumbler 2040 that is the particular object. The search application may be an application through which a “tumbler” is searched for in web pages and detailed information thereof is obtained. An execution screen of the search application may be displayed on the second area 120.
FIG. 12 illustrates a diagram for explaining an example of recognizing an object image of an actual space and providing a text recognition and translation service related to the recognized object image as the mobile device 1000 receives a user input of bending the mobile device 1000, according to an exemplary embodiment.
Referring to FIG. 12, as the mobile device 1000 receives the user input of bending the mobile device 1000, the mobile device 1000 may display an object image obtained by photographing an actual space on the first area 110, recognize a particular object in the object image of the photographed actual space, and thus, provide a search service or a translation service related to the recognized text. According to an exemplary embodiment, as the mobile device 1000 receives a user input of bending the mobile device 1000, the mobile device 1000 may photograph an object 2050 in an actual space which includes text “Se-jong Dae Wang”, and display the object 2050 on the first area 110. Additionally, the mobile device 100 may recognize the text “Se-jong Dae Wang” in an object image of the object 2050 displayed on the first area 110. The mobile device 1000 may include an image sensor, and the image sensor may recognize the text “Se-jong Dae Wang” by analyzing the object image of the object 2050, and transmit information about the recognized text to the mobile device 1000. The mobile device 1000 may provide a service of translating the text “Se-jong Dae Wang” into a language preset by a user. For example, the mobile device 1000 may provide a service of translating the text “Se-jong Dae Wang” into English “King Se-jong”. A translation result 220 obtained by translating the text “Se-jong Dae Wang” into the language preset by the user may be converted into a virtual image and displayed on the first area 110 of the screen. In other words, the translation result 220 may be displayed on the first area 110 to overlap with the object 2050 in the actual space. Additionally, according to an exemplary embodiment, the mobile device 1000 may provide a search service by executing an application for searching for information about the translation result 220. A screen of the application for performing a detailed search for information about the translation result 220 may be displayed on the second area 120.
FIG. 13 illustrates a flowchart showing a method of recognizing a peripheral device and providing a service related to the recognized peripheral device, as the mobile device 1000 receives a user input of bending the mobile device 1000, according to an exemplary embodiment.
In operation S1300, the mobile device 1000 receives a user input of bending the mobile device 1000 in correspondence with a first threshold value. The first threshold value may represent a particular angle such as 0°, 30°, 60°, 90°, or the like at which the screen of the mobile device 1000 is preset to be bent from a supporting surface of the mobile device 1000. According to an exemplary embodiment, the first threshold value may be 60°. The mobile device 1000 may include an angle sensor. The angle sensor may determine an angle at which the mobile device is bent, by using at least one selected from the group consisting of a terrestrial magnetic sensor, a gravity sensor, a gyro sensor, and an acceleration sensor.
In operation S1310, the mobile device 1000 photographs an object in an actual space, by using a camera included in the mobile device 1000. The camera may photograph the actual space toward which the eyesight of a user of the mobile device 1000 is directed. An object image of the photographed object may be transmitted to the mobile device 1000 in real time.
In operation S1320, the mobile device 1000 recognizes a peripheral device in the object image of the object in the actual space which is photographed by the camera. The mobile device 1000 may include an image sensor, and recognize the peripheral device in the object image by sensing and analyzing the object image of the photographed object by using the image sensor.
In operation S1330, the mobile device 1000 obtains device information of the recognized peripheral device. The mobile device 1000 may obtain the device information that includes a service set identifier (SSID), a device model number, or the like of the peripheral device recognized in operation S1320.
In operation S1340, the mobile device 1000 provides a service related to the recognized peripheral device by using the obtained device information. According to an exemplary embodiment, the mobile device 1000 may determine whether the mobile device 1000 may communicate with the recognized peripheral device, and establish communication with the recognized peripheral device. According to an exemplary embodiment, the mobile device 1000 may share content such as a photograph, music, or a video clip with the recognized peripheral device.
FIG. 14 illustrates a diagram for explaining an example of recognizing a peripheral device 3000 and providing a service for connecting to the recognized peripheral device 3000 as the mobile device 1000 receives a user input of bending the mobile device 1000, according to an exemplary embodiment.
Referring to FIG. 14, as the mobile device 1000 receives the user input of bending the mobile device 1000, the mobile device 1000 may photograph an object 3000 in an actual space by using a camera, and display an object image of the photographed object 3000 on the first area 110 of a screen of the mobile device 1000. The mobile device 1000 may recognize a peripheral device 3000 that is included in the object image of the photographed object 3000. The peripheral device 3000 that may be recognized may be a TV, a tablet PC, a refrigerator, or a gas stove. According to an exemplary embodiment, the mobile device 1000 may recognize the TV 3000 included in the object image. The mobile device 1000 may establish communication with the TV 3000 recognized in the object image, and thus, be connected to the TV 3000 via a communication network. The mobile device 1000 may include a communication unit, and establish communication with the peripheral device 3000 and thus, be connected to the peripheral device 3000.
According to an exemplary embodiment, the mobile device 1000 may execute an application through which a video clip is viewed, on the second area 120 of the screen. The video playback application may include a video playback screen 132 and a video information providing screen 124 that provides information about a video clip that is being played. If the video playback application is being executed, the mobile device 1000 may establish communication with the TV 3000 and receive a separate user input, and thus, share video content with the TV 3000. In other words, if the mobile device 1000 receives a separate user input, for example, a swipe input, the mobile device 1000 may share video content that is being played on the video playback screen 132 with the TV 3000 that has established a communication with the mobile device 1000. The shared video content, that is, the video content that is being played on the video playback screen 132 of the mobile device 1000 may be mirrored, displayed, and played on the TV 3000.
FIGS. 15 and 16 illustrate diagrams for explaining an example of recognizing a wearable device 3010 and providing a service related to the recognized wearable device as the mobile device 1000 receives a user input of bending the mobile device 1000, according to an exemplary embodiment.
Referring to FIG. 15, as the mobile device 1000 receives the user input of bending the mobile device 1000, the mobile device 1000 may obtain device information of the wearable device 3010 located near the mobile device 1000 and establish communication with the wearable device 3010 to be connected to the wearable device 3010.
The mobile device 1000 may obtain device information of a wearable device 3010 located near the mobile device 1000, establish communication with the wearable device 3010, and thus, be connected to the wearable device 3010. The mobile may recognize the wearable device 3010 by obtaining the device information that includes an SSID, a device model number, or the like of the wearable device 3010. The wearable device 3000 may be glasses that have a communication function and a data processing function. The wearable device 3010 may be worn by a user, and photograph an object in an actual space by using a camera that is directed toward the front of the user.
The mobile device 1000 may display a list of services, provided by an application that may be executed by the wearable device 3010, on the second area 120. The user may select a service from the list of the services displayed on the second area 120, and thus, be provided with the service of using the wearable device 3010.
Referring to FIG. 16, as the mobile device 1000 connected to the wearable device 3010 receives the user input of bending the mobile device 1000, the mobile device 1000 transmit an object image of an actual space 2060, which is located in a periphery of the mobile device 1000 and photographed by a camera, to the wearable device 3010. The wearable device 3010 may display virtual signs 2072 and 2074 on the transmitted object image of the actual space 2060, and provide a service related to the actual space 2060. The wearable device 3010 may display a virtual image 2070 which is obtained by adding the virtual signs 2072 and 2074 to the transmitted object image of the actual space 2060. According to an exemplary embodiment, the wearable device 3010 may provide a navigation service, and the navigation service may be a service of displaying the virtual signs 2072 and 2074 for guiding a user to move to a place preset by the user on a display unit 3012.
FIG. 17 illustrates a diagram for explaining an example of recognizing a peripheral device 3200 and sharing content with the peripheral device 3200 as the mobile device 1000 receives a user input of bending the mobile device 1000, according to an exemplary embodiment.
Referring to FIG. 17, as the mobile device 1000 receives the user input of bending the mobile device 1000, the mobile device 1000 may recognize the peripheral device 3020 in an object image of an object photographed by a camera, and share content stored in the mobile device 1000 with the peripheral device 3020. The mobile device 1000 may include an image sensor, and recognize the peripheral device 3020 in the object image by sensing and analyzing the object image of the photographed object by using the image sensor. Thus, the mobile device 1000 may obtain device identification information of the peripheral device 3020. The obtained device information may include an SSID of the peripheral device 3020, a device model number, or the like of the peripheral device 3020. The mobile device 1000 may establish communication with the peripheral device 3020 based on obtained identification information of the peripheral device 3020, and display a user interface for determining whether to share content with the peripheral device 3020 on the second area 120.
According to an exemplary embodiment, as the mobile device 1000 receives the user input of bending the mobile device 1000, the mobile device 1000 may be connected to the peripheral device 3020, for example, a mobile phone 3020 via a short-range wireless communication. The short-range wireless communication via which the mobile device 1000 is connected to the mobile phone 3020 may include a Beacon communication, near-field communication (NFC), a ZigBee communication, a radio frequency identification (RFID) communication, an ultra-wide band (UWB) communication, or a Bluetooth communication, but is not limited thereto. The mobile device 1000 may obtain an SSID of the mobile phone 3020 or user certification information of the mobile phone 3020 (for example, communication network subscription information, a user identification (ID), or the like), and determine whether to share content with the mobile phone 3020 based on the obtained user certification information. The mobile device 1000 may display contents to be shared with the mobile phone 3020, on the second area 120 of the screen. A user may select and share content that the user wants to share with the mobile phone 3020, from among the contents displayed on the second area 120 of the mobile device 1000.
FIG. 18 illustrates a flowchart of a method of providing a service from among services related to an application that is being executed as the mobile device 1000 receives user inputs of bending the mobile device 1000 at different angles from each other, according to an exemplary embodiment.
In operation S1800, the mobile device 1000 receives a user input of bending the mobile device 1000 in correspondence with a first threshold value. The first threshold value may represent a particular angle such as 0°, 30°, 60°, 90°, or the like at which the screen of the mobile device 1000 is preset to be bent from a supporting surface of the mobile device 1000.
According to an exemplary embodiment, the first threshold value may be 60°.
In operation S1810, the mobile device 1000 displays detailed information regarding a location of the mobile device 1000 and the application that is being executed on the first area 110 of a screen of the mobile device 1000 which is shown in FIG. 1.
According to an exemplary embodiment, the mobile device 1000 may include at least one selected from the group consisting of a location information sensor, a GPS and a positioning sensor, and determine information about a current location of the mobile device 1000 by using the at least one selected from the group consisting of the location information sensor, the GPS and the positioning sensor. According to another exemplary embodiment, the mobile device 1000 may include a communication unit, and determine a current location of the mobile device 1000 via communication with an AP of a communication network service provider which is disposed in an actual space.
At least one application may be installed on the mobile device 1000, and the mobile device 1000 may be executing the installed at least one application. The mobile device 1000 may execute an application related to current location information about the mobile device 1000, from among the at least one application that is being executed, and thus, provide a service related to the current location information. According to an exemplary embodiment, if the mobile device 1000 is located in a subway station or a street, the related service may be a navigation service in which an application for searching for location information, that is, a map application is executed. If the mobile device 1000 is located in a shopping place such as a department store, the related service may be a search service in which an application for searching for product information or a product cost is executed.
In operation S1820, the mobile device 1000 receives a user input of bending the mobile device 1000 in correspondence with a second threshold value. The first threshold value may represent a particular angle such as 0°, 30°, 60°, 90°, or the like at which the screen of the mobile device 1000 is preset to be bent from a supporting surface of the mobile device 1000. According to an exemplary embodiment, the first threshold value may be 60°.
According to an exemplary embodiment, the second threshold value for bending the mobile device 1000 may be a value corresponding to a user input for bending the mobile device at a greater angle than the first threshold value for bending the mobile device 1000 in operation S1800. For example, the first threshold value may be 30°, and the second threshold value may be 60°.
In operation S1820, the user input of bending the mobile device 1000 in correspondence with the second threshold value may be received successively in a time interval after the user input of bending the mobile device 1000 in correspondence with the first threshold value. In other words, according to an exemplary embodiment, the mobile device 1000 is bent in correspondence with the first threshold value in operation S1800, and then, successively bent in correspondence with the second threshold value in operation S1820 without being unbent and becoming flat. However, the bending of the mobile device 1000 is not limited thereto. The mobile device 1000 may be bent in correspondence with the first threshold value in operation S1800, unbent, and then, bent in correspondence with the second threshold value in operation S1820.
In operation S1830, the mobile device 1000 photographs an object in an actual space, by using a camera included in the mobile device 1000. The camera may photograph the actual space toward which the eyesight of a user of the mobile device 1000 is directed. An object image of the photographed object may be transmitted to the mobile device 1000 in real time.
In operation S1840, the mobile device 1000 provides a service related to the object image of the photographed object, from among services of the application that is being executed. According to an exemplary embodiment, the mobile device 1000 may obtain current location information about the mobile device 1000 from the object image of the photographed object. The mobile device 1000 may execute a service related to the location information which is executed in operation S1810, based on the obtained location information so as to provide the service related to the location information. For example, the service related to the location information may be a service for guiding a user through a route by executing a navigation application or a map application. According to an exemplary embodiment, the service related to the location information may be a service related to virtual reality for combining a virtual image with the first area 110 of the mobile device 1000, and thus, display the virtual image on the first area 110 of the mobile device 1000.
FIG. 19 illustrates a diagram for explaining an example in which the mobile device 1000 is bent, as the mobile device 1000 receives user inputs of bending the mobile device 1000 respectively in correspondence with a first threshold value α and a second threshold value β, which are different from each other.
Referring to FIG. 19, the mobile device 1000 may include a display area that is divided into a plurality of parts that includes the first area 110 and the second area 120. The display area is divided into the first area 110 and the second area 120. However, respective areas included in the display area may contact each other, and be bent or unbent, or folded or unfolded at a preset angle. Each area may be formed of a touchscreen.
As the display screen is divided into the first area 110 and the second area 120, a frame that supports the display screen may also be bent and divided into two parts. The frame may be divided into a first frame 132 that surrounds and supports the first area 110 and a second frame 134 that surrounds and support the second area 120. An angle between the first frame 134 and the second frame 132 may vary with a degree in which the mobile device 1000 is bent. If the first frame 134 is bent to a location 134-1 in correspondence with the first threshold value α, the first area 110 and the second area 120 may also be bent in correspondence with the first threshold angle α. Likewise, if the first frame 134 is bent to a location 134-2 in correspondence with the second threshold value β, the first area 110 and the second area 120 may also be bent in correspondence with the second threshold value β. According to an exemplary embodiment, the first threshold value α may be 30°, and the second threshold value α may be 60°.
FIGS. 20A and 20B illustrate diagrams for explaining an example of providing a service as the mobile device 1000 receives user inputs of bending the mobile device 1000 respectively in correspondence with a first threshold value α and a second threshold value β, which are different from each other.
Referring to FIG. 20A, as the mobile device 1000 receives a user input of bending the mobile device 1000 at a first threshold angle. The mobile device 1000 may provide a service of determining a current location of the mobile device 1000, searching for detailed information related to the determined location, and thus, displaying the detailed information on the first area 110 of the screen. According to an exemplary embodiment, if the mobile device 1000 is located at an entrance 2080 of an airport and receives a user input of bending the mobile device 1000 at an angle of 30°, the mobile device 1000 may determine a location of the entrance 2080 of the airport, and store a value of location information. The mobile device 1000 may determine a current location of the mobile device 1000, by determining the location information of the entrance 2080 of the airport by using a GPS sensor or a location sensor. Additionally, according to an exemplary embodiment, the mobile device 1000 may determine a current location of the mobile device 1000 based on AP usage information provided by a communication network provider, which is obtained by a communication unit included in the mobile device 1000. If the mobile device 1000 is executing, for example, an airport information application, the mobile device 1000 may display detailed information about a current location, that is, the entrance 2080 of the airport on the first area 110. The detailed information may include a name of the airport, a name of an air flight departing from the airport, or information about a departure time of the air flight. Such information may be found in web pages.
Referring to FIG. 20B, as the mobile device 1000 receives a user input of bending the mobile device 1000 at a second threshold angle, the mobile device 1000 may photograph an object in an actual space by using a camera, and provide a service related to an object image of the photographed object from among services that may be provided by an application that is being executed. According to an exemplary embodiment, as the mobile device 1000 receives a user input of bending the mobile device 1000 at an angle of 60°, the mobile device 1000 may obtain an object image by photographing the entrance 2080 of the airport, and display the object image on the first area 110. The mobile device 1000 may provide a service related to the object image of the entrance 2080 of the airport which is displayed on the first area 110, from among services that may be provided by the airport information application described with reference to FIG. 20A. The mobile device 1000 may provide a navigation service of setting a place preset by a user, for example, a place for check-in before flight departure as a destination. The navigation service may be a virtual reality service for guiding a user by displaying a virtual guide sign 230 on the object image of the entrance 2080 of the airport which is displayed on the first area 110.
According to an exemplary embodiment, the mobile device 1000 may display a map application for displaying a current location of the mobile device 1000 on the second area of the screen.
FIG. 21 illustrates a flowchart of a method of providing a service as the mobile device 1000 receives user inputs of bending the mobile device 1000 respectively in correspondence with a first threshold value α and a second threshold value β, which are different from each other.
In operation S2100, the mobile device 1000 receives a user input of bending the mobile device 1000 in correspondence with the first threshold value α. The first threshold value α may represent a particular angle such as 0°, 30°, 60°, 90°, or the like at which the screen of the mobile device 1000 is preset to be bent from a supporting surface of the mobile device 1000.
According to an exemplary embodiment, the first threshold value may be 60°.
In operation S2110, the mobile device 1000 photographs an object in an actual space, by using a camera included in the mobile device 1000. The camera may photograph the actual space toward which the eyesight of a user of the mobile device 1000 is directed. An object image of the photographed object may be transmitted to the mobile device 1000 in real time.
In operation S2120, the mobile device 1000 provides a service related to the object image of the photographed object, from among services of the application that is being executed. The mobile device 100 may recognize a particular object in the object image of the actual space photographed by using the camera in operation S2110, and execute an application related to the recognized particular object.
In operation S2130, the mobile device 1000 receives a user input of bending the mobile device 1000 in correspondence with the second threshold value β. The second threshold value may represent a particular angle such as 0°, 30°, 60°, 90°, or the like at which the screen of the mobile device 1000 is preset to be bent from a supporting surface of the mobile device 1000. According to an exemplary embodiment, the second threshold value may be 30°. In other words, the mobile device 1000 may receive a user input of bending the mobile device at 30° sequentially after the mobile device 1000 is bent at 60° in operation S2100.
In operation S2140, the mobile device 1000 displays additional information related to a particular object included in the object image on the second area 120 of the screen. The mobile device 1000 may include an image sensor, and recognize the particular object in the object image by sensing and analyzing the object image of the photographed object by using the image sensor. The mobile device 1000 may execute an application related to the recognized particular object from among at least one application installed on the mobile device 1000, and thus, display detailed information about the particular object on the second area 120 of the mobile device 1000.
FIGS. 22A and 22B illustrate diagrams for explaining an example of providing a service as the mobile device 1000 receives user inputs of bending the mobile device 1000 respectively in correspondence with a first threshold value α and a second threshold value β, which are different from each other, the providing being performed by the mobile device 1000 according to an exemplary embodiment.
Referring to FIG. 22A, as a user moves to a location where the user may buy running shoes such as a department store or a shoe store and the mobile device 1000 receives a user input of bending the mobile device 1000 at an angle of 60°, the mobile device 1000 may photograph the running shoes 2090. The photographed running shoes 2090 may be displayed on the first area 110 of the mobile device 1000 in real time. The mobile device 1000 may recognize the running shoes 2090 in an object image of the photographed running shoes 2090, execute an application related to the running shoes 2090 from among search applications installed on the mobile device 1000, and thus, display the application on the second area 120 of the screen. For example, the mobile device 1000 may display either a search application or a mobile shopping application as an application related to the running shoes 2090 on the second area 120 of the screen.
Referring to FIG. 22B, when the mobile device 1000 is bent at 60°, as the mobile device 1000 receives a user input of bending the mobile device 1000 at 30°, the mobile device 1000 may recognize the running shoes 2090 that is an object 2090 located in the actual space, and thus, execute a search application for searching for detailed information about the running shoes 2090. The mobile device 1000 may include an image sensor, and recognize a type, a manufacturer, or the like of the running shoes 2090 by sensing and analyzing an object image of the photographed running shoes 2090 by using the image sensor. The search application may be an application for searching for and displaying information in web pages, wherein the information includes a size, quantity of stocks, a cost, a discount, and evaluations by consumers with respect to the sneakers 2090. An execution screen of the search application may be displayed on the second area 120.
FIG. 23 illustrates a flowchart of a method of providing a service as the mobile device 1000 sequentially receives a user input of designating a particular location and a user input of bending the mobile device 1000, according to an exemplary embodiment.
In operation S2300, the mobile device 1000 receives a user input of designating a particular location, and stores an information value of the particular location. The mobile device 1000 may include a location sensor, a GPS sensor, or a positioning sensor, and store the information value of the particular location according to the user input as a numerical value.
In operation S2310, the mobile device 1000 receives a user input of bending the mobile device 1000 in correspondence with a first threshold value. The first threshold value may represent a particular angle such as 0°, 30°, 60°, 90°, or the like at which the screen of the mobile device 1000 is preset to be bent from a supporting surface of the mobile device 1000. According to an exemplary embodiment, the first threshold value may be 60°.
In operation S2320, the mobile device 1000 determines whether the mobile device 1000 is placed in the particular location. According to an exemplary embodiment, the mobile device 1000 may compare the particular location that is stored according to the user input received in operation S2300 to a coordinate value of a current location of the mobile device 1000, and determine whether the current location is identical to the stored particular location.
In operation S2330, if it is determined that the current location of the mobile device 1000 is identical to the particular location received in operation S2300, the mobile device 1000 uploads particular data designated by a user to a server, or download particular data from the server. If the mobile device 1000 sequentially receives a user input of designating a particular location, for example, location information of a lecture room and a user input of bending the mobile device 1000 in correspondence with the first threshold value, the mobile device 1000 may upload prestored information about homework to the server. Alternatively, if the mobile device 1000 receives a user input of bending the mobile device 1000 in correspondence with a first threshold value and recognizes that the particular location is, for example, a lecture room, the mobile device 1000 may download the information about homework stored in the server.
In operation S2340, if it is determined that the current location of the mobile device 1000 is not identical to the particular location received in operation S2300, the mobile device 1000 may receive a user input of unbending the mobile device 1000.
FIG. 24 is a block diagram of the mobile device 1000 according to an exemplary embodiment.
Referring to FIG. 24, according to an exemplary embodiment, the mobile device 1000 may include a user input unit 1100, an output unit 1200, a camera unit 1300, a communication unit 1400, a memory, and a control unit 1600.
The user input unit 1100 receives a user input. The user input unit 1100 may receive a user input of bending the mobile device 1000. The user input unit 1100 may include a sensor unit 1110 and a sound input unit 1120. The user input unit 1100 may detect a user input by using the sensor unit 1110 included in the mobile device 1000. The sensor unit 1110 may include, for example, a vibration sensor, an angle sensor, a gyro sensor, an acceleration sensor, a pressure sensor, a gravity sensor, or a touch sensor. The sensor unit 1110 may calculate an angle at which the frame 130 of the mobile device 1000 shown in FIG. 3 and the display screen 100 shown in FIG. 3 are bent, and transmit a value of the calculated angle to the control unit 1600. The sensor unit 1100 may determine whether the mobile device 1000 is bent in correspondence with a first threshold value or a second threshold value, by using a terrestrial magnetic sensor, the gyro sensor, or the acceleration sensor which may detect an angle at which the mobile device 1000 is bent with reference to a ground surface. Additionally, the sensor unit 1110 may determine a state when the mobile device 1000 is unbent. For example, the user input unit 1100 may detect a user input by using the sound input unit 1120.
The output unit 1200 outputs data of the mobile device 1000. The output unit 1200 may display an object that includes at least one selected from the group consisting of an image, a video clip, and text. Additionally, the output unit 1200 may output voice. The display unit 1210 displays data of the mobile device 1000. The display unit 1210 may include the display screen 100 and the first area 110 and the second area 120 of the display screen 100, which are shown in FIGS. 1 through 23. The display unit 1210 may display an object that includes at least one selected from the group consisting of an image, a video clip, and text. The display unit 1210 may be formed of a touchscreen. The display unit 1210 may output a screen from among from among a booting screen, a lock screen, a menu screen, a home screen, and an application activation screen. The display screen 100 may be formed of a flexible material that may be bent. For example, the display unit 1210 may be formed of an F-AMOLED display or an FLCD.
The sound output unit 1220 outputs voice played by the mobile device 1000. The sound output unit 1220 may include a speaker (SPK) for playing audio data transceived during a phone call using the mobile device 1000. The sound output unit 1220 may play and output an audio signal for an application that is executed when the mobile device 1000 receives a user input of bending the mobile device 1000.
The camera unit 1300 may be located at a front surface or a rear surface of the mobile device 1000, and obtain an object image by photographing an actual space that is in a periphery of the mobile device 1000. The camera unit 1300 may include a camera sensor 1310, a signal processing unit 1320, and an image sensor 1330. The camera sensor 1310 may photograph an object included in the actual space that is in the periphery of the mobile device 1000 and convert an object image of the photographed object into an electrical signal. The signal processing unit 1320 may convert an analog image signal obtained by photographing of the object by using the camera sensor into digital data. The camera sensor 1310 may be a CMOS or a CCD sensor. The signal processing unit 1330 may be implemented as a DSP. The image sensor 1330 may determine a current location of the mobile device 1000 by recognizing a particular object in the object image by sensing and analyzing the object image of the particular object photographed by using the camera sensor 1310, or by collecting location information related to the particular object, and thus, transmit the current location of the mobile device 1000 to the control unit 1600. .
The communication unit 1400 may include a mobile communication unit 1410 and a short-range communication unit 1420. The mobile device 1000 may establish communication with a peripheral device or a server (not shown) by using the mobile communication unit 1410 or the short-range communication unit 1420. In detail, the mobile communication unit 1410 may enable the mobile device 1000 to communicate with an AP provided by a communication network service provider, and determine a current location of the mobile device 1000 by transmitting location information of the AP. The short-range communication unit 1420 may enable the mobile device 1000 to communicate or share content with a peripheral device by using a Beacon communication, an NFC, a ZigBee communication, an RFID communication, a UWB communication, or a Bluetooth communication.
The memory 1500 store information about a particular location of the mobile device 1000 as a numerical value or store information about a recognized object. As such, the memory 1500 may store information needed when the mobile device 1000 executes an application to provide a service.
The control unit 1600 controls all operations of the user input unit 1100, the output unit 1200, the camera unit 1300, the communication unit 1400, and the memory 1500, so that the mobile device 1000 may provide a preset service as the mobile device 1000 receives a user input of bending the mobile device 1000.
The control unit 1600 provides a service related to an object image captured by using the camera unit 1300, from among services of applications installed on the mobile device 1000. Additionally, the control unit 1600 may determine a location of the mobile device 1000 by using the captured object image, and provide a service related to an object included in the object image based on the determined location.
Additionally, the control unit 1600 may recognize a particular object in the object image captured by the camera unit 1300, and provide information that is found by using at least one selected from the group consisting of text and an image included in the recognized particular object as an input value. The control unit 1600 may translate the text included in the recognized particular object into a preset language and provide the translated text.
Additionally, the control unit 1600 may determine a degree in which the mobile device 1000 is bent, by using the sensor unit 1110 included in the mobile device 1000. The control unit 1600 may provide a different service when the sensor unit 1110 is bent in correspondence with a second threshold value that is less than the first threshold value from when the sensor unit 1110 is bent in correspondence with the first threshold value. For example, if the control unit 1600 recognizes that the mobile device 1000 is bent in correspondence with the second threshold value, the mobile device 1000 may recognize location information of the mobile device 1000 and provide a service for searching for detailed information related to the recognized location information
Additionally, the control unit 1600 may recognize a peripheral device included in the object image captured by using the camera unit 1300, and obtain device information about the recognized peripheral device.
Additionally, the control unit 1600 may establish communication between the mobile device 1000 and a peripheral device located in a periphery of the mobile device 1000, based on a user input of bending the mobile device 1000, so that the mobile device may share content, stored in the memory 1500 included in the mobile device 1000, or content, included in the peripheral device, with the peripheral device.
In addition, other exemplary embodiments can also be implemented through computer readable code/instructions in/on a medium, e.g., a computer readable medium, to control at least one processing element to implement any above described exemplary embodiment. The medium can correspond to any medium/media permitting the storage and/or transmission of the computer readable code.
The computer readable code can be recorded/transferred on a medium in a variety of ways, with examples of the medium including recording media, such as magnetic storage media (e.g., ROM, floppy disks, hard disks, etc.) and optical recording media (e.g., CD-ROMs, or DVDs), and transmission media such as Internet transmission media. Thus, the medium may be such a defined and measurable structure including or carrying a signal or information, such as a device carrying a bitstream according to one or more exemplary embodiments. The media may also be a distributed network, so that the computer readable code is stored/transferred and executed in a distributed fashion. Furthermore, the processing element could include a processor or a computer processor, and processing elements may be distributed and/or included in a single device.
It should be understood that the exemplary embodiments described therein should be considered in a descriptive sense only and not for purposes of limitation. Descriptions of features or aspects within each exemplary embodiment should typically be considered as available for other similar features or aspects in other exemplary embodiments. For example, each component described in singular form may be executed in a distributed form. Likewise, components described in a distributed form may be executed in a combined form.
While one or more exemplary embodiments have been described with reference to the figures, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope as defined by the following claims.

Claims (15)

  1. A method of providing a preset service according to a user input of bending a mobile device, which is performed by the mobile device, the method comprising:
    receiving a first user input of bending the mobile device in correspondence with a first threshold value;
    photographing an object in an actual space by using a camera included in the mobile device, as the first user input is received;
    displaying an object image of the photographed object on a screen of the mobile device; and
    providing a service related to the object image of the photographed object, from among services of an application installed on the mobile device.
  2. The method of claim 1, further comprising determining a location of the mobile device, by using the object image of the photographed image,
    wherein the providing of the service comprises providing a service related to the object image, based on the determined location.
  3. The method of claim 1, further comprising recognizing an object in the object image of the photographed object,
    wherein the providing of the service comprises providing information related to the recognized object.
  4. The method of claim 3, wherein the recognized object is a peripheral device that is located near the mobile device,
    the method comprising obtaining device information of the peripheral device by using the object image, and
    the providing of the service comprises establishing communication between the mobile device and the peripheral device.
  5. The method of claim 4, wherein the providing of the service comprises sharing content in the mobile device with the peripheral device or sharing content in the peripheral device with the mobile device.
  6. The method of claim 1, wherein the displaying of the object image comprises displaying the object image of the photographed object on a first area from among areas of the screen of the mobile device, wherein the areas of the screen of the mobile device are obtained when the screen of the mobile device is divided as the mobile device is bent.
  7. The method of claim 6, wherein the providing of the service comprises displaying service information of the service on a second area from among the areas of the screen of the mobile device.
  8. The method of claim 1, further comprising:
    receiving a second user input of bending the mobile device in correspondence with a second threshold value before the first user input is received; and
    displaying additional information regarding an object included in the object image on a second area from among the areas of the screen of the mobile device.
  9. The method of claims 7 and 8, wherein the second threshold value represents an angle having a smaller value than the first threshold value.
  10. A mobile device configured to provide a preset service according to a user input, the mobile device comprising:
    a user input unit configured to receive a user input of bending the mobile device in correspondence with a first threshold value;
    a camera unit configured to photograph an object in an actual space, as the first user input is received;
    a display unit configured to display an object image of the photographed object on a screen; and
    a control unit configured to provide a service related to the object image of the photographed object, from among services of an application installed on the mobile device.
  11. The mobile device of claim 10, wherein the control unit determines a location of the mobile device, by using the object image of the photographed object, and
    provides a service related to the object image, based on the determined location.
  12. The mobile device of claim 10, wherein the control unit recognizes a peripheral device included in the object image of the photographed object, and obtains device information of the recognized peripheral device.
  13. The mobile device of claim 10, further comprising an angle sensor configured to recognize an angle of a first threshold value or a second threshold value at which the mobile device is bent.
  14. The mobile device of claim 10, wherein the control unit provides a service of recognizing location information of the mobile device and searching for detailed information regarding the recognized location information, if the angle sensor senses that the mobile device is bent in correspondence with the second threshold value that is smaller than the first threshold value.
  15. A non-transitory computer-readable recording storage medium having stored thereon a computer program, which when executed by a computer, performs the method of claim 1.
PCT/KR2015/011382 2015-01-12 2015-10-27 Method of providing preset service by bending mobile device according to user input of bending mobile device and mobile device performing the same WO2016114475A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2015-0004459 2015-01-12
KR1020150004459A KR102332524B1 (en) 2015-01-12 2015-01-12 Method for providing service by bending mobile device and mobile device

Publications (1)

Publication Number Publication Date
WO2016114475A1 true WO2016114475A1 (en) 2016-07-21

Family

ID=56405999

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/011382 WO2016114475A1 (en) 2015-01-12 2015-10-27 Method of providing preset service by bending mobile device according to user input of bending mobile device and mobile device performing the same

Country Status (2)

Country Link
KR (1) KR102332524B1 (en)
WO (1) WO2016114475A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109005286A (en) * 2018-07-13 2018-12-14 维沃移动通信有限公司 A kind of display control method and Folding screen terminal

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102402096B1 (en) 2017-03-27 2022-05-26 삼성전자주식회사 Device for providing information related to an object in an image
KR102620702B1 (en) 2018-10-12 2024-01-04 삼성전자주식회사 A mobile apparatus and a method for controlling the mobile apparatus
KR102620363B1 (en) * 2018-10-12 2024-01-04 삼성전자주식회사 A mobile apparatus and a method for controlling the mobile apparatus
WO2022119058A1 (en) * 2020-12-03 2022-06-09 삼성전자주식회사 Application execution processor and electronic device comprising same

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100011291A1 (en) * 2008-07-10 2010-01-14 Nokia Corporation User interface, device and method for a physically flexible device
WO2013169080A2 (en) * 2012-05-11 2013-11-14 Ahn Kang Seok Method for providing source information of object by photographing object, and server and portable terminal for method
US20140015745A1 (en) * 2012-07-13 2014-01-16 Samsung Electronics Co., Ltd. Apparatus and method for detecting and handling flexion states of flexible display
US20140035869A1 (en) * 2012-08-01 2014-02-06 Samsung Electronics Co. Ltd. Flexible display device and method for controlling the same
US20140295887A1 (en) * 2013-03-28 2014-10-02 Linkedln Corporation Navigating with a camera device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101606134B1 (en) * 2009-08-28 2016-03-25 삼성전자주식회사 Apparatus and method for connecting device using the image recognition in portable terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100011291A1 (en) * 2008-07-10 2010-01-14 Nokia Corporation User interface, device and method for a physically flexible device
WO2013169080A2 (en) * 2012-05-11 2013-11-14 Ahn Kang Seok Method for providing source information of object by photographing object, and server and portable terminal for method
US20140015745A1 (en) * 2012-07-13 2014-01-16 Samsung Electronics Co., Ltd. Apparatus and method for detecting and handling flexion states of flexible display
US20140035869A1 (en) * 2012-08-01 2014-02-06 Samsung Electronics Co. Ltd. Flexible display device and method for controlling the same
US20140295887A1 (en) * 2013-03-28 2014-10-02 Linkedln Corporation Navigating with a camera device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109005286A (en) * 2018-07-13 2018-12-14 维沃移动通信有限公司 A kind of display control method and Folding screen terminal
CN109005286B (en) * 2018-07-13 2021-06-04 维沃移动通信有限公司 Display control method and folding screen terminal

Also Published As

Publication number Publication date
KR102332524B1 (en) 2021-11-29
KR20160086717A (en) 2016-07-20

Similar Documents

Publication Publication Date Title
WO2016114475A1 (en) Method of providing preset service by bending mobile device according to user input of bending mobile device and mobile device performing the same
WO2014157886A1 (en) Method and device for executing application
WO2017131335A1 (en) User terminal device and control method therefor
WO2017003222A1 (en) Notification apparatus and object position notification method thereof
WO2014038902A1 (en) Method and device for executing application
WO2014106977A1 (en) Head mounted display and method for controlling the same
WO2014038916A1 (en) System and method of controlling external apparatus connected with device
WO2014007545A1 (en) Method and apparatus for connecting service between user devices using voice
WO2014204239A1 (en) Electronic device for displaying lock screen and method of controlling the same
WO2016064132A1 (en) Wearable device and method of transmitting content
EP3304942A1 (en) Method and apparatus for sharing application
WO2017135522A1 (en) Mirror type display device and method for controlling same
WO2014157903A1 (en) Method and device for displaying service page for executing application
WO2016200018A1 (en) Method and apparatus for sharing application
WO2014104656A1 (en) Method and system for communication between devices
WO2020149689A1 (en) Image processing method, and electronic device supporting same
EP3906553A1 (en) Electronic device for providing graphic data based on voice and operating method thereof
WO2015115698A1 (en) Portable device and method of controlling therefor
WO2016089047A1 (en) Method and device for providing content
WO2018124842A1 (en) Method and device for providing information on content
WO2020153772A1 (en) Electronic device and method of providing content therefor
WO2016093633A1 (en) Method and device for displaying content
WO2016080662A1 (en) Method and device for inputting korean characters based on motion of fingers of user
WO2020171558A1 (en) Method of providing augmented reality contents and electronic device therefor
WO2015005718A1 (en) Method of controlling operation mode and electronic device therefor

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15878119

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15878119

Country of ref document: EP

Kind code of ref document: A1