CN104662588A - Display device, control system, and control program - Google Patents

Display device, control system, and control program Download PDF

Info

Publication number
CN104662588A
CN104662588A CN201380050117.4A CN201380050117A CN104662588A CN 104662588 A CN104662588 A CN 104662588A CN 201380050117 A CN201380050117 A CN 201380050117A CN 104662588 A CN104662588 A CN 104662588A
Authority
CN
China
Prior art keywords
page
display
display device
finger
dimensional object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201380050117.4A
Other languages
Chinese (zh)
Other versions
CN104662588B (en
Inventor
上野泰弘
田边茂辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kyocera Corp
Original Assignee
Kyocera Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kyocera Corp filed Critical Kyocera Corp
Publication of CN104662588A publication Critical patent/CN104662588A/en
Application granted granted Critical
Publication of CN104662588B publication Critical patent/CN104662588B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/31Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using parallax barriers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/339Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using spatial multiplexing
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays

Abstract

A display device (1) comprises: display units (32a, 32b) which, by displaying by being mounted, images which correspond to both of a user's eyes, displays an electronic publication; a detection unit (44) which detects a plurality of bodies which carry out a manipulation of turning a page of the publication; and a control unit (22) which, according to the result of the detection of the detection unit (44), displays in the display unit (32a) a newly displayed page among the pages of the publication. It would be permissible for the display unit (32a) to stereoscopically display the publication in a display space. In such a circumstance, the detection unit (44) detects locations of a plurality of bodies in the display space.

Description

Display device, control system and control program
Technical field
The present invention relates to display device, control system and control program.
Background technology
In the display device comprising the display parts such as mobile phone terminal, can stereoscopically displaying images etc. (such as, referenced patent document 1).The parallax of eyes is utilized to realize stereo display.
Prior art document
Patent documentation
Patent documentation 1: Japanese Laid Open Patent " JP 2011-95547 "
Summary of the invention
Invent technical matters to be solved
Although for user, stereo display is the display format being easy to accept, and in existing display device, stereo display only for audiovisual object, and is not used in the convenience of raising operation.The object of the invention is to provide more convenient method of operating to user.
The object of the present invention is to provide: the display device of the method for operating that convenience can be provided high to user, control system and control program.
To deal with problems required means
Display device of the present invention, comprising: display part, when display device is worn, shows electronic publication by display and each self-corresponding image of user's two eyes; Test section, detects the object page of described publication being carried out to page turn over operation; And control part, according to the testing result of described test section, make described display part show the page of new display in the page of described electronic publication.
In addition, control system of the present invention, comprise the control part of terminal and the described terminal of control, wherein, described terminal comprises: display part, when wearing terminal, shows electronic publication by display and each self-corresponding image of user's two eyes; And test section, detect the multiple objects page of described publication being carried out to page turn over operation, described control part, according to the testing result of described test section, makes described display part show the page of new display in the page of described publication.
In addition, control program of the present invention, make the display device with display part and test section carry out following step, described step comprises: when display device is worn, by display and each self-corresponding image of user's two eyes, in described display part, demonstrate publication; The object page of described publication being carried out to page turn over operation is detected by described test section; And according to the testing result of described test section, make described display part show the page of new display in the page of described publication.
Invention effect
The present invention plays the effect of the method for operating that convenience can be provided high to user.
Brief Description Of Drawings
Fig. 1 is the stereographic map of display device.
Fig. 2 is the figure watching the display device worn by user from front.
Fig. 3 is the figure of the variation that display device is shown.
Fig. 4 is the figure of other variation that display device is shown.
Fig. 5 is the figure of other variation that display device is shown.
Fig. 6 is the block diagram of display device.
Fig. 7 illustrates that the function provided according to control program carries out the figure of the embodiment controlled.
Fig. 8 is the figure of the embodiment that the information be stored in object data is shown.
Fig. 9 is the figure of the embodiment that the information be stored in worked upon data is shown.
Figure 10 is the process flow diagram of the base conditioning order of the reading function illustrated for realizing book.
Figure 11 is the figure for illustration of detecting the operation catching three dimensional object to carry out.
Figure 12 is the figure for illustration of detecting the operation catching three dimensional object to carry out.
Figure 13 is the process flow diagram of the processing sequence of the selection check processing that three dimensional object is shown.
Figure 14 is the process flow diagram of the processing sequence that the operation detection process caught is shown.
Figure 15 is the figure of the embodiment that the book closed is shown.
Figure 16 is the figure that the embodiment that page turning controls is shown.
Figure 17 is the figure that another embodiment that page turning controls is shown.
Figure 18 is the figure that another embodiment that page turning controls is shown.
Figure 19 is the figure that the distance relation between number of pages and object climbed over is shown.
Figure 20 is the figure that the embodiment pointing out selected page scope to user is shown.
Figure 21 illustrates the figure in order to point out selected page scope to show the embodiment of page content to user.
Figure 22 is the figure of the embodiment that the operation made marks on page is shown.
Figure 23 is the figure of an embodiment of the display packing that folding line is shown.
Figure 24 is the process flow diagram of an embodiment of the processing sequence of page scope selected by adjustment.
Figure 25 is the figure of an embodiment of the operation that the upper bookmark of folder is shown.
Figure 26 is the figure of the embodiment that the operation of taking off page is shown.
Figure 27 is the figure of another embodiment that the operation of taking off page is shown.
Figure 28 is the figure of an embodiment of the operation that a part of taking off page is shown.
Figure 29 is the figure of the embodiment of control when stereo display many books are shown.
Figure 30 is the process flow diagram of the processing sequence of the process of other objects that display and page correspondence establishment are shown.
Figure 31 is the figure of the embodiment of the object that display and page correspondence establishment are shown.
Figure 32 is the figure of the embodiment of the object that display and page correspondence establishment are shown.
Figure 33 is the figure of the embodiment of the object that display and page correspondence establishment are shown.
Figure 34 is the figure of the embodiment of the object that display and page correspondence establishment are shown.
Figure 35 is the figure of the embodiment of the object of the front and back correspondence establishment that display and page are shown.
Figure 36 is the figure of the embodiment of the object of the front and back correspondence establishment that display and page are shown.
Figure 37 is the figure of the embodiment of the object that display and multipage correspondence establishment are shown.
Figure 38 is the figure of the embodiment of the object that display and multipage correspondence establishment are shown.
Embodiment
Below, the present invention is described in detail with reference to accompanying drawing.In addition, the invention is not restricted to explanation hereafter.And the inscape in explanation hereafter comprises the content of content, content equivalent in fact and the so-called equivalency range that those skilled in the art can easily expect.
Embodiment
First, the one-piece construction of the display device 1 of the first embodiment is described with reference to figure 1 and Fig. 2.Fig. 1 is the stereographic map of display device 1.Fig. 2 is the figure watching the display device 1 worn by user from front.As depicted in figs. 1 and 2, display device 1 is the wear-type device being worn on user's head.
Display device 1 has front face 1a, side surface part 1b and side surface part 1c.When worn, front face 1a is configured in the front of user to cover the eyes of user.Side surface part 1b is connected with the end of front face 1a side, and side surface part 1c is connected with the end of front face 1a opposite side.When wearing, side surface part 1b and side surface part 1c by the ear-supported of user, make display device 1 stablize as leg of spectacles.When worn, side surface part 1b and side surface part 1c also can be connected at the head back side of user.
Front face 1a has display part 32a and display part 32b on face relative with the eyes of user when worn.Display part 32a is configured in position relative with the right eye of user when worn, and display part 32b is configured in position relative with the left eye of user when worn.Display part 32a shows the image of right eye, and display part 32b shows the image of left eye.Like this, by having display part 32a and the display part 32b of each self-corresponding image of two eyes of display and user when worn, display device 1 is made can to realize utilizing the 3-D display of binocular parallax.
If can provide separately different images to the right eye of user and left eye, then display part 32a and display part 32b also can be made up of a display device.Such as, also can by rapid translating shield to make only having eyes can see the image of display, thus be configured to a display device and provide separately different images to the right eye of user and left eye.The eyes that front face 1a also can cover user do not enter the eyes of user to make extraneous light when worn.
Front face 1a be provided with display part 32a and display part 32b opposition side, face face on there is photography portion 40 and photography portion 42.Photography portion 40 is arranged near the end (right eye side when wearing) of front face 1a side, and photography portion 42 is arranged near the end (left eye side when wearing) of front face 1a opposite side.Photography portion 40 obtains the image being equivalent to the scope of user's right eye perspective.Photography portion 42 obtains the image being equivalent to the scope of user's left eye perspective.Here visual field when the visual field refers to that such as user sees front.
The image taken by photography portion 40 shows in display part 32a as right eye image by display device 1, the image taken is shown in display part 32b as left eye image by photography portion 42.Therefore, even if blocked the visual field by front face 1a, display device 1 also can provide the sight identical with when not wearing display device 1 to the user in wearing.
Display device 1 has as the above-mentioned function providing actual sight to user, also has and shows imaginary information in three dimensions, makes user can operate the function of imaginary information.By display device 1, imaginary information is overlapping with real sight as physical presence to be shown.And user can such as operate as with hand actual touch imagination information, imaginary information is moved, rotates, the change of distortion etc.Like this, display device 1 can provide directly perceived about imaginary information and the high method of operating of convenience.In explanation hereafter, sometimes the imaginary information shown in three dimensions by display device 1 is called " three dimensional object ".
Display device 1 provides and same wide visual field when not wearing display device 1 to user.And display device 1 can configure three dimensional object with arbitrary size in the optional position in this wide visual field.Like this, display device 1 does not limit by the size of display device, can at the three dimensional object of the various position display all sizes of broad space.
In fig. 1 and 2, show display device 1 and there is the embodiment being similar to glasses (safety goggles) shape, but the shape of display device 1 is not limited thereto.Such as, display device 1 also display device 2 like that, can have the shape of the helmet-type of the roughly upper half covering user's head as shown in Figure 3.Or display device 1 also display device 3 like that, can have the shape of the mask type covering the roughly whole face of user as shown in Figure 4.Display device 1 also can display device 4 like that, be configured to be connected with the external device (ED) such as signal conditioning package, cell apparatus 4d is wired or wireless as shown in Figure 5.
Then, the functional structure of display device 1 is described with reference to figure 6.Fig. 6 is the block diagram of display device 1.As shown in Figure 6, display device 1 has operating portion 13, control part 22, storage part 24, display part 32a and display part 32b, photography portion 40 and photography portion 42, test section 44, ranging unit 46.The basic operations such as the change of the startup of operating portion 13 receiving and displaying device 1, stopping, pattern.
Display part 32a and display part 32b has liquid crystal display (Liquid Crystal Display), organic EL (Organic Electro-Luminescence, organic light emission) display device of panel etc., the control signal according to inputting from control part 22 shows various information.Display part 32a and display part 32b also can be the projection arrangement using light source projected images on the retina of user such as laser beam.
Photography portion 40 and photography portion 42 use the imageing sensor of CCD (Charge Coupled Device ImageSensor), CMOS (Complementary Metal Oxide Semiconductor) etc. electronically to take image.And the image of shooting is converted to signal and exports to control part 22 by photography portion 40 and photography portion 42.
Test section 44 detects the real-world object be present in the coverage in photography portion 40 and photography portion 42.Test section 44 such as detects the object mated with the shape (such as, the shape of staff) registered in advance in the real-world object being present in coverage.Test section 44 also can be configured to, even if for the object of not registering in advance, and also can according to the scope (shape and size) of the real-world object in the detected image such as the edge of the lightness of pixel, colorfulness, tone.
Ranging unit 46 measures the distance apart from being present in the real-world object in the coverage in photography portion 40 and photography portion 42.To wear the position of every eyes of the user of display device 1 for reference measurement every eyes are apart from the distance of real-world object.Therefore, when the reference position of ranging unit 46 measuring distance and the position of every eyes are departed from, to represent the measured value of the mode correction measurement section 46 of the distance apart from eye position according to this bias.
In the present embodiment, photography portion 40 and photography portion 42 double as test section 44 and ranging unit 46.That is, in the present embodiment, by resolving the image taken by photography portion 40 and photography portion 42, the object in coverage is detected.And the object in the image taken by photography portion 42 by the object that compares in the image that is included in and taken by photography portion 40 and being included in, measures the distance of (calculatings) and object.
Display device 1, except photography portion 40 and photography portion 42, also can have test section 44.Test section 44 also can be the sensor using at least one detection in such as visible ray, infrared ray, ultraviolet, electric wave, sound wave, magnetic force, electrostatic capacitance to be present in the real-world object in coverage.Display device 1, except photography portion 40 and photography portion 42, also can have ranging unit 46.Ranging unit 46 also can be the sensor using at least one the detection distance in such as visible ray, infrared ray, ultraviolet, electric wave, sound wave, magnetic force, electrostatic capacitance to be present in the distance of the real-world object in coverage.Display device 1 also can have the sensor as used the sensor of TOF (Time-of-Flight, flight time) method can double as test section 44 and ranging unit 46 like that.
Control part 22 has as the CPU (Central Processing Unit) of arithmetic unit and the storer as memory storage, realizes various function by using these hardware resource executive routines.Particularly, control part 22 reads the program and data and be loaded on storer that store in storage part 24, CPU is performed and is included in the order be loaded in the program of storer.And control part 22, according to the execution result of the order performed by CPU, carries out the read-write of data to storer and storage part 24 or controls the action of display part 32a etc.When CPU fill order, be loaded on the data of storer and used as parameter or a part of of Rule of judgment by the operation that test section 44 detects.
Storage part 24 has non-volatile memory storage by flash memory etc. and forms, and stores various program and data.The program stored in storage part 24 comprises control program 24a.The data stored in storage part 24 comprise object data 24b, worked upon data 24c and imaginary spatial data 24d.Storage part 24 also by pocket storage mediums such as combination storage cards and can be formed the read-write equipment that storage medium is read and write.In this case, control program 24a, object data 24b, worked upon data 24c and imaginary spatial data 24d also can be stored in storage medium.In addition, control program 24a, object data 24b, worked upon data 24c and imaginary spatial data 24d also can be obtained from other devices of server unit etc. by radio communication or wire communication.
Control program 24a provides the function relevant to the various controls for making display device 1 operate.The function provided at control program 24a comprises and to be overlapped by three dimensional object on image that photography portion 40 and photography portion 42 obtain and the function shown in display part 32a and display part 32b, detect the function to the operation of three dimensional object, according to the function etc. that the operation detected makes three dimensional object change.Like this, control program 24a is by controlling display or the operation of detection to three dimensional object of three dimensional object, and make as described later, user can enjoy the happy of read electronic publication.
Control program 24a comprises check processing portion 25, display object control part 26 and Images uniting portion 27.Check processing portion 25 is provided for detecting the function of the real-world object in the coverage being present in photography portion 40 and photography portion 42.The function that check processing portion 25 provides comprises the function of the distance measured apart from each object detected.
Display object control part 26 is provided for management and configures which type of three dimensional object in imaginary space, and each three dimensional object is in the function of which type of state.The function that display object control part 26 provides comprises the operation of motion detection for three dimensional object of the real-world object that basis is detected by the function in check processing portion 25, and according to the function that the operation detected makes three dimensional object change.
Images uniting portion 27 is provided for, by the synthesis image of real space and the image in imaginary space, being created on the function of the image shown in display part 32a and the image shown in display part 32b.The function that Images uniting portion 27 provides comprise according to by measured by the function in check processing portion 25 apart from real-world object distance with from the viewpoint imaginary space to the distance of three dimensional object, judge the context of real-world object and three dimensional object and the function of adjustment overlap.
Object data 24b comprises the information of shape about three dimensional object and character.Object data 24b is for showing three dimensional object.Worked upon data 24c comprises the information about how to act on three dimensional object to the operation of the three dimensional object shown.When the operation to the three dimensional object of display being detected, worked upon data 24c is used for judgement makes three dimensional object how change.Change mentioned here comprises movement, rotation, distortion, disappearance etc.Imagination spatial data 24d keeps the information relevant to the state of the three dimensional object be configured in imaginary space.The state of three dimensional object comprises the situations such as such as position, posture, distortion.
Then, with reference to figure 7, illustrate that the function provided according to control program 24a carries out the embodiment controlled.Image P1a is the image obtained by photography portion 40, that is, be equivalent to the image of the sight seeing real space with right eye.Appear before one's eyes out in image P1a the hand H1 of desk T1 and user.Display device 1 also obtains the image being taken same scene by photography portion 42, that is, be equivalent to the image of the sight seeing real space with left eye.
Image P2a is the image of the right eye according to imaginary spatial data 24d and object data 24b generation.In the present embodiment, imaginary spatial data 24d keeps the information relevant to the state of the block three dimensional object BL1 be present in imaginary space, and object data 24b keeps the information relevant with character to the shape of three dimensional object BL1.Display device 1, according to these information regeneration imagination spaces, generates the image P2a in the imaginary space seeing reproduction with the viewpoint of right eye.The position of the right eye (viewpoint) in imaginary space is determined according to pre-defined rule.Similarly, display device 1 also generates the image in the imaginary space seeing reproduction with the viewpoint of left eye.That is, display device 1 is also by being combined with image P2a the image generating and show three dimensional object BL1 in three dimensions.
Display device 1 is composograph P1a and image P2a in the step S1 shown in Fig. 7, synthetic image P3a.Image P3a is the image shown in display part 32a as the image of right eye.Now, display device 1, judges be arranged in the real-world object of the coverage in photography portion 40 and be present in the context of three dimensional object in imaginary space for benchmark with the position of user's right eye.And, when real-world object is overlapping with three dimensional object, adjustment overlapping with make it possible to from see above from user right eye more close to object.
The adjustment of above-mentioned overlap is carried out by the scope (such as, each pixel) of pre-sizing each in the region on the image that real-world object is overlapping with three dimensional object.Therefore, the distance from the viewpoint real space to real-world object is measured by the scope of each pre-sizing on image.And, consider the position, shape, attitude etc. of three dimensional object, by the range computation of each pre-sizing on image from the viewpoint imaginary space to the distance of three dimensional object.
In the scene of the step S1 shown in Fig. 7, three dimensional object BL1 is configured in and is equivalent to the position of desk T1 directly over the position that real space exists in imaginary space.And, in the scene of the step S1 shown in Fig. 7, the hand H1 of user and three dimensional object BL1 with the right eye position of user for benchmark is present in roughly the same distance in roughly the same direction.Therefore, overlapping by the scope adjustment by each pre-sizing, make in image P3a in post synthesis, among the region of hand H1 and three dimensional object BL1 overlap, before the part place hand H1 of the thumb being equivalent to hand H1 is exposed at, before other part places three dimensional object BL1 is exposed at.And, at the region place of desk T1 and three dimensional object BL1 overlap, before three dimensional object BL1 is exposed at.
By the adjustment of above-mentioned overlap, in the step S1 shown in Fig. 7, obtain and to be placed on desk T1 as three dimensional object BL1 and user catches the image P3a of three dimensional object BL1 with hand H1.Display device 1 synthesizes the image in the image taken by photography portion 42 and imaginary space viewed from the viewpoint of left eye by identical process, is created on image that display part 32b shows using as left eye image.When generating the image of left eye, the overlap being benchmark adjustment real-world object and three dimensional object with the left eye position of user.
Display device 1 shows the composograph as above-mentioned generation in display part 32a and display part 32b.As a result, user can see and is placed on desk T1 as three dimensional object BL1, catches the sight of three dimensional object BL1 with the hand H1 of oneself.
In the scene of the step S1 shown in Fig. 7, hand H1 moves to the direction of arrow A 1 by user.In this case, in the scene of the step S2 shown in Fig. 7, the image change obtained by photography portion 40 is the image P1b after the position of hand H1 moves right.And the action of hand H1 is judged as catching the state of three dimensional object directly to the operation moved right by display device 1, according to operation, is moved right the position of the three dimensional object in imaginary space.The movement of the three dimensional object in imagination space is reflected in imaginary spatial data 24d.As a result, the image P2b after the position being three dimensional object BL1 according to the image change of the right eye of imaginary spatial data 24d and object data 24b generation moves right.Hereafter describe the detection of the operation undertaken by display device 1 in detail.
Display device 1 composograph P1b and image P2b also generates the image P3b of right eye.Compared with image P3a, image P3b is the image catching three dimensional object BL1 as user's position hand more on the right side on desk T1.Display device 1 similarly generates the composograph of left eye.And display device 1 shows the composograph as above-mentioned generation in display part 32a and display part 32b.As a result, user can see the sight as catching three dimensional object BL1 to move right with the hand H1 of oneself.
The composograph carrying out above-mentioned display with the frequency (such as, 30 times per second) identical with common animation frame per second upgrades.As a result, the image that display device 1 shows reflects the change of the three dimensional object BL1 of the operation according to user haply in real time, and user can operate three dimensional object BL1 in phase as physical presence.And, in the structure of the present embodiment, the hand H1 of the user of operation three dimensional object BL1 is without the need to being between the eyes of user and display part 32a and display part 32b, and therefore, user can operate when not worrying that the display of three dimensional object is blocked by hand.
Then, the object data 24b shown in Fig. 6 and worked upon data 24c is illustrated in greater detail with reference to figure 8.Fig. 8 is the figure of the embodiment that the information be stored in object data 24b is shown.Embodiment shown in Fig. 8 is the embodiment of the information about the three dimensional object shown as book.As shown in Figure 8, the three dimensional object shown as book comprises multiple three dimensional objects of front cover, back cover, spine and multiple pages.That is, the three dimensional object shown as book is the aggregate of three dimensional object.In addition, in explanation hereafter, for convenience of explanation, sometimes the three dimensional object shown as book is called merely " book ".Similarly, sometimes the three dimensional object being equivalent to front cover, back cover, spine and page is called merely respectively " front cover ", " back cover ", " spine " and " page ".
Be preset with in front cover, back cover and spine for determine thickness, width, highly, the information of the outward appearance such as color and character.In addition, in front cover, back cover and spine, set the text line and image etc. that show on the surface of three dimensional object as content using predetermined form.
In multipage in advance common setups for determine thickness, width, highly, the information of the outward appearance such as color and character.In addition, in multipage, set the text and image etc. that show in each page as content by every page using predetermined form.Sometimes the distinctive information in this page such as " < folds/> ", " < bookmark/> " is added in page." < folds/> " represents that a part for corresponding page is folded." < bookmark/> " expression accompanies bookmark in the page of correspondence.
In addition, in fig. 8, illustrate with the embodiment of XML (Extensive Markup Language) form description object data 24b, but the form of object data 24b is not limited thereto.Such as, the form of object data 24b also can be custom-designed form.In addition, the structure as the three dimensional object of book display is not limited to the embodiment shown in Fig. 8.Such as, the three dimensional object shown as book also can not comprise the information for the shape and character determining front cover, back cover and spine.In this case, the front cover of all books, back cover and spine also can have common shape and character according to the setting carried out in advance.
Fig. 9 is the figure of the embodiment that the information be stored in worked upon data 24c is shown.Embodiment shown in Fig. 9 illustrates how to act on these pages to the operation of the page included by book.In addition, in the present embodiment, the operation of page is envisioned for such as catches a part for page and the operation carried out with finger etc.
As shown in Figure 9, the effect of the operation of page is changed according to conditions such as situation, moving direction, moving range, translational speed and hardness.Situation represents when being release, at the end of namely catching operation, or mobile in, which when namely catching operation.Moving direction is the moving direction of the finger catching page etc.Moving range is the moving range of the finger catching page etc.Translational speed is the translational speed of the finger catching page etc.Hardness represents the hardness of page.Hardness is determined according to the thickness of page.
When setting situation as release, the moving direction catching the finger etc. of page to be about to is the opening and closing direction of book, namely with the direction that the connecting portion of page rotates for turning axle.In this case, if the moving range of finger etc. exceedes the connecting portion of page, if the connecting portion that namely finger etc. cross page moves, then display device 1 makes the page of release be changed by page turning.In addition, even if the moving range of finger etc. does not exceed the connecting portion of page, but when translational speed is higher than threshold value, and during hardness ratio threshold value height, display device 1 makes the page of release be changed by page turning.In addition, when the moving range to point etc. does not exceed the connecting portion of page, if translational speed is lower or hardness ratio threshold value is low than threshold value, then display device 1 makes the page of release change with gravity.Change case with gravity falls to gravity direction as shown as.
In addition, the situation that sets is as in movement, and the moving direction catching the finger of page etc. is the direction changed with the distance of the connecting portion of page.In this case, if the hardness of page is high, then display device 1 changes the position be booked.In addition, when the hardness ratio threshold value of page is low, if the distance of the part be booked and the connecting portion of page is below initial distance, then display device 1 coordinates the action of finger etc. to change the page be booked.In addition, when the hardness ratio threshold value of page is low, and when the part be booked is larger than initial distance with the distance of the connecting portion of page, display device 1 makes the page be booked be changed to be removed.That is, display device 1 makes the page be booked be separated with book.
In addition, when the moving direction of the finger catching page etc. is the direction orthogonal with the opening and closing direction of book, and during the hardness ratio threshold value height of page, display device 1 changes the position be booked.In addition, when the moving direction of the finger catching page etc. is the direction orthogonal with the opening and closing direction of book, and when the hardness ratio threshold value of page is low, display device 1 makes the page be booked be changed to be removed.In addition, when catching the opening and closing direction of moving direction book of finger etc. of page, display device 1 coordinates the action of finger etc. to change the page be booked.
Like this, set information in worked upon data 24c, changes according to operation to make page in the same manner as real page.About front cover and back cover, also in worked upon data 24c, carry out the setting identical with page.In addition, the structure of worked upon data 24c and content are not limited to the embodiment shown in Fig. 9.Such as, worked upon data 24c also can comprise the condition beyond shown in Fig. 9.In addition, the effect defined in worked upon data 24c also can be different from the embodiment shown in Fig. 9.
Then, with reference to figures 10 to Figure 14, the action that display device 1 is carried out in order to the reading function realizing book is described.In explanation hereafter, sometimes the space of user's viewing of wearing display device 1 is called display space.Figure 10 is the process flow diagram that the base conditioning order that display device 1 is carried out in order to the reading function realizing book is shown.Figure 11 and Figure 12 is the figure for illustration of detecting the operation catching three dimensional object to carry out.Figure 13 is the process flow diagram of the processing sequence of the selection check processing that three dimensional object is shown.Figure 14 is the process flow diagram of the processing sequence that the operation detection process caught is shown.
Processing sequence shown in Figure 10 is realized by control part 22 executive control program 24a.As shown in Figure 10, in order to realize the reading function of book, first, as step S101, control part 22 synthesis comprises the imaginary spatial image of book and real space image and shows in display part 32a and display part 32b.Outward appearance and the content of shown book are determined according to object data 24b.
Then, as step S102, control part 22 judges whether the operation to book to be detected.The operation to book is detected according to the image taken by photography portion 40 and photography portion 42.When the operation to book being detected (situation of step S102 "Yes"), as step S103, control part 22 changes the book shown according to the operation detected.Corresponding with the operation detected, that book is changed method is determined according to worked upon data 24c.When the operation to book not detected (situation of step S102 "No"), the book shown remains unchanged.
Then, as step S104, control part 22 judges whether to end process.When such as user has carried out indicating the operation of the read function terminating book, control part 22 has judged to end process.When ending process (situation of step S104 "Yes"), control part 22 terminates the processing sequence shown in Figure 10.When not ending process (situation of step S104 "No"), control part 22 performs from step S102 again.
As mentioned above, in the present embodiment, the operation that such as, when imagining the operation to book finger etc. catch page to carry out.That is, in the step S102 shown in Figure 10, detect the operation catching page to carry out, in step s 103, carry out the process corresponding with the operation catching page to carry out.
Hereinafter with reference Figure 11 to Figure 14 describes in detail about catching the control carrying out operating.In the step SA1 shown in Figure 11, in display space, show three dimensional object OB1 in three-dimensional mode by display part 32a and display part 32b.In addition, in order to select three dimensional object OB1, user's moveable finger F1 and finger F 2 are to make three dimensional object OB1 between finger F 1 and finger F 2.
Have 2 objects when display device 1 detects in display space, and three dimensional object OB1 between these 2 objects time, monitor the change of distance D1 of 2 objects.And, if roughly fixing in the time of distance D1 more than the schedule time, then judge to have selected three dimensional object OB1, three dimensional object OB1 is set to selection mode.And display device 1 changes the display mode etc. of three dimensional object OB1, notify that three dimensional object OB1 becomes selection mode to user.
By in the surface that such as changes three dimensional object OB1 with link 2 objects detected straight line intersection part near color or brightness, notify that three dimensional object OB1 becomes selection mode to user.In addition, also can be undertaken notifying to replace the notice of above-mentioned vision by sound or vibration, or carry out the notice of above-mentioned vision and the notice by sound or vibration.
Monitor the During of the distance D1 of 2 objects in display device 1,2 objects are without the need to staying the position catching three dimensional object OB1.That is, user's moveable finger F1 and finger F 2 as illustrated in step SA1, with after making three dimensional object OB1 between finger F 1 and finger F 2, also can not keep this state, finger F 1 and finger F 2 are moved to other positions.Such as, after finger F 1 and finger F 2 being moved to the position clamping the page grasped at, before notifying that the page caught becomes selection mode, also can start the action page caught being carried out to page turning.
As shown in step SA2, user from the state of step SA1 with the state moveable finger F1 keeping the distance D1 of finger F 1 and finger F 2 and roughly fix and finger F 2.In this case, display device 1, from the stage that three dimensional object OB1 shows between finger F 1 and finger F 2 being detected, namely from the stage of step SA1, applies the change of movement, rotation etc. to three dimensional object OB1 according to the action of finger F 1 and finger F 2.And as shown in step SA3, keep roughly fixing state to continue for the stage of the schedule time more than at the distance D1 of finger F 1 and finger F 2, three dimensional object OB1 is set to selection mode by display device 1.
In addition, as shown in the step SB1 to SB3 from Figure 12, when before after a predetermined time, the distance D1 of finger F 1 and finger F 2 becomes large, display device 1 couple of three dimensional object OB1 applies the change contrary with the change applied at that time.That is, when user does not want to carry out the operation to three dimensional object OB1, three dimensional object OB1 returns original state.As a result, three dimensional object OB1 is shown in the position identical with the stage of step SB1 with identical state.Speed three dimensional object OB1 being applied to inverse variation also can than fast to the speed of three dimensional object OB1 applying change at that time.That is, also can three dimensional object OB1 be made to do inverse variation as put upside down at a high speed.
Like this, by applying change to three dimensional object from the stage that three dimensional object shows between 2 objects being detected, make user can determine that front identification selects three dimensional object from selection.As a result, user can know whether the three dimensional object that have selected and want to select in advance.In addition, more than the state predetermined hold-time that the distance maintenance of 2 objects is roughly fixing, also can, by showing in the mode (such as translucent) different with selection mode from usual state the three dimensional object being applied in change, user be made to be easy to judge the state of three dimensional object.
In addition, also can not change three dimensional object OB1 from the stage of step SA1 according to the action of finger F 1 and finger F 2, but start to change three dimensional object OB1 after three dimensional object OB1 becomes selection mode.In addition, also only after the state of three dimensional object OB1 between finger F 1 and finger F 2 of such as step SA1 continue for the schedule time, three dimensional object OB1 can be set to selection mode.
In addition, in fig. 11, show the embodiment selecting 1 three dimensional object shown between 2 objects, but the three dimensional object quantity selected is not limited to 1.When detect between 2 objects, show multiple three dimensional object time, these three dimensional objects are selected by display device 1 together.That is, display device 1 allows user to select multipage and operates this multipage.
Figure 13 is the process flow diagram of the processing sequence of the selection check processing that three dimensional object is shown.Processing sequence shown in Figure 13 is realized by control part 22 executive control program 24a.As shown in figure 13, first, as step S201, control part 22 judges by test section 44, and whether namely photography portion 40 and photography portion 42 detect the first object and the second object.First object and the second object are such as the fingers of user.
When the first object and the second object being detected (situation of step S201 "Yes"), as step S202, control part 22 finds the three dimensional object shown between the first object and the second object among the three dimensional object of display.
When finding the three dimensional object shown between the first object and the second object (situation of step S203 "Yes"), as step S204, the three dimensional object shown between the first object and the second object is set to supposition selection mode by control part 22.When showing multiple three dimensional object between the first object and the second object, these three dimensional objects are all set to supposition selection mode by control part 22.Then, as step S205, control part 22 calculates the distance of the first object and the second object.Then, as step S206, control part 22 carries out the operation detection process shown in Figure 14, wherein, makes according to the operation detected the three dimensional object change being in selection mode.
When the three dimensional object shown between the first object and the second object not detected (situation of step S203 "No"), do not perform step S204 to step S206.
Afterwards, as step S207, control part 22 judges whether to end process.When ending process (situation of step S207 "Yes"), control part 22 ends process sequentially.When not ending process (situation of step S207 "No"), control part 22 performs from step S201 again.
When the first object and the second object not detected (situation of step S201 "No"), control part 22 performs step S207.
Figure 14 is the process flow diagram of the processing sequence that the operation detection process caught is shown.Processing sequence shown in Figure 14 is realized by control part 22 executive control program 24a.As shown in figure 14, first, as step S301, control part 22 calculates the distance of the first object and the second object.Then, as step S302, control part 22 judges that whether the distance namely when catching the beginning of operation detection process is larger than threshold value with the difference of the distance measured in step S301 when selecting three dimensional object.Threshold value used herein is for judging the value whether the first object is roughly the same with distance when selecting three dimensional object with the distance of the second object.
When the difference of distance is less than threshold value (situation of step S302 "No"), as step S303, control part 22 judges whether have passed through the schedule time from catching operation detection process.When have passed through the schedule time (situation of step S303 "Yes"), as step S304, if there is the three dimensional object of supposition selection mode, then this three dimensional object is set to selection mode by control part 22.When without (situation of step S303 "No") during the schedule time, do not carry out step S304.In addition, the schedule time can be such as the very short time as 0.1 second.
Then, as step S305, control part 22, according to the action of the first object detected and the second object, makes in selection mode or the three dimensional object change of supposition selection mode.The method that three dimensional object is changed is determined according to worked upon data 24c.Such as, the action that control part 22 makes in selection mode or the page of supposition selection mode is changed to cooperation first object and the second object is picked up.Then, control part 22 performs from step S301 again.
When the difference of distance is greater than threshold value (situation of step S302 "Yes"), as step S306, control part 22 judges that whether the three dimensional object shown between the first object and the second object is in supposition selection mode.
When three dimensional object is in supposition selection mode (situation of step S306 "Yes"), as step S307, control part 22 removes the supposition selection mode of three dimensional object.Then, as step S308, control part 22 makes three dimensional object carry out contrary change and returns virgin state, and then control part 22 terminates to catch operation detection process.
When three dimensional object is not supposition selection mode, when being namely selection mode (situation of step S306 "No"), as step S309, control part 22 judges that the distance change of cooperation first object and the second object can maintain the range of choice of three dimensional object or can change the range of choice of three dimensional object.
When the Distance Shortened of the first object and the second object, maintain or reduce the range of choice of three dimensional object.Particularly, when the three dimensional object becoming selection mode is 1, even if the distance of the first object and the second object shortens three dimensional object also keep selection mode.When the three dimensional object becoming selection mode is multiple, along with the Distance Shortened of the first object and the second object, the quantity becoming the three dimensional object of selection mode reduces, but has at least 1 three dimensional object to keep selection mode.Such as, when catching page with finger, the number of pages of control part 22 page that minimizing is booked along with finger is close, but the state that at least 1 page of maintenance is booked.
On the other hand, when the distance of the first object and the second object increases, may not maintain or change the range of choice of three dimensional object.Particularly, elongated by the distance of the first object and the second object, make when the three dimensional object of nonselection mode is between the first object and the second object, range of choice expands.In this case, the three dimensional object of nonselection mode becomes selection mode.There is no the three dimensional object of nonselection mode between the first object increased when distance and the second object, and the first object or the gap between the second object and the three dimensional object in selection mode larger than predetermined size time, can not maintain or change selection mode.When being judged as can not maintaining or changing selection mode, the three dimensional object of release in selection mode.
Such as, when catching page with finger, control part 22 along with finger away from increasing the number of pages of page be booked.Then, when the page not having to catch, the page be booked and the gap between point arbitrarily larger than preliminary dimension time, control part 22 judgement can not maintain or change selection mode.
When changing maintenance when the distance of the first object and the second object can be coordinated or change the range of choice of three dimensional object (situation of step S309 "Yes"), as step S310, control part 22 coordinates the distance change of the first object and the second object maintain or change the range of choice of three dimensional object.Then, control part 22 performs from step S301 again.
When changing maintenance when not coordinating the distance of the first object and the second object or change the range of choice of three dimensional object (situation of step S309 "No"), as step S311, control part 22 removes the selection mode of three dimensional object.In addition, as step S312, control part 22 is according to changed condition three dimensional object during release.The method that three dimensional object is changed is determined according to worked upon data 24c.Such as, control part 22 makes the page in selection mode be changed to gravity page turning.Then, control part 22 terminates to catch operation detection process.
Then, the specific embodiment of the control in the reading function of book is described with reference to Figure 15 to Figure 29.In addition, in order to make explanation simple, the explanation of supposition selection mode is hereafter omitted.
Figure 15 is an embodiment of the book closed.In fig .15, display device 1 is three-dimensionally presented at the book 50 on desk T1.In the present embodiment, book 50 is closed.The outward appearance of book 50 is determined according to object data 24b.In addition, if show book 50 at faithful to object data 24b, the Thickness Ratio predetermined value of book 50 is little, display device 1 also can revise the thickness of page to make the thickness of book 50 more than predetermined value.By increasing the thickness of book 50, user is easy to carry out the operation to book 50.
Figure 16 is the figure of an embodiment of the control that page turning is shown.In the step SC1 shown in Figure 16, user's moveable finger F1 and finger F 2 are to make the front cover of book 50 and page between finger F 1 and finger F 2.When display device 1 detect to there is front cover and page between finger F 1 and finger F 2 time, the front cover between finger F 1 and finger F 2 and page are set to selection mode.
Afterwards, in step SC2, user with the state keeping the interval of finger F 1 and finger F 2 and roughly fix to the opening and closing direction moveable finger F1 and finger F 2 to the connecting portion exceeding page of book 50.Display device 1 makes the front cover in selection mode and page change when the action of above-mentioned finger F 1 and finger F 2 being detected according to worked upon data 24c.Particularly, display device 1 coordinates the mobile change of finger F 1 and finger F 2 in the front cover of selection mode and the angle of page.
In this condition when user expands the distance of finger F 1 and finger F 2, or when making finger F 1 and finger F 2 away from the connecting portion of page thus become state not between finger F 1 and finger F 2 of front cover and page, the front cover of release in selection mode and page.As a result, display device 1 changes book 50 according to worked upon data 24c.Particularly, as shown in step SC3, change book 50 was that the page of medial extremity in the page of selection mode is topmost before making.Display device 1 shows the text corresponding with these page and image on the surface of the page that book 50 is opened.
Figure 17 is the figure of another embodiment of the control that page turning is shown.In the step SD1 shown in Figure 17, by control as shown in figure 16, book 50 is shown as being in the state opened.Then, user's moveable finger F1 and finger F 2 with make to comprise open page page between finger F 1 and finger F 2.When display device 1 detect there is page between finger F 1 and finger F 2 time, the page between finger F 1 and finger F 2 is set to selection mode.
Afterwards, in step SD2, user with the state keeping the interval of finger F 1 and finger F 2 and roughly fix to the opening and closing direction moveable finger F1 and finger F 2 to the connecting portion exceeding page of book 50.Display device 1 makes the front cover in selection mode and page change when the action of above-mentioned finger F 1 and finger F 2 being detected according to worked upon data 24c.Particularly, display device 1 coordinates the angle of the page of mobile change in selection mode of finger F 1 and finger F 2.
Now, display device 1 also can change according to the thickness of page (hardness) method that the page in selection mode is changed.Such as, when page is thicker than threshold value (during hardness height), display device 1 also can not bend page and angle changing.In addition, when page is thicker than threshold value (during hardness height), the change that display device 1 also can limit page is to make to catch the object of the page in selection mode must with the connecting portion of page for turning axle is as drawn the mobile angle that could change page circular arc.In addition, when page is thinner than threshold value (when hardness is low), display device 1 also can coordinate catches the action of the object of the page in selection mode and gravity to carry out bending page.
When user expands the distance of finger F 1 and finger F 2 under step SD2 state, or when making finger F 1 and finger F 2 away from the connecting portion of page thus become the state of page not between finger F 1 and finger F 2, release is in selection mode page.As a result, display device 1 changes book 50 according to worked upon data 24c.Particularly, as shown in step SD3, change book 50 was that the page of medial extremity in the page of selection mode is topmost before making.Display device 1 shows the text corresponding with these page and image on the surface of the page that book 50 is opened.
Figure 18 is the figure of another embodiment of the control that page turning is shown.In the step SE1 shown in Figure 18, by control as shown in figure 17, turn over multipage.Then, user's moveable finger F3 and finger F 4 are to make a part for the page in selection mode between finger F 3 and finger F 4.When display device 1 detect there is page between finger F 3 and finger F 4 time, the page between finger F 3 and finger F 4 is corresponding with finger F 3 and finger F 4.
Afterwards, in step SE2, user with the state keeping the interval of finger F 3 and finger F 4 and roughly fix to the opening and closing direction moveable finger F3 and finger F 4 to the connecting portion exceeding page of book 50.Display device 1 makes the front cover in selection mode and page change when the action of above-mentioned finger F 3 and finger F 4 being detected according to worked upon data 24c.Particularly, display device 1 coordinates the movement of finger F 3 and finger F 4 that the angle of page corresponding with finger F 3 and F4 in the page in selection mode is changed.
When user expands the distance of finger F 1 and finger F 2 under step SE2 state, or when making finger F 1 and finger F 2 away from the connecting portion of page thus become the state of page not between finger F 1 and finger F 2, be released in the page between finger F 1 and finger F 2.In addition, when user expands the distance of finger F 3 and finger F 4 under step SE2 state, or when making finger F 3 and finger F 4 away from the connecting portion of page thus become the state of page not between finger F 3 and finger F 4, be released in the page between finger F 1 and finger F 2.As a result, display device 1 changes book 50 according to worked upon data 24c.Particularly, as shown in step SE3, change book 50 with the page before making between finger F 1 and finger F 2 previously in the page of the boundary of the page between finger F 3 and finger F 4 topmost.Display device 1 shows the text corresponding with these page and image on the surface of the page that book 50 is opened.
As shown in Figure 16 to Figure 18, display device 1 makes user that page can be caught to open book.As a result, user not only can from opening every page of reading at first, and by with the operation identical to the operation of real book, easily can find the part of hope from electronic publication.
The number of pages climbed over is determined according to the distance of the object selecting page.Figure 19 is the figure that the distance relation between number of pages and object climbed over is shown.As shown in figure 19, than distance Dx more in short-term, the number of pages climbed over when the number of pages climbed over equals distance Dx than distance D1 is few for the distance D1 between the finger F 1 selecting page and finger F 2.On the other hand, when the distance D1 between the finger F 1 selecting page and finger F 2 is longer than distance Dx, the number of pages climbed over when the number of pages climbed over equals distance Dx than distance D1 is many.Like this, as long as and very close to each other between book 50, distance D1 is longer, and display device 1 more increases the number of pages climbed over.
Like this, display device 1 corresponds to distance D1 and changes the number of pages climbed over, and user can climb over arbitrary number of pages.
After the page of the quantity corresponding with distance D1 is set to selection mode, when distance D1 changes in the scope that can change range of choice, display device 1 changes the number of pages in selection mode according to the distance D1 after change.Preferably, display device 1 points out selected page scope to user.
Figure 20 is the figure that the embodiment pointing out the scope of selected page to user is shown.In the step SF1 shown in Figure 20, user's moveable finger F1 and finger F 2 are to make the page comprising the page opened between finger F 1 and finger F 2.When display device 1 detect there is page between finger F 1 and finger F 2 time, the page between finger F 1 and finger F 2 is set to selection mode.And display device 1 is presented among the page in selection mode, at the page number (87) of the page with the one end, page opposition side of opening.In addition, in the embodiment shown in Figure 20, the page opened shows the page number, but as long as the position that can see of user, where can in the position of the display page number.
Afterwards, in step SF2, user does not reserve the space between book 50 and increases the distance between finger F 1 and finger F 2.As a result, display device 1 increases the number of pages in selection mode.And display device 1 shows the page number (125) of the scope of the page represented in selection mode again.Like this, point out selected page scope by display device 1 to user, make user can be easy to adjust the scope of the page opened.
In addition, in fig. 20, the figure showing the embodiment of the page number for pointing out selected page scope to user is shown, but the content of the number of pages that also can show in selection mode or page is to replace the page number, or the display page number and in the number of pages of selection mode or the content of page.Figure 21 illustrates the figure in order to point out selected page scope to show the embodiment of page content to user.
In the step SG1 shown in Figure 21, in the page that book 50 is opened, show the text corresponding with these page and image.Then, in step SG2, user's moveable finger F1 and finger F 2 are to make the page comprising the page opened between finger F 1 and finger F 2.When display device 1 detect there is page between finger F 1 and finger F 2 time, the page between finger F 1 and finger F 2 is set to selection mode.And if the page becoming selection mode is opened, display device 1 shows the text corresponding with shown page and image.And, in the embodiment shown in Figure 21, change the content of the page opened, as long as but the position that the position of the content of display page and size are users can be seen and size.
Afterwards, in step SG3, user does not reserve the space between book 50 and increases the distance between finger F 1 and finger F 2.As a result, display device 1 increases the number of pages in selection mode.And if the page becoming selection mode is opened, display device 1 shows the text corresponding with shown page and image again.Like this, by the content of display device 1 to user's tips page, make user can be easy to hold it is seen which page by page turning.
In addition, in order to adjust the number of pages in selection mode, not only can use the distance between finger F 1 and finger F 2, the translational speed of finger F 1 and finger F 2 can also be used.Particularly, when the translational speed of finger F 1 and finger F 2 is faster than threshold value, increase the variable quantity that relative distance variable quantity is the number of pages of selection mode.On the other hand, when the translational speed of finger F 1 and finger F 2 is slower than threshold value, reduce the variable quantity that relative distance variable quantity is the number of pages of selection mode.Like this, by using the translational speed of finger F 1 and finger F 2, user is made to be easy to the number of pages in selection mode to be adjusted to the value wanted.The translational speed of finger F 1 mentioned here and finger F 2 preferably uses a side fast among the translational speed of the translational speed of finger F 1 and finger F 2.
Explained above is the operation of being opened together by multipage, but display device 1 also receives the operation of opening page by page.Such as, when display device 1 to detect with the finger contacted of the page opened etc. towards the action of the page movement of another side, also can open and be touched 1 page on one side.The situation of opening real thin paper page is imitated in this operation.
Display device 1 also can receive operation beyond page turn over operation as the operation relevant to page.Such as, display device 1 also may be received in operation that page is marked as the operation relevant to page.Figure 22 is the figure of the embodiment that the operation made marks on page is shown.In the step SH1 shown in Figure 22, user catches the angle of 1 page by finger F 1 and finger F 2.Then, in step SH2, user's moveable finger F1 and finger F 2 are with partially folded by what catch.
Like this, when the action of a part of folded edge being detected, this part as folding line 50a, is kept folded state by display device 1.Then, the page being provided with folding line 50a is recorded in object data 24b by display device 1.Preferably, as shown in figure 23, even if under the state be closed at book 50, display device 1 shows folding line 50a in the mode different from other parts, with the position making user can hold folding line 50a by change or brightness etc.Like this, by the operating and setting folded part of display device 1 according to user, can make marks to the page thinking to see again in advance after making user.
In addition, multiple folding line 50a can be set in 1 book 50.In addition, when the Thickness Ratio threshold value of page is thick (during hardness height), even if display device 1 detects that the action of a part for folded edge also can not arrange folding line.
Preferably, display device 1 adjusts selected page scope and sets the page of folding line 50a to make user be easy to read.Figure 24 is the process flow diagram of an embodiment of the processing sequence of page scope selected by adjustment.As step S401, the control part 22 of display device 1, when detecting that page shows between the first object and the second object, calculates the distance between the first object and the second object.Then, as step S402, control part 22 calculates the number of pages selected according to the distance calculated.
Then, as step S403, control part 22 judges whether have folding line in the page in the predetermined number of pages in front and back of selected last page.When there being folding line (situation of step S404 "Yes"), as step S405, the number of pages selected by control part 22 correction is with till making to choose page creasy.When there is multiple page creasy in the page in the front and back predetermined number of selected last page, the number of pages of control part 22 such as selected by correction is with till making to choose the page nearest with the last page be booked from page creasy.
Do not have folding line (situation of step S404 "No") in page in the predetermined number of pages in front and back of selected last page, then control part 22 selects page according to the number of pages calculated in step S402.
In addition, the adjustment range of choice shown in Figure 24 also can only be carried out when having carried out the operation catching page near the angle (angle 50b as shown in Figure 23) being provided with folding line 50a.That is, when having carried out the operation catching page near the angle (angle 50c as shown in Figure 23) not arranging folding line 50a, also range of choice can not have been adjusted.Like this, suppressed the adjustment of range of choice according to the position of selected page by display device 1, user is easy to reference to the page in the predetermined number of pages in front and back of the page being provided with folding line 50a.
Preferably, display device 1 is when the operation that is clipped in by bookmark 60 as illustrated in fig. 25 in book being detected or bookmark band (ス ピ Application) being clipped in the operation in book, also identical with the situation being provided with folding line, the scope of the page selected by adjustment.
In addition, display device 1 also can will take off the operation of page as the operation about page.Figure 26 is the figure of the embodiment that the operation of taking off page is shown.In the step SI1 shown in Figure 26, user catches the end of page 50d by finger F 1 and finger F 2, and moveable finger F1 and finger F 2 become large with the distance of the connecting portion with page.Display device 1 makes a page 50d change when the action of above-mentioned finger F 1 and finger F 2 being detected according to worked upon data 24c.Particularly, as shown in step S12, display device 1 is torn a page 50d, makes it be separated with book 50.
Figure 27 is the figure of another embodiment that the operation of taking off page is shown.In the step SJ1 shown in Figure 27, user catches the angle of page 50d by finger F 1 and finger F 2.Then, in step SJ2, user is to the direction moveable finger F1 orthogonal with the opening and closing direction of book 50 and finger F 2.Display device 1 makes a page 50d change when the action of above-mentioned finger F 1 and finger F 2 being detected according to worked upon data 24c.Particularly, display device 1 coordinates the movement of finger F 1 and finger F 2 to tear a page 50d.As a result, as shown in step SJ3, if user continues moveable finger F1 and finger F 2, then page 50d that tears makes it be separated with book 50.
Figure 28 is the figure of an embodiment of the operation that a part of taking off page is shown.In the step SK1 shown in Figure 28, user catches a page 50e by finger F 1 and finger F 2, forms scissors shape by the finger F 4 of another hand and finger F 5.Then, in step SK2, user is as crossing moveable finger F4 and finger F 5 page 50e.Display device 1, when the action of above-mentioned finger F 4 and finger F 5 being detected, adds tangent line in the part of being crossed by finger F 4 and finger F 5.As a result, as shown in step SK3, the hinge 50f as the page 50e of a part is removed along the mobile route of finger F 4 and finger F 5, is separated with book 50.In addition, the operation of taking off a part for page is not limited to the above-mentioned action making the finger of formation scissors cross page.Such as, pick-up unit 1, when the action with finger description page being detected, also can take off hinge along drawing path.
As shown in Figure 26 to Figure 28, taken off entirety or a part for page by display device 1, make user can will include the page of the text of interest etc. with the various document form management different from book.In addition, when the Thickness Ratio threshold value of page is thick (during hardness height), display device 1 also can set can not take off page.In addition, display device 1 is when the operation of taking off page being detected, and page of also can not tearing, makes the page that copies of page be separated with book.Like this, the object that page is not torn also can be the object that such as this page is forbidden copying in copyright management.
Display device 1 also can three-dimensionally show many books.Figure 29 is the figure of the embodiment of the control illustrated when three-dimensionally showing many books.In the step SL1 shown in Figure 29, display device 1 three-dimensionally shows three book 51-53 on desk T1.Then, in step SL2, user carries out the operation for opening book 53 by finger F 1 and finger F 2.
If display device 1 detects the operation of opening book 53, then open book 53 according to the operation detected.Now, in step SL3, display device expands the book 53 showing and open on roughly whole of desk T1.Like this, by expanding the book 53 showing and open, user is made to be easy to the page of reading book 53.
Display device 1 also can show other objects with page correspondence establishment.With reference to Figure 30 to Figure 38, the display with other objects of page correspondence establishment is described.Figure 30 is the process flow diagram of the processing sequence of the process of other objects that display and page correspondence establishment are shown.
Processing sequence shown in Figure 30 is realized by control part 22 executive control program 24a.As shown in figure 30, first, as step S501, control part 22 synthesis comprises the imaginary spatial image of book and real space image and is presented in display part 32a and display part 32b.Outward appearance and the content of the book shown is determined according to object data 24b.
Then, as step S502, control part 22 judges whether the operation to book to be detected.The operation of book is detected according to the image taken by photography portion 40 and photography portion 42.When the operation to book being detected (situation of step S502 "Yes"), as step S503, control part 22 changes the book shown according to the operation detected.Corresponding with the operation detected, that book is changed method is determined according to worked upon data 24c.
Then, as step S504, control part 22 judges whether the page of displaying contents changes.When page changes (situation of step S504 "Yes"), as step S505, control part 22 makes to show in the mode corresponding with page with the object of the page correspondence establishment of new displaying contents.
When the operation to book not detected (situation of step S502 "No"), the book shown is kept intact.In addition, when page does not change (situation of step S504 "No"), step S505 is not performed.
Then, as step S506, control part 22 judges whether to end process.Control part 22 such as user carry out instruction judge to end process after the scheduled operation that terminates of reading function.When ending process (situation of step S506 "Yes"), control part 22 terminates the processing sequence shown in Figure 30.When not ending process (situation of step S506 "No"), control part 22 performs from step S502 again.
Like this, display device 1, when showing other objects with page correspondence establishment, coordinates the change of page to change the display of object.The concrete example showing other objects with the page accordingly is hereafter described.
Figure 31 to Figure 34 shows and shows halobiontic embodiment in three dimensions accordingly with the page.In the embodiment shown in Figure 31, the three dimensional object 55a of correspondence establishment killer whale in the page of the page number 51 of book 55, three dimensional object 55b and 55c of correspondence establishment tropical fish in the page of the page number 50.When showing the page of the page of the page number 50 and the page number 51, show three dimensional object 55a to 55c in the mode appeared in one's mind from these pages.Like this, by showing the three dimensional object with the page correspondence establishment of book in the mode appeared in one's mind, make it possible to provide to user the information having more the sense of reality than the image inserted in true book or illustration.
User also can change arbitrarily the corresponding of page and three dimensional object.Such as, as shown in figure 32, user climbs over 1 page by finger F 1 and finger F 2 under the state catching three dimensional object 55a by finger F 3 and finger F 4.Display device 1, when detecting as above-mentioned to climb under the state catching three dimensional object page, makes the three dimensional object be booked set up corresponding with the page newly shown.
As a result, as shown in figure 33, three dimensional object 55a sets up corresponding with the page of the page number 53.In addition, as shown in figure 34, if when user turns back page and shows the page of the page number 51, owing to changing with the corresponding of page, therefore three dimensional object 55a is not shown.
In Figure 31 to Figure 34, set other objects and set up corresponding with a face of page, but other objects also can be set up corresponding with the front and back of page.In this case, display device 1 changes the method for display object according to the angle of page.
Figure 35 illustrates the figure showing halobiontic embodiment with the front and back of page accordingly in three dimensions.In the embodiment shown in Figure 35, the front and back of the page 56a of book 56 is set up corresponding with the three dimensional object 56b of killer whale.When showing a face of page 56a, the mode that display device 1 is appeared in one's mind from page 56a with the upper part of the body of killer whale dimensionally shows three dimensional object 56b.
Then, when user starts the operation carrying out page turning 56a, display device 1 coordinates the part shown by angle increase three dimensional object 56b of page 56a.When page 56a is vertical, show whole three dimensional object 56b.Then, if user proceeds the operation of page turning 56a, then display device 1 coordinates the part shown by angle reduction three dimensional object 56b of page 56a.If climb over a page 56a completely, then the mode that display device 1 is appeared in one's mind from page 56a with the lower part of the body of killer whale dimensionally shows three dimensional object 56b.
If user is page turning 56a round about, then display device 1 change that makes three dimensional object 56b carry out in contrast with the above description.
Figure 36 illustrates the figure showing another embodiment halobiontic with the front and back of page accordingly in three dimensions.In the embodiment shown in Figure 36, the front and back of the page 57a of book 57 is set up corresponding with the three dimensional object 57b of killer whale.When showing a face of page 57a, display device 1 is with the dorsal fin of killer whale mode 3-D display three dimensional object 56b upward.
Then, when user starts the operation carrying out page turning 57a, display device 1 coordinates the angle of page 57a to make three dimensional object 57b towards transverse rotation.If climb over a page 57a completely, then display device 1 is with the belly of killer whale mode 3-D display three dimensional object 57b upward.If user is page turning 57a in the opposite direction, then display device 1 change that makes three dimensional object 57b carry out in contrast with the above description.
By display device 1, object and page turning are changed in linkage like this, make user object desirably can be changed by the operation of this custom of page turning.That is, even be bad at the user of information equipment operation, the process of the complexity such as rotated three dimensional object also only can be realized by page turning.
Display device 1 also can in multipage correspondence establishment object.Figure 37 is figure correspondence establishment object in multipage being shown and showing halobiontic embodiment in three dimensions.In the embodiment shown in Figure 37, the three dimensional object 58e of correspondence establishment tropical fish and the three dimensional object 58f of killer whale in 4 page 58a to 58d of book 58.
As display page 58a and page 58b, display device 1 is with same ratio display three dimensional object 58e and three dimensional object 58f.Because the difference in size of tropical fish and killer whale is large, therefore when display page 58a and page 58b, in three dimensional object 58f, the part of killer whale tail exceeds viewing area and does not show whole body.The part that three dimensional object 58f exceeds viewing area is shown by page turning display page 58c and page 58d.
Like this, by showing multiple biology with same ratio, make user can be easy to hold biological difference in size.In addition, user can be read by the operation of this custom of page turning and exceed viewing area and the part do not shown.
In the embodiment shown in Figure 38, at the three dimensional object 59e of 4 page 59a to 59d correspondence establishment houses of book 59.As display page 59a and page 59b, display device 1 shows whole three dimensional object 59e.As display page 59c and page 59d, display device 1 shows three dimensional object 59e in the mode only showing 1 layer, house.
Like this, display device 1 also according to the number of pages setting cross section climbed over, can show the object of dissengaged positions with the cross section set.Above-mentioned control can be applied to the purposes of floor schematic diagram according to the number of pages display mansion climbed over, or the purposes of sectional view according to the number of pages display human body climbed over.
In addition, the mode of the present invention shown in above-described embodiment can change arbitrarily without departing from the scope of the subject in the invention.Such as, the control program 24a shown in above-described embodiment also can be split into multiple module, also can with other program integration.In addition, in the above-described embodiments, use finger to operate to three dimensional object, but club etc. also can be used to replace finger.
In addition, in the above-described embodiments, three dimensional object is shown as the embodiment with the object shown by page correspondence establishment.But, be not limited to three dimensional object with the object shown by page correspondence establishment.Such as, also can with page correspondence establishment show animation.Display device 1 with page correspondence establishment show animation time, also often can turn over 1 page and play different chapters and sections.
In addition, in the above-described embodiments, display device detects separately the operation to three dimensional object, but also can display device and server unit cooperative detection to the operation of three dimensional object.In this case, the information that test section detects by display device sends to server unit successively, and server unit detects operation and testing result is notified display device.By above-mentioned structure, the load of display device can be reduced.
In addition, display device 1 scope that also can will detect the user wearing display device 1 is defined as to the space of the operation of three dimensional object hand and can get at.Like this, detecting space to the operation of three dimensional object by limiting, making it possible to alleviate the load of calculation process of display device 1 in order to detect operation and carry out.
In addition, in the above-described embodiments, describe use the present invention to realize the embodiment of the reading function of book, but the present invention can be used read be not limited to book.The present invention may be used for the reading function realizing comprising the various electronic publication such as brochure and newspaper.
In addition, what can realize according to the present invention is not limited to the operation shown in above-described embodiment to the operation of three dimensional object.By control of the present invention, can realize such as selecting and the operation etc. of taking off the operation of book, the operation of Foldable newspaper and using stationery to write at book etc. from bookshelf.
Description of reference numerals
1 ~ 4 display device
1a front face
1b, 1c side surface part
4d external device (ED)
13 operating portions
22 control parts
24 storage parts
24a control program
24b object data
24c worked upon data
24d imagination spatial data
25 monitoring handling parts
26 display object control parts
27 Images uniting portions
32a, 32b display part
40,42 photography portions
44 test sections
46 ranging unit

Claims (8)

1. a display device, is characterized in that, comprising:
Display part, when display device is worn, shows electronic publication by display and each self-corresponding image of user's two eyes;
Test section, detects the object page of described publication being carried out to page turn over operation; And
Control part, according to the testing result of described test section, makes described display part show the page of new display in the page of described publication.
2. display device as claimed in claim 1, is characterized in that,
Described display part is publication described in display space Stereo display,
Described test section detects the position of the multiple described object in described display space,
The object that described control part makes described display part stereo display corresponding with the page shown.
3. display device as claimed in claim 2, it is characterized in that, described control part changes the display mode of described object according to the angle of the page corresponding with described object.
4. display device as claimed in claim 3, it is characterized in that, described control part rotates described object according to the angle of the page corresponding with described object.
5. display device as claimed in claim 3, it is characterized in that, described control part makes the described object cut off by cross section show, and wherein, described cross section corresponds to the angle of the page corresponding with described object.
6. display device as claimed in claim 3, it is characterized in that, described control part makes a part for described object show, and a described part for described object corresponds to the angle of the page corresponding with described object.
7. a control system, comprise the control part of terminal and the described terminal of control, wherein, described terminal comprises: display part, when described terminal is worn, shows electronic publication by display and each self-corresponding image of user's two eyes; And test section, detect the multiple objects page of described publication being carried out to page turn over operation,
It is characterized in that, described control part, according to the testing result of described test section, makes described display part show the page of new display in the page of described publication.
8. a control program, make the display device comprising display part and test section carry out following step, described step comprises:
When display device is worn, by display and each self-corresponding image of user's two eyes, in described display part, show electronic publication;
The object page of described publication being carried out to page turn over operation is detected by described test section; And
According to the testing result of described test section, described display part is made to show the page of new display in the page of described publication.
CN201380050117.4A 2012-09-27 2013-09-26 Display device, control system and control method Active CN104662588B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2012-214956 2012-09-27
JP2012214956A JP5841033B2 (en) 2012-09-27 2012-09-27 Display device, control system, and control program
PCT/JP2013/076065 WO2014050967A1 (en) 2012-09-27 2013-09-26 Display device, control system, and control program

Publications (2)

Publication Number Publication Date
CN104662588A true CN104662588A (en) 2015-05-27
CN104662588B CN104662588B (en) 2018-07-06

Family

ID=50388362

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201380050117.4A Active CN104662588B (en) 2012-09-27 2013-09-26 Display device, control system and control method

Country Status (5)

Country Link
US (1) US20150264338A1 (en)
EP (1) EP2905745A4 (en)
JP (1) JP5841033B2 (en)
CN (1) CN104662588B (en)
WO (1) WO2014050967A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107329257A (en) * 2016-04-29 2017-11-07 深圳市掌网科技股份有限公司 A kind of full frame driving display methods of virtual implementing helmet and its virtual implementing helmet
CN112463000A (en) * 2020-11-10 2021-03-09 赵鹤茗 Interaction method, device, system, electronic equipment and vehicle

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9606992B2 (en) * 2011-09-30 2017-03-28 Microsoft Technology Licensing, Llc Personal audio/visual apparatus providing resource management
WO2016079960A1 (en) * 2014-11-18 2016-05-26 Seiko Epson Corporation Image processing apparatus, control method for image processing apparatus, and computer program
JP6357412B2 (en) * 2014-12-15 2018-07-11 キヤノンマーケティングジャパン株式会社 Information processing apparatus, information processing system, information processing method, and program
WO2017033777A1 (en) 2015-08-27 2017-03-02 株式会社コロプラ Program for controlling head-mounted display system
JP6126667B2 (en) * 2015-11-12 2017-05-10 京セラ株式会社 Display device, control system, and control program
JP6597277B2 (en) * 2015-12-18 2019-10-30 富士通株式会社 Projection apparatus, projection method, and computer program for projection
JP6439953B1 (en) * 2018-03-11 2018-12-19 求 藤川 Determination apparatus and control method of determination apparatus
EP3693834A1 (en) * 2019-02-11 2020-08-12 Siemens Aktiengesellschaft Method and system for viewing virtual elements
US20240069642A1 (en) * 2022-08-31 2024-02-29 Youjean Cho Scissor hand gesture for a collaborative object

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101053010A (en) * 2004-02-05 2007-10-10 电子图书系统有限公司 Method, system, apparatus, and computer program product for controlling and browsing virtual book
US20100091096A1 (en) * 2008-10-10 2010-04-15 Canon Kabushiki Kaisha Image processing apparatus and image processing method
JP2010146481A (en) * 2008-12-22 2010-07-01 Brother Ind Ltd Head-mounted display
CN101923435A (en) * 2010-08-24 2010-12-22 福州瑞芯微电子有限公司 Method for simulating real page turning effect for electronic book
US20110093778A1 (en) * 2009-10-20 2011-04-21 Lg Electronics Inc. Mobile terminal and controlling method thereof

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06282371A (en) * 1993-03-26 1994-10-07 Kodo Eizo Gijutsu Kenkyusho:Kk Virtual space desk top device
US7667703B2 (en) * 2003-12-19 2010-02-23 Palo Alto Research Center Incorporated Systems and method for turning pages in a three-dimensional electronic document
ES1057360Y (en) * 2004-04-07 2004-11-01 Bunamir S L BOOK WITH THREE-DIMENSIONAL REASON.
US7898541B2 (en) * 2004-12-17 2011-03-01 Palo Alto Research Center Incorporated Systems and methods for turning pages in a three-dimensional electronic document
JP2011095547A (en) 2009-10-30 2011-05-12 Sharp Corp Display device
US20110181497A1 (en) * 2010-01-26 2011-07-28 Roni Raviv Object related augmented reality play system
US8964298B2 (en) * 2010-02-28 2015-02-24 Microsoft Corporation Video display modification based on sensor input for a see-through near-to-eye display
KR101651135B1 (en) * 2010-07-12 2016-08-25 엘지전자 주식회사 Mobile terminal and method for controlling the same
WO2012049795A1 (en) * 2010-10-12 2012-04-19 パナソニック株式会社 Display processing device, display method and program
JP2012114816A (en) * 2010-11-26 2012-06-14 Sony Corp Image processing device, image processing method, and image processing program
WO2012147702A1 (en) * 2011-04-28 2012-11-01 シャープ株式会社 Head-mounted display
JP5922349B2 (en) * 2011-07-27 2016-05-24 京セラ株式会社 Display device, control system and control program
JP5756704B2 (en) * 2011-07-27 2015-07-29 京セラ株式会社 Display device and control program
US20130147686A1 (en) * 2011-12-12 2013-06-13 John Clavin Connecting Head Mounted Displays To External Displays And Other Communication Networks
US20130181975A1 (en) * 2012-01-18 2013-07-18 Standard Nine Inc. (dba Inkling) Systems and methods for objects associated with a three-dimensional model

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101053010A (en) * 2004-02-05 2007-10-10 电子图书系统有限公司 Method, system, apparatus, and computer program product for controlling and browsing virtual book
US20100091096A1 (en) * 2008-10-10 2010-04-15 Canon Kabushiki Kaisha Image processing apparatus and image processing method
JP2010146481A (en) * 2008-12-22 2010-07-01 Brother Ind Ltd Head-mounted display
US20110093778A1 (en) * 2009-10-20 2011-04-21 Lg Electronics Inc. Mobile terminal and controlling method thereof
CN101923435A (en) * 2010-08-24 2010-12-22 福州瑞芯微电子有限公司 Method for simulating real page turning effect for electronic book

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107329257A (en) * 2016-04-29 2017-11-07 深圳市掌网科技股份有限公司 A kind of full frame driving display methods of virtual implementing helmet and its virtual implementing helmet
CN112463000A (en) * 2020-11-10 2021-03-09 赵鹤茗 Interaction method, device, system, electronic equipment and vehicle
CN112463000B (en) * 2020-11-10 2022-11-08 赵鹤茗 Interaction method, device, system, electronic equipment and vehicle

Also Published As

Publication number Publication date
JP2014071498A (en) 2014-04-21
WO2014050967A1 (en) 2014-04-03
JP5841033B2 (en) 2016-01-06
EP2905745A1 (en) 2015-08-12
EP2905745A4 (en) 2016-04-27
US20150264338A1 (en) 2015-09-17
CN104662588B (en) 2018-07-06

Similar Documents

Publication Publication Date Title
CN104662588A (en) Display device, control system, and control program
US10013793B2 (en) Focus guidance within a three-dimensional interface
US9632314B2 (en) Head mounted display device displaying thumbnail image and method of controlling the same
CN107209570B (en) Dynamic adaptive virtual list
JP5638896B2 (en) Display control program, display control device, display control system, and display control method
US9594399B2 (en) Computer-readable storage medium, display control apparatus, display control method and display control system for controlling displayed virtual objects with symbol images
JP5732218B2 (en) Display control program, display control device, display control system, and display control method
US8648924B2 (en) Computer-readable storage medium having stored thereon image generation program, capturing apparatus, capturing system, and image generation method for generating a combination image on a display of the capturing apparatus
ES2688643T3 (en) Apparatus and augmented reality method
CN104115096A (en) Display apparatus and method of changing screen mode using the same
KR101248736B1 (en) Augmented reality book station based augmented reality system and method, augmented reality processing apparatus for realizing the same
JP5514637B2 (en) Information processing program, information processing apparatus, information processing system, and information processing method
JP5756704B2 (en) Display device and control program
US20130057574A1 (en) Storage medium recorded with program, information processing apparatus, information processing system, and information processing method
JP5572532B2 (en) Display control program, display control device, display control method, and display control system
JP6022709B1 (en) Program, recording medium, content providing apparatus, and control method
US11182950B2 (en) Information processing device and information processing method
JP6126667B2 (en) Display device, control system, and control program
JP4040436B2 (en) Portable device
JP5922349B2 (en) Display device, control system and control program
JP5777332B2 (en) GAME DEVICE, GAME PROGRAM, GAME SYSTEM, AND GAME METHOD
JP7407409B1 (en) Landscape goods
US20170052684A1 (en) Display control apparatus, display control method, and program
JP2016186727A (en) Video display device, video display method, and program
JP2023065528A (en) Head-mounted information processing apparatus and head-mounted display system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant