US20120124509A1 - Information processor, processing method and program - Google Patents

Information processor, processing method and program Download PDF

Info

Publication number
US20120124509A1
US20120124509A1 US13/383,511 US201013383511A US2012124509A1 US 20120124509 A1 US20120124509 A1 US 20120124509A1 US 201013383511 A US201013383511 A US 201013383511A US 2012124509 A1 US2012124509 A1 US 2012124509A1
Authority
US
United States
Prior art keywords
display
information
section
cursor
display section
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US13/383,511
Other versions
US8751969B2 (en
Inventor
Kouichi Matsuda
Masaki Fukuchi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FUKUCHI, MASAKI, MATSUDA, KOUICHI
Publication of US20120124509A1 publication Critical patent/US20120124509A1/en
Application granted granted Critical
Publication of US8751969B2 publication Critical patent/US8751969B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2004Aligning objects, relative positioning of parts

Definitions

  • the present invention relates to an information processor, processing method and program.
  • the present invention relates more specifically to an information processor, processing method and program for processing data using a mixed reality (MR) technique that merges real-world real objects and electronic display.
  • MR mixed reality
  • processing the data only on the display section i.e., a limited area, involves the following problem.
  • the mouse cursor appearing on the display section of a PC or other device stops at the edge of the screen area of the display section. This makes it impossible to move an object or window in the display area to an area outside the screen area of the display section using the mouse cursor.
  • the former (a) requires not only a cost of adding a display but also a space.
  • the latter (b) requires the user to enter a command or manipulate an icon appearing, for example, on the tray to access an area other than that which is actually displayed on the display section.
  • Patent Document 1 Japanese Patent Laid-Open No. 2008-304268
  • Patent Document 2 Japanese Patent Laid-Open No. 2008-304269
  • MR mixed reality
  • a first mode of the present invention is an information processor including a coordinate processing module, camera, three-dimensional information analysis section, second display section and virtual object management section.
  • the coordinate processing module determines whether the position of a cursor, i.e., a position indicator displayed on a first display section, is located in or outside the area of the first display section and outputs cursor position information to the virtual object management section if the cursor is located outside the area of the first display section.
  • the camera captures an image of a real object including the first display section.
  • the three-dimensional information analysis section analyzes the three-dimensional position of the real object included in a camera-captured image.
  • the second display section displays the camera-captured image.
  • the virtual object management section generates a virtual object different from the real object included in the camera-captured image and generates a composite image including the generated virtual object and the real object so as to display the composite image on the second display section.
  • the virtual object management section calculates the three-dimensional position of the cursor based on the cursor position information supplied from the coordinate processing module so as to display, on the second display section, a composite image in which the object is placed at the calculated position as a virtual object.
  • the information processor includes an application execution section adapted to process a specified object specified by the position indicator.
  • the application execution section determines whether the specified object is located in or outside the area of the first display section and outputs object position information to the virtual object management section if the specified object is located outside the area of the first display section.
  • the virtual object management section calculates the three-dimensional position of the object based on the object position information supplied from the coordinate processing module so as to display, on the second display section, a composite image in which the object is placed at the calculated position as a virtual object.
  • the virtual object management section displays, on the second display section, a composite image with an object area image overlapping the display area of the first display section deleted.
  • the information processor further includes an object information acquisition section.
  • the object information acquisition section acquires image data of a real object specified by the cursor placed as the virtual object and searches data based on the acquired image data so as to acquire object information.
  • the object information acquisition section outputs the acquired object information to the first display section as display data.
  • the object information acquisition section accesses a database in which real object image data and object information are associated with each other or a server so as to acquire object information through a search based on the real object image data.
  • the virtual object management section calculates a plane including the display surface of the first display section based on three-dimensional position information of components making up the first display section included in the camera-captured image and calculates the three-dimensional position of the cursor so that the cursor position is placed on the plane.
  • the cursor is a mouse cursor that moves by mouse operation.
  • the coordinate processing module receives mouse cursor displacement information resulting from the mouse operation and determines whether the mouse cursor is located in or outside the area of the first display section.
  • a second mode of the present invention is an information processing method performed by an information processor.
  • the information processing method includes a coordinate processing step of a coordinate processing module determining whether the position of a cursor, i.e., a position indicator displayed on a first display section, is located in or outside the area of the first display section and outputting cursor position information to a virtual object management section if the cursor is located outside the area of the first display section.
  • the information processing method further includes an image capture step of a camera capturing an image of a real object including the first display section.
  • the information processing method further includes a three-dimensional information analysis step of a three-dimensional information analysis section analyzing the three-dimensional position of the real object included in a camera-captured image.
  • the information processing method further includes a virtual object management step of a virtual object management section generating a virtual object different from the real object included in the camera-captured image and generating a composite image including the generated virtual object and the real object so as to display the composite image on the second display section.
  • the virtual object management step is a step of calculating the three-dimensional position of a cursor based on cursor position information supplied from the coordinate processing module so as to display, on the second display section, a composite image in which the cursor is placed at the calculated position as a virtual object.
  • a third mode of the present invention is a program causing an information processor to process information.
  • the program includes a coordinate processing step of causing a coordinate processing module to determine whether the position of a cursor, i.e., a position indicator displayed on a first display section, is located in or outside the area of the first display section and output cursor position information to a virtual object management section if the cursor is located outside the area of the first display section.
  • the program further includes an image capture step of causing a camera to capture an image of a real object including the first display section.
  • the program still further includes a three-dimensional information analysis step of causing a three-dimensional information analysis section to analyze the three-dimensional position of the real object included in a camera-captured image.
  • the program still further includes a virtual object management step of causing a virtual object management section to generate a virtual object different from the real object included in the camera-captured image and generate a composite image including the generated virtual object and the real object so as to display the composite image on the second display section.
  • the virtual object management step is a step of causing the virtual object management section to calculate the three-dimensional position of a cursor based on cursor position information supplied from the coordinate processing module so as to display, on the second display section, a composite image in which the cursor is placed at the calculated position as a virtual object.
  • the program according to the present invention can be supplied, for example, via a recording or communication media adapted to supply, in a computer-readable form, a program to an image processor or computer system adapted to execute a variety of program codes. If such a program is supplied in a computer-readable form, the processes appropriate to the program are implemented in the image processor or computer system.
  • system in the present specification refers to a logical collection of a plurality of devices, and that the constituent devices are not necessarily provided in the same enclosure.
  • a cursor or object lying in an area outside the area of the display section of a PC or other device is displayed as a virtual object.
  • the display of goggles worn by the user displays a display device such as a PC and the area outside the display device.
  • the three-dimensional position of the cursor or object that has probably moved in response to user operation is calculated, after which the cursor or object is displayed as a virtual object at the calculated position. Further, object information for the object specified by the cursor is acquired and presented.
  • the present configuration makes it possible to constantly observe and verify data that has moved outside the display section, thus providing improved data processing efficiency.
  • FIG. 1 is a set of diagrams illustrating an example of a process performed by an information processor according to the present invention.
  • FIG. 2 is a set of diagrams illustrating an example of a process performed by the information processor according to the present invention.
  • FIG. 3 is a set of diagrams illustrating an example of a process performed by the information processor according to the present invention.
  • FIG. 4 is a diagram describing a configuration example of the information processor according to the present invention.
  • FIG. 5 is a diagram describing an example of display data displayed on a display of goggles worn by the user as a result of a process performed by the information processor according to the present invention.
  • FIG. 6 is a diagram illustrating a flowchart that describes a process sequence performed by the information processor according to the present invention.
  • FIG. 7 is a diagram describing a specific example of a process illustrating an example of a process performed by the information processor according to the present invention.
  • FIG. 8 is a diagram describing an example of display data displayed on the display of the goggles worn by the user as a result of a process performed by the information processor according to the present invention.
  • FIG. 9 is a diagram illustrating a flowchart that describes a process sequence performed by the information processor according to the present invention.
  • FIG. 10 is a diagram describing a specific example of a process illustrating an example of a process performed by the information processor according to the present invention.
  • FIG. 11 is a diagram describing a specific example of a process illustrating an example of a process performed by the information processor according to the present invention.
  • FIG. 12 is a diagram describing a configuration example of the information processor according to the present invention.
  • FIG. 13 is a diagram describing an example of display data displayed on the display of the goggles worn by the user as a result of a process performed by the information processor according to the present invention.
  • FIG. 14 is a diagram illustrating a flowchart that describes a process sequence performed by the information processor according to the present invention.
  • the present invention is designed to effectively use a space area other than the display section (display) of a PC or other device for data processing thanks to mixed reality (MR)-based data processing.
  • MR mixed reality
  • FIG. 1 is a set of diagrams illustrating an example of a process performed by the information processor according to the present invention.
  • FIG. 1 shows a display section 10 of a PC or other device operated by the user. It should be noted that although a detailed configuration will be described later, the user is operating the PC with goggles on.
  • the goggles have a display adapted to display an image generated by a mixed reality (MR) generator.
  • MR mixed reality
  • the goggles have a camera adapted to capture an image of the surrounding environment.
  • the display of the goggles displays a composite image composed of the camera-captured image and a virtual object generated by the mixed reality (MR) generator.
  • MR mixed reality
  • the user is preparing a document by displaying the document on the display section 10 as illustrated, for example, in FIG. 1( a ).
  • This process is an ordinary PC operation.
  • the display section 10 illustrated in FIG. 1( a ) displays a mouse cursor 11 a as a position indicator adapted to move in response to the movement of the mouse operated by the user.
  • the user can move the mouse cursor 11 a by operating the mouse.
  • the mouse cursor moves within the display area of the display section 10 .
  • the movement of the mouse cursor is not limited to within the display area of the display section 10 .
  • the mouse cursor can be moved to a space outside the display section 10 as shown in FIG. 1( b ).
  • the mouse cursor 11 b shown in FIG. 1( b ) is a virtual object generated by the mixed reality (MR) generator.
  • MR mixed reality
  • the user observes the mouse cursor 11 b which is a virtual object displayed on the display of the goggles worn by the user.
  • the configuration according to the present invention allows for the mouse cursor 11 to be moved at will inside or outside the display section 10 .
  • FIG. 2 is also a set of diagrams illustrating an example of a process performed by the information processor according to the present invention.
  • FIG. 2 also shows the display section 10 of a PC or other device operated by the user.
  • the user is wearing goggles having a display adapted to display an image generated by a mixed reality (MR) generator.
  • MR mixed reality
  • FIGS. 2( a ) and 2 ( b ) shows an image appearing on the display of the goggles worn by the user for observation.
  • the display section 10 shown in FIG. 2( a ) displays the mouse cursor 11 a and an object 21 a specified by the mouse cursor 11 a .
  • the object 21 a is an object displayed on the display section 10 as a result of the execution of a clock display application in the PC.
  • the user moves the mouse cursor 11 a onto the object 21 a by operating the mouse, specifies the object by operating the mouse and further moves the mouse cursor 11 a along a movement line 22 shown in FIG. 2( a ).
  • the object 21 is an object 21 b shown in FIG. 2( b ).
  • the object 21 b shown in FIG. 2( b ) is a virtual object generated by the mixed reality (MR) generator.
  • MR mixed reality
  • the user observes the object 21 b displayed on the display of the goggles worn by the user.
  • the configuration according to the present invention allows for not only the mouse cursor but also an object displayed on the display section 10 to be moved at will.
  • FIG. 3 is also a set of diagrams illustrating an example of a process performed by the information processor according to the present invention.
  • FIG. 3 also shows the display section 10 of a PC or other device operated by the user.
  • the user is wearing goggles having a display adapted to display an image generated by a mixed reality (MR) generator.
  • MR mixed reality
  • FIGS. 3( a ) and 3 ( b ) shows an image appearing on the display of the goggles worn by the user for observation.
  • FIG. 3( a ) shows the mouse cursor 11 a placed outside the display section 10 by the operation described earlier with reference to FIG. 1 and a real object 31 a specified by the mouse cursor 11 a .
  • the object 31 a is a real object actually existing in a space.
  • the object 31 a is a photograph of a CD jacket, i.e., a disk that stores music data.
  • the user places the mouse cursor 11 a on the object 31 a by mouse operation and specifies the object by mouse operation.
  • Information about the specified object i.e., object information
  • the acquired object information is displayed on the display section 10 .
  • An object image 31 b and object information 31 c shown in FIG. 3( b ) are object information.
  • the configuration according to the present invention makes it possible to specify a variety of real objects in a real space with a mouse cursor, i.e., a virtual object, acquire information related to the specified object, load the acquired data into the information processor such as a PC for processing and display the acquired data on the display section 10 as a result of the execution of an application in the PC.
  • a mouse cursor i.e., a virtual object
  • embodiment 1 is a configuration example in which the mouse cursor is moved to a space outside the display section 10 as illustrated in FIG. 1( b ) by moving the mouse cursor 11 a along the movement line 12 shown in FIG. 1( a ) as a result of mouse operation by the user.
  • FIG. 4 is a diagram illustrating the configuration of the information processor according to an embodiment of the present invention adapted to perform the above process.
  • a user 100 processes a variety of data by operating a PC (personal computer) 120 .
  • the PC 120 includes a mouse driver 121 , mouse coordinate processing module 122 , GUI section 123 , communication section 124 , application execution section 125 , control section 126 , memory 127 and display section 128 as illustrated in FIG. 4 .
  • the PC 120 further includes a mouse 129 illustrated at the top in FIG. 4 .
  • the mouse driver 121 receives position information and operation information from the mouse 129 as input information.
  • the mouse coordinate processing module 122 determines the display position of the mouse cursor according to the position information of the mouse 129 received via the mouse driver 121 . It should be noted that the display position of the mouse cursor is not limited to the display area of the display section 128 in the configuration according to the present invention.
  • the GUI section 123 is a user interface adapted, for example, to process information received from the user and output information to the user.
  • the communication section 124 communicates with a mixed reality (MR) generator 130 .
  • MR mixed reality
  • the application execution section 125 executes an application appropriate to data processing performed by the PC 120 .
  • the control section 126 exercises control over the processes performed by the PC 120 .
  • the memory 127 includes RAM, ROM and other storage devices adapted to store, for example, programs and data processing parameters.
  • the display section 128 is a display section which includes, for example, an LCD.
  • the user 100 wears goggles 141 having a display adapted to display virtual objects.
  • the goggles have a camera 142 adapted to capture an image of the surrounding environment.
  • the goggles 141 and camera 142 are connected to the mixed reality (MR) generator 130 .
  • the user 100 performs his or her tasks while observing an image appearing on the display provided on the goggles 141 .
  • MR mixed reality
  • the display of the goggles 141 displays a real-world image, i.e., an image captured by the camera 142 .
  • the display of the goggles 141 further displays a virtual object, generated by the mixed reality (MR) generator 130 , together with the real-world image.
  • MR mixed reality
  • the user 100 is operating the PC (personal computer) 120
  • the camera 142 is capturing an image of the PC (personal computer) 120 operated by the user 100 . Therefore, the display of the goggles 141 displays, as a real-world image, an image including, for example, the display (display section 128 ) of the PC (personal computer) 120 operated by the user 100 and a variety of real objects around the display of the PC 120 . Further, a virtual object, generated by the mixed reality (MR) generator 130 , appears superimposed on the real-world image. The orientation of the camera 142 is changed according to the movement of the user 100 .
  • MR mixed reality
  • display data 150 as illustrated, for example, in FIG. 5 appears on the display of the goggles 141 worn by the user 100 .
  • the display data 150 illustrated in FIG. 5 is a composite image including real and virtual objects.
  • the mixed reality (MR) generator 130 includes a three-dimensional information analysis section 131 , virtual object management module 132 , memory 133 and communication section 134 as illustrated in FIG. 4 .
  • the three-dimensional information analysis section 131 receives an image captured by the camera 142 worn by the user and analyzes the three-dimensional positions of the objects included in the captured image. This three-dimensional position analysis is performed, for example, using SLAM (simultaneous localization and mapping).
  • SLAM is designed to select feature points from a variety of real objects included in the camera-captured image and detect the positions of the selected feature points and the position and posture of the camera.
  • Patent Document 1 Japanese Patent Laid-Open No. 2008-304268
  • Patent Document 2 Japanese Patent Laid-Open No. 2008-304269
  • the three-dimensional information analysis section 131 calculates the three-dimensional positions of the real objects included in the image captured by the camera 142 worn by the user using, for example, SLAM described above. It should be noted, however, that the three-dimensional information analysis section 131 may find the three-dimensional positions of the real objects included in the camera-captured image by using a method other than SLAM described above.
  • the virtual object management module 132 manages the virtual objects appearing on the display of the goggles 141 worn by the user.
  • the virtual objects are data stored in the memory 133 . More specifically, the display of the goggles 141 worn by the user displays, for example, the display data 150 illustrated in FIG. 5 .
  • a PC image 151 included in the display data 150 is a real image (real object) captured by the camera 142 .
  • a mouse cursor 152 a appearing in the PC image 151 shown in FIG. 5 moves outside the PC image 151 , thus displaying a mouse cursor 152 b as a virtual object.
  • the user 100 shown in FIG. 4 can observe a composite image including, for example, the real and virtual objects shown in FIG. 5 on the display of the goggles 141 .
  • the PC image 151 shown in FIG. 5 is a real object captured by the camera 142 .
  • the mouse cursor 152 a in the PC image 151 is also information, i.e., a real object, actually appearing in the PC image 151 . It should be noted that an object that exists in a real world captured by the camera 142 and whose image can be captured by the camera is described here as a real object.
  • the mouse cursor 152 b outside the PC image 151 shown in FIG. 5 is not a real-world object (real object).
  • the mouse cursor 152 b is a virtual object generated by the mixed reality (MR) generator 130 .
  • the mouse cursor 152 b is an object that does not exist in a real world but appears on the display of the goggles 141 worn by the user.
  • step S 101 to step S 105 in the flowchart shown in FIG. 6 are performed by the PC 120 shown in FIG. 4 .
  • step S 106 to step S 109 are performed by the mixed reality (MR) generator 130 shown in FIG. 4 .
  • MR mixed reality
  • step S 101 the mouse coordinate processing module 122 of the PC 120 receives mouse displacement (dX, dY) information from the mouse driver 121 .
  • step S 102 the mouse coordinate processing module 122 calculates an updated mouse cursor position (XQ, YQ) from the previous mouse cursor position (XP, YP) and the mouse displacement (dX, dY) information.
  • step S 103 the mouse coordinate processing module 122 determines whether or not the updated mouse cursor position (XQ, YQ) is outside the display section. The process proceeds to step S 104 when the updated mouse cursor position (XQ, YQ) is inside the area of the display section. In this case, the PC updates the display of the mouse cursor as is normally done. The process proceeds to step S 105 if the updated mouse cursor position (XQ, YQ) is outside the area of the display section.
  • step S 105 the mouse cursor position information (XQ, YQ) stored in the memory is transferred to the mixed reality (MR) generator 130 via the communication section.
  • the position information transferred from the PC 120 to the mixed reality (MR) generator 130 is only position information of the mouse cursor and that the mixed reality (MR) generator 130 acknowledges in advance that the transferred position information is that of the mouse cursor.
  • step S 106 onward The process steps from step S 106 onward are performed by the mixed reality (MR) generator 130 .
  • MR mixed reality
  • the mixed reality (MR) generator 130 stores the mouse cursor position information (XQ, YQ) transferred from the PC 120 in its memory 133 . If the mixed reality (MR) generator 130 receives non-display data (mouse cursor drawing data) or its identifier from the PC 120 , the received data is also stored in the memory 133 of the mixed reality (MR) generator 130 .
  • step S 107 the virtual object management module 132 of the mixed reality (MR) generator 130 acquires the data stored in the memory 133 , i.e., the non-display data (mouse cursor drawing data) and position information (XQ, YQ).
  • the non-display data mouse cursor drawing data
  • position information XQ, YQ
  • step S 108 the virtual object management module 132 converts the position information of the non-display data (mouse cursor) into the camera coordinate system for the camera-captured image acquired from the three-dimensional information analysis section 131 .
  • the three-dimensional information analysis section 131 has already acquired three-dimensional position information of markers 201 a to 201 d at the four corners of a display section 200 of the PC included in the camera-captured image. As illustrated in FIG. 7 , the following pieces of position information have been acquired:
  • Marker 201 a (xa, ya, za)
  • Marker 201 b (xb, yb, zb)
  • Marker 201 c (xc, yc, zc)
  • Marker 201 d (xd, yd, zd)
  • the mouse cursor position information (XQ, YQ) received from the PC 120 is in the PC display section plane coordinate system.
  • the virtual object management display module 132 calculates a plane of the display section in the camera coordinate system based on three-dimensional position information of the markers 201 a to 201 d , determining the position where the non-display data (mouse cursor drawing data), acquired from the PC 120 , is to be placed on the calculated plane.
  • the position information (XQ, YQ) represented in the display section plane coordinate system (X, Y) acquired from the PC 120 is converted, thus calculating the display position (xq, yq, zq) of a mouse cursor 211 q in the camera coordinate system (x, y, z).
  • the display position (xq, yq, zq) of the mouse cursor 211 q is set on the plane of the display surface formed by the markers 201 a to 201 d at the four corners of the display section 200 shown in FIG. 7 .
  • the display surface formed by the markers 201 a to 201 d at the four corners of the display section 200 is found.
  • This display surface can be defined by using arbitrary three of the four coordinates of the markers 201 a to 201 d at the four corners of the display section 200 .
  • the display surface can be defined by using the coordinates of the following three points:
  • Marker 201 a (xa, ya, za)
  • Marker 201 b (xb, yb, zb)
  • Marker 201 c (xc, yc, zc)
  • An xyz plane (plane in the camera coordinate system (x, y, z) passing through the display surface can be expressed as illustrated in the Equation 1 shown below by using the coordinates of the above three points.
  • the virtual object management display module 132 converts the position information (XQ, YQ) represented in the display section plane coordinate system (X, Y) acquired from the PC 120 into position coordinates (xq, yq, zq) on the xyz plane in the camera coordinate system (x, y, z).
  • the coordinates to be found are a coordinate position (xq, yq, zq) in the camera coordinate system (x, y, z) of the mouse cursor 211 q shown in FIG. 7 .
  • Marker 201 a (xa, ya, za)
  • Marker 201 b (xb, yb, zb)
  • Marker 201 c (xc, yc, zc)
  • Marker 201 b (XB, 0)
  • Marker 201 b (XB, 0)
  • Position of the mouse cursor 211 q (XQ, YQ) is the same as that between the following sets of coordinates in the camera coordinate system (x, y, z), namely,
  • Marker 201 a (xa, ya, za)
  • Marker 201 b (xb, yb, zb)
  • Marker 201 c (xc, yc, zc)
  • step S 108 shown in the flow of FIG. 6 the virtual object management module 132 converts the position information (XQ, YQ) of the non-display data acquired from the memory 133 into the position (xq, yq, zq) in the camera coordinate system for the camera-captured image acquired from the three-dimensional information analysis section 131 as described above.
  • step S 109 the virtual object management module 132 displays the mouse cursor at the generated coordinate position (xq, yq, zq) in the camera coordinate system.
  • the non-display data included in the data stored in the memory 133 i.e., the non-display data (mouse cursor drawing data) transferred from the PC 120 , is displayed at the generated coordinate position (xq, yq, zq) in the camera coordinate system.
  • the display data 150 shown in FIG. 5 appears on the display of the goggles 141 worn by the user 100 .
  • the display data 150 shown in FIG. 5 is a composite image showing the PC image 151 as a real object together with the mouse cursor 152 b as a virtual object.
  • the virtual object management module 132 sets the display position of the virtual object in the space outside the PC display section as illustrated in FIG. 5 . This display process allows for the user to move the mouse cursor outside of the PC display section rather than only inside the PC display section, thus making it possible to use a larger work area for data processing.
  • the process described with reference to the flowchart shown in FIG. 6 is performed each time the mouse cursor position changes as a result of the user operating the mouse 129 of the PC 120 .
  • the mouse coordinate processing module 122 transmits updated data to the mixed reality (MR) generator 130 each time the mouse cursor position changes.
  • the mixed reality (MR) generator 130 changes the display position of the virtual object (mouse cursor) based on the updated data as a realtime process.
  • embodiment 2 is a configuration example in which the object 21 is moved to a space outside the display section 10 as illustrated in FIG. 2( b ) by specifying the object 21 and moving the mouse cursor 11 a along the movement line 22 shown in FIGS. 2( a ) and 2 ( b ) as a result of mouse operation by the user.
  • the present embodiment is performed by the devices configured as shown in FIG. 4 as with the first embodiment.
  • the user 100 is operating the PC (personal computer) 120
  • the camera 142 is capturing an image of the PC (personal computer) 120 operated by the user 100 . Therefore, the display of the goggles 141 worn by the user 100 displays, as a real-world image, an image including, for example, the display (display section 128 ) of the PC (personal computer) 120 operated by the user 100 and a variety of real objects around the display of the PC 120 . Further, a virtual object, generated by the mixed reality (MR) generator 130 , appears superimposed on the real-world image. The orientation of the camera 142 is changed according to the movement of the user 100 .
  • MR mixed reality
  • display data 250 as illustrated, for example, in FIG. 8 appears on the display of the goggles 141 worn by the user 100 .
  • the display data 250 illustrated in FIG. 8 is a composite image including real and virtual objects.
  • a PC image 251 included in the display data 250 is a real image (real object) captured by the camera 142 .
  • FIG. 8 shows the process in which the user moves the mouse 129 of the PC 120 shown in FIG. 4 . If an object 252 a appearing in the PC image 251 shown in FIG. 8 is moved outside of the PC image 251 after having been specified by a mouse cursor 271 a , an object 252 and mouse cursor 271 move together. If the object 252 and mouse cursor 271 continue to move, an object 252 b and mouse cursor 271 b are displayed outside the PC image 251 as virtual objects.
  • the user 100 shown in FIG. 4 can observe a composite image including, for example, the real and virtual objects shown in FIG. 8 on the display of the goggles 141 .
  • the PC image 251 shown in FIG. 8 is a real object captured by the camera 142 .
  • Both the object 252 a in the PC image 251 and the mouse cursor 271 a are information and real objects actually displayed in the PC image 151 .
  • the object 252 b and mouse cursor 271 b outside the PC image 251 shown in FIG. 8 are not real-world objects (real objects).
  • the object 252 b and mouse cursor 271 b are virtual objects generated by the mixed reality (MR) generator 130 .
  • the object 252 b and mouse cursor 271 b are objects that do not exist in a real world but appear on the display of the goggles 141 worn by the user.
  • the mouse cursor 271 b is displayed as a virtual object in the present embodiment as in the first embodiment.
  • the sequence adapted to display the mouse cursor 271 b is performed in the same manner as the sequence described with reference to FIG. 6 .
  • the process adapted to display the object 252 specified by the mouse is further added.
  • the flowchart shown in FIG. 9 is a flow that describes only the sequence adapted to display this mouse-specified object. That is, if the display data 250 shown in FIG. 8 is generated and displayed, two processes, one according to the flow shown in FIG. 6 and another according to the flow shown in FIG. 9 , are performed together.
  • step S 201 to step S 204 in the flowchart shown in FIG. 9 are performed by the PC 120 shown in FIG. 4 .
  • step S 205 to step S 208 are performed by the mixed reality (MR) generator 130 shown in FIG. 4 .
  • MR mixed reality
  • step S 201 object information specified by the mouse 129 of the PC 120 is stored in the memory 127 of the PC 120 .
  • object information stored in the memory 127 includes drawing data and position information of the object.
  • Position information is, for example, the coordinates of the center position serving as a reference of the object or a plurality of pieces of position information defining the outline.
  • coordinate information of each of four apexes P, Q, R and S is stored in the memory as elements making up object information.
  • position information need only be that which allows an object to be drawn at a specific position. Therefore, coordinate information of only one point, i.e., P, of all the four apexes P, Q, R and S, may be stored in the memory.
  • Object drawing data representing the shape of the object is also stored in the memory. Therefore, even if coordinate information of only one point, i.e., P, is stored in the memory as position information, it is possible to draw (display) the object using P as a starting point.
  • step S 202 it is determined whether or not an out-of-display-section area has been produced in the mouse-specified object area as a result of the movement of the mouse 129 of the PC 120 by user operation.
  • the application execution section 125 of the PC 120 makes this determination based on the new mouse cursor position and object shape acquired from the mouse coordinate processing module 122 .
  • step S 202 determines whether the determination in step S 202 is No, that is, if no out-of-display-section area has been produced in the mouse-specified object area. If the determination in step S 202 is No, that is, if no out-of-display-section area has been produced in the mouse-specified object area, the process proceeds to step S 203 where the application execution section 125 of the PC 120 displays the mouse-specified object in the display section.
  • step S 204 the object is moved to the position of an object 301 b shown in FIG. 10 or to the position of an object 301 c shown in FIG. 11 by user operation.
  • FIGS. 10 and 11 illustrate examples in which the objects 301 b and 301 c shown in FIGS. 10 and 11 appear at least partly as virtual objects on the display of the goggles worn by the user.
  • step S 204 the data (non-display data (object drawing data)) and position information stored in the memory are transmitted to the mixed reality (MR) generator 130 .
  • the drawing data of the clock i.e., the object 301 b
  • the coordinate data of each of the four apexes P, Q, R and S of the object 301 b are acquired from the memory 127 of the PC 120 and transmitted to the mixed reality (MR) generator 130 .
  • coordinate information of the following four apexes is transferred:
  • R (XR, YR)
  • step S 205 onward are performed by the mixed reality (MR) generator 130 .
  • MR mixed reality
  • the mixed reality (MR) generator 130 stores the data received from the PC 120 , i.e., the non-display data (object drawing data) and position information (P, Q, R and S coordinate information), in the memory 133 of the mixed reality (MR) generator 130 .
  • step S 206 the virtual object management module 132 of the mixed reality (MR) generator 130 acquires the data stored in the memory 133 , i.e., the non-display data (object drawing data) and position information (P, Q, R and S coordinate information).
  • the non-display data object drawing data
  • position information P, Q, R and S coordinate information
  • step S 207 the virtual object management module 132 converts the position information of points P, Q, R and S acquired from the memory 133 into positions in the camera coordinate system for the camera-captured image acquired from the three-dimensional information analysis section 131 .
  • This coordinate conversion is performed in the same manner as the coordinate conversion of the mouse cursor described in the first embodiment. Therefore, the detailed description thereof is omitted.
  • the virtual object management module 132 converts, in step S 270 shown in the flow of FIG. 9 , the position information of the non-display data acquired from the memory 133 , into positions (xq, yq, zq) in the camera coordinate system for the camera-captured image acquired from the three-dimensional information analysis section 131 .
  • step S 208 the virtual object management module 132 acquires the non-display data (object drawing data) contained in the data stored in the memory 133 , drawing or displaying the objects at the generated coordinate positions in the camera coordinate system as illustrated in FIG. 10 .
  • the display data 250 shown in FIG. 8 appears on the display of the goggles 141 worn by the user 100 .
  • the display data 250 shown in FIG. 8 is a composite image showing the PC image 251 as a real object together with the object 252 b and mouse cursor 271 b as virtual objects.
  • the virtual object management module 132 sets the display positions of the virtual objects in the space outside the PC display section as illustrated in FIG. 8 .
  • This display process allows for the user to display a variety of objects in a space outside the PC display section rather than only inside the PC display section, thus making it possible to use a larger work area for data processing.
  • the application execution section 125 transmits updated data to the mixed reality (MR) generator 130 each time the mouse cursor position changes.
  • the mixed reality (MR) generator 130 changes the display position of the virtual object (clock) based on the updated data as a realtime process.
  • step S 202 is Yes in the flowchart shown in FIG. 9 also occurs, for example, at the position of the object 301 c shown in FIG. 11 . That is, the determination in step S 202 is Yes if even only part of the mouse-specified object is located outside the display area of the PC display section.
  • the position information of points P, Q, R and S shown in FIG. 11 is also transferred from the PC 120 to the mixed reality (MR) generator 130 as the position information of the object 301 c .
  • the mixed reality (MR) generator 130 need only display the object 301 c in such a manner that part of the object 301 c appears superimposed on the display section of the PC.
  • the virtual object management module 132 of the mixed reality (MR) generator 130 generates virtual object display data, made up only of the data of the portion enclosed by points U, Q, R and V as shown in FIG. 11 , and displays this display data during the virtual object display process. That is, the data of the portion enclosed by points P, U, V and S of the object drawing data received from the PC is made to appear transparent.
  • MR mixed reality
  • embodiment 3 is a configuration example in which the object information 31 c is displayed as illustrated in FIG. 3( b ) by specifying an object 31 in a real space outside the PC display section as a result of mouse operation by the user.
  • the present embodiment is performed by the devices configured as shown in FIG. 12 .
  • the PC 120 is configured in the same manner as described earlier with reference to FIG. 4 in embodiment 1.
  • the mixed reality (MR) generator 130 includes the three-dimensional information analysis section 131 , virtual object management module 132 , memory 133 , communication section 134 , an object information acquisition section 135 and object information database 136 .
  • the object information database 136 need not essentially be provided in the mixed reality (MR) generator 130 .
  • the object information database 136 need only be, for example, a network-connectable database that can be accessed via the communication section of the mixed reality (MR) generator 130 .
  • the three-dimensional information analysis section 131 , virtual object management module 132 , memory 133 and communication section 134 are configured in the same manner as described earlier with reference to FIG. 4 in the first embodiment. It should be noted, however, that the communication section 134 communicates with an external server 140 or the object information database 136 via a network.
  • the object information acquisition section 135 acquires a variety of real object images from the image captured by the camera 142 worn by the user 100 and compares the images with the data stored in the object information database 136 , selecting similar images and acquiring object information associated with the selected images.
  • the object information is a variety of information such as song title and genre of the CD, artist and price. These pieces of object information are associated with the object image and stored in the object information database 136 .
  • the server 140 also holds the same information as that stored in the object information database 136 .
  • the mixed reality (MR) generator 130 transmits the image captured by the camera 142 worn by the user 100 or a real object image (e.g., CD jacket image) selected from the captured image to the server via the communication section 134 .
  • the server extracts corresponding object information from the received image, supplying the object information to the mixed reality (MR) generator 130 .
  • the mixed reality (MR) generator 130 acquires object information from the object information database 136 or server 140 and supplies the acquired information to the PC 120 together with the data of the real object image captured by the camera 142 .
  • the PC 120 displays the acquired information on its display section using the acquired information.
  • display data 450 as shown, for example, in FIG. 13 appears on the display of the goggles 14 worn by the user 100 .
  • a PC image 451 included in the display data 450 is a real image (real object) captured by the camera 142 .
  • An object 471 a outside the PC image 451 is also a real object.
  • a mouse cursor 480 a is a virtual object.
  • An object image 471 b and object information 471 c appearing in the PC image 451 are data displayed on the display section 128 by the application execution section 125 of the PC 120 .
  • the information other than the mouse cursor 480 a is the image appearing on the display of the goggles 141 worn by the user 100 . This image can also be observed by those users not wearing any goggles.
  • the object image 471 b and object information 471 c appearing in the PC image 451 are display data on the display section of the PC 120 which can be observed by anybody.
  • the mouse cursor 480 a is displayed as a virtual object in the present embodiment as in the first and second embodiments.
  • the sequence adapted to display the mouse cursor 480 a is performed in the same manner as the sequence described with reference to FIG. 6 .
  • the flowchart shown in FIG. 14 is a flow that describes only the sequence for this mouse-specified object. That is, if the display data 450 shown in FIG. 13 is generated and displayed, two processes, one according to the flow shown in FIG. 6 and another according to the flow shown in FIG. 14 , are performed together.
  • the process step in step S 301 in the flowchart shown in FIG. 14 is performed by both the PC 120 and mixed reality (MR) generator 130 shown in FIG. 12 .
  • the process steps from step S 302 to step S 309 are performed by the mixed reality (MR) generator 130 shown in FIG. 12 .
  • the process step in step S 310 is performed by the PC 120 shown in FIG. 12 .
  • step S 301 Prior to the process step in step S 301 , the process according to the flow shown in FIG. 6 described in the first embodiment is performed, thus placing the mouse cursor in an out-of-display-section area.
  • the mouse cursor is located at the position of the mouse cursor 480 a shown in FIG. 13 .
  • step S 301 it is determined whether or not a real object has been specified by mouse operation.
  • the process proceeds to step S 302 . If a real object has not been specified, the process is terminated.
  • the following process is performed when a real object has been specified.
  • mouse clicking information is supplied to the application execution section 125 via the mouse driver 121 of the PC 120
  • the application execution section 125 notifies the mouse operation (clicking) information to the mixed reality (MR) generator 130 via the communication section 124 .
  • the mixed reality (MR) generator 130 receives the mouse operation information via the communication section 134 and notifies the same information to the virtual object management module 132 .
  • step S 302 the virtual object management module 132 determines whether or not an out-of-PC-display-section area is included in the object area of the specified real object and located in the imaging range of the camera.
  • the camera is the camera 142 worn by the user 100 . If the determination in step S 302 is No, the process is terminated. When the determination in step S 302 is Yes, the process proceeds to step S 303 .
  • step S 303 an image including the mouse-specified object is captured by the camera 142 worn by the user 100 , and the captured image stored in the memory. This process is performed under control of the virtual object management module 132 .
  • step S 304 to S 306 are designed to acquire object information from the object information database 136 .
  • Those from step S 307 to S 308 are designed to acquire object information from the server 140 . Either of these processes may be performed. Alternatively, both thereof may be performed.
  • step S 304 to S 306 adapted to acquire object information from the object information database 136 .
  • step S 304 the object information database (DB) 136 is searched using the mouse-specified object image stored in the memory as a search key. This process is performed by the object information acquisition section 135 .
  • Image data of a variety of real objects and object information of the objects for the image data are registered in the object information database (DB) 136 .
  • object information database DB 136 .
  • object information are photographs of CD jackets and song titles and prices of the CDs.
  • step S 305 the object information acquisition section 135 searches the object information database (DB) 136 . That is, the same section 135 determines whether or not any image data registered in the object information database (DB) 136 matches or is similar to the mouse-specified object image. The process is terminated if no matching or similar registered image is extracted. The process proceeds to step S 306 when matching or similar registered image is extracted.
  • DB object information database
  • step S 306 the object information acquisition section 135 acquires, from the object information database (DB) 136 , the registered data for the registered image matching or similar to the mouse-specified object image, i.e., the object image and object information.
  • DB object information database
  • step S 307 the object information acquisition section 135 transmits the mouse-specified object image stored in the memory to the server 140 via the communication section 134 .
  • step S 308 the object information acquisition section 135 acquires, from the server 140 , the object image and object information selected based on the information registered in the server.
  • the server 140 performs the same process as the object information acquisition section 135 , searching database of the server 140 using the mouse-specified object image as a search key and extracting the object information. It should be noted that an error message is notified if the object information cannot be extracted.
  • step S 309 the mixed reality (MR) generator 130 transmits, to the PC 120 , the object information and object image data acquired from the server or database.
  • the object image data may be that acquired from the server or database or the image captured by the camera 142 .
  • step S 310 The process step in the final step S 310 is performed by the PC 120 .
  • step S 310 the data acquired from the mixed reality (MR) generator 130 is displayed on the PC display section thanks to the process performed by the application in the PC.
  • MR mixed reality
  • the display data 450 shown in FIG. 13 appears on the display of the goggles 14 worn by the user 100 .
  • the object image 471 b and object information 471 c appearing in the PC image 451 are the data displayed on the display section 128 by the application execution section 125 of the PC 120 . Therefore, the display data 450 other than the mouse cursor 480 a shown in FIG. 13 is information that can also be observed by those users not wearing any goggles.
  • the series of processes described in the specification may be performed by hardware or software or by a combination of both. If the series of processes are performed by software, the program containing the process sequence is installed into the memory of a computer incorporated in dedicated hardware for execution or into a general-purpose personal computer capable of performing various processes for execution.
  • the program can be stored on a recording media in advance.
  • the program can be installed to a recording media such as built-in harddisk by receiving the program via a network such as a LAN (Local Area Network) or the Internet.
  • LAN Local Area Network
  • system in the present specification refers to a logical collection of a plurality of devices, and that the constituent devices are not necessarily provided in the same enclosure.
  • a cursor or object lying in an area outside the area of the display section of a PC or other device is displayed as a virtual object.
  • the display of goggles worn by the user displays a display device such as a PC and the area outside the display device.
  • the three-dimensional position of the cursor or object that has probably moved in response to user operation is calculated, after which the cursor or object is displayed as a virtual object at the calculated position. Further, object information for the object specified by the cursor is acquired and presented.
  • the present configuration makes it possible to constantly observe and verify data that has moved outside the display section, thus providing improved data processing efficiency.

Abstract

The present invention provides a configuration that allows for a cursor or other object that has moved outside the display section to be displayed as a virtual object and observed. A cursor or object lying in an area outside the area of the display section of a PC or other device is displayed as a virtual object. For example, the display of goggles worn by the user displays a display device such as a PC and the area outside the display device. The three-dimensional position of the cursor or object that has probably moved in response to user operation is calculated, after which the cursor or object is displayed as a virtual object at the calculated position. Further, object information for the object specified by the cursor is acquired and presented.

Description

    TECHNICAL FIELD
  • The present invention relates to an information processor, processing method and program. The present invention relates more specifically to an information processor, processing method and program for processing data using a mixed reality (MR) technique that merges real-world real objects and electronic display.
  • BACKGROUND ART
  • For example, if the user processes data using a PC (Personal Computer), he or she processes the display data on the display section (display) of the PC. However, processing the data only on the display section, i.e., a limited area, involves the following problem.
  • (1) When operated with a mouse, the mouse cursor appearing on the display section of a PC or other device stops at the edge of the screen area of the display section. This makes it impossible to move an object or window in the display area to an area outside the screen area of the display section using the mouse cursor.
  • (2) If windows and other objects used by the user fill up the screen area of the display section of a PC or other device, many windows are displayed to overlap each other. In order to view an underlying window, the user must, for example, select the underlying window to display it on top or change the overlapping windows into icons. However, the former process makes it impossible to view the windows that have been moved behind other windows. On the other hand, the latter process makes it impossible to view the details of the iconized windows.
  • There are the above problems. It should be noted that the following process can be performed to use an area larger than the size of the display section of a PC or other device.
  • (a) Have ready a new physical display and connect the display to the computer operated by the user so that a plurality of displays are available for use.
    (b) Set up a virtual desktop on the single display section.
  • However, the former (a) requires not only a cost of adding a display but also a space. On the other hand, the latter (b) requires the user to enter a command or manipulate an icon appearing, for example, on the tray to access an area other than that which is actually displayed on the display section.
  • The present invention is intended to solve these problems by using, for example, mixed reality (MR)-based data processing. It should be noted that Patent Document 1 (Japanese Patent Laid-Open No. 2008-304268) and Patent Document 2 (Japanese Patent Laid-Open No. 2008-304269) are examples of the prior art describing mixed reality. These documents describe the process adapted to prepare a three-dimensional map of the real world using images captured by a camera.
  • PRIOR ART DOCUMENTS Patent Documents
    • Patent Document 1: Japanese Patent Laid-Open No. 2008-304268
    • Patent Document 2: Japanese Patent Laid-Open No. 2008-304269
    DISCLOSURE OF INVENTION Technical Problem
  • It is an object of the present invention to provide an information processor, information processing method and program that effectively use a space area other than the display section (display) of a PC or other device rather than only inside the display section for data processing thanks to mixed reality (MR)-based data processing.
  • Technical Solution
  • A first mode of the present invention is an information processor including a coordinate processing module, camera, three-dimensional information analysis section, second display section and virtual object management section. The coordinate processing module determines whether the position of a cursor, i.e., a position indicator displayed on a first display section, is located in or outside the area of the first display section and outputs cursor position information to the virtual object management section if the cursor is located outside the area of the first display section. The camera captures an image of a real object including the first display section. The three-dimensional information analysis section analyzes the three-dimensional position of the real object included in a camera-captured image. The second display section displays the camera-captured image. The virtual object management section generates a virtual object different from the real object included in the camera-captured image and generates a composite image including the generated virtual object and the real object so as to display the composite image on the second display section. The virtual object management section calculates the three-dimensional position of the cursor based on the cursor position information supplied from the coordinate processing module so as to display, on the second display section, a composite image in which the object is placed at the calculated position as a virtual object.
  • Further, in an embodiment of the information processor according to the present invention, the information processor includes an application execution section adapted to process a specified object specified by the position indicator. The application execution section determines whether the specified object is located in or outside the area of the first display section and outputs object position information to the virtual object management section if the specified object is located outside the area of the first display section. The virtual object management section calculates the three-dimensional position of the object based on the object position information supplied from the coordinate processing module so as to display, on the second display section, a composite image in which the object is placed at the calculated position as a virtual object.
  • Further, in another embodiment of the information processor according to the present invention, if the three-dimensional position of the object calculated based on the object position information supplied from the coordinate processing module includes the display area of the first display section, the virtual object management section displays, on the second display section, a composite image with an object area image overlapping the display area of the first display section deleted.
  • Further, in still another embodiment of the information processor according to the present invention, the information processor further includes an object information acquisition section. The object information acquisition section acquires image data of a real object specified by the cursor placed as the virtual object and searches data based on the acquired image data so as to acquire object information. The object information acquisition section outputs the acquired object information to the first display section as display data.
  • Further, in still another embodiment of the information processor according to the present invention, the object information acquisition section accesses a database in which real object image data and object information are associated with each other or a server so as to acquire object information through a search based on the real object image data.
  • Further, in still another embodiment of the information processor according to the present invention, the virtual object management section calculates a plane including the display surface of the first display section based on three-dimensional position information of components making up the first display section included in the camera-captured image and calculates the three-dimensional position of the cursor so that the cursor position is placed on the plane.
  • Further, in still another embodiment of the information processor according to the present invention, the cursor is a mouse cursor that moves by mouse operation. The coordinate processing module receives mouse cursor displacement information resulting from the mouse operation and determines whether the mouse cursor is located in or outside the area of the first display section.
  • Further, a second mode of the present invention is an information processing method performed by an information processor. The information processing method includes a coordinate processing step of a coordinate processing module determining whether the position of a cursor, i.e., a position indicator displayed on a first display section, is located in or outside the area of the first display section and outputting cursor position information to a virtual object management section if the cursor is located outside the area of the first display section. The information processing method further includes an image capture step of a camera capturing an image of a real object including the first display section. The information processing method further includes a three-dimensional information analysis step of a three-dimensional information analysis section analyzing the three-dimensional position of the real object included in a camera-captured image. The information processing method further includes a virtual object management step of a virtual object management section generating a virtual object different from the real object included in the camera-captured image and generating a composite image including the generated virtual object and the real object so as to display the composite image on the second display section.
  • The virtual object management step is a step of calculating the three-dimensional position of a cursor based on cursor position information supplied from the coordinate processing module so as to display, on the second display section, a composite image in which the cursor is placed at the calculated position as a virtual object.
  • Further, a third mode of the present invention is a program causing an information processor to process information. The program includes a coordinate processing step of causing a coordinate processing module to determine whether the position of a cursor, i.e., a position indicator displayed on a first display section, is located in or outside the area of the first display section and output cursor position information to a virtual object management section if the cursor is located outside the area of the first display section. The program further includes an image capture step of causing a camera to capture an image of a real object including the first display section. The program still further includes a three-dimensional information analysis step of causing a three-dimensional information analysis section to analyze the three-dimensional position of the real object included in a camera-captured image. The program still further includes a virtual object management step of causing a virtual object management section to generate a virtual object different from the real object included in the camera-captured image and generate a composite image including the generated virtual object and the real object so as to display the composite image on the second display section.
  • The virtual object management step is a step of causing the virtual object management section to calculate the three-dimensional position of a cursor based on cursor position information supplied from the coordinate processing module so as to display, on the second display section, a composite image in which the cursor is placed at the calculated position as a virtual object.
  • It should be noted that the program according to the present invention can be supplied, for example, via a recording or communication media adapted to supply, in a computer-readable form, a program to an image processor or computer system adapted to execute a variety of program codes. If such a program is supplied in a computer-readable form, the processes appropriate to the program are implemented in the image processor or computer system.
  • Other objects, features and advantages of the present invention will be apparent from a detailed description based on the embodiments which will be described later and the accompanying drawings. It should be noted that the term “system” in the present specification refers to a logical collection of a plurality of devices, and that the constituent devices are not necessarily provided in the same enclosure.
  • Advantageous Effect
  • In a configuration according to an embodiment of the present invention, a cursor or object lying in an area outside the area of the display section of a PC or other device is displayed as a virtual object. For example, the display of goggles worn by the user displays a display device such as a PC and the area outside the display device. The three-dimensional position of the cursor or object that has probably moved in response to user operation is calculated, after which the cursor or object is displayed as a virtual object at the calculated position. Further, object information for the object specified by the cursor is acquired and presented. The present configuration makes it possible to constantly observe and verify data that has moved outside the display section, thus providing improved data processing efficiency.
  • BRIEF DESCRIPTION OF DRAWINGS
  • FIG. 1 is a set of diagrams illustrating an example of a process performed by an information processor according to the present invention.
  • FIG. 2 is a set of diagrams illustrating an example of a process performed by the information processor according to the present invention.
  • FIG. 3 is a set of diagrams illustrating an example of a process performed by the information processor according to the present invention.
  • FIG. 4 is a diagram describing a configuration example of the information processor according to the present invention.
  • FIG. 5 is a diagram describing an example of display data displayed on a display of goggles worn by the user as a result of a process performed by the information processor according to the present invention.
  • FIG. 6 is a diagram illustrating a flowchart that describes a process sequence performed by the information processor according to the present invention.
  • FIG. 7 is a diagram describing a specific example of a process illustrating an example of a process performed by the information processor according to the present invention.
  • FIG. 8 is a diagram describing an example of display data displayed on the display of the goggles worn by the user as a result of a process performed by the information processor according to the present invention.
  • FIG. 9 is a diagram illustrating a flowchart that describes a process sequence performed by the information processor according to the present invention.
  • FIG. 10 is a diagram describing a specific example of a process illustrating an example of a process performed by the information processor according to the present invention.
  • FIG. 11 is a diagram describing a specific example of a process illustrating an example of a process performed by the information processor according to the present invention.
  • FIG. 12 is a diagram describing a configuration example of the information processor according to the present invention.
  • FIG. 13 is a diagram describing an example of display data displayed on the display of the goggles worn by the user as a result of a process performed by the information processor according to the present invention.
  • FIG. 14 is a diagram illustrating a flowchart that describes a process sequence performed by the information processor according to the present invention.
  • MODE FOR CARRYING OUT THE INVENTION
  • A detailed description will be given below of the information processor, processing method and program according to the present invention with reference to the accompanying drawings.
  • The present invention will be described with respect to the following items in sequence:
  • 1. Outline of the processes performed by the information processor according to the present invention
    2. Configuration of and processes performed in accordance with a first embodiment of the information processor according to a first embodiment of the present invention
    3. Configuration of and processes performed by the information processor according to a second embodiment of the present invention
    4. Configuration of and processes performed by the information processor according to a third embodiment of the present invention
  • [1. Outline of the Processes Performed by the Information Processor According to the Present Invention]
  • A description will be given first of the outline of processes performed by the information processor according to the present invention with reference to FIGS. 1 to 3. The present invention is designed to effectively use a space area other than the display section (display) of a PC or other device for data processing thanks to mixed reality (MR)-based data processing.
  • FIG. 1 is a set of diagrams illustrating an example of a process performed by the information processor according to the present invention. FIG. 1 shows a display section 10 of a PC or other device operated by the user. It should be noted that although a detailed configuration will be described later, the user is operating the PC with goggles on. The goggles have a display adapted to display an image generated by a mixed reality (MR) generator.
  • The goggles have a camera adapted to capture an image of the surrounding environment. The display of the goggles displays a composite image composed of the camera-captured image and a virtual object generated by the mixed reality (MR) generator. Each of FIGS. 1( a) and (b) shows an image appearing on the display of the goggles worn by the user for observation.
  • The user is preparing a document by displaying the document on the display section 10 as illustrated, for example, in FIG. 1( a). This process is an ordinary PC operation. The display section 10 illustrated in FIG. 1( a) displays a mouse cursor 11 a as a position indicator adapted to move in response to the movement of the mouse operated by the user.
  • The user can move the mouse cursor 11 a by operating the mouse. In a conventional information processor, the mouse cursor moves within the display area of the display section 10. However, when the present invention is applied, the movement of the mouse cursor is not limited to within the display area of the display section 10.
  • For example, if moved along a movement line 12 shown in FIG. 1( a) by user operation, the mouse cursor can be moved to a space outside the display section 10 as shown in FIG. 1( b). This is a mouse cursor 11 b shown in FIG. 1( b). The mouse cursor 11 b shown in FIG. 1( b) is a virtual object generated by the mixed reality (MR) generator. The user observes the mouse cursor 11 b which is a virtual object displayed on the display of the goggles worn by the user. As described above, the configuration according to the present invention allows for the mouse cursor 11 to be moved at will inside or outside the display section 10.
  • FIG. 2 is also a set of diagrams illustrating an example of a process performed by the information processor according to the present invention. As with FIG. 1, FIG. 2 also shows the display section 10 of a PC or other device operated by the user. The user is wearing goggles having a display adapted to display an image generated by a mixed reality (MR) generator. Each of FIGS. 2( a) and 2(b) shows an image appearing on the display of the goggles worn by the user for observation.
  • The display section 10 shown in FIG. 2( a) displays the mouse cursor 11 a and an object 21 a specified by the mouse cursor 11 a. In this example, the object 21 a is an object displayed on the display section 10 as a result of the execution of a clock display application in the PC.
  • The user moves the mouse cursor 11 a onto the object 21 a by operating the mouse, specifies the object by operating the mouse and further moves the mouse cursor 11 a along a movement line 22 shown in FIG. 2( a).
  • This process allows for the mouse cursor and specified object 21 to be moved to a space outside the display section 10. The object 21 is an object 21 b shown in FIG. 2( b). The object 21 b shown in FIG. 2( b) is a virtual object generated by the mixed reality (MR) generator. The user observes the object 21 b displayed on the display of the goggles worn by the user. As described above, the configuration according to the present invention allows for not only the mouse cursor but also an object displayed on the display section 10 to be moved at will.
  • FIG. 3 is also a set of diagrams illustrating an example of a process performed by the information processor according to the present invention. As with FIG. 1 and FIG. 2, FIG. 3 also shows the display section 10 of a PC or other device operated by the user. The user is wearing goggles having a display adapted to display an image generated by a mixed reality (MR) generator. Each of FIGS. 3( a) and 3(b) shows an image appearing on the display of the goggles worn by the user for observation.
  • FIG. 3( a) shows the mouse cursor 11 a placed outside the display section 10 by the operation described earlier with reference to FIG. 1 and a real object 31 a specified by the mouse cursor 11 a. In this example, the object 31 a is a real object actually existing in a space. In this example, the object 31 a is a photograph of a CD jacket, i.e., a disk that stores music data.
  • The user places the mouse cursor 11 a on the object 31 a by mouse operation and specifies the object by mouse operation. Information about the specified object, i.e., object information, is acquired from a database or server by specifying the object. The acquired object information is displayed on the display section 10. An object image 31 b and object information 31 c shown in FIG. 3( b) are object information.
  • As described above, the configuration according to the present invention makes it possible to specify a variety of real objects in a real space with a mouse cursor, i.e., a virtual object, acquire information related to the specified object, load the acquired data into the information processor such as a PC for processing and display the acquired data on the display section 10 as a result of the execution of an application in the PC.
  • [2. Configuration of and Processes Performed by the Information Processor According to a First Embodiment of the Present Invention]
  • A detailed description will be given next of the configuration of and processes performed by the information processor adapted to perform the process described as a first embodiment of the present invention with reference to FIG. 1. As shown in FIG. 1, embodiment 1 is a configuration example in which the mouse cursor is moved to a space outside the display section 10 as illustrated in FIG. 1( b) by moving the mouse cursor 11 a along the movement line 12 shown in FIG. 1( a) as a result of mouse operation by the user.
  • FIG. 4 is a diagram illustrating the configuration of the information processor according to an embodiment of the present invention adapted to perform the above process. A user 100 processes a variety of data by operating a PC (personal computer) 120. The PC 120 includes a mouse driver 121, mouse coordinate processing module 122, GUI section 123, communication section 124, application execution section 125, control section 126, memory 127 and display section 128 as illustrated in FIG. 4. The PC 120 further includes a mouse 129 illustrated at the top in FIG. 4.
  • The mouse driver 121 receives position information and operation information from the mouse 129 as input information. The mouse coordinate processing module 122 determines the display position of the mouse cursor according to the position information of the mouse 129 received via the mouse driver 121. It should be noted that the display position of the mouse cursor is not limited to the display area of the display section 128 in the configuration according to the present invention.
  • The GUI section 123 is a user interface adapted, for example, to process information received from the user and output information to the user. The communication section 124 communicates with a mixed reality (MR) generator 130.
  • The application execution section 125 executes an application appropriate to data processing performed by the PC 120. The control section 126 exercises control over the processes performed by the PC 120. The memory 127 includes RAM, ROM and other storage devices adapted to store, for example, programs and data processing parameters. The display section 128 is a display section which includes, for example, an LCD.
  • The user 100 wears goggles 141 having a display adapted to display virtual objects. The goggles have a camera 142 adapted to capture an image of the surrounding environment. The goggles 141 and camera 142 are connected to the mixed reality (MR) generator 130. The user 100 performs his or her tasks while observing an image appearing on the display provided on the goggles 141.
  • The display of the goggles 141 displays a real-world image, i.e., an image captured by the camera 142. The display of the goggles 141 further displays a virtual object, generated by the mixed reality (MR) generator 130, together with the real-world image.
  • In the example shown in FIG. 4, the user 100 is operating the PC (personal computer) 120, and the camera 142 is capturing an image of the PC (personal computer) 120 operated by the user 100. Therefore, the display of the goggles 141 displays, as a real-world image, an image including, for example, the display (display section 128) of the PC (personal computer) 120 operated by the user 100 and a variety of real objects around the display of the PC 120. Further, a virtual object, generated by the mixed reality (MR) generator 130, appears superimposed on the real-world image. The orientation of the camera 142 is changed according to the movement of the user 100.
  • If the user 100 faces the screen of the PC 120 while performing his or her tasks, the camera 142 captures an image centering around the image on the screen of the PC 120. As a result, display data 150 as illustrated, for example, in FIG. 5 appears on the display of the goggles 141 worn by the user 100. The display data 150 illustrated in FIG. 5 is a composite image including real and virtual objects.
  • A description will be given next of the configuration of the mixed reality (MR) generator 130 shown in FIG. 4. The mixed reality (MR) generator 130 includes a three-dimensional information analysis section 131, virtual object management module 132, memory 133 and communication section 134 as illustrated in FIG. 4.
  • The three-dimensional information analysis section 131 receives an image captured by the camera 142 worn by the user and analyzes the three-dimensional positions of the objects included in the captured image. This three-dimensional position analysis is performed, for example, using SLAM (simultaneous localization and mapping). SLAM is designed to select feature points from a variety of real objects included in the camera-captured image and detect the positions of the selected feature points and the position and posture of the camera. It should be noted that SLAM is described in Patent Document 1 (Japanese Patent Laid-Open No. 2008-304268) and Patent Document 2 (Japanese Patent Laid-Open No. 2008-304269) mentioned earlier. It should be noted that the basic process of SLAM is described in a paper titled [Andrew J. Davison, “Real-time simultaneous localisation and mapping with a single camera,” Proceedings of the 9th International Conference on Computer Vision, Ninth, (2003)].
  • The three-dimensional information analysis section 131 calculates the three-dimensional positions of the real objects included in the image captured by the camera 142 worn by the user using, for example, SLAM described above. It should be noted, however, that the three-dimensional information analysis section 131 may find the three-dimensional positions of the real objects included in the camera-captured image by using a method other than SLAM described above.
  • The virtual object management module 132 manages the virtual objects appearing on the display of the goggles 141 worn by the user. The virtual objects are data stored in the memory 133. More specifically, the display of the goggles 141 worn by the user displays, for example, the display data 150 illustrated in FIG. 5. A PC image 151 included in the display data 150 is a real image (real object) captured by the camera 142.
  • For example, if the user moves the mouse 129 of the PC 120, a mouse cursor 152 a appearing in the PC image 151 shown in FIG. 5 moves outside the PC image 151, thus displaying a mouse cursor 152 b as a virtual object.
  • The user 100 shown in FIG. 4 can observe a composite image including, for example, the real and virtual objects shown in FIG. 5 on the display of the goggles 141. The PC image 151 shown in FIG. 5 is a real object captured by the camera 142. The mouse cursor 152 a in the PC image 151 is also information, i.e., a real object, actually appearing in the PC image 151. It should be noted that an object that exists in a real world captured by the camera 142 and whose image can be captured by the camera is described here as a real object.
  • The mouse cursor 152 b outside the PC image 151 shown in FIG. 5 is not a real-world object (real object). The mouse cursor 152 b is a virtual object generated by the mixed reality (MR) generator 130. The mouse cursor 152 b is an object that does not exist in a real world but appears on the display of the goggles 141 worn by the user.
  • A description will be given below of the process sequence adapted to display the mouse cursors as described above with reference to the flowchart shown in FIG. 6. It should be noted that we assume that the user is operating the mouse 129 connected to the PC 120 shown in FIG. 4. Operation information is supplied to the mouse driver. From this process onward, the process starting from step S101 in the flowchart shown in FIG. 6 is performed.
  • The process steps from step S101 to step S105 in the flowchart shown in FIG. 6 are performed by the PC 120 shown in FIG. 4.
  • The process steps from step S106 to step S109 are performed by the mixed reality (MR) generator 130 shown in FIG. 4.
  • In step S101, the mouse coordinate processing module 122 of the PC 120 receives mouse displacement (dX, dY) information from the mouse driver 121.
  • In step S102, the mouse coordinate processing module 122 calculates an updated mouse cursor position (XQ, YQ) from the previous mouse cursor position (XP, YP) and the mouse displacement (dX, dY) information.
  • In step S103, the mouse coordinate processing module 122 determines whether or not the updated mouse cursor position (XQ, YQ) is outside the display section. The process proceeds to step S104 when the updated mouse cursor position (XQ, YQ) is inside the area of the display section. In this case, the PC updates the display of the mouse cursor as is normally done. The process proceeds to step S105 if the updated mouse cursor position (XQ, YQ) is outside the area of the display section.
  • In step S105, the mouse cursor position information (XQ, YQ) stored in the memory is transferred to the mixed reality (MR) generator 130 via the communication section. It should be noted that, in the present embodiment, the position information transferred from the PC 120 to the mixed reality (MR) generator 130 is only position information of the mouse cursor and that the mixed reality (MR) generator 130 acknowledges in advance that the transferred position information is that of the mouse cursor. In order to transfer position information or other information of other objects, it is necessary to transfer identification information of each object or object drawing data.
  • The process steps from step S106 onward are performed by the mixed reality (MR) generator 130.
  • First, in step S106, the mixed reality (MR) generator 130 stores the mouse cursor position information (XQ, YQ) transferred from the PC 120 in its memory 133. If the mixed reality (MR) generator 130 receives non-display data (mouse cursor drawing data) or its identifier from the PC 120, the received data is also stored in the memory 133 of the mixed reality (MR) generator 130.
  • Next, in step S107, the virtual object management module 132 of the mixed reality (MR) generator 130 acquires the data stored in the memory 133, i.e., the non-display data (mouse cursor drawing data) and position information (XQ, YQ).
  • In step S108, the virtual object management module 132 converts the position information of the non-display data (mouse cursor) into the camera coordinate system for the camera-captured image acquired from the three-dimensional information analysis section 131.
  • A description will be given of this process with reference to FIG. 7. The three-dimensional information analysis section 131 has already acquired three-dimensional position information of markers 201 a to 201 d at the four corners of a display section 200 of the PC included in the camera-captured image. As illustrated in FIG. 7, the following pieces of position information have been acquired:
  • Marker 201 a=(xa, ya, za)
  • Marker 201 b=(xb, yb, zb)
  • Marker 201 c=(xc, yc, zc)
  • Marker 201 d=(xd, yd, zd)
  • It should be noted that these pieces of position information are in the camera coordinate system (x, y, z).
  • On the other hand, the mouse cursor position information (XQ, YQ) received from the PC 120 is in the PC display section plane coordinate system. As illustrated in FIG. 7, the mouse cursor position information is that having, for example, the top left corner of the display section as its origin (X, Y)=(0, 0) with the horizontal direction denoted by X and the vertical direction denoted by Y.
  • The virtual object management display module 132 calculates a plane of the display section in the camera coordinate system based on three-dimensional position information of the markers 201 a to 201 d, determining the position where the non-display data (mouse cursor drawing data), acquired from the PC 120, is to be placed on the calculated plane. In order to perform this process, the position information (XQ, YQ) represented in the display section plane coordinate system (X, Y) acquired from the PC 120 is converted, thus calculating the display position (xq, yq, zq) of a mouse cursor 211 q in the camera coordinate system (x, y, z).
  • The display position (xq, yq, zq) of the mouse cursor 211 q is set on the plane of the display surface formed by the markers 201 a to 201 d at the four corners of the display section 200 shown in FIG. 7. First, the display surface formed by the markers 201 a to 201 d at the four corners of the display section 200 is found.
  • This display surface can be defined by using arbitrary three of the four coordinates of the markers 201 a to 201 d at the four corners of the display section 200. For example, the display surface can be defined by using the coordinates of the following three points:
  • Marker 201 a=(xa, ya, za)
  • Marker 201 b=(xb, yb, zb)
  • Marker 201 c=(xc, yc, zc)
  • An xyz plane (plane in the camera coordinate system (x, y, z) passing through the display surface can be expressed as illustrated in the Equation 1 shown below by using the coordinates of the above three points.

  • (x−xa)(yb−ya)(zc−za)+(xb−xa)(yc−ya)(z−za)+(xc−xa)(y−ya)(zb−za)−(xc−xa)(yb−ya)(z−za)−(xb−xa)(y−ya)(zc−za)−(x−xa)(yc−ya)(zb−za)=0  (Equation 1)
  • The virtual object management display module 132 converts the position information (XQ, YQ) represented in the display section plane coordinate system (X, Y) acquired from the PC 120 into position coordinates (xq, yq, zq) on the xyz plane in the camera coordinate system (x, y, z).
  • We assume that the coordinates to be found are a coordinate position (xq, yq, zq) in the camera coordinate system (x, y, z) of the mouse cursor 211 q shown in FIG. 7.
  • Marker 201 a=(xa, ya, za)
  • Marker 201 b=(xb, yb, zb)
  • Marker 201 c=(xc, yc, zc)
  • Further, the positions of the above three points in the display section plane coordinate system (X, Y) are respectively assumed to be as follows:
  • Marker 201 a=(0, 0)
  • Marker 201 b=(XB, 0)
  • Marker 201 c=(0, YC)
  • The positional relationship between the following sets of coordinates in the display section plane coordinate system (X, Y), namely,
  • Marker 201 a=(0, 0)
  • Marker 201 b=(XB, 0)
  • Marker 201 c=(0, YC)
  • Position of a mouse cursor 211 p (XP, YP)
  • Position of the mouse cursor 211 q (XQ, YQ) is the same as that between the following sets of coordinates in the camera coordinate system (x, y, z), namely,
  • Marker 201 a=(xa, ya, za)
  • Marker 201 b=(xb, yb, zb)
  • Marker 201 c=(xc, yc, zc)
  • Position of the mouse cursor 211 p (xp, yp, zp)
  • Position of the mouse cursor 211 q (xq, yq, zq).
  • Hence, the following equations hold:

  • (0−XQ)/(0−XB)=(xa−xq)/(xa−xb)

  • (0−YQ)/(0−YC)=(ya−yq)/(ya−yc)
  • From the above, the following relational equations (Equations 2 and 3) can be derived:

  • xq=xa−XQ(xa−xb)/XB  (Equation 2)

  • yq=ya−YQ(ya−yb)/YC  (Equation 3)
  • By substituting the above relational equations (Equations 2 and 3) into the equation (Equation 1) described earlier, zq can be derived. The position (xq, yq, zq) of the mouse cursor 211 q is calculated as described above.
  • In step S108 shown in the flow of FIG. 6, the virtual object management module 132 converts the position information (XQ, YQ) of the non-display data acquired from the memory 133 into the position (xq, yq, zq) in the camera coordinate system for the camera-captured image acquired from the three-dimensional information analysis section 131 as described above.
  • Next, in step S109, the virtual object management module 132 displays the mouse cursor at the generated coordinate position (xq, yq, zq) in the camera coordinate system. It should be noted that if mouse cursor drawing data has been received from the PC 120 and is stored in the memory 133, the non-display data included in the data stored in the memory 133, i.e., the non-display data (mouse cursor drawing data) transferred from the PC 120, is displayed at the generated coordinate position (xq, yq, zq) in the camera coordinate system.
  • As a result of this process, the display data 150 shown in FIG. 5 appears on the display of the goggles 141 worn by the user 100. The display data 150 shown in FIG. 5 is a composite image showing the PC image 151 as a real object together with the mouse cursor 152 b as a virtual object. The virtual object management module 132 sets the display position of the virtual object in the space outside the PC display section as illustrated in FIG. 5. This display process allows for the user to move the mouse cursor outside of the PC display section rather than only inside the PC display section, thus making it possible to use a larger work area for data processing.
  • It should be noted that the process described with reference to the flowchart shown in FIG. 6 is performed each time the mouse cursor position changes as a result of the user operating the mouse 129 of the PC 120. The mouse coordinate processing module 122 transmits updated data to the mixed reality (MR) generator 130 each time the mouse cursor position changes. The mixed reality (MR) generator 130 changes the display position of the virtual object (mouse cursor) based on the updated data as a realtime process.
  • [3. Configuration of and Processes Performed by the Information Processor According to a Second Embodiment of the Present Invention]
  • A detailed description will be given next of the configuration of and processes performed by the information processor adapted to perform the process described earlier as a second embodiment of the present invention with reference to FIG. 2. As described with reference to FIG. 2, embodiment 2 is a configuration example in which the object 21 is moved to a space outside the display section 10 as illustrated in FIG. 2( b) by specifying the object 21 and moving the mouse cursor 11 a along the movement line 22 shown in FIGS. 2( a) and 2(b) as a result of mouse operation by the user.
  • The present embodiment is performed by the devices configured as shown in FIG. 4 as with the first embodiment. In the example shown in FIG. 4, the user 100 is operating the PC (personal computer) 120, and the camera 142 is capturing an image of the PC (personal computer) 120 operated by the user 100. Therefore, the display of the goggles 141 worn by the user 100 displays, as a real-world image, an image including, for example, the display (display section 128) of the PC (personal computer) 120 operated by the user 100 and a variety of real objects around the display of the PC 120. Further, a virtual object, generated by the mixed reality (MR) generator 130, appears superimposed on the real-world image. The orientation of the camera 142 is changed according to the movement of the user 100.
  • If the user 100 faces the screen of the PC 120 while performing his or her tasks, the camera 142 captures an image centering around the image on the screen of the PC 120. As a result, display data 250 as illustrated, for example, in FIG. 8 appears on the display of the goggles 141 worn by the user 100. The display data 250 illustrated in FIG. 8 is a composite image including real and virtual objects.
  • A PC image 251 included in the display data 250 is a real image (real object) captured by the camera 142. For example, FIG. 8 shows the process in which the user moves the mouse 129 of the PC 120 shown in FIG. 4. If an object 252 a appearing in the PC image 251 shown in FIG. 8 is moved outside of the PC image 251 after having been specified by a mouse cursor 271 a, an object 252 and mouse cursor 271 move together. If the object 252 and mouse cursor 271 continue to move, an object 252 b and mouse cursor 271 b are displayed outside the PC image 251 as virtual objects.
  • The user 100 shown in FIG. 4 can observe a composite image including, for example, the real and virtual objects shown in FIG. 8 on the display of the goggles 141. The PC image 251 shown in FIG. 8 is a real object captured by the camera 142. Both the object 252 a in the PC image 251 and the mouse cursor 271 a are information and real objects actually displayed in the PC image 151. On the other hand, the object 252 b and mouse cursor 271 b outside the PC image 251 shown in FIG. 8 are not real-world objects (real objects). The object 252 b and mouse cursor 271 b are virtual objects generated by the mixed reality (MR) generator 130. The object 252 b and mouse cursor 271 b are objects that do not exist in a real world but appear on the display of the goggles 141 worn by the user.
  • A description will be given below of the process sequence adapted to display virtual objects as described above with reference to the flowchart shown in FIG. 9. It should be noted that we assume that the user is operating the mouse 129 connected to the PC 120 shown in FIG. 4.
  • It should be noted that, as can be understood from the display data 250 shown in FIG. 8, the mouse cursor 271 b is displayed as a virtual object in the present embodiment as in the first embodiment. The sequence adapted to display the mouse cursor 271 b is performed in the same manner as the sequence described with reference to FIG. 6.
  • In the present embodiment 2, the process adapted to display the object 252 specified by the mouse is further added. The flowchart shown in FIG. 9 is a flow that describes only the sequence adapted to display this mouse-specified object. That is, if the display data 250 shown in FIG. 8 is generated and displayed, two processes, one according to the flow shown in FIG. 6 and another according to the flow shown in FIG. 9, are performed together.
  • The process steps from step S201 to step S204 in the flowchart shown in FIG. 9 are performed by the PC 120 shown in FIG. 4.
  • The process steps from step S205 to step S208 are performed by the mixed reality (MR) generator 130 shown in FIG. 4.
  • In step S201, object information specified by the mouse 129 of the PC 120 is stored in the memory 127 of the PC 120. It should be noted that object information stored in the memory 127 includes drawing data and position information of the object. Position information is, for example, the coordinates of the center position serving as a reference of the object or a plurality of pieces of position information defining the outline.
  • In the case of rectangular data such as an object 301 a shown in FIG. 10, for example, coordinate information of each of four apexes P, Q, R and S is stored in the memory as elements making up object information. It should be noted that position information need only be that which allows an object to be drawn at a specific position. Therefore, coordinate information of only one point, i.e., P, of all the four apexes P, Q, R and S, may be stored in the memory. Object drawing data representing the shape of the object is also stored in the memory. Therefore, even if coordinate information of only one point, i.e., P, is stored in the memory as position information, it is possible to draw (display) the object using P as a starting point.
  • In step S202, it is determined whether or not an out-of-display-section area has been produced in the mouse-specified object area as a result of the movement of the mouse 129 of the PC 120 by user operation. In this process, the application execution section 125 of the PC 120 makes this determination based on the new mouse cursor position and object shape acquired from the mouse coordinate processing module 122.
  • If the determination in step S202 is No, that is, if no out-of-display-section area has been produced in the mouse-specified object area, the process proceeds to step S203 where the application execution section 125 of the PC 120 displays the mouse-specified object in the display section.
  • On the other hand, when the determination in step S202 is Yes, that is, when an out-of-display-section area has been produced in the mouse-specified object area, the process proceeds to step S204. In this case, for example, the object is moved to the position of an object 301 b shown in FIG. 10 or to the position of an object 301 c shown in FIG. 11 by user operation. It should be noted that FIGS. 10 and 11 illustrate examples in which the objects 301 b and 301 c shown in FIGS. 10 and 11 appear at least partly as virtual objects on the display of the goggles worn by the user.
  • In step S204, the data (non-display data (object drawing data)) and position information stored in the memory are transmitted to the mixed reality (MR) generator 130. In the example shown in FIG. 10, for example, the drawing data of the clock, i.e., the object 301 b, and the coordinate data of each of the four apexes P, Q, R and S of the object 301 b, are acquired from the memory 127 of the PC 120 and transmitted to the mixed reality (MR) generator 130.
  • It should be noted that the transferred position information is that in the PC display section plane coordinate system as with embodiment 1 which is position information having, for example, the top left corner of the display section as its origin (X, Y)=(0, 0) with the horizontal direction denoted by X and the vertical direction denoted by Y as illustrated in FIG. 10. In the example shown in FIG. 10, coordinate information of the following four apexes is transferred:
  • P=(XP, YP)
  • Q=(XQ, YQ)
  • R=(XR, YR)
  • S=(XS, YS)
  • The process steps from step S205 onward are performed by the mixed reality (MR) generator 130.
  • First, in step S205, the mixed reality (MR) generator 130 stores the data received from the PC 120, i.e., the non-display data (object drawing data) and position information (P, Q, R and S coordinate information), in the memory 133 of the mixed reality (MR) generator 130.
  • Next, in step S206, the virtual object management module 132 of the mixed reality (MR) generator 130 acquires the data stored in the memory 133, i.e., the non-display data (object drawing data) and position information (P, Q, R and S coordinate information).
  • Next, in step S207, the virtual object management module 132 converts the position information of points P, Q, R and S acquired from the memory 133 into positions in the camera coordinate system for the camera-captured image acquired from the three-dimensional information analysis section 131.
  • This coordinate conversion is performed in the same manner as the coordinate conversion of the mouse cursor described in the first embodiment. Therefore, the detailed description thereof is omitted.
  • In the example shown in FIG. 10, for example, the coordinates of each of points P, Q, R and S of the object 301 b in the display section plane coordinate system (X, Y) are converted into the following coordinates in the camera coordinate system (x, y, z):
  • P=(XP, YP)->(xp, yp, zp)
  • Q=(XQ, YQ)->(xq, yq, zq)
  • R=(XR, YR)->(xr, yr, zr)
  • S=(XS, YS)->(xs, ys, zs)
  • As described above, the virtual object management module 132 converts, in step S270 shown in the flow of FIG. 9, the position information of the non-display data acquired from the memory 133, into positions (xq, yq, zq) in the camera coordinate system for the camera-captured image acquired from the three-dimensional information analysis section 131.
  • Next, in step S208, the virtual object management module 132 acquires the non-display data (object drawing data) contained in the data stored in the memory 133, drawing or displaying the objects at the generated coordinate positions in the camera coordinate system as illustrated in FIG. 10.
  • As a result of this process, the display data 250 shown in FIG. 8 appears on the display of the goggles 141 worn by the user 100. The display data 250 shown in FIG. 8 is a composite image showing the PC image 251 as a real object together with the object 252 b and mouse cursor 271 b as virtual objects. The virtual object management module 132 sets the display positions of the virtual objects in the space outside the PC display section as illustrated in FIG. 8. This display process allows for the user to display a variety of objects in a space outside the PC display section rather than only inside the PC display section, thus making it possible to use a larger work area for data processing.
  • It should be noted that the process described with reference to the flowchart shown in FIG. 9 is performed each time the mouse cursor position changes as a result of the user operating the mouse 129 of the PC 120. The application execution section 125 transmits updated data to the mixed reality (MR) generator 130 each time the mouse cursor position changes. The mixed reality (MR) generator 130 changes the display position of the virtual object (clock) based on the updated data as a realtime process.
  • It should be noted that the case in which the determination in step S202 is Yes in the flowchart shown in FIG. 9 also occurs, for example, at the position of the object 301 c shown in FIG. 11. That is, the determination in step S202 is Yes if even only part of the mouse-specified object is located outside the display area of the PC display section.
  • In this case, the position information of points P, Q, R and S shown in FIG. 11 is also transferred from the PC 120 to the mixed reality (MR) generator 130 as the position information of the object 301 c. The mixed reality (MR) generator 130 need only display the object 301 c in such a manner that part of the object 301 c appears superimposed on the display section of the PC.
  • It should be noted that, in this case, only an out-of-display-section area of the object 301 c shown in FIG. 11 enclosed by points U, Q, R and V may appear as a virtual object and that, as for the portion of the object enclosed by points P, U, V and S, not a virtual object but the real object appearing on the display section of the PC, i.e., the camera-captured image itself, may be displayed on the display of the goggles worn by the user.
  • In order to perform this process, the virtual object management module 132 of the mixed reality (MR) generator 130 generates virtual object display data, made up only of the data of the portion enclosed by points U, Q, R and V as shown in FIG. 11, and displays this display data during the virtual object display process. That is, the data of the portion enclosed by points P, U, V and S of the object drawing data received from the PC is made to appear transparent.
  • [4. Configuration of and Processes Performed by the Information Processor According to a Third Embodiment of the Present Invention]
  • A detailed description will be given next of the configuration of and processes performed by the information processor adapted to perform the process described earlier as a third embodiment of the present invention with reference to FIG. 3. As described with reference to FIG. 3, embodiment 3 is a configuration example in which the object information 31 c is displayed as illustrated in FIG. 3( b) by specifying an object 31 in a real space outside the PC display section as a result of mouse operation by the user.
  • The present embodiment is performed by the devices configured as shown in FIG. 12. The PC 120 is configured in the same manner as described earlier with reference to FIG. 4 in embodiment 1. The mixed reality (MR) generator 130 includes the three-dimensional information analysis section 131, virtual object management module 132, memory 133, communication section 134, an object information acquisition section 135 and object information database 136. It should be noted that the object information database 136 need not essentially be provided in the mixed reality (MR) generator 130. The object information database 136 need only be, for example, a network-connectable database that can be accessed via the communication section of the mixed reality (MR) generator 130.
  • The three-dimensional information analysis section 131, virtual object management module 132, memory 133 and communication section 134 are configured in the same manner as described earlier with reference to FIG. 4 in the first embodiment. It should be noted, however, that the communication section 134 communicates with an external server 140 or the object information database 136 via a network.
  • The object information acquisition section 135 acquires a variety of real object images from the image captured by the camera 142 worn by the user 100 and compares the images with the data stored in the object information database 136, selecting similar images and acquiring object information associated with the selected images.
  • For example, if the real object image is a photograph of a CD jacket, the object information is a variety of information such as song title and genre of the CD, artist and price. These pieces of object information are associated with the object image and stored in the object information database 136.
  • It should be noted that the server 140 also holds the same information as that stored in the object information database 136. The mixed reality (MR) generator 130 transmits the image captured by the camera 142 worn by the user 100 or a real object image (e.g., CD jacket image) selected from the captured image to the server via the communication section 134. The server extracts corresponding object information from the received image, supplying the object information to the mixed reality (MR) generator 130.
  • As described above, the mixed reality (MR) generator 130 acquires object information from the object information database 136 or server 140 and supplies the acquired information to the PC 120 together with the data of the real object image captured by the camera 142. The PC 120 displays the acquired information on its display section using the acquired information.
  • As a result, display data 450 as shown, for example, in FIG. 13 appears on the display of the goggles 14 worn by the user 100. A PC image 451 included in the display data 450 is a real image (real object) captured by the camera 142. An object 471 a outside the PC image 451 is also a real object. A mouse cursor 480 a is a virtual object.
  • An object image 471 b and object information 471 c appearing in the PC image 451 are data displayed on the display section 128 by the application execution section 125 of the PC 120. Of the display data 450 shown in FIG. 13, therefore, the information other than the mouse cursor 480 a is the image appearing on the display of the goggles 141 worn by the user 100. This image can also be observed by those users not wearing any goggles.
  • That is, the object image 471 b and object information 471 c appearing in the PC image 451 are display data on the display section of the PC 120 which can be observed by anybody.
  • A description will be given below of the process sequence adapted to process data as described above with reference to the flowchart shown in FIG. 14. It should be noted that we assume that the user is operating the mouse 129 connected to the PC 120 shown in FIG. 12.
  • It should be noted that, as can be understood from the display data 450 shown in FIG. 13, the mouse cursor 480 a is displayed as a virtual object in the present embodiment as in the first and second embodiments. The sequence adapted to display the mouse cursor 480 a is performed in the same manner as the sequence described with reference to FIG. 6.
  • In the present embodiment 3, the process for the real object specified by the mouse is further added. The flowchart shown in FIG. 14 is a flow that describes only the sequence for this mouse-specified object. That is, if the display data 450 shown in FIG. 13 is generated and displayed, two processes, one according to the flow shown in FIG. 6 and another according to the flow shown in FIG. 14, are performed together.
  • The process step in step S301 in the flowchart shown in FIG. 14 is performed by both the PC 120 and mixed reality (MR) generator 130 shown in FIG. 12. The process steps from step S302 to step S309 are performed by the mixed reality (MR) generator 130 shown in FIG. 12. The process step in step S310 is performed by the PC 120 shown in FIG. 12.
  • Prior to the process step in step S301, the process according to the flow shown in FIG. 6 described in the first embodiment is performed, thus placing the mouse cursor in an out-of-display-section area. We assume, for example, that the mouse cursor is located at the position of the mouse cursor 480 a shown in FIG. 13.
  • In step S301, it is determined whether or not a real object has been specified by mouse operation. When a real object has been specified, the process proceeds to step S302. If a real object has not been specified, the process is terminated. The following process is performed when a real object has been specified. First, when mouse clicking information is supplied to the application execution section 125 via the mouse driver 121 of the PC 120, the application execution section 125 notifies the mouse operation (clicking) information to the mixed reality (MR) generator 130 via the communication section 124. The mixed reality (MR) generator 130 receives the mouse operation information via the communication section 134 and notifies the same information to the virtual object management module 132.
  • In step S302, the virtual object management module 132 determines whether or not an out-of-PC-display-section area is included in the object area of the specified real object and located in the imaging range of the camera. The camera is the camera 142 worn by the user 100. If the determination in step S302 is No, the process is terminated. When the determination in step S302 is Yes, the process proceeds to step S303.
  • In step S303, an image including the mouse-specified object is captured by the camera 142 worn by the user 100, and the captured image stored in the memory. This process is performed under control of the virtual object management module 132.
  • The process steps from step S304 to S306 are designed to acquire object information from the object information database 136. Those from step S307 to S308 are designed to acquire object information from the server 140. Either of these processes may be performed. Alternatively, both thereof may be performed.
  • A description will be given first of the process steps from step S304 to S306 adapted to acquire object information from the object information database 136.
  • In step S304, the object information database (DB) 136 is searched using the mouse-specified object image stored in the memory as a search key. This process is performed by the object information acquisition section 135.
  • Image data of a variety of real objects and objet information of the objects for the image data are registered in the object information database (DB) 136. Among such object information are photographs of CD jackets and song titles and prices of the CDs.
  • In step S305, the object information acquisition section 135 searches the object information database (DB) 136. That is, the same section 135 determines whether or not any image data registered in the object information database (DB) 136 matches or is similar to the mouse-specified object image. The process is terminated if no matching or similar registered image is extracted. The process proceeds to step S306 when matching or similar registered image is extracted.
  • In step S306, the object information acquisition section 135 acquires, from the object information database (DB) 136, the registered data for the registered image matching or similar to the mouse-specified object image, i.e., the object image and object information.
  • A description will be given next of the process steps from step S307 to step S308 using the server 140. In step S307, the object information acquisition section 135 transmits the mouse-specified object image stored in the memory to the server 140 via the communication section 134.
  • In step S308, the object information acquisition section 135 acquires, from the server 140, the object image and object information selected based on the information registered in the server. The server 140 performs the same process as the object information acquisition section 135, searching database of the server 140 using the mouse-specified object image as a search key and extracting the object information. It should be noted that an error message is notified if the object information cannot be extracted.
  • In step S309, the mixed reality (MR) generator 130 transmits, to the PC 120, the object information and object image data acquired from the server or database. It should be noted that the object image data may be that acquired from the server or database or the image captured by the camera 142.
  • The process step in the final step S310 is performed by the PC 120. In step S310, the data acquired from the mixed reality (MR) generator 130 is displayed on the PC display section thanks to the process performed by the application in the PC.
  • As a result, the display data 450 shown in FIG. 13 appears on the display of the goggles 14 worn by the user 100. As described earlier, the object image 471 b and object information 471 c appearing in the PC image 451 are the data displayed on the display section 128 by the application execution section 125 of the PC 120. Therefore, the display data 450 other than the mouse cursor 480 a shown in FIG. 13 is information that can also be observed by those users not wearing any goggles.
  • The present invention has been described above in detail with reference to the particular embodiments. However, it is obvious that those skilled in the art can modify or substitute the embodiments without departing from the gist of the present invention. That is, the present invention has been disclosed by way of illustration and should not be interpreted in a limited manner. The appended claims should be taken into consideration for evaluation of the gist of the present invention.
  • On the other hand, the series of processes described in the specification may be performed by hardware or software or by a combination of both. If the series of processes are performed by software, the program containing the process sequence is installed into the memory of a computer incorporated in dedicated hardware for execution or into a general-purpose personal computer capable of performing various processes for execution. For example, the program can be stored on a recording media in advance. In addition to installation from a recording media into a computer, the program can be installed to a recording media such as built-in harddisk by receiving the program via a network such as a LAN (Local Area Network) or the Internet.
  • It should be noted that each of the processes described in the specification may be performed not only chronologically according to the description but also in parallel or individually according to the processing capability of the device performing the process or as necessary. On the other hand, the term “system” in the present specification refers to a logical collection of a plurality of devices, and that the constituent devices are not necessarily provided in the same enclosure.
  • INDUSTRIAL APPLICABILITY
  • As described above, in the configuration according to the embodiment of the present invention, a cursor or object lying in an area outside the area of the display section of a PC or other device is displayed as a virtual object. For example, the display of goggles worn by the user displays a display device such as a PC and the area outside the display device. The three-dimensional position of the cursor or object that has probably moved in response to user operation is calculated, after which the cursor or object is displayed as a virtual object at the calculated position. Further, object information for the object specified by the cursor is acquired and presented. The present configuration makes it possible to constantly observe and verify data that has moved outside the display section, thus providing improved data processing efficiency.
  • DESCRIPTION OF THE REFERENCE SYMBOLS
    • 10 Display section
    • 11 Mouse cursor
    • 12 Movement line
    • 21 Object
    • 22 Movement line
    • 31 a, 31 b Objects
    • 31 c Object information
    • 100 User
    • 120 PC (personal computer)
    • 121 Mouse driver
    • 122 Mouse coordinate processing module
    • 123 GUI section
    • 124 Communication section
    • 125 Application execution section
    • 126 Control section
    • 127 Memory
    • 128 Display section
    • 129 Mouse
    • 130 Mixed reality (MR) generator
    • 131 Three-dimensional information analysis section
    • 132 Virtual object management module
    • 133 Memory
    • 134 Communication section
    • 135 Object information acquisition section
    • 136 Object information database
    • 140 Server
    • 141 Goggles
    • 142 Camera
    • 150 Display data
    • 151 PC image
    • 152 Mouse cursor
    • 200 Display section
    • 201 Markers
    • 211 Mouse cursor
    • 250 Display data
    • 251 PC image
    • 252 Object
    • 271 Mouse cursor
    • 301 Object
    • 450 Display data
    • 451 PC image
    • 471 a, 471 b Objects
    • 471 c Object information

Claims (9)

1. An information processor comprising:
a coordinate processing module adapted to determine whether the position of a cursor, i.e., a position indicator displayed on a first display section, is located in or outside the area of the first display section and output cursor position information to a virtual object management section if the cursor is located outside the area of the first display section;
a camera adapted to capture an image made up of a real object including the first display section;
a three-dimensional information analysis section adapted to analyze the three-dimensional position of the real object included in a camera-captured image;
a second display section adapted to display the camera-captured image; and
a virtual object management section adapted to generate a virtual object different from the real object included in the camera-captured image and generate a composite image including the generated virtual object and the real object so as to display the composite image on the second display section, wherein
the virtual object management section calculates the three-dimensional position of the cursor based on the cursor position information supplied from the coordinate processing module so as to display, on the second display section, a composite image in which the object is placed at the calculated position as a virtual object.
2. The information processor of claim 1 comprising:
an application execution section adapted to process a specified object specified by the position indicator, wherein
the application execution section determines whether the specified object is located in or outside the area of the first display section and outputs object position information to the virtual object management section if the specified object is located outside the area of the first display section, and wherein
the virtual object management section calculates the three-dimensional position of the object based on the object position information supplied from the coordinate processing module so as to display, on the second display section, a composite image in which the object is placed at the calculated position as a virtual object.
3. The information processor of claim 2, wherein
if the three-dimensional position of the object calculated based on the object position information supplied from the coordinate processing module includes the display area of the first display section, the virtual object management section displays, on the second display section, a composite image with an object area image overlapping the display area of the first display section deleted.
4. The information processor of claim 1 further comprising:
an object information acquisition section adapted to acquire image data of a real object specified by the cursor placed as the virtual object and search data based on the acquired image data so as to acquire object information, wherein
the object information acquisition section outputs the acquired object information to the first display section as display data.
5. The information processor of claim 4, wherein
the object information acquisition section accesses a database in which real object image data and object information are associated with each other or a server so as to acquire object information through a search based on the real object image data.
6. The information processor of claim 1, wherein
the virtual object management section calculates a plane including the display surface of the first display section based on three-dimensional position information of components making up the first display section included in the camera-captured image and calculates the three-dimensional position of the cursor so that the cursor position is placed on the plane.
7. The information processor of claim 1, wherein
the cursor is a mouse cursor that moves by mouse operation, and
the coordinate processing module receives mouse cursor displacement information resulting from the mouse operation and determines whether the mouse cursor is located in or outside the area of the first display section.
8. An information processing method performed by an information processor, the information processing method comprising:
a coordinate processing step of a coordinate processing module determining whether the position of a cursor, i.e., a position indicator displayed on a first display section, is located in or outside the area of the first display section and outputting cursor position information to a virtual object management section if the cursor is located outside the area of the first display section;
an image capture step of a camera capturing an image of a real object including the first display section;
a three-dimensional information analysis step of a three-dimensional information analysis section analyzing the three-dimensional position of the real object included in a camera-captured image; and
a virtual object management step of a virtual object management section generating a virtual object different from the real object included in the camera-captured image and generating a composite image including the generated virtual object and the real object so as to display the composite image on the second display section, wherein
the virtual object management step is a step of calculating the three-dimensional position of a cursor based on cursor position information supplied from the coordinate processing module so as to display, on the second display section, a composite image in which the cursor is placed at the calculated position as a virtual object.
9. A program causing an information processor to process information, the program comprising:
a coordinate processing step of causing a coordinate processing module to determine whether the position of a cursor, i.e., a position indicator displayed on a first display section, is located in or outside the area of the first display section and output cursor position information to a virtual object management section if the cursor is located outside the area of the first display section;
an image capture step of causing a camera to capture an image of a real object including the first display section;
a three-dimensional information analysis step of causing a three-dimensional information analysis section to analyze the three-dimensional position of the real object included in a camera-captured image; and
a virtual object management step of causing a virtual object management section to generate a virtual object different from the real object included in the camera-captured image and generate a composite image including the generated virtual object and the real object so as to display the composite image on the second display section, wherein
the virtual object management step is a step of causing the virtual object management section to calculate the three-dimensional position of a cursor based on cursor position information supplied from the coordinate processing module so as to display, on the second display section, a composite image in which the cursor is placed at the calculated position as a virtual object.
US13/383,511 2009-07-21 2010-06-30 Information processor, processing method and program for displaying a virtual image Active 2031-01-23 US8751969B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2009-170118 2009-07-21
JP2009170118A JP5263049B2 (en) 2009-07-21 2009-07-21 Information processing apparatus, information processing method, and program
PCT/JP2010/061161 WO2011010533A1 (en) 2009-07-21 2010-06-30 Information processing device, information processing method, and program

Publications (2)

Publication Number Publication Date
US20120124509A1 true US20120124509A1 (en) 2012-05-17
US8751969B2 US8751969B2 (en) 2014-06-10

Family

ID=43499009

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/383,511 Active 2031-01-23 US8751969B2 (en) 2009-07-21 2010-06-30 Information processor, processing method and program for displaying a virtual image

Country Status (9)

Country Link
US (1) US8751969B2 (en)
EP (1) EP2458486A1 (en)
JP (1) JP5263049B2 (en)
KR (1) KR20120069654A (en)
CN (1) CN102473068B (en)
BR (1) BR112012000913A2 (en)
RU (1) RU2524836C2 (en)
TW (1) TW201108037A (en)
WO (1) WO2011010533A1 (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130290876A1 (en) * 2011-12-20 2013-10-31 Glen J. Anderson Augmented reality representations across multiple devices
US20130328928A1 (en) * 2012-06-12 2013-12-12 Sony Computer Entertainment Inc. Obstacle avoidance apparatus and obstacle avoidance method
US20140053090A1 (en) * 2008-12-19 2014-02-20 Microsoft Corporation Interactive virtual display system
US9182827B2 (en) 2011-03-31 2015-11-10 Sony Corporation Information processing apparatus, image display apparatus, and information processing method
US9215439B2 (en) 2011-07-06 2015-12-15 Sony Corporation Apparatus and method for arranging emails in depth positions for display
US20160227050A1 (en) * 2015-01-30 2016-08-04 Konica Minolta, Inc. Data input system, data input apparatus, data input method, and non-transitory computer-readable recording medium encoded with data input program
US10275112B2 (en) * 2014-04-28 2019-04-30 Fujitsu Component Limited Computer readable medium, relay device and information processing device
US10466835B2 (en) 2015-03-27 2019-11-05 Fujitsu Limited Display method and display control apparatus
CN110663011A (en) * 2017-05-23 2020-01-07 Pcms控股公司 System and method for prioritizing AR information based on persistence of real-life objects in a user view
US20200027278A1 (en) * 2011-10-27 2020-01-23 Sony Corporation Image processing apparatus, image processing method, and program
US20210019911A1 (en) * 2017-12-04 2021-01-21 Sony Corporation Information processing device, information processing method, and recording medium
WO2021061351A1 (en) * 2019-09-26 2021-04-01 Apple Inc. Wearable electronic device presenting a computer-generated reality environment
US10983663B2 (en) * 2017-09-29 2021-04-20 Apple Inc. Displaying applications
US11049322B2 (en) * 2018-06-18 2021-06-29 Ptc Inc. Transferring graphic objects between non-augmented reality and augmented reality media domains
US11055918B2 (en) * 2019-03-15 2021-07-06 Sony Interactive Entertainment Inc. Virtual character inter-reality crossover
US11366514B2 (en) 2018-09-28 2022-06-21 Apple Inc. Application placement based on head position
US20220197394A1 (en) * 2020-12-17 2022-06-23 Samsung Electronics Co., Ltd. Electronic device and operation method thereof
US20230103022A1 (en) * 2023-08-28 2023-03-30 International Business Machines Corporation Mobile computing device projected visualization interaction
US20230140701A1 (en) * 2013-06-26 2023-05-04 Touchcast, Inc. System and method for providing and interacting with coordinated presentations
US11644940B1 (en) 2019-01-31 2023-05-09 Splunk Inc. Data visualization in an extended reality environment
US11800059B2 (en) 2019-09-27 2023-10-24 Apple Inc. Environment for remote communication
US11853533B1 (en) * 2019-01-31 2023-12-26 Splunk Inc. Data visualization workspace in an extended reality environment
US11960641B2 (en) 2022-06-21 2024-04-16 Apple Inc. Application placement based on head position

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6044079B2 (en) 2012-02-06 2016-12-14 ソニー株式会社 Information processing apparatus, information processing method, and program
JP2013174642A (en) * 2012-02-23 2013-09-05 Toshiba Corp Image display device
CN104683683A (en) * 2013-11-29 2015-06-03 英业达科技有限公司 System for shooting images and method thereof
JP6357843B2 (en) * 2014-04-10 2018-07-18 凸版印刷株式会社 Application inspection system, application inspection apparatus, and application inspection program
US10311638B2 (en) 2014-07-25 2019-06-04 Microsoft Technology Licensing, Llc Anti-trip when immersed in a virtual reality environment
US10416760B2 (en) 2014-07-25 2019-09-17 Microsoft Technology Licensing, Llc Gaze-based object placement within a virtual reality environment
US9865089B2 (en) 2014-07-25 2018-01-09 Microsoft Technology Licensing, Llc Virtual reality environment with real world objects
US9766460B2 (en) 2014-07-25 2017-09-19 Microsoft Technology Licensing, Llc Ground plane adjustment in a virtual reality environment
US9858720B2 (en) 2014-07-25 2018-01-02 Microsoft Technology Licensing, Llc Three-dimensional mixed-reality viewport
US10451875B2 (en) 2014-07-25 2019-10-22 Microsoft Technology Licensing, Llc Smart transparency for virtual objects
US9904055B2 (en) 2014-07-25 2018-02-27 Microsoft Technology Licensing, Llc Smart placement of virtual objects to stay in the field of view of a head mounted display
US20160027214A1 (en) * 2014-07-25 2016-01-28 Robert Memmott Mouse sharing between a desktop and a virtual world
US10393312B2 (en) 2016-12-23 2019-08-27 Realwear, Inc. Articulating components for a head-mounted display
US10437070B2 (en) 2016-12-23 2019-10-08 Realwear, Inc. Interchangeable optics for a head-mounted display
US11099716B2 (en) 2016-12-23 2021-08-24 Realwear, Inc. Context based content navigation for wearable display
US11507216B2 (en) 2016-12-23 2022-11-22 Realwear, Inc. Customizing user interfaces of binary applications
US10620910B2 (en) 2016-12-23 2020-04-14 Realwear, Inc. Hands-free navigation of touch-based operating systems
US10936872B2 (en) 2016-12-23 2021-03-02 Realwear, Inc. Hands-free contextually aware object interaction for wearable display
JP2021140085A (en) 2020-03-06 2021-09-16 富士フイルムビジネスイノベーション株式会社 Information processing device and program
JP2022098268A (en) 2020-12-21 2022-07-01 富士フイルムビジネスイノベーション株式会社 Information processing device and program
EP4295314A1 (en) 2021-02-08 2023-12-27 Sightful Computers Ltd Content sharing in extended reality
JP2024509722A (en) 2021-02-08 2024-03-05 サイトフル コンピューターズ リミテッド User interaction in extended reality
EP4288856A1 (en) 2021-02-08 2023-12-13 Sightful Computers Ltd Extended reality for productivity
WO2023009580A2 (en) 2021-07-28 2023-02-02 Multinarity Ltd Using an extended reality appliance for productivity
WO2023028571A1 (en) * 2021-08-27 2023-03-02 Chinook Labs Llc System and method of augmented representation of an electronic device
US20230334795A1 (en) 2022-01-25 2023-10-19 Multinarity Ltd Dual mode presentation of user interface elements
US11948263B1 (en) 2023-03-14 2024-04-02 Sightful Computers Ltd Recording the complete physical and extended reality environments of a user

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5305435A (en) * 1990-07-17 1994-04-19 Hewlett-Packard Company Computer windows management system and method for simulating off-screen document storage and retrieval
US6297804B1 (en) * 1998-08-13 2001-10-02 Nec Corporation Pointing apparatus
US7103841B2 (en) * 2001-05-08 2006-09-05 Nokia Corporation Method and arrangement for providing an expanded desktop
US20070052672A1 (en) * 2005-09-08 2007-03-08 Swisscom Mobile Ag Communication device, system and method
JP2008304268A (en) * 2007-06-06 2008-12-18 Sony Corp Information processor, information processing method, and computer program
WO2009072504A1 (en) * 2007-12-07 2009-06-11 Sony Corporation Control device, input device, control system, control method, and hand-held device
US20090221368A1 (en) * 2007-11-28 2009-09-03 Ailive Inc., Method and system for creating a shared game space for a networked game
US20090237564A1 (en) * 2008-03-18 2009-09-24 Invism, Inc. Interactive immersive virtual reality and simulation
US20100164990A1 (en) * 2005-08-15 2010-07-01 Koninklijke Philips Electronics, N.V. System, apparatus, and method for augmented reality glasses for end-user programming
US20110167379A1 (en) * 1999-04-06 2011-07-07 Microsoft Corporation Method and apparatus for providing a three-dimensional task gallery computer interface
US8176434B2 (en) * 2008-05-12 2012-05-08 Microsoft Corporation Virtual desktop view scrolling
US8310445B2 (en) * 2004-11-18 2012-11-13 Canon Kabushiki Kaisha Remote-control system, remote-control apparatus, apparatus to be controlled, remote-control method, computer program, and storage medium
US20120329558A1 (en) * 2006-09-14 2012-12-27 Nintendo Co., Ltd. Method and apparatus for using a common pointing input to control 3d viewpoint and object targeting

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4178697B2 (en) * 1999-11-18 2008-11-12 ソニー株式会社 Portable information processing terminal, information input / output system, and information input / output method
FI117488B (en) * 2001-05-16 2006-10-31 Myorigo Sarl Browsing information on screen
US7369102B2 (en) * 2003-03-04 2008-05-06 Microsoft Corporation System and method for navigating a graphical user interface on a smaller display
JP2006154902A (en) * 2004-11-25 2006-06-15 Olympus Corp Hand-written image display system, and portable information terminal for space hand-writing
US7843470B2 (en) * 2005-01-31 2010-11-30 Canon Kabushiki Kaisha System, image processing apparatus, and information processing method
SE0601216L (en) * 2006-05-31 2007-12-01 Abb Technology Ltd Virtual workplace
JP2008304269A (en) 2007-06-06 2008-12-18 Sony Corp Information processor, information processing method, and computer program
JP4909851B2 (en) * 2007-09-25 2012-04-04 日立アプライアンス株式会社 Washing and drying machine

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5305435A (en) * 1990-07-17 1994-04-19 Hewlett-Packard Company Computer windows management system and method for simulating off-screen document storage and retrieval
US6297804B1 (en) * 1998-08-13 2001-10-02 Nec Corporation Pointing apparatus
US20110167379A1 (en) * 1999-04-06 2011-07-07 Microsoft Corporation Method and apparatus for providing a three-dimensional task gallery computer interface
US7103841B2 (en) * 2001-05-08 2006-09-05 Nokia Corporation Method and arrangement for providing an expanded desktop
US8310445B2 (en) * 2004-11-18 2012-11-13 Canon Kabushiki Kaisha Remote-control system, remote-control apparatus, apparatus to be controlled, remote-control method, computer program, and storage medium
US20100164990A1 (en) * 2005-08-15 2010-07-01 Koninklijke Philips Electronics, N.V. System, apparatus, and method for augmented reality glasses for end-user programming
US20070052672A1 (en) * 2005-09-08 2007-03-08 Swisscom Mobile Ag Communication device, system and method
US20120329558A1 (en) * 2006-09-14 2012-12-27 Nintendo Co., Ltd. Method and apparatus for using a common pointing input to control 3d viewpoint and object targeting
JP2008304268A (en) * 2007-06-06 2008-12-18 Sony Corp Information processor, information processing method, and computer program
US20090221368A1 (en) * 2007-11-28 2009-09-03 Ailive Inc., Method and system for creating a shared game space for a networked game
WO2009072504A1 (en) * 2007-12-07 2009-06-11 Sony Corporation Control device, input device, control system, control method, and hand-held device
US20090237564A1 (en) * 2008-03-18 2009-09-24 Invism, Inc. Interactive immersive virtual reality and simulation
US8176434B2 (en) * 2008-05-12 2012-05-08 Microsoft Corporation Virtual desktop view scrolling

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140053090A1 (en) * 2008-12-19 2014-02-20 Microsoft Corporation Interactive virtual display system
US9323429B2 (en) * 2008-12-19 2016-04-26 Microsoft Technology Licensing, Llc Interactive virtual display system
US9182827B2 (en) 2011-03-31 2015-11-10 Sony Corporation Information processing apparatus, image display apparatus, and information processing method
US9215439B2 (en) 2011-07-06 2015-12-15 Sony Corporation Apparatus and method for arranging emails in depth positions for display
US11468647B2 (en) * 2011-10-27 2022-10-11 Sony Corporation Image processing apparatus, image processing method, and program
US20200027278A1 (en) * 2011-10-27 2020-01-23 Sony Corporation Image processing apparatus, image processing method, and program
US10902682B2 (en) * 2011-10-27 2021-01-26 Sony Corporation Image processing apparatus, image processing method, and program
US11941766B2 (en) 2011-10-27 2024-03-26 Sony Group Corporation Image processing apparatus, image processing method, and program
EP2795893A4 (en) * 2011-12-20 2015-08-19 Intel Corp Augmented reality representations across multiple devices
US20130290876A1 (en) * 2011-12-20 2013-10-31 Glen J. Anderson Augmented reality representations across multiple devices
US9952820B2 (en) * 2011-12-20 2018-04-24 Intel Corporation Augmented reality representations across multiple devices
US9599818B2 (en) * 2012-06-12 2017-03-21 Sony Corporation Obstacle avoidance apparatus and obstacle avoidance method
US20130328928A1 (en) * 2012-06-12 2013-12-12 Sony Computer Entertainment Inc. Obstacle avoidance apparatus and obstacle avoidance method
US20230140701A1 (en) * 2013-06-26 2023-05-04 Touchcast, Inc. System and method for providing and interacting with coordinated presentations
US10275112B2 (en) * 2014-04-28 2019-04-30 Fujitsu Component Limited Computer readable medium, relay device and information processing device
US9860395B2 (en) * 2015-01-30 2018-01-02 Konica Minolta, Inc. Data input system, data input apparatus, data input method, and non-transitory computer-readable recording medium encoded with data input program
US20160227050A1 (en) * 2015-01-30 2016-08-04 Konica Minolta, Inc. Data input system, data input apparatus, data input method, and non-transitory computer-readable recording medium encoded with data input program
US10466835B2 (en) 2015-03-27 2019-11-05 Fujitsu Limited Display method and display control apparatus
CN110663011A (en) * 2017-05-23 2020-01-07 Pcms控股公司 System and method for prioritizing AR information based on persistence of real-life objects in a user view
US10983663B2 (en) * 2017-09-29 2021-04-20 Apple Inc. Displaying applications
US20210019911A1 (en) * 2017-12-04 2021-01-21 Sony Corporation Information processing device, information processing method, and recording medium
US11049322B2 (en) * 2018-06-18 2021-06-29 Ptc Inc. Transferring graphic objects between non-augmented reality and augmented reality media domains
US11562544B2 (en) 2018-06-18 2023-01-24 Ptc Inc. Transferring graphic objects between non-augmented reality and augmented reality media domains
US11366514B2 (en) 2018-09-28 2022-06-21 Apple Inc. Application placement based on head position
US11853533B1 (en) * 2019-01-31 2023-12-26 Splunk Inc. Data visualization workspace in an extended reality environment
US11644940B1 (en) 2019-01-31 2023-05-09 Splunk Inc. Data visualization in an extended reality environment
US11055918B2 (en) * 2019-03-15 2021-07-06 Sony Interactive Entertainment Inc. Virtual character inter-reality crossover
US11521581B2 (en) 2019-09-26 2022-12-06 Apple Inc. Controlling displays
WO2021061351A1 (en) * 2019-09-26 2021-04-01 Apple Inc. Wearable electronic device presenting a computer-generated reality environment
US11893964B2 (en) 2019-09-26 2024-02-06 Apple Inc. Controlling displays
US11800059B2 (en) 2019-09-27 2023-10-24 Apple Inc. Environment for remote communication
US20220197394A1 (en) * 2020-12-17 2022-06-23 Samsung Electronics Co., Ltd. Electronic device and operation method thereof
US11822728B2 (en) * 2020-12-17 2023-11-21 Samsung Electronics Co., Ltd. Electronic device and operation method thereof
US11960641B2 (en) 2022-06-21 2024-04-16 Apple Inc. Application placement based on head position
US20230103022A1 (en) * 2023-08-28 2023-03-30 International Business Machines Corporation Mobile computing device projected visualization interaction
US11822941B2 (en) * 2023-08-28 2023-11-21 International Business Machines Corporation Mobile computing device projected visualization interaction

Also Published As

Publication number Publication date
BR112012000913A2 (en) 2016-03-01
CN102473068B (en) 2013-12-25
KR20120069654A (en) 2012-06-28
CN102473068A (en) 2012-05-23
US8751969B2 (en) 2014-06-10
RU2524836C2 (en) 2014-08-10
EP2458486A1 (en) 2012-05-30
JP2011028309A (en) 2011-02-10
RU2012101245A (en) 2013-07-27
TW201108037A (en) 2011-03-01
WO2011010533A1 (en) 2011-01-27
JP5263049B2 (en) 2013-08-14

Similar Documents

Publication Publication Date Title
US8751969B2 (en) Information processor, processing method and program for displaying a virtual image
US9639988B2 (en) Information processing apparatus and computer program product for processing a virtual object
Kato et al. Marker tracking and hmd calibration for a video-based augmented reality conferencing system
US9396215B2 (en) Search device, search method, recording medium, and program
EP2490182A1 (en) authoring of augmented reality
Andersen et al. Virtual annotations of the surgical field through an augmented reality transparent display
US10185394B2 (en) Gaze direction mapping
JP7026825B2 (en) Image processing methods and devices, electronic devices and storage media
CN104081307A (en) Image processing apparatus, image processing method, and program
JP2013164697A (en) Image processing device, image processing method, program and image processing system
CN108027655A (en) Information processing system, information processing equipment, control method and program
CN108430032B (en) Method and equipment for realizing position sharing of VR/AR equipment
CN112667179A (en) Remote synchronous collaboration system based on mixed reality
JP2018010599A (en) Information processor, panoramic image display method, panoramic image display program
JP2016122392A (en) Information processing apparatus, information processing system, control method and program of the same
WO2024012268A1 (en) Virtual operation method and apparatus, electronic device, and readable storage medium
JP2015184986A (en) Compound sense of reality sharing device
JP5998952B2 (en) Sign image placement support apparatus and program
JP6304305B2 (en) Image processing apparatus, image processing method, and program
JP5472509B2 (en) Information processing apparatus, information processing method, and information recording medium
US20200192472A1 (en) Gaze Direction Mapping
CN117453037A (en) Interactive method, head display device, electronic device and readable storage medium
JP3392080B2 (en) User interface method, user interface device, space drawing device, workstation device, and program storage medium
Kang Design and Implementation of Digital Map Products Contributing GIS Perspective based on Cloud Computing

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MATSUDA, KOUICHI;FUKUCHI, MASAKI;SIGNING DATES FROM 20111030 TO 20111203;REEL/FRAME:027521/0856

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551)

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8