WO2014071333A2 - Constructing augmented reality environment with pre-computed lighting - Google Patents

Constructing augmented reality environment with pre-computed lighting Download PDF

Info

Publication number
WO2014071333A2
WO2014071333A2 PCT/US2013/068363 US2013068363W WO2014071333A2 WO 2014071333 A2 WO2014071333 A2 WO 2014071333A2 US 2013068363 W US2013068363 W US 2013068363W WO 2014071333 A2 WO2014071333 A2 WO 2014071333A2
Authority
WO
WIPO (PCT)
Prior art keywords
augmented reality
virtual structure
lighting
local environment
physical
Prior art date
Application number
PCT/US2013/068363
Other languages
French (fr)
Other versions
WO2014071333A3 (en
Inventor
Jonathan STEED
Aaron KRAUSS
Mike Scavezze
Wei Zhang
Arthur TOMLIN
Tony AMBRUS
Brian Mount
Stephen Latta
Ryan Hastings
Original Assignee
Steed Jonathan
Krauss Aaron
Mike Scavezze
Wei Zhang
Tomlin Arthur
Ambrus Tony
Brian Mount
Stephen Latta
Ryan Hastings
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Steed Jonathan, Krauss Aaron, Mike Scavezze, Wei Zhang, Tomlin Arthur, Ambrus Tony, Brian Mount, Stephen Latta, Ryan Hastings filed Critical Steed Jonathan
Publication of WO2014071333A2 publication Critical patent/WO2014071333A2/en
Publication of WO2014071333A3 publication Critical patent/WO2014071333A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2215/00Indexing scheme for image rendering
    • G06T2215/16Using real world measurements to influence rendering

Definitions

  • Dynamic lighting and shadowing may be computed more quickly. However, the visual quality of dynamic lighting may be much lower than that of pre-computed lighting effects. Further, dynamic lighting may utilize significant resources at run-time.
  • one disclosed embodiment provides a method of displaying an augmented reality image via a display device.
  • the method includes receiving image data capturing an image of a local environment of the display device, and identifying a physical feature of the local environment via the image data.
  • the method further includes constructing an augmented reality image of a virtual structure for display over the physical feature in spatial registration with the physical feature from a viewpoint of a user, the augmented reality image comprising a plurality of modular virtual structure segments arranged in adjacent locations to form the virtual structure feature, each modular virtual structure segment comprising a pre-computed lighting effect, and outputting the augmented reality image to the display device.
  • Figure 1 shows an example embodiment of a see-through display device in an example use environment.
  • Figure 2 shows an embodiment of an augmented reality image in the example use environment of Figure 1.
  • Figure 3 shows an example embodiment of a set of modular virtual structure segments.
  • Figure 4 shows a schematic depiction of a pre-computed lighting effect being applied to a portion of a modular virtual structure segment.
  • Figure 5 A illustrates an addition of a dynamic point light effect to the augmented reality image of Figure 2
  • Figure 5B illustrates an example of the effect of the dynamic point light effect on a portion of a modular virtual structure segment having a pre-computed lighting effect.
  • Figure 6 shows a flow diagram depicting an embodiment of a method for constructing a virtual environment fit to a detected physical environment.
  • Figure 7 shows a block diagram of an example embodiment of a see-through display device.
  • Figure 8 shows a block diagram of an example embodiment of a computing system.
  • an augmented reality display system may be configured to adapt a virtual image to a user's surroundings.
  • an augmented reality video game may fit virtual structures in the game to corresponding physical structures of a physical environment of the user.
  • the geometries of the augmented reality image objects may change based upon the physical environment of the user.
  • dynamic lighting may be used in place of pre-computed lighting effects for an augmented reality environment.
  • dynamic lighting may be of lower quality than pre-computed lighting, and therefore may not provide as good of a user experience.
  • dynamic lighting may be computationally expensive at run-time, which may decrease computation budgets for other aspects of the experience, such as other visuals and game play.
  • embodiments are disclosed herein that relate to the efficient construction of an augmented reality environment, with high quality pre-computed lighting effects, that is fit to the geometry of a local physical environment.
  • the disclosed embodiments utilize modular virtual structure segments that may be arranged adjacent to one another to form a virtual structure for an augmented reality image, wherein the modular virtual structure segments comprise high-quality pre-computed lighting effects.
  • the lighting effects are pre-computed for each modular virtual structure segment, the lighting effects will be included in a virtual structure constructed via the modular virtual structure segments.
  • local lighting characteristics may be detected, and used to modulate an appearance of the modular virtual structure segments. Examples of such local lighting characteristics may include, but are not limited to, color characteristics and locations of light sources in the local physical environment.
  • Figure 1 shows an example embodiment of a use environment 100 for an augmented reality display system, in the form of a living room.
  • a user 102 is shown viewing the living room through a see-through display device 104.
  • Figure 1 also depicts a field of view 103 of the user, which represents a portion of the environment viewable through the see-through display device 104, and thus the portion of the environment that may be augmented with images displayed via the see-through display device 104.
  • the user's field of view 103 may be substantially coextensive with the user's actual field of vision, while in other embodiments the user's field of view 103 may occupy a lesser portion of the user's actual field of vision.
  • see-through display device 104 may comprise one or more outwardly facing image sensors (e.g., two-dimensional cameras and/or depth cameras) configured to acquire image data (e.g. color/grayscale images, depth images/point cloud data/mesh data, etc.) representing use environment 100 as the user navigates the environment.
  • image data e.g. color/grayscale images, depth images/point cloud data/mesh data, etc.
  • This image data may be used to obtain information regarding the layout of the environment and structural features thereof, such as ceiling 106 and walls 108, as well as other features.
  • See-through display device 104 further is configured to overlay displayed virtual objects over physical objects viewable through the device to create an augmented reality image.
  • an example augmented reality image is depicted in which virtual room framing structures 200, such as virtual wall studs 202, headers 204, etc., are displayed as an overlay on the user's wall.
  • Infrastructure images such as pipes 206, conduit/cables, etc. also may be displayed, as well as any other suitable virtual structures. Similar structures (not shown) may be displayed for the ceiling as well.
  • imagery may be displayed corresponding to furniture or other non-structural objects in a room, and as occupying an empty space within a room. It will be appreciated that the augmented reality image as depicted in Figure 2 is not limited to the user's field of view shown in Figure 1 in order to illustrate the augmented reality environment more completely.
  • the virtual wall framing structures of Figure 2 are geometrically fit to the underlying physical structures (e.g. walls 108).
  • the overall virtual structures for each player's local physical environment are constructed upon acquiring image data (e.g. stereo depth image data, structured light image data, time-of-flight image data, or other depth image data) of the local physical environment, rather than being pre-designed.
  • image data e.g. stereo depth image data, structured light image data, time-of-flight image data, or other depth image data
  • a player may be forced to wait an undesirably long time before the activity could be played, and may be restricted from changing environments (e.g. walking into a different rooms) during a game, as building and applying lighting to the new environment may take an unsuitably long period of time.
  • virtual structures 200 are assembled from sets of modular virtual structure segments with pre-computed lighting effects, wherein instances of the modular virtual structure segments may be arranged adjacent to each other and processed (e.g. rotated, scaled, etc.) to form the appearance of a unitary virtual structure.
  • Figure 3 shows an example embodiment of a set 300 of modular virtual structure segments comprising a wall stud segment 302, a wall stud segment with a jointed pipe 304, a wall stud segment with a horizontal pipe 306, a pair of door frame segments 308, 310, and a pair of window frame segments 312, 314.
  • virtual structures 200 may be constructed entirely from instances of virtual wall segments selected from set 300 as scaled, rotated, clipped, morphed, and/or otherwise appropriately processed based upon the placement of each particular instance.
  • FIG. 3 While a relatively simple set of modular virtual structure segments is depicted in Figure 3, it will be understood that a set of modular virtual structure segments may have any suitable number and selection of segments of any desired complexity. Further, it will be understood that sets of modular virtual structure segments with pre-computed lighting may be used to build any other suitable structure to fit any other desired physical feature than a virtual structure fit to a wall, including but not limited to virtual structures that are fit to ceilings, as well as non-structural features such as furniture, wall hangings, plants, exterior objects, counter tops, bar tops, etc.
  • pre-lit modular virtual structure segments may include pre-lit virtual sofas, desks, televisions, mirrors, and other objects that are commonly found in physical environments, but that may have different shapes and/or appearances in different physical environments.
  • virtual structural segments for such objects in some instances may comprise a single "segment,” such that a single virtual structure element is resized, rotated, and otherwise processed to fit a desired physical structure without necessarily being combined with adjacent segments.
  • an empty space within a physical environment may be considered a physical feature of the environment, and that modular virtual structure segments may be arranged to construct a virtual object in an unoccupied portion of space within a room or other use environment.
  • Some modular virtual structure segments may comprise connectivity constraints that restrict a set of other segments that may be joined to the segment.
  • each window segment and door segment may be joined on one side (e.g. the window/door side) to another window or door segment, and not a wall stud segment 302 on that side, or the segments would mate incorrectly.
  • jointed pipe 304 and horizontal pipe 306 segments may be restricted to being joined to segments with complementary pipe sections. It will be understood that these connectivity constraints are described for the purpose of example, and that any other suitable connectivity constraints may be used.
  • any suitable pre-computed lighting effect may be applied to the modular virtual structure segments.
  • a set of modular virtual structure segments may be intended for use in any local lighting environment, without reference to locations of physical lights in the environment.
  • a directional light effect may be utilized.
  • An example of this is shown in Figure 4 as directional light incident on a portion 400 of a virtual wall stud segment.
  • the virtual lighting applied may have any suitable direction. For example, for horizontally tiled pieces, the light may be perpendicular to the horizontal axis, while for vertically tiled pieces, the light may be perpendicular to the vertical axis. Additionally, for modular pieces that are tiled both horizontally and vertically, the light may be perpendicular to both axes.
  • the directional light is shown as being applied at an approximately forty five degree angle relative to a vertical plane, but it will be understood that this is presented for the purpose of example, and that any other suitable angle may be used.
  • a set of modular virtual structure segments may be configured for use with specific lighting characteristics - e.g. a single overhead point light source, a lamp adjacent to a wall, etc.
  • any suitable type of virtual lighting may be used to pre-compute lighting effects.
  • the computed light map may be saved with the associated modular virtual structure segment at a high level of information so that images of virtual structures assembled with the segment have realistic lighting effects.
  • any suitable type of lighting information may be stored for a modular virtual structure segment.
  • pre-computed lighting effects may be stored as light maps, cube maps, spherical harmonics (e.g. pre-computed radiance transfer functions), and/or in any other suitable form.
  • the use of pre-computed radiance transfer functions may allow realistic lighting and shadows on virtual objects to be generated, for example, based upon detected physical light locations in a use environment by applying virtual point lighting at locations of physical lights in a physical environment, as depicted via virtual point light source 500 in Figure 5 A.
  • Figure 5 B shows an example of how the appearance of the wall stud portion shown in Figure 4 may be modulated based upon the virtual point light source of Figure 5 A.
  • procedural or dynamic lighting also may be applied in real time (e.g. light arising from dynamic virtual objects displayed in the augmented reality image).
  • Local physical lighting characteristics also may be used to modulate an appearance of modular virtual structure segments in other ways.
  • the pre-computed lighting effects for the modular virtual structure segments may be computed based upon the application of white light. Then, when building a virtual image for the particular physical environment, color characteristics of the physical lighting in the physical environment may be analyzed from image data acquired by the see-through display device, and the determined color characteristics (e.g. hue, saturation, albedo) may be applied to the virtual lighting effects so that the virtual lighting matches the local physical lighting more closely.
  • the determined color characteristics e.g. hue, saturation, albedo
  • displayed instances of pre-lit virtual wall/ceiling segments, pre-lit virtual furniture, and any other suitable pre-lit virtual objects may be matched to an appearance of the physical environment more closely.
  • Figure 6 shows an embodiment of a method 600 for constructing an augmented reality environment by applying instances of modular virtual structure segments to a detected physical structure.
  • Method 600 comprises, at 602, receiving image data capturing an image of a local environment of a see-through display device.
  • the image data may comprise any suitable data, including but not limited to depth images 604 and/or two- dimensional images, and may be received from image sensors on the see-through display device, or external to the see-through display device.
  • the depth image data may be received from any suitable depth imaging system, including but not limited to stereo imaging systems, time-of- flight imaging systems, and structured light imaging systems.
  • Method 600 next comprises, at 606, identifying a physical feature of the local environment from the image data.
  • the physical feature may be identified in any suitable manner.
  • a mesh representation of the physical environment is determined from depth image data, and mesh analysis is performed, at 608, to identify, at 610, major surfaces in the physical environment. Examples include, but are not limited to, walls 612 and ceilings 614, as well as features of the walls and ceilings, such as doors, windows, skylights, columns, other protrusions/cutouts in the room, etc.
  • open spaces in the geometry may be identified, for example, to allow a desired virtual structure to be fit into the identified open space.
  • Method 600 also may include, at 616, identifying one or more local lighting characteristics of the physical environment.
  • local lighting characteristics may include, but are not limited to, color characteristics 618 and locations of local light sources 620.
  • Method 600 further comprises, at 622, constructing an augmented reality image comprising a virtual structure feature for display over a detected physical feature in spatial registration with the physical feature.
  • the virtual structure may be constructed by arranging a plurality of modular virtual structure segments that each comprises a pre-computed lighting effect.
  • the virtual structure segments may be arranged in any suitable manner, including but not limited to by rotating, scaling, morphing, clipping, etc. pieces to fit the physical geometry of interest.
  • the modular virtual structure segments may comprise any suitable pre-calculated information regarding the pre- computed lighting effect. Examples include, but are not limited to, light maps 626 and/or radiance transfer functions 628.
  • connectivity constraints that restrict a set of other modular virtual structure segments that can be joined to a selected modular virtual structure segment may be applied, at 630, when selecting and arranging the modular virtual structure segments to ensure that complementary features are appropriately joined on adjacent segments.
  • local lighting characteristics may be utilized in constructing the augmented reality image.
  • an appearance of the modular virtual structure segments may be modulated based upon local lighting characteristics.
  • the appearance may be modulated in any suitable manner.
  • a color of the local lighting environment may be imparted to the pre- computed lighting effect, as indicated at 634.
  • a virtual light source such as a virtual point light source, may be applied at a location of a physical light source in the environment, as indicated at 636.
  • a plurality of different sets of virtual modular structure segments having different lighting characteristics may be available.
  • one set of modular virtual structure segments may comprise pre-computed light effects corresponding to a point light source overhead, while another may comprise pre-computed light effects corresponding to directional light coming in from a side window.
  • local lighting characteristics may be utilized to select a set of modular virtual structure segments having corresponding lighting characteristics, so that the resulting virtual structure may have similar lighting characteristics as the physical light in the environment.
  • method 600 comprises outputting the augmented reality image to a see-through display device, as indicated at 640.
  • Sensor data from the see-through display device e.g. inward and outward image sensors
  • the methods described above may be performed via any suitable display device.
  • Examples include but not limited to see-through display devices such as head-mounted see-through display device 104 of Figure 1 , and other display devices having one or more image sensors, such as smart phones and notepad computers.
  • Figure 7 shows a block diagram of an example configuration of see-through display device 104.
  • See-through display device 104 may comprise one or more lenses 702 that form a part of a near-eye see-through display subsystem 704.
  • See-through display device 104 may further comprise one or more outward facing image sensors 706 configured to acquire images of a background scene being viewed by a user, and may include one or more microphones 708 configured to detect sounds, such as voice commands from a user.
  • Outward facing image sensors 706 may include one or more depth sensors (including but not limited to stereo depth imaging arrangements) and/or one or more two-dimensional image sensors.
  • See-through display device 104 further comprises a gaze detection subsystem 710 configured to detect a direction of gaze of each eye of a user, as described above.
  • the gaze detection subsystem 710 may be configured to determine gaze directions of each of a user's eyes in any suitable manner.
  • the gaze detection subsystem 710 comprises one or more glint sources 712, such as infrared light sources, configured to cause a glint of light to reflect from the cornea of each eye of a user, and one or more image sensors 714 configured to capture an image of one or more eyes of the user.
  • Images of the glints and of the pupils as determined from image data gathered via image sensor(s) 714 may be used to determine an optical axis of each eye. It will be understood that the gaze detection subsystem 710 may have any suitable number and arrangement of light sources and image sensors.
  • See-through display device 104 may further comprise additional sensors.
  • see-through display device 104 may comprise a global positioning (GPS) subsystem 716 to allow a location of see-through display device 104 to be determined.
  • GPS global positioning
  • See-through display device 104 further may include one or more motion sensors 718 to detect movements of a user's head when the user is wearing see-through display device 104.
  • Motion data may be used, for example, for image stabilization to help correct for blur in images from the outward-facing image sensor(s) 706.
  • the motion sensors 718, as well as the microphone(s) 708 and the gaze detection subsystem 710 also may be employed as user input devices, such that a user may interact with see-through display subsystem 704 via gestures of the eye, neck and/or head, as well as via verbal commands.
  • sensors illustrated in Figure 7 is shown for the purpose of example and are not intended to be limiting in any manner, as any other suitable sensors and/or combination of sensors may be utilized.
  • See-through display device 104 further comprises a computing device 720 having a logic subsystem 722 and a storage subsystem 724 in communication with the sensors, the gaze detection subsystem 710, and the see-through display subsystem 704.
  • Storage subsystem 724 comprises instructions stored thereon that are executable by logic subsystem 722, for example, to receive image data from outward facing image sensors 706 capturing an image of a local environment of the see-through display device, and to identify a physical feature of the local environment via the image data.
  • the instructions also may be executable to construct an augmented reality image of a virtual structure by arranging a plurality of modular virtual structure segments in adjacent locations, each modular virtual structure segment comprising a pre-computed global illumination effect, and to display the augmented reality image over the physical feature in spatial registration with the physical feature from a viewpoint of a user.
  • the instructions may further be executable to detect a local lighting characteristic, to modulate the augmented reality image based upon the local lighting characteristic, and to display the augmented reality image over the physical feature in spatial registration with the physical feature via the see-through display subsystem 704.
  • a display device may include additional and/or alternative sensors, cameras, microphones, input devices, output devices, etc. than those shown without departing from the scope of this disclosure.
  • the physical configuration of a display device and its various sensors and subcomponents may take a variety of different forms without departing from the scope of this disclosure.
  • a computing system configured to display augmented reality imagery via a see-through display device may take any suitable form other than a head-mounted display device, including but not limited to a mainframe computer, server computer, desktop computer, laptop computer, tablet computer, home- entertainment computer, network computing device, gaming device, mobile computing device, mobile communication device (e.g., smart phone), other wearable computer, etc. It will further be understood that the methods and processes described above may be implemented as a computer-application program or service, an application-programming interface (API), a library, and/or other computer program product.
  • API application-programming interface
  • FIG 8 schematically shows a non-limiting embodiment of a computing system 800 that can perform one or more of the methods and processes described above.
  • Computing system 800 is shown in simplified form, and as mentioned above may represent any suitable device and/or combination of devices, including but not limited to those described above with reference to Figures 1-9.
  • Computing system 800 includes a logic subsystem 802 and a storage subsystem 804.
  • Computing system 800 may optionally include a display subsystem 806, input device subsystem 808, communication subsystem 810, and/or other components not shown in Figure 8.
  • Computing system 800 may also optionally include or interface with one or more user input devices such as the above-described eye tracking system, as well as a keyboard, mouse, game controller, camera (depth and/or two-dimensional), microphone, and/or touch screen, for example.
  • user-input devices may form part of input device subsystem 808 or may interface with input device subsystem 808.
  • Logic subsystem 802 includes one or more physical devices configured to execute instructions.
  • the logic subsystem may be configured to execute machine- readable instructions that are part of one or more applications, services, programs, routines, libraries, objects, components, data structures, or other logical constructs. Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more components, or otherwise arrive at a desired result.
  • Logic subsystem 802 may include one or more processors configured to execute software instructions. Additionally or alternatively, logic subsystem 802 may include one or more hardware or firmware logic machines configured to execute hardware or firmware instructions. The processors of logic subsystem 802 may be single-core or multi-core, and the programs executed thereon may be configured for sequential, parallel or distributed processing. Logic subsystem 802 may optionally include individual components that are distributed among two or more devices, which can be remotely located and/or configured for coordinated processing. Aspects of the logic subsystem may be virtualized and executed by remotely accessible networked computing devices configured in a cloud-computing configuration.
  • Storage subsystem 804 includes one or more physical, non-transitory, computer- readable storage devices configured to hold data and/or instructions executable by the logic subsystem to implement the herein-described methods and processes. When such methods and processes are implemented, the state of storage subsystem 804 may be transformed— e.g., to hold different data.
  • Storage subsystem 804 may include removable media and/or built-in devices.
  • Storage subsystem 804 may include optical memory devices (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory devices (e.g., RAM, EPROM, EEPROM, etc.) and/or magnetic memory devices (e.g., hard-disk drive, floppy-disk drive, tape drive, MRAM, etc.), among others.
  • Storage subsystem 804 may include volatile, nonvolatile, dynamic, static, read/write, read-only, random-access, sequential-access, location- addressable, file-addressable, and/or content-addressable devices.
  • logic subsystem 802 and storage subsystem 804 may be integrated into one or more unitary devices, such as an application- specific integrated circuit (ASIC), or a system-on-a-chip.
  • ASIC application-specific integrated circuit
  • storage subsystem 804 includes one or more physical, non-transitory devices.
  • aspects of the instructions described herein may be propagated in a transitory fashion by a pure signal (e.g., an electromagnetic signal, an optical signal, etc.) that is not held by a physical device for a finite duration.
  • a pure signal e.g., an electromagnetic signal, an optical signal, etc.
  • data and/or other forms of information pertaining to the present disclosure may be propagated by a pure signal.
  • program may be used to describe an aspect of computing system 800 implemented to perform a particular function.
  • a program may be instantiated via logic subsystem 802 executing instructions held by storage subsystem 804. It will be understood that different programs may be instantiated from the same application, service, code block, object, library, routine, API, function, etc. Likewise, the same program may be instantiated by different applications, services, code blocks, objects, routines, APIs, functions, etc.
  • program may encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc.
  • a “service”, as used herein, is an application program executable across multiple user sessions.
  • a service may be available to one or more system components, programs, and/or other services.
  • a service may run on one or more server-computing devices.
  • display subsystem 806 may be used to present a visual representation of data held by storage subsystem 804.
  • This visual representation may take the form of a graphical user interface (GUI).
  • GUI graphical user interface
  • the state of display subsystem 806 may likewise be transformed to visually represent changes in the underlying data.
  • Display subsystem 806 may include one or more display devices utilizing virtually any type of technology. Such display devices may be combined with logic subsystem 802 and/or storage subsystem 804 in a shared enclosure, or such display devices may be peripheral display devices.
  • communication subsystem 810 may be configured to communicatively couple computing system 800 with one or more other computing devices.
  • Communication subsystem 810 may include wired and/or wireless communication devices compatible with one or more different communication protocols.
  • the communication subsystem may be configured for communication via a wireless telephone network, or a wired or wireless local- or wide-area network.
  • the communication subsystem may allow computing system 800 to send and/or receive messages to and/or from other devices via a network such as the Internet.

Abstract

Embodiments related to efficiently constructing an augmented reality environment with global illumination effects are disclosed. For example, one disclosed embodiment provides a method of displaying an augmented reality image via a display device. The method includes receiving image data, the image data capturing an image of a local environment of the display device, and identifying a physical feature of the local environment via the image data. The method further includes constructing an augmented reality image of a virtual structure for display over the physical feature in spatial registration with the physical feature from a viewpoint of a user, the augmented reality image comprising a plurality of modular virtual structure segments arranged in adjacent locations to form the virtual structure feature, each modular virtual structure segment comprising a pre-computed global illumination effect, and outputting the augmented reality image to the display device.

Description

CONSTRUCTING AUGMENTED REALITY ENVIRONMENT WITH
PRE-COMPUTED LIGHTING
BACKGROUND
[0001] The addition of realistic lighting and shadows to virtual environments, such as a virtual video game environment, may be computationally expensive. As such, rendering times for lighting effects may be unacceptably long for use during video game play. For example, the creation of texture maps that encode realistic lighting (e.g. global illumination) and shadows ("light maps") on a virtual environment may take hours, or even days, to compute. Thus, such lighting effects are generally pre-computed for a virtual environment during development of the virtual environment, rather than being calculated in real-time during game play.
[0002] Dynamic lighting and shadowing may be computed more quickly. However, the visual quality of dynamic lighting may be much lower than that of pre-computed lighting effects. Further, dynamic lighting may utilize significant resources at run-time.
SUMMARY
[0003] Various embodiments are disclosed that relate to efficiently constructing an augmented reality environment with global illumination effects. For example, one disclosed embodiment provides a method of displaying an augmented reality image via a display device. The method includes receiving image data capturing an image of a local environment of the display device, and identifying a physical feature of the local environment via the image data. The method further includes constructing an augmented reality image of a virtual structure for display over the physical feature in spatial registration with the physical feature from a viewpoint of a user, the augmented reality image comprising a plurality of modular virtual structure segments arranged in adjacent locations to form the virtual structure feature, each modular virtual structure segment comprising a pre-computed lighting effect, and outputting the augmented reality image to the display device.
[0004] This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure. BRIEF DESCRIPTION OF THE DRAWINGS
[0005] Figure 1 shows an example embodiment of a see-through display device in an example use environment.
[0006] Figure 2 shows an embodiment of an augmented reality image in the example use environment of Figure 1.
[0007] Figure 3 shows an example embodiment of a set of modular virtual structure segments.
[0008] Figure 4 shows a schematic depiction of a pre-computed lighting effect being applied to a portion of a modular virtual structure segment.
[0009] Figure 5 A illustrates an addition of a dynamic point light effect to the augmented reality image of Figure 2, and Figure 5B illustrates an example of the effect of the dynamic point light effect on a portion of a modular virtual structure segment having a pre-computed lighting effect.
[0010] Figure 6 shows a flow diagram depicting an embodiment of a method for constructing a virtual environment fit to a detected physical environment.
[0011] Figure 7 shows a block diagram of an example embodiment of a see-through display device.
[0012] Figure 8 shows a block diagram of an example embodiment of a computing system.
DETAILED DESCRIPTION
[0013] As mentioned above, realistic lighting effects for a virtual environment are generally pre-computed after the virtual environment has been constructed, and then stored, for example, as light maps for the virtual environment. Such virtual environments are commonly built with fixed geometries that do not adapt to a user's surroundings.
[0014] In contrast, an augmented reality display system may be configured to adapt a virtual image to a user's surroundings. For example, an augmented reality video game may fit virtual structures in the game to corresponding physical structures of a physical environment of the user. Thus, the geometries of the augmented reality image objects may change based upon the physical environment of the user.
[0015] As the fitting of the augmented reality environment to the physical environment occurs during real-time use, if high quality lighting effects were applied to the environment after building the environment, the lighting computations would also occur at this time. However, if such lighting effects were to be computed for an augmented reality environment after fitting augmented reality imagery to the physical environment, a user may have to wait from hours to days to play an augmented reality experience, depending upon the particular computing system used to compute the lighting effects, due to the computational expense of applying realistic lighting effects. This may result in an unacceptably slow user experience. Further, the appearance of the physical environment may change during such a long delay. This may result in a mismatch between the real world and virtual world, which may significantly impact an augmented reality experience.
[0016] As one potential solution, dynamic lighting may be used in place of pre-computed lighting effects for an augmented reality environment. However, as mentioned above, dynamic lighting may be of lower quality than pre-computed lighting, and therefore may not provide as good of a user experience. Further, dynamic lighting may be computationally expensive at run-time, which may decrease computation budgets for other aspects of the experience, such as other visuals and game play.
[0017] Therefore, embodiments are disclosed herein that relate to the efficient construction of an augmented reality environment, with high quality pre-computed lighting effects, that is fit to the geometry of a local physical environment. Briefly, the disclosed embodiments utilize modular virtual structure segments that may be arranged adjacent to one another to form a virtual structure for an augmented reality image, wherein the modular virtual structure segments comprise high-quality pre-computed lighting effects. As the lighting effects are pre-computed for each modular virtual structure segment, the lighting effects will be included in a virtual structure constructed via the modular virtual structure segments. Further, in some embodiments, local lighting characteristics may be detected, and used to modulate an appearance of the modular virtual structure segments. Examples of such local lighting characteristics may include, but are not limited to, color characteristics and locations of light sources in the local physical environment.
[0018] Figure 1 shows an example embodiment of a use environment 100 for an augmented reality display system, in the form of a living room. A user 102 is shown viewing the living room through a see-through display device 104. Figure 1 also depicts a field of view 103 of the user, which represents a portion of the environment viewable through the see-through display device 104, and thus the portion of the environment that may be augmented with images displayed via the see-through display device 104. In some embodiments, the user's field of view 103 may be substantially coextensive with the user's actual field of vision, while in other embodiments the user's field of view 103 may occupy a lesser portion of the user's actual field of vision. [0019] As will be described in greater detail below, see-through display device 104 may comprise one or more outwardly facing image sensors (e.g., two-dimensional cameras and/or depth cameras) configured to acquire image data (e.g. color/grayscale images, depth images/point cloud data/mesh data, etc.) representing use environment 100 as the user navigates the environment. This image data may be used to obtain information regarding the layout of the environment and structural features thereof, such as ceiling 106 and walls 108, as well as other features.
[0020] See-through display device 104 further is configured to overlay displayed virtual objects over physical objects viewable through the device to create an augmented reality image. For example, referring to Figure 2, an example augmented reality image is depicted in which virtual room framing structures 200, such as virtual wall studs 202, headers 204, etc., are displayed as an overlay on the user's wall. Infrastructure images, such as pipes 206, conduit/cables, etc. also may be displayed, as well as any other suitable virtual structures. Similar structures (not shown) may be displayed for the ceiling as well. Additionally, imagery may be displayed corresponding to furniture or other non-structural objects in a room, and as occupying an empty space within a room. It will be appreciated that the augmented reality image as depicted in Figure 2 is not limited to the user's field of view shown in Figure 1 in order to illustrate the augmented reality environment more completely.
[0021] The virtual wall framing structures of Figure 2 are geometrically fit to the underlying physical structures (e.g. walls 108). As the local physical environment of each user will likely differ, the overall virtual structures for each player's local physical environment are constructed upon acquiring image data (e.g. stereo depth image data, structured light image data, time-of-flight image data, or other depth image data) of the local physical environment, rather than being pre-designed. As such, if global illumination effects were to be applied to the virtual structures after constructing the virtual structures, a player may be forced to wait an undesirably long time before the activity could be played, and may be restricted from changing environments (e.g. walking into a different rooms) during a game, as building and applying lighting to the new environment may take an unsuitably long period of time.
[0022] Thus, as mentioned above, virtual structures 200 are assembled from sets of modular virtual structure segments with pre-computed lighting effects, wherein instances of the modular virtual structure segments may be arranged adjacent to each other and processed (e.g. rotated, scaled, etc.) to form the appearance of a unitary virtual structure. Figure 3 shows an example embodiment of a set 300 of modular virtual structure segments comprising a wall stud segment 302, a wall stud segment with a jointed pipe 304, a wall stud segment with a horizontal pipe 306, a pair of door frame segments 308, 310, and a pair of window frame segments 312, 314. Referring to Figure 2, it will be noted that virtual structures 200 may be constructed entirely from instances of virtual wall segments selected from set 300 as scaled, rotated, clipped, morphed, and/or otherwise appropriately processed based upon the placement of each particular instance.
[0023] While a relatively simple set of modular virtual structure segments is depicted in Figure 3, it will be understood that a set of modular virtual structure segments may have any suitable number and selection of segments of any desired complexity. Further, it will be understood that sets of modular virtual structure segments with pre-computed lighting may be used to build any other suitable structure to fit any other desired physical feature than a virtual structure fit to a wall, including but not limited to virtual structures that are fit to ceilings, as well as non-structural features such as furniture, wall hangings, plants, exterior objects, counter tops, bar tops, etc. For example, pre-lit modular virtual structure segments may include pre-lit virtual sofas, desks, televisions, mirrors, and other objects that are commonly found in physical environments, but that may have different shapes and/or appearances in different physical environments. It will be understood that such virtual structural segments for such objects in some instances may comprise a single "segment," such that a single virtual structure element is resized, rotated, and otherwise processed to fit a desired physical structure without necessarily being combined with adjacent segments. Additionally, it will be understood that an empty space within a physical environment may be considered a physical feature of the environment, and that modular virtual structure segments may be arranged to construct a virtual object in an unoccupied portion of space within a room or other use environment.
[0024] Some modular virtual structure segments may comprise connectivity constraints that restrict a set of other segments that may be joined to the segment. For example, in Figure 3, each window segment and door segment may be joined on one side (e.g. the window/door side) to another window or door segment, and not a wall stud segment 302 on that side, or the segments would mate incorrectly. Further, jointed pipe 304 and horizontal pipe 306 segments may be restricted to being joined to segments with complementary pipe sections. It will be understood that these connectivity constraints are described for the purpose of example, and that any other suitable connectivity constraints may be used.
[0025] Any suitable pre-computed lighting effect may be applied to the modular virtual structure segments. For example, in some embodiments, a set of modular virtual structure segments may be intended for use in any local lighting environment, without reference to locations of physical lights in the environment. In such embodiments, a directional light effect may be utilized. An example of this is shown in Figure 4 as directional light incident on a portion 400 of a virtual wall stud segment. The virtual lighting applied may have any suitable direction. For example, for horizontally tiled pieces, the light may be perpendicular to the horizontal axis, while for vertically tiled pieces, the light may be perpendicular to the vertical axis. Additionally, for modular pieces that are tiled both horizontally and vertically, the light may be perpendicular to both axes. This may help to ensure that the shadows and lighting pre-computed for the segments have common lighting characteristics for each segment, and thus may help to prevent the appearance of parallax and/or other discontinuities where adjacent segments meet. In the embodiment of Figure 4, the directional light is shown as being applied at an approximately forty five degree angle relative to a vertical plane, but it will be understood that this is presented for the purpose of example, and that any other suitable angle may be used.
[0026] In other embodiments, a set of modular virtual structure segments may be configured for use with specific lighting characteristics - e.g. a single overhead point light source, a lamp adjacent to a wall, etc. In such embodiments, any suitable type of virtual lighting may be used to pre-compute lighting effects. In any case, after pre-computing the lighting effects, the computed light map may be saved with the associated modular virtual structure segment at a high level of information so that images of virtual structures assembled with the segment have realistic lighting effects.
[0027] Any suitable type of lighting information may be stored for a modular virtual structure segment. For example, pre-computed lighting effects may be stored as light maps, cube maps, spherical harmonics (e.g. pre-computed radiance transfer functions), and/or in any other suitable form. The use of pre-computed radiance transfer functions may allow realistic lighting and shadows on virtual objects to be generated, for example, based upon detected physical light locations in a use environment by applying virtual point lighting at locations of physical lights in a physical environment, as depicted via virtual point light source 500 in Figure 5 A. Figure 5 B shows an example of how the appearance of the wall stud portion shown in Figure 4 may be modulated based upon the virtual point light source of Figure 5 A. Additionally, procedural or dynamic lighting also may be applied in real time (e.g. light arising from dynamic virtual objects displayed in the augmented reality image).
[0028] Local physical lighting characteristics also may be used to modulate an appearance of modular virtual structure segments in other ways. For example, the pre-computed lighting effects for the modular virtual structure segments may be computed based upon the application of white light. Then, when building a virtual image for the particular physical environment, color characteristics of the physical lighting in the physical environment may be analyzed from image data acquired by the see-through display device, and the determined color characteristics (e.g. hue, saturation, albedo) may be applied to the virtual lighting effects so that the virtual lighting matches the local physical lighting more closely. In this way, displayed instances of pre-lit virtual wall/ceiling segments, pre-lit virtual furniture, and any other suitable pre-lit virtual objects may be matched to an appearance of the physical environment more closely.
[0029] Figure 6 shows an embodiment of a method 600 for constructing an augmented reality environment by applying instances of modular virtual structure segments to a detected physical structure. Method 600 comprises, at 602, receiving image data capturing an image of a local environment of a see-through display device. The image data may comprise any suitable data, including but not limited to depth images 604 and/or two- dimensional images, and may be received from image sensors on the see-through display device, or external to the see-through display device. The depth image data may be received from any suitable depth imaging system, including but not limited to stereo imaging systems, time-of- flight imaging systems, and structured light imaging systems.
[0030] Method 600 next comprises, at 606, identifying a physical feature of the local environment from the image data. The physical feature may be identified in any suitable manner. For example, in some embodiments, a mesh representation of the physical environment is determined from depth image data, and mesh analysis is performed, at 608, to identify, at 610, major surfaces in the physical environment. Examples include, but are not limited to, walls 612 and ceilings 614, as well as features of the walls and ceilings, such as doors, windows, skylights, columns, other protrusions/cutouts in the room, etc. Additionally, open spaces in the geometry may be identified, for example, to allow a desired virtual structure to be fit into the identified open space.
[0031] Method 600 also may include, at 616, identifying one or more local lighting characteristics of the physical environment. Examples of local lighting characteristics may include, but are not limited to, color characteristics 618 and locations of local light sources 620.
[0032] Method 600 further comprises, at 622, constructing an augmented reality image comprising a virtual structure feature for display over a detected physical feature in spatial registration with the physical feature. As mentioned above and indicated at 624, the virtual structure may be constructed by arranging a plurality of modular virtual structure segments that each comprises a pre-computed lighting effect. The virtual structure segments may be arranged in any suitable manner, including but not limited to by rotating, scaling, morphing, clipping, etc. pieces to fit the physical geometry of interest. Likewise, the modular virtual structure segments may comprise any suitable pre-calculated information regarding the pre- computed lighting effect. Examples include, but are not limited to, light maps 626 and/or radiance transfer functions 628. Further, as described above, connectivity constraints that restrict a set of other modular virtual structure segments that can be joined to a selected modular virtual structure segment may be applied, at 630, when selecting and arranging the modular virtual structure segments to ensure that complementary features are appropriately joined on adjacent segments.
[0033] Additionally, as mentioned above, local lighting characteristics may be utilized in constructing the augmented reality image. For example, as indicated at 632, in some embodiments an appearance of the modular virtual structure segments may be modulated based upon local lighting characteristics. The appearance may be modulated in any suitable manner. For example, a color of the local lighting environment may be imparted to the pre- computed lighting effect, as indicated at 634. Likewise, a virtual light source, such as a virtual point light source, may be applied at a location of a physical light source in the environment, as indicated at 636.
[0034] In other embodiments, instead of modulating the appearance of the modular virtual structure segments, a plurality of different sets of virtual modular structure segments having different lighting characteristics may be available. For example, one set of modular virtual structure segments may comprise pre-computed light effects corresponding to a point light source overhead, while another may comprise pre-computed light effects corresponding to directional light coming in from a side window. In this instance, as indicated at 638, local lighting characteristics may be utilized to select a set of modular virtual structure segments having corresponding lighting characteristics, so that the resulting virtual structure may have similar lighting characteristics as the physical light in the environment.
[0035] Upon constructing the augmented reality image, method 600 comprises outputting the augmented reality image to a see-through display device, as indicated at 640. Sensor data from the see-through display device (e.g. inward and outward image sensors) may be used to detect the user's eye positions and gaze directions, and also to detect physical objects in the field of view of the user, and to display the virtual structure over a corresponding physical feature in spatial registration with the physical feature to give the user an augmented reality view of the physical environment.
[0036] As mentioned above, the methods described above may be performed via any suitable display device. Examples include but not limited to see-through display devices such as head-mounted see-through display device 104 of Figure 1 , and other display devices having one or more image sensors, such as smart phones and notepad computers. Figure 7 shows a block diagram of an example configuration of see-through display device 104.
[0037] See-through display device 104 may comprise one or more lenses 702 that form a part of a near-eye see-through display subsystem 704. See-through display device 104 may further comprise one or more outward facing image sensors 706 configured to acquire images of a background scene being viewed by a user, and may include one or more microphones 708 configured to detect sounds, such as voice commands from a user. Outward facing image sensors 706 may include one or more depth sensors (including but not limited to stereo depth imaging arrangements) and/or one or more two-dimensional image sensors.
[0038] See-through display device 104 further comprises a gaze detection subsystem 710 configured to detect a direction of gaze of each eye of a user, as described above. The gaze detection subsystem 710 may be configured to determine gaze directions of each of a user's eyes in any suitable manner. For example, in the depicted embodiment, the gaze detection subsystem 710 comprises one or more glint sources 712, such as infrared light sources, configured to cause a glint of light to reflect from the cornea of each eye of a user, and one or more image sensors 714 configured to capture an image of one or more eyes of the user. Images of the glints and of the pupils as determined from image data gathered via image sensor(s) 714 may be used to determine an optical axis of each eye. It will be understood that the gaze detection subsystem 710 may have any suitable number and arrangement of light sources and image sensors.
[0039] See-through display device 104 may further comprise additional sensors. For example, see-through display device 104 may comprise a global positioning (GPS) subsystem 716 to allow a location of see-through display device 104 to be determined.
[0040] See-through display device 104 further may include one or more motion sensors 718 to detect movements of a user's head when the user is wearing see-through display device 104. Motion data may be used, for example, for image stabilization to help correct for blur in images from the outward-facing image sensor(s) 706. Likewise, the motion sensors 718, as well as the microphone(s) 708 and the gaze detection subsystem 710, also may be employed as user input devices, such that a user may interact with see-through display subsystem 704 via gestures of the eye, neck and/or head, as well as via verbal commands. It will be understood that sensors illustrated in Figure 7 is shown for the purpose of example and are not intended to be limiting in any manner, as any other suitable sensors and/or combination of sensors may be utilized.
[0041] See-through display device 104 further comprises a computing device 720 having a logic subsystem 722 and a storage subsystem 724 in communication with the sensors, the gaze detection subsystem 710, and the see-through display subsystem 704. Storage subsystem 724 comprises instructions stored thereon that are executable by logic subsystem 722, for example, to receive image data from outward facing image sensors 706 capturing an image of a local environment of the see-through display device, and to identify a physical feature of the local environment via the image data. The instructions also may be executable to construct an augmented reality image of a virtual structure by arranging a plurality of modular virtual structure segments in adjacent locations, each modular virtual structure segment comprising a pre-computed global illumination effect, and to display the augmented reality image over the physical feature in spatial registration with the physical feature from a viewpoint of a user. The instructions may further be executable to detect a local lighting characteristic, to modulate the augmented reality image based upon the local lighting characteristic, and to display the augmented reality image over the physical feature in spatial registration with the physical feature via the see-through display subsystem 704.
[0042] Further information regarding example hardware for the logic subsystem 722, storage subsystem 724, and other above-mentioned components is described below with reference to Figure 8.
[0043] It will be appreciated that the depicted see-through display device 104 is provided by way of example, and thus is not meant to be limiting. Therefore it is to be understood that a display device may include additional and/or alternative sensors, cameras, microphones, input devices, output devices, etc. than those shown without departing from the scope of this disclosure. The physical configuration of a display device and its various sensors and subcomponents may take a variety of different forms without departing from the scope of this disclosure.
[0044] Further, it will be understood that a computing system configured to display augmented reality imagery via a see-through display device may take any suitable form other than a head-mounted display device, including but not limited to a mainframe computer, server computer, desktop computer, laptop computer, tablet computer, home- entertainment computer, network computing device, gaming device, mobile computing device, mobile communication device (e.g., smart phone), other wearable computer, etc. It will further be understood that the methods and processes described above may be implemented as a computer-application program or service, an application-programming interface (API), a library, and/or other computer program product.
[0045] Figure 8 schematically shows a non-limiting embodiment of a computing system 800 that can perform one or more of the methods and processes described above. Computing system 800 is shown in simplified form, and as mentioned above may represent any suitable device and/or combination of devices, including but not limited to those described above with reference to Figures 1-9.
[0046] Computing system 800 includes a logic subsystem 802 and a storage subsystem 804. Computing system 800 may optionally include a display subsystem 806, input device subsystem 808, communication subsystem 810, and/or other components not shown in Figure 8. Computing system 800 may also optionally include or interface with one or more user input devices such as the above-described eye tracking system, as well as a keyboard, mouse, game controller, camera (depth and/or two-dimensional), microphone, and/or touch screen, for example. Such user-input devices may form part of input device subsystem 808 or may interface with input device subsystem 808.
[0047] Logic subsystem 802 includes one or more physical devices configured to execute instructions. For example, the logic subsystem may be configured to execute machine- readable instructions that are part of one or more applications, services, programs, routines, libraries, objects, components, data structures, or other logical constructs. Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more components, or otherwise arrive at a desired result.
[0048] Logic subsystem 802 may include one or more processors configured to execute software instructions. Additionally or alternatively, logic subsystem 802 may include one or more hardware or firmware logic machines configured to execute hardware or firmware instructions. The processors of logic subsystem 802 may be single-core or multi-core, and the programs executed thereon may be configured for sequential, parallel or distributed processing. Logic subsystem 802 may optionally include individual components that are distributed among two or more devices, which can be remotely located and/or configured for coordinated processing. Aspects of the logic subsystem may be virtualized and executed by remotely accessible networked computing devices configured in a cloud-computing configuration. [0049] Storage subsystem 804 includes one or more physical, non-transitory, computer- readable storage devices configured to hold data and/or instructions executable by the logic subsystem to implement the herein-described methods and processes. When such methods and processes are implemented, the state of storage subsystem 804 may be transformed— e.g., to hold different data.
[0050] Storage subsystem 804 may include removable media and/or built-in devices. Storage subsystem 804 may include optical memory devices (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory devices (e.g., RAM, EPROM, EEPROM, etc.) and/or magnetic memory devices (e.g., hard-disk drive, floppy-disk drive, tape drive, MRAM, etc.), among others. Storage subsystem 804 may include volatile, nonvolatile, dynamic, static, read/write, read-only, random-access, sequential-access, location- addressable, file-addressable, and/or content-addressable devices. In some embodiments, logic subsystem 802 and storage subsystem 804 may be integrated into one or more unitary devices, such as an application- specific integrated circuit (ASIC), or a system-on-a-chip.
[0051] It will be appreciated that storage subsystem 804 includes one or more physical, non-transitory devices. However, in some embodiments, aspects of the instructions described herein may be propagated in a transitory fashion by a pure signal (e.g., an electromagnetic signal, an optical signal, etc.) that is not held by a physical device for a finite duration. Furthermore, data and/or other forms of information pertaining to the present disclosure may be propagated by a pure signal.
[0052] The term "program" may be used to describe an aspect of computing system 800 implemented to perform a particular function. In some cases, a program may be instantiated via logic subsystem 802 executing instructions held by storage subsystem 804. It will be understood that different programs may be instantiated from the same application, service, code block, object, library, routine, API, function, etc. Likewise, the same program may be instantiated by different applications, services, code blocks, objects, routines, APIs, functions, etc. The term "program" may encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc.
[0053] It will be appreciated that a "service", as used herein, is an application program executable across multiple user sessions. A service may be available to one or more system components, programs, and/or other services. In some implementations, a service may run on one or more server-computing devices.
[0054] When included, display subsystem 806 may be used to present a visual representation of data held by storage subsystem 804. This visual representation may take the form of a graphical user interface (GUI). As the herein described methods and processes change the data held by the storage subsystem, and thus transform the state of the storage subsystem, the state of display subsystem 806 may likewise be transformed to visually represent changes in the underlying data. Display subsystem 806 may include one or more display devices utilizing virtually any type of technology. Such display devices may be combined with logic subsystem 802 and/or storage subsystem 804 in a shared enclosure, or such display devices may be peripheral display devices.
[0055] When included, communication subsystem 810 may be configured to communicatively couple computing system 800 with one or more other computing devices. Communication subsystem 810 may include wired and/or wireless communication devices compatible with one or more different communication protocols. As non-limiting examples, the communication subsystem may be configured for communication via a wireless telephone network, or a wired or wireless local- or wide-area network. In some embodiments, the communication subsystem may allow computing system 800 to send and/or receive messages to and/or from other devices via a network such as the Internet.
[0056] It will be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated and/or described may be performed in the sequence illustrated and/or described, in other sequences, in parallel, or omitted. Likewise, the order of the above-described processes may be changed.
[0057] The subject matter of the present disclosure includes all novel and nonobvious combinations and subcombinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.

Claims

1. In a display device, a method of displaying an augmented reality image comprising lighting effects, the method comprising:
receiving image data, the image data capturing an image of a local environment of the display device;
identifying a physical feature of the local environment via the image data;
constructing an augmented reality image of a virtual structure for display over the physical feature in spatial registration with the physical feature from a viewpoint of a user, the augmented reality image comprising a plurality of modular virtual structure segments arranged in adjacent locations to form the virtual structure feature, each modular virtual structure segment comprising a pre-computed global illumination effect; and
outputting the augmented reality image to the display device.
2. The method of claim 1, wherein identifying a physical feature of the local environment comprises performing a mesh analysis of the local environment.
3. The method of claim 1 , wherein the physical feature comprises one or more of a wall and a ceiling.
4. The method of claim 1, wherein the physical feature comprises a non-structural object within the local environment.
5. The method of claim 1, wherein the physical feature comprises an empty space within the local environment.
6. The method of claim 1, wherein the pre-computed global illumination effect comprises a pre-computed directional lighting effect.
7. The method of claim 1, wherein the pre-computed global illumination effect comprises a pre-computed radiance transfer function.
8. The method of claim 1 , further comprising identifying a lighting characteristic of the local environment via the image data, and modulating an appearance of the plurality of modular virtual structure segments based upon the lighting characteristic of the local environment.
9. The method of claim 8, wherein the lighting characteristic comprises a color characteristic of the local environment, and wherein modulating the appearance of the plurality of modular virtual structure segments comprises imparting the color characteristic to the modular virtual structure segments.
10. The method of claim 8, wherein the lighting characteristic comprises a location of a physical light in the local environment, and wherein modulating the appearance of the plurality of modular virtual structure segments comprises computing a light effect arising from a virtual point light at the location of the physical light.
PCT/US2013/068363 2012-11-05 2013-11-05 Constructing augmented reality environment with pre-computed lighting WO2014071333A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/668,953 2012-11-05
US13/668,953 US9524585B2 (en) 2012-11-05 2012-11-05 Constructing augmented reality environment with pre-computed lighting

Publications (2)

Publication Number Publication Date
WO2014071333A2 true WO2014071333A2 (en) 2014-05-08
WO2014071333A3 WO2014071333A3 (en) 2015-06-18

Family

ID=49640172

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/068363 WO2014071333A2 (en) 2012-11-05 2013-11-05 Constructing augmented reality environment with pre-computed lighting

Country Status (2)

Country Link
US (4) US9524585B2 (en)
WO (1) WO2014071333A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112562051A (en) * 2020-11-30 2021-03-26 腾讯科技(深圳)有限公司 Virtual object display method, device, equipment and storage medium

Families Citing this family (103)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9298007B2 (en) 2014-01-21 2016-03-29 Osterhout Group, Inc. Eye imaging in head worn computing
US9952664B2 (en) 2014-01-21 2018-04-24 Osterhout Group, Inc. Eye imaging in head worn computing
US9965681B2 (en) 2008-12-16 2018-05-08 Osterhout Group, Inc. Eye imaging in head worn computing
US9229233B2 (en) 2014-02-11 2016-01-05 Osterhout Group, Inc. Micro Doppler presentations in head worn computing
JP6479654B2 (en) * 2012-06-11 2019-03-06 フィリップス ライティング ホールディング ビー ヴィ Method and apparatus for configuring a luminaire in a virtual environment
US11270498B2 (en) * 2012-11-12 2022-03-08 Sony Interactive Entertainment Inc. Real world acoustic and lighting modeling for improved immersion in virtual reality and augmented reality environments
US20140253540A1 (en) * 2013-03-07 2014-09-11 Yoav DORI Method and system of incorporating real world objects into a virtual environment
US10713838B2 (en) * 2013-05-03 2020-07-14 Nvidia Corporation Image illumination rendering system and method
US10262462B2 (en) 2014-04-18 2019-04-16 Magic Leap, Inc. Systems and methods for augmented and virtual reality
US20150089453A1 (en) * 2013-09-25 2015-03-26 Aquifi, Inc. Systems and Methods for Interacting with a Projected User Interface
US20150145887A1 (en) * 2013-11-25 2015-05-28 Qualcomm Incorporated Persistent head-mounted content display
US9829707B2 (en) 2014-08-12 2017-11-28 Osterhout Group, Inc. Measuring content brightness in head worn computing
US10649220B2 (en) 2014-06-09 2020-05-12 Mentor Acquisition One, Llc Content presentation in head worn computing
US9841599B2 (en) 2014-06-05 2017-12-12 Osterhout Group, Inc. Optical configurations for head-worn see-through displays
US11103122B2 (en) 2014-07-15 2021-08-31 Mentor Acquisition One, Llc Content presentation in head worn computing
US20160019715A1 (en) 2014-07-15 2016-01-21 Osterhout Group, Inc. Content presentation in head worn computing
US9575321B2 (en) 2014-06-09 2017-02-21 Osterhout Group, Inc. Content presentation in head worn computing
US9939934B2 (en) 2014-01-17 2018-04-10 Osterhout Group, Inc. External user interface for head worn computing
US20150228119A1 (en) 2014-02-11 2015-08-13 Osterhout Group, Inc. Spatial location presentation in head worn computing
US10684687B2 (en) 2014-12-03 2020-06-16 Mentor Acquisition One, Llc See-through computer display systems
US9299194B2 (en) 2014-02-14 2016-03-29 Osterhout Group, Inc. Secure sharing in head worn computing
US9746686B2 (en) 2014-05-19 2017-08-29 Osterhout Group, Inc. Content position calibration in head worn computing
US9594246B2 (en) 2014-01-21 2017-03-14 Osterhout Group, Inc. See-through computer display systems
US10191279B2 (en) 2014-03-17 2019-01-29 Osterhout Group, Inc. Eye imaging in head worn computing
US10254856B2 (en) 2014-01-17 2019-04-09 Osterhout Group, Inc. External user interface for head worn computing
US9523856B2 (en) 2014-01-21 2016-12-20 Osterhout Group, Inc. See-through computer display systems
US11892644B2 (en) 2014-01-21 2024-02-06 Mentor Acquisition One, Llc See-through computer display systems
US9494800B2 (en) 2014-01-21 2016-11-15 Osterhout Group, Inc. See-through computer display systems
US11737666B2 (en) 2014-01-21 2023-08-29 Mentor Acquisition One, Llc Eye imaging in head worn computing
US9740280B2 (en) 2014-01-21 2017-08-22 Osterhout Group, Inc. Eye imaging in head worn computing
US9766463B2 (en) 2014-01-21 2017-09-19 Osterhout Group, Inc. See-through computer display systems
US9836122B2 (en) 2014-01-21 2017-12-05 Osterhout Group, Inc. Eye glint imaging in see-through computer display systems
US9532715B2 (en) 2014-01-21 2017-01-03 Osterhout Group, Inc. Eye imaging in head worn computing
US11669163B2 (en) 2014-01-21 2023-06-06 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US11487110B2 (en) 2014-01-21 2022-11-01 Mentor Acquisition One, Llc Eye imaging in head worn computing
US20150205135A1 (en) 2014-01-21 2015-07-23 Osterhout Group, Inc. See-through computer display systems
US9753288B2 (en) 2014-01-21 2017-09-05 Osterhout Group, Inc. See-through computer display systems
US9854176B2 (en) * 2014-01-24 2017-12-26 Lucasfilm Entertainment Company Ltd. Dynamic lighting capture and reconstruction
US9852545B2 (en) 2014-02-11 2017-12-26 Osterhout Group, Inc. Spatial location presentation in head worn computing
US20150241963A1 (en) 2014-02-11 2015-08-27 Osterhout Group, Inc. Eye imaging in head worn computing
US9401540B2 (en) 2014-02-11 2016-07-26 Osterhout Group, Inc. Spatial location presentation in head worn computing
US10083541B2 (en) * 2014-03-13 2018-09-25 Imagination Technologies Limited Object illumination in hybrid rasterization and ray traced 3-D rendering
EP3123449B1 (en) 2014-03-25 2020-01-01 Apple Inc. Method and system for representing a virtual object in a view of a real environment
US20160187651A1 (en) 2014-03-28 2016-06-30 Osterhout Group, Inc. Safety for a vehicle operator with an hmd
US10853589B2 (en) 2014-04-25 2020-12-01 Mentor Acquisition One, Llc Language translation with head-worn computing
US9651787B2 (en) 2014-04-25 2017-05-16 Osterhout Group, Inc. Speaker assembly for headworn computer
WO2015175730A1 (en) * 2014-05-13 2015-11-19 Nant Vision, Inc. Augmented reality content rendering via albedo models, systems and methods
US10663740B2 (en) 2014-06-09 2020-05-26 Mentor Acquisition One, Llc Content presentation in head worn computing
DE102014217675A1 (en) * 2014-09-04 2016-03-24 Zumtobel Lighting Gmbh Augmented reality-based lighting system and procedure
US9684172B2 (en) 2014-12-03 2017-06-20 Osterhout Group, Inc. Head worn computer display systems
USD751552S1 (en) 2014-12-31 2016-03-15 Osterhout Group, Inc. Computer glasses
US20160239985A1 (en) * 2015-02-17 2016-08-18 Osterhout Group, Inc. See-through computer display systems
US10878775B2 (en) 2015-02-17 2020-12-29 Mentor Acquisition One, Llc See-through computer display systems
CN106162117B (en) * 2015-04-10 2019-04-16 北京智谷睿拓技术服务有限公司 Display control method and device
CN106293557B (en) 2015-05-04 2019-12-03 北京智谷睿拓技术服务有限公司 Display control method and device
US20160350967A1 (en) * 2015-06-01 2016-12-01 Cable Television Laboratories, Inc. Dynamic adjustments for augmented, mixed and virtual reality presentations
US10354449B2 (en) * 2015-06-12 2019-07-16 Hand Held Products, Inc. Augmented reality lighting effects
US10037627B2 (en) * 2015-08-14 2018-07-31 Argis Technologies Llc Augmented visualization system for hidden structures
US9652896B1 (en) 2015-10-30 2017-05-16 Snap Inc. Image based tracking in augmented reality systems
US9984499B1 (en) 2015-11-30 2018-05-29 Snap Inc. Image and point cloud based tracking and in augmented reality systems
US10089681B2 (en) 2015-12-04 2018-10-02 Nimbus Visulization, Inc. Augmented reality commercial platform and method
WO2017108806A1 (en) * 2015-12-21 2017-06-29 Thomson Licensing Key lights direction detection
US10607081B2 (en) 2016-01-06 2020-03-31 Orcam Technologies Ltd. Collaboration facilitator for wearable devices
US10591728B2 (en) 2016-03-02 2020-03-17 Mentor Acquisition One, Llc Optical systems for head-worn computers
US10274737B2 (en) 2016-02-29 2019-04-30 Microsoft Technology Licensing, Llc Selecting portions of vehicle-captured video to use for display
US10667981B2 (en) 2016-02-29 2020-06-02 Mentor Acquisition One, Llc Reading assistance system for visually impaired
US10019831B2 (en) * 2016-10-20 2018-07-10 Zspace, Inc. Integrating real world conditions into virtual imagery
US20180182160A1 (en) * 2016-12-23 2018-06-28 Michael G. Boulton Virtual object lighting
US10319149B1 (en) 2017-02-17 2019-06-11 Snap Inc. Augmented reality anamorphosis system
US10074381B1 (en) 2017-02-20 2018-09-11 Snap Inc. Augmented reality speech balloon system
US10387730B1 (en) 2017-04-20 2019-08-20 Snap Inc. Augmented reality typography personalization system
US10210664B1 (en) * 2017-05-03 2019-02-19 A9.Com, Inc. Capture and apply light information for augmented reality
EP3422145B1 (en) * 2017-06-28 2020-01-29 Nokia Technologies Oy Provision of virtual reality content
US10922878B2 (en) * 2017-10-04 2021-02-16 Google Llc Lighting for inserted content
WO2019111167A1 (en) * 2017-12-05 2019-06-13 Dvdperplay Sa Method of construction of a computer-generated image and a virtual environment
US10636200B2 (en) * 2018-01-19 2020-04-28 Htc Corporation Electronic device, method for displaying an augmented reality scene and non-transitory computer-readable medium
US10600239B2 (en) * 2018-01-22 2020-03-24 Adobe Inc. Realistically illuminated virtual objects embedded within immersive environments
US10559121B1 (en) 2018-03-16 2020-02-11 Amazon Technologies, Inc. Infrared reflectivity determinations for augmented reality rendering
US10777010B1 (en) * 2018-03-16 2020-09-15 Amazon Technologies, Inc. Dynamic environment mapping for augmented reality
US10607567B1 (en) 2018-03-16 2020-03-31 Amazon Technologies, Inc. Color variant environment mapping for augmented reality
FR3080935B1 (en) * 2018-05-02 2020-05-22 Argo METHOD AND SYSTEM FOR ON-THE-FLY GENERATION OF AUGMENTED REALITY CONTENT ON A USER DEVICE.
GB2573571B (en) * 2018-05-11 2022-08-31 Signaturize Holdings Ltd Generating virtual representations
GB2574795B (en) 2018-05-04 2022-10-05 Signaturize Holdings Ltd Generating virtual representations
DK180640B1 (en) 2018-05-07 2021-11-09 Apple Inc Devices and methods of measurement using augmented reality
US10950031B2 (en) * 2018-05-14 2021-03-16 Apple Inc. Techniques for locating virtual objects relative to real physical objects
US11694392B2 (en) * 2018-05-22 2023-07-04 Apple Inc. Environment synthesis for lighting an object
US10818093B2 (en) * 2018-05-25 2020-10-27 Tiff's Treats Holdings, Inc. Apparatus, method, and system for presentation of multimedia content including augmented reality content
WO2019228969A1 (en) * 2018-06-01 2019-12-05 Signify Holding B.V. Displaying a virtual dynamic light effect
GB2574882B (en) * 2018-06-22 2020-08-12 Sony Interactive Entertainment Inc Method and system for displaying a virtual object
US11302067B2 (en) 2018-08-31 2022-04-12 Edx Technologies, Inc. Systems and method for realistic augmented reality (AR) lighting effects
US10785413B2 (en) 2018-09-29 2020-09-22 Apple Inc. Devices, methods, and graphical user interfaces for depth-based annotation
DE102019102252A1 (en) * 2019-01-30 2020-07-30 Trilux Gmbh & Co. Kg Procedure for supporting the installation of a sensor or a luminaire in lighting systems
US10762697B1 (en) * 2019-02-27 2020-09-01 Verizon Patent And Licensing Inc. Directional occlusion methods and systems for shading a virtual object rendered in a three-dimensional scene
US11216920B2 (en) * 2019-05-31 2022-01-04 Apple Inc. Enhanced local contrast
US11189061B2 (en) 2019-06-25 2021-11-30 Universal City Studios Llc Systems and methods for virtual feature development
US11494953B2 (en) * 2019-07-01 2022-11-08 Microsoft Technology Licensing, Llc Adaptive user interface palette for augmented reality
US11087529B2 (en) * 2019-09-27 2021-08-10 Disney Enterprises, Inc. Introducing real-time lighting effects to illuminate real-world physical objects in see-through augmented reality displays
US11227446B2 (en) * 2019-09-27 2022-01-18 Apple Inc. Systems, methods, and graphical user interfaces for modeling, measuring, and drawing using augmented reality
US11003308B1 (en) 2020-02-03 2021-05-11 Apple Inc. Systems, methods, and graphical user interfaces for annotating, measuring, and modeling environments
EP3865982A1 (en) * 2020-02-17 2021-08-18 Hexagon Technology Center GmbH Augmented viewing of a scenery and subsurface infrastructure
US11727650B2 (en) 2020-03-17 2023-08-15 Apple Inc. Systems, methods, and graphical user interfaces for displaying and manipulating virtual objects in augmented reality environments
US11941764B2 (en) 2021-04-18 2024-03-26 Apple Inc. Systems, methods, and graphical user interfaces for adding effects in augmented reality environments
US11640698B2 (en) * 2021-05-27 2023-05-02 International Business Machines Corporation Mapping physical locations to fit virtualized AR and VR environments

Family Cites Families (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0561962A (en) 1991-08-30 1993-03-12 Hitachi Ltd Environmental model generating device for product evaluation
MX9207339A (en) * 1991-12-17 1993-07-01 Intelliswitch Inc LIGHTING REGULATOR APPARATUS FOR GAS DISCHARGE LAMPS.
US6124864A (en) * 1997-04-07 2000-09-26 Synapix, Inc. Adaptive modeling and segmentation of visual image streams
JP3577202B2 (en) 1997-11-10 2004-10-13 三菱電機株式会社 Video production equipment
GB2336057B (en) * 1998-04-02 2002-05-08 Discreet Logic Inc Producing image data in a virtual set
US6389375B1 (en) * 1999-01-22 2002-05-14 Interlego Ag Virtual reality modelling
CN1326080C (en) 1999-08-03 2007-07-11 二宫健一 Furniture design support system, design support method and medium storing design support program
US7154504B2 (en) 2001-01-26 2006-12-26 Microsoft Corporation System and method for fast, smooth rendering of lit, textured spheres
US7301547B2 (en) * 2002-03-22 2007-11-27 Intel Corporation Augmented reality system
US7227548B2 (en) * 2004-05-07 2007-06-05 Valve Corporation Method and system for determining illumination of models using an ambient cube
US7249005B2 (en) * 2004-08-17 2007-07-24 Dirtt Environmental Solutions Ltd. Design software incorporating efficient 3-D rendering
US7884818B2 (en) * 2004-12-17 2011-02-08 Kenichi Ninomiya Article design support system and method of controlling same
US7457730B2 (en) * 2005-12-15 2008-11-25 Degnan Donald A Method and system for virtual decoration
US8730156B2 (en) 2010-03-05 2014-05-20 Sony Computer Entertainment America Llc Maintaining multiple views on a shared stable virtual space
US8139059B2 (en) 2006-03-31 2012-03-20 Microsoft Corporation Object illumination in a virtual environment
US20080071559A1 (en) * 2006-09-19 2008-03-20 Juha Arrasvuori Augmented reality assisted shopping
DE102007045835B4 (en) * 2007-09-25 2012-12-20 Metaio Gmbh Method and device for displaying a virtual object in a real environment
CN100594519C (en) 2008-03-03 2010-03-17 北京航空航天大学 Method for real-time generating reinforced reality surroundings by spherical surface panoramic camera
US8797321B1 (en) 2009-04-01 2014-08-05 Microsoft Corporation Augmented lighting environments
US8558837B2 (en) 2010-01-18 2013-10-15 Disney Enterprises, Inc. Modular radiance transfer
US20110234631A1 (en) 2010-03-25 2011-09-29 Bizmodeline Co., Ltd. Augmented reality systems
US8405680B1 (en) 2010-04-19 2013-03-26 YDreams S.A., A Public Limited Liability Company Various methods and apparatuses for achieving augmented reality
US8721337B2 (en) * 2011-03-08 2014-05-13 Bank Of America Corporation Real-time video image analysis for providing virtual landscaping
US8668498B2 (en) * 2011-03-08 2014-03-11 Bank Of America Corporation Real-time video image analysis for providing virtual interior design
US10972680B2 (en) 2011-03-10 2021-04-06 Microsoft Technology Licensing, Llc Theme-based augmentation of photorepresentative view
CN103493106B (en) * 2011-03-29 2017-11-07 高通股份有限公司 Come hand is optionally covered to the method and apparatus on the virtual projection on physical surface using bone tracking
WO2013049889A1 (en) 2011-10-04 2013-04-11 Furniture Pty Ltd Luxmy Furniture design system and method
US9578226B2 (en) 2012-04-12 2017-02-21 Qualcomm Incorporated Photometric registration from arbitrary geometry for augmented reality

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112562051A (en) * 2020-11-30 2021-03-26 腾讯科技(深圳)有限公司 Virtual object display method, device, equipment and storage medium
CN112562051B (en) * 2020-11-30 2023-06-27 腾讯科技(深圳)有限公司 Virtual object display method, device, equipment and storage medium

Also Published As

Publication number Publication date
US9892562B2 (en) 2018-02-13
US10803670B2 (en) 2020-10-13
US20140125668A1 (en) 2014-05-08
US20180075663A1 (en) 2018-03-15
WO2014071333A3 (en) 2015-06-18
US20190244430A1 (en) 2019-08-08
US20170039773A1 (en) 2017-02-09
US9524585B2 (en) 2016-12-20
US10229544B2 (en) 2019-03-12

Similar Documents

Publication Publication Date Title
US10803670B2 (en) Constructing augmented reality environment with pre-computed lighting
US10102678B2 (en) Virtual place-located anchor
US10127725B2 (en) Augmented-reality imaging
EP3137982B1 (en) Transitions between body-locked and world-locked augmented reality
US10962780B2 (en) Remote rendering for virtual images
US10373392B2 (en) Transitioning views of a virtual model
KR102257255B1 (en) Mixed reality spotlight
US10559130B2 (en) Displaying image data behind surfaces
US20200225737A1 (en) Method, apparatus and system providing alternative reality environment
KR20140014160A (en) Immersive display experience
US20180182160A1 (en) Virtual object lighting
CA2946582A1 (en) World-locked display quality feedback
US20120120071A1 (en) Shading graphical objects based on face images
CN103761763A (en) Method for constructing reinforced reality environment by utilizing pre-calculation
CN111670465A (en) Displaying modified stereoscopic content
US10296080B2 (en) Systems and methods to simulate user presence in a real-world three-dimensional space
KR102197504B1 (en) Constructing augmented reality environment with pre-computed lighting
JP6272687B2 (en) Construction of augmented reality environment with pre-calculated lighting
EP2887321B1 (en) Constructing augmented reality environment with pre-computed lighting
US10964056B1 (en) Dense-based object tracking using multiple reference images
US20230281933A1 (en) Spatial video capture and replay

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13795351

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13795351

Country of ref document: EP

Kind code of ref document: A2