WO2014049200A1 - Method and apparatus for providing an indication regarding content presented to another user - Google Patents

Method and apparatus for providing an indication regarding content presented to another user Download PDF

Info

Publication number
WO2014049200A1
WO2014049200A1 PCT/FI2013/050887 FI2013050887W WO2014049200A1 WO 2014049200 A1 WO2014049200 A1 WO 2014049200A1 FI 2013050887 W FI2013050887 W FI 2013050887W WO 2014049200 A1 WO2014049200 A1 WO 2014049200A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
presented
content
indication
head mounted
Prior art date
Application number
PCT/FI2013/050887
Other languages
French (fr)
Inventor
Daniel Ashbrook
David Nguyen
Original Assignee
Nokia Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Corporation filed Critical Nokia Corporation
Priority to EP13776838.8A priority Critical patent/EP2900433B1/en
Publication of WO2014049200A1 publication Critical patent/WO2014049200A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • G06F3/1462Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay with means for detecting differences between the image stored in the host and the images displayed on the remote displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/101Collaborative creation, e.g. joint development of products or services
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted

Definitions

  • An example embodiment of the present invention relates generally to content presented upon a head mounted display and, more particularly, to the provision of an indication relating to the content presented to another user by the head mounted display of the other user.
  • participant may review a variety of content, such as documents, drawings, spreadsheets, electronic mail messages or the like.
  • content such as documents, drawings, spreadsheets, electronic mail messages or the like.
  • one of the participants may present the content on a display, such as a computer monitor, while the other participants gather around the computer monitor so as to concurrently review the same content.
  • a display such as a computer monitor
  • each of the participants may have a relatively common frame of reference and may be aware that the other participants have reviewed the same content. The collaboration between the participants may therefore be facilitated by making reference to portions of the content that have been reviewed by the participants.
  • content may be reviewed by a user in a private manner.
  • a user may wear a head mounted display upon which content may be presented. While a head mounted display may facilitate the review of the content by a first user who wears the head mounted display, it is generally quite difficult, if not impossible, for other participants to view the content presented by the head mounted display of the first user.
  • the first user who wears a head mounted display on which content is presented may have some difficulty in collaborating with other participants since the other participants may not be able to readily determine the content that is currently being reviewed by the first user who wears the head mounted display.
  • a method, apparatus and computer program product are provided in accordance with an example embodiment in order to provide an indication to a first user as to the content presented to another user by the head mounted display of the other user. Based upon the indication, the first user may be aware of the content that is currently being reviewed by the other user. Accordingly, the method, apparatus and computer program product of an example embodiment facilitate collaboration between the users by increasing the awareness of the first user as to the content that has been reviewed or is currently being reviewed by the second user, even in instances in which the content itself remains private relative to others in proximity to the second user.
  • a method in at least one embodiment, includes receiving information relating to content presented to a second user by a head mounted display of the second user. Based upon the information, the method also includes causing, with a processor, an indication to be presented to a first user identifying the content presented to the second user by the head mounted display of the second user.
  • the method of at least one embodiment may also include causing content to be presented to the first user by a head mounted display of the first user.
  • the method of this embodiment may also compare the information relating to content presented to the second user with information relating to content presented to the first user.
  • the indication that is caused to be presented to the first user may be based upon the comparison.
  • the indication that is caused to be presented to the first user may provide information regarding whether the content presented to the second user is identical to the content presented to the first user.
  • the method of at least one embodiment may cause the indication to be presented to the first user by causing a representation of the content presented to the second user to be presented to the first user.
  • the method may also generate the representation of the content presented to the second user.
  • the representation of the content may include an abstraction of the content presented to the second user.
  • the method may generate the representation of the content presented to the second user by obscuring at least portions of the content presented to the second user.
  • the method of at least one embodiment may cause the indication to be presented to the first user by causing the indication to be presented to the first user by a head mounted display of the first user such that the indication is presented at a location defined in relation to the second user.
  • the method may cause the indication to be presented to the first user by causing the indication to be presented to the first user by the head mounted display of the first user such that the indication appears to be
  • the method may cause the indication to be presented to the first user by causing the indication to be presented to the first user by the head mounted display of the first user such that the indication appears to be proximate to the second user.
  • an apparatus in another embodiment, includes at least one processor and at least one memory including computer program code with the at least one memory and the computer program code configured to, with the processor, cause the apparatus to at least receive information relating to content presented to a second user by a head mounted display of the second user. Based upon the information, the at least one memory and the computer program code are also configured to, with the processor, cause the apparatus to cause an indication to be presented to a first user identifying the content presented to the second user by the head mounted display of the second user.
  • the at least one memory and the computer program code may also be configured to, with the processor, cause the apparatus of at least one embodiment to cause content to be presented to the first user by a head mounted display of the first user.
  • the at least one memory and the computer program code may also be configured to, with the processor, cause the apparatus of this embodiment to compare the information relating to content presented to the second user with information relating to content presented to the first user.
  • the indication that is caused to be presented to the first user may be based upon the comparison.
  • the indication that is caused to be presented to the first user may provide information regarding whether the content presented to the second user is identical to the content presented to the first user.
  • the at least one memory and the computer program code may be configured to, with the processor, cause the apparatus of at least one embodiment to cause the indication to be presented to the first user by causing a representation of the content presented to the second user to be presented to the first user.
  • the at least one memory and the computer program code may also be configured to, with the processor, cause the apparatus to generate the representation of the content presented to the second user.
  • the representation of the content may include an abstraction of the content presented to the second user.
  • the at least one memory and the computer program code may be configured to, with the processor, cause the apparatus to generate the representation of the content presented to the second user by obscuring at least portions of the content presented to the second user.
  • a computer program product includes at least one non-transitory computer-readable storage medium having computer- executable program code portions stored therein with the computer-executable program code portions including program code instructions for receiving information relating to content presented to a second user by a head mounted display of the second user. Based upon the information, the computer-executable program code portions also include program code instructions for causing an indication to be presented to a first user identifying the content presented to the second user by the head mounted display of the second user.
  • the computer-executable program code portions of at least one embodiment may also include program instructions for causing content to be presented to the first user by a head mounted display of the first user.
  • the computer-executable program code portions of this embodiment may also include program instructions for comparing the information relating to content presented to the second user with information relating to content presented to the first user. The indication that is caused to be presented to the first user may be based upon the comparison. In at least one embodiment, the indication that is caused to be presented to the first user may provide information regarding whether the content presented to the second user is identical to the content presented to the first user.
  • the computer-executable program code portions of at least one embodiment may also include program instructions for generating a representation of the content presented to the second user.
  • the program code instructions for causing the indication to be presented to the first user may include program code instructions for causing a representation of the content presented to the second user to be presented to the first user.
  • the representation of the content may include an abstraction of the content presented to the second user.
  • the program instructions for generating the representation may include program code instructions for obscuring at least portions of the content presented to the second user.
  • an apparatus in yet another embodiment, includes means for receiving information relating to content presented to a second user by a head mounted display of the second user. Based upon the information, the apparatus also includes means for causing an indication to be presented to a first user identifying the content presented to the second user by the head mounted display of the second user.
  • Figure 1 is a perspective view illustrating first and second users wearing respective head mounted displays
  • Figure 2 is a block diagram of an apparatus that may be specifically configured in accordance with an example embodiment of the present invention
  • Figure 3 is a flow chart illustrating operations performed, such as by the apparatus of Figure 2, in accordance with an example embodiment of the present invention
  • Figure 4 is a perspective view of the head mounted display of the second user in which an indication of the content presented to the second user is presented to the first user in a manner in which the indication is superimposed upon the head mounted display of the second user in accordance with an example embodiment of the present invention
  • Figure 5 is a perspective view of the second user and the head mounted display worn by the second user with an indication of the content presented to the second user being presented to the first user with the indication appearing to float proximate to the second user in accordance with an example embodiment of the present invention.
  • Figure 6 is a perspective view of the head mounted display of the second user with an indication of the content presented to the second user being presented to the first user in the form of a ring about one lens of the head mounted display of the second user in accordance with an example embodiment of the present invention.
  • circuitry refers to (a) hardware-only circuit implementations (e.g., implementations in analog circuitry and/or digital circuitry); (b) combinations of circuits and computer program product(s) comprising software and/or firmware instructions stored on one or more computer readable memories that work together to cause an apparatus to perform one or more functions described herein; and (c) circuits, such as, for example, a microprocessor(s) or a portion of a microprocessor(s), that require software or firmware for operation even if the software or firmware is not physically present.
  • This definition of 'circuitry' applies to all uses of this term herein, including in any claims.
  • the term 'circuitry' also includes an implementation comprising one or more processors and/or portion(s) thereof and accompanying software and/or firmware.
  • the term 'circuitry' as used herein also includes, for example, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or a similar integrated circuit in a server, a cellular network device, other network device, and/or other computing device.
  • a method, apparatus and computer program product are provided in accordance with at least one embodiment of the present invention in order to provide an indication to a first user relating to the content presented to a second user by the head mounted display worn by the second user. Based upon the indication that is presented to the first user, such as by the head mounted display of the first user, relating to the content presented to the second user by the head mounted display of the second user, collaboration between the first and second users may be facilitated since the first user will be aware of the content that has been reviewed by the second user including the content that is currently being reviewed by the second user.
  • the first and second users need not review the same content on the same computer monitor in order to be aware of the content that has been or is currently being reviewed by the other user and, instead, the method, apparatus and computer program product of an example embodiment provide a mechanism for determining the content that is being presented to another user, while, in at least one embodiment, permitting the content to remain private or to at least to limit disclosure of the content.
  • first and second users 10, 14 are illustrated.
  • the users may be collaborating with one another on a project or the first user may simply have an interest in the content being reviewed by the second user, even in the absence of collaboration therebetween.
  • each of the first and second users wears a head mounted display 12, 16.
  • a head mounted display permits a user to optically view a scene external to the head mounted display.
  • a head mounted display may be in the form of a pair of glasses having a pair of lenses and a pair of side stems configured to support the glasses upon the user's ears.
  • the glasses may be worn by the user such that the user may view a scene, e.g., a field of view, through the lenses of the glasses.
  • the glasses may also be configured to present a visual representation of other information so as to augment or supplement the user's view of the scene through the lenses of the glasses.
  • the information presented by the head mounted display may augment the objects in the scene viewed through the head mounted display, such as by identifying or otherwise providing more information regarding one or more of the objects viewed through the head mounted display.
  • the information presented by the head mounted display may be unrelated to the objects in the scene viewed through the head mounted display, but may otherwise provide information that may be of interest to the user, such as content that may be relevant to a project on which the first and second users are collaborating.
  • a head mounted display as exemplified by the glasses may support augmented reality and other applications.
  • augmented reality glasses are one example of a head mounted display
  • a head mounted display may be embodied in a number of different manners with a variety of form factors, each of which may permit a user to optically see through the display so as to view the user's surroundings and each of which may benefit from the method, apparatus and computer program product of an example embodiment of the present invention as described below.
  • the head mounted display may be in the form of a head mounted visor or a helmet manner display.
  • the head mounted display may be in a form of a helmet worn by a motorcyclist, a pilot or they like.
  • the content presented to a user by a respective head mounted display is generally difficult, if not impossible, to be view by another user.
  • content presented by the head mounted display 16 of the second user 14 may be difficult, if not impossible, for the first user 10 to see.
  • the method, apparatus and computer program product of an example embodiment of the present invention facilitates the provision of an indication to the first user of the content presented to the second user by the head mounted display of the second user.
  • an apparatus 20 may be provided and may be specifically configured in accordance with an example embodiment of the present invention.
  • the apparatus may be associated with the head mounted display 12 of the first user in order to receive information regarding the content presented by the head mounted display of the second user and to provide an indication to the first user identifying the content presented by the head mounted display of the second user.
  • the apparatus may be embodied by the head mounted display of the first user.
  • the apparatus may be embodied by a computing device that is remote from the head mounted display of the first user, but that is in communication therewith, such as via wireless communication, e.g., via Bluetooth communication, Wi-Fi or another wireless network, or via wired communication.
  • a computing device such as a personal digital assistant (PDA), mobile telephone, smartphone, pager, mobile television, gaming device, laptop computer, camera, tablet computer, touch surface, video recorder, audio/video player, radio, electronic book, positioning device (e.g., global positioning system (GPS) device), or any combination of the aforementioned, and other types of voice and text communications systems, may embody the apparatus of at least one embodiment and be in communication with the head mounted displays of the first and second users.
  • the computing device that embodies the apparatus may then provide direction to another computing device, such as the head mounted display of the first user, to direct the presentation of an indication of the content presented by the head mounted display of the second user.
  • GPS global positioning system
  • the apparatus 20 may include or otherwise be in communication with a processor 22, a memory device 24 and a
  • FIG. 2 illustrates one example of a configuration of an apparatus that may be specifically configured in accordance with an embodiment of the present invention, numerous other configurations may also be used to implement other embodiments.
  • devices or elements are shown as being in communication with each other, hereinafter such devices or elements should be considered to be capable of being embodied within the same device or element and thus, devices or elements shown in communication should be understood to alternatively be portions of the same device or element.
  • the processor 22 may be in communication with the memory device 24 via a bus for passing information among components of the apparatus.
  • the memory device may include, for example, one or more volatile and/or non-volatile memories.
  • the memory device may be an electronic storage device (e.g., a computer readable storage medium) comprising gates configured to store data (e.g., bits) that may be retrievable by a machine (e.g., a computing device like the processor).
  • the memory device may be configured to store information, data, content, applications, instructions, or the like for enabling the apparatus 20 to carry out various functions in accordance with an example embodiment of the present invention.
  • the memory device could be configured to buffer input data for processing by the processor. Additionally or alternatively, the memory device could be configured to store instructions for execution by the processor.
  • the apparatus 20 may be embodied by a computing device, such as a head mounted display 12 of the first user 10 or a computing device in
  • the apparatus may be embodied as a chip or chip set.
  • the apparatus may comprise one or more physical packages (e.g., chips) including materials, components and/or wires on a structural assembly (e.g., a baseboard).
  • the structural assembly may provide physical strength, conservation of size, and/or limitation of electrical interaction for component circuitry included thereon.
  • the apparatus may therefore, in some cases, be configured to implement an embodiment of the present invention on a single chip or as a single "system on a chip.”
  • a chip or chipset may constitute means for performing one or more operations for providing the functionalities described herein.
  • the processor 22 may be embodied in a number of different ways.
  • the processor may be embodied as one or more of various hardware processing means such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), a processing element with or without an accompanying DSP, or various other processing circuitry including integrated circuits such as, for example, an ASIC
  • the processor may include one or more processing cores configured to perform independently.
  • a multi-core processor may enable multiprocessing within a single physical package.
  • the processor may include one or more processors configured in tandem via the bus to enable independent execution of instructions, pipelining and/or multithreading.
  • the processor 22 may be configured to execute instructions stored in the memory device 24 or otherwise accessible to the processor.
  • the processor may be configured to execute hard coded functionality.
  • the processor may represent an entity (e.g., physically embodied in circuitry) capable of performing operations according to an embodiment of the present invention while configured accordingly.
  • the processor when the processor is embodied as an ASIC, FPGA or the like, the processor may be specifically configured hardware for conducting the operations described herein.
  • the processor when the processor is embodied as an executor of software instructions, the instructions may specifically configure the processor to perform the algorithms and/or operations described herein when the instructions are executed.
  • the processor may be a processor of a specific device (e.g., a head mounted display) configured to employ an embodiment of the present invention by further configuration of the processor by instructions for performing the algorithms and/or operations described herein.
  • the processor may include, among other things, a clock, an arithmetic logic unit (ALU) and logic gates configured to support operation of the processor.
  • ALU arithmetic logic unit
  • the communication interface 26 may be any means such as a device or circuitry embodied in either hardware or a combination of hardware and software that is configured to receive and/or transmit data from/to a head mounted display 12 of the first user 10, such as a computing device embodied by the head mounted display of the first user, in an embodiment in which the apparatus 20 is remote from the head mounted display.
  • the communication interface may also be configured to communicate with the head mounted display 16 of the second user 14, such as via wireless communication, e.g., Bluetooth, Wi-Fi or another wireless network, or via wireline.
  • the head mounted display of the first user may include a display 28 and the communication interface may be configured to direct the presentation of information upon the display.
  • the communication interface may be configured to communicate with other components of the computing device in an instance in which the apparatus is embodied by a computing device embodied by the head mounted display of the first user or with a remote computing device in an instance in which the apparatus is separate from the head mounted display of the first user.
  • the communication interface 26 may include, for example, an antenna (or multiple antennas) and supporting hardware and/or software for enabling communications wirelessly. Additionally or alternatively, the communication interface may include the circuitry for interacting with the antenna(s) to cause transmission of signals via the antenna(s) or to handle receipt of signals received via the antenna(s).
  • the communications interface may be configured to communicate wirelessly with the sensor(s) 18, such as via Wi-Fi, Bluetooth or other wireless communications techniques.
  • the communication interface may alternatively or also support wired communication.
  • the communication interface may include a communication modem and/or other hardware/software for supporting communication via cable, digital subscriber line (DSL), universal serial bus (USB) or other mechanisms.
  • the communication interface may be configured to
  • the apparatus 20 communicate via wired communication with other components of the computing device including, for example, other components of the head mounted display 12 of the first user 10 in an embodiment in which the apparatus 20 is embodied by the head mounted display of the first user.
  • content may be presented to the second user 12 by the head mounted display 16 of the second user, such as by causing the display of content on the head mounted display of the second user.
  • the content may be presented by the head mounted display of the second user such that the content appears in the field of view of the second user and appears to float within the field of view of the second user.
  • the content may be presented by the head mounted display of the second user so as to be associated with and positionally fixed to an object within the field of view of the second user, such as a table or a wall.
  • the second user may readily view the content while concurrently viewing their surroundings.
  • the first user 10 may be unable to view the content presented by the head mounted display of the second user, at least not clearly.
  • an apparatus 20 of at least one embodiment may be configured to receive information relating to the content presented to the second user 12 by the head mounted display 16 of the second user. See block 32.
  • the head mounted display of the second user or a computing device associated with the head mounted display of the second user may be configured to transmit information relating to the content presented to the second user by the head mounted display of the second user.
  • the apparatus therefore includes means, such as the processor 22, the communication interface 26 or the like, for receiving the information relating to content presented to the second user by the head mounted display of the second user.
  • the information may be the actual content or an abstraction of the content.
  • a summary of the content or a blurred representation may be received instead of the actual content itself.
  • the abstraction of the content that is received may include only the title of the content, such as the name of a document, the subject of an electronic mail message or the like, or certain predefined field(s) of the content instead of the content itself.
  • the abstraction of the content may include metadata associated with the content instead of the content itself.
  • the image may be provided to and received by the apparatus or an abstraction of the image, such as an image in which the color of the pixels have been averaged or otherwise modified, may be provided and received by the apparatus.
  • the abstraction of the content may be, for example, a pulsing light with the pulsations synchronized with the content.
  • the apparatus 20 may be configured to cause an indication 40 to be presented to the first user 10 relating to the content presented to the second user 12 by the head mounted display 16 of the second user. See, for example, Figures 4 and 5.
  • the indication that is presented to the first user identifying the content presented to the second user by the head mounted display of the second user is presented to the first user by the head mounted display 14 of the first user.
  • the apparatus may include means, such as the processor 22, the communication interface 26 or the like, for causing an indication to be presented to the first user relating to the content presented by the head mounted display of the second user, such as by sending instructions to the display 28 of the head mounted display worn by the first user.
  • the apparatus 20, such as the processor 22, the communication interface 26 or the like may be configured to cause various types of indications 40 to be presented to the first user 10 via the head mounted display 14 of the first user.
  • the indication may be the content itself.
  • the indication may be a representation of the content which provides information regarding the content without disclosing the content itself.
  • the representation of the content may be an abstraction of the content as described above.
  • the abstraction may be provided by the head mounted display 16 of the second user 12 as described above or may be generated by the apparatus, such as the processor, following receipt of the content itself from the head mounted display 16 of the second user 12.
  • the apparatus 20, such as the processor 22, the communication interface 26 or the like, may be configured to cause the indication 40 to be presented to the first user 10 by the head mounted display 14 of the first user such that the indication is presented at a location that is defined in relation the second user 12, such as at a predefined position relative to the second user.
  • the first user may view the second user within the field of view seen through the head mounted display worn by the first user.
  • the indication may be presented by the head mounted display of the first user such that the first user is able to see both the second user and the indication identifying the content presented to the second user by the head mounted display of the second user.
  • the apparatus 20 such as the processor 22, the
  • the communication interface 26 or the like of at least one embodiment may be configured to cause the indication to be presented to the first user 10 by the head mounted display 14 of the first user such that the indication appears to be superimposed upon the second user and/or an object associated with the second user.
  • the apparatus such as the processor, the communication interface or the like, causes the indication to be presented in a manner that at least partially overlays the second user and/or an object associated with the second user.
  • the indication that is superimposed upon the second user and/or an object associated with the second user is translucent such that the portion of the second user and/or the portion of the object associated with the second user that is overlaid by the indication can be seen through the indication.
  • the indication that is superimposed upon the second user and/or an object associated with the second user may be opaque such that the portion of the second user and/or the portion of the object associated with the second user that underlies the indication is not visible.
  • the apparatus 20, such as the processor 22, may be configured to cause the indication 40 to be superimposed upon the same portion of the second user 12 and/or the same portion of an object associated with the second user even as there is relative movement between the head mounted display 14 of the first user 10 and the second user, such as may be attributable to the user moving within the field of view of the first user and/or as the head of the first user and, therefore, the head mounted display worn by the first user is moved relative to the second user.
  • the apparatus such as the processor, may be configured to track the location of the portion of the second user and/or the portion of an object associated with the second user upon which the indication is to be superimposed.
  • an image of the field of view of the first user through the head mounted display of the first user may be captured, such as by a camera incorporated within or associated with the head mounted display of the first user.
  • the apparatus such as the processor, may be configured to analyze the image and to determine the location within the field of view of the head mounted display of the first user of the portion of the second user and/or the portion of the object associated with the second user upon which the indication is to be superimposed.
  • the apparatus such as the processor, may thereafter cause the indication to be superimposed upon the location of the portion of the second user and/or the portion of the object associated with the second user that was determined from the analysis of the image, such as by sending instructions to the display 28 of the head mounted display worn by the first user.
  • the apparatus such as the processor, may cause the indication to be superimposed upon the same portion of the second user and/or the same portion of an object associated with the second user even as there is relative movement between the head mounted display of the first user and the second user.
  • Various objects may be associated with the second user 12.
  • objects that are worn or carried by the second user may be considered to be associated with the second user.
  • clothing worn by the user or a hat worn by the user may be considered to be associated with the second user.
  • the head mounted display may be considered to be associated with the second user 12.
  • the apparatus 20 of at least one embodiment, such as the processor
  • the communication interface 26 or the like may be configured to cause the indication
  • the apparatus 20, such as the processor 22, the communication interface 26 or the like may be configured to cause the indication to be presented to the first user 10 by the head mounted display 14 of the first user such that the indication appears to be proximate to the second user.
  • the indication that is caused to be presented to the first user appears to be positioned at least partially, if not entirely, beyond the second user, such as by being spaced apart from the second user.
  • the indication that is caused to be presented to the first user in this embodiment may be relatively near the second user, such as by being within a predefined distance of the second user, in order to maintain the proximal relationship between the second user and the indication.
  • the apparatus 20, such as the processor 22, the communication interface 26 or the like may be configured to cause the indication 40 to be presented to the first user 10 by the head mounted display 14 of the first user such that the indication appears to float proximate to the second user as shown, for example, in Figure 5.
  • the indication may be caused to be presented to the first user in various forms, such as a voice bubble, a thought bubble or the like.
  • the apparatus such as a processor, may be configured to track the position of the second user within the field of view of the first user through the head mounted display worn by the first user, such as by repeatedly capturing images of the field of view of the first user and analyzing the images to identify the position of the second user therein.
  • the apparatus such as the processor, may be configured to cause the indication to be presented in the same relative position with respect to the second user, even as relative movement between the second user and the head mounted display of the first user occurs. As such, the indication that is presented to the first user appears to float relative to the second user.
  • the apparatus 20, such as the processor 22, may be additionally configured to cause the indication 40 that is presented to the first user 10 by the head mounted display 14 of the first user to include a visual association 42 with the second user 12.
  • the visual association may have various forms, such as a line, an arrow, a pointer or that like, and serves to visibly link or otherwise associate the indication with the second user and/or an object associated with the second user.
  • various objects may be associated with the second user including objects, e.g., a hat, clothing, etc., that are worn or carried by the second user may be considered to be associated with the second user.
  • the head mounted display 16, such as a pair of augmented reality glasses may be considered to be associated with the second user.
  • the visual association has the form of a link that associates the indication with the second user.
  • the apparatus 20 of at least one embodiment may also be configured to cause content to be presented to the first user 10 by the head mounted display 14 of the first user.
  • the apparatus may include means, such as the processor 22, the communication interface 26 or the like, for causing content to be presented via the head mounted display of the first user.
  • the first user may therefore not only see the field of view through the head mounted display including any indication that is presented relative to the second user 12, but may also see the content that is presented by the head mounted display of the first user.
  • the apparatus 20 may be configured to compare the information relating to the content presented to the second user 12 that was received as shown, for example, in block 32 with information relating to the content presented by the first user 10 as shown in block 30.
  • the apparatus may include means, such as the processor 22, the communication interface 26 or the like, for making such a comparison.
  • the apparatus such as the processor, may determine if the same content is being concurrently presented to both the first and second users via the respective head mounted displays or whether different content is being presented to the first and second users via the respective head mounted displays.
  • the indication that is caused to be presented to the first user 10 may be based upon the comparison.
  • the indication that is caused to be presented to the first user may provide information regarding whether the content presented to the second user 12 is identical to the content presented to the first user.
  • This indication of the commonality of the content presented to both the first and second users may be provided in various manners, but is provided in the embodiment of Figure 6 by the presentation of an indication 44, such as by the head mounted display 14 of the first user, that appears to the first user as the illumination or coloring of a predefined region, such as ring about one of the lenses of the head mounted display 16 worn by the second user.
  • this indication may be presented, such as by the head mounted display of the first user, so as to appear to the first user as a halo positioned above the head of the second user, as a star presented upon the chest of the second user or otherwise.
  • the first user may quickly determine whether the second user is reviewing the same content as the first user by determining whether the indication that is representative of the commonality of the content presented to both the first and second users is visible.
  • the indication that is caused to be presented to the first user 10 may provide information regarding whether the first and second users are currently viewing different portions, e.g., different pages, of the same content.
  • the indication that is caused to be presented to the first user may provide information regarding whether the first and second users are currently viewing different versions of same content.
  • the indication that is caused to be presented to the first user may provide information regarding whether the first and second users are currently viewing content that is substantially similar, such as content that deviates by no more than a predefined amount or percent.
  • the content that is presented to the second user 12 by the head mounted display 16 of the second user may include at least portions that are confidential, private or the like.
  • Content may be indicated by the head mounted display of the second user to be private in various manners. For example, all content may be considered private, content that is to be considered private may be marked, content that is to be considered private may be indicated by the second user, content associated with one or more respective
  • the privacy of the content presented to the second user by the head mounted display of the second user may be maintained by not causing the content itself to be displayed for the first user, but, instead, causing a summary or an abstraction of the content to be presented to the first user 10 or causing a visible indication representative of the commonality of the content presented to both the first and second users to be presented to the first user.
  • the apparatus 20, such as the processor 22, may be configured to generate a representation of the content presented to the second user 12 and, in the course of generating the representation, may obscure at least portions of the content presented to the second user.
  • the portions of the content that are obscured may be defined in various manners including the obscuration of the content within predefined fields, the obscuration of content is predefined pattern that effectively masks the content, the obscuration of content identified by the second user or by a third party or otherwise.
  • the apparatus such as the processor, the communication interface 26 or the like, may be configured to cause the representation of the content, including those portions that have been obscured, to be presented to the first user 10 via the head mounted display 14 of the first user.
  • the first user may be able to identify the content that is currently being reviewed by the second user so as to determine whether the second user is reviewing the same content as is currently being reviewed by the first user, but the privacy of the content may be maintained.
  • the first user may readily identify the content that is being reviewed by the second user and, in one instance, may determine whether the second user is reviewing the same content as is currently being reviewed by the first user.
  • the first and second users may collaborate and, in the course of such collaboration, may review the same documentation or other content as evidenced by the indications that are cause to be presented in accordance with an embodiment of the present invention.
  • Figure 3 is a flowchart of an apparatus, method, and computer program product according to example embodiments of the invention. It will be understood that each block of the flowchart, and combinations of blocks in the flowchart, may be implemented by various means, such as hardware, firmware, processor, circuitry, and/or other devices associated with execution of software including one or more computer program instructions. For example, one or more of the procedures described above may be embodied by computer program instructions. In this regard, the computer program instructions which embody the procedures described above may be stored by a memory device 24 of an apparatus 20 employing an embodiment of the present invention and executed by a processor 22 of the apparatus. As will be appreciated, any such computer program instructions may be loaded onto a computer or other programmable apparatus (e.g., hardware) to produce a machine, such that the resulting computer or other programmable apparatus implements the functions specified in the flowchart blocks.
  • a computer or other programmable apparatus e.g., hardware
  • These computer program instructions may also be stored in a computer-readable memory that may direct a computer or other programmable apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture the execution of which implements the function specified in the flowchart blocks.
  • the computer program instructions may also be loaded onto a computer or other programmable apparatus to cause a series of operations to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus provide operations for implementing the functions specified in the flowchart blocks.
  • blocks of the flowcharts support combinations of means for performing the specified functions and combinations of operations for performing the specified functions for performing the specified functions. It will also be understood that one or more blocks of the flowcharts, and combinations of blocks in the flowcharts, can be implemented by special purpose hardware-based computer systems which perform the specified functions, or combinations of special purpose hardware and computer instructions.

Abstract

A method, apparatus and computer program product provide an indication to a first user as to the content presented to another user by the head mounted display of the other user. In the context of a method, information is received relating to content presented to a second user by a head mounted display of the second user. Based upon the information, the method also includes causing, with a processor, an indication to be presented to a first user identifying the content presented to the second user by the head mounted display of the second user.

Description

METHOD AND APPARATUS FOR PROVIDING AN INDICATION REGARDING CONTENT PRESENTED TO ANOTHER USER
TECHNOLOGICAL FIELD
An example embodiment of the present invention relates generally to content presented upon a head mounted display and, more particularly, to the provision of an indication relating to the content presented to another user by the head mounted display of the other user.
BACKGROUND
Multiple participants frequently collaborate on projects. During such collaboration, the participants may review a variety of content, such as documents, drawings, spreadsheets, electronic mail messages or the like. In one common scenario, one of the participants may present the content on a display, such as a computer monitor, while the other participants gather around the computer monitor so as to concurrently review the same content. As a result of the sharing of the content, such as may be presented by the computer monitor of one of the participants, each of the participants may have a relatively common frame of reference and may be aware that the other participants have reviewed the same content. The collaboration between the participants may therefore be facilitated by making reference to portions of the content that have been reviewed by the participants.
In contrast to the presentation of content upon a computer monitor that may be shared in a public manner, content may be reviewed by a user in a private manner. For example, a user may wear a head mounted display upon which content may be presented. While a head mounted display may facilitate the review of the content by a first user who wears the head mounted display, it is generally quite difficult, if not impossible, for other participants to view the content presented by the head mounted display of the first user. Thus, the first user who wears a head mounted display on which content is presented may have some difficulty in collaborating with other participants since the other participants may not be able to readily determine the content that is currently being reviewed by the first user who wears the head mounted display. As such, the participants may not be able to make reference to particular portions of the content during their collaboration and be confident that all of the other participants are concurrently reviewing the same content. A method, apparatus and computer program product are provided in accordance with an example embodiment in order to provide an indication to a first user as to the content presented to another user by the head mounted display of the other user. Based upon the indication, the first user may be aware of the content that is currently being reviewed by the other user. Accordingly, the method, apparatus and computer program product of an example embodiment facilitate collaboration between the users by increasing the awareness of the first user as to the content that has been reviewed or is currently being reviewed by the second user, even in instances in which the content itself remains private relative to others in proximity to the second user.
In at least one embodiment, a method is provided that includes receiving information relating to content presented to a second user by a head mounted display of the second user. Based upon the information, the method also includes causing, with a processor, an indication to be presented to a first user identifying the content presented to the second user by the head mounted display of the second user.
The method of at least one embodiment may also include causing content to be presented to the first user by a head mounted display of the first user. The method of this embodiment may also compare the information relating to content presented to the second user with information relating to content presented to the first user. The indication that is caused to be presented to the first user may be based upon the comparison. In at least one embodiment, the indication that is caused to be presented to the first user may provide information regarding whether the content presented to the second user is identical to the content presented to the first user.
The method of at least one embodiment may cause the indication to be presented to the first user by causing a representation of the content presented to the second user to be presented to the first user. In this embodiment, the method may also generate the representation of the content presented to the second user. For example, the representation of the content may include an abstraction of the content presented to the second user. In at least one embodiment, the method may generate the representation of the content presented to the second user by obscuring at least portions of the content presented to the second user.
The method of at least one embodiment may cause the indication to be presented to the first user by causing the indication to be presented to the first user by a head mounted display of the first user such that the indication is presented at a location defined in relation to the second user. In this regard, the method may cause the indication to be presented to the first user by causing the indication to be presented to the first user by the head mounted display of the first user such that the indication appears to be
superimposed upon at least one of the second user or an object associated with the second user. Alternatively, the method may cause the indication to be presented to the first user by causing the indication to be presented to the first user by the head mounted display of the first user such that the indication appears to be proximate to the second user.
In another embodiment, an apparatus is provided that includes at least one processor and at least one memory including computer program code with the at least one memory and the computer program code configured to, with the processor, cause the apparatus to at least receive information relating to content presented to a second user by a head mounted display of the second user. Based upon the information, the at least one memory and the computer program code are also configured to, with the processor, cause the apparatus to cause an indication to be presented to a first user identifying the content presented to the second user by the head mounted display of the second user.
The at least one memory and the computer program code may also be configured to, with the processor, cause the apparatus of at least one embodiment to cause content to be presented to the first user by a head mounted display of the first user. The at least one memory and the computer program code may also be configured to, with the processor, cause the apparatus of this embodiment to compare the information relating to content presented to the second user with information relating to content presented to the first user. The indication that is caused to be presented to the first user may be based upon the comparison. In at least one embodiment, the indication that is caused to be presented to the first user may provide information regarding whether the content presented to the second user is identical to the content presented to the first user.
The at least one memory and the computer program code may be configured to, with the processor, cause the apparatus of at least one embodiment to cause the indication to be presented to the first user by causing a representation of the content presented to the second user to be presented to the first user. In this embodiment, the at least one memory and the computer program code may also be configured to, with the processor, cause the apparatus to generate the representation of the content presented to the second user. For example, the representation of the content may include an abstraction of the content presented to the second user. In at least one embodiment, the at least one memory and the computer program code may be configured to, with the processor, cause the apparatus to generate the representation of the content presented to the second user by obscuring at least portions of the content presented to the second user.
In a further embodiment, a computer program product is provided that includes at least one non-transitory computer-readable storage medium having computer- executable program code portions stored therein with the computer-executable program code portions including program code instructions for receiving information relating to content presented to a second user by a head mounted display of the second user. Based upon the information, the computer-executable program code portions also include program code instructions for causing an indication to be presented to a first user identifying the content presented to the second user by the head mounted display of the second user.
The computer-executable program code portions of at least one embodiment may also include program instructions for causing content to be presented to the first user by a head mounted display of the first user. The computer-executable program code portions of this embodiment may also include program instructions for comparing the information relating to content presented to the second user with information relating to content presented to the first user. The indication that is caused to be presented to the first user may be based upon the comparison. In at least one embodiment, the indication that is caused to be presented to the first user may provide information regarding whether the content presented to the second user is identical to the content presented to the first user.
The computer-executable program code portions of at least one embodiment may also include program instructions for generating a representation of the content presented to the second user. In this embodiment, the program code instructions for causing the indication to be presented to the first user may include program code instructions for causing a representation of the content presented to the second user to be presented to the first user. The representation of the content may include an abstraction of the content presented to the second user. In at least one embodiment, the program instructions for generating the representation may include program code instructions for obscuring at least portions of the content presented to the second user.
In yet another embodiment, an apparatus is provided that includes means for receiving information relating to content presented to a second user by a head mounted display of the second user. Based upon the information, the apparatus also includes means for causing an indication to be presented to a first user identifying the content presented to the second user by the head mounted display of the second user. BRIEF DESCRIPTION OF THE DRAWINGS
Having thus described certain embodiments of the invention in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
Figure 1 is a perspective view illustrating first and second users wearing respective head mounted displays;
Figure 2 is a block diagram of an apparatus that may be specifically configured in accordance with an example embodiment of the present invention;
Figure 3 is a flow chart illustrating operations performed, such as by the apparatus of Figure 2, in accordance with an example embodiment of the present invention;
Figure 4 is a perspective view of the head mounted display of the second user in which an indication of the content presented to the second user is presented to the first user in a manner in which the indication is superimposed upon the head mounted display of the second user in accordance with an example embodiment of the present invention;
Figure 5 is a perspective view of the second user and the head mounted display worn by the second user with an indication of the content presented to the second user being presented to the first user with the indication appearing to float proximate to the second user in accordance with an example embodiment of the present invention; and
Figure 6 is a perspective view of the head mounted display of the second user with an indication of the content presented to the second user being presented to the first user in the form of a ring about one lens of the head mounted display of the second user in accordance with an example embodiment of the present invention.
DETAILED DESCRIPTION
Some embodiments of the present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all, embodiments of the invention are shown. Indeed, various embodiments of the invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements. Like reference numerals refer to like elements throughout. As used herein, the terms "data," "content," "information," and similar terms may be used interchangeably to refer to data capable of being transmitted, received and/or stored in accordance with embodiments of the present invention. Thus, use of any such terms should not be taken to limit the spirit and scope of embodiments of the present invention.
Additionally, as used herein, the term 'circuitry' refers to (a) hardware-only circuit implementations (e.g., implementations in analog circuitry and/or digital circuitry); (b) combinations of circuits and computer program product(s) comprising software and/or firmware instructions stored on one or more computer readable memories that work together to cause an apparatus to perform one or more functions described herein; and (c) circuits, such as, for example, a microprocessor(s) or a portion of a microprocessor(s), that require software or firmware for operation even if the software or firmware is not physically present. This definition of 'circuitry' applies to all uses of this term herein, including in any claims. As a further example, as used herein, the term 'circuitry' also includes an implementation comprising one or more processors and/or portion(s) thereof and accompanying software and/or firmware. As another example, the term 'circuitry' as used herein also includes, for example, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or a similar integrated circuit in a server, a cellular network device, other network device, and/or other computing device.
As defined herein, a "computer-readable storage medium," which refers to a non-transitory physical storage medium (e.g., volatile or non-volatile memory device), can be differentiated from a "computer-readable transmission medium," which refers to an electromagnetic signal.
A method, apparatus and computer program product are provided in accordance with at least one embodiment of the present invention in order to provide an indication to a first user relating to the content presented to a second user by the head mounted display worn by the second user. Based upon the indication that is presented to the first user, such as by the head mounted display of the first user, relating to the content presented to the second user by the head mounted display of the second user, collaboration between the first and second users may be facilitated since the first user will be aware of the content that has been reviewed by the second user including the content that is currently being reviewed by the second user. Consequently, the first and second users need not review the same content on the same computer monitor in order to be aware of the content that has been or is currently being reviewed by the other user and, instead, the method, apparatus and computer program product of an example embodiment provide a mechanism for determining the content that is being presented to another user, while, in at least one embodiment, permitting the content to remain private or to at least to limit disclosure of the content.
Referring now to Figure 1, first and second users 10, 14 are illustrated. The users may be collaborating with one another on a project or the first user may simply have an interest in the content being reviewed by the second user, even in the absence of collaboration therebetween. As shown in Figure 1 , each of the first and second users wears a head mounted display 12, 16. A head mounted display permits a user to optically view a scene external to the head mounted display. With reference to Figure 1 by way of example, a head mounted display may be in the form of a pair of glasses having a pair of lenses and a pair of side stems configured to support the glasses upon the user's ears. The glasses may be worn by the user such that the user may view a scene, e.g., a field of view, through the lenses of the glasses. However, the glasses may also be configured to present a visual representation of other information so as to augment or supplement the user's view of the scene through the lenses of the glasses. The information presented by the head mounted display may augment the objects in the scene viewed through the head mounted display, such as by identifying or otherwise providing more information regarding one or more of the objects viewed through the head mounted display. Alternatively, the information presented by the head mounted display may be unrelated to the objects in the scene viewed through the head mounted display, but may otherwise provide information that may be of interest to the user, such as content that may be relevant to a project on which the first and second users are collaborating. Regardless of the type of information presented by the head mounted display, a head mounted display as exemplified by the glasses may support augmented reality and other applications.
While augmented reality glasses are one example of a head mounted display
12, 16, a head mounted display may be embodied in a number of different manners with a variety of form factors, each of which may permit a user to optically see through the display so as to view the user's surroundings and each of which may benefit from the method, apparatus and computer program product of an example embodiment of the present invention as described below. For example, the head mounted display may be in the form of a head mounted visor or a helmet manner display. For example, the head mounted display may be in a form of a helmet worn by a motorcyclist, a pilot or they like.
The content presented to a user by a respective head mounted display is generally difficult, if not impossible, to be view by another user. For example, content presented by the head mounted display 16 of the second user 14 may be difficult, if not impossible, for the first user 10 to see. However, the method, apparatus and computer program product of an example embodiment of the present invention facilitates the provision of an indication to the first user of the content presented to the second user by the head mounted display of the second user. In this regard, an apparatus 20 may be provided and may be specifically configured in accordance with an example embodiment of the present invention. In accordance with an example embodiment of the present invention, the apparatus may be associated with the head mounted display 12 of the first user in order to receive information regarding the content presented by the head mounted display of the second user and to provide an indication to the first user identifying the content presented by the head mounted display of the second user. In at least one embodiment, the apparatus may be embodied by the head mounted display of the first user. Alternatively, the apparatus may be embodied by a computing device that is remote from the head mounted display of the first user, but that is in communication therewith, such as via wireless communication, e.g., via Bluetooth communication, Wi-Fi or another wireless network, or via wired communication. For example, a computing device, such as a personal digital assistant (PDA), mobile telephone, smartphone, pager, mobile television, gaming device, laptop computer, camera, tablet computer, touch surface, video recorder, audio/video player, radio, electronic book, positioning device (e.g., global positioning system (GPS) device), or any combination of the aforementioned, and other types of voice and text communications systems, may embody the apparatus of at least one embodiment and be in communication with the head mounted displays of the first and second users. In this embodiment, the computing device that embodies the apparatus may then provide direction to another computing device, such as the head mounted display of the first user, to direct the presentation of an indication of the content presented by the head mounted display of the second user.
Regardless of its implementation, the apparatus 20 may include or otherwise be in communication with a processor 22, a memory device 24 and a
communication interface 26. It should be noted that while Figure 2 illustrates one example of a configuration of an apparatus that may be specifically configured in accordance with an embodiment of the present invention, numerous other configurations may also be used to implement other embodiments. As such, in some embodiments, although devices or elements are shown as being in communication with each other, hereinafter such devices or elements should be considered to be capable of being embodied within the same device or element and thus, devices or elements shown in communication should be understood to alternatively be portions of the same device or element.
In some embodiments, the processor 22 (and/or co-processors or any other processing circuitry assisting or otherwise associated with the processor) may be in communication with the memory device 24 via a bus for passing information among components of the apparatus. The memory device may include, for example, one or more volatile and/or non-volatile memories. In other words, for example, the memory device may be an electronic storage device (e.g., a computer readable storage medium) comprising gates configured to store data (e.g., bits) that may be retrievable by a machine (e.g., a computing device like the processor). The memory device may be configured to store information, data, content, applications, instructions, or the like for enabling the apparatus 20 to carry out various functions in accordance with an example embodiment of the present invention. For example, the memory device could be configured to buffer input data for processing by the processor. Additionally or alternatively, the memory device could be configured to store instructions for execution by the processor.
As noted above, the apparatus 20 may be embodied by a computing device, such as a head mounted display 12 of the first user 10 or a computing device in
communication with the head mounted display of the first user, configured to employ an example embodiment of the present invention. However, in some embodiments, the apparatus may be embodied as a chip or chip set. In other words, the apparatus may comprise one or more physical packages (e.g., chips) including materials, components and/or wires on a structural assembly (e.g., a baseboard). The structural assembly may provide physical strength, conservation of size, and/or limitation of electrical interaction for component circuitry included thereon. The apparatus may therefore, in some cases, be configured to implement an embodiment of the present invention on a single chip or as a single "system on a chip." As such, in some cases, a chip or chipset may constitute means for performing one or more operations for providing the functionalities described herein.
The processor 22 may be embodied in a number of different ways. For example, the processor may be embodied as one or more of various hardware processing means such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), a processing element with or without an accompanying DSP, or various other processing circuitry including integrated circuits such as, for example, an ASIC
(application specific integrated circuit), an FPGA (field programmable gate array), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip, or the like. As such, in some embodiments, the processor may include one or more processing cores configured to perform independently. A multi-core processor may enable multiprocessing within a single physical package. Additionally or alternatively, the processor may include one or more processors configured in tandem via the bus to enable independent execution of instructions, pipelining and/or multithreading.
In an example embodiment, the processor 22 may be configured to execute instructions stored in the memory device 24 or otherwise accessible to the processor. Alternatively or additionally, the processor may be configured to execute hard coded functionality. As such, whether configured by hardware or software methods, or by a combination thereof, the processor may represent an entity (e.g., physically embodied in circuitry) capable of performing operations according to an embodiment of the present invention while configured accordingly. Thus, for example, when the processor is embodied as an ASIC, FPGA or the like, the processor may be specifically configured hardware for conducting the operations described herein. Alternatively, as another example, when the processor is embodied as an executor of software instructions, the instructions may specifically configure the processor to perform the algorithms and/or operations described herein when the instructions are executed. However, in some cases, the processor may be a processor of a specific device (e.g., a head mounted display) configured to employ an embodiment of the present invention by further configuration of the processor by instructions for performing the algorithms and/or operations described herein. The processor may include, among other things, a clock, an arithmetic logic unit (ALU) and logic gates configured to support operation of the processor.
Meanwhile, the communication interface 26 may be any means such as a device or circuitry embodied in either hardware or a combination of hardware and software that is configured to receive and/or transmit data from/to a head mounted display 12 of the first user 10, such as a computing device embodied by the head mounted display of the first user, in an embodiment in which the apparatus 20 is remote from the head mounted display. As shown in Figure 2, the communication interface may also be configured to communicate with the head mounted display 16 of the second user 14, such as via wireless communication, e.g., Bluetooth, Wi-Fi or another wireless network, or via wireline. As a further example, the head mounted display of the first user may include a display 28 and the communication interface may be configured to direct the presentation of information upon the display. Additionally, the communication interface may be configured to communicate with other components of the computing device in an instance in which the apparatus is embodied by a computing device embodied by the head mounted display of the first user or with a remote computing device in an instance in which the apparatus is separate from the head mounted display of the first user.
In this regard, the communication interface 26 may include, for example, an antenna (or multiple antennas) and supporting hardware and/or software for enabling communications wirelessly. Additionally or alternatively, the communication interface may include the circuitry for interacting with the antenna(s) to cause transmission of signals via the antenna(s) or to handle receipt of signals received via the antenna(s). For example, the communications interface may be configured to communicate wirelessly with the sensor(s) 18, such as via Wi-Fi, Bluetooth or other wireless communications techniques. In some instances, the communication interface may alternatively or also support wired communication. As such, for example, the communication interface may include a communication modem and/or other hardware/software for supporting communication via cable, digital subscriber line (DSL), universal serial bus (USB) or other mechanisms. For example, the communication interface may be configured to
communicate via wired communication with other components of the computing device including, for example, other components of the head mounted display 12 of the first user 10 in an embodiment in which the apparatus 20 is embodied by the head mounted display of the first user.
Referring now to Figure 3, the operations performed, such as by the apparatus 20 of Figure 2, in accordance with an example embodiment of the present invention are illustrated. In this regard, content may be presented to the second user 12 by the head mounted display 16 of the second user, such as by causing the display of content on the head mounted display of the second user. The content may be presented by the head mounted display of the second user such that the content appears in the field of view of the second user and appears to float within the field of view of the second user. Alternatively, the content may be presented by the head mounted display of the second user so as to be associated with and positionally fixed to an object within the field of view of the second user, such as a table or a wall. Once the content is presented by the head mounted display of the second user, the second user may readily view the content while concurrently viewing their surroundings. However, the first user 10 may be unable to view the content presented by the head mounted display of the second user, at least not clearly.
As shown in Figure 3, however, an apparatus 20 of at least one embodiment may be configured to receive information relating to the content presented to the second user 12 by the head mounted display 16 of the second user. See block 32. In this regard, the head mounted display of the second user or a computing device associated with the head mounted display of the second user may be configured to transmit information relating to the content presented to the second user by the head mounted display of the second user. The apparatus therefore includes means, such as the processor 22, the communication interface 26 or the like, for receiving the information relating to content presented to the second user by the head mounted display of the second user.
Various types of information relating to the content that is presented by the head mounted display 16 of the second user 12 may be provided and, in turn, received by the apparatus 20. For example, the information may be the actual content or an abstraction of the content. In regards to the abstraction of the content, a summary of the content or a blurred representation may be received instead of the actual content itself. As another example, the abstraction of the content that is received may include only the title of the content, such as the name of a document, the subject of an electronic mail message or the like, or certain predefined field(s) of the content instead of the content itself. As yet another example, the abstraction of the content may include metadata associated with the content instead of the content itself. In an instance in which the content is an image that is presented to the second user by the head mounted display of the second user, the image may be provided to and received by the apparatus or an abstraction of the image, such as an image in which the color of the pixels have been averaged or otherwise modified, may be provided and received by the apparatus. In an instance in which the content is moving, such as a movie or an animation, the abstraction of the content may be, for example, a pulsing light with the pulsations synchronized with the content.
Referring now to block 36 of Figure 3, the apparatus 20 may be configured to cause an indication 40 to be presented to the first user 10 relating to the content presented to the second user 12 by the head mounted display 16 of the second user. See, for example, Figures 4 and 5. In at least one embodiment, the indication that is presented to the first user identifying the content presented to the second user by the head mounted display of the second user is presented to the first user by the head mounted display 14 of the first user. Thus, the apparatus may include means, such as the processor 22, the communication interface 26 or the like, for causing an indication to be presented to the first user relating to the content presented by the head mounted display of the second user, such as by sending instructions to the display 28 of the head mounted display worn by the first user. As described below, the apparatus 20, such as the processor 22, the communication interface 26 or the like, may be configured to cause various types of indications 40 to be presented to the first user 10 via the head mounted display 14 of the first user. For example, the indication may be the content itself. Alternatively, the indication may be a representation of the content which provides information regarding the content without disclosing the content itself. For example, the representation of the content may be an abstraction of the content as described above. In this regard, the abstraction may be provided by the head mounted display 16 of the second user 12 as described above or may be generated by the apparatus, such as the processor, following receipt of the content itself from the head mounted display 16 of the second user 12.
The apparatus 20, such as the processor 22, the communication interface 26 or the like, may be configured to cause the indication 40 to be presented to the first user 10 by the head mounted display 14 of the first user such that the indication is presented at a location that is defined in relation the second user 12, such as at a predefined position relative to the second user. As such, the first user may view the second user within the field of view seen through the head mounted display worn by the first user. At a location that is defined in relation to the second user, the indication may be presented by the head mounted display of the first user such that the first user is able to see both the second user and the indication identifying the content presented to the second user by the head mounted display of the second user.
In regards to the presentation of the indication 40 at a location defined in relation to the second user 12, the apparatus 20, such as the processor 22, the
communication interface 26 or the like, of at least one embodiment may be configured to cause the indication to be presented to the first user 10 by the head mounted display 14 of the first user such that the indication appears to be superimposed upon the second user and/or an object associated with the second user. By being superimposed upon the second user and/or an object associated with a second user, the apparatus, such as the processor, the communication interface or the like, causes the indication to be presented in a manner that at least partially overlays the second user and/or an object associated with the second user. In some embodiments, the indication that is superimposed upon the second user and/or an object associated with the second user is translucent such that the portion of the second user and/or the portion of the object associated with the second user that is overlaid by the indication can be seen through the indication. In other embodiments, the indication that is superimposed upon the second user and/or an object associated with the second user may be opaque such that the portion of the second user and/or the portion of the object associated with the second user that underlies the indication is not visible.
In at least one embodiment, the apparatus 20, such as the processor 22, may be configured to cause the indication 40 to be superimposed upon the same portion of the second user 12 and/or the same portion of an object associated with the second user even as there is relative movement between the head mounted display 14 of the first user 10 and the second user, such as may be attributable to the user moving within the field of view of the first user and/or as the head of the first user and, therefore, the head mounted display worn by the first user is moved relative to the second user. In this embodiment, the apparatus, such as the processor, may be configured to track the location of the portion of the second user and/or the portion of an object associated with the second user upon which the indication is to be superimposed. In this regard, an image of the field of view of the first user through the head mounted display of the first user may be captured, such as by a camera incorporated within or associated with the head mounted display of the first user. The apparatus, such as the processor, may be configured to analyze the image and to determine the location within the field of view of the head mounted display of the first user of the portion of the second user and/or the portion of the object associated with the second user upon which the indication is to be superimposed. The apparatus, such as the processor, may thereafter cause the indication to be superimposed upon the location of the portion of the second user and/or the portion of the object associated with the second user that was determined from the analysis of the image, such as by sending instructions to the display 28 of the head mounted display worn by the first user. By repeating this process of image capture and analysis, the apparatus, such as the processor, may cause the indication to be superimposed upon the same portion of the second user and/or the same portion of an object associated with the second user even as there is relative movement between the head mounted display of the first user and the second user.
Various objects may be associated with the second user 12. For example, objects that are worn or carried by the second user may be considered to be associated with the second user. In this regard, clothing worn by the user or a hat worn by the user may be considered to be associated with the second user. Additionally, the head mounted display
16, such as a pair of augmented reality glasses, may be considered to be associated with the second user. As such, the apparatus 20 of at least one embodiment, such as the processor
22, the communication interface 26 or the like, may be configured to cause the indication
40 to be presented to the first user 10 by the head mounted display 14 of the first user such that the indication appears to be superimposed upon the head mounted display of the second user as shown in Figure 4.
As an alternative approach to the presentation of the indication at a location defined in relation to the second user 12, the apparatus 20, such as the processor 22, the communication interface 26 or the like, may be configured to cause the indication to be presented to the first user 10 by the head mounted display 14 of the first user such that the indication appears to be proximate to the second user. By being proximate to the second user, the indication that is caused to be presented to the first user appears to be positioned at least partially, if not entirely, beyond the second user, such as by being spaced apart from the second user. However, the indication that is caused to be presented to the first user in this embodiment may be relatively near the second user, such as by being within a predefined distance of the second user, in order to maintain the proximal relationship between the second user and the indication.
In this embodiment, the apparatus 20, such as the processor 22, the communication interface 26 or the like, may be configured to cause the indication 40 to be presented to the first user 10 by the head mounted display 14 of the first user such that the indication appears to float proximate to the second user as shown, for example, in Figure 5. In this regard, the indication may be caused to be presented to the first user in various forms, such as a voice bubble, a thought bubble or the like. As described above, the apparatus, such as a processor, may be configured to track the position of the second user within the field of view of the first user through the head mounted display worn by the first user, such as by repeatedly capturing images of the field of view of the first user and analyzing the images to identify the position of the second user therein. As such, the apparatus, such as the processor, may be configured to cause the indication to be presented in the same relative position with respect to the second user, even as relative movement between the second user and the head mounted display of the first user occurs. As such, the indication that is presented to the first user appears to float relative to the second user.
As shown in Figure 5, the apparatus 20, such as the processor 22, may be additionally configured to cause the indication 40 that is presented to the first user 10 by the head mounted display 14 of the first user to include a visual association 42 with the second user 12. The visual association may have various forms, such as a line, an arrow, a pointer or that like, and serves to visibly link or otherwise associate the indication with the second user and/or an object associated with the second user. As noted above, various objects may be associated with the second user including objects, e.g., a hat, clothing, etc., that are worn or carried by the second user may be considered to be associated with the second user. Additionally, the head mounted display 16, such as a pair of augmented reality glasses, may be considered to be associated with the second user. In the embodiment illustrated in Figure 5, for example, the visual association has the form of a link that associates the indication with the second user.
As shown in block 30 of Figure 3, the apparatus 20 of at least one embodiment may also be configured to cause content to be presented to the first user 10 by the head mounted display 14 of the first user. Thus, the apparatus may include means, such as the processor 22, the communication interface 26 or the like, for causing content to be presented via the head mounted display of the first user. In this embodiment, the first user may therefore not only see the field of view through the head mounted display including any indication that is presented relative to the second user 12, but may also see the content that is presented by the head mounted display of the first user.
In this embodiment and as shown in block 34 of Figure 3, the apparatus 20 may be configured to compare the information relating to the content presented to the second user 12 that was received as shown, for example, in block 32 with information relating to the content presented by the first user 10 as shown in block 30. Thus, the apparatus may include means, such as the processor 22, the communication interface 26 or the like, for making such a comparison. As a result of this comparison, the apparatus, such as the processor, may determine if the same content is being concurrently presented to both the first and second users via the respective head mounted displays or whether different content is being presented to the first and second users via the respective head mounted displays.
In this embodiment, the indication that is caused to be presented to the first user 10 may be based upon the comparison. For example, the indication that is caused to be presented to the first user may provide information regarding whether the content presented to the second user 12 is identical to the content presented to the first user. This indication of the commonality of the content presented to both the first and second users may be provided in various manners, but is provided in the embodiment of Figure 6 by the presentation of an indication 44, such as by the head mounted display 14 of the first user, that appears to the first user as the illumination or coloring of a predefined region, such as ring about one of the lenses of the head mounted display 16 worn by the second user.
Alternatively, this indication may be presented, such as by the head mounted display of the first user, so as to appear to the first user as a halo positioned above the head of the second user, as a star presented upon the chest of the second user or otherwise. As such, the first user may quickly determine whether the second user is reviewing the same content as the first user by determining whether the indication that is representative of the commonality of the content presented to both the first and second users is visible.
Other types of comparisons may be made in other embodiments. For example, For example, the indication that is caused to be presented to the first user 10 may provide information regarding whether the first and second users are currently viewing different portions, e.g., different pages, of the same content. As another example, the indication that is caused to be presented to the first user may provide information regarding whether the first and second users are currently viewing different versions of same content. In a further example, the indication that is caused to be presented to the first user may provide information regarding whether the first and second users are currently viewing content that is substantially similar, such as content that deviates by no more than a predefined amount or percent.
The content that is presented to the second user 12 by the head mounted display 16 of the second user may include at least portions that are confidential, private or the like. Content may be indicated by the head mounted display of the second user to be private in various manners. For example, all content may be considered private, content that is to be considered private may be marked, content that is to be considered private may be indicated by the second user, content associated with one or more respective
applications may be considered private while content associated with other applications may not be considered private, etc. The privacy of the content presented to the second user by the head mounted display of the second user may be maintained by not causing the content itself to be displayed for the first user, but, instead, causing a summary or an abstraction of the content to be presented to the first user 10 or causing a visible indication representative of the commonality of the content presented to both the first and second users to be presented to the first user.
In at least one embodiment, however, the apparatus 20, such as the processor 22, may be configured to generate a representation of the content presented to the second user 12 and, in the course of generating the representation, may obscure at least portions of the content presented to the second user. The portions of the content that are obscured may be defined in various manners including the obscuration of the content within predefined fields, the obscuration of content is predefined pattern that effectively masks the content, the obscuration of content identified by the second user or by a third party or otherwise. Following the obscuration of portions of the content that is presented to the second user, the apparatus, such as the processor, the communication interface 26 or the like, may be configured to cause the representation of the content, including those portions that have been obscured, to be presented to the first user 10 via the head mounted display 14 of the first user. As a result of the obscuration of at least portions of the content, the first user may be able to identify the content that is currently being reviewed by the second user so as to determine whether the second user is reviewing the same content as is currently being reviewed by the first user, but the privacy of the content may be maintained.
By causing an indication to be presented to the first user 10 identifying the content presented to the second user 12 by the head mounted display 16 of the second user, the first user may readily identify the content that is being reviewed by the second user and, in one instance, may determine whether the second user is reviewing the same content as is currently being reviewed by the first user. As such, the first and second users may collaborate and, in the course of such collaboration, may review the same documentation or other content as evidenced by the indications that are cause to be presented in accordance with an embodiment of the present invention.
As described above, Figure 3 is a flowchart of an apparatus, method, and computer program product according to example embodiments of the invention. It will be understood that each block of the flowchart, and combinations of blocks in the flowchart, may be implemented by various means, such as hardware, firmware, processor, circuitry, and/or other devices associated with execution of software including one or more computer program instructions. For example, one or more of the procedures described above may be embodied by computer program instructions. In this regard, the computer program instructions which embody the procedures described above may be stored by a memory device 24 of an apparatus 20 employing an embodiment of the present invention and executed by a processor 22 of the apparatus. As will be appreciated, any such computer program instructions may be loaded onto a computer or other programmable apparatus (e.g., hardware) to produce a machine, such that the resulting computer or other programmable apparatus implements the functions specified in the flowchart blocks.
These computer program instructions may also be stored in a computer-readable memory that may direct a computer or other programmable apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture the execution of which implements the function specified in the flowchart blocks. The computer program instructions may also be loaded onto a computer or other programmable apparatus to cause a series of operations to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus provide operations for implementing the functions specified in the flowchart blocks.
Accordingly, blocks of the flowcharts support combinations of means for performing the specified functions and combinations of operations for performing the specified functions for performing the specified functions. It will also be understood that one or more blocks of the flowcharts, and combinations of blocks in the flowcharts, can be implemented by special purpose hardware-based computer systems which perform the specified functions, or combinations of special purpose hardware and computer instructions.
Many modifications and other embodiments of the inventions set forth herein will come to mind to one skilled in the art to which these inventions pertain having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the inventions are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims. Moreover, although the foregoing descriptions and the associated drawings describe example embodiments in the context of certain example combinations of elements and/or functions, it should be appreciated that different combinations of elements and/or functions may be provided by alternative embodiments without departing from the scope of the appended claims. In this regard, for example, different combinations of elements and/or functions than those explicitly described above are also contemplated as may be set forth in some of the appended claims. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.

Claims

THAT WHICH IS CLAIMED:
1. A method comprising:
receiving information relating to content presented to a second user by a head mounted display of the second user; and
based upon the information, causing, with a processor, an indication to be presented to a first user identifying the content presented to the second user by the head mounted display of the second user.
2. A method according to Claim 1 further comprising:
causing content to be presented to the first user by a head mounted display of the first user; and
comparing the information relating to content presented to the second user with information relating to content presented to the first user,
wherein the indication that is caused to be presented to the first user is based upon the comparison.
3. A method according to Claim 2 wherein the indication that is caused to be presented to the first user provides information regarding whether the content presented to the second user is identical to the content presented to the first user.
4. A method according to Claim 1 wherein causing the indication to be presented to the first user comprises causing a representation of the content presented to the second user to be presented to the first user.
5. A method according to Claim 4 further comprising generating the representation of the content presented to the second user, wherein the representation of the content comprises an abstraction of the content presented to the second user.
6. A method according to Claim 4 further comprising generating the representation of the content presented to the second user, wherein generating the representation comprises obscuring at least portions of the content presented to the second user.
7. A method according to Claim 1 wherein causing the indication to be presented to the first user comprises causing the indication to be presented to the first user by a head mounted display of the first user such that the indication is presented at a location defined in relation to the second user.
8. A method according to Claim 7 wherein causing the indication to be presented to the first user further comprises causing the indication to be presented to the first user by the head mounted display of the first user such that the indication appears to be superimposed upon at least one of the second user or an object associated with the second user.
9. A method according to Claim 7 wherein causing the indication to be presented to the first user further comprises causing the indication to be presented to the first user by the head mounted display of the first user such that the indication appears to be proximate to the second user.
10. An apparatus comprising at least one processor and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the processor, cause the apparatus to at least:
receive information relating to content presented to a second user by a head mounted display of the second user; and
based upon the information, cause an indication to be presented to a first user identifying the content presented to the second user by the head mounted display of the second user.
11. An apparatus according to Claim 10 wherein the at least one memory and the computer program code are further configured to, with the processor, cause the apparatus to:
cause content to be presented to the first user by a head mounted display of the first user; and
compare the information relating to content presented to the second user with information relating to content presented to the first user,
wherein the indication that is caused to be presented to the first user is based upon the comparison.
12. An apparatus according to Claim 11 wherein the indication that is caused to be presented to the first user provides information regarding whether the content presented to the second user is identical to the content presented to the first user.
13. An apparatus according to Claim 10 wherein the at least one memory and the computer program code are configured to, with the processor, cause the apparatus to cause the indication to be presented to the first user by causing a representation of the content presented to the second user to be presented to the first user.
14. An apparatus according to Claim 13 wherein the at least one memory and the computer program code are further configured to, with the processor, cause the apparatus to generate the representation of the content presented to the second user, wherein the representation of the content comprises an abstraction of the content presented to the second user.
15. An apparatus according to Claim 13 wherein the at least one memory and the computer program code are further configured to, with the processor, cause the apparatus to generate the representation of the content presented to the second user, wherein the at least one memory and the computer program code are configured to, with the processor, cause the apparatus to generate the representation by obscuring at least portions of the content presented to the second user.
16. A computer program product comprising at least one non-transitory computer-readable storage medium having computer-executable program code portions stored therein, the computer-executable program code portions comprising program code instructions for:
receiving information relating to content presented to a second user by a head mounted display of the second user; and
based upon the information, causing an indication to be presented to a first user identifying the content presented to the second user by the head mounted display of the second user.
17. A computer program product according to Claim 16 wherein the computer- executable program code portions further comprise program code instructions for:
causing content to be presented to the first user by a head mounted display of the first user; and
comparing the information relating to content presented to the second user with information relating to content presented to the first user,
wherein the indication that is caused to be presented to the first user is based upon the comparison.
18. A computer program product according to Claim 17 wherein the indication that is caused to be presented to the first user provides information regarding whether the content presented to the second user is identical to the content presented to the first user.
19. A computer program product according to Claim 16 wherein the computer- executable program code portions further comprise program code instructions for generating a representation of the content presented to the second user, wherein the program code instructions for causing the indication to be presented to the first user comprise program code instructions for causing a representation of the content presented to the second user to be presented to the first user, and wherein the representation of the content comprises an abstraction of the content presented to the second user.
20. A computer program product according to Claim 16 wherein the computer- executable program code portions comprising program code instructions for generating the representation of the content presented to the second user, wherein the program code instructions for causing the indication to be presented to the first user comprise program code instructions for causing a representation of the content presented to the second user to be presented to the first user, and wherein the program code instructions for generating the representation comprise program code instructions for obscuring at least portions of the content presented to the second user.
PCT/FI2013/050887 2012-09-28 2013-09-13 Method and apparatus for providing an indication regarding content presented to another user WO2014049200A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP13776838.8A EP2900433B1 (en) 2012-09-28 2013-09-13 Method and apparatus for providing an indication regarding content presented to another user

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/631,596 US10620902B2 (en) 2012-09-28 2012-09-28 Method and apparatus for providing an indication regarding content presented to another user
US13/631,596 2012-09-28

Publications (1)

Publication Number Publication Date
WO2014049200A1 true WO2014049200A1 (en) 2014-04-03

Family

ID=49356462

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2013/050887 WO2014049200A1 (en) 2012-09-28 2013-09-13 Method and apparatus for providing an indication regarding content presented to another user

Country Status (3)

Country Link
US (1) US10620902B2 (en)
EP (1) EP2900433B1 (en)
WO (1) WO2014049200A1 (en)

Families Citing this family (76)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9715112B2 (en) 2014-01-21 2017-07-25 Osterhout Group, Inc. Suppression of stray light in head worn computing
US9952664B2 (en) 2014-01-21 2018-04-24 Osterhout Group, Inc. Eye imaging in head worn computing
US9229233B2 (en) 2014-02-11 2016-01-05 Osterhout Group, Inc. Micro Doppler presentations in head worn computing
US20150205111A1 (en) 2014-01-21 2015-07-23 Osterhout Group, Inc. Optical configurations for head worn computing
US9400390B2 (en) 2014-01-24 2016-07-26 Osterhout Group, Inc. Peripheral lighting for head worn computing
US9965681B2 (en) 2008-12-16 2018-05-08 Osterhout Group, Inc. Eye imaging in head worn computing
US9298007B2 (en) 2014-01-21 2016-03-29 Osterhout Group, Inc. Eye imaging in head worn computing
US20150277118A1 (en) 2014-03-28 2015-10-01 Osterhout Group, Inc. Sensor dependent content position in head worn computing
US9594246B2 (en) 2014-01-21 2017-03-14 Osterhout Group, Inc. See-through computer display systems
US11227294B2 (en) 2014-04-03 2022-01-18 Mentor Acquisition One, Llc Sight information collection in head worn computing
US9810906B2 (en) 2014-06-17 2017-11-07 Osterhout Group, Inc. External user interface for head worn computing
US11103122B2 (en) 2014-07-15 2021-08-31 Mentor Acquisition One, Llc Content presentation in head worn computing
US9671613B2 (en) 2014-09-26 2017-06-06 Osterhout Group, Inc. See-through computer display systems
US9448409B2 (en) 2014-11-26 2016-09-20 Osterhout Group, Inc. See-through computer display systems
US9299194B2 (en) * 2014-02-14 2016-03-29 Osterhout Group, Inc. Secure sharing in head worn computing
US9841599B2 (en) 2014-06-05 2017-12-12 Osterhout Group, Inc. Optical configurations for head-worn see-through displays
US9529195B2 (en) 2014-01-21 2016-12-27 Osterhout Group, Inc. See-through computer display systems
US10254856B2 (en) 2014-01-17 2019-04-09 Osterhout Group, Inc. External user interface for head worn computing
US10191279B2 (en) 2014-03-17 2019-01-29 Osterhout Group, Inc. Eye imaging in head worn computing
US9829707B2 (en) 2014-08-12 2017-11-28 Osterhout Group, Inc. Measuring content brightness in head worn computing
US9746686B2 (en) 2014-05-19 2017-08-29 Osterhout Group, Inc. Content position calibration in head worn computing
US10684687B2 (en) 2014-12-03 2020-06-16 Mentor Acquisition One, Llc See-through computer display systems
US9575321B2 (en) 2014-06-09 2017-02-21 Osterhout Group, Inc. Content presentation in head worn computing
US20160019715A1 (en) 2014-07-15 2016-01-21 Osterhout Group, Inc. Content presentation in head worn computing
US9939934B2 (en) 2014-01-17 2018-04-10 Osterhout Group, Inc. External user interface for head worn computing
US10649220B2 (en) 2014-06-09 2020-05-12 Mentor Acquisition One, Llc Content presentation in head worn computing
US9651784B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
US9651788B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
US11737666B2 (en) 2014-01-21 2023-08-29 Mentor Acquisition One, Llc Eye imaging in head worn computing
US9753288B2 (en) 2014-01-21 2017-09-05 Osterhout Group, Inc. See-through computer display systems
US9494800B2 (en) 2014-01-21 2016-11-15 Osterhout Group, Inc. See-through computer display systems
US11892644B2 (en) 2014-01-21 2024-02-06 Mentor Acquisition One, Llc See-through computer display systems
US11669163B2 (en) 2014-01-21 2023-06-06 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US11487110B2 (en) 2014-01-21 2022-11-01 Mentor Acquisition One, Llc Eye imaging in head worn computing
US9532714B2 (en) 2014-01-21 2017-01-03 Osterhout Group, Inc. Eye imaging in head worn computing
US20150205135A1 (en) 2014-01-21 2015-07-23 Osterhout Group, Inc. See-through computer display systems
US9836122B2 (en) 2014-01-21 2017-12-05 Osterhout Group, Inc. Eye glint imaging in see-through computer display systems
US9811159B2 (en) 2014-01-21 2017-11-07 Osterhout Group, Inc. Eye imaging in head worn computing
US9766463B2 (en) 2014-01-21 2017-09-19 Osterhout Group, Inc. See-through computer display systems
US9846308B2 (en) 2014-01-24 2017-12-19 Osterhout Group, Inc. Haptic systems for head-worn computers
US9401540B2 (en) 2014-02-11 2016-07-26 Osterhout Group, Inc. Spatial location presentation in head worn computing
US20150241963A1 (en) 2014-02-11 2015-08-27 Osterhout Group, Inc. Eye imaging in head worn computing
US20160187651A1 (en) 2014-03-28 2016-06-30 Osterhout Group, Inc. Safety for a vehicle operator with an hmd
US9423842B2 (en) 2014-09-18 2016-08-23 Osterhout Group, Inc. Thermal management for head-worn computer
US9672210B2 (en) 2014-04-25 2017-06-06 Osterhout Group, Inc. Language translation with head-worn computing
US9651787B2 (en) 2014-04-25 2017-05-16 Osterhout Group, Inc. Speaker assembly for headworn computer
US10853589B2 (en) 2014-04-25 2020-12-01 Mentor Acquisition One, Llc Language translation with head-worn computing
US10663740B2 (en) 2014-06-09 2020-05-26 Mentor Acquisition One, Llc Content presentation in head worn computing
US9684172B2 (en) 2014-12-03 2017-06-20 Osterhout Group, Inc. Head worn computer display systems
USD743963S1 (en) 2014-12-22 2015-11-24 Osterhout Group, Inc. Air mouse
USD751552S1 (en) 2014-12-31 2016-03-15 Osterhout Group, Inc. Computer glasses
USD753114S1 (en) 2015-01-05 2016-04-05 Osterhout Group, Inc. Air mouse
US20160239985A1 (en) 2015-02-17 2016-08-18 Osterhout Group, Inc. See-through computer display systems
US10878775B2 (en) 2015-02-17 2020-12-29 Mentor Acquisition One, Llc See-through computer display systems
US9652035B2 (en) * 2015-02-23 2017-05-16 International Business Machines Corporation Interfacing via heads-up display using eye contact
US10235808B2 (en) * 2015-08-20 2019-03-19 Microsoft Technology Licensing, Llc Communication system
US10169917B2 (en) 2015-08-20 2019-01-01 Microsoft Technology Licensing, Llc Augmented reality
US11609427B2 (en) 2015-10-16 2023-03-21 Ostendo Technologies, Inc. Dual-mode augmented/virtual reality (AR/VR) near-eye wearable displays
US11106273B2 (en) 2015-10-30 2021-08-31 Ostendo Technologies, Inc. System and methods for on-body gestural interfaces and projection displays
CN105491416B (en) * 2015-11-25 2020-03-03 腾讯科技(深圳)有限公司 Augmented reality information transmission method and device
US10345594B2 (en) 2015-12-18 2019-07-09 Ostendo Technologies, Inc. Systems and methods for augmented near-eye wearable displays
US10578882B2 (en) 2015-12-28 2020-03-03 Ostendo Technologies, Inc. Non-telecentric emissive micro-pixel array light modulators and methods of fabrication thereof
US10353203B2 (en) 2016-04-05 2019-07-16 Ostendo Technologies, Inc. Augmented/virtual reality near-eye displays with edge imaging lens comprising a plurality of display devices
US10453431B2 (en) 2016-04-28 2019-10-22 Ostendo Technologies, Inc. Integrated near-far light field display systems
US10522106B2 (en) 2016-05-05 2019-12-31 Ostendo Technologies, Inc. Methods and apparatus for active transparency modulation
US10282908B2 (en) * 2016-12-16 2019-05-07 Lenovo (Singapore) Pte. Ltd. Systems and methods for presenting indication(s) of whether virtual object presented at first device is also presented at second device
US11782669B2 (en) 2017-04-28 2023-10-10 Microsoft Technology Licensing, Llc Intuitive augmented reality collaboration on visual data
US10102659B1 (en) 2017-09-18 2018-10-16 Nicholas T. Hariton Systems and methods for utilizing a device as a marker for augmented reality content
US10105601B1 (en) 2017-10-27 2018-10-23 Nicholas T. Hariton Systems and methods for rendering a virtual content object in an augmented reality environment
US10636188B2 (en) 2018-02-09 2020-04-28 Nicholas T. Hariton Systems and methods for utilizing a living entity as a marker for augmented reality content
US10198871B1 (en) 2018-04-27 2019-02-05 Nicholas T. Hariton Systems and methods for generating and facilitating access to a personalized augmented rendering of a user
US10712901B2 (en) * 2018-06-27 2020-07-14 Facebook Technologies, Llc Gesture-based content sharing in artificial reality environments
US10635895B2 (en) 2018-06-27 2020-04-28 Facebook Technologies, Llc Gesture-based casting and manipulation of virtual content in artificial-reality environments
US10783712B2 (en) 2018-06-27 2020-09-22 Facebook Technologies, Llc Visual flairs for emphasizing gestures in artificial-reality environments
US10586396B1 (en) 2019-04-30 2020-03-10 Nicholas T. Hariton Systems, methods, and storage media for conveying virtual content in an augmented reality environment
GB2586148A (en) * 2019-08-07 2021-02-10 Sony Interactive Entertainment Inc Content generation system and method

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1435737A1 (en) * 2002-12-30 2004-07-07 Abb Research Ltd. An augmented reality system and method

Family Cites Families (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6388654B1 (en) * 1997-10-03 2002-05-14 Tegrity, Inc. Method and apparatus for processing, displaying and communicating images
US6686933B1 (en) * 2000-01-07 2004-02-03 Sun Microsystems, Inc. Lightweight indicator of divergence of views for collaboratively shared user interface elements
US8707185B2 (en) * 2000-10-10 2014-04-22 Addnclick, Inc. Dynamic information management system and method for content delivery and sharing in content-, metadata- and viewer-based, live social networking among users concurrently engaged in the same and/or similar content
US7532230B2 (en) 2004-01-29 2009-05-12 Hewlett-Packard Development Company, L.P. Method and system for communicating gaze in an immersive virtual environment
US7843470B2 (en) * 2005-01-31 2010-11-30 Canon Kabushiki Kaisha System, image processing apparatus, and information processing method
US8239453B2 (en) * 2005-02-25 2012-08-07 Microsoft Corporation System and method for providing one class of users of an application a view of what another class of users of the application is visually experiencing
US20060284791A1 (en) 2005-06-21 2006-12-21 National Applied Research Laboratories National Center For High-Performance Computing Augmented reality system and method with mobile and interactive function for multiple users
US9270976B2 (en) 2005-11-02 2016-02-23 Exelis Inc. Multi-user stereoscopic 3-D panoramic vision system and method
US8730156B2 (en) * 2010-03-05 2014-05-20 Sony Computer Entertainment America Llc Maintaining multiple views on a shared stable virtual space
IL175835A0 (en) * 2006-05-22 2007-07-04 Rafael Armament Dev Authority Methods and systems for communicating and displaying points-of-interest
US7849322B2 (en) * 2006-07-21 2010-12-07 E-On Software Method for exchanging a 3D view between a first and a second user
JP5228307B2 (en) * 2006-10-16 2013-07-03 ソニー株式会社 Display device and display method
US20080159601A1 (en) * 2006-12-31 2008-07-03 Motorola, Inc. Face Recognition System and Corresponding Method
WO2009002567A1 (en) * 2007-06-27 2008-12-31 The University Of Hawaii Virtual reality overlay
US8334902B2 (en) * 2009-03-31 2012-12-18 Fuji Xerox Co., Ltd. System and method for facilitating the use of whiteboards
US8689293B2 (en) * 2009-07-16 2014-04-01 Panasonic Corporation Access control device, access control method, program, storage medium, and integrated circuit
US8867780B2 (en) * 2010-02-25 2014-10-21 Apple Inc. Obfuscating the display of information and removing the obfuscation using a filter
US9097891B2 (en) * 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment
EP2539759A1 (en) * 2010-02-28 2013-01-02 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece
US9081632B2 (en) * 2010-04-21 2015-07-14 Lexmark International Technology Sa Collaboration methods for non-programmatic integration systems
US9298070B2 (en) 2010-04-29 2016-03-29 Hewlett-Packard Development Company, L.P. Participant collaboration on a displayed version of an object
US8743145B1 (en) * 2010-08-26 2014-06-03 Amazon Technologies, Inc. Visual overlay for augmenting reality
KR101788598B1 (en) * 2010-09-01 2017-11-15 엘지전자 주식회사 Mobile terminal and information security setting method thereof
US9348141B2 (en) * 2010-10-27 2016-05-24 Microsoft Technology Licensing, Llc Low-latency fusing of virtual and real content
US9329469B2 (en) * 2011-02-17 2016-05-03 Microsoft Technology Licensing, Llc Providing an interactive experience using a 3D depth camera and a 3D projector
EP2528266A1 (en) * 2011-05-24 2012-11-28 Thomson Licensing Method and device for 3D object protection by transformation of its points
US20120299962A1 (en) * 2011-05-27 2012-11-29 Nokia Corporation Method and apparatus for collaborative augmented reality displays
US20130007895A1 (en) * 2011-06-29 2013-01-03 International Business Machines Corporation Managing access control for a screen sharing session
US8934015B1 (en) * 2011-07-20 2015-01-13 Google Inc. Experience sharing
US9497249B2 (en) * 2011-08-08 2016-11-15 Sony Corporation Information processing apparatus, information processing method, program, and information processing system
WO2013028813A1 (en) * 2011-08-23 2013-02-28 Microsoft Corporation Implicit sharing and privacy control through physical behaviors using sensor-rich devices
JP6180072B2 (en) * 2011-08-24 2017-08-16 サターン ライセンシング エルエルシーSaturn Licensing LLC Display device, display system, and display method
WO2013028908A1 (en) * 2011-08-24 2013-02-28 Microsoft Corporation Touch and social cues as inputs into a computer
US9286711B2 (en) * 2011-09-30 2016-03-15 Microsoft Technology Licensing, Llc Representing a location at a previous time period using an augmented reality display
US9498720B2 (en) * 2011-09-30 2016-11-22 Microsoft Technology Licensing, Llc Sharing games using personal audio/visual apparatus
US20130141419A1 (en) * 2011-12-01 2013-06-06 Brian Mount Augmented reality with realistic occlusion
US9256600B2 (en) * 2012-04-13 2016-02-09 D2L Corporation Method and system for electronic content locking
US9179021B2 (en) * 2012-04-25 2015-11-03 Microsoft Technology Licensing, Llc Proximity and connection based photo sharing
US9519640B2 (en) * 2012-05-04 2016-12-13 Microsoft Technology Licensing, Llc Intelligent translations in personal see through display
US9122321B2 (en) * 2012-05-04 2015-09-01 Microsoft Technology Licensing, Llc Collaboration environment using see through displays
US8655389B1 (en) * 2012-08-10 2014-02-18 Google Inc. Method and system for enabling a user to obfuscate location coordinates by generating a blur level, and applying it to the location coordinates in a wireless communication networks
US9058813B1 (en) * 2012-09-21 2015-06-16 Rawles Llc Automated removal of personally identifiable information
US9996221B2 (en) * 2013-12-01 2018-06-12 Upskill, Inc. Systems and methods for look-initiated communication

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1435737A1 (en) * 2002-12-30 2004-07-07 Abb Research Ltd. An augmented reality system and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SZALAVARI Z ET AL: "STUDIERSTUBE: AN ENVIRONMENT FOR COLLABORATION IN AUGMENTED REALITY", VIRTUAL REALITY, VIRTUAL PRESS, WALTHAM CROSS, GB, vol. 3, no. 1, 1 January 1998 (1998-01-01), London, UK, pages 37 - 48, XP008011892, ISSN: 1359-4338, DOI: 10.1007/BF01409796 *

Also Published As

Publication number Publication date
US20140091984A1 (en) 2014-04-03
EP2900433B1 (en) 2021-06-02
US10620902B2 (en) 2020-04-14
EP2900433A1 (en) 2015-08-05

Similar Documents

Publication Publication Date Title
US10620902B2 (en) Method and apparatus for providing an indication regarding content presented to another user
US10019221B2 (en) Method and apparatus for concurrently presenting different representations of the same information on multiple displays
US11799652B2 (en) Encryption and decryption of visible codes for real time augmented reality views
CN110830786B (en) Detection and display of mixed 2D/3D content
US9201625B2 (en) Method and apparatus for augmenting an index generated by a near eye display
US9298970B2 (en) Method and apparatus for facilitating interaction with an object viewable via a display
US20150062158A1 (en) Integration of head mounted displays with public display devices
ES2778935T3 (en) Rendering a notification on a head-mounted display
US9360670B2 (en) Display method and display device for augmented reality
US8963807B1 (en) Head mounted display and method for controlling the same
US20170153698A1 (en) Method and apparatus for providing a view window within a virtual reality scene
KR20160015972A (en) The Apparatus and Method for Wearable Device
WO2014128751A1 (en) Head mount display apparatus, head mount display program, and head mount display method
EP3038061A1 (en) Apparatus and method to display augmented reality data
US9355534B2 (en) Causing display of a notification on a wrist worn apparatus
JP2017084100A (en) Information communication terminal, sharing management device, information sharing method, and computer program
JPWO2014128750A1 (en) I / O device, I / O program, and I / O method
CN107592520B (en) Imaging device and imaging method of AR equipment
TWI603225B (en) Viewing angle adjusting method and apparatus of liquid crystal display
CN104932101A (en) Head-mounted display device
Mann et al. FreeGlass for developers,“haccessibility”, and Digital Eye Glass+ Lifeglogging research in a (sur/sous) veillance society
US10679587B2 (en) Display of supplemental information
JP2014096057A (en) Image processing apparatus
US11269183B2 (en) Display information on a head-mountable apparatus corresponding to data of a computing device
US20140095109A1 (en) Method and apparatus for determining the emotional response of individuals within a group

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13776838

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2013776838

Country of ref document: EP