US20140101608A1 - User Interfaces for Head-Mountable Devices - Google Patents

User Interfaces for Head-Mountable Devices Download PDF

Info

Publication number
US20140101608A1
US20140101608A1 US13/840,016 US201313840016A US2014101608A1 US 20140101608 A1 US20140101608 A1 US 20140101608A1 US 201313840016 A US201313840016 A US 201313840016A US 2014101608 A1 US2014101608 A1 US 2014101608A1
Authority
US
United States
Prior art keywords
card
cards
hmd
linear arrangement
displaying
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/840,016
Inventor
Robert Allen Ryskamp
Max Benjamin Braun
Nirmal Patel
Chris McKenzie
Hayes Solos Raffle
Antonio Bernardo Monteiro Costa
Richard The
Alexander Hanbing Chen
Michael J. Lebeau
Alexander Faaborg
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Google LLC
Original Assignee
Google LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Google LLC filed Critical Google LLC
Priority to US13/840,016 priority Critical patent/US20140101608A1/en
Assigned to GOOGLE INC. reassignment GOOGLE INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: COSTA, ANTONIO BERNARDO MONTEIRO, BRAUN, MAX BENJAMIN, CHEN, ALEXANDER HANBING, FAABORG, ALEXANDER, LEBEAU, MICHAEL J., MCKENZIE, CHRISTOPHER DANIEL, PATEL, NIRMAL, RAFFLE, HAYES SOLOS, RYSKAMP, ROBERT ALLEN, THE, Richard
Priority to PCT/US2013/063578 priority patent/WO2014055948A2/en
Publication of US20140101608A1 publication Critical patent/US20140101608A1/en
Assigned to GOOGLE LLC reassignment GOOGLE LLC CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: GOOGLE INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/103Formatting, i.e. changing of presentation of documents
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/20Drawing from basic elements, e.g. lines or circles
    • G06T11/206Drawing of charts or graphs
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0141Head-up displays characterised by optical features characterised by the informative content of the display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • Computing systems such as personal computers, laptop computers, tablet computers, cellular phones, and countless types of Internet-capable devices are prevalent in numerous aspects of modern life. Over time, the manner in which these devices are providing information to users is becoming more intelligent, more efficient, more intuitive, and/or less obtrusive.
  • wearable computing The trend toward miniaturization of computing hardware, peripherals, as well as of sensors, detectors, and image and audio processors, among other technologies, has helped open up a field sometimes referred to as “wearable computing.”
  • wearable displays In the area of image and visual processing and production, in particular, it has become possible to consider wearable displays that place a very small image display element close enough to a wearer's (or user's) eye(s) such that the displayed image fills or nearly fills the field of view, and appears as a normal sized image, such as might be displayed on a traditional image display device.
  • the relevant technology can be referred to as “near-eye displays.”
  • Near-eye displays are fundamental components of wearable displays, also sometimes called “head-mounted displays” (HMDs).
  • a head-mounted display places a graphic display or displays close to one or both eyes of a wearer.
  • a computer processing system can be used to generate the images on a display.
  • Such displays can occupy part or all of a wearer's field of view.
  • head-mounted displays can be as small as a pair of glasses or as large as a helmet.
  • a method is provided.
  • a head-mountable device displaying a home card of an ordered plurality of cards. While displaying the home card, receiving a first input at the HMD.
  • the first input is associated with a first input type.
  • the first input type includes a choose-next input type and a choose-previous input type.
  • a next card of the ordered plurality of cards is obtained, the next card being subsequent to the home card in the ordered plurality of cards, and the HMD displays the next card.
  • a previous card of the ordered plurality of cards is obtained, where the previous card is prior to the home card in the ordered plurality of cards, and the HMD displays the previous card.
  • a method is provided.
  • a home card is displayed.
  • the HMD includes a user-interface (UI) state.
  • the UI state is in a home UI state.
  • a first UI of the HMD receives a first input.
  • the first input is associated with a first type of input.
  • the HMD displaying a next card of an ordered plurality of cards, where the ordered plurality of cards additionally includes the home card, where the next card differs from the home card, and setting the UI state of the HMD to a timeline-next state.
  • the HMD In response to the first type of input being a choose-previous type of input, the HMD displaying a previous card of the ordered plurality of cards, where the choose-previous type of input differs from the choose-next type of input, and where the previous card differs from the both next card and the home card, and setting the UI state of the HMD to a timeline-previous state.
  • the first type of input being a tap type of input: activating a second UI of the HMD, where the first UI of the HMD is a touch-based UI and where the second UI is a voice-based UI, and setting the UI state of the HMD to a voice-home state.
  • the first type of input being a speech-type of input: determining whether text associated with the first input matches a predetermined text, and, in response to determining that the text associated with first input matches the predetermined text, activating the second UI and setting the UI state of the HMD to the voice-home state.
  • a computing device in another aspect, includes a processor and a non-transitory computer-readable medium that is configured to store program instructions that, when executed by the processor, cause the computing device to carry out functions.
  • the functions include: displaying at least a portion of a first linear arrangement of cards, where the first linear arrangement includes an ordered plurality of cards that includes one or more first cards of a first card-type and one or more second cards of a second card-type, and where each first card corresponds to a group of cards; displaying a selection region that is moveable with respect to the first linear arrangement, where a given card is selected when the selection region is aligned with the given card; in response to selection of a given first card by the selection region, displaying at least a portion of a second linear arrangement of cards, where the second linear arrangement includes an ordered plurality of the group of cards that correspond to the given first card; and in response to selection of a given second card by the selection region, displaying at least a portion of a third linear arrangement of cards, where the third
  • a non-transitory computer readable medium configured to store program instructions that, when executed by a processor of a computing device, cause the computing device to carry out functions.
  • the functions include: displaying at least a portion of a first linear arrangement of cards, where the first linear arrangement includes an ordered plurality of cards that includes one or more first cards of a first card-type and one or more second cards of a second card-type, and where each first card corresponds to a group of cards; displaying a selection region that is moveable with respect to the first linear arrangement, where a given card is selected when the selection region is aligned with the given card; in response to selection of a given first card by the selection region, displaying at least a portion of a second linear arrangement of cards, where the second linear arrangement includes an ordered plurality of the group of cards that correspond to the given first card; and in response to selection of a given second card by the selection region, displaying at least a portion of a third linear arrangement of cards,
  • a computing device displays at least a portion of a first linear arrangement of cards.
  • the first linear arrangement includes an ordered plurality of cards.
  • the ordered plurality of cards includes one or more first cards of a first card-type and one or more second cards of a second card-type. Each first card corresponds to a group of cards.
  • the computing device displays a selection region that is moveable with respect to the first linear arrangement, where a given card is selected when the selection region is aligned with the given card.
  • the computing device displays at least a portion of a second linear arrangement of cards, where the second linear arrangement includes an ordered plurality of the group of cards corresponding to the given first card.
  • the computing device In response to selection of a given second card by the selection region, the computing device displays at least a portion of a third linear arrangement of cards, where the third linear arrangement includes one or more third cards of a third card-type, where each third card is selectable to perform an action based on the given second card.
  • a device in another aspect, includes: means for displaying at least a portion of a first linear arrangement of cards, where the first linear arrangement includes an ordered plurality of cards that includes one or more first cards of a first card-type and one or more second cards of a second card-type, and where each first card corresponds to a group of cards; means for displaying a selection region that is moveable with respect to the first linear arrangement, where a given card is selected when the selection region is aligned with the given card; means for, in response to selection of a given first card by the selection region, displaying at least a portion of a second linear arrangement of cards, where the second linear arrangement includes an ordered plurality of the group of cards that correspond to the given first card; and means for, in response to selection of a given second card by the selection region, displaying at least a portion of a third linear arrangement of cards, where the third linear arrangement includes one or more third cards of a third card-type, where each third card is selectable to perform an action based on the given second card
  • FIG. 1A illustrates a wearable computing system according to an example embodiment.
  • FIG. 1B illustrates an alternate view of the wearable computing device illustrated in FIG. 1A .
  • FIG. 1C illustrates another wearable computing system according to an example embodiment.
  • FIG. 1D illustrates another wearable computing system according to an example embodiment.
  • FIGS. 1E to 1G are simplified illustrations of the wearable computing system shown in FIG. 1D , being worn by a wearer.
  • FIG. 2A illustrates a schematic drawing of a computing device according to an example embodiment.
  • FIG. 2B shows an example projection of an image by an example head-mountable device (HMD), according to an example embodiment.
  • HMD head-mountable device
  • FIG. 3 shows an example home card of an example user interface for a HMD, according to an example embodiment.
  • FIG. 4 shows example operations of a multi-tiered user model for a user interface for a head-mountable device (HMD), according to an example embodiment.
  • HMD head-mountable device
  • FIG. 5A shows a scenario of example timeline interactions, according to an example embodiment.
  • FIG. 5B shows a scenario of example timeline interactions including splicing a new card into a timeline, according to an example embodiment.
  • FIG. 5C shows a scenario for using a multi-timeline display, according to an example embodiment.
  • FIG. 6A shows an example of using a two-fingered swipe on a touch-based UI of an HMD for zoomed scrolling, according to an example embodiment.
  • FIG. 6B shows a scenario for using a clutch operation to generate a multi-card display, according to an example embodiment.
  • FIG. 6C shows a scenario for using a clutch operation to generate a multi-timeline display, according to an example embodiment.
  • FIG. 6D shows a scenario for using head movements to navigate a multi-timeline display, according to an example embodiment.
  • FIG. 7 shows a user-interface scenario including contextual menus, according to an example embodiment.
  • FIG. 8 shows a user-interface scenario including a people chooser, according to an example embodiment.
  • FIG. 9 shows a user-interface scenario with camera interactions, according to an example embodiment.
  • FIG. 10A shows a user-interface scenario with photo bundles, according to an example embodiment.
  • FIG. 10B shows a user-interface scenario with message bundles, according to an example embodiment.
  • FIG. 11 shows a user-interface scenario with a timeline having settings cards, according to an example embodiment.
  • FIG. 12 shows a user-interface scenario related to WiFi settings, according to an example embodiment.
  • FIG. 13 shows a user-interface scenario related to Bluetooth settings, according to an example embodiment.
  • FIG. 14A shows an example visual stack, according to an example embodiment.
  • FIG. 14B shows another example visual stack, according to an example embodiment.
  • FIG. 15 shows a user-interface scenario related to voice interactions, according to an example embodiment.
  • FIG. 16A is a flow chart illustrating a method, according to an example embodiment.
  • FIG. 16B is a flow chart illustrating another method, according to an example embodiment.
  • FIG. 17 is a flow chart illustrating another method, according to an example embodiment.
  • Example methods and systems are described herein. It should be understood that the words “example” and “exemplary” are used herein to mean “serving as an example, instance, or illustration.” Any embodiment or feature described herein as being an “example” or “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments or features.
  • a UI for a computing device can include a timeline feature that allows the wearer to navigate through a sequence of ordered screens.
  • each screen can be referred to as a “card.”
  • the sequence of cards one or more cards can be displayed, and of the displayed card(s), one card can be “focused on” for possible selection.
  • the timeline can be present one card for display at a time, and the card being displayed is also the card being focused on.
  • the card when a card is selected, the card can be displayed using a single-card view that occupies substantially all of the viewing area of the display.
  • the computing device utilizing the herein-disclosed UI can be configured as a HMD, wearable computer, tablet computer, laptop computer, desktop computer, mobile telephone, and/or other computing device.
  • computing device 210 and/or remote device 230 discussed below in the context of FIG. 2A can be configured to utilize the herein-disclosed UI.
  • Each card can be associated with a certain application, object, or operation.
  • the cards can be ordered by a time associated with the card, application, object, or operation represented by the card. For example, if a card shows a photo captured by a wearer of the HMD at 2:57 PM, the time associated with the card is the time associated with the underlying photo object of 2:57 PM.
  • a card representing a weather application can continuously update temperature, forecast, wind, and other weather-related information, and as such, the time associated with the weather application can be the current time.
  • a card representing a calendar application can show a next appointment in 2 hours from now, and so the time associated with the card can be a time corresponding to the displayed next appointment, or 2 hours in the future.
  • the timeline feature can allow the wearer to navigate through the cards according to their associated times. For example, a wearer could move their head to the left to navigate to cards with times prior to a time associated with the focused-on card, and to the right to navigate to cards with times after the time associated with the focused-on card.
  • the wearer can use a touch pad or similar device as part of a touch-based UI to make a swiping motion in one direction on the touch-based UI to navigate to cards with times prior to the time associated with the focused-on card, and make a swiping motion in another direction to navigate to cards with times after the time associated with the focused-on card.
  • the HMD can display a “home card”, also referred to as a home screen.
  • the home card can display a clock, and be associated with a time of “now” or a current time. In some cases, the home card can display a clock, to reinforce the association between the home card and now. Then, cards associated with times before now can be viewed in the timeline as prior to the home card, and cards associated with times equal to or after now can be viewed in the timeline subsequent to the home card.
  • the wearer can choose to interact with some cards.
  • the wearer can tap on the touch-based UI, also referred to as performing a “tap operation”, to select the focused-on card for interaction.
  • a “contextual menu” can be used to interact with the selected card. For example, if the selected focused-on card shows a photo or an image captured by the HMD, the contextual menu can provide one or more options or operations for interacting with the selected photo, such as sharing the image with one or more people, or deleting the photo.
  • a contextual object for a contact or representation of information about a person can have options or operations such as call the contact, send a message to the contact, delete the contact, or review/update contact details such as telephone numbers, e-mail addresses, display names, etc.
  • Lists of some objects can be arranged by a different order other than the time-based order used by the timeline.
  • a list of contacts can be arranged by frequency of contact; e.g., a contact for the person most-communicated-with using the HMD can be displayed first in a list of contacts, the second-most-communicated-with contact can be displayed second in the list, and so on.
  • Other orderings are possible as well.
  • Groups of cards that represent share a relationship can be collected into a “bundle”, or “stack” or “deck” of cards.
  • the terms bundle of cards, stack of cards, and deck of cards are used interchangeably herein.
  • a bundle of cards can include any cards that can be considered to be related for a certain purpose, related based on criteria and/or a related combination of criteria. For example, a collection of photos captured within a certain span of time can be represented as a photo bundle.
  • a collection of messages e.g. an instant messaging session, SMS/text-message exchange, or e-mail chain
  • message bundle e.g. an instant messaging session, SMS/text-message exchange, or e-mail chain
  • a bundle card can be constructed for display on the timeline that represents the bundle and, in some cases, summarizes the bundle; e.g., shows thumbnail photos of photos in a photo bundle.
  • data related to the card can be used to track relationship(s) used to create bundles, e.g., a location associated with a card, an indication that the card is a photo, message, or other kind of card, a name of an application that created the card, etc.
  • cards can be classified according to activities taken upon selection. For example, upon selection of a bundle card, the bundle card can be replaced by one or more of the cards the bundle card represents.
  • An “actionable” card can be a non-bundle card that the HMD can perform one or more actions related to the actionable card.
  • a photo related to an actionable card can be shared, deleted, named, or stored by the HMD.
  • a message represented by an actionable card can be accepted, rejected, or transferred by the HMD.
  • the user interface can generate and/or use “action” cards to represent actions that can be performed by the HMD related to the actionable card.
  • the HMD can also use a speech or voice-based UI that can include one or more microphones to capture audible input, such as speech from the wearer.
  • the HMD can use speakers or a BCT to present audible output to the wearer.
  • the HMD can attempt to recognize the input as a speech command and processing the command accordingly; for example, by converting the audible input to text and operating on the text.
  • the speech input can represent commands to the HMD, such commands to search, navigate, take photos, record videos, send messages, make telephone calls, etc.
  • the UI can provide a relatively simple interface to a large collection of possible data sources. Further, by enabling operation on a collection of cards arranged in a natural fashion—according to time in one example—the wearer can readily locate and then utilize cards stored by the HMD.
  • an example system can be implemented in or can take the form of a wearable computer (also referred to as a wearable computing device).
  • a wearable computer takes the form of or includes a head-mountable device (HMD).
  • HMD head-mountable device
  • An example system can also be implemented in or take the form of other devices, such as a mobile phone, among other possibilities. Further, an example system can take the form of non-transitory computer readable medium, which has program instructions stored thereon that are executable by a processor to provide the functionality described herein. An example system can also take the form of a device such as a wearable computer or mobile phone, or a subsystem of such a device, which includes such a non-transitory computer readable medium having such program instructions stored thereon.
  • An HMD can generally be any display device that is capable of being worn on the head and places a display in front of one or both eyes of the wearer.
  • An HMD can take various forms such as a helmet or eyeglasses.
  • references to “eyeglasses” or a “glasses-style” HMD should be understood to refer to an HMD that has a glasses-like frame so that it can be worn on the head.
  • example embodiments can be implemented by or in association with an HMD with a single display or with two displays, which can be referred to as a “monocular” HMD or a “binocular” HMD, respectively.
  • FIG. 1A illustrates a wearable computing system according to an example embodiment.
  • the wearable computing system takes the form of a head-mountable device (HMD) 102 (which can also be referred to as a head-mounted display).
  • HMD head-mountable device
  • the HMD 102 includes frame elements including lens-frames 104 , 106 and a center frame support 108 , lens elements 110 , 112 , and extending side-arms 114 , 116 .
  • the center frame support 108 and the extending side-arms 114 , 116 are configured to secure the HMD 102 to a user's face via a user's nose and ears, respectively.
  • Each of the frame elements 104 , 106 , and 108 and the extending side-arms 114 , 116 can be formed of a solid structure of plastic and/or metal, or can be formed of a hollow structure of similar material so as to allow wiring and component interconnects to be internally routed through the HMD 102 . Other materials can be possible as well.
  • each of the lens elements 110 , 112 can be formed of any material that can suitably display a projected image or graphic.
  • Each of the lens elements 110 , 112 can also be sufficiently transparent to allow a user to see through the lens element. Combining these two features of the lens elements can facilitate an augmented reality or heads-up display where the projected image or graphic is superimposed over a real-world view as perceived by the user through the lens elements.
  • the extending side-arms 114 , 116 can each be projections that extend away from the lens-frames 104 , 106 , respectively, and can be positioned behind a user's ears to secure the HMD 102 to the user.
  • the extending side-arms 114 , 116 can further secure the HMD 102 to the user by extending around a rear portion of the user's head. Additionally or alternatively, for example, the HMD 102 can connect to or be affixed within a head-mounted helmet structure. Other configurations for an HMD are also possible.
  • the HMD 102 can also include an on-board computing system 118 , an image capture device 120 , a sensor 122 , and a finger-operable touch pad 124 .
  • the on-board computing system 118 is shown to be positioned on the extending side-arm 114 of the HMD 102 ; however, the on-board computing system 118 can be provided on other parts of the HMD 102 or can be remotely positioned from the HMD 102 (e.g., the on-board computing system 118 could be wire- or wirelessly-connected to the HMD 102 ).
  • the on-board computing system 118 can include a processor and memory, for example.
  • the on-board computing system 118 can be configured to receive and analyze data from the image capture device 120 and the finger-operable touch pad 124 (and possibly from other sensory devices, user interfaces, or both) and generate images for output by the lens elements 110 and 112 .
  • the image capture device 120 can be, for example, a camera that is configured to capture still images and/or to capture video. In the illustrated configuration, image capture device 120 is positioned on the extending side-arm 114 of the HMD 102 ; however, the image capture device 120 can be provided on other parts of the HMD 102 .
  • the image capture device 120 can be configured to capture images at various resolutions or at different frame rates. Many image capture devices with a small form-factor, such as the cameras used in mobile phones or webcams, for example, can be incorporated into an example of the HMD 102 .
  • FIG. 1A illustrates one image capture device 120
  • more image capture devices can be used, and each can be configured to capture the same view, or to capture different views.
  • the image capture device 120 can be forward facing to capture at least a portion of the real-world view perceived by the user. This forward facing image captured by the image capture device 120 can then be used to generate an augmented reality where computer generated images appear to interact with or overlay the real-world view perceived by the user.
  • the sensor 122 is shown on the extending side-arm 116 of the HMD 102 ; however, the sensor 122 can be positioned on other parts of the HMD 102 .
  • the HMD 102 can include multiple sensors.
  • an HMD 102 can include sensors 102 such as one or more gyroscopes, one or more accelerometers, one or more magnetometers, one or more light sensors, one or more infrared sensors, and/or one or more microphones.
  • Other sensing devices can be included in addition or in the alternative to the sensors that are specifically identified herein.
  • the finger-operable touch pad 124 is shown on the extending side-arm 114 of the HMD 102 . However, the finger-operable touch pad 124 can be positioned on other parts of the HMD 102 . Also, more than one finger-operable touch pad can be present on the HMD 102 .
  • the finger-operable touch pad 124 can be used by a user to input commands.
  • the finger-operable touch pad 124 can sense at least one of a pressure, position and/or a movement of one or more fingers via capacitive sensing, resistance sensing, or a surface acoustic wave process, among other possibilities.
  • the finger-operable touch pad 124 can be capable of sensing movement of one or more fingers simultaneously, in addition to sensing movement in a direction parallel or planar to the pad surface, in a direction normal to the pad surface, or both, and can also be capable of sensing a level of pressure applied to the touch pad surface.
  • the finger-operable touch pad 124 can be formed of one or more translucent or transparent insulating layers and one or more translucent or transparent conducting layers. Edges of the finger-operable touch pad 124 can be formed to have a raised, indented, or roughened surface, so as to provide tactile feedback to a user when the user's finger reaches the edge, or other area, of the finger-operable touch pad 124 . If more than one finger-operable touch pad is present, each finger-operable touch pad can be operated independently, and can provide a different function.
  • HMD 102 can be configured to receive user input in various ways, in addition or in the alternative to user input received via finger-operable touch pad 124 .
  • on-board computing system 118 can implement a speech-to-text process and utilize a syntax that maps certain spoken commands to certain actions.
  • HMD 102 can include one or more microphones via which a wearer's speech can be captured. Configured as such, HMD 102 can be operable to detect spoken commands and carry out various computing functions that correspond to the spoken commands.
  • HMD 102 can interpret certain head-movements as user input. For example, when HMD 102 is worn, HMD 102 can use one or more gyroscopes and/or one or more accelerometers to detect head movement. The HMD 102 can then interpret certain head-movements as being user input, such as nodding, or looking up, down, left, or right. An HMD 102 could also pan or scroll through graphics in a display according to movement. Other types of actions can also be mapped to head movement.
  • HMD 102 can interpret certain gestures (e.g., by a wearer's hand or hands) as user input. For example, HMD 102 can capture hand movements by analyzing image data from image capture device 120 , and initiate actions that are defined as corresponding to certain hand movements.
  • certain gestures e.g., by a wearer's hand or hands
  • HMD 102 can capture hand movements by analyzing image data from image capture device 120 , and initiate actions that are defined as corresponding to certain hand movements.
  • HMD 102 can interpret eye movement as user input.
  • HMD 102 can include one or more inward-facing image capture devices and/or one or more other inward-facing sensors (not shown) that can be used to track eye movements and/or determine the direction of a wearer's gaze.
  • certain eye movements can be mapped to certain actions.
  • certain actions can be defined as corresponding to movement of the eye in a certain direction, a blink, and/or a wink, among other possibilities.
  • HMD 102 also includes a speaker 125 for generating audio output.
  • the speaker could be in the form of a bone conduction speaker, also referred to as a bone conduction transducer (BCT).
  • BCT bone conduction transducer
  • Speaker 125 can be, for example, a vibration transducer or an electroacoustic transducer that produces sound in response to an electrical audio signal input.
  • the frame of HMD 102 can be designed such that when a user wears HMD 102 , the speaker 125 contacts the wearer.
  • speaker 125 can be embedded within the frame of HMD 102 and positioned such that, when the HMD 102 is worn, speaker 125 vibrates a portion of the frame that contacts the wearer.
  • HMD 102 can be configured to send an audio signal to speaker 125 , so that vibration of the speaker can be directly or indirectly transferred to the bone structure of the wearer.
  • the wearer can interpret the vibrations provided by BCT 125 as sounds.
  • bone-conduction transducers can be implemented, depending upon the particular implementation.
  • any component that is arranged to vibrate the HMD 102 can be incorporated as a vibration transducer.
  • an HMD 102 can include a single speaker 125 or multiple speakers.
  • the location(s) of speaker(s) on the HMD can vary, depending upon the implementation. For example, a speaker can be located proximate to a wearer's temple (as shown), behind the wearer's ear, proximate to the wearer's nose, and/or at any other location where the speaker 125 can vibrate the wearer's bone structure.
  • FIG. 1B illustrates an alternate view of the wearable computing device illustrated in FIG. 1A .
  • the lens elements 110 , 112 can act as display elements.
  • the HMD 102 can include a first projector 128 coupled to an inside surface of the extending side-arm 116 and configured to project a display 130 onto an inside surface of the lens element 112 .
  • a second projector 132 can be coupled to an inside surface of the extending side-arm 114 and configured to project a display 134 onto an inside surface of the lens element 110 .
  • the lens elements 110 , 112 can act as a combiner in a light projection system and can include a coating that reflects the light projected onto them from the projectors 128 , 132 .
  • a reflective coating may not be used (e.g., when the projectors 128 , 132 are scanning laser devices).
  • the lens elements 110 , 112 themselves can include: a transparent or semi-transparent matrix display, such as an electroluminescent display or a liquid crystal display, one or more waveguides for delivering an image to the user's eyes, or other optical elements capable of delivering an in focus near-to-eye image to the user.
  • a corresponding display driver can be disposed within the frame elements 104 , 106 for driving such a matrix display.
  • a laser or LED source and scanning system could be used to draw a raster display directly onto the retina of one or more of the user's eyes. Other possibilities exist as well.
  • FIG. 1C illustrates another wearable computing system according to an example embodiment, which takes the form of an HMD 152 .
  • the HMD 152 can include frame elements and side-arms such as those described with respect to FIGS. 1A and 1B .
  • the HMD 152 can additionally include an on-board computing system 154 and an image capture device 156 , such as those described with respect to FIGS. 1A and 1B .
  • the image capture device 156 is shown mounted on a frame of the HMD 152 . However, the image capture device 156 can be mounted at other positions as well.
  • the HMD 152 can include a single display 158 which can be coupled to the device.
  • the display 158 can be formed on one of the lens elements of the HMD 152 , such as a lens element described with respect to FIGS. 1A and 1B , and can be configured to overlay computer-generated graphics in the user's view of the physical world.
  • the display 158 is shown to be provided in a center of a lens of the HMD 152 , however, the display 158 can be provided in other positions, such as for example towards either the upper or lower portions of the wearer's field of view.
  • the display 158 is controllable via the computing system 154 that is coupled to the display 158 via an optical waveguide 160 .
  • FIG. 1D illustrates another wearable computing system according to an example embodiment, which takes the form of a monocular HMD 172 .
  • the HMD 172 can include side-arms 173 , a center frame support 174 , and a bridge portion with nosepiece 175 . In the example shown in FIG. 1D , the center frame support 174 connects the side-arms 173 .
  • the HMD 172 does not include lens-frames containing lens elements.
  • the HMD 172 can additionally include a component housing 176 , which can include an on-board computing system (not shown), an image capture device 178 , and a button 179 for operating the image capture device 178 (and/or usable for other purposes).
  • Component housing 176 can also include other electrical components and/or can be electrically connected to electrical components at other locations within or on the HMD.
  • HMD 172 also includes a BCT 186 .
  • the HMD 172 can include a single display 180 , which can be coupled to one of the side-arms 173 via the component housing 176 .
  • the display 180 can be a see-through display, which is made of glass and/or another transparent or translucent material, such that the wearer can see their environment through the display 180 .
  • the component housing 176 can include the light sources (not shown) for the display 180 and/or optical elements (not shown) to direct light from the light sources to the display 180 .
  • display 180 can include optical features that direct light that is generated by such light sources towards the wearer's eye, when HMD 172 is being worn.
  • HMD 172 can include a sliding feature 184 , which can be used to adjust the length of the side-arms 173 .
  • sliding feature 184 can be used to adjust the fit of HMD 172 .
  • an HMD can include other features that allow a wearer to adjust the fit of the HMD, without departing from the scope of the invention.
  • FIGS. 1E to 1G are simplified illustrations of the HMD 172 shown in FIG. 1D , being worn by a wearer 190 .
  • BCT 186 is arranged such that when HMD 172 is worn, BCT 186 is located behind the wearer's ear. As such, BCT 186 is not visible from the perspective shown in FIG. 1E .
  • the display 180 can be arranged such that when HMD 172 is worn, display 180 is positioned in front of or proximate to a user's eye when the HMD 172 is worn by a user.
  • display 180 can be positioned below the center frame support and above the center of the wearer's eye, as shown in FIG. 1E .
  • display 180 can be offset from the center of the wearer's eye (e.g., so that the center of display 180 is positioned to the right and above of the center of the wearer's eye, from the wearer's perspective).
  • display 180 can be located in the periphery of the field of view of the wearer 190 , when HMD 172 is worn.
  • FIG. 1F when the wearer 190 looks forward, the wearer 190 can see the display 180 with their peripheral vision.
  • display 180 can be outside the central portion of the wearer's field of view when their eye is facing forward, as it commonly is for many day-to-day activities. Such positioning can facilitate unobstructed eye-to-eye conversations with others, as well as generally providing unobstructed viewing and perception of the world within the central portion of the wearer's field of view.
  • the wearer 190 can view the display 180 by, e.g., looking up with their eyes only (possibly without moving their head). This is illustrated as shown in FIG. 1G , where the wearer has moved their eyes to look up and align their line of sight with display 180 . A wearer might also use the display by tilting their head down and aligning their eye with the display 180 .
  • FIG. 2A illustrates a schematic drawing of a computing device 210 according to an example embodiment.
  • device 210 communicates using a communication link 220 (e.g., a wired or wireless connection) to a remote device 230 .
  • the device 210 can be any type of device that can receive data and display information corresponding to or associated with the data.
  • the device 210 can be a heads-up display system, such as the head-mounted devices 102 , 152 , or 172 described with reference to FIGS. 1A to 1G .
  • the device 210 can include a display system 212 comprising a processor 214 and a display 216 .
  • the display 210 can be, for example, an optical see-through display, an optical see-around display, or a video see-through display.
  • the processor 214 can receive data from the remote device 230 , and configure the data for display on the display 216 .
  • the processor 214 can be any type of processor, such as a micro-processor or a digital signal processor, for example.
  • the device 210 can further include on-board data storage, such as memory 218 coupled to the processor 214 .
  • the memory 218 can store software that can be accessed and executed by the processor 214 , for example.
  • the remote device 230 can be any type of computing device or transmitter including a laptop computer, a mobile telephone, or tablet computing device, etc., that is configured to transmit data to the device 210 .
  • the remote device 230 and the device 210 can contain hardware to enable the communication link 220 , such as processors, transmitters, receivers, antennas, etc.
  • remote device 230 can take the form of or be implemented in a computing system that is in communication with and configured to perform functions on behalf of client device, such as computing device 210 .
  • client device such as computing device 210 .
  • Such a remote device 230 can receive data from another computing device 210 (e.g., an HMD 102 , 152 , or 172 or a mobile phone), perform certain processing functions on behalf of the device 210 , and then send the resulting data back to device 210 .
  • This functionality can be referred to as “cloud” computing.
  • the communication link 220 is illustrated as a wireless connection; however, wired connections can also be used.
  • the communication link 220 can be a wired serial bus such as a universal serial bus or a parallel bus.
  • a wired connection can be a proprietary connection as well.
  • the communication link 220 can also be a wireless connection using, e.g., Bluetooth® radio technology, communication protocols described in IEEE 802.11 (including any IEEE 802.11 revisions), Cellular technology (such as GSM, CDMA, UMTS, EV-DO, WiMAX, or LTE), or Zigbee® technology, among other possibilities.
  • the remote device 230 can be accessible via the Internet and can include a computing cluster associated with a particular web service (e.g., social-networking, photo sharing, address book, etc.).
  • FIG. 2B shows an example projection of UI elements described herein via an image 280 by an example head-mountable device (HMD) 252 , according to an example embodiment.
  • HMD head-mountable device
  • Other configurations of an HMD can also be used to present the UI described herein via image 280 .
  • FIG. 2B shows wearer 254 of HMD 252 looking at an eye of person 256 . As such, wearer 254 's gaze, or direction of viewing, is along gaze vector 260 .
  • a horizontal plane, such as horizontal gaze plane 264 can then be used to divide space into three portions: space above horizontal gaze plane 264 , space in horizontal gaze plane 264 , and space below horizontal gaze plane 264 .
  • horizontal gaze plane 260 appears as a line that divides projection plane into a subplane above the line of horizontal gaze plane 260 , a subplane below the line of horizontal gaze plane 260 , and the line where horizontal gaze plane 260 intersects projection plane 276 .
  • horizontal gaze plane 264 is shown using dotted lines.
  • a dividing plane indicated using dividing line 274 can be drawn to separate space into three other portions: space to the left of the dividing plane, space on the dividing plane, and space to right of the dividing plane.
  • the dividing plane intersects projection plane 276 at dividing line 274 .
  • the dividing plane divides projection plane into: a subplane to the left of dividing line 274 , a subplane to the right of dividing line 274 , and dividing line 274 .
  • dividing line 274 is shown as a solid line.
  • FIG. 2B shows the upper visual plane 270 as the uppermost plane that wearer 254 can see while gazing along gaze vector 260 , and shows lower visual plane 272 as the lowermost plane that wearer 254 can see while gazing along gaze vector 260 .
  • upper visual plane 270 and lower visual plane 272 are shown using dashed lines.
  • the HMD can project an image for view by wearer 254 at some apparent distance 262 along display line 282 , which is shown as a dotted and dashed line in FIG. 2B .
  • apparent distance 262 can be 1 meter, four feet, infinity, or some other distance.
  • HMD 252 can generate a display, such as image 280 , which appears to be at the apparent distance 262 from the eye of wearer 254 and in projection plane 276 .
  • image 280 is shown between horizontal gaze plane 264 and upper visual plane 270 ; that is image 280 is projected above gaze vector 260 .
  • image 280 is also projected to the right of dividing line 274 .
  • wearer 254 can look at person 256 without image 280 obscuring their general view.
  • the display element of the HMD 252 is translucent when not active (i.e. when image 280 is not being displayed), and so the wearer 254 can perceive objects in the real world along the vector of display line 282 .
  • image 280 can be projected above horizontal gaze plane 264 near and/or just above upper visual plane 270 to keep image 280 from obscuring most of wearer 254 's view. Then, when wearer 254 wants to view image 280 , wearer 254 can move their eyes such that their gaze is directly toward image 280 .
  • FIGS. 3 through 15 collectively describe aspects of an example user interface for an HMD such as discussed above at least in the context of FIGS. 1A through 2 .
  • the HMD can be configured with a user interface (UI) controller receiving inputs from at least two user interfaces: a touch-based UI and a voice-based UI.
  • the touch-based UI can include a touch pad and a button, configured to receive various touches, such as one-finger swipes in various directions, two-finger or multi-finger swipes in various directions, taps, button presses of various durations, and button releases.
  • the touch-based UI can report the touch; e.g., a “swipe forward” or “tap” to the HMD, or in some cases, to a component of the HMD such as a UI controller.
  • the HMD can act as the UI controller.
  • the HMD includes any necessary components, such as but not limited to one or more UI controllers, which are configured to perform and control the UI operations described herein.
  • the voice-based UI can include a microphone configured to receive various words, including commands, and to report the received words; e.g., “Call Mom”, to the HMD.
  • the HMD can include a gaze-based UI that is configured to detect duration and/or direction of one or more gazes of a wearer of the HMD.
  • the gaze-based UI can be configured to detect “dwell time” or how long the wearer gazes in a fixed direction, the direction of the gaze, a rate of change of the gaze, and additional information related to wearer gazes.
  • the HMD can generate audible outputs; e.g., tones, words, songs, etc., that can be heard by the wearer via headphones, speakers, or bone conduction devices of the HMD.
  • the HMD can generate “cards”, also referred to as screens or images, which are capable of occupying the full display of the HMD when selected.
  • One card is a home card that is the first card displayed when UI is activated, for example shortly after HMD powers up or when the HMD wakes from a sleep or power-saving mode.
  • FIG. 3 shows an example home card 300 of an example user interface, according to an example embodiment.
  • Home card 300 includes application status indicators 310 , device status indicators 312 , hint 316 and a clock shown in large numerals indicating the current time in the center of home card 300 .
  • Application status indicators 310 can indicate which application(s) are operating on the HMD.
  • application status indicators 310 include camera and Y-shaped road icons to respectively indicate operation of a camera application and a navigation application. Such indicators can remind the wearer what applications or processes are presently running and/or consuming power and/or processor resources of the HMD.
  • Device status indicators 312 can indicate which device(s) are operating on the HMD and HMD status. As shown in FIG. 3 , device status indicators 312 include icons for a wireless network and a Bluetooth network, respectively, that indicate the HMD is presently configured for communication via a wireless network and/or a Bluetooth network. In one embodiment, the HMD may not present device status indicators 312 on home card 300 .
  • Hint 314 is shown in FIG. 3 as “ok glass”. Hint 314 is shown in quotes to indicate that the hint is related to the voice-based UI of the HMD. In some embodiments, hint 314 can be related to the touch-based UI of the HMD.
  • the words in hint 314 illustrated as “ok glass” indicate that a wearer should say the words “ok glass” to activate the voice-based UI of the HMD.
  • “ok glass” in this instance is a word (that can also be referred to as “a hotword”) that triggers activation of a voice-based UI. Other hotwords can also be used.
  • hint 314 can remove hint 314 from being displayed on home card 110 .
  • the HMD can add hint 314 back to home card 110 to remind the wearer about specific words, e.g., ok glass, used to activate the voice-based UI.
  • the hotword presented as hint 314 on home card 300 can be updated to make the user aware of other functionality of the HMD, or to suggest queries or actions based on the HMD's current geographic location or situational context.
  • the UI can accept as inputs certain operations performed using the touch-based UI.
  • the UI can receive these operations and responsively perform actions to enable the wearer to interact with the HMD. These operations can be organized into tiers.
  • FIG. 4 lists example operations of a multi-tiered user model 400 for a user interface for a head-mountable device (HMD), according to an example embodiment.
  • HMD head-mountable device
  • multi-tiered user model 400 has three tiers: basic, intermediate, and advanced.
  • the basic tier provides the smallest number of operations of any tier of multi-tiered user model 400 .
  • the intermediate tier includes all operations provided by the basic tier, along with additional operations not provided by the basic tier.
  • the advanced tier includes all operations provided by the basic and intermediate tiers, along with additional operations not provided by either the basic tier or intermediate tier.
  • FIG. 4 shows that the basic tier of multi-tiered user model 400 provides tap, swipe forward, swipe backward, voice, and camera button press operations.
  • a tap operation can involve a single physical tap—that is, one quick, slight strike with one or more fingers on the touch pad of the touch-based UI.
  • a swipe forward operation sometimes termed a swipe right, can involve a movement forward by one or more fingers touching the touch pad, where forward is the general direction from the wearer's ear toward the wearer's eye when the wearer has the HMD on.
  • a swipe backward operation can involve a movement backward by one or more fingers touching the touch pad, where backward is the general direction from the wearer's eye toward the wearer's ear when the wearer has the HMD on.
  • a “swipe down” operation can involve a downward movement by one or more fingers touching the touch pad, where downward is the general direction from the top of the wearer's head toward the wearer's neck when the wearer has the HMD on.
  • the physical actions used by the wearer to perform some or all of the herein-described operations can be customized; e.g., by the wearer and/or other entity associated with the HMD.
  • the wearer and/or other entity could configure the HMD to recognize a double-tap as a tap operation, such as by training or setting the HMD to associate the double-tap with the tap operation.
  • the wearer would like to interchange the physical operations to perform swipe forward and backward operations; e.g., the swipe forward operation would be performed using a physical action described above as a swipe left and the swipe backward operation would be performed using a physical action described above as a swipe right.
  • the wearer could configure the HMD to recognize a physical swipe left as a swipe forward operation and physical swipe right as a swipe backward operation.
  • Other customizations are possible as well; e.g., using a sequence of swipes to carry out the tap operation.
  • the tap operation can select a currently visible card.
  • the swipe forward operation can remove the currently visible card from display and select a next card for display.
  • the swipe backward operation can remove the currently visible card from display and select a previous card for display.
  • the swipe down operation can, depending on context, act to go back, go home, or sleep. Going back can remove the currently visible card from display and display a previously-visible card for display.
  • the previously-visible card can be the card that most recently viewed; e.g. if card A is currently visible and card B is previously-viewed card, then the swipe down operation can remove card A from visibility and display card B. Going home can replace the currently visible card from display and display the home card. Sleeping can cause part of the HMD, e.g., the display, or all of the HMD to be deactivated.
  • a voice operation can provide access to a voice menu of operations. Voice interactions with the UI are discussed below in more detail in the context of FIG. 15 .
  • a camera button press can instruct the HMD to take a photo using a camera associated with and/or part of the HMD.
  • FIG. 4 shows that the intermediate tier of multi-tiered user model 400 provides tap, swipe forward, swipe backward, voice, and camera button press operations as described above in the context of the basic tier. Also, the intermediate tier provides camera button long press, two finger swipe forward, two finger swipe backward, and two finger swipe down operations.
  • the camera button long press operation can instruct the HMD to provide a capture menu for display and use.
  • the capture menu can provide one or more operations for using the camera associated with HMD. The capture menu is discussed below in more detail in the context of FIG. 7 .
  • the two finger swipe forward operation removes the currently visible card from display and selects a next card for display using a “zoomed scroll”.
  • the two finger swipe forward operation removes the currently visible card from display and selects the next card for display using a zoomed scroll. Zoomed scrolls are discussed in more detail in the context of at least FIG. 6A .
  • the two finger swipe down causes the HMD to sleep at this position in a timeline.
  • FIG. 4 shows that the advanced tier of multi-tiered user model 400 provides tap, swipe forward, swipe backward, voice, and camera button press operations as described above in the context of the basic tier, as well as camera button long press, two finger swipe forward, two finger swipe backward, and two finger swipe down operations described above in the context of the intermediate tier.
  • the advanced tier also provides one-finger press-and-hold, two-finger press-and-hold, and nudge operations.
  • the one-finger press-and-hold operation zooms, or expands, the display of the current card, or content related to the current card, starting when the wearer presses on the touch-based UI and continues to zoom as long as the wearer “holds” or keeps pressing on the touch-based UI.
  • the two-finger press-and-hold can provide a “clutch” operation, which can be performed by pressing on the touch-based UI in two separate spots using two fingers and holding the fingers in their respective positions on the touch-based UI. After the fingers are held in position on the touch-based UI, the clutch operation is engaged. In some embodiments, the HMD recognizes the clutch operation only after the fingers are held for at least a threshold period of time; e.g., one second. The clutch operation will stay engaged as long as the two fingers remain on the touch based UI. Clutch operations are discussed in more detail below in the context of at least FIGS. 6B and 6C .
  • the nudge operation can be performed using a short, slight nod of the wearer's head.
  • the HMD can be configured with accelerometers or other motion detectors that can detect the nudge and provide an indication of the nudge to the HMD.
  • the HMD can toggle an activation state of the HMD. That is, if the HMD is active (e.g., displaying a card on the activated display) before the nudge, the HMD can deactivate itself (e.g., turn off the display) in response.
  • the HMD is inactive before the nudge but is active enough to detect nudges; e.g., within two or a few seconds of notification of message arrival, the HMD can activate itself in response.
  • the HMD is powered on with the display inactive.
  • an audible chime can be emitted by the HMD.
  • the HMD can activate and present a card with the content of the text message. If, from the activated state, the user nudges again, the display will deactivate.
  • the user can interact with the device in a completely hands-free manner.
  • FIG. 5A shows a scenario 500 of example timeline interactions, according to an example embodiment.
  • Scenario 500 begins with home card 502 being displayed by an HMD worn by a wearer.
  • Home card 502 and cards 520 a - 520 c can be arranged as a “timeline” or ordered sequence of cards.
  • each card in timeline 510 has a specific time associated with the card.
  • the timeline can be ordered based on the specific time associated with each card.
  • the specific time can be “now” or the current time.
  • home card 502 can be associated with the specific time of now.
  • the time can be a time associated with an event leading to the card.
  • FIG. 5A shows that card 520 a represents a photo taken at a time 2 hours ago. Then, card 520 a can be associated with the specific time of 1:28, which is 2 hours before the current time of 3:28 shown on home card 500 .
  • Cards 520 b - 520 f represent current cards, or cards associated with the specific time of now, or upcoming cards, or cards associated with a future time.
  • card 520 b is a current card that includes an image currently generated by a camera associated with the HMD
  • card 520 c is a current card that includes an image of a “hangout” or video conference call currently in-progress generated by an application of the HMD
  • card 520 d is a current card that includes an image and text currently generated by a navigation application/process presently running on the HMD
  • card 520 e is a current card that includes images and text currently generated by a weather application of the HMD
  • 520 f is an upcoming card that includes images and text generated by a calendar application of the HMD indicating an appointment for “Lunch with Monica Kim” in “2 hours”.
  • the HMD can enable navigation of the time line using swipe operations. For example, starting at home card 502 , a swipe backward operation can cause the HMD to select and display a previous card, such as card 520 a , and a swipe forward operation can cause the HMD to select and display a next card, such as card 520 b .
  • the swipe forward operation can cause the HMD to select and display the previous card, which is home card 502
  • the swipe backward operation can cause the HMD to select and display the next card, which is card 520 c.
  • timeline 510 there are no cards in timeline 510 that are previous to card 520 a .
  • the timeline is represented as a circular timeline.
  • the HMD in response to a swipe backward operation on card 520 a requesting a previous card for display, the HMD can select 520 f for (re)display, as there are no cards in timeline 510 that are after card 520 f during scenario 500 .
  • the HMD in response to a swipe forward operation on card 520 f requesting a next card for display, the HMD can select 520 a for (re)display, as there are no cards in timeline 510 that are after card 520 f during scenario 500 .
  • a notification is generated to indicate to the user that there are no additional cards to navigate to in the instructed direction.
  • notifications could include any of or a combination of the following: a visual effect, an audible effect, a glowing effect on the edge of the card, a three dimensional animation twisting the edge of the card, a sound (e.g. a click), a textual or audible message indicating that the end of the timeline has been reached (e.g. “there are no cards older than this”).
  • a visual effect e.g. a click
  • a textual or audible message indicating that the end of the timeline has been reached (e.g. “there are no cards older than this”).
  • an attempt by the user to navigate past a card in a direction where there are no additional cards could result in no effect, i.e. swiping right on card 520 a results in no perceptible change to the display or card 520 a.
  • a wearer of the HMD can recite or utter a hotword, for example the words “ok glass” to activate the voice-based interface of the HMD.
  • the HMD can display card 530 that lists some of the commands that can be uttered by the wearer to interact with the voice-based interface.
  • 5A shows example commands as “Google” to perform a search query, “navigate to” to find directions to a location, “take a photo” to capture an image using a camera associated with the HMD, “record a video” to capture a sequence of images and/or associated sounds, using a camera and/or a microphone associated with the HMD, and “send a message” to generate and send an e-mail, SMS message, instant message, or some other type of message.
  • the wearer can utter something in response, which can lead to voice interactions with the UI, such as those discussed below with respect to FIG. 15 .
  • the commands capable of triggering voice interactions are not necessarily limited to those presented on card 530 at the time the utterance is received. For example, as the user dwells on card 530 , additional commands can be presented for other features. Further, such commands presented on card 530 can change over time through further use of the HMD, or can be remotely updated to surface additional features or content of the HMD. Still further, similar to the frequent contact aspects described herein, commands for frequently used functions of the HMD can be presented on card 530 . As such, these commands can change over time based on use of the HMD by the wearer.
  • timelines can become lengthy.
  • the UI provides operations for speedy use of the UI, such as two-fingered swipes and clutches, although other gestures to invoke such navigation operations are possible.
  • FIG. 6A shows an example of using a two-fingered swipe on a touch-based UI of an HMD for zoomed scrolling, according to an example embodiment.
  • FIG. 5B shows scenario 540 of example timeline interactions including splicing a new card into timeline 550 , according to an example embodiment.
  • Scenario 540 begins with a wearer of an HMD using the HMD to observe timeline 550 , focusing in on card 550 b , which is the home card for timeline 550 .
  • FIG. 5B shows the focused-on card, card 550 b , of timeline 550 using a dotted-line border.
  • card 550 b is displayed by the HMD using a single-card view.
  • Scenario 540 continues by the HMD receiving an incoming telephone call from a contact, Kelly Young.
  • the HMD can be configured with one or more transceivers configured to establish, maintain, and tear down communication links, such as communication link 220 discussed above in the context of FIG. 2A , that utilize one of a number of cellular and/or other technologies to originate and terminate wireless telephone calls.
  • the HMD Upon receiving the phone call, the HMD can generate, retrieve, and/or determine card 560 representing the calling party, Kelly Young, of the telephone call. Once available, the HMD can display card 560 using a single-card view.
  • the wearer of the HMD would like to answer the call from Kelly Young.
  • the wearer can perform a tap operation to bring up a contextual menu suitable for the context of a telephone call.
  • This contextual menu can have options such as, but not limited to, answering the telephone call, routing/forwarding the telephone call to another number (e.g., another phone, voice mail), ignoring/rejecting the telephone call, putting the calling party on hold, bridging the calling party into a three-way or multi-way call, bridging the calling party into a video conference call, such as a hangout, and saving contact information related to the telephone call.
  • the first option of the contextual menu for the telephone call is an answer option.
  • the options of the contextual menu can be displayed as an overlay on top of card 560 representing the telephone call.
  • card 570 can be generated by (a) displaying text and/or graphics related to the contextual menu item overlaying (b) a dimmed version of card 560 . Card 570 can then be focused on and displayed using a single-card view.
  • the wearer can answer the call by performing a tap operation while the answer option is active; e.g., while card 570 is focused on.
  • the HMD can generate display 580 by determining where in the timeline a new card representing the telephone call would be displayed.
  • a card representing the telephone call e.g. card 560
  • card 560 would be adjacent to and on the future/now side of a timeline. That is, for timeline 550 shown at the top of FIG. 5B , card 560 would be “spliced into”, or inserted or placed into the middle of, timeline 550 between home card 550 b and card 550 c.
  • the HMD can be configured to animate this splicing operation by showing room being made for the to-be-spliced-in card in the timeline and then showing the to-be-spliced-in card placed into the timeline.
  • the HMD can show the spliced card in single-card view as the focused-on card.
  • the HMD in response to the tap operation performed while card 570 is displayed, the HMD can switch to a zoomed-out, or multi-card, display of the timeline as shown in display 580 showing part or all of cards 550 a - 550 d of timeline 550 .
  • the HMD can show the cards on each side of the to-be-spliced-in card; e.g., cards 550 b and 550 c , moving away from the center of the zoomed-out display as indicated in display 580 .
  • the to-be-spliced-in card can be shown in the display between the cards on each side.
  • Display 582 shows a stage of this insertion animation after cards 550 b and 550 c have moved far enough apart to permit splicing in card 560 —the to-be-spliced-in card.
  • Timeline 550 at the bottom of FIG. 5B shows the result of the splicing operation.
  • Card 560 (the formed to-be-spliced-in card) in shown between cards 550 b and 550 c in timeline 550 , and is indicated as being focused on by the HMD. As card 560 is focused in by the HMD, card 560 can be shown in single-card mode.
  • Scenario 540 can conclude with the HMD answering the telephone call before, during, or after the animation of the splicing operation, and the telephone call between Kelly Young and the wearer entering the talking state.
  • the splicing operation can be performed in reverse when a card is to be removed from a timeline; that is, a “reverse splice” can be performed.
  • a “reverse splice” can be performed after the call with Kelly Young is completed.
  • card 560 could be removed from the timeline 500 .
  • an animation that is substantially in the reverse of the splicing process described above is used in conjunction with removing card 560 from the timeline 550 .
  • FIG. 5C shows scenario 584 using a multi-timeline display, according to an example embodiment.
  • Scenario 584 begins with a wearer of an HMD using View A, shown at the top of FIG. 5C , that can be generated by the HMD to observe home card 588 a displayed in single-card view 586 .
  • a wearer of an HMD can switch from single-card view 586 into a multi-timeline view using a clutch operation, as discussed in detail below in the context of FIG. 6D .
  • a different operation or operations than a clutch can be performed to switch into the multi-timeline view.
  • FIG. 5C illustrates a multi-timeline view and shows three cards 588 a , 588 b , and 588 c of main timeline 588 in a linear arrangement.
  • Card 588 a is a home card for main timeline 588
  • card 588 b is a card representing an “Email” from “LunchPal” that arrived “5 min ago”
  • card 588 c is a bundle card that shows a number of thumbnail images related to a bundle of contacts called “Friends”.
  • card 588 a was shown in while in single-user view 586 and in an initial multi-timeline view.
  • the initial multi-timeline view can be centered on the card shown in a previous single-card view; e.g., home card 588 a .
  • multiple timelines can be displayed as part of the initial multi-timeline view; for example, main timeline 588 can be accompanied by a one or more timelines showing card representing one or more contacts, photos, previous events, future events, and/or other cards.
  • the wearer of the HMD can select a card for use by controlling a selection region; e.g., focus 688 a shown in FIG. 6D .
  • a given card such as card 588 b
  • the selection region can be aligned with a given card in a display when the selection region is placed over the given card in the display, the selection region substantially overlaps the given card in the display, and/or a UI action (e.g., a tap of a touchpad, a click of a mouse, a key press) is performed when the selection region overlaps the given card in the display.
  • a UI action e.g., a tap of a touchpad, a click of a mouse, a key press
  • the selection region substantially overlaps the given card when at least 50% of the selection region overlaps the given card in the display.
  • the HMD can be configured to detect head movements and the selection region can be moved using the head movements.
  • the wearer of the HMD selects card 588 b and, after the selection of card 588 b , View C can be generated, which is shown below and to the left of View B in FIG. 5C .
  • View C shows card 588 b of main timeline 588 and a linear arrangement of three action cards 590 a , 590 b , and 590 c shown above card 588 b ; that is, View C shows multiple linear arrangements simultaneously.
  • the linear arrangement of action cards starts with card 590 a that is directly above selected card 588 b , and the linear arrangement of action cards is adjacent to, above, and parallel to main timeline 588 .
  • Card 588 a is shown in View C as greyed out to indicate that card 588 a is not selected.
  • action card 590 a Upon selection of action card 590 a to “View All”, the wearer can view the e-mail represented by card 588 b .
  • Selection of action card 590 b to “Share” can enable the wearer to share; e.g., reply to, forward, post to a website, etc., the e-mail represented by card 588 b .
  • Selection of action card 590 c to “Delete” can permit the wearer to delete the e-mail represented by card 588 b.
  • the wearer selects card 590 a to view all of the e-mail represented by card 588 b .
  • the content of the e-mail is shown using three content cards 592 a , 592 b , and 592 c shown in View D as adjacent to and above selected card 590 a .
  • View D is shown directly to the right of View C in FIG. 5C .
  • View D also shows that the linear arrangement of contact cards begins with card 592 a , which is shown directly above selected card 592 a .
  • View D does not show unselected action cards 590 b and 590 c ; in some embodiments, unselected cards can be displayed.
  • unselected but displayed card can be displayed in a visually distinct manner to indicate non-selection; e.g., shown with a grey background as for card 588 a in View C.
  • Scenario 584 continues with the wearer of the HMD manipulating the selection region to return to the main timeline 588 and select card 588 c as shown in View E.
  • FIG. 5C shows View E below and to the left of View D.
  • card 588 c is a bundle card representing a group of related cards; in this example, a group of contact cards.
  • Each contact card can have an indication that the card is a contact card.
  • card represented by bundle card 588 c can have an indication that the card is in the “Friends” bundle of cards/contacts.
  • the HMD can determine cards in the “Friends” bundle by searching for each card having an indication that the card is in the “Friends” group of cards.
  • the HMD can generate View F, which shows contact cards 594 a and 594 b of the “Friends” bundle displayed the linear arrangement with main timeline 588 .
  • View F is shown in FIG. 5C directly below View E.
  • Bundle card 588 b is shown by View F as remaining in the linear arrangement with main timeline 588 .
  • contact cards 594 a and 594 b , as well as additional cards in the “Friends” bundle can be shown in a linear arrangement adjacent to the linear arrangement showing a selected bundle card; e.g., card 588 c .
  • bundle card 588 c is no longer displayed; rather, the bundle card can be considered to be replaced by the content of the bundle.
  • the splicing operation can utilize card generated by other applications. For example, suppose a card representing a navigation application/process is displayed on a timeline, and the wearer uses a tap operation to activate the navigation application/process to provide directions to a destination. To show the directions to the destination, the navigation application/process can generate a results card that includes one or more directions. When first generated, the results card can be spliced into the timeline, using the splicing operation described immediately above. When the wearer arrives at the destination, the results card can be removed using the reverse splice operation described above. In some scenarios, multiple cards can be spliced in and/or reverse spliced out of a timeline simultaneously or substantially as so, such as when being added to or leaving a multi-party hangout, telephone call, or other communication.
  • a wearer can swipe forward with two fingers, as shown in FIG. 6A , to perform a zoomed scroll to a next card.
  • a wearer can swipe backward with two fingers, as also shown in FIG. 6 , to perform a zoomed scroll to a previous card.
  • a reduced-size view of cards can be displayed in the resulting timeline 610 . That is, as shown in FIG. 6A , multiple cards can be shown in example display 612 generated by the HMD.
  • a swipe or drag operation associated with the zoomed scroll can move content faster, e.g., 4 times faster, than when performing a regular swipe or drag operation.
  • Inertial free scrolling can be performed as part of zoomed scrolling.
  • the focus for the UI is on card 614 of timeline 610 .
  • FIG. 6A shows card 614 outlined using a thick dashed line in the center of display 612 .
  • a timeline that has been released after the zoomed scroll can stay zoomed out, or can continue with reduced image views, until a minimum velocity threshold for the timeline is reached.
  • display 612 can be instructed to zoom to the card that is closest to the center of display 612 ; e.g., display 612 can zoom to card 614 . That is, the HMD can show card 614 as large as possible within display 612 .
  • a clutch operation can lead to generation and display of a multi-card display, such as shown in FIG. 6B , or a multi-timeline display, such as shown in FIG. 6C .
  • Navigation within the multi-card display and/or multi-timeline display can, in some embodiments, be performed using head movements.
  • the multi-card display or multi-timeline display in toto can be focused on, or displayed by the HMD.
  • a sub-focus can be implemented to highlight a card or a timeline within a multi-card or multi-timeline display.
  • FIG. 6B shows a scenario 620 for using clutch operation 642 to generate a multi-card display 634 a , according to an example embodiment.
  • Scenario 620 begins with an HMD having timeline 630 with cards 630 a through 630 g , and with a focus on card 630 d .
  • the HMD displays cards in the timeline using a single-card view, while solely displaying a focused-upon card.
  • the focus is on card 630 d , which FIG. 6 shows as a photo of a woman's face, the HMD displays a single-card view of card 630 d.
  • Scenario 620 continues with a wearer of the HMD performing clutch operation 642 using the touch-based UI of the HMD.
  • a clutch operation can involve pressing on the touch-based UI of the HMD using two fingers and holding the two-finger press until the HMD recognizes the clutch operation 642 has been performed.
  • Other gestures, techniques, inputs or time thresholds can be used to trigger the clutch operation. For example, in certain embodiments, a three-finger gesture or a voice-action could be used to engage and/or disengage the clutch operation.
  • the HMD can generate and display multi-card display 634 a , which is shown in an expanded view as multi-card display 634 b .
  • the HMD can focus on the entire multi-card display 634 a using focus 636 .
  • the HMD can focus a subset of cards, such as but not limited to, a single card, a row of cards, a column of cards, a block of cards, or some other selection of cards, within multi-card display 634 a using sub-focus 638 .
  • the HMD is configured to display sub-focus 638 on a single card.
  • the sub-focus can remain on one or more cards at or near the center of the display.
  • the multi-card display shows nine cards: cards 630 a through 630 g of timeline 630 and two other cards 640 a and 640 b not shown as part of timeline 630 .
  • the wearer of the HMD can navigate around multi-card display 634 a , 634 b using head movements, such as moving the wearer's head up, down, left, and/or right.
  • gaze tracking can be used in place of or in addition to head movements for navigating around multi-card display 634 a , 634 b and/or multi-timeline display 664 a , 664 b.
  • “wrap-around” movements or moving off the end of a row or column to the respective other end of the row or column, are enabled. Then, in response to respective movements upward, downward, leftward, or rightward by the head of the wearer, the sub-focus 638 can move from card 630 d , as shown in FIG. 6B , to respective cards 630 a , 630 g , 630 f , or 630 g . In particular embodiments, wrap-around can be inhibited, so moving the wearer's head leftward will not move sub-focus 638 from card 630 d to card 630 f , but rather sub-focus 638 will stay at the left-end of the middle row on card 630 d.
  • the sub-focus 638 in response to respective movements diagonally up-and-left, up-and-right, down-and-left, and down-and-right by the head of the wearer, can move from card 630 d , as shown in FIG. 6B , to respective cards 630 c , 630 b , 640 b , or 640 c .
  • head movements and/or UI operations can be used as well or instead with multi-card display 634 a , 634 b , including but not limited to head movements and/or UI operations that move the focus faster than and/or slower than one card at a time, zooming in and out, reshaping sub-focus 638 , selecting card(s), and deselecting card(s).
  • sub-focus 638 may not be used.
  • a leftward head movement can move each of cards 630 b , 630 c , 630 e , 630 f , 640 a , and 640 b to the left by one card and bring in new cards to the “right” of these cards (new cards not shown in FIG. 6B ) on to multi-card displays 634 a and 634 b .
  • the new cards can be displayed in the respective positions of card 630 c , 630 f , and 640 b , and remove cards 630 a , 630 d , and 630 g from multi-card display 634 a and 634 b .
  • a rightward head movement can move each of cards 630 a , 630 b , 630 d , 630 e , 630 g , 640 a to the right by one card, bring in new cards to the “right” of these cards (not shown in FIG. 4 ) on to multi-card displays 634 a and 634 b .
  • the new cards can be displayed in the respective positions of card 630 a , 630 d , and 640 g , and remove cards 630 c , 630 f , and 640 b multi-card displays 634 a and 634 b.
  • an upward head movement can: (1) bring a new row of cards considered to be “above” the top row of cards; e.g., cards in the positions of cards 630 a , 630 b , 630 c of multi-card displays 634 a and 634 b , (2) display the new row of cards on the top row of multi-card displays 634 a and 634 b , (3) move the top row of cards down to be displayed as the middle row of cards; e.g.
  • a downward head movement can: (1) bring a new row of cards considered to be “below” the bottom row of cards of multi-card displays 634 a and 634 b , (2) display the new row of cards on the bottom row of multi-card displays 634 a and 634 b , (3) move the bottom row of cards up to be displayed as the middle row of cards; e.g. display cards 630 g , 640 a , and 640 b in the positions of cards 630 d , 630 e , and 630 f of multi-card displays 634 a and 634 b , (4) move the middle row of cards up to the top row of cards e.g.
  • Scenario 620 continues with clutch 642 being released while sub-focus 638 is on card 630 g .
  • Clutch 642 can be released by the wearer removing one or both of their fingers from the touch-based UI of the HMD. After clutch 642 is released, the HMD can use a single-user view to display either (a) card 630 c , as the card being focused on before clutch operation 642 began, or (b) card 630 g , as the card focused on using sub-focus 638 just prior to release of clutch 642 . In response to clutch 642 being released for HMD embodiments not using sub-focus 638 , the HMD can use a single-user view to display card 630 c.
  • FIG. 6C shows a scenario 650 for using clutch operation 680 to generate a multi-timeline display 664 a , according to an example embodiment.
  • Scenario 650 begins with an HMD displaying main timeline 660 with a focus on card 660 a .
  • the HMD displays cards in main timeline 660 using a single-card view, displaying a focused-upon card. As the focus is on card 660 a , the HMD displays a single-card view of card 660 a.
  • Scenario 650 continues with a wearer of the HMD performing clutch operation 680 .
  • the HMD can generate and display multi-timeline display 664 a , which is shown in an expanded view as multi-timeline display 664 b .
  • the HMD can focus on the entire multi-timeline display 664 a using focus 666 .
  • the HMD can focus a subset of cards and/or timelines, such as, but not limited to, a single card, one, some, or all cards on a timeline, a column of cards across one or more timelines, a block of cards across multiple timelines, a single timeline, a group of timelines, or some other selection of cards and/or timelines, within multi-card display 664 a using sub-focus 668 .
  • a subset of cards and/or timelines such as, but not limited to, a single card, one, some, or all cards on a timeline, a column of cards across one or more timelines, a block of cards across multiple timelines, a single timeline, a group of timelines, or some other selection of cards and/or timelines, within multi-card display 664 a using sub-focus 668 .
  • the multi-timeline displays five timelines (TLs): timelines 670 , 672 , 674 , 676 , and 678 .
  • the multi-timeline display displays five cards for each of displayed timelines 670 , 672 , 674 , 676 , and 678 .
  • the timelines can be selected for display based on a type of object displayed in a card; e.g., a timeline having only photos, only photo bundles, only messages, only message bundles, only cards representing active applications.
  • Additional criteria can be used to further select items for a timeline; e.g., for photo objects, some criteria can be: only photos taken before (or after) a predetermined date, within a date range, at a location, as part of a photo bundle, photos that were shared, photos that were shared and with one or more messages received in response, etc.
  • Other criteria for photo objects and/or other types of objects are possible as well for selection in a timeline. For example, in scenario 650 , all of the cards in timeline 670 represent photos in a photo bundle, all of the cards in timeline 672 represent photos taken in a given city location, and all of the cards in timeline 678 represent contacts that do not have associated photos/images.
  • the additional timelines presented can represent different user accounts associated with the HMD, for example, a first timeline could be cards generated by a user's work account, e.g. photos, events, contacts, email, messages, sent to or received by his/her work account, e.g. user@google.com.
  • the HMD could be configured to allow access to multiple user accounts, such as the user's personal account, e.g. user@gmail.com; such that a second timeline accessible from the grid view could be cards generated by the user's personal account, e.g. photos, events, contacts, email, messages, sent to or received by his/her personal account.
  • a first timeline could be cards generated by a user's work account, e.g. photos, events, contacts, email, messages, sent to or received by his/her work account, e.g. user@google.com.
  • the HMD could be configured to allow access to multiple user accounts, such as the user's personal account, e.g. user@gmail.com;
  • the timelines can be selected to be part or all of the main timeline; for example, FIG. 6C shows that timeline 674 includes five cards selected from main timeline 660 .
  • Cards can be selected from main timeline 660 randomly, based on focus 662 , based on a type of object represented on the main timeline; e.g., select only cards representing active applications visible from the main timeline, and/or based on other criteria.
  • timeline 674 includes card 660 a , which was the focused-on card prior to clutch 680 , and the two cards on each side of card 660 a in main timeline 660 .
  • Other criteria for selecting cards from a main timeline are possible as well.
  • One or more timelines can act as contextual menu(s) for multi-timeline display 664 a , including possible operations that can be performed from multi-timeline display 664 a , operations on multi-timeline display 664 a , and/or other operations.
  • timeline 678 includes a menu of operations including navigate, take a video, take a photo, remove a timeline option, and add a timeline. Other operations are possible as well.
  • the multi-timeline display 664 a could present a contextual menu of operations that could be executed based off of the presently selected card 660 a , e.g. share this card, delete the card, remove from timeline, add to bundle, etc.
  • the wearer of the HMD can navigate around multi-timeline display 664 a , 664 b using head movements.
  • the HMD is configured to display sub-focus 668 , shown as a dotted line on both multi-timeline displays 664 a and 664 b , shown focusing on a single timeline; e.g., timeline 668 .
  • “wrap-around” movements or moving off the end of a row or column to the respective other end of the row or column, are enabled. Then, in response to respective movements upward, downward, leftward, or rightward by the head of the wearer, the sub-focus 668 can move from timeline 674 , as shown in FIG. 6C , to respective timelines 672 , 676 , 672 , or 676 .
  • wrap-around can be inhibited, so moving the head of the wearer leftward will not move sub-focus 668 from timeline 674 to timeline 672 and moving the head of the wearer rightward will not move sub-focus 668 from timeline 674 to timeline 676 but rather sub-focus 638 will stay on timeline 674 in response to either the leftward or the rightward movement.
  • the sub-focus 638 in response to respective movements diagonally up-and-left, up-and-right, down-and-left, and down-and-right by the head of the wearer with wrap-around enabled, can move from timeline 674 , as shown in FIG. 6C , to respective cards 672 , 672 , 676 , and 676 .
  • wrap-around can be inhibited, but as each of the diagonal movements has an up or down components, movement to a respective timeline will succeed when sub-focus 668 is on timeline 674 .
  • sub-focus 668 may not be used.
  • a leftward head movement can move each of timelines 670 , 672 , 674 , 676 , 678 to the left on multi-timeline display 664 a , 664 b by one or more cards and a rightward head movement can move each of timelines 670 , 672 , 674 , 676 , 678 to the right on multi-timeline display 664 a , 664 b by one or more cards.
  • an upward head movement can bring a time “above” timeline 670 (not shown in FIG.
  • timeline 6C into view as a top-most timeline on multi-timeline displays 664 a and 664 b , move down each of timelines 670 , 672 , 674 , 676 by one time line on multi-timeline displays 664 a and 664 b , and remove timeline 678 from view.
  • an upward head movement can bring a time “below” timeline 678 (not shown in FIG. 6C ) into view as a bottom-most timeline on multi-timeline displays 664 a and 664 b , move up each of timelines 672 , 674 , 676 , 678 by one timeline on multi-timeline displays 664 a and 664 b , and remove timeline 670 from view.
  • multi-timeline display 664 a , 664 b can be used as well or instead with multi-timeline display 664 a , 664 b , including but not limited to head movements and/or UI operations that move the focus faster than and/or slower than one timeline at a time, enable navigation of cards within a timeline, which can include some or all of the navigation techniques discussed above regarding multi-card displays 634 a and 634 b , zooming in and out, reshaping sub-focus 668 , selecting card(s)/timeline(s), and deselecting card(s)/timeline(s).
  • Scenario 650 continues with clutch 680 being released while sub-focus 688 is on timeline 670 .
  • the HMD can use a single-card view to display a card on selected timeline 670 .
  • FIG. 6D shows scenario 682 for using head movements to navigate a multi-timeline display, according to an example embodiment.
  • Scenario 682 begins with the HMD displaying a single-card view 684 of a contact named “George Farley” participating in a hangout, as shown at the upper-left hand corner of FIG. 6D .
  • a hangout can be indicated by the HMD using icon 684 a of a camera inside of a speech balloon.
  • Scenario 682 continues with the wearer of the HMD performing a clutch operation, or pressing two fingers on the touch-based UI of the HMD for at least one second.
  • Multi-timeline display 686 a After determining a clutch operation was performed, the HMD can generate multi-timeline display 686 a , shown in the upper-right-hand corner of FIG. 6D as a rectangle with thick lines. Multi-timeline display 686 a is shown displaying a focus 688 a and parts of three timelines, including timeline (TL) 690 a .
  • timeline (TL) 690 a In scenario 682 , focus 688 a , shown in FIG. 6D as a circular arrangement of gray trapezoids, rests or focuses on card 684 . Focus 688 a rests on card 684 , as card 684 which was the card previously being displayed in a single-card view. In one embodiment, focus 688 a element may not be presented.
  • head movements can be used target items and move between levels of navigation.
  • Each level of navigation can be represented in a multi-timeline display as one or more cards on a timeline.
  • multi-timeline display 686 a shows that if the wearer made a leftward head movement, card 692 a on timeline 690 a , representing a navigation application/process would be centered on by focus 688 a .
  • Multi-timeline display 686 a also shows that if the wearer made a rightward head movement, card 692 b on timeline 690 a representing a weather application would be centered on by focus 688 a .
  • multi-timeline display 686 a shows that if the wearer made respective upward or downward head movements, respective cards 692 c or 692 d would be centered on by focus 688 a.
  • Scenario 682 continues with the wearer making a downward head tilt.
  • the HMD can move focus 688 a downward onto card 692 d with text of “expand”.
  • the HMD can generate multi-timeline display 686 b with focus 688 b on card 692 d , as shown in the center-left portion of FIG. 6D .
  • Multi-timeline display 686 b shows that card 692 d is part of timeline 690 b.
  • Timeline 690 b represents a contextual menu for the hangout, which includes card 692 d to expand, or show other members in the hangout, invite to request other people join the hangout, end the hangout, and mute sound from one or more persons at the hangout.
  • a card 694 a representing an attendee of the hangout is shown, in part to represent the next level of navigation if the wearer were to decide to make another downward head motion.
  • Scenario 682 continues with the wearer of the HMD making another downward head motion. After determining a downward head movement was performed, the HMD can move focus 688 b downward onto card 694 a , which represents George Farley as a hangout attendee.
  • the HMD can generate multi-timeline display 686 c with focus 688 c on card 694 a , as shown in the center-right portion of FIG. 6D .
  • Multi-timeline display 686 c shows that card 694 a is part of timeline 690 c , which represents attendees of the hangout.
  • FIG. 6D shows that there are three other attendees at the hangout beyond the wearer: Pieter Vrijman represented by card 694 b , George Farley represented by card 694 a , and Richard The, who is represented by card 694 c .
  • Below card 694 a is card 696 a with text of “mute”, representing a contextual menu of operations regarding attendees of hangouts. Card 696 a also represents the next level of navigation if the wearer were to decide to make another downward head motion.
  • Scenario 682 continues with the wearer releasing his/her fingers from the touch-based UI of the HMD, thereby ending the clutch operation.
  • the HMD can revert to a single-card view as shown at the lower right hand corner of FIG. 6D .
  • the single-card view can view the last-focused card during multi-timeline display.
  • the last focus; e.g., focus 688 d during multi-timeline display was on card 694 c representing Richard The.
  • the single-card view can display last-focused card 696 c in a single card view to end scenario 682 .
  • the user interface can use contextual menus to designate operations for specific objects, applications, and/or cards.
  • FIG. 7 shows user-interface scenario 700 including contextual menus, according to an example embodiment.
  • a contextual menu is a menu of operations or other possible selections that are based on a card.
  • a contextual menu can include operations such as sharing the video, editing the video, watching the video, deleting the video, adding the video to a “video bundle” or collection of videos, annotating the video, adding, deleting and/or editing sound associated with the video, and/or other operations related to the video, including but not limited more or fewer options.
  • Scenario 700 begins with the HMD receiving a tap while displaying image 710 .
  • image 710 is part of a timeline.
  • the HMD can select operations for a contextual menu, such as sharing and deleting the photo, based on the displayed card; e.g., image 710 .
  • the HMD can then display card 720 to indicate that a share operation can be performed on image 710 .
  • Card 720 also shows two dots to indicate that the current contextual menu has two options, with the leftmost dot being black and the rightmost dot being white to indicate that the current Share option is the first option of the two options.
  • a wearer can perform a swipe operation while card 720 is displayed.
  • card 722 can be displayed, where card 722 is associated with a delete operation for image.
  • card 722 shows two dots to indicate that the current contextual menu has two options, with the leftmost dot being white and the rightmost dot being black to indicate that the current Delete option is the second option of the two options.
  • a swipe operation while displaying card 722 causes (re)display of card 720 .
  • the HMD can interpret the tap operation as selection of the Share option of the contextual menu.
  • a “people chooser” can be used to select a first person for sharing.
  • the people chooser can display card 730 , which includes an image and a name of a first contact.
  • FIG. 7 shows that card 730 indicates the first person as “Jane Smith”.
  • the wearer can instruct the people chooser to show other possible recipients of photo 710 via swiping through a list of contacts.
  • the list of contacts can be represented by cards that include: card 732 a showing “Another Person”, card 732 b showing “Friends”, and card 732 c indicating other person(s), circle(s), and/or social network(s) for sharing photos. People choosers are also discussed in more detail at least in the context of FIG. 8 .
  • FIG. 7 shows that swiping left while card 732 c is displayed to request a next possible recipient can lead to re-displaying card 730 associated with Jane Smith. Similarly, FIG. 7 shows that swiping right while card 730 is displayed to request a previous possible recipient can lead to card 732 c.
  • the wearer taps on the touch-based UI while card 730 is displayed, indicating that the wearer wishes to share image 710 with Jane Smith.
  • card 734 is displayed, which includes the word “Sending” and a progress bar.
  • the HMD is configured to wait for a “grace period”, such as one or a few second(s), before carrying sending or deleting images, to give the wearer a brief interval to cancel sending or deleting the image.
  • the progress bar on card 734 can show the passing of the time of the grace period for sending image 710 .
  • the HMD can send image 710 , e.g., via e-mail or multi-media message, to Jane Smith. If image 710 is sent successfully, the HMD can display card 736 with text of “Sent” to indicate that image 710 was indeed successfully sent to Jane Smith. After displaying card 736 , the HMD can return to a timeline display, such as discussed above in the context of at least FIG. 5A .
  • the HMD can display card 738 to indicate to the wearer that the HMD was unsuccessful in sending image 710 sent to Jane Smith. After displaying card 738 , the HMD can return to a timeline display, such as discussed above in the context of at least FIG. 5A .
  • the HMD can interpret the tap operation as selection of the Delete option of the contextual menu.
  • the HMD can display card 740 with text of “Deleting” and a progress bar for a grace period that has to expire before the HMD will delete image 710 .
  • the HMD can delete image 710 .
  • the HMD can display card 742 to indicate to the wearer that image 710 was indeed deleted. After displaying card 742 , the HMD can return to a timeline display, such as discussed above in the context of at least FIG. 5A .
  • FIG. 7 also shows that at any time while displaying cards 720 , 722 , 730 , 732 a - 732 c , 734 , 736 , 740 , and 742 , a swipe down operation can be performed.
  • the HMD can stop the current operation; e.g., send or delete, and return to displaying image 710 .
  • FIG. 8 shows a user-interface scenario 800 including a people chooser, according to an example embodiment.
  • scenario 800 two techniques are shown for invoking the people chooser. While card 810 is displayed, a wearer of an HMD can use a voice interface that requests that the wearer “Speak a name from your contacts.” Also or instead, at 812 , the HMD can be in a contextual menu with a “Share” option that is selected.
  • the people chooser is invoked to permit selection of a person or “contact” as a destination for sharing, being called, looked up in a contact directory, or some other activity.
  • the people chooser sorts contacts by frequency of use, rather than by time of use; e.g., recency, to be a useful alternative to the timeline.
  • the HMD can then take action 822 with the choice. If a swipe is received while card 820 is displayed, then another card can be displayed for a next-most recent contact; e.g., card 824 for “Another Person”. To select “Another Person” for the action while card 824 is displayed, a wearer can either tap the HMD using the touch-based UI or say the person's name, e.g., “Another Person”, using the voice-based interface. If “Another Person” is selected, the HMD can carry out the action with “Another Person”.
  • “Another Person” is not selected. Then, the wearer can swipe again, and another card can be displayed for a group of contacts, such as card 826 for “Friends”. To select a “Friend” for the action while card 826 is displayed, a wearer can either tap the HMD using the touch-based UI or say the person's name, e.g., “Friend”, using the voice-based interface. If the “Friends” group is selected, the HMD can provide cards in the “Friends” group in response to swipe actions until either a contact in the “Friends” group is selected or the “Friends” group is exhausted without the wearer making a selection.
  • Each item in the “Friends” group, or friend can be a contact or other representation of a person, organization, group, family, etc. that the wearer has designated as a friend.
  • the “Friends” group can be a bundle or folder that enables access to the items or friends within the bundle or folder.
  • the “Friends” group can be a group of friends ordered based on time of friend designation, most recent access, or by some other criteria.
  • Scenario 800 can continue with swipes that show contacts until either a contact is selected or until all contacts have been displayed. If all contacts have been displayed, after displaying the last selected contact, the HMD can “wrap-around” or return to the first selected card; e.g., card 820 representing “Jane Smith”.
  • FIG. 9 shows a user-interface scenario 900 with camera interactions, according to an example embodiment.
  • Scenario 900 can begin by displaying card 910 or card 930 for an HMD configured with one or more cameras that can perform at least the activities described herein.
  • the camera button e.g., button 179 of HMD 172 shown in FIG. 1D
  • the camera button can be pressed for either a short time; e.g., less than one second, or a long time; e.g., longer than the short time.
  • a short time also referred to as a “short press” of the camera button
  • scenario 900 continues by displaying card 920 .
  • the camera button is pressed for the long time, also referred to as a “long press” of the camera button
  • scenario 900 continues by displaying card 934 .
  • scenario 900 In response to the short press of the camera button, a photo or still image is captured using the camera—an example image capture is shown as card 920 . If, after capturing the photo, a tap is received, scenario 900 continues by displaying card 922 ; otherwise, if either a swipe down is received or no interaction with the touch-based UI is recorded during a wait interval; e.g., one second, scenario 900 continues by displaying card 924 .
  • Card 922 is part of a contextual menu with options for operating on the captured photo.
  • the contextual menu can include options such as a share option for the captured photo; e.g., as indicated by the “Share” card shown at 922 , a delete option for the captured photo, and other options for the captured photo (e.g., editing the photo).
  • Card 924 shows the captured photo as “animated out”; that is, the image of the captured photo is replaced with a blank card shown as card 926 via an animated transition.
  • the HMD can return to a previous state; e.g., a position in the timeline being displayed at 910 before receiving the short press of the camera button.
  • a tap can be received via the touch-based UI.
  • the HMD can display a “Capture” card, such as card 930 .
  • scenario 900 can continue with a display of card 932 .
  • Card 932 is shown in FIG. 9 as a “Photo” card, indicating that to the wearer that a photo or still image can be captured using the camera. If a swipe is received while displaying card 932 , scenario 900 can continue by displaying card 934 ; otherwise, scenario 900 can continue at 950 .
  • the HMD can determine whether a new video session is to be started to capture the requested video or if a pending video session is to be rejoined. If the new video session is to be started, the HMD can trigger the camera to start recording images (if not already recording) and scenario 900 can continue by displaying card 950 . If the pending video session is to be rejoined, the HMD can redirect to, or request display of, an already-existing card for the pending video session and scenario 900 can continue by displaying a card for the pending video session, shown in FIG. 9 as card 952 .
  • Card 936 is shown in FIG. 9 as a “Timelapse” card to indicate to the wearer that a timelapse image can be captured using the camera. If, a swipe is received while displaying card 936 , scenario 900 can continue by displaying card 932 .
  • the HMD can determine whether a new timelapse session is to be started to capture the requested timelapse image or if a pending timelapse session is to be rejoined. If the new timelapse session is to be started, the HMD can trigger a timelapse card to start displaying a timelapse image being captured by the camera (if not already recording) and scenario 900 can continue by displaying card 960 . If the pending timelapse session is to be rejoined, the HMD can redirect to an already-existing card for the pending timelapse session and scenario 900 can continue by displaying a card for the pending timelapse session, shown in FIG. 9 as card 962 .
  • the HMD can launch a temporary view finder and instruct the camera to begin capturing images.
  • the HMD can display the image. While displaying the image, the wearer can either (a) provide a tap to the HMD and scenario 900 can continue by displaying card 942 or (b) provide a swipe down using the HMD and scenario 900 can continue by displaying card 944 .
  • the HMD can capture an image using the camera. Once captured, the HMD can display the captured image for a short period of time; e.g., one or a few seconds. After displaying the captured image for the short period, scenario 900 can proceed to display card 940 .
  • any image for possible capture e.g., card 940
  • the camera can be deactivated after animating out the image, if no other application; e.g., video, is using the camera.
  • the HMD can return to a previous state; e.g., a position in the timeline being displayed at 910 before reaching 944 .
  • Card 950 can be a card representing the new video session. While the video session is active, the HMD can capture images and, in some embodiments, sound, and store the captured video. Upon capturing each image for the video session, the HMD can display the captured image using card 950 , which represents the new video session. While displaying the images for the video session using card 950 , the wearer can either (a) provide a tap to the HMD and scenario 900 can continue by displaying card 954 or (b) provide a swipe down using the HMD and scenario 900 can continue by displaying card 956 .
  • Card 952 can be a card representing the pending video session. While the video session is active, the HMD can capture images, and in some embodiments, sound, and store the captured video. Upon capturing each image for the video session, the HMD can display the captured image using the card 952 , which represents the pending video session. While displaying the images for the video session using card 952 , the wearer can either (a) provide a tap to the HMD and scenario 900 can continue by displaying card 954 or (b) provide a swipe down using the HMD and scenario 900 can continue by displaying card 956 .
  • Card 954 can represent a contextual menu with options for the captured video.
  • the contextual menu can include options for the captured video, such as a stop recording option, restart recording option, delete video option, and other options.
  • Card 956 can be a blank card indicating to the wearer that the video session has terminated.
  • the captured video can be deleted after the video session is stopped, while in other embodiments, the captured video or audio video can remain in storage after the video session is stopped.
  • the camera can be deactivated if no other application; e.g., a timelapse photo capture, is using the camera.
  • the HMD after displaying the blank card, the HMD can return to a previous state; e.g., a position in the timeline being displayed using card 910 before card 956 was ever displayed.
  • Card 960 can represent the new timelapse session. While the new timelapse session is active, the HMD can capture images for addition to the timelapse image. Upon capturing each image for the timelapse session, the HMD can display image(s) related to the new timelapse session using card 960 . While displaying card 960 , the wearer can either (a) provide a tap to the HMD and scenario 900 can continue by displaying card 964 or (b) provide a swipe down using the HMD and scenario 900 can continue by displaying card 966 .
  • Card 962 can represent the pending timelapse session. While the pending timelapse session is active, the HMD can capture images for addition to the timelapse image. Upon capturing each image for the timelapse session, the HMD can display image(s) related to the pending timelapse session using card 962 . While displaying card 962 , the wearer can either (a) provide a tap to the HMD and scenario 900 can continue by displaying card 964 or (b) provide a swipe down using the HMD and scenario 900 can continue by displaying card 966 .
  • Card 964 can represent a contextual menu with options for the captured timelapse image.
  • the contextual menu can include options for the captured timelapse image, such as a stop timelapse option, a timelapse frequency option, a restart timelapse option, and other options.
  • Card 966 can be a blank card that indicates to the wearer that the timelapse session has terminated.
  • the captured timelapse image can be deleted after the timelapse session is stopped, while in other embodiments, the captured timelapse image can remain in storage after the timelapse session is stopped.
  • the camera can be deactivated if no other application; e.g., video is using the camera.
  • the HMD after displaying the blank card, the HMD can return to a previous state; e.g., a position in the timeline being displayed using card 910 before card 966 was ever displayed.
  • FIG. 10A shows user-interface scenario 1000 with photo bundles, according to an example embodiment.
  • Scenario 1000 begins with an HMD displaying photo bundle card (PBC) 1010 in a timeline.
  • Photo bundle card 1010 includes photo bundle indicator (PBI) 1010 a , example photo 1010 b , and thumbnails 1010 c .
  • Photo bundle indicator 1010 a shown in FIG. 10A as a page with a turned-down corner, indicates that a “photo bundle” or collection of photos is associated with photo bundle card 1010 .
  • Example photo 1010 b shown in FIG.
  • FIG. 10A as occupying roughly one-half of photo bundle card 1010 , provides a relatively large image of an example photo in the photo bundle.
  • Thumbnails 1010 c shown in FIG. 10A as collectively occupying roughly one-half of photo bundle card 1010 , provides four relatively small images of four example photos in the photo bundle.
  • the wearer of the HMD can tap on a touch-based UI to instruct the HMD to display the photos in the photo bundle.
  • the HMD can receive a tap and subsequently display a card with photo 1012 .
  • Each individual item within a bundle functions the same with respect to the user interface as it would if the item were displayed on the timeline. For example, in the case of a photo, such as photo 1012 , tapping on the touch-based UI would enter a contextual menu for the photo, and swiping down while in the contextual menu would return to photo 1012 .
  • the HMD can receive a swipe forward to display the next photo in the bundle or a swipe backward to display the previous photo in the bundle.
  • the next photo can be photo 1014 .
  • the previous photo is the last photo in the bundle, or photo 1018 .
  • the HMD receives a swipe backward while displaying photo 1012 .
  • the HMD can display photo 1018 as discussed above.
  • Scenario 1000 continues with the HMD receiving two more swipes backwards.
  • the HMD can first display photo 1016 which is the previous photo to photo 1018 , and, after receiving the second swipe backward, display photo 1014 which is the previous photo to photo 1016 as shown in FIG. 10A .
  • the HMD can receive a tap. In response to the tap, the HMD can display photo bundle card 1010 and scenario 1000 can end.
  • FIG. 10B shows user-interface scenario 1050 with message bundles, according to an example embodiment.
  • Scenario 1050 begins with an HMD displaying message bundle card (MBC) 1060 in a timeline.
  • Message bundle card 1060 includes message bundle indicator (MBI) 1060 a and a most-recent message in the message bundle, which includes image 1060 b and message 1060 c .
  • Photo bundle indicator 1060 a shown in FIG. 10B as a page with a turned-down corner, indicates that a “message bundle” or collection of messages is associated with message bundle card 1060 .
  • Image 1060 b can be an image associated with the sender of the most-recent message in the message bundle.
  • Message 1060 c can include text, and in some embodiments, other type(s) of data, that is sent with the most-recent message in the message bundle. As shown in FIG. 10B , image 1060 b occupies roughly one-third of message bundle card 1060 , is an image of “Joe W.” who sent message 1060 c , which occupies roughly two-thirds of message bundle card 1060 . Message 1060 c includes text that says “Sounds great. See you there,” and was sent three minutes ago.
  • scenario 1060 while displaying message bundle card 1060 , the wearer of the HMD can tap on a touch-based UI. Some bundles have additional functionality, specific to the bundle, associated with a tap.
  • a contextual menu can be displayed in response to the tap.
  • FIG. 10B shows two options in the contextual menu: a reply option associated with card 1070 and a read-all option associated with card 1072 .
  • the HMD can receive a tap.
  • the HMD can interpret the tap as a selection to reply to the most recently displayed message card.
  • the HMD can receive a tap, which can be interpreted to read the messages in the message bundle, starting with the most recent. In one embodiment, the HMD can start with the first message in the message bundle rather than the most recent.
  • the HMD can select message bundle card 1060 for display.
  • Each individual item within a bundle functions the same with respect to the user interface as it would if the item were displayed on the timeline. For example, in the case of a message, such as message 1062 , tapping on the touch-based UI would enter a contextual menu for the message, and swiping down while in the contextual menu for the message would return to message 1062 .
  • the HMD can receive a swipe forward to display the next message in the bundle or a swipe backward to display the previous message in the bundle.
  • the previous message can be message 1064 .
  • message 1062 is the first message in the bundle, there is no “next” message, so the last message in the bundle, or message 1066 , can be displayed instead.
  • the HMD receives a swipe forward while displaying message 1062 .
  • the HMD can display message 1066 as discussed above.
  • Scenario 1050 continues with the HMD receiving two more swipe forwards.
  • the HMD can first display message 1064 which is the next message to message 1066 , and, after receiving the second swipe forward, display message 1062 , which is the next message to message 1064 as shown in FIG. 10B .
  • the HMD can receive a tap.
  • the HMD can enter a contextual menu for message 1062 and scenario 1050 can end.
  • FIG. 11 shows user-interface scenario 1100 with timeline 1110 including settings cards 1120 , 1130 , according to an example embodiment.
  • timeline 1110 has two settings cards 1120 and 1130 at the now/future end of the timeline.
  • both cards 1120 and 1130 permit interaction with various “settings”, e.g., controls, preferences, data, and/or other information, in response to a tap input of the touch-based user interface.
  • Card 1120 is related to wireless network (“WiFi”) settings, which can be settings related to wireless networks operating using one or more protocols, such as IEEE 802.11 protocols, which are discussed in more detail below in the context of FIG. 12 .
  • Card 1130 is related to Bluetooth settings, which can be settings related to short range wireless networks operating using one or more Bluetooth protocols, which are discussed in more detail below in the context of FIG. 13 .
  • FIG. 12 shows user-interface scenario 1200 related to WiFi settings, according to an example embodiment.
  • Scenario 1200 begins with an HMD displaying card 1210 .
  • Card 1210 indicates that the HMD is connected via WiFi to a network of computers called “GGuest.”
  • a wearer of the HMD taps the touch-based UI of the HMD.
  • the HMD displays card 1220 , indicating both that the HMD is connected to GGuest and a map of the general area around the HMD.
  • the wearer can swipe next through cards 1230 and 1240 that indicate available networks for accessible connections, card 1250 to begin the process to add another WiFi network, and card 1260 to turn off the WiFi functionality of the HMD.
  • swiping next after displaying card 1260 leads to display of card 1220 .
  • swiping previous after displaying card 1220 leads to display of card 1260 .
  • the HMD In response to tapping while displaying card 1220 , the HMD displays card 1222 with text of “Forget”.
  • wearer can use the touch-based UI of the HMD to either (a) tap to instruct the HMD to begin a process of forgetting, e.g., deleting stored information, about the currently connected WiFi network, or (b) swipe to bring up card 1232 with text of “Disconnect” to begin a process of disconnecting from the currently connected WiFi network.
  • the currently connected WiFi network would be GGuest, as card 1220 was reached after tapping card 1210 , and card 1210 is associated with the GGuest WiFi network.
  • the wearer taps on the touch-based HMD while card 1222 is displayed to instruct the HMD to forget about the GGuest network.
  • the process of forgetting about a WiFi network is associated with a grace period to permit the wearer to reconsider.
  • the HMD can display card 1224 with text of “Forgetting” and progress bar 1224 a .
  • Progress bar 1224 a can take a length of time, such as equal to or greater than the grace period, to complete display. After progress bar 1224 a is completely displayed, the grace period is deemed to have expired.
  • the HMD can delete stored information about the currently connected WiFi network and display card 1226 indicating the currently connected WiFi network is now forgotten. After displaying card 1226 , the HMD can return to the settings context menu.
  • the wearer taps on the touch-based HMD while card 1232 is displayed to instruct the HMD to disconnect from the GGuest network.
  • the process of disconnecting from a WiFi network is associated with a grace period to permit the wearer to reconsider.
  • the HMD can display card 1234 with text of “Disconnecting” and progress bar 1234 a .
  • Progress bar 1234 a can take a length of time, such as equal to or greater than the grace period, to complete display. After progress bar 1234 a is completely displayed, the grace period is deemed to have expired.
  • the HMD can disconnect from the currently connected WiFi network and display card 1236 indicating that the HMD is now disconnected from the previously-connected WiFi network. After displaying card 1236 , the HMD can return to the settings context menu.
  • Card 1230 displays information about a nearby WiFi network named “GA Ntwk” including the network's use of Wired Equivalent Privacy or “WEP”, and a map with location information about “GA Ntwk.”
  • WEP Wired Equivalent Privacy
  • the HMD attempts to connect the “GA Ntwk” network and displays card 1244 with text of “Connecting”
  • the HMD After displaying card 1244 , if the HMD is able to successfully connect to the WiFi network, the HMD will display card 1246 with text of “Connected” and return to the setting context menu. If the HMD is unable to successfully connect to the WiFi network; e.g., the network is not open access and requires authentication for access, the HMD will display card 1248 with text of “Failed” and return to the previous card; e.g., card 1230 to request additional input related to the “GA Ntwk.” In some embodiments, the HMD can automatically attempt WiFi reconnection upon (initial) failure. In particular embodiments, the HMD will automatically attempt WiFi reconnection for a fixed number of attempts before indicating failure. If the HMD automatically reattempts WiFi connection upon failure, the HMD can display card 1244 as the “previous” card.
  • Card 1240 displays information about a nearby WiFi network named “Coffee Shop” including a map with location information about “Coffee Shop”. In response to tapping while displaying card 1240 , the HMD can determine that the “Coffee Shop” network is secured and display card 1242 .
  • Card 1242 displays an icon of a Quick Response (QR) code, text to “Enter Password”, and a hint of “Generate QR code at ⁇ Example URL>.”
  • QR Quick Response
  • the QR code is provided to the HMD.
  • the QR code can be on a sticker, poster, paper, or otherwise displayed at the wearer's location; e.g., the “Coffee Shop” location.
  • the QR code can be generated via a website in which the user entered the credentials for access to the network. Once a suitable QR code is located, the wearer can capture the QR code by pointing the HMD's camera at it.
  • other techniques besides a QR code can be used to enter network credentials, such as the wearer speaking the password for access to a network.
  • the HMD can display card 1244 .
  • the HMD After displaying card 1244 , if the HMD is able to successfully connect to the WiFi network, the HMD will display card 1246 with text of “Connected” and return to the setting context menu. If the HMD is unable to successfully connect to the WiFi network, the HMD will display card 1248 with text of “Connected” and return to the previous card; e.g., card 1230 to request additional input related to the “Coffee Shop” network.
  • the HMD can automatically reattempt WiFi connection upon (initial) failure.
  • the HMD will automatically attempt WiFi reconnection for a fixed number of attempts before indicating failure. If the HMD automatically reattempts WiFi connection upon failure, the HMD can display card 1244 as the “previous” card.
  • Card 1250 displays a QR code encoding information about a WiFi network.
  • the wearer can obtain a QR code and the HMD's camera can be utilized to capture the QR code as discussed above.
  • other techniques besides a QR code can be used to enter network credentials, such as the wearer speaking the password for access to a network.
  • the HMD can display card 1244 .
  • the HMD After displaying card 1244 , if the HMD is able to successfully connect to the WiFi network, the HMD will display card 1246 with text of “Connected” and return to the setting context menu. If the HMD is unable to successfully connect to the WiFi network, the HMD will display card 1248 with text of “Failed” and return to the previous card; e.g., card 1230 to request additional input related to the WiFi network to be added indicated using card 1250 .
  • the HMD can automatically reattempt WiFi connection upon (initial) failure. In particular embodiments, the HMD will automatically attempt WiFi reconnection for a fixed number of attempts before indicating failure. If the HMD automatically reattempts WiFi connection upon failure, the HMD can display card 1244 as the “previous” card.
  • the HMD begins a process of “turning off” or deactivating WiFi functionality for the HMD.
  • the process of deactivating WiFi functionality is associated with a grace period to permit the wearer to cancel or abort the WiFi deactivation.
  • the HMD can display card 1262 with text of “Turning off” and progress bar 1262 a .
  • Progress bar 1262 a can take a length of time, such as equal to or greater than the grace period, to complete display. After progress bar 1262 a is completely displayed, the grace period is deemed to have expired.
  • the HMD can deactivate WiFi functionality for the HMD, and display card 1264 indicating the WiFi functionality for the HMD is off. After displaying card 1264 , the HMD can return to the settings context menu.
  • card 1210 could indicate, as shown using card 1212 of FIG. 12 , that the HMD is not connected to a WiFi network or, as shown using card 1214 of FIG. 12 , that WiFi functionality of the HMD is turned off.
  • card 1220 is not used, and tapping either card 1212 or 1214 leads to display of card 1230 .
  • card 1260 displays a “Turn On” or similar text, and tapping card 1260 while the WiFi functionality is initially off, lead to activation of the HMD's WiFi functionality.
  • FIG. 13 shows user-interface scenario 1300 related to Bluetooth settings, according to an example embodiment.
  • Scenario 1300 begins with the HMD displaying card 1310 in a timeline.
  • card 1310 includes a Bluetooth logo and text indicating the HMD is “Connected to Galaxy Nexus [and] Home-PC.”
  • scenario 1300 in response to viewing card 1310 , the wearer performs a tap operation using the touch-based UI of the HMD.
  • the HMD can display card 1320 .
  • Card 1320 shows an image of a mobile device and text of “Connected to Galaxy Nexus.”
  • the wearer can swipe next through card 1330 that indicate connection to a Home-PC and card 1340 to begin the process to “pair with” or connect to another device using Bluetooth.
  • swiping next after displaying card 1340 leads to display of card 1320 .
  • swiping previous after displaying card 1320 leads to display of card 1340 .
  • the wearer can perform a tap operation using the touch-based UI of the HMD.
  • card 1332 is displayed with text of “Disconnect” to indicate a disconnect operation to be performed on the current Bluetooth connection.
  • the wearer can use the touch-based UI to perform a swipe operation.
  • the HMD can display card 1322 with text of “Forget” to indicate a forget operation for the current Bluetooth connection.
  • the wearer can use the touch-based UI of the HMD to either (a) tap to instruct the HMD to begin a process of forgetting about the current Bluetooth connection, or (b) swipe to re-view card 1332 .
  • the Bluetooth connection would be a connection between the HMD and “Galaxy Nexus”, as card 1322 was reached after tapping card 1320 , and card 1320 is associated with the HMD/Galaxy Nexus Bluetooth connection.
  • the wearer taps on the touch-based HMD while card 1322 is displayed to instruct the HMD to forget about the HMD/Galaxy Nexus Bluetooth connection.
  • the process of forgetting about a Bluetooth connection is associated with a grace period to permit the wearer to reconsider.
  • the HMD can display card 1324 with text of “Forgetting” and a progress bar.
  • the progress bar can take a length of time, such as equal to or greater than the grace period, to complete display. After the progress bar is completely displayed, the grace period is deemed to have expired.
  • the HMD can delete stored information about the current Bluetooth connection and display card 1326 indicating the current Bluetooth connection is now forgotten. After displaying card 1326 , the HMD can return to the home card context menu.
  • the wearer taps on the touch-based HMD while card 1332 is displayed to instruct the HMD to disconnect from the Galaxy Nexus.
  • the process of disconnecting a Bluetooth connection is associated with a grace period to permit the wearer to reconsider.
  • the HMD can display card 1334 with text of “Disconnecting” and a progress bar that can take a length of time, such as equal to or greater than the grace period, to complete display. After the progress bar is completely displayed, the grace period is deemed to have expired.
  • the HMD can disconnect from the current Bluetooth connection and display card 1336 indicating that the HMD is now disconnected from the previously-connected Bluetooth connection. After displaying card 1336 , the HMD can return to the home card context menu.
  • the wearer can use the touch-based UI of the HMD to perform a tap operation while card 1330 is displayed.
  • Card 1330 shows an image of a computer display and has text of “Connected to Home-PC” to indicate a Bluetooth connection between the HMD and a device named “Home-PC”.
  • the HMD can display card 1332 for disconnecting the HMD/Home-PC connection, or after receiving a swipe operation, the HMD can display card 1322 for disconnecting the HMD/Home-PC connection.
  • the wearer can use the touch-based UI to perform a tap operation.
  • the HMD can respectively perform the forgetting (after card 1322 display) or disconnecting operations (after card 1322 display) for Bluetooth connections, using the HMD/Home-PC connection as the current Bluetooth connection, as the tap for card 1322 / 1332 was received after most recently displaying card 1330 representing the HMD/Home-PC connection.
  • the HMD can be brought into, or already be in, proximity of some other device configured to pair with the HMD.
  • the other device is a mobile phone identified, e.g., as “Galaxy Nexus.”
  • the HMD can attempt to pair with a device other than the Galaxy Nexus. If the other device attempts to pair with the HMD (or vice versa), card 1342 can be displayed in response. As shown in FIG. 13 , card 1342 includes an image of a mobile device and text of “Pair with Galaxy Nexus? Tap if Galaxy Nexus displays 186403.”
  • the wearer can use the touch-based UI to perform a tap operation, and so instruct the HMD to pair with the other device; e.g., the Galaxy Nexus.
  • the HMD can display card 1344 with text of “Pairing” to indicate that the HMD is attempting to pair with the Galaxy Nexus.
  • the HMD can display card 1344 with text of “Paired” and can return to the main timeline after “splicing” or adding a card for the new device to the timeline.
  • the HMD can display card 1348 with text of “Failed” and can return to card 1340 (“Pair with new device”) to possibly reattempt pairing with the Galaxy Nexus and/or pair with a different device.
  • the HMD can arrange portions of a card in a “visual stack” in order to generate the visual rendering of the card.
  • FIG. 14A shows example visual stack 1400 , according to an example embodiment.
  • Visual stack 1400 is used to generate or render a card from a set of overlaid images.
  • visual stack 1400 is the collection of images viewed looking down viewport 1410 via a “rectangular tube”, shown with dashed lines in FIG. 14A , to main timeline 1430 .
  • the collection of images can be part of timelines, menus, etc., each of which can be considered to run independently at different levels perpendicular to the rectangular tube.
  • the wearer can then perceive content on the display of the HMD through the viewport 1410 to see the portions of the timelines, menus, etc. that are within the rectangular tube.
  • FIG. 14A shows that three items are in visual stack 1400 between viewport 1410 and main timeline 1430 : submenu 1420 , contextual menu 1422 , and overlay 1424 .
  • Submenu 1420 includes three images: an image of “Jane Smith”, an image of “Another Person”, and an image associated with “Friends”, with the image of “Another Person” inside the rectangular tube.
  • Contextual menu 1422 includes two options: a “Share” option and a “Delete” option, with the “Share” inside the rectangular tube.
  • visual stack 1400 shows contextual menu 1422 for a photo bundle card shown on main timeline 1430 with a “Share” option selected from the contextual menu 1422 , and a sharing destination of “Another Person” selected from submenu 1420 .
  • FIG. 14B shows example visual stack 1450 , according to an example embodiment.
  • visual stack 1450 is the collection of images viewed looking down viewport 1460 via a rectangular tube shown with dashed lines in FIG. 14A , to main timeline 1480 .
  • FIG. 14B shows two items are in visual stack 1450 between viewport 1460 and main timeline 1480 : action notification 1470 and overlay 1472 .
  • Action notification 1470 shows a “Send” notification.
  • visual stack 1400 shows a “Send” notification for a photo bundle card shown on main timeline 1480 .
  • overlay 1472 is completely opaque with respect to main timeline 1480 . In these embodiments, the wearer viewing visual stack 1450 sees action notification 1470 and overlay 1472 . In other embodiments, overlay 1472 is partially or completely transparent or translucent with respect to main timeline 1480 . In these embodiments, the wearer viewing visual stack 1450 sees action notification 1470 and overlay 1472 with some portion(s) of the photo bundle card shown on main timeline 1480 visible, depending on the visibility of an image on main timeline 1480 through overlay 1472 .
  • FIG. 15 shows a user-interface scenario 1500 related to voice interactions, according to an example embodiment.
  • Scenario 1500 begins with a wearer of the HMD reciting and/or uttering the phrase “ok glass”, which can be prompted by a hint provided on a home card such as discussed above in the context of FIG. 3 .
  • the HMD can receive the utterance “ok glass” via the voice-based interface and display card 1510 .
  • Card 1510 shows the input command “ok glass” to confirm the input received at the voice-based UI of the HMD and a list of available voice commands including “Google”, “navigate to”, “take a photo”, “record a video”, “send a message to”, and “make a call to.”
  • the wearer of the HMD can tilt his/her head up or down to respectively scroll up or down through lists, such as the list of available voice commands.
  • card 1512 shows the result of scrolling down the list of possible voice commands shown in card 1510 , indicating the removal of the previously-visible available voice command “Google” and the addition of the available voice command “hangout with.”
  • the HMD can use tilt sensors, accelerometers, motion detectors, and/or other devices/sensors to determine if the wearer tilted their head and/or whether the wearer tilted their head up or down.
  • Card 1510 While display of card 1510 , the wearer utters the word “Google”, which causes card 1520 to be displayed. Card 1520 can also be displayed in response to the wearer uttering “OK Google” and the voice-based HMD recognizing the “OK Google” phrase. Card 1520 , as shown in FIG. 15 , includes a hint of “ask a question” along with an icon of a microphone that can act as a reminder to the wearer that they are using the voice-based interface.
  • the wearer can utter “How tall is the Eiffel Tower?”
  • the HMD can then display card 1522 showing that the HMD is processing the input utterance, and once processed, display card 1524 .
  • Card 1524 “echo-prints” or repeats the input utterance of “How tall is the Eiffel Tower?” and also prints an indicator of “searching” to inform the wearer that a search is ongoing based on their input.
  • the HMD can display card 1526 , which includes an image of the Eiffel Tower and an answer of “1,063 feet (324 m)” to the question asked by the wearer.
  • scenario 1500 in response to card 1512 , the viewer can utter “Navigate to”.
  • the voice-based UI of the HMD can capture the utterance, determine that the utterance is “Navigate to” and display card 1530 showing an echo-print of the “Navigate to” utterance.
  • Scenario 1500 includes two examples of a destination of the “Navigate to” command.
  • the wearer utters “Restaurant 1” as the destination, indicated via card 1532 .
  • Card 1532 includes an echo-print of the “ok glass, navigate to Restaurant 1” command and an indication that the HMD is searching for a location of Restaurant 1.
  • a single location result is returned for “Restaurant 1”.
  • Card 1534 includes a map to the single location that occupies about one-third of the card. The remainder of card 1534 shows that the HMD is “navigating to Restaurant 1” which is on “13 th Street” and will take “16 minutes” to get to the restaurant.
  • the wearer utters “Popular Pueblo” as the destination, as indicated via card 1536 .
  • Card 1536 includes an echo-print of the “ok glass, navigate to Popular Pueblo” command and an indication that the HMD is searching for a location of Popular Pueblo.
  • a search for a location of “Popular Pueblo” returned multiple locations, as indicated by location cards 1538 .
  • the wearer can use the touch-based UI to swipe through the multiple location cards 1538 to view each location cards individually, and to perform a tap operation to select a particular Popular Pueblo location while the desired Popular Pueblo location card is displayed. After the desired Popular Pueblo location card is displayed and the tap operation is completed, the desired location result is shown in card 1540 of FIG.
  • Card 1540 includes a map to the desired location that occupies about one-third of the card. The remainder of card 1540 shows that the HMD is “navigating to Popular Pueblo” which is on “14 th Street” and will take “5 minutes” to get to the desired Popular Pueblo.
  • scenario 1500 in response to card 1512 , the viewer can utter “Send message to”.
  • the voice-based UI of the HMD can capture the utterance, determine that the utterance is “Send message to” and display card 1550 showing an echo-print of the “Navigate to” utterance, along with a list of potential recipients of the message.
  • FIG. 15 shows that the list of potential recipients includes “Sarah Johnson”, “Steve Johnson”, and “Julie Dennis”.
  • Scenario 1500 continues with the wearer uttering “Sarah Johnson” to the HMD.
  • the voice-based UI of the HMD can capture the utterance, determine that the utterance is “Sarah Johnson” and display card 1554 showing an echo-print of the “send message to” utterance, along with “Sarah Johnson” as a recipient of the message.
  • the HMD can wait for a period of time, e.g., one second, for the wearer to provide additional recipients. If the wearer does not provide additional recipients in that period of time, the HMD can display a card, such as card 1556 , for composing and echo-printing a message.
  • Card 1556 shows echo-printed utterances including “hi sarah I′m on my way out will be a few” and blocks. The blocks indicate that the HMD is in the process of recognizing the utterances provided by the wearer and translating those utterances into text.
  • speech can be translated to text using one or more automatic speech recognition (ASR) techniques.
  • ASR automatic speech recognition
  • the wearer stops uttering content for the message.
  • scenario 1500 after uttering the content of the message, the wearer decides to send the message to the recipient, Sarah Johnson.
  • the user can either perform a tap operation using the touch-based UI, or stop uttering for a period of time; e.g., one second.
  • the HMD can display a card such as card 1558 indicating that the message is in the process of being sent. After the message is sent, the sent message is spliced into the timeline.
  • FIG. 16A is a flow chart illustrating a method 1600 , according to an example embodiment.
  • method 1600 is described by way of example as being carried out by a computing device, such as a wearable computer, and possibly a wearable computer that includes a head-mounted display (HMD).
  • HMD head-mounted display
  • example methods can be carried out by a wearable computer without wearing the computer.
  • such methods can be carried out by simply holding the wearable computer using the wearer's hands.
  • Other possibilities can also exist.
  • example methods can be carried out by devices other than a wearable computer, and/or can be carried out by sub-systems in a wearable computer or in other devices.
  • an example method can alternatively be carried out by a device such as a mobile phone, which is programmed to simultaneously display a graphic object in a graphic display and also provide a point-of-view video feed in a physical-world window.
  • a device such as a mobile phone, which is programmed to simultaneously display a graphic object in a graphic display and also provide a point-of-view video feed in a physical-world window.
  • Other examples are also possible.
  • method 1600 begins at block 1610 , where an HMD can display a home card of an ordered plurality of cards.
  • the ordered plurality of cards can be ordered based on time.
  • Each card in the ordered plurality of cards is associated with a specific time.
  • the choose-next input type can be associated with going forward in time.
  • a specific time can be associated with the next card that is equal to or later than a specific time associated with the home card.
  • the choose-previous input type can be associated with going backward in time.
  • a specific time can be associated with the previous card that is equal to or earlier than the specific time associated with the home card.
  • the HMD can receive a first input.
  • the first input can be associated with a first input type.
  • the first input type can include a choose-next input type and a choose-previous input type.
  • the HMD can include a touch-based UI, such as a touch pad, via which the first input can be received.
  • the user-input can be received via other user-interfaces as well.
  • the HMD can: (a) obtain a next card of the ordered plurality of cards, where the next card is subsequent to the home card in the ordered plurality of cards, and (b) display the next card using the HMD.
  • the HMD can: (a) obtain a previous card of the ordered plurality of cards, where the previous card is prior to the home card in the ordered plurality of cards, and (b) display the previous card using the HMD.
  • method 1600 can further involve the HMD receiving a next input while displaying the next card.
  • the next input can be associated with a next input type.
  • the next input type can include the choose-next input type and the choose-previous input type.
  • the HMD can obtain a second-next card of the plurality of cards, where the second-next card is subsequent to the next card in the ordered plurality of card.
  • the HMD can display the second-next card.
  • the HMD can obtain the home card and display the home card.
  • method 1600 can additionally include that the HMD can, while displaying the previous card, receive a previous input.
  • the previous input can be associated with a previous input type.
  • the previous input type can include the choose-next input type and the choose-previous input type.
  • the HMD can obtain the home card and display the home card.
  • the HMD can obtain a second-previous card of the plurality of cards, where the second-previous card is prior to the previous card in the ordered plurality of cards.
  • the HMD can display the second-previous card.
  • the second-previous card can include a bundle card.
  • the bundle card can represent a collection of cards and can include a bundle card indicator.
  • method 1600 can further include receiving a bundle-card input of a bundle-card type at the HMD, while displaying the bundle card.
  • a first card of the collection of cards can be displayed in response to the bundle-card type of input being a tap.
  • the HMD can receive a first-card input associated with a first-card type.
  • the HMD can select a second card in the collection of cards, where the second card is subsequent to the first card and display the second card.
  • the HMD can select a third card in the collection of cards, where the third card is prior to the first card; and display the third card.
  • the first input type can additionally include a fast-choose-next input type and a fast-choose-previous input type.
  • Each of the choose-next input type and the choose-previous input type can be associated with a first card rate
  • each of the fast-choose-next input type and the fast-choose-previous input type can be associated with a second card rate.
  • the second card rate can exceed the first card rate.
  • method 1600 can additionally include: in response to the first input type being the choose-next input type: (i) simulating movement at the first card rate through of the ordered plurality of cards subsequent to the home card, and (ii) obtaining the next card based on the simulated movement subsequent to the home card at the first card rate.
  • method 1600 can include: (iii) simulating movement at the first card rate through of the ordered plurality of cards prior to the home card, and (iv) obtaining the previous card based on the simulated movement prior to the home card at the first card rate.
  • method 1600 can include: (v) simulating movement at the second card rate through of the ordered plurality of cards subsequent to the home card, and (vi) obtaining a fast-next card based on the simulated movement subsequent to the home card at the second card rate.
  • method 1600 can additionally include: (vii) simulating movement at the second card rate through of the ordered plurality of cards prior to the home card, and (viii) obtaining a fast-previous card based on the simulated movement prior to the home card at the second card rate.
  • each of the choose-next input type and the choose-previous input type can be associated with a swipe made using a first number of fingers and each of the fast-choose-next input type and the fast-choose-previous input type can be associated with a swipe made using a second number of fingers. Then, the first number of fingers can differ from the second number of fingers.
  • FIG. 16B is a flow chart illustrating a method 1650 , according to an example embodiment.
  • method 1650 is described by way of example as being carried out by a computing device, such as a wearable computer, and possibly a wearable computer that includes an HMD, but other techniques and/or device can be used to carry out method 1650 , such as discussed above in the context of method 1600 .
  • method 1650 begins at block 1660 , where a home card can be displayed by a head-mountable device (HMD).
  • HMD head-mountable device
  • the HMD can include a user-interface (UI) state, where the UI state is in a home UI state.
  • UI user-interface
  • displaying the home card can include displaying a hint for using a UI of the HMD on the home card.
  • the hint can include a hint for the voice-based UI.
  • the hint can include a hint for the touch-based UI.
  • displaying the hint can include determining whether a number of times the hint is used successfully meets or exceeds a threshold number of times. In response to the number of times the hint is used successfully does not meet or exceed the threshold number of times, the hint can be displayed on the home card. In response to the number of times the hint is used successfully does meet or exceed the threshold number of times, the hint can be inhibited from display on the home card.
  • a first UI of the HMD can receive a first input.
  • the first input can be associated with a first type of input.
  • the HMD can: display a next card of an ordered plurality of cards, where the ordered plurality of cards also includes the home card, and where the next card can differ from the home card, and set the UI state to a timeline-next state.
  • the HMD can, in response to the first type of input being a choose-previous type of input: display a previous card of the ordered plurality of cards, where the choose-previous type of input differs from the choose-next type of input, and where the previous card differs from the both next card and the home card, and set the UI state to a timeline-previous state.
  • the HMD can, in response to the first type of input being a tap type of input: activate a second UI of the HMD, where the first UI of the HMD is a touch-based UI and where the second UI is a voice-based UI, and set the UI state of the HMD to a voice-home state.
  • the HMD can, in response to the first type of input being a speech-type of input, determine whether text associated with the first input matches a predetermined text. In response to determining that the text associated with first input matches the predetermined text, the HMD can activate the second UI and set the UI state to the voice-home state.
  • method 1650 can additionally include: in response to the first input being a sleep-type of input, deactivating at least a portion of the HMD and setting the UI state of the HMD to a deactivated state.
  • method 1650 can additionally include: receiving a second input using the first UI while in the timeline-previous state.
  • the second input can be associated with a second type of input.
  • the second type of input being the tap type of input: (i) one or more operations can be selected based on the previous card, (ii) a menu of operation cards can be generated based on the selected one or more operations, where each operation card in the menu of operation cards can correspond to an operation of the one or more operations, and (ii) at least one operation card of the menu of operation cards can be displayed.
  • displaying the at least one operation card of the menu of operation cards can include displaying text associated with the operation that overlays the display of the previous card.
  • the one or more operations can include an operation associated with a grace period of time.
  • method 1650 can additionally include: while displaying at least one operation card of the menu of operation cards, receiving an operation input using the first UI where the operation input has an operation type.
  • an operation associated with the displayed at least one operation card can be determined.
  • a determination can be made whether a grace period of time is associated with the associated operation. If the grace period of time is associated with the associated operation, a card can be displayed that is configured to graphically indicate the grace period of time, where the displaying takes at least the grace period of time. After displaying the card configured to graphically indicate the grace period of time, the HMD can perform the associated operation.
  • method 1650 can additionally include: while in the timeline-next state, receiving a second input using the first UI.
  • the second input can be associated with a second type of input.
  • the second type of input being the tap type of input: (a) one or more operations can be selected based on the next card, (b) a menu of operations can be generated based on the selected one or more operations, and (c) at least one menu operation of the menu of operations can be displayed.
  • displaying the at least one menu operation of the menu of operations can include displaying text associated with the at least one menu operation that overlays the display of the next card.
  • method 1650 can additionally include: in response to the UI state of the HMD being in the voice-home state, generating a menu card to display a menu of operations for using the voice-based UI.
  • the HMD can display the menu card.
  • a head-related input related to a head movement associated with the HMD can be received.
  • the menu card can be modified based on the head-related input.
  • the modified menu card can be displayed using the HMD.
  • FIG. 17 is a flow chart illustrating a method 1700 , according to an example embodiment.
  • method 1700 is described by way of example as being carried out by a computing device, such as a wearable computer and possibly a wearable computer that includes an HMD, but other techniques and/or devices can be used to carry out method 1700 , such as discussed above in the context of method 1600 .
  • Method 1700 can begin at block 1710 .
  • a computing device can display at least a portion of a first linear arrangement of cards.
  • the first linear arrangement can include an ordered plurality of cards that includes one or more first cards of a first card-type and one or more second cards of a second card-type. Each first card corresponds to a group of cards. Aspects of the first linear arrangement are discussed above at least in the context of at least FIGS. 5C , 6 D, 7 , 8 , 10 A, 10 B, 14 A, and 14 B.
  • the first linear arrangement can include a timeline and each card of the first linear arrangement can be associated with a specific time, such as discussed above at least in the context of at least FIGS. 5A-15 .
  • each card of the ordered plurality of cards can include a relationship-related parameter, such as a type as discussed above at least in the context of at least FIGS. 5C , 6 C, 6 D, 10 A, and 10 B, or other kind(s) of relationship-related parameter(s).
  • each card in the group of cards can be related to a same relationship-related parameter, such as discussed above in the context of at least FIGS. 5C , 6 C, 6 D, 10 A, and 10 B.
  • the computing device can display a selection region that is moveable with respect to the first linear arrangement, where a given card is selected when the selection region is aligned with the given card. Alignment of the selection region and the given card is discussed above in more detail in the context of FIG. 5C . Additional aspects of the selection region are discussed above at least in the context of FIGS. 5 C and 6 A- 6 D.
  • the HMD can be configured to detect head movements.
  • displaying the selection region that is moveable with respect to the first linear arrangement can include moving the selection region with respect to the first linear arrangement based on the head movements, such as discussed above at least in the context of FIGS. 5C and 6D .
  • the computing device can display at least a portion of a second linear arrangement of cards, where the second linear arrangement can include an ordered plurality of the group of cards that corresponds to the given first card. Aspects of the given first card are discussed above at least in the context of FIGS. 5C , 6 D, 7 , 8 , 10 A, 10 B, 14 A, and 14 B. In some embodiments, the second linear arrangement can also include the given first card.
  • the computing device can display at least a portion of a third linear arrangement of cards, where the third linear arrangement includes one or more third cards of a third card-type, where each third card is selectable to perform an action based on the given second card.
  • the third linear arrangement includes one or more third cards of a third card-type, where each third card is selectable to perform an action based on the given second card.
  • the selected second card can be related to a first relationship-related parameter
  • displaying at least the portion of the third linear arrangement can include determining the one or more third cards based on the first relationship-related parameter, such as discussed above in the context of at least FIGS. 5C and 6D .
  • the second linear arrangement can include the bundle card.
  • method 1700 can further include: initially displaying a single card of from the first linear arrangement using a single-card view; while displaying the single card, receiving a first input via the touchpad; in response to the first input: switching to a multi-timeline view and displaying, in the multi-timeline view, the at least a portion of the first linear arrangement of cards, wherein the at least the portion of the first linear arrangement of cards comprises the single card.
  • method 1700 can further include: after displaying the second linear arrangement of cards, selecting a card other than the selected first card; and ceasing display of the second linear arrangement.
  • each block and/or communication can represent a processing of information and/or a transmission of information in accordance with example embodiments.
  • Alternative embodiments are included within the scope of these example embodiments.
  • functions described as blocks, transmissions, communications, requests, responses, and/or messages can be executed out of order from that shown or discussed, including substantially concurrent or in reverse order, depending on the functionality involved.
  • more or fewer blocks and/or functions can be used with any of the ladder diagrams, scenarios, and flow charts discussed herein, and these ladder diagrams, scenarios, and flow charts can be combined with one another, in part or in whole.
  • a block that represents a processing of information can correspond to circuitry that can be configured to perform the specific logical functions of a herein-described method or technique.
  • a block that represents a processing of information can correspond to a module, a segment, or a portion of program code (including related data).
  • the program code can include one or more instructions executable by a processor for implementing specific logical functions or actions in the method or technique.
  • the program code and/or related data can be stored on any type of computer readable medium such as a storage device including a disk or hard drive or other storage medium.
  • the computer readable medium can also include non-transitory computer readable media such as computer-readable media that stores data for short periods of time like register memory, processor cache, and random access memory (RAM).
  • the computer readable media can also include non-transitory computer readable media that stores program code and/or data for longer periods of time, such as secondary or persistent long term storage, like read only memory (ROM), optical or magnetic disks, compact-disc read only memory (CD-ROM), for example.
  • the computer readable media can also be any other volatile or non-volatile storage systems.
  • a computer readable medium can be considered a computer readable storage medium, for example, or a tangible storage device.
  • a block that represents one or more information transmissions can correspond to information transmissions between software and/or hardware modules in the same physical device. However, other information transmissions can be between software modules and/or hardware modules in different physical devices.

Abstract

Methods, apparatus, and computer-readable media are described herein related to a user interface (UI) for a head-mountable device (HMD). A computing device, such as an HMD, can display at least a portion of a first linear arrangement of cards. The first linear arrangement can include an ordered plurality of cards that can include an actionable card and a bundle card that can correspond to a group of cards. A moveable selection region can be displayed. A given card can be selected by aligning the selection region with the given card. After selection of a bundle card, the computing device can display a second linear arrangement of cards that includes a portion of the corresponding group of cards. After selection of an actionable card, the computing device can display a third linear arrangement of cards that includes action card(s) selectable to perform action(s) based on the actionable card.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims priority to U.S. Patent App. No. 61/710,543, entitled “User Interfaces for Head-Mountable Devices”, filed on Oct. 5, 2012, the contents of which are fully incorporated by referenced herein for all purposes.
  • BACKGROUND
  • Unless otherwise indicated herein, the materials described in this section are not prior art to the claims in this application and are not admitted to be prior art by inclusion in this section.
  • Computing systems such as personal computers, laptop computers, tablet computers, cellular phones, and countless types of Internet-capable devices are prevalent in numerous aspects of modern life. Over time, the manner in which these devices are providing information to users is becoming more intelligent, more efficient, more intuitive, and/or less obtrusive.
  • The trend toward miniaturization of computing hardware, peripherals, as well as of sensors, detectors, and image and audio processors, among other technologies, has helped open up a field sometimes referred to as “wearable computing.” In the area of image and visual processing and production, in particular, it has become possible to consider wearable displays that place a very small image display element close enough to a wearer's (or user's) eye(s) such that the displayed image fills or nearly fills the field of view, and appears as a normal sized image, such as might be displayed on a traditional image display device. The relevant technology can be referred to as “near-eye displays.”
  • Near-eye displays are fundamental components of wearable displays, also sometimes called “head-mounted displays” (HMDs). A head-mounted display places a graphic display or displays close to one or both eyes of a wearer. To generate the images on a display, a computer processing system can be used. Such displays can occupy part or all of a wearer's field of view. Further, head-mounted displays can be as small as a pair of glasses or as large as a helmet.
  • SUMMARY
  • In one aspect, a method is provided. At a head-mountable device (HMD), displaying a home card of an ordered plurality of cards. While displaying the home card, receiving a first input at the HMD. The first input is associated with a first input type. The first input type includes a choose-next input type and a choose-previous input type. In response to the first input type being the choose-next input type: a next card of the ordered plurality of cards is obtained, the next card being subsequent to the home card in the ordered plurality of cards, and the HMD displays the next card. In response to the first input type being the choose-previous input type: a previous card of the ordered plurality of cards is obtained, where the previous card is prior to the home card in the ordered plurality of cards, and the HMD displays the previous card.
  • In another aspect, a method is provided. At an HMD, a home card is displayed. The HMD includes a user-interface (UI) state. The UI state is in a home UI state. While in the home UI state, a first UI of the HMD receives a first input. The first input is associated with a first type of input. In response to the first type of input being a choose-next type of input: the HMD displaying a next card of an ordered plurality of cards, where the ordered plurality of cards additionally includes the home card, where the next card differs from the home card, and setting the UI state of the HMD to a timeline-next state. In response to the first type of input being a choose-previous type of input, the HMD displaying a previous card of the ordered plurality of cards, where the choose-previous type of input differs from the choose-next type of input, and where the previous card differs from the both next card and the home card, and setting the UI state of the HMD to a timeline-previous state. In response to the first type of input being a tap type of input: activating a second UI of the HMD, where the first UI of the HMD is a touch-based UI and where the second UI is a voice-based UI, and setting the UI state of the HMD to a voice-home state. In response to the first type of input being a speech-type of input: determining whether text associated with the first input matches a predetermined text, and, in response to determining that the text associated with first input matches the predetermined text, activating the second UI and setting the UI state of the HMD to the voice-home state.
  • In another aspect, a computing device is provided. The computing device includes a processor and a non-transitory computer-readable medium that is configured to store program instructions that, when executed by the processor, cause the computing device to carry out functions. The functions include: displaying at least a portion of a first linear arrangement of cards, where the first linear arrangement includes an ordered plurality of cards that includes one or more first cards of a first card-type and one or more second cards of a second card-type, and where each first card corresponds to a group of cards; displaying a selection region that is moveable with respect to the first linear arrangement, where a given card is selected when the selection region is aligned with the given card; in response to selection of a given first card by the selection region, displaying at least a portion of a second linear arrangement of cards, where the second linear arrangement includes an ordered plurality of the group of cards that correspond to the given first card; and in response to selection of a given second card by the selection region, displaying at least a portion of a third linear arrangement of cards, where the third linear arrangement includes one or more third cards of a third card-type, where each third card is selectable to perform an action based on the given second card.
  • In another aspect, a non-transitory computer readable medium is provided. The non-transitory computer-readable medium is configured to store program instructions that, when executed by a processor of a computing device, cause the computing device to carry out functions. The functions include: displaying at least a portion of a first linear arrangement of cards, where the first linear arrangement includes an ordered plurality of cards that includes one or more first cards of a first card-type and one or more second cards of a second card-type, and where each first card corresponds to a group of cards; displaying a selection region that is moveable with respect to the first linear arrangement, where a given card is selected when the selection region is aligned with the given card; in response to selection of a given first card by the selection region, displaying at least a portion of a second linear arrangement of cards, where the second linear arrangement includes an ordered plurality of the group of cards that correspond to the given first card; and in response to selection of a given second card by the selection region, displaying at least a portion of a third linear arrangement of cards, where the third linear arrangement includes one or more third cards of a third card-type, where each third card is selectable to perform an action based on the given second card.
  • In another aspect, a method is provided. A computing device displays at least a portion of a first linear arrangement of cards. The first linear arrangement includes an ordered plurality of cards. The ordered plurality of cards includes one or more first cards of a first card-type and one or more second cards of a second card-type. Each first card corresponds to a group of cards. The computing device displays a selection region that is moveable with respect to the first linear arrangement, where a given card is selected when the selection region is aligned with the given card. In response to selection of a given first card by the selection region, the computing device displays at least a portion of a second linear arrangement of cards, where the second linear arrangement includes an ordered plurality of the group of cards corresponding to the given first card. In response to selection of a given second card by the selection region, the computing device displays at least a portion of a third linear arrangement of cards, where the third linear arrangement includes one or more third cards of a third card-type, where each third card is selectable to perform an action based on the given second card.
  • In another aspect, a device is provided. The device includes: means for displaying at least a portion of a first linear arrangement of cards, where the first linear arrangement includes an ordered plurality of cards that includes one or more first cards of a first card-type and one or more second cards of a second card-type, and where each first card corresponds to a group of cards; means for displaying a selection region that is moveable with respect to the first linear arrangement, where a given card is selected when the selection region is aligned with the given card; means for, in response to selection of a given first card by the selection region, displaying at least a portion of a second linear arrangement of cards, where the second linear arrangement includes an ordered plurality of the group of cards that correspond to the given first card; and means for, in response to selection of a given second card by the selection region, displaying at least a portion of a third linear arrangement of cards, where the third linear arrangement includes one or more third cards of a third card-type, where each third card is selectable to perform an action based on the given second card.
  • These as well as other aspects, advantages, and alternatives will become apparent to those of ordinary skill in the art by reading the following detailed description, with reference where appropriate to the accompanying drawings. Further, it should be understood that this summary and other descriptions and figures provided herein are intended to illustrative embodiments by way of example only and, as such, that numerous variations are possible. For instance, structural elements and process steps can be rearranged, combined, distributed, eliminated, or otherwise changed, while remaining within the scope of the embodiments as claimed.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1A illustrates a wearable computing system according to an example embodiment.
  • FIG. 1B illustrates an alternate view of the wearable computing device illustrated in FIG. 1A.
  • FIG. 1C illustrates another wearable computing system according to an example embodiment.
  • FIG. 1D illustrates another wearable computing system according to an example embodiment.
  • FIGS. 1E to 1G are simplified illustrations of the wearable computing system shown in FIG. 1D, being worn by a wearer.
  • FIG. 2A illustrates a schematic drawing of a computing device according to an example embodiment.
  • FIG. 2B shows an example projection of an image by an example head-mountable device (HMD), according to an example embodiment.
  • FIG. 3 shows an example home card of an example user interface for a HMD, according to an example embodiment.
  • FIG. 4 shows example operations of a multi-tiered user model for a user interface for a head-mountable device (HMD), according to an example embodiment.
  • FIG. 5A shows a scenario of example timeline interactions, according to an example embodiment.
  • FIG. 5B shows a scenario of example timeline interactions including splicing a new card into a timeline, according to an example embodiment.
  • FIG. 5C shows a scenario for using a multi-timeline display, according to an example embodiment.
  • FIG. 6A shows an example of using a two-fingered swipe on a touch-based UI of an HMD for zoomed scrolling, according to an example embodiment.
  • FIG. 6B shows a scenario for using a clutch operation to generate a multi-card display, according to an example embodiment.
  • FIG. 6C shows a scenario for using a clutch operation to generate a multi-timeline display, according to an example embodiment.
  • FIG. 6D shows a scenario for using head movements to navigate a multi-timeline display, according to an example embodiment.
  • FIG. 7 shows a user-interface scenario including contextual menus, according to an example embodiment.
  • FIG. 8 shows a user-interface scenario including a people chooser, according to an example embodiment.
  • FIG. 9 shows a user-interface scenario with camera interactions, according to an example embodiment.
  • FIG. 10A shows a user-interface scenario with photo bundles, according to an example embodiment.
  • FIG. 10B shows a user-interface scenario with message bundles, according to an example embodiment.
  • FIG. 11 shows a user-interface scenario with a timeline having settings cards, according to an example embodiment.
  • FIG. 12 shows a user-interface scenario related to WiFi settings, according to an example embodiment.
  • FIG. 13 shows a user-interface scenario related to Bluetooth settings, according to an example embodiment.
  • FIG. 14A shows an example visual stack, according to an example embodiment.
  • FIG. 14B shows another example visual stack, according to an example embodiment.
  • FIG. 15 shows a user-interface scenario related to voice interactions, according to an example embodiment.
  • FIG. 16A is a flow chart illustrating a method, according to an example embodiment.
  • FIG. 16B is a flow chart illustrating another method, according to an example embodiment.
  • FIG. 17 is a flow chart illustrating another method, according to an example embodiment.
  • DETAILED DESCRIPTION
  • Example methods and systems are described herein. It should be understood that the words “example” and “exemplary” are used herein to mean “serving as an example, instance, or illustration.” Any embodiment or feature described herein as being an “example” or “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments or features. In the following detailed description, reference is made to the accompanying figures, which form a part thereof. In the figures, similar symbols typically identify similar components, unless context dictates otherwise. Other embodiments can be utilized, and other changes can be made, without departing from the spirit or scope of the subject matter presented herein.
  • The example embodiments described herein are not meant to be limiting. It will be readily understood that the aspects of the present disclosure, as generally described herein, and illustrated in the figures, can be arranged, substituted, combined, separated, and designed in a wide variety of different configurations, all of which are explicitly contemplated herein.
  • A. OVERVIEW
  • In an example embodiment, a UI for a computing device can include a timeline feature that allows the wearer to navigate through a sequence of ordered screens. In the context of such a timeline feature, each screen can be referred to as a “card.” Among the sequence of cards, one or more cards can be displayed, and of the displayed card(s), one card can be “focused on” for possible selection. For example, the timeline can be present one card for display at a time, and the card being displayed is also the card being focused on. In one embodiment, when a card is selected, the card can be displayed using a single-card view that occupies substantially all of the viewing area of the display. In some embodiments, the computing device utilizing the herein-disclosed UI can be configured as a HMD, wearable computer, tablet computer, laptop computer, desktop computer, mobile telephone, and/or other computing device. In particular embodiments, computing device 210 and/or remote device 230 discussed below in the context of FIG. 2A can be configured to utilize the herein-disclosed UI.
  • Each card can be associated with a certain application, object, or operation. The cards can be ordered by a time associated with the card, application, object, or operation represented by the card. For example, if a card shows a photo captured by a wearer of the HMD at 2:57 PM, the time associated with the card is the time associated with the underlying photo object of 2:57 PM. As another example, a card representing a weather application can continuously update temperature, forecast, wind, and other weather-related information, and as such, the time associated with the weather application can be the current time. As an additional example, a card representing a calendar application can show a next appointment in 2 hours from now, and so the time associated with the card can be a time corresponding to the displayed next appointment, or 2 hours in the future.
  • The timeline feature can allow the wearer to navigate through the cards according to their associated times. For example, a wearer could move their head to the left to navigate to cards with times prior to a time associated with the focused-on card, and to the right to navigate to cards with times after the time associated with the focused-on card. As another example, the wearer can use a touch pad or similar device as part of a touch-based UI to make a swiping motion in one direction on the touch-based UI to navigate to cards with times prior to the time associated with the focused-on card, and make a swiping motion in another direction to navigate to cards with times after the time associated with the focused-on card.
  • Upon power up, the HMD can display a “home card”, also referred to as a home screen. The home card can display a clock, and be associated with a time of “now” or a current time. In some cases, the home card can display a clock, to reinforce the association between the home card and now. Then, cards associated with times before now can be viewed in the timeline as prior to the home card, and cards associated with times equal to or after now can be viewed in the timeline subsequent to the home card.
  • After viewing cards on the timeline, the wearer can choose to interact with some cards. To select a card on the timeline for interaction, the wearer can tap on the touch-based UI, also referred to as performing a “tap operation”, to select the focused-on card for interaction. In some cases, a “contextual menu” can be used to interact with the selected card. For example, if the selected focused-on card shows a photo or an image captured by the HMD, the contextual menu can provide one or more options or operations for interacting with the selected photo, such as sharing the image with one or more people, or deleting the photo.
  • Different contextual menus can be used for different objects. For example, a contextual object for a contact or representation of information about a person can have options or operations such as call the contact, send a message to the contact, delete the contact, or review/update contact details such as telephone numbers, e-mail addresses, display names, etc.
  • Lists of some objects can be arranged by a different order other than the time-based order used by the timeline. For example, a list of contacts can be arranged by frequency of contact; e.g., a contact for the person most-communicated-with using the HMD can be displayed first in a list of contacts, the second-most-communicated-with contact can be displayed second in the list, and so on. Other orderings are possible as well.
  • Groups of cards that represent share a relationship can be collected into a “bundle”, or “stack” or “deck” of cards. The terms bundle of cards, stack of cards, and deck of cards are used interchangeably herein. A bundle of cards can include any cards that can be considered to be related for a certain purpose, related based on criteria and/or a related combination of criteria. For example, a collection of photos captured within a certain span of time can be represented as a photo bundle. As another example, a collection of messages (e.g. an instant messaging session, SMS/text-message exchange, or e-mail chain) can be represented as a message bundle. A bundle card can be constructed for display on the timeline that represents the bundle and, in some cases, summarizes the bundle; e.g., shows thumbnail photos of photos in a photo bundle. In some cases, data related to the card can be used to track relationship(s) used to create bundles, e.g., a location associated with a card, an indication that the card is a photo, message, or other kind of card, a name of an application that created the card, etc.
  • In some embodiments, cards can be classified according to activities taken upon selection. For example, upon selection of a bundle card, the bundle card can be replaced by one or more of the cards the bundle card represents. An “actionable” card can be a non-bundle card that the HMD can perform one or more actions related to the actionable card. In some example scenarios, a photo related to an actionable card can be shared, deleted, named, or stored by the HMD. In some other example scenarios, a message represented by an actionable card can be accepted, rejected, or transferred by the HMD. The user interface can generate and/or use “action” cards to represent actions that can be performed by the HMD related to the actionable card.
  • The HMD can also use a speech or voice-based UI that can include one or more microphones to capture audible input, such as speech from the wearer. The HMD can use speakers or a BCT to present audible output to the wearer. Upon receiving audible input, the HMD can attempt to recognize the input as a speech command and processing the command accordingly; for example, by converting the audible input to text and operating on the text. The speech input can represent commands to the HMD, such commands to search, navigate, take photos, record videos, send messages, make telephone calls, etc.
  • By organizing objects, applications, and operations into cards, the UI can provide a relatively simple interface to a large collection of possible data sources. Further, by enabling operation on a collection of cards arranged in a natural fashion—according to time in one example—the wearer can readily locate and then utilize cards stored by the HMD.
  • B. EXAMPLE WEARABLE COMPUTING DEVICES
  • Systems and devices in which example embodiments can be implemented will now be described in greater detail. In general, an example system can be implemented in or can take the form of a wearable computer (also referred to as a wearable computing device). In an example embodiment, a wearable computer takes the form of or includes a head-mountable device (HMD).
  • An example system can also be implemented in or take the form of other devices, such as a mobile phone, among other possibilities. Further, an example system can take the form of non-transitory computer readable medium, which has program instructions stored thereon that are executable by a processor to provide the functionality described herein. An example system can also take the form of a device such as a wearable computer or mobile phone, or a subsystem of such a device, which includes such a non-transitory computer readable medium having such program instructions stored thereon.
  • An HMD can generally be any display device that is capable of being worn on the head and places a display in front of one or both eyes of the wearer. An HMD can take various forms such as a helmet or eyeglasses. As such, references to “eyeglasses” or a “glasses-style” HMD should be understood to refer to an HMD that has a glasses-like frame so that it can be worn on the head. Further, example embodiments can be implemented by or in association with an HMD with a single display or with two displays, which can be referred to as a “monocular” HMD or a “binocular” HMD, respectively.
  • FIG. 1A illustrates a wearable computing system according to an example embodiment. In FIG. 1A, the wearable computing system takes the form of a head-mountable device (HMD) 102 (which can also be referred to as a head-mounted display). It should be understood, however, that example systems and devices can take the form of or be implemented within or in association with other types of devices, without departing from the scope of the invention. As illustrated in FIG. 1A, the HMD 102 includes frame elements including lens- frames 104, 106 and a center frame support 108, lens elements 110, 112, and extending side- arms 114, 116. The center frame support 108 and the extending side- arms 114, 116 are configured to secure the HMD 102 to a user's face via a user's nose and ears, respectively.
  • Each of the frame elements 104, 106, and 108 and the extending side- arms 114, 116 can be formed of a solid structure of plastic and/or metal, or can be formed of a hollow structure of similar material so as to allow wiring and component interconnects to be internally routed through the HMD 102. Other materials can be possible as well.
  • One or more of each of the lens elements 110, 112 can be formed of any material that can suitably display a projected image or graphic. Each of the lens elements 110, 112 can also be sufficiently transparent to allow a user to see through the lens element. Combining these two features of the lens elements can facilitate an augmented reality or heads-up display where the projected image or graphic is superimposed over a real-world view as perceived by the user through the lens elements.
  • The extending side- arms 114, 116 can each be projections that extend away from the lens- frames 104, 106, respectively, and can be positioned behind a user's ears to secure the HMD 102 to the user. The extending side- arms 114, 116 can further secure the HMD 102 to the user by extending around a rear portion of the user's head. Additionally or alternatively, for example, the HMD 102 can connect to or be affixed within a head-mounted helmet structure. Other configurations for an HMD are also possible.
  • The HMD 102 can also include an on-board computing system 118, an image capture device 120, a sensor 122, and a finger-operable touch pad 124. The on-board computing system 118 is shown to be positioned on the extending side-arm 114 of the HMD 102; however, the on-board computing system 118 can be provided on other parts of the HMD 102 or can be remotely positioned from the HMD 102 (e.g., the on-board computing system 118 could be wire- or wirelessly-connected to the HMD 102). The on-board computing system 118 can include a processor and memory, for example. The on-board computing system 118 can be configured to receive and analyze data from the image capture device 120 and the finger-operable touch pad 124 (and possibly from other sensory devices, user interfaces, or both) and generate images for output by the lens elements 110 and 112.
  • The image capture device 120 can be, for example, a camera that is configured to capture still images and/or to capture video. In the illustrated configuration, image capture device 120 is positioned on the extending side-arm 114 of the HMD 102; however, the image capture device 120 can be provided on other parts of the HMD 102. The image capture device 120 can be configured to capture images at various resolutions or at different frame rates. Many image capture devices with a small form-factor, such as the cameras used in mobile phones or webcams, for example, can be incorporated into an example of the HMD 102.
  • Further, although FIG. 1A illustrates one image capture device 120, more image capture devices can be used, and each can be configured to capture the same view, or to capture different views. For example, the image capture device 120 can be forward facing to capture at least a portion of the real-world view perceived by the user. This forward facing image captured by the image capture device 120 can then be used to generate an augmented reality where computer generated images appear to interact with or overlay the real-world view perceived by the user.
  • The sensor 122 is shown on the extending side-arm 116 of the HMD 102; however, the sensor 122 can be positioned on other parts of the HMD 102. For illustrative purposes, only one sensor 122 is shown. However, in an example embodiment, the HMD 102 can include multiple sensors. For example, an HMD 102 can include sensors 102 such as one or more gyroscopes, one or more accelerometers, one or more magnetometers, one or more light sensors, one or more infrared sensors, and/or one or more microphones. Other sensing devices can be included in addition or in the alternative to the sensors that are specifically identified herein.
  • The finger-operable touch pad 124 is shown on the extending side-arm 114 of the HMD 102. However, the finger-operable touch pad 124 can be positioned on other parts of the HMD 102. Also, more than one finger-operable touch pad can be present on the HMD 102. The finger-operable touch pad 124 can be used by a user to input commands. The finger-operable touch pad 124 can sense at least one of a pressure, position and/or a movement of one or more fingers via capacitive sensing, resistance sensing, or a surface acoustic wave process, among other possibilities. The finger-operable touch pad 124 can be capable of sensing movement of one or more fingers simultaneously, in addition to sensing movement in a direction parallel or planar to the pad surface, in a direction normal to the pad surface, or both, and can also be capable of sensing a level of pressure applied to the touch pad surface. In some embodiments, the finger-operable touch pad 124 can be formed of one or more translucent or transparent insulating layers and one or more translucent or transparent conducting layers. Edges of the finger-operable touch pad 124 can be formed to have a raised, indented, or roughened surface, so as to provide tactile feedback to a user when the user's finger reaches the edge, or other area, of the finger-operable touch pad 124. If more than one finger-operable touch pad is present, each finger-operable touch pad can be operated independently, and can provide a different function.
  • In a further aspect, HMD 102 can be configured to receive user input in various ways, in addition or in the alternative to user input received via finger-operable touch pad 124. For example, on-board computing system 118 can implement a speech-to-text process and utilize a syntax that maps certain spoken commands to certain actions. In addition, HMD 102 can include one or more microphones via which a wearer's speech can be captured. Configured as such, HMD 102 can be operable to detect spoken commands and carry out various computing functions that correspond to the spoken commands.
  • As another example, HMD 102 can interpret certain head-movements as user input. For example, when HMD 102 is worn, HMD 102 can use one or more gyroscopes and/or one or more accelerometers to detect head movement. The HMD 102 can then interpret certain head-movements as being user input, such as nodding, or looking up, down, left, or right. An HMD 102 could also pan or scroll through graphics in a display according to movement. Other types of actions can also be mapped to head movement.
  • As yet another example, HMD 102 can interpret certain gestures (e.g., by a wearer's hand or hands) as user input. For example, HMD 102 can capture hand movements by analyzing image data from image capture device 120, and initiate actions that are defined as corresponding to certain hand movements.
  • As a further example, HMD 102 can interpret eye movement as user input. In particular, HMD 102 can include one or more inward-facing image capture devices and/or one or more other inward-facing sensors (not shown) that can be used to track eye movements and/or determine the direction of a wearer's gaze. As such, certain eye movements can be mapped to certain actions. For example, certain actions can be defined as corresponding to movement of the eye in a certain direction, a blink, and/or a wink, among other possibilities.
  • HMD 102 also includes a speaker 125 for generating audio output. In one example, the speaker could be in the form of a bone conduction speaker, also referred to as a bone conduction transducer (BCT). Speaker 125 can be, for example, a vibration transducer or an electroacoustic transducer that produces sound in response to an electrical audio signal input. The frame of HMD 102 can be designed such that when a user wears HMD 102, the speaker 125 contacts the wearer. Alternatively, speaker 125 can be embedded within the frame of HMD 102 and positioned such that, when the HMD 102 is worn, speaker 125 vibrates a portion of the frame that contacts the wearer. In either case, HMD 102 can be configured to send an audio signal to speaker 125, so that vibration of the speaker can be directly or indirectly transferred to the bone structure of the wearer. When the vibrations travel through the bone structure to the bones in the middle ear of the wearer, the wearer can interpret the vibrations provided by BCT 125 as sounds.
  • Various types of bone-conduction transducers (BCTs) can be implemented, depending upon the particular implementation. Generally, any component that is arranged to vibrate the HMD 102 can be incorporated as a vibration transducer. Yet further it should be understood that an HMD 102 can include a single speaker 125 or multiple speakers. In addition, the location(s) of speaker(s) on the HMD can vary, depending upon the implementation. For example, a speaker can be located proximate to a wearer's temple (as shown), behind the wearer's ear, proximate to the wearer's nose, and/or at any other location where the speaker 125 can vibrate the wearer's bone structure.
  • FIG. 1B illustrates an alternate view of the wearable computing device illustrated in FIG. 1A. As shown in FIG. 1B, the lens elements 110, 112 can act as display elements. The HMD 102 can include a first projector 128 coupled to an inside surface of the extending side-arm 116 and configured to project a display 130 onto an inside surface of the lens element 112. Additionally or alternatively, a second projector 132 can be coupled to an inside surface of the extending side-arm 114 and configured to project a display 134 onto an inside surface of the lens element 110.
  • The lens elements 110, 112 can act as a combiner in a light projection system and can include a coating that reflects the light projected onto them from the projectors 128, 132. In some embodiments, a reflective coating may not be used (e.g., when the projectors 128, 132 are scanning laser devices).
  • In alternative embodiments, other types of display elements can also be used. For example, the lens elements 110, 112 themselves can include: a transparent or semi-transparent matrix display, such as an electroluminescent display or a liquid crystal display, one or more waveguides for delivering an image to the user's eyes, or other optical elements capable of delivering an in focus near-to-eye image to the user. A corresponding display driver can be disposed within the frame elements 104, 106 for driving such a matrix display. Alternatively or additionally, a laser or LED source and scanning system could be used to draw a raster display directly onto the retina of one or more of the user's eyes. Other possibilities exist as well.
  • FIG. 1C illustrates another wearable computing system according to an example embodiment, which takes the form of an HMD 152. The HMD 152 can include frame elements and side-arms such as those described with respect to FIGS. 1A and 1B. The HMD 152 can additionally include an on-board computing system 154 and an image capture device 156, such as those described with respect to FIGS. 1A and 1B. The image capture device 156 is shown mounted on a frame of the HMD 152. However, the image capture device 156 can be mounted at other positions as well.
  • As shown in FIG. 1C, the HMD 152 can include a single display 158 which can be coupled to the device. The display 158 can be formed on one of the lens elements of the HMD 152, such as a lens element described with respect to FIGS. 1A and 1B, and can be configured to overlay computer-generated graphics in the user's view of the physical world. The display 158 is shown to be provided in a center of a lens of the HMD 152, however, the display 158 can be provided in other positions, such as for example towards either the upper or lower portions of the wearer's field of view. The display 158 is controllable via the computing system 154 that is coupled to the display 158 via an optical waveguide 160.
  • FIG. 1D illustrates another wearable computing system according to an example embodiment, which takes the form of a monocular HMD 172. The HMD 172 can include side-arms 173, a center frame support 174, and a bridge portion with nosepiece 175. In the example shown in FIG. 1D, the center frame support 174 connects the side-arms 173. The HMD 172 does not include lens-frames containing lens elements. The HMD 172 can additionally include a component housing 176, which can include an on-board computing system (not shown), an image capture device 178, and a button 179 for operating the image capture device 178 (and/or usable for other purposes). Component housing 176 can also include other electrical components and/or can be electrically connected to electrical components at other locations within or on the HMD. HMD 172 also includes a BCT 186.
  • The HMD 172 can include a single display 180, which can be coupled to one of the side-arms 173 via the component housing 176. In an example embodiment, the display 180 can be a see-through display, which is made of glass and/or another transparent or translucent material, such that the wearer can see their environment through the display 180. Further, the component housing 176 can include the light sources (not shown) for the display 180 and/or optical elements (not shown) to direct light from the light sources to the display 180. As such, display 180 can include optical features that direct light that is generated by such light sources towards the wearer's eye, when HMD 172 is being worn.
  • In a further aspect, HMD 172 can include a sliding feature 184, which can be used to adjust the length of the side-arms 173. Thus, sliding feature 184 can be used to adjust the fit of HMD 172. Further, an HMD can include other features that allow a wearer to adjust the fit of the HMD, without departing from the scope of the invention.
  • FIGS. 1E to 1G are simplified illustrations of the HMD 172 shown in FIG. 1D, being worn by a wearer 190. As shown in FIG. 1F, when HMD 172 is worn, BCT 186 is arranged such that when HMD 172 is worn, BCT 186 is located behind the wearer's ear. As such, BCT 186 is not visible from the perspective shown in FIG. 1E.
  • In the illustrated example, the display 180 can be arranged such that when HMD 172 is worn, display 180 is positioned in front of or proximate to a user's eye when the HMD 172 is worn by a user. For example, display 180 can be positioned below the center frame support and above the center of the wearer's eye, as shown in FIG. 1E. Further, in the illustrated configuration, display 180 can be offset from the center of the wearer's eye (e.g., so that the center of display 180 is positioned to the right and above of the center of the wearer's eye, from the wearer's perspective).
  • Configured as shown in FIGS. 1E to 1G, display 180 can be located in the periphery of the field of view of the wearer 190, when HMD 172 is worn. Thus, as shown by FIG. 1F, when the wearer 190 looks forward, the wearer 190 can see the display 180 with their peripheral vision. As a result, display 180 can be outside the central portion of the wearer's field of view when their eye is facing forward, as it commonly is for many day-to-day activities. Such positioning can facilitate unobstructed eye-to-eye conversations with others, as well as generally providing unobstructed viewing and perception of the world within the central portion of the wearer's field of view. Further, when the display 180 is located as shown, the wearer 190 can view the display 180 by, e.g., looking up with their eyes only (possibly without moving their head). This is illustrated as shown in FIG. 1G, where the wearer has moved their eyes to look up and align their line of sight with display 180. A wearer might also use the display by tilting their head down and aligning their eye with the display 180.
  • FIG. 2A illustrates a schematic drawing of a computing device 210 according to an example embodiment. In an example embodiment, device 210 communicates using a communication link 220 (e.g., a wired or wireless connection) to a remote device 230. The device 210 can be any type of device that can receive data and display information corresponding to or associated with the data. For example, the device 210 can be a heads-up display system, such as the head-mounted devices 102, 152, or 172 described with reference to FIGS. 1A to 1G.
  • Thus, the device 210 can include a display system 212 comprising a processor 214 and a display 216. The display 210 can be, for example, an optical see-through display, an optical see-around display, or a video see-through display. The processor 214 can receive data from the remote device 230, and configure the data for display on the display 216. The processor 214 can be any type of processor, such as a micro-processor or a digital signal processor, for example.
  • The device 210 can further include on-board data storage, such as memory 218 coupled to the processor 214. The memory 218 can store software that can be accessed and executed by the processor 214, for example.
  • The remote device 230 can be any type of computing device or transmitter including a laptop computer, a mobile telephone, or tablet computing device, etc., that is configured to transmit data to the device 210. The remote device 230 and the device 210 can contain hardware to enable the communication link 220, such as processors, transmitters, receivers, antennas, etc.
  • Further, remote device 230 can take the form of or be implemented in a computing system that is in communication with and configured to perform functions on behalf of client device, such as computing device 210. Such a remote device 230 can receive data from another computing device 210 (e.g., an HMD 102, 152, or 172 or a mobile phone), perform certain processing functions on behalf of the device 210, and then send the resulting data back to device 210. This functionality can be referred to as “cloud” computing.
  • In FIG. 2A, the communication link 220 is illustrated as a wireless connection; however, wired connections can also be used. For example, the communication link 220 can be a wired serial bus such as a universal serial bus or a parallel bus. A wired connection can be a proprietary connection as well. The communication link 220 can also be a wireless connection using, e.g., Bluetooth® radio technology, communication protocols described in IEEE 802.11 (including any IEEE 802.11 revisions), Cellular technology (such as GSM, CDMA, UMTS, EV-DO, WiMAX, or LTE), or Zigbee® technology, among other possibilities. The remote device 230 can be accessible via the Internet and can include a computing cluster associated with a particular web service (e.g., social-networking, photo sharing, address book, etc.).
  • C. EXAMPLE IMAGE PROJECTION
  • FIG. 2B shows an example projection of UI elements described herein via an image 280 by an example head-mountable device (HMD) 252, according to an example embodiment. Other configurations of an HMD can also be used to present the UI described herein via image 280. FIG. 2B shows wearer 254 of HMD 252 looking at an eye of person 256. As such, wearer 254's gaze, or direction of viewing, is along gaze vector 260. A horizontal plane, such as horizontal gaze plane 264 can then be used to divide space into three portions: space above horizontal gaze plane 264, space in horizontal gaze plane 264, and space below horizontal gaze plane 264. In the context of projection plane 276, horizontal gaze plane 260 appears as a line that divides projection plane into a subplane above the line of horizontal gaze plane 260, a subplane below the line of horizontal gaze plane 260, and the line where horizontal gaze plane 260 intersects projection plane 276. In FIG. 2B, horizontal gaze plane 264 is shown using dotted lines.
  • Additionally, a dividing plane, indicated using dividing line 274 can be drawn to separate space into three other portions: space to the left of the dividing plane, space on the dividing plane, and space to right of the dividing plane. In the context of projection plane 276, the dividing plane intersects projection plane 276 at dividing line 274. Thus, the dividing plane divides projection plane into: a subplane to the left of dividing line 274, a subplane to the right of dividing line 274, and dividing line 274. In FIG. 2B, dividing line 274 is shown as a solid line.
  • Humans, such wearer 254, when gazing in a gaze direction, can have limits on what objects can be seen above and below the gaze direction. FIG. 2B shows the upper visual plane 270 as the uppermost plane that wearer 254 can see while gazing along gaze vector 260, and shows lower visual plane 272 as the lowermost plane that wearer 254 can see while gazing along gaze vector 260. In FIG. 2B, upper visual plane 270 and lower visual plane 272 are shown using dashed lines.
  • The HMD can project an image for view by wearer 254 at some apparent distance 262 along display line 282, which is shown as a dotted and dashed line in FIG. 2B. For example, apparent distance 262 can be 1 meter, four feet, infinity, or some other distance. That is, HMD 252 can generate a display, such as image 280, which appears to be at the apparent distance 262 from the eye of wearer 254 and in projection plane 276. In this example, image 280 is shown between horizontal gaze plane 264 and upper visual plane 270; that is image 280 is projected above gaze vector 260. In this example, image 280 is also projected to the right of dividing line 274. As image 280 is projected above and to the right of gaze vector 260, wearer 254 can look at person 256 without image 280 obscuring their general view. In one example, the display element of the HMD 252 is translucent when not active (i.e. when image 280 is not being displayed), and so the wearer 254 can perceive objects in the real world along the vector of display line 282.
  • Other example locations for displaying image 280 can be used to permit wearer 254 to look along gaze vector 260 without obscuring the view of objects along the gaze vector. For example, in some embodiments, image 280 can be projected above horizontal gaze plane 264 near and/or just above upper visual plane 270 to keep image 280 from obscuring most of wearer 254's view. Then, when wearer 254 wants to view image 280, wearer 254 can move their eyes such that their gaze is directly toward image 280.
  • D. AN EXAMPLE USER INTERFACE FOR AN HMD
  • FIGS. 3 through 15 collectively describe aspects of an example user interface for an HMD such as discussed above at least in the context of FIGS. 1A through 2. The HMD can be configured with a user interface (UI) controller receiving inputs from at least two user interfaces: a touch-based UI and a voice-based UI. The touch-based UI can include a touch pad and a button, configured to receive various touches, such as one-finger swipes in various directions, two-finger or multi-finger swipes in various directions, taps, button presses of various durations, and button releases.
  • Once a touch is received, the touch-based UI can report the touch; e.g., a “swipe forward” or “tap” to the HMD, or in some cases, to a component of the HMD such as a UI controller. In other embodiments, the HMD can act as the UI controller. As described herein, the HMD includes any necessary components, such as but not limited to one or more UI controllers, which are configured to perform and control the UI operations described herein.
  • The voice-based UI can include a microphone configured to receive various words, including commands, and to report the received words; e.g., “Call Mom”, to the HMD. In some embodiments, the HMD can include a gaze-based UI that is configured to detect duration and/or direction of one or more gazes of a wearer of the HMD. For example, the gaze-based UI can be configured to detect “dwell time” or how long the wearer gazes in a fixed direction, the direction of the gaze, a rate of change of the gaze, and additional information related to wearer gazes. In some cases, the HMD can generate audible outputs; e.g., tones, words, songs, etc., that can be heard by the wearer via headphones, speakers, or bone conduction devices of the HMD.
  • The HMD can generate “cards”, also referred to as screens or images, which are capable of occupying the full display of the HMD when selected. One card is a home card that is the first card displayed when UI is activated, for example shortly after HMD powers up or when the HMD wakes from a sleep or power-saving mode. FIG. 3 shows an example home card 300 of an example user interface, according to an example embodiment. Home card 300 includes application status indicators 310, device status indicators 312, hint 316 and a clock shown in large numerals indicating the current time in the center of home card 300. Application status indicators 310 can indicate which application(s) are operating on the HMD. As shown in FIG. 3, application status indicators 310 include camera and Y-shaped road icons to respectively indicate operation of a camera application and a navigation application. Such indicators can remind the wearer what applications or processes are presently running and/or consuming power and/or processor resources of the HMD.
  • Device status indicators 312 can indicate which device(s) are operating on the HMD and HMD status. As shown in FIG. 3, device status indicators 312 include icons for a wireless network and a Bluetooth network, respectively, that indicate the HMD is presently configured for communication via a wireless network and/or a Bluetooth network. In one embodiment, the HMD may not present device status indicators 312 on home card 300.
  • Hint 314 is shown in FIG. 3 as “ok glass”. Hint 314 is shown in quotes to indicate that the hint is related to the voice-based UI of the HMD. In some embodiments, hint 314 can be related to the touch-based UI of the HMD. The words in hint 314 illustrated as “ok glass” indicate that a wearer should say the words “ok glass” to activate the voice-based UI of the HMD. In other words, “ok glass” in this instance is a word (that can also be referred to as “a hotword”) that triggers activation of a voice-based UI. Other hotwords can also be used.
  • As also indicated in the lower portion of FIG. 3, if hint 314 is used successfully a number, e.g., 5, of times, the HMD can remove hint 314 from being displayed on home card 110. However, if the HMD has a gaze-based UI and detects that a dwell time of the wearer on the home card exceeds a threshold, such as a 30-second threshold, the HMD can add hint 314 back to home card 110 to remind the wearer about specific words, e.g., ok glass, used to activate the voice-based UI. In one embodiment, the hotword presented as hint 314 on home card 300 can be updated to make the user aware of other functionality of the HMD, or to suggest queries or actions based on the HMD's current geographic location or situational context.
  • The UI can accept as inputs certain operations performed using the touch-based UI. The UI can receive these operations and responsively perform actions to enable the wearer to interact with the HMD. These operations can be organized into tiers. FIG. 4 lists example operations of a multi-tiered user model 400 for a user interface for a head-mountable device (HMD), according to an example embodiment.
  • As shown in FIG. 4, multi-tiered user model 400 has three tiers: basic, intermediate, and advanced. The basic tier provides the smallest number of operations of any tier of multi-tiered user model 400. The intermediate tier includes all operations provided by the basic tier, along with additional operations not provided by the basic tier. Similarly, the advanced tier includes all operations provided by the basic and intermediate tiers, along with additional operations not provided by either the basic tier or intermediate tier.
  • FIG. 4 shows that the basic tier of multi-tiered user model 400 provides tap, swipe forward, swipe backward, voice, and camera button press operations. A tap operation can involve a single physical tap—that is, one quick, slight strike with one or more fingers on the touch pad of the touch-based UI. A swipe forward operation, sometimes termed a swipe right, can involve a movement forward by one or more fingers touching the touch pad, where forward is the general direction from the wearer's ear toward the wearer's eye when the wearer has the HMD on. A swipe backward operation, sometimes termed a “swipe left”, can involve a movement backward by one or more fingers touching the touch pad, where backward is the general direction from the wearer's eye toward the wearer's ear when the wearer has the HMD on. A “swipe down” operation can involve a downward movement by one or more fingers touching the touch pad, where downward is the general direction from the top of the wearer's head toward the wearer's neck when the wearer has the HMD on.
  • While example embodiments in this description make reference to particular directions of touchpad input such as up, down, left, right, it should be understood that these are exemplary and that embodiments where certain operations can be triggered via different input directions are contemplated.
  • In one embodiment, the physical actions used by the wearer to perform some or all of the herein-described operations can be customized; e.g., by the wearer and/or other entity associated with the HMD. For example, suppose the wearer prefers to perform a physical action of a “double-tap”—that is, one physical tap quickly followed by a second physical tap—rather than the above-mentioned single physical tap, to perform a tap operation. In this embodiment, the wearer and/or other entity could configure the HMD to recognize a double-tap as a tap operation, such as by training or setting the HMD to associate the double-tap with the tap operation. As another example, suppose that the wearer would like to interchange the physical operations to perform swipe forward and backward operations; e.g., the swipe forward operation would be performed using a physical action described above as a swipe left and the swipe backward operation would be performed using a physical action described above as a swipe right. In this embodiment, the wearer could configure the HMD to recognize a physical swipe left as a swipe forward operation and physical swipe right as a swipe backward operation. Other customizations are possible as well; e.g., using a sequence of swipes to carry out the tap operation.
  • The tap operation can select a currently visible card. The swipe forward operation can remove the currently visible card from display and select a next card for display. The swipe backward operation can remove the currently visible card from display and select a previous card for display.
  • The swipe down operation can, depending on context, act to go back, go home, or sleep. Going back can remove the currently visible card from display and display a previously-visible card for display. For example, the previously-visible card can be the card that most recently viewed; e.g. if card A is currently visible and card B is previously-viewed card, then the swipe down operation can remove card A from visibility and display card B. Going home can replace the currently visible card from display and display the home card. Sleeping can cause part of the HMD, e.g., the display, or all of the HMD to be deactivated.
  • A voice operation can provide access to a voice menu of operations. Voice interactions with the UI are discussed below in more detail in the context of FIG. 15. A camera button press can instruct the HMD to take a photo using a camera associated with and/or part of the HMD.
  • FIG. 4 shows that the intermediate tier of multi-tiered user model 400 provides tap, swipe forward, swipe backward, voice, and camera button press operations as described above in the context of the basic tier. Also, the intermediate tier provides camera button long press, two finger swipe forward, two finger swipe backward, and two finger swipe down operations.
  • The camera button long press operation can instruct the HMD to provide a capture menu for display and use. The capture menu can provide one or more operations for using the camera associated with HMD. The capture menu is discussed below in more detail in the context of FIG. 7.
  • The two finger swipe forward operation removes the currently visible card from display and selects a next card for display using a “zoomed scroll”. The two finger swipe forward operation removes the currently visible card from display and selects the next card for display using a zoomed scroll. Zoomed scrolls are discussed in more detail in the context of at least FIG. 6A. The two finger swipe down causes the HMD to sleep at this position in a timeline.
  • FIG. 4 shows that the advanced tier of multi-tiered user model 400 provides tap, swipe forward, swipe backward, voice, and camera button press operations as described above in the context of the basic tier, as well as camera button long press, two finger swipe forward, two finger swipe backward, and two finger swipe down operations described above in the context of the intermediate tier. The advanced tier also provides one-finger press-and-hold, two-finger press-and-hold, and nudge operations.
  • The one-finger press-and-hold operation zooms, or expands, the display of the current card, or content related to the current card, starting when the wearer presses on the touch-based UI and continues to zoom as long as the wearer “holds” or keeps pressing on the touch-based UI.
  • The two-finger press-and-hold can provide a “clutch” operation, which can be performed by pressing on the touch-based UI in two separate spots using two fingers and holding the fingers in their respective positions on the touch-based UI. After the fingers are held in position on the touch-based UI, the clutch operation is engaged. In some embodiments, the HMD recognizes the clutch operation only after the fingers are held for at least a threshold period of time; e.g., one second. The clutch operation will stay engaged as long as the two fingers remain on the touch based UI. Clutch operations are discussed in more detail below in the context of at least FIGS. 6B and 6C.
  • The nudge operation can be performed using a short, slight nod of the wearer's head. For example, the HMD can be configured with accelerometers or other motion detectors that can detect the nudge and provide an indication of the nudge to the HMD. Upon receiving indication of a nudge, the HMD can toggle an activation state of the HMD. That is, if the HMD is active (e.g., displaying a card on the activated display) before the nudge, the HMD can deactivate itself (e.g., turn off the display) in response. Alternatively, if the HMD is inactive before the nudge but is active enough to detect nudges; e.g., within two or a few seconds of notification of message arrival, the HMD can activate itself in response.
  • By way of further example, in one scenario, the HMD is powered on with the display inactive. In response to the HMD receiving a new text message, an audible chime can be emitted by the HMD. Then, if the wearer nudges within a few seconds of the chime, the HMD can activate and present a card with the content of the text message. If, from the activated state, the user nudges again, the display will deactivate. Thus, in this example, the user can interact with the device in a completely hands-free manner.
  • As mentioned above, the UI maintains a timeline or ordered sequence of cards that can be operated on using the operations described in FIG. 4 immediately above. FIG. 5A shows a scenario 500 of example timeline interactions, according to an example embodiment.
  • Scenario 500 begins with home card 502 being displayed by an HMD worn by a wearer. Home card 502 and cards 520 a-520 c can be arranged as a “timeline” or ordered sequence of cards. In the example shown in FIG. 5A, each card in timeline 510 has a specific time associated with the card. The timeline can be ordered based on the specific time associated with each card. In some cases, the specific time can be “now” or the current time. For example, home card 502 can be associated with the specific time of now. In other cases, the time can be a time associated with an event leading to the card. For example, FIG. 5A shows that card 520 a represents a photo taken at a time 2 hours ago. Then, card 520 a can be associated with the specific time of 1:28, which is 2 hours before the current time of 3:28 shown on home card 500.
  • Cards 520 b-520 f represent current cards, or cards associated with the specific time of now, or upcoming cards, or cards associated with a future time. For example, card 520 b is a current card that includes an image currently generated by a camera associated with the HMD, card 520 c is a current card that includes an image of a “hangout” or video conference call currently in-progress generated by an application of the HMD, card 520 d is a current card that includes an image and text currently generated by a navigation application/process presently running on the HMD, card 520 e is a current card that includes images and text currently generated by a weather application of the HMD, and 520 f is an upcoming card that includes images and text generated by a calendar application of the HMD indicating an appointment for “Lunch with Monica Kim” in “2 hours”.
  • In scenario 500, the HMD can enable navigation of the time line using swipe operations. For example, starting at home card 502, a swipe backward operation can cause the HMD to select and display a previous card, such as card 520 a, and a swipe forward operation can cause the HMD to select and display a next card, such as card 520 b. Upon displaying card 520 b, the swipe forward operation can cause the HMD to select and display the previous card, which is home card 502, and the swipe backward operation can cause the HMD to select and display the next card, which is card 520 c.
  • In scenario 500, there are no cards in timeline 510 that are previous to card 520 a. In one embodiment, the timeline is represented as a circular timeline. For example, in response to a swipe backward operation on card 520 a requesting a previous card for display, the HMD can select 520 f for (re)display, as there are no cards in timeline 510 that are after card 520 f during scenario 500. Similarly, in response to a swipe forward operation on card 520 f requesting a next card for display, the HMD can select 520 a for (re)display, as there are no cards in timeline 510 that are after card 520 f during scenario 500.
  • In another embodiment, instead of a circular representation of the timeline, when the user navigates to the end of the timeline, a notification is generated to indicate to the user that there are no additional cards to navigate to in the instructed direction. Examples of such notifications could include any of or a combination of the following: a visual effect, an audible effect, a glowing effect on the edge of the card, a three dimensional animation twisting the edge of the card, a sound (e.g. a click), a textual or audible message indicating that the end of the timeline has been reached (e.g. “there are no cards older than this”). Alternatively, in one embodiment, an attempt by the user to navigate past a card in a direction where there are no additional cards could result in no effect, i.e. swiping right on card 520 a results in no perceptible change to the display or card 520 a.
  • While displaying home card 502, a wearer of the HMD can recite or utter a hotword, for example the words “ok glass” to activate the voice-based interface of the HMD. In response, the HMD can display card 530 that lists some of the commands that can be uttered by the wearer to interact with the voice-based interface. FIG. 5A shows example commands as “Google” to perform a search query, “navigate to” to find directions to a location, “take a photo” to capture an image using a camera associated with the HMD, “record a video” to capture a sequence of images and/or associated sounds, using a camera and/or a microphone associated with the HMD, and “send a message” to generate and send an e-mail, SMS message, instant message, or some other type of message.
  • While displaying card 530, the wearer can utter something in response, which can lead to voice interactions with the UI, such as those discussed below with respect to FIG. 15. The commands capable of triggering voice interactions are not necessarily limited to those presented on card 530 at the time the utterance is received. For example, as the user dwells on card 530, additional commands can be presented for other features. Further, such commands presented on card 530 can change over time through further use of the HMD, or can be remotely updated to surface additional features or content of the HMD. Still further, similar to the frequent contact aspects described herein, commands for frequently used functions of the HMD can be presented on card 530. As such, these commands can change over time based on use of the HMD by the wearer.
  • In some examples, timelines can become lengthy. The UI provides operations for speedy use of the UI, such as two-fingered swipes and clutches, although other gestures to invoke such navigation operations are possible. FIG. 6A shows an example of using a two-fingered swipe on a touch-based UI of an HMD for zoomed scrolling, according to an example embodiment.
  • FIG. 5B shows scenario 540 of example timeline interactions including splicing a new card into timeline 550, according to an example embodiment. Scenario 540 begins with a wearer of an HMD using the HMD to observe timeline 550, focusing in on card 550 b, which is the home card for timeline 550. FIG. 5B shows the focused-on card, card 550 b, of timeline 550 using a dotted-line border. At this point of scenario 540, card 550 b is displayed by the HMD using a single-card view.
  • Scenario 540 continues by the HMD receiving an incoming telephone call from a contact, Kelly Young. For example, the HMD can be configured with one or more transceivers configured to establish, maintain, and tear down communication links, such as communication link 220 discussed above in the context of FIG. 2A, that utilize one of a number of cellular and/or other technologies to originate and terminate wireless telephone calls.
  • Upon receiving the phone call, the HMD can generate, retrieve, and/or determine card 560 representing the calling party, Kelly Young, of the telephone call. Once available, the HMD can display card 560 using a single-card view.
  • In scenario 540, the wearer of the HMD would like to answer the call from Kelly Young. To accomplish this, the wearer can perform a tap operation to bring up a contextual menu suitable for the context of a telephone call. This contextual menu can have options such as, but not limited to, answering the telephone call, routing/forwarding the telephone call to another number (e.g., another phone, voice mail), ignoring/rejecting the telephone call, putting the calling party on hold, bridging the calling party into a three-way or multi-way call, bridging the calling party into a video conference call, such as a hangout, and saving contact information related to the telephone call.
  • In scenario 540, the first option of the contextual menu for the telephone call is an answer option. The options of the contextual menu can be displayed as an overlay on top of card 560 representing the telephone call. In some embodiments, such as shown in FIG. 5B, card 570 can be generated by (a) displaying text and/or graphics related to the contextual menu item overlaying (b) a dimmed version of card 560. Card 570 can then be focused on and displayed using a single-card view.
  • The wearer can answer the call by performing a tap operation while the answer option is active; e.g., while card 570 is focused on. In response, the HMD can generate display 580 by determining where in the timeline a new card representing the telephone call would be displayed. As the telephone call is a current and most recent event for the HMD, a card representing the telephone call; e.g. card 560, would be adjacent to and on the future/now side of a timeline. That is, for timeline 550 shown at the top of FIG. 5B, card 560 would be “spliced into”, or inserted or placed into the middle of, timeline 550 between home card 550 b and card 550 c.
  • The HMD can be configured to animate this splicing operation by showing room being made for the to-be-spliced-in card in the timeline and then showing the to-be-spliced-in card placed into the timeline. Once spliced in, the HMD can show the spliced card in single-card view as the focused-on card. For example, in response to the tap operation performed while card 570 is displayed, the HMD can switch to a zoomed-out, or multi-card, display of the timeline as shown in display 580 showing part or all of cards 550 a-550 d of timeline 550. Then, the HMD can show the cards on each side of the to-be-spliced-in card; e.g., cards 550 b and 550 c, moving away from the center of the zoomed-out display as indicated in display 580.
  • Then, once cards on each side of the to-be-spliced-in card have each moved far enough away from the center of the display to permit insertion of a new card, the to-be-spliced-in card can be shown in the display between the cards on each side. Display 582 shows a stage of this insertion animation after cards 550 b and 550 c have moved far enough apart to permit splicing in card 560—the to-be-spliced-in card.
  • Timeline 550 at the bottom of FIG. 5B shows the result of the splicing operation. Card 560 (the formed to-be-spliced-in card) in shown between cards 550 b and 550 c in timeline 550, and is indicated as being focused on by the HMD. As card 560 is focused in by the HMD, card 560 can be shown in single-card mode.
  • Scenario 540 can conclude with the HMD answering the telephone call before, during, or after the animation of the splicing operation, and the telephone call between Kelly Young and the wearer entering the talking state.
  • The splicing operation can be performed in reverse when a card is to be removed from a timeline; that is, a “reverse splice” can be performed. For example, after the call with Kelly Young is completed, card 560 could be removed from the timeline 500. In an embodiment, an animation that is substantially in the reverse of the splicing process described above is used in conjunction with removing card 560 from the timeline 550.
  • FIG. 5C shows scenario 584 using a multi-timeline display, according to an example embodiment. Scenario 584 begins with a wearer of an HMD using View A, shown at the top of FIG. 5C, that can be generated by the HMD to observe home card 588 a displayed in single-card view 586. In scenario 584, a wearer of an HMD can switch from single-card view 586 into a multi-timeline view using a clutch operation, as discussed in detail below in the context of FIG. 6D. In other scenarios, a different operation or operations than a clutch can be performed to switch into the multi-timeline view.
  • In scenario 584, multiple cards of main timeline 588 can be displayed simultaneously upon entering the multi-timeline view. View B of FIG. 5C, shown just below View A, illustrates a multi-timeline view and shows three cards 588 a, 588 b, and 588 c of main timeline 588 in a linear arrangement. Card 588 a is a home card for main timeline 588, card 588 b is a card representing an “Email” from “LunchPal” that arrived “5 min ago”, and card 588 c is a bundle card that shows a number of thumbnail images related to a bundle of contacts called “Friends”.
  • In scenario 584, card 588 a was shown in while in single-user view 586 and in an initial multi-timeline view. In some scenarios, the initial multi-timeline view can be centered on the card shown in a previous single-card view; e.g., home card 588 a. In other scenarios, multiple timelines can be displayed as part of the initial multi-timeline view; for example, main timeline 588 can be accompanied by a one or more timelines showing card representing one or more contacts, photos, previous events, future events, and/or other cards.
  • In scenario 584, the wearer of the HMD can select a card for use by controlling a selection region; e.g., focus 688 a shown in FIG. 6D. A given card, such as card 588 b, can be selected when the selection region is aligned with the given card. In this context, the selection region can be aligned with a given card in a display when the selection region is placed over the given card in the display, the selection region substantially overlaps the given card in the display, and/or a UI action (e.g., a tap of a touchpad, a click of a mouse, a key press) is performed when the selection region overlaps the given card in the display. Other techniques for aligning a selection region and a given card are possible as well. In some embodiments, the selection region substantially overlaps the given card when at least 50% of the selection region overlaps the given card in the display. In some embodiments, the HMD can be configured to detect head movements and the selection region can be moved using the head movements.
  • In scenario 584, the wearer of the HMD selects card 588 b and, after the selection of card 588 b, View C can be generated, which is shown below and to the left of View B in FIG. 5C. View C shows card 588 b of main timeline 588 and a linear arrangement of three action cards 590 a, 590 b, and 590 c shown above card 588 b; that is, View C shows multiple linear arrangements simultaneously. As shown in View C, the linear arrangement of action cards starts with card 590 a that is directly above selected card 588 b, and the linear arrangement of action cards is adjacent to, above, and parallel to main timeline 588. Card 588 a is shown in View C as greyed out to indicate that card 588 a is not selected.
  • Upon selection of action card 590 a to “View All”, the wearer can view the e-mail represented by card 588 b. Selection of action card 590 b to “Share” can enable the wearer to share; e.g., reply to, forward, post to a website, etc., the e-mail represented by card 588 b. Selection of action card 590 c to “Delete” can permit the wearer to delete the e-mail represented by card 588 b.
  • In scenario 584, the wearer selects card 590 a to view all of the e-mail represented by card 588 b. After selection of card 590 a, the content of the e-mail is shown using three content cards 592 a, 592 b, and 592 c shown in View D as adjacent to and above selected card 590 a. View D is shown directly to the right of View C in FIG. 5C.
  • View D also shows that the linear arrangement of contact cards begins with card 592 a, which is shown directly above selected card 592 a. View D does not show unselected action cards 590 b and 590 c; in some embodiments, unselected cards can be displayed. In particular scenarios, unselected but displayed card can be displayed in a visually distinct manner to indicate non-selection; e.g., shown with a grey background as for card 588 a in View C.
  • Scenario 584 continues with the wearer of the HMD manipulating the selection region to return to the main timeline 588 and select card 588 c as shown in View E. FIG. 5C shows View E below and to the left of View D. As mentioned above, card 588 c is a bundle card representing a group of related cards; in this example, a group of contact cards. Each contact card can have an indication that the card is a contact card. In some embodiments, card represented by bundle card 588 c can have an indication that the card is in the “Friends” bundle of cards/contacts. As such, the HMD can determine cards in the “Friends” bundle by searching for each card having an indication that the card is in the “Friends” group of cards.
  • Upon selection of card 588 c, the HMD can generate View F, which shows contact cards 594 a and 594 b of the “Friends” bundle displayed the linear arrangement with main timeline 588. View F is shown in FIG. 5C directly below View E. Bundle card 588 b is shown by View F as remaining in the linear arrangement with main timeline 588. In some scenarios, contact cards 594 a and 594 b, as well as additional cards in the “Friends” bundle can be shown in a linear arrangement adjacent to the linear arrangement showing a selected bundle card; e.g., card 588 c. In other scenarios, upon selection of bundle card 588 c, bundle card 588 c is no longer displayed; rather, the bundle card can be considered to be replaced by the content of the bundle.
  • Also, the splicing operation can utilize card generated by other applications. For example, suppose a card representing a navigation application/process is displayed on a timeline, and the wearer uses a tap operation to activate the navigation application/process to provide directions to a destination. To show the directions to the destination, the navigation application/process can generate a results card that includes one or more directions. When first generated, the results card can be spliced into the timeline, using the splicing operation described immediately above. When the wearer arrives at the destination, the results card can be removed using the reverse splice operation described above. In some scenarios, multiple cards can be spliced in and/or reverse spliced out of a timeline simultaneously or substantially as so, such as when being added to or leaving a multi-party hangout, telephone call, or other communication.
  • To speed movement in selecting next card(s) in the timeline, a wearer can swipe forward with two fingers, as shown in FIG. 6A, to perform a zoomed scroll to a next card. Similarly, to speed movement in selecting previous card(s) in the timeline, a wearer can swipe backward with two fingers, as also shown in FIG. 6, to perform a zoomed scroll to a previous card.
  • Upon receiving a UI operation for a zoomed scroll, for example, a two-fingered swipe forward, a reduced-size view of cards can be displayed in the resulting timeline 610. That is, as shown in FIG. 6A, multiple cards can be shown in example display 612 generated by the HMD. A swipe or drag operation associated with the zoomed scroll can move content faster, e.g., 4 times faster, than when performing a regular swipe or drag operation. Inertial free scrolling can be performed as part of zoomed scrolling. After the zoomed scroll completes, the focus for the UI is on card 614 of timeline 610. FIG. 6A shows card 614 outlined using a thick dashed line in the center of display 612.
  • A timeline that has been released after the zoomed scroll can stay zoomed out, or can continue with reduced image views, until a minimum velocity threshold for the timeline is reached. After the minimum velocity threshold is reached, display 612 can be instructed to zoom to the card that is closest to the center of display 612; e.g., display 612 can zoom to card 614. That is, the HMD can show card 614 as large as possible within display 612.
  • Additional techniques for rapid movement within a timeline and between timelines can be provided by the UI. For example, a clutch operation can lead to generation and display of a multi-card display, such as shown in FIG. 6B, or a multi-timeline display, such as shown in FIG. 6C. Navigation within the multi-card display and/or multi-timeline display can, in some embodiments, be performed using head movements. In other embodiments, the multi-card display or multi-timeline display in toto can be focused on, or displayed by the HMD. Thus, to aid navigation, a sub-focus can be implemented to highlight a card or a timeline within a multi-card or multi-timeline display.
  • FIG. 6B shows a scenario 620 for using clutch operation 642 to generate a multi-card display 634 a, according to an example embodiment. Scenario 620 begins with an HMD having timeline 630 with cards 630 a through 630 g, and with a focus on card 630 d. During scenario 620, prior to clutch 642, the HMD displays cards in the timeline using a single-card view, while solely displaying a focused-upon card. As the focus is on card 630 d, which FIG. 6 shows as a photo of a woman's face, the HMD displays a single-card view of card 630 d.
  • Scenario 620 continues with a wearer of the HMD performing clutch operation 642 using the touch-based UI of the HMD. A clutch operation can involve pressing on the touch-based UI of the HMD using two fingers and holding the two-finger press until the HMD recognizes the clutch operation 642 has been performed. Other gestures, techniques, inputs or time thresholds can be used to trigger the clutch operation. For example, in certain embodiments, a three-finger gesture or a voice-action could be used to engage and/or disengage the clutch operation.
  • Upon recognition of clutch operation 642, in scenario 620, the HMD can generate and display multi-card display 634 a, which is shown in an expanded view as multi-card display 634 b. In some embodiments, the HMD can focus on the entire multi-card display 634 a using focus 636. In other embodiments, the HMD can focus a subset of cards, such as but not limited to, a single card, a row of cards, a column of cards, a block of cards, or some other selection of cards, within multi-card display 634 a using sub-focus 638. For example, in scenario 620, the HMD is configured to display sub-focus 638 on a single card. In some embodiments, the sub-focus can remain on one or more cards at or near the center of the display.
  • As shown in FIG. 6B using expanded multi-card display 634 b, the multi-card display shows nine cards: cards 630 a through 630 g of timeline 630 and two other cards 640 a and 640 b not shown as part of timeline 630. The wearer of the HMD can navigate around multi-card display 634 a, 634 b using head movements, such as moving the wearer's head up, down, left, and/or right. In some embodiments, gaze tracking can be used in place of or in addition to head movements for navigating around multi-card display 634 a, 634 b and/or multi-timeline display 664 a, 664 b.
  • In scenario 620, “wrap-around” movements, or moving off the end of a row or column to the respective other end of the row or column, are enabled. Then, in response to respective movements upward, downward, leftward, or rightward by the head of the wearer, the sub-focus 638 can move from card 630 d, as shown in FIG. 6B, to respective cards 630 a, 630 g, 630 f, or 630 g. In particular embodiments, wrap-around can be inhibited, so moving the wearer's head leftward will not move sub-focus 638 from card 630 d to card 630 f, but rather sub-focus 638 will stay at the left-end of the middle row on card 630 d.
  • In some embodiments, in response to respective movements diagonally up-and-left, up-and-right, down-and-left, and down-and-right by the head of the wearer, the sub-focus 638 can move from card 630 d, as shown in FIG. 6B, to respective cards 630 c, 630 b, 640 b, or 640 c. Other types of head movements and/or UI operations can be used as well or instead with multi-card display 634 a, 634 b, including but not limited to head movements and/or UI operations that move the focus faster than and/or slower than one card at a time, zooming in and out, reshaping sub-focus 638, selecting card(s), and deselecting card(s).
  • In some embodiments, sub-focus 638 may not be used. For example, in these embodiments, a leftward head movement can move each of cards 630 b, 630 c, 630 e, 630 f, 640 a, and 640 b to the left by one card and bring in new cards to the “right” of these cards (new cards not shown in FIG. 6B) on to multi-card displays 634 a and 634 b. The new cards can be displayed in the respective positions of card 630 c, 630 f, and 640 b, and remove cards 630 a, 630 d, and 630 g from multi-card display 634 a and 634 b. Also, a rightward head movement can move each of cards 630 a, 630 b, 630 d, 630 e, 630 g, 640 a to the right by one card, bring in new cards to the “right” of these cards (not shown in FIG. 4) on to multi-card displays 634 a and 634 b. The new cards can be displayed in the respective positions of card 630 a, 630 d, and 640 g, and remove cards 630 c, 630 f, and 640 b multi-card displays 634 a and 634 b.
  • In these embodiments, an upward head movement can: (1) bring a new row of cards considered to be “above” the top row of cards; e.g., cards in the positions of cards 630 a, 630 b, 630 c of multi-card displays 634 a and 634 b, (2) display the new row of cards on the top row of multi-card displays 634 a and 634 b, (3) move the top row of cards down to be displayed as the middle row of cards; e.g. display cards 630 a, 630 b, and 630 c in the positions of cards 630 d, 630 e, and 630 f of multi-card displays 634 a and 634 b, (4) move the middle row of cards down to the bottom row of cards e.g. display cards 630 d, 630 e, and 630 f in the positions of cards 630 g, 640 a, and 640 b of multi-card displays 634 a and 634 b, thus removing the bottom row of cards; e.g., cards 630 g, 640 a, and 640 b, from view on multi-card displays 634 a and 634 b.
  • In these embodiments, a downward head movement can: (1) bring a new row of cards considered to be “below” the bottom row of cards of multi-card displays 634 a and 634 b, (2) display the new row of cards on the bottom row of multi-card displays 634 a and 634 b, (3) move the bottom row of cards up to be displayed as the middle row of cards; e.g. display cards 630 g, 640 a, and 640 b in the positions of cards 630 d, 630 e, and 630 f of multi-card displays 634 a and 634 b, (4) move the middle row of cards up to the top row of cards e.g. display cards 630 d, 630 e, and 630 f in the positions of cards 630 a, 630 b, and 630 c of multi-card displays 634 a and 634 b, thus removing the top row of cards; e.g., cards 630 a, 630 b, and 630 c, from view on multi-card displays 634 a and 634 b.
  • Scenario 620 continues with clutch 642 being released while sub-focus 638 is on card 630 g. Clutch 642 can be released by the wearer removing one or both of their fingers from the touch-based UI of the HMD. After clutch 642 is released, the HMD can use a single-user view to display either (a) card 630 c, as the card being focused on before clutch operation 642 began, or (b) card 630 g, as the card focused on using sub-focus 638 just prior to release of clutch 642. In response to clutch 642 being released for HMD embodiments not using sub-focus 638, the HMD can use a single-user view to display card 630 c.
  • FIG. 6C shows a scenario 650 for using clutch operation 680 to generate a multi-timeline display 664 a, according to an example embodiment. Scenario 650 begins with an HMD displaying main timeline 660 with a focus on card 660 a. During scenario 650 prior to clutch 680, the HMD displays cards in main timeline 660 using a single-card view, displaying a focused-upon card. As the focus is on card 660 a, the HMD displays a single-card view of card 660 a.
  • Scenario 650 continues with a wearer of the HMD performing clutch operation 680. Upon recognition of clutch operation 680, in scenario 650, the HMD can generate and display multi-timeline display 664 a, which is shown in an expanded view as multi-timeline display 664 b. In some embodiments, the HMD can focus on the entire multi-timeline display 664 a using focus 666. In other embodiments, the HMD can focus a subset of cards and/or timelines, such as, but not limited to, a single card, one, some, or all cards on a timeline, a column of cards across one or more timelines, a block of cards across multiple timelines, a single timeline, a group of timelines, or some other selection of cards and/or timelines, within multi-card display 664 a using sub-focus 668.
  • As shown in FIG. 6C using expanded multi-timeline display 664 b, the multi-timeline displays five timelines (TLs): timelines 670, 672, 674, 676, and 678. The multi-timeline display displays five cards for each of displayed timelines 670, 672, 674, 676, and 678. The timelines can be selected for display based on a type of object displayed in a card; e.g., a timeline having only photos, only photo bundles, only messages, only message bundles, only cards representing active applications. Additional criteria can be used to further select items for a timeline; e.g., for photo objects, some criteria can be: only photos taken before (or after) a predetermined date, within a date range, at a location, as part of a photo bundle, photos that were shared, photos that were shared and with one or more messages received in response, etc. Other criteria for photo objects and/or other types of objects are possible as well for selection in a timeline. For example, in scenario 650, all of the cards in timeline 670 represent photos in a photo bundle, all of the cards in timeline 672 represent photos taken in a given city location, and all of the cards in timeline 678 represent contacts that do not have associated photos/images.
  • The additional timelines presented can represent different user accounts associated with the HMD, for example, a first timeline could be cards generated by a user's work account, e.g. photos, events, contacts, email, messages, sent to or received by his/her work account, e.g. user@google.com. In this example, the HMD could be configured to allow access to multiple user accounts, such as the user's personal account, e.g. user@gmail.com; such that a second timeline accessible from the grid view could be cards generated by the user's personal account, e.g. photos, events, contacts, email, messages, sent to or received by his/her personal account. This way, the user can easily interact with the HMD via different profiles or personas, such as work or personal.
  • The timelines can be selected to be part or all of the main timeline; for example, FIG. 6C shows that timeline 674 includes five cards selected from main timeline 660. Cards can be selected from main timeline 660 randomly, based on focus 662, based on a type of object represented on the main timeline; e.g., select only cards representing active applications visible from the main timeline, and/or based on other criteria. For example, in scenario 650, timeline 674 includes card 660 a, which was the focused-on card prior to clutch 680, and the two cards on each side of card 660 a in main timeline 660. Other criteria for selecting cards from a main timeline are possible as well.
  • One or more timelines can act as contextual menu(s) for multi-timeline display 664 a, including possible operations that can be performed from multi-timeline display 664 a, operations on multi-timeline display 664 a, and/or other operations. For example, timeline 678 includes a menu of operations including navigate, take a video, take a photo, remove a timeline option, and add a timeline. Other operations are possible as well. For example, if clutch is engaged from card 660 a in main timeline 660, the multi-timeline display 664 a could present a contextual menu of operations that could be executed based off of the presently selected card 660 a, e.g. share this card, delete the card, remove from timeline, add to bundle, etc.
  • In one embodiment, the wearer of the HMD can navigate around multi-timeline display 664 a, 664 b using head movements. For example, in scenario 650, the HMD is configured to display sub-focus 668, shown as a dotted line on both multi-timeline displays 664 a and 664 b, shown focusing on a single timeline; e.g., timeline 668.
  • In one example of scenario 650, “wrap-around” movements, or moving off the end of a row or column to the respective other end of the row or column, are enabled. Then, in response to respective movements upward, downward, leftward, or rightward by the head of the wearer, the sub-focus 668 can move from timeline 674, as shown in FIG. 6C, to respective timelines 672, 676, 672, or 676. In particular embodiments, wrap-around can be inhibited, so moving the head of the wearer leftward will not move sub-focus 668 from timeline 674 to timeline 672 and moving the head of the wearer rightward will not move sub-focus 668 from timeline 674 to timeline 676 but rather sub-focus 638 will stay on timeline 674 in response to either the leftward or the rightward movement.
  • In some embodiments, in response to respective movements diagonally up-and-left, up-and-right, down-and-left, and down-and-right by the head of the wearer with wrap-around enabled, the sub-focus 638 can move from timeline 674, as shown in FIG. 6C, to respective cards 672, 672, 676, and 676. In particular embodiments, wrap-around can be inhibited, but as each of the diagonal movements has an up or down components, movement to a respective timeline will succeed when sub-focus 668 is on timeline 674.
  • In some embodiments, sub-focus 668 may not be used. For example, in these embodiments, a leftward head movement can move each of timelines 670, 672, 674, 676, 678 to the left on multi-timeline display 664 a,664 b by one or more cards and a rightward head movement can move each of timelines 670, 672, 674, 676, 678 to the right on multi-timeline display 664 a,664 b by one or more cards. Also in these embodiments, an upward head movement can bring a time “above” timeline 670 (not shown in FIG. 6C) into view as a top-most timeline on multi-timeline displays 664 a and 664 b, move down each of timelines 670, 672, 674, 676 by one time line on multi-timeline displays 664 a and 664 b, and remove timeline 678 from view. Further, an upward head movement can bring a time “below” timeline 678 (not shown in FIG. 6C) into view as a bottom-most timeline on multi-timeline displays 664 a and 664 b, move up each of timelines 672, 674, 676, 678 by one timeline on multi-timeline displays 664 a and 664 b, and remove timeline 670 from view.
  • Other types of head movements and/or UI operations can be used as well or instead with multi-timeline display 664 a, 664 b, including but not limited to head movements and/or UI operations that move the focus faster than and/or slower than one timeline at a time, enable navigation of cards within a timeline, which can include some or all of the navigation techniques discussed above regarding multi-card displays 634 a and 634 b, zooming in and out, reshaping sub-focus 668, selecting card(s)/timeline(s), and deselecting card(s)/timeline(s).
  • Scenario 650 continues with clutch 680 being released while sub-focus 688 is on timeline 670. After clutch 680 is released, the HMD can use a single-card view to display a card on selected timeline 670.
  • FIG. 6D shows scenario 682 for using head movements to navigate a multi-timeline display, according to an example embodiment. Scenario 682 begins with the HMD displaying a single-card view 684 of a contact named “George Farley” participating in a hangout, as shown at the upper-left hand corner of FIG. 6D. A hangout can be indicated by the HMD using icon 684 a of a camera inside of a speech balloon. Scenario 682 continues with the wearer of the HMD performing a clutch operation, or pressing two fingers on the touch-based UI of the HMD for at least one second.
  • After determining a clutch operation was performed, the HMD can generate multi-timeline display 686 a, shown in the upper-right-hand corner of FIG. 6D as a rectangle with thick lines. Multi-timeline display 686 a is shown displaying a focus 688 a and parts of three timelines, including timeline (TL) 690 a. In scenario 682, focus 688 a, shown in FIG. 6D as a circular arrangement of gray trapezoids, rests or focuses on card 684. Focus 688 a rests on card 684, as card 684 which was the card previously being displayed in a single-card view. In one embodiment, focus 688 a element may not be presented.
  • During scenario 682, head movements can be used target items and move between levels of navigation. Each level of navigation can be represented in a multi-timeline display as one or more cards on a timeline. For example, multi-timeline display 686 a shows that if the wearer made a leftward head movement, card 692 a on timeline 690 a, representing a navigation application/process would be centered on by focus 688 a. Multi-timeline display 686 a also shows that if the wearer made a rightward head movement, card 692 b on timeline 690 a representing a weather application would be centered on by focus 688 a. Similarly, multi-timeline display 686 a shows that if the wearer made respective upward or downward head movements, respective cards 692 c or 692 d would be centered on by focus 688 a.
  • Scenario 682 continues with the wearer making a downward head tilt. After determining a downward head movement was performed, the HMD can move focus 688 a downward onto card 692 d with text of “expand”. The HMD can generate multi-timeline display 686 b with focus 688 b on card 692 d, as shown in the center-left portion of FIG. 6D. Multi-timeline display 686 b shows that card 692 d is part of timeline 690 b.
  • Timeline 690 b represents a contextual menu for the hangout, which includes card 692 d to expand, or show other members in the hangout, invite to request other people join the hangout, end the hangout, and mute sound from one or more persons at the hangout. Below timeline 690 b, a card 694 a representing an attendee of the hangout is shown, in part to represent the next level of navigation if the wearer were to decide to make another downward head motion.
  • Scenario 682 continues with the wearer of the HMD making another downward head motion. After determining a downward head movement was performed, the HMD can move focus 688 b downward onto card 694 a, which represents George Farley as a hangout attendee.
  • The HMD can generate multi-timeline display 686 c with focus 688 c on card 694 a, as shown in the center-right portion of FIG. 6D. Multi-timeline display 686 c shows that card 694 a is part of timeline 690 c, which represents attendees of the hangout. FIG. 6D shows that there are three other attendees at the hangout beyond the wearer: Pieter Vrijman represented by card 694 b, George Farley represented by card 694 a, and Richard The, who is represented by card 694 c. Below card 694 a is card 696 a with text of “mute”, representing a contextual menu of operations regarding attendees of hangouts. Card 696 a also represents the next level of navigation if the wearer were to decide to make another downward head motion.
  • Scenario 682 continues with the wearer of the HMD making a rightward head motion. After determining a rightward head movement was performed, the HMD can move focus 688 c rightward onto card 694 c, which represents Richard The. The HMD can generate multi-timeline display 686 d with focus 688 d on card 694 c, as shown in the lower-left corner of FIG. 6D. Below card 694 c is card 696 b with text of “mute”, representing a contextual menu of operations regarding attendees of hangouts and the next level of navigation corresponding to downward head movements.
  • Scenario 682 continues with the wearer releasing his/her fingers from the touch-based UI of the HMD, thereby ending the clutch operation. After determining the clutch operation has completed, the HMD can revert to a single-card view as shown at the lower right hand corner of FIG. 6D. In some embodiments, the single-card view can view the last-focused card during multi-timeline display. For example, the last focus; e.g., focus 688 d, during multi-timeline display was on card 694 c representing Richard The. Then, the single-card view can display last-focused card 696 c in a single card view to end scenario 682.
  • The user interface can use contextual menus to designate operations for specific objects, applications, and/or cards. FIG. 7 shows user-interface scenario 700 including contextual menus, according to an example embodiment. A contextual menu is a menu of operations or other possible selections that are based on a card. For example, if the card is a card representing a video, a contextual menu can include operations such as sharing the video, editing the video, watching the video, deleting the video, adding the video to a “video bundle” or collection of videos, annotating the video, adding, deleting and/or editing sound associated with the video, and/or other operations related to the video, including but not limited more or fewer options.
  • Scenario 700 begins with the HMD receiving a tap while displaying image 710. In some embodiments, image 710 is part of a timeline. In response to the tap, the HMD can select operations for a contextual menu, such as sharing and deleting the photo, based on the displayed card; e.g., image 710. To display the contextual menu, the HMD can then display card 720 to indicate that a share operation can be performed on image 710. Card 720 also shows two dots to indicate that the current contextual menu has two options, with the leftmost dot being black and the rightmost dot being white to indicate that the current Share option is the first option of the two options.
  • To select the other option in the contextual menu, a wearer can perform a swipe operation while card 720 is displayed. In response to the swipe operation, card 722 can be displayed, where card 722 is associated with a delete operation for image. As with card 720, card 722 shows two dots to indicate that the current contextual menu has two options, with the leftmost dot being white and the rightmost dot being black to indicate that the current Delete option is the second option of the two options. A swipe operation while displaying card 722 causes (re)display of card 720.
  • If a tap operation is received while displaying card 720, the HMD can interpret the tap operation as selection of the Share option of the contextual menu. In response, a “people chooser” can be used to select a first person for sharing.
  • The people chooser can display card 730, which includes an image and a name of a first contact. FIG. 7 shows that card 730 indicates the first person as “Jane Smith”. In response to viewing card 730, the wearer can instruct the people chooser to show other possible recipients of photo 710 via swiping through a list of contacts. In scenario 700, the list of contacts can be represented by cards that include: card 732 a showing “Another Person”, card 732 b showing “Friends”, and card 732 c indicating other person(s), circle(s), and/or social network(s) for sharing photos. People choosers are also discussed in more detail at least in the context of FIG. 8.
  • FIG. 7 shows that swiping left while card 732 c is displayed to request a next possible recipient can lead to re-displaying card 730 associated with Jane Smith. Similarly, FIG. 7 shows that swiping right while card 730 is displayed to request a previous possible recipient can lead to card 732 c.
  • In scenario 700, the wearer taps on the touch-based UI while card 730 is displayed, indicating that the wearer wishes to share image 710 with Jane Smith. In response to this tap, card 734 is displayed, which includes the word “Sending” and a progress bar. In scenario 700, the HMD is configured to wait for a “grace period”, such as one or a few second(s), before carrying sending or deleting images, to give the wearer a brief interval to cancel sending or deleting the image.
  • The progress bar on card 734 can show the passing of the time of the grace period for sending image 710. Once the grace period expires or a tap is received, the HMD can send image 710, e.g., via e-mail or multi-media message, to Jane Smith. If image 710 is sent successfully, the HMD can display card 736 with text of “Sent” to indicate that image 710 was indeed successfully sent to Jane Smith. After displaying card 736, the HMD can return to a timeline display, such as discussed above in the context of at least FIG. 5A.
  • If image 710 is not sent successfully or was cancelled, such as by the wearer performing a swipe down operation during the grace period, the HMD can display card 738 to indicate to the wearer that the HMD was unsuccessful in sending image 710 sent to Jane Smith. After displaying card 738, the HMD can return to a timeline display, such as discussed above in the context of at least FIG. 5A.
  • If a tap operation is received while displaying card 722, which FIG. 7 shows is the “Delete” card, the HMD can interpret the tap operation as selection of the Delete option of the contextual menu. In response to this tap, the HMD can display card 740 with text of “Deleting” and a progress bar for a grace period that has to expire before the HMD will delete image 710. Once the grace period expires or a tap is received, the HMD can delete image 710. Once image 710 is deleted, the HMD can display card 742 to indicate to the wearer that image 710 was indeed deleted. After displaying card 742, the HMD can return to a timeline display, such as discussed above in the context of at least FIG. 5A.
  • FIG. 7 also shows that at any time while displaying cards 720, 722, 730, 732 a-732 c, 734, 736, 740, and 742, a swipe down operation can be performed. In response, the HMD can stop the current operation; e.g., send or delete, and return to displaying image 710.
  • The UI can utilize “people choosers” or software configured to help a wearer find a person from among the wearer's contacts, such as when the wearer wants to contact that the person. FIG. 8 shows a user-interface scenario 800 including a people chooser, according to an example embodiment. In scenario 800, two techniques are shown for invoking the people chooser. While card 810 is displayed, a wearer of an HMD can use a voice interface that requests that the wearer “Speak a name from your contacts.” Also or instead, at 812, the HMD can be in a contextual menu with a “Share” option that is selected.
  • After either card 810 or 812 is displayed, the people chooser is invoked to permit selection of a person or “contact” as a destination for sharing, being called, looked up in a contact directory, or some other activity. The people chooser sorts contacts by frequency of use, rather than by time of use; e.g., recency, to be a useful alternative to the timeline.
  • FIG. 8 shows that card 820 is selected for display by the people chooser. Card 820 represents “Jane Smith”. In scenario 800, Jane Smith is the most frequently used contact. Card 820 includes the contact's name, Jane Smith, and an image related to the contact, e.g., a picture of Jane Smith. After reviewing the card shown at 820, the wearer of the HMD can either tap or swipe the touch-based UI to select “Jane Smith” as the person selected for the activity; e.g., sharing, calling, etc., that can lead to invocation of the people chooser.
  • If a tap is received while card 820 is shown, the HMD can then take action 822 with the choice. If a swipe is received while card 820 is displayed, then another card can be displayed for a next-most recent contact; e.g., card 824 for “Another Person”. To select “Another Person” for the action while card 824 is displayed, a wearer can either tap the HMD using the touch-based UI or say the person's name, e.g., “Another Person”, using the voice-based interface. If “Another Person” is selected, the HMD can carry out the action with “Another Person”.
  • Otherwise, “Another Person” is not selected. Then, the wearer can swipe again, and another card can be displayed for a group of contacts, such as card 826 for “Friends”. To select a “Friend” for the action while card 826 is displayed, a wearer can either tap the HMD using the touch-based UI or say the person's name, e.g., “Friend”, using the voice-based interface. If the “Friends” group is selected, the HMD can provide cards in the “Friends” group in response to swipe actions until either a contact in the “Friends” group is selected or the “Friends” group is exhausted without the wearer making a selection. Each item in the “Friends” group, or friend, can be a contact or other representation of a person, organization, group, family, etc. that the wearer has designated as a friend. In one embodiment, the “Friends” group can be a bundle or folder that enables access to the items or friends within the bundle or folder. In one embodiment, the “Friends” group can be a group of friends ordered based on time of friend designation, most recent access, or by some other criteria.
  • Otherwise, “Friends” are not selected. Then, the wearer can swipe while card 826 is displayed to bring up card 828, representing another contact frequently called by the wearer. Scenario 800 can continue with swipes that show contacts until either a contact is selected or until all contacts have been displayed. If all contacts have been displayed, after displaying the last selected contact, the HMD can “wrap-around” or return to the first selected card; e.g., card 820 representing “Jane Smith”.
  • As mentioned above, the HMD can be configured with a camera, and the UI can aid wearer interaction with the camera. FIG. 9 shows a user-interface scenario 900 with camera interactions, according to an example embodiment. Scenario 900 can begin by displaying card 910 or card 930 for an HMD configured with one or more cameras that can perform at least the activities described herein.
  • While displaying card 910, at any point while utilizing the UI of the HMD, the camera button; e.g., button 179 of HMD 172 shown in FIG. 1D, can be pressed for either a short time; e.g., less than one second, or a long time; e.g., longer than the short time. If the camera button is pressed for the short time, also referred to as a “short press” of the camera button, scenario 900 continues by displaying card 920. Otherwise, if the camera button is pressed for the long time, also referred to as a “long press” of the camera button, scenario 900 continues by displaying card 934.
  • In response to the short press of the camera button, a photo or still image is captured using the camera—an example image capture is shown as card 920. If, after capturing the photo, a tap is received, scenario 900 continues by displaying card 922; otherwise, if either a swipe down is received or no interaction with the touch-based UI is recorded during a wait interval; e.g., one second, scenario 900 continues by displaying card 924.
  • Card 922 is part of a contextual menu with options for operating on the captured photo. The contextual menu can include options such as a share option for the captured photo; e.g., as indicated by the “Share” card shown at 922, a delete option for the captured photo, and other options for the captured photo (e.g., editing the photo).
  • Card 924 shows the captured photo as “animated out”; that is, the image of the captured photo is replaced with a blank card shown as card 926 via an animated transition. After displaying card 926, the HMD can return to a previous state; e.g., a position in the timeline being displayed at 910 before receiving the short press of the camera button.
  • After displaying a home card, such as card 300 shown in FIG. 3, a tap can be received via the touch-based UI. In response to the tap, the HMD can display a “Capture” card, such as card 930. After displaying card 930, scenario 900 can continue with a display of card 932.
  • Card 932 is shown in FIG. 9 as a “Photo” card, indicating that to the wearer that a photo or still image can be captured using the camera. If a swipe is received while displaying card 932, scenario 900 can continue by displaying card 934; otherwise, scenario 900 can continue at 950.
  • Card 934 is shown in FIG. 9 as a “Video” card to indicate to the wearer that a video can be captured using the camera. If a swipe is received while displaying card 934, scenario 900 can continue by displaying card 936. In one embodiment, multiple camera operations can occur simultaneously; e.g., the HMD can perform some or all of recording video, capturing still images, capturing timelapse images, and conducting video conferencing at the same time. In more particular embodiments, the HMD can perform the multiple camera operations and/or multiple telephone operations simultaneously; e.g., the HMD can, while perform multiple camera operations, conduct one or more two-party or multi-party voice calls, dial one or more parties, have one or more voice calls on hold, forward one or more voice call, and other telephone operations.
  • Otherwise, the HMD can determine whether a new video session is to be started to capture the requested video or if a pending video session is to be rejoined. If the new video session is to be started, the HMD can trigger the camera to start recording images (if not already recording) and scenario 900 can continue by displaying card 950. If the pending video session is to be rejoined, the HMD can redirect to, or request display of, an already-existing card for the pending video session and scenario 900 can continue by displaying a card for the pending video session, shown in FIG. 9 as card 952.
  • Card 936 is shown in FIG. 9 as a “Timelapse” card to indicate to the wearer that a timelapse image can be captured using the camera. If, a swipe is received while displaying card 936, scenario 900 can continue by displaying card 932.
  • Otherwise, the HMD can determine whether a new timelapse session is to be started to capture the requested timelapse image or if a pending timelapse session is to be rejoined. If the new timelapse session is to be started, the HMD can trigger a timelapse card to start displaying a timelapse image being captured by the camera (if not already recording) and scenario 900 can continue by displaying card 960. If the pending timelapse session is to be rejoined, the HMD can redirect to an already-existing card for the pending timelapse session and scenario 900 can continue by displaying a card for the pending timelapse session, shown in FIG. 9 as card 962.
  • Upon displaying card 940, the HMD can launch a temporary view finder and instruct the camera to begin capturing images. Upon capturing each image, the HMD can display the image. While displaying the image, the wearer can either (a) provide a tap to the HMD and scenario 900 can continue by displaying card 942 or (b) provide a swipe down using the HMD and scenario 900 can continue by displaying card 944.
  • Upon displaying card 942, the HMD can capture an image using the camera. Once captured, the HMD can display the captured image for a short period of time; e.g., one or a few seconds. After displaying the captured image for the short period, scenario 900 can proceed to display card 940.
  • Upon displaying card 944, which is a blank card, any image for possible capture, e.g., card 940, animates out. In some embodiments, the camera can be deactivated after animating out the image, if no other application; e.g., video, is using the camera. After displaying card 944, the HMD can return to a previous state; e.g., a position in the timeline being displayed at 910 before reaching 944.
  • Card 950 can be a card representing the new video session. While the video session is active, the HMD can capture images and, in some embodiments, sound, and store the captured video. Upon capturing each image for the video session, the HMD can display the captured image using card 950, which represents the new video session. While displaying the images for the video session using card 950, the wearer can either (a) provide a tap to the HMD and scenario 900 can continue by displaying card 954 or (b) provide a swipe down using the HMD and scenario 900 can continue by displaying card 956.
  • Card 952 can be a card representing the pending video session. While the video session is active, the HMD can capture images, and in some embodiments, sound, and store the captured video. Upon capturing each image for the video session, the HMD can display the captured image using the card 952, which represents the pending video session. While displaying the images for the video session using card 952, the wearer can either (a) provide a tap to the HMD and scenario 900 can continue by displaying card 954 or (b) provide a swipe down using the HMD and scenario 900 can continue by displaying card 956.
  • Card 954 can represent a contextual menu with options for the captured video. The contextual menu can include options for the captured video, such as a stop recording option, restart recording option, delete video option, and other options.
  • Card 956 can be a blank card indicating to the wearer that the video session has terminated. In some embodiments, the captured video can be deleted after the video session is stopped, while in other embodiments, the captured video or audio video can remain in storage after the video session is stopped. In some embodiments, the camera can be deactivated if no other application; e.g., a timelapse photo capture, is using the camera. In other embodiments, after displaying the blank card, the HMD can return to a previous state; e.g., a position in the timeline being displayed using card 910 before card 956 was ever displayed.
  • Card 960 can represent the new timelapse session. While the new timelapse session is active, the HMD can capture images for addition to the timelapse image. Upon capturing each image for the timelapse session, the HMD can display image(s) related to the new timelapse session using card 960. While displaying card 960, the wearer can either (a) provide a tap to the HMD and scenario 900 can continue by displaying card 964 or (b) provide a swipe down using the HMD and scenario 900 can continue by displaying card 966.
  • Card 962 can represent the pending timelapse session. While the pending timelapse session is active, the HMD can capture images for addition to the timelapse image. Upon capturing each image for the timelapse session, the HMD can display image(s) related to the pending timelapse session using card 962. While displaying card 962, the wearer can either (a) provide a tap to the HMD and scenario 900 can continue by displaying card 964 or (b) provide a swipe down using the HMD and scenario 900 can continue by displaying card 966.
  • Card 964 can represent a contextual menu with options for the captured timelapse image. The contextual menu can include options for the captured timelapse image, such as a stop timelapse option, a timelapse frequency option, a restart timelapse option, and other options.
  • Card 966 can be a blank card that indicates to the wearer that the timelapse session has terminated. In some embodiments, the captured timelapse image can be deleted after the timelapse session is stopped, while in other embodiments, the captured timelapse image can remain in storage after the timelapse session is stopped. In some embodiments, the camera can be deactivated if no other application; e.g., video is using the camera. In other embodiments, after displaying the blank card, the HMD can return to a previous state; e.g., a position in the timeline being displayed using card 910 before card 966 was ever displayed.
  • Objects, such as photos and messages, can be grouped or “bundled” by the UI to simplify interactions with these bundles. FIG. 10A shows user-interface scenario 1000 with photo bundles, according to an example embodiment. Scenario 1000 begins with an HMD displaying photo bundle card (PBC) 1010 in a timeline. Photo bundle card 1010 includes photo bundle indicator (PBI) 1010 a, example photo 1010 b, and thumbnails 1010 c. Photo bundle indicator 1010 a, shown in FIG. 10A as a page with a turned-down corner, indicates that a “photo bundle” or collection of photos is associated with photo bundle card 1010. Example photo 1010 b, shown in FIG. 10A as occupying roughly one-half of photo bundle card 1010, provides a relatively large image of an example photo in the photo bundle. Thumbnails 1010 c, shown in FIG. 10A as collectively occupying roughly one-half of photo bundle card 1010, provides four relatively small images of four example photos in the photo bundle.
  • While displaying photo bundle card 1010, the wearer of the HMD can tap on a touch-based UI to instruct the HMD to display the photos in the photo bundle. During scenario 1000, while displaying photo bundle card 1010, the HMD can receive a tap and subsequently display a card with photo 1012.
  • Each individual item within a bundle, e.g., a photo within a photo bundle, functions the same with respect to the user interface as it would if the item were displayed on the timeline. For example, in the case of a photo, such as photo 1012, tapping on the touch-based UI would enter a contextual menu for the photo, and swiping down while in the contextual menu would return to photo 1012.
  • While displaying photo 1012, the HMD can receive a swipe forward to display the next photo in the bundle or a swipe backward to display the previous photo in the bundle. In scenario 1000 as shown in FIG. 10A, the next photo can be photo 1014. As photo 1012 is the first photo in the bundle, the previous photo is the last photo in the bundle, or photo 1018.
  • During scenario 1000, the HMD receives a swipe backward while displaying photo 1012. In response to the swipe backward, the HMD can display photo 1018 as discussed above. Scenario 1000 continues with the HMD receiving two more swipes backwards. In response, the HMD can first display photo 1016 which is the previous photo to photo 1018, and, after receiving the second swipe backward, display photo 1014 which is the previous photo to photo 1016 as shown in FIG. 10A.
  • While displaying photo 1014, the HMD can receive a tap. In response to the tap, the HMD can display photo bundle card 1010 and scenario 1000 can end.
  • FIG. 10B shows user-interface scenario 1050 with message bundles, according to an example embodiment. Scenario 1050 begins with an HMD displaying message bundle card (MBC) 1060 in a timeline. Message bundle card 1060 includes message bundle indicator (MBI) 1060 a and a most-recent message in the message bundle, which includes image 1060 b and message 1060 c. Photo bundle indicator 1060 a, shown in FIG. 10B as a page with a turned-down corner, indicates that a “message bundle” or collection of messages is associated with message bundle card 1060. Image 1060 b can be an image associated with the sender of the most-recent message in the message bundle. Message 1060 c can include text, and in some embodiments, other type(s) of data, that is sent with the most-recent message in the message bundle. As shown in FIG. 10B, image 1060 b occupies roughly one-third of message bundle card 1060, is an image of “Joe W.” who sent message 1060 c, which occupies roughly two-thirds of message bundle card 1060. Message 1060 c includes text that says “Sounds great. See you there,” and was sent three minutes ago.
  • In scenario 1060, while displaying message bundle card 1060, the wearer of the HMD can tap on a touch-based UI. Some bundles have additional functionality, specific to the bundle, associated with a tap. In the example of the message bundle, a contextual menu can be displayed in response to the tap. FIG. 10B shows two options in the contextual menu: a reply option associated with card 1070 and a read-all option associated with card 1072.
  • While card 1070 associated with the reply option is displayed, the HMD can receive a tap. In response, the HMD can interpret the tap as a selection to reply to the most recently displayed message card. While card 1072 associated with the read all option is displayed, the HMD can receive a tap, which can be interpreted to read the messages in the message bundle, starting with the most recent. In one embodiment, the HMD can start with the first message in the message bundle rather than the most recent. In response to receiving a swipe down while in the contextual menu for message bundles, the HMD can select message bundle card 1060 for display.
  • Each individual item within a bundle, e.g., a message within a message bundle, functions the same with respect to the user interface as it would if the item were displayed on the timeline. For example, in the case of a message, such as message 1062, tapping on the touch-based UI would enter a contextual menu for the message, and swiping down while in the contextual menu for the message would return to message 1062.
  • While displaying message 1062, the HMD can receive a swipe forward to display the next message in the bundle or a swipe backward to display the previous message in the bundle. In scenario 1050 as shown in FIG. 10B, the previous message can be message 1064. As message 1062 is the first message in the bundle, there is no “next” message, so the last message in the bundle, or message 1066, can be displayed instead.
  • During scenario 1050, the HMD receives a swipe forward while displaying message 1062. In response to the swipe forward, the HMD can display message 1066 as discussed above. Scenario 1050 continues with the HMD receiving two more swipe forwards. In response, the HMD can first display message 1064 which is the next message to message 1066, and, after receiving the second swipe forward, display message 1062, which is the next message to message 1064 as shown in FIG. 10B.
  • While displaying message 1062, the HMD can receive a tap. In response to the tap, the HMD can enter a contextual menu for message 1062 and scenario 1050 can end.
  • The HMD has various settings, including settings for networks such as WiFi and Bluetooth networks. FIG. 11 shows user-interface scenario 1100 with timeline 1110 including settings cards 1120, 1130, according to an example embodiment. As shown in FIG. 11, timeline 1110 has two settings cards 1120 and 1130 at the now/future end of the timeline. As shown in FIG. 11, both cards 1120 and 1130 permit interaction with various “settings”, e.g., controls, preferences, data, and/or other information, in response to a tap input of the touch-based user interface.
  • Card 1120 is related to wireless network (“WiFi”) settings, which can be settings related to wireless networks operating using one or more protocols, such as IEEE 802.11 protocols, which are discussed in more detail below in the context of FIG. 12. Card 1130 is related to Bluetooth settings, which can be settings related to short range wireless networks operating using one or more Bluetooth protocols, which are discussed in more detail below in the context of FIG. 13.
  • FIG. 12 shows user-interface scenario 1200 related to WiFi settings, according to an example embodiment. Scenario 1200 begins with an HMD displaying card 1210. Card 1210 indicates that the HMD is connected via WiFi to a network of computers called “GGuest.”
  • During scenario 1200, in response to viewing card 1210, a wearer of the HMD taps the touch-based UI of the HMD. In response, the HMD displays card 1220, indicating both that the HMD is connected to GGuest and a map of the general area around the HMD.
  • After viewing card 1220, the wearer can swipe next through cards 1230 and 1240 that indicate available networks for accessible connections, card 1250 to begin the process to add another WiFi network, and card 1260 to turn off the WiFi functionality of the HMD. In some embodiments, swiping next after displaying card 1260 leads to display of card 1220. In other embodiments, swiping previous after displaying card 1220 leads to display of card 1260.
  • In response to tapping while displaying card 1220, the HMD displays card 1222 with text of “Forget”. After viewing card 1222 during scenario 1200, wearer can use the touch-based UI of the HMD to either (a) tap to instruct the HMD to begin a process of forgetting, e.g., deleting stored information, about the currently connected WiFi network, or (b) swipe to bring up card 1232 with text of “Disconnect” to begin a process of disconnecting from the currently connected WiFi network. In scenario 1200, the currently connected WiFi network would be GGuest, as card 1220 was reached after tapping card 1210, and card 1210 is associated with the GGuest WiFi network.
  • During one aspect of scenario 1200, the wearer taps on the touch-based HMD while card 1222 is displayed to instruct the HMD to forget about the GGuest network. The process of forgetting about a WiFi network is associated with a grace period to permit the wearer to reconsider. In response to the tap operation, the HMD can display card 1224 with text of “Forgetting” and progress bar 1224 a. Progress bar 1224 a can take a length of time, such as equal to or greater than the grace period, to complete display. After progress bar 1224 a is completely displayed, the grace period is deemed to have expired.
  • Once the grace period expires or a tap is received during display of card 1224, the HMD can delete stored information about the currently connected WiFi network and display card 1226 indicating the currently connected WiFi network is now forgotten. After displaying card 1226, the HMD can return to the settings context menu.
  • During another aspect of scenario 1200, the wearer taps on the touch-based HMD while card 1232 is displayed to instruct the HMD to disconnect from the GGuest network. The process of disconnecting from a WiFi network is associated with a grace period to permit the wearer to reconsider. In response to the tap operation, the HMD can display card 1234 with text of “Disconnecting” and progress bar 1234 a. Progress bar 1234 a can take a length of time, such as equal to or greater than the grace period, to complete display. After progress bar 1234 a is completely displayed, the grace period is deemed to have expired.
  • Once the grace period expires or a tap is received during display of card 1234, the HMD can disconnect from the currently connected WiFi network and display card 1236 indicating that the HMD is now disconnected from the previously-connected WiFi network. After displaying card 1236, the HMD can return to the settings context menu.
  • Card 1230 displays information about a nearby WiFi network named “GA Ntwk” including the network's use of Wired Equivalent Privacy or “WEP”, and a map with location information about “GA Ntwk.” In response to tapping while displaying card 1230, the HMD attempts to connect the “GA Ntwk” network and displays card 1244 with text of “Connecting”
  • After displaying card 1244, if the HMD is able to successfully connect to the WiFi network, the HMD will display card 1246 with text of “Connected” and return to the setting context menu. If the HMD is unable to successfully connect to the WiFi network; e.g., the network is not open access and requires authentication for access, the HMD will display card 1248 with text of “Failed” and return to the previous card; e.g., card 1230 to request additional input related to the “GA Ntwk.” In some embodiments, the HMD can automatically attempt WiFi reconnection upon (initial) failure. In particular embodiments, the HMD will automatically attempt WiFi reconnection for a fixed number of attempts before indicating failure. If the HMD automatically reattempts WiFi connection upon failure, the HMD can display card 1244 as the “previous” card.
  • Card 1240 displays information about a nearby WiFi network named “Coffee Shop” including a map with location information about “Coffee Shop”. In response to tapping while displaying card 1240, the HMD can determine that the “Coffee Shop” network is secured and display card 1242. Card 1242 displays an icon of a Quick Response (QR) code, text to “Enter Password”, and a hint of “Generate QR code at <Example URL>.”
  • In scenario 1200, the QR code is provided to the HMD. For example, the QR code can be on a sticker, poster, paper, or otherwise displayed at the wearer's location; e.g., the “Coffee Shop” location. As another example, the QR code can be generated via a website in which the user entered the credentials for access to the network. Once a suitable QR code is located, the wearer can capture the QR code by pointing the HMD's camera at it. In other embodiments, other techniques besides a QR code can be used to enter network credentials, such as the wearer speaking the password for access to a network.
  • In response to the HMD successfully capturing the QR code or otherwise obtaining the password for the “Coffee Shop” network, the HMD can display card 1244. After displaying card 1244, if the HMD is able to successfully connect to the WiFi network, the HMD will display card 1246 with text of “Connected” and return to the setting context menu. If the HMD is unable to successfully connect to the WiFi network, the HMD will display card 1248 with text of “Connected” and return to the previous card; e.g., card 1230 to request additional input related to the “Coffee Shop” network.
  • In some embodiments, the HMD can automatically reattempt WiFi connection upon (initial) failure. In particular embodiments, the HMD will automatically attempt WiFi reconnection for a fixed number of attempts before indicating failure. If the HMD automatically reattempts WiFi connection upon failure, the HMD can display card 1244 as the “previous” card.
  • Card 1250 displays a QR code encoding information about a WiFi network. In response to tapping while displaying card 1250, the wearer can obtain a QR code and the HMD's camera can be utilized to capture the QR code as discussed above. In other embodiments, other techniques besides a QR code can be used to enter network credentials, such as the wearer speaking the password for access to a network. In response to the HMD obtaining the QR code or otherwise obtaining the password for the WiFi network to be added, the HMD can display card 1244.
  • After displaying card 1244, if the HMD is able to successfully connect to the WiFi network, the HMD will display card 1246 with text of “Connected” and return to the setting context menu. If the HMD is unable to successfully connect to the WiFi network, the HMD will display card 1248 with text of “Failed” and return to the previous card; e.g., card 1230 to request additional input related to the WiFi network to be added indicated using card 1250. In some embodiments, the HMD can automatically reattempt WiFi connection upon (initial) failure. In particular embodiments, the HMD will automatically attempt WiFi reconnection for a fixed number of attempts before indicating failure. If the HMD automatically reattempts WiFi connection upon failure, the HMD can display card 1244 as the “previous” card.
  • In response to tapping card 1260, the HMD begins a process of “turning off” or deactivating WiFi functionality for the HMD. In scenario 1200, the process of deactivating WiFi functionality is associated with a grace period to permit the wearer to cancel or abort the WiFi deactivation. In response to the tap operation, the HMD can display card 1262 with text of “Turning off” and progress bar 1262 a. Progress bar 1262 a can take a length of time, such as equal to or greater than the grace period, to complete display. After progress bar 1262 a is completely displayed, the grace period is deemed to have expired.
  • Once the grace period expires or a tap is received during display of card 1262, the HMD can deactivate WiFi functionality for the HMD, and display card 1264 indicating the WiFi functionality for the HMD is off. After displaying card 1264, the HMD can return to the settings context menu.
  • In other example scenarios, card 1210 could indicate, as shown using card 1212 of FIG. 12, that the HMD is not connected to a WiFi network or, as shown using card 1214 of FIG. 12, that WiFi functionality of the HMD is turned off. In those examples, card 1220 is not used, and tapping either card 1212 or 1214 leads to display of card 1230.
  • If the WiFi functionality is off; e.g., card 1214 is displayed, card 1260 displays a “Turn On” or similar text, and tapping card 1260 while the WiFi functionality is initially off, lead to activation of the HMD's WiFi functionality.
  • FIG. 13 shows user-interface scenario 1300 related to Bluetooth settings, according to an example embodiment. Scenario 1300 begins with the HMD displaying card 1310 in a timeline. As shown in FIG. 13, card 1310 includes a Bluetooth logo and text indicating the HMD is “Connected to Galaxy Nexus [and] Home-PC.” During scenario 1300, in response to viewing card 1310, the wearer performs a tap operation using the touch-based UI of the HMD.
  • In response to the tap operation, the HMD can display card 1320. Card 1320 shows an image of a mobile device and text of “Connected to Galaxy Nexus.” After viewing card 1320, the wearer can swipe next through card 1330 that indicate connection to a Home-PC and card 1340 to begin the process to “pair with” or connect to another device using Bluetooth. In some embodiments, swiping next after displaying card 1340 leads to display of card 1320. In other embodiments, swiping previous after displaying card 1320 leads to display of card 1340.
  • After viewing card 1320, the wearer can perform a tap operation using the touch-based UI of the HMD. In response to this tap operation, card 1332 is displayed with text of “Disconnect” to indicate a disconnect operation to be performed on the current Bluetooth connection. After viewing card 1332, the wearer can use the touch-based UI to perform a swipe operation. In response to the swipe, the HMD can display card 1322 with text of “Forget” to indicate a forget operation for the current Bluetooth connection.
  • After viewing card 1322 during scenario 1300, the wearer can use the touch-based UI of the HMD to either (a) tap to instruct the HMD to begin a process of forgetting about the current Bluetooth connection, or (b) swipe to re-view card 1332. In scenario 1300, the Bluetooth connection would be a connection between the HMD and “Galaxy Nexus”, as card 1322 was reached after tapping card 1320, and card 1320 is associated with the HMD/Galaxy Nexus Bluetooth connection.
  • During one aspect of scenario 1300, the wearer taps on the touch-based HMD while card 1322 is displayed to instruct the HMD to forget about the HMD/Galaxy Nexus Bluetooth connection. The process of forgetting about a Bluetooth connection is associated with a grace period to permit the wearer to reconsider. In response to the tap operation, the HMD can display card 1324 with text of “Forgetting” and a progress bar. The progress bar can take a length of time, such as equal to or greater than the grace period, to complete display. After the progress bar is completely displayed, the grace period is deemed to have expired.
  • Once the grace period expires or a tap is received during display of card 1324, the HMD can delete stored information about the current Bluetooth connection and display card 1326 indicating the current Bluetooth connection is now forgotten. After displaying card 1326, the HMD can return to the home card context menu.
  • In another aspect of scenario 1300, the wearer taps on the touch-based HMD while card 1332 is displayed to instruct the HMD to disconnect from the Galaxy Nexus. The process of disconnecting a Bluetooth connection is associated with a grace period to permit the wearer to reconsider. In response to the tap operation, the HMD can display card 1334 with text of “Disconnecting” and a progress bar that can take a length of time, such as equal to or greater than the grace period, to complete display. After the progress bar is completely displayed, the grace period is deemed to have expired.
  • Once the grace period expires or a tap is received during display of card 1334, the HMD can disconnect from the current Bluetooth connection and display card 1336 indicating that the HMD is now disconnected from the previously-connected Bluetooth connection. After displaying card 1336, the HMD can return to the home card context menu.
  • In another example of scenario 1300, the wearer can use the touch-based UI of the HMD to perform a tap operation while card 1330 is displayed. Card 1330 shows an image of a computer display and has text of “Connected to Home-PC” to indicate a Bluetooth connection between the HMD and a device named “Home-PC”. In response to this tap operation, the HMD can display card 1332 for disconnecting the HMD/Home-PC connection, or after receiving a swipe operation, the HMD can display card 1322 for disconnecting the HMD/Home-PC connection. After either card 1322 or 1332 is displayed, the wearer can use the touch-based UI to perform a tap operation. In response to the tap, the HMD can respectively perform the forgetting (after card 1322 display) or disconnecting operations (after card 1322 display) for Bluetooth connections, using the HMD/Home-PC connection as the current Bluetooth connection, as the tap for card 1322/1332 was received after most recently displaying card 1330 representing the HMD/Home-PC connection.
  • After displaying card 1340, the HMD can be brought into, or already be in, proximity of some other device configured to pair with the HMD. In scenario 1300, the other device is a mobile phone identified, e.g., as “Galaxy Nexus.” In other scenarios, the HMD can attempt to pair with a device other than the Galaxy Nexus. If the other device attempts to pair with the HMD (or vice versa), card 1342 can be displayed in response. As shown in FIG. 13, card 1342 includes an image of a mobile device and text of “Pair with Galaxy Nexus? Tap if Galaxy Nexus displays 186403.”
  • In response to the display of card 1342, the wearer can use the touch-based UI to perform a tap operation, and so instruct the HMD to pair with the other device; e.g., the Galaxy Nexus. After receiving the tap, the HMD can display card 1344 with text of “Pairing” to indicate that the HMD is attempting to pair with the Galaxy Nexus.
  • If the pairing operation between the HMD and the Galaxy Nexus is successful, the HMD can display card 1344 with text of “Paired” and can return to the main timeline after “splicing” or adding a card for the new device to the timeline. On the other hand, if the pairing operation between the HMD and the Galaxy Nexus is unsuccessful, the HMD can display card 1348 with text of “Failed” and can return to card 1340 (“Pair with new device”) to possibly reattempt pairing with the Galaxy Nexus and/or pair with a different device.
  • The HMD can arrange portions of a card in a “visual stack” in order to generate the visual rendering of the card. FIG. 14A shows example visual stack 1400, according to an example embodiment. Visual stack 1400 is used to generate or render a card from a set of overlaid images.
  • From the wearer's perspective, visual stack 1400 is the collection of images viewed looking down viewport 1410 via a “rectangular tube”, shown with dashed lines in FIG. 14A, to main timeline 1430. The collection of images can be part of timelines, menus, etc., each of which can be considered to run independently at different levels perpendicular to the rectangular tube. The wearer can then perceive content on the display of the HMD through the viewport 1410 to see the portions of the timelines, menus, etc. that are within the rectangular tube.
  • FIG. 14A shows that three items are in visual stack 1400 between viewport 1410 and main timeline 1430: submenu 1420, contextual menu 1422, and overlay 1424. Submenu 1420 includes three images: an image of “Jane Smith”, an image of “Another Person”, and an image associated with “Friends”, with the image of “Another Person” inside the rectangular tube. Contextual menu 1422 includes two options: a “Share” option and a “Delete” option, with the “Share” inside the rectangular tube. Thus, visual stack 1400 shows contextual menu 1422 for a photo bundle card shown on main timeline 1430 with a “Share” option selected from the contextual menu 1422, and a sharing destination of “Another Person” selected from submenu 1420.
  • FIG. 14B shows example visual stack 1450, according to an example embodiment. From the wearer's perspective, visual stack 1450 is the collection of images viewed looking down viewport 1460 via a rectangular tube shown with dashed lines in FIG. 14A, to main timeline 1480. FIG. 14B shows two items are in visual stack 1450 between viewport 1460 and main timeline 1480: action notification 1470 and overlay 1472. Action notification 1470 shows a “Send” notification. Thus, visual stack 1400 shows a “Send” notification for a photo bundle card shown on main timeline 1480.
  • In some embodiments, overlay 1472 is completely opaque with respect to main timeline 1480. In these embodiments, the wearer viewing visual stack 1450 sees action notification 1470 and overlay 1472. In other embodiments, overlay 1472 is partially or completely transparent or translucent with respect to main timeline 1480. In these embodiments, the wearer viewing visual stack 1450 sees action notification 1470 and overlay 1472 with some portion(s) of the photo bundle card shown on main timeline 1480 visible, depending on the visibility of an image on main timeline 1480 through overlay 1472.
  • Along with the touch-based UI detailed above, the HMD can utilize a voice-based interface. FIG. 15 shows a user-interface scenario 1500 related to voice interactions, according to an example embodiment. Scenario 1500 begins with a wearer of the HMD reciting and/or uttering the phrase “ok glass”, which can be prompted by a hint provided on a home card such as discussed above in the context of FIG. 3.
  • In response to the HMD recognizing “ok glass”, the HMD can receive the utterance “ok glass” via the voice-based interface and display card 1510. Card 1510 shows the input command “ok glass” to confirm the input received at the voice-based UI of the HMD and a list of available voice commands including “Google”, “navigate to”, “take a photo”, “record a video”, “send a message to”, and “make a call to.”
  • The wearer of the HMD can tilt his/her head up or down to respectively scroll up or down through lists, such as the list of available voice commands. For example, card 1512 shows the result of scrolling down the list of possible voice commands shown in card 1510, indicating the removal of the previously-visible available voice command “Google” and the addition of the available voice command “hangout with.” The HMD can use tilt sensors, accelerometers, motion detectors, and/or other devices/sensors to determine if the wearer tilted their head and/or whether the wearer tilted their head up or down.
  • While display of card 1510, the wearer utters the word “Google”, which causes card 1520 to be displayed. Card 1520 can also be displayed in response to the wearer uttering “OK Google” and the voice-based HMD recognizing the “OK Google” phrase. Card 1520, as shown in FIG. 15, includes a hint of “ask a question” along with an icon of a microphone that can act as a reminder to the wearer that they are using the voice-based interface.
  • In response to the display of card 1520, the wearer can utter “How tall is the Eiffel Tower?” The HMD can then display card 1522 showing that the HMD is processing the input utterance, and once processed, display card 1524. Card 1524 “echo-prints” or repeats the input utterance of “How tall is the Eiffel Tower?” and also prints an indicator of “searching” to inform the wearer that a search is ongoing based on their input. After the search is complete, the HMD can display card 1526, which includes an image of the Eiffel Tower and an answer of “1,063 feet (324 m)” to the question asked by the wearer.
  • In another aspect of scenario 1500, in response to card 1512, the viewer can utter “Navigate to”. In response, the voice-based UI of the HMD can capture the utterance, determine that the utterance is “Navigate to” and display card 1530 showing an echo-print of the “Navigate to” utterance.
  • Scenario 1500 includes two examples of a destination of the “Navigate to” command. In one example, the wearer utters “Restaurant 1” as the destination, indicated via card 1532. Card 1532 includes an echo-print of the “ok glass, navigate to Restaurant 1” command and an indication that the HMD is searching for a location of Restaurant 1. In this example, a single location result is returned for “Restaurant 1”. Card 1534 includes a map to the single location that occupies about one-third of the card. The remainder of card 1534 shows that the HMD is “navigating to Restaurant 1” which is on “13th Street” and will take “16 minutes” to get to the restaurant.
  • In another navigation example, the wearer utters “Popular Pueblo” as the destination, as indicated via card 1536. Card 1536 includes an echo-print of the “ok glass, navigate to Popular Pueblo” command and an indication that the HMD is searching for a location of Popular Pueblo. In this example, a search for a location of “Popular Pueblo” returned multiple locations, as indicated by location cards 1538. The wearer can use the touch-based UI to swipe through the multiple location cards 1538 to view each location cards individually, and to perform a tap operation to select a particular Popular Pueblo location while the desired Popular Pueblo location card is displayed. After the desired Popular Pueblo location card is displayed and the tap operation is completed, the desired location result is shown in card 1540 of FIG. 15. Card 1540 includes a map to the desired location that occupies about one-third of the card. The remainder of card 1540 shows that the HMD is “navigating to Popular Pueblo” which is on “14th Street” and will take “5 minutes” to get to the desired Popular Pueblo.
  • In another aspect of scenario 1500, in response to card 1512, the viewer can utter “Send message to”. In response, the voice-based UI of the HMD can capture the utterance, determine that the utterance is “Send message to” and display card 1550 showing an echo-print of the “Navigate to” utterance, along with a list of potential recipients of the message. FIG. 15 shows that the list of potential recipients includes “Sarah Johnson”, “Steve Johnson”, and “Julie Dennis”.
  • To navigate through the list of potential recipients, the wearer can tilt their head up or down to respectively scroll up or down through the list as indicated by cards 1550 and 1552. Scenario 1500 continues with the wearer uttering “Sarah Johnson” to the HMD. The voice-based UI of the HMD can capture the utterance, determine that the utterance is “Sarah Johnson” and display card 1554 showing an echo-print of the “send message to” utterance, along with “Sarah Johnson” as a recipient of the message.
  • The HMD can wait for a period of time, e.g., one second, for the wearer to provide additional recipients. If the wearer does not provide additional recipients in that period of time, the HMD can display a card, such as card 1556, for composing and echo-printing a message. Card 1556 shows echo-printed utterances including “hi sarah I′m on my way out will be a few” and blocks. The blocks indicate that the HMD is in the process of recognizing the utterances provided by the wearer and translating those utterances into text. In some embodiments, speech can be translated to text using one or more automatic speech recognition (ASR) techniques.
  • After some time, the wearer stops uttering content for the message. In scenario 1500, after uttering the content of the message, the wearer decides to send the message to the recipient, Sarah Johnson. To send the message, the user can either perform a tap operation using the touch-based UI, or stop uttering for a period of time; e.g., one second. In response, the HMD can display a card such as card 1558 indicating that the message is in the process of being sent. After the message is sent, the sent message is spliced into the timeline.
  • E. EXAMPLE METHODS OF OPERATION
  • FIG. 16A is a flow chart illustrating a method 1600, according to an example embodiment. In FIG. 16A, method 1600 is described by way of example as being carried out by a computing device, such as a wearable computer, and possibly a wearable computer that includes a head-mounted display (HMD). However, it should be understood that example methods, such as method 1600, can be carried out by a wearable computer without wearing the computer. For example, such methods can be carried out by simply holding the wearable computer using the wearer's hands. Other possibilities can also exist.
  • Further, example methods, such as method 1600, can be carried out by devices other than a wearable computer, and/or can be carried out by sub-systems in a wearable computer or in other devices. For example, an example method can alternatively be carried out by a device such as a mobile phone, which is programmed to simultaneously display a graphic object in a graphic display and also provide a point-of-view video feed in a physical-world window. Other examples are also possible.
  • As shown in FIG. 16A, method 1600 begins at block 1610, where an HMD can display a home card of an ordered plurality of cards. In some embodiments, the ordered plurality of cards can be ordered based on time. Each card in the ordered plurality of cards is associated with a specific time. In particular of these embodiments, the choose-next input type can be associated with going forward in time. A specific time can be associated with the next card that is equal to or later than a specific time associated with the home card. The choose-previous input type can be associated with going backward in time. A specific time can be associated with the previous card that is equal to or earlier than the specific time associated with the home card.
  • At block 1620, while displaying the home card, the HMD can receive a first input. The first input can be associated with a first input type. The first input type can include a choose-next input type and a choose-previous input type. In some embodiments, the HMD can include a touch-based UI, such as a touch pad, via which the first input can be received. The user-input can be received via other user-interfaces as well.
  • At block 1630, if the first input is of the choose-next input type, the HMD can: (a) obtain a next card of the ordered plurality of cards, where the next card is subsequent to the home card in the ordered plurality of cards, and (b) display the next card using the HMD.
  • At block 1640, in response to the first input type being the choose-previous input type, the HMD can: (a) obtain a previous card of the ordered plurality of cards, where the previous card is prior to the home card in the ordered plurality of cards, and (b) display the previous card using the HMD.
  • In some embodiments, method 1600 can further involve the HMD receiving a next input while displaying the next card. The next input can be associated with a next input type. The next input type can include the choose-next input type and the choose-previous input type. In response to the next input type being the choose-next input type, the HMD can obtain a second-next card of the plurality of cards, where the second-next card is subsequent to the next card in the ordered plurality of card. The HMD can display the second-next card. In response to the next input type being the choose-previous input type, the HMD can obtain the home card and display the home card.
  • In other embodiments, method 1600 can additionally include that the HMD can, while displaying the previous card, receive a previous input. The previous input can be associated with a previous input type. The previous input type can include the choose-next input type and the choose-previous input type. In response to the previous input type being the choose-next input type, the HMD can obtain the home card and display the home card. In response to the previous input type being the choose-previous input type, the HMD can obtain a second-previous card of the plurality of cards, where the second-previous card is prior to the previous card in the ordered plurality of cards. The HMD can display the second-previous card.
  • In particular of the other embodiments, the second-previous card can include a bundle card. The bundle card can represent a collection of cards and can include a bundle card indicator. Then, method 1600 can further include receiving a bundle-card input of a bundle-card type at the HMD, while displaying the bundle card. A first card of the collection of cards can be displayed in response to the bundle-card type of input being a tap. While displaying the first card of the collection of cards, the HMD can receive a first-card input associated with a first-card type. In response to the first-card input being the choose-next type of input, the HMD can select a second card in the collection of cards, where the second card is subsequent to the first card and display the second card. In response to the first-card input being the choose-previous type of input, the HMD can select a third card in the collection of cards, where the third card is prior to the first card; and display the third card.
  • In still other embodiments, the first input type can additionally include a fast-choose-next input type and a fast-choose-previous input type. Each of the choose-next input type and the choose-previous input type can be associated with a first card rate, and each of the fast-choose-next input type and the fast-choose-previous input type can be associated with a second card rate. The second card rate can exceed the first card rate.
  • In these embodiments, method 1600 can additionally include: in response to the first input type being the choose-next input type: (i) simulating movement at the first card rate through of the ordered plurality of cards subsequent to the home card, and (ii) obtaining the next card based on the simulated movement subsequent to the home card at the first card rate. In response to the first input type being the choose-previous input type, method 1600 can include: (iii) simulating movement at the first card rate through of the ordered plurality of cards prior to the home card, and (iv) obtaining the previous card based on the simulated movement prior to the home card at the first card rate. In response to the first input type being the fast-choose-next input type, method 1600 can include: (v) simulating movement at the second card rate through of the ordered plurality of cards subsequent to the home card, and (vi) obtaining a fast-next card based on the simulated movement subsequent to the home card at the second card rate. In response to the first input type being the fast-choose-previous input type, method 1600 can additionally include: (vii) simulating movement at the second card rate through of the ordered plurality of cards prior to the home card, and (viii) obtaining a fast-previous card based on the simulated movement prior to the home card at the second card rate.
  • In particular of these embodiments, each of the choose-next input type and the choose-previous input type can be associated with a swipe made using a first number of fingers and each of the fast-choose-next input type and the fast-choose-previous input type can be associated with a swipe made using a second number of fingers. Then, the first number of fingers can differ from the second number of fingers.
  • FIG. 16B is a flow chart illustrating a method 1650, according to an example embodiment. In FIG. 16B, method 1650 is described by way of example as being carried out by a computing device, such as a wearable computer, and possibly a wearable computer that includes an HMD, but other techniques and/or device can be used to carry out method 1650, such as discussed above in the context of method 1600.
  • As shown in FIG. 16B, method 1650 begins at block 1660, where a home card can be displayed by a head-mountable device (HMD). The HMD can include a user-interface (UI) state, where the UI state is in a home UI state.
  • In some embodiments, displaying the home card can include displaying a hint for using a UI of the HMD on the home card. In particular embodiments, the hint can include a hint for the voice-based UI. In other particular embodiments, the hint can include a hint for the touch-based UI. In still other particular embodiments, displaying the hint can include determining whether a number of times the hint is used successfully meets or exceeds a threshold number of times. In response to the number of times the hint is used successfully does not meet or exceed the threshold number of times, the hint can be displayed on the home card. In response to the number of times the hint is used successfully does meet or exceed the threshold number of times, the hint can be inhibited from display on the home card.
  • At block 1670, while in the home UI state, a first UI of the HMD can receive a first input. The first input can be associated with a first type of input.
  • At block 1680, in response to the first type of input being a choose-next type of input, the HMD can: display a next card of an ordered plurality of cards, where the ordered plurality of cards also includes the home card, and where the next card can differ from the home card, and set the UI state to a timeline-next state.
  • At block 1682, the HMD can, in response to the first type of input being a choose-previous type of input: display a previous card of the ordered plurality of cards, where the choose-previous type of input differs from the choose-next type of input, and where the previous card differs from the both next card and the home card, and set the UI state to a timeline-previous state.
  • At block 1684, the HMD can, in response to the first type of input being a tap type of input: activate a second UI of the HMD, where the first UI of the HMD is a touch-based UI and where the second UI is a voice-based UI, and set the UI state of the HMD to a voice-home state.
  • At block 1686, the HMD can, in response to the first type of input being a speech-type of input, determine whether text associated with the first input matches a predetermined text. In response to determining that the text associated with first input matches the predetermined text, the HMD can activate the second UI and set the UI state to the voice-home state.
  • In some embodiments, method 1650 can additionally include: in response to the first input being a sleep-type of input, deactivating at least a portion of the HMD and setting the UI state of the HMD to a deactivated state.
  • In other embodiments, method 1650 can additionally include: receiving a second input using the first UI while in the timeline-previous state. The second input can be associated with a second type of input. In response to the second type of input being the tap type of input: (i) one or more operations can be selected based on the previous card, (ii) a menu of operation cards can be generated based on the selected one or more operations, where each operation card in the menu of operation cards can correspond to an operation of the one or more operations, and (ii) at least one operation card of the menu of operation cards can be displayed.
  • In particular of the other embodiments, displaying the at least one operation card of the menu of operation cards can include displaying text associated with the operation that overlays the display of the previous card.
  • In other particular of the other embodiments, the one or more operations can include an operation associated with a grace period of time. Then, method 1650 can additionally include: while displaying at least one operation card of the menu of operation cards, receiving an operation input using the first UI where the operation input has an operation type. In response to the operation type of input being the tap type of input, an operation associated with the displayed at least one operation card can be determined. A determination can be made whether a grace period of time is associated with the associated operation. If the grace period of time is associated with the associated operation, a card can be displayed that is configured to graphically indicate the grace period of time, where the displaying takes at least the grace period of time. After displaying the card configured to graphically indicate the grace period of time, the HMD can perform the associated operation.
  • In still other embodiments, method 1650 can additionally include: while in the timeline-next state, receiving a second input using the first UI. The second input can be associated with a second type of input. In response to the second type of input being the tap type of input: (a) one or more operations can be selected based on the next card, (b) a menu of operations can be generated based on the selected one or more operations, and (c) at least one menu operation of the menu of operations can be displayed. In particular of the still other embodiments, displaying the at least one menu operation of the menu of operations can include displaying text associated with the at least one menu operation that overlays the display of the next card.
  • In even other embodiments, method 1650 can additionally include: in response to the UI state of the HMD being in the voice-home state, generating a menu card to display a menu of operations for using the voice-based UI. The HMD can display the menu card. After displaying the menu card, a head-related input related to a head movement associated with the HMD can be received. The menu card can be modified based on the head-related input. The modified menu card can be displayed using the HMD.
  • FIG. 17 is a flow chart illustrating a method 1700, according to an example embodiment. In FIG. 17, method 1700 is described by way of example as being carried out by a computing device, such as a wearable computer and possibly a wearable computer that includes an HMD, but other techniques and/or devices can be used to carry out method 1700, such as discussed above in the context of method 1600.
  • Method 1700 can begin at block 1710. At block 1710, a computing device can display at least a portion of a first linear arrangement of cards. The first linear arrangement can include an ordered plurality of cards that includes one or more first cards of a first card-type and one or more second cards of a second card-type. Each first card corresponds to a group of cards. Aspects of the first linear arrangement are discussed above at least in the context of at least FIGS. 5C, 6D, 7, 8, 10A, 10B, 14A, and 14B.
  • In some embodiments, the first linear arrangement can include a timeline and each card of the first linear arrangement can be associated with a specific time, such as discussed above at least in the context of at least FIGS. 5A-15.
  • In other embodiments, each card of the ordered plurality of cards can include a relationship-related parameter, such as a type as discussed above at least in the context of at least FIGS. 5C, 6C, 6D, 10A, and 10B, or other kind(s) of relationship-related parameter(s). In particular embodiments, each card in the group of cards can be related to a same relationship-related parameter, such as discussed above in the context of at least FIGS. 5C, 6C, 6D, 10A, and 10B.
  • At block 1720, the computing device can display a selection region that is moveable with respect to the first linear arrangement, where a given card is selected when the selection region is aligned with the given card. Alignment of the selection region and the given card is discussed above in more detail in the context of FIG. 5C. Additional aspects of the selection region are discussed above at least in the context of FIGS. 5C and 6A-6D.
  • In some embodiments, the HMD can be configured to detect head movements. In these embodiments, displaying the selection region that is moveable with respect to the first linear arrangement can include moving the selection region with respect to the first linear arrangement based on the head movements, such as discussed above at least in the context of FIGS. 5C and 6D.
  • At block 1730, in response to selection of a given first card by the selection region, the computing device can display at least a portion of a second linear arrangement of cards, where the second linear arrangement can include an ordered plurality of the group of cards that corresponds to the given first card. Aspects of the given first card are discussed above at least in the context of FIGS. 5C, 6D, 7, 8, 10A, 10B, 14A, and 14B. In some embodiments, the second linear arrangement can also include the given first card.
  • At block 1740, in response to selection of a given second card by the selection region, the computing device can display at least a portion of a third linear arrangement of cards, where the third linear arrangement includes one or more third cards of a third card-type, where each third card is selectable to perform an action based on the given second card. Aspects of actionable cards are discussed above at least in the context of FIGS. 5A-15.
  • In some embodiments, the selected second card can be related to a first relationship-related parameter, and displaying at least the portion of the third linear arrangement can include determining the one or more third cards based on the first relationship-related parameter, such as discussed above in the context of at least FIGS. 5C and 6D. In other embodiments, the second linear arrangement can include the bundle card.
  • In even other embodiments, the HMD can be configured with a touchpad. In these embodiments, such as discussed above in the context of at least FIGS. 5C and 6A-6D, method 1700 can further include: initially displaying a single card of from the first linear arrangement using a single-card view; while displaying the single card, receiving a first input via the touchpad; in response to the first input: switching to a multi-timeline view and displaying, in the multi-timeline view, the at least a portion of the first linear arrangement of cards, wherein the at least the portion of the first linear arrangement of cards comprises the single card.
  • In still other embodiments, method 1700 can further include: after displaying the second linear arrangement of cards, selecting a card other than the selected first card; and ceasing display of the second linear arrangement.
  • F. CONCLUSION
  • The present disclosure is not to be limited in terms of the particular embodiments described in this application, which are intended as illustrations of various aspects. Many modifications and variations can be made without departing from its spirit and scope, as will be apparent to those skilled in the art. Functionally equivalent methods and apparatuses within the scope of the disclosure, in addition to those enumerated herein, will be apparent to those skilled in the art from the foregoing descriptions. Such modifications and variations are intended to fall within the scope of the appended claims.
  • The above detailed description describes various features and functions of the disclosed systems, devices, and methods with reference to the accompanying figures. In the figures, similar symbols typically identify similar components, unless context dictates otherwise. The example embodiments described herein and in the figures are not meant to be limiting. Other embodiments can be utilized, and other changes can be made, without departing from the spirit or scope of the subject matter presented herein. It will be readily understood that the aspects of the present disclosure, as generally described herein, and illustrated in the figures, can be arranged, substituted, combined, separated, and designed in a wide variety of different configurations, all of which are explicitly contemplated herein.
  • With respect to any or all of the ladder diagrams, scenarios, and flow charts in the figures and as discussed herein, each block and/or communication can represent a processing of information and/or a transmission of information in accordance with example embodiments. Alternative embodiments are included within the scope of these example embodiments. In these alternative embodiments, for example, functions described as blocks, transmissions, communications, requests, responses, and/or messages can be executed out of order from that shown or discussed, including substantially concurrent or in reverse order, depending on the functionality involved. Further, more or fewer blocks and/or functions can be used with any of the ladder diagrams, scenarios, and flow charts discussed herein, and these ladder diagrams, scenarios, and flow charts can be combined with one another, in part or in whole.
  • A block that represents a processing of information can correspond to circuitry that can be configured to perform the specific logical functions of a herein-described method or technique. Alternatively or additionally, a block that represents a processing of information can correspond to a module, a segment, or a portion of program code (including related data). The program code can include one or more instructions executable by a processor for implementing specific logical functions or actions in the method or technique. The program code and/or related data can be stored on any type of computer readable medium such as a storage device including a disk or hard drive or other storage medium.
  • The computer readable medium can also include non-transitory computer readable media such as computer-readable media that stores data for short periods of time like register memory, processor cache, and random access memory (RAM). The computer readable media can also include non-transitory computer readable media that stores program code and/or data for longer periods of time, such as secondary or persistent long term storage, like read only memory (ROM), optical or magnetic disks, compact-disc read only memory (CD-ROM), for example. The computer readable media can also be any other volatile or non-volatile storage systems. A computer readable medium can be considered a computer readable storage medium, for example, or a tangible storage device.
  • Moreover, a block that represents one or more information transmissions can correspond to information transmissions between software and/or hardware modules in the same physical device. However, other information transmissions can be between software modules and/or hardware modules in different physical devices.
  • The particular arrangements shown in the figures should not be viewed as limiting. It should be understood that other embodiments can include more or less of each element shown in a given figure. Further, some of the illustrated elements can be combined or omitted. Yet further, an example embodiment can include elements that are not illustrated in the figures.
  • While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.

Claims (22)

We claim:
1. A computing device, comprising:
a processor; and
a non-transitory computer-readable medium configured to store program instructions that, when executed by the processor, cause the computing device to carry out functions comprising:
displaying at least a portion of a first linear arrangement of cards, wherein the first linear arrangement comprises an ordered plurality of cards that includes one or more first cards of a first card-type and one or more second cards of a second card-type, and wherein each first card corresponds to a group of cards;
displaying a selection region that is moveable with respect to the first linear arrangement, wherein a given card is selected when the selection region is aligned with the given card;
in response to selection of a given first card by the selection region, displaying at least a portion of a second linear arrangement of cards, wherein the second linear arrangement comprises an ordered plurality of the group of cards that corresponds to the given first card; and
in response to selection of a given second card by the selection region, displaying at least a portion of a third linear arrangement of cards, wherein the third linear arrangement comprises one or more third cards of a third type, wherein each third card is selectable to perform an action based on the given second card.
2. The computing device of claim 1, wherein each first card is a bundle card, wherein each second card is an actionable card, and wherein each third card is an action card.
3. The computing device of claim 1, wherein the first linear arrangement comprises a timeline, and wherein each card of the first linear arrangement is associated with a specific time.
4. The computing device of claim 1, wherein each card of the first linear arrangement comprises a relationship-related parameter.
5. The computing device of claim 4, wherein each card in the group of cards comprises a same relationship-related parameter.
6. The computing device of claim 4, wherein the selected second card is related to a first relationship-related parameter, and wherein displaying at least the portion of the third linear arrangement comprises determining the one or more third cards based on the first relationship-related parameter.
7. The computing device of claim 1, wherein the computing device is further configured with a touchpad, and wherein the functions further comprise:
initially displaying a single card from the first linear arrangement using a single-card view;
while displaying the single card, receiving a first input via the touchpad; and
in response to the first input:
switching to a multi-timeline view; and
displaying, in the multi-timeline view, the at least a portion of the first linear arrangement of cards, wherein the at least the portion of the first linear arrangement of cards comprises the single card.
8. The computing device of claim 1, wherein the computing device is configured to detect head movements, and wherein displaying the selection region that is moveable with respect to the first linear arrangement comprises moving the selection region with respect to the first linear arrangement based on the head movements.
9. The computing device of claim 1, wherein displaying the at least a portion of the third linear arrangement of cards comprises displaying the third linear arrangement adjacent to and parallel to the first linear arrangement, and wherein the third linear arrangement begins with a third card aligned with and adjacent to the selected second card.
10. The computing device of claim 1, wherein the functions further comprise:
after displaying the second linear arrangement of cards, selecting a card other than the selected first card; and
ceasing display of the second linear arrangement.
11. The computing device of claim 1, wherein the second linear arrangement further comprises the bundle card.
12. The computing device of claim 1, wherein the computing device is configured as a head-mountable device.
13. A non-transitory computer-readable medium configured to program instructions that, when executed by a processor of a computing device, cause the computing device to carry out functions comprising:
displaying at least a portion of a first linear arrangement of cards, wherein the first linear arrangement comprises an ordered plurality of cards that includes one or more first cards of a first card-type and one or more second cards of a second card-type, and wherein each first card corresponds to a group of cards;
displaying a selection region that is moveable with respect to the first linear arrangement, wherein a given card is selected when the selection region is aligned with the given card;
in response to selection of a given first card by the selection region, displaying at least a portion of a second linear arrangement of cards, wherein the second linear arrangement comprises an ordered plurality of the group of cards that corresponds to the given first card; and
in response to selection of a given second card by the selection region, displaying at least a portion of a third linear arrangement of cards, wherein the third linear arrangement comprises one or more third cards of a third type, and wherein each third card is selectable to perform an action based on the given second card.
14. The non-transitory computer-readable medium of claim 13, wherein each first card is a bundle card, wherein each second card is an actionable card, and wherein each third card is an action card.
15. The non-transitory computer-readable medium of claim 13, wherein the first linear arrangement comprises a timeline, and wherein each card of the first linear arrangement is associated with a specific time.
16. The non-transitory computer-readable medium of claim 13, wherein each card of the first linear arrangement comprises a relationship-related parameter.
17. The non-transitory computer-readable medium of claim 16, wherein each card in the bundle of cards comprises a same relationship-related parameter.
18. The non-transitory computer-readable medium of claim 16, wherein the selected second card is related to a first relationship-related parameter, and wherein displaying at least the portion of the third linear arrangement comprises determining the one or more third cards based on the first relationship-related parameter.
19. The non-transitory computer-readable medium of claim 13, wherein the computing device is associated with a touchpad, and wherein the functions further comprise:
initially displaying a single card of the plurality of cards using a single-card view;
while displaying the single card, receiving a first input via the touchpad;
in response to the first input:
switching to a multi-timeline view; and
displaying, in the multi-timeline view, the at least a portion of the first linear arrangement of cards, wherein the at least the portion of the first linear arrangement of cards comprises the single card.
20. The non-transitory computer-readable medium of claim 13, wherein displaying the at least a portion of the third linear arrangement of cards comprises displaying the third linear arrangement adjacent to and parallel to the first linear arrangement, and wherein the third linear arrangement begins with an action card aligned with and adjacent to the selected actionable card
21. The non-transitory computer-readable medium of claim 13, wherein the computing device is configured as a head-mountable device.
22. A method, comprising:
displaying at least a portion of a first linear arrangement of cards using a computing device, wherein the first linear arrangement comprises an ordered plurality of cards that includes one or more first cards of a first card-type and one or more second cards of a second card-type, and wherein each first card corresponds to a group of cards;
displaying a selection region that is moveable with respect to the first linear arrangement using the computing device, wherein a given card is selected when the selection region is aligned with the given card;
in response to selection of a given first card by the selection region, displaying at least a portion of a second linear arrangement of cards using the computing device, wherein the second linear arrangement comprises an ordered plurality of the group of cards that corresponds to the given first card; and
in response to selection of a given second card by the selection region, displaying at least a portion of a third linear arrangement of cards using the computing device, wherein the third linear arrangement comprises one or more third cards of a third type, and wherein each third card is selectable to perform an action based on the given second card.
US13/840,016 2012-10-05 2013-03-15 User Interfaces for Head-Mountable Devices Abandoned US20140101608A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US13/840,016 US20140101608A1 (en) 2012-10-05 2013-03-15 User Interfaces for Head-Mountable Devices
PCT/US2013/063578 WO2014055948A2 (en) 2012-10-05 2013-10-04 User interfaces for head-mountable devices

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261710543P 2012-10-05 2012-10-05
US13/840,016 US20140101608A1 (en) 2012-10-05 2013-03-15 User Interfaces for Head-Mountable Devices

Publications (1)

Publication Number Publication Date
US20140101608A1 true US20140101608A1 (en) 2014-04-10

Family

ID=50432335

Family Applications (5)

Application Number Title Priority Date Filing Date
US13/840,016 Abandoned US20140101608A1 (en) 2012-10-05 2013-03-15 User Interfaces for Head-Mountable Devices
US13/861,217 Active 2034-01-22 US9250769B2 (en) 2012-10-05 2013-04-11 Grouping of cards by time periods and content types
US13/890,049 Active 2034-05-24 US9454288B2 (en) 2012-10-05 2013-05-08 One-dimensional to two-dimensional list navigation
US14/975,668 Expired - Fee Related US10254923B2 (en) 2012-10-05 2015-12-18 Grouping of cards by time periods and content types
US16/276,435 Abandoned US20190179497A1 (en) 2012-10-05 2019-02-14 Grouping of Cards by Time Periods and Content Types

Family Applications After (4)

Application Number Title Priority Date Filing Date
US13/861,217 Active 2034-01-22 US9250769B2 (en) 2012-10-05 2013-04-11 Grouping of cards by time periods and content types
US13/890,049 Active 2034-05-24 US9454288B2 (en) 2012-10-05 2013-05-08 One-dimensional to two-dimensional list navigation
US14/975,668 Expired - Fee Related US10254923B2 (en) 2012-10-05 2015-12-18 Grouping of cards by time periods and content types
US16/276,435 Abandoned US20190179497A1 (en) 2012-10-05 2019-02-14 Grouping of Cards by Time Periods and Content Types

Country Status (2)

Country Link
US (5) US20140101608A1 (en)
WO (3) WO2014055929A1 (en)

Cited By (103)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140198129A1 (en) * 2013-01-13 2014-07-17 Qualcomm Incorporated Apparatus and method for controlling an augmented reality device
US20140215380A1 (en) * 2013-01-31 2014-07-31 Hyungsuk Kang Image display apparatus and method for operating the same
KR101430614B1 (en) * 2014-05-30 2014-08-18 주식회사 모리아타운 Display device using wearable eyeglasses and method for operating the same
US20140317522A1 (en) * 2013-04-17 2014-10-23 Nokia Corporation Method and Apparatus for Causing Display of Notification Content
US20140344627A1 (en) * 2013-05-16 2014-11-20 Advantest Corporation Voice recognition virtual test engineering assistant
US20140373123A1 (en) * 2013-06-18 2014-12-18 Samsung Electronics Co., Ltd. Service providing method and electronic device using the same
US20150007114A1 (en) * 2013-06-28 2015-01-01 Adam G. Poulos Web-like hierarchical menu display configuration for a near-eye display
US20150138417A1 (en) * 2013-11-18 2015-05-21 Joshua J. Ratcliff Viewfinder wearable, at least in part, by human operator
US20150138081A1 (en) * 2013-02-22 2015-05-21 Sony Corporation Head-mounted display system, head-mounted display, and head-mounted display control program
US20150212324A1 (en) * 2014-01-24 2015-07-30 Osterhout Group, Inc. Peripheral lighting for head worn computing
US9116545B1 (en) * 2012-03-21 2015-08-25 Hayes Solos Raffle Input detection
US20150241957A1 (en) * 2014-02-21 2015-08-27 Sony Corporation Control apparatus, information processing apparatus, control method, information processing method, information processing system and wearable device
US20150256674A1 (en) * 2014-03-10 2015-09-10 Qualcomm Incorporated Devices and methods for facilitating wireless communications based on implicit user cues
US20150302855A1 (en) * 2014-04-21 2015-10-22 Qualcomm Incorporated Method and apparatus for activating application by speech input
US9176582B1 (en) * 2013-04-10 2015-11-03 Google Inc. Input system
USD743963S1 (en) 2014-12-22 2015-11-24 Osterhout Group, Inc. Air mouse
US20160011420A1 (en) * 2014-07-08 2016-01-14 Lg Electronics Inc. Glasses-type terminal and method for controlling the same
US20160018901A1 (en) * 2014-07-18 2016-01-21 Richard D. Woolley Enabling data tracking without requiring direct contact
USD751552S1 (en) 2014-12-31 2016-03-15 Osterhout Group, Inc. Computer glasses
WO2016043820A1 (en) * 2014-09-15 2016-03-24 Google Inc. Managing information display
US9298002B2 (en) 2014-01-21 2016-03-29 Osterhout Group, Inc. Optical configurations for head worn computing
USD753114S1 (en) 2015-01-05 2016-04-05 Osterhout Group, Inc. Air mouse
US9310610B2 (en) 2014-01-21 2016-04-12 Osterhout Group, Inc. See-through computer display systems
US9316833B2 (en) 2014-01-21 2016-04-19 Osterhout Group, Inc. Optical configurations for head worn computing
US20160110046A1 (en) * 2013-03-29 2016-04-21 Hewlett-Packard Development Company, L.P. Adjustable timeline user interface
US9329387B2 (en) 2014-01-21 2016-05-03 Osterhout Group, Inc. See-through computer display systems
US9366868B2 (en) 2014-09-26 2016-06-14 Osterhout Group, Inc. See-through computer display systems
US9366867B2 (en) 2014-07-08 2016-06-14 Osterhout Group, Inc. Optical systems for see-through displays
US9401540B2 (en) 2014-02-11 2016-07-26 Osterhout Group, Inc. Spatial location presentation in head worn computing
US9423842B2 (en) 2014-09-18 2016-08-23 Osterhout Group, Inc. Thermal management for head-worn computer
US9423612B2 (en) 2014-03-28 2016-08-23 Osterhout Group, Inc. Sensor dependent content position in head worn computing
US9448409B2 (en) 2014-11-26 2016-09-20 Osterhout Group, Inc. See-through computer display systems
US9494800B2 (en) 2014-01-21 2016-11-15 Osterhout Group, Inc. See-through computer display systems
US9507481B2 (en) 2013-04-17 2016-11-29 Nokia Technologies Oy Method and apparatus for determining an invocation input based on cognitive load
US9523856B2 (en) 2014-01-21 2016-12-20 Osterhout Group, Inc. See-through computer display systems
US9529192B2 (en) 2014-01-21 2016-12-27 Osterhout Group, Inc. Eye imaging in head worn computing
US9529195B2 (en) 2014-01-21 2016-12-27 Osterhout Group, Inc. See-through computer display systems
US9532715B2 (en) 2014-01-21 2017-01-03 Osterhout Group, Inc. Eye imaging in head worn computing
US9547465B2 (en) 2014-02-14 2017-01-17 Osterhout Group, Inc. Object shadowing in head worn computing
US9575321B2 (en) 2014-06-09 2017-02-21 Osterhout Group, Inc. Content presentation in head worn computing
US9575563B1 (en) * 2013-12-30 2017-02-21 X Development Llc Tap to initiate a next action for user requests
US9584915B2 (en) 2015-01-19 2017-02-28 Microsoft Technology Licensing, Llc Spatial audio with remote speakers
US9651787B2 (en) 2014-04-25 2017-05-16 Osterhout Group, Inc. Speaker assembly for headworn computer
US9651784B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
US9671613B2 (en) 2014-09-26 2017-06-06 Osterhout Group, Inc. See-through computer display systems
US9672210B2 (en) 2014-04-25 2017-06-06 Osterhout Group, Inc. Language translation with head-worn computing
US9671626B2 (en) 2016-05-19 2017-06-06 Maximilian Ralph Peter von und zu Liechtenstein Apparatus and method for augmenting human vision by means of adaptive polarization filter grids
US9684172B2 (en) 2014-12-03 2017-06-20 Osterhout Group, Inc. Head worn computer display systems
US9715112B2 (en) 2014-01-21 2017-07-25 Osterhout Group, Inc. Suppression of stray light in head worn computing
US9720234B2 (en) 2014-01-21 2017-08-01 Osterhout Group, Inc. See-through computer display systems
US9740280B2 (en) 2014-01-21 2017-08-22 Osterhout Group, Inc. Eye imaging in head worn computing
US20170242479A1 (en) * 2014-01-25 2017-08-24 Sony Interactive Entertainment America Llc Menu navigation in a head-mounted display
US9746686B2 (en) 2014-05-19 2017-08-29 Osterhout Group, Inc. Content position calibration in head worn computing
US9753288B2 (en) 2014-01-21 2017-09-05 Osterhout Group, Inc. See-through computer display systems
US9766463B2 (en) 2014-01-21 2017-09-19 Osterhout Group, Inc. See-through computer display systems
US9784973B2 (en) 2014-02-11 2017-10-10 Osterhout Group, Inc. Micro doppler presentations in head worn computing
US9810906B2 (en) 2014-06-17 2017-11-07 Osterhout Group, Inc. External user interface for head worn computing
US9811152B2 (en) 2014-01-21 2017-11-07 Osterhout Group, Inc. Eye imaging in head worn computing
US9826299B1 (en) 2016-08-22 2017-11-21 Osterhout Group, Inc. Speaker systems for head-worn computer systems
US9829707B2 (en) 2014-08-12 2017-11-28 Osterhout Group, Inc. Measuring content brightness in head worn computing
US9836122B2 (en) 2014-01-21 2017-12-05 Osterhout Group, Inc. Eye glint imaging in see-through computer display systems
US20170351918A1 (en) * 2015-02-13 2017-12-07 Halliburton Energy Services, Inc. Distributing information using role-specific augmented reality devices
US9841599B2 (en) 2014-06-05 2017-12-12 Osterhout Group, Inc. Optical configurations for head-worn see-through displays
US9846308B2 (en) 2014-01-24 2017-12-19 Osterhout Group, Inc. Haptic systems for head-worn computers
US9880441B1 (en) 2016-09-08 2018-01-30 Osterhout Group, Inc. Electrochromic systems for head-worn computer systems
US9939934B2 (en) 2014-01-17 2018-04-10 Osterhout Group, Inc. External user interface for head worn computing
US9952664B2 (en) 2014-01-21 2018-04-24 Osterhout Group, Inc. Eye imaging in head worn computing
US9965681B2 (en) 2008-12-16 2018-05-08 Osterhout Group, Inc. Eye imaging in head worn computing
US10027606B2 (en) 2013-04-17 2018-07-17 Nokia Technologies Oy Method and apparatus for determining a notification representation indicative of a cognitive load
US10062182B2 (en) 2015-02-17 2018-08-28 Osterhout Group, Inc. See-through computer display systems
US10121063B2 (en) 2015-01-12 2018-11-06 BMT Business Meets Technology Holding AG Wink gesture based control system
US10168766B2 (en) 2013-04-17 2019-01-01 Nokia Technologies Oy Method and apparatus for a textural representation of a guidance
US10191279B2 (en) 2014-03-17 2019-01-29 Osterhout Group, Inc. Eye imaging in head worn computing
US10254856B2 (en) 2014-01-17 2019-04-09 Osterhout Group, Inc. External user interface for head worn computing
US10320946B2 (en) 2013-03-15 2019-06-11 Sony Interactive Entertainment America Llc Virtual reality universe representation changes viewing based upon client side parameters
US10356215B1 (en) 2013-03-15 2019-07-16 Sony Interactive Entertainment America Llc Crowd and cloud enabled virtual reality distributed location network
US20190243598A1 (en) * 2012-10-10 2019-08-08 Samsung Electronics Co., Ltd. Head mounted display apparatus and method for displaying a content
US10474711B1 (en) 2013-03-15 2019-11-12 Sony Interactive Entertainment America Llc System and methods for effective virtual reality visitor interface
US10565249B1 (en) 2013-03-15 2020-02-18 Sony Interactive Entertainment America Llc Real time unified communications interaction of a predefined location in a virtual reality location
US10599707B1 (en) 2013-03-15 2020-03-24 Sony Interactive Entertainment America Llc Virtual reality enhanced through browser connections
US10649220B2 (en) 2014-06-09 2020-05-12 Mentor Acquisition One, Llc Content presentation in head worn computing
USD885410S1 (en) * 2018-10-05 2020-05-26 Google Llc Display screen with animated graphical user interface
US10663740B2 (en) 2014-06-09 2020-05-26 Mentor Acquisition One, Llc Content presentation in head worn computing
US10684687B2 (en) 2014-12-03 2020-06-16 Mentor Acquisition One, Llc See-through computer display systems
US10853589B2 (en) 2014-04-25 2020-12-01 Mentor Acquisition One, Llc Language translation with head-worn computing
TWI719483B (en) * 2019-05-17 2021-02-21 雅得近顯股份有限公司 Convenient memo operating system
US10936163B2 (en) * 2018-07-17 2021-03-02 Methodical Mind, Llc. Graphical user interface system
US10949054B1 (en) 2013-03-15 2021-03-16 Sony Interactive Entertainment America Llc Personal digital assistance and virtual reality
US11103122B2 (en) 2014-07-15 2021-08-31 Mentor Acquisition One, Llc Content presentation in head worn computing
US11104272B2 (en) 2014-03-28 2021-08-31 Mentor Acquisition One, Llc System for assisted operator safety using an HMD
US11227294B2 (en) 2014-04-03 2022-01-18 Mentor Acquisition One, Llc Sight information collection in head worn computing
US11269182B2 (en) 2014-07-15 2022-03-08 Mentor Acquisition One, Llc Content presentation in head worn computing
US11288355B2 (en) * 2020-05-05 2022-03-29 International Business Machines Corporation Detector for online user verification
US11381903B2 (en) 2014-02-14 2022-07-05 Sonic Blocks Inc. Modular quick-connect A/V system and methods thereof
US11392636B2 (en) 2013-10-17 2022-07-19 Nant Holdings Ip, Llc Augmented reality position-based service, methods, and systems
US11487110B2 (en) 2014-01-21 2022-11-01 Mentor Acquisition One, Llc Eye imaging in head worn computing
US11537269B2 (en) 2019-12-27 2022-12-27 Methodical Mind, Llc. Graphical user interface system
US11669163B2 (en) 2014-01-21 2023-06-06 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US11737666B2 (en) 2014-01-21 2023-08-29 Mentor Acquisition One, Llc Eye imaging in head worn computing
US11776255B2 (en) 2021-10-22 2023-10-03 Kyndryl, Inc. Dynamic input system for smart glasses based on user availability states
US11854153B2 (en) 2011-04-08 2023-12-26 Nant Holdings Ip, Llc Interference based augmented reality hosting platforms
US11892644B2 (en) 2014-01-21 2024-02-06 Mentor Acquisition One, Llc See-through computer display systems
US11967034B2 (en) 2023-10-31 2024-04-23 Nant Holdings Ip, Llc Augmented reality object management system

Families Citing this family (66)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140068510A1 (en) * 2012-09-05 2014-03-06 Sap Ag Matrix menu
US9591339B1 (en) 2012-11-27 2017-03-07 Apple Inc. Agnostic media delivery system
US9774917B1 (en) 2012-12-10 2017-09-26 Apple Inc. Channel bar user interface
US10200761B1 (en) 2012-12-13 2019-02-05 Apple Inc. TV side bar user interface
KR102081797B1 (en) * 2012-12-13 2020-04-14 삼성전자주식회사 Glass apparatus and Method for controlling glass apparatus, Audio apparatus and Method for providing audio signal and Display apparatus
US9532111B1 (en) 2012-12-18 2016-12-27 Apple Inc. Devices and method for providing remote control hints on a display
US10521188B1 (en) 2012-12-31 2019-12-31 Apple Inc. Multi-user TV user interface
US20140195918A1 (en) * 2013-01-07 2014-07-10 Steven Friedlander Eye tracking user interface
KR101439250B1 (en) * 2013-06-27 2014-09-11 한국과학기술연구원 Transparent display device and method for providing user interface thereof
KR102184269B1 (en) * 2013-09-02 2020-11-30 삼성전자 주식회사 Display apparatus, portable apparatus and method for displaying a screen thereof
US10545657B2 (en) 2013-09-03 2020-01-28 Apple Inc. User interface for manipulating user interface objects
US9336189B2 (en) * 2013-11-15 2016-05-10 Glu Mobile Inc. Systems and methods for providing fused images to remote recipients for descrambling and interpretation
US9436577B2 (en) * 2013-11-22 2016-09-06 Nintendo Co., Ltd. System and method for generating a code execution timeline from an executing program
JP2015158753A (en) * 2014-02-21 2015-09-03 ソニー株式会社 Wearable device and control apparatus
CN106415475A (en) 2014-06-24 2017-02-15 苹果公司 Column interface for navigating in a user interface
CN111104040B (en) 2014-06-24 2023-10-24 苹果公司 Input device and user interface interactions
KR20160026323A (en) * 2014-08-29 2016-03-09 삼성전자주식회사 method and apparatus for controlling the notification information based on movement
CN112199000A (en) 2014-09-02 2021-01-08 苹果公司 Multi-dimensional object rearrangement
US10073590B2 (en) 2014-09-02 2018-09-11 Apple Inc. Reduced size user interface
US9984505B2 (en) * 2014-09-30 2018-05-29 Sony Interactive Entertainment Inc. Display of text information on a head-mounted display
US10691317B2 (en) 2014-10-24 2020-06-23 Flow Labs, Inc. Target-directed movement in a user interface
US9804669B2 (en) 2014-11-07 2017-10-31 Eye Labs, Inc. High resolution perception of content in a wide field of view of a head-mounted display
KR102183212B1 (en) 2014-11-18 2020-11-25 삼성전자주식회사 Method for controlling display and an electronic device thereof
US20160147304A1 (en) * 2014-11-24 2016-05-26 General Electric Company Haptic feedback on the density of virtual 3d objects
CN104570354A (en) * 2015-01-09 2015-04-29 京东方科技集团股份有限公司 Interactive glasses and visitor guide system
US10317215B2 (en) * 2015-01-09 2019-06-11 Boe Technology Group Co., Ltd. Interactive glasses and navigation system
US20180088785A1 (en) * 2015-02-26 2018-03-29 Flow Labs, Inc. Navigating a set of selectable items in a user interface
CN107430483B (en) * 2015-03-27 2021-03-23 谷歌有限责任公司 Navigation event information
JP6426525B2 (en) * 2015-04-20 2018-11-21 ファナック株式会社 Display system
EP3112986B1 (en) * 2015-07-03 2020-02-26 Nokia Technologies Oy Content browsing
WO2017033777A1 (en) * 2015-08-27 2017-03-02 株式会社コロプラ Program for controlling head-mounted display system
WO2017036953A1 (en) * 2015-09-02 2017-03-09 Thomson Licensing Method, apparatus and system for facilitating navigation in an extended scene
US20170316064A1 (en) * 2016-04-27 2017-11-02 Inthinc Technology Solutions, Inc. Critical event assistant
DK201670595A1 (en) 2016-06-11 2018-01-22 Apple Inc Configuring context-specific user interfaces
DK201670581A1 (en) 2016-06-12 2018-01-08 Apple Inc Device-level authorization for viewing content
DK201670582A1 (en) 2016-06-12 2018-01-02 Apple Inc Identifying applications on which content is available
JP6440910B2 (en) * 2016-07-29 2018-12-19 三菱電機株式会社 Display device, display control device, and display control method
EP4044613A1 (en) 2016-10-26 2022-08-17 Apple Inc. User interfaces for browsing content from multiple content applications on an electronic device
US10437343B2 (en) * 2017-01-06 2019-10-08 Samsung Electronics Co., Ltd. Augmented reality control of internet of things devices
CN107632856A (en) * 2017-09-18 2018-01-26 联想(北京)有限公司 Object methods of exhibiting and system
FR3071639B1 (en) * 2017-09-22 2020-01-31 Lithium Media METHOD OF OPERATING A COMPUTER DEVICE AND COMPUTER DEVICE USING THE SAME
US10891803B2 (en) * 2017-10-16 2021-01-12 Comcast Cable Communications, Llc User interface and functions for virtual reality and augmented reality
JP2019086916A (en) * 2017-11-02 2019-06-06 オリンパス株式会社 Work support device, work support method, and work support program
USD900838S1 (en) 2018-02-13 2020-11-03 Zap Surgical Systems, Inc. Display screen or portion thereof with graphical user interface for a radiation treatment
DK201870354A1 (en) 2018-06-03 2019-12-20 Apple Inc. Setup procedures for an electronic device
CN110049187B (en) * 2019-03-22 2021-08-06 维沃移动通信(深圳)有限公司 Display method and terminal equipment
CN113906419A (en) 2019-03-24 2022-01-07 苹果公司 User interface for media browsing application
US11683565B2 (en) 2019-03-24 2023-06-20 Apple Inc. User interfaces for interacting with channels that provide content that plays in a media browsing application
CN113940088A (en) 2019-03-24 2022-01-14 苹果公司 User interface for viewing and accessing content on an electronic device
CN114115676A (en) 2019-03-24 2022-03-01 苹果公司 User interface including selectable representations of content items
US11657298B2 (en) 2019-04-19 2023-05-23 T-Mobile Usa, Inc. Card engine for producing dynamically configured content
US11194717B2 (en) 2019-04-19 2021-12-07 T-Mobile Usa, Inc. Facts control and evaluating card definitions using cached facts
EP3977245A1 (en) 2019-05-31 2022-04-06 Apple Inc. User interfaces for a podcast browsing and playback application
US11863837B2 (en) 2019-05-31 2024-01-02 Apple Inc. Notification of augmented reality content on an electronic device
US11016303B1 (en) * 2020-01-09 2021-05-25 Facebook Technologies, Llc Camera mute indication for headset user
US11483155B2 (en) 2020-01-22 2022-10-25 T-Mobile Usa, Inc. Access control using proof-of-possession token
US11675773B2 (en) 2020-01-22 2023-06-13 T-Mobile Usa, Inc. Content management
US11481196B2 (en) 2020-01-22 2022-10-25 T-Mobile Usa, Inc. User interface for accessing and modifying development area content
US11843838B2 (en) 2020-03-24 2023-12-12 Apple Inc. User interfaces for accessing episodes of a content series
US11899895B2 (en) 2020-06-21 2024-02-13 Apple Inc. User interfaces for setting up an electronic device
US11720229B2 (en) 2020-12-07 2023-08-08 Apple Inc. User interfaces for browsing and presenting content
US11276221B1 (en) * 2021-01-27 2022-03-15 International Business Machines Corporation Creating an animated pictogram
US11934640B2 (en) 2021-01-29 2024-03-19 Apple Inc. User interfaces for record labels
CN115407949A (en) * 2021-05-11 2022-11-29 中强光电股份有限公司 Display image adjusting method and augmented reality display device
US11893212B2 (en) 2021-06-06 2024-02-06 Apple Inc. User interfaces for managing application widgets
US11921812B2 (en) * 2022-05-19 2024-03-05 Dropbox, Inc. Content creative web browser

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020126066A1 (en) * 1993-08-12 2002-09-12 Seiko Epson Corporation Head-mounted image display device and data processing apparatus including the same
US20040150668A1 (en) * 2003-01-31 2004-08-05 Xerox Corporation Secondary touch contextual sub-menu navigation for touch screen interface
US20060026535A1 (en) * 2004-07-30 2006-02-02 Apple Computer Inc. Mode-based graphical user interfaces for touch sensitive input devices
US20060156228A1 (en) * 2004-11-16 2006-07-13 Vizible Corporation Spatially driven content presentation in a cellular environment
US20060271867A1 (en) * 2005-05-27 2006-11-30 Wang Kong Q Mobile communications terminal and method therefore
US20080276196A1 (en) * 2007-05-04 2008-11-06 Apple Inc. Automatically adjusting media display in a personal display system
US20080295016A1 (en) * 2007-05-25 2008-11-27 Mathieu Audet Timescale for representing information
US20090150775A1 (en) * 2007-12-07 2009-06-11 Sony Corporation Information display terminal, information display method and program
US20100060666A1 (en) * 2008-09-09 2010-03-11 Microsoft Corporation Zooming graphical user interface
US20110296351A1 (en) * 2010-05-26 2011-12-01 T-Mobile Usa, Inc. User Interface with Z-axis Interaction and Multiple Stacks
US20130141324A1 (en) * 2011-12-02 2013-06-06 Microsoft Corporation User interface control based on head orientation
US8473860B2 (en) * 2010-02-12 2013-06-25 Microsoft Corporation Multi-layer user interface with flexible parallel and orthogonal movement

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6768999B2 (en) 1996-06-28 2004-07-27 Mirror Worlds Technologies, Inc. Enterprise, stream-based, information management system
US8843850B2 (en) * 1999-07-22 2014-09-23 Tavusi Data Solutions Llc Graphic-information flow for visually analyzing patterns and relationships
AU2001238311A1 (en) 2000-02-14 2001-08-27 Geophoenix, Inc. System and method for graphical programming
EP1346337A4 (en) * 2000-11-20 2006-06-14 Displaytech Inc Dual mode near-eye and projection display system
US8099680B1 (en) 2002-03-12 2012-01-17 Arris Group, Inc. System and method of contextual pre-tuning
US20050102634A1 (en) 2003-11-10 2005-05-12 Sloo David H. Understandable navigation of an information array
US7353466B2 (en) * 2004-05-28 2008-04-01 Microsoft Corporation System and method for generating message notification objects on dynamically scaled timeline
US8464176B2 (en) * 2005-01-19 2013-06-11 Microsoft Corporation Dynamic stacking and expansion of visual items
EP1755051A1 (en) * 2005-08-15 2007-02-21 Mitsubishi Electric Information Technology Centre Europe B.V. Method and apparatus for accessing data using a symbolic representation space
US8160400B2 (en) 2005-11-17 2012-04-17 Microsoft Corporation Navigating images using image based geometric alignment and object based controls
US8683362B2 (en) * 2008-05-23 2014-03-25 Qualcomm Incorporated Card metaphor for activities in a computing device
WO2008002999A2 (en) * 2006-06-27 2008-01-03 Metabeam Corporation Digital content playback
CA2666016C (en) 2008-05-15 2014-07-22 Mathieu Audet Method for building a search algorithm and method for linking documents with an object
US20100293105A1 (en) * 2009-05-15 2010-11-18 Microsoft Corporation Social networking updates for image display devices
US20110078626A1 (en) * 2009-09-28 2011-03-31 William Bachman Contextual Presentation of Digital Media Asset Collections
US8584047B2 (en) * 2010-05-18 2013-11-12 Microsoft Corporation Orbital representation of hierarchical navigation
US9128960B2 (en) * 2011-01-14 2015-09-08 Apple Inc. Assisted image selection
JP5977922B2 (en) 2011-02-24 2016-08-24 セイコーエプソン株式会社 Information processing apparatus, information processing apparatus control method, and transmissive head-mounted display apparatus
US9524651B2 (en) * 2011-07-25 2016-12-20 Raymond Fix System and method for electronic communication using a voiceover in combination with user interaction events on a selected background
US20130136411A1 (en) * 2011-11-28 2013-05-30 Microsoft Corporation Time-shifted content channels
US20130147686A1 (en) * 2011-12-12 2013-06-13 John Clavin Connecting Head Mounted Displays To External Displays And Other Communication Networks
WO2014028074A1 (en) * 2012-08-17 2014-02-20 Flextronics Ap, Llc Intelligent television

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020126066A1 (en) * 1993-08-12 2002-09-12 Seiko Epson Corporation Head-mounted image display device and data processing apparatus including the same
US20040150668A1 (en) * 2003-01-31 2004-08-05 Xerox Corporation Secondary touch contextual sub-menu navigation for touch screen interface
US20060026535A1 (en) * 2004-07-30 2006-02-02 Apple Computer Inc. Mode-based graphical user interfaces for touch sensitive input devices
US20060156228A1 (en) * 2004-11-16 2006-07-13 Vizible Corporation Spatially driven content presentation in a cellular environment
US20060271867A1 (en) * 2005-05-27 2006-11-30 Wang Kong Q Mobile communications terminal and method therefore
US20080276196A1 (en) * 2007-05-04 2008-11-06 Apple Inc. Automatically adjusting media display in a personal display system
US20080295016A1 (en) * 2007-05-25 2008-11-27 Mathieu Audet Timescale for representing information
US20090150775A1 (en) * 2007-12-07 2009-06-11 Sony Corporation Information display terminal, information display method and program
US20100060666A1 (en) * 2008-09-09 2010-03-11 Microsoft Corporation Zooming graphical user interface
US8473860B2 (en) * 2010-02-12 2013-06-25 Microsoft Corporation Multi-layer user interface with flexible parallel and orthogonal movement
US20110296351A1 (en) * 2010-05-26 2011-12-01 T-Mobile Usa, Inc. User Interface with Z-axis Interaction and Multiple Stacks
US20130141324A1 (en) * 2011-12-02 2013-06-06 Microsoft Corporation User interface control based on head orientation

Cited By (232)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9965681B2 (en) 2008-12-16 2018-05-08 Osterhout Group, Inc. Eye imaging in head worn computing
US11869160B2 (en) 2011-04-08 2024-01-09 Nant Holdings Ip, Llc Interference based augmented reality hosting platforms
US11854153B2 (en) 2011-04-08 2023-12-26 Nant Holdings Ip, Llc Interference based augmented reality hosting platforms
US9116545B1 (en) * 2012-03-21 2015-08-25 Hayes Solos Raffle Input detection
US20190243598A1 (en) * 2012-10-10 2019-08-08 Samsung Electronics Co., Ltd. Head mounted display apparatus and method for displaying a content
US11360728B2 (en) * 2012-10-10 2022-06-14 Samsung Electronics Co., Ltd. Head mounted display apparatus and method for displaying a content
US10359841B2 (en) * 2013-01-13 2019-07-23 Qualcomm Incorporated Apparatus and method for controlling an augmented reality device
US20140198129A1 (en) * 2013-01-13 2014-07-17 Qualcomm Incorporated Apparatus and method for controlling an augmented reality device
US11366515B2 (en) 2013-01-13 2022-06-21 Qualcomm Incorporated Apparatus and method for controlling an augmented reality device
US9182890B2 (en) * 2013-01-31 2015-11-10 Lg Electronics Inc. Image display apparatus and method for operating the same
US20140215380A1 (en) * 2013-01-31 2014-07-31 Hyungsuk Kang Image display apparatus and method for operating the same
US20160021415A1 (en) * 2013-01-31 2016-01-21 Lg Electronics Inc. Image display apparatus and method for operating the same
US20150138081A1 (en) * 2013-02-22 2015-05-21 Sony Corporation Head-mounted display system, head-mounted display, and head-mounted display control program
US9829997B2 (en) * 2013-02-22 2017-11-28 Sony Corporation Head-mounted display system, head-mounted display, and head-mounted display control program
US10320946B2 (en) 2013-03-15 2019-06-11 Sony Interactive Entertainment America Llc Virtual reality universe representation changes viewing based upon client side parameters
US10949054B1 (en) 2013-03-15 2021-03-16 Sony Interactive Entertainment America Llc Personal digital assistance and virtual reality
US10356215B1 (en) 2013-03-15 2019-07-16 Sony Interactive Entertainment America Llc Crowd and cloud enabled virtual reality distributed location network
US10474711B1 (en) 2013-03-15 2019-11-12 Sony Interactive Entertainment America Llc System and methods for effective virtual reality visitor interface
US10565249B1 (en) 2013-03-15 2020-02-18 Sony Interactive Entertainment America Llc Real time unified communications interaction of a predefined location in a virtual reality location
US11809679B2 (en) 2013-03-15 2023-11-07 Sony Interactive Entertainment LLC Personal digital assistance and virtual reality
US11272039B2 (en) 2013-03-15 2022-03-08 Sony Interactive Entertainment LLC Real time unified communications interaction of a predefined location in a virtual reality location
US10599707B1 (en) 2013-03-15 2020-03-24 Sony Interactive Entertainment America Llc Virtual reality enhanced through browser connections
US10938958B2 (en) 2013-03-15 2021-03-02 Sony Interactive Entertainment LLC Virtual reality universe representation changes viewing based upon client side parameters
US11064050B2 (en) 2013-03-15 2021-07-13 Sony Interactive Entertainment LLC Crowd and cloud enabled virtual reality distributed location network
US20160110046A1 (en) * 2013-03-29 2016-04-21 Hewlett-Packard Development Company, L.P. Adjustable timeline user interface
US9176582B1 (en) * 2013-04-10 2015-11-03 Google Inc. Input system
US10359835B2 (en) * 2013-04-17 2019-07-23 Nokia Technologies Oy Method and apparatus for causing display of notification content
US10168766B2 (en) 2013-04-17 2019-01-01 Nokia Technologies Oy Method and apparatus for a textural representation of a guidance
US10936069B2 (en) 2013-04-17 2021-03-02 Nokia Technologies Oy Method and apparatus for a textural representation of a guidance
US10027606B2 (en) 2013-04-17 2018-07-17 Nokia Technologies Oy Method and apparatus for determining a notification representation indicative of a cognitive load
US9507481B2 (en) 2013-04-17 2016-11-29 Nokia Technologies Oy Method and apparatus for determining an invocation input based on cognitive load
US20140317522A1 (en) * 2013-04-17 2014-10-23 Nokia Corporation Method and Apparatus for Causing Display of Notification Content
US20140344627A1 (en) * 2013-05-16 2014-11-20 Advantest Corporation Voice recognition virtual test engineering assistant
US9495266B2 (en) * 2013-05-16 2016-11-15 Advantest Corporation Voice recognition virtual test engineering assistant
US20140373123A1 (en) * 2013-06-18 2014-12-18 Samsung Electronics Co., Ltd. Service providing method and electronic device using the same
US20150007114A1 (en) * 2013-06-28 2015-01-01 Adam G. Poulos Web-like hierarchical menu display configuration for a near-eye display
US9563331B2 (en) * 2013-06-28 2017-02-07 Microsoft Technology Licensing, Llc Web-like hierarchical menu display configuration for a near-eye display
US11392636B2 (en) 2013-10-17 2022-07-19 Nant Holdings Ip, Llc Augmented reality position-based service, methods, and systems
US20150138417A1 (en) * 2013-11-18 2015-05-21 Joshua J. Ratcliff Viewfinder wearable, at least in part, by human operator
US9491365B2 (en) * 2013-11-18 2016-11-08 Intel Corporation Viewfinder wearable, at least in part, by human operator
US9798517B2 (en) * 2013-12-30 2017-10-24 X Development Llc Tap to initiate a next action for user requests
US20170139672A1 (en) * 2013-12-30 2017-05-18 X Development Llc Tap to Initiate a Next Action for User Requests
US9575563B1 (en) * 2013-12-30 2017-02-21 X Development Llc Tap to initiate a next action for user requests
US10254856B2 (en) 2014-01-17 2019-04-09 Osterhout Group, Inc. External user interface for head worn computing
US11169623B2 (en) 2014-01-17 2021-11-09 Mentor Acquisition One, Llc External user interface for head worn computing
US11231817B2 (en) 2014-01-17 2022-01-25 Mentor Acquisition One, Llc External user interface for head worn computing
US11782529B2 (en) 2014-01-17 2023-10-10 Mentor Acquisition One, Llc External user interface for head worn computing
US9939934B2 (en) 2014-01-17 2018-04-10 Osterhout Group, Inc. External user interface for head worn computing
US11507208B2 (en) 2014-01-17 2022-11-22 Mentor Acquisition One, Llc External user interface for head worn computing
US11126003B2 (en) 2014-01-21 2021-09-21 Mentor Acquisition One, Llc See-through computer display systems
US9329387B2 (en) 2014-01-21 2016-05-03 Osterhout Group, Inc. See-through computer display systems
US9310610B2 (en) 2014-01-21 2016-04-12 Osterhout Group, Inc. See-through computer display systems
US9538915B2 (en) 2014-01-21 2017-01-10 Osterhout Group, Inc. Eye imaging in head worn computing
US11947126B2 (en) 2014-01-21 2024-04-02 Mentor Acquisition One, Llc See-through computer display systems
US9532714B2 (en) 2014-01-21 2017-01-03 Osterhout Group, Inc. Eye imaging in head worn computing
US11892644B2 (en) 2014-01-21 2024-02-06 Mentor Acquisition One, Llc See-through computer display systems
US10866420B2 (en) 2014-01-21 2020-12-15 Mentor Acquisition One, Llc See-through computer display systems
US9594246B2 (en) 2014-01-21 2017-03-14 Osterhout Group, Inc. See-through computer display systems
US9615742B2 (en) 2014-01-21 2017-04-11 Osterhout Group, Inc. Eye imaging in head worn computing
US9651788B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
US9436006B2 (en) 2014-01-21 2016-09-06 Osterhout Group, Inc. See-through computer display systems
US9651789B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-Through computer display systems
US9651783B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
US9651784B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
US9532715B2 (en) 2014-01-21 2017-01-03 Osterhout Group, Inc. Eye imaging in head worn computing
US9658458B2 (en) 2014-01-21 2017-05-23 Osterhout Group, Inc. See-through computer display systems
US9658457B2 (en) 2014-01-21 2017-05-23 Osterhout Group, Inc. See-through computer display systems
US9377625B2 (en) 2014-01-21 2016-06-28 Osterhout Group, Inc. Optical configurations for head worn computing
US11353957B2 (en) 2014-01-21 2022-06-07 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US11054902B2 (en) 2014-01-21 2021-07-06 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US11487110B2 (en) 2014-01-21 2022-11-01 Mentor Acquisition One, Llc Eye imaging in head worn computing
US9684171B2 (en) 2014-01-21 2017-06-20 Osterhout Group, Inc. See-through computer display systems
US9684165B2 (en) 2014-01-21 2017-06-20 Osterhout Group, Inc. Eye imaging in head worn computing
US9298001B2 (en) 2014-01-21 2016-03-29 Osterhout Group, Inc. Optical configurations for head worn computing
US11796805B2 (en) 2014-01-21 2023-10-24 Mentor Acquisition One, Llc Eye imaging in head worn computing
US9715112B2 (en) 2014-01-21 2017-07-25 Osterhout Group, Inc. Suppression of stray light in head worn computing
US9720235B2 (en) 2014-01-21 2017-08-01 Osterhout Group, Inc. See-through computer display systems
US9720227B2 (en) 2014-01-21 2017-08-01 Osterhout Group, Inc. See-through computer display systems
US9720234B2 (en) 2014-01-21 2017-08-01 Osterhout Group, Inc. See-through computer display systems
US10705339B2 (en) 2014-01-21 2020-07-07 Mentor Acquisition One, Llc Suppression of stray light in head worn computing
US10139632B2 (en) 2014-01-21 2018-11-27 Osterhout Group, Inc. See-through computer display systems
US9740012B2 (en) 2014-01-21 2017-08-22 Osterhout Group, Inc. See-through computer display systems
US9740280B2 (en) 2014-01-21 2017-08-22 Osterhout Group, Inc. Eye imaging in head worn computing
US10579140B2 (en) 2014-01-21 2020-03-03 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US9746676B2 (en) 2014-01-21 2017-08-29 Osterhout Group, Inc. See-through computer display systems
US11099380B2 (en) 2014-01-21 2021-08-24 Mentor Acquisition One, Llc Eye imaging in head worn computing
US9753288B2 (en) 2014-01-21 2017-09-05 Osterhout Group, Inc. See-through computer display systems
US9766463B2 (en) 2014-01-21 2017-09-19 Osterhout Group, Inc. See-through computer display systems
US9772492B2 (en) 2014-01-21 2017-09-26 Osterhout Group, Inc. Eye imaging in head worn computing
US10698223B2 (en) 2014-01-21 2020-06-30 Mentor Acquisition One, Llc See-through computer display systems
US9529199B2 (en) 2014-01-21 2016-12-27 Osterhout Group, Inc. See-through computer display systems
US11737666B2 (en) 2014-01-21 2023-08-29 Mentor Acquisition One, Llc Eye imaging in head worn computing
US9811152B2 (en) 2014-01-21 2017-11-07 Osterhout Group, Inc. Eye imaging in head worn computing
US9811159B2 (en) 2014-01-21 2017-11-07 Osterhout Group, Inc. Eye imaging in head worn computing
US11103132B2 (en) 2014-01-21 2021-08-31 Mentor Acquisition One, Llc Eye imaging in head worn computing
US11719934B2 (en) 2014-01-21 2023-08-08 Mentor Acquisition One, Llc Suppression of stray light in head worn computing
US9529195B2 (en) 2014-01-21 2016-12-27 Osterhout Group, Inc. See-through computer display systems
US9829703B2 (en) 2014-01-21 2017-11-28 Osterhout Group, Inc. Eye imaging in head worn computing
US9836122B2 (en) 2014-01-21 2017-12-05 Osterhout Group, Inc. Eye glint imaging in see-through computer display systems
US10073266B2 (en) 2014-01-21 2018-09-11 Osterhout Group, Inc. See-through computer display systems
US9298002B2 (en) 2014-01-21 2016-03-29 Osterhout Group, Inc. Optical configurations for head worn computing
US11619820B2 (en) 2014-01-21 2023-04-04 Mentor Acquisition One, Llc See-through computer display systems
US9316833B2 (en) 2014-01-21 2016-04-19 Osterhout Group, Inc. Optical configurations for head worn computing
US11622426B2 (en) 2014-01-21 2023-04-04 Mentor Acquisition One, Llc See-through computer display systems
US11669163B2 (en) 2014-01-21 2023-06-06 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US9885868B2 (en) 2014-01-21 2018-02-06 Osterhout Group, Inc. Eye imaging in head worn computing
US9927612B2 (en) 2014-01-21 2018-03-27 Osterhout Group, Inc. See-through computer display systems
US9494800B2 (en) 2014-01-21 2016-11-15 Osterhout Group, Inc. See-through computer display systems
US9933622B2 (en) 2014-01-21 2018-04-03 Osterhout Group, Inc. See-through computer display systems
US9529192B2 (en) 2014-01-21 2016-12-27 Osterhout Group, Inc. Eye imaging in head worn computing
US10001644B2 (en) 2014-01-21 2018-06-19 Osterhout Group, Inc. See-through computer display systems
US9952664B2 (en) 2014-01-21 2018-04-24 Osterhout Group, Inc. Eye imaging in head worn computing
US9958674B2 (en) 2014-01-21 2018-05-01 Osterhout Group, Inc. Eye imaging in head worn computing
US9523856B2 (en) 2014-01-21 2016-12-20 Osterhout Group, Inc. See-through computer display systems
US10558050B2 (en) 2014-01-24 2020-02-11 Mentor Acquisition One, Llc Haptic systems for head-worn computers
US9400390B2 (en) * 2014-01-24 2016-07-26 Osterhout Group, Inc. Peripheral lighting for head worn computing
US9846308B2 (en) 2014-01-24 2017-12-19 Osterhout Group, Inc. Haptic systems for head-worn computers
US11782274B2 (en) 2014-01-24 2023-10-10 Mentor Acquisition One, Llc Stray light suppression for head worn computing
US10578874B2 (en) 2014-01-24 2020-03-03 Mentor Acquisition One, Llc Stray light suppression for head worn computing
US11822090B2 (en) 2014-01-24 2023-11-21 Mentor Acquisition One, Llc Haptic systems for head-worn computers
US9939646B2 (en) 2014-01-24 2018-04-10 Osterhout Group, Inc. Stray light suppression for head worn computing
US20150212324A1 (en) * 2014-01-24 2015-07-30 Osterhout Group, Inc. Peripheral lighting for head worn computing
US20210357028A1 (en) * 2014-01-25 2021-11-18 Sony Interactive Entertainment LLC Menu navigation in a head-mounted display
US11693476B2 (en) * 2014-01-25 2023-07-04 Sony Interactive Entertainment LLC Menu navigation in a head-mounted display
US20170242479A1 (en) * 2014-01-25 2017-08-24 Sony Interactive Entertainment America Llc Menu navigation in a head-mounted display
US11036292B2 (en) 2014-01-25 2021-06-15 Sony Interactive Entertainment LLC Menu navigation in a head-mounted display
US10809798B2 (en) * 2014-01-25 2020-10-20 Sony Interactive Entertainment LLC Menu navigation in a head-mounted display
US9841602B2 (en) 2014-02-11 2017-12-12 Osterhout Group, Inc. Location indicating avatar in head worn computing
US9843093B2 (en) 2014-02-11 2017-12-12 Osterhout Group, Inc. Spatial location presentation in head worn computing
US9401540B2 (en) 2014-02-11 2016-07-26 Osterhout Group, Inc. Spatial location presentation in head worn computing
US9784973B2 (en) 2014-02-11 2017-10-10 Osterhout Group, Inc. Micro doppler presentations in head worn computing
US9547465B2 (en) 2014-02-14 2017-01-17 Osterhout Group, Inc. Object shadowing in head worn computing
US11381903B2 (en) 2014-02-14 2022-07-05 Sonic Blocks Inc. Modular quick-connect A/V system and methods thereof
US10140079B2 (en) 2014-02-14 2018-11-27 Osterhout Group, Inc. Object shadowing in head worn computing
US9928019B2 (en) 2014-02-14 2018-03-27 Osterhout Group, Inc. Object shadowing in head worn computing
US20150241957A1 (en) * 2014-02-21 2015-08-27 Sony Corporation Control apparatus, information processing apparatus, control method, information processing method, information processing system and wearable device
US10394330B2 (en) * 2014-03-10 2019-08-27 Qualcomm Incorporated Devices and methods for facilitating wireless communications based on implicit user cues
US20150256674A1 (en) * 2014-03-10 2015-09-10 Qualcomm Incorporated Devices and methods for facilitating wireless communications based on implicit user cues
US10191279B2 (en) 2014-03-17 2019-01-29 Osterhout Group, Inc. Eye imaging in head worn computing
US9423612B2 (en) 2014-03-28 2016-08-23 Osterhout Group, Inc. Sensor dependent content position in head worn computing
US11104272B2 (en) 2014-03-28 2021-08-31 Mentor Acquisition One, Llc System for assisted operator safety using an HMD
US11227294B2 (en) 2014-04-03 2022-01-18 Mentor Acquisition One, Llc Sight information collection in head worn computing
US10770075B2 (en) * 2014-04-21 2020-09-08 Qualcomm Incorporated Method and apparatus for activating application by speech input
US20150302855A1 (en) * 2014-04-21 2015-10-22 Qualcomm Incorporated Method and apparatus for activating application by speech input
US11727223B2 (en) 2014-04-25 2023-08-15 Mentor Acquisition One, Llc Language translation with head-worn computing
US10634922B2 (en) 2014-04-25 2020-04-28 Mentor Acquisition One, Llc Speaker assembly for headworn computer
US11474360B2 (en) 2014-04-25 2022-10-18 Mentor Acquisition One, Llc Speaker assembly for headworn computer
US9672210B2 (en) 2014-04-25 2017-06-06 Osterhout Group, Inc. Language translation with head-worn computing
US9651787B2 (en) 2014-04-25 2017-05-16 Osterhout Group, Inc. Speaker assembly for headworn computer
US10853589B2 (en) 2014-04-25 2020-12-01 Mentor Acquisition One, Llc Language translation with head-worn computing
US11880041B2 (en) 2014-04-25 2024-01-23 Mentor Acquisition One, Llc Speaker assembly for headworn computer
US9746686B2 (en) 2014-05-19 2017-08-29 Osterhout Group, Inc. Content position calibration in head worn computing
KR101430614B1 (en) * 2014-05-30 2014-08-18 주식회사 모리아타운 Display device using wearable eyeglasses and method for operating the same
US9841599B2 (en) 2014-06-05 2017-12-12 Osterhout Group, Inc. Optical configurations for head-worn see-through displays
US11402639B2 (en) 2014-06-05 2022-08-02 Mentor Acquisition One, Llc Optical configurations for head-worn see-through displays
US10877270B2 (en) 2014-06-05 2020-12-29 Mentor Acquisition One, Llc Optical configurations for head-worn see-through displays
US11960089B2 (en) 2014-06-05 2024-04-16 Mentor Acquisition One, Llc Optical configurations for head-worn see-through displays
US11327323B2 (en) 2014-06-09 2022-05-10 Mentor Acquisition One, Llc Content presentation in head worn computing
US11360318B2 (en) 2014-06-09 2022-06-14 Mentor Acquisition One, Llc Content presentation in head worn computing
US10139635B2 (en) 2014-06-09 2018-11-27 Osterhout Group, Inc. Content presentation in head worn computing
US11663794B2 (en) 2014-06-09 2023-05-30 Mentor Acquisition One, Llc Content presentation in head worn computing
US9720241B2 (en) 2014-06-09 2017-08-01 Osterhout Group, Inc. Content presentation in head worn computing
US9575321B2 (en) 2014-06-09 2017-02-21 Osterhout Group, Inc. Content presentation in head worn computing
US10976559B2 (en) 2014-06-09 2021-04-13 Mentor Acquisition One, Llc Content presentation in head worn computing
US11022810B2 (en) 2014-06-09 2021-06-01 Mentor Acquisition One, Llc Content presentation in head worn computing
US11790617B2 (en) 2014-06-09 2023-10-17 Mentor Acquisition One, Llc Content presentation in head worn computing
US10649220B2 (en) 2014-06-09 2020-05-12 Mentor Acquisition One, Llc Content presentation in head worn computing
US10663740B2 (en) 2014-06-09 2020-05-26 Mentor Acquisition One, Llc Content presentation in head worn computing
US11887265B2 (en) 2014-06-09 2024-01-30 Mentor Acquisition One, Llc Content presentation in head worn computing
US11294180B2 (en) 2014-06-17 2022-04-05 Mentor Acquisition One, Llc External user interface for head worn computing
US10698212B2 (en) 2014-06-17 2020-06-30 Mentor Acquisition One, Llc External user interface for head worn computing
US9810906B2 (en) 2014-06-17 2017-11-07 Osterhout Group, Inc. External user interface for head worn computing
US11789267B2 (en) 2014-06-17 2023-10-17 Mentor Acquisition One, Llc External user interface for head worn computing
US11054645B2 (en) 2014-06-17 2021-07-06 Mentor Acquisition One, Llc External user interface for head worn computing
US11940629B2 (en) 2014-07-08 2024-03-26 Mentor Acquisition One, Llc Optical configurations for head-worn see-through displays
US10031337B2 (en) * 2014-07-08 2018-07-24 Lg Electronics Inc. Glasses-type terminal and method for controlling the same
US11409110B2 (en) 2014-07-08 2022-08-09 Mentor Acquisition One, Llc Optical configurations for head-worn see-through displays
KR102227087B1 (en) 2014-07-08 2021-03-12 엘지전자 주식회사 Wearable glass-type device and control method of the wearable glass-type device
US20160011420A1 (en) * 2014-07-08 2016-01-14 Lg Electronics Inc. Glasses-type terminal and method for controlling the same
US10775630B2 (en) 2014-07-08 2020-09-15 Mentor Acquisition One, Llc Optical configurations for head-worn see-through displays
KR20160006053A (en) * 2014-07-08 2016-01-18 엘지전자 주식회사 Wearable glass-type device and control method of the wearable glass-type device
US10564426B2 (en) 2014-07-08 2020-02-18 Mentor Acquisition One, Llc Optical configurations for head-worn see-through displays
US9366867B2 (en) 2014-07-08 2016-06-14 Osterhout Group, Inc. Optical systems for see-through displays
US11786105B2 (en) 2014-07-15 2023-10-17 Mentor Acquisition One, Llc Content presentation in head worn computing
US11269182B2 (en) 2014-07-15 2022-03-08 Mentor Acquisition One, Llc Content presentation in head worn computing
US11103122B2 (en) 2014-07-15 2021-08-31 Mentor Acquisition One, Llc Content presentation in head worn computing
US20160018901A1 (en) * 2014-07-18 2016-01-21 Richard D. Woolley Enabling data tracking without requiring direct contact
US11360314B2 (en) 2014-08-12 2022-06-14 Mentor Acquisition One, Llc Measuring content brightness in head worn computing
US9829707B2 (en) 2014-08-12 2017-11-28 Osterhout Group, Inc. Measuring content brightness in head worn computing
US10908422B2 (en) 2014-08-12 2021-02-02 Mentor Acquisition One, Llc Measuring content brightness in head worn computing
US11630315B2 (en) 2014-08-12 2023-04-18 Mentor Acquisition One, Llc Measuring content brightness in head worn computing
WO2016043820A1 (en) * 2014-09-15 2016-03-24 Google Inc. Managing information display
GB2544904A (en) * 2014-09-15 2017-05-31 Google Inc Managing information display
CN106471419A (en) * 2014-09-15 2017-03-01 谷歌公司 Management information shows
GB2544904B (en) * 2014-09-15 2021-07-21 Google Llc Managing information display
US9547365B2 (en) 2014-09-15 2017-01-17 Google Inc. Managing information display
US9423842B2 (en) 2014-09-18 2016-08-23 Osterhout Group, Inc. Thermal management for head-worn computer
US9671613B2 (en) 2014-09-26 2017-06-06 Osterhout Group, Inc. See-through computer display systems
US9366868B2 (en) 2014-09-26 2016-06-14 Osterhout Group, Inc. See-through computer display systems
US9448409B2 (en) 2014-11-26 2016-09-20 Osterhout Group, Inc. See-through computer display systems
US11809628B2 (en) 2014-12-03 2023-11-07 Mentor Acquisition One, Llc See-through computer display systems
US11262846B2 (en) 2014-12-03 2022-03-01 Mentor Acquisition One, Llc See-through computer display systems
US9684172B2 (en) 2014-12-03 2017-06-20 Osterhout Group, Inc. Head worn computer display systems
US10684687B2 (en) 2014-12-03 2020-06-16 Mentor Acquisition One, Llc See-through computer display systems
USD743963S1 (en) 2014-12-22 2015-11-24 Osterhout Group, Inc. Air mouse
USD792400S1 (en) 2014-12-31 2017-07-18 Osterhout Group, Inc. Computer glasses
USD751552S1 (en) 2014-12-31 2016-03-15 Osterhout Group, Inc. Computer glasses
USD753114S1 (en) 2015-01-05 2016-04-05 Osterhout Group, Inc. Air mouse
USD794637S1 (en) 2015-01-05 2017-08-15 Osterhout Group, Inc. Air mouse
US10121063B2 (en) 2015-01-12 2018-11-06 BMT Business Meets Technology Holding AG Wink gesture based control system
US9584915B2 (en) 2015-01-19 2017-02-28 Microsoft Technology Licensing, Llc Spatial audio with remote speakers
US10146998B2 (en) * 2015-02-13 2018-12-04 Halliburton Energy Services, Inc. Distributing information using role-specific augmented reality devices
US20170351918A1 (en) * 2015-02-13 2017-12-07 Halliburton Energy Services, Inc. Distributing information using role-specific augmented reality devices
US10062182B2 (en) 2015-02-17 2018-08-28 Osterhout Group, Inc. See-through computer display systems
US9671626B2 (en) 2016-05-19 2017-06-06 Maximilian Ralph Peter von und zu Liechtenstein Apparatus and method for augmenting human vision by means of adaptive polarization filter grids
US9826299B1 (en) 2016-08-22 2017-11-21 Osterhout Group, Inc. Speaker systems for head-worn computer systems
US11825257B2 (en) 2016-08-22 2023-11-21 Mentor Acquisition One, Llc Speaker systems for head-worn computer systems
US11350196B2 (en) 2016-08-22 2022-05-31 Mentor Acquisition One, Llc Speaker systems for head-worn computer systems
US10757495B2 (en) 2016-08-22 2020-08-25 Mentor Acquisition One, Llc Speaker systems for head-worn computer systems
US10768500B2 (en) 2016-09-08 2020-09-08 Mentor Acquisition One, Llc Electrochromic systems for head-worn computer systems
US11768417B2 (en) 2016-09-08 2023-09-26 Mentor Acquisition One, Llc Electrochromic systems for head-worn computer systems
US9880441B1 (en) 2016-09-08 2018-01-30 Osterhout Group, Inc. Electrochromic systems for head-worn computer systems
US11415856B2 (en) 2016-09-08 2022-08-16 Mentor Acquisition One, Llc Electrochromic systems for head-worn computer systems
US11861145B2 (en) * 2018-07-17 2024-01-02 Methodical Mind, Llc Graphical user interface system
US11372523B2 (en) * 2018-07-17 2022-06-28 Meso Scale Technologies, Llc. Graphical user interface system
US10936163B2 (en) * 2018-07-17 2021-03-02 Methodical Mind, Llc. Graphical user interface system
USD885410S1 (en) * 2018-10-05 2020-05-26 Google Llc Display screen with animated graphical user interface
TWI719483B (en) * 2019-05-17 2021-02-21 雅得近顯股份有限公司 Convenient memo operating system
US11537269B2 (en) 2019-12-27 2022-12-27 Methodical Mind, Llc. Graphical user interface system
US11288355B2 (en) * 2020-05-05 2022-03-29 International Business Machines Corporation Detector for online user verification
US11776255B2 (en) 2021-10-22 2023-10-03 Kyndryl, Inc. Dynamic input system for smart glasses based on user availability states
US11967034B2 (en) 2023-10-31 2024-04-23 Nant Holdings Ip, Llc Augmented reality object management system

Also Published As

Publication number Publication date
WO2014055952A2 (en) 2014-04-10
US9454288B2 (en) 2016-09-27
WO2014055929A1 (en) 2014-04-10
WO2014055952A3 (en) 2014-05-30
WO2014055948A2 (en) 2014-04-10
US10254923B2 (en) 2019-04-09
US20140101592A1 (en) 2014-04-10
WO2014055929A9 (en) 2014-05-22
US20190179497A1 (en) 2019-06-13
WO2014055948A3 (en) 2014-05-30
US20160188536A1 (en) 2016-06-30
US9250769B2 (en) 2016-02-02
US20140098102A1 (en) 2014-04-10

Similar Documents

Publication Publication Date Title
US20140101608A1 (en) User Interfaces for Head-Mountable Devices
US9507426B2 (en) Using the Z-axis in user interfaces for head mountable displays
US20220131819A1 (en) Message Suggestions
US9811154B2 (en) Methods to pan, zoom, crop, and proportionally move on a head mountable display
US9798517B2 (en) Tap to initiate a next action for user requests
US20150193098A1 (en) Yes or No User-Interface
DK201870362A1 (en) Multi-participant live communication user interface
US20170236330A1 (en) Novel dual hmd and vr device with novel control methods and software
JP6609361B2 (en) Multi-participant live communication user interface
CN110720085A (en) Voice communication method
US10559024B1 (en) Voice initiated purchase request
US20160299641A1 (en) User Interface for Social Interactions on a Head-Mountable Display
US10194121B1 (en) Content capture
AU2019100499A4 (en) Multi-participant live communication user interface
US9727716B1 (en) Shared workspace associated with a voice-request account
AU2019266225B2 (en) Multi-participant live communication user interface
US20240134492A1 (en) Digital assistant interactions in extended reality
WO2022182668A1 (en) Digital assistant interactions in extended reality
KR20180057034A (en) Mobile terminal and method for controlling the same

Legal Events

Date Code Title Description
AS Assignment

Owner name: GOOGLE INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:RYSKAMP, ROBERT ALLEN;BRAUN, MAX BENJAMIN;PATEL, NIRMAL;AND OTHERS;SIGNING DATES FROM 20130315 TO 20130401;REEL/FRAME:030155/0941

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: GOOGLE LLC, CALIFORNIA

Free format text: CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:044695/0115

Effective date: 20170929