US20120242609A1 - Interacting With Physical and Digital Objects Via a Multi-Touch Device - Google Patents

Interacting With Physical and Digital Objects Via a Multi-Touch Device Download PDF

Info

Publication number
US20120242609A1
US20120242609A1 US13/493,497 US201213493497A US2012242609A1 US 20120242609 A1 US20120242609 A1 US 20120242609A1 US 201213493497 A US201213493497 A US 201213493497A US 2012242609 A1 US2012242609 A1 US 2012242609A1
Authority
US
United States
Prior art keywords
physical object
recited
display
digital
physical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/493,497
Inventor
Shahram Izadi
Abigail J. Sellen
Richard M. Banks
Stuart Taylor
Stephen E. Hodges
Alex Butler
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corp filed Critical Microsoft Corp
Priority to US13/493,497 priority Critical patent/US20120242609A1/en
Publication of US20120242609A1 publication Critical patent/US20120242609A1/en
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/43Querying
    • G06F16/438Presentation of query results
    • G06F16/4387Presentation of query results by the use of playlists
    • G06F16/4393Multimedia presentations, e.g. slide shows, multimedia albums
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management

Definitions

  • Groups of individuals such as families, sports teams, school house groups, or other groups of individuals who work or collaborate with one another often collect a great deal of material in order to capture and preserve group memories or for other purposes such as education and knowledge sharing.
  • This material may be in physical form, such as printed photographs, sports trophies, mascots, art work, birthday cards, theatre tickets etc. It may also be in digital form such as digital photographs and home videos. For example, there is a burgeoning amount of digital media which families capture and collect using a range of devices including camcorders, digital cameras, and, increasingly, mobile phones.
  • An archiving system which incorporates at least one image capture device, a display, a sensing apparatus arranged to detect user input associated with the display, a processor and memory, and a receptacle for holding digital media storage devices such as mobile telephones, digital cameras, personal digital assistants and the like.
  • the image capture device is operable to capture digital images of physical objects for archiving.
  • the receptacle comprises a data transmission apparatus for automatically transferring data with the digital media storage devices and optionally also a power charging apparatus.
  • FIG. 1 is a schematic diagram of an example archiving system
  • FIG. 2 is a schematic side view of another example archiving system
  • FIG. 3 is a schematic side view of another example archiving system
  • FIG. 4 is a schematic cross-section through an example archiving system
  • FIG. 5 is a schematic side view of an example archiving system using a display having retro-reflective opto sensors
  • FIG. 6 illustrates a cross-section through an exemplary touch panel display
  • FIG. 7 is a schematic side view of an example archiving system using polarizing filters at a display and a camera;
  • FIG. 8 is a block diagram of an example method of processing user input at an archiving system
  • FIG. 9 is a block diagram of an example method of archiving images of a physical object at an archiving system
  • FIG. 10 is a block diagram of an example method of displaying indicators of charging and/or synchronizing status at a display of an archiving system
  • FIG. 11 is a block diagram of an example method of using bar code information
  • FIG. 12 illustrates an exemplary computing-based device in which embodiments of an archiving system may be implemented.
  • touch panel display is used to refer to a surface arranged to display digital images electronically and where the surface is also arranged to detect a physical object (such as a stylus, human digit, playing piece, or tagged object such as a puck) which either makes contact with the surface or which is close to but not in actual contact with the surface.
  • the digital images displayed may be of any suitable type such as video, still images, electronic drawings, graphical user interface features, or any other type of digital images.
  • Some examples of touch panel displays have functionality to enable discrimination between events where objects come into contact with the touch panel and events where objects merely come into close adjacency with the touch panel.
  • multi-touch panel display is used to refer to a touch panel display which is able to both detect two or more co-occurring physical objects (or parts of the same object) which either make contact with the surface or come close to but do not make actual contact with the surface and discriminate between those two or more physical contacts.
  • FIG. 1 is a schematic diagram of an example archiving system which is provided in the form of a table 100 with a drawer 102 .
  • the table may be for domestic use such as in a family living room.
  • the table has a display 101 at the table top and a camera 103 supported on an articulated arm so that its field of view comprises at least part of the display.
  • the table optionally has a microphone and loudspeaker embedded or integrated into it.
  • the display 101 may be any suitable type of display for presenting digital images.
  • a non-exhaustive list of examples is: a touch panel display, a multi-touch panel display, an area onto which information is projected either using front or rear projection, a liquid crystal display, a plasma screen.
  • the display is suitable for presenting digital images such as videos, digital photographs, scanned documents, 3D images of physical objects and the like. It is also suitable for presenting graphical user interface items such as menus, dialog boxes and the like.
  • the display 101 may be integrated into the table top.
  • the display 101 comprises the table top itself.
  • front projection such as where a micro-projector may be integrated with the camera 103 .
  • the archiving system also comprises a sensing apparatus, associated with the display 101 .
  • the sensing apparatus is arranged to detect user input to the archiving system comprising position of one or more physical objects on or above the display.
  • the sensing apparatus may comprise an image capture device such as a camera positioned over table or below table. In the case of an over table camera, camera 103 may be used as at least part of the sensing apparatus.
  • the sensing apparatus may also be integral with the display itself in some cases such as in the case of a touch panel display using retro-reflective opto sensors.
  • the sensing apparatus is arranged to sense not only the position of physical objects (such as a user's hand, digit, playing piece, puck or the like) above the screen, but is also able to detect movement of such objects in relation to information presented on the display.
  • physical objects such as a user's hand, digit, playing piece, puck or the like
  • the drawer 102 is sized and shaped to hold one or more digital hand-held media storage devices such as mobile telephones, personal digital assistants (PDAs), digital cameras, and the like.
  • the drawer also optionally comprises a power charging apparatus (not shown in FIG. 1 ) which provides automatic power charging of media storage devices in the drawer.
  • a power charging apparatus not shown in FIG. 1
  • inductive pads for achieving this automated power charging may be embedded in the drawer base and/or walls. This is described in more detail below.
  • the drawer also comprises a data transmission apparatus arranged at least to receive data from any media storage devices in the drawer.
  • This data transmission apparatus may operate using wired communications or may provide physical connectionless data transmission with the media storage devices.
  • the table 100 also comprises a processor having an associated memory both of which may be incorporated into the table itself and not visible to the user.
  • the processor and memory are provided using a personal computer (PC) which may be a tablet PC.
  • the processor comprises a data communications link to a server or communications network.
  • the camera 103 is connected to the processor such that images captured by the camera 103 may be transferred to the processor and its memory.
  • the sensing apparatus is also in communication with the processor such that output from the sensing apparatus may be transferred to the processor.
  • the processor is arranged to control the display.
  • the archiving system may be used by one or more users 105 at the same time.
  • Physical objects 104 such as printed photographs, printed paper items, a child's first pair of shoes or other objects may be placed on the display during use.
  • the memory of the archiving system may be used to archive digital media items of any suitable type.
  • a non-exhaustive list of examples is: music files, short message service (SMS) messages, email messages, voice mail messages, digital photographs, digital videos, text documents, ringtones, multimedia messages, web pages, calendar entries.
  • SMS short message service
  • the archiving system 100 is used to capture images of a physical object that it is required to archive.
  • physical objects 104 may be placed in the field of view of the camera 103 by placing them appropriately on the display 101 .
  • the camera 103 position may be adjusted by the users 105 if necessary using the articulated arm supporting the camera.
  • a simple user interface is used to instruct the camera 103 to capture one or more images of the physical object 104 .
  • the user interface may be provided at any suitable location such as in the table top, as part of the display 101 , or on the camera itself.
  • the camera may be a video camera and the user interface may comprise a single button which when activated causes the camera to record.
  • This button may be a large, physical button provided to the side of the display 101 . However, this is not essential. Any type of user interface may be used.
  • the captured images are then displayed at the display 101 and are stored on the basis of user input received at the sensing apparatus. In this way, images of physical objects may be captured and stored in an archive in a simple and effective manner. More detail about the process of capturing the images and storing those is given later with reference to FIG. 8 .
  • the archiving system 100 may also be used to capture sounds using the microphone.
  • human speech may be recorded and stored in the archive associated with another digital media item such as an image.
  • This enables users to store speech, for example, explaining facts about a physical object whose image is stored in the archive.
  • a loudspeaker or other transducer for audio playback may be provided to enable sound recordings stored in the archive to be played back.
  • the archiving system 100 is also able to receive digital media items that have already been created or captured using other devices.
  • hand-held digital media storage devices such as mobile telephones, personal digital assistants (PDAs), digital music storage devices, digital cameras, and the like.
  • PDAs personal digital assistants
  • These digital media storage devices may be placed in the drawer 102 in order to enable digital data from those items to be uploaded onto the archiving system 100 .
  • digital data from the archiving system may be transferred to the hand-held digital media storage devices.
  • the drawer 102 may have embedded data transmission devices which optionally also provide power charging functionality. The data transmission devices and power charging functionality are described in more detail below.
  • the drawer 102 not only provides a safe and secure storage area for hand-held digital media storage devices, which is out of sight of small children, but it also provides functionality for data synchronization between those devices and the archiving system as well as (optionally) for safe, simple, and cost effective power charging.
  • a user may be presented with a display at the display 101 indicating the presence of the mobile telephone and showing its current power charge status and data synchronization status.
  • Data synchronization may proceed automatically without input from the user or may occur as a result of specific user input. In the case that power charging is also provided this may also proceed automatically or as a result of specific user input.
  • Digital media items received from items in the drawer may be represented on the display, for example, as a pile of unsorted images or in any other suitable manner. A user is then able to sort through, view and organize those media items using the display 101 and sensing apparatus.
  • the display and associated sensing apparatus provides a simple and intuitive user interface and in some examples is a multi-touch panel display.
  • User input at the display and associated sensing apparatus allows the triaging, editing and organizing of the media items, be they captured images from the image capture device or media items uploaded from the hand-held digital media storage devices.
  • the media items may be annotated for example, using a stylus pen input device or in any other suitable manner. Key words may be associated with the media items, either automatically using suitable image processing software or by receiving user input.
  • Album making software may be provided at the archiving system to enable users to create albums of digital media items using the display.
  • the media items are stored at the memory incorporated in the archiving system 100 and may be backed-up automatically to another storage location via a communications network to which the archiving system is connected.
  • the display 101 and sensing apparatus may also be used to provide a user interface which enables a user to access and display items from the archive.
  • digital media items may be retrieved on the basis of stored time information associated with those items or on the basis of stored event information associated with those items.
  • the interface may also be arranged to provide browsing of the archived digital media items.
  • the interface may provide additional functionality such as enabling items from the archive to be emailed or transferred in any suitable manner to another location via the communications network. It may also be used to post items to a web site, to print items, to edit items and to carry out other operations on the media items.
  • FIG. 2 is a schematic side view of another example archiving system.
  • a display 201 is provided above a drawer 202 and a personal computer 203 . These items need not be integrated into a table as described with reference to FIG. 1 . Rather, they may be provided in any suitable form such as a stand alone unit, or be integrated into a work surface.
  • An image capture device 200 is provided such as a camera. It is positioned such that its field of view comprises at least part of the display.
  • the archiving system is in communication with a server 204 to enable back-up of digital media from the archive.
  • the apparatus of FIG. 2 may be used in a similar manner to that described above with reference to FIG. 1 .
  • FIG. 3 is a schematic side view of an archiving system having a bowl 303 provided alongside a display 301 and arranged to hold hand-held digital media storage devices.
  • a personal computer 302 is provided below the display and bowl 303 .
  • Data synchronization functionality, and optionally power charging functionality, may be embedded in the bowl or other receptacle in a similar manner as for the drawer 202 , 102 of the earlier examples.
  • the apparatus of FIG. 3 may be used in a similar manner to that described above with reference to FIG. 1 .
  • the receptacle comprises power charging apparatus, integral with, attached to, or embedded in the receptacle itself.
  • the power charging apparatus is suitable for charging hand-held digital media storage devices such as mobile telephones, digital cameras, personal digital assistants and the like.
  • this may comprise an inductive charging mat or apparatus such as those currently commercially available from Splashpower Limited (trade mark).
  • Splashpower Limited trademark of Splashpower Limited
  • FIG. 4 is a schematic cross-section through a drawer 402 of an archiving system such as that of FIG. 1 or FIG. 2 .
  • a display 405 is shown above the drawer and a personal computer 406 is in communication with the archiving system for back-up purposes.
  • the drawer is optionally provided with a shield to electro-magnetic radiation 400 , 401 such as a metal layer provided around the drawer such that when the drawer is closed it forms a Faraday cage.
  • Electronic equipment stored inside the drawer is then protected from electrostatic discharges. Also, electro-magnetic radiation from items in the drawer and from the power charging apparatus is prevented from leaking from the archiving system into the environment.
  • the base of the drawer may comprise an inductive power charging mat 407 such as that described above. Similar inductive power charging apparatus may be provided in the walls of the drawer and/or on the surface above the drawer.
  • antennas such as Bluetooth (trade mark) antennas 403 with reduced range are attached to the inside walls of the drawer. These provide means for data transmission between any hand-held digital media storage devices in the drawer and a personal computer 203 ( FIG. 2 ) integrated into the archiving system. However, it is not essential to use such antennas 403 for data transmission. In cases where power charging apparatus is provided, this may itself provide data transmission functionality as described above.
  • the data transmission apparatus is wired such that hand-held digital media storage devices placed in the receptacle are physically connected to the data transmission apparatus.
  • hand-held digital media storage devices placed in the receptacle are physically connected to the data transmission apparatus.
  • cradles for the media storage devices may be provided in the receptacle for this purpose.
  • USB connections may also be provided in the receptacle or any other suitable type of connections for data transmission.
  • the drawer may comprise one or more image capture devices 404 which may be cameras of any suitable type.
  • these image capture devices 404 are used to capture images of physical objects placed in the drawer and which it is required to archive. These image capture devices may be used instead of, or in addition to, the image capture device 103 , 200 of FIGS. 1 and 2 .
  • the antennas 403 and the image capture devices 404 are used to obtain information about the position and outline of any devices in the drawer 402 . This is described in more detail later with reference to FIG. 9 .
  • the depth of the drawer 402 is sized relative to standard media storage devices such that those media storage devices are forced to lie flat in the drawer in order for the drawer to close. This promotes increased areas of physical contact between the media storage devices and the drawer base so that inductive charging and/or data transmission is enhanced.
  • any suitable type of display may be used as mentioned above.
  • touch panel displays these may use resistive touch panels in which touching the screen causes layers, which are normally separated by a small gap, to come into contact or a capacitive touch panel in which contact with a conductive object changes the capacitance.
  • Another type of touch screen technology uses optical sensors (e.g. an optical sensor array) to detect when a screen is touched. Any of these types of touch panel may be used in the archiving system described herein and these are intended as a non-exhaustive list of examples.
  • the touch panel display is a multi-touch panel display.
  • a multi-touch panel display is provided as now described with reference to FIG. 5 .
  • the multi-touch panel 500 comprises a liquid crystal display (LCD) having retro-reflective opto sensors embedded behind it.
  • LCD liquid crystal display
  • This multi-touch panel 500 is placed over a drawer 501 and personal computer 502 or used in any of the arrangements described above with reference to FIGS. 1 , 2 , 3 and 4 .
  • a touch panel display 500 comprises a touch panel 602 that has several infrared (IR) sensors 604 integrated therein.
  • IR infrared
  • Objects above a touchable surface 606 include an object 608 A that is in contact with touchable surface 606 and an object 608 B that is close to but not in actual contact with (“adjacent”) touchable surface 606 .
  • Infrared sensors 604 are distributed throughout touch panel 602 parallel to touchable surface 606 .
  • One of infrared sensors 604 may detect infrared radiation reflected from objects 608 A and 608 B, as indicated by arrows 610 .
  • touchable surface 606 is horizontal, but in a different embodiment generated by rotating system 500 clockwise by 90 degrees, touchable surface 606 could be vertical. In that embodiment, the objects from which reflected IR radiation is detected are to the side of touchable surface 606 .
  • the term “above” is intended to be applicable to all such orientations.
  • Touch panel 602 may comprise filters 612 that absorb visible light and transmit infrared radiation and are located between touchable surface 606 and IR sensors 604 in order to shield IR sensors 604 from visible light 614 incident on touchable surface 606 in the case where IR sensors 604 are sensitive to a broader range of wavelengths of light other than purely infrared wavelengths.
  • Touch panel 602 may comprise a display that is configured to display images that are viewable via touchable surface 606 .
  • An eye 615 indicates a possible direction from which the images are viewed.
  • the display may be, for example, an LCD, an organic light emitting diode (OLED) display, a flexible display such as electronic paper, or any other suitable display in which an IR sensor can be integrated.
  • OLED organic light emitting diode
  • System 500 may comprise a backlight 616 for the display.
  • Backlight 616 may comprise at least one IR source 618 that is configured to illuminate objects in contact with or adjacent touchable surface 606 with infrared radiation through touchable surface 606 , as indicated by arrows 620 .
  • IR sensor 604 s are only sensitive to radiation incident from above, so IR radiation traveling directly from backlight 616 to IR sensor 604 s is not detected.
  • the output of IR sensors 604 may be processed to identify a detected infrared image.
  • the IR radiation reflected from the objects may be reflected from reflective ink patterns on the objects, metal designs on the objects or any other suitable reflector.
  • white paper reflects IR radiation and black ink absorbs IR radiation, so a conventional bar code on a surface of an object may be detected by an infrared-sensing device according to the described technology.
  • Fingers are estimated to reflect about 10% of the near IR, which is sufficient to detect that a finger or hand is located at a particular location on or adjacent the touchable surface.
  • a higher resolution of IR sensors may be used to scan objects to do applications such as document scanning and fingerprint recognition. For example, fingerprint recognition generally requires a resolution of more than 200 dots per inch (dpi).
  • FIG. 6 provides just one example of an exemplary touch panel system.
  • the backlight may not comprise any IR sources and the touch panel may include a frontlight which comprises at least one IR source.
  • the touchable surface of the system is a surface of the frontlight and not of the touch panel.
  • the frontlight may comprise a light guide, so that IR radiation emitted from IR source travels through the light guide and is directed towards touchable surface and any objects in contact with or adjacent to it.
  • both the backlight and frontlight may comprise IR sources.
  • there is no backlight and the frontlight comprises both IR sources and visible light sources.
  • the system may not comprise a frontlight or a backlight, but instead the IR sources may be integrated within the touch panel.
  • the touch panel may comprise an OLED display which comprises IR OLED emitters and IR-sensitive organic photosensors (which may comprise reverse-biased OLEDs).
  • the IR source of the touch panel system may be turned on only if the touchable surface is touched. Alternatively, the IR source may be turned on regardless of whether the touchable surface is touched, and detection of whether actual contact between the touchable surface and the object occurred is processed along with the output of the IR sensor. Actual contact between the touchable surface and the object may be detected by any suitable means, including, for example, by a vibration sensor or microphone coupled to the touch panel.
  • sensors to detect contact includes pressure-based mechanisms, micro-machined accelerometers, piezoelectric devices, capacitive sensors, resistive sensors, inductive sensors, laser vibrometers, and LED vibrometers.
  • IR sensors 604 may comprise suitable infrared-sensitive semiconductor elements.
  • semiconductor material that is infrared-sensitive includes polycrystalline silicon, monocrystalline silicon, microcrystalline silicon, nanocrystalline silicon, plastic semiconductors and other non-silicon based semiconductors.
  • Devices based on polycrystalline, microcrystalline, monocrystalline or nanocrystalline silicon may have better stability than amorphous silicon devices.
  • TFTs based on polycrystalline, microcrystalline, monocrystalline or nanocrystalline silicon may have higher field mobility than amorphous silicon TFTs.
  • a multi-touch panel display is provided as now described with reference to FIG. 7 .
  • a stylus enabled digital liquid crystal display 704 is provided having a resistive touch overlay 703 and a polarizing filter 702 over the resistive touch overlay 703 .
  • the display 704 is positioned over a drawer 705 and a personal computer 706 or used in any of the other arrangements described above with reference to FIGS. 1 to 4 .
  • an image capture device such as a video camera 700 is used having a field of view which comprises at least part of the display 704 .
  • the image capture device has a polarizing filter 701 which is crossed with respect to the polarizing filter 702 at the display.
  • Polarized light emitted from the liquid crystal display passes through the polarizing filter 702 and is thus polarized.
  • ambient light in the environment is not polarized or only partially polarized.
  • Polarized light emitted from the liquid crystal display is blocked at the camera 700 by polarizing filter 701 which is substantially crossed with respect to the polarizing filter at the liquid crystal display. This means that, in an image received at the camera 700 , image regions corresponding to the liquid crystal display are dark. However, image regions corresponding to any objects between the liquid crystal display and the camera have a higher intensity.
  • Ambient light from light sources in the environment which is reflected from any objects between the liquid crystal display and the camera is captured by the camera because it is not polarized.
  • Light from the liquid crystal display which reflects or scatters from any objects between the liquid crystal display and the camera and is received by the camera may not be substantially polarized as a result of the reflection or scattering process. This light produces an image of any objects between the liquid crystal display and the camera because at least some of this unpolarized light is able to pass through the polarizing filter 701 into the camera.
  • Image segmentation has thus been achieved because those regions of the image corresponding to objects between the liquid crystal display and the camera have a much higher intensity than those regions of the image corresponding to the display itself.
  • a thresholding operation may optionally be carried out on the image to discard image elements with an intensity lower than a specified threshold.
  • a feathered mask may then be applied to smooth the edges of the remaining segmented image regions.
  • the thresholding operation and the masking operation may be carried out at the computer 706 in the archiving system.
  • This image segmentation process may be used as part of processes enabling functionality of the touch panel display. It may also be used as part of processes to capture and archive images of physical objects.
  • Resistive touch overlay 703 is used. Resistive touch overlays are widely known are typically composed of layers of material which when touched cause a change in electrical current which is registered as a touch event and sent to a controller for processing.
  • a user is able to operate the multi-touch panel display provided using the liquid crystal display, resistive touch overlay 703 and polarizing filter 702 , of FIG. 7 by placing his or her hands or digits on or just above the display and making hand gestures and movements which may be bi-manual.
  • the multi-touch panel display provides a user interface whereby such gestures and movements are used to control software applications provided on a computer 706 at the archiving system. For example, the gestures and movements may be used to determine any one or more of translation, rotation and zooming of a digital object. In addition, the user may make inputs using a stylus on the liquid crystal display.
  • user input may comprise hand gestures made on or above the touch panel (block 800 ).
  • image segmentation as described above, an image of a user's hand or hands is obtained together with images of any other physical objects such as pens, pieces of paper etc. that are on or above the multi-touch panel display (blocks 801 and 802 ).
  • the segmented image is processed at the computer 706 to detect which regions in the image correspond to the user's hand or hands (block 803 ). This is achieved in any suitable manner, for example, using stored templates of hand images and comparing the segmented image regions with those stored templates.
  • parameters describing this region are computed (block 804 ). Any suitable parameters may be used such as centre of mass, principal axis and bounding area. For example, centre of mass, bounding box and principal axis of an individual connected component can be used to respectively translate, scale and rotate a virtual object.
  • This process is then repeated for images in a sequence (block 805 ) captured by the camera and differences are computed for corresponding parameters between images in the sequence (block 806 ). These difference values are then used, together with information from the resistive touch overlay to control display of information on the touch panel display.
  • optical flow techniques are used to enable translation, scaling and rotation of items presented on the display via hand gestures and movements. These optical flow techniques are described in “PlayAnywhere: A Compact Interactive Tabletop Projection-Vision System”, Andrew D. Wilson, ACM UIST 2005.
  • the physical objects may be a child's first pair of shoes, or a printed photograph.
  • a user places the physical object to be recorded on the display.
  • One or more images of that object are then captured against a background (block 800 ).
  • the display may be arranged to present a uniform background color during this process or any other suitable background.
  • the display may have a polarizing filter which is crossed with respect to a polarizing filter at a camera as described above with reference to FIG. 7 .
  • the captured images of the object may include the user's hand for example, if the user is holding that object in position.
  • the images are captured using any suitable image capture device (or combination of such devices) provided in the archiving system. For example, this may be an over table camera such as camera 103 in FIG. 1 .
  • the display may itself be capable of capturing one or more images of the object.
  • a touch panel display may have scanning functionality.
  • the captured images are sent from the camera to a processor and associated memory provided in the archiving system.
  • An optional image segmentation process (block 901 ) is then carried out to segment the background from the image of the object. Any suitable image segmentation process may be used. For example, if a known background was presented on the display, information about this known background may be used to carry out image segmentation. Alternatively, polarization information may be used as described above with reference to FIG. 7 .
  • the captured image of the object which has optionally been segmented, may then be presented on the display as a digital imprint of the object itself. That is, when a user removes the object from the display, the captured image of that object is presented in the place where the object had been (block 802 ). This provides an intuitive way in which a user is able to view the results of the image capture process.
  • the image capture device comprises a camera having a range sensor which enables a 3D map of the surface of an object to be detected. Any suitable such camera may be used; a non-exhaustive list of examples includes those currently commercially available and those described in the following publications: “A CMOS 3D camera with millimetric depth resolution” by Niclass et al, IEEE Custom Integrated Circuits Conference pp 705-708 October 2004; “A time-of-flight depth sensor—system description, issues and solutions” Gokturk et al. Proceedings of the 2004 Conference on computer vision and pattern recognition workshop p 35 2004.
  • the image capture device comprises a 3D laser or infra-red scanner which enables a 3D map of the surface of an object to be detected.
  • the user is then able to make input which is sensed by the sensing apparatus (block 903 ) in order to organize, annotate and store the captured image as required (block 904 ).
  • a back up process may be carried out to back up the captured image to a location remote of the archiving system (block 905 ).
  • the receptacle 402 may comprise antennas 403 and cameras 404 .
  • the antennas 403 and the image capture devices 404 are used to obtain information about the position and/or outline of any devices in the drawer 402 . This is now described in more detail with reference to FIG. 10 .
  • a position of a media storage device in the receptacle of the archiving system is detected (block 1000 ). For example, a triangulation process is carried out using the antenna input signals in order to detect the location of a media storage device in the receptacle.
  • Information about the characteristics of the media storage device such as information about its identity, power charging status, data synchronization status and so on may also be provided to the processor of the archiving system. For example, this information is transferred using the antennas 403 or via the power charging mechanism.
  • the image may be an icon or other image representing the particular media storage device.
  • the image of the media storage device may comprise an outline, silhouette or other image of the actual media storage device as obtained from cameras 404 in the receptacle.
  • the position information may be used to influence location of the display of the image of the media storage device on the display. For example, in the case that the receptacle is a drawer under the display, the image or outline may be presented immediately above the media storage device in the drawer. In the case that the receptacle is a tray beside the touch panel display, the image may be presented in a representation of the tray on the display, using the position information.
  • the image of the media storage device may incorporate information about the power charging status and/or data synchronization status of that device (block 1002 ).
  • this information may be represented using colors or any other suitable markers.
  • bar codes displayed on them.
  • These bar codes may be of any suitable type able to store information about the related physical object.
  • the bar codes may be simple one-dimensional bar codes that are visible to the human eye. They may also be bar codes presented in a manner invisible to the human eye but which can be detected using infra red light sources and detectors.
  • an image of a physical object is captured as described above (block 1100 ).
  • a bar code in the image of the object is detected (block 1101 ) using pattern recognition or other image processing techniques.
  • Information from the bar code is extracted (block 1102 ).
  • this information may indicate the type or function of the object, such that it is a stapler for example.
  • the operation of the graphical user interface at the display is then influenced (block 1103 ). For example, if a pile of images are presented on the display and the user places a stapler on the display (that stapler being sensed by the sensing apparatus), this may initiate an action to group or attach the images in the pile together.
  • FIG. 12 illustrates various components of an exemplary computing-based device 1200 which may be implemented as any form of a computing and/or electronic device, and in which embodiments of an archiving system may be implemented.
  • the computing-based device 1200 comprises one or more inputs 1201 which are of any suitable type for receiving images captured by an image capture device such as a camera.
  • the device also comprises communication interface 1202 which is arranged to transmit data during synchronization of the computing-based device 1200 and one or more media storage devices such as mobile telephones, digital cameras, personal digital assistants and the like.
  • Computing-based device 1200 also comprises one or more processors 1203 which may be microprocessors, controllers or any other suitable type of processors for processing computing executable instructions to control the operation of the device in order to provide an archiving system.
  • Platform software comprising an operating system 1204 or any other suitable platform software may be provided at the computing-based device to enable application software 1205 to be executed on the device.
  • the computer executable instructions may be provided using any computer-readable media, such as memory 1207 .
  • the memory is of any suitable type such as random access memory (RAM), a disk storage device of any type such as a magnetic or optical storage device, a hard disk drive, or a CD, DVD or other disc drive. Flash memory, EPROM or EEPROM may also be used.
  • An interface 1206 is provided to a touch panel display which may be a multi-touch panel display.
  • An interface 1208 to a microphone and loudspeaker may optionally be provided.
  • computer is used herein to refer to any device with processing capability such that it can execute instructions. Those skilled in the art will realize that such processing capabilities are incorporated into many different devices and therefore the term ‘computer’ includes PCs, servers, mobile telephones, personal digital assistants and many other devices.
  • the methods described herein may be performed by software in machine readable form on a storage medium.
  • the software can be suitable for execution on a parallel processor or a serial processor such that the method steps may be carried out in any suitable order, or simultaneously.
  • a remote computer may store an example of the process described as software.
  • a local or terminal computer may access the remote computer and download a part or all of the software to run the program.
  • the local computer may download pieces of the software as needed, or execute some software instructions at the local terminal and some at the remote computer (or computer network).
  • a dedicated circuit such as a DSP, programmable logic array, or the like.

Abstract

Existing tools for organizing family memories offer few possibilities for easily integrating both physical and digital materials in order to produce a single archive for a family (or other group of users). This also applies to archiving of physical objects and digital media in general (even for applications outside the field of family use). An archiving system is described which incorporates at least one image capture device, a display, a sensing apparatus arranged to detect user input associated with the display, a processor and memory, and a receptacle for holding digital media storage devices such as mobile telephones, digital cameras, personal digital assistants and the like. The image capture device is operable to capture digital images of physical objects for archiving. The receptacle comprises a data transmission apparatus for automatically transferring data with the digital media storage devices and optionally also a power charging apparatus.

Description

  • This application is a continuation of, and claims priority to, commonly assigned co-pending U.S. patent application Ser. No. 11/746,397, entitled “Archive for Physical and Digital Objects,” filed on May 9, 2007, the entire disclosure of which is incorporated by reference herein in its entirety.
  • BACKGROUND
  • Groups of individuals such as families, sports teams, school house groups, or other groups of individuals who work or collaborate with one another often collect a great deal of material in order to capture and preserve group memories or for other purposes such as education and knowledge sharing. This material may be in physical form, such as printed photographs, sports trophies, mascots, art work, birthday cards, theatre tickets etc. It may also be in digital form such as digital photographs and home videos. For example, there is a burgeoning amount of digital media which families capture and collect using a range of devices including camcorders, digital cameras, and, increasingly, mobile phones.
  • Existing tools for organizing family memories offer few possibilities for easily integrating both physical and digital materials in order to produce a single archive for a family (or other group of users). Furthermore, most families feel a great deal of guilt about the fact that these materials all exist in different places, and are often collected but never properly organized. At the same time, many households report that, if there were a fire in the house, aside from rescuing loved ones and pets, family memorabilia such as photos would be the next things that would be rescued. These materials are therefore very valuable to households and families, yet we have no good coherent systems to allow us to easily archive and organize such family physical and digital media in a manner that is simple to use. This also applies to archiving of physical objects and digital media in general (even for applications outside the field of family use).
  • Existing archiving systems often have complex user interfaces and detailed systems for annotating and labeling items with key words in order to organize the items. This leads to problems for novice users who find it difficult to operate complex systems for archiving and accessing items from the archive.
  • It will be understood that the invention is not limited to implementations that solve any or all of the above noted disadvantages.
  • SUMMARY
  • The following presents a simplified summary of the disclosure in order to provide a basic understanding to the reader. This summary is not an extensive overview of the disclosure and it does not identify key/critical elements of the invention or delineate the scope of the invention. Its sole purpose is to present some concepts disclosed herein in a simplified form as a prelude to the more detailed description that is presented later.
  • Existing tools for organizing family memories offer few possibilities for easily integrating both physical and digital materials in order to produce a single archive for a family (or other group of users). This also applies to archiving of physical objects and digital media in general (even for applications outside the field of family use). An archiving system is described which incorporates at least one image capture device, a display, a sensing apparatus arranged to detect user input associated with the display, a processor and memory, and a receptacle for holding digital media storage devices such as mobile telephones, digital cameras, personal digital assistants and the like. The image capture device is operable to capture digital images of physical objects for archiving. The receptacle comprises a data transmission apparatus for automatically transferring data with the digital media storage devices and optionally also a power charging apparatus.
  • Many of the attendant features will be more readily appreciated as the same becomes better understood by reference to the following detailed description considered in connection with the accompanying drawings.
  • DESCRIPTION OF THE DRAWINGS
  • The present description will be better understood from the following detailed description read in light of the accompanying drawings, wherein:
  • FIG. 1 is a schematic diagram of an example archiving system;
  • FIG. 2 is a schematic side view of another example archiving system;
  • FIG. 3 is a schematic side view of another example archiving system;
  • FIG. 4 is a schematic cross-section through an example archiving system;
  • FIG. 5 is a schematic side view of an example archiving system using a display having retro-reflective opto sensors;
  • FIG. 6 illustrates a cross-section through an exemplary touch panel display;
  • FIG. 7 is a schematic side view of an example archiving system using polarizing filters at a display and a camera;
  • FIG. 8 is a block diagram of an example method of processing user input at an archiving system;
  • FIG. 9 is a block diagram of an example method of archiving images of a physical object at an archiving system;
  • FIG. 10 is a block diagram of an example method of displaying indicators of charging and/or synchronizing status at a display of an archiving system;
  • FIG. 11 is a block diagram of an example method of using bar code information;
  • FIG. 12 illustrates an exemplary computing-based device in which embodiments of an archiving system may be implemented.
  • Like reference numerals are used to designate like parts in the accompanying drawings.
  • DETAILED DESCRIPTION
  • The detailed description provided below in connection with the appended drawings is intended as a description of the present examples and is not intended to represent the only forms in which the present example may be constructed or utilized. The description sets forth the functions of the example and the sequence of steps for constructing and operating the example. However, the same or equivalent functions and sequences may be accomplished by different examples.
  • Although the present examples are described and illustrated herein as being implemented in a family archive system, the system described is provided as an example and not a limitation. As those skilled in the art will appreciate, the present examples are suitable for application in a variety of different types of archives for physical objects and digital media, for use by single individuals or groups of individuals who may or may not be family groups.
  • The term “touch panel display” is used to refer to a surface arranged to display digital images electronically and where the surface is also arranged to detect a physical object (such as a stylus, human digit, playing piece, or tagged object such as a puck) which either makes contact with the surface or which is close to but not in actual contact with the surface. The digital images displayed may be of any suitable type such as video, still images, electronic drawings, graphical user interface features, or any other type of digital images. Some examples of touch panel displays have functionality to enable discrimination between events where objects come into contact with the touch panel and events where objects merely come into close adjacency with the touch panel.
  • The term “multi-touch panel display” is used to refer to a touch panel display which is able to both detect two or more co-occurring physical objects (or parts of the same object) which either make contact with the surface or come close to but do not make actual contact with the surface and discriminate between those two or more physical contacts.
  • An Exemplary Apparatus
  • FIG. 1 is a schematic diagram of an example archiving system which is provided in the form of a table 100 with a drawer 102. For example, the table may be for domestic use such as in a family living room. The table has a display 101 at the table top and a camera 103 supported on an articulated arm so that its field of view comprises at least part of the display. The table optionally has a microphone and loudspeaker embedded or integrated into it.
  • The display 101 may be any suitable type of display for presenting digital images. A non-exhaustive list of examples is: a touch panel display, a multi-touch panel display, an area onto which information is projected either using front or rear projection, a liquid crystal display, a plasma screen. The display is suitable for presenting digital images such as videos, digital photographs, scanned documents, 3D images of physical objects and the like. It is also suitable for presenting graphical user interface items such as menus, dialog boxes and the like.
  • In some embodiments the display 101 may be integrated into the table top. For example, in the case of a touch panel display, multi-touch panel display or liquid crystal display. In other embodiments the display 101 comprises the table top itself. For example, in the case of front projection such as where a micro-projector may be integrated with the camera 103.
  • The archiving system also comprises a sensing apparatus, associated with the display 101. The sensing apparatus is arranged to detect user input to the archiving system comprising position of one or more physical objects on or above the display. For example, the sensing apparatus may comprise an image capture device such as a camera positioned over table or below table. In the case of an over table camera, camera 103 may be used as at least part of the sensing apparatus. The sensing apparatus may also be integral with the display itself in some cases such as in the case of a touch panel display using retro-reflective opto sensors. In some examples, the sensing apparatus is arranged to sense not only the position of physical objects (such as a user's hand, digit, playing piece, puck or the like) above the screen, but is also able to detect movement of such objects in relation to information presented on the display.
  • The drawer 102 is sized and shaped to hold one or more digital hand-held media storage devices such as mobile telephones, personal digital assistants (PDAs), digital cameras, and the like. The drawer also optionally comprises a power charging apparatus (not shown in FIG. 1) which provides automatic power charging of media storage devices in the drawer. For example, inductive pads for achieving this automated power charging may be embedded in the drawer base and/or walls. This is described in more detail below.
  • The drawer also comprises a data transmission apparatus arranged at least to receive data from any media storage devices in the drawer. This data transmission apparatus may operate using wired communications or may provide physical connectionless data transmission with the media storage devices.
  • The table 100 also comprises a processor having an associated memory both of which may be incorporated into the table itself and not visible to the user. For example, the processor and memory are provided using a personal computer (PC) which may be a tablet PC. The processor comprises a data communications link to a server or communications network. The camera 103 is connected to the processor such that images captured by the camera 103 may be transferred to the processor and its memory. The sensing apparatus is also in communication with the processor such that output from the sensing apparatus may be transferred to the processor. Also, the processor is arranged to control the display.
  • The archiving system may be used by one or more users 105 at the same time. Physical objects 104 such as printed photographs, printed paper items, a child's first pair of shoes or other objects may be placed on the display during use.
  • The memory of the archiving system may be used to archive digital media items of any suitable type. A non-exhaustive list of examples is: music files, short message service (SMS) messages, email messages, voice mail messages, digital photographs, digital videos, text documents, ringtones, multimedia messages, web pages, calendar entries.
  • Exemplary Method
  • For example, the archiving system 100 is used to capture images of a physical object that it is required to archive. In the example of FIG. 1, physical objects 104 may be placed in the field of view of the camera 103 by placing them appropriately on the display 101. The camera 103 position may be adjusted by the users 105 if necessary using the articulated arm supporting the camera. A simple user interface is used to instruct the camera 103 to capture one or more images of the physical object 104. The user interface may be provided at any suitable location such as in the table top, as part of the display 101, or on the camera itself. For example, the camera may be a video camera and the user interface may comprise a single button which when activated causes the camera to record. This button may be a large, physical button provided to the side of the display 101. However, this is not essential. Any type of user interface may be used. The captured images are then displayed at the display 101 and are stored on the basis of user input received at the sensing apparatus. In this way, images of physical objects may be captured and stored in an archive in a simple and effective manner. More detail about the process of capturing the images and storing those is given later with reference to FIG. 8.
  • The archiving system 100 may also be used to capture sounds using the microphone. For example, human speech may be recorded and stored in the archive associated with another digital media item such as an image. This enables users to store speech, for example, explaining facts about a physical object whose image is stored in the archive. A loudspeaker or other transducer for audio playback may be provided to enable sound recordings stored in the archive to be played back.
  • The archiving system 100 is also able to receive digital media items that have already been created or captured using other devices. For example, hand-held digital media storage devices such as mobile telephones, personal digital assistants (PDAs), digital music storage devices, digital cameras, and the like. These digital media storage devices may be placed in the drawer 102 in order to enable digital data from those items to be uploaded onto the archiving system 100. Similarly, digital data from the archiving system may be transferred to the hand-held digital media storage devices. For example, the drawer 102 may have embedded data transmission devices which optionally also provide power charging functionality. The data transmission devices and power charging functionality are described in more detail below. Thus the drawer 102 not only provides a safe and secure storage area for hand-held digital media storage devices, which is out of sight of small children, but it also provides functionality for data synchronization between those devices and the archiving system as well as (optionally) for safe, simple, and cost effective power charging.
  • Once a digital media storage device such as a mobile telephone is placed in the drawer, and the drawer is closed, a user may be presented with a display at the display 101 indicating the presence of the mobile telephone and showing its current power charge status and data synchronization status. Data synchronization may proceed automatically without input from the user or may occur as a result of specific user input. In the case that power charging is also provided this may also proceed automatically or as a result of specific user input.
  • Digital media items received from items in the drawer may be represented on the display, for example, as a pile of unsorted images or in any other suitable manner. A user is then able to sort through, view and organize those media items using the display 101 and sensing apparatus. The display and associated sensing apparatus provides a simple and intuitive user interface and in some examples is a multi-touch panel display. User input at the display and associated sensing apparatus (for example, two-handed user input) allows the triaging, editing and organizing of the media items, be they captured images from the image capture device or media items uploaded from the hand-held digital media storage devices. The media items may be annotated for example, using a stylus pen input device or in any other suitable manner. Key words may be associated with the media items, either automatically using suitable image processing software or by receiving user input. Album making software may be provided at the archiving system to enable users to create albums of digital media items using the display. The media items are stored at the memory incorporated in the archiving system 100 and may be backed-up automatically to another storage location via a communications network to which the archiving system is connected.
  • The display 101 and sensing apparatus may also be used to provide a user interface which enables a user to access and display items from the archive. For example, digital media items may be retrieved on the basis of stored time information associated with those items or on the basis of stored event information associated with those items. The interface may also be arranged to provide browsing of the archived digital media items. The interface may provide additional functionality such as enabling items from the archive to be emailed or transferred in any suitable manner to another location via the communications network. It may also be used to post items to a web site, to print items, to edit items and to carry out other operations on the media items.
  • Another Exemplary Apparatus
  • FIG. 2 is a schematic side view of another example archiving system. A display 201 is provided above a drawer 202 and a personal computer 203. These items need not be integrated into a table as described with reference to FIG. 1. Rather, they may be provided in any suitable form such as a stand alone unit, or be integrated into a work surface. An image capture device 200 is provided such as a camera. It is positioned such that its field of view comprises at least part of the display. The archiving system is in communication with a server 204 to enable back-up of digital media from the archive. The apparatus of FIG. 2 may be used in a similar manner to that described above with reference to FIG. 1.
  • Another Exemplary Apparatus
  • It is not essential to use a drawer to hold the hand-held digital media storage devices as described above with reference to FIGS. 1 and 2. Another option is to use a bowl, tray or any other suitable receptacle which is sized and shaped to hold two or more hand-held digital media storage devices. FIG. 3 is a schematic side view of an archiving system having a bowl 303 provided alongside a display 301 and arranged to hold hand-held digital media storage devices. A personal computer 302 is provided below the display and bowl 303. Data synchronization functionality, and optionally power charging functionality, may be embedded in the bowl or other receptacle in a similar manner as for the drawer 202, 102 of the earlier examples. The apparatus of FIG. 3 may be used in a similar manner to that described above with reference to FIG. 1.
  • More detail about the data synchronization functionality and optional power charging functionality is now given.
  • In some embodiments the receptacle comprises power charging apparatus, integral with, attached to, or embedded in the receptacle itself. However, it is not essential to provide power charging apparatus. The power charging apparatus is suitable for charging hand-held digital media storage devices such as mobile telephones, digital cameras, personal digital assistants and the like. For example, this may comprise an inductive charging mat or apparatus such as those currently commercially available from Splashpower Limited (trade mark). This technology enables a portable device to be recharged without an electrical contact and also to transmit, receive or synchronize data with another unit. The device to be recharged may or may not require retrofitting of a recharging unit depending on its design. This type of technology is described in detail in UK Patent Application GB 2394843, US Patent Application US 2006/0205381A1, US Patent Application US 2005/0116683 and WO 2005/024865 A2. Inductively coupled power charging technology for power and data transmission is also available from Fulton Innovations, LLC under the brand eCoupled technology (trade mark).
  • FIG. 4 is a schematic cross-section through a drawer 402 of an archiving system such as that of FIG. 1 or FIG. 2. A display 405 is shown above the drawer and a personal computer 406 is in communication with the archiving system for back-up purposes. The drawer is optionally provided with a shield to electro- magnetic radiation 400, 401 such as a metal layer provided around the drawer such that when the drawer is closed it forms a Faraday cage. Electronic equipment stored inside the drawer is then protected from electrostatic discharges. Also, electro-magnetic radiation from items in the drawer and from the power charging apparatus is prevented from leaking from the archiving system into the environment.
  • The base of the drawer may comprise an inductive power charging mat 407 such as that described above. Similar inductive power charging apparatus may be provided in the walls of the drawer and/or on the surface above the drawer.
  • In some embodiments antennas such as Bluetooth (trade mark) antennas 403 with reduced range are attached to the inside walls of the drawer. These provide means for data transmission between any hand-held digital media storage devices in the drawer and a personal computer 203 (FIG. 2) integrated into the archiving system. However, it is not essential to use such antennas 403 for data transmission. In cases where power charging apparatus is provided, this may itself provide data transmission functionality as described above.
  • In some embodiments the data transmission apparatus is wired such that hand-held digital media storage devices placed in the receptacle are physically connected to the data transmission apparatus. For example, cradles for the media storage devices may be provided in the receptacle for this purpose. USB connections may also be provided in the receptacle or any other suitable type of connections for data transmission.
  • In some embodiments the drawer may comprise one or more image capture devices 404 which may be cameras of any suitable type. For example, these image capture devices 404 are used to capture images of physical objects placed in the drawer and which it is required to archive. These image capture devices may be used instead of, or in addition to, the image capture device 103, 200 of FIGS. 1 and 2.
  • In some embodiments the antennas 403 and the image capture devices 404 are used to obtain information about the position and outline of any devices in the drawer 402. This is described in more detail later with reference to FIG. 9.
  • In some embodiments the depth of the drawer 402 is sized relative to standard media storage devices such that those media storage devices are forced to lie flat in the drawer in order for the drawer to close. This promotes increased areas of physical contact between the media storage devices and the drawer base so that inductive charging and/or data transmission is enhanced.
  • More detail about the display is now given.
  • Any suitable type of display may be used as mentioned above. For example, in the case of touch panel displays, these may use resistive touch panels in which touching the screen causes layers, which are normally separated by a small gap, to come into contact or a capacitive touch panel in which contact with a conductive object changes the capacitance. Another type of touch screen technology uses optical sensors (e.g. an optical sensor array) to detect when a screen is touched. Any of these types of touch panel may be used in the archiving system described herein and these are intended as a non-exhaustive list of examples.
  • In some embodiments the touch panel display is a multi-touch panel display.
  • In one example a multi-touch panel display is provided as now described with reference to FIG. 5. The multi-touch panel 500 comprises a liquid crystal display (LCD) having retro-reflective opto sensors embedded behind it. This multi-touch panel 500 is placed over a drawer 501 and personal computer 502 or used in any of the arrangements described above with reference to FIGS. 1, 2, 3 and 4. In this embodiment it is not essential to use an image capture device over the touch panel display. Instead, image capture devices may be provided in the receptacle or the touch panel itself may be used to capture images of physical objects to be archived.
  • Details about the multi-touch panel 500 are now described with reference to FIG. 6, which illustrates a cross-section of an exemplary touch panel display. A touch panel display 500 comprises a touch panel 602 that has several infrared (IR) sensors 604 integrated therein. Objects above a touchable surface 606 include an object 608A that is in contact with touchable surface 606 and an object 608B that is close to but not in actual contact with (“adjacent”) touchable surface 606. Infrared sensors 604 are distributed throughout touch panel 602 parallel to touchable surface 606. One of infrared sensors 604 may detect infrared radiation reflected from objects 608A and 608B, as indicated by arrows 610. Although the term “above” is used in this description, it should be understood that the orientation of the touch panel system is irrelevant. As shown in FIG. 6, touchable surface 606 is horizontal, but in a different embodiment generated by rotating system 500 clockwise by 90 degrees, touchable surface 606 could be vertical. In that embodiment, the objects from which reflected IR radiation is detected are to the side of touchable surface 606. The term “above” is intended to be applicable to all such orientations.
  • Touch panel 602 may comprise filters 612 that absorb visible light and transmit infrared radiation and are located between touchable surface 606 and IR sensors 604 in order to shield IR sensors 604 from visible light 614 incident on touchable surface 606 in the case where IR sensors 604 are sensitive to a broader range of wavelengths of light other than purely infrared wavelengths.
  • Touch panel 602 may comprise a display that is configured to display images that are viewable via touchable surface 606. An eye 615 indicates a possible direction from which the images are viewed. The display may be, for example, an LCD, an organic light emitting diode (OLED) display, a flexible display such as electronic paper, or any other suitable display in which an IR sensor can be integrated.
  • System 500 may comprise a backlight 616 for the display. Backlight 616 may comprise at least one IR source 618 that is configured to illuminate objects in contact with or adjacent touchable surface 606 with infrared radiation through touchable surface 606, as indicated by arrows 620. IR sensor 604s are only sensitive to radiation incident from above, so IR radiation traveling directly from backlight 616 to IR sensor 604s is not detected.
  • The output of IR sensors 604 may be processed to identify a detected infrared image. The IR radiation reflected from the objects may be reflected from reflective ink patterns on the objects, metal designs on the objects or any other suitable reflector. For example, white paper reflects IR radiation and black ink absorbs IR radiation, so a conventional bar code on a surface of an object may be detected by an infrared-sensing device according to the described technology. Fingers are estimated to reflect about 10% of the near IR, which is sufficient to detect that a finger or hand is located at a particular location on or adjacent the touchable surface. A higher resolution of IR sensors may be used to scan objects to do applications such as document scanning and fingerprint recognition. For example, fingerprint recognition generally requires a resolution of more than 200 dots per inch (dpi).
  • FIG. 6 provides just one example of an exemplary touch panel system. In other examples, the backlight may not comprise any IR sources and the touch panel may include a frontlight which comprises at least one IR source. In such an example, the touchable surface of the system is a surface of the frontlight and not of the touch panel. The frontlight may comprise a light guide, so that IR radiation emitted from IR source travels through the light guide and is directed towards touchable surface and any objects in contact with or adjacent to it. In other touch panel systems, both the backlight and frontlight may comprise IR sources. In yet other touch panel systems, there is no backlight and the frontlight comprises both IR sources and visible light sources. In further examples, the system may not comprise a frontlight or a backlight, but instead the IR sources may be integrated within the touch panel. In an implementation, the touch panel may comprise an OLED display which comprises IR OLED emitters and IR-sensitive organic photosensors (which may comprise reverse-biased OLEDs).
  • For some applications, it may be desirable to detect an object only if it is in actual contact with the touchable surface of the touch panel system. The IR source of the touch panel system may be turned on only if the touchable surface is touched. Alternatively, the IR source may be turned on regardless of whether the touchable surface is touched, and detection of whether actual contact between the touchable surface and the object occurred is processed along with the output of the IR sensor. Actual contact between the touchable surface and the object may be detected by any suitable means, including, for example, by a vibration sensor or microphone coupled to the touch panel. A non-exhaustive list of examples for sensors to detect contact includes pressure-based mechanisms, micro-machined accelerometers, piezoelectric devices, capacitive sensors, resistive sensors, inductive sensors, laser vibrometers, and LED vibrometers.
  • IR sensors 604 may comprise suitable infrared-sensitive semiconductor elements. A non-exhaustive list of examples of semiconductor material that is infrared-sensitive includes polycrystalline silicon, monocrystalline silicon, microcrystalline silicon, nanocrystalline silicon, plastic semiconductors and other non-silicon based semiconductors. Devices based on polycrystalline, microcrystalline, monocrystalline or nanocrystalline silicon may have better stability than amorphous silicon devices. TFTs based on polycrystalline, microcrystalline, monocrystalline or nanocrystalline silicon may have higher field mobility than amorphous silicon TFTs.
  • In another example a multi-touch panel display is provided as now described with reference to FIG. 7. A stylus enabled digital liquid crystal display 704 is provided having a resistive touch overlay 703 and a polarizing filter 702 over the resistive touch overlay 703. The display 704 is positioned over a drawer 705 and a personal computer 706 or used in any of the other arrangements described above with reference to FIGS. 1 to 4. In this embodiment an image capture device such as a video camera 700 is used having a field of view which comprises at least part of the display 704. The image capture device has a polarizing filter 701 which is crossed with respect to the polarizing filter 702 at the display.
  • Light emitted from the liquid crystal display passes through the polarizing filter 702 and is thus polarized. In contrast, ambient light in the environment is not polarized or only partially polarized. Polarized light emitted from the liquid crystal display is blocked at the camera 700 by polarizing filter 701 which is substantially crossed with respect to the polarizing filter at the liquid crystal display. This means that, in an image received at the camera 700, image regions corresponding to the liquid crystal display are dark. However, image regions corresponding to any objects between the liquid crystal display and the camera have a higher intensity. Ambient light (from light sources in the environment) which is reflected from any objects between the liquid crystal display and the camera is captured by the camera because it is not polarized. Light from the liquid crystal display which reflects or scatters from any objects between the liquid crystal display and the camera and is received by the camera may not be substantially polarized as a result of the reflection or scattering process. This light produces an image of any objects between the liquid crystal display and the camera because at least some of this unpolarized light is able to pass through the polarizing filter 701 into the camera.
  • Image segmentation has thus been achieved because those regions of the image corresponding to objects between the liquid crystal display and the camera have a much higher intensity than those regions of the image corresponding to the display itself. A thresholding operation may optionally be carried out on the image to discard image elements with an intensity lower than a specified threshold. A feathered mask may then be applied to smooth the edges of the remaining segmented image regions. For example, the thresholding operation and the masking operation may be carried out at the computer 706 in the archiving system.
  • This image segmentation process, using the polarizing filters, may be used as part of processes enabling functionality of the touch panel display. It may also be used as part of processes to capture and archive images of physical objects.
  • In order to discriminate between situations in which an object, such as a user's hand, is touching the touch panel display as opposed to hovering just above the touch panel display, the resistive touch overlay 703 is used. Resistive touch overlays are widely known are typically composed of layers of material which when touched cause a change in electrical current which is registered as a touch event and sent to a controller for processing.
  • A user is able to operate the multi-touch panel display provided using the liquid crystal display, resistive touch overlay 703 and polarizing filter 702, of FIG. 7 by placing his or her hands or digits on or just above the display and making hand gestures and movements which may be bi-manual. The multi-touch panel display provides a user interface whereby such gestures and movements are used to control software applications provided on a computer 706 at the archiving system. For example, the gestures and movements may be used to determine any one or more of translation, rotation and zooming of a digital object. In addition, the user may make inputs using a stylus on the liquid crystal display.
  • Referring to FIG. 8, one or more users make input which is sensed by the sensing apparatus. In this example of a multi-touch panel display, user input may comprise hand gestures made on or above the touch panel (block 800). Using image segmentation as described above, an image of a user's hand or hands is obtained together with images of any other physical objects such as pens, pieces of paper etc. that are on or above the multi-touch panel display (blocks 801 and 802). The segmented image is processed at the computer 706 to detect which regions in the image correspond to the user's hand or hands (block 803). This is achieved in any suitable manner, for example, using stored templates of hand images and comparing the segmented image regions with those stored templates.
  • Once a hand image region has been detected, parameters describing this region are computed (block 804). Any suitable parameters may be used such as centre of mass, principal axis and bounding area. For example, centre of mass, bounding box and principal axis of an individual connected component can be used to respectively translate, scale and rotate a virtual object.
  • This process is then repeated for images in a sequence (block 805) captured by the camera and differences are computed for corresponding parameters between images in the sequence (block 806). These difference values are then used, together with information from the resistive touch overlay to control display of information on the touch panel display.
  • In other embodiments optical flow techniques are used to enable translation, scaling and rotation of items presented on the display via hand gestures and movements. These optical flow techniques are described in “PlayAnywhere: A Compact Interactive Tabletop Projection-Vision System”, Andrew D. Wilson, ACM UIST 2005.
  • The process of capturing images of physical objects in order to make a record of such physical objects for archiving is now described in more detail with reference to FIG. 9. For example, the physical objects may be a child's first pair of shoes, or a printed photograph.
  • A user places the physical object to be recorded on the display. One or more images of that object are then captured against a background (block 800). For example, the display may be arranged to present a uniform background color during this process or any other suitable background. Alternatively, the display may have a polarizing filter which is crossed with respect to a polarizing filter at a camera as described above with reference to FIG. 7. The captured images of the object may include the user's hand for example, if the user is holding that object in position. The images are captured using any suitable image capture device (or combination of such devices) provided in the archiving system. For example, this may be an over table camera such as camera 103 in FIG. 1. Alternatively, the display may itself be capable of capturing one or more images of the object. For example, a touch panel display may have scanning functionality.
  • The captured images are sent from the camera to a processor and associated memory provided in the archiving system. An optional image segmentation process (block 901) is then carried out to segment the background from the image of the object. Any suitable image segmentation process may be used. For example, if a known background was presented on the display, information about this known background may be used to carry out image segmentation. Alternatively, polarization information may be used as described above with reference to FIG. 7.
  • The captured image of the object, which has optionally been segmented, may then be presented on the display as a digital imprint of the object itself. That is, when a user removes the object from the display, the captured image of that object is presented in the place where the object had been (block 802). This provides an intuitive way in which a user is able to view the results of the image capture process.
  • In some embodiments, the image capture device comprises a camera having a range sensor which enables a 3D map of the surface of an object to be detected. Any suitable such camera may be used; a non-exhaustive list of examples includes those currently commercially available and those described in the following publications: “A CMOS 3D camera with millimetric depth resolution” by Niclass et al, IEEE Custom Integrated Circuits Conference pp 705-708 October 2004; “A time-of-flight depth sensor—system description, issues and solutions” Gokturk et al. Proceedings of the 2004 Conference on computer vision and pattern recognition workshop p 35 2004. In other embodiments the image capture device comprises a 3D laser or infra-red scanner which enables a 3D map of the surface of an object to be detected.
  • The user is then able to make input which is sensed by the sensing apparatus (block 903) in order to organize, annotate and store the captured image as required (block 904). A back up process may be carried out to back up the captured image to a location remote of the archiving system (block 905).
  • As mentioned above with reference to FIG. 4 the receptacle 402 may comprise antennas 403 and cameras 404. In some embodiments the antennas 403 and the image capture devices 404 are used to obtain information about the position and/or outline of any devices in the drawer 402. This is now described in more detail with reference to FIG. 10. A position of a media storage device in the receptacle of the archiving system is detected (block 1000). For example, a triangulation process is carried out using the antenna input signals in order to detect the location of a media storage device in the receptacle. Information about the characteristics of the media storage device, such as information about its identity, power charging status, data synchronization status and so on may also be provided to the processor of the archiving system. For example, this information is transferred using the antennas 403 or via the power charging mechanism.
  • An image of the media storage device is then displayed (block 1001) at the display. The image may be an icon or other image representing the particular media storage device. For example, it may be a pre-configured image of a particular media storage device. Alternatively, the image of the media storage device may comprise an outline, silhouette or other image of the actual media storage device as obtained from cameras 404 in the receptacle. The position information may be used to influence location of the display of the image of the media storage device on the display. For example, in the case that the receptacle is a drawer under the display, the image or outline may be presented immediately above the media storage device in the drawer. In the case that the receptacle is a tray beside the touch panel display, the image may be presented in a representation of the tray on the display, using the position information.
  • It is also possible for the image of the media storage device to incorporate information about the power charging status and/or data synchronization status of that device (block 1002). For example, this information may be represented using colors or any other suitable markers.
  • In some embodiments physical objects are used which have bar codes displayed on them. These bar codes may be of any suitable type able to store information about the related physical object. For example, the bar codes may be simple one-dimensional bar codes that are visible to the human eye. They may also be bar codes presented in a manner invisible to the human eye but which can be detected using infra red light sources and detectors.
  • With reference to FIG. 11 an image of a physical object is captured as described above (block 1100). A bar code in the image of the object is detected (block 1101) using pattern recognition or other image processing techniques. Information from the bar code is extracted (block 1102). For example, this information may indicate the type or function of the object, such that it is a stapler for example. Using this extracted information the operation of the graphical user interface at the display is then influenced (block 1103). For example, if a pile of images are presented on the display and the user places a stapler on the display (that stapler being sensed by the sensing apparatus), this may initiate an action to group or attach the images in the pile together.
  • Exemplary Computing-Based Device
  • FIG. 12 illustrates various components of an exemplary computing-based device 1200 which may be implemented as any form of a computing and/or electronic device, and in which embodiments of an archiving system may be implemented.
  • The computing-based device 1200 comprises one or more inputs 1201 which are of any suitable type for receiving images captured by an image capture device such as a camera. The device also comprises communication interface 1202 which is arranged to transmit data during synchronization of the computing-based device 1200 and one or more media storage devices such as mobile telephones, digital cameras, personal digital assistants and the like.
  • Computing-based device 1200 also comprises one or more processors 1203 which may be microprocessors, controllers or any other suitable type of processors for processing computing executable instructions to control the operation of the device in order to provide an archiving system. Platform software comprising an operating system 1204 or any other suitable platform software may be provided at the computing-based device to enable application software 1205 to be executed on the device.
  • The computer executable instructions may be provided using any computer-readable media, such as memory 1207. The memory is of any suitable type such as random access memory (RAM), a disk storage device of any type such as a magnetic or optical storage device, a hard disk drive, or a CD, DVD or other disc drive. Flash memory, EPROM or EEPROM may also be used.
  • An interface 1206 is provided to a touch panel display which may be a multi-touch panel display.
  • An interface 1208 to a microphone and loudspeaker may optionally be provided.
  • The term ‘computer’ is used herein to refer to any device with processing capability such that it can execute instructions. Those skilled in the art will realize that such processing capabilities are incorporated into many different devices and therefore the term ‘computer’ includes PCs, servers, mobile telephones, personal digital assistants and many other devices.
  • The methods described herein may be performed by software in machine readable form on a storage medium. The software can be suitable for execution on a parallel processor or a serial processor such that the method steps may be carried out in any suitable order, or simultaneously.
  • This acknowledges that software can be a valuable, separately tradable commodity. It is intended to encompass software, which runs on or controls “dumb” or standard hardware, to carry out the desired functions. It is also intended to encompass software which “describes” or defines the configuration of hardware, such as HDL (hardware description language) software, as is used for designing silicon chips, or for configuring universal programmable chips, to carry out desired functions.
  • Those skilled in the art will realize that storage devices utilized to store program instructions can be distributed across a network. For example, a remote computer may store an example of the process described as software. A local or terminal computer may access the remote computer and download a part or all of the software to run the program. Alternatively, the local computer may download pieces of the software as needed, or execute some software instructions at the local terminal and some at the remote computer (or computer network). Those skilled in the art will also realize that by utilizing conventional techniques known to those skilled in the art that all, or a portion of the software instructions may be carried out by a dedicated circuit, such as a DSP, programmable logic array, or the like.
  • Any range or device value given herein may be extended or altered without losing the effect sought, as will be apparent to the skilled person.
  • It will be understood that the benefits and advantages described above may relate to one embodiment or may relate to several embodiments. It will further be understood that reference to ‘an’ item refers to one or more of those items.
  • The steps of the methods described herein may be carried out in any suitable order, or simultaneously where appropriate. Additionally, individual blocks may be deleted from any of the methods without departing from the spirit and scope of the subject matter described herein. Aspects of any of the examples described above may be combined with aspects of any of the other examples described to form further examples without losing the effect sought.
  • It will be understood that the above description of a preferred embodiment is given by way of example only and that various modifications may be made by those skilled in the art. The above specification, examples and data provide a complete description of the structure and use of exemplary embodiments of the invention. Although various embodiments of the invention have been described above with a certain degree of particularity, or with reference to one or more individual embodiments, those skilled in the art could make numerous alterations to the disclosed embodiments without departing from the spirit or scope of this invention.

Claims (20)

1. A method comprising:
detecting a physical object in relation to a component of a multi-touch device;
receiving input through interaction of the physical object with the component of the multi-touch device; and
responsive to the interaction, downloading data by another component of the multi-touch device.
2. A method as recited in claim 1, the physical object including one of a plurality of physical objects.
3. A method as recited in claim 2, the plurality of physical objects comprising at least two diverse objects.
4. A method as recited in claim 1, further comprising synchronizing the data between the physical object and a memory component associated with the multi-touch device.
5. A method as recited in claim 1, further comprising creating an image to represent the physical object in a user interface.
6. A computer-readable storage medium having computer executable instructions recorded thereon that, upon execution, configure a device to perform a method as recited in claim 1.
7. A method comprising:
detecting a physical object in relation to a component of a multi-touch device;
receiving input through interaction of the physical object with the component of the multi-touch device;
uploading data associated with the physical object by the component of the multi-touch device; and
storing the data being uploaded.
8. A method as recited in claim 7, the data associated with the physical object comprising an audio file.
9. A method as recited in claim 7, the uploading data associated with the physical object comprising uploading data from the physical object.
10. A method as recited in claim 7, further comprising relating the data associated with the physical object with other digital content.
11. A method as recited in claim 7, further comprising, via a user interface, representing the data associated with the physical object as being stored in a virtual box.
12. A method as recited in claim 11, the user interface being associated with the multi-touch device.
13. A method as recited in claim 11, further comprising accepting input via the user interface, the input comprising at least one of:
a pinch gesture to close the virtual box;
an action to label the virtual box;
an open the virtual box action;
a spill the virtual box action; and
a break the virtual box action.
14. A method as recited in claim 7, further comprising representing the data associated with the physical object on a timeline.
15. A method as recited in claim 7, further comprising creating a digital scrapbook of the data associated with the physical object.
16. A method as recited in claim 7, further comprising capturing an image of the physical object, the physical object being on or near the component of the multi-touch device.
17. A computer-readable storage medium having computer executable instructions recorded thereon that, upon execution, configure a device to perform a method as recited in claim 7.
18. A system comprising
a computer-readable storage medium as recited in claim 17; and
a processor configured to execute the computer executable instructions.
19. A user interface comprising:
a multi-touch surface;
a component to detect a physical object in proximity to the multi-touch surface; and
a representation of the physical object.
20. A user interface as recited in claim 19 configured to accept input comprising at least one of:
a gesture to place the representation of the physical object in a virtual box;
a pinch gesture to close the virtual box;
an action to label the virtual box;
an open the virtual box action; or
a spill the virtual box action.
US13/493,497 2007-05-09 2012-06-11 Interacting With Physical and Digital Objects Via a Multi-Touch Device Abandoned US20120242609A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/493,497 US20120242609A1 (en) 2007-05-09 2012-06-11 Interacting With Physical and Digital Objects Via a Multi-Touch Device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/746,397 US8199117B2 (en) 2007-05-09 2007-05-09 Archive for physical and digital objects
US13/493,497 US20120242609A1 (en) 2007-05-09 2012-06-11 Interacting With Physical and Digital Objects Via a Multi-Touch Device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/746,397 Continuation US8199117B2 (en) 2007-05-09 2007-05-09 Archive for physical and digital objects

Publications (1)

Publication Number Publication Date
US20120242609A1 true US20120242609A1 (en) 2012-09-27

Family

ID=39970477

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/746,397 Expired - Fee Related US8199117B2 (en) 2007-05-09 2007-05-09 Archive for physical and digital objects
US13/493,497 Abandoned US20120242609A1 (en) 2007-05-09 2012-06-11 Interacting With Physical and Digital Objects Via a Multi-Touch Device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/746,397 Expired - Fee Related US8199117B2 (en) 2007-05-09 2007-05-09 Archive for physical and digital objects

Country Status (1)

Country Link
US (2) US8199117B2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120026414A1 (en) * 2010-07-30 2012-02-02 Po-Wen Hsiao Projection electronic book
US20120293537A1 (en) * 2010-02-26 2012-11-22 Rakuten, Inc. Data-generating device, data-generating method, data-generating program and recording medium
US20170287351A1 (en) * 1999-06-11 2017-10-05 Sydney Hyman Compositions and image making media
US20180357463A1 (en) * 2014-04-10 2018-12-13 Kuo-Ching Chiang Portable Device with Fingerprint Pattern Recognition Module
GB2591722A (en) * 2019-10-12 2021-08-11 Isaac Julia Martinez Interactive table

Families Citing this family (74)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7970870B2 (en) 2005-06-24 2011-06-28 Microsoft Corporation Extending digital artifacts through an interactive surface
JP4636064B2 (en) 2007-09-18 2011-02-23 ソニー株式会社 Image processing apparatus, image processing method, and program
US20090094561A1 (en) * 2007-10-05 2009-04-09 International Business Machines Corporation Displaying Personalized Documents To Users Of A Surface Computer
US9134904B2 (en) * 2007-10-06 2015-09-15 International Business Machines Corporation Displaying documents to a plurality of users of a surface computer
US8139036B2 (en) * 2007-10-07 2012-03-20 International Business Machines Corporation Non-intrusive capture and display of objects based on contact locality
US8024185B2 (en) * 2007-10-10 2011-09-20 International Business Machines Corporation Vocal command directives to compose dynamic display text
KR20100035043A (en) * 2008-09-25 2010-04-02 삼성전자주식회사 Method and apparatus for contents management
US8704822B2 (en) 2008-12-17 2014-04-22 Microsoft Corporation Volumetric display system enabling user interaction
US8650634B2 (en) 2009-01-14 2014-02-11 International Business Machines Corporation Enabling access to a subset of data
KR101617645B1 (en) * 2009-02-24 2016-05-04 삼성전자주식회사 Method for controlling display and apparatus using the same
US20110057891A1 (en) * 2009-09-10 2011-03-10 Qualcomm Incorporated Wireless power display device
US8610924B2 (en) 2009-11-24 2013-12-17 International Business Machines Corporation Scanning and capturing digital images using layer detection
US8441702B2 (en) * 2009-11-24 2013-05-14 International Business Machines Corporation Scanning and capturing digital images using residue detection
CN101776836B (en) * 2009-12-28 2013-08-07 武汉全真光电科技有限公司 Projection display system and desktop computer
CN102822770B (en) * 2010-03-26 2016-08-17 惠普发展公司,有限责任合伙企业 Associated with
US9262015B2 (en) * 2010-06-28 2016-02-16 Intel Corporation System for portable tangible interaction
US9152277B1 (en) * 2010-06-30 2015-10-06 Amazon Technologies, Inc. Touchable projection surface system
US9250745B2 (en) 2011-01-18 2016-02-02 Hewlett-Packard Development Company, L.P. Determine the characteristics of an input relative to a projected image
US9858552B2 (en) * 2011-06-15 2018-01-02 Sap Ag Systems and methods for augmenting physical media from multiple locations
US9161026B2 (en) 2011-06-23 2015-10-13 Hewlett-Packard Development Company, L.P. Systems and methods for calibrating an imager
BR112014002186B1 (en) 2011-07-29 2020-12-29 Hewlett-Packard Development Company, L.P capture projection system, executable means of processing and method of collaboration in the workspace
US20130176263A1 (en) * 2012-01-09 2013-07-11 Harris Corporation Display system for tactical environment
US9297942B2 (en) 2012-10-13 2016-03-29 Hewlett-Packard Development Company, L.P. Imaging with polarization removal
US9143696B2 (en) 2012-10-13 2015-09-22 Hewlett-Packard Development Company, L.P. Imaging using offsetting accumulations
GB2509517B (en) * 2013-01-04 2021-03-10 Vertegaal Roel Computing apparatus
US20140245181A1 (en) * 2013-02-25 2014-08-28 Sharp Laboratories Of America, Inc. Methods and systems for interacting with an information display panel
JP2016528647A (en) 2013-08-22 2016-09-15 ヒューレット−パッカード デベロップメント カンパニー エル.ピー.Hewlett‐Packard Development Company, L.P. Projective computing system
WO2015030795A1 (en) 2013-08-30 2015-03-05 Hewlett Packard Development Company, L.P. Touch input association
EP3049899A4 (en) 2013-09-24 2017-07-05 Hewlett-Packard Development Company, L.P. Identifying a target touch region of a touch-sensitive surface based on an image
WO2015047225A1 (en) 2013-09-24 2015-04-02 Hewlett-Packard Development Company, L.P. Determining a segmentation boundary based on images representing an object
US10114512B2 (en) 2013-09-30 2018-10-30 Hewlett-Packard Development Company, L.P. Projection system manager
US10003777B2 (en) 2013-11-21 2018-06-19 Hewlett-Packard Development Company, L.P. Projection screen for specularly reflecting light
US9372543B2 (en) * 2013-12-16 2016-06-21 Dell Products, L.P. Presentation interface in a virtual collaboration session
EP3111299A4 (en) 2014-02-28 2017-11-22 Hewlett-Packard Development Company, L.P. Calibration of sensors and projector
US10607502B2 (en) 2014-06-04 2020-03-31 Square Panda Inc. Phonics exploration toy
US10825352B2 (en) * 2014-06-04 2020-11-03 Square Panda Inc. Letter manipulative identification board
WO2016007167A1 (en) 2014-07-11 2016-01-14 Hewlett-Packard Development Company, L.P. Corner generation in a projector display area
WO2016018232A1 (en) 2014-07-28 2016-02-04 Hewlett-Packard Development Company, L.P. Image background removal using multi-touch surface input
EP3175368A4 (en) 2014-07-29 2018-03-14 Hewlett-Packard Development Company, L.P. Default calibrated sensor module settings
CN106797420B (en) 2014-07-31 2020-06-12 惠普发展公司,有限责任合伙企业 Processing data representing an image
WO2016018395A1 (en) 2014-07-31 2016-02-04 Hewlett-Packard Development Company, L.P. Document region detection
US10623649B2 (en) 2014-07-31 2020-04-14 Hewlett-Packard Development Company, L.P. Camera alignment based on an image captured by the camera that contains a reference marker
WO2016018409A1 (en) 2014-07-31 2016-02-04 Hewlett-Packard Development Company, L.P. Process image according to mat characteristic
EP3175292B1 (en) 2014-07-31 2019-12-11 Hewlett-Packard Development Company, L.P. Projector as light source for an image capturing device
WO2016018413A1 (en) 2014-07-31 2016-02-04 Hewlett-Packard Development Company, L.P. Object capture and illumination
EP3175328B1 (en) 2014-07-31 2021-01-27 Hewlett-Packard Development Company, L.P. Stylus
WO2016018411A1 (en) 2014-07-31 2016-02-04 Hewlett-Packard Development Company, L.P. Measuring and correcting optical misalignment
EP3175200A4 (en) 2014-07-31 2018-04-04 Hewlett-Packard Development Company, L.P. Three dimensional scanning system and framework
EP3175514B1 (en) 2014-07-31 2022-08-31 Hewlett-Packard Development Company, L.P. Dock connector
WO2016018378A1 (en) 2014-07-31 2016-02-04 Hewlett-Packard Development Company, L.P. Data storage
US10664090B2 (en) 2014-07-31 2020-05-26 Hewlett-Packard Development Company, L.P. Touch region projection onto touch-sensitive surface
CN106797458B (en) 2014-07-31 2019-03-08 惠普发展公司,有限责任合伙企业 The virtual change of real object
US10257424B2 (en) 2014-07-31 2019-04-09 Hewlett-Packard Development Company, L.P. Augmenting functionality of a computing device
US10664100B2 (en) 2014-07-31 2020-05-26 Hewlett-Packard Development Company, L.P. Misalignment detection
WO2016018416A1 (en) 2014-07-31 2016-02-04 Hewlett-Packard Development Company, L.P. Determining the location of a user input device
WO2016022097A1 (en) 2014-08-05 2016-02-11 Hewlett-Packard Development Company, L.P. Determining a position of an input object
WO2016032501A1 (en) 2014-08-29 2016-03-03 Hewlett-Packard Development Company, L.P. Multi-device collaboration
WO2016036352A1 (en) 2014-09-03 2016-03-10 Hewlett-Packard Development Company, L.P. Presentation of a digital image of an object
WO2016036370A1 (en) 2014-09-04 2016-03-10 Hewlett-Packard Development Company, L.P. Projection alignment
US10318077B2 (en) 2014-09-05 2019-06-11 Hewlett-Packard Development Company, L.P. Coherent illumination for touch point identification
CN107431792B (en) 2014-09-09 2019-06-28 惠普发展公司,有限责任合伙企业 Color calibration
EP3191918B1 (en) 2014-09-12 2020-03-18 Hewlett-Packard Development Company, L.P. Developing contextual information from an image
US10216075B2 (en) 2014-09-15 2019-02-26 Hewlett-Packard Development Company, L.P. Digital light projector having invisible light channel
CN107003717B (en) 2014-09-24 2020-04-10 惠普发展公司,有限责任合伙企业 Transforming received touch input
WO2016053269A1 (en) 2014-09-30 2016-04-07 Hewlett-Packard Development Company, L. P. Displaying an object indicator
WO2016053271A1 (en) 2014-09-30 2016-04-07 Hewlett-Packard Development Company L. P. Identification of an object on a touch-sensitive surface
CN107077235B (en) 2014-09-30 2021-01-12 惠普发展公司,有限责任合伙企业 Determining unintended touch rejection
EP3201724A4 (en) 2014-09-30 2018-05-16 Hewlett-Packard Development Company, L.P. Gesture based manipulation of three-dimensional images
WO2016053281A1 (en) 2014-09-30 2016-04-07 Hewlett-Packard Development Company, L.P. Unintended touch rejection
EP3213504B1 (en) 2014-10-28 2022-04-13 Hewlett-Packard Development Company, L.P. Image data segmentation
CN107079126A (en) * 2014-11-13 2017-08-18 惠普发展公司,有限责任合伙企业 Image projection
US20170123622A1 (en) * 2015-10-28 2017-05-04 Microsoft Technology Licensing, Llc Computing device having user-input accessory
US10142436B2 (en) 2015-11-19 2018-11-27 Microsoft Technology Licensing, Llc Enhanced mode control of cached data
FR3061596B1 (en) * 2016-12-30 2020-03-06 Societe Anonyme Des Eaux Minerales D'evian Et En Abrege "S.A.E.M.E" PROCESS FOR PRESENTING A BOTTLE

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5060135A (en) * 1988-09-16 1991-10-22 Wang Laboratories, Inc. Apparatus for manipulating documents in a data processing system utilizing reduced images of sheets of information which are movable
US5732227A (en) * 1994-07-05 1998-03-24 Hitachi, Ltd. Interactive information processing system responsive to user manipulation of physical objects and displayed images
US5969720A (en) * 1997-03-07 1999-10-19 International Business Machines Corporation Data processing system and method for implementing an informative container for a file system
US6243724B1 (en) * 1992-04-30 2001-06-05 Apple Computer, Inc. Method and apparatus for organizing information in a computer system
US20020087588A1 (en) * 1999-04-14 2002-07-04 Mcbride Stephen Larry Method and apparatus for automatically synchronizing data from a host computer to two or more backup data storage locations
US20030069874A1 (en) * 1999-05-05 2003-04-10 Eyal Hertzog Method and system to automate the updating of personal information within a personal information management application and to synchronize such updated personal information management applications
US6725427B2 (en) * 1996-06-28 2004-04-20 Mirror Worlds Technologies, Inc. Document stream operating system with document organizing and display facilities
US20040098721A1 (en) * 1998-11-13 2004-05-20 Alverson Gail A. Restricting access to memory in a multithreaded environment
US6874037B1 (en) * 2000-06-19 2005-03-29 Sony Corporation Method and apparatus for synchronizing device information
US20050108485A1 (en) * 2003-11-18 2005-05-19 Perego Robert M. Data set level mirroring to accomplish a volume merge/migrate in a digital data storage system
US20060119541A1 (en) * 2004-12-02 2006-06-08 Blythe Michael M Display system
US20060294247A1 (en) * 2005-06-24 2006-12-28 Microsoft Corporation Extending digital artifacts through an interactive surface
US20070185938A1 (en) * 2005-12-19 2007-08-09 Anand Prahlad Systems and methods for performing data replication
US7657846B2 (en) * 2004-04-23 2010-02-02 Microsoft Corporation System and method for displaying stack icons

Family Cites Families (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5517434A (en) * 1989-01-31 1996-05-14 Norand Corporation Data capture system with communicating and recharging docking apparatus and hand-held data terminal means cooperable therewith
GB9521072D0 (en) 1995-10-14 1995-12-20 Rank Xerox Ltd Calibration of an interactive desktop system
US6340978B1 (en) 1997-01-31 2002-01-22 Making Everlasting Memories, Ltd. Method and apparatus for recording and presenting life stories
US6886047B2 (en) 1998-11-13 2005-04-26 Jp Morgan Chase Bank System and method for managing information retrievals for integrated digital and analog archives on a global basis
US6760884B1 (en) 1999-08-09 2004-07-06 Internal Research Corporation Interactive memory archive
SE515495C2 (en) * 1999-12-30 2001-08-13 Ericsson Telefon Ab L M ESD protection device for a memory card holder
JP2001209480A (en) * 2000-01-28 2001-08-03 Alps Electric Co Ltd Transmitter-receiver
US7327376B2 (en) * 2000-08-29 2008-02-05 Mitsubishi Electric Research Laboratories, Inc. Multi-user collaborative graphical user interfaces
US6894703B2 (en) * 2000-08-29 2005-05-17 Mitsubishi Electric Research Laboratories, Inc. Multi-user collaborative circular graphical user interfaces
US6791530B2 (en) * 2000-08-29 2004-09-14 Mitsubishi Electric Research Laboratories, Inc. Circular graphical user interfaces
US6545660B1 (en) * 2000-08-29 2003-04-08 Mitsubishi Electric Research Laboratory, Inc. Multi-user interactive picture presentation system and method
US6597255B1 (en) * 2001-05-30 2003-07-22 Nortel Networks Limited Power transmission system for a Faraday cage power source
US6710754B2 (en) * 2001-11-29 2004-03-23 Palm, Inc. Moveable output device
US7194002B2 (en) * 2002-02-01 2007-03-20 Microsoft Corporation Peer-to-peer based network performance measurement and analysis system and method for large scale networks
GB2388716B (en) 2002-05-13 2004-10-20 Splashpower Ltd Improvements relating to contact-less power transfer
US20050122308A1 (en) * 2002-05-28 2005-06-09 Matthew Bell Self-contained interactive video display system
GB2394843A (en) 2002-10-28 2004-05-05 Zap Wireless Technologies Ltd Charge and data transfer by the same means
GB0229141D0 (en) 2002-12-16 2003-01-15 Splashpower Ltd Improvements relating to contact-less power transfer
US20050052156A1 (en) * 2003-09-04 2005-03-10 Frank Liebenow Wireless charging mat with integrated interface connection
GB0320960D0 (en) 2003-09-08 2003-10-08 Splashpower Ltd Improvements relating to improving flux patterns of inductive charging pads
US20050183035A1 (en) * 2003-11-20 2005-08-18 Ringel Meredith J. Conflict resolution for graphic multi-user interface
US7356475B2 (en) * 2004-01-05 2008-04-08 Sbc Knowledge Ventures, L.P. System and method for providing access to an interactive service offering
US20050181839A1 (en) * 2004-02-17 2005-08-18 Nokia Corporation Devices and methods for simultaneous battery charging and data transmission in a mobile terminal
US20050219204A1 (en) * 2004-04-05 2005-10-06 Wyatt Huddleston Interactive display system
US7394459B2 (en) * 2004-04-29 2008-07-01 Microsoft Corporation Interaction between objects and a virtual environment display
US7397464B1 (en) * 2004-04-30 2008-07-08 Microsoft Corporation Associating application states with a physical object
US7134756B2 (en) * 2004-05-04 2006-11-14 Microsoft Corporation Selectable projector and imaging modes of display table
US7467380B2 (en) * 2004-05-05 2008-12-16 Microsoft Corporation Invoking applications with virtual objects on an interactive display
US7358962B2 (en) * 2004-06-15 2008-04-15 Microsoft Corporation Manipulating association of data with a physical object
US7254665B2 (en) * 2004-06-16 2007-08-07 Microsoft Corporation Method and system for reducing latency in transferring captured image data by utilizing burst transfer after threshold is reached
US7466308B2 (en) * 2004-06-28 2008-12-16 Microsoft Corporation Disposing identifying codes on a user's hand to provide input to an interactive display application
US7511703B2 (en) * 2004-06-28 2009-03-31 Microsoft Corporation Using size and shape of a physical object to manipulate output in an interactive display application
US7379047B2 (en) * 2004-06-30 2008-05-27 Microsoft Corporation Using a physical object to control an attribute of an interactive display application
US7743348B2 (en) * 2004-06-30 2010-06-22 Microsoft Corporation Using physical objects to adjust attributes of an interactive display application
US7463792B2 (en) 2004-08-17 2008-12-09 Peterschmidt Eric T System and method of archiving family history
US7639231B2 (en) * 2005-03-29 2009-12-29 Hewlett-Packard Development Company, L.P. Display of a user interface
US20060267952A1 (en) 2005-05-26 2006-11-30 Steve Alcorn Interactive display table top
US7525538B2 (en) * 2005-06-28 2009-04-28 Microsoft Corporation Using same optics to image, illuminate, and project
US20070030254A1 (en) 2005-07-21 2007-02-08 Robrecht Michael J Integration of touch sensors with directly mounted electronic components
US7627831B2 (en) * 2006-05-19 2009-12-01 Fuji Xerox Co., Ltd. Interactive techniques for organizing and retrieving thumbnails and notes on large displays
US7686682B2 (en) * 2007-01-10 2010-03-30 Fuji Xerox Co., Ltd. Video game for tagging photos

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5060135A (en) * 1988-09-16 1991-10-22 Wang Laboratories, Inc. Apparatus for manipulating documents in a data processing system utilizing reduced images of sheets of information which are movable
US6243724B1 (en) * 1992-04-30 2001-06-05 Apple Computer, Inc. Method and apparatus for organizing information in a computer system
US5732227A (en) * 1994-07-05 1998-03-24 Hitachi, Ltd. Interactive information processing system responsive to user manipulation of physical objects and displayed images
US6725427B2 (en) * 1996-06-28 2004-04-20 Mirror Worlds Technologies, Inc. Document stream operating system with document organizing and display facilities
US5969720A (en) * 1997-03-07 1999-10-19 International Business Machines Corporation Data processing system and method for implementing an informative container for a file system
US20040098721A1 (en) * 1998-11-13 2004-05-20 Alverson Gail A. Restricting access to memory in a multithreaded environment
US20020087588A1 (en) * 1999-04-14 2002-07-04 Mcbride Stephen Larry Method and apparatus for automatically synchronizing data from a host computer to two or more backup data storage locations
US20030069874A1 (en) * 1999-05-05 2003-04-10 Eyal Hertzog Method and system to automate the updating of personal information within a personal information management application and to synchronize such updated personal information management applications
US6874037B1 (en) * 2000-06-19 2005-03-29 Sony Corporation Method and apparatus for synchronizing device information
US20050108485A1 (en) * 2003-11-18 2005-05-19 Perego Robert M. Data set level mirroring to accomplish a volume merge/migrate in a digital data storage system
US7657846B2 (en) * 2004-04-23 2010-02-02 Microsoft Corporation System and method for displaying stack icons
US20060119541A1 (en) * 2004-12-02 2006-06-08 Blythe Michael M Display system
US20060294247A1 (en) * 2005-06-24 2006-12-28 Microsoft Corporation Extending digital artifacts through an interactive surface
US20070185938A1 (en) * 2005-12-19 2007-08-09 Anand Prahlad Systems and methods for performing data replication

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
definition of synchronization downloaded from http://www.thefreedictionary.com/synchronize on September 30, 2017, 3 pages *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170287351A1 (en) * 1999-06-11 2017-10-05 Sydney Hyman Compositions and image making media
US11341863B2 (en) * 1999-06-11 2022-05-24 Sydney Hyman Compositions and image making media
US20120293537A1 (en) * 2010-02-26 2012-11-22 Rakuten, Inc. Data-generating device, data-generating method, data-generating program and recording medium
US20120026414A1 (en) * 2010-07-30 2012-02-02 Po-Wen Hsiao Projection electronic book
US8786787B2 (en) * 2010-07-30 2014-07-22 E Ink Holdings Inc. Projection electronic book
US20180357463A1 (en) * 2014-04-10 2018-12-13 Kuo-Ching Chiang Portable Device with Fingerprint Pattern Recognition Module
GB2591722A (en) * 2019-10-12 2021-08-11 Isaac Julia Martinez Interactive table

Also Published As

Publication number Publication date
US20080281851A1 (en) 2008-11-13
US8199117B2 (en) 2012-06-12

Similar Documents

Publication Publication Date Title
US8199117B2 (en) Archive for physical and digital objects
US7358962B2 (en) Manipulating association of data with a physical object
US10599923B2 (en) Mobile device utilizing multiple cameras
JP6428641B2 (en) Display control apparatus, display control method, and program
CN105144037B (en) For inputting the equipment, method and graphic user interface of character
US20170269793A1 (en) User Interface For Collaborative Efforts
CN104756065B (en) Information processing unit, information processing method and computer readable recording medium storing program for performing
JP5470051B2 (en) Note capture device
US20100149096A1 (en) Network management using interaction with display surface
US20100103136A1 (en) Image display device, image display method, and program product
US11934640B2 (en) User interfaces for record labels
CN102754071A (en) Apparatus and method having multiple application display modes including mode with display resolution of another apparatus
CN102754061A (en) Device, Method, And Graphical User Interface For Changing Pages In An Electronic Document
CN105144069A (en) Semantic zoom-based navigation of displayed content
CN102763128A (en) Device, method, and graphical user interface for attachment viewing and editing
JP2010250464A (en) Apparatus and method for processing information, and program
CN103069378A (en) Device, method, and graphical user interface for user interface screen navigation
CN103052935A (en) Device, method, and graphical user interface for reordering the front-to-back positions of objects
US11671696B2 (en) User interfaces for managing visual content in media
JP2012195018A (en) Method for determining authenticity of object, method for acquiring authenticity determination image data, object authenticity determination system, authenticity determination image data acquiring system, and authenticity determining program
KR20150019370A (en) Method for navigating pages using three-dimensinal manner in mobile device and the mobile device therefor
US20050275635A1 (en) Manipulating association of data with a physical object
Dippon et al. Seamless integration of mobile devices into interactive surface environments
US20180125605A1 (en) Method and system for correlating anatomy using an electronic mobile device transparent display screen
KR102031283B1 (en) Method for managing for image an electronic device thereof

Legal Events

Date Code Title Description
AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034544/0541

Effective date: 20141014

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION