US20100194781A1 - System and method for cropping and annotating images on a touch sensitive display device - Google Patents

System and method for cropping and annotating images on a touch sensitive display device Download PDF

Info

Publication number
US20100194781A1
US20100194781A1 US12/767,077 US76707710A US2010194781A1 US 20100194781 A1 US20100194781 A1 US 20100194781A1 US 76707710 A US76707710 A US 76707710A US 2010194781 A1 US2010194781 A1 US 2010194781A1
Authority
US
United States
Prior art keywords
image
point
input
rectangle
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/767,077
Inventor
Christopher Tossing
Marc Siegel
Albert Ho
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SECURENET SOLUTIONS GROUP LLC
Original Assignee
Christopher Tossing
Marc Siegel
Albert Ho
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Christopher Tossing, Marc Siegel, Albert Ho filed Critical Christopher Tossing
Priority to US12/767,077 priority Critical patent/US20100194781A1/en
Publication of US20100194781A1 publication Critical patent/US20100194781A1/en
Assigned to KD SECURE LLC reassignment KD SECURE LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HO, ALBERT, MR, TOSSING, CHRISTOPHER, MR, SIEGEL, MARC, MR
Assigned to SECURENET SOLUTIONS GROUP, LLC reassignment SECURENET SOLUTIONS GROUP, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KD SECURE, LLC
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0637Strategic management or analysis, e.g. setting a goal or target of an organisation; Planning actions based on goals; Analysis or evaluation of effectiveness of goals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/087Inventory or stock management, e.g. order filling, procurement or balancing against orders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/103Workflow collaboration or project management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/018Certifying business or products
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P80/00Climate change mitigation technologies for sector-wide applications
    • Y02P80/20Climate change mitigation technologies for sector-wide applications using renewable energy

Definitions

  • the present invention is generally related to user-interfaces for touch-sensitive displays. More specifically, the instant invention relates to a system and method for capturing, cropping, and annotating images on a touch sensitive display device or other handheld device.
  • Cropping and annotating images is important in graphic user interfaces (GUIs), both for manipulating images in graphics applications such as Adobe Photoshop®, Microsoft Paint®, and the like, and also for cropping and annotating images for insertion into textual documents such as Adobe Portable Document Format (PDF)®, Microsoft Word®, and the like.
  • GUIs graphic user interfaces
  • PDF Adobe Portable Document Format
  • FIG. 1 One prior method of image cropping and annotating is shown in FIG. 1 .
  • Window 1 ( 101 ) shows an image of a plant 108 , which could be any image or photograph, previously stored in memory or taken live right before the cropping/annotating operation.
  • a mouse 103 is used to click at a point on the screen 102 (shown as dashed cross hair 102 ), then performing a drag operation 110 (shown as dashed line 110 ) while holding down the mouse button to another point on the screen 106 (shown as solid cross hair 106 ), and then releasing the mouse button at the point 106 .
  • Points 102 and 106 represent the lower-left-hand (LLH) corner and the upper-right-hand (URH) corner, respectively, of the rectangular bounding box representing the crop area.
  • a user hits, selects, or otherwise operates a menu bar, selecting the crop/annotate operation, which then either crops or annotates the image using the LLH 102 and URH 106 points to define the bounding box.
  • This operation is cumbersome and requires multiple mouse operations, and furthermore is generally only useable in the desktop environment with the availability of a peripheral device such as a mouse. It is generally not suitable for use in a handheld or other field device, or locations or situations where greater flexibility and fewer interactions with the GUI are highly desirable.
  • the present invention solves these problems of the prior art.
  • Applications of the present invention include digital cameras, digital video cameras, phones with built-in cameras, phones with built-in display devices, such as the Apple iPhone®, and the like.
  • the present invention may be used to provide a simple and convenient method to crop and annotate images in situations and locations where such ease is important and/or necessary.
  • one concrete application of the present invention is related to supplying a convenient user interface for a handheld device used for industrial inspection and maintenance compliance systems, as described in related U.S. Ser. No. 12/489,313.
  • the present invention allows an easy mechanism for on-site inspectors to quickly crop and annotate images in the field to substantiate problems found during an inspection.
  • the present invention is a system and method for cropping and annotating images on a touch sensitive display device or other handheld device.
  • One embodiment of the present invention is a method for cropping images, including the steps of (a) displaying an image of the image file to be cropped; (b) receiving a first input from a user designating a first point in the image defining a corner of a crop rectangle, e.g., a lower left hand corner; (c) receiving a second input from the user designating a second point in the image defining the opposite corner of the crop rectangle, e.g. an upper right hand corner, wherein the first input is released before the second input is initiated; and (d) cropping the image to the crop rectangle defined by the two corners when the second input is released.
  • Another embodiment of the present invention is the method described above also including the step of displaying on the image a location of the first point.
  • Another embodiment of the present invention is the method described above also including the step of displaying a rectangle corresponding to the crop rectangle of the image before cropping the image.
  • Another embodiment of the present invention is the method described above where if a user does not immediately release the second input, allowing the user to drag the second point to visually edit a shape and a size of the crop rectangle.
  • Another embodiment of the present invention is the method described above where if a user drags the second point near an edge of the displayed image, and the image is larger than the displayed portion, then scrolling the displayed portion to show area of the image in a direction of the dragged point.
  • Another embodiment of the present invention is the method described above also including the step of displaying the cropped image in the display area in place of the original image.
  • Another embodiment of the present invention is the method described above also including the step of scaling the cropped image to fill the entire display area.
  • Yet another embodiment of the present invention is a method of annotating an image (where annotating an image includes superimposing one or more geometrical shapes on top of the image), the method including the steps of (a) displaying an image of the image file to be annotated; (b) receiving a first input from a user designating a first point in the image defining a corner of an annotation rectangle, e.g. the lower left hand corner; (c) receiving a second input from the user designating an opposite corner of the annotation rectangle, e.g., the upper right hand corner, wherein the first input is released before the second input is initiated; and (d) annotating the image in the annotation rectangle defined by the two corners when the second input is released.
  • Another embodiment of the present invention is the method described above also including the step of displaying on the image a location of the first point.
  • Another embodiment of the present invention is the method described above also including the step of displaying a shape corresponding to the annotation of the image before annotating the image.
  • Another embodiment of the present invention is the method described above where if a user does not immediately release the second input, allowing the user to drag the second point to visually show a shape and a size of the annotation area.
  • Another embodiment of the present invention is the method described above where if a user drags the second point near an edge of the displayed image, and the image is larger than the displayed portion, then scrolling the displayed portion to show area of the image in a direction of the dragged point.
  • Another embodiment of the present invention is the method described above also including the step of displaying the annotated image in the display area in place of the original image.
  • Another embodiment of the present invention is the method described above also including the step of receiving a third input representing a type of shape and a characteristic of the shape corresponding to the annotation.
  • Another embodiment of the present invention is the method described above where the shape is, but is not limited to, a line, a rectangle, an ellipse, or a circle.
  • Another embodiment of the present invention is the method described above where the characteristics of the shape include, but is not limited to, a line type, a line width, and a line color.
  • the present invention also includes a related system by which the method of capturing, cropping, and annotating an image could be carried out.
  • a system could be implemented as a computer system, embodied in a handheld device.
  • the system may include integrated or separate hardware components for taking of media samples and means for receiving touch input.
  • FIG. 1 shows a prior art method of cropping an image in a desktop or other like environment using a mouse or other like peripheral device
  • FIG. 2 shows a flowchart for cropping an image using a simplified process on a handheld device, in accordance with one embodiment of the present invention
  • FIG. 3A shows a first step in the process for cropping the image on the handheld device, in accordance with one embodiment of the present invention
  • FIG. 3B shows a second step in the process for cropping the image on the handheld device, in accordance with one embodiment of the present invention
  • FIG. 3C shows a third step in the process for cropping the image on the handheld device, in accordance with one embodiment of the present invention.
  • FIG. 4 shows a prior art method of annotating an image in a desktop or other like environment using a mouse or other like peripheral device
  • FIG. 5 shows a flowchart for annotating an image using a simplified process on a handheld device, in accordance with another embodiment of the present invention
  • FIG. 6A shows a first step in the process for annotating the image on the handheld device, in accordance with one embodiment of the present invention
  • FIG. 6B shows a second step in the process for annotating the image on the handheld device, in accordance with one embodiment of the present invention
  • FIG. 6C shows a third step in the process for annotating the image on the handheld device, in accordance with one embodiment of the present invention.
  • FIG. 7 shows a flowchart of another process of another embodiment of the present invention showing one aspect of one possible workflow using the principles of the present invention
  • FIG. 8 is an illustration of a multi-functional handheld device, in which some of the software and hardware components of the present invention could reside, in accordance with yet another embodiment of the present invention.
  • FIG. 9 is an illustration of one of many possible use-cases of the present invention in relation to carrying out an industrial inspection operation on a wind-farm or other energy project or other like site;
  • FIG. 10 is an illustration of yet another possible use-case of the present invention in relation to a hand-held device with a camera and a touch-sensitive display, such as an Apple iPhone® or other like device;
  • FIG. 11 is an illustration of yet another possible use-case of the present invention in relation to a hand-held camera with a touch-sensitive display.
  • FIG. 12 is a block diagram of a system in accordance with yet another embodiment of the present invention.
  • the present invention generally pertains to a system and method for capturing, cropping, and annotating images on a touch sensitive display or other handheld device.
  • the interface could have, but is not limited to, the following components. Any subsets of the following components are also within the scope of this invention.
  • the user can choose to crop the image as follows:
  • the user can choose to annotate the image as follows:
  • the invention may be used in an industrial inspection compliance system with which various methods can be carried out to the effect of assisting in an inspection and providing the means for compliance verification of a proper inspection.
  • an inspection may represent the process of checking a physical component for safety, security or business reasons, doing the same for compliance with industrial standards and guidelines, or a maintenance operation on a physical component for those same reasons.
  • These methods can generally be best executed by a multi-function handheld device, carried to and used in the physical proximity of an inspection component by the inspector. Examples of multi-function handheld devices include the Apple iPhone®, the Psion Teklogix Workabout Pro®, the Motorola MC-75®, and the like, but the present invention is not limited to such devices as shown or described here.
  • One embodiment of the inspection compliance method includes the steps of scanning unique machine-readable tags deployed at logical inspection points defined by the inspector, and assigning a timestamp to the scanning operation; taking media samples of logical inspection points defined by the inspector, and assigning a timestamp to the media sample capturing operation; reporting of sub-optimal conditions of the unique machine-readable tags deployed at logical inspection points if its condition warrants such a declaration; associating a media sample with a corresponding scan of a unique machine-readable tag; and annotating a media sample in such ways that substantiate statements of an industrial component passing inspection, or in such ways that substantiate statements of problems found with the industrial component. See U.S. Ser. No. 12/489,313 for more details of an example of an industrial inspection compliance system to which the present invention may be applied.
  • FIG. 1 shows a prior art method of cropping an image in a desktop or other like environment using a mouse or other like peripheral device.
  • Window 1 ( 101 ) shows an image of a plant 108 , which could be any image or photograph, previously stored in memory or taken live right before the cropping operation.
  • a mouse 103 is used to click at a point on the screen 102 (shown as dashed cross hair 102 ), then performing a drag operation 110 (shown as dashed line 110 ) while holding down the mouse button to another point on the screen 106 (shown as solid cross hair 106 , also known as a “sweeping motion”), and then releasing the mouse button at the point 106 .
  • Points 102 and 106 represent the lower-left-hand (LLH) corner and the upper-right-hand (URH) corner, respectively, of the rectangular bounding box representing the crop area.
  • a user hits, selects, or otherwise operates a menu bar, selecting the crop operation, which then crops the images using the LLH 102 and URH 106 points to define the bounding box.
  • This operation is cumbersome and requires multiple mouse operations, and furthermore is generally only useable in the desktop environment with the availability of a peripheral device such as a mouse. It is generally not suitable to use in the handheld or other field device, or locations or situations where greater flexibility and less interactions with the GUI are highly desirable.
  • FIG. 2 shows a flowchart for cropping an image using a simplified process on a handheld device, in accordance with one embodiment of the present invention.
  • Process 200 begins at step 202 , where an image is either retrieved from memory, captured in real-time via an image capture device, or in some other way provided to the process 200 .
  • the image is displayed on the touch sensitive display or other display of the handheld device.
  • the user may click or tap (using a finger, a stylus, a mouse, or other device) at a LLH location where the crop is to begin.
  • step 208 the user may click or tap (using the finger, the stylus, the mouse, or other device) at a URH location where the crop is to end.
  • step 210 the image is cropped between the LLH location and the URH location.
  • step 212 the cropped image is displayed for the user's confirmation.
  • the user may cancel, undo, or accept the crop.
  • the user releases the first input before activating the second input. The process ends in step 214 .
  • FIG. 3A shows the first step in the process for cropping the image on the handheld device, in accordance with the embodiment described in relation to FIG. 2 .
  • Screen 302 shows a screen or touch-sensitive area of a handheld device.
  • Window 2 ( 304 ) shows one of many windows showing the image the user desires to crop.
  • the user uses his or her hand 308 (or stylus, mouse, or other device) to click or tap at point 306 (shown as solid cross hair 306 ).
  • the position of the click or tap 306 represents a LLH corner of the crop boundary.
  • FIG. 3B shows the second step in the process for cropping the image on the handheld device.
  • point 306 now shown as a dashed cross hair 306
  • the user may move his or her hand 308 (or stylus, mouse, or other device) shown as dashed motion line 310 , to another location 312 (shown as solid cross hair 312 ) and click or tap a second time.
  • the second tap at location 312 represents an URH corner of the crop boundary.
  • the user releases the first input before activating the second input.
  • FIG. 3C shows the third and final step in the process for cropping the image on the handheld device.
  • the crop operation is performed in the background, and an updated or cropped image is displayed for the user's confirmation.
  • Point 306 shown as a dashed cross hair 306
  • point 312 now also shown as dashed cross hair 312
  • a user of the present invention may implement a crop operation with very little hand motion and very little input into the device, which is highly desirable, or even mandatory, when operating in the field, for example, during an inspection operation.
  • FIG. 4 shows a prior art method of annotating an image in a desktop or other like environment using a mouse or other like peripheral device.
  • Window 1 ( 401 ) shows an image of a plant 408 , which could be any image or photograph, previously stored in memory or taken live right before the cropping/annotating operation.
  • a mouse 403 is used to click at a point on the screen 402 (shown as dashed cross hair 402 ), then performing a drag operation 410 (shown as dashed line 410 ) while holding down the mouse button to another point on the screen 406 (shown as solid cross hair 406 , also known as a “sweeping motion”), and finally releasing the mouse button at the point 406 .
  • Points 402 and 406 represent the lower-left-hand (LLH) corner and the upper-right-hand (URH) corner, respectively, of the rectangular bounding box representing the annotation area.
  • a user hits, selects, or otherwise operates a menu bar, selecting the proper annotate operation (circle, oval, rectangle, arrow, line, etc.), which then annotates the image using the LLH 102 and URH 106 points to define the bounding box.
  • This operation is cumbersome and requires multiple mouse operations, and furthermore is generally only useable in the desktop environment with the availability of a peripheral device such as a mouse. It is generally not suitable to use in the handheld or other field devices, or locations or situations where greater flexibility and fewer interactions with the GUI are highly desirable.
  • FIG. 5 shows a flowchart for annotating an image using a simplified process on a handheld device, in accordance with another embodiment of the present invention.
  • Process 500 begins at step 502 , where an image is either retrieved from memory, captured in real-time via an image capture device, or in some other way provided to the process 500 .
  • the image is displayed on the touch sensitive display or other display of the handheld device.
  • the user may click or tap (using a finger, a stylus, a mouse, or other device) at a LLH location where the crop is to begin.
  • step 508 the user may click or tap (using the finger, the stylus, the mouse, or other device) at an URH location where the annotation is to end.
  • step 510 the image is annotated between the LLH location and the URH location.
  • step 512 the annotated image is displayed for the user's confirmation.
  • the user may cancel, undo, or accept the annotation.
  • the user releases the first input before activating the second input. The process ends in step 514 .
  • FIG. 6A shows the first step in the process for annotating the image on the handheld device, in accordance with the embodiment described in relation to FIG. 5 .
  • Screen 602 shows a screen or touch-sensitive area of a handheld device.
  • Window 2 ( 604 ) shows one of many windows showing the image the user desires to annotate.
  • the user uses his or her hand 608 (or stylus, mouse, or other device) to click or tap at point 606 (shown as solid cross hair 606 ).
  • the position of the click or tap 606 represents an LLH corner of the crop boundary.
  • FIG. 6B shows the second step in the process for annotating the image on the handheld device.
  • point 606 now shown as a dashed cross hair 606
  • the user may move his or her hand 608 (or stylus, mouse, or other device) shown as dashed motion line 610 , to another location 612 (shown as solid cross hair 612 ) and click or tap a second time.
  • the second tap at location 612 represents an URH corner of the crop boundary.
  • the user releases the first input before activating the second input.
  • FIG. 6C shows the third and final step in the process for annotating the image on the handheld device.
  • an updated or annotated image is displayed for the user's confirmation.
  • Point 606 shown as a dashed cross hair 606
  • point 612 now also shown as dashed cross hair 612
  • a user of the present invention may implement an annotation operation with very little hand motion and very little input into the device, which is highly desirable, or even mandatory, when operating in the field, for example, during an inspection operation.
  • FIG. 7 shows a flowchart of another process of another embodiment of the present invention showing one aspect of one possible workflow using the principles of the present invention.
  • Process 700 begins at step 702 , where the user of a handheld device edits an image on the device.
  • the user opens the image for viewing and then at step 706 , the user makes an annotation on the image in the spirit of process 500 shown in FIG. 5 .
  • the user then proceeds to step 708 , where he or she crops the image using steps described in process 200 shown in FIG. 2 .
  • the user now sees only a sub-area of the original image on the screen, as a result of step 212 of FIG. 2 whereby the cropped area is displayed to take up the full screen area of the device.
  • step 706 the user decides that what he or she would like to annotated in step 706 was not correct, so he or she reverses that annotation at the click of an UNDO button in step 710 .
  • step 712 the user reverses the crop he or she made in step 708 , by which point the image shown to the user looks exactly as it was in step 704 .
  • step 714 the user crops the image in step 714 in a different fashion from the one the user did previously in step 708 , but once again using the steps described in process 200 shown in FIG. 2 .
  • steps 716 and 718 the user makes two consecutive annotations in the spirit of process 500 shown in FIG. 5 .
  • the user is then satisfied with the edits he or she has made and ends the process at step 720 .
  • the result of the series of actions illustrated in FIG. 7 may be stored by storing a reference to the original image, storing a final crop rectangle by reference to a LLH and an URH corner, and storing a list of annotations which are also stored by reference to a LLH and an URH corner along with type information, such as annotation type, annotation color, etc. to overlay on the cropped image.
  • FIG. 8 is an illustration of a multi-functional handheld device 800 , in which some of the software and hardware components of the present invention could reside, in accordance with yet another embodiment of the present invention.
  • the handheld device 800 contains a screen or display 802 , which may be a touch-sensitive display, for displaying an image to be cropped and/or annotated with overlaid objects.
  • the handheld 800 also contains a toolbar 806 that contains iconographic buttons for each function that a user may execute during a process of taking and editing an image.
  • Some possible selectable actions include, but are not limited to, from top to bottom and left to right, “take a picture” 804 (first row, far left), undo, redo, zoom-in, zoom-out (first row, far right), delete/cancel (second row, far left), annotate with an arrow, annotate with a circle, annotate with a rectangle, annotate with a line, and crop (second row, far right).
  • take a picture 804
  • undo, redo zoom-in
  • zoom-out first row, far right
  • delete/cancel second row, far left
  • annotate with an arrow annotate with a circle
  • annotate with a rectangle annotate with a line
  • crop second row, far right
  • the illustrative user interface 800 is but one of many possible illustrative embodiments of the present invention.
  • One of ordinary skill in the art would appreciate that any other configuration of objects in a user interface, as well as any possible extensions to the set of functions presented in the user interface 800 , are all within the spirit and scope of the present invention.
  • FIG. 9 is an illustration of one of many possible use-cases of the present invention in relation to carrying out an industrial inspection operation on a wind-farm or other energy project or other like site.
  • FIG. 9 shows an inspector carrying out an inspection of wind turbine 902 and wind turbine 904 .
  • the inspector 906 is standing next to the tower and foundation sections of wind turbine 904 .
  • the inspector 906 is using an industrial inspection handheld device 908 .
  • Inspector 906 is more specifically in the process of using the industrial inspection handheld device 908 , even more specifically having an embedded RFID reader, to scan RFID tag 912 on tower section of wind turbine 904 , via radio frequency communication channel 910 .
  • the inspection may take a picture of the potential problem area, and then proceed to crop and annotate the problem area using the methods described in the present application. Since the inspector is in the field, the present invention is particularly suitable to helping the inspector complete the inspection in a timely, accurate, and cost effective manner.
  • FIG. 9 is but one of many possible illustrative embodiments of the usage of the present invention.
  • renewable energy systems and distributed energy systems including wind turbines, solar photovoltaic, solar thermal plants, co-generation plants, biomass-fueled power plants, carbon sequestration projects, enhanced oil recovery systems, and the like.
  • FIG. 10 is an illustration of yet another possible use-case of the present invention in relation to a hand-held device with a camera and a touch-sensitive display, such as an Apple iPhone® 1000 or other like device.
  • Users of an Apple iPhone® 1000 may wish to crop and/or annotate an image either taken by the iPhone® 1000 or received from another user, or in some other way obtained on the iPhone® 1000.
  • the present invention is particularly suitable for use with an iPhone®, since an iPhone® as currently practiced does not contain a useful or easy mechanism for cropping or annotating images.
  • FIG. 11 is an illustration of yet another possible use-case of the present invention in relation to a hand-held camera with a touch-sensitive display 1100 .
  • Users of a digital camera 1100 may wish to crop and/or annotate an image taken by the digital camera 1100 .
  • the present invention is particularly suitable for use with a digital camera, especially a digital camera as shown in FIG. 11 with a touch-sensitive display device 1100 , since digital cameras as currently practiced do not contain a useful or easy mechanism for cropping or annotating images on-site and instead require uploading the images to a computer for further desktop processing to crop and annotate the images.
  • FIG. 12 is a block diagram of an exemplary computer system 1200 , in accordance with one embodiment of the present invention.
  • the computer system 1200 may correspond to a personal computer system, such as a desktops, laptops, tablets or handheld computer.
  • the computer system may also correspond to a computing device, such as a cell phone, PDA, dedicated media player, consumer electronic device, and the like.
  • the exemplary computer system 1200 shown in FIG. 12 includes a processor 1208 configured to execute instructions and to carry out operations associated with the computer system 1200 .
  • the processor 1208 may control the reception and manipulation of input and output data between components of the computing system 1200 .
  • the processor 1208 can be implemented on a single-chip, multiple chips or multiple electrical components.
  • various architectures can be used for the processor 1208 , including dedicated or embedded processor, single purpose processor, controller, ASIC, and so forth.
  • the processor 1208 together with an operating system operates to execute computer code and produce and use data.
  • Operating systems are generally well known and will not be described in greater detail.
  • the operating system may correspond to OS/2, Apple OS/X, Apple iPhone® OS, Google Android® OS, DOS, UNIX, Linux, Palm® OS, Windows, Windows Mobile®, Windows CE®, and the like.
  • the operating system can also be a special purpose operating system, such as may be used for limited purpose appliance-type computing devices.
  • the operating system, other computer code and data may reside within a memory block 1214 that is operatively coupled to the processor 1208 .
  • Memory block 1214 generally provides a place to store computer code and data that are used by the computer system 1200 .
  • the memory block 1214 may include Read-Only Memory (ROM), Random-Access Memory (RAM), hard disk drive and the like.
  • ROM Read-Only Memory
  • RAM Random-Access Memory
  • the information could also reside on a removable storage medium and loaded or installed onto the computer system 1200 when needed.
  • Removable storage mediums include, for example, CD-ROM, PC-CARD, memory card, floppy disk, magnetic tape, and a network component.
  • the computer system 1200 also includes a display device 1210 that is operatively coupled to the processor 1208 .
  • the display device 1210 may be a liquid crystal display (LCD) (e.g., active matrix, passive matrix and the like).
  • the display device 1210 may be a monitor such as a monochrome display, color graphics adapter (CGA) display, enhanced graphics adapter (EGA) display, variable-graphics-array (VGA) display, super VGA display, cathode ray tube (CRT), and the like.
  • the display device may also correspond to a plasma display or a display implemented with electronic inks.
  • the display device 1210 is generally configured to display a graphical user interface (GUI) that provides an easy to use interface between a user of the computer system and the operating system or application running thereon.
  • GUI graphical user interface
  • the graphical images may include windows, fields, dialog boxes, menus, icons, buttons, cursors, scroll bars, etc. Such images may be arranged in predefined layouts, or may be created dynamically to serve the specific actions being taken by a user.
  • the user can select and activate various graphical images in order to initiate functions and tasks associated therewith.
  • a user may select a button that opens, closes, minimizes, or maximizes a window, or an icon that launches a particular program.
  • the GUI can additionally or alternatively display information, such as non interactive text and graphics, for the user on the display device 1210 .
  • the computer system 1200 also includes a tag scanning input device 1202 that is operatively coupled to the processor 1208 .
  • the tag scanning input device 1202 is configured to transfer data from the outside world into the computer system 1200 .
  • the input device 1202 is used to scan unique machine-readable tag 1204 .
  • the unique machine-readable tag 1204 may be a barcode sticker, a high-frequency (HF) radio-frequency identification (RFID) tag, an ultra-high-frequency (UHF) RFID tag, or any other tag or the like that serves as a unique identifier.
  • HF high-frequency
  • RFID radio-frequency identification
  • UHF ultra-high-frequency
  • the scanning of the tag may be done by a corresponding tag scanning input device 1202 either embedded in the inspector's handheld device, or embodied in a separate dedicated device, implemented in whichever way is necessary to read the corresponding tag, whether by way of visual identification, radio frequency identification, or the like, and store a record of the scanning operation.
  • a corresponding tag scanning input device 1202 either embedded in the inspector's handheld device, or embodied in a separate dedicated device, implemented in whichever way is necessary to read the corresponding tag, whether by way of visual identification, radio frequency identification, or the like, and store a record of the scanning operation.
  • Various other techniques of choosing the type of unique machine-readable tag and the scanning of it are within the skill of one of ordinary skill in the art.
  • the computer system 1200 also includes a media sample input device 1206 that is operatively coupled to the processor 1208 .
  • the media sample input device 1206 is configured to transfer data from the outside world into the computer system 1200 .
  • the input device 1206 is used to capture a media sample and may include cameras of any sort, video camcorders with audio input, video camcorders without audio input, infrared imagers, ultrasonic imagers, or any other type of mechanical, chemical or electromagnetic imager that can obtain visual media. This visual media could be a view of an inspected component 1203 .
  • the taking of a media sample may be done by media sample input device 1206 either embedded in the handheld device, or embodied in a separate dedicated device, implemented in whichever way is necessary to take and store the media sample.
  • the computer system 1200 also includes capabilities for coupling to one or more I/O devices 1220 .
  • the I/O devices 1220 may correspond to keyboards, printers, scanners, cameras, speakers, and/or the like.
  • the I/O devices 1220 may be integrated with the computer system 1200 or they may be separate components (e.g. peripheral devices).
  • the I/O devices 1220 may be connected to the computer system 1200 through wired connections (e.g. cables/ports).
  • the I/O devices 1220 may be connected to the computer system 1200 through wireless connections.
  • the data link may correspond to PS/2, USB, IR, RF, Bluetooth or the like.
  • the memory block 1214 may include a tag scanning operational program 1216 , which may be part of the operating system or a separate application.
  • the tag scanning operational program 1216 generally includes a set of instructions that recognizes the occurrence of a tag scan operation on unique machine-readable tag 1204 and informs one or more software agents of the presence of unique machine-readable tag 1204 and/or what action(s) to take in response to the unique machine-readable tag 1204 .
  • the memory block 1214 may also include a media sample capturing program 1218 , which may be part of the operating system or a separate application.
  • the media sample capturing program 1218 generally includes a set of instructions that recognizes the occurrence of a media sample capture operation on the view of inspected component 1203 and informs one or more software agents of media obtained and/or what action(s) to take in response to the media obtained.
  • the system 1200 may also allow an inspector to annotate the media samples in such ways that substantiate inspector statements of problems with inspected components found during inspection, or to substantiate inspector statements of any inspected components passing inspection, using the principles taught in the present invention.
  • the memory block 1214 may be used to store program code to execute any of the processes of the present invention, including the processes shown in FIGS. 2 and 5 .

Abstract

The present invention is a system and method for cropping and annotating images on a touch sensitive display device or other handheld device, including the following steps: (a) displaying an image of the image file to be cropped/annotated; (b) receiving an first input from a user designating a first point in the image defining a corner of a crop/annotation rectangle; (c) receiving a second input from the user designating a second point in the image defining an opposite corner of the crop/annotation rectangle; and (d) cropping and/or annotating the image from the first point to the second point of the crop/annotation rectangle. The present invention may be used in digital cameras, Apple iPhones®, hand-held devices that inspectors may use to annotate photographs taken to substantiate statements of problems found during industrial inspections, and for other purposes.

Description

    REFERENCE TO RELATED APPLICATIONS
  • This application is a CONTINUATION application of U.S. Ser. No. 12/507,039 filed on Jul. 21, 2009 entitled “SYSTEM AND METHOD FOR CROPPING AND ANNOTATING IMAGES ON A TOUCH SENSITIVE DISPLAY DEVICE,” which claims priority from provisional application Ser. No. 61/122,632, filed on Dec. 15, 2008, and entitled “A system, method and apparatus for inspections and compliance verification of industrial equipment using a handheld device,” the entirety of which are both hereby incorporated by reference herein. This application is related to co-pending application Ser. No. 12/758,134, filed on Apr. 12, 2010, and entitled “A system and method for carrying out an inspection or maintenance operation with compliance tracking using a handheld device,” the entirety of which is hereby incorporated by reference herein.
  • FIELD OF THE INVENTION
  • The present invention is generally related to user-interfaces for touch-sensitive displays. More specifically, the instant invention relates to a system and method for capturing, cropping, and annotating images on a touch sensitive display device or other handheld device.
  • BACKGROUND OF THE INVENTION
  • Cropping and annotating images is important in graphic user interfaces (GUIs), both for manipulating images in graphics applications such as Adobe Photoshop®, Microsoft Paint®, and the like, and also for cropping and annotating images for insertion into textual documents such as Adobe Portable Document Format (PDF)®, Microsoft Word®, and the like.
  • Multiple end-use applications require cropping and annotating images, including reference manuals, encyclopedias, educational texts, inspection reports, and the like. For example, U.S. Ser. No. 12/489,313, filed on Jun. 22, 2009, entitled “A system and method for carrying out an inspection or maintenance operation with compliance tracking using a handheld device,” describes a method for carrying out an inspection on a piece of industrial equipment and generating inspection reports in the field. An inspector out in the field carrying out an inspection operation needs a convenient, quick, and accurate method to crop and annotate images taken in the field.
  • One prior method of image cropping and annotating is shown in FIG. 1. Window 1 (101) shows an image of a plant 108, which could be any image or photograph, previously stored in memory or taken live right before the cropping/annotating operation. A mouse 103 is used to click at a point on the screen 102 (shown as dashed cross hair 102), then performing a drag operation 110 (shown as dashed line 110) while holding down the mouse button to another point on the screen 106 (shown as solid cross hair 106), and then releasing the mouse button at the point 106. Points 102 and 106 represent the lower-left-hand (LLH) corner and the upper-right-hand (URH) corner, respectively, of the rectangular bounding box representing the crop area. In the prior art method, after the crop area has been selected, a user hits, selects, or otherwise operates a menu bar, selecting the crop/annotate operation, which then either crops or annotates the image using the LLH 102 and URH 106 points to define the bounding box. This operation is cumbersome and requires multiple mouse operations, and furthermore is generally only useable in the desktop environment with the availability of a peripheral device such as a mouse. It is generally not suitable for use in a handheld or other field device, or locations or situations where greater flexibility and fewer interactions with the GUI are highly desirable. The present invention solves these problems of the prior art.
  • Applications of the present invention include digital cameras, digital video cameras, phones with built-in cameras, phones with built-in display devices, such as the Apple iPhone®, and the like. In general, the present invention may be used to provide a simple and convenient method to crop and annotate images in situations and locations where such ease is important and/or necessary.
  • For example, one concrete application of the present invention is related to supplying a convenient user interface for a handheld device used for industrial inspection and maintenance compliance systems, as described in related U.S. Ser. No. 12/489,313. The present invention allows an easy mechanism for on-site inspectors to quickly crop and annotate images in the field to substantiate problems found during an inspection.
  • One of ordinary skill in the art will find many useful applications of the present invention in which a convenient and easy way is needed to either crop or annotate images on a touch-sensitive display or other hand-held device.
  • It is against this background that various embodiments of the present invention were developed.
  • BRIEF SUMMARY OF THE INVENTION
  • The present invention is a system and method for cropping and annotating images on a touch sensitive display device or other handheld device.
  • One embodiment of the present invention is a method for cropping images, including the steps of (a) displaying an image of the image file to be cropped; (b) receiving a first input from a user designating a first point in the image defining a corner of a crop rectangle, e.g., a lower left hand corner; (c) receiving a second input from the user designating a second point in the image defining the opposite corner of the crop rectangle, e.g. an upper right hand corner, wherein the first input is released before the second input is initiated; and (d) cropping the image to the crop rectangle defined by the two corners when the second input is released.
  • Another embodiment of the present invention is the method described above also including the step of displaying on the image a location of the first point.
  • Another embodiment of the present invention is the method described above also including the step of displaying a rectangle corresponding to the crop rectangle of the image before cropping the image.
  • Another embodiment of the present invention is the method described above where if a user does not immediately release the second input, allowing the user to drag the second point to visually edit a shape and a size of the crop rectangle.
  • Another embodiment of the present invention is the method described above where if a user drags the second point near an edge of the displayed image, and the image is larger than the displayed portion, then scrolling the displayed portion to show area of the image in a direction of the dragged point.
  • Another embodiment of the present invention is the method described above also including the step of displaying the cropped image in the display area in place of the original image.
  • Another embodiment of the present invention is the method described above also including the step of scaling the cropped image to fill the entire display area.
  • Yet another embodiment of the present invention is a method of annotating an image (where annotating an image includes superimposing one or more geometrical shapes on top of the image), the method including the steps of (a) displaying an image of the image file to be annotated; (b) receiving a first input from a user designating a first point in the image defining a corner of an annotation rectangle, e.g. the lower left hand corner; (c) receiving a second input from the user designating an opposite corner of the annotation rectangle, e.g., the upper right hand corner, wherein the first input is released before the second input is initiated; and (d) annotating the image in the annotation rectangle defined by the two corners when the second input is released.
  • Another embodiment of the present invention is the method described above also including the step of displaying on the image a location of the first point.
  • Another embodiment of the present invention is the method described above also including the step of displaying a shape corresponding to the annotation of the image before annotating the image.
  • Another embodiment of the present invention is the method described above where if a user does not immediately release the second input, allowing the user to drag the second point to visually show a shape and a size of the annotation area.
  • Another embodiment of the present invention is the method described above where if a user drags the second point near an edge of the displayed image, and the image is larger than the displayed portion, then scrolling the displayed portion to show area of the image in a direction of the dragged point.
  • Another embodiment of the present invention is the method described above also including the step of displaying the annotated image in the display area in place of the original image.
  • Another embodiment of the present invention is the method described above also including the step of receiving a third input representing a type of shape and a characteristic of the shape corresponding to the annotation.
  • Another embodiment of the present invention is the method described above where the shape is, but is not limited to, a line, a rectangle, an ellipse, or a circle. Another embodiment of the present invention is the method described above where the characteristics of the shape include, but is not limited to, a line type, a line width, and a line color.
  • The present invention also includes a related system by which the method of capturing, cropping, and annotating an image could be carried out. Such a system could be implemented as a computer system, embodied in a handheld device. The system may include integrated or separate hardware components for taking of media samples and means for receiving touch input.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The invention will be readily understood by the following detailed description in conjunction with the accompanying drawings, wherein like reference numerals designate like structural elements, and in which:
  • FIG. 1 shows a prior art method of cropping an image in a desktop or other like environment using a mouse or other like peripheral device;
  • FIG. 2 shows a flowchart for cropping an image using a simplified process on a handheld device, in accordance with one embodiment of the present invention;
  • FIG. 3A shows a first step in the process for cropping the image on the handheld device, in accordance with one embodiment of the present invention;
  • FIG. 3B shows a second step in the process for cropping the image on the handheld device, in accordance with one embodiment of the present invention;
  • FIG. 3C shows a third step in the process for cropping the image on the handheld device, in accordance with one embodiment of the present invention;
  • FIG. 4 shows a prior art method of annotating an image in a desktop or other like environment using a mouse or other like peripheral device;
  • FIG. 5 shows a flowchart for annotating an image using a simplified process on a handheld device, in accordance with another embodiment of the present invention;
  • FIG. 6A shows a first step in the process for annotating the image on the handheld device, in accordance with one embodiment of the present invention;
  • FIG. 6B shows a second step in the process for annotating the image on the handheld device, in accordance with one embodiment of the present invention;
  • FIG. 6C shows a third step in the process for annotating the image on the handheld device, in accordance with one embodiment of the present invention;
  • FIG. 7 shows a flowchart of another process of another embodiment of the present invention showing one aspect of one possible workflow using the principles of the present invention;
  • FIG. 8 is an illustration of a multi-functional handheld device, in which some of the software and hardware components of the present invention could reside, in accordance with yet another embodiment of the present invention;
  • FIG. 9 is an illustration of one of many possible use-cases of the present invention in relation to carrying out an industrial inspection operation on a wind-farm or other energy project or other like site;
  • FIG. 10 is an illustration of yet another possible use-case of the present invention in relation to a hand-held device with a camera and a touch-sensitive display, such as an Apple iPhone® or other like device;
  • FIG. 11 is an illustration of yet another possible use-case of the present invention in relation to a hand-held camera with a touch-sensitive display; and
  • FIG. 12 is a block diagram of a system in accordance with yet another embodiment of the present invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • The present invention generally pertains to a system and method for capturing, cropping, and annotating images on a touch sensitive display or other handheld device.
  • The interface according to the principles of the present invention could have, but is not limited to, the following components. Any subsets of the following components are also within the scope of this invention. After a user captures an initial image, it is stored and displayed. No actions of the user will modify the initial image, allowing all edits to be undone or re-applied against the original image until finalized.
  • The user can choose to crop the image as follows:
      • 1. The user can click once on a point in the image, displaying a point where the click occurred, and then click again at another point in the image;
      • 2. A rectangle with the two clicks at opposite corners is displayed. When the user releases the second click, including immediately releasing it, this rectangle becomes the new crop rectangle;
      • 3. If the user does not immediately release the second click, they can drag the point to visually edit the shape and size of the rectangle;
      • 4. If the point is dragged near the edge of the displayed image and the image is larger than the displayed portion, then the displayed portion will scroll to show the areas in the direction of the dragged point; and
      • 5. Once selected, the new crop rectangle becomes the area displayed. The image within the selected rectangle can be scaled to the size of the viewport.
  • The user can choose to annotate the image as follows:
      • 1. The user can choose a type of an annotation shape (e.g., line, rectangle, ellipse), and characteristics of the annotation shape such as line type (e.g., dashed), line width, and line color, etc.;
      • 2. The user can click once on a point in the image, displaying a point where the click occurred, and then click again at another point in the image;
      • 3. An annotation shape of the appropriate type is displayed over the image with the two clicks at opposite corners of the shape's bounding rectangle. When the user releases the second click, including immediately releasing it, this shape and its location on the image are saved;
      • 4. If the user does not immediately release the second click, they can drag the point to visually edit the shape and its size; and
      • 5. If the point is dragged near the edge of the displayed image and the image is larger than the displayed portion, then the displayed portion will scroll to show the areas in the direction of the dragged point, and the portion of the shape in the displayed area will be shown.
  • The invention may be used in an industrial inspection compliance system with which various methods can be carried out to the effect of assisting in an inspection and providing the means for compliance verification of a proper inspection. For the purposes of the text describing this invention, an inspection may represent the process of checking a physical component for safety, security or business reasons, doing the same for compliance with industrial standards and guidelines, or a maintenance operation on a physical component for those same reasons. These methods can generally be best executed by a multi-function handheld device, carried to and used in the physical proximity of an inspection component by the inspector. Examples of multi-function handheld devices include the Apple iPhone®, the Psion Teklogix Workabout Pro®, the Motorola MC-75®, and the like, but the present invention is not limited to such devices as shown or described here. One embodiment of the inspection compliance method includes the steps of scanning unique machine-readable tags deployed at logical inspection points defined by the inspector, and assigning a timestamp to the scanning operation; taking media samples of logical inspection points defined by the inspector, and assigning a timestamp to the media sample capturing operation; reporting of sub-optimal conditions of the unique machine-readable tags deployed at logical inspection points if its condition warrants such a declaration; associating a media sample with a corresponding scan of a unique machine-readable tag; and annotating a media sample in such ways that substantiate statements of an industrial component passing inspection, or in such ways that substantiate statements of problems found with the industrial component. See U.S. Ser. No. 12/489,313 for more details of an example of an industrial inspection compliance system to which the present invention may be applied.
  • The invention is discussed below with reference to FIGS. 1-11. However, those skilled in the art will readily appreciate that the detailed description given herein with respect to these figures is for explanatory purposes as the invention extends beyond these limited embodiments.
  • FIG. 1 shows a prior art method of cropping an image in a desktop or other like environment using a mouse or other like peripheral device. Window 1 (101) shows an image of a plant 108, which could be any image or photograph, previously stored in memory or taken live right before the cropping operation. A mouse 103 is used to click at a point on the screen 102 (shown as dashed cross hair 102), then performing a drag operation 110 (shown as dashed line 110) while holding down the mouse button to another point on the screen 106 (shown as solid cross hair 106, also known as a “sweeping motion”), and then releasing the mouse button at the point 106. Points 102 and 106 represent the lower-left-hand (LLH) corner and the upper-right-hand (URH) corner, respectively, of the rectangular bounding box representing the crop area. In the prior art method, after the crop area has been selected, a user hits, selects, or otherwise operates a menu bar, selecting the crop operation, which then crops the images using the LLH 102 and URH 106 points to define the bounding box. This operation is cumbersome and requires multiple mouse operations, and furthermore is generally only useable in the desktop environment with the availability of a peripheral device such as a mouse. It is generally not suitable to use in the handheld or other field device, or locations or situations where greater flexibility and less interactions with the GUI are highly desirable.
  • In order to solve the inherent limitations in the prior art method described in FIG. 1, the inventors have invented a novel method, system, and apparatus to facilitate on-site image cropping. FIG. 2 shows a flowchart for cropping an image using a simplified process on a handheld device, in accordance with one embodiment of the present invention. Process 200 begins at step 202, where an image is either retrieved from memory, captured in real-time via an image capture device, or in some other way provided to the process 200. In step 204, the image is displayed on the touch sensitive display or other display of the handheld device. In step 206, the user may click or tap (using a finger, a stylus, a mouse, or other device) at a LLH location where the crop is to begin. In step 208, the user may click or tap (using the finger, the stylus, the mouse, or other device) at a URH location where the crop is to end. In step 210, the image is cropped between the LLH location and the URH location. Finally, in step 212, the cropped image is displayed for the user's confirmation. At this point (not shown), the user may cancel, undo, or accept the crop. Unlike a conventional “sweeping motion” for cropping, the user releases the first input before activating the second input. The process ends in step 214.
  • The process described in FIG. 2 is more particularly illustrated in relations to FIGS. 3A-3C. FIG. 3A shows the first step in the process for cropping the image on the handheld device, in accordance with the embodiment described in relation to FIG. 2. Screen 302 shows a screen or touch-sensitive area of a handheld device. Window 2 (304) shows one of many windows showing the image the user desires to crop. The user uses his or her hand 308 (or stylus, mouse, or other device) to click or tap at point 306 (shown as solid cross hair 306). The position of the click or tap 306 represents a LLH corner of the crop boundary.
  • FIG. 3B shows the second step in the process for cropping the image on the handheld device. After clicking or tapping on point 306 (now shown as a dashed cross hair 306), the user may move his or her hand 308 (or stylus, mouse, or other device) shown as dashed motion line 310, to another location 312 (shown as solid cross hair 312) and click or tap a second time. The second tap at location 312 represents an URH corner of the crop boundary. Unlike a conventional “sweeping motion” for cropping, the user releases the first input before activating the second input.
  • FIG. 3C shows the third and final step in the process for cropping the image on the handheld device. After indicating a completion of a crop operation, such as by removing hand 308, the crop operation is performed in the background, and an updated or cropped image is displayed for the user's confirmation. Point 306 (shown as a dashed cross hair 306) and point 312 (now also shown as dashed cross hair 312) represent a LLH corner and an URH corner, respectively, of the crop boundary.
  • Therefore, as shown in FIGS. 2 and 3A-3C, a user of the present invention may implement a crop operation with very little hand motion and very little input into the device, which is highly desirable, or even mandatory, when operating in the field, for example, during an inspection operation.
  • Now turning to annotation of images, FIG. 4 shows a prior art method of annotating an image in a desktop or other like environment using a mouse or other like peripheral device. Window 1 (401) shows an image of a plant 408, which could be any image or photograph, previously stored in memory or taken live right before the cropping/annotating operation. A mouse 403 is used to click at a point on the screen 402 (shown as dashed cross hair 402), then performing a drag operation 410 (shown as dashed line 410) while holding down the mouse button to another point on the screen 406 (shown as solid cross hair 406, also known as a “sweeping motion”), and finally releasing the mouse button at the point 406. Points 402 and 406 represent the lower-left-hand (LLH) corner and the upper-right-hand (URH) corner, respectively, of the rectangular bounding box representing the annotation area. In the prior art method, after the annotation area has been selected, a user hits, selects, or otherwise operates a menu bar, selecting the proper annotate operation (circle, oval, rectangle, arrow, line, etc.), which then annotates the image using the LLH 102 and URH 106 points to define the bounding box. This operation is cumbersome and requires multiple mouse operations, and furthermore is generally only useable in the desktop environment with the availability of a peripheral device such as a mouse. It is generally not suitable to use in the handheld or other field devices, or locations or situations where greater flexibility and fewer interactions with the GUI are highly desirable.
  • In order to solve the inherent limitations in the prior art method described in FIG. 4, the inventors have invented a novel method, system, and apparatus to facilitate on-site image annotation. FIG. 5 shows a flowchart for annotating an image using a simplified process on a handheld device, in accordance with another embodiment of the present invention. Process 500 begins at step 502, where an image is either retrieved from memory, captured in real-time via an image capture device, or in some other way provided to the process 500. In step 504, the image is displayed on the touch sensitive display or other display of the handheld device. In step 506, the user may click or tap (using a finger, a stylus, a mouse, or other device) at a LLH location where the crop is to begin. In step 508, the user may click or tap (using the finger, the stylus, the mouse, or other device) at an URH location where the annotation is to end. In step 510, the image is annotated between the LLH location and the URH location. Finally, in step 512, the annotated image is displayed for the user's confirmation. At this point (not shown), the user may cancel, undo, or accept the annotation. Unlike a conventional “sweeping motion” for annotations, the user releases the first input before activating the second input. The process ends in step 514.
  • The process described in FIG. 5 is more particularly illustrated in relations to FIGS. 6A-6C. FIG. 6A shows the first step in the process for annotating the image on the handheld device, in accordance with the embodiment described in relation to FIG. 5. Screen 602 shows a screen or touch-sensitive area of a handheld device. Window 2 (604) shows one of many windows showing the image the user desires to annotate. The user uses his or her hand 608 (or stylus, mouse, or other device) to click or tap at point 606 (shown as solid cross hair 606). The position of the click or tap 606 represents an LLH corner of the crop boundary.
  • FIG. 6B shows the second step in the process for annotating the image on the handheld device. After clicking or tapping on point 606 (now shown as a dashed cross hair 606), the user may move his or her hand 608 (or stylus, mouse, or other device) shown as dashed motion line 610, to another location 612 (shown as solid cross hair 612) and click or tap a second time. The second tap at location 612 represents an URH corner of the crop boundary. Unlike a conventional “sweeping motion” for annotations, the user releases the first input before activating the second input.
  • FIG. 6C shows the third and final step in the process for annotating the image on the handheld device. After indicating a completion of an annotation operation, such as by removing hand 608, an updated or annotated image is displayed for the user's confirmation. Point 606 (shown as a dashed cross hair 606) and point 612 (now also shown as dashed cross hair 612) represent a LLH corner and an URH corner, respectively, of the annotation boundary.
  • Therefore, as shown in FIGS. 5 and 6A-6C, a user of the present invention may implement an annotation operation with very little hand motion and very little input into the device, which is highly desirable, or even mandatory, when operating in the field, for example, during an inspection operation.
  • FIG. 7 shows a flowchart of another process of another embodiment of the present invention showing one aspect of one possible workflow using the principles of the present invention. Process 700 begins at step 702, where the user of a handheld device edits an image on the device. First, at step 704 the user opens the image for viewing and then at step 706, the user makes an annotation on the image in the spirit of process 500 shown in FIG. 5. The user then proceeds to step 708, where he or she crops the image using steps described in process 200 shown in FIG. 2. Then, after cropping the image, the user now sees only a sub-area of the original image on the screen, as a result of step 212 of FIG. 2 whereby the cropped area is displayed to take up the full screen area of the device. At this point, the user decides that what he or she would like to annotated in step 706 was not correct, so he or she reverses that annotation at the click of an UNDO button in step 710. Then in step 712, the user reverses the crop he or she made in step 708, by which point the image shown to the user looks exactly as it was in step 704. Finally, the user crops the image in step 714 in a different fashion from the one the user did previously in step 708, but once again using the steps described in process 200 shown in FIG. 2. Then in steps 716 and 718, the user makes two consecutive annotations in the spirit of process 500 shown in FIG. 5. The user is then satisfied with the edits he or she has made and ends the process at step 720.
  • The result of the series of actions illustrated in FIG. 7 may be stored by storing a reference to the original image, storing a final crop rectangle by reference to a LLH and an URH corner, and storing a list of annotations which are also stored by reference to a LLH and an URH corner along with type information, such as annotation type, annotation color, etc. to overlay on the cropped image.
  • FIG. 8 is an illustration of a multi-functional handheld device 800, in which some of the software and hardware components of the present invention could reside, in accordance with yet another embodiment of the present invention. The handheld device 800 contains a screen or display 802, which may be a touch-sensitive display, for displaying an image to be cropped and/or annotated with overlaid objects. The handheld 800 also contains a toolbar 806 that contains iconographic buttons for each function that a user may execute during a process of taking and editing an image. Some possible selectable actions include, but are not limited to, from top to bottom and left to right, “take a picture” 804 (first row, far left), undo, redo, zoom-in, zoom-out (first row, far right), delete/cancel (second row, far left), annotate with an arrow, annotate with a circle, annotate with a rectangle, annotate with a line, and crop (second row, far right). For example, if button 804 is pressed, the software activates the handheld device's digital camera and places the captured image in display screen 802.
  • The illustrative user interface 800 is but one of many possible illustrative embodiments of the present invention. One of ordinary skill in the art would appreciate that any other configuration of objects in a user interface, as well as any possible extensions to the set of functions presented in the user interface 800, are all within the spirit and scope of the present invention.
  • FIG. 9 is an illustration of one of many possible use-cases of the present invention in relation to carrying out an industrial inspection operation on a wind-farm or other energy project or other like site. FIG. 9 shows an inspector carrying out an inspection of wind turbine 902 and wind turbine 904. The inspector 906 is standing next to the tower and foundation sections of wind turbine 904. The inspector 906 is using an industrial inspection handheld device 908. Inspector 906 is more specifically in the process of using the industrial inspection handheld device 908, even more specifically having an embedded RFID reader, to scan RFID tag 912 on tower section of wind turbine 904, via radio frequency communication channel 910. Since inspector 906 is within proximity of the inspected component, he is able to successfully scan the RFID tag 912 because it is within the range of radio frequency communication channel 910. If the inspector recognizes a potential problem with the foundation section of the wind turbine 904, the inspection may take a picture of the potential problem area, and then proceed to crop and annotate the problem area using the methods described in the present application. Since the inspector is in the field, the present invention is particularly suitable to helping the inspector complete the inspection in a timely, accurate, and cost effective manner.
  • The illustration shown in FIG. 9 is but one of many possible illustrative embodiments of the usage of the present invention. One of ordinary skill in the art would appreciate that many possible uses of the present invention are all within the spirit and scope of the present invention, including, but not limited to, renewable energy systems and distributed energy systems, including wind turbines, solar photovoltaic, solar thermal plants, co-generation plants, biomass-fueled power plants, carbon sequestration projects, enhanced oil recovery systems, and the like.
  • FIG. 10 is an illustration of yet another possible use-case of the present invention in relation to a hand-held device with a camera and a touch-sensitive display, such as an Apple iPhone® 1000 or other like device. Users of an Apple iPhone® 1000 may wish to crop and/or annotate an image either taken by the iPhone® 1000 or received from another user, or in some other way obtained on the iPhone® 1000. The present invention is particularly suitable for use with an iPhone®, since an iPhone® as currently practiced does not contain a useful or easy mechanism for cropping or annotating images.
  • FIG. 11 is an illustration of yet another possible use-case of the present invention in relation to a hand-held camera with a touch-sensitive display 1100. Users of a digital camera 1100 may wish to crop and/or annotate an image taken by the digital camera 1100. The present invention is particularly suitable for use with a digital camera, especially a digital camera as shown in FIG. 11 with a touch-sensitive display device 1100, since digital cameras as currently practiced do not contain a useful or easy mechanism for cropping or annotating images on-site and instead require uploading the images to a computer for further desktop processing to crop and annotate the images.
  • FIG. 12 is a block diagram of an exemplary computer system 1200, in accordance with one embodiment of the present invention. The computer system 1200 may correspond to a personal computer system, such as a desktops, laptops, tablets or handheld computer. The computer system may also correspond to a computing device, such as a cell phone, PDA, dedicated media player, consumer electronic device, and the like.
  • The exemplary computer system 1200 shown in FIG. 12 includes a processor 1208 configured to execute instructions and to carry out operations associated with the computer system 1200. For example, using instructions retrieved from memory 1214, the processor 1208 may control the reception and manipulation of input and output data between components of the computing system 1200. The processor 1208 can be implemented on a single-chip, multiple chips or multiple electrical components. For example, various architectures can be used for the processor 1208, including dedicated or embedded processor, single purpose processor, controller, ASIC, and so forth.
  • In most cases, the processor 1208 together with an operating system operates to execute computer code and produce and use data. Operating systems are generally well known and will not be described in greater detail. By way of example, the operating system may correspond to OS/2, Apple OS/X, Apple iPhone® OS, Google Android® OS, DOS, UNIX, Linux, Palm® OS, Windows, Windows Mobile®, Windows CE®, and the like. The operating system can also be a special purpose operating system, such as may be used for limited purpose appliance-type computing devices. The operating system, other computer code and data may reside within a memory block 1214 that is operatively coupled to the processor 1208. Memory block 1214 generally provides a place to store computer code and data that are used by the computer system 1200. By way of example, the memory block 1214 may include Read-Only Memory (ROM), Random-Access Memory (RAM), hard disk drive and the like. The information could also reside on a removable storage medium and loaded or installed onto the computer system 1200 when needed. Removable storage mediums include, for example, CD-ROM, PC-CARD, memory card, floppy disk, magnetic tape, and a network component.
  • The computer system 1200 also includes a display device 1210 that is operatively coupled to the processor 1208. The display device 1210 may be a liquid crystal display (LCD) (e.g., active matrix, passive matrix and the like). Alternatively, the display device 1210 may be a monitor such as a monochrome display, color graphics adapter (CGA) display, enhanced graphics adapter (EGA) display, variable-graphics-array (VGA) display, super VGA display, cathode ray tube (CRT), and the like. The display device may also correspond to a plasma display or a display implemented with electronic inks.
  • The display device 1210 is generally configured to display a graphical user interface (GUI) that provides an easy to use interface between a user of the computer system and the operating system or application running thereon. Generally speaking, the GUI represents programs, files and operational options with graphical images. The graphical images may include windows, fields, dialog boxes, menus, icons, buttons, cursors, scroll bars, etc. Such images may be arranged in predefined layouts, or may be created dynamically to serve the specific actions being taken by a user. During operation, the user can select and activate various graphical images in order to initiate functions and tasks associated therewith. By way of example, a user may select a button that opens, closes, minimizes, or maximizes a window, or an icon that launches a particular program. The GUI can additionally or alternatively display information, such as non interactive text and graphics, for the user on the display device 1210.
  • The computer system 1200 also includes a tag scanning input device 1202 that is operatively coupled to the processor 1208. The tag scanning input device 1202 is configured to transfer data from the outside world into the computer system 1200. The input device 1202 is used to scan unique machine-readable tag 1204. The unique machine-readable tag 1204 may be a barcode sticker, a high-frequency (HF) radio-frequency identification (RFID) tag, an ultra-high-frequency (UHF) RFID tag, or any other tag or the like that serves as a unique identifier. The scanning of the tag may be done by a corresponding tag scanning input device 1202 either embedded in the inspector's handheld device, or embodied in a separate dedicated device, implemented in whichever way is necessary to read the corresponding tag, whether by way of visual identification, radio frequency identification, or the like, and store a record of the scanning operation. Various other techniques of choosing the type of unique machine-readable tag and the scanning of it are within the skill of one of ordinary skill in the art.
  • The computer system 1200 also includes a media sample input device 1206 that is operatively coupled to the processor 1208. The media sample input device 1206 is configured to transfer data from the outside world into the computer system 1200. The input device 1206 is used to capture a media sample and may include cameras of any sort, video camcorders with audio input, video camcorders without audio input, infrared imagers, ultrasonic imagers, or any other type of mechanical, chemical or electromagnetic imager that can obtain visual media. This visual media could be a view of an inspected component 1203. The taking of a media sample may be done by media sample input device 1206 either embedded in the handheld device, or embodied in a separate dedicated device, implemented in whichever way is necessary to take and store the media sample.
  • The computer system 1200 also includes capabilities for coupling to one or more I/O devices 1220. By way of example, the I/O devices 1220 may correspond to keyboards, printers, scanners, cameras, speakers, and/or the like. The I/O devices 1220 may be integrated with the computer system 1200 or they may be separate components (e.g. peripheral devices). In some cases, the I/O devices 1220 may be connected to the computer system 1200 through wired connections (e.g. cables/ports). In other cases, the I/O devices 1220 may be connected to the computer system 1200 through wireless connections. By way of example, the data link may correspond to PS/2, USB, IR, RF, Bluetooth or the like.
  • The memory block 1214 may include a tag scanning operational program 1216, which may be part of the operating system or a separate application. The tag scanning operational program 1216 generally includes a set of instructions that recognizes the occurrence of a tag scan operation on unique machine-readable tag 1204 and informs one or more software agents of the presence of unique machine-readable tag 1204 and/or what action(s) to take in response to the unique machine-readable tag 1204.
  • The memory block 1214 may also include a media sample capturing program 1218, which may be part of the operating system or a separate application. The media sample capturing program 1218 generally includes a set of instructions that recognizes the occurrence of a media sample capture operation on the view of inspected component 1203 and informs one or more software agents of media obtained and/or what action(s) to take in response to the media obtained.
  • In one embodiment, the system 1200 may also allow an inspector to annotate the media samples in such ways that substantiate inspector statements of problems with inspected components found during inspection, or to substantiate inspector statements of any inspected components passing inspection, using the principles taught in the present invention. In general, the memory block 1214 may be used to store program code to execute any of the processes of the present invention, including the processes shown in FIGS. 2 and 5.
  • While the methods disclosed herein have been described and shown with reference to particular operations performed in a particular order, it will be understood that these operations may be combined, sub-divided, or re-ordered to form equivalent methods without departing from the teachings of the present invention. Accordingly, unless specifically indicated herein, the order and grouping of the operations is not a limitation of the present invention.
  • While the invention has been particularly shown and described with reference to embodiments thereof, it will be understood by those skilled in the art that various other changes in the form and details may be made without departing from the spirit and scope of the invention.

Claims (20)

1. A method for cropping an image file, comprising the steps of:
displaying an image of the image file to be cropped in a display area;
receiving a first input from a user designating a first point in the image defining a corner of a crop rectangle;
receiving a second input from the user designating a second point in the image defining an opposite corner of the crop rectangle, wherein the first input is released before the second input is initiated; and
cropping the image to the crop rectangle defined by the first point and the second point to create a cropped image when the second input is released.
2. The method as recited in claim 1, further comprising:
displaying on the image a location of the first point.
3. The method as recited in claim 1, further comprising:
displaying a rectangle overlaid over the image corresponding to the crop rectangle before cropping the image.
4. The method as recited in claim 1, wherein if the user does not immediately release the second input, allowing the user to drag the second point to visually edit a shape and a size of the crop rectangle.
5. The method as recited in claim 1, wherein if the user drags the second point near an edge of a displayed portion and the image is larger than the displayed portion, then scrolling the displayed portion to show a portion of the image in a direction of the dragged point.
6. The method as recited in 1, further comprising:
displaying the cropped image in the display area in place of the original image.
7. The method as recited in claim 6, further comprising:
scaling the cropped image to fill the entire display area.
8. A method of annotating an image file, comprising the steps of:
displaying an image of the image file to be annotated in a display area;
receiving a first input from a user designating a first point in the image defining a corner of an annotation rectangle;
receiving a second input from the user designating a second point in the image defining an opposite corner of the annotation rectangle, wherein the first input is released before the second input is initiated; and
annotating the image from the first point to the second point of the annotation rectangle to create an annotated image when the second input is released.
9. The method as recited in claim 8, further comprising:
displaying on the image a location of the first point.
10. The method as recited in claim 8, further comprising:
displaying a shape corresponding to the annotation rectangle of the image before annotating the image.
11. The method as recited in claim 8, wherein if the user does not immediately release the second input, allowing the user to drag the second point to visually show a shape and a size of the annotation rectangle.
12. The method as recited in claim 8, wherein if the user drags the second point near an edge of a displayed portion and the image is larger than the displayed portion, then scrolling the displayed portion to show a portion of the image in a direction of the dragged point.
13. The method as recited in 8, further comprising:
displaying the annotated image in the display area in place of the original image.
14. The method as recited in claim 8, further comprising:
receiving a third input representing a type of shape and a characteristic of the shape corresponding to the annotation rectangle.
15. The method as recited in claim 14, wherein the shape is selected from the group consisting of a line, a rectangle, an ellipse, and a circle.
16. The method as recited in claim 14, wherein the characteristic of the shape is selected from the group consisting of a line type, a line width, and a line color.
17. A touch-sensitive hand-held system having a capability of cropping and annotating an image file, comprising:
at least one processor; and
at least one or more memories, operatively coupled to the processor, and containing program code, which when executed causes the processor to execute a process comprising the steps of:
displaying an image of the image file to be cropped or annotated in a display area;
receiving a first input from a user designating a first point in the image defining a corner of a crop rectangle;
receiving a second input from the user designating a second point in the image defining an opposite corner of the crop rectangle, wherein the first input is released before the second input is initiated;
cropping the image from the first point to the second point of the crop rectangle when the second input is released to form a cropped image;
displaying the cropped image in the display area in place of the original image;
receiving a third input from the user designating a third point in the cropped image defining a corner of an annotation rectangle;
receiving a fourth input from the user designating a fourth point in the cropped image defining an opposite corner of the annotation rectangle, wherein the first input is released before the second input is initiated; and
annotating the cropped image from the third point to the fourth point of the annotation rectangle when the fourth input is released to form an annotated cropped image.
18. The system as recited in claim 17, further containing program code, which when executed causes the processor to execute a process further comprising the step of:
displaying on the image a location of the first point and a location of the third point.
19. The system as recited in claim 17, further containing program code, which when executed causes the processor to execute a process further comprising the steps of:
displaying a rectangle overlaid over the image corresponding to the crop rectangle before cropping the image; and
displaying a shape overlaid over the image corresponding to the annotation rectangle before annotating the image.
20. The system as recited in claim 17, wherein if the user drags the second point near an edge of a displayed portion and the image is larger than the displayed portion, then scrolling the displayed portion to show a portion of the image in a direction of the dragged point.
US12/767,077 2008-12-15 2010-04-26 System and method for cropping and annotating images on a touch sensitive display device Abandoned US20100194781A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/767,077 US20100194781A1 (en) 2008-12-15 2010-04-26 System and method for cropping and annotating images on a touch sensitive display device

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US12263208P 2008-12-15 2008-12-15
US12/507,039 US20100149211A1 (en) 2008-12-15 2009-07-21 System and method for cropping and annotating images on a touch sensitive display device
US12/767,077 US20100194781A1 (en) 2008-12-15 2010-04-26 System and method for cropping and annotating images on a touch sensitive display device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US12/507,039 Continuation US20100149211A1 (en) 2008-12-15 2009-07-21 System and method for cropping and annotating images on a touch sensitive display device

Publications (1)

Publication Number Publication Date
US20100194781A1 true US20100194781A1 (en) 2010-08-05

Family

ID=42239966

Family Applications (6)

Application Number Title Priority Date Filing Date
US12/489,313 Abandoned US20100153168A1 (en) 2008-12-15 2009-06-22 System and method for carrying out an inspection or maintenance operation with compliance tracking using a handheld device
US12/507,039 Abandoned US20100149211A1 (en) 2008-12-15 2009-07-21 System and method for cropping and annotating images on a touch sensitive display device
US12/507,071 Expired - Fee Related US8032830B2 (en) 2008-12-15 2009-07-22 System and method for generating quotations from a reference document on a touch sensitive display device
US12/758,134 Abandoned US20100185549A1 (en) 2008-12-15 2010-04-12 System and method for carrying out an inspection or maintenance operation with compliance tracking using a handheld device
US12/767,077 Abandoned US20100194781A1 (en) 2008-12-15 2010-04-26 System and method for cropping and annotating images on a touch sensitive display device
US12/832,903 Expired - Fee Related US7971140B2 (en) 2008-12-15 2010-07-08 System and method for generating quotations from a reference document on a touch sensitive display device

Family Applications Before (4)

Application Number Title Priority Date Filing Date
US12/489,313 Abandoned US20100153168A1 (en) 2008-12-15 2009-06-22 System and method for carrying out an inspection or maintenance operation with compliance tracking using a handheld device
US12/507,039 Abandoned US20100149211A1 (en) 2008-12-15 2009-07-21 System and method for cropping and annotating images on a touch sensitive display device
US12/507,071 Expired - Fee Related US8032830B2 (en) 2008-12-15 2009-07-22 System and method for generating quotations from a reference document on a touch sensitive display device
US12/758,134 Abandoned US20100185549A1 (en) 2008-12-15 2010-04-12 System and method for carrying out an inspection or maintenance operation with compliance tracking using a handheld device

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/832,903 Expired - Fee Related US7971140B2 (en) 2008-12-15 2010-07-08 System and method for generating quotations from a reference document on a touch sensitive display device

Country Status (1)

Country Link
US (6) US20100153168A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120210200A1 (en) * 2011-02-10 2012-08-16 Kelly Berger System, method, and touch screen graphical user interface for managing photos and creating photo books
US20130218464A1 (en) * 2012-02-17 2013-08-22 Chun-Ming Chen Method for generating split screen according to a touch gesture
US10417763B2 (en) 2014-07-25 2019-09-17 Samsung Electronics Co., Ltd. Image processing apparatus, image processing method, x-ray imaging apparatus and control method thereof
WO2020027813A1 (en) * 2018-07-31 2020-02-06 Hewlett-Packard Development Company, L.P. Cropping portions of images
US11294556B1 (en) * 2021-01-05 2022-04-05 Adobe Inc. Editing digital images using multi-panel graphical user interfaces

Families Citing this family (65)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8106856B2 (en) 2006-09-06 2012-01-31 Apple Inc. Portable electronic device for photo management
US20090048949A1 (en) * 2007-08-16 2009-02-19 Facility Audit Solutions, Llc System and method for managing photographs from site audits of facilities
WO2009045218A1 (en) 2007-10-04 2009-04-09 Donovan John J A video surveillance, storage, and alerting system having network management, hierarchical data storage, video tip processing, and vehicle plate analysis
US8013738B2 (en) 2007-10-04 2011-09-06 Kd Secure, Llc Hierarchical storage manager (HSM) for intelligent storage of large volumes of data
US8957865B2 (en) * 2009-01-05 2015-02-17 Apple Inc. Device, method, and graphical user interface for manipulating a user interface object
TWI497357B (en) * 2009-04-23 2015-08-21 Waltop Int Corp Multi-touch pad control method
JP5607726B2 (en) * 2009-08-21 2014-10-15 トムソン ライセンシング Method, apparatus, and program for adjusting parameters on user interface screen
EP3855297A3 (en) 2009-09-22 2021-10-27 Apple Inc. Device method and graphical user interface for manipulating user interface objects
US8780069B2 (en) 2009-09-25 2014-07-15 Apple Inc. Device, method, and graphical user interface for manipulating user interface objects
US8799775B2 (en) * 2009-09-25 2014-08-05 Apple Inc. Device, method, and graphical user interface for displaying emphasis animations for an electronic document in a presentation mode
JP5656585B2 (en) * 2010-02-17 2015-01-21 キヤノン株式会社 Document creation support apparatus, document creation support method, and program
US7793850B1 (en) * 2010-03-14 2010-09-14 Kd Secure Llc System and method used for configuration of an inspection compliance tool with machine readable tags and their associations to inspected components
JP5459031B2 (en) * 2010-04-13 2014-04-02 ソニー株式会社 Information processing apparatus, information processing method, and program
CN102918828B (en) * 2010-05-31 2015-11-25 株式会社Pfu Overhead scanner device and image processing method
US9064290B2 (en) 2010-07-23 2015-06-23 Jkads Llc Method for inspecting a physical asset
US9139240B1 (en) * 2010-08-13 2015-09-22 Kodiak Innovations, LLC Apparatus for decreasing aerodynamic drag, improving stability, and reducing road spray of a transport vehicle
JP5685928B2 (en) * 2010-12-24 2015-03-18 ソニー株式会社 Information processing apparatus, image data optimization method, and program
US20120185787A1 (en) * 2011-01-13 2012-07-19 Microsoft Corporation User interface interaction behavior based on insertion point
US8902222B2 (en) 2012-01-16 2014-12-02 Autodesk, Inc. Three dimensional contriver tool for modeling with multi-touch devices
US8947429B2 (en) 2011-04-12 2015-02-03 Autodesk, Inc. Gestures and tools for creating and editing solid models
US9182882B2 (en) 2011-04-12 2015-11-10 Autodesk, Inc. Dynamic creation and modeling of solid models
US8860726B2 (en) 2011-04-12 2014-10-14 Autodesk, Inc. Transform manipulator control
US20120308969A1 (en) * 2011-06-06 2012-12-06 Paramit Corporation Training ensurance method and system for copmuter directed assembly and manufacturing
US20130009785A1 (en) * 2011-07-07 2013-01-10 Finn Clayton L Visual and Audio Warning System Including Test Ledger for Automated Door
US8860675B2 (en) 2011-07-12 2014-10-14 Autodesk, Inc. Drawing aid system for multi-touch devices
KR101859100B1 (en) * 2011-07-19 2018-05-17 엘지전자 주식회사 Mobile device and control method for the same
US20130021138A1 (en) * 2011-07-20 2013-01-24 GM Global Technology Operations LLC Method of evaluating structural integrity of a vehicle component with radio frequency identification tags and system for same
US20130063369A1 (en) * 2011-09-14 2013-03-14 Verizon Patent And Licensing Inc. Method and apparatus for media rendering services using gesture and/or voice control
US20130086933A1 (en) * 2011-10-07 2013-04-11 Colleen M. Holtkamp Controller for a medical products storage system
WO2013109246A1 (en) * 2012-01-16 2013-07-25 Autodesk, Inc. Gestures and tools for creating and editing solid models
WO2013109244A1 (en) * 2012-01-16 2013-07-25 Autodesk, Inc. Three dimensional contriver tool for modeling with multi-touch devices
WO2013109245A1 (en) * 2012-01-16 2013-07-25 Autodesk, Inc. Dynamic creation and modeling of solid models
US20130227457A1 (en) * 2012-02-24 2013-08-29 Samsung Electronics Co. Ltd. Method and device for generating captured image for display windows
KR102304700B1 (en) * 2012-02-24 2021-09-28 삼성전자주식회사 Method and device for generating capture image for display windows
US9299019B2 (en) * 2012-03-14 2016-03-29 Trimble Navigation Limited Systems for data collection
CN102662525A (en) * 2012-04-27 2012-09-12 上海量明科技发展有限公司 Method and terminal for carrying out screenshot operation through touch screen
US8671361B2 (en) * 2012-05-24 2014-03-11 Blackberry Limited Presentation of image on display screen with combination crop and rotation and with auto-resizing of crop field
US20220335385A1 (en) * 2012-06-07 2022-10-20 Procore Technologies, Inc. System and method for systematic presentation and ordering of documents based on triggers
US20170024695A1 (en) * 2013-12-24 2017-01-26 Scott Gerard Wolfe, Jr. System and method for systematic presentation and ordering of documents based on triggers
US20140172684A1 (en) * 2012-12-14 2014-06-19 Scott Gerard Wolfe, Jr. System and Method to Utilize Presumptions, Database Information, and/or User Defaults to Calculate Construction Lien, Notice, Bond Claim, and Other Construction Document Deadlines and Requirements
US20170186105A1 (en) * 2012-08-30 2017-06-29 Smith and Turner Development LLC Method to inspect equipment
US9705835B2 (en) * 2012-11-02 2017-07-11 Pandexio, Inc. Collaboration management systems
US10325298B2 (en) * 2013-01-22 2019-06-18 General Electric Company Systems and methods for a non-destructive testing ecosystem
US20140281895A1 (en) * 2013-03-15 2014-09-18 Kah Seng Tay Techniques for embedding quotes of content
US20140344077A1 (en) * 2013-03-15 2014-11-20 Contact Marketing Services, Inc. Used industrial equipment sales application suites, systems, and related apparatus and methods
CN105432075B (en) 2013-03-20 2019-09-06 生活时间品牌公司 System for moving mass control inspection
US20140330731A1 (en) * 2013-05-06 2014-11-06 RDH Environmental Services, LLC System and method for managing backflow prevention assembly test data
US9652460B1 (en) 2013-05-10 2017-05-16 FotoIN Mobile Corporation Mobile media information capture and management methods and systems
CN104729714A (en) * 2013-12-18 2015-06-24 上海宝钢工业技术服务有限公司 Spot inspection instrument device based on Android phone platform
US10672089B2 (en) * 2014-08-19 2020-06-02 Bert L. Howe & Associates, Inc. Inspection system and related methods
US10671275B2 (en) 2014-09-04 2020-06-02 Apple Inc. User interfaces for improving single-handed operation of devices
WO2016176188A1 (en) * 2015-04-27 2016-11-03 First Advantage Corporation Device and method for performing validation and authentication of a physical structure or physical object
US9329762B1 (en) * 2015-06-02 2016-05-03 Interactive Memories, Inc. Methods and systems for reversing editing operations in media-rich projects
CN106325663B (en) * 2015-06-27 2019-09-17 南昌欧菲光科技有限公司 Mobile terminal and its screenshotss method
CN105151436B (en) * 2015-09-03 2017-03-22 温州智信机电科技有限公司 Spark plug sheath sleeving machine with function of material detection, and reliable working
US11805170B2 (en) * 2015-10-10 2023-10-31 David Sean Capps Fire service and equipment inspection test and maintenance system
US20220188955A1 (en) * 2015-10-10 2022-06-16 David Sean Capps Fire Service and Equipment Inspection Test and Maintenance System and Method
US11287407B2 (en) * 2015-10-19 2022-03-29 University Of North Texas Dynamic reverse gas stack model for portable chemical detection devices to locate threat and point-of-source from effluent streams
DE102015221313A1 (en) * 2015-10-30 2017-05-04 Siemens Aktiengesellschaft System and procedure for the maintenance of a plant
GB201611152D0 (en) * 2016-06-27 2016-08-10 Moletest Ltd Image processing
US11095502B2 (en) * 2017-11-03 2021-08-17 Otis Elevator Company Adhoc protocol for commissioning connected devices in the field
CN108205412B (en) * 2017-11-09 2019-10-11 中兴通讯股份有限公司 A kind of method and apparatus for realizing screenshotss
US10977874B2 (en) * 2018-06-11 2021-04-13 International Business Machines Corporation Cognitive learning for vehicle sensor monitoring and problem detection
DK201970535A1 (en) 2019-05-06 2020-12-21 Apple Inc Media browsing user interface with intelligently selected representative media items
DK202070616A1 (en) 2020-02-14 2022-01-14 Apple Inc User interfaces for workout content

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6304271B1 (en) * 1999-02-05 2001-10-16 Sony Corporation Apparatus and method for cropping an image in a zooming graphical user interface
US20020172498A1 (en) * 2001-05-18 2002-11-21 Pentax Precision Instrument Corp. Computer-based video recording and management system for medical diagnostic equipment
US20040056869A1 (en) * 2002-07-16 2004-03-25 Zeenat Jetha Using detail-in-context lenses for accurate digital image cropping and measurement
US20040095375A1 (en) * 2002-05-10 2004-05-20 Burmester Christopher Paul Method of and apparatus for interactive specification of manufactured products customized with digital media
US20040128613A1 (en) * 2002-10-21 2004-07-01 Sinisi John P. System and method for mobile data collection
US20060107302A1 (en) * 2004-11-12 2006-05-18 Opentv, Inc. Communicating primary content streams and secondary content streams including targeted advertising to a remote unit
US20070097089A1 (en) * 2005-10-31 2007-05-03 Battles Amy E Imaging device control using touch pad
US20090051946A1 (en) * 2007-08-23 2009-02-26 Canon Kabushiki Kaisha Image area selecting method
US7743348B2 (en) * 2004-06-30 2010-06-22 Microsoft Corporation Using physical objects to adjust attributes of an interactive display application
US20100171826A1 (en) * 2006-04-12 2010-07-08 Store Eyes, Inc. Method for measuring retail display and compliance
US20100211902A1 (en) * 2006-10-10 2010-08-19 Promethean Limited Interactive display system

Family Cites Families (67)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0457924B1 (en) * 1990-01-11 1994-10-26 Kabushiki Kaisha Toshiba Apparatus for supporting inspection of plant
US6262732B1 (en) * 1993-10-25 2001-07-17 Scansoft, Inc. Method and apparatus for managing and navigating within stacks of document pages
US7028044B2 (en) 1994-12-22 2006-04-11 University Of Utah Research Foundation Highlighting quoted passages in a hypertext system
US5732230A (en) * 1995-05-19 1998-03-24 Richo Company Ltd. Computer user interface for manipulating image fragments using drag, drop and merge operations
US6665490B2 (en) * 1998-04-01 2003-12-16 Xerox Corporation Obtaining and using data associating annotating activities with portions of recordings
US6283759B1 (en) * 1998-11-13 2001-09-04 R. J. Price System for demonstrating compliance with standards
US20020023109A1 (en) * 1999-12-30 2002-02-21 Lederer Donald A. System and method for ensuring compliance with regulations
US20010047283A1 (en) * 2000-02-01 2001-11-29 Melick Bruce D. Electronic system for identification, recording, storing, and retrieving material handling equipment records and certifications
US20020025085A1 (en) * 2000-04-19 2002-02-28 Ipads.Com, Inc. Computer-controlled system and method for generating a customized imprinted item
US7038714B1 (en) * 2000-05-16 2006-05-02 Eastman Kodak Company Printing system and method having a digital printer that uses a digital camera image display
WO2002021340A1 (en) * 2000-09-07 2002-03-14 Praeses Corporation System and method for an online jurisdiction manager
US8198986B2 (en) * 2001-11-13 2012-06-12 Ron Craik System and method for storing and retrieving equipment inspection and maintenance data
US9760235B2 (en) * 2001-06-12 2017-09-12 Callahan Cellular L.L.C. Lens-defined adjustment of displays
US6772098B1 (en) * 2001-07-11 2004-08-03 General Electric Company Systems and methods for managing inspections
US6587768B2 (en) * 2001-08-08 2003-07-01 Meritor Heavy Vehicle Technology, Llc Vehicle inspection and maintenance system
US6671646B2 (en) * 2001-09-11 2003-12-30 Zonar Compliance Systems, Llc System and process to ensure performance of mandated safety and maintenance inspections
US8400296B2 (en) * 2001-09-11 2013-03-19 Zonar Systems, Inc. Method and apparatus to automate data collection during a mandatory inspection
US7557696B2 (en) * 2001-09-11 2009-07-07 Zonar Systems, Inc. System and process to record inspection compliance data
US20030069716A1 (en) * 2001-10-09 2003-04-10 Martinez David Frederick System & method for performing field inspection
US20030131011A1 (en) * 2002-01-04 2003-07-10 Argent Regulatory Services, L.L.C. Online regulatory compliance system and method for facilitating compliance
CA2394268A1 (en) * 2002-02-14 2003-08-14 Beyond Compliance Inc. A compliance management system
US20050228688A1 (en) * 2002-02-14 2005-10-13 Beyond Compliance Inc. A compliance management system
US20030229858A1 (en) * 2002-06-06 2003-12-11 International Business Machines Corporation Method and apparatus for providing source information from an object originating from a first document and inserted into a second document
US8120624B2 (en) * 2002-07-16 2012-02-21 Noregin Assets N.V. L.L.C. Detail-in-context lenses for digital image cropping, measurement and online maps
US6859757B2 (en) * 2002-07-31 2005-02-22 Sap Aktiengesellschaft Complex article tagging with maintenance related information
US20040177326A1 (en) * 2002-10-21 2004-09-09 Bibko Peter N. Internet/intranet software system to audit and manage compliance
US7356393B1 (en) * 2002-11-18 2008-04-08 Turfcentric, Inc. Integrated system for routine maintenance of mechanized equipment
US20050065842A1 (en) * 2003-07-28 2005-03-24 Richard Summers System and method for coordinating product inspection, repair and product maintenance
US20050111699A1 (en) * 2003-11-24 2005-05-26 Emil Gran Suite of parking regulation control systems
US7536278B2 (en) * 2004-05-27 2009-05-19 International Electronic Machines Corporation Inspection method, system, and program product
US7454050B2 (en) * 2004-06-18 2008-11-18 Csi Technology, Inc. Method of automating a thermographic inspection process
US20060132291A1 (en) * 2004-11-17 2006-06-22 Dourney Charles Jr Automated vehicle check-in inspection method and system with digital image archiving
US20060132836A1 (en) * 2004-12-21 2006-06-22 Coyne Christopher R Method and apparatus for re-sizing image data
US7834876B2 (en) * 2004-12-28 2010-11-16 The Mathworks, Inc. Providing graphic generating capabilities for a model based development process
US20060218492A1 (en) * 2005-03-22 2006-09-28 Andrade Jose O Copy and paste with citation attributes
US7869944B2 (en) * 2005-04-18 2011-01-11 Roof Express, Llc Systems and methods for recording and reporting data collected from a remote location
US20060235741A1 (en) * 2005-04-18 2006-10-19 Dataforensics, Llc Systems and methods for monitoring and reporting
US7324905B2 (en) * 2005-05-11 2008-01-29 Robert James Droubie Apparatus, system and method for automating an interactive inspection process
US7357571B2 (en) * 2005-07-01 2008-04-15 Predictive Service, Llc Infrared inspection and reporting process
US20070027704A1 (en) * 2005-07-28 2007-02-01 Simplikate Systems, L.L.C. System and method for community association violation tracking and processing
US20070050177A1 (en) * 2005-08-25 2007-03-01 Kirkland James R Jr Methods and Systems for Accessing Code Information
US7851758B1 (en) * 2005-09-29 2010-12-14 Flir Systems, Inc. Portable multi-function inspection systems and methods
US8310533B2 (en) * 2006-03-27 2012-11-13 GE Sensing & Inspection Technologies, LP Inspection apparatus for inspecting articles
WO2008054847A2 (en) * 2006-04-03 2008-05-08 3M Innovative Properties Company Vehicle inspection using radio frequency identification (rfid)
JP4781883B2 (en) * 2006-04-04 2011-09-28 株式会社日立製作所 Information management method and information management system
US8230362B2 (en) * 2006-05-31 2012-07-24 Manheim Investments, Inc. Computer-assisted and/or enabled systems, methods, techniques, services and user interfaces for conducting motor vehicle and other inspections
US20070288859A1 (en) * 2006-06-07 2007-12-13 Siemens Communications, Inc. Method and apparatus for selective forwarding of e-mail and document content
US20080021717A1 (en) * 2006-06-08 2008-01-24 Db Industries, Inc. Method of Facilitating Controlled Flow of Information for Safety Equipment Items and Database Related Thereto
US20080021905A1 (en) * 2006-06-08 2008-01-24 Db Industries, Inc. Direct Data Input for Database for Safety Equipment Items and Method
US20080021718A1 (en) * 2006-06-08 2008-01-24 Db Industries, Inc. Centralized Database of Information Related to Inspection of Safety Equipment Items Inspection and Method
US20080052377A1 (en) * 2006-07-11 2008-02-28 Robert Light Web-Based User-Dependent Customer Service Interaction with Co-Browsing
US8106856B2 (en) * 2006-09-06 2012-01-31 Apple Inc. Portable electronic device for photo management
US8564544B2 (en) * 2006-09-06 2013-10-22 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons
US8396280B2 (en) * 2006-11-29 2013-03-12 Honeywell International Inc. Apparatus and method for inspecting assets in a processing or other environment
US7865872B2 (en) * 2006-12-01 2011-01-04 Murex S.A.S. Producer graph oriented programming framework with undo, redo, and abort execution support
US7860268B2 (en) * 2006-12-13 2010-12-28 Graphic Security Systems Corporation Object authentication using encoded images digitally stored on the object
US11126966B2 (en) * 2007-01-23 2021-09-21 Tegris, Inc. Systems and methods for a web based inspection compliance registry and communication tool
KR101420419B1 (en) * 2007-04-20 2014-07-30 엘지전자 주식회사 Electronic Device And Method Of Editing Data Using the Same And Mobile Communication Terminal
US20080275714A1 (en) * 2007-05-01 2008-11-06 David Frederick Martinez Computerized requirement management system
JP2009104268A (en) * 2007-10-22 2009-05-14 Hitachi Displays Ltd Coordinate detection device and operation method using touch panel
US8224020B2 (en) * 2007-11-29 2012-07-17 Kabushiki Kaisha Toshiba Appearance inspection apparatus, appearance inspection system, and appearance inspection appearance
US8571747B2 (en) * 2007-12-06 2013-10-29 The Boeing Company System and method for managing aircraft maintenance
TW200935278A (en) * 2008-02-04 2009-08-16 E Lead Electronic Co Ltd A cursor control system and method thereof
WO2009117419A2 (en) * 2008-03-17 2009-09-24 Worcester Polytechnic Institute Virtual interactive system for ultrasound training
CA2722417A1 (en) * 2008-04-25 2009-10-29 Btsafety Llc System and method of providing product quality and safety
US8526767B2 (en) * 2008-05-01 2013-09-03 Atmel Corporation Gesture recognition
US8866698B2 (en) * 2008-10-01 2014-10-21 Pleiades Publishing Ltd. Multi-display handheld device and supporting system

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6304271B1 (en) * 1999-02-05 2001-10-16 Sony Corporation Apparatus and method for cropping an image in a zooming graphical user interface
US20020172498A1 (en) * 2001-05-18 2002-11-21 Pentax Precision Instrument Corp. Computer-based video recording and management system for medical diagnostic equipment
US20040095375A1 (en) * 2002-05-10 2004-05-20 Burmester Christopher Paul Method of and apparatus for interactive specification of manufactured products customized with digital media
US20040056869A1 (en) * 2002-07-16 2004-03-25 Zeenat Jetha Using detail-in-context lenses for accurate digital image cropping and measurement
US20040128613A1 (en) * 2002-10-21 2004-07-01 Sinisi John P. System and method for mobile data collection
US7743348B2 (en) * 2004-06-30 2010-06-22 Microsoft Corporation Using physical objects to adjust attributes of an interactive display application
US20060107302A1 (en) * 2004-11-12 2006-05-18 Opentv, Inc. Communicating primary content streams and secondary content streams including targeted advertising to a remote unit
US20070097089A1 (en) * 2005-10-31 2007-05-03 Battles Amy E Imaging device control using touch pad
US20100171826A1 (en) * 2006-04-12 2010-07-08 Store Eyes, Inc. Method for measuring retail display and compliance
US20100211902A1 (en) * 2006-10-10 2010-08-19 Promethean Limited Interactive display system
US20090051946A1 (en) * 2007-08-23 2009-02-26 Canon Kabushiki Kaisha Image area selecting method

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120210200A1 (en) * 2011-02-10 2012-08-16 Kelly Berger System, method, and touch screen graphical user interface for managing photos and creating photo books
US20130218464A1 (en) * 2012-02-17 2013-08-22 Chun-Ming Chen Method for generating split screen according to a touch gesture
US9228839B2 (en) * 2012-02-17 2016-01-05 Mitac International Corp. Method for generating split screen according to a touch gesture
US10417763B2 (en) 2014-07-25 2019-09-17 Samsung Electronics Co., Ltd. Image processing apparatus, image processing method, x-ray imaging apparatus and control method thereof
WO2020027813A1 (en) * 2018-07-31 2020-02-06 Hewlett-Packard Development Company, L.P. Cropping portions of images
US11294556B1 (en) * 2021-01-05 2022-04-05 Adobe Inc. Editing digital images using multi-panel graphical user interfaces

Also Published As

Publication number Publication date
US20100153168A1 (en) 2010-06-17
US20100185549A1 (en) 2010-07-22
US8032830B2 (en) 2011-10-04
US20100149211A1 (en) 2010-06-17
US20100269029A1 (en) 2010-10-21
US7971140B2 (en) 2011-06-28
US20100153833A1 (en) 2010-06-17

Similar Documents

Publication Publication Date Title
US20100194781A1 (en) System and method for cropping and annotating images on a touch sensitive display device
US11550993B2 (en) Ink experience for images
US9805486B2 (en) Image-drawing processing system, server, user terminal, image-drawing processing method, program, and storage medium
JP2014052873A (en) Electronic apparatus and handwritten document processing method
JP2007304669A (en) Method and program for controlling electronic equipment
CN102034081B (en) Use image as the calculator device of Data Source
US10108312B2 (en) Apparatus and method for processing information list in terminal device
US9117125B2 (en) Electronic device and handwritten document processing method
CN102799384A (en) Method, client and system for outdoor scene screenshot
KR20110074166A (en) Method for generating digital contents
CN113079316A (en) Image processing method, image processing device and electronic equipment
CN112882643A (en) Control method of touch pen, control method of electronic equipment and touch pen
US20160117548A1 (en) Electronic apparatus, method and storage medium
WO2014122794A1 (en) Electronic apparatus and handwritten-document processing method
US20150098653A1 (en) Method, electronic device and storage medium
CN105027053A (en) Electronic device, display method, and program
WO2017143575A1 (en) Method for retrieving content of image, portable electronic device, and graphical user interface
WO2023284640A1 (en) Picture processing method and electronic device
US20150149894A1 (en) Electronic device, method and storage medium
CN113783770B (en) Image sharing method, image sharing device and electronic equipment
CN113163256B (en) Method and device for generating operation flow file based on video
CN112765500A (en) Information searching method and device
CN113407144A (en) Display control method and device
CN113342222A (en) Application classification method and device and electronic equipment
CN112162681A (en) Text operation execution method and device and electronic equipment

Legal Events

Date Code Title Description
AS Assignment

Owner name: KD SECURE LLC, FLORIDA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HO, ALBERT, MR;SIEGEL, MARC, MR;TOSSING, CHRISTOPHER, MR;SIGNING DATES FROM 20110111 TO 20110119;REEL/FRAME:025875/0588

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: SECURENET SOLUTIONS GROUP, LLC, FLORIDA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KD SECURE, LLC;REEL/FRAME:043815/0074

Effective date: 20171009