US20040064266A1 - Computerized system and method for simultaneously representing and recording dynamic judgments - Google Patents

Computerized system and method for simultaneously representing and recording dynamic judgments Download PDF

Info

Publication number
US20040064266A1
US20040064266A1 US10/662,568 US66256803A US2004064266A1 US 20040064266 A1 US20040064266 A1 US 20040064266A1 US 66256803 A US66256803 A US 66256803A US 2004064266 A1 US2004064266 A1 US 2004064266A1
Authority
US
United States
Prior art keywords
user
judgment
representation
concept
judgments
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/662,568
Inventor
John Baird
Marek Chawarski
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US10/662,568 priority Critical patent/US20040064266A1/en
Publication of US20040064266A1 publication Critical patent/US20040064266A1/en
Priority to US11/553,051 priority patent/US20070055481A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4824Touch or pain perception evaluation

Definitions

  • the present invention relates to methods for representing and recording personal judgments and more particularly relates to a computerized system and method for representing and recording dynamic, relative judgments of physical or non-physical concepts in one or two dimensions.
  • a computerized system and method that represents and records the scale values resulting from the dynamic adjustment of the location of multiple concepts in one or two dimensions.
  • a computerized system and method is also needed that allows the user's judgment decisions to be evaluated continuously by recording the changes made in the user's judgments over time.
  • a computerized system and method for representing judgments of a user, for recording relative judgments in one or two dimensions, and for recording the judgment making process.
  • the computerized method displays multiple concept representations simultaneously, receives a user-manipulated adjustment to one or more of the concept representations to create a judgment representation, and records the judgment representation(s) and user-manipulated adjustment(s).
  • the judgment representations and adjustments are preferably recorded continuously so that the judgment process can be reviewed and evaluated.
  • the computerized method represents and records relative judgments within a physical context.
  • the computerized method represents and records relative judgments along a one-dimensional scale.
  • the computerized method represents and records relative judgments along a two-dimensional scale.
  • the computerized method represents and records relative judgments using a polar coordinate scale.
  • the computerized method represents and records relative judgments by associating concepts without any physical context or scale.
  • the computerized method represents and records relative judgments using the above methods together with a fixed resource technique.
  • the computerized system preferably implements the methods defined above using software and a computing device.
  • FIGS. 1 and 2 are schematic block diagrams of the computerized system for representing and recording judgments, according to different embodiments of the present invention
  • FIG. 3 is a flow chart illustrating a method for representing and recording judgments in a physical context, according to one embodiment of the present invention
  • FIGS. 4, 4A and 5 are illustrations of a human form for indicating pain locations, according to one example of the method for representing and recording judgments in a physical context;
  • FIG. 6 is a flow chart illustrating a method for representing and recording judgments in relation to a rating scale, according to another embodiment of the present invention.
  • FIG. 7 is a graphical illustration of a vertical rating scale with unrated concept representations corresponding to various types of pain, according to one example of the method for representing and recording judgments in relation to a rating scale;
  • FIG. 8 is a graphical illustration of the vertical rating scale shown in FIG. 7 with the concept representations positioned relative to one another and rated based on the location relative to the rating scale;
  • FIGS. 9 and 10 are graphical illustrations showing a vertical scale with concept representations corresponding to emotional feelings, according to another example
  • FIGS. 11 - 14 are graphical illustrations showing a horizontal rating scale with concept representations, according to further examples.
  • FIG. 15 is a graphical illustration of a horizontal rating scale with each of the words in a different row, according to yet another embodiment of the present invention.
  • FIGS. 16 - 17 are graphical illustrations of a two dimensional scale with concept representations, according to yet another example.
  • FIG. 18 is a graphical illustration of a two dimensional polar coordinate scale with concept representations, according to yet another example
  • FIG. 19 is a flow chart illustrating a method for representing and recording judgments by associating concept representations in space, according to a further embodiment of the present invention.
  • FIG. 20 is a graphical illustration of concept representations associated in space, according to one example.
  • the concepts can be any physical item (e.g., food) or non-physical concept (e.g., feelings or issues) about which a user can express judgment.
  • the user Using the user input 16 (e.g., by depressing the mouse button), the user represents one or more relative judgments by locating concept representations in the space relative to other concept representations, a physical context, and/or a scale.
  • the system can receive user-manipulated adjustments of the concept representations relative to each other, the physical context, and/or the scale.
  • the system draws the concept representation at its user-designated location, such as occurs when icons are moved across the screen in computer operating systems.
  • the system thus allows the users to dynamically express and/or modify their relative judgments, for example, by positioning the concept representations relative to one another, relative to a scale, and/or relative to a physical context on the computer screen.
  • the sequential order and value of each manipulation and adjustment is recorded, together with the time required by the user to make the adjustment.
  • the user continues locating concept representations on the screen or continues making adjustments of all concept representations until satisfied with the judgments represented.
  • the user signals (e.g., by pressing any key on the computer keyboard), and the system then records the final values of the judgment representations.
  • FIG. 3 One method of representing and recording judgments in relation to a physical context is illustrated in FIG. 3.
  • at least one physical context representation is displayed, step 112 .
  • the physical context representation represents the physical context (e.g., the user's body) in which the user is making judgments, and the user is asked to make a judgment by designating locations (e.g., a pain location) in the physical context.
  • the system receives the user input judgment associated with at least one location in the physical context, step 116 .
  • the location representing the user's judgment is then displayed in relation to the physical context representation, step 120 .
  • User input information e.g., each user designation and the time between designations
  • is recorded as each of the locations are designated by the user, step 124 .
  • the user can adjust or modify the judgment representations, step 128 , for example, by designating new locations and/or erasing existing designations.
  • the user input information for these adjustments is also recorded.
  • the final judgment representations are recorded, step 130 .
  • the user is a patient experiencing a sensory symptom such as pain or itchiness
  • the physical context is the patient's body
  • judgments pertaining to the location of the symptom are recorded.
  • the system displays outline drawings 30 of a human head and body, and the user locates the cursor at one or more locations 32 a , 32 b on the figure to indicate the pain or itchiness or some other sensory symptom.
  • Different colors or types of geometric figures can be used to represent different types of sensory symptoms (e.g., different types or intensities of pain) in a physical context.
  • patients can record their symptoms at different intensities on the body picture using different colors to represent the different intensities (as indicated by the scale), thereby providing a symptom scanning technique.
  • the system records the order of each point's placement on the drawing, for example, by recording the x,y coordinates of each point placed on the drawing.
  • the system also records the times between each designation of a point on the drawing.
  • This data allows an investigator to exactly reproduce the judgment process employed by the user in marking locations on the figure.
  • the recorded judgment data and judgment process data can thus be used to evaluate the patient's condition.
  • an animated graphical representation showing the judgment process can be replayed (e.g., as a movie) to visualize the exact manner in which the user made each judgment.
  • the data can be compared to previously recorded data for other patients, which has been stored in a library of data, to give a likely diagnosis for consideration by the physician.
  • multidimensional judgments pertaining to the symptoms at each user-designated location can be represented and recorded.
  • a graphical representation 34 associated with a user-designated location can be displayed to allow the user to make the multidimensional judgments further characterizing the symptoms.
  • methods for representing and recording multi-dimensional judgment representations e.g., using a fixed resource technique
  • methods for representing and recording multi-dimensional judgment representations are described in greater detail in co-pending provisional application Serial No. 60/270,854 (Attorney Docket No. BAIRD-001PR) and application Ser. No. 09/950,126 (Attorney Docket No. BAIRD-001XX), both of which are incorporated herein by reference.
  • Other methods for representing and recording judgments to further characterize the symptoms include the methods described in greater detail below.
  • FIG. 6 One method of representing and recording judgments in relation to a rating scale is illustrated in FIG. 6.
  • multiple unrated concept representations are displayed (e.g., using words, pictures or icons), step 222 .
  • the concepts can be physical or non-physical and can include anything about which a user can express a judgment.
  • One or more rating scales are also displayed, step 226 .
  • the rating scale(s) provide a range of possible judgments applicable to the concepts (e.g., degrees of pain). The user is asked to make a judgment rating each of the concepts in relation to the rating scale(s) and in relation to one another, for example, by manipulating and locating the concept representations along the rating scale(s).
  • step 230 the concept representation is displayed in relation to the scale, step 234 .
  • User input information is recorded as the user rates (or adjusts the rating of) each of the concepts, step 238 . If the user wants to rate another concept or adjust a rating, step 242 , these steps are repeated.
  • step 242 the final ratings are recorded as the user's judgment representation, step 246 .
  • FIGS. 7 - 17 examples of the method of representing and recording judgments in relation to a rating scale are described in greater detail.
  • the scale 42 can be oriented either vertically (FIGS. 7 - 10 ) or horizontally (FIGS. 11 - 15 ).
  • the user moves the words (i.e., the concept representations) to positions along the scale 42 to indicate an amount or degree along the dimension, thereby representing the user's judgment by rating the concept.
  • the movement is accomplished by positioning the cursor on the word, clicking on the mouse, and moving the cursor.
  • the system automatically erases the old representation of the word and draws it in the new location. This occurs continuously as the cursor is moved.
  • an indication is preferably displayed on the rating scale indicating the user input rating.
  • the movement of a single word along the scale 42 leads to a corresponding change in the position of an arrowhead 46 that slides along the scale 42 and points to the exact rating at each instant in time.
  • the words can appear at any position along the dimension of the computer screen that is orthogonal to the orientation of the measurement scale 42 , thus allowing different items to receive the same ratings.
  • the words can be located in separate rows above the horizontal scale 42 ′, as shown in FIG. 15, so that more than one concept can be given the same rating without the words overlapping.
  • the words can be located adjacent to each other, within the limits of the screen size.
  • the user rates the words with respect to the scale and relative to the other words already rated.
  • the method allows the user to continue manipulating the positions of the words until the user is satisfied with all the ratings.
  • the user input information recorded includes each move, the order for each move, and the time required for each move.
  • the user input information is stored as an animated graphical representation, which can be replayed to visualize the exact process of making each judgment.
  • a patient in chronic pain adjusted the adjectives to indicate the appropriateness (Least to Best) of each adjective for describing the character of the patient's pain.
  • the adjective “throbbing” was rated as the most appropriate and the adjective “tender” was rated as the least appropriate.
  • the advantage of this method over the standard means of obtaining ratings for each adjective in isolation is that judgments are made within a “context” of other adjectives, thus encouraging the user to make distinctions among the adjectives in terms of their ratings.
  • the standard method when adjectives are rated in isolation in the clinic chronic pain patients tend to choose high ratings of appropriateness or intensity for all the adjectives.
  • the patient used the adjectives to describe the patient's feelings.
  • concept representations are located by the user along two dimensions simultaneously.
  • words initially appear in a vertical list 40 on the screen and a two-dimensional, orthogonal coordinate system (e.g., vertical rating scale 42 and horizontal rating scale 42 ′) is shown with tick marks and integers designating different levels of the attribute being judged.
  • the user moves the words (i.e., concept representations) to positions within the two-dimensional space relative to the vertical scale 42 and the horizontal scale 42 ′.
  • This movement of the representation is linked in a linear fashion to movement of one arrowhead 46 a along the vertical scale 42 and of another arrowhead 46 b along the horizontal scale 42 ′.
  • the system allows the user to continue manipulating the positions of the items in two dimensions until they are satisfied with all the ratings along both scales 42 , 42 ′.
  • the system records the concept ratings and movements on both scales 42 , 42 ′, the order for each move, and the time required for each move.
  • the user may not assign exactly the same ratings (x and y coordinates) for two or more items because this requires that the two words be placed on top of each other, thereby making them unreadable.
  • a three-dimensional scale can be used with each of the words located and movable in its own plane above the ground plane including the two-dimensional scale. When each of the words is moved, the x, y location is displayed on the ground plane relative to the two-dimensional scale. This three-dimensional example allows multiple words to have the same rating (i.e., the same x, y location) without the words having to be placed on top of one another.
  • a hypothetical user expressed their preference (vertical scale) and perceived nutritional value (horizontal scale) of different foods.
  • the item “pasta” was rated as well liked and of high nutrition; the item “water” was rated as well liked and low in nutrition; the item “broccoli” was rated as poorly liked and high in nutrition; and the item “lettuce” was rated as poorly liked and low in nutrition.
  • the two dimensional scale can be represented as a polar coordinate system 50 , as shown in FIG. 18.
  • the user moves the concept representations 52 (e.g., words, letters, pictures or icons) within one or more circles 54 to different locations in the space.
  • the value measured along one dimension e.g., preference
  • the value along the second dimension is the angle of a vector extending from the center 56 of the circle(s) 54 .
  • concept representations are positioned relative to one another in two dimensions without any physical context or rating scale.
  • Multiple concept representations e.g., words, pictures, or icons
  • the user is asked to make judgments by manipulating the concept representations and moving the concept representations in relation to one another.
  • the user manipulations of the concept representations are received, step 326 , and the concept representations are displayed at the user-designated location, step 330 .
  • User input information is recorded as each concept representation is manipulated, step 334 .
  • These steps can be repeated to adjust or modify the user's judgment, step 338 .
  • the final positions of the concept representations representing the user's judgments are recorded, step 342 .
  • FIG. 20 One example of this method of positioning concept representations 60 in two-dimensional space is used with food items, as shown in FIG. 20.
  • the user locates the food items in two dimensions on the computer screen by moving words to positions relative to each other. No scales or units of measure are shown and the user is not told what the two dimensions of the screen represent.
  • the user is instructed to adjust the items such that those items that go together (specific attributes can be specified) are close together in space and those items that do not go together are far apart in space.
  • the inter-item distances can later be analyzed so that items are placed in either clustered groupings (non metric) or in a two-dimensional coordinate system (metric).
  • the advantage of this method over previous manual and computer applications is that one can store a running record of every keystroke made by the user in rendering judgments as items are moved about the screen.
  • any of the methods described above can incorporate the fixed resource technique, as described in greater detail in co-pending provisional application Serial No. 60/270,854 (Attorney Docket No. BAIRD-001PR) and application Ser. No. 09/950,126 (Attorney Docket No. BAIRD-001XX), both of which are incorporated herein by reference.
  • a horizontal scale 42 ′ (for example, as shown in FIG. 15) can be used with each word (or other type of concept representation) located in its own row above the scale 42 ′.
  • one or more of the other words are able to move automatically without interfering with one another in accordance with the fixed resource technique.
  • the three-dimensional scale described above can also be used according to this embodiment to provide the fixed resources in two dimensions.

Abstract

The computerized system and method represents judgments of a user and records the judgments and the judgment making process. In one embodiment, the computerized method displays multiple concept representations simultaneously, receives a user-manipulated adjustment to one or more of the concept representations to create a judgment representation, and records the judgment representation(s) and user-manipulated adjustment(s). The judgment representations and adjustments are preferably recorded continuously so that the judgment process can be reviewed and evaluated. The concept representations can be displayed relative to other concept representations and/or relative to a rating scale. In another embodiment, the computerized method displays a physical context representation and the judgments are represented and recorded as user designated locations in the physical context. One example of an application for the system and method is to record and evaluate a patient's judgments with respect to pain.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • The present application claims the benefit of U.S. Provisional Application Serial No. 60/270,854, filed Feb. 23, 2001, and U.S. Provisional Application Serial No. 60/292,115, filed May 18, 2001, both of which are fully incorporated herein by reference.[0001]
  • STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT
  • [0002] This invention was made with Government support under SBIR grant Nos. 1 R43 MH62833-01, 1 R43 NS42387-01, 1 R43 HL/MH68493-01 awarded by the National Institutes of Health. The Government has certain rights in the invention.
  • TECHNICAL FIELD
  • The present invention relates to methods for representing and recording personal judgments and more particularly relates to a computerized system and method for representing and recording dynamic, relative judgments of physical or non-physical concepts in one or two dimensions. [0003]
  • BACKGROUND INFORMATION
  • Studies have been performed using cognitive mapping methods to assess a person's conception of the perceived or ideal distances between actual or hypothetical physical objects, such as buildings on a campus or in a town, or the perceived glossiness of images in a photograph. These studies have been done both by physical manipulation of objects (photographic prints), as well as by using a computer system to record the location of objects placed by an individual in a grid appearing on a computer monitor. These studies and methods are described in various publications[0004] 1, all of which are incorporated herein by reference.
  • One problem with the methods described in these publications is that they have only been used to scale judgments of objects that are naturally situated in a metric space (buildings) or of physical stimuli that are directly perceived by an observer (photographic prints). These methods are also limited in that they do not provide a precise measure of the rating assigned to each item, because the location of the item along the scale has an error bar equal to the width of the pictorial word or icon. These methods also are limited in that they do not allow for (e.g., in the case of prints), or have not recorded (e.g., in the case of computer methods) dynamic changes of judgments over time. [0005]
  • Accordingly, a computerized system and method is needed that represents and records the scale values resulting from the dynamic adjustment of the location of multiple concepts in one or two dimensions. A computerized system and method is also needed that allows the user's judgment decisions to be evaluated continuously by recording the changes made in the user's judgments over time. [0006]
  • SUMMARY
  • To address the needs described above, a computerized system and method is provided for representing judgments of a user, for recording relative judgments in one or two dimensions, and for recording the judgment making process. In general, the computerized method displays multiple concept representations simultaneously, receives a user-manipulated adjustment to one or more of the concept representations to create a judgment representation, and records the judgment representation(s) and user-manipulated adjustment(s). The judgment representations and adjustments are preferably recorded continuously so that the judgment process can be reviewed and evaluated. [0007]
  • In accordance with one aspect of the present invention, the computerized method represents and records relative judgments within a physical context. [0008]
  • In accordance with another aspect of the present invention, the computerized method represents and records relative judgments along a one-dimensional scale. [0009]
  • In accordance with a further aspect of the present invention, the computerized method represents and records relative judgments along a two-dimensional scale. [0010]
  • In accordance with a further aspect of the present invention, the computerized method represents and records relative judgments using a polar coordinate scale. [0011]
  • In accordance with a further aspect of the present invention, the computerized method represents and records relative judgments by associating concepts without any physical context or scale. [0012]
  • In accordance with yet another aspect of the present invention, the computerized method represents and records relative judgments using the above methods together with a fixed resource technique. [0013]
  • The computerized system preferably implements the methods defined above using software and a computing device.[0014]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIGS. 1 and 2 are schematic block diagrams of the computerized system for representing and recording judgments, according to different embodiments of the present invention; [0015]
  • FIG. 3 is a flow chart illustrating a method for representing and recording judgments in a physical context, according to one embodiment of the present invention; [0016]
  • FIGS. 4, 4A and [0017] 5 are illustrations of a human form for indicating pain locations, according to one example of the method for representing and recording judgments in a physical context;
  • FIG. 6 is a flow chart illustrating a method for representing and recording judgments in relation to a rating scale, according to another embodiment of the present invention; [0018]
  • FIG. 7 is a graphical illustration of a vertical rating scale with unrated concept representations corresponding to various types of pain, according to one example of the method for representing and recording judgments in relation to a rating scale; [0019]
  • FIG. 8 is a graphical illustration of the vertical rating scale shown in FIG. 7 with the concept representations positioned relative to one another and rated based on the location relative to the rating scale; [0020]
  • FIGS. 9 and 10 are graphical illustrations showing a vertical scale with concept representations corresponding to emotional feelings, according to another example; [0021]
  • FIGS. [0022] 11-14 are graphical illustrations showing a horizontal rating scale with concept representations, according to further examples;
  • FIG. 15 is a graphical illustration of a horizontal rating scale with each of the words in a different row, according to yet another embodiment of the present invention; [0023]
  • FIGS. [0024] 16-17 are graphical illustrations of a two dimensional scale with concept representations, according to yet another example;
  • FIG. 18 is a graphical illustration of a two dimensional polar coordinate scale with concept representations, according to yet another example; [0025]
  • FIG. 19 is a flow chart illustrating a method for representing and recording judgments by associating concept representations in space, according to a further embodiment of the present invention; and [0026]
  • FIG. 20 is a graphical illustration of concept representations associated in space, according to one example. [0027]
  • pictures or some other icon (such as a solid geometric figure) The concepts can be any physical item (e.g., food) or non-physical concept (e.g., feelings or issues) about which a user can express judgment. Using the user input [0028] 16 (e.g., by depressing the mouse button), the user represents one or more relative judgments by locating concept representations in the space relative to other concept representations, a physical context, and/or a scale. The system can receive user-manipulated adjustments of the concept representations relative to each other, the physical context, and/or the scale. In response to the user's manipulation of the concept representations), the system draws the concept representation at its user-designated location, such as occurs when icons are moved across the screen in computer operating systems.
  • The system thus allows the users to dynamically express and/or modify their relative judgments, for example, by positioning the concept representations relative to one another, relative to a scale, and/or relative to a physical context on the computer screen. The sequential order and value of each manipulation and adjustment is recorded, together with the time required by the user to make the adjustment. The user continues locating concept representations on the screen or continues making adjustments of all concept representations until satisfied with the judgments represented. When the user is satisfied, the user signals (e.g., by pressing any key on the computer keyboard), and the system then records the final values of the judgment representations. Various methods of the present invention are described in greater detail below. [0029]
  • One method of representing and recording judgments in relation to a physical context is illustrated in FIG. 3. According to this method, at least one physical context representation is displayed, [0030] step 112. The physical context representation represents the physical context (e.g., the user's body) in which the user is making judgments, and the user is asked to make a judgment by designating locations (e.g., a pain location) in the physical context. The system then receives the user input judgment associated with at least one location in the physical context, step 116. The location representing the user's judgment is then displayed in relation to the physical context representation, step 120. User input information (e.g., each user designation and the time between designations) is recorded as each of the locations are designated by the user, step 124. The user can adjust or modify the judgment representations, step 128, for example, by designating new locations and/or erasing existing designations. The user input information for these adjustments is also recorded. When the user is finished, the final judgment representations are recorded, step 130.
  • Referring to FIG. 4, one example of the method of representing and recording judgments in relation to a physical context is described in greater detail. According to this exemplary method, the user is a patient experiencing a sensory symptom such as pain or itchiness, the physical context is the patient's body, and judgments pertaining to the location of the symptom are recorded. The system displays [0031] outline drawings 30 of a human head and body, and the user locates the cursor at one or more locations 32 a, 32 b on the figure to indicate the pain or itchiness or some other sensory symptom.
  • Depressing the mouse results in the appearance of a solid figure (square, circle, or some other geometric figure) at that location. The size of the figure can be adjusted to accommodate the size of the entire drawing as it appears on the computer screen. By holding down the mouse button and moving the cursor, the user can fill in a region on the drawing or indicate the exact pattern of locations on the body where the symptom is experienced. The system preferably only places points that do not overlap with adjacent points so that the system does not have multiple records of the same (or almost the same) placement location. A library of “legal” points (i.e., those falling within the confines of the figure) can be stored separately, and checked by the software before displaying a point indicated by the user. The user can also erase any inadvertent designations. Different colors or types of geometric figures can be used to represent different types of sensory symptoms (e.g., different types or intensities of pain) in a physical context. In one example shown in FIG. 4A, patients can record their symptoms at different intensities on the body picture using different colors to represent the different intensities (as indicated by the scale), thereby providing a symptom scanning technique. [0032]
  • The system records the order of each point's placement on the drawing, for example, by recording the x,y coordinates of each point placed on the drawing. The system also records the times between each designation of a point on the drawing. This data allows an investigator to exactly reproduce the judgment process employed by the user in marking locations on the figure. The recorded judgment data and judgment process data can thus be used to evaluate the patient's condition. In one example, an animated graphical representation showing the judgment process can be replayed (e.g., as a movie) to visualize the exact manner in which the user made each judgment. In another example, the data can be compared to previously recorded data for other patients, which has been stored in a library of data, to give a likely diagnosis for consideration by the physician. [0033]
  • According to one variation of this method for representing and recording judgments of sensory symptoms, as shown in FIG. 5, multidimensional judgments pertaining to the symptoms at each user-designated location can be represented and recorded. For example, a [0034] graphical representation 34 associated with a user-designated location can be displayed to allow the user to make the multidimensional judgments further characterizing the symptoms. Examples of methods for representing and recording multi-dimensional judgment representations (e.g., using a fixed resource technique) are described in greater detail in co-pending provisional application Serial No. 60/270,854 (Attorney Docket No. BAIRD-001PR) and application Ser. No. 09/950,126 (Attorney Docket No. BAIRD-001XX), both of which are incorporated herein by reference. Other methods for representing and recording judgments to further characterize the symptoms include the methods described in greater detail below.
  • One method of representing and recording judgments in relation to a rating scale is illustrated in FIG. 6. According to this method, multiple unrated concept representations are displayed (e.g., using words, pictures or icons), [0035] step 222. The concepts can be physical or non-physical and can include anything about which a user can express a judgment. One or more rating scales are also displayed, step 226. The rating scale(s) provide a range of possible judgments applicable to the concepts (e.g., degrees of pain). The user is asked to make a judgment rating each of the concepts in relation to the rating scale(s) and in relation to one another, for example, by manipulating and locating the concept representations along the rating scale(s). When the user input rating of a concept is received, step 230, the concept representation is displayed in relation to the scale, step 234. User input information is recorded as the user rates (or adjusts the rating of) each of the concepts, step 238. If the user wants to rate another concept or adjust a rating, step 242, these steps are repeated. When the user is satisfied, the final ratings are recorded as the user's judgment representation, step 246.
  • Referring to FIGS. [0036] 7-17, examples of the method of representing and recording judgments in relation to a rating scale are described in greater detail. As shown by example in FIG. 7, words initially appear in a vertical list 40 on the screen and a single linear scale 42 appears on the screen with numerical values (e.g., integers 1 to 10) and tick marks. The scale 42 can be oriented either vertically (FIGS. 7-10) or horizontally (FIGS. 11-15). The user moves the words (i.e., the concept representations) to positions along the scale 42 to indicate an amount or degree along the dimension, thereby representing the user's judgment by rating the concept. In one example, the movement is accomplished by positioning the cursor on the word, clicking on the mouse, and moving the cursor. The system automatically erases the old representation of the word and draws it in the new location. This occurs continuously as the cursor is moved.
  • When a concept representation is manipulated, an indication is preferably displayed on the rating scale indicating the user input rating. For example, the movement of a single word along the [0037] scale 42 leads to a corresponding change in the position of an arrowhead 46 that slides along the scale 42 and points to the exact rating at each instant in time. The words can appear at any position along the dimension of the computer screen that is orthogonal to the orientation of the measurement scale 42, thus allowing different items to receive the same ratings. For example, the words can be located in separate rows above the horizontal scale 42′, as shown in FIG. 15, so that more than one concept can be given the same rating without the words overlapping. In the case of a vertical scale, the words can be located adjacent to each other, within the limits of the screen size.
  • As additional words are added to the [0038] scale 42, the user rates the words with respect to the scale and relative to the other words already rated. The method allows the user to continue manipulating the positions of the words until the user is satisfied with all the ratings. The user input information recorded includes each move, the order for each move, and the time required for each move. In one example, the user input information is stored as an animated graphical representation, which can be replayed to visualize the exact process of making each judgment.
  • In the example illustrated in FIGS. 8 and 12, a patient in chronic pain adjusted the adjectives to indicate the appropriateness (Least to Best) of each adjective for describing the character of the patient's pain. The adjective “throbbing” was rated as the most appropriate and the adjective “tender” was rated as the least appropriate. The advantage of this method over the standard means of obtaining ratings for each adjective in isolation is that judgments are made within a “context” of other adjectives, thus encouraging the user to make distinctions among the adjectives in terms of their ratings. In the standard method when adjectives are rated in isolation in the clinic, chronic pain patients tend to choose high ratings of appropriateness or intensity for all the adjectives. In another example illustrated in FIGS. 10 and 14, the patient used the adjectives to describe the patient's feelings. [0039]
  • According to another example, shown in FIGS. [0040] 16-17, concept representations are located by the user along two dimensions simultaneously. As shown by example in FIG. 16, words initially appear in a vertical list 40 on the screen and a two-dimensional, orthogonal coordinate system (e.g., vertical rating scale 42 and horizontal rating scale 42′) is shown with tick marks and integers designating different levels of the attribute being judged. The user moves the words (i.e., concept representations) to positions within the two-dimensional space relative to the vertical scale 42 and the horizontal scale 42′. This movement of the representation is linked in a linear fashion to movement of one arrowhead 46 a along the vertical scale 42 and of another arrowhead 46 b along the horizontal scale 42′. The system allows the user to continue manipulating the positions of the items in two dimensions until they are satisfied with all the ratings along both scales 42, 42′. The system records the concept ratings and movements on both scales 42, 42′, the order for each move, and the time required for each move.
  • In one example, the user may not assign exactly the same ratings (x and y coordinates) for two or more items because this requires that the two words be placed on top of each other, thereby making them unreadable. According to another example, a three-dimensional scale can be used with each of the words located and movable in its own plane above the ground plane including the two-dimensional scale. When each of the words is moved, the x, y location is displayed on the ground plane relative to the two-dimensional scale. This three-dimensional example allows multiple words to have the same rating (i.e., the same x, y location) without the words having to be placed on top of one another. [0041]
  • In the example illustrated in FIG. 17, a hypothetical user expressed their preference (vertical scale) and perceived nutritional value (horizontal scale) of different foods. The item “pasta” was rated as well liked and of high nutrition; the item “water” was rated as well liked and low in nutrition; the item “broccoli” was rated as poorly liked and high in nutrition; and the item “lettuce” was rated as poorly liked and low in nutrition. [0042]
  • Alternatively, the two dimensional scale can be represented as a polar coordinate [0043] system 50, as shown in FIG. 18. The user moves the concept representations 52 (e.g., words, letters, pictures or icons) within one or more circles 54 to different locations in the space. The value measured along one dimension (e.g., preference) is the distance of the item from the center 56 of the circle(s) 54. The value along the second dimension (e.g., nutritional value) is the angle of a vector extending from the center 56 of the circle(s) 54.
  • According to another method of representing and recording judgments, as shown in FIG. 19, concept representations are positioned relative to one another in two dimensions without any physical context or rating scale. Multiple concept representations (e.g., words, pictures, or icons) are displayed, [0044] step 322. The user is asked to make judgments by manipulating the concept representations and moving the concept representations in relation to one another. The user manipulations of the concept representations are received, step 326, and the concept representations are displayed at the user-designated location, step 330. User input information is recorded as each concept representation is manipulated, step 334. These steps can be repeated to adjust or modify the user's judgment, step 338. When the user is finished, the final positions of the concept representations representing the user's judgments are recorded, step 342.
  • One example of this method of positioning [0045] concept representations 60 in two-dimensional space is used with food items, as shown in FIG. 20. The user locates the food items in two dimensions on the computer screen by moving words to positions relative to each other. No scales or units of measure are shown and the user is not told what the two dimensions of the screen represent. The user is instructed to adjust the items such that those items that go together (specific attributes can be specified) are close together in space and those items that do not go together are far apart in space. The inter-item distances can later be analyzed so that items are placed in either clustered groupings (non metric) or in a two-dimensional coordinate system (metric). The advantage of this method over previous manual and computer applications is that one can store a running record of every keystroke made by the user in rendering judgments as items are moved about the screen.
  • According to further embodiments of the present invention, any of the methods described above can incorporate the fixed resource technique, as described in greater detail in co-pending provisional application Serial No. 60/270,854 (Attorney Docket No. BAIRD-001PR) and application Ser. No. 09/950,126 (Attorney Docket No. BAIRD-001XX), both of which are incorporated herein by reference. For example, a [0046] horizontal scale 42′ (for example, as shown in FIG. 15) can be used with each word (or other type of concept representation) located in its own row above the scale 42′. As one word is moved horizontally in relation to the scale 42′, one or more of the other words are able to move automatically without interfering with one another in accordance with the fixed resource technique. The three-dimensional scale described above can also be used according to this embodiment to provide the fixed resources in two dimensions.
  • Accordingly, the system and method of the present invention is able to dynamically represent relative judgments while also recording the judgment process. Modifications and substitutions by one of ordinary skill in the art are considered to be within the scope of the present invention, which is not to be limited except by the following claims. [0047]

Claims (32)

The invention claimed is:
1. A computerized method of representing judgments of a user and recording a judgment making process of said user, said method comprising:
displaying multiple concept representations to the user, wherein said concept representations represent concepts about which the user is asked to make a judgment by positioning said concepts together with other concepts that go together;
receiving user manipulations of selected concept representations resulting in user-designated locations of said concept representations relative to one another based on the judgment of the user as to said concepts that go together;
displaying said concept representations at said user-designated locations;
continuously recording user input information as the user manipulates said selected concept representations; and
recording said concept representations at final user-designated locations to provide a final judgment representation, wherein said user input information and said final judgment representation can be used to evaluate the judgments and the judgment making process of the user.
2. The method of claim 1 wherein said user input information includes said user manipulations and the timing of said user manipulations.
3. The method of claim 1 further including replaying the judgment making process of the user using said user input information.
4. The method of claim 1 further including analyzing the distances between said concept representations at said final user-designated locations.
5. The method of claim 4 further including placing said concept representations in clustered groupings based on said distances.
6. The method of claim 1 wherein said concept representations are words.
7. The method of claim 1 wherein said concept representations represent at least one of physical items and abstract ideas.
8. A computerized method of representing and recording judgments of a user in relation to a rating scale, said method comprising:
displaying multiple concept representations to the user, wherein each of said concept representations represents a concept about which the user is asked to make a judgment;
displaying at least one rating scale, wherein said rating scale provides a range of possible judgments applicable to said concepts;
receiving user manipulations of selected concept representations locating said concept representations relative to said rating scale and relative to other concept representations, thereby providing user input ratings of selected concepts;
displaying said selected concept representations relative to said rating scale and relative to said other concept representations in accordance with said user input ratings; and
recording at least a final location of said concept representations relative to said rating scale to provide a final judgment to be evaluated.
9. The method of claim 8 further including displaying an indication on said rating scale of said user input rating of a concept representation when said concept representation is being manipulated by the user.
10. The method of claim 8 wherein said at least one rating scale is two-dimensional including a horizontal rating scale and a vertical rating scale, for representing two different types of judgments.
11. The method of claim 8 wherein said at least one scale is represented as a two-dimensional polar coordinate system, for representing two different types of judgments.
12. The method of claim 8 wherein said concept representations are words.
13. The method of claim 8 wherein said rating scale includes numerical values for indicating said user input ratings.
14. The method of claim 8 further including continuously recording user input information as the user manipulates said concept representations, wherein said user input information allows an evaluation of a judgment making process of the user.
15. The method of claim 14 further including replaying said judgment making process of the user based upon said user input information.
16. The method of claim 8 further including dynamically adjusting a location of at least one other concept representation relative to said rating scale when each said selected concept representation is located relative to said rating scale such that a user can observe dynamically how a judgment with respect to one concept representation affects a judgment with respect to another concept representation.
17. The method of claim 16 wherein said concept representations are dynamically adjusted according to a fixed resource technique such that a sum of user input ratings for each of said selected concepts remains constant.
18. The method of claim 16 wherein said concept representations are positioned relative to said rating scale such that said concept representations can be dynamically adjusted without interfering with other said concept representations.
19. The method of claim 8 wherein said concepts include sensory symptoms, and wherein said judgment of the user is based on a degree to which the user experiences said sensory symptoms.
20. A computerized method of representing and recording judgments and a judgment making process of a user in relation to a physical context, said method comprising:
displaying at least one physical context representation, wherein said physical context representation represents a physical context in which the user is asked to make judgments by designating locations in said physical context;
receiving a user input judgment associated with at least one location in said physical context;
displaying a location representation at said location in said physical context representation;
continuously recording user input information as the user designates said locations; and
recording each said location representation in said physical context representation to provide a final judgment representation, wherein said user input information and said final judgment representation can be used to evaluate the judgments and the judgment making process of the user.
21. The method of claim 20 further comprising:
displaying multidimensional judgments pertaining to at least one of said location representations; and
receiving at least one user manipulation of at least one of said multidimensional judgments, for further characterizing said user input judgment.
22. The method of claim 21 wherein said multi-dimensional judgments are dynamically adjusted in response to each said user manipulation.
23. The method of claim 22 wherein said multi-dimensional judgments are dynamically adjustable according to a fixed resource technique.
24. The method of claim 20 further comprising replaying said judgment making process of said user using said user input information.
25. A computerized method of representing and recording judgments of a user pertaining to sensory systems, said method comprising:
displaying at least one body representation, wherein said body representation represents at least a portion of the body of the user in which the user is asked to make judgments by designating locations of said sensory symptoms;
receiving a user input judgment associated with at least one location of a sensory symptom in said body;
displaying a location representation at said location in said body representation; and
recording said location representations in said body representation to provide a final judgment representation, wherein said final judgment representation can be used to evaluate the judgments of the user.
26. The method of claim 25 further comprising continuously recording user input information as the user designates said locations, wherein said user input information allows an evaluation of a judgment making process of the user.
27. The method of claim 26 further comprising replaying said judgment making process of the user using said user input information.
28. The method of claim 25 wherein different types of location representations are used for different types of said sensory symptoms.
29. The method of claim 28 wherein said sensory symptoms include pain symptoms, and wherein different colors are used to represent different intensities of pain.
30. The method of claim 25 further including comparing said final judgment representation to a library of data to determine a diagnosis.
31. A computer program product, stored on a storage medium, for representing and recording judgments of a user in relation to a rating scale, said computer program product comprising:
code for displaying multiple concept representations to the user, wherein each of said concept representations represents a concept about which the user is asked to make a judgment;
code for displaying at least one rating scale, wherein said rating scale provides a range of possible judgments applicable to said concepts;
code for receiving user manipulations of selected concept representations locating said concept representations relative to said rating scale and relative to other concept representations, thereby providing user input ratings of selected concepts;
code for displaying said selected concept representations relative to said rating scale and relative to said other concept representations in accordance with said user input ratings; and
code for recording at least a final location of said concept representations relative to said rating scale to provide a final judgment to be evaluated.
32. A computer program product, stored on a storage medium, for representing and recording judgments and a judgment making process of a user in relation to a physical context, said computer program product comprising:
code for displaying at least one physical context representation, wherein said physical context representation represents a physical context in which the user is asked to make judgments by designating locations in said physical context;
code for receiving a user input judgment associated with at least one location in said physical context;
code for displaying a location representation at said location in said physical context representation;
code for continuously recording user input information as the user designates said locations; and
code for recording each said location representation in said physical context representation to provide a final judgment representation, wherein said user input information and said final judgment representation can be used to evaluate the judgments and the judgment making process of the user.
US10/662,568 2001-02-23 2003-09-15 Computerized system and method for simultaneously representing and recording dynamic judgments Abandoned US20040064266A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US10/662,568 US20040064266A1 (en) 2001-02-23 2003-09-15 Computerized system and method for simultaneously representing and recording dynamic judgments
US11/553,051 US20070055481A1 (en) 2001-02-23 2006-10-26 System and method for mapping symptom location and intensity

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US27085401P 2001-02-23 2001-02-23
US29211501P 2001-05-18 2001-05-18
US10/016,623 US6619961B2 (en) 2001-02-23 2001-12-10 Computerized system and method for simultaneously representing and recording dynamic judgments
US10/662,568 US20040064266A1 (en) 2001-02-23 2003-09-15 Computerized system and method for simultaneously representing and recording dynamic judgments

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/016,623 Division US6619961B2 (en) 2001-02-23 2001-12-10 Computerized system and method for simultaneously representing and recording dynamic judgments

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US11/553,051 Continuation-In-Part US20070055481A1 (en) 2001-02-23 2006-10-26 System and method for mapping symptom location and intensity

Publications (1)

Publication Number Publication Date
US20040064266A1 true US20040064266A1 (en) 2004-04-01

Family

ID=27360611

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/016,623 Expired - Fee Related US6619961B2 (en) 2001-02-23 2001-12-10 Computerized system and method for simultaneously representing and recording dynamic judgments
US10/662,568 Abandoned US20040064266A1 (en) 2001-02-23 2003-09-15 Computerized system and method for simultaneously representing and recording dynamic judgments

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US10/016,623 Expired - Fee Related US6619961B2 (en) 2001-02-23 2001-12-10 Computerized system and method for simultaneously representing and recording dynamic judgments

Country Status (1)

Country Link
US (2) US6619961B2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040162751A1 (en) * 2003-02-13 2004-08-19 Igor Tsyganskiy Self-balancing of idea ratings
US20070055481A1 (en) * 2001-02-23 2007-03-08 Psychological Applications Llc System and method for mapping symptom location and intensity

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7576756B1 (en) * 2002-02-21 2009-08-18 Xerox Corporation System and method for interaction of graphical objects on a computer controlled system
US20040116778A1 (en) * 2002-12-13 2004-06-17 Mauldin John Tyler Systems and methods for assessment of gross cultural assimilation
US20070224584A1 (en) * 2006-03-02 2007-09-27 Michelle Hokanson Apparatus for use in assessing pain level
US8528905B2 (en) * 2009-06-25 2013-09-10 Ronald Bianco Electronic puzzle with problem-solution features for proper placement of puzzle pieces
US20130323692A1 (en) * 2012-05-29 2013-12-05 Nerdcoach, Llc Education Game Systems and Methods
US11532396B2 (en) 2019-06-12 2022-12-20 Mind Medicine, Inc. System and method for patient monitoring of gastrointestinal function using automated stool classifications

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4671772A (en) * 1985-10-22 1987-06-09 Keilty, Goldsmith & Boone Performance appraisal and training system and method of utilizing same
US5720502A (en) * 1996-11-08 1998-02-24 Cain; John R. Pain location and intensity communication apparatus and method
US5745718A (en) * 1995-07-31 1998-04-28 International Business Machines Corporation Folder bar widget
US5882203A (en) * 1995-05-31 1999-03-16 Correa; Elsa I. Method of detecting depression
US5938607A (en) * 1996-09-25 1999-08-17 Atl Ultrasound, Inc. Ultrasonic diagnostic imaging system with access to reference image library
US6007340A (en) * 1996-04-01 1999-12-28 Electronic Data Systems Corporation Method and system for measuring leadership effectiveness
US6405159B2 (en) * 1998-06-03 2002-06-11 Sbc Technology Resources, Inc. Method for categorizing, describing and modeling types of system users

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4671772A (en) * 1985-10-22 1987-06-09 Keilty, Goldsmith & Boone Performance appraisal and training system and method of utilizing same
US5882203A (en) * 1995-05-31 1999-03-16 Correa; Elsa I. Method of detecting depression
US5745718A (en) * 1995-07-31 1998-04-28 International Business Machines Corporation Folder bar widget
US6007340A (en) * 1996-04-01 1999-12-28 Electronic Data Systems Corporation Method and system for measuring leadership effectiveness
US5938607A (en) * 1996-09-25 1999-08-17 Atl Ultrasound, Inc. Ultrasonic diagnostic imaging system with access to reference image library
US5720502A (en) * 1996-11-08 1998-02-24 Cain; John R. Pain location and intensity communication apparatus and method
US6405159B2 (en) * 1998-06-03 2002-06-11 Sbc Technology Resources, Inc. Method for categorizing, describing and modeling types of system users

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070055481A1 (en) * 2001-02-23 2007-03-08 Psychological Applications Llc System and method for mapping symptom location and intensity
US20040162751A1 (en) * 2003-02-13 2004-08-19 Igor Tsyganskiy Self-balancing of idea ratings
US7536315B2 (en) * 2003-02-13 2009-05-19 Sap Aktiengesellschaft Self-balancing of idea ratings

Also Published As

Publication number Publication date
US6619961B2 (en) 2003-09-16
US20020119431A1 (en) 2002-08-29

Similar Documents

Publication Publication Date Title
US8021298B2 (en) System and method for mapping pain depth
ES2363759T3 (en) METHODS TO TAKE IMAGES OF THE SKIN AND ANALYZE THE SKIN.
US10702216B2 (en) Method and system for body scanning and display of biometric data
US20070055481A1 (en) System and method for mapping symptom location and intensity
Tory et al. Visualization task performance with 2D, 3D, and combination displays
AU775248B2 (en) Computer system and method for displaying data in a shared interactive sectorial graph
US20160088284A1 (en) Method and system for determining biometrics from body surface imaging technology
Perin et al. Interactive horizon graphs: Improving the compact visualization of multiple time series
US9024952B2 (en) Discovering and configuring representations of data via an insight taxonomy
McCleary An Effective Graphic? Vocabulary?
Ulinski et al. Two handed selection techniques for volumetric data
US20040064266A1 (en) Computerized system and method for simultaneously representing and recording dynamic judgments
AU2017260525B2 (en) Method and system for body scanning and display of biometric data
US20210233330A1 (en) Virtual or Augmented Reality Aided 3D Visualization and Marking System
McNamara et al. Perceptually-motivated graphics, visualization and 3D displays
US11294922B2 (en) Method, apparatus, and computer readable medium for modeling relationships between query responses in a data set
Cecotti et al. Serious game for medical imaging in fully immersive virtual reality
Lee et al. Spatial low pass filters for pin actuated tactile displays
KR19990045435A (en) Method using Pseudo Device Selector and Neural Network
JP2001299545A (en) Advice system for choosing pillow
EP3893091A1 (en) Controlling a data inspection by a user
KR101079218B1 (en) Method and apparatus for displaying visualized data
Carswell et al. Visuospatial displays: Design problems and principles.
Gustafson Visualizing off-screen locations on small mobile displays
GB2402830A (en) Colour coding of data dependant on human perception of colour

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION