WO1995016949A1 - Automatic gathering and graphical display of useability test data - Google Patents

Automatic gathering and graphical display of useability test data Download PDF

Info

Publication number
WO1995016949A1
WO1995016949A1 PCT/US1994/013853 US9413853W WO9516949A1 WO 1995016949 A1 WO1995016949 A1 WO 1995016949A1 US 9413853 W US9413853 W US 9413853W WO 9516949 A1 WO9516949 A1 WO 9516949A1
Authority
WO
WIPO (PCT)
Prior art keywords
cursor
data
user
software
useability
Prior art date
Application number
PCT/US1994/013853
Other languages
French (fr)
Inventor
Jeffrey Charles Conniff
Original Assignee
Software Publishing Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Software Publishing Corporation filed Critical Software Publishing Corporation
Priority to EP95905852A priority Critical patent/EP0685084A1/en
Priority to CA002156058A priority patent/CA2156058A1/en
Publication of WO1995016949A1 publication Critical patent/WO1995016949A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3409Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
    • G06F11/3419Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment by assessing time
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3409Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
    • G06F11/3414Workload generation, e.g. scripts, playback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3438Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment monitoring of user actions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/30Monitoring
    • G06F11/34Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
    • G06F11/3466Performance evaluation by tracing or monitoring

Definitions

  • This invention relates to computer software testing, and more particularly to a system for developing and testing computer software which includes the automatic tracking and graphical display of the software useability.
  • Program useability describes the ease with which an untrained or modestly trained user can maneuver about within the software program and operate the various program functions. This useability includes such considerations as the logical arrangement of functions and menus, the layout of the various graphical features on the computer screen, and the hierarchical arrangement in which various functions are ordered.
  • the traditional method of testing useability of computer software is to videotape the actual usage of the software. This generally entails setting up a testing station in which a number of users sequentially perform specific tasks on the software-under-test. The various keystrokes and user inputs are recorded on videotape.
  • testing sessions are then reviewed by a software test engineer and various relevant data is recorded relating to the useability and use patterns contained in the videotape.
  • Measuring software useability in this way is inconvenient in a number of respects.
  • the user since the user must be videotaped, maintaining more than one station can be an expensive process as there must be one taping machine per station.
  • having a video camera, and especially having a cameraman present during the software testing session can skew the useability results.
  • evaluation of the videotape following the test sessions can be an extremely burdensome process, since each of the video sessions must be reviewed by the software test engineer.
  • the test sessions are videotaped by user (i.e.
  • a first user is taped performing multiple tasks, followed by a second user performing multiple tasks), they do not lend themselves to being conveniently reviewed by task.
  • What is needed is a method to automatically record software utilization in such a way that not only the results achieved by the software user can be measured, but also the process followed by the user in achieving the software results.
  • the system consists of programmed computer instructions implemented on a general purpose computer comprising a central processing unit for performing the programmed instructions, a video display for displaying stored data, a keyboard and pointing device for enabling user input, a clock generator for timing each function tested, and memory for data storage.
  • the memory includes program memory, test memory, mass storage and software-under- test memory.
  • the test memory contains a number of registers including a path array for keeping track of the movements of a pointing device, a click register for locating the position of the pointing device when a specific graphical interface selection is made, and a statistics register for keeping track of various user statistics such as total test time elapsed, user ID number, user name and familiarity data.
  • the method of the present invention is initiated when the user is asked to perform a specific function in the software under test. Upon initiating the task, a user ID is established and a clock generator begins to track the time of the user function. The system then traces the path followed by the user in manipulating the pointing device or keyboard.
  • the total time elapsed is determined and the user ID and path data are stored in the statistics register and a new user or new task is selected.
  • Analysis of the test data is performed by replaying the paths of selected users and displaying the paths and click entry locations in a playback mode on the video screen.
  • a single user's test data may by analyzed as the path traced is replayed in real time or by displaying a composite graph comprising a plurality of charted lines shown for comparison of the test data of selected users.
  • FIG. 1 is a block diagram showing the system of the present invention
  • FIG. 2(a) is a block diagram showing the various components of the test memory of FIG. 1
  • FIG. 2(b) is a block diagram showing the various components of the statistics register of FIG. 2(a)
  • FIG. 3 is a flow diagram showing the various method steps of the present invention
  • FIG. 4 shows an example of the video display screen used in collecting information in the method of the present invention
  • FIG. 5 shows an example of the video display screen showing data collected for a single user following execution of the example of FIG. 4
  • FIG. 6 shows an example of the video display screen showing various statistics collected for multiple users following execution of the example of FIG. 4.
  • FIG. 1 a system 10 is shown for implementing the automatic gathering and graphical display of useability test data of the present invention.
  • the preferred embodiment of the present invention consists of programmed computer instructions implemented on a general purpose computer such as an IBM PC, Intel 80486-based machine. Execution of the programmed instructions is performed in a Central Processing Unit (CPU) 12. Connected to the CPU 12 is a video display 14 for displaying stored data to the user. User input is enabled through a keyboard 16 and a pointing device 18 which are both connected to the CPU 12.
  • CPU Central Processing Unit
  • the CPU 12 can optionally be connected to a printer 20 for generating hard copies of data manipulated by the CPU 12, a Local Area Network (LAN) 22 for communicating with other computers, and a modem 24 for receiving and transmitting data through a telephone network. Also connected to the CPU 12 is a data bus 26 for connecting the CPU 12 to other peripheral devices, such as a clock register 27, and to various computer memories, including a program memory 28, test memory 30, software-under-test memory 32 and mass storage 34.
  • the program memory 28 stores the programmed instructions for controlling the operation of the CPU 12 in executing the method steps of the present invention. Following the instructions stored in the program memory 28, the CPU 12 prompts the user via the video display 14 to perform specific tasks on the software being tested, using the keyboard 16 and the pointing device 18.
  • the software-under-test memory 32 is used to store the software components being analyzed by the instructions stored in program memory 28.
  • a typical software component being analyzed would be a dialog box in which the user is asked to read a set of instructions and pick one or several choices from a plurality of possible selections.
  • Statistical data such as the amount of time required for the user to complete the task displayed in the dialog box, is recorded and stored in the test memory 30.
  • a mass storage 32 such as a hard disk or the like, is included in the system 10 for long term storage of a multiplicity of test results.
  • the clock register 27 provides means for determining the time that a user spends performing a specific function as identified by instructions stored in the software-under-test memory 32. Increments of time are generated by the internal clock of the CPU 12, and the CPU periodically increments the contents of the clock register 27. The periodic updating of the clock register 27 preferably occurs in tenths of a second increments, although other increments may be equivalently useful.
  • the clock register 27 is used by the CPU 12 for measuring and tracking functions as required by the programmed 1 instructions of the present invention. The total elapsed time from
  • the clock register 27 is associated with a
  • FIG. 2(a) a block diagram shows the
  • the databus 26 connects
  • I I a path array 36, a click register 38 and a statistics register 40 to the
  • the path array 36 selectively stores the location of the cursor
  • Each user is assigned an identification number by the CPU 12 that is used to identify the user's test data for a specific function.
  • the identification numbers are stored in the statistics register 40.
  • the CPU 12 utilizes the identification number it generates, and not the user's name, to store and retrieve the test data for that user.
  • the CPU 12 periodically requests the user to make selections from among various menu items. The user makes these selections by moving the cursor to the area of the video display 14 ' containing the desired information and either pressing a predetermined keyboard key or by pressing an enter key on a pointing device 18.
  • a statistics register 40 stores a variety of statistical data relating the user's interaction with the software-under-test. This stored data includes the elapsed time calculated by the CPU 12 for performing specific functions, the user name entered at the beginning of the test, the user ID generated by the CPU 12 for identifying specific test data and user familiarity data. User familiarity data are answers to multiple choice questions prompted by the test program to ascertain the level of familiarity the user has with the software- under-test or like products. Referring now to FIG.
  • a block diagram shows the components comprising the statistics register 40 which include an elapsed time memory 42, a user name memory 44, a user ID number memory 46 and a familiarity data memory 48.
  • the elapsed time memory 42 stores the total time that a user spends performing a specific function as determined by the clock register 27 and by the CPU 12.
  • the user name memory 44 stores the name that the user input in response to a prompt by the CPU 12 at the beginning of a test cycle.
  • the user ID number memory 46 stores the CPU generated ID number for each user.
  • the familiarity data memory 48 stores answers to multiple choice questions prompted by the CPU 12 at the beginning of the test cycle, where the user is asked specific questions to gauge the level of familiarity the user has with the software- under-test.
  • a flow diagram shows the various steps comprising the method 50 of automatically gathering test data in accordance with the present invention.
  • the method starts in step 51 with a prompt from the CPU 12 requesting that the name of the user be entered via the keyboard 16.
  • the user's name is used, however, any other identification system may also be used.
  • the user's name is entered 53 and is stored in the user name memory 44.
  • the CPU 12 generates a separate ID number for each user name input and stores the ID numbers in the user ID number memory 46.
  • the CPU 12 utilizes the user ID number as the primary ID source for purposes of identifying data but preferably only the user's name is displayed on the video display 14.
  • familiarity data is entered 55 by the user in response to multiple choice questions that gauge the user's familiarity with like software products. For example, an appropriate multiple choice question might ask the user to choose whether he is more familiar with IBM-type personal computers or Apple Macintosh-type computers. Another question might ask the user if he has previously used a particular software package.
  • the familiarity level of the user is important in the proper analysis of the software under test as heightened familiarity may skew the test results and cause inaccurate analysis. A computer literate user will most likely have an easier time working with new programs than a computer novice.
  • the familiarity data entries 55 are stored in the familiarity data memory 48.
  • the useability evaluation of the software-under-test is initiated by displaying 57 a function task, comprising a written set of instructions presented on the video display 14.
  • the instructions provide the user with directions about the task to be performed.
  • the instructions also direct the user to acknowledge that the directions have been read and to initiate the evaluation preferably by pressing a certain key or by entering a specific combination of keystrokes.
  • the user positions the cursor on an acknowledgment button displayed on the video display 14 and initiates the evaluation by pressing the enter key on the pointing device 18.
  • the test screen associated with the first function is displayed on the video display 14 and incrementing of the clock register 27 is started 59.
  • the clock register 27 continues to increment until a final input from the user signals the completion of the specific function being evaluated.
  • the movement of the cursor is traced 61 and cursor position at the time of each click is stored.
  • the X and Y coordinates corresponding to the position of the cursor are ascertained by the CPU 12 at periodic time increments and are stored in the path array memory 36. From this stored cursor data, the path 61 of the cursor as it is moved about the video display 14 by the user can be later retraced.
  • the position of the cursor each time the mouse is clicked is also stored by its corresponding X and Y coordinates.
  • the position data and quantity of clicks are stored in the click register 40.
  • the user When the specific function being tested is completed, the user indicates 63 the completion by indicating a "done" command using either the keyboard 16 or the pointing device 18.
  • the CPU 12 tests for a done indication in step 63.
  • the time stored in the clock register 27 is read by the CPU 12, and the total time elapsed for the function tested is determined and stored in the elapsed time memory 42.
  • a software useability test typically comprises a plurality of functions. Where a second function is to be tested, the method repeats by displaying 57 a new set of instructions. As with the instructions for the first function, the user is asked to start the function timer by entering a predetermined combination of keystrokes. The total elapsed time is determined by the CPU 12 for each function performed.
  • test data is preferably stored primarily by user ID number and secondarily by function code in the statistics register 40.
  • the CPU 12 asks 65 the user if a new user is to be tested. If the answer is 'yes', the user's name is entered 53 and processing iterates once again through the function evaluation steps as discussed. If no further users are found in step 65, the method ends 67.
  • test mode video display screen 70 is shown concomitant with the starting 59 of the clock register incrementation.
  • the test memory 30 is now prepared to receive information from the user via the CPU 12.
  • the user is prompted to begin the test by responding to a first instruction 78.
  • the test screen 70 includes an instruction box 72 in the upper right-hand corner of the test screen 70 and a test box 74 in the middle left-hand side of the test screen 70 entitled “new" in the example shown.
  • the instruction box 72 provides direction to the user, and prompts the user to begin testing by entering a combination of initiating keystrokes 80.
  • test box 74 is a video display window that contains the graphical components being evaluated for useability by the present invention.
  • the test box 74 appears much bigger and partially covers the initiating keystrokes 80 prompt, but its size has been reduced in FIG. 4 for illustration purposes.
  • the instructions direct the user to "show the titles and left justify them on the page.”
  • the cursor 76 would first be moved to the box marked "show titles” 82. When the cursor 76 is within the frame of the show titles box 82, the mouse is clicked to select "show titles”. The cursor 76 is then moved to the box marked "align text” 84, centered within the "left” bubble 86, and the mouse is clicked to select left alignment. Next, the cursor 76 is moved to the box marked "OK” 88 and the mouse is clicked to complete the test function and stop the incrementing clock register 27.
  • the path of the cursor 76 is determined by the location of the cursor 76 at selected time intervals.
  • Each user's path is stored in the path array memory 36 and the location of each cursor click is stored in the click register 40.
  • the user ID, user name, familiarity data and the elapsed function time for the series of functions comprising an event are stored in the statistics register 42.
  • the data stored in the path array 36, the click register 38, and the statistics register 40 can be recalled to recreate a test function performed by a user, in order to analyze the useability of the software-under-test.
  • the cursor path and cursor click location data for any user can be recalled by the CPU 12 and displayed on the test screen 70 as a charted line representing the cursor path and series of bullets denoting the locations of the cursor clicks.
  • the time increment/cursor location data from the path array 36 is recalled to the test screen 70. Because the data is stored as a series of points having X and Y coordinates, drawing a line between subsequent points to connect them creates the charted line.
  • the X and Y coordinates relating to the cursor click locations are recalled to the test screen 70 and bullets or other such demarcation is added to the display at each location.
  • both the charted line and bullets can be recreated simultaneously where the "real time" (i.e. exact path including pauses) taken by the chosen user is desired for the analysis of the software-under-test.
  • FIG. 5 illustrates a playback mode video display screen 89, showing the path array and cursor click data collected for a single user following the execution of the function displayed in FIG. 4.
  • the playback screen 89 includes the instruction box 72 and test box 74 as seen by the user during the test, and further includes an analysis box 90 shown below the test box 74.
  • the same test environment is presented in the playback mode as the environment of the test mode so that an accurate examination and analysis of the test results is made.
  • a user named Andy 91 utilized a total time of 58 one-tenth second increments 92 (or 5.8 seconds) to complete the example task.
  • Andy's path 98 is drawn in graphic form and click locations 100a through 100c are shown as bullets.
  • Andy started at the upper left- hand c- ⁇ rner 102 of the test box 74, moved to the show titles box 82, clicked the mouse at location 100a, moved toward the right dipping at point 104 slightly until the cursor was within the align text left bubble 86, clicked the mouse at location 100b and moved to the 'OK' box 88 and clicked the mouse at location 100c to finalize the task.
  • All existing paths are first cleared by pressing the clear path button 106.
  • the name of the user whose test is to be analyzed is input 108.
  • the user identification box 109 is shown as a comprising a single row of data, in an alternative embodiment a plurality of user identifications are listed and a list of users may be scrolled in order to select one or more users.
  • the speed at which the charted line is to be drawn is changed by moving the speed box 110 to the right for a faster line creation, or to the left for slower line creation.
  • the chart is drawn in real time to simulate the exact path, including pauses, of the chosen user.
  • the draw speed could be increased for instantaneous line creation or decreased for slow motion line creation, as desired.
  • the "dialogid box" 111 indicates which function of an event is being recreated. As the various function tasks comprising the software test are created, each function is coded with a dialogid number 113. Test data is stored in the statistics register 40, preferably primarily by the user ID number, and secondarily by dialogid number 113.
  • a plurality of users are simultaneously analyzed in an example of a playback screen 89 showing the paths 112 and cursor click data 114 for multiple users.
  • a plurality of sets of preliminary data 116 (user name, time elapsed and familiarity data answers) are shown for multiple users in generally tabular form. Alternatively, only one set of preliminary data 116 can be shown at a time.
  • the preliminary data 116 can be sorted by user name, total time elapsed or by the sequence of answers to the familiarity data.
  • the paths of selected users are preferably drawn sequentially in overlapping fashion, with the line that is in the process of being drawn highlighted, shown as a darker line 118, or in a different color.
  • a composite of the charted lines for a multiplicity of users can be displayed simultaneously for comparison. For instance, to create a composite of the charted lines for only the five fastest users, the preliminary data is sorted by total elapsed time and the paths taken by the first five users are selected to be instantaneously drawn. Further, an analysis can be limited to only the cursor clicks by choosing to regenerate the corresponding data from the click register and having the click locations drawn on the playback screen.
  • the automatic gathering and graphical display system of the present invention provides a software useability analyzer with a means for viewing the paths and click locations of one or more users simultaneously to assist in the analysis of the particular software- under-test. By considering the elapsed time data and gauging the familiarity level for each user in addition to examining the traced paths, the analysis of the tested software's useability is accurately determined.

Abstract

An apparatus for recording software useability includes central processing unit (CPU, 12), video display (14), keyboard (16), clock register (27), program memory (28), test memory (30), and mass storage (34). Test memory (30) includes path array (36) for keeping track of pointing device (18), located by click register (38), and register (40) for such statistics as identification number (ID) for successive users, elapsed time (42) indicating time taken to perform a function, user name (44), and familiarity data (48) for answers to questions regarding the user's level of experience using like software. A user is asked (57) to perform a software function, the system traces the cursor's path (61) and determines elapsed time. Data is stored in test memory (30) and the program is ended, or a new user or task is selected (65). Analysis is performed by replaying the paths of selected users on video screen (14).

Description

AUTOMATIC GATHERING AND GRAPHICAL DISPLAY OF USEABILITY TEST DATA
BACKGROUND OF THE BSrVΕNΗON 1. Field of the Invention This invention relates to computer software testing, and more particularly to a system for developing and testing computer software which includes the automatic tracking and graphical display of the software useability.
2. Description of the Background Art An important aspect of the development of new computer software is program useability. Program useability describes the ease with which an untrained or modestly trained user can maneuver about within the software program and operate the various program functions. This useability includes such considerations as the logical arrangement of functions and menus, the layout of the various graphical features on the computer screen, and the hierarchical arrangement in which various functions are ordered. The traditional method of testing useability of computer software is to videotape the actual usage of the software. This generally entails setting up a testing station in which a number of users sequentially perform specific tasks on the software-under-test. The various keystrokes and user inputs are recorded on videotape. The results of these testing sessions are then reviewed by a software test engineer and various relevant data is recorded relating to the useability and use patterns contained in the videotape. Measuring software useability in this way is inconvenient in a number of respects. First of all, since the user must be videotaped, maintaining more than one station can be an expensive process as there must be one taping machine per station. Secondly, having a video camera, and especially having a cameraman present during the software testing session, can skew the useability results. Further, evaluation of the videotape following the test sessions can be an extremely burdensome process, since each of the video sessions must be reviewed by the software test engineer. Finally, because the test sessions are videotaped by user (i.e. a first user is taped performing multiple tasks, followed by a second user performing multiple tasks), they do not lend themselves to being conveniently reviewed by task. What is needed is a method to automatically record software utilization in such a way that not only the results achieved by the software user can be measured, but also the process followed by the user in achieving the software results.
SUMMARY OF THE INVENTION In accordance with the present invention, an apparatus and process are described for automatically recording software useability in a computer system. The system consists of programmed computer instructions implemented on a general purpose computer comprising a central processing unit for performing the programmed instructions, a video display for displaying stored data, a keyboard and pointing device for enabling user input, a clock generator for timing each function tested, and memory for data storage. The memory includes program memory, test memory, mass storage and software-under- test memory. The test memory contains a number of registers including a path array for keeping track of the movements of a pointing device, a click register for locating the position of the pointing device when a specific graphical interface selection is made, and a statistics register for keeping track of various user statistics such as total test time elapsed, user ID number, user name and familiarity data. The method of the present invention is initiated when the user is asked to perform a specific function in the software under test. Upon initiating the task, a user ID is established and a clock generator begins to track the time of the user function. The system then traces the path followed by the user in manipulating the pointing device or keyboard. When the user has completed the path, the total time elapsed is determined and the user ID and path data are stored in the statistics register and a new user or new task is selected. Analysis of the test data is performed by replaying the paths of selected users and displaying the paths and click entry locations in a playback mode on the video screen. A single user's test data may by analyzed as the path traced is replayed in real time or by displaying a composite graph comprising a plurality of charted lines shown for comparison of the test data of selected users.
BRIEF DESCRIPTION OF THE DRAWINGS ' FIG. 1 is a block diagram showing the system of the present invention; FIG. 2(a) is a block diagram showing the various components of the test memory of FIG. 1 ; FIG. 2(b) is a block diagram showing the various components of the statistics register of FIG. 2(a); FIG. 3 is a flow diagram showing the various method steps of the present invention; FIG. 4 shows an example of the video display screen used in collecting information in the method of the present invention; FIG. 5 shows an example of the video display screen showing data collected for a single user following execution of the example of FIG. 4; and FIG. 6 shows an example of the video display screen showing various statistics collected for multiple users following execution of the example of FIG. 4.
DESCRIPTION OF THE PREFERRED EMBODIMENT Referring now to FIG. 1, a system 10 is shown for implementing the automatic gathering and graphical display of useability test data of the present invention. The preferred embodiment of the present invention consists of programmed computer instructions implemented on a general purpose computer such as an IBM PC, Intel 80486-based machine. Execution of the programmed instructions is performed in a Central Processing Unit (CPU) 12. Connected to the CPU 12 is a video display 14 for displaying stored data to the user. User input is enabled through a keyboard 16 and a pointing device 18 which are both connected to the CPU 12. The CPU 12 can optionally be connected to a printer 20 for generating hard copies of data manipulated by the CPU 12, a Local Area Network (LAN) 22 for communicating with other computers, and a modem 24 for receiving and transmitting data through a telephone network. Also connected to the CPU 12 is a data bus 26 for connecting the CPU 12 to other peripheral devices, such as a clock register 27, and to various computer memories, including a program memory 28, test memory 30, software-under-test memory 32 and mass storage 34. The program memory 28 stores the programmed instructions for controlling the operation of the CPU 12 in executing the method steps of the present invention. Following the instructions stored in the program memory 28, the CPU 12 prompts the user via the video display 14 to perform specific tasks on the software being tested, using the keyboard 16 and the pointing device 18. The user's response to these prompts is tracked and stored in the test memory 30. The software-under-test memory 32 is used to store the software components being analyzed by the instructions stored in program memory 28. A typical software component being analyzed would be a dialog box in which the user is asked to read a set of instructions and pick one or several choices from a plurality of possible selections. Statistical data, such as the amount of time required for the user to complete the task displayed in the dialog box, is recorded and stored in the test memory 30. A mass storage 32, such as a hard disk or the like, is included in the system 10 for long term storage of a multiplicity of test results. The clock register 27 provides means for determining the time that a user spends performing a specific function as identified by instructions stored in the software-under-test memory 32. Increments of time are generated by the internal clock of the CPU 12, and the CPU periodically increments the contents of the clock register 27. The periodic updating of the clock register 27 preferably occurs in tenths of a second increments, although other increments may be equivalently useful. The clock register 27 is used by the CPU 12 for measuring and tracking functions as required by the programmed 1 instructions of the present invention. The total elapsed time from
2 start to finish, for every function that each user performs, is
3 calculated by the CPU 12 and stored in the test memory 30. In an
4 alternative embodiment, the clock register 27 is associated with a
5 discrete clock generator circuit which times the user performances
6 independently of the CPU 12, and calculates the time elapsed for each
7 function. 8
9 Referring now to FIG. 2(a), a block diagram shows the
I 0 components comprising the test memory 30. The databus 26 connects
I I a path array 36, a click register 38 and a statistics register 40 to the
1 2 CPU 12. The path array 36 selectively stores the location of the cursor
1 3 as the user performs each function. In the preferred embodiment, the
1 4 X and Y coordinates corresponding to a cursor position are saved in 1 5 synchronization with CPU 12 time increments provided to the clock 1 6 register 27. The CPU 12 ascertains the X and Y coordinates
1 7 corresponding to the cursor's position at each predetermined
1 8 increment of time, and saves the data to the path array 36. 9 Alternatively, the coordinates are saved to the path array 36 only if 0 the cursor's position actually changes. This selective storing of data 1 eliminates the storage of redundant data when the cursor is idle. 2 Other equivalent selective storage schemes are also possible. For 3 instance, storage of the cursor's position to the path array 36 could 4 occur asynchronously with the CPU 12 using an interrupt mechanism 5 to store time and position data whenever cursor movement is 6 detected. Further, storage could occur only when movement has 7 exceeded some threshold. From the time increment versus cursor 8 position data, the path upon which the user moves the cursor during the test function can be regenerated and later recreated for study and comparison to the paths of other users tested. Each user is assigned an identification number by the CPU 12 that is used to identify the user's test data for a specific function. The identification numbers are stored in the statistics register 40. Although the user is prompted at the beginning of the test to type in his or her name, the CPU 12 utilizes the identification number it generates, and not the user's name, to store and retrieve the test data for that user. As the user operates the software-under-test stored in the memory 32, the CPU 12 periodically requests the user to make selections from among various menu items. The user makes these selections by moving the cursor to the area of the video display 14 ' containing the desired information and either pressing a predetermined keyboard key or by pressing an enter key on a pointing device 18. These entries are called "clicks", and the position of the cursor at the time of each click is defined as a point having X and Y coordinates, and is stored in a click register 38. A statistics register 40 stores a variety of statistical data relating the user's interaction with the software-under-test. This stored data includes the elapsed time calculated by the CPU 12 for performing specific functions, the user name entered at the beginning of the test, the user ID generated by the CPU 12 for identifying specific test data and user familiarity data. User familiarity data are answers to multiple choice questions prompted by the test program to ascertain the level of familiarity the user has with the software- under-test or like products. Referring now to FIG. 2(b), a block diagram shows the components comprising the statistics register 40 which include an elapsed time memory 42, a user name memory 44, a user ID number memory 46 and a familiarity data memory 48. The elapsed time memory 42 stores the total time that a user spends performing a specific function as determined by the clock register 27 and by the CPU 12. The user name memory 44 stores the name that the user input in response to a prompt by the CPU 12 at the beginning of a test cycle. The user ID number memory 46 stores the CPU generated ID number for each user. The familiarity data memory 48 stores answers to multiple choice questions prompted by the CPU 12 at the beginning of the test cycle, where the user is asked specific questions to gauge the level of familiarity the user has with the software- under-test.
Referring now to FIG. 3, a flow diagram shows the various steps comprising the method 50 of automatically gathering test data in accordance with the present invention. The method starts in step 51 with a prompt from the CPU 12 requesting that the name of the user be entered via the keyboard 16. Preferably the user's name is used, however, any other identification system may also be used. In response to the prompt, the user's name is entered 53 and is stored in the user name memory 44. The CPU 12 generates a separate ID number for each user name input and stores the ID numbers in the user ID number memory 46. The CPU 12 utilizes the user ID number as the primary ID source for purposes of identifying data but preferably only the user's name is displayed on the video display 14. Following entry of the user's name 53, familiarity data is entered 55 by the user in response to multiple choice questions that gauge the user's familiarity with like software products. For example, an appropriate multiple choice question might ask the user to choose whether he is more familiar with IBM-type personal computers or Apple Macintosh-type computers. Another question might ask the user if he has previously used a particular software package. The familiarity level of the user is important in the proper analysis of the software under test as heightened familiarity may skew the test results and cause inaccurate analysis. A computer literate user will most likely have an easier time working with new programs than a computer novice. The familiarity data entries 55 are stored in the familiarity data memory 48. The useability evaluation of the software-under-test is initiated by displaying 57 a function task, comprising a written set of instructions presented on the video display 14. The instructions provide the user with directions about the task to be performed. The instructions also direct the user to acknowledge that the directions have been read and to initiate the evaluation preferably by pressing a certain key or by entering a specific combination of keystrokes. In an alternative embodiment, the user positions the cursor on an acknowledgment button displayed on the video display 14 and initiates the evaluation by pressing the enter key on the pointing device 18. As the user responds to the instructions and starts the evaluation in accordance with step 57, the test screen associated with the first function is displayed on the video display 14 and incrementing of the clock register 27 is started 59. The clock register 27 continues to increment until a final input from the user signals the completion of the specific function being evaluated. The movement of the cursor is traced 61 and cursor position at the time of each click is stored. The X and Y coordinates corresponding to the position of the cursor are ascertained by the CPU 12 at periodic time increments and are stored in the path array memory 36. From this stored cursor data, the path 61 of the cursor as it is moved about the video display 14 by the user can be later retraced. During the trace path step 61, the position of the cursor each time the mouse is clicked is also stored by its corresponding X and Y coordinates. The position data and quantity of clicks are stored in the click register 40. When the specific function being tested is completed, the user indicates 63 the completion by indicating a "done" command using either the keyboard 16 or the pointing device 18. The CPU 12 tests for a done indication in step 63. When the user has indicated that the function is done, the time stored in the clock register 27 is read by the CPU 12, and the total time elapsed for the function tested is determined and stored in the elapsed time memory 42. A software useability test typically comprises a plurality of functions. Where a second function is to be tested, the method repeats by displaying 57 a new set of instructions. As with the instructions for the first function, the user is asked to start the function timer by entering a predetermined combination of keystrokes. The total elapsed time is determined by the CPU 12 for each function performed. Each function is separately coded for identification purposes and the test data is preferably stored primarily by user ID number and secondarily by function code in the statistics register 40. When all of the test functions for a particular user have been completed, the CPU 12 asks 65 the user if a new user is to be tested. If the answer is 'yes', the user's name is entered 53 and processing iterates once again through the function evaluation steps as discussed. If no further users are found in step 65, the method ends 67.
Referring now to FIG. 4, a test mode video display screen 70 is shown concomitant with the starting 59 of the clock register incrementation. The test memory 30 is now prepared to receive information from the user via the CPU 12. The user is prompted to begin the test by responding to a first instruction 78. The test screen 70 includes an instruction box 72 in the upper right-hand corner of the test screen 70 and a test box 74 in the middle left-hand side of the test screen 70 entitled "new" in the example shown. The instruction box 72 provides direction to the user, and prompts the user to begin testing by entering a combination of initiating keystrokes 80. When the user enters the initiating keystrokes 80 (Ctrl+Z in this example), the incrementing of the clock register 27 is started 59, the path array 36 and click register 40 tracing 61 is begun, and a test box 74 appears. The test box 74 is a video display window that contains the graphical components being evaluated for useability by the present invention. Preferably the test box 74 appears much bigger and partially covers the initiating keystrokes 80 prompt, but its size has been reduced in FIG. 4 for illustration purposes. By using a mouse as the preferred pointing device 18, the user moves a cursor 76 about the test screen 70 in response to the instructions displayed in the instruction box 72. In the example shown, the instructions direct the user to "show the titles and left justify them on the page." To exemplify a task being performed, the cursor 76 would first be moved to the box marked "show titles" 82. When the cursor 76 is within the frame of the show titles box 82, the mouse is clicked to select "show titles". The cursor 76 is then moved to the box marked "align text" 84, centered within the "left" bubble 86, and the mouse is clicked to select left alignment. Next, the cursor 76 is moved to the box marked "OK" 88 and the mouse is clicked to complete the test function and stop the incrementing clock register 27. From the time of the initiating keystrokes 80 (Ctrl+Z) until the time the function is completed, the path of the cursor 76 is determined by the location of the cursor 76 at selected time intervals. Each user's path is stored in the path array memory 36 and the location of each cursor click is stored in the click register 40. The user ID, user name, familiarity data and the elapsed function time for the series of functions comprising an event are stored in the statistics register 42. The data stored in the path array 36, the click register 38, and the statistics register 40 can be recalled to recreate a test function performed by a user, in order to analyze the useability of the software-under-test. The cursor path and cursor click location data for any user can be recalled by the CPU 12 and displayed on the test screen 70 as a charted line representing the cursor path and series of bullets denoting the locations of the cursor clicks. To create the charted line, the time increment/cursor location data from the path array 36, is recalled to the test screen 70. Because the data is stored as a series of points having X and Y coordinates, drawing a line between subsequent points to connect them creates the charted line. Next, the X and Y coordinates relating to the cursor click locations are recalled to the test screen 70 and bullets or other such demarcation is added to the display at each location. Alternatively, both the charted line and bullets can be recreated simultaneously where the "real time" (i.e. exact path including pauses) taken by the chosen user is desired for the analysis of the software-under-test.
FIG. 5 illustrates a playback mode video display screen 89, showing the path array and cursor click data collected for a single user following the execution of the function displayed in FIG. 4. The playback screen 89 includes the instruction box 72 and test box 74 as seen by the user during the test, and further includes an analysis box 90 shown below the test box 74. The same test environment is presented in the playback mode as the environment of the test mode so that an accurate examination and analysis of the test results is made. In this illustration, a user named Andy 91 utilized a total time of 58 one-tenth second increments 92 (or 5.8 seconds) to complete the example task. In response to two multiple choice familiarity data questions, Andy answered by choosing the first answer to the first question 94 and the first answer to the second question 96. Next, Andy's path 98 is drawn in graphic form and click locations 100a through 100c are shown as bullets. Andy started at the upper left- hand c-βrner 102 of the test box 74, moved to the show titles box 82, clicked the mouse at location 100a, moved toward the right dipping at point 104 slightly until the cursor was within the align text left bubble 86, clicked the mouse at location 100b and moved to the 'OK' box 88 and clicked the mouse at location 100c to finalize the task. To analyze a specific test data, all existing paths are first cleared by pressing the clear path button 106. Next, the name of the user whose test is to be analyzed is input 108. Although the user identification box 109 is shown as a comprising a single row of data, in an alternative embodiment a plurality of user identifications are listed and a list of users may be scrolled in order to select one or more users. The speed at which the charted line is to be drawn is changed by moving the speed box 110 to the right for a faster line creation, or to the left for slower line creation. Preferably the chart is drawn in real time to simulate the exact path, including pauses, of the chosen user. Alternatively, the draw speed could be increased for instantaneous line creation or decreased for slow motion line creation, as desired. The "dialogid box" 111 indicates which function of an event is being recreated. As the various function tasks comprising the software test are created, each function is coded with a dialogid number 113. Test data is stored in the statistics register 40, preferably primarily by the user ID number, and secondarily by dialogid number 113.
Referring now to FIG. 6, a plurality of users are simultaneously analyzed in an example of a playback screen 89 showing the paths 112 and cursor click data 114 for multiple users. A plurality of sets of preliminary data 116 (user name, time elapsed and familiarity data answers) are shown for multiple users in generally tabular form. Alternatively, only one set of preliminary data 116 can be shown at a time. The preliminary data 116 can be sorted by user name, total time elapsed or by the sequence of answers to the familiarity data. The paths of selected users are preferably drawn sequentially in overlapping fashion, with the line that is in the process of being drawn highlighted, shown as a darker line 118, or in a different color. Alternatively, a composite of the charted lines for a multiplicity of users can be displayed simultaneously for comparison. For instance, to create a composite of the charted lines for only the five fastest users, the preliminary data is sorted by total elapsed time and the paths taken by the first five users are selected to be instantaneously drawn. Further, an analysis can be limited to only the cursor clicks by choosing to regenerate the corresponding data from the click register and having the click locations drawn on the playback screen. The automatic gathering and graphical display system of the present invention provides a software useability analyzer with a means for viewing the paths and click locations of one or more users simultaneously to assist in the analysis of the particular software- under-test. By considering the elapsed time data and gauging the familiarity level for each user in addition to examining the traced paths, the analysis of the tested software's useability is accurately determined.
The invention has now been explained with reference to specific embodiments. Other embodiments will be apparent to those of ordinary skill in the art in light of this disclosure. Therefore it is not intended that this invention be limited, except as indicated by the appended claims.

Claims

What is claimed is:
1. A system for automatically gathering and graphically displaying useability data, the system comprising: a processor for reading, processing and storing data in response to programmed instructions; a display device connected to the processor for gathering and graphically displaying useability information; an input device connected to the processor for providing user input to the processor in response to information displayed on the display device; and a memory connected to the processor for storing data, the stored data including programmed instructions for testing software useability, a software program undergoing testing, and test data relating to the useability of the software program undergoing testing.
2. The system as recited in claim 1 wherein the processor is further connected to means for regenerating cursor movements, the cursor movements are performed by the user during the gathering of useability data.
3. The system as recited in claim 2 wherein the means for regenerating comprises: means for clocking the duration of intervals of data collection; means for determining the movement of a cursor upon the display device at a selected time interval; and means for determining the location of a cursor upon the display device at the time an entry is made.
4. The system as recited in claim 3 wherein the means for clocking comprises a clock register connected to the processing unit for providing increments of time.
5. The system as recited in claim 3 wherein the means for determining the cursor movement comprises a path array register for storing the X and Y coordinates correlating to the position of a cursor upon the display device, the coordinates being saved in the path array register at selective time increments provided by the means for clocking.
6. The system as recited in claim 3 wherein the means for determining the cursor location at the time an entry is made comprises a click register for storing the X and Y coordinates correlating to the position of the cursor upon the display device at the point in time where a key entry is made.
7. The system as recited in claim 3 wherein the means for regenerating further comprises a statistics register for storing data input in response to selective programming instructions.
8. The system as recited in claim 7 wherein the statistics register comprises: an elapsed time memory for storing the total time elapsed during the performance of a function directed by the programmed instructions; a user name memory for storing an identification input from the input device in response to a programmed instruction; a user identification number memory for storing an identification number generated by the processing unit for each user name input into memory; and a familiarity data memory for storing the data input in response to programmed instructions regarding prior use of software programs similar to the software program undergoing testing.
9. A means for automatically gathering and graphically displaying useability data for analyzing the ease of useability of a software- under- test program in a computer comprising: means for displaying a first programmed test instruction providing direction for manipulating a cursor in response to the instruction; means for starting a clock which indicates time intervals; means for determining the location of the cursor at selected time intervals; means for storing the cursor location at the selected time intervals; means for indicating the completion of the cursor manipulations; and. means for stopping the clock when completion has been indicated.
10. The means for automatically gathering and graphically displaying useability data as described in claim 9 further comprising: means for determining whether a subsequent test instruction is programmed following the stopping of the clock.
1 1. The means for automatically gathering and graphically displaying useability data as described in claim 10 further comprising: means for displaying a new first programmed test instruction if a new user is to analyze the software-under-test program.
12. The means for automatically gathering and graphically displaying useability data as described in claim 9 further comprising: means for retracing the path of the cursor during the manipulations for subsequent analysis of the useability of the software-under-test program, by displaying the stored cursor locations in a graphically charted form.
13. A method for automatically gathering and graphically displaying useability data for analyzing the ease of useability of a software- under-test program in a computer, the method comprising the steps of: displaying a first programmed test instruction providing direction for manipulating a cursor in response to the instruction; starting a clock which indicates the time intervals; determining the location of the cursor at selected time intervals; storing the cursor location at the selected time intervals; indicating the completion of the cursor manipulations; and stopping the clock when completion has been indicated.
14. The method as described in claim 13 further comprising the step of: determining whether a subsequent test instruction is programmed following the stopping of the clock.
15. The method as described in claim 14 further comprising the step of: displaying the subsequent programmed test instruction if a subsequent test instruction is programmed.
16. The method as described in claim 13 further comprising a step following the stopping step comprising: displaying a new first programmed test instruction if a new user is to analyze the software-under-test program.
17. The method as described in claim 13 further comprising the step of: ending the program if no new users will be analyzing the software-under-test program.
18. The method as described in claim 13 further comprising the initial steps of: entering an identification term; designating an identification number to correlate with the identification term; entering responses to queries regarding familiarity with the software-under-test program; and storing the data entered in a statistics register.
19. The method as described in claim 13 further comprising the final step of: retracing the path of the cursor during the manipulations for subsequent analysis of the useability of the software-under-test program, by displaying the stored cursor locations in a graphically charted form.
PCT/US1994/013853 1993-12-17 1994-12-06 Automatic gathering and graphical display of useability test data WO1995016949A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP95905852A EP0685084A1 (en) 1993-12-17 1994-12-06 Automatic gathering and graphical display of useability test data
CA002156058A CA2156058A1 (en) 1993-12-17 1994-12-06 Automatic gathering and graphical display of useability test data

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US16950693A 1993-12-17 1993-12-17
US08/169,506 1993-12-17

Publications (1)

Publication Number Publication Date
WO1995016949A1 true WO1995016949A1 (en) 1995-06-22

Family

ID=22615984

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1994/013853 WO1995016949A1 (en) 1993-12-17 1994-12-06 Automatic gathering and graphical display of useability test data

Country Status (3)

Country Link
EP (1) EP0685084A1 (en)
CA (1) CA2156058A1 (en)
WO (1) WO1995016949A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1998044418A1 (en) * 1997-04-01 1998-10-08 Landmark Systems Corporation Client-based system for monitoring the performance of application programs
US6064381A (en) * 1996-12-03 2000-05-16 Ergolight Ltd. Apparatus and methods for analyzing software systems
US6118447A (en) * 1996-12-03 2000-09-12 Ergolight Ltd. Apparatus and methods for analyzing software systems
WO2002008903A2 (en) * 2000-07-25 2002-01-31 Vividence Corporation Automated task-context web page effectiveness analyzer
US6526526B1 (en) 1999-11-09 2003-02-25 International Business Machines Corporation Method, system and program for performing remote usability testing
US7152016B2 (en) 2002-09-19 2006-12-19 Fuji Xerox Co., Ltd. Usability evaluation support apparatus and method
DE10306304B4 (en) * 2002-09-09 2007-06-06 Fuji Xerox Co., Ltd. Device for supporting user-friendliness evaluation
US7490319B2 (en) 2003-11-04 2009-02-10 Kimberly-Clark Worldwide, Inc. Testing tool comprising an automated multidimensional traceability matrix for implementing and validating complex software systems

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107193984A (en) * 2017-05-25 2017-09-22 上海喆之信息科技有限公司 A kind of high-quality user's commending system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4637797A (en) * 1985-01-11 1987-01-20 Access Learning Technology Corporation Software training system
US4772206A (en) * 1986-03-10 1988-09-20 International Business Machines Corporation Multi-mode teaching simulator
US4845665A (en) * 1985-08-26 1989-07-04 International Business Machines Corp. Simulation of computer program external interfaces
US4941829A (en) * 1987-12-23 1990-07-17 International Business Machines Corporation Method for providing a dynamic tutorial display
US5086393A (en) * 1986-03-10 1992-02-04 International Business Machines Corp. System for testing human factors and performance of a system program
US5211564A (en) * 1989-07-19 1993-05-18 Educational Testing Service Computerized figural response testing system and method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4637797A (en) * 1985-01-11 1987-01-20 Access Learning Technology Corporation Software training system
US4845665A (en) * 1985-08-26 1989-07-04 International Business Machines Corp. Simulation of computer program external interfaces
US4772206A (en) * 1986-03-10 1988-09-20 International Business Machines Corporation Multi-mode teaching simulator
US5086393A (en) * 1986-03-10 1992-02-04 International Business Machines Corp. System for testing human factors and performance of a system program
US4941829A (en) * 1987-12-23 1990-07-17 International Business Machines Corporation Method for providing a dynamic tutorial display
US5211564A (en) * 1989-07-19 1993-05-18 Educational Testing Service Computerized figural response testing system and method

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6064381A (en) * 1996-12-03 2000-05-16 Ergolight Ltd. Apparatus and methods for analyzing software systems
US6118447A (en) * 1996-12-03 2000-09-12 Ergolight Ltd. Apparatus and methods for analyzing software systems
US6384843B1 (en) 1996-12-03 2002-05-07 Ergolight Ltd. Apparatus and methods for analyzing software systems
WO1998044418A1 (en) * 1997-04-01 1998-10-08 Landmark Systems Corporation Client-based system for monitoring the performance of application programs
US6526526B1 (en) 1999-11-09 2003-02-25 International Business Machines Corporation Method, system and program for performing remote usability testing
WO2002008903A2 (en) * 2000-07-25 2002-01-31 Vividence Corporation Automated task-context web page effectiveness analyzer
WO2002008903A3 (en) * 2000-07-25 2003-07-17 Vividence Corp Automated task-context web page effectiveness analyzer
DE10306304B4 (en) * 2002-09-09 2007-06-06 Fuji Xerox Co., Ltd. Device for supporting user-friendliness evaluation
US7152016B2 (en) 2002-09-19 2006-12-19 Fuji Xerox Co., Ltd. Usability evaluation support apparatus and method
US7490319B2 (en) 2003-11-04 2009-02-10 Kimberly-Clark Worldwide, Inc. Testing tool comprising an automated multidimensional traceability matrix for implementing and validating complex software systems

Also Published As

Publication number Publication date
CA2156058A1 (en) 1995-06-22
EP0685084A1 (en) 1995-12-06

Similar Documents

Publication Publication Date Title
US4696003A (en) System for testing interactive software
US9342437B2 (en) Backward post-execution software debugger
US5086393A (en) System for testing human factors and performance of a system program
US8914777B2 (en) Forward post-execution software debugger
US4772206A (en) Multi-mode teaching simulator
US7653899B1 (en) Post-execution software debugger with performance display
Kieras Model-based evaluation
Kieras A guide to GOMS model usability evaluation using NGOMSL
US8584097B2 (en) Post-execution software debugger with event display
US5220658A (en) System for testing a performance of user interactive-commands using an emulator-overlay for determining the progress of the user timing response
US20110283247A1 (en) Method of recording and replaying call frames for the testbench
EP0463732A2 (en) Method and system for animating the execution of a computer program
US8015552B1 (en) Post-execution software debugger with coverage display
Jovic et al. Automating performance testing of interactive java applications
US5513316A (en) Method and apparatus for exercising an integrated software system
EP0685084A1 (en) Automatic gathering and graphical display of useability test data
Al-Qaimari et al. KALDI: a computer-aided usability engineering tool for supporting testing and analysis of human-computer interaction
US8078590B2 (en) Data processing system
Schulman Hardware measurement device for IBM system/360 time sharing evaluation
CN114897296A (en) RPA flow labeling method, execution process playback method and storage medium
EP0236746B1 (en) System and method for testing human factors
Tabatabai Investigation of decision making process: A hypermedia approach
CA1293057C (en) System for testing human factors and performance of a system program
JP2530841B2 (en) Program performance measurement method
Dauphin et al. PEPP: Performance Evaluation of Parallel Programs

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): CA JP

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE

WWE Wipo information: entry into national phase

Ref document number: 2156058

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 1995905852

Country of ref document: EP

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWW Wipo information: withdrawn in national office

Ref document number: 1995905852

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1995905852

Country of ref document: EP