US20040185424A1 - Method for scoring and delivering to a reader test answer images for open-ended questions - Google Patents

Method for scoring and delivering to a reader test answer images for open-ended questions Download PDF

Info

Publication number
US20040185424A1
US20040185424A1 US10/765,749 US76574904A US2004185424A1 US 20040185424 A1 US20040185424 A1 US 20040185424A1 US 76574904 A US76574904 A US 76574904A US 2004185424 A1 US2004185424 A1 US 2004185424A1
Authority
US
United States
Prior art keywords
answer
scoring
method recited
page
reader
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/765,749
Inventor
Bernard Kucinski
Jose Gonzalez
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harcourt Assessment Inc
Original Assignee
Harcourt Assessment Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harcourt Assessment Inc filed Critical Harcourt Assessment Inc
Priority to US10/765,749 priority Critical patent/US20040185424A1/en
Publication of US20040185424A1 publication Critical patent/US20040185424A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K17/00Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations
    • G06K17/0032Apparatus for automatic testing and analysing marked record carriers, used for examinations of the multiple choice answer type

Definitions

  • the present invention relates to systems and methods for imaging test answer sheets and, more particularly, to automated systems and methods for processing and storing test answer sheet images that include answers to open-ended questions.
  • OMR Optimal mark reading
  • the system includes integrated hardware elements and software processes for capturing optical mark and full visual images of an answer page, for storing the images, for retrieving the images, for distributing the visual images to a reader for scoring, for assisting the reader in scoring, and for monitoring the reader's performance.
  • the scanning system comprises means for sequentially advancing each page of a plurality of answer pages along a predetermined path. Positioned along the path are mark imaging means (OMR, optical mark recognition; OCR, optical character recognition) for capturing a location of an optical mark on each answer page and visual imaging means for capturing a full visual image of each answer page.
  • mark imaging means OCR, optical mark recognition; OCR, optical character recognition
  • visual imaging means for capturing a full visual image of each answer page.
  • a forms database in a server is provided that contains data on the physical location and type (e.g., multiple-choice or open-ended) of each answer on each page.
  • Software means resident in the server operate with the forms database to determine whether the captured image contains an answer to an open-ended question. If such an open-ended answer is supposed to be found on the page being imaged, the full visual image of the page is stored.
  • the scanner further comprises means for aligning the page image without the use of timing or tracking marks.
  • the aligning means comprises means for detecting a page edge, which is sufficient for pages having only open-ended answers.
  • the present invention further includes a system and method for distributing one of a batch of answer images to a reader for scoring.
  • the answer images typically comprise open-ended answers such as are obtained from the scanning system and method as described above.
  • Preferably each batch of answer images are from a common test, although this is not intended as a limitation.
  • the method comprises the steps of fetching a batch of answers to a test question from a storage device and placing them in a temporary cache. These fetching and temporary storing steps are preferably under the control of a server.
  • This server contains a database associating each answer batch with a qualification required of a reader.
  • Another database resident therein contains a list of qualifications possessed by each reader.
  • a reader who is in electronic communication with the cache indicates a readiness for scoring, and that reader's qualifications, which are resident in the server, permit the routing to the reader of one of an available batch of answers based upon predetermined criteria such as priority associated with a test to be scored.
  • An answer image from an appropriate answer batch is electronically delivered to the reader's workstation for scoring. Once the scoring of that answer is complete, the server will distribute additional answer images to that reader until the batch is completely scored or the reader exits the system.
  • a similarly qualified group of readers score answer images from the same batch.
  • the present invention additionally includes a system and method for displaying a test answer page to a reader for scoring.
  • the page number for a particular test is used to access a forms layout database, which contains a location of the sector on which the open-ended question is expected to be found.
  • the page image is then formatted to display that answer sector to the reader.
  • Means are also provided for permitting access to the remainder of the page, such as by scrolling on a workstation screen, or to additional pages if the item answer covers multiple pages.
  • Formatting also comprises providing a scoring protocol for the answer and displaying commensurate indicia to the reader to assist in scoring. For example, a button bar can be displayed on a screen, an item of which can be selected for entering a score.
  • Another scoring facilitator available to the reader comprises a geometric measurement tool that can be superimposed on an answer and manipulated to provide an indication of how close to an “ideal” answer the student has come.
  • Scoring is also assisted by an electronic querying system and method, whereby a query is electronically transmitted to successively higher levels of supervisors until an answer can be obtained. The answer is then electronically relayed back through the same levels so that all intermediate personnel can benefit from the knowledge.
  • FIG. 1 is a schematic of a hardware configuration of a preferred embodiment of the scoring system.
  • FIG. 2 is a schematic of the data processing functions and applications of the scoring system.
  • FIG. 3 is a schematic of a network architecture useful in the scoring system.
  • FIG. 4 is a flowchart of representative image processing and storing steps in the method of the present invention.
  • FIG. 5 is a flowchart of a representative process for distributing an answer to a reader for scoring in the method of this invention.
  • FIG. 6 is a flowchart of representative steps in the scoring process of the present invention following the distribution of an answer to a reader.
  • FIG. 7A illustrates an exemplary page of a literature test having one multiple-choice question and one open-ended question.
  • FIG. 7B illustrates a display of the image processed from the page of FIG. 8A as displayed to a reader for scoring.
  • FIG. 8A illustrates an exemplary page of a geometry test having one multiple-choice question and one question requiring the student to draw a diagram.
  • FIG. 8B illustrates a display of the image processed from the page of FIG. 8A as displayed to a reader for scoring.
  • FIG. 9 is a flowchart of representative steps in the reader calibration process of the present invention for tracking scoring efficiency and effectiveness.
  • FIG. 10 illustrates an exemplary header sheet for a batch of test booklets.
  • FIG. 1 A schematic of a hardware configuration of a preferred embodiment of the present invention is illustrated in FIG. 1, which includes the imaging and image storing elements, and in FIG. 3, which includes the network architecture.
  • Software application elements are included in the data processing flow diagram of FIG. 2.
  • FIG. 4 A flowchart of representative image processing and storing steps is given in FIG. 4, and two exemplary answer pages are illustrated in FIGS. 7A and 8A.
  • the imaging and scoring system 10 hardware elements include a scanner 20 for imaging answer pages.
  • a preferred embodiment of the scanner 20 comprises a modified Scan-Optics 9000 unit, rated for 120 pages/min.
  • Standardized tests are typically given in batches to students belonging to a particular group, for example, a plurality of sixth-grade students from different schools and different classrooms in a particular geographical region.
  • Each student receives a coded booklet comprising a plurality of pages, and, following test administration, all the test booklets are delivered to a scoring center for processing.
  • a header page 13 (FIG. 10) provides alphanumeric character and OMR-readable data for tracking the booklets.
  • Header page 13 includes, for example, such information as teacher name 131 (“Mrs. Smith”), grade level 133 (“ 6 ”), and school code 132 ( 134274 ), the latter two having an associated “bubble” filled in for each number.
  • This configuration is exemplary and is not intended as a limitation.
  • One or more of such batches may together form an “order,” and a number is also assigned to track this (e.g., all Grade 6 classes in Greenwich, Conn.).
  • Another tracking means comprises “cart number,” which indicates a physical location of the booklets.
  • Each test booklet is entered, for example, via bar code, for later demographic correlation with scores, and is cut apart into individual, usually two-sided pages (FIG. 4, step 899 ).
  • test booklet pages are stacked sequentially into an entrance hopper 201 of a scanner 20 , and each page 12 is fed by methods well known in the art onto a belt 21 for advancing the page 12 along a predetermined path (FIG. 4, step 900 ).
  • the belt 21 has a substantially transparent portion for permitting the page 12 to be imaged on both sides simultaneously by two sets of cameras.
  • a first set of cameras includes an upper 22 and a lower 23 camera, each filtered for infrared wavelengths.
  • This set 22 , 23 is for optical mark recognition (OMR), used to detect the location of pencil marks, for example, filled-in bubbles such as are common in multiple-choice answers, on both sides of the page 12 (step 903 ).
  • OCR marks are detected and processed (step 903 ).
  • the OMR scan data are greyscale processed by means 42 known in the art for detection of corrections and erasures.
  • the data are then routed to a long-term storage device (step 906 ), such as magnetic tape 41 , for later scoring and further processing in a mainframe computer 40 .
  • a second set of cameras includes an upper 24 and a lower 25 camera, each substantially unfiltered. This set 24 , 25 is for capturing a full visual image of both sides of the page 12 (step 907 ).
  • the page 12 continues along the path on the belt 21 and is collected in sequence with previously scanned pages in an exit hopper 202 .
  • the scanner 20 is under the control of a first server 26 , such as a Novell server, which performs a plurality of quality-control functions interspersed with the imaging functions.
  • Software means 261 resident in the first server 26 determine that each page being scanned is in sequence (step 904 ) from preprinted marks on the page indicating page number. If it is not, the operator must correct the sequence before being allowed to continue scanning (step 905 ).
  • the first server 26 also has software means 262 for determining whether the page 12 is scannable (step 901 ).
  • Pages containing OMR data contain timing tracks 125 as are known in the art (see FIG. 7A) for orienting the page with respect to optical mark position. A page that has these missing is not scannable, and a substitute page marked “unscannable” in placed into the document indicating to the reader that a request for a hard copy must be made before this page can be scored (step 902 ).
  • a screen 27 is in communication with the first server 26 that displays to the operator a preselected number of visual images (step 911 ). For example, the operator may choose to view every nth page scanned. Should the quality be deemed insufficient (step 912 ), the scanner 20 is stopped (step 913 ), maintenance functions or repairs are performed (step 914 ), and the affected group of pages is rescanned (step 900 ).
  • This is a custom-designed function, a scanning activity monitor, that automatically searches the output files looking for the latest cart-stack combination and then displays the latest images from the cameras 24 , 25 for operator review.
  • the first server 26 further contains a forms database 265 of answer pages that comprises data on the physical location of each answer and a type of answer for each page in the answer booklet.
  • the answer type may be, for example, an answer to an open-ended question or a multiple-choice question.
  • FIG. 7A illustrates a sample page 12 from a literature test, wherein Question # 1 71 is multiple-choice and Question # 2 72 is open-ended, with an answer space 73 provided for writing an answer 74 .
  • FIG. 8A a sample page 12 ′ from a geometry test, Question # 1 81 is multiple-choice and Question # 2 82 is open-ended, with an answer space 83 provided for drawing a diagram 84 .
  • a correlation is performed between the page number and the forms database (step 908 ) to determine whether the page 12 , 12 ′ contains an open-ended answer. If so (step 909 ), the page image is prepared for storage (step 910 ); if not, the page image is not saved.
  • the first server 26 also contains means for detecting an edge, preferably an uncut edge 120 , of the imaged page. Edge detection is utilized to align the visual image for answer pages containing only open-ended answers. This is beneficial for several reasons:
  • a page image that is to be saved is stored temporarily in a second server, comprising a fast storage server 28 (step 915 ) that has a response time sufficiently fast to keep pace with the visual image scanning step 907 .
  • a second server 28 may comprise, for example, a Novell 4.x, 32-Mb RAM processor with a 3-Gb disk capacity. Means are provided here for ensuring that the OMR and image data are in synchrony (step 916 ). If they are not, data may have to be reconstructed or images rescanned (step 917 ).
  • the data are transferred at predetermined intervals to a third server 30 having software means 302 resident therein for performing a high-performance image indexing (HPII) on the visual image (step 918 ).
  • HPII high-performance image indexing
  • This is for processing the data for optical storage and retrieval (OSAR).
  • Third server 30 may comprise, for example, a UNIX 256-Mb RAM processor with a 10-Gb disk capacity having 3.2.1 FileNet and custom OSAR software resident thereon.
  • the answer images are finally transferred to a long-term storage (step 919 ) unit 34 for later retrieval.
  • a unit 34 may comprise, for example, one or more optical jukeboxes, each comprising one or more optical platters. Preferably two copies are written, each copy to a different platter, for data backup.
  • Fourth server 32 may comprise, for example, a UNIX 64-Mb RAM processor having Oracle and FileNet software resident thereon.
  • FIG. 5 is a flowchart of an exemplary distribution process of the present invention, wherein a first step 950 comprises determining an answer batch from a queue to be scored during a particular time period.
  • a determination is made prior to the start of a scoring session as to which batches of answers are desired to be scored during that session. This determination may be based, for example, on predetermined criteria including an assigned priority, project number, order number, and number and type of readers available, and is entered into a fifth server 36 , which provides a communication link between the fourth server 32 , the cache 38 , reader workstations 50 , and the mainframe 40 , as will be discussed in the following (FIG. 1).
  • Fifth server 36 comprises, in an exemplary embodiment, a DEC-Alpha server having 512 Mb RAM and 12-Gb disk capacity, with 3.2c UNIX and 7.2.2.3 Oracle resident therein.
  • the desired batches are prefetched (step 951 ) from the long-term storage unit 34 and temporarily stored (step 952 ) in a cache 38 , as directed by the OSAR system 322 in the fourth server 32 under the control of the fifth server 36 .
  • These prefetching and temporary storage steps 951 , 952 confer a speed advantage over having readers access the long-term storage unit 34 directly, which is comparatively slow, whereas the cache 38 response time is rapid.
  • An exemplary cache 38 for use in the system comprises a FileNet residing on the OSAR server and contains 12 GB of magnetic storage for this transient database.
  • the fifth server 36 contains a first database 362 associating each answer batch with a qualification required of a reader (e.g., sixth-grade math, New York State test).
  • a second database 364 resident therein contains a list of qualifications possessed by each reader.
  • a third database 366 resident therein contains the form data for each answer, including the number of questions and pages in the test, how each answer is to be scored, and in what form the answer image is to be presented to a reader. For example, information on the page in FIG. 7A would include the location of the answer blank 73 to Question #2 and the answer scale to be used in scoring that question (e.g., a score of 1-5).
  • step 953 the question qualification 362 and forms 366 databases are referenced (steps 953 and 954 ), and a work queue is established, which is selected by a supervisor managing a group of readers (step 955 ).
  • a reader logs onto a workstation 50 , his or her qualifications will have been checked by the supervisor.
  • the reader receives an answer from the chosen batch for scoring (step 957 ).
  • the answer image is formatted for display (step 958 ) and delivered to the reader's workstation 50 (step 959 ).
  • the formatting step 958 comprises accessing the forms database 366 to determine how the answer image and scoring protocol are to be displayed to the reader. For example, an area of interest 73 (FIG. 7A) or 83 (FIG. 8A), which comprises the space left for writing in an answer, is delineated on each page image, and it is this area that initially appears on the reader's workstation screen 51 (FIGS. 7B and 8B).
  • An important feature of the present invention is that the reader can also access the remainder of the image if desired, which can be necessary if the student has written outside the area provided for that particular question (see FIG. 6, steps 988 , 989 ), and may even spill over onto another page.
  • Such access is typically provided by a scroll bar 510 such as are known in the art in Windows®-type applications (FIGS. 7B and 8B).
  • a scroll bar 510 such as are known in the art in Windows®-type applications (FIGS. 7B and 8B). This feature provides an advantage over other systems known in the art in which the visual image is clipped to include only a predetermined area of interest, in which case this extra display information is lost.
  • a score is entered into the workstation 50 (step 960 ), which is delivered to and stored at the fifth server 36 (step 962 ).
  • the reader receives another answer to score from the same batch, if there are additional answers of the same test question remaining in the queue (step 962 ). If that queue is empty, the supervisor selects another answer batch from the queue (step 955 ).
  • the scores are assembled and transmitted by the fifth server 36 to the mainframe 40 (step 965 ), where all the individual answer scores are correlated for each booklet and a total test score is calculated. This step typically occurs once per day.
  • step 963 The progress and speed of any particular reader or the status of a particular queue are monitored by accessing the fifth server 36 , which maintains statistics (step 963 ) and a table of workflow queues (step 964 ). Access to this information may be limited, for example, to supervisory or managerial personnel by means known in the art.
  • FIG. 6 One aspect of the scoring system and method of the present invention is illustrated in the flowchart of FIG. 6, which provides further details of the steps occurring between step 957 , the delivery of an answer to a reader for scoring, and step 960 , the entry of a score, in FIG. 5.
  • the answer prior to delivery (step 957 ), is formatted for electronically selecting an area of interest 73 or 83 for displaying to the reader, along with a scroll bar 75 , 85 for permitting the reader to access the remainder of the page 12 , 12 ′ (FIGS. 7A,8A).
  • the answer is also formatted for scoring protocol, and, as illustrated in FIGS. 7B and 8B, a score button bar 76 , 86 is provided that corresponds to the scoring range for that question.
  • the scores are given on a scale of 1 to 5; in FIG. 8B, 1 to 4. Answers that cannot be give a numeric grade are considered invalid and are scored in a separate category (e.g., blank, foreign language, off-topic).
  • Scoring facilities such as are known in the art generally comprise groups of readers having similar qualifications who are assigned to types of questions to score. Such groups may be further subdivided into smaller groups, and a commensurate management tree structure created. Preferably this tree structure is mirrored in the hardware architecture (FIG. 3), wherein, for example, a supervisor has access to all reader workstations 50 in that group.
  • step 980 To proceed with scoring, formatted answer and score button bar 76 , 86 are displayed to the reader (step 980 ). If the reader has a question regarding the scoring protocol (step 981 ), a query is sent electronically upline to the reader's next-level supervisor (step 982 ). If that supervisor can answer the question (step 983 ), a response is transmitted electronically to the reader (step 984 ); if that supervisor cannot answer the question (step 983 ), a query is transmitted upline to the next-level supervisor (step 982 ), looping through as many levels of supervisors as are present until the query can be addressed. When the query is answered, the answer is relayed to the reader through all intermediate query relayers (step 984 ) so that all levels of personnel can view the answer to the query. While the query is being routed, the reader can continue scoring another answer.
  • the reader can continue scoring that answer. If the test is in geometry or some other discipline wherein an answer can comprise the drawing of a diagram, a software tool is made available to the reader to assist in scoring (step 985 ). If needed, the geometric tool is fetched (step 986 ) and utilized to score the answer. In the example shown in FIG. 8B, a right triangle was drawn, and thus a floating protractor 87 can be used to measure the right angle 840 . Also available are screen-manipulable tools for measuring areas, lines, and circles. This software in the preferred embodiment comprises a custom-designed package.
  • the reader determines if the image display is sufficient for scoring the answer (step 987 ). If so, the reader can score the answer (step 960 ); if not, the reader can use the scroll bar 510 to access another area of the page, or an area on another page, to view additional parts of the visual image (step 988 ).
  • Another aspect of the present invention includes a system and method for monitoring the scoring effectiveness of a reader, the steps for which are included in the flowchart of FIG. 9.
  • a group supervisor for example, sends a calibration answer having a predetermined target answer to a reader (step 990 ). This answer is interspersed with “real” student answers and are substantially identical in form thereto, which permits the calibration to be performed transparently.
  • a score entered by the reader is collected (step 992 ) and electronically compared with the target score (step 993 ) for providing an indication of effectiveness (step 994 ).
  • the scoring time can be collected (step 992 ) and compared with a target scoring time (step 993 ) for a calculation of scoring efficiency (step 994 ).
  • Another check is performed by comparing a score given holistically and analytically by an inconsistency application ( 970 , FIG. 2). If these scores differ too widely, they are rechecked to ensure that an error was not made.
  • scoring is typically performed by electronically linked groups of readers having similar qualifications.
  • the method illustrated in FIG. 9 can also be expanded to monitor the effectiveness and efficiency of the entire group of readers (steps 991 - 991 ′′) substantially simultaneously if desired.
  • FIG. 3 An exemplary architecture for a preferred embodiment of the present system 10 is schematically illustrated in FIG. 3, and comprises a fiber-optic database distributed interface 61 (FDDI) having a throughput of 100 Mbits.
  • FDDI fiber-optic database distributed interface 61
  • a 100-Mbit fiber is employed to link the subsystems.
  • a first hub 62 is connected to the FDDI 61 and, via 10-Mbit lines, to the scanners 20 , which output to magnetic tape 41 , as shown in FIG. 1, and thence to mainframe 40 .
  • a second hub 63 is connected to the FDDI 61 and, via 10-Mbit lines, to the reader workstations 50 . Second hub 63 acts as a concentrator and has 100 Mbits from FDDI 61 . Each workstation 50 has 10 Mbits out on ethernet.
  • the applications bear like numbers to the steps they perform in the flowcharts.
  • various caches are maintained between applications, including: transaction data 971 from the scanning operation 907 ; rescanned 972 and new booklet 973 information from HPII document committal; image quality work units 974 acted upon by the image quality application 912 , the distributor application 957 , the question application 981 , and the scoring application 960 ; regular holistic and analytical scores 975 from the scoring 960 , route 965 , and question 981 applications; domain item questions 976 , wherein pending questions are held until they are resolved; pending scores 977 for holding incomplete scores; calibration work units 978 ; and inconsistency work units 979 .
  • the system of the present invention further comprises a table-driven system for entering new project configurations, including teams, forms, domains, and orders. This allows the scoring to be customized for each project without any recoding.

Abstract

A method for scoring an answer page containing an answer to an open-ended question includes viewing a first visual image of a first portion of an answer page. The first portion contains an answer space in which an answer to an open-ended question is expected to reside. If the first portion of the answer page contains a complete answer, the answer is electronically scored. If the first portion of the answer page does not encompass a complete answer, a second visual image of a second portion of the answer page is accessed and viewed. The second portion contains a sector of the answer page outside the answer space. A method is also provided for delivering an answer page to a reader for scoring.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of and incorporates by reference co-pending application Ser. No. 10/113,035, filed Apr. 1, 2002, now U.S. Pat. No. 6,684,052, which itself is a continuation of application Ser. No. 09/707,252, filed Nov. 6, 2000, now U.S. Pat. No. 6,366,760, issued Apr. 2, 2002, which itself is a divisional application of application Ser. No. 08/903,646, filed Jul. 31, 1997, now U.S. Pat. No. 6,173,154, issued Jan. 9, 2001, which are commonly owned with the present invention and which are incorporated herein by reference.[0001]
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention [0002]
  • The present invention relates to systems and methods for imaging test answer sheets and, more particularly, to automated systems and methods for processing and storing test answer sheet images that include answers to open-ended questions. [0003]
  • 2. Description of Related Art [0004]
  • The automation of test scoring is a complex problem that has generated a great deal of interest, owing to a significant economic pressure to optimize efficiency and accuracy and to minimize human involvement. Optimal mark reading (OMR) systems are well known in the art, such as those for scanning forms having pencil marks within preprinted areas such as ovals. OMR systems generally sense data recorded within the preprinted areas by detecting light absorbed in the near infrared, which is referred to as NIR scanning. This method permits the differentiation of the pencil marks from the preprinted information, which is provided in a pigment that does not absorb in the NIR. OMR systems thus permit a gathering of data that is easily converted into digital form, scored against an answer database, and saved without consuming excessive storage space. [0005]
  • An additional level of complexity is added, however, with the inclusion of open-ended or essay-type questions. These questions must typically be scored by a human reader, and thus either the physical test form or a visible image thereof must be available for at least the time required for scoring. A digitally stored visible image can be obtained by an image processing apparatus, for example. [0006]
  • A multiplicity of systems and methods for addressing the scoring of test answer sheets have been disclosed in the art. For example, Poor (U.S. Pat. No. 5,452,379), Keogh et al. (U.S. Pat. No. 5,134,669), Clark and Clark et al. (U.S. Pat. Nos. 5,321,611; 5,433,615; 5,437,554; 5,458,493; 5,466,159; and 5,558,521) disclose systems and methods for combining OMR and image processing wherein only a predefined area of a document (an “area of interest”) is captured and stored. [0007]
  • Another aspect of the problem of processing test answer sheets having both multiple-choice and open-ended questions involves the scanning apparatus used to convert a written document into digital data. The use of combined OMR and image capture devices is disclosed by Poor '379, Keogh et al. '669, Clark et al. '554. [0008]
  • SUMMARY OF THE INVENTION
  • It is therefore an object of the present invention to provide a system and method for processing and scoring test answer sheets having both multiple-choice and open-ended questions. [0009]
  • It is another object to provide such a system and method that retains a full image of a test form so that it is retrievable by a scorer. [0010]
  • It is an additional object to provide such a system and method that captures OMR and image data in a unitary device. [0011]
  • It is a further object to provide such a system and method that obviates the need for trigger or timing marks on a test form. [0012]
  • It is yet another object to provide such a system and method that distributes answers for scoring to a qualified reader. [0013]
  • It is yet an additional object to provide a flexible system architecture for imaging test answer sheets, storing the images, and distributing the images to a qualified reader for scoring. [0014]
  • It is yet a further object to provide such a system and method that includes a tool for performing a geometric measurement upon a displayed image of an answer sheet. [0015]
  • These and other objects are provided by the imaging and scoring system and method of the present invention. The system includes integrated hardware elements and software processes for capturing optical mark and full visual images of an answer page, for storing the images, for retrieving the images, for distributing the visual images to a reader for scoring, for assisting the reader in scoring, and for monitoring the reader's performance. [0016]
  • The scanning system comprises means for sequentially advancing each page of a plurality of answer pages along a predetermined path. Positioned along the path are mark imaging means (OMR, optical mark recognition; OCR, optical character recognition) for capturing a location of an optical mark on each answer page and visual imaging means for capturing a full visual image of each answer page. A forms database in a server is provided that contains data on the physical location and type (e.g., multiple-choice or open-ended) of each answer on each page. Software means resident in the server operate with the forms database to determine whether the captured image contains an answer to an open-ended question. If such an open-ended answer is supposed to be found on the page being imaged, the full visual image of the page is stored. [0017]
  • In a particular embodiment the scanner further comprises means for aligning the page image without the use of timing or tracking marks. The aligning means comprises means for detecting a page edge, which is sufficient for pages having only open-ended answers. [0018]
  • The present invention further includes a system and method for distributing one of a batch of answer images to a reader for scoring. The answer images typically comprise open-ended answers such as are obtained from the scanning system and method as described above. Preferably each batch of answer images are from a common test, although this is not intended as a limitation. [0019]
  • The method comprises the steps of fetching a batch of answers to a test question from a storage device and placing them in a temporary cache. These fetching and temporary storing steps are preferably under the control of a server. This server contains a database associating each answer batch with a qualification required of a reader. Another database resident therein contains a list of qualifications possessed by each reader. [0020]
  • A reader who is in electronic communication with the cache indicates a readiness for scoring, and that reader's qualifications, which are resident in the server, permit the routing to the reader of one of an available batch of answers based upon predetermined criteria such as priority associated with a test to be scored. An answer image from an appropriate answer batch is electronically delivered to the reader's workstation for scoring. Once the scoring of that answer is complete, the server will distribute additional answer images to that reader until the batch is completely scored or the reader exits the system. Typically, a similarly qualified group of readers score answer images from the same batch. [0021]
  • The present invention additionally includes a system and method for displaying a test answer page to a reader for scoring. In this aspect, the page number for a particular test is used to access a forms layout database, which contains a location of the sector on which the open-ended question is expected to be found. The page image is then formatted to display that answer sector to the reader. Means are also provided for permitting access to the remainder of the page, such as by scrolling on a workstation screen, or to additional pages if the item answer covers multiple pages. [0022]
  • Formatting also comprises providing a scoring protocol for the answer and displaying commensurate indicia to the reader to assist in scoring. For example, a button bar can be displayed on a screen, an item of which can be selected for entering a score. [0023]
  • Another scoring facilitator available to the reader comprises a geometric measurement tool that can be superimposed on an answer and manipulated to provide an indication of how close to an “ideal” answer the student has come. [0024]
  • Scoring is also assisted by an electronic querying system and method, whereby a query is electronically transmitted to successively higher levels of supervisors until an answer can be obtained. The answer is then electronically relayed back through the same levels so that all intermediate personnel can benefit from the knowledge. [0025]
  • In order to monitor the scoring effectiveness of a reader, means are provided for transmitting a calibration answer for scoring. The reader is unaware that this is not another answer in the regular workflow queue. The score granted by the reader can be compared against a target score to judge that reader's effectiveness. In addition, scoring time can be tracked to obtain a measure of scoring speed. Similarly, the calibration answer can be given to a plurality of readers for obtaining effectiveness and speed statistics for a group of readers. [0026]
  • The features that characterize the invention, both as to organization and method of operation, together with further objects and advantages thereof, will be better understood from the following description used in conjunction with the accompanying drawing. It is to be expressly understood that the drawing is for the purpose of illustration and description and is not intended as a definition of the limits of the invention. These and other objects attained, and advantages offered, by the present invention will become more fully apparent as the description that now follows is read in conjunction with the accompanying drawing. [0027]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a schematic of a hardware configuration of a preferred embodiment of the scoring system. [0028]
  • FIG. 2 is a schematic of the data processing functions and applications of the scoring system. [0029]
  • FIG. 3 is a schematic of a network architecture useful in the scoring system. [0030]
  • FIG. 4 is a flowchart of representative image processing and storing steps in the method of the present invention. [0031]
  • FIG. 5 is a flowchart of a representative process for distributing an answer to a reader for scoring in the method of this invention. [0032]
  • FIG. 6 is a flowchart of representative steps in the scoring process of the present invention following the distribution of an answer to a reader. [0033]
  • FIG. 7A illustrates an exemplary page of a literature test having one multiple-choice question and one open-ended question. [0034]
  • FIG. 7B illustrates a display of the image processed from the page of FIG. 8A as displayed to a reader for scoring. [0035]
  • FIG. 8A illustrates an exemplary page of a geometry test having one multiple-choice question and one question requiring the student to draw a diagram. [0036]
  • FIG. 8B illustrates a display of the image processed from the page of FIG. 8A as displayed to a reader for scoring. [0037]
  • FIG. 9 is a flowchart of representative steps in the reader calibration process of the present invention for tracking scoring efficiency and effectiveness. [0038]
  • FIG. 10 illustrates an exemplary header sheet for a batch of test booklets.[0039]
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • A description of the preferred embodiments of the present invention will now be presented with reference to FIGS. 1-10. [0040]
  • The Image Capturing and Storage System and Method [0041]
  • A schematic of a hardware configuration of a preferred embodiment of the present invention is illustrated in FIG. 1, which includes the imaging and image storing elements, and in FIG. 3, which includes the network architecture. Software application elements are included in the data processing flow diagram of FIG. 2. A flowchart of representative image processing and storing steps is given in FIG. 4, and two exemplary answer pages are illustrated in FIGS. 7A and 8A. The imaging and scoring [0042] system 10 hardware elements include a scanner 20 for imaging answer pages. A preferred embodiment of the scanner 20 comprises a modified Scan-Optics 9000 unit, rated for 120 pages/min.
  • Standardized tests are typically given in batches to students belonging to a particular group, for example, a plurality of sixth-grade students from different schools and different classrooms in a particular geographical region. Each student receives a coded booklet comprising a plurality of pages, and, following test administration, all the test booklets are delivered to a scoring center for processing. A header page [0043] 13 (FIG. 10) provides alphanumeric character and OMR-readable data for tracking the booklets. Header page 13 includes, for example, such information as teacher name 131 (“Mrs. Smith”), grade level 133 (“6”), and school code 132 (134274), the latter two having an associated “bubble” filled in for each number. This configuration is exemplary and is not intended as a limitation. One or more of such batches may together form an “order,” and a number is also assigned to track this (e.g., all Grade 6 classes in Greenwich, Conn.). Another tracking means comprises “cart number,” which indicates a physical location of the booklets. Each test booklet is entered, for example, via bar code, for later demographic correlation with scores, and is cut apart into individual, usually two-sided pages (FIG. 4, step 899).
  • The test booklet pages are stacked sequentially into an [0044] entrance hopper 201 of a scanner 20, and each page 12 is fed by methods well known in the art onto a belt 21 for advancing the page 12 along a predetermined path (FIG. 4, step 900). The belt 21 has a substantially transparent portion for permitting the page 12 to be imaged on both sides simultaneously by two sets of cameras.
  • A first set of cameras includes an upper [0045] 22 and a lower 23 camera, each filtered for infrared wavelengths. This set 22,23 is for optical mark recognition (OMR), used to detect the location of pencil marks, for example, filled-in bubbles such as are common in multiple-choice answers, on both sides of the page 12 (step 903). Alternatively, OCR marks are detected and processed (step 903).
  • The OMR scan data are greyscale processed by [0046] means 42 known in the art for detection of corrections and erasures. The data are then routed to a long-term storage device (step 906), such as magnetic tape 41, for later scoring and further processing in a mainframe computer 40.
  • A second set of cameras includes an upper [0047] 24 and a lower 25 camera, each substantially unfiltered. This set 24,25 is for capturing a full visual image of both sides of the page 12 (step 907).
  • The [0048] page 12 continues along the path on the belt 21 and is collected in sequence with previously scanned pages in an exit hopper 202.
  • The [0049] scanner 20 is under the control of a first server 26, such as a Novell server, which performs a plurality of quality-control functions interspersed with the imaging functions. Software means 261 resident in the first server 26 determine that each page being scanned is in sequence (step 904) from preprinted marks on the page indicating page number. If it is not, the operator must correct the sequence before being allowed to continue scanning (step 905).
  • The [0050] first server 26 also has software means 262 for determining whether the page 12 is scannable (step 901). Pages containing OMR data contain timing tracks 125 as are known in the art (see FIG. 7A) for orienting the page with respect to optical mark position. A page that has these missing is not scannable, and a substitute page marked “unscannable” in placed into the document indicating to the reader that a request for a hard copy must be made before this page can be scored (step 902).
  • In addition, a [0051] screen 27 is in communication with the first server 26 that displays to the operator a preselected number of visual images (step 911). For example, the operator may choose to view every nth page scanned. Should the quality be deemed insufficient (step 912), the scanner 20 is stopped (step 913), maintenance functions or repairs are performed (step 914), and the affected group of pages is rescanned (step 900). This is a custom-designed function, a scanning activity monitor, that automatically searches the output files looking for the latest cart-stack combination and then displays the latest images from the cameras 24,25 for operator review.
  • The [0052] first server 26 further contains a forms database 265 of answer pages that comprises data on the physical location of each answer and a type of answer for each page in the answer booklet. The answer type may be, for example, an answer to an open-ended question or a multiple-choice question. FIG. 7A illustrates a sample page 12 from a literature test, wherein Question # 1 71 is multiple-choice and Question # 2 72 is open-ended, with an answer space 73 provided for writing an answer 74. Likewise in FIG. 8A, a sample page 12′ from a geometry test, Question # 1 81 is multiple-choice and Question # 2 82 is open-ended, with an answer space 83 provided for drawing a diagram 84. A correlation is performed between the page number and the forms database (step 908) to determine whether the page 12,12′ contains an open-ended answer. If so (step 909), the page image is prepared for storage (step 910); if not, the page image is not saved.
  • The [0053] first server 26 also contains means for detecting an edge, preferably an uncut edge 120, of the imaged page. Edge detection is utilized to align the visual image for answer pages containing only open-ended answers. This is beneficial for several reasons:
  • (1) the answer booklets are more economical to produce, since tracks do not need to be printed and printing accuracy is less important; (2) there is less chance of tampering; and (3) the booklets have greater aesthetic appeal. [0054]
  • A page image that is to be saved is stored temporarily in a second server, comprising a fast storage server [0055] 28 (step 915) that has a response time sufficiently fast to keep pace with the visual image scanning step 907. Such a second server 28 may comprise, for example, a Novell 4.x, 32-Mb RAM processor with a 3-Gb disk capacity. Means are provided here for ensuring that the OMR and image data are in synchrony (step 916). If they are not, data may have to be reconstructed or images rescanned (step 917).
  • The data are transferred at predetermined intervals to a [0056] third server 30 having software means 302 resident therein for performing a high-performance image indexing (HPII) on the visual image (step 918). This is for processing the data for optical storage and retrieval (OSAR). Third server 30 may comprise, for example, a UNIX 256-Mb RAM processor with a 10-Gb disk capacity having 3.2.1 FileNet and custom OSAR software resident thereon.
  • The answer images are finally transferred to a long-term storage (step [0057] 919) unit 34 for later retrieval. Such a unit 34 may comprise, for example, one or more optical jukeboxes, each comprising one or more optical platters. Preferably two copies are written, each copy to a different platter, for data backup.
  • Next the transaction log data are transferred to a [0058] fourth server 32. Fourth server 32 may comprise, for example, a UNIX 64-Mb RAM processor having Oracle and FileNet software resident thereon.
  • Th Distribution and Qu u Monitoring System and Method [0059]
  • Once a complete batch of answer pages have been imaged and stored, a “batch” comprising, for example, all test booklets from a particular grade level from a particular school, scoring can commence. FIG. 5 is a flowchart of an exemplary distribution process of the present invention, wherein a [0060] first step 950 comprises determining an answer batch from a queue to be scored during a particular time period.
  • In a preferred embodiment, a determination is made prior to the start of a scoring session as to which batches of answers are desired to be scored during that session. This determination may be based, for example, on predetermined criteria including an assigned priority, project number, order number, and number and type of readers available, and is entered into a [0061] fifth server 36, which provides a communication link between the fourth server 32, the cache 38, reader workstations 50, and the mainframe 40, as will be discussed in the following (FIG. 1). Fifth server 36 comprises, in an exemplary embodiment, a DEC-Alpha server having 512 Mb RAM and 12-Gb disk capacity, with 3.2c UNIX and 7.2.2.3 Oracle resident therein.
  • The desired batches are prefetched (step [0062] 951) from the long-term storage unit 34 and temporarily stored (step 952) in a cache 38, as directed by the OSAR system 322 in the fourth server 32 under the control of the fifth server 36. These prefetching and temporary storage steps 951,952 confer a speed advantage over having readers access the long-term storage unit 34 directly, which is comparatively slow, whereas the cache 38 response time is rapid. An exemplary cache 38 for use in the system comprises a FileNet residing on the OSAR server and contains 12 GB of magnetic storage for this transient database.
  • The [0063] fifth server 36 contains a first database 362 associating each answer batch with a qualification required of a reader (e.g., sixth-grade math, New York State test). A second database 364 resident therein contains a list of qualifications possessed by each reader. A third database 366 resident therein contains the form data for each answer, including the number of questions and pages in the test, how each answer is to be scored, and in what form the answer image is to be presented to a reader. For example, information on the page in FIG. 7A would include the location of the answer blank 73 to Question #2 and the answer scale to be used in scoring that question (e.g., a score of 1-5).
  • After the answer batch is lodged in the [0064] cache 38, the question qualification 362 and forms 366 databases are referenced (steps 953 and 954), and a work queue is established, which is selected by a supervisor managing a group of readers (step 955).
  • When a reader logs onto a [0065] workstation 50, his or her qualifications will have been checked by the supervisor. The reader receives an answer from the chosen batch for scoring (step 957). The answer image is formatted for display (step 958) and delivered to the reader's workstation 50 (step 959).
  • The [0066] formatting step 958 comprises accessing the forms database 366 to determine how the answer image and scoring protocol are to be displayed to the reader. For example, an area of interest 73 (FIG. 7A) or 83 (FIG. 8A), which comprises the space left for writing in an answer, is delineated on each page image, and it is this area that initially appears on the reader's workstation screen 51 (FIGS. 7B and 8B). An important feature of the present invention is that the reader can also access the remainder of the image if desired, which can be necessary if the student has written outside the area provided for that particular question (see FIG. 6, steps 988,989), and may even spill over onto another page. Such access is typically provided by a scroll bar 510 such as are known in the art in Windows®-type applications (FIGS. 7B and 8B). This feature provides an advantage over other systems known in the art in which the visual image is clipped to include only a predetermined area of interest, in which case this extra display information is lost.
  • Once the reader has finished with an answer, a score is entered into the workstation [0067] 50 (step 960), which is delivered to and stored at the fifth server 36 (step 962). Next the reader receives another answer to score from the same batch, if there are additional answers of the same test question remaining in the queue (step 962). If that queue is empty, the supervisor selects another answer batch from the queue (step 955). Once the batch is completely scored, the scores are assembled and transmitted by the fifth server 36 to the mainframe 40 (step 965), where all the individual answer scores are correlated for each booklet and a total test score is calculated. This step typically occurs once per day.
  • The progress and speed of any particular reader or the status of a particular queue are monitored by accessing the [0068] fifth server 36, which maintains statistics (step 963) and a table of workflow queues (step 964). Access to this information may be limited, for example, to supervisory or managerial personnel by means known in the art.
  • The Scoring and Reader Monitoring System and Method [0069]
  • One aspect of the scoring system and method of the present invention is illustrated in the flowchart of FIG. 6, which provides further details of the steps occurring between [0070] step 957, the delivery of an answer to a reader for scoring, and step 960, the entry of a score, in FIG. 5.
  • As indicated above, the answer, prior to delivery (step [0071] 957), is formatted for electronically selecting an area of interest 73 or 83 for displaying to the reader, along with a scroll bar 75,85 for permitting the reader to access the remainder of the page 12,12′ (FIGS. 7A,8A). The answer is also formatted for scoring protocol, and, as illustrated in FIGS. 7B and 8B, a score button bar 76,86 is provided that corresponds to the scoring range for that question. In FIG. 7B, the scores are given on a scale of 1 to 5; in FIG. 8B, 1 to 4. Answers that cannot be give a numeric grade are considered invalid and are scored in a separate category (e.g., blank, foreign language, off-topic).
  • Scoring facilities such as are known in the art generally comprise groups of readers having similar qualifications who are assigned to types of questions to score. Such groups may be further subdivided into smaller groups, and a commensurate management tree structure created. Preferably this tree structure is mirrored in the hardware architecture (FIG. 3), wherein, for example, a supervisor has access to all [0072] reader workstations 50 in that group.
  • To proceed with scoring, formatted answer and score [0073] button bar 76,86 are displayed to the reader (step 980). If the reader has a question regarding the scoring protocol (step 981), a query is sent electronically upline to the reader's next-level supervisor (step 982). If that supervisor can answer the question (step 983), a response is transmitted electronically to the reader (step 984); if that supervisor cannot answer the question (step 983), a query is transmitted upline to the next-level supervisor (step 982), looping through as many levels of supervisors as are present until the query can be addressed. When the query is answered, the answer is relayed to the reader through all intermediate query relayers (step 984) so that all levels of personnel can view the answer to the query. While the query is being routed, the reader can continue scoring another answer.
  • Once the query is answered, or if there was no query, the reader can continue scoring that answer. If the test is in geometry or some other discipline wherein an answer can comprise the drawing of a diagram, a software tool is made available to the reader to assist in scoring (step [0074] 985). If needed, the geometric tool is fetched (step 986) and utilized to score the answer. In the example shown in FIG. 8B, a right triangle was drawn, and thus a floating protractor 87 can be used to measure the right angle 840. Also available are screen-manipulable tools for measuring areas, lines, and circles. This software in the preferred embodiment comprises a custom-designed package.
  • The reader then determines if the image display is sufficient for scoring the answer (step [0075] 987). If so, the reader can score the answer (step 960); if not, the reader can use the scroll bar 510 to access another area of the page, or an area on another page, to view additional parts of the visual image (step 988).
  • Another aspect of the present invention includes a system and method for monitoring the scoring effectiveness of a reader, the steps for which are included in the flowchart of FIG. 9. A group supervisor, for example, sends a calibration answer having a predetermined target answer to a reader (step [0076] 990). This answer is interspersed with “real” student answers and are substantially identical in form thereto, which permits the calibration to be performed transparently.
  • A score entered by the reader (step [0077] 991) is collected (step 992) and electronically compared with the target score (step 993) for providing an indication of effectiveness (step 994). At the same time, the scoring time can be collected (step 992) and compared with a target scoring time (step 993) for a calculation of scoring efficiency (step 994).
  • Another check is performed by comparing a score given holistically and analytically by an inconsistency application ([0078] 970, FIG. 2). If these scores differ too widely, they are rechecked to ensure that an error was not made.
  • As mentioned, scoring is typically performed by electronically linked groups of readers having similar qualifications. Thus the method illustrated in FIG. 9 can also be expanded to monitor the effectiveness and efficiency of the entire group of readers (steps [0079] 991-991″) substantially simultaneously if desired.
  • Statistics can also be amassed at the system level on scoring progress for each workflow queue, broken down into scoring groups or by individual readers. As these statistics are being collected continuously, the system provides enormous flexibility in terms of optimization of effort. [0080]
  • System Architecture and Software System Flow [0081]
  • An exemplary architecture for a preferred embodiment of the [0082] present system 10 is schematically illustrated in FIG. 3, and comprises a fiber-optic database distributed interface 61 (FDDI) having a throughput of 100 Mbits. In this embodiment a 100-Mbit fiber is employed to link the subsystems.
  • Connected to the [0083] FDDI 61 are the Novell server 28 and the UNIX servers 30 and 36. The cache 38 and the jukebox 34 are connected through the server 30. A first hub 62 is connected to the FDDI 61 and, via 10-Mbit lines, to the scanners 20, which output to magnetic tape 41, as shown in FIG. 1, and thence to mainframe 40. A second hub 63 is connected to the FDDI 61 and, via 10-Mbit lines, to the reader workstations 50. Second hub 63 acts as a concentrator and has 100 Mbits from FDDI 61. Each workstation 50 has 10 Mbits out on ethernet.
  • It is believed that this architecture confers advantages over systems previously known in the art, which employ token rings having limited throughput and one server per group. The present system comprises central servers supporting all readers, which permits improved flexibility both in hardware and in software implementation. This architecture further permits the adaptation to remote scoring sites. [0084]
  • The software system flow is illustrated in FIG. 2, wherein each “scoring work unit,” (SCO WRK UN), here shown as [0085] 74 in FIG. 7A, comprises an answer image. The applications bear like numbers to the steps they perform in the flowcharts. In addition, various caches are maintained between applications, including: transaction data 971 from the scanning operation 907; rescanned 972 and new booklet 973 information from HPII document committal; image quality work units 974 acted upon by the image quality application 912, the distributor application 957, the question application 981, and the scoring application 960; regular holistic and analytical scores 975 from the scoring 960, route 965, and question 981 applications; domain item questions 976, wherein pending questions are held until they are resolved; pending scores 977 for holding incomplete scores; calibration work units 978; and inconsistency work units 979.
  • New Form Definition [0086]
  • The system of the present invention further comprises a table-driven system for entering new project configurations, including teams, forms, domains, and orders. This allows the scoring to be customized for each project without any recoding. [0087]
  • It may be appreciated by one skilled in the art that additional embodiments may be contemplated, including analogous systems and methods for processing questionnaires. [0088]
  • In the foregoing description, certain terms have been used for brevity, clarity, and understanding, but no unnecessary limitations are to be implied therefrom beyond the requirements of the prior art, because such words are used for description purposes herein and are intended to be broadly construed. Moreover, the embodiments of the apparatus illustrated and described herein are by way of example, and the scope of the invention is not limited to the exact details of construction. [0089]
  • Having now described the invention, the construction, the operation and use of preferred embodiment thereof, and the advantageous new and useful results obtained thereby, the new and useful constructions, and reasonable mechanical equivalents thereof obvious to those skilled in the art, are set forth in the appended claims. [0090]

Claims (25)

What is claimed is:
1. A method for scoring an answer page containing an answer to an open-ended question, the method comprising the steps of:
viewing a first visual image of a first portion of an answer page, the first portion comprising an answer space in which an answer to an open-ended question is expected to reside;
if the first portion of the answer page contains a complete answer, electronically scoring the answer;
if the first portion of the answer page does not encompass a complete answer, accessing and viewing a second visual image of a second portion of the answer page, the second portion comprising a sector of the answer page outside the answer space, and electronically scoring the answer.
2. The method recited in claim 1, wherein the viewing step comprises receiving the first and the second visual image through a processor onto a display device.
3. The method recited in claim 1, further comprising the steps of:
entering an electronic scoring system;
requesting to view an answer to score; and
receiving the first visual image from a queue comprising a plurality of answer page images.
4. The method recited in claim 1, wherein the electronically scoring step comprises selecting a numerical score for the answer from a score sector on a display.
5. The method recited in claim 4, wherein the score sector comprises a score button bar displayed on a common display with the first visual image.
6. The method recited in claim 1, further comprising the steps, if the first and the second visual image do not encompass a complete answer, of repeating the accessing and viewing steps until substantially the entire answer page is viewed.
7. The method recited in claim 6, wherein, if the entire answer page does not encompass a complete answer, viewing a first visual image of a second answer page to search for a complete answer.
8. The method recited in claim 1, wherein the viewing steps comprise receiving the first and the second visual image through a processor onto a display device, and the accessing step comprises electronically manipulating a scroll bar on the display device.
9. The method recited in claim 1, further comprising the step of, if a question occurs during the scoring step, electronically transmitting a query to a supervisor.
10. The method recited in claim 1, wherein the answer comprises an answer in verbal form.
11. The method recited in claim 1, wherein the answer comprises a geometric diagram.
12. The method recited in claim 11, further comprising the step of accessing an electronically manipulable display of a geometric tool for assessing the geometric diagram.
13. The method recited in claim 1, wherein the answer comprises a calibration answer, and further comprising the steps of receiving a score and comparing the received score with a target score.
14. The method recited in claim 13, further comprising the step of calculating a reader effectiveness from the comparing step.
15. The method recited in claim 13, further comprising the steps of calculating a time span between the first visual image viewing step and the scoring step, and comparing the time span with a target scoring time.
16. The method recited in claim 15, further comprising the step of calculating a reader efficiency from the time-span comparing step.
17. A method for delivering an answer page containing an answer to an open-ended question to a reader for scoring, the method comprising the steps of:
retrieving from a storage device a visual image of an answer page containing an answer to an open-ended question;
formatting a first display screen comprising a first visual image of a first portion of an answer page, the first portion comprising an answer space in which an answer to an open-ended question is expected to reside;
transmitting to a reader display device the first display screen; and
permitting reader visual access on the display device to a second portion of the answer page at least partially outside the first portion.
18. The method recited in claim 17, wherein the retrieving step comprises the steps of:
determining a batch comprising a plurality of answers to be scored, the plurality of answers comprising answers to a unitary test question;
fetching answer page images corresponding to the determined batch from a storage device; and
holding the fetched answer page images in a cache.
19. The method recited in claim 18, further comprising the steps, prior to the formatting and transmitting steps, of retrieving a scoring and a display protocol for answers to the test question.
20. The method recited in claim 19, wherein the scoring protocol comprises a numerical score range for answers to the test question.
21. The method recited in claim 20, wherein the first visual image comprises a score selection element, and the formatting step comprises including the score selection element as dictated by the display protocol.
22. The method recited in claim 21, wherein the score selection element comprises a score button bar.
23. The method recited in claim 17, wherein the first display screen further comprises a score selection element.
24. The method recited in claim 17, wherein, if the answer page does not encompass a complete answer, repeating the retrieving, formatting, transmitting, and permitting steps on a second answer page.
25. The method recited in claim 17, wherein the first display screen further comprises an electronically manipulable scroll bar, and wherein the permitting step comprises receiving a signal indicative of a manipulation of the scroll bar and transmitting to the reader display device a second display screen comprising the second portion of the answer page corresponding to the signal.
US10/765,749 1997-07-31 2004-01-27 Method for scoring and delivering to a reader test answer images for open-ended questions Abandoned US20040185424A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/765,749 US20040185424A1 (en) 1997-07-31 2004-01-27 Method for scoring and delivering to a reader test answer images for open-ended questions

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US08/903,646 US6173154B1 (en) 1997-07-31 1997-07-31 System and method for imaging test answer sheets having open-ended questions
US09/707,252 US6366760B1 (en) 1997-07-31 2000-11-06 Method for imaging test answer sheets having open-ended questions
US10/113,035 US6684052B2 (en) 1997-07-31 2002-04-01 Scanning system for imaging and storing the images of test answer sheets having open-ended questions
US10/765,749 US20040185424A1 (en) 1997-07-31 2004-01-27 Method for scoring and delivering to a reader test answer images for open-ended questions

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US10/113,035 Continuation US6684052B2 (en) 1997-07-31 2002-04-01 Scanning system for imaging and storing the images of test answer sheets having open-ended questions

Publications (1)

Publication Number Publication Date
US20040185424A1 true US20040185424A1 (en) 2004-09-23

Family

ID=25417866

Family Applications (4)

Application Number Title Priority Date Filing Date
US08/903,646 Expired - Lifetime US6173154B1 (en) 1997-07-31 1997-07-31 System and method for imaging test answer sheets having open-ended questions
US09/707,252 Expired - Lifetime US6366760B1 (en) 1997-07-31 2000-11-06 Method for imaging test answer sheets having open-ended questions
US10/113,035 Expired - Lifetime US6684052B2 (en) 1997-07-31 2002-04-01 Scanning system for imaging and storing the images of test answer sheets having open-ended questions
US10/765,749 Abandoned US20040185424A1 (en) 1997-07-31 2004-01-27 Method for scoring and delivering to a reader test answer images for open-ended questions

Family Applications Before (3)

Application Number Title Priority Date Filing Date
US08/903,646 Expired - Lifetime US6173154B1 (en) 1997-07-31 1997-07-31 System and method for imaging test answer sheets having open-ended questions
US09/707,252 Expired - Lifetime US6366760B1 (en) 1997-07-31 2000-11-06 Method for imaging test answer sheets having open-ended questions
US10/113,035 Expired - Lifetime US6684052B2 (en) 1997-07-31 2002-04-01 Scanning system for imaging and storing the images of test answer sheets having open-ended questions

Country Status (5)

Country Link
US (4) US6173154B1 (en)
AU (1) AU750650B2 (en)
CA (1) CA2301007C (en)
GB (1) GB2343775B (en)
WO (1) WO1999006945A1 (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030180703A1 (en) * 2002-01-28 2003-09-25 Edusoft Student assessment system
US20040091847A1 (en) * 2002-11-06 2004-05-13 Ctb/Mcgraw-Hill Paper-based adaptive testing
US20040126036A1 (en) * 2000-08-11 2004-07-01 Poor David D.S. Method and apparatus for selective processing of captured images
US20050157930A1 (en) * 2004-01-20 2005-07-21 Robert Cichielo Method and system for performing image mark recognition
US20050284944A1 (en) * 2004-06-28 2005-12-29 Wei Ming Color barcode producing, reading and/or reproducing method and apparatus
US20060120605A1 (en) * 2004-12-08 2006-06-08 Ctb/Mcgraw-Hill Data extraction from temporal image data
US20060213993A1 (en) * 2005-03-28 2006-09-28 Konica Minolta Systems Laboratory, Inc. Systems and methods for preserving and maintaining document integrity
US20060252023A1 (en) * 2005-05-03 2006-11-09 Lexmark International, Inc. Methods for automatically identifying user selected answers on a test sheet
US20070048698A1 (en) * 2005-08-30 2007-03-01 Wang Chien J Literacy and Language Assessment and Associated Methods
US20070232885A1 (en) * 2006-03-03 2007-10-04 Thomas Cook Medical imaging examination review and quality assurance system and method
US20080078836A1 (en) * 2006-09-29 2008-04-03 Hiroshi Tomita Barcode for two-way verification of a document
US20080080777A1 (en) * 2006-09-29 2008-04-03 Hiroshi Tomita Barcode and decreased-resolution reproduction of a document image
US20080140865A1 (en) * 2006-12-11 2008-06-12 Sharp Laboratories Of America, Inc. Integrated paper and computer-based testing administration system
US20080227075A1 (en) * 2007-03-15 2008-09-18 Ctb/Mcgraw-Hill, Llc Method and system for redundant data capture from scanned documents
US20090015875A1 (en) * 2007-06-20 2009-01-15 Ctb/Mcgraw-Hill Companies, Inc. Image manipulation of digitized images of documents
US20090194592A1 (en) * 2004-08-09 2009-08-06 Konica Minolta Systems Laboratory, Inc. Color Barcode Producing Method and Apparatus, Color Barcode Reading Method and Apparatus and Color Barcode Reproducing Method and Apparatus
US20090282009A1 (en) * 2008-05-09 2009-11-12 Tags Ltd System, method, and program product for automated grading
US7791756B2 (en) 2005-05-03 2010-09-07 Lexmark International, Inc. Methods for identifying marks using a digital master document and scanned image enhancement
CN102339352A (en) * 2010-07-20 2012-02-01 上海海鞠电子科技有限公司 Electronic paper marking method
US8526766B2 (en) 2007-10-31 2013-09-03 Ctb/Mcgraw-Hill Llc Use of composite bitmapped images in conjunction with display of captured data
WO2017105518A1 (en) * 2015-12-18 2017-06-22 Hewlett-Packard Development Company, L.P. Question assessment

Families Citing this family (68)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6173154B1 (en) * 1997-07-31 2001-01-09 The Psychological Corporation System and method for imaging test answer sheets having open-ended questions
US6267601B1 (en) 1997-12-05 2001-07-31 The Psychological Corporation Computerized system and method for teaching and assessing the holistic scoring of open-ended questions
US6553369B1 (en) 1999-03-11 2003-04-22 Oracle Corporation Approach for performing administrative functions in information systems
US6341959B1 (en) * 2000-03-23 2002-01-29 Inventec Besta Co. Ltd. Method and system for learning a language
US6988895B1 (en) * 2001-01-12 2006-01-24 Ncs Pearson, Inc. Electronic test item display as an image with overlay controls
US6751351B2 (en) * 2001-03-05 2004-06-15 Nsc Pearson, Inc. Test question response verification system
US6675133B2 (en) * 2001-03-05 2004-01-06 Ncs Pearsons, Inc. Pre-data-collection applications test processing system
US20020123029A1 (en) * 2001-03-05 2002-09-05 Kristian Knowles Multiple server test processing workflow system
US6810232B2 (en) 2001-03-05 2004-10-26 Ncs Pearson, Inc. Test processing workflow tracking system
US6961482B2 (en) * 2001-03-05 2005-11-01 Ncs Pearson, Inc. System for archiving electronic images of test question responses
US6927872B2 (en) * 2001-07-25 2005-08-09 Hewlett-Packard Development Company, L.P. Data acquisition system and method using answer forms
JP3687785B2 (en) * 2001-08-15 2005-08-24 株式会社日本統計事務センター Scoring processing method and scoring processing system
US7077313B2 (en) * 2001-10-01 2006-07-18 Avante International Technology, Inc. Electronic voting method for optically scanned ballot
US7828215B2 (en) * 2001-10-01 2010-11-09 Avante International Technology, Inc. Reader for an optically readable ballot
US7635087B1 (en) 2001-10-01 2009-12-22 Avante International Technology, Inc. Method for processing a machine readable ballot and ballot therefor
US10347145B1 (en) 2001-10-05 2019-07-09 Vision Works Ip Corporation Method and apparatus for periodically questioning a user using a computer system or other device to facilitate memorization and learning of information
US6895213B1 (en) 2001-12-03 2005-05-17 Einstruction Corporation System and method for communicating with students in an education environment
JP2005520588A (en) * 2002-03-20 2005-07-14 ノヴァダク テクノロジーズ インコーポレイテッド System and method for visualizing fluid flow in a tube
US20030207246A1 (en) * 2002-05-01 2003-11-06 Scott Moulthrop Assessment and monitoring system and method for scoring holistic questions
US8892895B1 (en) 2002-05-07 2014-11-18 Data Recognition Corporation Integrated system for electronic tracking and control of documents
US6772081B1 (en) 2002-05-21 2004-08-03 Data Recognition Corporation Priority system and method for processing standardized tests
AU2003239936A1 (en) * 2002-05-31 2003-12-19 Vsc Technologies, Llc System for scoring scanned test answer sheets
US20040194036A1 (en) * 2002-11-14 2004-09-30 Magdalena Wolska Automated evaluation of overly repetitive word use in an essay
US6918768B2 (en) * 2003-01-31 2005-07-19 Enablearning, Inc. Computerized system and method for visually based education
US8385811B1 (en) 2003-02-11 2013-02-26 Data Recognition Corporation System and method for processing forms using color
US20040202992A1 (en) * 2003-04-14 2004-10-14 Scott Moulthrop Electronic test answer record quality assurance system and method
US7020435B2 (en) * 2003-06-12 2006-03-28 Harcourt Assessment, Inc. Electronic test answer record image quality improvement system and method
US20050154590A1 (en) * 2004-01-09 2005-07-14 Coffey Mark A. Method for assisting the grading and recording of educational tasks in an electronic gradebook using voice recognition
US7298901B2 (en) * 2004-04-07 2007-11-20 Scantron Corporation Scannable form and system
US20050237580A1 (en) * 2004-04-16 2005-10-27 Dave Coleman Scanner read head for images and optical mark recognition
US20050238260A1 (en) * 2004-04-16 2005-10-27 Dave Coleman Image and optical mark scanner with encryption
US20060003306A1 (en) * 2004-07-02 2006-01-05 Mcginley Michael P Unified web-based system for the delivery, scoring, and reporting of on-line and paper-based assessments
US20060035204A1 (en) * 2004-08-11 2006-02-16 Lamarche Wesley E Method of processing non-responsive data items
EP1650945A1 (en) * 2004-10-21 2006-04-26 Océ-Technologies B.V. Apparatus and method for automatically analysing a filled in questonnaire
JP4872214B2 (en) * 2005-01-19 2012-02-08 富士ゼロックス株式会社 Automatic scoring device
US20060166179A1 (en) * 2005-01-24 2006-07-27 Wiig Elisabeth H System and method for assessment of basic concepts
JP4807489B2 (en) * 2005-02-28 2011-11-02 富士ゼロックス株式会社 Teaching material processing apparatus, teaching material processing method, and teaching material processing program
US8170466B2 (en) * 2005-05-27 2012-05-01 Ctb/Mcgraw-Hill System and method for automated assessment of constrained constructed responses
US8608477B2 (en) * 2006-04-06 2013-12-17 Vantage Technologies Knowledge Assessment, L.L.C. Selective writing assessment with tutoring
US20070298404A1 (en) * 2006-06-09 2007-12-27 Training Masters, Inc. Interactive presentation system and method
US8358964B2 (en) * 2007-04-25 2013-01-22 Scantron Corporation Methods and systems for collecting responses
US20080280280A1 (en) * 2007-05-11 2008-11-13 Aplia, Inc. Method of capturing workflow
US8100694B2 (en) * 2007-06-11 2012-01-24 The United States Of America As Represented By The Secretary Of The Navy Infrared aimpoint detection system
US8738659B1 (en) 2007-10-22 2014-05-27 Data Recognition Corporation Method and apparatus for managing priority in standardized test and survey imaging
US8526055B1 (en) 2007-10-22 2013-09-03 Data Recognition Corporation Standardized test and survey imaging system
US8488220B1 (en) 2007-10-22 2013-07-16 Data Recognition Corporation Method and apparatus for calibrating imaging equipment
US8649601B1 (en) 2007-10-22 2014-02-11 Data Recognition Corporation Method and apparatus for verifying answer document images
US9195875B1 (en) 2007-10-22 2015-11-24 Data Recognition Corporation Method and apparatus for defining fields in standardized test imaging
US8066184B2 (en) * 2008-04-30 2011-11-29 Avante International Technology, Inc. Optically readable marking sheet and reading apparatus and method therefor
US8639177B2 (en) * 2008-05-08 2014-01-28 Microsoft Corporation Learning assessment and programmatic remediation
US20100047757A1 (en) * 2008-08-22 2010-02-25 Mccurry Douglas System and method for using interim-assessment data for instructional decision-making
US8261985B2 (en) 2009-04-07 2012-09-11 Avante Corporation Limited Manual recount process using digitally imaged ballots
US8261986B2 (en) * 2009-10-21 2012-09-11 Kevin Kwong-Tai Chung System and method for decoding an optically readable markable sheet and markable sheet therefor
WO2011094214A1 (en) * 2010-01-29 2011-08-04 Scantron Corporation Data collection and transfer techniques for scannable forms
US9099007B1 (en) 2011-05-15 2015-08-04 Quaest, Inc. Computerized processing of pictorial responses in evaluations
US9530329B2 (en) 2014-04-10 2016-12-27 Laurence RUDOLPH System and method for conducting multi-layer user selectable electronic testing
JP2016212284A (en) * 2015-05-11 2016-12-15 富士通株式会社 Point rating rule application object specification program, erroneous judgment rule setting program, point rating rule application object specification method, erroneous judgment rule setting method, application object specification device, and erroneous judgment rule setting unit
US10366083B2 (en) 2015-07-29 2019-07-30 Oracle International Corporation Materializing internal computations in-memory to improve query performance
US10204135B2 (en) 2015-07-29 2019-02-12 Oracle International Corporation Materializing expressions within in-memory virtual column units to accelerate analytic queries
US10063727B2 (en) * 2015-12-29 2018-08-28 Kabushiki Kaisha Toshiba Marking apparatus and decoloring apparatus
US10531131B2 (en) * 2016-05-23 2020-01-07 Time Warner Cable Enterprises Llc Distribution and management of content from a multi-tier content distribution system
JP6634993B2 (en) * 2016-09-30 2020-01-22 京セラドキュメントソリューションズ株式会社 Image forming device
JP2018141846A (en) * 2017-02-27 2018-09-13 株式会社リコー Information processing device, program, information processing system and information processing method
US10516525B2 (en) 2017-08-24 2019-12-24 International Business Machines Corporation System and method for detecting anomalies in examinations
US11226955B2 (en) 2018-06-28 2022-01-18 Oracle International Corporation Techniques for enabling and integrating in-memory semi-structured data and text document searches with in-memory columnar query processing
JP7234706B2 (en) * 2019-03-11 2023-03-08 富士フイルムビジネスイノベーション株式会社 Information processing system, information processing device and program
JP2021157737A (en) * 2020-03-30 2021-10-07 ブラザー工業株式会社 Relay server and computer program therefor
CN111507251B (en) * 2020-04-16 2022-10-21 北京世纪好未来教育科技有限公司 Method and device for positioning answering area in test question image, electronic equipment and computer storage medium

Citations (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4205780A (en) * 1977-03-21 1980-06-03 Teknekron, Inc. Document processing system and method
US4741047A (en) * 1986-03-20 1988-04-26 Computer Entry Systems Corporation Information storage, retrieval and display system
US4760246A (en) * 1987-04-20 1988-07-26 Cognitronics Corporation Mark-reading apparatus for use with answer sheets
US4958284A (en) * 1988-12-06 1990-09-18 Npd Group, Inc. Open ended question analysis system and method
US4978305A (en) * 1989-06-06 1990-12-18 Educational Testing Service Free response test grading method
US5002491A (en) * 1989-04-28 1991-03-26 Comtek Electronic classroom system enabling interactive self-paced learning
US5011413A (en) * 1989-07-19 1991-04-30 Educational Testing Service Machine-interpretable figural response testing
US5017763A (en) * 1987-04-20 1991-05-21 Cognitronics Corp. Scanning apparatus storing both processed and unprocessed scan data signals for separate read-out and method of operating same
US5038392A (en) * 1990-02-12 1991-08-06 International Business Machines Corporation Method and apparatus for adaptive image processing by recognizing a characterizing indicium in a captured image of a document
US5054096A (en) * 1988-10-24 1991-10-01 Empire Blue Cross/Blue Shield Method and apparatus for converting documents into electronic data for transaction processing
US5101447A (en) * 1989-09-28 1992-03-31 Automated Tabulation Inc. Method and apparatus for optically reading pre-printed survey pages
US5103490A (en) * 1990-06-13 1992-04-07 National Computer Systems, Inc. Method and apparatus for storing and merging multiple optically scanned images
US5102341A (en) * 1989-05-05 1992-04-07 Touchstone Applied Science Associates, Inc. Test answer and score sheet device
US5134669A (en) * 1990-06-13 1992-07-28 National Computer Systems Image processing system for documentary data
US5235433A (en) * 1991-04-30 1993-08-10 International Business Machines Corporation System and method for automatically indexing facsimile transmissions received in a computerized image management system
US5258855A (en) * 1991-03-20 1993-11-02 System X, L. P. Information processing methodology
US5321611A (en) * 1993-02-05 1994-06-14 National Computer Systems, Inc. Multiple test scoring system
US5410494A (en) * 1992-04-08 1995-04-25 Sharp Kabushiki Kaisha Electronic measuring apparatus for measuring objects of a figure or on a map
US5433615A (en) * 1993-02-05 1995-07-18 National Computer Systems, Inc. Categorized test item reporting system
US5437554A (en) * 1993-02-05 1995-08-01 National Computer Systems, Inc. System for providing performance feedback to test resolvers
US5452379A (en) * 1991-09-13 1995-09-19 Meadowbrook Industries, Ltd. Image capture and storage techniques in association with optical mark reading
US5584699A (en) * 1996-02-22 1996-12-17 Silver; Judith A. Computerized system for teaching geometry proofs
US5597311A (en) * 1993-12-30 1997-01-28 Ricoh Company, Ltd. System for making examination papers and having an automatic marking function
US5672060A (en) * 1992-07-08 1997-09-30 Meadowbrook Industries, Ltd. Apparatus and method for scoring nonobjective assessment materials through the application and use of captured images
US5729741A (en) * 1995-04-10 1998-03-17 Golden Enterprises, Inc. System for storage and retrieval of diverse types of information obtained from different media sources which includes video, audio, and text transcriptions
US5798752A (en) * 1993-07-21 1998-08-25 Xerox Corporation User interface having simultaneously movable tools and cursor
US5991595A (en) * 1997-03-21 1999-11-23 Educational Testing Service Computerized system for scoring constructed responses and methods for training, monitoring, and evaluating human rater's scoring of constructed responses
US6120299A (en) * 1997-06-06 2000-09-19 Educational Testing Service System and method for interactive scoring of standardized test responses
US6366760B1 (en) * 1997-07-31 2002-04-02 The Psychological Corporation Method for imaging test answer sheets having open-ended questions
US6466683B1 (en) * 1992-07-08 2002-10-15 Ncs Pearson, Inc. System and method of distribution of digitized materials and control of scoring for open-ended assessments
US6657642B1 (en) * 1997-07-03 2003-12-02 International Business Machines Corporation User interactive display interfaces with means for interactive formation of combination display objects representative of combined interactive functions
US6751351B2 (en) * 2001-03-05 2004-06-15 Nsc Pearson, Inc. Test question response verification system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6115509A (en) * 1994-03-10 2000-09-05 International Business Machines Corp High volume document image archive system and method

Patent Citations (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4205780A (en) * 1977-03-21 1980-06-03 Teknekron, Inc. Document processing system and method
US4741047A (en) * 1986-03-20 1988-04-26 Computer Entry Systems Corporation Information storage, retrieval and display system
US4760246A (en) * 1987-04-20 1988-07-26 Cognitronics Corporation Mark-reading apparatus for use with answer sheets
US5017763A (en) * 1987-04-20 1991-05-21 Cognitronics Corp. Scanning apparatus storing both processed and unprocessed scan data signals for separate read-out and method of operating same
US5054096A (en) * 1988-10-24 1991-10-01 Empire Blue Cross/Blue Shield Method and apparatus for converting documents into electronic data for transaction processing
US4958284A (en) * 1988-12-06 1990-09-18 Npd Group, Inc. Open ended question analysis system and method
US5002491A (en) * 1989-04-28 1991-03-26 Comtek Electronic classroom system enabling interactive self-paced learning
US5102341A (en) * 1989-05-05 1992-04-07 Touchstone Applied Science Associates, Inc. Test answer and score sheet device
US4978305A (en) * 1989-06-06 1990-12-18 Educational Testing Service Free response test grading method
US5011413A (en) * 1989-07-19 1991-04-30 Educational Testing Service Machine-interpretable figural response testing
US5101447A (en) * 1989-09-28 1992-03-31 Automated Tabulation Inc. Method and apparatus for optically reading pre-printed survey pages
US5038392A (en) * 1990-02-12 1991-08-06 International Business Machines Corporation Method and apparatus for adaptive image processing by recognizing a characterizing indicium in a captured image of a document
US5103490A (en) * 1990-06-13 1992-04-07 National Computer Systems, Inc. Method and apparatus for storing and merging multiple optically scanned images
US5134669A (en) * 1990-06-13 1992-07-28 National Computer Systems Image processing system for documentary data
US5258855A (en) * 1991-03-20 1993-11-02 System X, L. P. Information processing methodology
US5235433A (en) * 1991-04-30 1993-08-10 International Business Machines Corporation System and method for automatically indexing facsimile transmissions received in a computerized image management system
US5452379A (en) * 1991-09-13 1995-09-19 Meadowbrook Industries, Ltd. Image capture and storage techniques in association with optical mark reading
US5410494A (en) * 1992-04-08 1995-04-25 Sharp Kabushiki Kaisha Electronic measuring apparatus for measuring objects of a figure or on a map
US6466683B1 (en) * 1992-07-08 2002-10-15 Ncs Pearson, Inc. System and method of distribution of digitized materials and control of scoring for open-ended assessments
US5672060A (en) * 1992-07-08 1997-09-30 Meadowbrook Industries, Ltd. Apparatus and method for scoring nonobjective assessment materials through the application and use of captured images
US5433615A (en) * 1993-02-05 1995-07-18 National Computer Systems, Inc. Categorized test item reporting system
US5466159A (en) * 1993-02-05 1995-11-14 National Computer Systems, Inc. Collaborative and quality control scoring system
US5558521A (en) * 1993-02-05 1996-09-24 National Computer Systems, Inc. System for preventing bias in test answer scoring
US5321611A (en) * 1993-02-05 1994-06-14 National Computer Systems, Inc. Multiple test scoring system
US5458493A (en) * 1993-02-05 1995-10-17 National Computer Systems, Inc. Dynamic on-line scoring guide
US5437554A (en) * 1993-02-05 1995-08-01 National Computer Systems, Inc. System for providing performance feedback to test resolvers
US5690497A (en) * 1993-02-05 1997-11-25 National Computer Systems, Inc. Dynamic on-line scoring method
US5709551A (en) * 1993-02-05 1998-01-20 National Computer Systems, Inc. Multiple test item scoring method
US5716213A (en) * 1993-02-05 1998-02-10 National Computer Systems, Inc. Method for preventing bias in test answer scoring
US5798752A (en) * 1993-07-21 1998-08-25 Xerox Corporation User interface having simultaneously movable tools and cursor
US5597311A (en) * 1993-12-30 1997-01-28 Ricoh Company, Ltd. System for making examination papers and having an automatic marking function
US5775918A (en) * 1993-12-30 1998-07-07 Ricoh Company, Ltd. System for making examination papers and having an automatic marking function
US5729741A (en) * 1995-04-10 1998-03-17 Golden Enterprises, Inc. System for storage and retrieval of diverse types of information obtained from different media sources which includes video, audio, and text transcriptions
US5584699A (en) * 1996-02-22 1996-12-17 Silver; Judith A. Computerized system for teaching geometry proofs
US5991595A (en) * 1997-03-21 1999-11-23 Educational Testing Service Computerized system for scoring constructed responses and methods for training, monitoring, and evaluating human rater's scoring of constructed responses
US6120299A (en) * 1997-06-06 2000-09-19 Educational Testing Service System and method for interactive scoring of standardized test responses
US6657642B1 (en) * 1997-07-03 2003-12-02 International Business Machines Corporation User interactive display interfaces with means for interactive formation of combination display objects representative of combined interactive functions
US6366760B1 (en) * 1997-07-31 2002-04-02 The Psychological Corporation Method for imaging test answer sheets having open-ended questions
US6751351B2 (en) * 2001-03-05 2004-06-15 Nsc Pearson, Inc. Test question response verification system

Cited By (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040126036A1 (en) * 2000-08-11 2004-07-01 Poor David D.S. Method and apparatus for selective processing of captured images
US20040131279A1 (en) * 2000-08-11 2004-07-08 Poor David S Enhanced data capture from imaged documents
US7417774B2 (en) 2000-08-11 2008-08-26 Ctb/Mcgraw-Hill Llc Method and apparatus for selective processing of captured images
US7911660B2 (en) 2000-08-11 2011-03-22 Ctb/Mcgraw-Hill Llc Method and apparatus for data capture from imaged documents
US7573616B2 (en) 2000-08-11 2009-08-11 Ctb/Mcgraw-Hill Llc Enhanced data capture from imaged documents
US20030180703A1 (en) * 2002-01-28 2003-09-25 Edusoft Student assessment system
US20040091847A1 (en) * 2002-11-06 2004-05-13 Ctb/Mcgraw-Hill Paper-based adaptive testing
US7574047B2 (en) 2004-01-20 2009-08-11 Educational Testing Service Method and system for performing image mark recognition
US7298902B2 (en) * 2004-01-20 2007-11-20 Educational Testing Service Method and system for performing image mark recognition
US20080253658A1 (en) * 2004-01-20 2008-10-16 Robert Cichielo Method and system for performing image mark recognition
US20050157930A1 (en) * 2004-01-20 2005-07-21 Robert Cichielo Method and system for performing image mark recognition
US20050284944A1 (en) * 2004-06-28 2005-12-29 Wei Ming Color barcode producing, reading and/or reproducing method and apparatus
US8640955B2 (en) 2004-06-28 2014-02-04 Konica Minolta Laboratory U.S.A., Inc. Color barcode producing, reading and/or reproducing method and apparatus
US20080210758A1 (en) * 2004-06-28 2008-09-04 Konica Minolta Systems Laboratory, Inc. Color Barcode Producing, Reading and/or Reproducing Method and Apparatus
US20080210764A1 (en) * 2004-06-28 2008-09-04 Konica Minolta Systems Laboratory, Inc. Color Barcode Producing, Reading and/or Reproducing Method and Apparatus
US7823797B2 (en) 2004-06-28 2010-11-02 Konica Minolta Systems Laboratory, Inc. Color barcode producing, reading and/or reproducing method and apparatus
US8215556B2 (en) 2004-06-28 2012-07-10 Konica Minolta Laboratory U.S.A., Inc. Color barcode producing, reading and/or reproducing method and apparatus
US8038064B2 (en) 2004-08-09 2011-10-18 Konica Minolta Systems Laboratory, Inc. Color barcode producing method and apparatus, color barcode reading method and apparatus and color barcode reproducing method and apparatus
US20090194592A1 (en) * 2004-08-09 2009-08-06 Konica Minolta Systems Laboratory, Inc. Color Barcode Producing Method and Apparatus, Color Barcode Reading Method and Apparatus and Color Barcode Reproducing Method and Apparatus
US20060120605A1 (en) * 2004-12-08 2006-06-08 Ctb/Mcgraw-Hill Data extraction from temporal image data
US7606421B2 (en) 2004-12-08 2009-10-20 Ctb/Mcgraw-Hill Llc Data extraction from temporal image data
US20060213993A1 (en) * 2005-03-28 2006-09-28 Konica Minolta Systems Laboratory, Inc. Systems and methods for preserving and maintaining document integrity
US20080265015A1 (en) * 2005-03-28 2008-10-30 Konica Minolta Systems Laboratory, Inc. Systems and methods for preserving and maintaining document integrity
US7775435B2 (en) 2005-03-28 2010-08-17 Konica Minolta Systems Laboratory, Inc. Systems and methods for preserving and maintaining document integrity
US20080265042A1 (en) * 2005-03-28 2008-10-30 Konica Minolta Systems Laboratory, Inc. Systems and Methods for Preserving and Maintaining Document Integrity
US8070066B2 (en) 2005-03-28 2011-12-06 Konica Minolta Laboratory U.S.A., Inc. Systems and methods for preserving and maintaining document integrity
US8074886B2 (en) 2005-03-28 2011-12-13 Konica Minolta Laboratory U.S.A., Inc. Systems and methods for preserving and maintaining document integrity
US7669769B2 (en) 2005-03-28 2010-03-02 Konica Minolta Systems Laboratory, Inc. Systems and methods for preserving and maintaining document integrity
US7791756B2 (en) 2005-05-03 2010-09-07 Lexmark International, Inc. Methods for identifying marks using a digital master document and scanned image enhancement
US20060252023A1 (en) * 2005-05-03 2006-11-09 Lexmark International, Inc. Methods for automatically identifying user selected answers on a test sheet
US20070048698A1 (en) * 2005-08-30 2007-03-01 Wang Chien J Literacy and Language Assessment and Associated Methods
US7479011B2 (en) 2005-08-30 2009-01-20 Chien Ju Wang Literacy and language assessment and associated methods
US20070232885A1 (en) * 2006-03-03 2007-10-04 Thomas Cook Medical imaging examination review and quality assurance system and method
US20080080777A1 (en) * 2006-09-29 2008-04-03 Hiroshi Tomita Barcode and decreased-resolution reproduction of a document image
US7628330B2 (en) 2006-09-29 2009-12-08 Konica Minolta Systems Laboratory, Inc. Barcode and decreased-resolution reproduction of a document image
US20080078836A1 (en) * 2006-09-29 2008-04-03 Hiroshi Tomita Barcode for two-way verification of a document
US7766241B2 (en) 2006-09-29 2010-08-03 Konica Minolta Systems Laboratory, Inc. Barcode for two-way verification of a document
US7831195B2 (en) * 2006-12-11 2010-11-09 Sharp Laboratories Of America, Inc. Integrated paper and computer-based testing administration system
US20080140865A1 (en) * 2006-12-11 2008-06-12 Sharp Laboratories Of America, Inc. Integrated paper and computer-based testing administration system
US20080227075A1 (en) * 2007-03-15 2008-09-18 Ctb/Mcgraw-Hill, Llc Method and system for redundant data capture from scanned documents
US9792828B2 (en) 2007-03-15 2017-10-17 Mcgraw-Hill School Education Holdings Llc Use of a resource allocation engine in processing student responses to assessment items
US20090015875A1 (en) * 2007-06-20 2009-01-15 Ctb/Mcgraw-Hill Companies, Inc. Image manipulation of digitized images of documents
US8526766B2 (en) 2007-10-31 2013-09-03 Ctb/Mcgraw-Hill Llc Use of composite bitmapped images in conjunction with display of captured data
US20090282009A1 (en) * 2008-05-09 2009-11-12 Tags Ltd System, method, and program product for automated grading
CN102339352A (en) * 2010-07-20 2012-02-01 上海海鞠电子科技有限公司 Electronic paper marking method
WO2017105518A1 (en) * 2015-12-18 2017-06-22 Hewlett-Packard Development Company, L.P. Question assessment

Also Published As

Publication number Publication date
AU750650B2 (en) 2002-07-25
CA2301007C (en) 2003-10-14
GB0003001D0 (en) 2000-03-29
CA2301007A1 (en) 1999-02-11
US6173154B1 (en) 2001-01-09
US20020110798A1 (en) 2002-08-15
US6684052B2 (en) 2004-01-27
WO1999006945A1 (en) 1999-02-11
GB2343775A (en) 2000-05-17
AU8395298A (en) 1999-02-22
GB2343775B (en) 2001-05-09
US6366760B1 (en) 2002-04-02

Similar Documents

Publication Publication Date Title
US6366760B1 (en) Method for imaging test answer sheets having open-ended questions
US6311040B1 (en) System and method for scoring test answer sheets having open-ended questions
US6751351B2 (en) Test question response verification system
US6675133B2 (en) Pre-data-collection applications test processing system
US6810232B2 (en) Test processing workflow tracking system
US6961482B2 (en) System for archiving electronic images of test question responses
US20060003306A1 (en) Unified web-based system for the delivery, scoring, and reporting of on-line and paper-based assessments
US7020435B2 (en) Electronic test answer record image quality improvement system and method
US20020123029A1 (en) Multiple server test processing workflow system
US20040202992A1 (en) Electronic test answer record quality assurance system and method

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION