WO2004046979A2 - Risk data analysis system - Google Patents

Risk data analysis system Download PDF

Info

Publication number
WO2004046979A2
WO2004046979A2 PCT/EP2003/013019 EP0313019W WO2004046979A2 WO 2004046979 A2 WO2004046979 A2 WO 2004046979A2 EP 0313019 W EP0313019 W EP 0313019W WO 2004046979 A2 WO2004046979 A2 WO 2004046979A2
Authority
WO
WIPO (PCT)
Prior art keywords
risk
risk assessment
files
quantitative
data analysis
Prior art date
Application number
PCT/EP2003/013019
Other languages
French (fr)
Other versions
WO2004046979A8 (en
Inventor
Nancy J. Davis
Steven Ira Kauderer
Gail E. Mcgiffin
Rose Mary Ciraulo
Kathleen Ziegler
Anthony G. Tempesta
Original Assignee
Accenture Global Services Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Accenture Global Services Gmbh filed Critical Accenture Global Services Gmbh
Priority to AU2003288134A priority Critical patent/AU2003288134B8/en
Priority to CA002506520A priority patent/CA2506520A1/en
Priority to EP03780011A priority patent/EP1563430A2/en
Publication of WO2004046979A2 publication Critical patent/WO2004046979A2/en
Publication of WO2004046979A8 publication Critical patent/WO2004046979A8/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/08Insurance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0635Risk analysis of enterprise or organisation activities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0637Strategic management or analysis, e.g. setting a goal or target of an organisation; Planning actions based on goals; Analysis or evaluation of effectiveness of goals
    • G06Q10/06375Prediction of business process outcome or impact based on a proposed change
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06395Quality analysis or management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06398Performance of employee with respect to a job function
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/18Legal services; Handling legal documents
    • G06Q50/188Electronic negotiation

Definitions

  • risk assessment For entities (including businesses, companies and individuals), such as those that issue insurance or grant loans and/or leases, assuming risk is an unavoidable part of doing business. An absolute necessity for such entities is the ability to effectively and efficiently assess risk (“risk assessment”). Risk assessment involves not only determining the nature and extent of the risk but also the potential for profit gained by assuming the risk. For entities involved in any type of risk assessment, it is crucial to have well trained people to make the risk assessments (“risk assessors”), and to monitor, analyze and update any processes used by the risk assessors for risk assessment (“risk assessment processes").
  • Audits are generally conducted either manually, or by using a system that can only determine basic information including the number of risk assessments made, the risk assessors that performed the risk assessment, the amount charged to assume the risk (such as a premium, interest or fee) and the time frame for the risk assumption.
  • this basic information is not sufficient to determine whether risks are being assessed in an efficient and cost effective manner.
  • risk assessors There are limited means to determine whether an entity is effectively assessing risk, however, these limited means tend to be costly and time consuming. Further, it is difficult to calculate whether risk assessors are using the best methods for streamlining the risk assessment processes. To compound this problem, risk assessors generally receive only on-the-job training and perhaps a limited amount of formal introductory training.
  • this training generally focuses on the processes and methods to be used and regulations to be followed rather than on obtaining a financially viable outcome.
  • systems that can review risk assessment processes more completely in order to provide information that can be used to determine which processes are beneficial, whether the consideration received for assuming the risk is appropriate, and whether risks are being handled efficiently and effectively.
  • technology-based solutions that enable and support such systems by providing automated and customizable data storage, retrieval and analysis.
  • a system takes an outcome oriented approach to analyzing risk assessment, training risk assessors as to best practices for assessing risk, and enabling a quality assurance process (the "Risk Data Analysis System” or "RDA System).
  • the RDA System generally includes a system that implements the RDA System in a timely and efficient manner (a "Risk Analysis System").
  • a "Risk Analysis System” Each aspect of the RDA System, including performing the RDAP, conducting a best practices training, and enabling a quality assurance process (shown in FIG. 2), may be implemented or supported by the Risk Analysis System.
  • the Risk Analysis System generally includes a risk analysis unit and may also include an interface unit.
  • the risk analysis unit may include a database that can be custom-developed or developed within the framework of existing database software such as, Microsoft Access.
  • the database is configured to store and efficiently retrieve the information used and produced by the Risk Analysis System.
  • the database may include an analyzing portion, which may include modules for developing questionnaires, developing databases, selecting files, performing the quantitative analyses, synthesizing the quantitative results, generating reports, conducting the scoring processes, and any subset of the foregoing.
  • the automation and customization provided by these modules improves the speed, accuracy and comprehensiveness of the RDA System.
  • the RDA System may determine performance measures, such as economic gain opportunities in risk assessments, identify in which phase of the risk assessment life-cycle these performance measures are the greatest, focus improvement and training efforts on these risk assessment phases using a hands- on-based approach, and use the performance measures to monitor past, current and future compliance with the best practices associated with these risk assessment phases.
  • the RDA System includes a group of methods, methodologies, questionnaires, software, hardware and analyses that analyze risk assessment processes by providing a view of the current state of the risk assessment processes, developing or improving best practices, providing training as to the best practices and enabling the monitoring of compliance with the best practices.
  • the RDA System may also include one or more methods such as, performing a risk data analysis procedure ("RDAP"), conducting best practices training, and enabling a quality assurance process.
  • the RDAP uses information relating to risk assessment processes and evaluates this information in terms of the life cycle or phases of risk assessment.
  • the RDAP includes preparing for the RDAP, conducting the RDAP to generate quantitative analyses and qualitative results, and generating reports that include recommendations and suggestions for improvement based on the quantitative analyses and the qualitative results. As previously discussed, much of the RDAP may be implemented in a Risk Analysis System.
  • best practices training is conducted.
  • Conducting the best practices training includes an outcome focused learning approach used to instill a results orientation for risk assessment. More specifically, conducting the best practices training includes providing hands-on training for best practices; providing content expert presentations; providing networking opportunities; providing feedback and improvement mechanisms; and enabling the determination of best practices.
  • Enabling a quality assurance process allows the monitoring of compliance with the best practices after the RDAP has been completed and includes developing a framework, developing a quality assurance database, and conducting a scoring process. As previously discussed, much of the enabling a quality assurance process may be implemented in a Risk Analysis System.
  • FIG. 1 is a block diagram of a Risk Analysis System
  • FIG. 2 is a flow chart of a method for improving risk assessment processes and outcomes
  • FIG. 3 is a flow chart of a method for performing a risk data analysis procedure ("RDAP");
  • FIG. 4 is a flow chart of a method for preparing for the RDAP
  • FIG. 5 is a flow chart of a method for conducting the RDAP
  • FIG. 6 is a flow chart of a method for conducting the quantitative portion of the RDAP
  • FIG. 7 is a flow chart of a method for generating the reports of the RDAP
  • FIG. 8 is a flow chart of a method for providing training for best practices
  • FIG. 9 is a flow chart of a quality assurance process.
  • FIG. 10 is a flow chart of a method for developing a framework.
  • a system for improving processes and outcomes in risk assessment includes a group of methods, methodologies, questionnaires, software and analyses for analyzing risk assessment processes in order to improve them by providing a view of their current state, developing or improving best practices, providing training as to best practices, and enabling the monitoring of compliance with the best practices.
  • the RDA System takes an outcome oriented approach to analyzing risk assessment processes.
  • the RDA System may identify economic gain opportunities ("EGO") in risk assessment, identify in which phase of the risk assessment life-cycle (“risk assessment phase” or “phase”) EGO exists, and focus improvement, training and monitoring efforts on these risk assessment phases.
  • EGO economic gain opportunities
  • risk assessment phase or “phase”
  • phase phase
  • EGO is a measure of leakage determined for each risk assessment phase, and represents the revenues lost and losses incurred in the phases due to inefficient practices. Therefore, the risk assessment phases that have a higher EGO may be identified and targeted for improvement during a best practices training, and future monitoring by a quality assurance process.
  • the RDA System can be considered a "toolkit” that includes a collection of “tools” or components that can be used in many combinations to analyze risk assessment processes. It is convenient to group these tools as follows: a risk data analysis procedure ("RDAP"); the recommendations; the best practices training; the quality assurance process, the quality assurance database, and the Risk Analysis System.
  • the Risk Analysis System and the quality assurance database implement, support and automate the remaining tools.
  • the Risk Analysis System may includes risk analysis software ("RAS”), score generating software (“SGS”), and quality assurance software (“QAS”), which provide the technical support for the remainder of the Risk Data Analysis System.
  • the RDAP uses information relating to risk assessment processes and evaluates the information in terms of the risk assessment life cycle.
  • the RDAP may include qualitative and quantitative portions, which together with the Risk Analysis System, generates the recommendations.
  • the recommendations may be used in the best practices training to improve the risk assessment processes.
  • the quality assurance process along with the Risk Analysis System and the quality assurance database, helps to sustain the recommendations.
  • the tools embody a method for improving risk assessment processes and outcomes 100.
  • the methods for improving risk assessment processes and outcomes 100 generally include, performing a risk data analysis procedure or RDAP 102, conducting a best practices training 104, and enabling a quality assurance process 106.
  • risk assessment processes tend to coincide with the risk assessment phases
  • the RDA System may analyze risk assessment processes in terms of each risk assessment phase in order to identify opportunities for improving the risk assessment processes. This analysis enables an evaluation of how well the individual risk assessment processes yield profitable outcomes and how well they are executed.
  • the process of risk assessment goes thru at least six (6) risk assessment phases.
  • the first phase involves identifying exposures. This is the phase in which all the risks that might be encountered in a particular case are identified.
  • the second phase evaluating exposure, involves determining the likelihood that the exposures will become actual losses and the potential extent of the actual losses should they occur. In some cases, the second phase may be combined with the first into a single phase.
  • the third phase involves making the risk decision. In this phase, a decision is made based on the results of the second phase regarding whether to assume the risk. If, during the third phase, it is decided that the risk will not be assumed, the life-cycle of the risk assessment process stops at the end of the third phase.
  • the life-cycle continues with the fourth phase, setting the terms and conditions.
  • Setting the terms and conditions involves determining the details under which the risk will be assumed. These details may include, the duration of the risk assumption, the duties of the entity from which the risk will be assumed, the precise definition and scope of the risk assumption and any other term except price. Setting the price for which the risk will be assumed is generally done separately in the fifth phase because the price depends, in part, on the particular terms and conditions under which the risk will be assumed.
  • the next phase is negotiation.
  • this sixth phase the set terms and conditions and/or price may be adjusted in order to make them acceptable and so that a risk relationship is established.
  • a seventh phase is also included. This seventh phase, setting the service program, involves determining the effectiveness of any service resources that were used. Service resources are internal and external services that may be used to evaluate, mitigate or otherwise respond to different aspects of a risk and generally include loss control, claims and premium audits.
  • Loss control audits involve assessing risks and exposures before and/or after a risk has been assumed.
  • Premium audits which may also include interest or fee audits, generally involve either a voluntary or a physical audit of a business that is or will be the subject of a risk policy (such as the insured operation covered by or to be covered by an insurance policy and the collateral for a loan) to see if the premium, interest and/or fees charged were the proper amount based on the assumptions made by the risk assessor.
  • risk assessment specific service resources such as claim audits and default audits. For example, if the risk assessment is underwriting, an additional service resource may be claim audits.
  • Claim audits generally involve special handling of claims made under an insurance policy, which is a costly method.
  • an additional service resource may be default audits.
  • Default audits generally include determining the circumstances under which there is a default on a loan repayment.
  • the effectiveness of any service resources that are used is evaluated by determining the over or under utilization of the service resources. Over utilization of a service resource occurs when that service resource is used beyond the point at which its use produces a monetary savings or measurable improvement. Under utilization occurs when a service resource is not used in situations where its use would have produced a monetary savings or loss cost impact.
  • analysis dimensions include line of business, geographic location, type of risk assumed, branch, division, office or any other dimension along which analysis is desired. For example, an entity may want to compare the performances of each of its branch offices in terms of the type of risk they assume. In this case, the RDA System would analyze risk assessment in terms of the office and type of risk by determining the EGO for each branch office for each of the types of risk they assess.
  • a Risk Analysis System In order to implement and support the RDA System in an efficient and time- effective manner, a Risk Analysis System has been developed. Each aspect of the RDA System, including performing the RDAP, conducting a best practices training, and enabling a quality assurance process (shown in FIG. 2), may be implemented or supported by the Risk Analysis System. An example of a Risk Analysis System is shown in FIG. 1 , and indicated by reference number 1000.
  • System 1000 generally includes a risk analysis unit 1002 and may also include an interface unit 1004.
  • the interface unit 1004 generally includes an input device 1014 and an output device 1016.
  • the output device 1016 may include any type of visual, manual, audio, electronic or electromagnetic device capable of communicating information from a processor, memory device or other computer readable storage medium to a person, processor, memory device or other computer readable storage medium. Examples of output devices include, but are not limited to, monitors, speakers, liquid crystal displays, networks, buses, and interfaces.
  • the output device 1016 may receive the communication from the risk analysis unit or other computer readable storage medium via an output signal 1012.
  • the input device 1014 may be any type of visual, manual, mechanical, audio, electronic, or electromagnetic device capable of communicating information from a person, processor, memory device or other computer readable storage medium to any of the foregoing. Examples of input devices include keyboards, microphones, voice recognition systems, trackballs, mice, networks, buses, and interfaces. Alternatively, the input and output devices 1014 and 1016, respectively, may be included in a single device such as a touch screen, computer, processor or memory coupled to the processor via a network.
  • the interface unit 1004 may include a plurality of input and output devices (not shown) to enable a plurality of risk assessors or the group of premier risk assessors to enter quantitative analyses directly into the input device. The interface unit 1004 may communicate with the risk analysis unit 1002 via an input signal 1010.
  • the risk analysis unit 1002 basically includes a processor 1020 coupled to a memory device 1018.
  • the memory device 1018 may be any type of fixed or removable digital storage device, and (if needed) a device for reading the digital storage device including, floppy disks and floppy drives, CD-ROM disks and drives, optical disks and drives, hard-drives, RAM, ROM and other devices for storing digital information.
  • the processor 1020 may be any type of apparatus used to process digital information.
  • the memory device 1018 may communicate with the processor 1020 via a memory signal 1024 and a processor signal 1022.
  • the memory device may also receive communication from the input device 1014 of the interface unit 1004 either directly via an input signal 1014 (not shown) or through the processor 1020 via the input signal and the processor signal 1022.
  • the memory device may similarly communicate with the output device 1016 of the interface unit 1004 directly via the memory signal 1024 (not shown), or indirectly via the memory signal 1024 and the output signal 1012.
  • the memory device 1018 may include a database 1029 that can be custom- developed or developed within the framework of existing database software such as, Microsoft Access.
  • the database is configured to store and efficiently retrieve the information used and produced by the Risk Analysis System.
  • the database 1029 may include a storing portion 1030.
  • the storing portion 1030 may store questionnaires, quantitative analyses, qualitative results and quantitative results of an RDAP, customized and/or industry best practices, recommendations, reports, questionnaires, and any other information or data.
  • the memory device 1018 may also include a quality assurance database (discussed subsequently) as part of the database 1029 or as a separate database.
  • the database may also include an analyzing portion 1032, which may include modules for developing questionnaires, developing databases, selecting files, performing the quantitative analyses, synthesizing the quantitative results, generating reports, conducting the scoring processes, and any subset of the foregoing.
  • the modules may be stored in the memory device 1008, the processor 1020, other computer readable storage media, or a combination of the foregoing. Alternatively, the modules may be encoded in a computer readable electromagnetic signal. When implemented as software code, the modules may be object code or any other code describing or controlling their functionality.
  • the Risk Analysis System may be used to prepare for and conduct the RDAP, and generate reports as a result of the RDAP.
  • the module for developing questionnaires may be used to develop and store, in electronic form, a questionnaire that is designed to elicit and capture information needed by the RDAP. While enabling the quality assurance process, the module for developing questionnaires may be used to adapt the questionnaire used during the RDAP for use as an audit tool.
  • the module for developing databases may be used in preparing for the RDAP to customize the database1029 so that it can store the particular information elicited in a particular RDAP.
  • the module for performing the RDAP may then, together with the interface unit 1004, capture and/or store the information elicited by the questionnaire in the memory device 1018 or other computer readable storage medium.
  • the module for file selection may perform some or all of the steps of the file selection process during which the files upon which the RDAP will be performed are selected.
  • File selection may include generating a performance report, performing an account run, performing a calibration step, and designating certain files as selected files.
  • the Risk Analysis System can generate a performance report from information stored in the database 1019 that provides a summary of the profits and losses along a desired dimension or dimensions.
  • the Risk Analysis System can prepare an inventory of accounts and/or policies along the dimensions for which problem areas were made evident in the performance report (perform an account run).
  • the Risk Analysis System may select individual files for analysis using one of a number of search routines (perform a calibration) and designate the files chosen as "selected files.”
  • the module for generating reports may include a module for synthesizing the quantitative results.
  • the module for synthesizing quantitative results may include risk data analysis software ("RAS").
  • the RAS generally communicates the quantitative analyses, qualitative results, customized and/or industry best practices, and instructions stored in the storing portion 1030 or other computer readable storage device to the processor 1020, according to which the processor 1020 generates quantitative results.
  • the RAS, together with the processor 1020, may synthesize the quantitative results by aggregating the values captured by the questionnaire.
  • the RAS may include the quantitative analyses themselves as part of the quantitative results.
  • the quantitative results may then be communicated to the storing portion 1030 or other computer readable storage device for storage and/or to the interface unit 1004 for display.
  • the module for generating reports may also include a recommendation generator.
  • the recommendation generator generates recommendations based on the quantitative results.
  • the memory device 1018 or other computer readable storage medium communicates the recommendation generator to the processor 1020 via a memory signal 1024 upon receiving the relevant request from the processor 1020 made via a processor signal 1022. Further, the module for generating reports may compile the recommendations and the qualitative results into at least one report. The report may then be used for best practices training.
  • the module for conducting the scoring processes may perform an audit and generate scores for performance metrics based on the information captured during the audit process.
  • the module for conducting the scoring process may include score generating software ("SGS").
  • SGS score generating software
  • the SGS generally instructs the processor 1020 to evaluate files in terms of how well they follow the best practices and recommendations, and to generate scores for the files that reflect the evaluation.
  • the SGS may additionally or alternatively generate audit recommendations based on the scores.
  • the SGS may be implemented independently or together with the RAS.
  • the quality assurance database is configured to store information within the scope and dimensions, and for the performance metrics defined for the quality assurance process. In addition, it may be configured to store an audit questionnaire.
  • the quality assurance database is generally developed from the database used during the RDAP.
  • the quality assurance database may be developed from and/or stored in the database of the Risk Analysis System, as previously discussed. However, it can be developed from almost any database structure.
  • the quality assurance database may be implemented in the database of the Risk Analysis System.
  • the quality assurance database may be implemented in any memory device or other computer readable storage device.
  • the risk data analysis procedure uses information relating to risk assessment processes and evaluates this information in terms of the risk assessment life-cycle.
  • the Risk Analysis System including the RAS, can be used to implement the RDAP.
  • the results of the RDAP provide a clear scorecard regarding the quality of risk assessment processes in terms of the accepted standards used by the risk assessment industry (collectively the "industry standards") and identify and prioritize opportunities for improvement.
  • information is obtained from documents in files relating to individual risk assessment cases, such as the loan-related documents in a loan file for a specific loan for a particular customer or the underwriting-related documents in an underwriting file for a specific insurance policy for a particular client.
  • the industry standards used are those that apply generally to the risk assessment industry and can be customized to include the industry best practices relating specifically to the type of risk assessment being analyzed (the "customized industry best practices").
  • the RDAP 102 is shown generally in FIG. 3 and includes preparing for the RDAP 200, conducting the RDAP to generate quantitative analyses and qualitative results 202, and generating reports from the quantitative analyses and the qualitative results 204.
  • Preparing for the RDAP 200 is shown in more detail in FIG. 4 and generally includes defining the analysis dimensions for the RDAP 300, developing a questionnaire 302, developing a database 303, and selecting files from which the information is to be obtained (the "selected files”) 304.
  • the analysis dimensions are used to categorize the information used for the RDAP and the results generated by the RDAP into groups that provide insights into the desired segments of a risk-assessing entity.
  • the analysis dimensions may be predefined and static from one RDAP to another, it is more useful to define the analysis dimensions for each RDAP so that the results are customized for the particular entity involved. Additionally, the analysis dimensions may be further broken down into subgroups. Using subgroups helps to demonstrate specific problem areas within each of the analysis dimensions. Examples of analysis dimension subgroups include geographic area of the risk, policy or loan duration, degree of risk involved, resources used in risk assessment, types of liability, external and internal data sources used in risk assessment, number of claims or defaults made, uniformity of the information used and overlooked exposures.
  • This questionnaire is used to elicit information from the selected files so this information can be used to create quantitative analyses and to determine compliance with the industry best practices and/or the customized best practices during the RDAP.
  • the questionnaire may be a form questionnaire, a customized questionnaire or a partially customized- questionnaire.
  • a form questionnaire generally includes standard questions applicable to most risk analysis situations including questions designed to elicit information used to determine compliance with industry best practices and is not altered for any particular RDAP.
  • a customized questionnaire is developed for a particular. RDAP and includes questions that use the particular terms and language of and elicit information that is particular to the entity for which the RDAP is to be performed. Additionally, the customized questionnaire includes questions designed to elicit information used to determine compliance with the custom best practices.
  • the questionnaires may also be partially customized in that, although they use standard questions for each RDAP, the language is altered so that entity-specific jargon or terminology is used and questions are added to elicit information specific to the entity being analyzed and to determine compliance with the customized and/or industry best practices. For entities outside the United States, significant amounts of customization are generally required because the types of relevant information are most likely going to be different.
  • the questionnaire is designed to elicit information relating to the risk assessment phases.
  • Table 1 shows an example of questions that may be included in the questionnaire according to the risk assessment phase from which they are designed to elicit information.
  • Table 1 Some of these questions are used to determine compliance with the customized and/or industry best practices related to each of the phases. In addition to the questions that directly ask if there was compliance with best practices, questions such as: “Have all exposures been identified?" and “If not, what exposures were missed?" may also be used to determine compliance with best practices.
  • the questionnaire may generally include questions designed to elicit a determination of performance measures for each of the phases.
  • the questionnaire may provide space for answers to be written in or it may provide a group of possible answers from which a choice can be made.
  • the questionnaire can be in written, oral, digital, analog or any other form capable of eliciting the needed information.
  • a Risk Analysis System such as that shown in FIG. 1 , may store a digital questionnaire and provide the means for capturing the answers.
  • the next step in preparing for the RDAP 200 is developing a database 303 to store the information elicited using the questionnaire.
  • the database 1029 may reside in the memory device 1018 or the Risk Analysis System 1000.
  • the database 1029 may be designed to store the RAS, SGS, quality assurance database, QAS, customized and/or industry best practices, recommendation generator, recommendations, the questionnaire, and/or any subset of the foregoing.
  • the database may be developed prior to the RDAP process and modified after or at the same time that the questionnaire is developed or the files are selected so that it will be able to store any information elicited by customized questions. Alternatively, the database may be developed entirely at any of these times.
  • the structure of the database can be totally custom-developed or developed within the framework of existing database software such as, Microsoft Access.
  • the database will generally have two portions, a storing portion and an analyzing portion.
  • the storing portion will store the reports, the quantitative analyses, the qualitative results, the recommendations and any other needed information or data. It may further store the quality assurance database.
  • the analyzing portion may include the RAS and/or the SAS. As part of the RAS, the analyzing portion may include a recommendation generator that generates recommendations based on the quantitative results.
  • Defining the analysis dimensions for the RDAP 300 and developing the questionnaire 302 generally defines the information needed to conduct the RDAP.
  • the files from which the needed information is to be obtained in order to produce the analyses are selected 304.
  • the number of files selected must be a number sufficient to yield statistically significant data. In general, to fulfill this requirement, at least 8-10% of the relevant files in an entity's books are selected. Generally, files are selected that are representative of all the files in an entity's books. However, files representing exceptional activities may also or alternatively be chosen.
  • Selecting the files 304 may involve a four (4) step process. In the first step, a performance report is generated to provide a summary of the profits and losses of the entity along a desired dimension or dimensions.
  • the performance report may be generated by division to provide a macro view of the relative performance of the different divisions.
  • an account run is performed.
  • the account run includes preparing an inventory of accounts and/or policies in the dimensions for which problem areas were made evident in the first step.
  • the files corresponding to some of the accounts and/or policies listed in the inventory are reviewed manually to identify groups of files with the desired properties or information. This helps to direct the file selection 304 to files that represent areas that are typical or of particular interest.
  • the results of this step may be summarized in a report of the books.
  • One possible result may be that certain file types are identified for selection. Generally, the types of files that are identified for selection depend, in part, on the type of risk assessment involved.
  • the third step of selecting the files from which information is to be obtained 304 is the calibration step. This is the step in which the individual files are selected or "pulled.” This step, as well as the first step, generally needs to be done manually because the files are primarily in hard copy form.
  • the files may be reviewed and calibrated electronically by the Risk Analysis System using any one of a number of search routines. Once selected, the files are designated as "selected files" in the fourth step of selecting the files and are then used to conduct the RDAP.
  • Conducting the RDAP 202 generally includes conducting a quantitative portion of the RDAP to produce quantitative analyses 400 and conducting a qualitative portion of the RDAP to produce qualitative results 402.
  • the quantitative and the qualitative portions may be conducted in parallel. Alternately, the qualitative portion may also be conducted in parallel with and after the quantitative portion (as shown in FIG. 5) so as to take into account the quantitative analyses produced in the quantitative portion.
  • conducting the quantitative portion of the RDAP 400 includes training a premier group of risk assessors 500 and having the group of premier risk assessors analyze the selected files to produce quantitative analyses 502.
  • Training a group of premier risk assessors 500 includes choosing the group of premier risk assessors and training and synchronizing the group of premier risk assessors.
  • the group of premier risk assessors generally includes the best or top risk assessors (the "premier assessors") working for the entity for which the risk analysis is being performed.
  • the premier assessors are chosen to analyze the selected files because they are the individuals that know their entity's risk assessment processes best.
  • using risk assessors from within the entity being analyzed as part of the RDAP helps to initiate the best practices training.
  • Training a premier group of risk assessors 500 also includes training and synchronizing the group of premier risk assessors.
  • Training and synchronizing includes training the entire group of premier risk assessors as to how to analyze the selected files while all the premiere risk assessors are in a single group and then having the premier risk assessors evaluate at least one example file.
  • the example files used in training and synchronization may be chosen from the selected files, from the entity's books or may be pre-developed files.
  • the group of premier risk assessors is then broken down into at least two subgroups and the training and synchronization method is repeated as before except that each subgroup evaluates at least one additional example file. The analyses performed by each subgroup are then compared with each other.
  • the training is repeated until the analyses are consistent. If the analyses are consistent with each other, the subgroups are broken down into progressively smaller subgroups of the premier group. The training and synchronization continues until the risk assessors have been trained individually and the results of their individual analyses are consistent throughout the group of premier risk assessors.
  • the quantitative portion of the RDAP 400 continues with having the group of premier risk assessors analyze the selected files to produce quantitative analyses 502. All the risk assessors in the premier group individually analyze at least one selected file from the point of view of a risk assessor by completing the questionnaire for the selected files. This involves eliciting general information from the selected files and judging what was done in the file in terms of certain performance measures for the first through the sixth risk assessment phases. As previously discussed, the Risk Analysis System (shown in FIG.1 ) may store, display and elicit response to the questionnaire.
  • the premier assessors will generally determine a value for at least one performance measure for each phase and provide a reason for the particular value obtained.
  • the performance measures are generally types of EGO which include, lost cost avoidance, expenses, premium and price differential.
  • the questionnaire will include questions that ask the premier risk assessors to determine the corresponding performance measure for each phase. Additionally, the questionnaire will ask the premier risk assessor to evaluate the value obtained for the performance measure. Table 1 shows one example of performance measures that may be used for each of the risk assessment phases and how these performance measures may be determined:
  • LCA is a measure of leakage due to the loss of a cost avoidance opportunity.
  • Phase 1 associated with every exposure that is identified is the opportunity to reduce costs due to that exposure. For example, if the risk assessment is lending, one possible exposure is that the collateral for the loan may be easily destroyable. If this exposure is identified, the lender has the opportunity to require the borrower to take out insurance on that collateral. However, for each exposure that is not identified the opportunity to reduce costs associated with the unidentified exposure is lost.
  • LCA is generally defined by the equation:
  • Loss cost avoidance (actual losses incurred) + IBNR (1 )
  • IBNR (or incurred but not reported) represents incurred and anticipated losses that have not been recorded.
  • the LCA measures the ability to avoid a claim or a default due to improperly assuming a risk.
  • the LCA is an exposure-specific performance measure and is determined using Equation (1).
  • LCA represents the loss of a cost avoidance opportunity due to improperly assuming or rejecting the assumption of a risk.
  • LCA for Phase 3 includes all such losses, independent of exposure and is determined using Equation (1)
  • Phase 4 the performance measure for Phase 4 is premium.
  • Premium may be determined in terms of gross leakage or net leakage.
  • Gross Leakage is a measure of loss that results from charging an insufficient premium, fee and/or interest rate. Gross leakage is generally is defined according to the equation:
  • the propensity factor is a value less than or equal to one that represents the likelihood that the sufficient premium, fee or interest rate would have been obtained.
  • expenses which is a measure of leakage that represents the costs incurred by an entity on a risk that should not have been assumed.
  • premium is a measure of leakage that represents the fees, interest and/or premiums that were not obtained due to an incorrect assessment.
  • losses in Phase 7, setting the service program may also be determined.
  • the loss measures in this phase are a function of the under or over utilization of any service resources. When any service resources are over utilized, the performance measure used is expenses because the amount of money spent on service resources that had no monetary impact on the risk account represents an unnecessary expense.
  • the performance measure is loss cost avoidance because the EGO in this situation will be due to exposures that most likely would have been discovered or mitigated if the service resources were properly utilized. If this phase is included, the questionnaire will include questions designed to elicit information relating to service resources such as: "In accordance with best practices, should loss control been ordered to better identify exposures?"
  • Conducting the qualitative portion of the RDAP 402 is done to help identify particular problem areas in the risk assessment methods used by the entity and to help develop best practices that will improve these risk assessment methods.
  • conducting the qualitative portion of the RDAP 402 includes interviewing risk assessors and holding focus groups for risk assessors.
  • the interviews are generally conducted with risk assessors that may or may not belong to the group of premier risk assessors.
  • the interviews ask questions of the risk assessors to determine what the risk assessors think they do, and the resources and thought processes used by the risk assessors in terms of EGO.
  • the focus groups are facilitated interviews with groups of risk assessors during which suggestions for best practices are elicited and developed by the group.
  • This qualitative portion of the RDAP 402 is generally done in parallel with the quantitative portion 400. However, for even better results, the qualitative portion can be done again after the quantitative portion so that the interviews can be targeted towards the risk assessment phases or the analysis dimensions of concern as indicated by the quantitative portion.
  • the results of the quantitative and qualitative portions of the RDAP are used to generate reports 204.
  • This may be accomplished by the Risk Analysis System as previously described.
  • the step of generating reports 204 is shown in more detail in FIG. 7 and generally includes assembling the qualitative analyses and the quantitative results in a database 600; synthesizing quantitative results in terms of the risk assessment phases 602; and generating reports 604.
  • Assembling the qualitative analyses and the quantitative results in a database 600 includes entering the qualitative analyses elicited through use of the questionnaire and the quantitative results into the database of the Risk Analysis System.
  • the database may store this information according to category such as, the identity of the premier risk assessor that analyzed the file and by risk type so that the data may be retrieved in any number of ways.
  • Entering the quantitative results and the qualitative analyses into the database may be accomplished in any number of ways using the output device (see 1016 in FIG. 1 ) of the Risk Analysis System including, typing the information on a keyboard and scanning the information and using character recognition software to convert the scanned images into text-based documents.
  • the step of assembling the qualitative analyses and the quantitative results in a database 600 may be done simultaneously with the step of having the group of premier risk assessors analyze the selected files 502 (shown in FIG. 6) by having the group of premier risk assessors input their quantitative analyses directly into the database of the Risk Analysis System.
  • the group of premier risk assessors input their quantitative analyses directly into the database Risk Analysis System by providing responses to the questionnaire, which is also implemented in the database of the Risk Analysis
  • Synthesizing quantitative results in terms of risk assessment phases 602 includes aggregating the values for the performance measures for each analysis dimension and for the entity as a whole. These aggregate values, along with the quantitative analyses make up the quantitative results. This step may be automated by the Risk Analysis System, as previously discussed. Once the quantitative results are generated, they are stored in the database.
  • the values of the quantitative results may then be evaluated by the Risk Analysis System according to a recommendation generator, as previously described, to determine if the values are within the range of any of the recommendations.
  • the recommendation generator then generates the appropriate recommendation for each quantitative result that has a value that falls within the scope of that recommendation.
  • the quantitative results and the recommendations are then, together with the results of the qualitative portion of the RDAP (the qualitative results), compiled by the Risk Analysis System into at least one report.
  • Generating reports 604 may be performed by the Risk Analysis System, and includes presenting various aspects of the quantitative and qualitative results and the recommendations in a manner that is easier to understand than the quantitative and qualitative results themselves.
  • the Risk Analysis System may present the data included in the reports in graphical form, but may also present the data as a textual listing.
  • the Risk Analysis System may assemble the information contained in these reports in any manner and in any combination, and may also include an analysis of the quantitative and qualitative results.
  • the following represents merely a sample of the possible reports that may be generated using the quantitative results and the results of the qualitative analysis: an Executive Summary, a Risk Analysis Data Analysis Report; and Final Recommendations.
  • An Executive Summary generally contains an overview of the quantitative results terms of the risk assessment phases, the qualitative results, the major problems identified and some recommendations as to solutions to these problems.
  • the quantitative results to be included in the overview are generally selected manually and include the quantitative results that best represent the current state and illustrate the problems of the risk assessment processes that were analyzed.
  • the quantitative results are not only presented in terms of the risk assessment phases, but along the selected analysis dimensions.
  • the qualitative results are included.
  • the qualitative results are used to interpret the quantitative results to help identify the problems and suggest the solutions that are likely and/or unlikely to be successful in solving the problems.
  • the qualitative results selected for inclusion in the Executive Summary may include those qualitative results which represent the current risk assessment methods that are effective which, when interpreted using the qualitative results, suggest methods that are effective and should be maintained.
  • the recommendations may be prioritized based on phases that will show the greatest rate of return if the results were improved. Additionally, the recommendations may be quantified by the of dollar amounts that could be saved if the recommendations were implemented. For example, one recommendation may be to stop assuming certain types of risks in certain geographic locations due to irreparable losses.
  • the Risk Data Analysis Report includes a summary of the selected files reviewed during the RDAP and the results of RDAP. Also included in this report are various aspects of the quantitative results presented along the analysis dimensions and in terms of the risk assessment phases. For example, the Risk Data Analysis Report may present the data in terms of the EGO or the EGO per share of the fee or premium received. However, the quantitative results may also or alternatively be presented in terms of LCA and/or Price.
  • the Final Recommendations is a report that includes a prioritized list of the recommendations and a timeline for implementing these recommendations.
  • the recommendations are listed in terms of which recommendation has the greatest potential to increase performance measures. Recommendations are determined based on the findings of the RDAP and may be generated manually or automatically. Examples of recommendations include pricing training, producer management training, industry or line of business training, what reports to order for information gathering and the best practice tools to use to assess exposures. In some cases, recommendations are made that can be quickly implemented that will generate a good size return.
  • best practices training is conducted. This best practices training extends the training received by the group of premier risk assessors in the RDAP to other risk assessors in the entity.
  • Conducting the best practices training 104 shown in FIG. 2 is shown in more detail in FIG. 8 and includes an outcome focused learning approach to instill a results orientation for risk assessment. Instead of simply training risk assessors to blindly follow standards, the risk assessors are taught the manner in which their decisions and actions have a real and tangible effect on the profitability of their entity and are enabled to help define the best practices for risk assessment that their entity will adopt.
  • the remaining risk assessors are trained with regard to how the RDAP was conducted and the results of the RDAP including the recommendations. Then using the recommendations as a starting point, the remaining risk assessors participate in preparing best practices for risk assessment that their entity will adopt.
  • the training may be focused specifically on the risk assessment phases determined during the RDAP to have the greatest EGO.
  • Conducting the best practices training 104 includes providing hands-on training for best practices 700; providing content expert presentations 702; providing networking opportunities 704; providing feedback and improvement mechanisms 706; and enabling the determination of best practices 708. All of these portions of the best practices training may be conducted in any order. In general, learning is better facilitated when these portions are intermixed and spread out over a number of days.
  • the hands-on training is a type of experiential learning provided to improve the risk assessment processes and outcomes by training the risk assessors as to the best methods for rapidly changing behaviors. This type of training enables higher retention with regard to the RDAP and the RDAP results and shorter time to proficiency with regard to risk assessment.
  • Training files may include some of the selected files and/or composite files.
  • Composite files may be created by combining facts from some of the selected files and/or may be totally or partially fabricated.
  • Specific training files are chosen or created because they contain fact patterns that will emphasize the best practices, particularly those that, if implemented or improved, will yield the greatest gain in the performance measures.
  • the remaining analyzers review the training files using a process similar to that used by the group of premier risk assessors. Alternately, in one embodiment, the remaining analyzers do not go through the calibration process but instead review the training files in teams each consisting of a subgroup of the remaining analyzers.
  • Each team identifies the positive and negative actions that took place for each risk assessment phase of the risk assessment in terms of the best practices.
  • conferencing allows the teams to learn from each other. Conferencing involves open communication among all the remaining risk assessors from all the teams regarding the facts of some of the training files and the possible opportunities for improving the outcome of the training files.
  • Each team presents their analyses, followed by a discussion among the remaining risk assessors. This discussion may be facilitated by a facilitator. Conferencing ends when all the teams come to a consensus with regard to the analyses.
  • Providing content expert presentations 702 includes having experts in risk assessment make formal presentations on various risk assessment related topics. Generally, the risk assessment related topics will be presented within the context of the risk assessment phases. The experts may come from within the entity or from other sources.
  • Providing networking opportunities 704 includes hosting team building activities, performing checkpoint exercises and hosting social events. These activities all help to build relationships among the risk assessors to provide for a lasting resource within the entity regarding risk assessment and improving risk assessment processes.
  • Providing feedback and improvement opportunities includes providing surveys and question and answer sessions on a periodic basis throughout the best practices training. This enables the best practices training to be constantly improved and updated as it is performed.
  • Enabling the determination of best practices 708 allows the.remaining risk assessors to be part of the process whereby the best practices are determined and adopted. Generally, the best practices are determined through discussion among the remaining risk assessors, which continues until a consensus is reached regarding which practices will be adopted as the best practices. Using a discussion and consensus approach to determining best practices helps instill in the remaining risk assessors a sense of ownership that will help to ensure that the best practices will be integrated into the way their entity conducts business thereby promoting compliance with the best practices.
  • determining the best practices may be enabled at any time during the best practices training, it is beneficial to at least begin the determination after the remaining risk assessors have had some training regarding the RDAP and its results so that they have a clear picture of the current state of risk assessment in their entity.
  • the remaining risk assessors may develop the best practices from existing entity practices, industry standard practices or entirely from scratch. If the remaining risk assessors develop the best practices from current entity or industry practices, it is beneficial for them to receive some training regarding the qualitative results and the recommendations so that they may use these as a starting point.
  • a quality assurance process should be enabled so that the quality of the risk assessment of files not analyzed during the RDAP and files reviewed during the RDAP that were subsequently updated can be monitored.
  • enabling a quality assurance process 106 includes developing a method for monitoring risk assessment that allows the risk assessors or others within an entity to monitor their own risk assessment processes.
  • the quality assurance process allows the entity to review files and arrive at a quantitative scoring of the quality of the risk assessment, which in turn allows an entity to review the risk assessment at a granular or macro level for quick identification of where in the risk assessment processes there may be a lack of understanding or a problem with the best practices.
  • Enabling a quality assurance process 106 includes developing a framework 800; developing a quality assurance database 802; and conducting a scoring process 804.
  • Developing a framework for the quality assurance process 800 is shown in more detail in FIG. 10 and includes developing an approach 900; evaluating the current risk assessment review processes 902; outlining the detailed requirements and developing materials 804; determining key metrics 906 and beginning implementation and ongoing training 808.
  • Developing an approach 900 generally includes developing the plan for enabling the quality assurance process. This plan is developed from the current state of the entity's risk assessment processes and the desired scope of the quality assurance process. In order to develop the plan, the current state of the entity's risk assessment processes, as determined in the RDAP, is confirmed.
  • the scope of the quality assurance process is defined.
  • the scope may be defined to include all types of files, or only those that were identified as not conforming to best practices during the RDAP. Alternately, the scope may be defined in terms of line of business or in terms of any of the other dimensions for which the RDAP was carried out.
  • the files or file types included in the scope will be the files or file types monitored by the quality assurance process.
  • a determination is made as to the resources that will be needed for the quality assurance process. These resources include the entity or other personnel needed to help perform the quality assurance process.
  • Evaluating the current risk assessment review processes 902 provides a view of the entity's current audit processes and uses the current audit processes as a baseline by which new audit processes are developed.
  • the entity's current audit processes are the processes currently in use by the entity by which they determine the quality of their risk assessment. In some cases, current audit processes may not exist. This includes reviewing and documenting the current audit processes, including any forms, databases, questionnaires and other tools used by the entity in its current risk assessment review process.
  • a new risk assessment review process is developed.
  • the new risk assessment review process may include all or only a portion of the recommendations developed during the RDAP.
  • Outlining the detailed requirements and developing materials 904 includes adapting the questionnaire used during the RDAP for use as an audit tool.
  • This "audit questionnaire” is generally revised to reflect the scope of the new risk assessment review process and the results of the RDAP. For example, the questionnaire may be changed so that it only asks questions related to the first two phases of risk assessment for a particular geographic location.
  • outlining the detailed requirements may include determining the specific resources needed to perform the new risk assessment process such as the level and areas of expertise required of the personnel involved in this process.
  • Determining key metrics 906 includes determining the metrics by which files analyzed by the new risk assessment process are to be judged. Additionally, determining the key metrics 906 includes establishing baseline and target numbers for the key metrics.
  • the baseline is the state of the risk assessment processes at the time of the RDAP or at some other defined time in the past.
  • the state of the risk assessment processes is generally given in terms of the performance measures used in the RDAP.
  • the target numbers are also generally given in terms of the performance measures and represent the desired state of the risk assessment processes.
  • determining key metrics may include establishing a reward or incentive program to reward risk assessors that help the entity meet its target for the key metrics.
  • Performing ongoing training 908 includes training the resources to perform or help perform the quality assurance process.
  • the quality assurance database is generally developed from the database used during the RDAP.
  • the quality assurance database may be developed from and/or stored in the database of the Risk Analysis System, as previously discussed. However, it can be developed from almost any database structure.
  • the quality assurance database is set up to store information within the scope and dimensions and for the performance metrics defined for the quality assurance process.
  • the quality assurance database may also store the audit questionnaire.
  • Conducting the scoring process 804 is similar to conducting the quantitative phase of the RDAP. Questions from the audit questionnaire are answered by the resources and the answers are ultimately input into the quality assurance database. The audit process then uses the answers to compute one or more scores for the defined performance metrics.
  • the audit may be performed manually.
  • the audit may be performed by the Risk Analysis System according to the score generating software (SGS).
  • SGS score generating software
  • each file audited is evaluated in terms of how well it follows the best practices and recommendations established during the best training and the RDAP, respectively.
  • the score gives a numerical measure of how well the best practices and recommendations were followed.
  • the score may be weighted, so that the scores generated for the performance metrics along defined dimensions may have greater weight (multiplied by a number greater than one) or a lesser weight (multiplied by a number less than one) than the scores along other non-weighted dimensions.
  • the audit process also generates audit recommendations based on the scores for the performance metrics.
  • the resource may then review the scores and recommendations and may add comments and other recommendations.

Description

RISK DATA ANALYSIS SYSTEM
BACKGROUND
For entities (including businesses, companies and individuals), such as those that issue insurance or grant loans and/or leases, assuming risk is an unavoidable part of doing business. An absolute necessity for such entities is the ability to effectively and efficiently assess risk ("risk assessment"). Risk assessment involves not only determining the nature and extent of the risk but also the potential for profit gained by assuming the risk. For entities involved in any type of risk assessment, it is crucial to have well trained people to make the risk assessments ("risk assessors"), and to monitor, analyze and update any processes used by the risk assessors for risk assessment ("risk assessment processes"). Audits are generally conducted either manually, or by using a system that can only determine basic information including the number of risk assessments made, the risk assessors that performed the risk assessment, the amount charged to assume the risk (such as a premium, interest or fee) and the time frame for the risk assumption. Unfortunately, this basic information is not sufficient to determine whether risks are being assessed in an efficient and cost effective manner. There are limited means to determine whether an entity is effectively assessing risk, however, these limited means tend to be costly and time consuming. Further, it is difficult to calculate whether risk assessors are using the best methods for streamlining the risk assessment processes. To compound this problem, risk assessors generally receive only on-the-job training and perhaps a limited amount of formal introductory training. Furthermore, this training generally focuses on the processes and methods to be used and regulations to be followed rather than on obtaining a financially viable outcome. As such, there is a need for systems that can review risk assessment processes more completely in order to provide information that can be used to determine which processes are beneficial, whether the consideration received for assuming the risk is appropriate, and whether risks are being handled efficiently and effectively. In addition, there is a need for technology-based solutions that enable and support such systems by providing automated and customizable data storage, retrieval and analysis.
SUMMARY
A system is presented that takes an outcome oriented approach to analyzing risk assessment, training risk assessors as to best practices for assessing risk, and enabling a quality assurance process (the "Risk Data Analysis System" or "RDA System).
The RDA System generally includes a system that implements the RDA System in a timely and efficient manner (a "Risk Analysis System"). Each aspect of the RDA System, including performing the RDAP, conducting a best practices training, and enabling a quality assurance process (shown in FIG. 2), may be implemented or supported by the Risk Analysis System. The Risk Analysis System generally includes a risk analysis unit and may also include an interface unit. The risk analysis unit may include a database that can be custom-developed or developed within the framework of existing database software such as, Microsoft Access. The database is configured to store and efficiently retrieve the information used and produced by the Risk Analysis System. The database may include an analyzing portion, which may include modules for developing questionnaires, developing databases, selecting files, performing the quantitative analyses, synthesizing the quantitative results, generating reports, conducting the scoring processes, and any subset of the foregoing. The automation and customization provided by these modules improves the speed, accuracy and comprehensiveness of the RDA System.
The RDA System may determine performance measures, such as economic gain opportunities in risk assessments, identify in which phase of the risk assessment life-cycle these performance measures are the greatest, focus improvement and training efforts on these risk assessment phases using a hands- on-based approach, and use the performance measures to monitor past, current and future compliance with the best practices associated with these risk assessment phases. The RDA System includes a group of methods, methodologies, questionnaires, software, hardware and analyses that analyze risk assessment processes by providing a view of the current state of the risk assessment processes, developing or improving best practices, providing training as to the best practices and enabling the monitoring of compliance with the best practices.
The RDA System may also include one or more methods such as, performing a risk data analysis procedure ("RDAP"), conducting best practices training, and enabling a quality assurance process. The RDAP uses information relating to risk assessment processes and evaluates this information in terms of the life cycle or phases of risk assessment. The RDAP includes preparing for the RDAP, conducting the RDAP to generate quantitative analyses and qualitative results, and generating reports that include recommendations and suggestions for improvement based on the quantitative analyses and the qualitative results. As previously discussed, much of the RDAP may be implemented in a Risk Analysis System.
To begin the implementation of the improvements and recommendations suggested by the RDAP, best practices training is conducted. Conducting the best practices training includes an outcome focused learning approach used to instill a results orientation for risk assessment. More specifically, conducting the best practices training includes providing hands-on training for best practices; providing content expert presentations; providing networking opportunities; providing feedback and improvement mechanisms; and enabling the determination of best practices. Enabling a quality assurance process allows the monitoring of compliance with the best practices after the RDAP has been completed and includes developing a framework, developing a quality assurance database, and conducting a scoring process. As previously discussed, much of the enabling a quality assurance process may be implemented in a Risk Analysis System.
BRIEF DESCRIPTION OF THE DRAWINGS Described herein are numerous embodiments, which will be understood by those skilled in the art, based on the present disclosure. Some of these are described below and are represented in the drawings by several figures, in which:
FIG. 1 is a block diagram of a Risk Analysis System;
FIG. 2 is a flow chart of a method for improving risk assessment processes and outcomes;
FIG. 3 is a flow chart of a method for performing a risk data analysis procedure ("RDAP");
FIG. 4 is a flow chart of a method for preparing for the RDAP;
FIG. 5 is a flow chart of a method for conducting the RDAP;
FIG. 6 is a flow chart of a method for conducting the quantitative portion of the RDAP;
FIG. 7 is a flow chart of a method for generating the reports of the RDAP;
FIG. 8 is a flow chart of a method for providing training for best practices;
FIG. 9 is a flow chart of a quality assurance process; and
FIG. 10 is a flow chart of a method for developing a framework.
DETAILED DESCRIPTION A system for improving processes and outcomes in risk assessment (the "Risk Data Analysis System" or the "RDA System") includes a group of methods, methodologies, questionnaires, software and analyses for analyzing risk assessment processes in order to improve them by providing a view of their current state, developing or improving best practices, providing training as to best practices, and enabling the monitoring of compliance with the best practices.
The RDA System takes an outcome oriented approach to analyzing risk assessment processes. The RDA System may identify economic gain opportunities ("EGO") in risk assessment, identify in which phase of the risk assessment life-cycle ("risk assessment phase" or "phase") EGO exists, and focus improvement, training and monitoring efforts on these risk assessment phases. In general, EGO is a measure of leakage determined for each risk assessment phase, and represents the revenues lost and losses incurred in the phases due to inefficient practices. Therefore, the risk assessment phases that have a higher EGO may be identified and targeted for improvement during a best practices training, and future monitoring by a quality assurance process.
In one sense, the RDA System can be considered a "toolkit" that includes a collection of "tools" or components that can be used in many combinations to analyze risk assessment processes. It is convenient to group these tools as follows: a risk data analysis procedure ("RDAP"); the recommendations; the best practices training; the quality assurance process, the quality assurance database, and the Risk Analysis System. The Risk Analysis System and the quality assurance database implement, support and automate the remaining tools. The Risk Analysis System may includes risk analysis software ("RAS"), score generating software ("SGS"), and quality assurance software ("QAS"), which provide the technical support for the remainder of the Risk Data Analysis System. The RDAP uses information relating to risk assessment processes and evaluates the information in terms of the risk assessment life cycle. The RDAP may include qualitative and quantitative portions, which together with the Risk Analysis System, generates the recommendations. The recommendations may be used in the best practices training to improve the risk assessment processes. The quality assurance process, along with the Risk Analysis System and the quality assurance database, helps to sustain the recommendations.
These groups of tools can be used separately and the individual tools within these groups can also be used separately. However, when used in combination with each other, a highly effective analysis can be obtained. One example of the tools included in the RDA System as they are used in combination is shown generally in FIG. 2. In this combination, the tools embody a method for improving risk assessment processes and outcomes 100. The methods for improving risk assessment processes and outcomes 100 generally include, performing a risk data analysis procedure or RDAP 102, conducting a best practices training 104, and enabling a quality assurance process 106. Because risk assessment processes tend to coincide with the risk assessment phases, the RDA System may analyze risk assessment processes in terms of each risk assessment phase in order to identify opportunities for improving the risk assessment processes. This analysis enables an evaluation of how well the individual risk assessment processes yield profitable outcomes and how well they are executed.
In general, the process of risk assessment goes thru at least six (6) risk assessment phases. The first phase involves identifying exposures. This is the phase in which all the risks that might be encountered in a particular case are identified. The second phase, evaluating exposure, involves determining the likelihood that the exposures will become actual losses and the potential extent of the actual losses should they occur. In some cases, the second phase may be combined with the first into a single phase. The third phase involves making the risk decision. In this phase, a decision is made based on the results of the second phase regarding whether to assume the risk. If, during the third phase, it is decided that the risk will not be assumed, the life-cycle of the risk assessment process stops at the end of the third phase. However, if the risk is assumed in the third phase, the life-cycle continues with the fourth phase, setting the terms and conditions. Setting the terms and conditions involves determining the details under which the risk will be assumed. These details may include, the duration of the risk assumption, the duties of the entity from which the risk will be assumed, the precise definition and scope of the risk assumption and any other term except price. Setting the price for which the risk will be assumed is generally done separately in the fifth phase because the price depends, in part, on the particular terms and conditions under which the risk will be assumed. The next phase is negotiation. In this sixth phase, the set terms and conditions and/or price may be adjusted in order to make them acceptable and so that a risk relationship is established. Additionally, in some cases, a seventh phase is also included. This seventh phase, setting the service program, involves determining the effectiveness of any service resources that were used. Service resources are internal and external services that may be used to evaluate, mitigate or otherwise respond to different aspects of a risk and generally include loss control, claims and premium audits.
Loss control audits involve assessing risks and exposures before and/or after a risk has been assumed. Premium audits, which may also include interest or fee audits, generally involve either a voluntary or a physical audit of a business that is or will be the subject of a risk policy (such as the insured operation covered by or to be covered by an insurance policy and the collateral for a loan) to see if the premium, interest and/or fees charged were the proper amount based on the assumptions made by the risk assessor. Additionally, there may be risk assessment specific service resources such as claim audits and default audits. For example, if the risk assessment is underwriting, an additional service resource may be claim audits. Claim audits generally involve special handling of claims made under an insurance policy, which is a costly method. In another example, if the risk assessment is lending, an additional service resource may be default audits. Default audits generally include determining the circumstances under which there is a default on a loan repayment. The effectiveness of any service resources that are used is evaluated by determining the over or under utilization of the service resources. Over utilization of a service resource occurs when that service resource is used beyond the point at which its use produces a monetary savings or measurable improvement. Under utilization occurs when a service resource is not used in situations where its use would have produced a monetary savings or loss cost impact. Although this phase is the seventh phase, it applies to the use of service resources before and/or after a risk is assumed.
Because many entities involved in risk assessment have large and complicated organizations, the RDA System can analyze risk analysis along various dimensions ("analysis dimensions"). These analysis dimensions include line of business, geographic location, type of risk assumed, branch, division, office or any other dimension along which analysis is desired. For example, an entity may want to compare the performances of each of its branch offices in terms of the type of risk they assume. In this case, the RDA System would analyze risk assessment in terms of the office and type of risk by determining the EGO for each branch office for each of the types of risk they assess.
Risk Analysis System In order to implement and support the RDA System in an efficient and time- effective manner, a Risk Analysis System has been developed. Each aspect of the RDA System, including performing the RDAP, conducting a best practices training, and enabling a quality assurance process (shown in FIG. 2), may be implemented or supported by the Risk Analysis System. An example of a Risk Analysis System is shown in FIG. 1 , and indicated by reference number 1000. The Risk Analysis
System 1000 generally includes a risk analysis unit 1002 and may also include an interface unit 1004. The interface unit 1004 generally includes an input device 1014 and an output device 1016. The output device 1016 may include any type of visual, manual, audio, electronic or electromagnetic device capable of communicating information from a processor, memory device or other computer readable storage medium to a person, processor, memory device or other computer readable storage medium. Examples of output devices include, but are not limited to, monitors, speakers, liquid crystal displays, networks, buses, and interfaces. The output device 1016 may receive the communication from the risk analysis unit or other computer readable storage medium via an output signal 1012. The input device 1014 may be any type of visual, manual, mechanical, audio, electronic, or electromagnetic device capable of communicating information from a person, processor, memory device or other computer readable storage medium to any of the foregoing. Examples of input devices include keyboards, microphones, voice recognition systems, trackballs, mice, networks, buses, and interfaces. Alternatively, the input and output devices 1014 and 1016, respectively, may be included in a single device such as a touch screen, computer, processor or memory coupled to the processor via a network. The interface unit 1004 may include a plurality of input and output devices (not shown) to enable a plurality of risk assessors or the group of premier risk assessors to enter quantitative analyses directly into the input device. The interface unit 1004 may communicate with the risk analysis unit 1002 via an input signal 1010.
The risk analysis unit 1002 basically includes a processor 1020 coupled to a memory device 1018. The memory device 1018 may be any type of fixed or removable digital storage device, and (if needed) a device for reading the digital storage device including, floppy disks and floppy drives, CD-ROM disks and drives, optical disks and drives, hard-drives, RAM, ROM and other devices for storing digital information. The processor 1020 may be any type of apparatus used to process digital information. The memory device 1018 may communicate with the processor 1020 via a memory signal 1024 and a processor signal 1022. The memory device may also receive communication from the input device 1014 of the interface unit 1004 either directly via an input signal 1014 (not shown) or through the processor 1020 via the input signal and the processor signal 1022. The memory device may similarly communicate with the output device 1016 of the interface unit 1004 directly via the memory signal 1024 (not shown), or indirectly via the memory signal 1024 and the output signal 1012.
The memory device 1018 may include a database 1029 that can be custom- developed or developed within the framework of existing database software such as, Microsoft Access. The database is configured to store and efficiently retrieve the information used and produced by the Risk Analysis System. The database 1029 may include a storing portion 1030. The storing portion 1030 may store questionnaires, quantitative analyses, qualitative results and quantitative results of an RDAP, customized and/or industry best practices, recommendations, reports, questionnaires, and any other information or data. The memory device 1018 may also include a quality assurance database (discussed subsequently) as part of the database 1029 or as a separate database. The database may also include an analyzing portion 1032, which may include modules for developing questionnaires, developing databases, selecting files, performing the quantitative analyses, synthesizing the quantitative results, generating reports, conducting the scoring processes, and any subset of the foregoing. The modules may be stored in the memory device 1008, the processor 1020, other computer readable storage media, or a combination of the foregoing. Alternatively, the modules may be encoded in a computer readable electromagnetic signal. When implemented as software code, the modules may be object code or any other code describing or controlling their functionality.
The automation provided by these modules improves the speed, accuracy and comprehensiveness of the RDA System. During the RDAP, the Risk Analysis System may be used to prepare for and conduct the RDAP, and generate reports as a result of the RDAP. In preparing for the RDAP, the module for developing questionnaires may be used to develop and store, in electronic form, a questionnaire that is designed to elicit and capture information needed by the RDAP. While enabling the quality assurance process, the module for developing questionnaires may be used to adapt the questionnaire used during the RDAP for use as an audit tool. The module for developing databases may be used in preparing for the RDAP to customize the database1029 so that it can store the particular information elicited in a particular RDAP. The module for performing the RDAP may then, together with the interface unit 1004, capture and/or store the information elicited by the questionnaire in the memory device 1018 or other computer readable storage medium.
Additionally, the module for file selection may perform some or all of the steps of the file selection process during which the files upon which the RDAP will be performed are selected. File selection may include generating a performance report, performing an account run, performing a calibration step, and designating certain files as selected files. The Risk Analysis System can generate a performance report from information stored in the database 1019 that provides a summary of the profits and losses along a desired dimension or dimensions. In addition, the Risk Analysis System can prepare an inventory of accounts and/or policies along the dimensions for which problem areas were made evident in the performance report (perform an account run). The Risk Analysis System may select individual files for analysis using one of a number of search routines (perform a calibration) and designate the files chosen as "selected files."
The module for generating reports may include a module for synthesizing the quantitative results. The module for synthesizing quantitative results may include risk data analysis software ("RAS"). The RAS generally communicates the quantitative analyses, qualitative results, customized and/or industry best practices, and instructions stored in the storing portion 1030 or other computer readable storage device to the processor 1020, according to which the processor 1020 generates quantitative results. The RAS, together with the processor 1020, may synthesize the quantitative results by aggregating the values captured by the questionnaire. The RAS may include the quantitative analyses themselves as part of the quantitative results. The quantitative results may then be communicated to the storing portion 1030 or other computer readable storage device for storage and/or to the interface unit 1004 for display. The module for generating reports may also include a recommendation generator. The recommendation generator generates recommendations based on the quantitative results. The memory device 1018 or other computer readable storage medium communicates the recommendation generator to the processor 1020 via a memory signal 1024 upon receiving the relevant request from the processor 1020 made via a processor signal 1022. Further, the module for generating reports may compile the recommendations and the qualitative results into at least one report. The report may then be used for best practices training.
The module for conducting the scoring processes may perform an audit and generate scores for performance metrics based on the information captured during the audit process. For performing the audit process, the module for conducting the scoring process may include score generating software ("SGS"). During an audit process, the SGS generally instructs the processor 1020 to evaluate files in terms of how well they follow the best practices and recommendations, and to generate scores for the files that reflect the evaluation. The SGS may additionally or alternatively generate audit recommendations based on the scores. The SGS may be implemented independently or together with the RAS.
Quality Assurance Database The quality assurance database is configured to store information within the scope and dimensions, and for the performance metrics defined for the quality assurance process. In addition, it may be configured to store an audit questionnaire. The quality assurance database is generally developed from the database used during the RDAP. For example, the quality assurance database may be developed from and/or stored in the database of the Risk Analysis System, as previously discussed. However, it can be developed from almost any database structure. As previously discussed, the quality assurance database may be implemented in the database of the Risk Analysis System. However, the quality assurance database may be implemented in any memory device or other computer readable storage device.
Risk data analysis procedure The risk data analysis procedure ("RDAP") uses information relating to risk assessment processes and evaluates this information in terms of the risk assessment life-cycle. To optimize the effectiveness of the RDAP, the Risk Analysis System, including the RAS, can be used to implement the RDAP. The results of the RDAP provide a clear scorecard regarding the quality of risk assessment processes in terms of the accepted standards used by the risk assessment industry (collectively the "industry standards") and identify and prioritize opportunities for improvement. In general, information is obtained from documents in files relating to individual risk assessment cases, such as the loan-related documents in a loan file for a specific loan for a particular customer or the underwriting-related documents in an underwriting file for a specific insurance policy for a particular client. The industry standards used are those that apply generally to the risk assessment industry and can be customized to include the industry best practices relating specifically to the type of risk assessment being analyzed (the "customized industry best practices"). The RDAP 102 is shown generally in FIG. 3 and includes preparing for the RDAP 200, conducting the RDAP to generate quantitative analyses and qualitative results 202, and generating reports from the quantitative analyses and the qualitative results 204. Preparing for the RDAP 200 is shown in more detail in FIG. 4 and generally includes defining the analysis dimensions for the RDAP 300, developing a questionnaire 302, developing a database 303, and selecting files from which the information is to be obtained (the "selected files") 304. As previously discussed, the analysis dimensions are used to categorize the information used for the RDAP and the results generated by the RDAP into groups that provide insights into the desired segments of a risk-assessing entity. Although the analysis dimensions may be predefined and static from one RDAP to another, it is more useful to define the analysis dimensions for each RDAP so that the results are customized for the particular entity involved. Additionally, the analysis dimensions may be further broken down into subgroups. Using subgroups helps to demonstrate specific problem areas within each of the analysis dimensions. Examples of analysis dimension subgroups include geographic area of the risk, policy or loan duration, degree of risk involved, resources used in risk assessment, types of liability, external and internal data sources used in risk assessment, number of claims or defaults made, uniformity of the information used and overlooked exposures. Once the dimensions for the RDAP are defined, a questionnaire is developed
302. This questionnaire is used to elicit information from the selected files so this information can be used to create quantitative analyses and to determine compliance with the industry best practices and/or the customized best practices during the RDAP. The questionnaire may be a form questionnaire, a customized questionnaire or a partially customized- questionnaire. A form questionnaire generally includes standard questions applicable to most risk analysis situations including questions designed to elicit information used to determine compliance with industry best practices and is not altered for any particular RDAP. A customized questionnaire is developed for a particular. RDAP and includes questions that use the particular terms and language of and elicit information that is particular to the entity for which the RDAP is to be performed. Additionally, the customized questionnaire includes questions designed to elicit information used to determine compliance with the custom best practices. However, the questionnaires may also be partially customized in that, although they use standard questions for each RDAP, the language is altered so that entity-specific jargon or terminology is used and questions are added to elicit information specific to the entity being analyzed and to determine compliance with the customized and/or industry best practices. For entities outside the United States, significant amounts of customization are generally required because the types of relevant information are most likely going to be different.
Whether developed as a form questionnaire, a customized questionnaire, or a partially customized questionnaire, the questionnaire is designed to elicit information relating to the risk assessment phases. Table 1 shows an example of questions that may be included in the questionnaire according to the risk assessment phase from which they are designed to elicit information.
Figure imgf000021_0001
Table 1 Some of these questions are used to determine compliance with the customized and/or industry best practices related to each of the phases. In addition to the questions that directly ask if there was compliance with best practices, questions such as: "Have all exposures been identified?" and "If not, what exposures were missed?" may also be used to determine compliance with best practices.
Additionally, the questionnaire may generally include questions designed to elicit a determination of performance measures for each of the phases. The questionnaire may provide space for answers to be written in or it may provide a group of possible answers from which a choice can be made. The questionnaire can be in written, oral, digital, analog or any other form capable of eliciting the needed information.
One example of an embodiment of a digital questionnaire and a system for capturing answers including, data storage, analysis and reporting capabilities is described by Costonis et al. in the commonly-owned co-pending U.S. Patent Application No. 09/559,725, filed April 28, 2000, entitled "Claim Data Analysis Toolkit," which is hereby incorporate by reference in its entirety. In another example, a Risk Analysis System, such as that shown in FIG. 1 , may store a digital questionnaire and provide the means for capturing the answers.
The next step in preparing for the RDAP 200 (FIG. 4) is developing a database 303 to store the information elicited using the questionnaire. As shown in FIG. 1 , the database 1029 may reside in the memory device 1018 or the Risk Analysis System 1000. The database 1029 may be designed to store the RAS, SGS, quality assurance database, QAS, customized and/or industry best practices, recommendation generator, recommendations, the questionnaire, and/or any subset of the foregoing. The database may be developed prior to the RDAP process and modified after or at the same time that the questionnaire is developed or the files are selected so that it will be able to store any information elicited by customized questions. Alternatively, the database may be developed entirely at any of these times. The structure of the database can be totally custom-developed or developed within the framework of existing database software such as, Microsoft Access. As previously discussed, the database will generally have two portions, a storing portion and an analyzing portion. The storing portion will store the reports, the quantitative analyses, the qualitative results, the recommendations and any other needed information or data. It may further store the quality assurance database. The analyzing portion may include the RAS and/or the SAS. As part of the RAS, the analyzing portion may include a recommendation generator that generates recommendations based on the quantitative results.
Defining the analysis dimensions for the RDAP 300 and developing the questionnaire 302 generally defines the information needed to conduct the RDAP. Considering this, the files from which the needed information is to be obtained in order to produce the analyses are selected 304. The number of files selected must be a number sufficient to yield statistically significant data. In general, to fulfill this requirement, at least 8-10% of the relevant files in an entity's books are selected. Generally, files are selected that are representative of all the files in an entity's books. However, files representing exceptional activities may also or alternatively be chosen. Selecting the files 304 may involve a four (4) step process. In the first step, a performance report is generated to provide a summary of the profits and losses of the entity along a desired dimension or dimensions. For example, the performance report may be generated by division to provide a macro view of the relative performance of the different divisions. In the second step, an account run is performed. The account run includes preparing an inventory of accounts and/or policies in the dimensions for which problem areas were made evident in the first step. Using this inventory of accounts, the files corresponding to some of the accounts and/or policies listed in the inventory are reviewed manually to identify groups of files with the desired properties or information. This helps to direct the file selection 304 to files that represent areas that are typical or of particular interest. The results of this step may be summarized in a report of the books. One possible result may be that certain file types are identified for selection. Generally, the types of files that are identified for selection depend, in part, on the type of risk assessment involved. For example, when the risk assessment involves underwriting for an insurance policy, new or renewal underwriting files and their associated claim histories may be identified because they can be used to analyze the current underwriting effectiveness and efficiency. Submissions for coverage for which coverage was applied and a quote given but for which no agreement was reached, may also be selected for the same purpose. Additionally, submissions for coverage for which coverage was applied and denied and submissions which were submitted to competitors may be selected because they can be used to assess alignment with producers and to identify potential growth areas. The third step of selecting the files from which information is to be obtained 304 is the calibration step. This is the step in which the individual files are selected or "pulled." This step, as well as the first step, generally needs to be done manually because the files are primarily in hard copy form. However, if the files are in an electronic format, the files may be reviewed and calibrated electronically by the Risk Analysis System using any one of a number of search routines. Once selected, the files are designated as "selected files" in the fourth step of selecting the files and are then used to conduct the RDAP.
The step of conducting the RDAP to generate quantitative analyses and quantitative results 202 is shown in more detail in FIG. 5. Conducting the RDAP 202 generally includes conducting a quantitative portion of the RDAP to produce quantitative analyses 400 and conducting a qualitative portion of the RDAP to produce qualitative results 402. The quantitative and the qualitative portions may be conducted in parallel. Alternately, the qualitative portion may also be conducted in parallel with and after the quantitative portion (as shown in FIG. 5) so as to take into account the quantitative analyses produced in the quantitative portion.
As shown in FIG. 6, conducting the quantitative portion of the RDAP 400 includes training a premier group of risk assessors 500 and having the group of premier risk assessors analyze the selected files to produce quantitative analyses 502. Training a group of premier risk assessors 500 includes choosing the group of premier risk assessors and training and synchronizing the group of premier risk assessors. The group of premier risk assessors generally includes the best or top risk assessors (the "premier assessors") working for the entity for which the risk analysis is being performed. The premier assessors are chosen to analyze the selected files because they are the individuals that know their entity's risk assessment processes best. Furthermore, using risk assessors from within the entity being analyzed as part of the RDAP helps to initiate the best practices training. Training a premier group of risk assessors 500 also includes training and synchronizing the group of premier risk assessors. Training and synchronizing includes training the entire group of premier risk assessors as to how to analyze the selected files while all the premiere risk assessors are in a single group and then having the premier risk assessors evaluate at least one example file. The example files used in training and synchronization may be chosen from the selected files, from the entity's books or may be pre-developed files. The group of premier risk assessors is then broken down into at least two subgroups and the training and synchronization method is repeated as before except that each subgroup evaluates at least one additional example file. The analyses performed by each subgroup are then compared with each other. If the analyses are not consistent with each other, the training is repeated until the analyses are consistent. If the analyses are consistent with each other, the subgroups are broken down into progressively smaller subgroups of the premier group. The training and synchronization continues until the risk assessors have been trained individually and the results of their individual analyses are consistent throughout the group of premier risk assessors. The quantitative portion of the RDAP 400 continues with having the group of premier risk assessors analyze the selected files to produce quantitative analyses 502. All the risk assessors in the premier group individually analyze at least one selected file from the point of view of a risk assessor by completing the questionnaire for the selected files. This involves eliciting general information from the selected files and judging what was done in the file in terms of certain performance measures for the first through the sixth risk assessment phases. As previously discussed, the Risk Analysis System (shown in FIG.1 ) may store, display and elicit response to the questionnaire.
In completing the questions in the questionnaire, the premier assessors will generally determine a value for at least one performance measure for each phase and provide a reason for the particular value obtained. The performance measures are generally types of EGO which include, lost cost avoidance, expenses, premium and price differential. Generally, the questionnaire will include questions that ask the premier risk assessors to determine the corresponding performance measure for each phase. Additionally, the questionnaire will ask the premier risk assessor to evaluate the value obtained for the performance measure. Table 1 shows one example of performance measures that may be used for each of the risk assessment phases and how these performance measures may be determined:
Figure imgf000027_0001
Figure imgf000028_0001
Table 2
LCA is a measure of leakage due to the loss of a cost avoidance opportunity. In Phase 1 , associated with every exposure that is identified is the opportunity to reduce costs due to that exposure. For example, if the risk assessment is lending, one possible exposure is that the collateral for the loan may be easily destroyable. If this exposure is identified, the lender has the opportunity to require the borrower to take out insurance on that collateral. However, for each exposure that is not identified the opportunity to reduce costs associated with the unidentified exposure is lost. LCA is generally defined by the equation:
Loss cost avoidance = (actual losses incurred) + IBNR (1 )
Wherein IBNR (or incurred but not reported) represents incurred and anticipated losses that have not been recorded. With regard to Phase 2, the LCA measures the ability to avoid a claim or a default due to improperly assuming a risk. In both Phases 1 and 2, the LCA is an exposure-specific performance measure and is determined using Equation (1). For Phase 3, LCA represents the loss of a cost avoidance opportunity due to improperly assuming or rejecting the assumption of a risk. LCA for Phase 3 includes all such losses, independent of exposure and is determined using Equation (1)
In contrast, the performance measure for Phase 4 is premium. Premium may be determined in terms of gross leakage or net leakage. In Phase 4, Gross Leakage is a measure of loss that results from charging an insufficient premium, fee and/or interest rate. Gross leakage is generally is defined according to the equation:
Gross Leakage = What should have been charged - what was charged (2)
Net leakage is also a measure of this loss. However, it takes into account a propensity factor "p". Net leakage = Gross Leakage X p (3)
The propensity factor is a value less than or equal to one that represents the likelihood that the sufficient premium, fee or interest rate would have been obtained. Also determined for Phase 4 is expenses which is a measure of leakage that represents the costs incurred by an entity on a risk that should not have been assumed. Finally, premium is a measure of leakage that represents the fees, interest and/or premiums that were not obtained due to an incorrect assessment. Additionally, losses in Phase 7, setting the service program, may also be determined. The loss measures in this phase are a function of the under or over utilization of any service resources. When any service resources are over utilized, the performance measure used is expenses because the amount of money spent on service resources that had no monetary impact on the risk account represents an unnecessary expense. In contrast, when service resources are under utilized, the performance measure is loss cost avoidance because the EGO in this situation will be due to exposures that most likely would have been discovered or mitigated if the service resources were properly utilized. If this phase is included, the questionnaire will include questions designed to elicit information relating to service resources such as: "In accordance with best practices, should loss control been ordered to better identify exposures?"
Conducting the qualitative portion of the RDAP 402 (shown in FIG. 5) is done to help identify particular problem areas in the risk assessment methods used by the entity and to help develop best practices that will improve these risk assessment methods. Generally, conducting the qualitative portion of the RDAP 402 includes interviewing risk assessors and holding focus groups for risk assessors. The interviews are generally conducted with risk assessors that may or may not belong to the group of premier risk assessors. The interviews ask questions of the risk assessors to determine what the risk assessors think they do, and the resources and thought processes used by the risk assessors in terms of EGO. The focus groups are facilitated interviews with groups of risk assessors during which suggestions for best practices are elicited and developed by the group. This qualitative portion of the RDAP 402 is generally done in parallel with the quantitative portion 400. However, for even better results, the qualitative portion can be done again after the quantitative portion so that the interviews can be targeted towards the risk assessment phases or the analysis dimensions of concern as indicated by the quantitative portion.
As shown in FIG. 3, the results of the quantitative and qualitative portions of the RDAP are used to generate reports 204. This may be accomplished by the Risk Analysis System as previously described. The step of generating reports 204 is shown in more detail in FIG. 7 and generally includes assembling the qualitative analyses and the quantitative results in a database 600; synthesizing quantitative results in terms of the risk assessment phases 602; and generating reports 604. Assembling the qualitative analyses and the quantitative results in a database 600 includes entering the qualitative analyses elicited through use of the questionnaire and the quantitative results into the database of the Risk Analysis System. The database may store this information according to category such as, the identity of the premier risk assessor that analyzed the file and by risk type so that the data may be retrieved in any number of ways. Entering the quantitative results and the qualitative analyses into the database may be accomplished in any number of ways using the output device (see 1016 in FIG. 1 ) of the Risk Analysis System including, typing the information on a keyboard and scanning the information and using character recognition software to convert the scanned images into text-based documents.
Alternatively, in FIG. 7, the step of assembling the qualitative analyses and the quantitative results in a database 600 may be done simultaneously with the step of having the group of premier risk assessors analyze the selected files 502 (shown in FIG. 6) by having the group of premier risk assessors input their quantitative analyses directly into the database of the Risk Analysis System. In another embodiment, the group of premier risk assessors input their quantitative analyses directly into the database Risk Analysis System by providing responses to the questionnaire, which is also implemented in the database of the Risk Analysis
System. When the questionnaire is implemented in the database, it can be done so that only the questions relevant to a particular RDAP are shown to the premier risk assessors, thereby automatically skipping irrelevant questions. Furthermore, the Risk Analysis System can perform a feasibility check of the responses to the questions and not allow the premier risk assessor to go on to the next question if an error in the current response is discovered. This helps to ensure consistency and accuracy of the quantitative analyses. Synthesizing quantitative results in terms of risk assessment phases 602 includes aggregating the values for the performance measures for each analysis dimension and for the entity as a whole. These aggregate values, along with the quantitative analyses make up the quantitative results. This step may be automated by the Risk Analysis System, as previously discussed. Once the quantitative results are generated, they are stored in the database. The values of the quantitative results may then be evaluated by the Risk Analysis System according to a recommendation generator, as previously described, to determine if the values are within the range of any of the recommendations. The recommendation generator then generates the appropriate recommendation for each quantitative result that has a value that falls within the scope of that recommendation. Once the quantitative results and the recommendations have been generated, they are then, together with the results of the qualitative portion of the RDAP (the qualitative results), compiled by the Risk Analysis System into at least one report. Generating reports 604 may be performed by the Risk Analysis System, and includes presenting various aspects of the quantitative and qualitative results and the recommendations in a manner that is easier to understand than the quantitative and qualitative results themselves. The Risk Analysis System may present the data included in the reports in graphical form, but may also present the data as a textual listing. The Risk Analysis System may assemble the information contained in these reports in any manner and in any combination, and may also include an analysis of the quantitative and qualitative results. The following represents merely a sample of the possible reports that may be generated using the quantitative results and the results of the qualitative analysis: an Executive Summary, a Risk Analysis Data Analysis Report; and Final Recommendations.
An Executive Summary generally contains an overview of the quantitative results terms of the risk assessment phases, the qualitative results, the major problems identified and some recommendations as to solutions to these problems. The quantitative results to be included in the overview are generally selected manually and include the quantitative results that best represent the current state and illustrate the problems of the risk assessment processes that were analyzed. The quantitative results are not only presented in terms of the risk assessment phases, but along the selected analysis dimensions. Additionally, the qualitative results are included. The qualitative results are used to interpret the quantitative results to help identify the problems and suggest the solutions that are likely and/or unlikely to be successful in solving the problems. Additionally or alternatively, the qualitative results selected for inclusion in the Executive Summary may include those qualitative results which represent the current risk assessment methods that are effective which, when interpreted using the qualitative results, suggest methods that are effective and should be maintained. The recommendations may be prioritized based on phases that will show the greatest rate of return if the results were improved. Additionally, the recommendations may be quantified by the of dollar amounts that could be saved if the recommendations were implemented. For example, one recommendation may be to stop assuming certain types of risks in certain geographic locations due to irreparable losses. The Risk Data Analysis Report includes a summary of the selected files reviewed during the RDAP and the results of RDAP. Also included in this report are various aspects of the quantitative results presented along the analysis dimensions and in terms of the risk assessment phases. For example, the Risk Data Analysis Report may present the data in terms of the EGO or the EGO per share of the fee or premium received. However, the quantitative results may also or alternatively be presented in terms of LCA and/or Price. Which of the quantitative results are and how they are presented is generally determined manually from an inspection of the quantitative results along the various analysis dimensions. A sample of some of the analysis dimensions by which the EGO and EGO per share of fee or premium received may be presented include: geographic area of the risk, policy or loan duration, degree of risk involved, resources used in risk assessment, types of liability, external and internal data sources used in risk assessment, number of claims or defaults made, uniformity of the information used and overlooked exposures. In addition to presenting the quantitative results in terms of EGO or
EGO per share of fee or premium received, general information relating the selected files may also be presented such as: which of the selected files were missing information or had erroneous information; competitor information; whether the pricing is aligned with an existing guidelines, prices charged, number of quotes given and if terms and conditions were refined during negotiation. The quantitative results may be presented in any number of different ways limited only by the analysis dimensions chosen and the information elicited from the selected files. The Final Recommendations is a report that includes a prioritized list of the recommendations and a timeline for implementing these recommendations. The recommendations are listed in terms of which recommendation has the greatest potential to increase performance measures. Recommendations are determined based on the findings of the RDAP and may be generated manually or automatically. Examples of recommendations include pricing training, producer management training, industry or line of business training, what reports to order for information gathering and the best practice tools to use to assess exposures. In some cases, recommendations are made that can be quickly implemented that will generate a good size return.
Best Practices Training
To begin the implementation of the results and recommendations suggested by the RDAP, best practices training is conducted. This best practices training extends the training received by the group of premier risk assessors in the RDAP to other risk assessors in the entity. Conducting the best practices training 104 shown in FIG. 2 is shown in more detail in FIG. 8 and includes an outcome focused learning approach to instill a results orientation for risk assessment. Instead of simply training risk assessors to blindly follow standards, the risk assessors are taught the manner in which their decisions and actions have a real and tangible effect on the profitability of their entity and are enabled to help define the best practices for risk assessment that their entity will adopt. During the best practices training, all or most of an entity's risk assessors that were not part of the group of premier risk assessors (the "remaining risk assessors") are trained with regard to how the RDAP was conducted and the results of the RDAP including the recommendations. Then using the recommendations as a starting point, the remaining risk assessors participate in preparing best practices for risk assessment that their entity will adopt. The training may be focused specifically on the risk assessment phases determined during the RDAP to have the greatest EGO.
Conducting the best practices training 104 includes providing hands-on training for best practices 700; providing content expert presentations 702; providing networking opportunities 704; providing feedback and improvement mechanisms 706; and enabling the determination of best practices 708. All of these portions of the best practices training may be conducted in any order. In general, learning is better facilitated when these portions are intermixed and spread out over a number of days.
The hands-on training is a type of experiential learning provided to improve the risk assessment processes and outcomes by training the risk assessors as to the best methods for rapidly changing behaviors. This type of training enables higher retention with regard to the RDAP and the RDAP results and shorter time to proficiency with regard to risk assessment.
Providing hands on training 700 generally includes having the remaining risk assessors analyze training files and participate in conferencing. Training files may include some of the selected files and/or composite files. Composite files may be created by combining facts from some of the selected files and/or may be totally or partially fabricated. Specific training files are chosen or created because they contain fact patterns that will emphasize the best practices, particularly those that, if implemented or improved, will yield the greatest gain in the performance measures. The remaining analyzers review the training files using a process similar to that used by the group of premier risk assessors. Alternately, in one embodiment, the remaining analyzers do not go through the calibration process but instead review the training files in teams each consisting of a subgroup of the remaining analyzers. Each team identifies the positive and negative actions that took place for each risk assessment phase of the risk assessment in terms of the best practices. After the teams have completed their analyses, conferencing allows the teams to learn from each other. Conferencing involves open communication among all the remaining risk assessors from all the teams regarding the facts of some of the training files and the possible opportunities for improving the outcome of the training files. Each team presents their analyses, followed by a discussion among the remaining risk assessors. This discussion may be facilitated by a facilitator. Conferencing ends when all the teams come to a consensus with regard to the analyses.
Providing content expert presentations 702, providing networking opportunities 704 and providing feedback mechanisms 706 are all done to reinforce the hands-on training and to build the relationships that will enable the training to continue long after the hands-on training has ended. Providing content expert presentations 702 includes having experts in risk assessment make formal presentations on various risk assessment related topics. Generally, the risk assessment related topics will be presented within the context of the risk assessment phases. The experts may come from within the entity or from other sources. Providing networking opportunities 704 includes hosting team building activities, performing checkpoint exercises and hosting social events. These activities all help to build relationships among the risk assessors to provide for a lasting resource within the entity regarding risk assessment and improving risk assessment processes. Providing feedback and improvement opportunities includes providing surveys and question and answer sessions on a periodic basis throughout the best practices training. This enables the best practices training to be constantly improved and updated as it is performed.
Enabling the determination of best practices 708 allows the.remaining risk assessors to be part of the process whereby the best practices are determined and adopted. Generally, the best practices are determined through discussion among the remaining risk assessors, which continues until a consensus is reached regarding which practices will be adopted as the best practices. Using a discussion and consensus approach to determining best practices helps instill in the remaining risk assessors a sense of ownership that will help to ensure that the best practices will be integrated into the way their entity conducts business thereby promoting compliance with the best practices. Although determining the best practices may be enabled at any time during the best practices training, it is beneficial to at least begin the determination after the remaining risk assessors have had some training regarding the RDAP and its results so that they have a clear picture of the current state of risk assessment in their entity. The remaining risk assessors may develop the best practices from existing entity practices, industry standard practices or entirely from scratch. If the remaining risk assessors develop the best practices from current entity or industry practices, it is beneficial for them to receive some training regarding the qualitative results and the recommendations so that they may use these as a starting point.
Quality Assurance Process
To help ensure that the best practices are followed even after the RDAP and best practices training is concluded, a quality assurance process should be enabled so that the quality of the risk assessment of files not analyzed during the RDAP and files reviewed during the RDAP that were subsequently updated can be monitored. In general, enabling a quality assurance process 106 (FIG. 2) includes developing a method for monitoring risk assessment that allows the risk assessors or others within an entity to monitor their own risk assessment processes. The quality assurance process allows the entity to review files and arrive at a quantitative scoring of the quality of the risk assessment, which in turn allows an entity to review the risk assessment at a granular or macro level for quick identification of where in the risk assessment processes there may be a lack of understanding or a problem with the best practices.
Enabling a quality assurance process 106, as shown in FIG. 9, includes developing a framework 800; developing a quality assurance database 802; and conducting a scoring process 804. Developing a framework for the quality assurance process 800 is shown in more detail in FIG. 10 and includes developing an approach 900; evaluating the current risk assessment review processes 902; outlining the detailed requirements and developing materials 804; determining key metrics 906 and beginning implementation and ongoing training 808. Developing an approach 900 generally includes developing the plan for enabling the quality assurance process. This plan is developed from the current state of the entity's risk assessment processes and the desired scope of the quality assurance process. In order to develop the plan, the current state of the entity's risk assessment processes, as determined in the RDAP, is confirmed. This helps define where the entity is in terms of its risk assessment which defines the baseline for improvement. Additionally, to implement the plan, the scope of the quality assurance process is defined. The scope may be defined to include all types of files, or only those that were identified as not conforming to best practices during the RDAP. Alternately, the scope may be defined in terms of line of business or in terms of any of the other dimensions for which the RDAP was carried out. The files or file types included in the scope will be the files or file types monitored by the quality assurance process. To complete the plan, a determination is made as to the resources that will be needed for the quality assurance process. These resources include the entity or other personnel needed to help perform the quality assurance process.
Evaluating the current risk assessment review processes 902 provides a view of the entity's current audit processes and uses the current audit processes as a baseline by which new audit processes are developed. The entity's current audit processes are the processes currently in use by the entity by which they determine the quality of their risk assessment. In some cases, current audit processes may not exist. This includes reviewing and documenting the current audit processes, including any forms, databases, questionnaires and other tools used by the entity in its current risk assessment review process. Using the entity's current risk assessment review process and the results of the RDAP (or only the results of the RDAP if no current risk assessment review process exists), a new risk assessment review process is developed. The new risk assessment review process may include all or only a portion of the recommendations developed during the RDAP.
Outlining the detailed requirements and developing materials 904 includes adapting the questionnaire used during the RDAP for use as an audit tool. This "audit questionnaire" is generally revised to reflect the scope of the new risk assessment review process and the results of the RDAP. For example, the questionnaire may be changed so that it only asks questions related to the first two phases of risk assessment for a particular geographic location. Additionally, outlining the detailed requirements may include determining the specific resources needed to perform the new risk assessment process such as the level and areas of expertise required of the personnel involved in this process. Determining key metrics 906 includes determining the metrics by which files analyzed by the new risk assessment process are to be judged. Additionally, determining the key metrics 906 includes establishing baseline and target numbers for the key metrics. The baseline is the state of the risk assessment processes at the time of the RDAP or at some other defined time in the past. The state of the risk assessment processes is generally given in terms of the performance measures used in the RDAP. The target numbers are also generally given in terms of the performance measures and represent the desired state of the risk assessment processes. Further, determining key metrics may include establishing a reward or incentive program to reward risk assessors that help the entity meet its target for the key metrics. Performing ongoing training 908 includes training the resources to perform or help perform the quality assurance process.
Referring to FIG. 9, once the framework is developed 800, a database is developed for the quality assurance process (the "quality assurance database") 802. The quality assurance database is generally developed from the database used during the RDAP. For example, the quality assurance database may be developed from and/or stored in the database of the Risk Analysis System, as previously discussed. However, it can be developed from almost any database structure. The quality assurance database is set up to store information within the scope and dimensions and for the performance metrics defined for the quality assurance process. The quality assurance database may also store the audit questionnaire. Conducting the scoring process 804 is similar to conducting the quantitative phase of the RDAP. Questions from the audit questionnaire are answered by the resources and the answers are ultimately input into the quality assurance database. The audit process then uses the answers to compute one or more scores for the defined performance metrics. The audit may be performed manually. Alternatively, the audit may be performed by the Risk Analysis System according to the score generating software (SGS). During the audit process, each file audited is evaluated in terms of how well it follows the best practices and recommendations established during the best training and the RDAP, respectively. The score gives a numerical measure of how well the best practices and recommendations were followed. In another embodiment, the score may be weighted, so that the scores generated for the performance metrics along defined dimensions may have greater weight (multiplied by a number greater than one) or a lesser weight (multiplied by a number less than one) than the scores along other non-weighted dimensions. The audit process also generates audit recommendations based on the scores for the performance metrics. The resource may then review the scores and recommendations and may add comments and other recommendations.
Although the methods and apparatuses disclosed herein have been described in terms of specific embodiments and applications, persons skilled in the art can, in light of this teaching, generate additional embodiments without exceeding the scope or departing from the spirit of the claimed invention. Accordingly, it is to be understood that the drawings and descriptions in this disclosure are proffered to facilitate comprehension of the invention and should not be construed to limit the scope thereof.

Claims

1. A computer readable storage medium storing computer readable program code for synthesizing at least one quantitative result for a plurality of selected files, the computer readable program code comprising: data encoding at least one quantitative analysis of each of the plurality of files; and a computer code implementing a procedure for synthesizing the at least one quantitative result in response to an input of at least one quantitative analysis of each of the plurality of files, wherein the procedure for synthesizing the at least one quantitative result aggregates the at least one value for at least one performance measure in terms of at least one risk assessment phase, each of a plurality of risk dimensions, or the at least one risk assessment phase and each of a plurality of risk dimensions.
2. A computer readable storage medium storing computer readable program code for generating reports from at least one quantitative analysis and at least one qualitative result, the computer readable program code comprising: data encoding at least one quantitative analysis of each of the plurality of files and at least one qualitative result of each of the plurality of files; and a computer code implementing a procedure for: synthesizing the at least one quantitative result in response to an input of at least one quantitative analysis of each of the plurality of files, wherein the procedure for synthesizing the at least one quantitative result aggregates the at least one value for at least one performance measure in terms of at least one risk assessment phase, each of a plurality of risk dimensions, or the at least one risk assessment phase and each of a plurality of risk dimensions generating at least one recommendation based on the at least one quantitative results; and generating the at least one report based on the at least one quantitative analysis, the at least one recommendation and the at least one qualitative result.
3. A risk analysis system, comprising: an interface unit for receiving at least one quantitative analysis of each of a plurality of files; and a risk analysis unit, comprising: a memory device, including a storing portion and an analyzing portion; wherein the storing portion stores the at least one quantitative analysis including at least one value for at least one performance measure as a function of at least one risk assessment phase for each of a plurality of analysis dimensions, and the analyzing portion stores an algorithm for determining quantitative results; and a processor coupled to the memory device; wherein the processor using the at least one quantitative analysis and the algorithm for determining quantitative results communicated to it by the memory device, synthesizes the at least one quantitative result from the at least one quantitative analysis by aggregating one or more performance measures for a plurality of risk assessment phases, each of the plurality of analysis dimensions, or the at least one risk assessment phase and each of the plurality of analysis dimensions; wherein the processor further communicates the at least one quantitative result to the memory device, to the interface unit, or to the memory device and the interface unit.
4. A risk analysis system, as claimed in Claim 3, wherein the memory device further stores a recommendation generator; and the processor further, using the recommendation generator and the at least one quantitative result, generates at least one recommendation; wherein the processor communicates the at least one recommendation to the memory unit, the interface unit, or the memory unit and the interface unit.
5. A risk analysis system, as claimed in Claim 4, wherein the interface unit further receives at least one qualitative result; the memory device further stores the at least one qualitative result; and the processor, using the at least one recommendation, the at least one qualitative result and the at least one quantitative result, generates at least one report; wherein the processor communicates the at least one report to the memory unit, the interface unit, or the memory unit and the interface unit.
6. A method for improving processes and outcomes in risk assessment comprising: performing a risk data analysis procedure with a group of premier risk analyzers to generate at least one report; conducting a best practices training to extend training received by the group of premier risk analyzers in the risk data analysis procedure to at least one remaining risk assessor and to prepare at least one best practice; and enabling a quality assurance process to monitor compliance with the at least one best practice.
7. The method for improving processes and outcomes in risk assessment, as claimed in Claim 6, wherein the risk assessment is underwriting.
8. The method for improving processes and outcomes in risk assessment, as claimed in Claim 6, wherein the risk assessment is lending.
9. The method for improving processes and outcomes in risk assessment, as claimed in Claim 6, wherein performing the risk data analysis procedure to generate at least one report comprises: preparing for the risk data analysis procedure; conducting the risk data analysis procedure to generate at least one qualitative result and at least one quantitative analysis; and generating the at least one report from the at least one quantitative analysis and the at least one qualitative result.
10. The method for improving processes and outcomes in risk assessment, as claimed in Claim 9, wherein preparing for the risk data analysis procedure comprises: defining at least one analysis dimension for the risk data analysis procedure; developing a questionnaire to elicit the at least one quantitative analysis; developing a database to store the at least one quantitative analysis and the at least one qualitative result; and selecting a plurality of files from which to conduct the risk data analysis procedure, wherein the plurality of files are defined as a plurality of selected files.
11. The method for improving processes and outcomes in risk assessment, as claimed in Claim 10, wherein the at least one analysis dimension comprises one or more analysis dimensions chosen from an analysis dimension group comprising: type of risk, office, geographic area, resources used in the risk assessment, types of liability, degree of risk, external resources used in the risk assessment, number of claims or defaults made, uniformity of information used, and overlooked exposures.
12. The method for improving processes and outcomes in risk assessment, as claimed in Claim 10, wherein developing the questionnaire to elicit at least one quantitative analysis comprises developing a standard questionnaire comprising a plurality of questions designed to elicit at least one quantitative analysis from each of the plurality of selected files, wherein the at least one quantitative analysis may be used to evaluate at least one performance measure in at least one risk assessment phase.
13. The method for improving processes and outcomes in risk assessment, as claimed in Claim 10, wherein developing a questionnaire to elicit at least one quantitative analysis comprises developing a customized questionnaire comprising customizing a plurality of standard questions designed to elicit at least one quantitative analysis from each of the plurality of selected files, wherein the at least one quantitative analysis may be used to evaluate at least one performance measure in at least one risk assessment phase.
14. The method for improving processes and outcomes in risk assessment, as claimed in Claim 12, wherein the at least one risk assessment phase comprises one or more risk assessment phases chosen from an assessment phase group comprising: identifying and evaluating exposure; making a risk decision; setting terms and conditions; setting price and premium; and negotiating.
15. The method for improving processes and outcomes in risk assessment, as claimed in Claim 14, wherein the assessment phase group further comprises setting a service program.
16. The method for improving processes and outcomes in risk assessment, as claimed in Claim 10, wherein the database comprises: a storing portion, wherein the storing portion stores the at least one quantitative analysis, the at least one quantitative result and the at least one qualitative result; and an analyzing portion for synthesizing the at least one qualitative result.
17. The method for improving processes and outcomes in risk assessment, as claimed in Claim 10, wherein selecting the plurality of files from which to conduct the risk data analysis procedure comprises: reviewing a plurality of files to identify groups of files with desired properties; and performing a calibration step to obtain the selected files.
18. The method for improving processes and outcomes in risk assessment, as claimed in Claim 17, wherein reviewing a plurality of files to identify groups of files with desired properties further comprises summarizing groups of files identified with the desired properties in a file group report.
19. The method for improving processes and outcomes in risk assessment, as claimed in Claim 17, wherein the risk assessment is underwriting and wherein reviewing a plurality of files to identify groups of files with desired properties comprises identifying groups of files chosen from among a file group comprising: new files, renewal files, files including submissions for coverage wherein the coverage was applied for and quoted but no agreement reached, files including submissions for the coverage wherein the coverage was applied for and denied; and files including submissions to competitors.
20. The method for improving processes and outcomes in risk assessment, as claimed in Claim 9, wherein conducting the risk data analysis procedure to generate the at least one quantitative analysis and the at least one qualitative result comprises: conducting a quantitative portion of the risk data analysis procedure, wherein the quantitative portion produces the at least one quantitative analysis; and conducting a qualitative portion of the risk data analysis procedure, wherein the quantitative portion produces the at least one qualitative result.
21. The method for improving processes and outcomes in risk assessment, as claimed in Claim 20, wherein conducting the quantitative portion of the risk data analysis procedure comprises: training and synchronizing the group of premier risk assessors to analyze a plurality of selected files, wherein the group of premier risk assessors comprises a plurality of members and is a subset of a group of risk assessors; and having each of the plurality of members of the group of premier risk assessors analyze a subset of the plurality of selected files to produce at least one quantitative analysis.
22. The method for improving processes and outcomes in risk assessment, as claimed in Claim 21 , wherein training and synchronizing further comprises:
(A) training the group of premier risk assessors as a whole to analyze the plurality of selected files;
(B) having the group of premier risk assessors as a whole analyze at least one example file;
(C) breaking the group of premier risk assessors into at least two subgroups;
(D) having each of the at least two subgroups analyze a new example file, wherein each of the at least two subgroups prepares an analysis;
(E) comparing the analyses of all of the at least two subgroups to determine if the analyses of all of the at least two subgroups are consistent; wherein if the analyses of all of the at least two subgroups are not consistent, repeating steps (D) and (E) until the analyses of all the plurality of subgroups are consistent; and
(F) wherein if the analyses of each of the plurality of subgroups are consistent, performing an iteration until each subgroup includes a risk assessor and analyses of all the risk assessors are consistent, wherein steps (C), (D), (E) and (F) are defined as an iteration and with each iteration the subgroups are broken down into progressively smaller subgroups, until each of the progressively smaller subgroups contains one member.
23. The method for improving processes and outcomes in risk assessment, as claimed in Claim 21 , wherein having each of the plurality of members of the group of premier risk assessors analyze the subset of the plurality of selected files to produce the at least one quantitative analysis, comprises: eliciting general information from the subset of the plurality of selected files; and determining a value for at least one performance measure for each of a plurality of analysis dimensions for each selected file in the subset of the plurality of selected files.
24. The method for improving processes and outcomes in risk assessment, as claimed in Claim 23, wherein the at least one performance measure is a type of economic gain opportunity.
25. The method for improving processes and outcomes in risk assessment, as claimed in Claim 24, wherein the type of economic gain opportunity is selected from an EGO group comprising: lost cost avoidance, expenses, premium and price differential.
26. The method for improving processes and outcomes in risk assessment, as claimed in Claim 20, wherein conducting the qualitative portion of the risk data analysis procedure, comprises: conducting interviews with more than one premier risk assessor of the group of premier risk assessors; and conducting focus groups with at least one portion of the group of premier risk assessors, wherein conducting the interviews and focus groups produces at least one qualitative result.
27. The method for improving processes and outcomes in risk assessment, as claimed in Claim 9, wherein generating reports from the at least one quantitative analysis and the at least one qualitative result comprises: assembling the at least one quantitative analysis and the at least one qualitative result in a database, wherein the at least one qualitative analysis includes at least one value for at least one performance measure as a function of at least one risk assessment phase for each of a plurality of analysis dimensions; synthesizing the at least one quantitative result from the at least one quantitative analysis by aggregating the at least one value for the at least one performance measure in terms of the at least one risk assessment phase, each of the plurality of analysis dimensions, or the at least one risk assessment phase and each of the plurality of analysis dimensions; generating the at least one recommendation based on the at least one quantitative results; and generating the at least one report based on the at least one quantitative analysis, the at least one recommendation and the at least one qualitative result.
28. The method for improving processes and outcomes in risk assessment, as claimed in Claim 27, wherein the at least one risk assessment phase is selected from a risk assessment phase group comprising: identifying exposures, evaluating exposures, making a risk assessment decision, setting terms and conditions, setting a price and negotiation.
29. The method for improving processes and outcomes in risk assessment, as claimed in Claim 28, wherein the risk assessment phase group further comprises: setting a service program.
30. The method for improving processes and outcomes in risk assessment, as claimed in Claim 6, wherein conducting the best practices training to communicate the risk data analysis procedure and the at least one best pracfice, comprises steps of: providing hands-on training for best practices for at least one remaining risk assessor, wherein providing the hands-on training includes having the at least one remaining risk assessor analyze at least one training file; providing content expert presentations, wherein the content expert presentations reinforce the hands-on training; providing networking opportunities, wherein the networking opportunities reinforce the hands-on training; providing feedback and improvement mechanisms, wherein the feedback and improvement mechanisms reinforce the hands-on training; and enabling a determination of at least one best practices, wherein the determination of the at least one best practices is made with participation of the at least one remaining risk assessor.
31. The method for improving processes and outcomes in risk assessment, as claimed in Claim 30, wherein the steps of conducting the best practices training are performed in any order.
32. The method for improving processes and outcomes in risk assessment, as claimed in Claim 30, wherein the steps of conducting the best practices training are performed approximately simultaneously.
33. The method for improving processes and outcomes in risk assessment, as claimed in Claim 30, wherein the steps of conducting the best practices training are performed in an intermixed manner.
34. The method for improving processes and outcomes in risk assessment, as claimed in Claim 30, wherein the at least one training file includes at least one selected file.
35. The method for improving processes and outcomes in risk assessment, as claimed in Claim 30, wherein the at least one training file includes at least one composite file.
36. The method for improving processes and outcomes in risk assessment, as claimed in Claim 6, wherein enabling the quality assurance process that provides monitoring of compliance with the at least one best practice, comprises: developing a framework, wherein the framework includes an audit questionnaire for eliciting audit questionnaire answers and a new risk assessment review process which includes at least one key metric; developing a quality assurance database to store the audit questionnaire and the audit questionnaire answers; and conducting a scoring process wherein the new risk assessment review process uses the audit questionnaire answers to create at least one score, wherein the score reflects the compliance with the at least one best practice for each of the at least one key metric.
37. The method for improving processes and outcomes in risk assessment, as claimed in Claim 36, wherein developing a framework, includes: developing a plan for enabling the quality assurance process from a current state of a risk assessment review process and a desired scope of the quality assurance process, wherein the current state of the risk assessment review process is determined by the risk data analysis procedure; evaluating the current risk assessment review process to define a new risk assessment review process; outlining detailed requirements and developing materials wherein the materials include the audit questionnaire; determining the at least one key metric to be used by the new risk assessment review process; and beginning an implementation of the new risk assessment review process.
38. The method for improving processes and outcomes in risk assessment, as claimed in Claim 37, wherein developing the plan includes: confirming the current state of the risk assessment review process to define a baseline for creating the new risk assessment process; defining the scope; and determining any resources needed for the new risk assessment review process.
39. The method for improving processes and outcomes in risk assessment, as claimed in Claim 37, wherein evaluating the current risk assessment review process to define the new risk assessment review process includes: identifying and reviewing any current audit processes; and developing the new risk assessment review process using any recommendations produced by the risk data analysis procedure, any current audit processes and the at least one key metric.
40. The method for improving processes and outcomes in risk assessment, as claimed in Claim 37, wherein the audit questionnaire is a questionnaire used by the risk data analysis procedure adapted to reflect the scope.
41. The method for improving processes and outcomes in risk assessment, as claimed in Claim 37, wherein outlining detailed requirements and developing materials further includes determining specific resources needed for the implementation of the new risk assessment review process.
42. The method for improving processes and outcomes in risk assessment, as claimed in Claim 37, wherein determining the at least one key metric includes establishing baseline and target numbers for the at least one key metric.
43. The method for improving processes and outcomes in risk assessment, as claimed in Claim 37, wherein determining the at least one key metric includes establishing a reward program, an incentive program or both the reward program and the incentive program.
44. The method for improving processes and outcomes in risk assessment, as claimed in Claim 36, wherein developing the quality assurance database includes adapting a risk data analysis database to reflect the scope.
45. The method for improving processes and outcomes in risk assessment, as claimed in Claim 36, wherein the at least one score is weighted according to at least one defined dimension.
46. A risk analysis system for improving processes and outcomes in risk assessment, comprising: a risk data analysis preparation procedure for developing a questionnaire to elicit at least one quantitative analysis, developing a database to store the at least one quantitative analysis, and selecting a plurality of files from which to conduct a risk data analysis procedure; a risk analysis device including a memory for storing the database and a processor for generating at least one quantitative result, at least one recommendation and the at least one report; a method for conducting a risk data analysis procedure with a group of premier risk analyzers for generating at least one qualitative result, at least one quantitative result, at least one recommendation and at least one report, wherein the method for conducting a risk data analysis procedure uses the risk analysis device to perform a risk data analysis process to generate the at least one qualitative result, the at least one recommendation and the at least one report: a method for conducting a best practices training to extend training received by the group of premier risk analyzers in the risk data analysis procedure to at least one remaining risk assessor and to prepare at least one best practice; and a method for enabling a quality assurance process to monitor compliance with the at least one best practice.
47. A risk data analysis procedure for generating at least one report, comprising: preparing for the risk data analysis procedure; conducting the risk data analysis procedure to generate at least one qualitative result and at least one quantitative analysis; and generating the at least one report from the at least one quantitative analysis and the at least one qualitative result.
48. The risk data analysis procedure, as claimed in Claim 47, wherein preparing for the risk data analysis procedure comprises: defining at least one analysis dimension for the risk data analysis procedure; developing a questionnaire to elicit the at least one quantitative analysis; developing a database to store the at least one quantitative analysis and the at least one qualitative result; and selecting a plurality of files from which to conduct the risk data analysis procedure, wherein the plurality of files are defined as a plurality of selected files.
49. The risk data analysis procedure, as claimed in Claim 48, wherein the at least one analysis dimension comprises one or more analysis dimensions chosen from an analysis dimension group comprising: type of risk, office, geographic area, resources used in the risk assessment, types of liability, degree of risk, external resources used in the risk assessment, number of claims or defaults made, uniformity of information used, and overlooked exposures.
50. The risk data analysis procedure, as claimed in Claim 48, wherein developing the questionnaire to elicit the at least one quantitative analysis comprises developing a standard questionnaire comprising a plurality of questions designed to elicit the at least one quantitative analysis from each of the plurality of selected files, wherein the at least one quantitative analysis may be used to evaluate at least one performance measure in at least one risk assessment phase.
51. The risk data analysis procedure, as claimed in Claim 48, wherein developing a questionnaire to elicit the at least one quantitative analysis comprises developing a customized questionnaire comprising customizing a plurality of standard questions designed to elicit the at least one quantitative analysis from each of the plurality of selected files, wherein the at least one quantitative analysis may be used to evaluate at least one performance measure in at least one risk assessment phase.
52. The risk data analysis procedure, as claimed in Claim 50, wherein the at least one risk assessment phase comprises one or more risk assessment phases chosen from an assessment phase group comprising: identifying and evaluating exposure; making a risk decision; setting terms and conditions; setting price and premium; and negotiating.
53. The risk data analysis procedure, as claimed in Claim 52, wherein the assessment phase group further comprises setting a service program.
54. The risk data analysis procedure, as claimed in Claim 48, wherein the database comprises: a storing portion, wherein the storing portion stores the at least one quantitative analysis, the at least one quantitative result and the at least one qualitative result; and an analyzing portion for synthesizing the at least one qualitative result.
55. The risk data analysis procedure, as claimed in Claim 48, wherein selecting the plurality of files from which to conduct the risk data analysis procedure comprises: reviewing a plurality of files to identify groups of files with desired properties; and performing a calibration step to obtain the selected files.
56. The risk data analysis procedure, as claimed in Claim 55, wherein reviewing a plurality of files to identify groups of files with desired properties further comprises summarizing groups of files identified with the desired properties in a file group report.
57. The risk data analysis procedure, as claimed in Claim 55, wherein the risk assessment is underwriting and wherein reviewing a plurality of files to identify groups of files with desired properties comprises identifying groups of files chosen from among a file group comprising: new files, renewal files, files including submissions for coverage wherein the coverage was applied for and quoted but no agreement reached, files including submissions for the coverage wherein the coverage was applied for and denied; and files including submissions to competitors.
58. The risk data analysis procedure, as claimed in Claim 47, wherein conducting the risk data analysis procedure to generate the at least one quantitative analysis and the at least one qualitative result comprises: conducting a quantitative portion of the risk data analysis procedure, wherein the quantitative portion produces the at least one quantitative analysis; and conducting a qualitative portion of the risk data analysis procedure, wherein the quantitative portion produces the at least one qualitative result.
59. The risk data analysis procedure, as claimed in Claim 58, wherein conducting the quantitative portion of the risk data analysis procedure comprises: training and synchronizing a group of premier risk assessors to analyze a plurality of selected files, wherein the group of premier risk assessors comprises a plurality of members and is a subset of a group of risk assessors; and having each of the plurality of members of the group of premier risk assessors analyze a subset of the plurality of selected files to produce at least one quantitative analysis.
60. The risk data analysis procedure, as claimed in Claim 59, wherein training and synchronizing further comprises:
(A) training the group of premier risk assessors as a whole to analyze the plurality of selected files;
(B) having the group of premier risk assessors as a whole analyze at least one example file;
(C) breaking the group of premier risk assessors into at least two subgroups; (D) having each of the at least two subgroups analyze a new example file, wherein each of the at least two subgroups prepares an analysis;
(E) comparing the analyses of all of the at least two subgroups to determine if the analyses of all of the at least two subgroups are consistent; wherein if the analyses of all of the at least two subgroups are not consistent, repeating steps (D) and (E) until the analyses of all the plurality of subgroups are consistent; and
(F) wherein if the analyses of each of the plurality of subgroups are consistent, performing an iteration until each subgroup includes a risk assessor and analyses of all the risk assessors are consistent, wherein steps (C), (D), (E) and (F) are defined as an iteration and with each iteration the subgroups are broken down into progressively smaller subgroups, until each of the progressively smaller subgroups contains one member.
61. The risk data analysis procedure, as claimed in Claim 59, wherein having each of the plurality of members of the group of premier risk assessors analyze the subset of the plurality of selected files to produce the at least one quantitative analysis, comprises: eliciting general information from the subset of the plurality of selected files; and determining a value for at least one performance measure for each of a plurality of analysis dimensions for each selected file in the subset of the plurality of selected files.
62. The risk data analysis procedure, as claimed in Claim 61 , wherein the at least one performance measure is a type of economic gain opportunity.
63. The risk data analysis procedure, as claimed in Claim 62, wherein the type of economic gain opportunity is selected from an EGO group comprising: lost cost avoidance, expenses, premium and price differential.
64. The risk data analysis procedure, as claimed in Claim 58, wherein conducting the qualitative portion of the risk data analysis procedure, comprises: conducting interviews with more than one premier risk assessor of the group of premier risk assessors; and conducting focus groups with at least one portion of the group of premier risk assessors, wherein conducting the interviews and focus groups produces at least one qualitative result.
65. The risk data analysis procedure, as claimed in Claim 47, wherein generating reports from the at least one quantitative analysis and the at least one qualitative result comprises: assembling the at least one quantitative analysis and the at least one qualitative result in a database, wherein the at least one qualitative analysis includes at least one value for at least one performance measure as a funcfion of at least one risk assessment phase for each of a plurality of analysis dimensions; synthesizing the at least one quantitative result from the at least one quantitative analysis by aggregating the at least one value for the at least one performance measure in terms of the at least one risk assessment phase, each of the plurality of analysis dimensions, or the at least one risk assessment phase and each of the plurality of analysis dimensions; generating at least one recommendation based on the at least one quantitative results; and generating the at least one report based on the at least one quantitative analysis, the at least one recommendation and the at least one qualitative result.
66. The risk data analysis procedure, as claimed in Claim 65, wherein the at least one risk assessment phase is selected from a risk assessment phase group comprising: identifying exposures, evaluating exposures, making a risk assessment decision, setting terms and conditions, setting a price and negotiation.
67. The risk data analysis procedure, as claimed in Claim 66, wherein the risk assessment phase group further comprises: setting a service program.
68. A method for conducting a best practices training to extend training received by a group of premier risk assessors in a risk data analysis procedure to at least one remaining risk assessor, wherein the risk data analysis procedure includes at least one result and may include at least one recommendation, comprising: training the at least one remaining risk assessor as to how the risk data analysis procedure was performed; training the at least one remaining risk assessor about the at least one result and any at least one recommendation; and preparing at least one best practice with participation from the at least one remaining risk assessor and using the at least one recommendation, if any, as a starting point.
69. The method for conducting a best practices training, as claimed in Claim 68, further comprising using an outcome focused learning approach.
70. A method for conducting a best practices training to begin implementation of results and recommendations suggested by a risk data analysis procedure, comprising: providing a hands-on training for best practices for at least one remaining risk assessor, wherein providing the hands-on training includes having the at least one remaining risk assessor analyze at least one training file and participate in conferencing; providing content expert presentations, wherein the content expert presentations reinforce the hands-on training; providing networking opportunities, wherein the networking opportunities reinforce the hands-on training; providing feedback and improvement mechanisms, wherein feedback and improvement mechanisms reinforce the hands-on training; and enabling a determination of at least one best practices, wherein the determination of the at least one best practice is made with participation of the at least one remaining risk assessor.
71. The method for conducting a best practices training, as claimed in Claim 70, wherein the steps of the method are performed in any order.
72. The method for conducting a best practices training, as claimed in Claim 70, wherein the steps of the method are performed approximately simultaneously.
73. The method for a best practices training, as claimed in Claim 70, wherein the steps of the method are performed in an intermixed manner.
74. The method for conducting a best practices training, as claimed in Claim 70, wherein, the at least one training file includes at least one selected file.
75. The method for conducting a best practices training, as claimed in Claim 70, wherein, the at least one training file includes at least one composite file.
76. A method for enabling a quality assurance process to monitor compliance with at least one best practice and at least one recommendation, wherein the at least one best practice and the at least one recommendation were created during a risk data analysis procedure, comprising: developing a framework, wherein the framework includes an audit questionnaire for eliciting audit questionnaire answers and a new risk assessment review process which includes at least one key metric; developing a quality assurance database to store the audit questionnaire and the audit questionnaire answers; and conducting a scoring process wherein the new risk assessment review process uses the audit questionnaire answers to create at least one score, wherein the score reflects the compliance with the at least one best practice and the at least one recommendation for each of the at least one key metric.
77. The method for enabling a quality assurance process, as claimed in Claim 76, wherein developing a framework, includes: developing a plan for enabling the quality assurance process from a current state of a risk assessment review process and a desired scope of the quality assurance process, wherein the current state of the risk assessment review process is determined by a risk data analysis procedure; evaluating the current risk assessment review process to define a new risk assessment review process; outlining detailed requirements and developing materials wherein the materials include the audit questionnaire; determining the at least one key metric to be used by the new risk assessment review process; and beginning an implementation of the new risk assessment review process.
78. The method for enabling a quality assurance process, as claimed in Claim 77, wherein developing the plan includes: confirming the current state of the risk assessment review process to define a baseline for creating the new risk assessment process; defining the scope; and determining any resources needed for the new risk assessment review process.
79. The method for enabling a quality assurance process, as claimed in Claim 77, wherein evaluating the current risk assessment review process to define the new risk assessment review process includes: identifying and reviewing any current audit processes; and developing the new risk assessment review process using any recommendations produced by the risk data analysis procedure, any current audit processes and the at least one key metric.
80. The method for enabling a quality assurance process, as claimed in Claim 77, wherein the audit questionnaire is a questionnaire used by the risk data analysis procedure adapted to reflect the scope.
81. The method for enabling a quality assurance process, as claimed in Claim 77, wherein outlining detailed requirements and developing materials further includes determining specific resources needed for the implementation of the new risk assessment review process.
82. The method for enabling a quality assurance process, as claimed in Claim 77, wherein determining the at least one key metric includes establishing baseline and target numbers for the at least one key metric.
83. The method for enabling a quality assurance process, as claimed in Claim 77, wherein determining the at least one key metric includes establishing a reward program, an incentive program or both the reward program and the incentive program.
84. The method for enabling a quality assurance process, as claimed in Claim 76, wherein developing the quality assurance database includes adapting a risk data analysis database to reflect the scope.
85. The method for enabling a quality assurance process, as claimed in Claim 76, wherein the at least one score is weighted according to at least one defined dimension.
PCT/EP2003/013019 2002-11-18 2003-11-18 Risk data analysis system WO2004046979A2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
AU2003288134A AU2003288134B8 (en) 2002-11-18 2003-11-18 Risk data analysis system
CA002506520A CA2506520A1 (en) 2002-11-18 2003-11-18 Risk data analysis system
EP03780011A EP1563430A2 (en) 2002-11-18 2003-11-18 Risk data analysis system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10/299,960 2002-11-18
US10/299,960 US20040172317A1 (en) 2002-11-18 2002-11-18 System for improving processes and outcomes in risk assessment

Publications (2)

Publication Number Publication Date
WO2004046979A2 true WO2004046979A2 (en) 2004-06-03
WO2004046979A8 WO2004046979A8 (en) 2004-09-02

Family

ID=32324385

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2003/013019 WO2004046979A2 (en) 2002-11-18 2003-11-18 Risk data analysis system

Country Status (5)

Country Link
US (1) US20040172317A1 (en)
EP (1) EP1563430A2 (en)
AU (1) AU2003288134B8 (en)
CA (1) CA2506520A1 (en)
WO (1) WO2004046979A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010006345A1 (en) * 2008-07-11 2010-01-14 Jeremy Esekow Entrepreneurial behavioural risk assessment in determining the suitability of a candidate for ris associated products

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040103309A1 (en) * 2002-11-27 2004-05-27 Tracy Richard P. Enhanced system, method and medium for certifying and accrediting requirements compliance utilizing threat vulnerability feed
US7739141B2 (en) * 2003-07-10 2010-06-15 International Business Machines Corporation Consulting assessment environment
US6935948B2 (en) * 2004-01-27 2005-08-30 Integrated Group Assets, Inc. Multiple pricing shared single jackpot in a lottery
US7635303B2 (en) * 2004-01-27 2009-12-22 Integrated Group Assets Inc. Lottery ticket dispensing machine for multiple priced tickets based on variable ratios
WO2005113084A1 (en) * 2004-01-27 2005-12-01 Integrated Group Assets Inc. Virtual lottery
US20050164767A1 (en) * 2004-01-27 2005-07-28 Wright Robert J. System and method of providing a guarantee in a lottery
US7635304B2 (en) * 2004-01-27 2009-12-22 Integrated Group Assets Inc. Multiple levels of participation in a lottery jackpot
US20070106599A1 (en) * 2005-11-07 2007-05-10 Prolify Ltd. Method and apparatus for dynamic risk assessment
US20070198401A1 (en) * 2006-01-18 2007-08-23 Reto Kunz System and method for automatic evaluation of credit requests
WO2009023321A2 (en) 2007-05-14 2009-02-19 Joseph Hidler Body- weight support system and method of using the same
US7877323B2 (en) * 2008-03-28 2011-01-25 American Express Travel Related Services Company, Inc. Consumer behaviors at lender level
US20090248572A1 (en) * 2008-03-28 2009-10-01 American Express Travel Related Services Company, Inc. Consumer behaviors at lender level
US7882027B2 (en) * 2008-03-28 2011-02-01 American Express Travel Related Services Company, Inc. Consumer behaviors at lender level
US20090248569A1 (en) * 2008-03-28 2009-10-01 American Express Travel Related Services Company, Inc. Consumer behaviors at lender level
US20090248573A1 (en) * 2008-03-28 2009-10-01 American Express Travel Related Services Company, Inc. Consumer behaviors at lender level
US7844544B2 (en) * 2008-03-28 2010-11-30 American Express Travel Related Services Company, Inc. Consumer behaviors at lender level
US7805363B2 (en) * 2008-03-28 2010-09-28 American Express Travel Related Services Company, Inc. Consumer behaviors at lender level
US20110313818A1 (en) * 2010-06-16 2011-12-22 Lulinski Grzybowski Darice M Web-Based Data Analysis and Reporting System for Advising a Health Care Provider
US20140188575A1 (en) * 2012-12-31 2014-07-03 Laureate Education, Inc. Collaborative quality assurance system and method
US10249212B1 (en) * 2015-05-08 2019-04-02 Vernon Douglas Hines User attribute analysis system
CN109214474B (en) * 2017-06-30 2022-05-24 阿里巴巴集团控股有限公司 Behavior analysis and information coding risk analysis method and device based on information coding
US11227246B2 (en) * 2017-09-29 2022-01-18 Tom Albert Systems and methods for identifying, profiling and generating a graphical user interface displaying cyber, operational, and geographic risk
WO2020154265A1 (en) 2019-01-22 2020-07-30 Joseph Hidler Gait training via perturbations provided by body-weight support system
US11574150B1 (en) 2019-11-18 2023-02-07 Wells Fargo Bank, N.A. Data interpretation analysis

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5809478A (en) * 1995-12-08 1998-09-15 Allstate Insurance Company Method for accessing and evaluating information for processing an application for insurance
US5873066A (en) * 1997-02-10 1999-02-16 Insurance Company Of North America System for electronically managing and documenting the underwriting of an excess casualty insurance policy
US6236955B1 (en) * 1998-07-31 2001-05-22 Gary J. Summers Management training simulation method and system
US6125358A (en) * 1998-12-22 2000-09-26 Ac Properties B.V. System, method and article of manufacture for a simulation system for goal based education of a plurality of students
US6375466B1 (en) * 1999-04-23 2002-04-23 Milan Juranovic Method for teaching economics, management and accounting
US20020094927A1 (en) * 1999-09-03 2002-07-18 Baxter International Inc. Blood separation systems and methods with umbilicus-driven blood separation chambers
US7231327B1 (en) * 1999-12-03 2007-06-12 Digital Sandbox Method and apparatus for risk management
AU2001280966A1 (en) * 2000-08-01 2002-02-13 Adam Burczyk System and method of trading monetized results of risk factor populations withinfinancial exposures
AU2002345937A1 (en) * 2001-06-29 2003-03-03 Humanr System and method for interactive on-line performance assessment and appraisal
US20030126049A1 (en) * 2001-12-31 2003-07-03 Nagan Douglas A. Programmed assessment of technological, legal and management risks
US20080015871A1 (en) * 2002-04-18 2008-01-17 Jeff Scott Eder Varr system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
No Search *
See references of EP1563434A1 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010006345A1 (en) * 2008-07-11 2010-01-14 Jeremy Esekow Entrepreneurial behavioural risk assessment in determining the suitability of a candidate for ris associated products

Also Published As

Publication number Publication date
AU2003288134A1 (en) 2004-06-15
CA2506520A1 (en) 2004-06-03
US20040172317A1 (en) 2004-09-02
AU2003288134B8 (en) 2010-11-04
EP1563430A2 (en) 2005-08-17
WO2004046979A8 (en) 2004-09-02
AU2003288134B2 (en) 2010-10-21

Similar Documents

Publication Publication Date Title
AU2003288134B2 (en) Risk data analysis system
US7856367B2 (en) Workers compensation management and quality control
Sunder M et al. Lean Six Sigma in consumer banking–an empirical inquiry
Sedera et al. A balanced scorecard approach to enterprise systems performance measurement
CN114600136A (en) System and method for automated operation of due diligence analysis to objectively quantify risk factors
US20020091558A1 (en) System and method for determining and implementing best practice in a distributed workforce
US20130246126A1 (en) System and method for customer value creation
Naji et al. The effect of change-order management factors on construction project success: a structural equation modeling approach
Byrnes et al. Software Capability Evaluation, Version 3.0: Method Description
US20070250359A1 (en) Systems and methods for providing documentation having succinct communication with scalability
Phillips et al. ROI fundamentals: Why and when to measure return on investment
US20130282442A1 (en) System and method for customer value creation
US11625388B2 (en) System with task analysis framework display to facilitate update of electronic record information
Shrestha et al. Building a software tool for transparent and efficient process assessments in IT service management
Negahban Utilization of enterprise resource planning tools by small to medium size construction organizations: A decision-making model
Kwak A systematic approach to evaluate quantitative impacts of project management (PM)
MZENGIA Assessing the Practices and Challenges of Project Monitoring and Evaluation System of Local Ngos in Addis Ababa
PMP The Enterprise Business Analyst: Developing Creative Solutions to Complex Business Problems
Todd Evaluation of the Use of Data Analytics by University Research Administration Offices to Monitor Financial Compliance
Aqila SUPPLIER SUSTAINABILITY ASSESSMENT AND IMPROVEMENT
Mankge A Critical Analysing of the Pricing Process for the Corporate and Commercial Segments of Bank XYZ in South Africa
Lehmann et al. Performance Evaluation of Public Health Laboratories in Kenya
Maddumasooriya Potential of implementing" window delay analysis" for road projects in Sri Lanka-claims consultants' perspectives
Bujnowski Optimal Capability Maturity in Startup Software Companies
Gaillard A valué assessment approach towards effective use of Information Technology in organizations

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
D17 Declaration under article 17(2)a
WWE Wipo information: entry into national phase

Ref document number: 448/MUMNP/2005

Country of ref document: IN

WWE Wipo information: entry into national phase

Ref document number: 2506520

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 2003780011

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2003288134

Country of ref document: AU

WWP Wipo information: published in national office

Ref document number: 2003780011

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Ref document number: JP