In-App Help was last updated in 8.6.4 (released 10/23/2019) if you are looking for help with a feature that has been added or enhanced since 8.6.4, please check in Online Help from Help menu.
The Acuity Interrater Reliability Analysis by Assessor Report
You can run this report to measure interrater reliability by assessor, summarizing all audited assessments completed during the selected period. This report can differentiate between absolute and near agreement versus non-agreement at the outcome level by assessor. The report includes a comparison of the acuity level between assessor and auditor.
Running the Acuity Interrater Reliability Analysis by Assessor Report
Complete the following steps to run the Acuity Interrater Reliability Analysis by Assessor report:
- From the Reports menu, select Acuity > Acuity Interrater Reliability Analysis by Assessor. This opens the section criteria page.
- On the Selection Criteria tab, select one or more facilities from the Facilities section. Use CTRL+click to select multiple items. Use SHIFT+click to select a range of items. Staff Manager automatically limits the facilities to those set up for Acuity.
- Select one or more profiles from the Profiles section. Staff Manager automatically limits the profiles to those set up for Acuity.
- Enter a start date in the box or click the calendar icon to use the date menu.
- Enter an end date in the box or click the calendar icon to use the date menu.
- Select an Export Type.
- If you select PDF, Staff Manager opens the report results as a PDF.
- If you select Microsoft Office Excel, Staff Manager exports the report results as an Excel spreadsheet. You must have Microsoft Office Excel or Excel Viewer installed on your workstation to use this option.
-
Click Run Report. If you selected the PDF export type, the report is displayed in the Report Output pane. If you selected the Excel export type, an alert window opens to let you select whether to open the report or to save the report as an Excel spreadsheet. It is recommended selecting Save so that you can review the report in Excel.
Be aware that report formatting is not retained when you export a report to Excel.
Viewing the Acuity Interrater Reliability Analysis by Assessor Report
The first section of the report includes the following information:
- Report Title: Acuity Interrater Reliability Analysis by Assessor Report
- Date Range: As selected, such as 6/30/2011 - 7/4/2011
- Profile: As selected, such as 5W MedSurg
The second section provides the following information for each assessor included in the report:
- RN Assessor: The user who completed the acuity assessment in Last Name, First Name format.
- RN Auditor: The user who completed the audit on this acuity assessment in Last Name, First Name format.
- Audit Date: The date the user completed the audit.
- Assessor/Auditor Acuity Level: The acuity levels as calculated first by the assessment and second by the audit. Results are displayed as two numbers separated by a slash (#/#), with the first number being the assessment acuity level and the second number being the audit acuity level.
- % Outcome Agreement: The percent of outcomes with identical Likert ratings from the assessor and the auditor. For example, a percent outcome rating of 92% on 19 outcomes means that 17 outcomes are rated identically between the assessor and auditor.
- # Outcomes w/ >1 Likert Variance: The number of outcomes where the difference between the assessor's Likert ratings and the auditor's Likert ratings is greater than one. A score of 3, for example, means that there were 3 different outcomes where the assessment ratings and the audit ratings varied by at least two Likert levels. A value other than 0 indicates low interrater reliability. Assessors and Auditors are expected to review such variations to improve reliability. Any audited assessment with a value other than 0 should not count toward the number of audits required by RNs to achieve RN Acuity Competency.
Sample Report
Related Topics