Managing High-Impact Areas

Once the Courseware has been implemented, AIS can provide ongoing management related to the RAI and Medicare Process.

  • The RAIQ testing and virtual facility provides a quantifiable assessment of the knowledge base of potential new hires or of newly hired individuals.

  • Education Dashboard and the Education Reports can help with ongoing management of high-impact areas of the RAI and RUG IV processes.

  • The optional modules can help with program development.


AIS RAIQ and Use of Virtual Facility for Pre Testing

AIS has developed a virtual facility for use in testing candidates or new hires. The intent of this new functionality is to assess the current knowledge base of the candidate or the new hire.

  • A virtual facility will be established for each organization and will be located in the same region as the corporate office.

  • The individual completing the test would use it in a proctored environment (on site at client office).

Courseware Content

  • The primary test to be used for evaluating candidates for an MDS Coordinator (or other related positions) is the RAIQ test.

  • This test features a comprehensive 40 question test that is composed of select questions from the RUG IV and RAI 3.0 tests.

  • The questions are not pooled, which means that the test is identical for anyone who takes the test.

  • The pass rate is set to 0%

  • Although the primary test is the RAIQ test, the full suite of AIS eLearning solutions will also be accessible in the virtual HR facility. This will allow AIS clients to use any of the other tests (pre-tests would be used) should they wish to do so.

Registration

  • AIS will provide a manual registration code to be used for registering candidates into the virtual HR facility.

  • A staff person (typically a Human Resources staff member) will register the candidate in the virtual HR facility.

  • It is recommended that the staff person also launch the desired test for the candidate to avoid any unnecessary confusion for the candidate.

Results

  • The test results can be viewed by individual in the reports section of AIS Central

  • In order to prevent the results from diluting the Organizations roll up data, the test results for each individual will be kept separate and will not be integrated into the Roll up Reports and will not be available in the Dashboard.


AIS Education Dashboard and Reports – Overview

This section provides suggestions for using the information in the Dashboard and reports when conducting root cause analysis following audits, surveys, and analytics reports; in compliance programs; and when enhancing clinical programs.

The Education Dashboard has three categories that can be used by the Educators to manage region, facility, and learner performance, and to help in root cause analysis:

  • Competency Issues
    Includes evaluation questions answered incorrectly 30% or more of the time. In addition to an overall review, these are further subdivided into questions related to RUGs (Clinical Reimbursement), QI/QMs, Regulatory (Survey), and CAAs (Care Plans).

  • Evaluation Performance
    Includes number of attempts (tests taken), grades, and completion of required curriculum.

  • eTraining Usage
    Includes number of hours of module usage by location and by section.


Educator Reports can be accessed via AIS Central, and provide information by individual learner. The reports can be downloaded into Excel, then sorted and distributed to the organization team leaders. These reports include:

  • Curriculum
    Lists the learners who have/have not yet completed the required items.

  • Reports of Evaluation
    Provides an overview of each evaluation attempt (test), including date, grade and level of completion.

  • Response by Evaluation
    Provides the percentage of incorrect responses given for each question on a selected evaluation (with learner names).

  • Response Individual Learner
    Provides an overview of all responses given, either for the selected evaluation or for the selected learner.


Note: For detailed information on navigating through the available reports, refer to the Quick Reference Guide to AIS Dashboard and Reports.


Using the Dashboard and Reports to Determine Organizational Competency

A primary use of the Dashboard and reports is to determine overall competency with coding the MDS, completing assessments, and managing the clinical component of the Medicare program.

  • Dashboard: The Evaluation Summary section of the Dashboard provides the average grade by evaluation (test), the number of attempts, and curriculum usage. These summaries can be further divided by module, pre-test vs. post-test, and region or facility. It is recommended that the evaluation scores be reviewed at the organizational level first, then at the regional level (when applicable), and finally at the facility level. These graphs can be downloaded and distributed as needed.
    Note: The Dashboard can provide detail down to the facility level, but not by individual learner.

  • Reports: The reports can provide a detail review of pre-tests and post-tests by individual learner. The Results of Evaluations report gives a detailed list of results for each learner and evaluation selected. When downloaded into Excel, these reports can be sorted and filtered, then distributed as needed.

This information is useful for the following areas:

  • Implementation Follow-Up
    A review of the facility/organization test (evaluation) scores both pre-implementation and post implementation will provide an indication of the improvement in competency for each section and module.

  • Determining Refresher Course Requirements
    As discussed in previous sections, regular review and testing of the modules is recommended. Information from the Dashboard and reports indicate current competency levels and provide a comparison to previous evaluations, allowing the organization to determine whether further education is required and, if so, which sections.

  • Following Changes in Regulations
    When needed, CMS typically issues an RAI update following posting of the Fiscal Year Final Rule in July. These updates, when applicable, become effective October 1. On occasion, there will be further updates typically in April. When CMS changes the coding requirements significantly, AIS updates the content of the modules accordingly and refreshes the testing. In these cases, review and retesting of the updated sections or modules is recommended. Information from the Dashboard and reports indicate current competency levels and provide a comparison to previous evaluations, allowing the organization to determine whether further Educator-led training is required and, if so, which sections.


Using the Dashboard and Reports to Analyze Audit/Survey Results

When audits or surveys indicate an issue related to MDS coding, Care Plans, Assessment Completion, or RUG issues, the first step in a root cause analysis is to determine if there is an issue with understanding basic RAI and RUG principles. Competency in these areas can be quickly determined by using the Dashboard for a facility-level view, and using the reports to determine the specific areas of concern and an individual learner’s scores. Once overall competency is determined, drilling down by tags, section, and response can help determine specific competency issues.

The results of this analysis can be used to develop a plan of action for education. Once issues related to understanding how and when to code the MDS for both clinical and reimbursement areas and how and when to complete assessments are determined, a further review of organizational processes or compliance to regulations can be initiated.

The following examples are provided to explain the process. They are not comprehensive, but are intended to show how to use the reports to manage the organization’s RAI process.

Example 1: Internal Audit (Missed COT Assessment)

An internal audit reveals that Change of Therapy (COT) assessments are not being completed based on the guidelines for PPS assessments.

Scenario: 14 Day assessment had ARD 03/05/11. RUG Level RUB. In the 7 days Mar 6 – 12, therapy continued. However, on Wed Mar 7, the resident was sick and missed OT and the therapy session or minutes were not made up during the week. The COT review should have indicated that the minutes provided were fewer than 720 and a COT assessment would be required, but the COT due on Mar 12 was missed. The following week, the resident’s RUB RUG continued for payment. The audit identified this issue before the resident was discharged from Medicare Part A.

Financial Impact:

RVB x 7 days

=

$3,027


AAA x 7 days

=

$1,287


Loss

=

$1,740


Review: To begin root cause analysis, determine whether the therapists providing the treatment and the MDS Coordinator coding the MDS assessments and managing the process understand the rules for completing a COT assessment.

  1. Dashboard:
    Look at the overall score of the RUG IV module, then drill down to response analysis and issues by location. If this is a large facility, it may be beneficial to review by disciplines. If scores are below 90%, look at the time spent reviewing each module before testing.

  2. Reports:
    For the individual therapists and MDS Coordinator, examine the grades and answers to the questions related to coding OMRA assessments. If a staff member does not know the rules of coding these assessments, the action plan might include a self-directed or Educator-directed review of the RUG IV module followed by retesting, or a review of the specific issues (in this case the OMRAs) with testing by the Educator.

Example 2: External Audit (ADL Coding)

An external audit reveals that Late Loss ADLs were not coded based on documentation provided.

ADL Coding

Scenario: A probe review by the FI (Fiscal Intermediary) indicated that documents received for 6 claims did not support the ADL coding on the MDS.

Financial Impact:

30 Day MDS RVC ($583.77) x 30 days

=

$17,513 per claim


30 Day MDS RVA ($496.30) x 30 days

=

$14,889 per claim


Loss = $2,624 per claim = $2,624 x 6

=

$15,744 total


Review: To begin root cause analysis, determine whether the MDS Coordinator coding the MDS assessments understands the rules for coding ADLs, including the Rule of Three and the documentation used to support ADL coding.

  1. Dashboard:
    Look at the overall score of the Section G module, then drill down to response analysis and issues by location. If scores are below 90%, look at the time spent reviewing each module before testing.

  2. Reports:
    Examine the MDS Coordinator’s grades and answers to the questions related to coding ADLs. If the MDS Coordinator does not know the rules of coding these assessments, the action plan might include a self-directed or Educator-directed review of the Section G and ADL module followed by retesting. If the MDS Coordinator understands the rules of coding but the CNAs, nursing staff, or therapists do not understand how to code, the action plan can include a review of the ADL module followed by retesting.

Example 3: Survey Management

Scenario: During an annual survey, it was determined that two residents experienced a decline in function related to their ADL status and pressure ulcers. The facility staff failed to identify and complete a Significant Change in Status assessment for either resident per the RAI requirements. The facility received a tag for the pressure ulcer issue and also received a tag for failure to complete a Significant Change in Status assessment (F274) and failure to use the assessment to revise the current care plan to incorporate the change (F279).

Impact: Depending on the scope and severity, these tags could result in additional CMP.

Review: It is important to know if the staff responsible for coding the MDS, and the nursing team responsible for documenting the care of the resident, understand the rules for coding significant change status, CAAs, and Care Plans. Part of the POC (Plan of Correction) can be education on Assessment Types, CAAs, and Care Plan. This action plan can include timeframes for completion, and monitoring using Curriculum Management and reports.

Using the Dashboard and Reports For Data Analytics

When analyzing data, whether from an internal or external source, identifying the root causes of issues can be challenging. As with the information from an audit or survey, identifying issues with understanding the principles in question is a key first step. A competency review can be narrowed to just the coding items in question. AIS has tagged each of the questions related to RUGs, QI/QM, Regulatory, and CAAs (Care Planning). These sub topics can be reviewed for overall competency, and specific questions related to these areas can be reviewed by percent of incorrect responses and the frequency of the answers provided. This is can be viewed in the Dashboard by region/facility or in the reports by individual user, helping the Educator to determine where to focus the education needs. The following example is one of many issues that could be identified by your data analytics programs.

Example: Depression Indicators

Scenario: The data analytics reports have identified a potential under-coding for this resident. The resident has a diagnosis of Depression and COPD and is being treated with oxygen therapy as needed. His PHQ-9 score based on resident interview is 1 and his 30 day RUG score is HC1. Upon review, his medical record indicates he becomes agitated with attempts to lower the head of the bed at night, stating he is short of breath. There is documentation of daily episodes of weeping, poor appetite, and difficulty falling asleep at night. If the depression indicators were coded as observed, the RUG score would be HC2. Is the resident interview truly reflecting how the resident is really feeling?

Financial Impact:

HC1 ($335.63) x 30 days

=

$10,068.90


HC2 ($401.48) x 30 days

=

$12,044.40


Loss per 30 days

=

$1,975.50


Review: The Depression Indicators increase the rate in each of the Nursing RUG categories to account for the increased staff time to supervise and care for residents with similar medical conditions that are having these depressive indicators. Knowing how to effectively interview residents and understanding the techniques to ensure an accurate interview is critical to ensuring accurate MDS coding for all interviews. The sections to review are the Interview and Section D – Mood.

  1. In the reports, review the Results of Evaluation for each learner responsible for completing the interview and for the learner responsible for coding this section of the MDS.

  2. Review the Response by Individual Learner and the Response by Evaluation for a more in-depth understanding of the issues related to competency. This will identify specific issues of understanding the coding for this critical piece of the RAI process.

  3. Part of the action plan for issues with the interview process can be education on the Interview and Section D modules, with monitoring using Curriculum Management and reports.


Additional AIS Modules for Ongoing Education and Competency Assessment

Effective program development involves ensuring competency in the specific program being implemented. This includes both competency in treating the condition or problem appropriately, and understanding how to document and measure effectiveness through the RAI process. Included here are programs that are commonly implemented in a SNF and the corresponding RAI Process and AIS Courseware modules that can be used to help develop competency. Completion and demonstration of competency in these areas are designed to complement, not replace, the current education programs being offered by the SNF.

Examples of programs where AIS Courseware education will be beneficial:

  • Respiratory Program

Respiratory Module: This module includes MDS coding and RUG IV implications

  • Discharge Planning

Section Q to understand the importance of Goal Setting and Community Referral in the Discharge Planning process

CAAs and Care Plan with emphasis on CAA #20 Community Referral

  • ADL/Restorative

ADL Module to understand documenting ADLs

Section G to understand MDS coding of ADLs, including the Rule of 3

Section O to understand MDS coding of Restorative Programs

CAAs and Care Plan with emphasis on #5 ADL, #11 Falls, #18 Restraints

  • Interviews

Interview Module

  • Skin Management

Section M to understand how to code skin issues

ADL Module to understand how to document ADLs to effectively manage problems related to turning and repositioning and protecting skin integrity

CAAs and Care Plan with emphasis on #16 Pressure Ulcers

  • Rehab Staff

Shortened RUG module has been developed for use by treating therapists and other rehabilitation staff involved in the RUG/PPS process.

The module includes pertinent information on the types of Medicare assessments including the unscheduled assessments

  • Additional Modules are developed based on customer and industry needs and changes