View Categories

How to Conduct Item Analysis through YouTestMe GetCertified

Article verified for Release 14.1 on March 29, 2025.

This article provides a comprehensive guide on conducting Item analysis using YouTestMe GetCertified. Item analysis allows you to assess the reliability and effectiveness of test questions.

There are four ways to conduct item analysis:

  1. Predefined Test Report – You can utilize the item reliability report provided by YouTestMe GetCertified. This report offers insights into the performance of each test question, allowing you to assess the reliability and effectiveness of the items.
  2. Report Builder – The platform also offers a report builder feature that enables you to download all the answers to the questions. With this downloaded data, you can perform item analysis independently, gaining more flexibility and control over the analysis process.
  3. Statistics on Question Level – Within YouTestMe GetCertified, you have access to detailed statistics at the question level. This feature allows you to evaluate the performance, difficulty, and effectiveness of each question in a test.
  4. Statistics on Answer Level – You can also analyze statistics at the answer level to understand how candidates interact with each possible answer. This provides deeper insights into the quality and clarity of the answer choices, helping to refine test questions further.

    Using predefined test reports

    To access the Item Reliability report, follow the next steps:

    1. Click the Reporting option in the main menu, and select Predefined Reports.
    2. Click on Test Reports.

    1. Choose the Item reliability report.

    1. Click the Details button to preview each question’s success ratio and item reliability.

     On the displayed screen, there are the following metrics shown:

    • Item reliability is based on point-biserial correlation and will let you know how items correlate with the test as a whole, allowing you to identify the questions that do not belong in a test. When determining the correlation between the item and the test as a whole, the system correlates the responses on that item (dichotomous variable – correctly or incorrectly answered) with the test’s outcome (continuous variable – the score on a test). The values can range between -1 and 1. Negative values usually mean that:
      • Candidates who got lower scores on the exam answered this question correctly.
      • Candidates who got higher scores on the exam answered this question incorrectly.

    In this case, the question does not correlate with the exam, and you may want to exclude it from the next exam versions or substitute it. Positive values usually mean that:

      • Candidates who got higher scores on the exam answered this question correctly.
      • Candidates who got lower scores on the exam answered this question incorrectly.

    In this case, the question correlates with the exam and should be included in the next exam versions.

    • The success ratio provides the percentage of candidates that answered the question correctly.
    • Standard deviation measures the variability of success ratios for individual questions, helping identify inconsistent performance and correlations between question difficulty and overall test scores

    Note: The item reliability report shows manually created tests.

    Using report builder

    1. To access the report builder from the main application menu, select the Reporting tab in the left-side menu and select the Report Builder tab.
    2. Select the test name from the Name column to display a Report builder for a specific test.

    1. On the Report Builder page, you can either load existing report templates or create a new custom report by selecting items from the list.
    2. Once you’ve selected the desired items, click the Display Report button.

    • Each selected item will appear as a column in the report table.
    • To return to the Report Builder and make changes to the specific columns you want to appear, please click on Back to Report Builder.

    For more instructions on how to use report builder, please see this video.

    Statistics on Question Level

    1. Navigate to Test and select Manage Test.
    2. Open the desired test.
    3. Go to the Questions tab.
    4. In the list of test versions, click on the eye icon from the Action tab.

    1. Scroll down to view the statistics for each question in the test.

    You have the option to compare questions by: success ratio, time to answer,question occurrence

    Success ratio: a percentage of users who answered the question correctly.

    Time to answer: The time spent on the question before an answer was provided.

    Question occurrence: a percentage of test versions that contain the question.

    Statistics on Answer Level

    To see statistics at the answer level for each question, follow these steps:

    1. From the list of questions, choose the desired question.
    2. Click on the eye icon in the Action tab.

    1. Scroll down to view the statistics at the answer level.

    For more useful instructional materials, please visit:

     

     

    Powered by BetterDocs