Sul & Associates International works with teams of stakeholders and psychometricians towards the design and development of culturally responsive and psychometrically valid systems of assessment. Whether plans call for a system-wide large-scale assessment or a smaller local assessment, Sul & Associates can assist your organization to incorporate the practices involved in developing culturally responsive and psychometrically valid systems of assessment.

Sul & Associates International seeks to provide tools and knowledge for clients to develop self-sustaining assessment processes and expertise. With this perspective in mind, an objective for our assessment projects is to establish a process for identifying, capturing, and reporting upon assessment data in a consistently meaningful manner.

Methodology

The assessment development process consists of six phases. It begins with a review of the existing assessment landscape and will transition to the formal description of the construct of the assessments to be developed. This, in turn, will lead to the development of the assessments. Systems for data gathering, analysis, and reporting will be discussed and recommendations for each will be provided. Each of these elements is described below.

1. Assessment Landscape

The assessment process begins with a comprehensive review of current and prior assessment initiatives and practices as they relate to the development of the assessment program. It includes the identification of the learning framework, learning expectations, curriculum model, instructional program, assessment practices, proficiency levels, and assessment item types.

2. Construct Definition and Development

2a. Construct Definition. The assessment development phase begins with the confirmation of the assessment construct and its relationship to the overall expectations for learning. Confirmation of the learning goals targeted for each assessment will help to establish the construct and will set the stage for the development of the assessment.

2b. Construct Development. Areas of emphasis during the construct development phase include the identification of the overall learning dimensions, components within dimensions, range of coverage within dimensions, range of ability levels to be addressed, and dimensionality.

3. Assessment Design

In this phase, an assessment design appropriate for measurement within the construct dimensions is developed. The arrangement and interaction of multiple assessments are considered so that scores on one assessment can be linked to scores on adjacent assessments.

Successful assessment linking will result in greater clarity of score meanings and easier comparisons between results on different assessments.

4. Data Gathering

An assessment system requires comprehensive structures for gathering, processing, and storing assessment data. In this phase, the technical requirements for establishing a data system for use in collecting, processing, and reporting upon assessment data. Options include the use of optical mark reading (OMR) software and the integration of tablet-based data gathering and storage systems.

5. Data Analysis

This phase will focus on preparing summary information about assessment item performance. The emphasis will be on obtaining information that is summarized and disaggregated by appropriate groups.

The second stage of the data analysis will focus on the suitability of items. Here, decisions will be made about the inclusion of assessment items.

6. Reporting Analysis Findings

Driven by the results of the data analysis phase, work in this phase will focus on the compilation of all relevant assessment data and related findings necessary to answer questions regarding the development of the assessment.

Validity

Cultural Validity

Working collaboratively with an expert panel ensures cultural validity of the assessment. Panel members assist in the definition of assessment constructs and identify forms of demonstrating proficiency in the assessment domain. The assessments attempt to capture the transition from novice to expert in support of the measurement of the cultural knowledge along an established learning continuum.

Content Validity

Content experts relate the knowledge domains to the learning environment and setting to ensure content validity of the assessment. Panel members seek to determine whether the critical elements of the curriculum framework are being addressed and represented properly within the assessment.

Psychometric Validity

Once the cultural aspects of the assessment are confirmed, validation of the instrument requires a measurement model that incorporates the cultural framework and integrates the knowledge domains as either independent or dependent constructs. Sul & Associates International promotes the use of Item Response Theory to establish interval-level scaled scores for both formative and summative assessments.