Online Readiness Tools and Research

Compiled by the CATIE QA Task Force:

Members

  • Lisa Bunkowski (Central Texas) – Chair
  • Pat Abrego (International)
  • Julia Allen (Texarkana)
  • Aleyda Cantu-Lee (Corpus Christi)
  • Jeremy Gamez (Commerce)
  • Alexandra Janney (Corpus Christi)
  • Gloria Sanchez (International)
  • Linda Scott (Texarkana)
  • Mike Smith (Commerce)
  • Diane Sudman (Tarleton)

Tools

SmarterMeasure

The primary third-party approach is SmarterMeasure. This is a comprehensive system, but it is costly. To find out more, see:

Here are several of the studies which they summarize:

  • Middlesex Community College: a correlation study was conducted to see the relationships between the scores of SmarterMeasure and the students’ grades.
  • Argosy University: identified a four-part research project to Compare, Explore, Trend, and Apply findings from an analysis of SmarterMeasure data.
  • Sargeant Reynolds Community College: conducted an analysis to determine the relationship between the SmarterMeasure sub-scale scores and student’s grades.
  • North Central Michigan College: used SmarterMeasure to measure levels of online student readiness and the COMPASS exam to measure incoming student’s skills in reading, writing and math. The calculated correlations between scores on the two exams to determine the degree of relationship between measures of online learner readiness and measures of academic readiness. This study also includes demographic data on the trends of student performance, and a review of the literature.
  • Institutions who use SmarterMeasure in Texas

Online Readiness Assessment

This resource created by Vicki Williams and The Pennsylvania State University is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License. Permissions beyond the scope of this license may be available at mailto:vqw@psu.edu.

Version 1; Version 2

Online Readiness Self-Assessment

This resource was developed by Glen Pillsbury at Stanislaus State University published freely under a Creative Commons Attribution 4.0 license.

See: Online Readiness Self-Assessment

Michigan Virtual Online Learning Orientation Tool

This resource is freely available and self-paced, designed to help students understand the skills and knowledge necessary for successful online learning. See: Online Learning Orientation Tool,

Used in conjunction with the Online Learner Readiness Rubric, OLOT can help students better understand what to expect and gauge their level of preparedness. See: Online Learner Readiness Rubric

ToOLS: Test of Online Learning Success

This resource was developed by Marcel S. Kerr and Marcus C. Kerr of Texas Wesleyan University, and Dr. Kimberly Rynearson of Tarleton State University. Researchers do not need permission from the authors to use for data collection.

See: Test of Online Learning Success

Examples of institutions that use the TOOLS, see the following.

  • The self-assessment test used by the University of Arkansas Online was based on the TOOLS: Test of Online Learning Success.
  • The online learning self-assessment survey used at Kirtland Community College: Student Online Self-Assessment Survey was also based on the TOOLS: Test of Online Learning Success.

Merlot Bookmark Collection: Student Readiness for Online Learning

Online resources with examples of self-administered, online-readiness assessments. Created in 2013 by Gerry Hanley.

Rutgers list of online readiness assessments

Readiness assessment sites. Although a little dated, with some broken links, there are some useful resources included.

Copied below are those not already referenced above.

WCET Discussion Thread: Online Learning Student Assessment

  • At Portland Community College. The Virtual Backpack: The Start Guide for Online Learning gives students an overview of what online learning entails, an introduction to academic integrity, and overview of required technology needs (and proficiency) and what student support is available. They also incorporate a non-cognitive assessment that we license through SmarterMeasure. Students must complete it before they can register for their first online class, but they do not need to get a specific grade.

They worked with their IR department to determine the efficacy of the Virtual Backpack in the initial Spring term and a subsequent Fall term. They saw a 5% increase in pass rates, an equally impressive improvement in the GPA for first time online students, and generally, no difference in success between first time and returning online students.

They released the contents of the Virtual Backpack under the CC-By license, and developed a project summary document.

  • The California Community College Online Education Initiative has developed and makes available through Creative Commons a comprehensive online student readiness program called “Quest”. Faculty support for the OEI.

The Quest Program, which focuses on the cognitive and non-cognitive factors contributing to online student success, has been developed in a Canvas course shell.

There are 3 components and two pathways, one for the novice online student and another for the experienced online student. The program incorporates 1. the vendor product SmarterMeasure, a diagnostic assessment of online readiness, 2. Customized interactive, skill building multimedia tutorials, quizzes, and supplemental materials, and 3. A self-check on learning activities.

Research

Annotated Bibliography

Bernard, R. M., Brauer, A., Abrami, P. C., & Surkes, M. (2004).

The development of a questionnaire for predicting online learning achievement. Distance Education, 24(1), 31-47. doi: 10.1080/0158791042000212440

This study addressed the researchers’ efforts to create and evaluate an instrument to predict online learning success – measured in the study by Course Grade. The instrument focused on four factors: beliefs about distance education, confidence, self-direction, and desire for interaction. However, the final results indicated that cumulative gpa was the best predictor of Course Grade.

Blankenship, R., & Atkinson, J. K. (2010).

Undergraduate student online learning readiness. International Journal of Education Research, 5 (2), 44-54.

Blankenship and Atkinson (2010) replicated an Australian study (Smith, 2003, 2005) on the effectiveness of the McVay Online Readiness Survey (2000, 2001). The aim of their paper was to help validate the accuracy of the instrument.

Farid, A., Plaisent, M., Bernard, P., & Chitu, O. (2014).

Student online readiness assessment tools: A systematic review approach. Electronic Journal of E-Learning, 12(4), 375-382. Retrieved from http://www.ejel.org

This systematic review of more than 5000 published and unpublished papers on online readiness assessment tools between 1990 and 2010 aims to identify those that allowed assessment of student’s preparation for online learning, and to determine which tools had been validated.

Key Results:

  • No standard tool exists
  • Few (developed commercially or in-house) demonstrated good psychometric quality
  • Universities prefer to develop instruments that fit their online programs

Gascoigne, C., & Parnell, J. (2014).

Distance education readiness assessments: An overview and application. Online Journal of Distance Learning Administration, 17(4), 1-7. Retrieved from http://www.westga.edu/~distance/ojdla/

As background for their empirical study, Gascoigne and Parnell (2014) provide a very helpful review of a series of online readiness assessment tools, including those developed commercially and in-house. They note that while commercially developed instruments have the research to back them up, they generally come with high prices attached.

 Geiger, L. A., Morris, D., Suboez, S. L., Shattuck, K., & Viterito, A. (2014).

Effect of student readiness on student success in online courses. Internet Learning Journal, 3(1), 73-84. doi: 10.18278/il.3.1.7

With regard to student readiness assessment, this study includes these assessments (SmarterMeasure) as an important component for student success (retention and course grade) in online learning.

 Hall, M. C. (2009).

A factor analysis of the distance education surveys “Is Online Learning Right for Me?” and “What Technical Skills Do I need?” Quarterly Review of Distance Education, 10(4), 339-345.

Hall notes that assumptions about the predictive validity of online readiness are not well founded.  He specifically looks at two different tools, employing quantitative methods to compare different populations.  He concludes that as online courses grow in popularity, institutions will continue to look to such surveys to bolster the viability of their online programs.  The lack of validly measured tools is problematic to these efforts.   (RMS)

Hall, M. (2011).

A predictive validity study of the Revised McVay Readiness for Online Learning Questionnaire. Online Journal of Distance Learning Administration, 14(3). Retrieved from http://www.westga.edu/~distance/ojdla/

Considers predictive validity of McVay Readiness survey.  Concludes there may be merit to the tool, although a small sample size for the study gives reason for caution.  Hall makes several recommendations, including additional testing of the instrument using larger sample sizes. (RMS 2-13-19)

Moser, J., Philipson, D., and Reed, E. (2018, April 9).

Using a course start-up message to improve student outcomes. Educause Review. Retrieved from https://er.educause.edu/

This article discusses the efforts of the Plymouth State University Online Reflective Practice Group to enhance student success in online learning. The article includes an overview of the needs and characteristics of adult learners, and a chart of key skills for online learning success. Their recommendations include a robust course start-up message.

Pillay, H., Irving, K., & Tones, M. (2007).

Validation of the diagnostic tool for assessing Tertiary students’ readiness for online learning. Distance Education, 26(2), 217-234. doi: 10.1080/07294360701310821

The authors of this study continue efforts to validate the assessment instrument, Tertiary Students Readiness for Online Learning (TSROL). This instruments includes four categories: technical skills, computer self-efficacy, learner preferences, and attitudes towards computers. The authors recommend updating the language and approach of the instrument – particularly with regard to the “learner preferences” and “attitudes” categories. This study also provides comparisons with other instruments, including the COLQ (Smith, 2005), the Osborn scale (2001), Muse (2003), and others.

Doe, R., Castillo, M. S., & Musyoka, M. M. (2017).

Assessing online readiness of students. Online Journal of Distance Learning Administration, 20(1). Retrieved from http://www.westga.edu/~distance/ojdla/

 This study addresses the researchers’ effort to create an instrument to measure online readiness of undergraduate students. The instrument focuses on digital engagement, motivation, self-efficacy, and learner characteristics. The study examines the development and assessment of the 14-item instrument – which is included in the article.

 Smith, P. J., Murphy, K. L., & Mahone, S. E. (2003).

Towards identifying factors underlying readiness for online learning: An exploratory study. Distance Education, 24(1), 57-67. doi: 10.1080/01587910303043

This study assesses the value and reliability of the Readiness for Online Learning instrument (McVay, 2000). The authors recommend enhancing the questionnaire, and continued effort to establish validity.

Wladis, C.  & Conway, K. M., Hachey, A. C. (2016).

Assessing readiness for online education—research models for identifying students at risk. Online Learning, 20(3), 97-109. Retrieved from https://olj.onlinelearningconsortium.org/index.php/olj/index

This article provides an informative study of student characteristics that may jeopardize their success in online learning. While the study does not directly address online readiness survey tools, it might prove helpful to those looking to develop an in-house readiness instrument.

Wladis, C., & Samuels, J. (2016).

Do online readiness surveys do what they claim? Validity, reliability, and subsequent student enrollment decisions. Computers & Education, 98, 39-56. doi:10.1016/j.compedu.2016.03.001

This detailed study tested the validity and reliability of an online readiness surveys, and recommends extreme caution regarding their implementation.

Key results:

  • Predictive validity should be tested before implementation
  • They may unnecessarily discourage students from enrolling in online courses
  • Consider alternatives, such as student characteristics that identify students at-risk online (see Wladis, Conway, & Hachey, 2016)

Xiong, J., So, H., & Toh, Y. (2015).

Assessing learners’ perceived readiness for Computer-Supported Collaborative Learning (CSCL): A study on Initial Development and Validation. Journal of Computing in Higher Education, 27(3), 215-239. doi.10.1007/s12528-015-9102-9

This study addresses the researchers’ efforts to create an instrument to measure Student Readiness for Computer-Supported Collaborative Learning (SR-CSCL). The resulting SR-CSCL includes a three-part framework focused on collaborative/online learning: motivation, collaboration behavior, and online learning aptitude. This instrument addresses social and technological aspects of online learning.

 Yu, T., & Richardson, J. C. (2015).

An exploratory factor analysis and reliability analysis of the Student Online Learning Readiness (SOLR) Instrument. Online Learning, 19(5), 120-141. Retrieved from https://olj.onlinelearningconsortium.org/index.php/olj/index

This study addresses the researchers’ effort to create the Student Online Learning Readiness (SOLR) instrument. The SOLR addresses social competencies with the instructor and with classmates, communication competencies, and technical competencies. The theoretical framework for the study is the Student Integration Model (Tinto, 1975). This study provides a useful emphasis on the social aspects of online learning, to balance the technical and computer skills emphasis often seen in other readiness instruments.