Learn more about our name change.

Research

2010 Student Readiness Report

SmarterServices, LLC, the provider of the SmarterMeasure Learning Readiness Indicator, annually analyzes the SmarterMeasure data in aggregate of all of the students from the prior year who have taken SmarterMeasure. No data specific to individual students or individual schools is made publically available. Data in the 2010 report was taken from 209,025 unique students from 274 higher education institutions who took the SmarterMeasure assessment from July 1, 2009 to June 30, 2010. Highlights in the report include the following statistically significant differences between the means of the variables of gender, ethnicity, institution type, age range and number of prior online courses taken as they relate to student readiness for online learning:

  • Gender: Females were found to have statistically significant higher means on the constructs of individual attributes and typing accuracy. Males were found to have statistically significant higher means on the constructs of reading rate and technical knowledge.
  • Ethnicity: Statistically significant differences in means were reported in five of the eight constructs based on ethnicity. Caucasian/White reported the highest means for technical knowledge, typing speed and reading recall. Asian or Pacific Islander reported the highest mean for typing accuracy. American Indian reported the highest means for individual attributes.
  • Age Range: Significant differences did exist in five of the eight of the constructs measured. Generally speaking age does matter. For constructs related to personal maturity, older students had the highest means. For constructs related to technical matters, younger students had the highest means.
  • Number of Courses: The results demonstrated that experience matters with online learning. In all eight constructs measured, persons who reported having taken five or more prior online courses reported the highest mean. The differences in the means were statistically significant in four of the eight groups. The greatest difference in means from students with no prior online course experience and those who had taken five or more courses was in the area of technical knowledge. This indicates that with experience students can learn to use the technology required for online courses.
  • Institution Type: Analysis of Variance (ANOVA) was calculated to determine if differences exist between students of different types of institutions. Significant differences did exist on four of the eight constructs measured. Baccalaureate Institutions had a statistically significant higher mean in the constructs of learning styles and individual attributes while Special Focus Institutions had the highest means for reading recall and technical knowledge.
A full copy of the report is available here.

Assessment Details

All six components of SmarterMeasure are grounded in theoretical research and practice. The six components of SmarterMeasure are:

The providers of SmarterMeasure encourage schools to do research with SmarterMeasure data regarding their own students. When schools plan to do an analysis of their SmarterMeasure data they often plan first to correlate SmarterMeasure scores to student's grades in the course. This is a welcomed analysis and typically results in statistically significant findings. The 2008 study conducted by Atanda Research analyzed the SmarterMeasure scores of 2,622 random students representing over 300 schools. Correlations significant at the .05 level or higher were found with 11 of the 15 SmarterMeasure scores variables and student's grades. However, this analysis is really not the most appropriate way to measure the validity of SmarterMeasure scores because student's grades are impacted by a myriad of variables (prior academic experiences, IQ, etc.). SmarterMeasure is not designed to be an indicator of academic success. There are several tools such as the ACT, SAT, and GRE which serve this purpose. SmarterMeasure does not measure any constructs of content knowledge in areas such as math, science, history, etc. So to use SmarterMeasure solely as a predictor of academic success is not the most appropriate application.

In addition to correlating SmarterMeasure scores to grades, we recommend three other types of analysis which are a more valid measurement of the applicability of SmarterMeasure. (1) Identify students who dropped out of the courses and compare the means of their SmarterMeasure scores to the means of the SmarterMeasure scores of the students who persisted in the courses. The intent of SmarterMeasure is to identify students who are "at-risk" of not being a good fit for distance or technology rich learning and it is these students who are more likely to drop out. The real benefit of SmarterMeasure is when schools can identify "at-risk" students then provide the encouragement, remediation and support that the students need to remain in the course and be successful. (2) After students have completed their first online, hybrid or technology rich course, do a survey of the students asking them to report the goodness of fit for them. Ask them questions about how they did keeping up with the volume of reading in the course, the degree to which they could find time to participate in course activities, the level of frustration they had with their computer and the Internet, etc. Then correlate their responses on these questions back to their SmarterMeasure scores. This type is study is very appropriate because SmarterMeasure is intended to be a predictor of goodness of fit of distance or technology rich education. In the 2008 study conducted by Atanda Research, of the 90 correlations calculated between measures of goodness of fit and SmarterMeasure scores 63 of the 90 correlations were statistically significant at the .05 level or higher. (3) The third type of analysis that we encourage is a qualitative study in which you interview individually or in a focus group students who persisted in online or technology rich courses and those who withdrew. Compare the factors that influenced their decision to remain or withdraw to the means of the SmarterMeasure scores from your students.

Not only does the provider of SmarterMeasure support additional analysis of SmarterMeasure data, but we will support you in the effort. If your school would like to construct a study like this contact Dr. Mac Adkins for assistance in designing the study, exporting the correct data, and conducting the statistical analysis.

Three major studies have been conducted by external research agencies to measure the reliability and validity of SmarterMeasure. Information about these studies is presented below.

Construct Validity

Construct validity refers to whether an assessment measures a theorized psychological construct. In the case of SmarterMeasure, construct validity is a measurement of the degree to which SmarterMeasure is an indicator of a learner's level of readiness for studying in an online or technology rich environment. Results from the two studies described below indicate that SmarterMeasure has strong construct validity in that it is an indicator of the goodness of fit for distance learning as is evidenced by multiple correlations that are statistically significant at the .01 level.

It should be noted that SmarterMeasure is not designed to be a predictor of academic success. There are a myriad of variables which impact academic success in online courses ranging from the student's intelligence to the level of interactivity of the online faculty member. SmarterMeasure is an indicator of the degree to which online, hybrid and technology rich courses are a good fit for a student. SmarterMeasure does not make a value judgment indicating that a student should or should not take the courses. Rather it informs the student of their strengths and opportunities for growth in areas related to taking these type courses. If a student is indicated to be deficient in a certain area and then if the school provides appropriate remediation and/or support, then SmarterMeasure can serve as a retention tool by helping students succeed as they learn in the context of online or technology rich courses.

In 2007 an external research firm (Atanda Research, Alexandria, VA) was commissioned to analyze the data gathered during a study concerning the relationship of SmarterMeasure scores and measures of academic success and goodness of fit of distance education as a measure of construct validity. The major findings of this report were that there were forty-two statistically significant correlations between SmarterMeasure variables and measures of academic success and goodness of fit. Of the five constructs measured by SmarterMeasure, the construct with the most correlation to academic success and goodness of fit was Individual Attributes. The variable of the participant's individual attributes scores were statistically significant at the .001 level with all measures of academic success and goodness of fit. The variable with the strongest correlation in the study was relationship between Grade Point Average and Reading Comprehension. Click here to view a copy of this report.

In 2008 the study conducted by Atanda Research was replicated as a part of a learner's dissertation research which involved 2,622 students who had taken SmarterMeasure representing over 300 schools. This replication yielded even stronger results than the original study. Of the possible 105 correlations measured, 74 were found to be statistically significant. The factor measured by SmarterMeasure that had the strongest correlations to measures of goodness of fit and academic success was individual attributes which yielded correlations in each of the seven categories which were statistically significant at the .01 level. This finding mirrored the finding from the 2007 study which also indicated that individual attributes were the strongest indicator of goodness of fit of distance education.

The following correlation matrix presents the results of the statistical analysis from this study:


Correlations between student success variables (SmarterMeasure scores) and measures of goodness of fit for online learning and measures of academic success.

Correlation Matrix Table

*  Correlation is significant at the .05 level
** Correlation is significant at the 0.01 level


Item Reliability

In statistics, reliability is the consistency of a set of measurements used in an assessment. It is a measurement of whether the items of an instrument give or are likely to give the same measurement upon multiple attempts.

In 2008 Applied Measurement Associates of Tuscaloosa, Alabama was commissioned to conduct reliability coefficient calculations for the questions in SmarterMeasure. An expected range for Cronbach Alpha reliability coefficient values is expected to be from .70 to .95 to indicate a reliable assessment.

Reliability Table

It should be noted that for the areas of SmarterMeasure which showed a lower reliability coefficient that the scale type was 0,1. This scale type resulted in lower levels of variability among the possible answers thus reducing the measurement of reliability.

One of the useful features of SmarterMeasure is that school leaders (faculty and/or administrators) can view SmarterMeasure scores through a dashboard which allows them to at-a-glance identify students who might be at risk of not doing well in an online or technology rich course based on their SmarterMeasure scores. Then based on these findings the school can provide remediation and support as appropriate. This serves as a valuable student service which can increase the retention rates among online learners. Because the student population of each school is unique, one of the features of SmarterMeasure is that schools can set the grading thresholds to determine what level of SmarterMeasure scores should classify their students as "failed","questionable", or "passed". In July, 2008 an analysis was conducted based on the 108,423 students who had taken SmarterMeasure in the prior twelve months. Based on this analysis recommendations were made regarding the setting of the grading threshold values in the administrative dashboard of SmarterMeasure. Click Here to view a copy of this report.

This analysis revealed the following distributions of SmarterMeasure scores:

Individual Attributes Technical Knowledge
Reading Comprehension Overall Technical Comp.


In November, 2010 an analysis was conducted using only data from secondary level students to determine the appropriate readiness ranges settings for the secondary version of SmarterMeasure. Click Here to view a copy of this report.


Learning Styles

The learning styles instrument embedded into SmarterMeasure is based on the multiple intelligences approach to identifying a person's dominant learning style(s). The theory of multiple intelligences was proposed by Dr. Howard Gardner in 1983 and exists to measure levels of the following types of intelligences in an individual: Visual-spatial, Verbal-linguistic, Logical-mathematical, Bodily-kinesthetic, Musical-rhythmic, Interpersonal, and Intrapersonal. Additional categories of intelligence have also been recognized in the body of literature on this subject.

Read more about learning styles ...

Individual Attributes

The component of SmarterMeasure which measures individual attributes is based on the dissertation research of Dr. Julia Hartman. Dr. Hartman served as the Manager of the Alabama Online High School. In 2001 she received her Ph.D. from the University of Alabama in Instructional Leadership with emphasis in Instructional Technology (minors in Educational Research and Educational Computer Technology). Her dissertation was titled ATFY-R: Psychometric properties and predictive value for academic performance in online learning.

In her dissertation she identified the individual attributes which are significant predictors of success in an online learning environment. These are variables such as motivation, procrastination, time availability, and willingness to seek help. The individual attributes section of SmarterMeasure measures these variables which are indicators of success in an online course environment.

Person Education LogoSmarterMeasure has partnered with the world's leading learning company, Pearson Education, to provide student success resources at a discounted rate. Through using these resources students can learn how to enhance their opportunities for success in higher education.

Life Factors

Formal and informal feedback was submitted by faculty and administrators of several schools which use SmarterMeasure. Based on their suggestions and a review of literature on situation in life variables which influence learner retention, a first draft of the instrument was created and revised by a professional psychometrician and statistician. Reliability and validity measurements will be calculated after several months of usage. The Life Factors construct in SmarterMeasure has five subscales (time, place, reason, resources and skills). Each subscale has a maximum value of 20 points for a total construct score of 100. A score near 100 indicates a person whose situation in life is very conducive to online learning. Subscale scores for time, place, reason, resources, and skills were used in confirmatory factor model to create a single scaled latent variable score termed, "life factor".

On-Screen Reading Rate and Recall

The on-screen reading rate and recall section of SmarterMeasure was developed by an expert panel of educators representing institutions which are clients of SmarterServices in cooperation with LiteracyWorks.org which is a project of the National Institute for Literacy.

Both reading rate and recall are measured in SmarterMeasure because students should realize that they must not too rapidly read on-screen course content because they may be assessed on the content in their courses.

SmarterMeasure is used by secondary schools, technical colleges, community colleges, universities and corporations. To best fit the needs of the learners of each of these organizations, several reading passages are available. Institutions using SmarterMeasure may select per login group the reading passage that is most developmentally appropriate for that group of learners. The following passages are available:

Flesh-Kincaid Readability Table

The Flesch/Flesch-Kincaid Readability Tests are designed to indicate comprehension difficulty when reading a passage of contemporary academic English. The two tests are the Flesch-Kincaid Grade Level and the Flesch Reading Ease. Although they both use the same core measures (word length and sentence length), they have different weighting factors, so the results of the two tests correlate imperfectly: a text with a comparatively high score on the Reading Ease test may have a lower score on the Grade Level test. Both systems were devised by Rudolf Flesch.

The "Flesch-Kincaid Grade Level Formula" translates the 0-100 score to a U.S. grade level, making it easier for teachers, parents, librarians, and others to judge the readability level of various books and texts. It can also mean the number of years of education generally required to understand this text. The result is a number that corresponds with a grade level. For example, a score of 8.2 would indicate that the text is expected to be understandable by an average student in 8th grade (usually aged 13-14 in the U.S.).

In the Flesch Reading Ease test, higher scores indicate material that is easier to read; lower numbers mark passages that are more difficult to read. For comparison the Readibility Index of the Reader's Digest is about 65, Time Magazine is about 52 and the Harvard Law Review is in the low 30s.

The degree to which the learner can recall the information in these passages is measured by ten questions. There are two of each of the following types of questions: sequence of events, factual, inferential, cloze, and main idea.

Participants are not allowed to view the reading passages while taking the quiz. As such SmarterMeasure provides an assessment of reading recall, not reading comprehension. The intention of this component of SmarterMeasure is to measure the degree to which a person can read academic information on-screen and then recall that information on a quiz. This is a task that is frequently replicated in online and technology rich courses.

It should be noted that the reading rate and recall section of SmarterMeasure should not be used as an exhaustive reading skills inventory. Rather, it should be used as a screening device to identify learners who may be having difficulty recalling what they have read on-screen. If a learner is identified as having opportunities for growth in this area, the school can then inform the student about the resources for remediation and support which they provide. Communicating these resources can be automated through the feedback mechanisms of SmarterMeasure.

National Institude for Literacy Banner




Literacy Works Banner

Technical Competency

The technical competency and typing components of SmarterMeasure was developed by Dr. Mac Adkins. Dr. Adkins holds an Ed.D. from Auburn University in Educational Leadership with an emphasis on instructional technology. Dr. Adkins was one of the authors of the Alabama Course of Study in Technology used by all public schools in Alabama. He was also a participating writer for the National Education Technology Standards (NETS) for Teachers document published by the International Society for Technology in Education. Dr. Adkins also teaches Administration and Leadership of Distance Learning Programs online for Capella University.

The premise of the technical competency section is that if students do not possess basic technical competencies, they will quickly become frustrated and may drop out of the online course. The tasks measured in the technical competency section are basic technology skills which a learner should possess to begin studying online.

Typing Speed and Accuracy

Average typing speeds of persons who type regularly in their occupation range between 50 to 70 words per minute. Average typing speeds for the general public are considered to be around 30 words per minute. Between July 1, 2009 and June 30, 2010 a total of 152,130 students completed the typing section of the SmarterMeasure assessment. The average adjusted typing speed of these students was calculated to be 27.64 words per minute. This slower average rate of typing is a factor that should be considered by schools as they design online courses and by students as they plan for their time to participate in online courses. The formula used to calculate the average adjusted typing speed among students who took SmarterMeasure was to divide the number of words by the number of seconds and subtract for the number of errors.

Adjusted words per minute

Adjusted Typing Speed
N 152,130
Mean 27.64
Median 26
Mode 21
Standard Deviation 11.997
Decile (10%) Typing Scores
1st top 10% 44+ WPM
2nd 10% 37 - 43 WPM
3rd 10% 33 - 36 WPM
4th 10% 29 - 32 WPM
5th 10% 26 - 28 WPM
6th 10% 23 - 25 WPM
7th 10% 20 - 22 WPM
8th 10% 17 - 19 WPM
9th 10% 13 - 16 WPM
Bottom 10% 12 or less WPM

Although the average adjusted typing speed of these students is lower than the general public, this may partly be explained by high levels of typing accuracy. The nature of academic assignments prompts students to be more concerned with typing accuracy than speed since inaccurate words could negatively impact their grades on the submitted assignments. On a scale of 0 to 100% the average typing accuracy of these 152,130 students was 92.41%.

Typing Accuracy
N 152,130
Mean 92.41%
Median 98%
Mode 100%
Standard Deviation 16.956


SmarterMeasure Usage Patterns

Schools use SmarterMeasure in a variety of ways. A common model is that schools embed SmarterMeasure as an assignment into their orientation course. Some schools use it in the orientation course which is specific to online learners while other schools use it in the general orientation course which all students take. SmarterMeasure is a useful student service tool not only for students who will be taking fully online courses, but also hybrid courses, video conferencing courses and even face-to-face courses which use the Internet for communication in the course. Several schools make SmarterMeasure available to prospective students through their website. In May, 2009 schools which use SmarterMeasure were asked to describe how that SmarterMeasure is beneficial to their students and how they utilize SmarterMeasure. Click here to view this report.



Brief Review of Literature on the need for SmarterMeasure

With the shift toward online learning, it is important to explore the adoption of online education. Previous studies found that among academic leaders, 64 percent believe that it takes more discipline for a learner to succeed in an online course (Sloan Consortium, 2006); therefore, placing additional responsibility on students to be self-directed learners. Before the start of an online program or course, it should be determined if a learner's instructional need can be resolved through a distance education approach (Willis & Lockee, 2004). Assessing the pre-requisite skills of the distance learner is critical (Hsiu-Mei & Liaw, 2004; Simonson et al., 2003). Learners need to have enough pre-requisite skills of technological proficiency and a strong motivation to learn by technology (Hsiu-Mei & Liaw, 2004). Because of the difficulty in accommodating a group of learners with a wide range of acquired skills, requirements for pre-requisite skills should be set (Falvo & Solloway, 2004). A researched method of examining the notion of online readiness is listed using three aspects: (a) Student's preference for online form of instructional delivery as compared to traditional face to face instruction; (b) Student confidence in using electronic communication for learning and competence and confidence in the use of Internet and computer-mediated communication; and (c) Ability to engage in autonomous learning (P. J. Smith et al., 2003). Hall (2008, para 27) stated that "the primary value of the surveys may lie in raising awareness for any student considering enrolling in a distance education course."

Pamela Dupin-Bryant of Utah State University - Toole conducted a study which was published in The American Journal of Distance Education titled "Pre-entry Variables Related to Retention in Online Distance Education". This study identified pre-entry variables related to course completion and non-completion in university online distance education courses. Four hundred and sixty-four students who were enrolled in online distance education courses participated in the study. Discriminant analysis revealed six pre-entry variables were related to retention, including cumulative grade point average, class rank, number of previous courses completed online, searching the Internet training, operating systems and file management training, and Internet applications training. Results indicate prior educational experience and prior computer training may help distinguish between individuals who complete university online distance education courses and those who do not. SmarterMeasure measures all of the variables that this study indicated as indicators of success except for class rank.

Click here to view a paper presented at the 2009 Distance Learning Administration conference about the usage of SmarterMeasure.

Developmental Students

When developmental students enroll in distance classes, they bring with them the same need for support that they have in a conventional classroom (Caverly and MacDonald, 1998; Rhoda and Burns, 2005), and surprisingly little research has been done on how best to facilitate the progress of underprepared students in an online class (Perez and Foshay, 2002). Distance education requires more self-directed learning and higher levels of personal motivation, independence and self-discipline (Sampson, 2003), in addition to the technical skills required for participation in an online class (Caverly and MacDonald, 1998). These are all skills in which underprepared students making be lacking. Fortunately, the same technology that delivers the class can deliver the support systems.

Additional Research Requests

Additional research on SmarterMeasure is welcomed. If you are interested in conducting research on the topic of online student readiness using SmarterMeasure data please send a brief research request to Dr. Mac Adkins (mac@smarterservices.com). In the research request describe the purpose and plan for your research including the proposed subjects, timeline, and plans for the dissemination of the research. All research done using SmarterMeasure data must meet our privacy statement. We never release to third parties any data which identifies individual or other school specific data.

Reference List

  • Association, (2004). Retrieved March 10, 2004 from http://www.usdla.org
  • Caverly, D., and MacDonald, L. (1998). Techtalk: Distance developmental education. Journal of Developmental Education, 2. Retrieved October 12, 2007 from Academic Search Premier
  • Dupin-Bryant, P. A. (2004). Pre-entry variables related to retention in online distance education. American Journal of Distance Education, 18(4), 199-206.
  • Falvo, D. A., & Solloway, S. (2004). Constructing community in a graduate course about teaching with technology. TechTrends: Linking Research & Practice to Improve Learning, 48(5), 56.
  • Hall, M. (2008, Fall). Predicting student performance in web-based distance education courses based on survey instruments measuring personality traits and technical skills. Online Journal of Distance Learning Administration, 11. Retrieved April 20, 2009, from http://www.westga.edu/%7Edistance/ojdla/fall113/hall113.html
  • Hsiu-Mei, H., & Liaw, S.-S. (2004). Guiding distance educators in building web-based instructions. International Journal of Instructional Media, 31(2), 125.
  • Perez, S., & Foshay, R. (2002). Adding up the distance: Can developmental studies work in a distance learning environment? T H E Journal, 29, pp. 16+. Retrieved May 22, 2007 from Questia.
  • Rhoda, K. R. & Burns, C. N. (2005). Developing and online writing center for distance learning courses. Paper presented at 21st Annual Conference on Distance Learning and Teaching. Retrieved October 13, 2007 from http://www.uwex.edu/disted/conference/Resource_library/proceedings/05_1923.pdf
  • Sampson, N. (2003). Meeting the needs of distance learners. Language, Learning and Technology, 7, pp.103+. Retrieved June 13, 2007, from Questia.
  • Simonson, M., Smaldino, S., Albright, M., & Zvacek, S. (2003). Teaching and learning at a distance. Upper Saddle River, NJ: Pearson Education, Inc.
  • Smith, P. J., Murphy, K. L., & Mahoney, S. E. (2003). Towards identifying factors underlying readiness for online learning: An exploratory study. Distance Education, 24(1), 57. United States Distance Learning
  • Willis, L. L., & Lockee, B. B. (2004). A pragmatic instructional design model for distance learning. International Journal of Instructional Media, 31(1), 9.