NMSUAstronomy

Skip to: [content ] [navigation]

Nicole Vogt » Evaluation of Progress within an Online Tutor


I have developed an online tutor for general astronomy, one that allows students to conduct self-paced studying of a range of astronomy topics covering the solar system, stars and the galaxy, and the universe of galaxies beyond. A cornerstone of the program is the ease of evaluation of progress, both for students examining their own work and for instructors supporting a cohort of students.

Instructors are given detailed progress reports for each student, and students can view a summary version for themselves at any time. Each report contains a set of tables and figures designed to illustrate progress and identify areas for improvement.

The number of completed quiz questions and average scores are computed, by lecture and by week, for both review (practice) and weekly (formal) modes. We also track student success separately for numerical questions (those for which the student calculates and enters a numerical answer), as these scores tend to be lower than average and can alert an instructor to a math comprehension issue.

A first sample table (shown below) lists an extract of all completed quizzes; instructors can jump to individual questions (shown as green squares when answered correctly and red otherwise) or view the entire quiz, complete with all student answers. Quizzes are taken in either weekly (W) mode for the formal weekly quiz, or in review (R) mode to indicate homework or practice efforts, and cover a range of lectures. The amount of time spent on each quiz is also noted.

Quiz Scores, sorted by Time
Date and Time of DayModeRangeTime (m)ScoreQuiz ID
SaturdayDec.14 8:36 am W 1 - 2610.41387035406
SaturdayDec.14 8:33 am R 1 - 26 2.41387035219
Friday Dec.1310:29 pm R 1 - 26 8.91386998994
Friday Dec.1310:25 pm R 1 - 26 3.91386998712
Friday Dec.13 9:36 pm R 1 - 26 4.71386995780
...

A second sample table lists an extract of all of the questions that a student (or a cohort) has missed to date, ranked in order of how often they were missed. This provides an efficient way to jump-start a discussion with a student about challenges, or to begin a group study session before an exam. Note that a "question" in this context represents a family of questions covering a single concept, so the student does not see the same exact question more than once. This ranking is also provided lecture by lecture, to allow instructors to focus on current or selected topics. When studying students have the option of prioritizing the types of questions they have missed most in the past, for example, to revisit their most challenging topics before an exam.

Questions, sorted by Number of Misses and Success Rates
Date and Time of DayQuestionNmissRateTime (m)
WednesdayNov.2011:43 pm2400010049 5 0.444 2.86
Tuesday Dec.10 9:59 pm2700010002 4 0.200 1.63
Thursday Dec.12 9:43 am2600020006 4 0.333 1.80
Monday Oct.21 6:53 pm1700060021 3 0.000 1.43
Monday Oct.21 7:01 pm1800020016 3 0.000 1.15
...

Three figures shown below track student progress over time and through topics.

Canvas is not supported by your browser.
The average score per lecture for a strong student is shown as green squares. The dark green line shows the average score per lecture achieved by previous students, surrounded by a light green region extending to ± one sigma. The background yellow histogram indicates the relative number of questions answered on each lecture topic.

Some students are strongly motivated by the comparison with a peer group. After initially struggling with a lecture topic, they will continue to test themselves and reinforce the material by successfully solving problems until their average score rises above the peer cohort line.

Canvas is not supported by your browser.
The average score per week for a strong student is shown as green squares. The background yellow histogram indicates the relative number of questions answered each week. The horizontal black line with a yellow highlight indicates the histogram level equivalent to solving 100 questions per week, the recommended number for success.

The two highest peaks in the yellow histogram represent studying for the midterm and final exams.

Canvas is not supported by your browser.
The distribution of questions for a strong student is shown as a function of time and lecture topic. Each square represents the questions answered within a certain lecture during a certain week. The size of the square scales with the number of questions it represents. The square color indicates that the average score for these questions was less than 50% (red), between 50% and 65% (orange), between 65% and 80% (yellow), and above 80% (green).

Note the two long column features that appear as the student reviews cumulative material before each exam. We also see the student struggled with both lectures covered in week 13, but spent additional time on these topics the next week to master them.

Progress reports are particularly useful for students and instructors working in a distance learning environment, to help to bridge the gap created by remote, and often asynchronous, communication.

A wealth of information is logged within the online tutor, allowing detailed analyses to be conducted on a variety of factors. Results are stored in an open format and can be analyzed with any tool (using a spreadsheet or through a scripting language). One can study the amount of time spent solving problems and reading solution sets, count the number of questions answered per topic or per unit time, plot the amount of pre-quiz practice done versus scores on weekly quizzes, compare scores for self-review and exam questions by topic, and track progress over time on a topic for individuals or for a cohort. One can also "invert the analysis" and evaluate questions, comparing success rates for different questions (or sets) or for a question when answered at different times (on start, or after studying a particular topic for an hour).


This material is based upon work supported by the National Science Foundation (NSF) under Grant No. AST-0349155 and the National Aeronautics and Space Administration (NASA) under Grant No. NNX09AV36G. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the NSF or NASA.