Online Journal of Distance Learning Administration
Distance writing programs still struggle with assessment strategies that can evaluate student writing as well as their ability to communicate about that writing with peers at a distance. This article uses Kim, Smith and Maeng's 2008 distance education program assessment scheme to evaluate a single distance writing program at Old Dominion University. The program's specific assessment needs include the ability to determine how well students are developing expert insider prose and working together as a virtual community. Kim, Smith and Maeng's assessment scheme was applied to six courses within the writing program, revealing that programmatic assessment weaknesses included providing varied methods of embedded assessment and encouraging collaboration in writing. Findings further showed that few courses were using summative assessment in the form of exams and quizzes, and several lacked instruments of team or self assessment. The assessment scheme identified other assessment weaknesses across courses, including a lack of opportunities for these distance students to engage in peer discussion or evaluation and a need for greater assessment instrument variety. Applying this assessment method to courses within the single program revealed that electronic graduation portfolios best meet this program's unique interdisciplinary assessment needs.
Original Publication Citation
Tucker, V.M. (2012). Listening for the squeaky wheel: Designing distance writing program assessment. Online Journal of Distance Learning Administration, 15(4), 1-13.
Tucker, Virginia M., "Listening for the Squeaky Wheel: Designing Distance Writing Program Assessment" (2012). English Faculty Publications. 36.