My objective was to launch a full-scale pilot of Insite, a web-based program for electronic assessment, in the English department’s undergraduate writing program. The department had done a small-scale program over the summer and fall of 2004 involving only about fifteen total sections. We, the writing program faculty, wanted to do a larger pilot, with about fifty sections to see the value of electronic assessment in the future of the writing program, specifically in English 200, and, to a lesser degree, English 101.
The Insite on-line component is paired with the Harbrace Handbook for Writers . This pairing offers learning opportunities for students and additional teaching tools for teachers. Students can submit papers electronically and the Insite program allows teachers to choose from a number of options to advance the students’ understanding of their progress and their weaknesses. The program also has electronic tutoring capabilities, controlled by the classroom teacher, which we believed might be particularly helpful for our diverse student population at VCU. Insite allows teachers to mark student papers online and to mark those papers not only for content, but also for grammar and mechanics. When students make consistent errors, the site provides tutorials to help with student-specific problems. It also provides on-going quizzes and feedback about these issues to both the student and the teacher. This tool does not do the work of the teacher, but is an enhancement tool. With larger classroom populations, tools such as this allow teachers to give more individually-geared instruction. Insite also provides an originality checking feature which allows teachers to check student work against all electronic sources and a database of student work. This feature not only allows teachers to check for plagiarism, but it can, if the teacher wishes, be made visible on the student page so that students can see their own mistakes in appropriately using source material. The tool can also be used to review student progress from draft to draft.
All teachers in the Writing program currently participate in portfolio assessment groups each semester. Every student who is enrolled in English 101 or English 200 has his or her portfolio assessed by a group of writing program faculty. This allows for more consistency across the program and helps to maintain high standards in the program. We had hoped that with Insite, teachers could exchange portfolios electronically, thereby making the process less cumbersome for teachers and allowing multiple teachers to look at the same portfolio at the same time.
During the course of the Spring 2005 semester we taught approximately forty sections of English 200 using the InSite program. Several instructors also volunteered to use InSite in English 101. In both courses, students submitted all written work via InSite. The InSite representative or I met with each participating class, and assisted the students with sign-on and explained the basic function of the program. Instructors involved in the courses met several times during the semester for instruction and problem-solving and participated, via survey, in assessment at the end of the semester.
The project did meet many of the objectives that we had hoped for, but it also taught us that electronic assessment and feedback on student work does not come without its hazards. To begin with, electronic submission of student work required additional work for instructors in getting students to participate. While most students participated willingly, there were, in nearly every class, stragglers who did not get signed on to the program in a timely way and this caused extra work for teachers in terms of keeping up with which students were turning in work electronically and which ones were not. In addition, many teachers felt that having students turn in work electronically caused the teachers and students to have less actual physical interaction and this often meant that teachers learned students’ names less quickly and that teachers and students had less face-to-face interaction. Teachers felt that this probably affected the engagement of already at-risk students.
Having the capability to look at multiple drafts of student work simultaneously was the benefit that most teachers liked best and we did feel that it allowed us to have a better idea of a student’s progress in the class. We were able to point more specifically at changes that student made in their work and more accurately assess some of their ongoing work. However, grading on-line with the InSite on-line system was often a very time-consuming task for teachers and most reported that it was much more time –consuming to grade on-line than by hand. We also found that students often had difficulty in downloading the comments from the classroom teacher and therefore did not always avail themselves of the comments. We did come to learn that our own electronic comments could be made more quickly, and often more accurately, using the track changes feature in Microsoft word.
The originality checking feature was also a function of the program that teachers liked. It not only helped to identify changes in student work between drafts, but also helped identify when students were having problems understanding the fundamentals of documentation. It also has a feature which allows students to see their own mistakes and many teachers cited this as a real aid to teaching.
Ultimately most teachers found the on-line tutorials for grammar and mechanics did not work well for their purposes and did not use them much. Also, InSite did not lend itself well to the portfolio process although the InSite staff made every effort to help us with that.
While the Writing program did not decide to continue its use of InSite, we did learn a good deal about electronic assessment. Most teachers agreed that it was helpful to have some electronic component in their writing classes. It is helpful in ongoing analysis of student work, and is helpful in looking at student progress because it enables teachers to keep, see, and compare multiple copies of assignments. We also agreed that an originality checking component can help assess student understanding of citation and documentation issues, as well as find actual cases of plagiarism. Our Insite Pilot, perhaps more importantly, taught us the problems with electronic assessment. This can sometimes be a more time-consuming process which can cut down on time to spend with students. It can also take away from more personal interactions between faculty and student which is probably not helpful with student engagement, especially for at-risk students.
Finally, I would like to thank the Center for Teaching Excellence for the opportunity to pursue this project. It has given us invaluable information about the role of electronic assessment in student writing and we would not have been able to fully participate in this pilot without support from the center.
« Back to Reports from previous recipients | « Back to Small Grants Program home