- Assessing General Education Using a Standardized
Challenges and Successful Solutions.
Jan A. Geesaman, Peter T. Klassen, Russell Watson *
- * Published in Assessment Update November-December 2000 9 Volume 12, Number 6.
College of DuPage (34,000 students) in Glen Ellyn, Illinois, has developed a general education assessment design that can be used easily at two-year or four-year institutions of any size. This general education assessment model grew from the college's initial classroom assessment initiative, and now its second full year of implementation is complete. This article will review the assessment design, use of results to close the feedback loop, ways faculty cooperation was encouraged, and means of motivating students to give their best efforts during the assessment activities.
A decade ago, two-year colleges were known for their "enrollment churn" and the attendant problems of attempting to use a pretest/posttest model for assessment of student learning. Now that same enrollment churn occurs at many four-year institutions. Because of this type of enrollment pattern at College of DuPage (COD), true pre-testing and post-testing was unrealistic, and therefore a pseudo-pretest/posttest model was designed. This strategy involves administering the six American College Test/Collegiate Assessment of Academic Proficiency (ACT/CAAP) area tests to sections of introductory courses during the fall quarter, and to sections of advanced courses with primarily graduating or transferring students in the spring quarter. A stratified proportionate random sample of class sections is obtained and matched for distribution on day/night and on/off campus variables. In each class section used in the assessment process, each of the six CRAP area tests is administered to separate groups of students. That is, in a given section of 30 students: 5 students each take the Math, Science Reasoning, Writing Skills, Reading, Critical Thinking, and Essay Writing area tests. This method provides for the most representative assessment of general education skills in the student population.
A series of additional statistical controls helps maintain a robust examination of those students taking their college sequence at our institution. One control for the "freshman drop-out effect" involves removing scores from freshman students not registered at the college during the following spring quarter. Another control involves removing scores from those students who were tested during spring quarter but who had taken college credit at other institutions. With these and other controls for the "perpetual student," we were able to obtain a pool of entering freshmen who had completed either one or two years of classes at the college, and a pool of completing sophomores who had taken classes only at COD. In addition, there was a middle pool of midyear freshmen and sophomores who were approximately halfway through their COD experience. A more detailed discussion of the controls, results, and methodology is available in the 2001 Assessment Report.
Use of Results
Typically, whenever assessment results are posted at an institution, an often-heard comment from many faculty is "What are we supposed to do with this?" For this initiative, we wanted to ensure that communication was a two-way process. Therefore, with each report of assessment results, a tear-off response sheet is included that may either be submitted or filed electronically through the college intranet. These response sheets ask multiple-choice, short-answer, and open-ended questions about what faculty members think should be done with the results of the assessment data and what they are going to do individually. Finally, written faculty responses are published (anonymously) in a follow-up document. This process allows not only maximum input but also maximum distribution of responses. While initial response rates were low, we have experienced a gradual increase in participation by faculty. Also, after initial responses were published, faculty dialogue about our efforts and results increased. Questions asked and faculty responses [were to] be found on the Website.
In fall 1999, results of the first phase of our ACT/CAAP assessment battery revealed that College of DuPage students scored, on average, lower than ACT two-year college norms in college level reading skills. As a result, faculty in each division incorporated a reading related goal in their area objectives. In fall 2000, the latest ACT/CAAP reading scores show a slight (though not statistically significant) increase in reading skills among sophomore students. This information will be shared at our fall faculty meeting with the message that while there was some increase in reading scores, continued emphasis needs to be placed on improving college-level reading skills across our general education
Faculty opposition to giving up in-class time for any non-instructional purpose is universal, and assessment is no exception. However, a request to donate 50 minutes of class time for the purpose of achieving a greater understanding of students' learning in areas of general education skills usually elicits a higher level of faculty cooperation. College of DuPage has about 300 full-time and 1.200 part-time faculty. The selection of classes described earlier involves 30-35 sections each in the fall and spring. Since a list of randomly selected class sections is used, only 30 faculty from a pool of nearly 1,500 are solicited for each round of testing. Each faculty member thus has only about a 2 percent chance of being selected during each testing round. Given these circumstances and the fact that this assessment process is a faculty-driven initiative, our experience has been one of substantial cooperation. In fact, out of four rounds of ACTICAAP testing to date involving about 2,400 students and about 120 faculty, only one faculty member has refused to participate.
Another major challenge in using standardized testing as an assessment technique involves motivating students to give their best effort. Unsuccessful incentive systems have included providing discount coupons at the college bookstore, coupons for free pizza, tuition discount coupons, free meals, T-shirts, certificates of participation, and other inducements. The following method used at College of DuPage seems to have yielded appropriate and sincere participation by students. After the class sections are randomly selected, the instructor reads a letter to the class indicating that sometime during the next two weeks that class section has been asked to participate in the CARP testing. In addition, the students are given a tri-fold brochure describing the CARP tests, why they are important to the college, and why they should be important to students as individuals. This brochure asks the students to do their best on the test and tells them that they will receive their own results, which will give them a sense of their general education skills in a way that does not affect their classroom grades or grade point average. In addition, the students are informed that if their score exceeds the national mean, ACT will issue a certificate attesting to their performance. Finally, instructors of each class section selected to participate are asked to remain in the room during the testing period. It seems that the presence of their instructor is an additional incentive for students to perform well on the test. There are always students who fail to take assessment testing seriously: however, careful monitoring of completed CAAP answer sheets has revealed that those students account for only about 1 percent of participants at College of DuPage.
While few testing systems are ideal for all cases, the assessment committee has been very pleased with the design, faculty and student participation, and the usability (and use of results) of the assessment model at College of DuPage. Although our experience is still somewhat limited, this is a general education assessment model that appears to have overcome several of the major challenges associated with the use of standardized testing for assessment of general education outcomes and that could be adapted to any two- or four- year institution.
Jan A. Geesaman is associate dean of communications and a member of the assessment committee at College of DuPage. Peter T. Klassen is professor emeritus of sociology and interdisciplinary studies at College of DuPage and an assessment consultant in private practice. Russell Watson is professor of psychology and faculty chair of the assessment committee at College of DuPage.