![]() |
|
![]() |
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| Undergraduate Studies | General Ed. and Articulation Office: Administration 725 Phone: (323) 343-3917 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| Search CSULA | Site Map | Campus Directory | Maps | Calendars | Highlights | Campus Safety | Library | GET | Contact Us | CSULA | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
Report on the Pilot Test: Use of the Writing Proficiency Exam to Assess Student Understanding and Appreciation of Diversity In the General Education Program Version 2.3 7/24/01 This Pilot Test originated in 1999-2000 at the recommendation of the Ad Hoc GE Assessment Faculty Working Group, which recommended a pilot study to determine the feasibility of using the Writing Proficiency Exam (WPE) as an appropriate measure to assess students’ understanding and appreciation of diversity. The intent of the study was to see if an appropriate WPE topic could be developed and used – one that would first and foremost meet the WPE requirements for an accessible topic, as well as allow assessment of students’ ability to articulate their understanding and appreciation of diversity and/or their concerns and apprehensions about diversity. The pilot began in Spring 2000 with the committee’s development of an appropriate topic. The WPE Director used this question for the WPE in Summer 2000, administering the exam in the normal way. In fall quarter 2000, an ad hoc task force of faculty who either teach diversity courses or have assessment experience in GE developed a rating rubric for assessment based on the goals and objectives of learning outcomes for diversity in the General Education program. In December 2000, the ad hoc task force, augmented by other faculty, used this rubric to assess a sample of WPE essays written the previous summer. The analysis of the pilot study proceeded in winter 2001, with accompanying discussion of the results by faculty involved in the pilot. This report summarizes the background, methodology, and results of the pilot study and includes recommendations for subsequent use of this assessment measure.
The Ad Hoc GE Assessment Faculty Working Group worked for two years to develop an assessment plan proposal for the new GE program. A major goal of the group was to review and pilot a variety of instruments in order to determine the range of methods that would be appropriate for assessing student learning outcomes for General Education. After extensive consultation with faculty from across the University on the goals and objectives for the GE program, the group identified a series of "common threads" in the GE program, one of which was diversity. The Working Group then met with groups of faculty, including the WPE Faculty Director and the Faculty Director of the Writing Center, to discuss ways to assess the diversity criteria and outcomes stipulated in the University Senate policy. The Working Group selected the WPE as a possible means to assess diversity for the following reasons. First, in developing the assessment plan, the Working Group strongly recommended that existing measures and instruments, when appropriate, be used to assess GE learning outcomes. Second, the Working Group felt that the structure of the WPE would lend itself to assessing attitudes and feelings about certain topics. Third, the Working Group felt that using the WPE as a possible method of assessment would provide an opportunity to assess learning outcomes of both native and transfer students at the upper division level.
The preparation for the pilot study of the use of the WPE to assess diversity in the GE program began in spring quarter 2000, supported by a University lottery grant. The Working Group convened a selected group of faculty who teach the GE diversity courses to meet with the WPE Faculty Director; this group became the task force that performed the assessment and analyzed the results. The task force began by discussing and exploring the pilot project, including developing the prompt statement (topic) for the written essay on diversity. Initial discussions began with the development of a common meaning for diversity that reflects the University Senate Policy and the aspects of diversity appropriate for essay topics. The Goals and Objectives regarding diversity in the proposed GE Assessment Plan provided direction in the formulation of the prompt and the development of criteria for scoring the essays. (Please see attachment A, Goals and Objectives.) The topic had to allow students to respond whether or not they had taken a "diversity" course (as so identified in the GE program), or it would have advantaged those students who had taken such a course. The prompt developed was as follows: One of the objectives of a college education is to broaden our perspectives. In a well developed essay, describe a course you have taken at the college level, or an experience you have had on a college campus, that has caused you to rethink your beliefs or feelings about a person or group of people. Then explain the change in your attitude and how it has affected your life. All students taking the WPE Summer quarter 2000 wrote an essay using this prompt. The WPE coordinator oversaw the regular scoring of the WPE in Summer, 2000 according to the normal scoring rubric and according to established procedures. At the conclusion of the scoring of the WPE for the assessment of writing competency, the task force drew a random sample of 399 essays for independent scoring as a part of the Diversity Pilot Study.
The scoring system developed was as follows: ========================================================== 0 An essay that does not discuss diversity in some way. 1 An essay that represents a weak understanding of most of the following concepts:
2 The "2" essay is confused or inconsistent in its understanding and reflects an inadequate understanding of some of the concepts listed under #1 above. 3 The "3" essay is less thoughtful or less solid in content and development than the "4," but it demonstrates an adequate response. It demonstrates adequate understanding of some of the concepts listed under "1" above. 4 The "4" essay demonstrates strong understanding, through its use of relevant examples and details, of many of the concepts listed under "1" above. ========================================================== Appendix A contains the scoring system in the form used during the assessment. Training and scoring of the essays took place during fall quarter 2000. Virginia Hunter, Associate Dean of Undergraduate Studies and executive secretary to the task force, invited additional faculty who teach GE diversity courses to participate in the scoring of the essays using the diversity scoring rubric. The scoring procedure included a "norming" session preceding to the actual scoring of the essays, similar to the regular scoring practice of the WPE. The grading procedure was holistic, similar to what is done with the WPE. The graders rated each essay. Scores could thus range from 0 through 8. The WPE Director adjudicated scores of 5, meaning that one reader gave the essay a 2 and the other reader gave it a 3.
Analysis of the results of the ratings continued into Winter and Spring quarters 2001. Analysis focused on the following aspects:
Scores ranged from 0 to 8 with two raters and a 4-point rating scale:
Interpretation.
Thus,
Here and elsewhere in this report, numbers that do not add to 100 do so because of rounding.
There is a small difference (about 5%) in the success rates of women and men. The difference is not statistically significant using a one-way analysis of variance test (F = 0.14; p < 0.71)
The difference between the two groups is highly statistically significant using a one-way analysis of variance (F = 16.86, p<.0000) test.
We used OASIS, the student information system, to collect information on whether students were "natives" (students who started their freshman year at CSLA) or "transfers" (those who transferred, almost always with almost two years or more of transfer credits). Graduate students were distinguished by the presence of a graduate GPA on the information file obtained from Analytical Studies. There is a small difference between native students and transfers, although the Ns are small – native students are equally likely to write about diversity and somewhat more likely to do so successfully, although the differences were not statistically significant using one-way analysis of variance (F=1.46, p < 0.24).
Younger students were more likely to write about diversity and more likely to do so successfully; students over 30 were less likely to do both. The differences are not statistically significant using a one-way analysis of variance (F=1.86, p< 0.14) test.
There are major differences by College of major – Arts and Letters majors did very well on the diversity WPE; E&T, HHS, B&E and NSS majors were less successful but similar to each other. The few Charter College majors in the group did much less well; it is not clear whether this is a function of the sample size or something unique about these 17 students. The differences are not statistically significant using a one-way analysis of variance (F=0.91, p< 0.48) test.
Multiple subject majors included those majoring in Child and Family Studies, Liberal Studies, and Urban Learning. There is an approximate 5% difference between the multiple subject majors and all others – multiple subject majors were a bit less successful in passing the diversity WPE and tended not to write about diversity to a greater extent than others. The differences are not statistically significant using a one-way analysis of variance (F=2.72, p < 0.07) test.
This table is for undergraduates only. This is a weak relationship; there is a somewhat greater tendency to write successfully about diversity as GPA increases, but the correlation between GPA and the diversity score is weak at 0.09. The differences are not statistically significant using a one-way analysis of variance (F=1.00, p< 0.41) test.
We ran a similar table for the WPE score:
Student self-classifications, obtained as part of the University application process, are the source of the data used in this analysis. "Other/Refused" includes those who specified "Other," made no response, or refused to state any ethnicity. The Ns are small for all except Latino and Asian-American. There are substantial differences between those two groups on both tables. A one-way analysis of variance indicates that the diversity score differences are statistically significant (F = 3.28, p < 0.01), as are the WPE score differences (F=18.76, p < .00005).
|
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||