CSULA logo image General Assessment Banner CSULA Seal Image
Undergraduate Studies  |  General Ed. and Articulation    Office: Administration 725   Phone: (323) 343-3917
Search CSULA | Site Map | Campus Directory | Maps | Calendars | Highlights | Campus Safety | Library | GET | Contact Us | CSULA
       
GE Assessment
Main page

Back to Reports


Sectional Anchors:

Introduction

Background

Methodology

The scoring system

Results


Analysis of Findings

Recommendations

Report on the Pilot Test: Use of the Writing Proficiency Exam to Assess Student Understanding and Appreciation of Diversity In the General Education Program

Version 2.3 7/24/01

This Pilot Test originated in 1999-2000 at the recommendation of the Ad Hoc GE Assessment Faculty Working Group, which recommended a pilot study to determine the feasibility of using the Writing Proficiency Exam (WPE) as an appropriate measure to assess students’ understanding and appreciation of diversity. The intent of the study was to see if an appropriate WPE topic could be developed and used – one that would first and foremost meet the WPE requirements for an accessible topic, as well as allow assessment of students’ ability to articulate their understanding and appreciation of diversity and/or their concerns and apprehensions about diversity.

The pilot began in Spring 2000 with the committee’s development of an appropriate topic. The WPE Director used this question for the WPE in Summer 2000, administering the exam in the normal way. In fall quarter 2000, an ad hoc task force of faculty who either teach diversity courses or have assessment experience in GE developed a rating rubric for assessment based on the goals and objectives of learning outcomes for diversity in the General Education program. In December 2000, the ad hoc task force, augmented by other faculty, used this rubric to assess a sample of WPE essays written the previous summer.

The analysis of the pilot study proceeded in winter 2001, with accompanying discussion of the results by faculty involved in the pilot. This report summarizes the background, methodology, and results of the pilot study and includes recommendations for subsequent use of this assessment measure.

Go to Top

Background.

The Ad Hoc GE Assessment Faculty Working Group worked for two years to develop an assessment plan proposal for the new GE program. A major goal of the group was to review and pilot a variety of instruments in order to determine the range of methods that would be appropriate for assessing student learning outcomes for General Education. After extensive consultation with faculty from across the University on the goals and objectives for the GE program, the group identified a series of "common threads" in the GE program, one of which was diversity. The Working Group then met with groups of faculty, including the WPE Faculty Director and the Faculty Director of the Writing Center, to discuss ways to assess the diversity criteria and outcomes stipulated in the University Senate policy.

The Working Group selected the WPE as a possible means to assess diversity for the following reasons. First, in developing the assessment plan, the Working Group strongly recommended that existing measures and instruments, when appropriate, be used to assess GE learning outcomes. Second, the Working Group felt that the structure of the WPE would lend itself to assessing attitudes and feelings about certain topics. Third, the Working Group felt that using the WPE as a possible method of assessment would provide an opportunity to assess learning outcomes of both native and transfer students at the upper division level.

Go to Top

Methodology.

The preparation for the pilot study of the use of the WPE to assess diversity in the GE program began in spring quarter 2000, supported by a University lottery grant. The Working Group convened a selected group of faculty who teach the GE diversity courses to meet with the WPE Faculty Director; this group became the task force that performed the assessment and analyzed the results. The task force began by discussing and exploring the pilot project, including developing the prompt statement (topic) for the written essay on diversity.

Initial discussions began with the development of a common meaning for diversity that reflects the University Senate Policy and the aspects of diversity appropriate for essay topics. The Goals and Objectives regarding diversity in the proposed GE Assessment Plan provided direction in the formulation of the prompt and the development of criteria for scoring the essays. (Please see attachment A, Goals and Objectives.) The topic had to allow students to respond whether or not they had taken a "diversity" course (as so identified in the GE program), or it would have advantaged those students who had taken such a course.

The prompt developed was as follows:

One of the objectives of a college education is to broaden our perspectives. In a well developed essay, describe a course you have taken at the college level, or an experience you have had on a college campus, that has caused you to rethink your beliefs or feelings about a person or group of people. Then explain the change in your attitude and how it has affected your life.

All students taking the WPE Summer quarter 2000 wrote an essay using this prompt. The WPE coordinator oversaw the regular scoring of the WPE in Summer, 2000 according to the normal scoring rubric and according to established procedures. At the conclusion of the scoring of the WPE for the assessment of writing competency, the task force drew a random sample of 399 essays for independent scoring as a part of the Diversity Pilot Study.

Go to Top

The scoring system.

The scoring system developed was as follows:

==========================================================

0 An essay that does not discuss diversity in some way.

1 An essay that represents a weak understanding of most of the following concepts:

  • Social, economic, and political forces that shape diversity.
  • Tolerance and/or acceptance of others (including race, ethnicity, culture, gender, sexuality, nationality, and class).
  • An awareness of similarities and differences among groups and individuals.
  • An empathetic identification with others or an understanding of the other’s behavior.
  • An awareness of the self, based on his or her background, in relationship to others.
  • A translation of this knowledge into positive action.

2 The "2" essay is confused or inconsistent in its understanding and reflects an inadequate understanding of some of the concepts listed under #1 above.

3 The "3" essay is less thoughtful or less solid in content and development than the "4," but it demonstrates an adequate response. It demonstrates adequate understanding of some of the concepts listed under "1" above.

4 The "4" essay demonstrates strong understanding, through its use of relevant examples and details, of many of the concepts listed under "1" above.

==========================================================

Appendix A contains the scoring system in the form used during the assessment.

Training and scoring of the essays took place during fall quarter 2000. Virginia Hunter, Associate Dean of Undergraduate Studies and executive secretary to the task force, invited additional faculty who teach GE diversity courses to participate in the scoring of the essays using the diversity scoring rubric. The scoring procedure included a "norming" session preceding to the actual scoring of the essays, similar to the regular scoring practice of the WPE. The grading procedure was holistic, similar to what is done with the WPE. The graders rated each essay. Scores could thus range from 0 through 8. The WPE Director adjudicated scores of 5, meaning that one reader gave the essay a 2 and the other reader gave it a 3.

Go to Top

Results.

Analysis of the results of the ratings continued into Winter and Spring quarters 2001.

Analysis focused on the following aspects:

  • Demographic differences and similarities,
  • Whether the student had taken diversity courses,
  • The number of diversity courses taken and grades earned,
  • The relationship between a student’s WPE score and the student’s Diversity score,
  • The relationship between how a student did on the WPE and whether the student wrote about diversity.
Go to Top

Analysis of Findings.

  1. Basic results.

Scores ranged from 0 to 8 with two raters and a 4-point rating scale:

A B C D E
Score Number Percent Percent Group
0 167 42% (Exclude) Did not write about diversity
1 33 8 (Exclude) One rater zero, the other one
2 8 2% 4%  Failed
3 7 2 4%  Failed
4 24 6 12%  Failed
5 3 1 2 One rater zero, the other one did not
6 99 25 50 Clear pass
7 43 11 22% Clear pass
8 14 4 8% Excellent

Interpretation.

  • Column A is the combined, two-grader score, ranging from 0 to 8.
  • Column B is the number of exams that received each of the 9 scores.
  • Column C is the percent based on all 399 sampled WPE exams.
  • Column D is the percent based only on those who clearly wrote about a diversity-related topic, excluding those exams that received a "0" (both readers felt the exam was not about diversity) or a "1" (one reader felt the exam was not about diversity; the other reader felt the exam represented a "weak" understanding of diversity).
  • Column E is the interpretation of the score of 0 to 8.

Thus,

  • According to Column C, 42% of the students clearly did not write about a diversity-related topic. Therefore, 58% wrote about diversity in the view of at least one rater.
  • According to Column C, 40% clearly wrote about diversity and passed (scores of 6, 7, and 8). An additional 1% (or three exams) passed with one rater. 10% failed.
  • If the base is those who clearly wrote about diversity (Column D), 80% passed and 20% failed

Here and elsewhere in this report, numbers that do not add to 100 do so because of rounding.

 

Go to Top

2.  Demographics.
 
A.

Sex (N=390)

Females

Males

Did not write about diversity

51%

50%

Fail

8%

14%

Pass

41%

36%

Total

100%

100%

 

 

 

There is a small difference (about 5%) in the success rates of women and men.

The difference is not statistically significant using a one-way analysis of variance test (F = 0.14; p < 0.71)

B. Citizenship (N=395) Citizen Non-Citizen
  Did not write about diversity 44 64
  Fail 10 9
  Pass 46 27
  Total 100 100
  N 257 137

 

 

 


Citizenship is the closest proxy to a non-native speaker variable – citizens were much more likely to write about diversity and to do so successfully. Non-citizens have student visas or exchange visas, are immigrants with I-551 or I-688 forms, have no valid visa or are undocumented, or are refugees with political asylum. The sample contained 257 citizens and 137 non-citizens.

The difference between the two groups is highly statistically significant using a one-way analysis of variance (F = 16.86, p<.0000) test.

C. Transfer status (N = 187) Undergraduates Graduate
    Natives  Transfers  Students

Did not write about diversity

 47%

 49%

 65%

  Fail 7 10 10
  Pass 47 41 25
  Total 100 100 100
  N 45 102 40

We used OASIS, the student information system, to collect information on whether students were "natives" (students who started their freshman year at CSLA) or "transfers" (those who transferred, almost always with almost two years or more of transfer credits). Graduate students were distinguished by the presence of a graduate GPA on the information file obtained from Analytical Studies.

There is a small difference between native students and transfers, although the Ns are small – native students are equally likely to write about diversity and somewhat more likely to do so successfully, although the differences were not statistically significant using one-way analysis of variance (F=1.46, p < 0.24).

D. Age (N = 388)  17-23 24-30  31-45  46-65
  Did not write about diversity 46%  49%  61% 59%
  Fail 8 12  8 18
 

Pass

46

 40

32

23

  Total 100 101 101 100
  N 138 154 76 22

Younger students were more likely to write about diversity and more likely to do so successfully; students over 30 were less likely to do both.

The differences are not statistically significant using a one-way analysis of variance (F=1.86, p< 0.14) test.

 E. School/College (N = 388) AL BE ED ET  HHS NSS
 

Did not write about diversity

 38%  62%   71% 33%  48%  47%
  Fail 5 5 12 29 11 16
  Pass 57 33 18 38 41 37
  Total 100 100 101 100 100 100
  N 61 108 17 21 107 70

There are major differences by College of major – Arts and Letters majors did very well on the diversity WPE; E&T, HHS, B&E and NSS majors were less successful but similar to each other. The few Charter College majors in the group did much less well; it is not clear whether this is a function of the sample size or something unique about these 17 students.

The differences are not statistically significant using a one-way analysis of variance (F=0.91, p< 0.48) test.

F. Major (N = 388)  Multiple Subject Majors  Other Undergraduates Graduate Students
  Did not write about diversity 57% 47% 65%
Fail 10 10 10
Pass 33 43 25
Total 100 100 100
N 63 285 40
 

Multiple subject majors included those majoring in Child and Family Studies, Liberal Studies, and Urban Learning. There is an approximate 5% difference between the multiple subject majors and all others – multiple subject majors were a bit less successful in passing the diversity WPE and tended not to write about diversity to a greater extent than others.

The differences are not statistically significant using a one-way analysis of variance (F=2.72, p < 0.07) test.

G. GPA (N=348) <2.0 2.0-2.5 2.5-3.0 3.0-3.5 3.5-4.0
  Did not write about diversity 65% 50% 48% 46% 51%
  Fail 10 16 11 8 0
  Pass 25 34 41 46 49
  Total 100 100 100 100 100
  N 20 82 118 87 41

This table is for undergraduates only. This is a weak relationship; there is a somewhat greater tendency to write successfully about diversity as GPA increases, but the correlation between GPA and the diversity score is weak at 0.09.

The differences are not statistically significant using a one-way analysis of variance (F=1.00, p< 0.41) test.

H. Race/Ethnicity - Self-Described Cultural Orientation (N=399)
    African-American  Latino Asian-Anerican/Pacific Islander  Caucasian Non-Hispanic  Other/
Refused
  Did not write about diversity 26% 47% 62% 44% 48%
  Fail 22 10 10 6 0
  Pass 52 43 27 50 52
  Total 100 100 99 100 100
  N 23 173 132 34 33

We ran a similar table for the WPE score:

    African-American  Latino Asian-Anerican/Pacific Islander  Caucasian Non-Hispanic  Other/
Refused
  Did not write about diversity 26% 47% 62% 44% 48%
  Fail 22 20 60 32 18
  Pass 78 80 40 68 82
  Total 100 100 100 100 100
  N 23 173 132 34 34

Student self-classifications, obtained as part of the University application process, are the source of the data used in this analysis. "Other/Refused" includes those who specified "Other," made no response, or refused to state any ethnicity. The Ns are small for all except Latino and Asian-American. There are substantial differences between those two groups on both tables.

A one-way analysis of variance indicates that the diversity score differences are statistically significant (F = 3.28, p < 0.01), as are the WPE score differences (F=18.76, p < .00005).

3.  Whether students had taken a diversity course.
  • 56% or 224 had taken a diversity course
  • Of those who did NOT write about diversity, 53% had taken a diversity course.
  • Of those who DID write about diversity, 60% had taken a diversity course

4.  Average grade in the diversity course.

  • Did not write about diversity 2.76
  • Wrote about diversity but failed 2.38
  • Wrote about diversity but passed 2.87

5.  If a student had taken a diversity course, did that student write about a diversity-related topic?

  • 53% of those who had taken a diversity course wrote about a diversity related topic.
  • Of those who had not taken a diversity course, only 46% did.

6.  WPE score and writing about a diversity related topic.

  • Of those who failed the WPE, 33% wrote about diversity
  • Of those who passed the WPE, 59% wrote about diversity

 7.  WPE score and diversity score.

  • Of those who failed the WPE, 18% wrote about diversity and passed.
  • Of those who passed the WPE, 50% wrote about diversity and passed.
Go to Top

Recommendations.

  1. The University should assess diversity using the WPE every second year. The pilot test shows that the Writing Proficiency Exam (WPE) is a viable measure for assessing the diversity learning outcomes of the GE program. This frequency would not hinder the administration of the WPE and at the same time keep the University community up-to-date on changing trends in how students think about diversity-related topics.

  2. The WPE Director, the Writing Center Director, the GE Assessment Coordinator, and selected faculty teaching diversity courses should develop a pool of several prompts for assessing diversity.

  3. The grading of the WPE for diversity should be consistent with the grading practices of the WPE in general, employing the use of holistic scoring and a GE diversity scoring guide similar to the WPE grading guide.

  4. As the General Education Subcommittee reviews the General Education policy, results from these assessments, as well as developing research, may indicate a need for reshaping or redefining the diversity requirement.