CSULA logo image General Assessment Banner CSULA Seal Image
Undergraduate Studies  | Office: Administration 725   Phone: (323) 343-3830
Search CSULA | Site Map | Campus Directory | Maps | Calendars | Highlights | Campus Safety | Library | GET | Contact Us | CSULA
       
GE Assessment
Main page

Back to Reports


Section Anchors:

Introduction

Design

English Department Concerns

Implementation

Results

English as a Second Language

Number of English Courses

Grade point average

Multiple regression model

Conclusion


 

 

The Collegiate Assessment of Academic Proficiency (CAAP) Test
 – Draft – 5/6/00

The Faculty Working Group decided in Fall, 1999, to ascertain the appropriateness of the Collegiate Assessment of Academic Proficiency (CAAP) test for the assessment of GE writing skills. The test, an ACT product, consists of several parts:

  1. Reading – reading comprehension in fiction, humanities, social sciences and natural sciences. This section consists of passages with multiple choice questions on them. Questions relate to deriving meaning, manipulating information, making comparisons and generalizations and drawing conclusions.
  2. Writing Skills – several prose passages plus sequences of multiple choice questions measuring students’ understanding of the conventions of standard written English, plus strategy, organization and style. Passages are pegged to the second semester sophomore level. You get a total score and two subscores (usage/mechanics and rhetorical skills).
  3. Writing Essay test – the test can be graded by the ACT company, but it is more common for these tests to be graded locally on the particular campus. For CSLA, this test would essentially duplicate the Writing Proficiency Exam.
  4. Mathematics – test emphasizes the solution of math skills generally taught in the first two years of college. The variety of skills tested only partially duplicates what is done in precollegiate math on this campus, plus Math 100/102. The working group ruled this test out because it duplicated so little of the material taught on campus for any given student.
  5. Science Reasoning – assumes no factual information but is based on principles generally taught in science courses. The test contains passages plus multiple choice questions. Information conveyed in data representation form (graphs, tables, other schemes), research summaries (descriptions of several related experiments), or conflicting viewpoints. The science faculty consulted by the working group members felt this test was not appropriate for our GE program.
  6. Critical Thinking – ability to clarify, analyze, evaluate, and extend arguments. Includes case studies, debates, dialogues, overlapping positions, statistical arguments, experimental results, and editorials. Since this test would duplicate Prof. Garry’s extensive analyses currently underway, we did not use this section.

Since the working group at the time was focusing on the basic skills part of the GE program, we decided to attempt the use of the CAAP Writing Skills test to determine whether the exam might be appropriate for assessing our GE program and to develop a design to use all or part of the CAAP to assess GE. The Writing Skills test is multiple choice, with 70 questions to be answered in 40 minutes, and is aimed at college sophomores and juniors, one of the few exams that are aimed at mid-level college students.

Go to Top

Design.
Several design possibilities were evident from the CAAP literature.

    1. Cross-sectional – test beginning first year students at the beginning of the year and test sophomores who are finishing GE at the end of the same year. Match the two groups at least in terms of GPAs (or control for GPA in the analysis to see whether the scores of sophomores who finish GE are greater than those of first year students who have not had any GE). We began with a design where we would give the CAAP Writing Skills test to two groups: first year students enrolled in English 101 or 102 and first quarter transfer students in their junior year who were enrolled in the Introduction to Higher Education course for junior year transfers.
    2. Longitudinal – give the test to incoming first year students and to the same group when they finish the program. Infer change from the difference in scores. Some problems might arise from testing effects (if the students remembered the test, but the alternate forms of the test in combination with the one to two years between pretest and post test should mitigate this effect). This option did not seem feasible at Cal State LA because of the absence of an intact group or a testing location, that is, a group of students who could be traced and would be taking the same course at the same time a year or more apart, or a location where students could be tested a year apart.
    3. Longitudinal – give the test to sophomores who have finished the program and compare scores with any other ACT test. However, at CSLA, the number of students who have scores from other ACT tests on file is small.

Finding a good design is difficult with a General Education program that has as many options and choices as CSLA’s does. As a result of the alternate courses and the lack of a prescribed sequence, it is impossible to test the same students other than at the beginning and the end of the same quarter length class, making it difficult to study longer term growth in each individual student. At the same time, the CAAP literature makes it clear that asking students to take a voluntary examination at a time of their own choosing does not work in obtaining a valid sample of the abilities and achievements of all students.

Go to Top

English Department Concerns:
Members of the English department, especially the coordinator of English 101 and 102, expressed concerns over the appropriateness of the examination, in particular:

  • The norming group included few students from the West, and almost no Latino or Asian-American students.
  • The test questions dealt with standard normal English, with multiple choice questions on strategy, organization and style in writing, but at a sophisticated level, a higher level than the majority of CSLA students operate.

As a result of the concerns of the English Department faculty, the design was modified to a cross-section of first quarter transfer students in the Introduction to Higher Education sections for transfer students. The goal was to test 100 students. Because of student absences and tardiness (the latter, in particular, was not planned for) on the day in November, 1999 that the test was administered, a total of 86 students took the examination for the full 40 minutes.

Go to Top

Implementation.
Implementation was more complex than first envisioned.

  • Cost – the cost of the test is $10 per student. If more than one CAAP test is used, the cost is $15.75 per student for 2-5 tests.
  • Standardizing the instructions – the instructions are sufficiently complex that it was difficult to implement standard instructions in all classrooms and would be particularly difficult if the test were administered by the instructors in each of many classes.
  • Student tardiness – since the test is not one that students have an incentive to arrive on time for, like the WPE, students came late to class, arriving while the exam was in progress.
  • Student incentives – while the faculty working group appealed to all eight classes that took the exam to take the exam seriously, as the endeavor was important for the University, maintaining student incentives to do well on the exam would be very difficult if the exam were to be administered year after year. No clear solution has been found to the student incentive problem, according to the CAAP literature.
  • One student did not return the exam; after several phone calls and letters, the exam was returned the next day.

While implementing the CAAP on a "one time" or "special occasion" basis is feasible, it would be difficult to implement the exam on a continuing basis with the current structure of the CSLA GE program.

86 students took the exam in six classes, all Introduction to Higher Education classes for junior year transfers to Cal State LA. Almost all of the students were in their first quarter on campus.

Go to Top

Results:

Average Scores. The national average for the Writing Skills test is 64.3 points on a scale from 40 to 80. The CSLA average was 57.5. The standard deviation for the CSLA group was 4.5; the national standard deviation is 4.7. The average student scored in approximately the 11th percentile compared with four year public college sophomores or the 21st percentile compared with two year public (community) college sophomores.

Since the scores were low, we decided to collect more data to determine whether the results reflected no English courses, whether the scores correlated with English course grades, community college grades or CSLA grades, and the like. The data collected (from CSLA’s Student Information System) included:

  • CSLA GPA
  • CSLA major
  • Community college attended
  • Community college GPA
  • Number of community college units taken
  • Number of community college English courses; grades in those courses
  • Number of CSLA English courses taken; grades, if any, in those courses
Go to Top

English as a Second Language.

Most students (51 or 61%) did not speak English as their first language. The average CAAP Writing Skills test score:

bullet marker  When English was their first language: 58.6 (N = 33)
bullet marker  When English was not their first language: 57.1 (N = 51)

 
The difference between the two groups was not statistically significant (p < .12).

 

Go to Top

Number of English Courses.

The average student had taken 3.3 English courses. The median was three. Eight students, or 9%, had taken 7-8 English courses. When students had taken more than two or three English courses, most of the additional courses were in vocabulary building, reading, and other pre-collegiate courses.

Consequently, the more English courses students had taken, the lower the CAAP Writing Skills test score:

bullet marker  0, 1, 2 English courses: 58.8 (N = 33)
bullet marker  3.4 English courses 57.6 (N = 33)
bullet marker  5 English courses 56.6 (N = 8)
bullet marker  6 English courses 55.0 (N = 4)
bullet marker  7+ English courses 53.6 (N = 8)

The difference among the groups above was statistically significant, with a p value of less than .028, but with the data not grouped as above (in the first two rows), the differences were not significant. In other words, the significance is due in part to the grouping chosen.

Go to Top

Grade point average.
The correlations of CAAP Writing Skills test scores with the different GPAs are as follows:

bullet marker  Overall GPA in the community college: 0.43
bullet marker  GPA at CSLA (caveat: most students had completed only one quarter at CSLA): 0.32
bullet marker  GPA in English courses: 0.25
bullet marker  Number of units taken at the community college: 0.05

 
The following scatterplots tell part of the story:

Notice that the line is steeper – and thus the relationship more significant – between the overall community college GPA (the top graph) and the CAAP Writing Skills test score, as opposed to the relationship between the GPA in English courses and the CAAP Writing Skills test score (the bottom graph):

 

GPA scatter plot

WPE and English GPA scatterplot

Interpretation: we infer from the stronger correlation with the overall community college GPA that the test is tapping a general reading/writing ability or "readiness for upper division work" assessment, rather than strictly assessing "writing" per se the way it is taught in California’s community colleges.

Go to Top

Multiple regression model.
A multiple regression model predicts the CAAP Writing Skills test score moderately well.

Variable Coefficient t-value p < |t|
Dependent variable:      
CAAP Writing Skills test score      
GPA (community college) 3.12 2.86 0.005

GPA (English)

0.43 0.60 0.549
Number of English courses -0.81 -3.25 0.002

English as the student’s first language

-0.73 -0.81 0.420

Constant

50.38 15.94 0.000
       
R-square 0.27    
F value 7.01   0.0001

The CAAP test score turns out to be a function of the student’s overall GPA at the community college and the number of English courses. Several other variables were attempted in the model; all were insignificant.

Go to Top

Conclusion:

The CAAP Writing Skills multiple choice examination is a valuable adjunct in the repository of vehicles for assessing writing skills. These include the following at CSLA:

  • the English Placement Test taken by almost all first year students
  • grades in English courses
  • the portfolios presently used in the precollegiate English courses (which could possibly be used for assessing writing as students leave English 101 or 102)
  • the Writing Proficiency Examination (taken by all undergraduate and graduate students)
  • the upper division writing courses required in all majors
  • the senior seminars in some majors
  • the CAAP writing skills test

On the positive side, the CAAP is nationally normed and the questions are classical standard English. On the negative side is the expense, the difficulty of implementation, and the composition of the norming group that ACT used for this test. And the CSLA sample consisted exclusively of transfer students in their first quarter on campus, before, presumably, we had had a positive effect on their English skills.

Consequently, the working group is recommending that the CAAP be used only as one of several tools to assess writing skills; it has limited value within the CSLA context.