Student Achievement Assessment Committee
Department of Psychology
1. Learning Outcomes in the Department of Psychology
At the completion of baccalaureate degree studies in Psychology, students will:
- Exhibit broad knowledge about human behavior from a variety of psychological perspectives (e.g., biological, cognitive, developmental, social).
- Have the necessary skills in research and other forms of inquiry in order to develop new knowledge about behavior.
- Be able to communicate their knowledge of psychology to others.
- Have the necessary skills and content knowledge to be an informed and critical consumer of existing knowledge.
- Be prepared for post-baccalaureate studies in psychology or related disciplines, or for entering the workforce in areas related to their training.
This report describes the results of the assessment activities carried out in the Department of Psychology over the two-year period, 1999-2001. There were two major projects during this time period, development of our questionnaire-based assessment of student learning outcomes and an empirical assessment of critical thinking. Together, these projects allowed us to assess, to various degrees, the following learning outcomes: Research Skills, Inquiry, Communication, Critical Thinking, and Preparedness.
2. Assessment Activities.
A) Refining the “Assessment of Student Learning” Inventory. In the spring of 1998, we developed a survey to assess students’ perceptions of what they learned in their courses. The survey, which is administered during the last week of class, asks students to rate the extent to which the items described their experiences in a given course. A sample item is “I connected what I learned in this course with courses in the natural sciences (e.g., biology, chemistry, physics, geology).” Copies of the most recent version of the inventory are available upon request.
There have been two revisions of the survey since the spring of 1998. One version was used in the fall of 1998 and a second during the Fall 1999, Spring 2000, and Fall 2000 semesters. Analysis of the Fall 1998 data set revealed that one semester’s worth of data yields too little information about courses that our majors take. The third version of the survey was administered over three consecutive semesters. Initial analyses of the large data set were done during the Spring 2001 semester; the results are reported in Section 3 of this report.
B) Assessment of Learning Outcomes from Alumni. In response to the offer from the Office of Institutional Research to include department-specific information with their large-scale survey of 1999 and 1995 graduates, the “Assessment of Learning Outcomes” inventory was modified to reflect what our undergraduates had learned from the entire curriculum. For example, the analogous item to “I connected what I learned in this course with courses in the natural sciences” was “I connected what I learned as a psychology major with what I learned from undergraduate courses in the natural sciences.” A total of 250 alumni surveys were sent out during March, 24 were returned.
C) Assessment of Critical Thinking Skills in First Year and Fourth Year Psychology Majors. This study is the third in a line of studies designed to assess critical thinking. The first study, carried out during the 1997-98 AY, used a critical thinking test developed by Dr. Stuart Keeley. The test consists of a six-paragraph “Letter to the Editor” that students evaluate for the quality of thinking revealed in those paragraphs. (Our view is that a major component of critical thinking is the ability to identify errors in reasoning and argument.) The results of this initial study were encouraging: upper-level undergraduate students (i.e., a mixture of juniors and seniors) consistently performed at a higher level than lower-level undergraduate students (i.e., a mixture of freshmen and sophomores), suggesting that some critical thinking skills are acquired by the time a student reaches the latter half of his or her undergraduate career.
The purpose of the second study, which was carried out during the 1998-99 AY, was to refine the measurement properties of the critical thinking test. Dr. Keeley revised the critical thinking test and also refined the scoring manual. The revised test was given at the beginning and end of the spring semester to students taking a legal studies class in which the instructor emphasized critical thinking. There was very strong agreement between the raters. In addition, the mean score on the end-of-semester tests was significantly higher than on the beginning-of-semester tests, suggesting that these freshmen honors students learned how to think critically during the class.
With a highly reliable and seemingly valid test of critical thinking ability now in hand, we returned to our original population of interest: psychology majors. In the Fall, 1999 semester, letters were sent to 60 randomly chosen first year and 72 randomly chosen fourth year psychology majors, inviting them to help the department assess how well psychology majors learn to evaluate arguments. Students were paid $10.00 for participating and had their names entered in a lottery for winning $100. (This study was conducted under HSRB Project No. H00P078FX2.) Altogether, 38 first-year and 24 fourth-year students participated. The results are described in the next section.
3. Results and Conclusions
A) Analysis of the “Assessment of Student Learning” Inventory. Here we describe how the patterns of responses to the 37 items in the survey yield clusters of items that are similar, a techniques known as factor analysis. Next we report some initial analyses of our curriculum by looking to see how different groups of students scored on the derived factors.
Sub-scales of the “Assessment of Student Learning” Inventory. Factor analysis of the large data set done during the Spring 2001 semester reveal that the items in the survey can be grouped together in six different cluster or factor containing from three to eight items each. (This is down from the seven-factor solution we reported on earlier when we used a smaller – and hence noisier – data set.) Upon examining the items that fall into each cluster, we abstracted a name that captures the essence of what those items are measuring. The names of those six factors, and two items from each factor, are presented below. (A full report of the analysis is also available upon request.)
Scientific Inquiry and Problem Solving
I learned how psychologists study the issues pertaining to the content matter of the course.
I learned how to think critically about the ideas raised in the course.
Inter- and Intrapersonal Growth
As a result of taking this course I developed a better understanding of myself.
As a result of taking this course I became more tolerant of ideas differing from mine.
Research Skills and Logical Thinking
I learned how to use logical, mathematical or computational techniques to further my understanding of the content of the course.
I learned how to express myself in writing (e.g., APA-style papers, written assignments, essays).
I connected what I learned in this course with courses in the social sciences (e.g., ethnic studies, sociology, history).
I connected what I learned in this course with readings outside this course (e.g, books, newspapers, magazines).
Relevance for Career or Future
As a result of taking this course I learned information that I will use in graduate school or in my career.
As a result of taking this course I acquired skills I will use in graduate school or in my career.
Working with Others
As a result of taking this course I worked with other students or discussed course material with other students during class (e.g., in class or small group discussions).
As a result of taking this course I worked with other students or discussed course material with other students outside of class (e.g., in study groups, using E-mail.
Two results of this analysis deserve to be highlighted. First, the factors themselves, and individual items within those factors, appear to capture quite nicely both the learning outcomes for the department that were presented in Section 1 as well as most of the more general learning outcomes for the university, as presented on the assessment web site (viz., investigate, connect, write, present, and participate).
Second, to our surprise, items that we saw as reflecting the learning of course content (e.g., I became much more informed about the subject matter of the course; I learned terms to describe behavior) did not form their own independent category, falling instead in three or four of the categories described above. It would seem that the content of these courses is learned, for example, as part of Scientific Inquiry or as part of Personal Growth. In this sense, the content of psychology courses appears not to be stand-alone facts (i.e., declarative knowledge) but incorporated into skills (i.e., procedural knowledge).
Conclusions: The “Assessment of Learning Outcomes” appears to be a reasonably valid and easily administered instrument for assessing what students believe they have learned. It should be remembered, however, that these are students perceptions of what was or was not learned in a given course, and the responses may not be an accurate reflection of what in fact was learned.
A Glance at the Curriculum through the “Assessment of Student Learning” Factors.
Comparisons of Majors to Non-majors. One way of determining whether the empirically derived factors make sense is to compare the responses of our majors (approx. 720 respondents) to those of non-majors (approx. 1750 respondents). Non-majors are likely to be take large-lecture courses that are required by their majors (e.g., introductory psychology, child development, abnormal psychology) rather than the smaller research-focused courses designed for our majors (e.g., statistics, research methods, junior-level laboratory courses); majors take both types of courses.
On every factor but Integration and Inter- and Intrapersonal Growth, majors scored significantly higher than non-majors (F’s ranged from 14.51 to 205.37, p’s < .001). The largest difference between the groups was in the Research Skills and Logical Thinking factor (mean difference = 2.44), followed by Relevance for Career or Future (mean difference = 2.11) and Scientific Inquiry and Problem Solving (mean difference = 1.15). Considering that the non-majors DO NOT take our research-focused courses and are not likely to pursue careers in psychology, these results make sense. It is encouraging to see that at least some Scientific Inquiry and Problem Solving outcomes are provided throughout the curriculum – if not, the mean difference between the majors and non-majors would likely have mirrored those for the other factors. The mean difference between majors and non-majors on Working with Others, although significant, was small (.56), suggesting that this outcome is prevalent throughout the curriculum. The same could be said of Integration, even though the difference between majors and non-majors was not significant. Interestingly, our majors scored slightly but significantly LOWER than the non-majors on Inter- and Intrapersonal Growth (mean difference = .58, F = 4.94, p < .05). It appears that the very same research-focused courses that our majors take may provide fewer opportunities for growth outcomes than the large lecture courses that our non-majors take. Either that our majors have a higher initial level of inter- and intrapersonal knowledge than non-majors – which is why they're psychology majors in the first place – and have less to learn in this area. The first explanation seems more likely.
Conclusion: The comparisons between majors and non-majors lends credibility to the empirically derived factors, especially when considering the constellation of classes that these different groups of students take. Because this data set is the first one to have been analyzed to this level, it is difficult to make recommendations for change or whether to know if change is even necessary. Recall that the results just described have been averaged over all courses for which we have data, and not every course is expected to exhibit a high profile on all factors. These results will be useful as benchmark data with which we can compare subsequent assessments.
Learning Outcomes Over Time. Given the sequence of courses that our majors take over the course of their undergraduate careers, it seemed reasonable to expect, for example, that juniors and seniors would indicate having learned more research skills than would freshmen and sophomores, or that courses taken during the later years would be more relevant for the future than those taken early on. On the other hand, we would hope that outcomes like learning to think critically within a particular domain or connecting what was learned with other fields would not vary with year as a psychology major. Consequently, we looked at scores on the six learning outcome factors as a function of year in school.
There were no significant differences among the means of first-, second-, third-, and fourth-year students in scores on the Integration, Inter- and Intrapersonal Growth, and Scientific Inquiry and Problem Solving factors (F’s < 2.00, p’s > .10). As expected, scores on the Research Skills and Logical Thinking factor increased each year (F = 4.42, p < .005). Perhaps reflecting the tendency for students in the smaller research-focused courses to work together or the tendency to know most of one’s classmates by one’s senior year, scores on the Working with Others factor also increased with year in school (F = 5.84, < .005). An alarming, and in some ways troubling, result is that scores on the Relevance for Career or Future factor systematically decreased with year in school (F = 8.33, p < .001). Post hoc analyses indicate that students who are about to enter the workforce or graduate school see the courses that they take as seniors as less useful than those they took as freshmen, sophomores, and juniors. Whether this trend is unique to our majors or reflects some general growing dissatisfaction with year in school is unclear. We note that the magnitude of the decrease in perceived relevance from first- to fourth-year is nearly identical for our majors as it is for non-majors. Although at first blush this finding could be seen as evidence for general dissatisfaction, it still troubles us to see that our majors see our courses as decreasing in relevance as they move through the curriculum.
Demographic data obtained at the same time that the survey is administered shows a steady decrease from the second-year on in the percentage of our majors who plan to attend graduate school in psychology: 75%, 57%, and 32% in the second-, third-, and fourth-years respectively (there were too few first-year students to allow for meaningful comparisons). Not surprisingly, there is a corresponding increase in both the proportion of students who plan to attend graduate school in a related field with year in school (5% to 12% to 20%) and the proportion of students who hope to get a job in a psychology-related field upon graduating (9% to 12% to 38%). If students perceive that the courses that they take during their last two years are geared mostly to psychology majors who plan to pursue graduate work in psychology, the growing dissatisfaction with course relevance could reflect the growing numbers of majors who decide not to pursue graduate work in psychology.
The paucity of responses from first-year students mentioned above led to an examination of the number of responses we obtained from students at each of the years in school. After collecting data over three semesters, we had responses from 71 first-year, 165 second-year, 230 third-year, and 250 fourth-year students. The entry-level course in psychology is PSYC 201, which ought to be taken during the first year. We found that only about 25 of the 100+ first-year students who declare psychology as their major take PSYC 201 during their first year. If our majors are waiting until the second year to take PSYC 201, they will likely be taking the junior-level laboratory courses – which may be seen as irrelevant to one’s future unless one is going into experimental psychology – during the fourth year. Indeed, the available data show that 80% of the students enrolled in the junior-level labs are seniors.
Conclusions: Given the decrease in perceived course relevance with year in school, it appears that we may not be serving the needs of all of our majors as well as we should be. One possibility is that we are not offering the proper assortment of courses for students to take during the fourth-year. Another possibility is that students are taking third-year courses during the fourth year instead of taking those courses that are more relevant (assuming that we have them in the curriculum). If students are indeed waiting until the second year to start the major, it is no surprise that they are taking third-year courses as seniors. Using course enrollment data, we will determine whether first-year enrollments in PSYC 201 by our majors is indeed low, as the current data indicate. If so, it would behoove us to find out why. For example, a curriculum modification that changes PSYC 201 to PSYC 101 would be in order if first-year students report being wary of taking a 200-level course. Less cosmetic changes, such as creating tracks for students to follow so that they are taking relevant courses during their last two years, are being discussed.
B) Assessment of Learning Outcomes from Alumni. With only 24 alumni surveys returned, there were too few to run a factor analysis on the alumni responses. Consequently, we took the items from the student survey that formed the six factors and used the analogous items on the alumni survey to create analogous factors. Responses to those items were scored just as they were with the student survey. Recall that the students who fill out the survey are essentially a captive audience; the 10% of the alumni who returned the survey are not. Because of that, the alumni responses may not be representative of all alumni. Indeed, two-thirds of the respondents had attended graduate school in psychology or a related field, which is a good bit higher than the percentage of fourth-year students who were planning to attend graduate school in psychology or a related field.
Compared to the responses from the students, the responses from the alumni were significantly higher (t’s > 3.25, p’s < .01) on four of the six factors: Integration (mean difference = 4.97), Inter- and Intrapersonal Growth (mean difference = 3.17), Working with Others (mean difference = 1.00), and Research Skills and Logical Thinking (mean difference = 2.00). There was no difference between the two groups of respondents on the Scientific Inquiry and Problem Solving (mean difference = 0) and the Relevance for Career or Future (mean difference = -.31) factors (t’s < .500, p’s > .10).
Conclusions: These results can be interpreted in two ways. On the one hand we could argue that a retrospective look at what was learned by majoring in psychology shows increases in four of the six factors, maintaining something along the lines of students not appreciating what they were learning in school until they were out of school. On the other hand, given that the alumni sample was in some sense more successful than a random sample of majors would have been, the higher learning outcome scores are to be expected. Accordingly, that scores on two factors are no higher than those of the student sample can be seen as deficits. Either way, considering that lack of relevance is one of those factors, which also showed up as a possible trouble area in the student sample, these data indicate that course relevance is an issue that needs to be explored. That scores along the scientific inquiry factor did not differ among students and alumni suggests that critical thinking and problem solving suggests that these outcomes bear watching.
C) Assessment of Critical Thinking Skills in First Year and Fourth Year Psychology Majors. A graduate student in psychology was trained to score the responses to the critical thinking test using the revised scoring manual that Dr. Keeley had created. Unfortunately, this student went on emergency medical leave soon after the spring semester began, which meant that scoring of the tests was delayed for a long period of time. We now have preliminary results from this third study, which mirror those of the initial pilot study. That is, with means of 11.4 and 6.0, fourth-year students scored significantly higher than first-year students on the critical thinking test (t(60) = 2.970, p < .005). On only four of the eight paragraphs in this test, however, were the scores of fourth-year students significantly better than those of first-year students.
Conclusions: The 5.4 point difference between fourth-year and first-year student scores on the critical thinking test was very close to the 4.75 point difference obtained in the initial pilot study. This degree of improvement in test scores that is contemporaneous with obtaining a degree in psychology pales in comparison to the 17.1 point difference seen among freshmen honors students taking a course that emphasizes critical thinking. These differences could be due to the relative inefficacy of trying to embed critical thinking within the curriculum, the nature of the psychology students who volunteer for the study, or differences between the freshmen honors students and our student volunteers.
4. Actions Taken
A) Refining the “Assessment of Student Learning” Inventory.
As a result of analyzing the large data set, a fourth version of the inventory, which is shorter than the other three, was developed for used at the end of the Spring 2001 semester. The Spring 2001 data await analysis. We hope that with fewer items in the survey, students will be more likely to fill it out with care. Eventually we hope to have sufficient data to be able to examine the learning outcomes on a course-by-course basis.
We are currently examining class enrollment by year in college to get a comprehensive picture of the flow of students through the curriculum. If the data obtained from the surveys are at all accurate, student flow is not what was intended when the curriculum was developed, i.e., with 80% of the students in the junior-level lab courses being seniors.
We also looked at psychology curricula from a variety of other colleges and universities. Without exception, every department we examined has a breadth requirement (i.e., where students are required to sample courses from different areas of psychology) and most offer identifiable tracks within areas of psychology -- a sequence of courses for students interested in clinical/counseling psychology to take that differs from the sequences for those interested in developmental, cognitive, I/O, or physiological psychology (behavioral neuroscience). Our curriculum has neither a breadth requirement nor tracks (which probably go together). Creating tracks and populating them with specialized courses may go a long way towards increasing the relevance of our courses to our upper-level students.
B) Assessment of Learning Outcomes from Alumni.
Late last fall, we began to solicit comments from our alumni concerning things that they would change about our curriculum and about things they wish they would have had the opportunity to learn. These comments, which have been coded but not analyzed, may prove useful in addressing the course relevance issue. As with the previous alumni survey, the results are based on a self-selecting sample of alumni. We will report our findings next year.
C) Assessment of Critical Thinking Skills in First Year and Fourth Year Psychology Majors.
One of the many factors that may account for why the first-year honors students’ critical thinking scores improved so much more than did those for our majors is the attitudes that those students have toward critical thinking. We obtained and examined a copy of the California Critical Thinking Disposition Inventory (CCTDI), a 75-item instrument that measures “The Critical Spirit.” Presumably, without such a disposition, the learning of critical thinking skills would be ephemeral at best. We planned to administer the CCTDI to first-year students enrolled in a UNIV 100 course designated for psychology majors with the hope of following at least some of these students through the curriculum, armed with their CCTDI scores. When we ordered the CCTDI, however, we were sent something else by mistake. By the time we discovered the error, it was too late in the semester to administer the inventory. Meanwhile, we plan to continue analyzing the results of the critical thinking study.