Several departments and programs have developed assessment methods tweaked to Carleton culture and to the program’s needs. In this section, we present a few examples of assessment methods that other programs can adapt for their own uses.

Exit Interviews

SOAN

Assessing the SLOs in the SOAN Curriculum: The Student Perspective

Although SOAN faculty believe that the six SLOs are communicated in their courses, do students realize they have acquired these SLOs? To answer this question, we asked our graduating seniors at their individual exit interview whether and how – i.e., through what curricular and co-curricular activities – they had achieved each objective. The interviews were conducted by our Departmental Administrative Assistant, Liz Musicant, who interviewed fourteen of the eighteen graduating seniors. The questions developed are listed below along with a selection of student responses.

  1. Can you give me an example of a time in the major where you felt that you learned to connect information about historical and contemporary socio-cultural phenomena?
  2. Can you give me an example of a time in the major where you felt that you learned to formulate appropriate sociological and/or anthropological research questions about socio-cultural phenomena?
  3. Can you give me an example of a time in the major where you felt that you learned to apply sociological and anthropological theory to analyze socio-cultural phenomena?
  4. Can you give me an example of a time in the major where you felt that you learned to describe how sociology and anthropology interact with one another, interact with other liberal arts disciplines, and contribute to various interdisciplinary conversations?
  5. Can you give me an example of a time in the major where you felt that you learned to draw upon one’s understanding of historical and contemporary socio-cultural phenomena to engage the world?

Comps Survey

Physics

(Note that the first eight items ask students the degree to which they agree with each of the statements).

  1. My Comps experience was educationally valuable, and it gave me the opportunity to use and deepen my knowledge and understand of physics and astronomy.
  2. The department provided clear instructions and helpful feedback about the Comps process and procedures.
  3. My faculty comps advisers provided helpful advising comments during the Comps process.
  4. I found that the process of a series of deadlines with repeated input and revision was helpful during the Comps process.
  5. The oral presentation helped my thinking in terms of the future versions of my written paper.
  6. I revised my written paper significantly after giving the oral presentation, and receiving feedback from it.
  7. I found that my student peer reviewer provided helpful advising and comments to me during the Comps process.
  8. As a peer reviewer, I was able to provide helpful comments during the Comps process.
  9. Do you feel your paper and talk was integrative in nature? Please list the courses (and the topics therein) that you have drawn upon in order to produce your talk and paper?
  10. What skills did you develop from taking courses in the Department of Physics and Astronomy that you applied as part of the Comps process? Where did you acquire these skills, and how were they applicable to Comps?
  11. Were there any other learning skills applicable to Comps that you did NOT acquire as part of your major?
  12. If you could do Comps all over again, what would you do differently?
  13. What advice would you give to future physics majors as they begin their Comps process?
  14. Do you have any suggestions to improve the Comps process?

Sampling student work from classes

Computer Science

How we assessed the goal

Since all of the CS majors in the class of 2010 took Programming Languages from Dave Musicant and Software Design from me, Dave and I were responsible for this year’s round of assessment. Here’s what we did:

  1. We selected at random one third of the CS majors in the graduating class. This sample turned out to include a good range of students, including our weakest senior, one of our strongest, and a bunch in the middle.
  2. We had a preliminary discussion of the criteria we would use to determine “intermediate” and “expert” proficiency. This turned out to be tricky.  We both found ourselves in one of those “we know it when we see it” situations familiar to assessors of great art and obscenity. So we sketched a few notes about particular constructs we would like to see in the students’ Python and Java programs, and moved on to the next step. During this meeting, we also decided to skip the programs written by the students for the Programming Languages course. That course is notable for giving the students the chance to study several programming languages (in addition to learning about programming language theory), but does not attempt to guide the students beyond beginner-level proficiency in those languages.
  3. We studied the assignments from Software Design submitted by these students. The assignments came from three different sections of the Software Design class, so there was no single assignment done by all the students. Each student had submitted at least two assignments written in the Python programming language, and two written in Java.
  4. We met to discuss each student’s work. This interesting meeting led us to the following observations:
    • The best programmers often avoid complicated language structures. Coherent and effective design is often best implemented via simple code. Thus, you can’t assess beginner/intermediate/expert proficiency in a programming language simply by looking for fancy language features.
    • That said, there are some intermediate-level language features in Java and Python that were appropriate in the assignments we read, and the best programmers used those features. (For example students who used regular expressions, built-in date and time processing tools, convenient graphical user interface classes, etc. generally wrote better code overall than did the students who used less specialized language features to achieve the same goals.)
    • Some language features stick out as red flags in student code (usually through their absence), and thus might be especially valuable in assessment of language proficiency. (Notably, only the strongest students made reliable and appropriate use of “final” and “static” in Java, list slicing and dictionaries in Python, and exceptions in both languages.)
    • One student (whose code was certainly of intermediate proficiency) used advanced language features, but not entirely appropriately. So is his proficiency higher because he used tools many students didn’t know about?  Is it lower because he didn’t notice the inappropriateness of his usage?
    • Overall, our feeling was that all but one of the assessed students had intermediate-or-better proficiency in Python, and about half also had intermediate proficiency in Java. This conclusion was reassuring in the sense that we’re nearly reaching our goal (and thus don’t need to panic), but also leaves us with plenty of room for improvement.

Biochemistry

During the Spring term of 2010, Joe Chihade chose two exam questions from his Chem 320 course that were related to thermodynamics applied to biological systems. This particular exam was given first as an in-class, fixed time test. After it was graded and returned to the students, they were allowed to rewrite their answers over the course of several days for partial credit, so we are confident that the answers we assessed are a good indication of their understanding.

Copies of this work were made and were reviewed by Dave Alberg and Joe Chihade during the December break and the Winter Term. Dave did much of the analysis on his own, and then had follow-up conversations with Joe, who largely agreed with his conclusions.

After reading through the students’ answers, Dave generated a rubric for each of the questions, focusing on the particular thermodynamic concepts addressed. Devising these rubrics was more difficult than expected. With regard to the first question, Dave wrote: “After reading many of their answers, I struggled with devising a useful rubric that could easily applied to their answers. In the end, I felt that the most straightforward way to handle this learning goal would be simply to read a student’s answer, in total, and then assess based on a qualitative scale of our perception of their level of sophistication.” The three-level rubric Dave arrived at was:

  • Level 1: Student has a basic understanding of ΔG˚=–RTlnK.
  • Level 2: Student understands the relationship between ΔG˚ and ΔG˚’ and understands that the direction of spontaneity of a reaction depends on the conditions at hand.  (i.e. Le Chatelier’s principle).
  • Level 3: Student has a sophisticated understanding of how Le Chatelier’s principle applies in the context of a biochemical pathway.

For the second question, an even simpler two-level rubric was used:

  • Level 1: Student understand ΔG = ΔH – TΔS and can apply it to a chemical transformation.
  • Level 2: Student has a sound understanding of the thermodynamics of protein folding.

Both these rubrics could be read as – lower levels: student understands the thermodynamic concept generally, highest level: student can apply the concept to the understanding of a biological system.

[The report then discusses what the program found in the student work samples.]

Rubrics

History

Draft Rubric (History goals 1, 3-7) Revised 7/9/10
Analysis of Comps and Comps Defense    

The student…5 (distinction)4 (high pass)3 (pass)2 (low pass)1 (not passing)
Formulates a historical question of appropriate significanceClearly formulated question; question corresponds to norms in history; question is significantQuestion is acceptably historical, but may be slightly less clearly formulated or slightly less  significantQuestion is acceptably historical, but may be weaker in clarity  or significanceQuestion is weak in clarity or significance or somewhat problematic as a historical questionDoes not formulate a question. Question is not historical
Proposes a complete and persuasive answer to the questionFully answers the question in a persuasive wayAnswers question but with slightly less completeness or persuasivenessAnswers question adequatelyAnswer is somewhat convincing but not adequately soDoes not answer the question or provides an unconvincing answer
Bases answer on an analysis of a body of primary sourcesExcellent match of argument and evidence. Uses a rich body of sourcesVery good match of argument and evidence. Uses a very good body of sourcesAdequate match of argument and evidence. Uses an adequate body of sourcesSome problems with the match between argument and evidence and/or a small or weak body of sourcesLittle or no connection between argument and sources. Too few or no sources
Locates him/herself in a scholarly conversation via secondary literatureWide-ranging knowledge of the relevant secondary literature; subtle appreciation of his/her own contributionVery good knowledge of the relevant secondary literature; very good appreciation of his/her own contributionGood knowledge of the relevant secondary literature; good appreciation of his/her own contributionInsufficient or less relevant secondary literature; weak appreciation of his/her own contributionVery little or no secondary literature. Little or no awareness of their contribution to a conversation
Documents sourcesDocuments sources completely, correctly, and consistentlyDocuments sources well; may be less  consistentDocuments adequately, but some lack of completeness or correctnessSignificant errors of omission or significant problems with correct and complete citation formSparse or no documentation; little or no attention to proper form.
Writes with sound mechanics*Shows strong control of diction, variety of syntax, and transition. May have a few minor flawsShows control of diction, variety of syntax and transition. May have a few flawsDemonstrates competent writing; may have some flawsMay show patterns of flaws in language, syntax, or mechanicsWork has serious flaws in language, syntax or mechanics that interfere with comprehension
Presents effectively orallyAnswers all questions; presents points fully and clearly; thinks on her/his feet. Demonstrates more knowledge of the topic than contained in the paperAnswers all questions; presents points fully and clearly. May not think on his/her feet as readily; demonstrates less knowledge of the topic beyond the paperAnswers most questions with reasonable fullness and clarity. May struggle a bit; shows good comprehension of  work contained in written paper and some knowledge beyond itAnswers some questions. Formulates limited or somewhat unclear answers. Knowledge does not extend  much beyond paperFails to answer most of the questions or to do so clearly. Shows little knowledge of the topic

*Adaped from E. M. White, Teaching and Assessing Writing (Jossey-Bass, 1994), 2nd ed.

Spanish Language and Literature

Also used by Latin American Studies

Latin American Studies, 2010-2011
Assessment Rubric for Comps

As part of our assessment of the major, we have chosen to focus on the issue of research and use of secondary sources in our senior comprehensive exercise. We will use the following criteria in the assessment of our students’ research and incorporation of secondary resources in their essay.

The student...UnsatisfactoryPoorAdequateGoodExceptional
12345
Selection     
Demonstrates use of appropriate search engines.     
Locates sources specific to topic or goal.     
Incorporates significant number of resources in Spanish (and/or Portuguese) and from original texts.     
Format     
Generates properly formatted bibliography     
Quotes and cites sources correctly in the body of the text.     
Application     
Incorporates sufficient research effectively to support a central argument or thesis.     
Selects and integrates pertinent evidence, including quoted material, quantitative data, narratives, etc.     

Aggregate Raw Score: (Please sum all seven scores).

Use of outside assessors

Art History

Art History’s response to question on annual report:

A. Which department student learning goal(s) was/were addressed in your department/program’s assessment work this year? (If more than 1, please number them for reference below.)  

We chose to focus on the assessment of [learning outcome] #1 for our majors through the mechanism of our comps exam. We informed our outside examiner, Nicola Courtright, Professor of Art History and Associate Dean at Amherst College, that we wanted to assess this particular learning outcome. 

. . . The examiner not only sets the exam but evaluates the student work.  . . .

This was arguably one of the most challenging comps tests in the last fifteen years and our number of senior majors was on the small size – only six students. The exam demanded a very sophisticated approach to discussing historical and cultural contexts, often asking the students to integrate this with our other learning outcomes, especially #2 and #6. Even though the sample is small, all students received a grade on this part of the exam that we would consider acceptable or better for work in the major and thus achievement of this learning outcome. The range of grades … (from a tough examiner) demonstrates successful achievement of the learning outcome. Please note that this section of the exam also tests learning outcomes #2 and #6, and we will study this evidence (along with more that we collect) in the years that we focus on those learning outcomes. I should also mention that Part I of the exam focused more directly on learning outcomes #2 and #6 and that we will also be studying this evidence in future years.

Automating collection of assessment data

PEAR

PEAR department has put three kinds of assessment tools on the web that are associated with particular outcomes for activity classes and varsity teams. 

  • Excel spreadsheet online: documents attendance
  • Student Voice Survey: Student Course evaluations
  • Student Voice Rubric: Instructor observations rubric related to:
    • Skill acquisition
    • Knowledge and interpersonal skills
    • Behavior changes and self-management