Student Evaluations

ChauvenetOnline Surveys

Diverse methods, ranging from surveys to observations, help me to assess my teaching effectiveness.  I request feedback on my teaching from professors and lecturers in addition to departmentally-required observations, and implement written student surveys early in the semester to identify effective methods and to adjust my techniques as necessary.

Qualtrics online surveys at the beginning and end of a course help me to determine students’ conception of their learning progress in a given class.  This system allows me to use a combination of Likert scale and open-ended questions to determine students’ perceived progress, as well as increased or decreased engagement.  Quantitative evaluations always pair with qualitative questions, often simply asking “why?”  I use these sorts of entrance/exit surveys to characterize student engagement with Italian writing and Internet culture and determine the effectiveness of Italian blogging as a pedagogic tool as part of my Teaching as Research in Higher Education research project.  To avoid the common pitfall of fetishizing emerging technologies, I will continue to use online surveys to assess the usefulness of new forms and forums of writing as they enter common usage.  Because students fill out these surveys at home, they do not take up valuable class time, and the professor can easily identify trends.  But while online surveys make evaluation manageable and efficient, particularly for large classes, I personally prefer to supplement them with paper surveys and observations.

Paper Surveys

Cornell University’s Department of Romance Studies conducts in-class student surveys two times per semester.  The Feminist, Gender, and Sexuality Studies Program uses long-form student surveys at the end of the semester.  I augment this feedback by soliciting informal, bi-weekly reactions in the form of minute papers at the end of the day’s lesson. This timing allows me to tailor the next day’s activities to the students’ areas of concern, and reveals how quickly students are progressing towards learning goals.   Typical questions for these casual check-ins include, “Which class activity did you find most challenging today, and why?”  or  “If you were the teacher, which exercise would you review with the class tomorrow, and why?”

Feminist, Gender, and Sexuality Studies Evaluations

By contrast, my Fall 2014 student surveys for “Introduction to Feminist, Gender, and Sexuality Studies” were a huge surprise, in that these students picked up on completely different aspects of my teaching from my Romance Studies students.  This group particularly valued my ability to explain complex theories using simple terms.  As one student put it, “She was very adept in realizing when the class was not clear, and in that event she would clearly explain the topic at hand.”  Students in FGSS appreciated my “passion for the material,” whereas my Romance Studies students interpreted this same enthusiasm as a more generalized personality trait.  I was also surprised to learn that many students considered me to be a tough grader.  In the words of one student, “Harsh but fair grader – her critiques and comments are really helpful.”  I consider this comment to be evidence of my progress as a teacher since arriving at Cornell in 2009.  We all want to be liked by our students, and offering A’s can provide an easy path to popularity.  But: I want my students to learn even more than I want them to like me.  That means telling them the truth, kindly, and offering clear and specific means to improve.  Even on apparently slap-dash student papers, I always identify the strongest paragraphs and explain why the argumentation works so well in those places.  This feedback technique provides students with models that they can use for future papers.  Privileging my students’ progress over the potential risk to my popularity has not been easy.  These surveys demonstrated to me that the students can see, and often appreciate, this commitment to their development as scholars.

Romance Studies Evaluations

Moving from FGSS back to Romance Studies, the Department’s bi-semester surveys ask students to rate instructors on criteria ranging from effective use of visual aids to demonstration of cultural expertise. Over the past three years, students have consistently remarked on the humming energy of my classrooms, and suggested that verbal and physical animation helped them to understand Italian without resorting to English.  In a similar vein, I ask my students to mime new vocabulary, or to describe unknown terms with other words they do know.  Making my pedagogy explicit helps the class to understand why we regularly engage in silliness: studies such as A Kinesthetic Approach to Building Language Power and Moving in(to) Imaginary Worlds suggest that gesture and movement aid in vocabulary recall and the encoding of new terms.

My Spring 2015 student surveys for “European Modernism” characterized several themes common to my evaluations over the past few years: students spoke about our class as an expedition, using words like “exciting,” “challenging,” “explore,” “seek,” and “investigate.”  They particularly enjoyed how I structured the course  so that each piece fit logically into the next: in assignments for example, solo writing led to group editing which fed into one-on-one meetings with me.  In these meetings, students found me to be “knowledgable” and “enthusiastic,” which motivated them to improve.  As one student put it, “Through different assignments and instructions from our instructor, I have gained confidence in my writing abilities and actually enjoy writing!”  Every student felt that the course had improved their writing, and many named “European Modernisms” as their favorite course that year.  But the evaluations also presented a paradox: my students and I disagreed about which texts helped them to learn.  Almost universally, the students wanted to eliminate The Craft of Research.  Many claimed to have already mastered the basic lessons that the text covered, whereas I believe that the book’s step-by-step progression dramatically improved their work.  Teachers often joke that we have to give students what they need, disguised as what they want, and that may have been the case here.  But rather than attempt to “hide the grammar,” I hope to reexamine how I frame this book in the future.  In my opinion, the book is a zen proverb – but how to convey the complexity behind the concision?  While I do not have an answer yet, I plan to use this question to shape future incarnations of this course.

RinaldoI have learned from critical feedback as well.  In my first year teaching Elementary Italian at Cornell, students requested more strongly structured lesson plans in the Fall 2010 student surveys.  I now include pre- and post-segments to each activity, and clarify how the exercise contributes to lingual, literary, or cultural knowledge.  Progress became apparent as early as the Fall 2011 student surveys.  My high scores for class structure in the Fall 2012 student surveys indicate the continuing success of these changes.

Teaching writing skills in Intermediate Italian presented a new challenge: how to provide students with clear suggestions to improve their written work, a common request in Spring 2013’s midterm surveys.  To show my students that I took their comments seriously, I closed the feedback loop the next week in class by asking them to write minute papers describing the type of feedback they found most helpful.  This experience taught me that although all TAs follow the same rubric for written comments, combining written feedback with oral explanation clarifies the advice.  I now schedule office hours with individual students after the first and second written assignments to go through their writing and my assessment together.  By the end of the semester, my Spring 2013 student surveys demonstrated the effectiveness of these changes in my approach to written error correction, as well as enthusiasm for the blog project that I added to our standard curriculum.

Please note that students were instructed to review my teaching effectiveness in “Section II: The Instructor.”  “Section I: The Course” refers to TA coordinator responsibilities.