Assessment at Baruch
Creating an Assessment
This page is meant as a primer on how to design and conduct an assessment, but it is by no means in-depth enough to be used as one's only resource. Accordingly, several links to more detailed information are peppered throughout.
Assessment Guidelines for Programs, Majors, and Minors (.PDF)
This document, prepared in the Spring semester of 2014, goes into deeper detail than this page. It is an essential resource.
Assessment: A Three-Step Cycle
The below graphic illustrates the cyclical process of assessment, which consists of, at minimum, three major steps. This process is sometimes referred to as the "Plan-Assess-Improve" method, but there are numerous other processes with up to six major steps. However, the three-step method is simple, streamlined, and effective.
1. Set Standards for Learning
First, standards (i.e. learning goals) must be set. Without standards, nothing can be assessed. It is best for standards to be enumerated and precisely defined: rather than saying that students should "Acquire a breadth of linguistic knowledge," for example, it is better to say that students should "Acquire a working knowledge of phonology, syntax, and semantics." This way, each standard can be individually evaluated, and subjective judgements are kept to a minimum.
Fundamental to writing effective learning goals is a familiarity with Bloom's taxonomy. Without an understanding of the various levels of knowledge, it is difficult to author learning goals that touch upon the full breadth of a student's potential for learning.
In 1956, Benjamin Bloom headed a group of educational psychologists who developed a classification of levels of intellectual behavior important in learning. Bloom found that most of the test questions that students encounter require them to think only at the lowest possible level: the recall of information.
Bloom identified six levels within the cognitive domain, from the simple recall of facts as the lowest level through increasingly complex and abstract mental levels.
During the 1990s, a new assembly was formed to update the taxonomy to reflect relevance to 21st century work. Led by Lorin Anderson (a former student of Bloom), the group contained cognitive psychologists, curriculum theorists, instructional researchers, and testing and assessment specialists. This new taxonomy (“The Revised Bloom’s Taxonomy”) is the standard today.
Below, the levels are defined in order of least to most complex. Each definition is followed by a list of verbs that exemplify intellectual activity at their particular level.
- Remember: Recalling relevant knowledge from long-term memory.
Define, duplicate, list, memorize, recall, relate, repeat, reproduce
- Understand: Making sense of what you have learned.
Classify, describe, discuss, explain, identify, locate, recognize, report, select, translate, paraphrase
- Apply: Using knowledge gained in new ways.
Choose, demonstrate, dramatize, employ, illustrate, interpret, operate, schedule, sketch, solve, use, write
- Analyze: Breaking the concept into parts and understanding the relationships between each part.
Appraise, compare, contrast, criticize, differentiate, discriminate, distinguish, examine, experiment, question, test
- Evaluate: Making judgments based on a set of guidelines.
Argue, criticize, defend, judge, question, relate, select, solve, support, value
- Create: Putting information together in a creative way.
Assemble, combine, construct, design, develop, formulate, generate, invent, write
Writing Learning Goals
Learning goals should clearly articulate expected outcomes of student learning upon completion of instruction (e.g. major, minor, graduate degree, course). These goals should be directly measurable, such as through student assignments, although indirect measures are also useful and can be used in addition to direct measures. Such indirect measures include student surveys, feedback from student focus groups, and course evaluations.
Although this list is not exhaustive, learning goals can also be referred to as the following:
Learning goals should follow the model that the college has adopted for learning goals associated with courses: "By the time that students have completed [the program/major/minor/course], they will be able to..." See the faculty handbook's guidelines for writing learning goals. Of course, student learning goals should be appropriate to the level of each course or program.
For each goal, use active verbs that make clear to students and instructors what students will be able to do upon the completion of the program. The emphasis is on the student's achievements, and not the faculty member's methods: language such as "Faculty members will demonstrate how distinguish pigeons from mourning doves" should be avoided in favor of student oriented phrasing, such as "Students will be able to distinguish between pigeons and mourning doves." It is useful to use verbs contained in the typical discussions of the revised Bloom's Taxonomy.
Our document, Assessment Guidelines, provides further information on how to how to construct and assess learning goals.
2. Systematically Gather, Analyze, and Interpret Evidence
The next step is to evaluate how well your standards are being met. This is the most labor intensive step and the step which is most frequently associated with "assessment." Standards must be mapped to a rubric, by which student performance will be graded and ranked to give a statistical insight into how well students are meeting standards (and, by extension, how well a program is ensuring that its standards are met).
Data must be collected, analyzed, and interpreted. The collection can be done using surveys, assessment of a final project or portfolio, or standardized testing.
Whatever the means, the final task that must be completed in this step is the assessment report. Completed assessment reports from Baruch can be seen on the Assessment Projects & Surveys page. The purpose of an assessment report is to present and interpret the assessment results, and provide suggestions for improving student learning. Both successes and failures should be highlighted and discussed in an assessment report.
Assessment Framework for Academic and Administrative Support Services (.PDF)
In 2008, the Office of Institutional Research and Program Assessment prepared this document, providing a framework by which assessment can be conducted, as well as including information on logic models, improvement strategies, and how assessment differs from standard yearly reporting. Although this document is not recent, it is not outdated, and it is still a valuable resource.
Bloom's Taxonomy Blooms Digitally
When designing an assessment, it is useful to have familiarity with Bloom's Taxonomy, which identifies several different layers of learning, from lower order (knowledge) to higher order (evaluation). This way, students will be assessed at all levels of knowledge, rather than just one or two. This Taxonomy has been revised several times, and this article by Andrew Churches, hosted by Techlearning.com, provides in-depth information on several different versions of it.
3. Use Information to Improve Performance
The final step is to put the assessment report into practice. This can take numerous forms, including restructuring the curriculum, redefining learning goals to better match the skills that students are actually acquiring, changing standards for program admission, adding courses, and hiring new faculty.
Starting the Process Over Again
An assessment is only of value insofar as its findings are applied; ideally, whatever changes are made on the basis of your findings will effectively correct any problems that were found while continuing to foster identified successes. However, it is not enough to simply assume that these changes had the effect that they were presumed to, which is why the entire assessment cycle starts over again.
Assessment is ultimately about identifying which practices are successful and which are unsuccessful, and trying to fix the latter while preserving the former. It is, by necessity, an adaptive process, and must continuously change to reflect the realities of a given program.