Assessment at Baruch

More Information About Assessment Plans

The first step in developing an assessment plan is to determine what part(s) of your program you wish to assess to determine if it is working as planned.  The logic model that you developed in the previous section will serve as a useful framework for deciding what element(s) of your program you want to assess.

It is not necessary to assess every component or intervention every semester.  However, you need to develop a timeline to indicate when various components will be assessed.  A sample assessment cycle is shown below:

Sample Assessment Cycle

Program Component or Intervention

Outcome

(from Logic Model)

When Assessed

Report Due

Who is Responsible

Math Tutoring Program

Outcome 1

Fall 2008

Dec 31

Jimmy

Outcome 2

Spring 2009

June 1

Astrid

Pre-grant Awards

All Outcomes

Fall 2008

Dec 31

Alan

Post-grant Awards

All Outcomes

Fall 2009

Dec 31

Melissa

 

Once a timeline has been set to assess various program components or interventions an assessment plan can be developed for the current semester. 

Cycle for Assessment Planning Template  

A model assessment plan is shown in the chart below.  Each column in the chart is described below:

  • Assessment Questions  –  These are questions about your program components.  They are not the same as those on student and faculty surveys.  A survey is a method for collecting data. Responses to questions on a survey are used to assess whether a program component has achieved its intended outcomes.

      Types of Assessment Questions – Formative and Summative.

Formative Questions (quantity and quality of operations and services) – Focus on the level of resources necessary to carry out program activities; level of effort necessary to achieve program outcomes; the level of outreach; and the level of satisfaction with services and operations.

Summative Questions (learning outcomes)  – Focus on whether and to what extent the program has achieved its goals for student and faculty learning in terms of knowledge, awareness, attitudes, behavior etc.

  • Indicators of Success – What are the specific, measurable characteristics or changes that represent achievement of an outcome? Indicators are directly related to the outcome and help define it.  Examples are provided in the table below entitled ‘Types of Indicators of success’.
  • Data Sources – Where and from whom will the data needed to address the assessment questions be obtained?  They might be from program documents and existing data bases; from program participants, students, faculty, and staff; and from other records and observations.
  • Methods of Collecting Data – What methods will be used to collect the data from the various data sources?  These may include surveys, focus groups, interviews, tests, and program logs. 
  • Timeline – When will the data be collected? For example, a survey may be administered at the beginning and at the end of the semester.
  • Responsible Person(s) – Who will be responsible for collecting the data? They may include staff within a given administrative unit or staff from other offices and entities within the institution.
  • Data Analysis – How will the data interpreted? Data analysis methods range from simple counting and comparing to more sophisticated methods such as analysis of variance, or regression analysis.
  • Dissemination of Results – With whom will the data findings be shared and in what format?  They may include faculty, department heads, external evaluators, and accreditation evaluators.  Formats for disseminating the results include technical reports, executive summaries, pamphlets, newsletters, and oral presentations.

Fall 2008 Assessment Plan Template