HLC Assessment Focused Visit Results
On April 28-29, 2008, a team from The Higher Learning Commission (HLC, UNK’s accrediting body) visited to evaluate our progress in developing a useful and sustainable assessment process campus-wide. The 2008 “focused visit” was a follow-up to our 2004 accreditation visit in which assessment was identified as an area that needed further attention. The 2008 focused visit report from the team identified areas where UNK had made acceptable progress and other areas that need further attention.
The team felt that we had developed a significant infrastructure to support assessment, with 100% participation among academic departments. To ensure sustainability, the team recommended that each academic program evaluate their assessment activities to determine which offer the most valuable information for increasing student learning. They recommended that programs periodically review their learning objectives, methods, and measures to be sure that the process is focused on critical learning outcomes and the collection of useful data.
Faculty commitment was another area that the team commented on. The report noted that “faculty spoke openly about how the campus climate related to assessment has changed over the past four years. Faculty and assessment personnel interviewed reported a clear understanding of the current status of the assessment process and the work that remains to be done.” Recognition for exemplary assessment has been established with the annual Assessment Awards Luncheon.
The team noted progress in the assessment of Distance Education, Graduate program, and Writing Intensive assessment. However, the team noted additional work on assessment in the Cultural Diversity Program was needed.
In addition, there was a follow-up report requested for 2011 for the assessment of the General Studies Program. The team noted: “As it currently stands, there does not appear to be a clear, shared definition nor understanding of the learning outcomes for the General Studies program. Each department that is teaching courses in General Studies is assessing their interpretation of the outcomes in their own way, using their own instruments. As such, it is impossible to determine whether UNK students are achieving at an acceptable level when the level shifts from course to course, program to program“. The report due in 2011 must: 1) provide evidence that we are measuring standardized General Studies learning outcomes and making curricular adjustments that take those outcomes into account; 2) demonstrate that we have made the required changes in the GS program to provide campus-wide agreement on learning outcomes and the assessment of those outcomes; 3) demonstrate that we have moved from a culture of assessment to a culture of learning.
The advancement section of the report included several suggestions for refining the assessment process at UNK. The team made it clear that expectations for the assessment process would be higher for the next HLC accreditation visit. Three main areas for advancement that were discussed in the report were: the revision of student learning outcomes, the creation of curriculum maps, and the development of assessment rubrics. To move forward in these targeted areas, the assessment newsletter will focus on these issues during the next academic year. This newsletter includes an article on refining student learning outcomes. The next newsletters will have articles on developing curriculum maps and developing assessment rubrics. In addition, the Office of Assessment plans to offer online training on rubric development by Spring of 2008.
back to top
Refining Student Learning Outcomes
We want the assessment process at UNK to be useful, efficient, and sustainable. Clearly, if we do not collect information that is useful, or we have a very inefficient process, our assessment work will not be sustainable. As noted in the previous article, the HLC report identified three areas for academic programs to consider when evaluating their assessment process: the revision of student learning outcomes, the creation of curriculum maps, and the development of well designed assessment rubrics. Each of these areas has the potential to help us toward our goal of developing a sustainable assessment process. These topics will be discussed in this year’s newsletters, starting with refining student learning outcomes.
In this article, we will suggest some criteria for evaluating student learning outcomes and provide some resources related to outcome development. Every academic program at UNK has a list of student learning outcomes that forms the basis of their assessment plan. The process of developing and refining student learning outcomes allows faculty members of an academic program to achieve consensus about the purposes of an academic program in terms of student learning. The identified student learning outcomes can then become the standards for what is expected of graduates of the program. Student learning outcomes should be dynamic and as a program changes these outcomes should be revisited.
An academic program interested in refining the learning objectives might consider the following criteria for well constructed student learning outcomes:
Outcomes should be stated in terms of what a student should know and be able to do at the end of the program as a results of his/her learning experiences. Because these are outcomes that we expect from graduating students they should typically move beyond statements like “Our graduates have knowledge of…”. This may be an appropriate outcome for courses at lower levels, but by the time a student graduates, higher level skill and abilities, will likely be expected. For example, “Students will analyze and interpret the meaning of certain historical texts or events in their social, political, economic, and cultural contexts”, an outcome from the History Department requires students to demonstrate higher level skills.
Bloom’s Taxonomy can be helpful in defining outcomes of varying levels of cognitive complexity. The following links provide suggestions, including sample verbs, for developing outcomes using Bloom’s taxonomy.
Outcomes should be measurable, clear, and useful to learners. The HLC review team recommended that each program review their outcomes with the goal of improving the measurability and clarity of their outcomes. As stated in the HLC Advancement report “Often, program level student learning outcomes contain multiple issues that not only make them difficult to understand, but also nearly impossible to assess. Starting with simple statements such as “Our graduates are able to….” followed by one (and only one) action verb, and one (and only one) accomplishment will be both easier to understand and assess.” An example from the Marketing Department demonstrates an outcome that focuses on a single issue: “students will develop marketing strategies congruent with the current business environment.”
“Students will develop and demonstrate appropriate writing, communication, critical thinking, and analytical skills linked to subject mastery”, is an example of an outcome that contains multiple issues that could be broken down into more that one outcome. More specificity would make the outcome clearer and easier to assess. For example in a field where students are expected to do original research, an outcome like: “students will be able to identify a problem, construct hypotheses, identify variables, construct operational definitions, create a research design, carry out statistical analyses and present the research in a poster at a conference”, very specifically and clearly lists the expectations for student learning.
Outcomes at the academic program level should consider the institutional mission, asking the question “how are we supporting the mission of UNK?” It could be helpful to review the UNK Mission statement (http://aaunk.unk.edu/catalogs/current/int/mission.asp) and learning goals from the Phase I Strategic Plan (http://www.unk.edu/academicaffairs/index.php?id=23350).
Outcomes should be supported by the curriculum. Once we have a list of learning outcomes, it is important to evaluate whether students are given the opportunity to achieve these outcomes. Curriculum mapping, or mapping course level outcomes to department/program level outcomes, can be very helpful in gaining a better understanding of where we introduce and reinforce the learning related to outcomes within the curriculum. If students are not given learning opportunities, then either the curriculum should be revisited or the outcome should be revised or removed.
If any department or program would like more guidance as they revisit their student learning outcomes, please feel free to contact the Office of Assessment.
back to top
Assessment reports submitted using WEAVEonline
We are rolling out WEAVEonline at UNK! This year’s assessment reports will be entered directly into WEAVEonline. In an effort to make the transition as smooth as possible, the Office of Assessment staff entered each department’s/program’s assessment plan (mission, objectives, and measures) into WEAVEonline last year. This fall, faculty members responsible for assessment reporting will log in to WEAVEonline and enter their assessment findings and related action plans. We hope that faculty will find WEAVEonline to be user-friendly. To introduce WEAVEonline to new users, the Office of Assessment prepared training materials and ran training sessions.
WEAVEonline training sessions began September 15th; and a total of seven sessions were conducted. One faculty member from each department or program was invited to attend a training session; that person is the WEAVEonline “expert” and support person for the department/program. The person who attended the training session will provide WEAVEonline training and support for others in their department who need to use the program to enter assessment results (for example, some programs have different people entering results for general studies and for majors). The deadline for entering department/program assessment reports into WEAVEonline is October 24th. We will try to have feedback to departments as soon as possible after that date. Departments submitting their reports before that date will receive their feedback within 1-2 weeks of submitting their report.
To view information about WEAVEonline, go to: http://www.weaveonline.com/
back to top
Assessing Critical Thinking
Critical thinking is one of the student learning outcomes that appears from the institutional strategic planning level to the General Studies and departmental course level at UNK. For example, the UNK strategic plan calls for a curriculum that develops “students’ ability and confidence to think critically”. In support of that institutional goal, one of the four major outcomes of the general studies program is that: “Students will demonstrate the capability for critical thinking, reasoning and analyzing”. Many programs and departments also identify the ability to think critically as an important outcome for their students.
Measuring critical thinking can seem daunting. In a collaborative and ongoing project at the Washington State University that began in 1996, faculty have developed a “seven-dimension critical thinking rubric … to provide a process for improving and ….measuring students’ higher order thinking skills during the course of their college careers.” The rubric is general enough to be used in many disciplines; in fact, the Biology department at UNK is using the WSU critical thinking rubric to assess writing in their discipline. To learn more about this project and to view the critical thinking rubric visit: http://wsuctproject.wsu.edu/ctr.htm.
back to top
The internet can be a treasure trove of resources for assessment. Many of these resources can be found at a site maintained by North Carolina State University Planning and Analysis. http://www2.acs.ncsu.edu/UPA/assmt/resource.htm. This is a great first stop when you have questions related to assessment. Below, we have included a couple of links related to writing/revising student learning outcomes and developing rubrics.
Resources for writing/revising outcomes/objectives:
Writing Student-Learning Program Objectives: Building the Right Foundation for Assessment. A power point presentation on by Stacey Street & E. J. Keeley, Office of Institutional Research, Eastern Kentucky University. http://www.ir.eku.edu/Assessment/docs/EKU%20writing%20objectives%20presentation%20Apr041.pdf
Creating Student Learning Goals: A Brief Guide for Faculty. From George Mason University Office of Institutional Assessment. https://assessment.gmu.edu/AssessmentLinks/Guide.html
Resources on developing rubrics:
Winona State University assessment website has several resources that could help with rubric development including this document which could be helpful in guiding rubric development: http://www.winona.edu/air/resourcelinks/rubric_sampler.pdf. The website also houses has a long list of example rubrics from a variety of disciplines.http://www.winona.edu/air/rubrics.htm.
Rubistar for Teachers (http://rubistar.4teachers.org/index.php) is an online tool for developing rubrics. You can also search rubrics within Rubistar to see examples of what others have created. Virginia Commonwealth University has an online module that walks a user through the steps of creating a rubric with Rubistar for Teachers http://www.vcu.edu/cte/resources/videos/Rubistar_tutorial/index.html.
back to top
Upcoming Assessment Conferences
The Office of Assessment sponsors travel stipends for faculty to attend conferences related to assessment of student learning. Funding is competitive, with preference given to faculty who are making a presentation related to assessment or faculty attending a conference or workshop on assessment. Applications should be submitted to the Director of Assessment, prior to the event. Awards will not be made after an event has occurred. Successful applicants will be asked to deliver their conference presentation or a topic related to the conference at a Center for Teaching Excellence seminar.
Indiana University-Purdue University Indianapolis Assessment Institute
The 2008 Assessment Institute
October 26-28, 2008
Association for Institutional Research Assessment Institute
Registration for the 2009 Assessment Institute will open Nov. 1, 2008
American Evaluation Association Annual Conference
Evaluation Policy and Evaluation Practice
November 5-8, 2008
National ACademic ADvising Association (NACADA)
NACADA 6th Annual Assessment of Academic Advising Institute
February 18-20, 2009
Texas A&M University
9th Annual Texax A&M Annual Assessment Conference
February 22-24, 2009
College Station, TX
Association of American Colleges and Universities (AAC&U)
General Education, Assessment and the Learning Students Need
February 26-28, 2009
National Association for Student Personnel Administrators (NASPA)
2009 NASPA Annual Conference: Nourishing Partnerships for Lifelong Learning
March 7-11, 2009
Higher Learning Commission Annual Meeting
Finding Common Ground: Accreditation, Assessment and Accountability
April 17-21, 2009
2009 North Carolina State Undergraduate Assessment Symposium
Aligning Pedagogy, Curriculum & Assessment
April 24-26, 2009
The International Association for Educational Assessment (IAEA)
2009 Annual Conference
September 13-18, 2009
back to top