The North Central Association accreditation visitation team’s preliminary report was received in August. The final report was received in September. The report indicated that the visiting team was not satisfied with the progress that had been made with the assessment of the General Studies Program. The team also noted that assessment of the Writing Intensive program needed to be undertaken as well.
The General Studies assessment strategy that had been developed during the 2003-04 academic year was based upon a distributive education model and dependent upon individual department assessment of their General Studies courses. By the time of the visitation of the North Central accreditation team in April, 2004 only four departments had participated in the assessment process. There was also no schedule for when other departments would participate in the assessment process. The only indirect assessment that had taken place was the use of the National Survey of Student Engagement. However, this was administered through the Senior Vice Chancellor’s Office had had not been correlated with the GSP Objectives.
Initiatives Undertaken in 2004
The Director of Assessment met with the GSP Director in June and discussed what needed to be done to develop an assessment plan. Two of the major elements discussed in June were the need for a long term schedule of departmental assessment and the need to devote more time to assessment by the GSC. In August, the Director met with the GSC Director and requested that the GSC Director either share assessment responsibilities with the Coordinator of Assessment or relinquish responsibility to the Coordinator of Assessment in order to accelerate the development of an assessment plan for General Studies. The GSC Director agreed to a co-chair arrangement for the GSC Assessment subcommittee.
The Director of Assessment met with the GSC Assessment subcommittee in late August and indicated that the previous plan for assessment was not sufficient for accomplishing an assessment process that met the North Central Association guidelines for assessment. The Director of Assessment established a general timeline for the subcommittee. The timeline consisted of developing an assessment proposal for review, analysis, and adoption by the GSC in October. The remainder of the academic year was to be spent in developing the assessment plan, beginning implementation (where possible), piloting instruments, and working with departments to develop individual assessment plans with the goal of being able to fully implement assessment of the GSP by the Fall semester of 2005.
A 10 point plan was developed by the Coordinator of Assessment and the Director of General Studies for review by the GSC assessment subcommittee and the GSC. It was presented to the GSC subcommittee in September and the GSC in October. It was disseminated across the campus through the GSC minutes to Faculty Senate, the Assessment Newsletter, and the Council of Chairs. Although it was broadly disseminated, it did not engender a great deal of discussion.
The GSC Assessment subcommittee began working with the co-chairs of the Assessment subcommittee to develop surveys, identify norm-referenced tests, and work with departments on the individual department assessments. A number of activities were initiated but communication amongst members of the subcommittee and the Council as a whole tended to be limited and thus understanding of the initiatives was lacking.
During the Spring semester, the Coordinator of Assessment had other priorities that needed to be accomplished and the GSC Director continued to work on the assessment plan and initiatives with the assistance of the Director of Assessment. The GSC Director continued working with the GSC subcommittee through the Spring semester and developed a draft of the complete assessment plan for the April GSC meeting. In April, the Director of General Studies resigned and the Director of Assessment assumed responsibility for working with the Council on assessment. The Director of Assessment visited with the Council regarding their wishes on how to proceed. It was decided that assessment was important enough to have all members participate in the discussions and that a summer retreat would be held to go through the assessment plan draft that had been developed by the Director of General Studies. The retreat was held in May (17th &18th) and June (6th) with the purpose of reviewing and revising the plan and assessment instruments to further suit the needs of the GSC.
The following sections report on what was initiated during the 2004-05 academic year. Each section will describe what was developed, if it was implemented, an analysis of findings, and revisions needed in the assessment instruments (if any).
The assessment plan included the direct measurement of student learning through departmental assessment of General Studies objectives through the individual General Studies courses. This process had been adopted in 2003 but had not been implemented fully during the 2003-04 academic year. The timeline that was adopted stipulated that each department involved with General Studies Program instruction was to develop an assessment plan by May 1st, 2005. The goal was to then administer the assessment during the Fall semester of 2005. Departments were encouraged to pilot or collect some initial data during the Spring semester of 2005. Several collected data from the piloting process as well as through implementation of their assessment proposed plan.
The co-chairs met with each department chair in December and January to present the assessment plan guidelines and a proposal for stipend funding to support faculty development of the assessment plans. The stipend funding was provided with the condition that faculty attend a session sponsored by the Center of Teaching Excellence on assessment practice and development. Eligibility for the funding required the submission of a proposal describing the work to be done by February 15th. The plans had to be completed by May 1st as another condition. The Director of Assessment paid $11,100 in stipends for the completed work. All departments with the exception of one had developed some type of plan by May 1st. The plans will be reviewed by the GSC in June (and continue the review to completion) to determine if they measure the GS objectives, have required components, and identify the level of knowledge being assessed (with Bloom’s taxonomy). The Safety Center course did not have an assessment plan made for it. The course has not been offered for 3-4 years and they felt the appropriate thing to do would be to drop the course from the GS Program.
- Review plans and identify exemplary assessments
- Use Bloom’s taxonomy as a method to determine levels of knowledge being assessed and link to objectives on critical thinking
- Initiate a change of program form to drop the Safety Center course from the GSP
Four norm-referenced assessments were identified in September for potential use and selection. The initial timeline that was intended was to evaluate the tests during the fall and pilot in the spring with adopting. The tests were to be analyzed to determine match with the GS objectives and the curriculum. The actual initial review of tests began during the Spring semester. The tests that were selected for review included:
- College Basic Academic Subjects Examination (College BASE)
- Collegiate Learning Assessment (CLA)
- Academic Profile (AP)
- Collegiate Assessment of Academic Proficiency (CAAP)
The prices and initial introductory information were obtained by web search. The Director of General Studies selected the College BASE test for initial review and piloting. It was piloted in April by administering it to a group of students enrolled in a Psychology program psychometrics course. The students completed the complete battery and were interviewed as part of a focus group review of the test. The test protocols were then analyzed by the test publisher and test profiles were returned to the Director of General Studies. The student comments tended to question the length of time needed to complete the entire test and the validity of the reading assessment component.
The Director of Assessment attended the Higher Learning Commission annual conference in April. He visited with the representatives of each of the test publishers and obtained additional information about each test.
The GSC review of the test and the results at the May retreat concluded that the other tests should be evaluated during the coming year. Comments from the group included “the College BASE seemed to be the weakest test of the group; the GSC should first analyze its entire assessment program to determine what a norm-referenced test could provide that is not already provided; prior to actually administering a test, it (the test) should be shared with the faculty to build “buy-in” and that the logistics should be worked out first; don’t buy a test just to be buying one.”
The GSC will review the other tests using the assessment plan guidelines during 2005-06.
Categorical Objectives Analysis:
The categorical survey was developed in the fall. The piloting was conducted during December. The surveys were to be administered in 16 different courses across all of the required perspectives. The survey had two components. One component was a student component and the other was a matching faculty component. The intent of the two different components was to enable the GSC to compare the survey responses of the students with those of the faculty. Very few faculty members completed the survey. The student surveys were piloted using different response formats to determine which might be most effective or practical to use. These included individual completion on computer, on-line with notification and without notification. It was found that the response rates for on-line were quite similar whether the student was notified of it in class or not. The students who were administered the survey individually at computer stations were also asked to comment about the survey in a follow up interview. They generally felt that it was easy to complete and easily understood.
During the Spring semester, the logistics for administration were discussed. Two possible methods were identified. One involved notifying the department chairs and asking them to forward the URL to faculty who would then share it with students. Two departments were identified for this initial administration. One department refused to participate using that format and the other agreed to try it. The response rate was not good. Questions also arose regarding the purpose of the survey. The other method was to disseminate the survey using the student announce list and requesting their participation to complete the survey sections for the GS categories that they had completed. This method was not used because it was believed to be less accurate in regard to obtaining student responses. The potential problem was that students might complete sections of the survey that they had not actually completed the coursework. This method would be simpler and with some modification it can be tried during the next year.
The GSC members at the retreat in May reviewed the efforts (piloting and initial administration) and decided that the faculty component should be dropped. The faculty member participation in the piloting and the initial administration were very poor. The primary focus of the survey should be on student perceptions regarding their learning. The survey items should be revised to more specifically focus on the objectives of the category and not the course. The members thought the survey was still a good idea but would need significant revision. The initial thought was to revise it in the fall. Revision ideas include: explanation for the students regarding why the survey is being administered, more explicit instructions, a thank you statement, have the items focus on the perspective objectives and not on the course itself.
- Revise the survey and identify linkages with objectives and other surveys
Survey of Faculty Opinions of the GSP:
The faculty survey was disseminated on-line using the UNK announce list between March 3 and March 13. A final total of 102 faculty members responded to the survey. This represents approximately 1/3rd of the faculty. The survey results are included in Appendix ___. The GSC members in attendance at the May assessment retreat made some initial interpretations of the data as well as recommendations for initiatives that may be undertaken by the Council. They are as follows:
- There appears to be a lack of faculty community in regard to planning course instruction and planning across disciplines (a concern)-item 22
- General Studies courses are viewed as requirements that must be completed rather than as courses that accomplish the GS learning objectives (a concern)-item 7
- Students primarily learn about the GS Program through the catalog (a concern)-item 9
- The GS Program is seen as a fragmented list of course requirements (a concern)-items 10, & 11
- The GS Program is considered to be a static program that has not changed much since its inception (a concern)-item 18
- Administrative (resource) support for the program appears to be minimal (a concern)-items 19, 6
- The CD component is viewed as a strength (a strength)-item 14
- There is recognition of efforts to internationalize the curriculum (a strength)-item 13
- The development of political, moral, and ethical dimensions is recognized by faculty as occurring (a strength)-item 12
- Faculty do not always have a sense of the purpose of the GS Program (a concern)-item 19
- Sponsor panel/roundtable discussions hosting faculty who have taught courses across disciplines describing how to do it, experiences, logistical needs, etc.
- Identify administrative obstacles to interdisciplinary offerings such as student credit hour and FTE allocations, etc. and seek to remove the obstacles
- Develop interdisciplinary courses
- Promote mentoring of faculty (both adjunct as well as full time) in collaborative planning and instruction of GS courses within departments
- Coordinate syllabi and instruction where appropriate
- Identify departments doing an exemplary job of mentoring faculty teaching GS courses
- Identify departments doing an exemplary job of coordinating syllabi
- Address issues of administrative support for the GS Program:
- Develop a budget for travel to conferences about general education curriculum and assessment
- Develop initiatives for faculty development with a supporting budget
- Address administrative obstacles to interdisciplinary course development
- Address making the Director position more attractive to qualified faculty
- Publicize GS Program goals and objectives:
- Develop pamphlet describing benefits and goals of the GS Program for use at summer orientation
- Revise website to promote GS Program goals and objectives
- Sponsor orientation for new faculty about the GS Program goals and objectives
- Sponsor panel discussions for faculty about the GS Program goals and objectives
- The GSP graduate assistant developed a pamphlet describing the GS Objectives and benefits. It was reviewed by the members in attendance at the May retreat and with some revision approved for distribution this summer at orientation. Two thousand copies will be made for distribution.
- Mary Daake the Director of Academic Advising, will distribute the pamphlets and promote the GS Program during the orientation sessions. She is also modifying her presentation to more strongly emphasize the benefits and objectives.
- The Director of General Studies presented an overview of the GSP and distributed the pamphlet given to incoming freshmen at the new faculty orientation in August.
Assessment of the Assessment:
The survey should be revised to include
- Length of service in the demographic characteristics
- Reword items to include self perception rather than how the respondent believes others perceive it
- Eliminate 3 in the likert scale or add an “I don’t know” category
- Revise items to reflect future issues or changes in the program
Survey of Student Perceptions of the GSP:
The Student Assessment Committee developed the Student Survey to be answered by students during early Spring semester. The survey was reviewed by the General Studies assessment subcommittee and the GSC was notified prior to the survey’s use. It was disseminated on-line using the Opinio format between March 25th and April 23rd. The student announce list was used and all undergraduates were asked to participate. Of the approximately 4,900 undergraduate students, 607 chose to participate. This was considered to be a very successful first attempt at measuring student perceptions of the structure and governance of the GSP. The members of the committee shared their survey results with the GSC and at a poster session at the UNK Platte Valley Assessment Conference.
The members from the GSC who attended the May 17th and 18th retreat reviewed and analyzed the survey results. Several initial interpretations were developed. They are
- There was less “waffling” in the responses by students than faculty. This may have been a reflection of modifying the faculty survey scale
- Students tended to perceive the GS Program as an obstacle toward taking courses in their major
- Students perceived that faculty viewed teaching in the GSP as a burden
- Students wanted to see major changes in the program
- Some want to see it discontinued
- Many saw it as a set of course requirements rather than as learning goals to be accomplished
- Generally speaking, the members of the GSC interpreted the results as indicating that students did not understand the purpose(s) of the GSP
- There is a public relations problem regarding the efficacy of the GSP and thus more effort needs to be expended describing the benefits and purposes of an effective GSP. Some ideas included:
- Development of a pamphlet describing the GSP benefits and objectives for distribution to students
- Taking more time at orientation to explain the benefits and objectives
- Possible revision of the catalog to more fully explain the program
- The pamphlet developed by Tanis Saldivar was reviewed and revised with the intent of trying it this summer
- The publication of the pamphlet was ordered and it will be used on a trial basis this summer at orientation
- Mary Daake, the Director of Academic Advising, revised her presentation to students and parents to include information from the pamphlet
Assessment of the Assessment:
The members of the GSC in attendance at the retreat revised the Assessment Plan to incorporate the student assessment by the Student Assessment Committee on an annual basis. The members also added a student focus group component as well. These were added to the June draft of the assessment plan. The survey could be revised for future administrations by
- Changing item 6 to add “it did not help me help me decide on a major”
- Adding a comment section
- Adding items that focus on the quality or type of learning experiences
- Asking if courses were a replication of high school courses, an extension, or quite different
- Asking about rigor of coursework
- Asking if courses are presented at a higher level
- Recommending to the Student Assessment Committee that the updated survey be administered every Spring semester
There were a number of concerns raised about the clarity of the objectives. There are apparently three levels of objectives. There are the General Objectives for the entire program. The question was raised in regard to whether or not they were actually better used as goals. There are the categorical/perspective objectives for each category. Then there are additional indicators or objectives for each category. These were developed a few years ago, but are not present in official documents such as the catalog. The organizational format of these objectives needs to be discussed and clarified.