CHAPTER 17: IMPROVEMENT DIMENSION

Key Performance Indicators
Current Situation
Opportunities and Challenges
Recommended Actions

Foundations Institutions conduct assessment and maintain associations with other institutions and relevant professional organizations in order to achieve ongoing first-year improvement.  This assessment is specific to the first year as a unit of analysis—a distinct time period and set of experiences, academic and otherwise, in the lives of students.  It is also linked systemically to the institutions’ overall assessment.  Assessment results are an integral part of institutional planning, resource allocation, decision making, and ongoing improvement of programs and policies as they affect first-year students.  As part of the enhancement process and as a way to achieve ongoing improvement, institutions are familiar with current practices at other institutions as well as with research and scholarship on the first college year.

Back to Top

Key Performance Indicators for the Improvement Dimension include: 

  • Assessment of High Impact Initiatives: first-year initiatives impacting the largest percentages of students include systematic assessment.
  • Using Assessment for Improvement: assessment results have been used to improve or confirm existing practice across first-year initiatives.
  • Student Success: recent assessment activities have improved campus understanding of the following elements of student success: student allocation of their time, student/faculty connections, student use of campus services, and student class attendance patterns.
  • Strategies and External Involvement: the following strategies have been used to improve the first year: attendance at higher education meetings, participation in multi-campus initiatives focused on the first year (e.g., collaborative projects, benchmarking efforts, and peer institution consortia), broad campus exposure to external experts, and broad exposure to campus-based knowledge/expertise about the first year.
Back to Top

Current Situation

Assessment of High Impact Initiatives

The primary focus for the Improvement Dimension was to closely examine five specific initiatives identified by the UNI FoE Steering Committee, in order to determine if assessments were being conducted for each, and the results interpreted and used to improve on existing practices.  The findings of the committee evaluation of each initiative are described in further detail below.  The initiatives evaluated include:

  1. Orientation/Registration Programs.
  2. Residence Life Programming.
  3. Liberal Arts Core Category I.
  4. First-year Advising.
  5. Institution’s Vision and Mission statement regarding personalized and engaging learning opportunities for all students.

Orientation/Registration Programs

Students at orientation

Currently, orientation programs for first-year students are separately administered and assessed by the administrators and staff of those programs (i.e., Department of New Student Programs, International Services Office, and Jump Start) in an effort to better understand the needs and expectations of first-year students participating in the orientation process.  Consequently, there is no unified model for how to manage and evaluate first-year student orientation, making it difficult to compare or combine results.  The use of a satisfaction survey is one of the most common methods employed by these programs to obtain information. 

The 2008-2010 University Catalog identifies a freshman as any student who has earned less than 30 credit hours.[1]  However, the definition of a first-year student varies within undergraduate academic units and, as a result, may create discrepancies during the assessment process and consistency in use of the results. 

Residence Life Programming

Residence Life currently uses a variety of methods for assessing first-year students at UNI, both within the Department of Residence (internally) and through other University departments (externally).  Surveys, such as the Association of College and University Housing Officers-International (ACUHO-I)/Educational Benchmarking, Inc. (EBI) Resident Assessment, cover many topics pertinent to Residence Life programming, but focus on satisfaction rather than actual behavior improvements.  Asking students if their behavior has changed as a result of programming is the current method used for determining programming effectiveness and is addressed in several different surveys (i.e., the House survey and Immersion Project survey[2]).  These surveys evaluate perceived changes rather than actual outcomes.  The biggest challenges facing Residence Life are both to use existing assessment information carefully, as it measures perception rather than reality, and to create new assessments that also measure residents’ actual change in behavior.

The Residence Life program conducts many informal assessments among RAs, students, and Residence Life Coordinators (RLC).  These assessments provide feedback to Residence Life staff on programming effectiveness.  Many surveys used by Residence Life have a small sub-set of application for programming efforts. While some surveys are highly systematic, they do not necessarily systematically assess information useful to Residence Life.  Residence Life programming would also like to assess whether students’ behavior changed as a result of programming efforts.  Several surveys focus on student satisfaction with programming, rather than assessing whether students’ behavior was changed by the programming, suggesting the need for new questions to be included in such surveys.

Participation rates for some surveys (House survey and Dive-In Days[3]) had poor and/or varied participation, according to conversations with Department of Resident staff, ranging from 22-33%.  This input makes using these assessments difficult since the data collection may be skewed by an inaccurate representation of students.

Liberal Arts Core Category I

The Liberal Arts Core (LAC) Category I includes courses in writing, mathematics, oral communication, and personal wellness, one or more of which is completed by students during their first year of study or in advance of attending UNI.  The University has used the Measure of Academic Proficiency and Progress (MAPP) since 2006 for assessment of the LAC Category I.[4]

personal wellness class

Category I courses also are reviewed by the LAC Committee in conjunction with a faculty committee organized for that specific purpose once every six years.  The LAC Committee uses the results of this assessment process to make recommendations to the University Faculty Senate and appropriate University administrators.  The Committee provides the Senate with a copy of the final review along with the LAC Committee Review Summary.  The committee found no evidencethat assessment results were shared directly with faculty responsible for delivering these courses, or that recommendations were made in direct consultation with this group of faculty.

Some departments with a strong investment in LAC Category I courses have begun to conduct more frequent assessments of the Category I courses they teach as part of departmental student outcomes assessment, but there is confusion across campus as to who should be designing and implementing this assessment process - the departments in which the course is housed or the LAC committee.  As a result, there is no consistent, established, student outcomes-based assessment of Category I courses across campus.[5]

First-Year Academic Advising

A National Academic Advising Association (NACADA) document states that a systematic campus-wide advising evaluation program based on student outcomes assessment is needed.[6]  Assessment of first-year academic advising is being conducted throughout campus;[7] however, departments and programs providing first-year advising and programming are conducting assessment independently and without the ability to compare or combine results.  Approximately 66% of first-year students are in programs (through the College of Business and the Office of Academic Advising) that have developed and conduct systematic assessment.[8]  These assessments, as well as many of the assessments evaluated for this dimension report, focus on the specific topics and goals that are deemed important to the individual advising center or program.  Surveys being conducted on an annual basis containing questions related to first-year academic advising include: CBA first-year seminar pre- and post-assessment, Office of Academic Advising pre- and post-outcome survey, New Student Survey, student evaluations for freshman orientation, Jump Start program evaluation, NSSE data (Qs 10b & 12), student satisfaction survey, and the Peer Academic Advisor in Residence survey.

Vision and Mission – Personalized Learning Opportunities

Teacher working with a small group

The current UNI mission statement in the 2004-2009 strategic plan includes the following statement: “The University of Northern Iowa is a comprehensive institution dedicated to providing a personalized learning environment, founded on a strong liberal arts curriculum.”[9]  The focused mission statement on the same Web site states: “The University of Northern Iowa offers a world-class university education, providing personalized experiences and creating a lifetime of opportunities.”  As current UNI President Benjamin Allen put it in his September 2006 installation speech, “We must maintain our commitment to a personalized learning environment—an environment with small classes and substantial interaction between the faculty member and student.” [10] 

Institutionally, National Survey of Student Engagement (NSSE) data is the most thorough assessment of university success in creating a personalized learning environment.  While NSSE as a whole provides data related to student engagement, NSSE data specifically related to the creation of a personalized environment include the following:

1. Data from the NSSE Student-Faculty Interaction Benchmark Score

The NSSE benchmark score for Student-Faculty Interaction is a calculated mean for student responses to the following set of questions concerning how often students have:

  • Discussed grades or assignments with an instructor.
  • Talked about career plans with a faculty member or advisor.
  • Discussed ideas from your readings or classes with faculty members outside of class.
  • Worked with faculty members on activities other than coursework (committees, orientation, student-life activities, etc.).
  • Received prompt written or oral feedback from faculty on your performance.
  • Worked on a research project with a faculty member outside of course or program requirements.

2. Data from the NSSE Supportive Campus Environment Benchmark Score

Three questions in particular relate to the creation of a personalized environment, in the sense of responding to individual needs.  Students are asked to what extent their institution emphasizes the following:

10b. Providing the support you need to help you succeed academically.

10d. Helping you cope with your non-academic responsibilities (work, family, etc.).

10e. Providing the support you need to thrive socially.

3. Student overall evaluation of their entire educational experience at the institution. (Q13)

While NSSE data have been made available to faculty and staff through a password-protected Web site, workshops relating NSSE data to Liberal Arts Core outcomes, and presentations to departmental faculty meetings, no evidence to date exists to suggest that these assessment efforts have been used in systematic ways to provide specific information for either personalized or engaged learning.

Many activities and services at UNI lead to personalization of services related to learning, including the Academic Learning Center, CHAMPS Life Skill course, and the use of e-portfolios in some first-year courses, but evidence that these programs and activities are assessed in order to determine the degree to which UNI provides a personalized learning experience to first-year students does not exist.

Using Assessment for Improvement

The FoE faculty/staff survey results (see Table 17.1) show that while assessment is being conducted, the results of the assessments are not widely used to shape the first-year student experience at UNI.  Only about 26% of faculty/staff rated the institution’s ability to disseminate assessment results relevant to the first year of college in a timely manner as good or excellent, with a mean of 2.86 (Q93).  Faculty and staff also ranked the institution only slightly higher in its ability to “assess what’s relevant” (Q92: mean of 2.94) and to “using results for improvement” (Q94: mean of 2.90) in the first year of college.  These survey results are supported by the lack of evidence of assessment-related improvements in the FoE evidence library and across campus as a whole.

Table 17.1 Faculty/Staff Survey – Improvement Dimension

Question #

Question Text

Response

1 or 2

3

4 or 5

Mean

50

To what degree are you engaged in the following professional activities focusing on the first year: attending conferences or workshops at this institution?

67.9%

19.7%

12.4%

2.05

51

To what degree are you engaged in the following professional activities focusing on the first year: attending national/regional conferences or meetings? 

72.8%

13%

14.2%

1.91

52

To what degree are you engaged in the following professional activities focusing on the first year: reading professional materials? 

53.2%

21.7%

25.1%

2.55

53

To what degree are you engaged in the following professional activities focusing on the first year: presenting at conferences or contributing to publications? 

76.1%

11.9%

12.1%

1.80

84

To what degree has the following information directly influenced your work with first-year students: demographic information from this institution's databases? 

74.2%

15.6%

10.2%

1.85

85

To what degree has the following information directly influenced your work with first-year students: measures of pre-enrollment academic skills from this institution's databases? 

77.1%

11.8%

11.1%

1.76

86

To what degree has the following information directly influenced your work with first-year students: academic skills measured after one semester/quarter or more? 

67.5%

18.4%

14.1%

1.96

87

To what degree has the following information directly influenced your work with first-year students: measures of student time spent studying? 

73.4%

17.3%

9.3%

1.83

88

To what degree has the following information directly influenced your work with first-year students: measures of student alcohol consumption? 

77.5%

13%

9.4%

1.76

89

To what degree has the following information directly influenced your work with first-year students: current practices at other institutions? 

66.7%

17.4%

15.9%

2.05

90

To what degree has the following information directly influenced your work with first-year students: professional / published research? 

62.8%

19.9%

17.3%

2.18

91

To what degree has the following information directly influenced your work with first-year students: student evaluations, assessments, or feedback? 

38.3%

27.1%

34.6%

2.82

1=Not at all; 2=Slight; 3=Moderate; 4=High; 5=Very High

92

Overall, please rate this institution's assessment capabilities relevant to the first year of college: assessing what's relevant? 

28.5%

45.2%

26.3%

2.94

93

Overall, please rate this institution's assessment capabilities relevant to the first year of college: disseminating results in a timely manner? 

34.1%

40.2%

25.8%

2.86

94

Overall, please rate this institution's assessment capabilities relevant to the first year of college: using results for improvement? 

33.8%

39%

27.1%

2.90

1=Very poor; 2=Poor; 3=Fair; 4=Good; 5=Excellent

 

Orientation/Registration Programs

Although limited in scope, assessment results are being used informally by the Coordinator of New Student Programs, International Services, and Jump Start to critique and improve programs.  These results are typically used internally (within the department or college) and are not distributed for the purpose of collaboration to the university-wide community.  The Coordinator of New Student Programs uses the results gathered from the Summer Orientation Evaluation[11] and New Student Survey[12] to change and implement new programming as well as improve program sessions for first-year students and their parents (e.g., a resource page for parents was developed as a result of this assessment).  The International Services Office uses the results of their International Student Orientation Evaluation Form to change and improve programming for international students.[13]  Jump Start evaluations are primarily positive in nature, from both first-year[14] and transfer students,[15] but it is unclear how they are being used to improve orientation practices. 

Residence Life Programming

With a few exceptions, the assessments discussed earlier are used regularly by Residence Life staff as a decision-making tool for improving programming.  Some examples of findings made after reviewing current assessments include:[16]

  • The American College Health Association-National Collegiate Health Assessment (ACHA-NCHA) survey[17] confirmed the continued need for alcohol education.
  • House surveys identified that career-related issues could be improved, and revisions will be made to improve passing along this information to students.
  • The Association of College and University Housing Officers-International (ACUHO-I) survey identified the need to address the areas of time management and related factors of academic excellence.  Changes in these areas include Houses being identified for positive academic achievement, the PAIR program’s responsibilities being changed, and grade-related bulletin boards being mandated.

Residence Life Coordinators (RLCs) are given priorities gathered from the previous year's ACUHO-I/EBI results, as well as other related assessment information, such as the House survey results.  In addition, RLCs provide feedback they have gathered from RA staff and through their expertise and experiences.

Liberal Arts Core Category I

Category 1 related data from MAPP and NSSE have been gathered into handouts and presented in workshops to faculty and staff during the fall semester of the 2008-2009 school year.[18]  The workshops were not well attended, and it is not clear how departments, or the individual faculty who attended the workshop, are using the information they received at the workshops.  As a result, the Director of Academic Assessment and Liberal Arts Core Coordinator are visiting all departments to share and discuss this data during the 2009-2010 academic year.  There is no evidence to support that this information is being used in any organized, systematic way, to make changes in courses or curricula in Category I by departments or faculty who teach these classes.

No systematic assessments are being conducted in the areas of Oral Communication, Writing, and Mathematics courses, and results from the LAC generated Category I six-year review process are not shared in any systematic way with faculty responsible for teaching these courses.  Personal Wellness faculty have used student surveys to assess and make changes to content themes and to evaluate whether fitness outcomes are being achieved in the aerobic labs; however, for economic reasons, the majority of labs were suspended as of fall 2009.

First-Year Academic Advising

There is no formal and/or written plan in place for the analysis and use of academic advising assessment results at the institutional or individual levels.  Through anecdotal evidence, it was found that results from assessments are typically used internally (within a department or college) but are not distributed for information and/or collaboration to the university-wide community.  Recently, two assessments have been developed and implemented for purposes of advising improvements:

  1. The First-year Seminar for Business Majors Course Assessment – Pre and Post Assessment[19] and the Office of Academic Advising Pre and Post Outcome Survey of their new Intake model,[20] are assessments of new programs.  The assessment tools have been established and conducted for the first time in the academic year 2008-2009.  Staff from these programs have yet to have the opportunity to review results and improve practices as necessary. Discussions with both the College of Business and the Office of Academic Advising reveal that plans are to collect data on a continual basis, analyze responses, and use results to determine improvements and changes to these programs as necessary.
  2. The New Student Survey conducted in fall 2007 was not conducted in the fall 2008 due to the assessments administered for the Foundations of Excellence®.  Although the use of results has been limited and informal, the goal is to collect data to make annual programming changes and determine if desired outcomes are being met.

Vision and Mission – Personalized Learning Opportunities

There is not a comprehensive, university-wide plan in place to improve the personalized learning opportunities for students at UNI.  The efforts undertaken in academic advising are the most advanced at this point, but not enough longitudinal data exists for that information to be directly used for making improvements in personalized learning.  There is also a need to connect the assessment of academic programs and courses to the critical review and analysis of the program/course, eventually leading to modifications or restructuring specifically aligned to personalized learning.  Likewise, there is no documentation to suggest that the NSSE data are being used to directly improve personalized learning opportunities of first-year students.

Student Success

TA studying student in the dormhe Improvement Dimension Committee focused on questions from the FoE faculty/staff survey that address the degree to which recent assessment activities have improved the campus’ understanding of student success.  In response to the question, “To what degree has the following [assessment] information directly influenced your work with first-year students: measures of student time spent studying?” (Q87), 73.4% of faculty/staff indicated that assessment results had slight or no influence on their work with first-year students.  Furthermore, 76.1% of faculty/staff respondents did not use demographic information to improve their understanding of first-year students, “To what degree has the following information directly influenced your work with first-year students: demographic information from this institution’s databases?” (Q84), and 77.1% did not use pre-enrollment academic skills information about students to aid them in assisting and teaching first-year students as determined by the question, “To what degree has the following information directly influenced your work with first-year students: measures of pre-enrollment academic skills from this institution’s databases?” (Q85)  These responses indicate that most faculty and staff are not using assessment data in their work with first-year students.  The committee’s hypothesis was that assessment information is not used because that data are not readily available to most faculty and staff.

Additionally, when faculty and staff were asked “To what degree has current practices at other institutions directly influenced your work with first-year students?” (Q89), 66.7% answered not at all or slight.  Sixty-two percent of faculty and staff said their work with first-year students was not at all or only slightly influenced by professional/published research. (Q90)  When asked to what degree their work was influenced by student evaluations, assessments, or feedback, (Q91) only 34.6% of faculty/staff responded high or very high.  Little evidence was found that faculty/staff systematically collect or use assessment results to improve their understanding of student/student connections or student/faculty connections, especially as it relates to the first year.  Some data does exist in the NSSE,[21] but the results are relatively recent, not widely distributed, and have not been used for process improvement.  The data, however, are useful for institutional benchmarking and as baseline for future assessment.

Strategies and External Involvement

Approximately 68% of FoE faculty/staff survey respondents indicated they do not attend conferences or workshops at UNI focused on the first year. (Q50)  Seventy-three percent of faculty/staff indicated that they do not attend national/regional conferences or meetings focused on the first year. (Q51)  46.8% responded moderate or very high to the question, “To what degree are you engaged in reading professional materials focusing on the first year?” (Q52), 76.1% indicated not at all to slight when asked the question, “To what degree are you engaged in presenting at conferences or contributing to publications focusing on the first year?” (Q53)  Additionally, no direct evidence was analyzed that verified either broad campus exposure to external experts or broad exposure to campus-based knowledge/expertise about the first year.  In fact, during multiple conversations with faculty who participated in the FoE process, many expressed surprise at the number of first-year activities and programs specific to the first-year student, the amount of department level/program level assessment but lack of institutional coordination, and the lack of communication on institutional level data.

Back to Top

Opportunities and Challenges

  • There are currently no specified learning outcomes for the first year at UNI.  The university strategic plan and mission does not mention first-year students explicitly.  There is a challenge to focus on that which is salient and specific to first-year outcomes, developing and maintaining a clear channel of communication about the first year across UNI, and nurturing a culture of relevance and value for consistent, ongoing assessment and improvement of first-year programs.
  • When assessment has been done, it typically has not been done for all first-year students in a unified manner.  There is not a common set of questions for comparison of first-year students’ experiences at UNI.
  • There is an opportunity to align institutional perceptions with individual reality.  While assessments are being done for several of the initiatives studied by this dimension, these data are not being consistently disseminated to those who work with first-year students.  The dissemination of data, along with campus conversations about it, may significantly improve campus understanding of the elements of student success in the first year.
  • The opportunity exists to develop a core of faculty/staff who are interested in improving the first-year experience.  The challenge will be to support the faculty, staff, and administrators who have an interest in the first year and who have the time and/or resources that will allow them to participate, attend, read and review, assess and make use of assessments, and align curriculum with student outcomes.
Back to Top

Recommended Actions

  1. Assessment Activities within Orientation Programs should be standardized throughout all Programs to more fully Assess Expected Student Outcomes

    1. Assessment results should be made available to the campus community.
    2. The coordinator of new student programs or a committee should be responsible for monitoring and evaluating all orientation assessment activities.
  2. Assessment Activities within Residence Life should be strengthened through the Development of a more Formalized Assessment Review Process

    1. A department-wide assessment committee should be instituted.
    2. Efforts should be made to improve the quality of assessment of learning outcomes of educational programs, not just student satisfaction.
    3. Springboard houses should be used to further assess services offered and program effectiveness in residence life and throughout all first-year programs.
  3. The Current Process for Evaluating the Liberal Arts Core should be Refined to Clarify Student Outcomes for each Category

    1. Annual assessment activities should be developed that specifically evaluate those outcomes.
    2. A systematic, coordinated effort should be put in place in collaboration with faculty to ensure those results are used for improvement.
  4. All Assessment of Academic Advising should be Coordinated by a Designated Individual or the Undergraduate Academic Advising Council

    1. A written assessment plan should be developed that includes intended outcomes, strategies to assess outcomes, and a plan to use the results for improvement.
  5.  A Common Definition of what it means to Offer a Personalized Learning Experience should be Developed and a Plan to Assess Student Engagement in such Experiences during the First Year should be Initiated

  6. Assessment Plans for the Five First-Year Initiatives should be Widely Disseminated to Students, Faculty, and Staff

    1. Efforts should be made to seek formal and informal feedback.
    2. Emphasis should be placed on sharing information so that departments may gauge their relative effectiveness and be identified for distinction.

[4] https://assessment.uni.edu/