How surveying campus personnel helps you better serve your students
Associate Vice President of Retention Solutions
April 5, 2012
Many campuses have regular assessment plans that focus on student satisfaction. Certainly capturing and documenting student satisfaction is a key variable for accreditation activities, strategic planning, and retention efforts. However, by adding an assessment of the perceptions of your campus personnel, you can gain valuable insights into the priorities of your faculty, administration, and staff regarding the student experience.
As I work with campuses on their assessments, we have found capturing the perceptions of campus personnel can provide context for student perspectives and can prepare campuses for positioning the data received from students as findings are presented on campus. These administrator, faculty, and staff perceptions may be directly and indirectly communicated to students through day-to-day interactions. This data can help you to know if campus personnel will be surprised by the student data or if the area has already been identified as a challenge.
Let’s take a closer look at national data from campuses using the Institutional Priorities Survey™ (IPS) to see some examples of how this can play out.
In recent blogs, I have looked closer at the four-year private and public data, so today I would like to share national data from students and campus personnel at community, junior, and technical colleges. (Similar data are also available for four-year institutions and two-year career schools, and the examples I’m going to give about using comparative data are relevant to all institution types). The Student Satisfaction Inventory™ (SSI) asks students to indicate their satisfaction as well as the level of importance they place on each item, using a seven-point scale with a rating of seven meaning very satisfied or very important. The IPS asks campus personnel to respond to parallel items about the student experience with their level of agreement that the institution is meeting this expectation, as well as an importance score, with the same seven-point rating. (For example, on the SSI, one item reads “I am able to register for classes with few conflicts,” while the parallel item on the IPS is “Students are able to register for classes with few conflicts.”) Areas that are both very important and have high satisfaction or agreement scores are strengths while items with high importance and low satisfaction/agreement responses are institutional challenges.
When I first look at combined SSI and IPS data, I see which items overlap as strengths and challenges and which items may have the opposite indicators. The following items are viewed as strengths in the 2011 community college national data sets for students and campus personnel:
- Nearly all faculty are knowledgeable in their fields.
- Students are able to experience intellectual growth here.
- The campus is safe and secure for all students.
- There is a good variety of courses provided on this campus.
- Program requirements are clear and reasonable.
- Students are made to feel welcome here.
There is much to be proud of on this list, with areas that all levels of the campus are indicating are positives. If this were a single campus’s results, they could celebrate these strengths with the knowledge that students and campus personnel have given these items a thumbs up.
In the national data set, one item is viewed as a challenge by both students and the campus personnel:
- Academic advisors are knowledgeable about program requirements.
This is a priority area not only for students, but for faculty, administrators, and staff as well. The leadership of a college could move forward to improve this item with the knowledge that everyone is on board with this being an issue that needs to be explored and resolved.
Another interesting finding in the national data set is one item which students view positively as a strength but campus personnel perceive to be a challenge:
- The quality of instruction in most of my classes is excellent.
This presents an opportunity to discuss this perception with the campus personnel, to both let them know that students believe the quality of instruction is an overall strength and to determine why faculty and staff may feel that there is room for further improvement.
There are other items on both lists that are not identified by the other population. Often this is because students and campus personnel place different emphasis on what is important. This highlights another interesting way to explore the data: comparing importance rankings. The average scores reported on the individual surveys can be converted to rank indicators to see how important items are relative to the other items. Out of 50 items which cross over on the surveys, we can focus on those items that students and campus personnel have ranked different by 10 spots or more. For example:
SSI National Community College Rank
IPS National Community College Rank
|Classes are scheduled at times that are convenient for me.||
|I am able to register for classes with few conflicts.||
|Faculty are usually available after class and during office hours.||
|On the whole, the campus is well-maintained.||
|The amount of student parking is adequate.||
These are all priority areas for students that are not given equal priority by campus personnel. Faculty, administration, and staff are focused on things such as the college showing for students as individuals, security staff responding quickly, and the admissions staff being knowledgeable, which are all important. However, it is also critical to understand how important access to classes is for students, as reflected in the data.
I encourage you to review your current assessment plan and to consider expanding your surveying to include your campus personnel for at least one survey cycle. Most campuses I am working with administer the SSI and IPS at the same time of year to capture a specific moment in time on campus. The IPS works especially well with online administrations since campus personnel regularly use e-mail for communication on campus. Responses to an e-mail invitation and reminder messages average around 50 percent for faculty, administration, and staff, which is a very high response rate.
If you have any questions about surveying students or staff, or taking action with satisfaction data, please e-mail me or leave me a comment.
Note: the National Community College SSI data are based on 186,038 student records from 198 institutions during the fall of 2008 through the spring of 2011. The IPS data reflect 7,299 campus personnel records from 43 institutions during the same time period. Additional national data sets can be found on the Noel-Levitz Website.