2006-2007 Assessment Report

Assessment Methodology

The purpose of this assessment project was to improve UPA products, processes and services, to identify ways to become more efficient and effective, and to determine UPA’s strengths and weaknesses.

The goals for this project included:

  • By February 15, 2007, assess UPA to determine UPA’s strengths and weaknesses and the needs of UPA’s constituencies and clients.
  • By April 30, 2007, develop an action plan based on assessment findings and develop a new overall assessment plan (including outcomes and measurements) for UPA.

UPA staff members discussed UPA’s mission, goals and objectives during the fall of 2006. Although there were many issues of interest to UPA, the staff narrowed the focus of the assessment to the following three questions:

  • For what purposes do UPA’s clients use its information, services and resources?
  • Among all of UPA data, information, and services (existing or needed), which do the clients think should be given highest priority?
  • How can UPA improve in order to fit the clients’ needs better?

The assessment method identified to address these assessment questions was the focus group technique. It was decided to invite those who most frequently use UPA’s information, services and resources to a focus group session. The staff developed a list of those they had worked with during the prior 12 months. Those who were most often listed were invited to one of five focus group sessions. Joni Spurlin and Nancy Whelchel developed the focus group guide, which outlined specific questions and prompts. Nancy Whelchel or Carrie Zelna facilitated each of the focus groups. Melissa Godwin wrote important items on flip charts during all the sessions. Joni Spurlin or a graduate student took notes during the sessions.

Those who attended one of five focus groups conducted during January 2007, included:

  • 10 Vice Chancellors or directors of non-academic units
  • 5 Vice Provosts
  • 12 Deans or Associate Deans
  • 8 Department Heads
  • 8 faculty or staff, including assessment professionals
  • (The Provost was interviewed separately.)

Summary Of Results From Focus Groups

Following is a summary of the findings. These findings are not in any specific order; all are of importance.

Positive Comments:

  • UPA is Wonderful!
  • Data is useful and we know it is there!
  • Staff is excellent, timely, and does well with the volume of work needed.
  • Impressed with what UPA is doing.
  • Well-integrated system with planning and discussions with college and administration.
  • Layout of data/website is a strength – well organized; well done.
  • Good people to work with!
  • Great customer service!
  • With current staffing, UPA could not do more. UPA needs to increase its staffing.

Challenges, Needs of Clients, Suggested Improvements:

Based on the focus groups, four major categories of challenges were identified. Below is a short explanation of the issues identified related to these categories of challenges, as expressed by those who attended.

Others’ Understanding of Processes/Data:

Others on campus, especially department heads, need to have increased knowledge about UPA’s services and information. Most clients expressed the need to have an increased understanding of data definitions. There is a need to ensure that everyone knows how to correctly use the provided data for decision-making.

Those who attended the focus groups suggested that UPA take a stronger role in consulting and challenging upper administration, colleges’ administration and those who lead major units. UPA needs to help others “ask the right questions.” UPA should increase its visibility with those who are unfamiliar with what UPA offers.

Data Accuracy:

There is a concern that some clients mistrust the data provided by UPA because it does not match data from other units or sources. Some clients expressed strong concern that the multiple sources of data do not match. They gave examples of these concerns. The examples centered on the differences between client-gathered data, source unit-gathered data/reports (e.g., data reported by graduate school, HR, R&R), and UPA data/reports. Some specific examples included the following:

  • One person said: “I can list, by name, the faculty members in my department. However, the reports about the faculty in my department such as the number of faculty, the faculty FTE and the percentages in each ethnic group do not match my list of faculty members.”
  • Others in the focus group stated that it is critical to use the data from both the Graduate School’s data system and UPA’s reports for graduate program reviews and external program accreditation. However, the data from these two sources do not always match and many users do not understand the differences in the reported numbers.

Other issues about data accuracy related to not always understanding the data definitions, when the data was collected and how it was analyzed. Another concern identified was about how the new system of ClassEval data would be analyzed and reported.

Clients’ Need For More Help To Answer Their Specific Questions/Issues:

Department heads said that they could not use UPA data or services for their day-to-day or month-to-month decisions at the department level. They felt UPA data/reports were good for identifying trends and for use at the college and university levels, but not for department-level decisions.

Many expressed that UPA is excellent at helping its clients determine their questions and providing data to address these questions; however, sometimes the clients don’t know where to begin to determine their questions. Therefore, the clients’ felt they should have access to the same data that UPA uses and be enabled to “mine” the data themselves. Many felt that being able to mine the data themselves would further their understanding of what issues and questions to raise in the future.

Some clients suggested that UPA should spend more time consulting individually with units who don’t know what their goals are or what measures to use. Through identifying units’ goals and needs for measurements and data, UPA, over time, may see ways of consolidating data collection across units and thereby improving the available datasets and reports. However, the focus groups recognized that in order for UPA to conduct more individual consulting, UPA would need to hire more personnel. Some suggested that these newly hired staff members could be assigned to individual clients for 4-6 months so that they can be better aware of the client’s area and special concerns. These newly hired staff should be under UPA supervision and guidance but be available to provide more intense assistance to specific units. Some clients would be willing to pay for this type of service; others would not. Others suggested that UPA consider collaborating with the Institute for Advanced Analytics and its graduate students to help UPA and UPA’s clients.

Others suggested that UPA collaborate with units to increase usefulness of data; to help client’s understand appropriate tools to use; how to make appropriate comparisons to other groups. UPA should improve end-user’s ability to query/search UPA datasets; to enable clients to have access to a dynamic, navigational datamart.

Others suggested that UPA should be more involved with the major committees on campus. UPA could help major committees by: 1) assisting with deciding on “what are the right questions,” 2) determining where there are existing data, 3) defining ways to collect needed data, and 4) advising on how to interpret data.

Those in the focus group gave several examples of decisions that they felt could not be made with their current understanding of UPA data/information/services:

  • How many course section are needed, number of seats needed, and staffing needed?
  • How can academic programs be improved? How can they meet retention targets? Improve admissions/financial aid services? Improve DE services/offerings?
  • What are best prediction models for student success at program, college and university levels?
  • How is retention in specific courses (e.g. CHE 101) and retention of majors related? How does the way courses are taught affect retention?
  • How are internal transfers affecting specific departments or colleges? Internal transfers are complex: what are the issues related to race and gender, by programs or colleges?
  • How can department heads make mid-year program corrections?

Institution-Wide Data-Driven Decision-Making Needed:

Those in the focus groups suggested that UPA take a more active role in addressing institutional issues, by helping define questions, gather data, analyze the data and make structured text reports (not just numbers). UPA should encourage data-driven decisions. They gave examples of institutional questions that UPA could help address; these included:

  • What are the impacts of graduate and research programs on resources and on improving student learning?
  • How are resources being over- or under-utilized? Why are classrooms empty on Fridays?
  • How many students are working and how much? How do students’ working patterns affect use of classrooms, lab spaces, on-line courses? How do these patterns affect retention and student learning?

Key Emerging Issues:

The clients in the focus groups were asked to express what they considered emerging issues that their units or the university will be facing in the near future. Below is a summary of the six categories of issues they identified.

Capacity/Resource Needs For Undergraduate And Graduate Students:

Key issues related to better predicting the need for resources included: 1) need to improve prediction models of student retention and enrollment patterns; 2) administrators at college and departmental levels need to be accountable for enrollment targets; 3) need to improve ability to predict numbers of program and general education course sections needed; and 4) better communication throughout the institution when developing new programs or eliminating programs.

Impact of Distance Education/ On-line Learning:

Key issues related to the impact of distance education included: 1) need to better define and measure distance education so that impact on enrollment, graduation rate, credit hours, etc can be calculated; and 2) improve understanding of how increased use of on-line learning will affect predictions of success, resource allocations, tuition and fee structures.

Better Models To Predict Student Success:

Key issues to developing better models to predict student success included: 1) need to define student success at various levels; 2) determine the impact of specific student-related programs/services on student success at both college and institutional levels; 3) gather more data points on enrollment (e.g., middle and end of semester enrollments) to improve determination of where and why there is loss of students; 4) improved use of what research literature indicates are reasons why students are retained (e.g. parents’ education levels); and 5) improve understanding of what variables NCSU is using to predict student success, (e.g., if SAT does not predict success, why use it?).

Efficiency And Effectiveness:

Key issues raised by those in the focus group included: 1) how to determine most effective use of resources to help improve student success; 2) how to define efficiency of services; 3) how to make business decisions with limited services; 4) determine what data UPA can provide to help make effectiveness and efficiency decisions; and 5) how to improve surveys efficiency (e.g., should fewer surveys be conducted at individual unit level, should UPA coordinate numbers of surveys across campus, etc?).

Reporting of Accountability Performance Indicators:

Key issues raised by those in the focus group included: 1) determine the most important institutional-level analyses needed for accountability purposes; 2) define the best effectiveness measures; 3) determine the affect of US Department of Education decisions on reporting needs; 4) determine how to prove that NCSU is producing quality education for the prices; 5) decide which outcomes NCSU should measure institutionally; and 6) calculate the impact on UPA resources as needs for external reporting increase.

Improved Data-Driven Decision-Making Process:

Key issues raised by those in the focus group included: 1) how can NCSU units best use data for decision making; 2) improved decision making requires expanding UPA processes to be more collaborative with others, to continually ask what clients need to know; what is needed to answer their questions; and 3) UPA is in the best position to see patterns from data gathered from multiple resources.

Issues That Have Been Found Multiple Times From Past And Current Assessment:

UPA continues to receives high marks in accuracy and usefulness of data and response time from upper administration and UNC General Administration. Timeliness of data has not been raised as an issue.

Others’ lack of understanding of the data was found in past assessments and continues to be an on-going issue. Many of UPA clients, especially department heads, need a better understanding of what data, information, reports and services are available, how to use them, how to interpret information (including survey data) for unit level decision-making.

Another issue that has been raised over the years is the need for UPA to work with constituencies and produce more topical reports or analyses. Others suggest that UPA provide more interpretation of data.

UPA staff members have noted, over the years, that more and more administrators and constituencies are using and expecting UPA data, reports and services. The down side is that the more information is made available, the more additional information is want. UPA continues to discuss how to balance the increasing expectations, without compromising the accuracy, timeliness or usefulness of data, reports, information and services.

Action Plan Based On Assessment Findings

Suggested Action Items:

UPA staff members discussed the above findings during a day-long retreat and several subsequent meetings. Below is a list of suggested methods to improve UPA.

Improve Communication:

The UPA staff’s discussion about improving communication involved three issues: 1) how best to define the data, variables, and reports and how to facilitate client’s interpretations made based on their assumptions and understandings; 2) how to improve training so users understand the data, variables, definitions, etc; and 3) how consulting can improve communication.

UPA staff members have posted data definitions on the website, have provided training to various groups as needed or requested. Each staff member has provided individual consultations as asked. The conversation revolved around what more could be done.

Some suggestions to improve clients’ understanding about the data included:

  • Incorporate more data definitions on UPA website within reports.
  • Develop a guide to all NCSU data, not just what UPA provides.
  • Give examples of how to interpret data and reports.
  • Add new SIS definitions.

Some suggestions on how to develop more pro-active training included:

  • Continue to offer orientation to newly hired administrators on annual basis.
  • During training, improve explanation of what UPA expects users to do with specific reports and information.
  • During training, clarify what each staff member does and services each person offers.

Some suggestions on how to develop more pro-active consulting included:

  • Develop pro-active consulting with upper administration/colleges/units about what type of data they need; what types of decisions they need to make, and how UPA can help provide answers, data, information, and services.
  • On UPA website, improve explanation of services UPA staff members provide.

Improve Data Accuracy:

The UPA staff feels confident in their ability to provide accurate data. Much of the discussion about data accuracy is in reality about improving communication as discussed above. One important suggestion to ensure quality data, reports and information from UPA, is for the staff to make a determined effort to improve the data accuracy, quality and consistency from the original data sources, and address unnecessary duplication of information across various units, such as Registration and Records, Graduate School and Human Resources.

Improve Business Intelligence at NCSU:

The discussion addressed how UPA’s expertise can be used to advise on models, processes and policies so that other university decision makers can have the data needed to make decisions; even if UPA doesn’t have the data or control it. The points made during the discussion about improving communication and improving data accuracy also impinge on improving business intelligence at NCSU.

Develop Dynamic Datamart:

The way data, reports and information is communicated continues to change and improve. UPA changed from paper reports to providing web-based reports in 1999. UPA next improvement was to provide an interactive planning website beginning in 2002. The next planned improvement is to work with UPA’s constituencies to conceptualize, design and implement a dynamic Datamart website with a dynamic, navigational query system and downloadable results, in order to meet demands for “just in time” decisions and facilitate formatting of data to meet multiple constituencies’ needs.

Expand Survey Capacity:

The assessment results continue to show that more and more surveys are being conducted at every level of the institution. The UPA staff discussed how it can improve its survey services. The need for more resources to provide expertise and coordinate surveys across the campus is important. In addition, as assessment activities continue to grow, the need for expertise in this area continues to grow.

Develop Institutional-Level Assessment And Accountability:

The UPA staff discussed the implications of the US Department of Education’s push toward accountability and transparent data for each higher education institution. It seems likely that there will be a need to provide accountability of the effectiveness of NC State to the UNC State System, parents, alumni, students and other constituencies. Part of the accountability will be providing transparent data about how well students are achieving outcomes as defined by units throughout the institution. It is likely that UPA will coordinate with others to develop institutional-level assessment and accountability processes.

Final Action Items:

Based on these suggestions and other information, UPA’s Compact Plan for 2007-2010 was developed. See the Compact Plan on UPA’s website.
UPA’s on-going assessment plan with outcomes and measures is to be developed during 2007-2008.