Early insights from the ACES benchmarking survey

There’s still one month left to submit data to the Business Affairs Forum’s Administrative Cost, Efficiency, and Structure (ACES) benchmarking platform, but we’ve generated some broader insights from the inquiries and responses received so far. The number of ACES participants continues to grow, and we will identify specific data trends after the survey submission window closes on August 31.

Related resource center
Learn more about ACES

Participants appreciate 360-degree perspective on admin spend and operational performance

Higher education leaders recognize the importance of using data to determine efficiency and make strategic decisions. But how can administrators identify which targets to aim for? ACES is designed to answer these questions and to provide comparison with peers, both now and in the future.

While some benchmarking initiatives restrict their focus to spending on staff, ACES balances staffing metrics with a look at the efficiency and quality of the work that is being done. An exclusive focus on headcount may not provide a full picture. For instance, low headcount does not always mean that quality work is being performed. Moreover, the amount spent on FTEs can differ for any number of reasons. ACES provides that fuller picture and allows for better decision-making through an understanding of opportunities for improvement.

Preliminary observations from ACES benchmarking survey

1. The fastest submissions came from smaller, highly centralized institutions.
Factors like current staffing levels, staff capacity, and maturity of electronic systems certainly influence the time it takes an institution to complete ACES. However, we hypothesize that smaller institutions are able to track more metrics in one central location, and thus complete ACES faster than their larger counterparts. Central unit staff at participating institutions typically complete ACES and have quick access to data and reports to help them answer the survey questions. Centralized access to data seems to be more prevalent at smaller institutions.

2. Larger institutions still find value in ACES, particularly when exploring options like shared services centers, but may need to rely on educated estimates to complete the survey.
We have heard that entering the FTE counts and work distribution metrics for “decentralized units” is more challenging at larger schools, given that staff in central offices are likely to be tasked with completing ACES and do not have readily-available data from the units. One strategy that participants have found helpful is to obtain data for two or three average-sized units (e.g. school of engineering, school of business). Then multiply these numbers by the number of decentralized units to obtain the totals for the “decentralized units” columns. Educated estimates are better than no data!

3. Structural similarities may be the most important factor when determining a peer group.
Some participants have expressed concerns that the sample size will not include enough institutions with certain shared characteristics (e.g. faith-based affiliation, geographic location, Carnegie classification). While peer group comparisons across those dimensions may inform qualitative or cultural decisions, we designed the survey to measure operational activities through a more clinical lens (e.g., looking at work distribution in addition to volume of work) to enable comparisons agnostic of traditional peer groups. When selecting metrics for ACES, we obtained guidance and feedback from a couple dozen CBOs who recommended metrics that were the most interesting, transferrable, and accessible.

This is a preview of restricted content.

Full access to this content is reserved for Business Affairs Forum members. Log in now or learn more about Business Affairs Forum.

Next, Check Out

ACES – Administrative Cost, Efficiency, and Structure Benchmarking

More
  • Manage Your Events
  • Saved webpages and searches
  • Manage your subscriptions
  • Update personal information
  • Invite a colleague