IT Forum Perspectives

Early trends emerge from the IT Functional Diagnostic

Earlier this summer, I wrote about the impetus for the IT Functional Diagnostic and the various ways CIOs could use the tool.

The number of responding institutions continues to grow, enabling more complex analyses and cohort development, but I wanted to provide some preliminary analysis based on several dozen IT Forum member responses, specifically focusing on three member questions that originally pushed us to create the IT Functional Diagnostic:

1. CIO reports: We often hear that CIOs are taking on broader responsibilities outside "traditional" IT functions; which "non-IT" offices are most likely to report to the CIO?

2. Degree of centralization: We know that distributed IT presents interesting challenges and opportunities for CIOs to support campus; what relationship does the size and complexity of an institution have with distributed IT?

3. Maturity and urgency map: Of all the possible functional capabilities described in this model, where are the greatest membership-wide opportunities for investment and growth?

Respondent CIOs have received a customized report summarizing the results from their institutions and IT teams across North America, but here we can provide some insight on membership-wide trends.

1. CIO reports

The structure of reporting lines can impact the effectiveness of IT on a number of critical capabilities, and the chart below shows the frequency with which certain functions report to the CIO. We weren’t surprised to see that Information Security was a very common report for members, but were intrigued to find that Academic and Instructional Technology reports through the CIO as often as Project Management, and while 70% of BI and Analytics functions report through the CIO, only 4% of institutional research functions do.

Some of these reporting lines may have developed organically on campus, with responsibilities assigned based on personalities or to accomplish past, narrowly scoped initiatives, however those legacy structures impact operations today. The first step to implementing intentional reporting lines is understanding what the current structures are and how they can and should support campus goals. There will be some variability across institutions as they pursue nuanced goals, but it is a worthwhile effort to review reporting lines to ensure they are best set-up to support success.

We specifically asked members to identify non-IT reporting lines (i.e., those other than network and applications, helpdesk) to shed light on the less obvious connections.

2. Degree of centralization

Our members have told us that distributed IT across campus, although often necessary, complicates their work. But beyond complication there are often critical questions about the current structures: Are they close to acceptable benchmarks? How does IT’s distribution impact critical functions?

By tracking the percentage of IT full-time equivalent staff (FTEs) in central and distributed units alongside other institutional factors and functional maturity, we have started to build models and benchmarks for distributed IT’s size, role, and impact on functional capabilities.

One of the first things we can report is that as the total (central and distributed) number of IT staff grows, the number and share of those staff that work outside of central IT grows very quickly. At institutions that have less than 100 total IT FTEs, an average of 7% of those staff work outside of central IT; for institutions with more than 100 total IT FTEs, an average of 40% work outside of the central organization.

Over the next few months, we will begin taking a look at how those distributions correlate with maturity on critical functions, and provide members with benchmarks that include the role of distributed IT on campus.

3. Maturity and urgency map

The scatter plot below graphs each of the capability's average urgency score reported by the 48 respondents with the average gap score on each capability. To generate the gap score, we subtracted each institution’s urgency score on a given capability from their maturity score on that capability. Then we averaged all the institutions' gap scores to produce the average gap score on each capability.

Graphing the urgency (on the x-axis) with gap (on the y-axis) produced this map highlighting areas where higher education as an industry feels investment is necessary.

It was not surprising to see that the two areas that popped as investment areas were "reporting and analytics" and "data governance." Our research team just spent time focusing on these topics in our latest study, A Common Currency and at the bequest of our members we are continuing to uncover insights and generate best practices that help take analytics from access to adoption.

If you’d like to be involved in our ongoing research, please do not hesitate to contact me at

Participate in Our Diagnostic

The opportunities to harness this data only grow as we receive more responses. We're excited to start analyzing how structures, centralization, size, and maturity on certain capabilities are correlated with maturity and urgency responses from institutions.

If you're interested in getting a customized report to see not only how your institution compares to others in our survey, but also areas for institution-specific prioritization and investment, complete our diagnostic.

Take the diagnostic

  • Manage Your Events
  • Saved webpages and searches
  • Manage your subscriptions
  • Update personal information
  • Invite a colleague