A response to the Historically Black Colleges and Universities dashboard

As researchers committed to providing robust, data-driven analyses that Historically Black Colleges and Universities (HBCUs) can use to better serve their students, we appreciate the thoughtful use of well-contextualized data and rigorous evidence when considering these oft-maligned institutions. In September, the White House Initiative on HBCUs issued an “HBCU Dashboard” at its annual conference.  We have several concerns about the “Dashboard” and we outline them below.

Our first concern with the report is its reliance on IPEDS as its sole data source. There’s no question that IPEDS is a fount of information on HBCUs, but such data on their own, with no further context, fails to provide a useful understanding of these institutions. Although the authors of the “Dashboard” note that IPEDS is not a perfect measure and remind readers that “it is important to engage institutions directly to understand the deeper context of the data,” they nonetheless base the report exclusively on this imperfect data, with little rationale for their choices.

We also found a number of troubling inconsistencies in the use and interpretation of data, not to mention a few questionable assumptions, which we detail below.

  • The report does not settle on the number of HBCUs that it covers. At times, it’s 105 and other times it’s 106. IPEDS has data for only 100 HBCUs because Knoxville College, Lewis College of Business and Morris Brown College are unaccredited, and Hinds Community College and American Baptist College are not considered HBCUs by IPEDS. These omissions should be noted when considering the national data on HBCUs.
  • The report lists the average graduation rate for African-American students across the country as 34 percent. In actuality, the graduation rate for African-Americans ranges from 38.9 percent to 42.1 percent, according to the National Center for Education Statistics. This discrepancy undercuts the validity of the report, as it is not possible to understand what database the authors are using as a starting point.

The use of seemingly arbitrary analytical timeframes makes us wonder whether they were chosen randomly or for the startling findings they generate:

  • In their analysis of HBCUs with declining enrollments, the authors measure declines between 1990 and 2012. What is significant about this timespan? And how does it take into account the recent recession — enrollment historically declines after recessions — or institutions that have made substantial gains in the last five years?
  • A closer look at enrollment data reveals that the arbitrary choice to depict a 22-year time period to assess enrollment sensationalizes the enrollment trends at the institutions listed in Table 1 of the report. Although the authors do not list the variable they chose to estimate these figures, we replicated these analyses with the “Total Enrollment” IPEDS variable from 2003 to 2013 (see the table below). We did this to illustrate the variability and the shifts in recent years and to further contextualize IPEDS data. Surprisingly, the data demonstrates that the changes in enrollment stated in the report are grossly exaggerated.

For instance, all institutions have seen at least two consecutive years with positive gains in enrollment. Though not all of them have been able to sustain these increased enrollments, of the institutions listed as the lowest-performing in the report, three of them (Talladega College, Le Moyne-Owen College and Miles College) have either maintained or increased total enrollment since 2003.

(The table above shows the percentage increase/decrease in total enrollment using the preceding academic year as a baseline for estimations. It then lists the average percentage change from 2004 to 2013 and the total change between 2003 and 2013.)

Some of the comparisons in the report are wholly misleading:

  • The report calls out the low performance of some open-enrollment HBCUs, comparing them to more selective and heavier resourced institutions, without accounting for the student characteristics at these institutions or the services that they provide by educating students that are often failed by the K-12 system.
  • Graduation rates at two- and four-year institutions are lumped together and compared, even though these institutions have fundamentally different missions.
  • Public and private institutions’ tuition costs are compared, which makes little sense.
  • It’s not clear why stand-alone medical schools — Morehouse School of Medicine and Meharry Medical College — are included alongside the rest of HBCUs. For the purpose of comparing institutional performance, those two institutions should have been excluded from the analysis.
  • Much like a ranking system that highlights elite institutions, the report focuses on and lauds high-resource HBCUs, neglecting to consider the unique mission that many HBCUs have of educating low-income students.

In several places, the authors fail to tease out critical details:

  • The report claims that there are four factors that are indicative of strong enrollment at HBCUs. These are (a) no less than 1,000 students; (b) steady increase in student enrollment; (c) a balanced ratio of male and female students; and (d) a campus infrastructure that accommodates the student body. However, we don’t know how these indicators were determined and defined or why they are important. Why are these measures used?
  • The report suggests that students are being done a disservice by not having a majority of tenured faculty to teach them. Yet, it does not speak to the national data regarding undergraduate students being taught by tenured faculty. Furthermore, there is no data presented to substantiate the relationship between student learning and faculty rank. To go a step further, as the definition and requirements for tenure vary from institution to institution, this measurement should be used with caution.
  • The report also looks at presidential term length, using a 13-year period as a barometer for strength. Why 13 years, when the average tenure of a college or university president is  seven, according to the American Council on Education’s Center for Policy Analysis? What is the significance of the 13-year period? In addition, there is no discussion of why presidents leave their posts — retirement, new positions, resigned, fired?

Given the influence of the White House Initiative on Historically Black Colleges and Universities and that of the U.S. Department of Education, we hope that the authors will reach out to all HBCUs and ask them for the most up-to-date data to include in this report. Making this effort would have given readers a more realistic and current view of the state of HBCUs than IPEDS can provide.

Although one could view the “HBCU Dashboard” as merely a reporting of IPEDS data, it is crucial to consider the power of influence that any report issued by a White House Initiative can have on the higher education community, funders and policymakers.

This piece has been updated from a previous version. The authors recently learned that the “Dashboard” is a working document despite being distributed at a national conference and lacking a statement noting the intermediate state of the document. We hope that our comments assist with future iterations of the “Dashboard” as it is improved for the good of HBCUs.

Gasman is professor of higher education at the University of Pennsylvania and director of the Penn Center for Minority Serving InstitutionsSamayoa, Nguyen, Commodore and Boland are Ph.D. candidates and research assistants at the Center.

Tags Department of Education HBCUs Historically black colleges and universities

Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..

Main Area Top ↴

Testing Homepage Widget

 

Main Area Middle ↴
Main Area Bottom ↴

Most Popular

Load more

Video

See all Video