My mother was a child of the 1960s. Despite not having a college degree, she was determined to teach my sister and me that women could do anything. I guess my sister was really the intended target, but I was certainly along for the ride. Every doctor she ever took us to was a woman—to the point where I didn’t even realize men could be doctors. It was never a conscious thought, just an ingrained assumption that all doctors were women. My realization to the contrary came embarrassingly late.
My family spent a good part of my early life living in a remote part of New York, near Montreal. We were culturally isolated. The TV shows were in French, so I could barely figure out what the Flintstones were up to, never mind the guys living in green tents on M*A*S*H. Somewhere near my 11th birthday, I decided it was time to announce my career choice. “I’m going to be a paleontologist,” I confidently declared. Without looking up from her newspaper, my mother shut it down quickly and viciously. “You’re never going to meet a wife digging around in the desert. You should be a doctor.” Slightly stunned by my mother’s uncharacteristically sexist take, I retorted, “I’m not even looking for girls, I’m looking for DINOSAU… Wait, I can be a doctor?”
It is easy to overlook or minimize the importance of representation. Until you experience a lack of it. As we have seen with the increasing numbers of women in our specialty, representation truly matters. We know that enrolling more Black students in medical school will help recruitment for decades to come. The barriers, though, to increasing the number of Black doctors in the U.S. are deep-rooted.
In the early 1900s, the Carnegie Foundation commissioned Abraham Flexner to evaluate the state of medical education in North America. Flexner visited all 155 medical schools in Canada and the U.S., issuing a detailed report of his findings in 1910. Flexner concluded that the rise of for-profit schools resulted in an overproduction of poorly trained physicians. The ensuing Flexner Report led to radical changes in medical education and became the basis for our now antiquated model, which includes two years of basic science. Critics pointed out that Flexner had received no medical instruction and valued science over clinical training. Pioneering physician William Osler worried that this emphasis on basic science would produce a generation of “clinical prigs.”
While I am unsure of the precise definition of “prig,” I do enthusiastically add my support to Osler’s concerns regarding medical curricula. However, more germane to this discussion is that within 15 years of the Flexner Report, more than half of all U.S. schools had closed. The seven existing Black medical schools, with fewer resources, were disproportionately affected. By 1920, only two, Meharry Medical College and Howard University, remained. These closures had an enduring and tragic effect on our ability to produce Black doctors. By 2018, Black and African American individuals represented 13.4% of the U.S. population but only 5% of physicians.
A 2020 JAMA study estimated that if all seven Black medical schools had remained open, an additional 35,315 Black physicians would have entered the workforce. Flexner’s recommendations that led to the demise of these five medical schools do not appear to be based on any deep-felt bigotry. There is ample evidence that Flexner argued fervently for the survival of Meharry and Howard. Still, he is quoted stating that Black students should be trained as “sanitarians,” and in that role would also be protecting white people from disease. Flexner, though, knew Howard and Meharry could not survive without significant financial endowments from white people, so some have painted these statements as an appeal to these potential donors.
At best, Flexner was an ignorant idealist, too rigid in his beliefs to appreciate that the need to continue educating Black doctors should supersede his desire to impose daunting standards on the schools willing to train them.
Flexner was not solely responsible for the onerous requirements placed on Black medical schools. In 1906, the American Medical Association (AMA) Council on Medical Education sent a letter to every medical school urging them to require at least a year of college science classes prior to admission. This prerequisite would immediately and dramatically reduce the number of qualified Black applicants since few Black colleges offered these courses, and even fewer white colleges would admit them. Another AMA requirement banned classes after 4 p.m., which disproportionately affected Black students who had to work to pay tuition. Under these new restrictions, three Black medical schools closed before Flexner began his tour.
After the Flexner Report, the surviving two Black medical schools were caught in a tug of war. On one side, Flexner stated the AMA’s regulations were “choking” the few remaining Black schools, particularly Meharry. The AMA countered that Flexner and the Carnegie Institute should help secure large financial endowments for the schools to help them meet the requirements.
The AMA’s suppression of Black doctors began in the 1800s. Six of the first 13 AMA presidents resided in active slaveholding and trading states. The AMA claimed not to discriminate based on race, leaving admission decisions to its local chapters. Black doctors, however, found few chapters that would accept them, particularly in the South. AMA membership was critical for a physician and often tied to licensure and hospital admitting privileges.
In 1869, the AMA chapter in the District of Columbia refused admission to three qualified Black doctors. A subsequent U.S. Senate investigation found the group guilty of excluding doctors “solely on account of color.” A bill to repeal their charter was presented to Congress but never passed. In 1870, the three excluded doctors joined other D.C. physicians to form the racially integrated National Medical Society (NMS).
When the NMS attempted to be recognized by the AMA the following year, they were accused of admitting irregulars, specifically those who had not received a formal medical license or education. This was an extremely unfair hurdle because their licenses would have been issued by the D.C. chapter that had already excluded them. That year, the AMA Committee on Ethics reviewed charges against three groups accused of admitting irregulars. The accusations against the two all-white delegations were tabled, and these groups were recognized. The racially integrated NMS was denied entry by a 114–82 vote, in which 36 members of the D.C. chapter were allowed to participate. At the same convention, the AMA voted to issue the statement, “That inasmuch as it has been distinctly stated and proved that the consideration of race and color has had nothing whatsoever to do with the decision of the reception of the (NMS) delegates.” In other words, they voted to declare themselves not racist.
By 1950, Meharry and Howard combined to produce about 100 Black doctors annually, while the remaining white schools added only 10 to 20 more. President Lyndon Johnson began the Great Society programs in the 1960s with the lofty goals of ending poverty and injustice. Johnson believed that education would be the great equalizer. His projects, combined with the advances of the Civil Rights era, led to minority enrollment in U.S. medical schools climbing from 2% to nearly 9%. Unfortunately, this progress then stagnated for 20 years.
In 1991, Robert Petersdorf, president of the Association of American Medical Colleges (AAMC), announced a plan to enroll 3,000 Black and Hispanic medical students annually. He spoke to the 124 U.S. medical school deans, who met him with some skepticism. They did not even receive 3,000 minority applications a year. How were they supposed to double enrollment? Undaunted, Petersdorf launched Project 3000 by 2000.
Under Petersdorf, the AAMC began new efforts at nearly every level of the education system. Pipelines programs were built, and articulation agreements were formed in which medical schools automatically admitted students meeting set academic goals. Race and ethnicity were factored into admission decisions, and for the first time in decades, progress was achieved. Black and Hispanic enrollment in U.S. medical schools increased from 1,500 in 1990 to more than 2,000 in 1995. But then, a seemingly unrelated legal ruling in Louisiana threatened everything.
Project 3000 by 2000 employed a method commonly referred to as affirmative action. Even though affirmative action is the only technique that has ever proven successful in increasing minority enrollment in higher education, it has a long and tangled legal history in the U.S.
The first mention of affirmative action came on May 6, 1961, in Executive Order 10925. President John F. Kennedy called on government contractors to “…take affirmative action to ensure that applicants are employed and that employees are treated during employment without regard to their race, creed, color, or national origin.” In 1966, President Johnson established the Office of Federal Contract Compliance Programs in the U.S. Department of Labor to enforce these requirements. Subsequently, President Richard Nixon issued Executive Order 11478, which called for unilateral affirmative action in all government employment.
In 1978, affirmative action faced its first major legal challenge. Allan Bakke, a white male who had been denied admission to the University of California Davis School of Medicine two times, filed a lawsuit against the university. The medical school reserved 16 out of 100 spots for minorities at that time. The case found its way to the Supreme Court, where Bakke won and was granted admittance to the school. The court, however, did rule that race could be used as a factor in admissions, but quotas violated the 14th Amendment’s Equal Protection Clause.
In 1992, Cheryl Hopwood and three other white applicants to the University of Texas Law School filed suit alleging that the school discriminated against them by using an affirmative action admissions process that placed Black and Mexican American applicants in a separate admissions pool and consequently accepted members of those groups over non-minority students who had comparable grades and test scores. The U.S. District Court for the Western District of Texas found in favor of the university. U.S. District Judge Sam Sparks stated in the decision, “until society sufficiently overcomes the effects of its lengthy history of pervasive racism, affirmative action is a necessity.” The case was then brought to the U.S. Court of Appeals for the Fifth Circuit. Here, Hopwood v. Texas was overturned, and affirmative action rejected. The ruling read, “the University of Texas School of Law may not use race as a factor in deciding which applicants to admit in order to achieve a diverse student body, to combat the perceived effects of a hostile environment at the law school, to alleviate the law school’s poor reputation in the minority community, or to eliminate any present effects of past discrimination by actors other than the law school.”
The Supreme Court declined to review Hopwood v. Texas. Therefore the ruling became law in Texas, Mississippi and Louisiana. Even though the decision only applied to three states, the precedent was far-reaching. Schools across the U.S. adjusted their admissions policies, fearful of lawsuits. A 2003 Supreme Court decision in Grutter v. Bollinger later invalidated Hopwood v. Texas. Here, the court held that the Equal Protection Clause of the 14th Amendment does not prohibit the narrowly tailored use of race in university admission plans as part of a compelling interest in promoting student diversity.
Still, the reversal came too late. Project 3000 by 2000, which had made the first progress in minority medical school enrollment in over 20 years, was abandoned.
In 2008, the AMA issued an apology for its “past history of racial inequality toward African American physicians.” In 2020, the AAMC removed Abraham Flexner’s name from its award for distinguished service to medical education. Today, though, more than atonement is needed.
Every year, 21,000 students are admitted into the 154 U.S. medical schools. Black men often make up less than 500 of them. The AAMC reports that essentially every minority group saw an increase in medical school enrollment between 1978 and 2014, except for Black men. Black males face specific challenges: a higher likelihood of attending underfunded high schools and a lower chance of participation in AP courses, gifted programs and STEM classes. A 2021 UCLA study published in the Journal of General Internal Medicine found that the percentage of doctors who are Black males has remained unchanged since 1940.
Financial considerations are also important. More than 70% of medical students come from homes with an average income of over $74,000 (2016 terms). Underrepresented minorities disproportionately come from lower socioeconomic backgrounds. As such, they carry an average debt of over $200,000 upon graduating from medical school. The Brookings Institute found that the net worth of the average Black family is about 10% of the average white family.
The Medical College Admission Test (MCAT) poses another potential fiscal barrier. The registration is $320, but more affluent applicants can afford to spend thousands on tutoring, creating another disparity. The AAMC walks a tightrope in defense of the role of the MCAT in the admissions process. They have asked medical schools to de-emphasize the scores, but the AAMC also administers the exam.
This fall, the use of race as a consideration in admissions may face its death in the Supreme Court. Edward Blum, an anti-affirmative action crusader, has filed separate lawsuits against Harvard and the University of North Carolina. Both cases were unsuccessful in federal district court, and the litigation against Harvard failed in the First Circuit Court of Appeals. The Supreme Court has paired the lawsuits and is bringing them to Washington to hear. Previous Supreme Court rulings allowing affirmative action have been close decisions, usually with a moderate conservative justice holding the swing vote. No moderate conservatives remain on the court. Many legal experts anticipate a 6–3 decision mandating complete “color blindness” in the admissions process. Based on Blum’s own documents, he expects that if his petition is successful, and affirmative action declared illegal, the share of Harvard students who are Black would fall from approximately 14% to about 3%. While its legal basis hangs in peril, we should consider the ethical implications.
To dispute the validity of affirmative action as a tool in higher education admissions, one must successfully defend one of two positions. Either diversity does not matter, or a means other than affirmative action can achieve it. Neither position, however, is scientifically viable. At a certain point, ignorance becomes a choice.
Malachi Sheahan III, MD, is the Claude C. Craighead Jr. professor and chair in the division of vascular and endovascular surgery at Louisiana State University Health Sciences Center in New Orleans. He is medical editor of Vascular Specialist.