Elevated POV (February ‘26)
Data without context is not neutrality
by Jocelyn Hines, MD, MBA, FAAFP
Black history month is about who is building the future
Black history month is not only about honoring past milestones. It is about recognizing who is shaping healthcare now and deciding whether we will protect the systems that allow progress to continue.
Modern African American leaders are actively transforming healthcare technology, biomedical science, and AI. Scientists like Dr. Kizzmekia Corbett played a central role in advancing mRNA vaccine science, accelerating global responses to COVID-19 and redefining what is possible when diverse expertise is fully supported. Clinician-researchers such as Dr. Rhea Boyd continue to connect clinical data, policy, and lived experience to illuminate how structural forces shape child and population health outcomes.
These contributions are not symbolic. They influence how risk is defined, how algorithms are trained, how trust is built, and how care reaches communities that have historically been excluded.
Innovation does not happen in isolation. It happens inside systems built on data. And data, when stripped of context, becomes unreliable at best and harmful at worst.
You cannot address what you refuse to measure
Health equity begins with visibility.
Disparities in healthcare outcomes exist across race, disability status, socioeconomic position, gender, literacy, obesity, and geography. These factors intersect and compound. When they are not named, they are not measured. When they are not measured, they are not addressed.
Current efforts to eliminate language associated with diversity, equity, and inclusion from research and reporting may appear neutral, but neutrality in data collection is a myth. Removing descriptors does not remove disparity. It removes the ability to detect it.
The result is not fairness. The result is distortion.
What the evidence already shows
A recent article in JAMA warned that suppressing demographic and social identifiers undermines scientific validity and weakens patient safety. Without these variables, researchers and health systems lose the ability to identify gaps in care, assess differential outcomes, or correct bias embedded in clinical decision-making tools.
This risk becomes more pronounced in healthcare AI. Widely cited research by Dr. Ziad Obermeyer demonstrated how algorithms trained on incomplete data systematically underestimated the needs of Black patients, not because of intent, but because of flawed proxies and missing context. AI does not eliminate bias. It scales whatever bias it is given.
Data without context does not improve accuracy. It creates false confidence.
Leadership requires memory and responsibility
Healthcare leaders have an obligation to learn from the past. Healthcare strategists and innovators are charged with something more demanding. We must transform that knowledge into systems that perform better, more safely, and more equitably.
That means protecting robust data collection even when it is politically inconvenient.
That means insisting on transparency in AI design and governance.
That means being precise about who is represented in our data and who is missing.
Scholars like Dr. Ruha Benjamin have long warned that technology divorced from social context does not become objective. It becomes opaque. And opacity in healthcare is where harm hides.
Progress requires precision, not erasure
Black history month is a reminder that progress has always depended on clarity, courage, and truth-telling. The same is true for healthcare innovation today.
If we want technology and AI to reduce disparities rather than automate them, we must preserve the language, data, and frameworks that allow inequity to be seen and corrected.
Innovation without equity is not progress.
Data without context is not truth.
And systems built on erasure will fail the people they claim to serve.
This is the work of modern healthcare leadership. Clear-eyed, data-driven, and forward thinking.