The digital transformation of well being care is advancing at unprecedented velocity. Synthetic intelligence (AI), cellular well being apps, and telemedicine promise to boost extra correct diagnoses, streamline care supply, and broaden entry to well being providers worldwide. However within the International South, these guarantees usually collide with stark realities: colonial legacies, systemic inequalities, and uneven digital infrastructure. Except digital well being governance (DHG) is essentially reimagined, these applied sciences danger reinforcing — fairly than narrowing — world well being disparities.
DHG requires a multi-faceted strategy to regulating, coordinating, and guiding the moral and accountable deployment of digital applied sciences inside well being care programs. This text explores why DHG should be reimagined to handle structural energy imbalances — utilizing sexual and reproductive well being and rights (SRHR) as a key instance — and to advertise fairness, range, and inclusion in digital innovation.
The Digital Divide as a Governance Failure
Globally, practically one-third of the inhabitants lacks web entry, with girls and rural communities disproportionately affected. In Africa, solely 31 % of ladies use the web in comparison with 43 % of males. This “digital divide” isn’t just a technological difficulty — it’s a governance failure. When well being care programs depend on digital instruments to broaden entry however ignore underlying inequalities, they danger excluding the very communities they purpose to serve.
Furthermore, many digital well being instruments are designed by non-public corporations or analysis groups primarily based within the International North. The ensuing applied sciences usually lack contextual relevance and fail to contemplate native well being wants, languages, and cultural practices. With out participatory design processes, these instruments can alienate customers or produce biased outputs — particularly in AI programs skilled on homogenous datasets.
SRHR and the Ethics of Consent within the Age of AI
Digital instruments may be highly effective enablers for SRHR. SRHR encompasses a variety of providers, together with contraception, secure abortion, maternal care, sexual transmitted an infection prevention, and fertility therapy — all grounded in bodily autonomy. Cellular apps might help folks entry contraception data, abortion providers, or psychological well being help in privateness and security. Nonetheless, these identical instruments usually accumulate delicate information — menstrual cycles, sexual exercise, geolocation — with out enough safeguards. In some contexts, such information might be weaponized, particularly the place abortion is criminalized or gender-based violence is prevalent.
As highlighted by the United Nations Inhabitants Fund (UNFPA), technology-facilitated gender-based violence (TFGBV) is on the rise. TFGBV refers to any type of violence that makes use of know-how — like telephones, social media, GPS trackers, and even recording units — to hurt somebody due to their gender. Most frequently, this impacts girls and ladies. Examples embrace on-line harassment, defamation, non-consensual sharing of intimate photographs, and cyberstalking.
When well being care programs depend on digital instruments to broaden entry however ignore underlying inequalities, they danger excluding the very communities they purpose to serve.
Many of those digital well being apps lack strong consent frameworks and correct information anonymization. A notable SRHR information breach occurred in 2021 when hackers accessed the data of roughly 400,000 Deliberate Parenthood Los Angeles sufferers. This highlighted critical privateness dangers in digital reproductive well being providers. Femtech apps — digital functions designed to help girls’s well being, significantly in areas like menstruation, fertility, being pregnant, and menopause — have been criticized for offering inaccurate medical data, sharing delicate consumer information with third events with out clear consent, and reinforcing gender stereotypes that contribute to epistemic injustice. Moreover, algorithmic bias can replicate and reinforce disparities associated to socioeconomic standing, race, ethnicity, faith, gender, incapacity, or sexual orientation — amplifying current inequities in well being programs and undermining the effectiveness and security of SRHR interventions.
Rethinking Information Sovereignty and Gender Justice
A central problem in DHG is figuring out who has management over well being information and the way it’s ruled. In lots of nations throughout the International South, private well being information — reminiscent of medical data, diagnostic outcomes, and data collected by means of well being apps or wearable units — is saved on servers owned by multinational firms and topic to overseas authorized jurisdictions. This dynamic undermines nationwide sovereignty and creates vital dangers for privateness, safety, and evidence-based policy-making.
Frameworks should account for a way race, gender, incapacity, and geography form each entry to — and impression of — digital. DHG should additionally incorporate algorithmic equity requirements as core rules, aligning with worldwide frameworks such because the OECD’s AI Rules, the WHO’s Ethics & Governance of AI for Well being, the EU’s ALTAI necessities for reliable AI, and ISO requirements like ISO/IEC 42001 (AI Administration Methods) and ISO/IEC 42005 (AI System Affect Evaluation). Collectively, these frameworks emphasize key parts reminiscent of human company, technical robustness, transparency, range, and accountability. Integrating these requirements helps be certain that AI-driven well being instruments aren’t solely efficient but in addition ethically sound and socially accountable.
To make sure well being AI instruments are equitable, they should be rigorously examined and validated throughout numerous populations — taking into consideration variations in gender, race, age, incapacity, and socioeconomic background. Nonetheless, fairness in design should transcend surface-level demographic illustration. It requires real participatory processes, the place affected communities are meaningfully engaged all through growth and deployment. This consists of mechanisms for suggestions, redress, and group accountability, making certain that applied sciences not solely do no hurt however actively contribute to inclusive, secure, and moral care.
International Well being Governance Should Decolonize
The colonial undercurrents in world well being aren’t a matter of the previous — they persist in how analysis is funded, how applied sciences are transferred, and the way experience is acknowledged. Digital well being isn’t any exception. The “one-size-fits-all” mannequin of technological intervention usually exported from the International North reproduces dynamics of dependency and disempowerment. International South growth has usually resembled a sport of Tetris — with inflexible, standardized insurance policies imposed on numerous realities — flattening complexity into prefabricated options.
The Lancet and Monetary Instances Fee on Governing Well being Futures 2030 argues that digital well being must be primarily based on solidarity, sustainability, and sovereignty. This implies supporting native tech growth, selling South–South collaboration, and rejecting extractive information fashions. As a substitute, information must be ruled collectively to make sure truthful and inclusive digital well being programs that profit communities.
The transformative potential of digital well being can solely be realized when fairness, rights, and justice are embedded at its core. Advancing fairness in digital well being governance means shifting past effectivity to embrace a rights-based strategy that prioritizes public well being over non-public revenue. This requires embedding inclusion all through the design and deployment of SRHR applied sciences, making certain numerous information, human-centered options, and culturally delicate interfaces that talk information utilization clearly and uphold knowledgeable consent.
True accountability calls for clear algorithms, gender-responsive insurance policies, and mechanisms to redress hurt. To be significant, digital well being governance should confront colonial legacies by shifting energy to the International South, investing in native management, and fostering world cooperation grounded in moral requirements and shared accountability.
This publish is a part of a digital symposium known as Innovation, Legislation, and Ethics in Worldwide Bioscience. To learn the associated posts, click on right here.
Acknowledgment: The analysis for this weblog publish acquired help by the Novo Nordisk Basis (NNF) by means of a grant for the scientifically unbiased Collaborative Analysis Program in Bioscience Innovation Legislation (Inter-CeBIL Program – Grant No. NNF23SA0087056).
