Every day since the inauguration of Donald Trump’s second term has brought unprecedented announcements with dramatic consequences all over the world. It is still impossible at this stage to fully grasp the impact of the closure of USAID, but it will for sure disrupt many life-saving programs that benefited millions of vulnerable populations.
The overall attack on the US Federal administration by Elon Musk’s Department of Governmental Efficiency (DOGE) has moreover highlighted how critical access to data has become: breaches in procedures, expeditive termination of officials refusing to grant access to unaccredited DOGE staff, and numerous lawsuits have put public data access at the center of the battlefield for the rule of law and respect of the constitution, as well as for global geopolitical powerplays.
Indeed, the Trump’s administration vision of the digital world is now openly endorsed, stated clearly at the Paris summit on artificial intelligence: full leeway must be given to private companies and industrial interests, with no public regulation deemed acceptable whatsoever – any attempt to go against this being met with the threat of retaliation.
This paves the way for a dystopian World where Big tech giants freely compete to extract data on communities, and where policymaking is decided through gamification and leaderboards that dictate what to cut next.
Risk of a two-tier humanitarian data system
We need to take a step back, think about where this leaves the sector, and what we should be reconsidering as we rebuild. There are voices raised in the Global South, but also in the humanitarian community, to see this chain of events as a wake-up call to deeply transform the system and finally end the dependency on aid and its post-colonial system.
The agenda decided at the Grand Bargain on localization has been notoriously lagging since 2016, despite persistent calls to accelerate its implementation, including in the humanitarian data ecosystem. The criticism on “data colonialism” is not new, and the current situation only highlights the concerning dependency of the humanitarian sector – among many others – on both a few Western donors, but also a handful of Big Tech companies.
CartONG, in its recent study “Beyond the Numbers: Balancing Innovation, Ethics, and Impact”, has synthetized this risk of a “two-tier humanitarian data system”, where major international actors (INGOs and UN agencies) would be the only ones to keep up with digital innovations, leaving the rest of the ecosystem (including all local/national actors but also many smaller INGOs) behind.
Given the current context, we might even now question whether the major international actors can keep up, especially in light – amongst other things – of the closure of websites that were central up to now to humanitarian decision-making (such as the Famine Early Warning Systems Network or the Demographic and Health Surveys), and the often questionable space that Big Tech is taking with them.
The existing humanitarian data system
In a context where humanitarian needs are higher than ever, as competing priorities arise from everywhere while funding shrinks and where seemingly untouchable institutions such as the United Nations or Global North development agencies are now in jeopardy, preserving the data infrastructure might seem a secondary goal.
This would be neglecting all the major advances acquired over the past decades, among which are more:
- qualitative data collection;
- open sharing of data to maximize impact (OCHA/HDX);
- consideration for local voices, collecting data in their own language (ClearGlobal) or designing and processing projects (Ground Truth Solutions);
- responsible data approach (MERL Tech, the Engine Room’s or OCHA’s Data Responsible WG work);
- transversal data literacy to support decision-making (e.g. IFRC’s Data Playbook)
- promotion of local data expertise (CartONG or HOTOSM),
- use of data for anticipatory action rather than after a crisis has occurred, etc.
Humanitarian data systems, despite all their imperfections and limitations, have improved needs assessment & resources allocation, and ensured more transparency and accountability of the sector. Adequate data makes it harder to ignore crisis and the people who suffer from them, rather than focusing only on those who are most visible and easiest to reach.
One could also be tempted – as currently in the US – to simply replace any non-profit or public effort by for-profit, corporate solutions because “they work better”, don’t require public funding, and are already available.
We can only highlight the risk of such approaches: Big Tech have repeatedly demonstrated their lack of interest in developing services where they don’t perceive a solvable market for their products (a well-known example being the coverage and economic model of Google Maps, only developed where a local advertisement market exists, compared to the open collaborative platform OpenStreetMap).
Even “for good” ventures often come with strings attached (such as licensing costs eventually increasing, once the organization has become completely dependent on the tech in question). Public actors in the Global North are now worryingly examining their dependance on Big Tech solutions, which are now openly subordinated to a US government that can no longer be looked up to as a reliable partner. Reinforcing such a dependency now would go against the way of the world.
The challenge of doing more with less
Transforming the global digital economy is an endeavor that goes way beyond what the humanitarian system could achieve. It would require an alliance of all actors that (still) believe in open, fair and responsible digital tools – an alliance that could include the European Union, but also most African, Latin American and Asian governments.
The recent “Statement on Inclusive and Sustainable Artificial Intelligence for People and the Planet” adopted at the Paris Summit, signed by 61 governments including the EU, India, China, the African Union, and many other governments – with the notable exception of the USA – is a possible example (while still relatively lightweight) of such a coalition.
That being said, it would be no solution to leave the dependency of US’ Big Tech to fall into other close-ended systems, whether they are controlled by other corporations or governments.
Regarding humanitarian data specifically, the current situation, where we are forced to do more with less, could also be an opportunity to refocus on essentials. Many examples of humanitarian digital commons already exist, whether for open data sharing (HDX, OpenStreetMap, etc.), or in terms of software (ODK/KoBo Toolbox, DHIS2, QGIS…).
Several of these platforms are curated or supported by key actors, in particular UN agencies or governments. Many of them benefit from shared resources from other sectors who also use them (public administration, research, private sector, etc.). There is a clear path to scale up these tools, bridging the gaps that still exist, while continuing to innovate.
This would of course only work if these platforms and systems are agile enough to be co-constructed with national and local stakeholders – governments & Nations Statistics Officers, national service providers & tech experts, and local communities.
Support from international organizations and donors will of course be essential. This would also entail a change of positioning from donors: reprioritizing data collection from accountability towards themselves, to data useful for programs and accountability (above all) to affected populations.
It is worth noting that in a context of political instability, the heavy compliance and reporting procedures imposed to humanitarian actors for decades have proven of little use to secure international aid funding and be accountable to taxpayers and the public opinion…
Such an approach would also have side benefits: building more locally adoptable and adaptable tools, more sober and sustainable systems (as the dramatic environmental footprint of AI begins to unveil), more scalable solutions (with examples like OpenStreetMap), and last but not least, more ethical and responsible approaches.
In other words, putting the problem before the solution – a simple yet often overlooked principle in a world where innovation is too easily equated with more technology. All the while keeping a human approach to the process, often identified as a necessary condition for AI to be adopted and useful.
How to avoid a digital “Wild West”?
As panelists expressed it in a recent The New Humanitarian conversation, it is now about “not throwing the baby out with the bathwater”: preserving all those vital humanitarian systems and knowledge that have been built over the years, while using the momentum to reshape the system completely.
The same applies of course to humanitarian data: it is essential on the one hand that key systems are preserved, and on the other hand, to support and fund transformative initiatives and in particular local actors. As the humanitarian sector grapples with existential challenges – and as committed donors and governments face mounting demands to bridge the gap – sustaining support for key digital commons and data products is crucial.
As well as listening to the voice of local actors, to decide where remaining funds should be allocated and what support they require. Without it, we risk opening the door to a digital “Wild West” where the powerful freely extract data from vulnerable communities.
The challenges we face are therefore considerable, but we must take a step back to ensure the choices that we make today help create a more equitable system that prioritizes the well-being of communities over the economic value of their data. This is all the more the case when – sadly – NGOs have to close operations as is the case for many at the moment. It requires following proper responsible data practices.
The context requires each donor – when deciding on tech or data-related funding – and every frontline organization – when reorganizing activities – to ensure that the technological decisions they make are geared towards a higher involvement of local communities and less dependency on Big Tech. It is time for the humanitarian sector in all its diversity to come together to build a new data ecosystem.
By Maeve de France and Martin Noblecourt of CartONG