Telecommunications is a lucrative business that generates a great deal of profit for network operators. Unfortunately, but not unsurprisingly, telecommunications infrastructure tends to be concentrated around the greatest opportunity for profit, namely in more densely populated areas where investments in infrastructure can serve the greatest number of people. This also tends to be where wealth is most concentrated. Look at any major city in sub-Saharan Africa and telecom infrastructure is booming. It varies in affordability but access is there.
The problem exists in more sparsely populated rural areas where there is neither the density nor the wealth to justify infrastructure investment to network operator shareholders. Enter national governments and their mission to make access affordable and available to all. Governments have a number of options at their disposal.
- They can do the job themselves by building government-owned networks.
- They enter into public-private partnerships (PPPs) with industry.
- They can create incentives to encourage the market into under-served areas.
- They can tax existing network operators to finance network infrastructure development in the form of Universal Service Funds to create infrastructure and services where none exists.
And yet universal access remains elusive. While there are some success stories, no best practice model emerges. This may be simply because complex systems do not respond to formulaic interventions and every country has a unique combination of challenges that include leadership, geography, market development, foreign investment, governance, rule of law, etc.
As both an observer and participant in universal access debates for many years, I have been on the look out for the “right strategy” for achieving universal access. You may not be surprised to learn that I haven’t found it. What I have found is what I think are some general principles that can help to avoid the worst kinds of failures in UA strategies. They are based on the insights of some modern thinkers who have given me new tools for looking at the problem of access.
1. Skin In The Game
The term “skin in the game” has been around for a long time but has been recently taken up by Nassim Nicholas Taleb, author of The Black Swan and Antifragile. He argues that no one should be able to put others at risk without the risk of harm coming to themselves. He gives the example of an architect who designs a house with a weak foundation. The incentive for the architect to design a solid foundation is much higher if the architect has to live in the house. He argues that divorcing risk from decision-making will inevitably lead to bad outcomes. Silicon Valley has long embraced this principle with the concept of “dog-fooding” where developers are obliged to use the tools they are developing for their own work.
This idea also resonates with my experience as a development research funder. Where there is no direct personal stake in the success or failure of the outcome of a project, decision-making tends to ignore risks that otherwise might be red flags. Taleb argues that skin-in-the-game is not just a practical risk management issue but also a moral one. What right do we have to put others at risk if we do not carry an equivalent risk related to our decisions?
When we look at Universal Service Funds that are managed by people who do not have a personal stake in the outcome of the initiatives they fund, it is not surprising to find these haven’t worked out so well. In designing an access strategy, it is worth analysing how close decision-makers are to the risks they are incurring. From a beneficiary point of view, this might mean putting the control of the purse strings much closer to the people who stand to benefit from access.
2. Trust
In their book, Why Nations Fail, Daron Acemoglu and James Robinson argue that prosperity is significantly dependent on political institutions. Wealth is generated by investment and innovation, but these are themselves dependent on trust. Before they will make a commitment, investors and innovators require a sufficiently high level of trust that they will not be exploited by the powerful. This offers a very practical lens on the telecoms sector.
Consider a country like Kenya where, under the leadership of Bitange Ndemo, then Permanent Secretary of the Ministry of Information and Communication (2005-2013), there was a clear message communicated to all that Kenya was committed to the development of ICT infrastructure and welcomed a partnership with industry. Contrast this with South Africa which has endured no less than five Ministers of Communication with varying agendas during the same period. Any universal access strategy needs to be considered from the point of view of whether it builds or erodes institutional trust.
And trust is not just an issue between industry and government. Trust among large operators is essential as well. I have argued that trust can be the key to understanding interventions to bring down backhaul costs. You can see the issue playing out across the continent.
3. The Categorical Imperative
A recent Econtalk podcast with economist Paul Romer on Urban Growth gave me an unexpected but profound insight. In it, Romer makes the distinction between concession zones and reform zones.
Introducing new policies or regulation in a limited zone whether geographic or virtual is a clever idea that allows for policies to be tested in the crucible of real life. The distinction that Romer makes is between interventions that provide concessions to a group, such as a tax break for example, versus reforms, such as a new policy which, if successful, is worth implementing everywhere. The generalisability of interventions seems like an important point and it makes me suspicious of initiatives that attempt to create a protected environment for business.
This idea is applicable to universal access as many initiatives attempt to treat rural access as a special case that requires a unique subsidy or special rules. It seems to me that any rule that proves good for rural access ought to be generalisable.
4. Plan On Being Surprised
Although we know things will change, we tend to believe that what we have now will also define the future; that the future is largely an extrapolation of the present. Think of technologies we rely on today, that didn’t exist ten years ago: smartphones, tablets, social media as we know it, the list goes on.
And yes, we are consistently surprised. Mobiles may be the future of everything but maybe not in the way that we imagine. Competitive markets cope very well with technological change and the key to good policy and regulation is to avoid tangling up policy and regulation with technological platforms. Sometimes making technological bets is unavoidable for governments but where possible, it is a much better option to leave technology choice to the private sector.
This needn’t result in a laissez-faire strategy. By treating the telecom sector for what it is, a complex adaptive system, one can apply lessons from the study of complex systems to cultivate better outcomes. This may involve creating a range of incentives and disincentives to encourage good behaviour in the market without being prescriptive.
6. Fail Small
A corollary to 1, 3, and 4 above is the importance of designing to mitigate failures. If we assume we are going to be wrong, and probably reasonably often, we want a strategy that minimizes failures and allows us to amplify successes. Multi-million dollar contracts such as that awarded for Johannesburg’s metro fibre project are high-risk ventures that are not only disastrous when they fail but get in the way of anything else happening.
Contrast this with the success of the bottom-up, community-driven Parkhurst neighbourhood fibre initiative which was quickly emulated by others and the risk comparison is obvious. This is not to say that large projects should never be undertaken but they need to have built-in “circuit-breakers” that will trip before failure cascades.
5. Enable the Adjacent Possible
Complexity theorist Stuart Kauffman coined the term “the adjacent possible” which has since been taken up by science writers like Steven Johnson and economists like Ricardo Hausmann. The idea of the adjacent possible is that at any given time in evolution, only a certain number changes are possible.
For example, prior to the arrival of the opposable thumb, the use of tools was beyond the possible. With the opposable thumb, tool use became part of the adjacent possible. A similar analysis can be made of technology development. Without the development of moveable type, ink, and paper, the printing press was beyond the possible. Or more recently, prior to the development of high-speed broadband, streaming media services such as Netflix were not within the realm of the possible. The more diverse the palette of technologies to choose from, the greater the scope of the adjacent possible.
What I want to suggest is that strategies, policies, and regulation that invite experimentation and “tinkering” are generally smarter bets in a rapidly changing environment than ones that assume a particular outcome. Here we can compare the openness of the unlicensed spectrum bands which have enabled WiFi, Bluetooth, and other hugely successful technologies with the monolithic bets that were made on technologies like WiMax, ISDN, and others. Unlicensed spectrum is also littered with failures but they were low-cost failures. Open platforms for innovation whether simple wholesale fibre access networks or services such as Amazon Web Services are true enablers of the adjacent possible.
Allowing for “tinkering” in the implementation of strategies can enable previously unconsidered combinations of technologies to become successes. Take for example the combination of the spread of metro fibre infrastructure and the growth of WiFi which has enabled alternative wireless success stories. The key point of the adjacent possible is that the more diverse the range of technologies, platforms, and services available, the more different and interesting social and economic innovations may emerge. This is why strategies should embrace a range of approaches from business-led to community-led and a range of technologies from fibre to wireless, licensed to unlicensed, mobile to fixed, etc, etc.
Ok, it’s still rough
The above is not a recipe for success but my evolving insights on avoiding or mitigating failure. Comments, reactions, elaborations, rebuttals very welcome.
Originally published as How to Think About Universal Access
Nice article. Particularly love the Fail Small part.
Insightful thinking Steve. Thanks for sharing. I agree with the logic suggested – if we are restricted to thinking inside the neo-liberal box of a privately developed infrastructure. Tim Unwin has written about the record of failure in PPPs. However, as you point out, government has the option of doing the job themselves. I am old enough to appreciate the great job that the UK government did in providing universal access to postal services, electrical supply and land-line telephony. Every citizen had the right to receive these services – irrespective of how rural the location – and every citizen accessed them at the same price – irrespective of the true cost of delivery. Has any private initiative equalled this success – and if not, why are we thinking inside the neo-liberal box?
Hi Tony. I think I must present myself as more of a neo-liberal than I am. I am agnostic about government vs private sector solutions. My real interest is in trying to understand what works and also what fails less spectularly. A government broadband initiative worked in South Korea but actually it is hard to distinguish the private sector from government in that particular case. In SSA, it is hard to point to many successful government-led broadband initiatives. Open to discussing any specific examples you have in mind.
Hi Steve. Sorry for not being clearer. I’m not accusing you of being neo-liberal. I’m saying that USF-thinking is by definition inside the neo-liberal box.
Before Thatcherism and neo-liberal thinking governments used to build public infrastructure themselves – and in some examples, like the ones I gave, did a good job of delivering against universal service principles.
Post-Thatcher and since public utilities were privatised the Universal Service Fund was invented to reach the parts that the profit-motive does not. However this private sector plus USF model seem to have consistently failed to deliver?
My point is that if the private-sector plus USFs consistently fails to deliver then we maybe we should stop repeating the same mistake and try thinking outside the neo-liberal box.
The important place where we are in complete agreement is that existing USF models mostly don’t work at all. I want to climb out of the current box whatever the label on the outside. 🙂
Reminds me of what Churchill famously said, “Democracy is the worst form of government, except for all those other forms that have been tried from time to time”! UK government (or a few others) may have done a great job in providing universal access to postal services, electrical supply and land-line telephony. Otherwise the world is full of examples where the governments failed rather miserably. USFs have been “new-entrants”, they’ve been stumbling, but learning. I am optimistic!
Thanks Steve, Good food for thought…..and action. I would suggest an additional principle, that being that expertise and technology only have relevance in context, and that context is best understood by engaging all stakeholders in strategy development and implementation. The one group with “skin in the game” that is frequently marginalized in both strategy development and implementation is the intended direct beneficiaries. Their participation is essential to understanding context.
Great point Sam.