Like what you've read?

On Line Opinion is the only Australian site where you get all sides of the story. We don't
charge, but we need your support. Here�s how you can help.

  • Advertise

    We have a monthly audience of 70,000 and advertising packages from $200 a month.

  • Volunteer

    We always need commissioning editors and sub-editors.

  • Contribute

    Got something to say? Submit an essay.


 The National Forum   Donate   Your Account   On Line Opinion   Forum   Blogs   Polling   About   
On Line Opinion logo ON LINE OPINION - Australia's e-journal of social and political debate

Subscribe!
Subscribe





On Line Opinion is a not-for-profit publication and relies on the generosity of its sponsors, editors and contributors. If you would like to help, contact us.
___________

Syndicate
RSS/XML


RSS 2.0

Building the public goods of the 21st century: Google DeepMind edition

By Nicholas Gruen - posted Wednesday, 12 April 2017


History plays tricks on us. Just as we think we’ve got things figured, everything changes. My favourite example is Malthus’s ‘principle of population’ which explained why most people were mired in poverty despite improving technology. As Malthus explained, population growth was exponential and so would eventually outstrip productivity growth and return the bulk of the population back where it started – at bare subsistence. A theory with considerable explanatory power over the whole of human history! And published in 1798 just as it became obsolete. The next century saw the takeoff in living standards lifting global living standards around twenty times since then!

Something similar happened with economic reform and its guiding idea that private goods – cars, cookies, holidays, houses – were the heart of our material living standards and that they were best produced by competitive markets. Under the sway of these ideas we swept away all manner of silly regulation – controlling everything from airline scheduling to shopping hours.1 But where market failures continued – for instance in transport, communications and energy, we look more like slaves to fashion – creating markets or the appearance of markets where there were none. But we’ve come to rue much of our handiwork.

Meanwhile there was a Big New Development. The Internet and digital technology came of age. And here’s the thing. Digital artefacts – whether they’re an algorithm, a website, an app or a coding language – are always and everywhere potential public goods. Once produced digital artefacts are essentially costless to replicate which raises the question of whether they can or should be made freely available to all.

Advertisement

In fact since the Internet matured one entrepreneur after another has actually chosen to give their service away, even though it could have been provided behind a paywall. We’re now in a world of burgeoning public goods privately provided – from platforms whether they’re for profit like Google, Facebook, Twitter, or not like Wikipedia to open source applications like Linux, Firefox and the Android Operating system. There’s a paradox here because, by the standard assumptions economists make, governments should have built these assets – because the economic value they generate ranges far beyond the revenue – if any – they bring to their creators. But governments wouldn’t have had the skills to have produced these things, nor the permission to experiment and fail in order to seed the vast success stories that we see before us.

However, by definition, the public goods of the 21st century that have so far been built privately are the low hanging fruit – those projects where the cost of production is so low compared with the total economic benefit created that they can be funded from small slivers of the value they create, whether that’s from advertising, philanthropy or from the benefit some patch of software does for its author before they donate their code back to the project. There are plenty more digital public goods where they came from, but they’re harder to build for two reasons. Firstly many digital public goods are cognate with standards. Where someone comes along and establishes a ‘killer platform’ that lots of people want to be part of – like Google, Facebook or Wikipedia – they get to define the standard. But in lots of situations there are numerous incumbent digital platforms and products and there would be great public benefit for them to be interoperable. But, as we’re coming to see, bringing that interoperability about – creating that digital public good is much more easily said than done. Secondly some digital public goods may cost more than the private gains they bring to any one party. I’ll provide examples of each case below.

Meanwhile, the great digital innovators are champing at the bit to gain commercial advantage in getting to the future first – by building it. Enter Google’s artificial intelligence subsidiary DeepMind2 which is delivering a clinical alert app for kidney injury to the Royal Free London NHS Foundation Trust. Yet, as recently outlined in a learned journal article, things look decidedly dodgy. In circumstances that are still poorly documented publicly, DeepMind was able to ingest vastly more patient data than was necessary to build the app – all of it without clear patient consent. As the article points out, the arrangement likely compromises the public interest. Yet though the authors are right that this is no way to develop healthcare technology in the 21st century, it’s a much harder business to say how we should properly go about that task. And even if  satisfactory policy is necessary to success, so too is execution – which we’re learning is a lot harder again. So far we’re not very well advanced on either.

The article raises concerns about privacy and the domination of private over public considerations. On the first, if we’re serious about the actual risks to privacy, I expect they’re quite low. Google has a lot of interest in protecting its users’ privacy and a better record than the British Government. None of that excuses the Trust or Google in the cavalier way they’ve proceeded. However the alternatives the authors mention – which are tied up with obtaining greater consent – don’t seem all that helpful. One of the fundamental benefits that the data revolution unlocks is the capacity for endless interrogation and reconfiguration of data assets to generate new knowledge – putting the extraordinary possibility of personalised medicine within contemplation. And keeping transactions costs near zero is essential to seizing those possibilities. Seeking people’s consent for each and every use of their data simply erects barriers to new knowledge for no identifiable benefit. If that was what was required of Fiona Stanley when she used old data to identify patterns that enabled us to reduce the incidence of spina bifida in regional Australia she would have been stopped in her tracks – the knowledge latent in data we already had would never have been uncovered.

Instead we should provide users with reasonable protection against identifiable harms, preeminently violations of their privacy. Given people’s justifiable wariness about their data being used or mis-used for private gain, I expect we do need some legitimating consent from consumers. But if so, we should think about the architecture of that consent – the transactions costs involved for all parties in its being given, and its generality once given. We’ll foreclose myriad opportunities for human betterment if we hanker for line-item consent like some people yearn for the return of reusable milk-bottles.

In the meantime, the big story in the excitement about DeepMind and the NHS is the way in which the interests and technical capabilities of private operators are dominating public interests and capabilities. But calling that out, as the authors rightly do, is only the first step towards better outcomes. We need to articulate what those public interests are, and then understand how best to build a world that optimises them. And while the Googles of the world have been building their preferred world for over a decade and show no signs of slowing down, the representation of the public interest has been far more tentative – politically, but also intellectually.

Advertisement

The Productivity Commission is in the process of legitimating a stronger regulatory role for government, calling for broad based right for consumers to access data about them. As the Commission’s Chairman Peter Harris said recently, this is “something of a new departure for the commission” and I think it’s on the right track. As he puts it, it’s all part of trying to balance the rights of producers and consumers and align public and private interests. Yet this still implies an old model in which the public sector sets the rules and the world is then built by firms or other organisations competing with one another. As I wrote a few years ago in all those areas where market failure abounds – and that’s certainly the case in health – “output is better thought of as the joint product of competitive and collective (collaborative and regulatory) activity. Each sector requires the evolution of quite different institutions in which public and private, competitive and collaborative considerations concatenate at every level from high policy down to the life-world of workplaces”.

If that’s the case, how much work is there to do to really seize the potential that lies before us in big data for health let alone all the other areas of the digital economy? Many moons ago we twigged that a widely used, universal e-health record would be a fundamental, infrastructural public good for the future. But we’ve been at it for over a decade, sinking about a thousand times as much money as Google took to get online and yet the results remain modest. As the PC report is showing, we need more than grand pronouncements and more than simple expenditure to bring these things into existence. We need the whole system to start coalescing around these new public goods. I’ve suggested a slew of digital public goods that might be built by governments taking the lead in configuring public private digital partnerships or in simply using their convening power to help standards evolve and encourage public reporting against them. Generally these ideas are low risk and would be net fiscal contributors over any reasonable time frame. Most people exposed to them find them compelling. They’re far from the last word on how to build the public goods of the 21st century. But somehow they never find their way into the policy in-tray.

And ultimately this is but one potential line of development. As the Internet of Things burgeons and generates more and more data about us – from our smartphones to our Fitbits, making that data accessible to consumers is the first stage towards being able to combine it with other data and recombine it to generate worthwhile insights to improve our convenience in living our lives, but also our health and the health of others. Health professionals are increasingly using free online services like HealthKit for managing their practices.3 One might imagine that much of the most useful available data would assemble itself onto some platform for the greater good driven by the self-interest of all the players. But given the transactions costs involved in negotiating and dividing up the benefits to the owners of the data, to say nothing of the technical challenges of interoperability or the complications of privacy concerns and the ‘anti-commons’ created by the legal obsolescence of our intellectual property regime in a digital world, the chances of this happening are vanishingly small.

The kinds of rights proposed by the PC will likely help, but there’s plenty more governments can do by using their considerable sway in sectors like health – as funders, regulators, insurers and as conveners – to:

  • delineate common goals;
  • protect people’s legitimate interests, for instance in privacy, with firm and effective but parsimonious regulation;
  • accelerate the evolution of common standards; and
  • encourage those they can influence to take up those standards.

Only then will we be able to endlessly combine and recombine our data to find the hitherto undetected patterns and connections that can unlock a better life for us all.

  1. Pages:
  2. Page 1
  3. All


Discuss in our Forums

See what other readers are saying about this article!

Click here to read & post comments.

4 posts so far.

Share this:
reddit this reddit thisbookmark with del.icio.us Del.icio.usdigg thisseed newsvineSeed NewsvineStumbleUpon StumbleUponsubmit to propellerkwoff it

About the Author

Dr Nicholas Gruen is CEO of Lateral Economics and Chairman of Peach Refund Mortgage Broker. He is working on a book entitled Reimagining Economic Reform.

Other articles by this Author

All articles by Nicholas Gruen

Creative Commons LicenseThis work is licensed under a Creative Commons License.

Photo of Nicholas Gruen
Article Tools
Comment 4 comments
Print Printable version
Subscribe Subscribe
Email Email a friend
Advertisement

About Us Search Discuss Feedback Legals Privacy