Skip to main content Skip to main navigation
Skip to access and inclusion page Skip to search input

Transformation and the moral imperative

11:20am March 21 2018

Advances in artificial intelligence and robotics are leading to testing of robots that access memories and learn from interactions with humans. (Getty Images)

For the last 60 years, computing has been about making analogue things digital, and the computers worked through a command and control infrastructure. We told the computers what do to do and they do it.

In the world that is coming, sometimes they’re going to do more than what we tell them. They will be proactive and behave without immediate human control.

This will create a completely different world.

We have to remember that there are choices to make on how we architect that world. Creating a new social contract is at the core of the digital new world. Because when we think about digital transformation, the important question is: transformed to what, for whom, and why?

Part of the way we have measured the success of the last 60 years of computing has been in terms of productivity and efficiency gains. They have enabled us to do things faster. While they may have been good metrics for how to think about computing, they may not be right for what is coming next.

As a society, we need to think about other values: be they around fairness, equity and sustainability. We also need to focus on notions of citizenship and engagement, not just about consumerism. Rightly or wrongly, we are seeing signs of this starting to play out in the debate around the increasing power of “platforms” such as Facebook and Google and Amazon.
 

An executive shows the new Google Home at the annual Google I/O conference in 2016. (Getty Images)


There is a moral imperative which needs to be factored into this digital transformation that’s underway, and I believe institutions and corporates have a moment right now to have a conversation about what their own imperative is.

As part of this, what corporates need to understand is that many of them have a “delta problem”: issues around factoring customers into their strategy.

Technology companies may not have structural problems or technical problems, but they could well have problems with their imagination, and this often happens when leaders can no longer put themselves in the shoes of the people they need to connect to. Who your customers are may not be who your leaders are.

The temptation is often to reach out and obtain more data as a solution, but the reality is that most of us don’t trust data if it doesn’t conform to our world view, and we just dismiss it out of hand.

To address the delta problem requires not just quantitative research, but to factor in the qualitative in terms of people’s stories and voices.

Because if it doesn’t, then the delta problem can lead to a breakdown in trust, and trust is one of the most important features of any system. And what trust looks like now is more complicated than ever before.

If you think about the fourth wave of the industrial revolution and the move to cyber physical systems with artificial intelligence, one of the greatest challenges is whether these are systems we can trust.

It is going to lie at the heart of our relationship with every piece of technology being built: those which rationalise our relationships, drive our cars, clean our houses, decide if we are credit worthy, and so on.

The development of the iPhone in 2007 has revolutionised the way people communicate and do business. (Getty Images)


In each of these situations data will be used to render a judgement which affects us, so the question is: how are we going to know how those judgements were reached and how do we know that it was a fair process?

How do we think about building a world where technology makes decisions which are moral and ethical, and also legal?

All of this is coming and we are not talking about it enough. We are not having the conversations about the moral and ethical dimensions of data.

We need to make sure that the world we are building is inhabited with our values, and is a world we can continue to trust.

For organisations which are transforming and building things as they do it, this is a moment to be thoughtful, innovative and careful, and make sure they transform into the organisations they want to be.

We are standing in the middle of an incredibly significant transformation driven by technology.

But we have a choice, and that is to live in a world which we choose to build, or a world where we just let things happen to us.

More than at any time in history we are faced with that choice now.

The views expressed are those of the author and do not necessarily reflect those of the Westpac Group. This article is general commentary and it is not intended as financial advice and should not be relied upon as such.

Professor Bell is the Director of the 3A Institute, Florence Violet McKenzie Chair, and Distinguished Professor at the Australian National University (ANU), as well as a Vice President and Senior Fellow at Intel. Prof Bell is a cultural anthropologist, best known for her work at the intersection of cultural practice and technology development. She heads the 3A Institute, co-founded by the ANU and CSIRO's Data61, tasked with building a new applied science around the management of artificial intelligence, data, technology and their impact on humanity. Prof Bell also presented the highly acclaimed ABC Boyer Lecture series for 2017.

Browse topics