Thursday, May 27, 2021

Ethics is the new Quality

This morning I took part in the first panel at the BSI conference The Digital World: Artificial Intelligence.  The subject of the panel was AI Governance and Ethics. My co-panelist was Emma Carmel, and we were expertly chaired by Katherine Holden.

Emma and I each gave short opening presentations prior to the Q&A. The title of my talk was Why is Ethical Governance in AI so hard? Something I've thought about alot in recent months.

Here are the slides exploring that question.


And here is what I said.

Early in 2018 I wrote a short blog post with the title Ethical Governance: what is it and who's doing it? Good ethical governance is important because in order for people to have confidence in their AI they need to know that it has been developed responsibly. I concluded my piece by asking for examples of good ethical governance. I had several replies, but none were nominating AI companies.

So. why is it that 3 years on we see some of the largest AI companies on the planet shooting themselves in the foot, ethically speaking? I’m not at all sure I can offer an answer but, in the next few minutes, I would like to explore the question: why is ethical governance in AI so hard? 

But from a new perspective. 

Slide 2

In the early 1970s I spent a few months labouring in a machine shop. The shop was chaotic and disorganised. It stank of machine oil and cigarette smoke, and the air was heavy with the coolant spray used to keep the lathe bits cool. It was dirty and dangerous, with piles of metal swarf cluttering the walkways. There seemed to be a minor injury every day.

Skip forward 40 years and machine shops look very different. 

Slide 3

So what happened? Those of you old enough will recall that while British design was world class – think of the British Leyland Mini, or the Jaguar XJ6 – our manufacturing fell far short. "By the mid 1970s British cars were shunned in Europe because of bad workmanship, unreliability, poor delivery dates and difficulties with spares. Japanese car manufacturers had been selling cars here since the mid 60s but it was in the 1970s that they began to make real headway. Japanese cars lacked the style and heritage of the average British car. What they did have was superb build quality and reliability" [1].

What happened was Total Quality Management. The order and cleanliness of modern machine shops like this one is a strong reflection of TQM practices. 

Slide 4

In the late 1970s manufacturing companies in the UK learned - many the hard way - that ‘quality’ is not something that can be introduced by appointing a quality inspector. Quality is not something that can be hired in.

This word cloud reflects the influence from Japan. The words Japan, Japanese and Kaizen – which roughly translates as continuous improvement – appear here. In TQM everyone shares the responsibility for quality. People at all levels of an organization participate in kaizen, from the CEO to assembly line workers and janitorial staff. Importantly suggestions from anyone, no matter who, are valued and taken equally seriously.

Slide 5

In 2018 my colleague Marina Jirotka and I published a paper on ethical governance in robotics and AI. In that paper we proposed 5 pillars of good ethical governance. The top four are:

  • have an ethical code of conduct, 
  • train everyone on ethics and responsible innovation,
  • practice responsible innovation, and
  • publish transparency reports.

The 5th pillar underpins these four and is perhaps the hardest: really believe in ethics.

Now a couple of months ago I looked again at these 5 pillars and realised that they parallel good practice in Total Quality Management: something I became very familiar with when I founded and ran a company in the mid 1980s [2].

Slide 6 

So, if we replace ethics with quality management, we see a set of key processes which exactly parallel our 5 pillars of good ethical governance, including the underpinning pillar: believe in total quality management.

I believe that good ethical governance needs the kind of corporate paradigm shift that was forced on UK manufacturing industry in the 1970s.

Slide 7

In a nutshell I think ethics is the new quality

Yes, setting up an ethics board or appointing an AI ethics officer can help, but on their own these are not enough. Like Quality, everyone needs to understand and contribute to ethics. Those contributions should be encouraged, valued and acted upon. Nobody should be fired for calling out unethical practices.

Until corporate AI understands this we will, I think, struggle to find companies that practice good ethical governance [3]. 

Quality cannot be ‘inspected in’, and nor can ethics.

Thank you.


[1]    I'm quoting here from the excellent history of British Leyland by Ian Nicholls

[2]    My company did a huge amount of work for Motorola and - as a subcontractor - we became certified software suppliers within their six sigma quality management programme.

[3]    It was competitive pressure that forced manufacturing companies in the 1970s to up their game by embracing TQM. Depressingly the biggest AI companies face no such competitive pressures, which is why regulation is both necessary and inevitable.

Saturday, May 15, 2021

The Grim Reality of Jobs in Robotics and AI

The reality is that AI is in fact generating a large number of jobs already. That is the good news. The bad news is that they are mostly - to put it bluntly - crap jobs. 

There are several categories of such jobs. 

At the benign end of the spectrum is the work of annotating images, i.e. looking at images and identifying features then labelling them. This is AI tagging. This work is simple and incredibly dull but important because it generates training data sets for machine learning systems. Those systems could be AIs for autonomous vehicles and the images are identifying bicycles, traffic lights etc. The jobs are low-skill low-pay and a huge international industry has grown up to allow the high tech companies to outsource this work to what have been called white collar sweatshops in China or developing countries. 

A more skilled version of this kind of job are translators who are required to ‘assist’ natural language translation systems who get stuck on a particular phrase or word.

And there is another category of such jobs that are positively dangerous: content moderators. These are again outsourced by companies like Facebook, to contractors who employ people to filter abusive, violent or illegal content. This can mean watching video clips and making a decision on whether the clip is acceptable or not (and apparently the rules are complex), over and over again, all day. Not surprisingly content moderators suffer terrible psychological trauma, and often leave the job burned out after a year or two. Publicly Facebook tells us this is important work, yet content moderators are paid a fraction of what staffers working on the company campus earn. So not that important.

But jobs created by AI and automation can also be physically dangerous. The problem with real robots, in warehouses for instance, is that like AIs they are not yet good enough to do everything in the (for the sake of argument) Amazon warehouse. So humans have to do the parts of the workflow that robots cannot yet do and - as we know from press reports - these humans are required to work super fast and behave, in fact, as if they are robots. And perhaps the most dehumanizing part of the job for such workers is that, like the content moderators (and for that matter Uber drivers or Deliveroo riders), their workflows are managed by algorithms, not humans.

We roboticists used to justifiably claim that robots would do jobs that are too dull, dirty and dangerous for humans. It is now clear that working as human assistants to robots and AIs in the 21st century is dull, and both physically and/or psychologically dangerous. One of the foundational promises of robotics has been broken. This makes me sad, and very angry.

The text above is a lightly edited version of my response to the Parliamentary Office of Science and Technology (POST) request for comments on a draft horizon scanning article. The final piece How technology is accelerating changes in the way we work was published a few weeks ago.

Thursday, May 13, 2021

The Energy Cost of Online Living in Lockdown

Readers of his blog will know that one of the many things ethical I worry about is the energy cost of AI. As part of the work I'm doing with Claudia Pagliari and her National Expert Group on Digital Ethics for Scotland I've been looking also into the energy costs of what is - for many of us - everyday digital life in lockdown. I don't yet have a complete set of results but what I have found so far is surprising - and not in a good way.

So far I've looked into the energy costs of (i) uploading to the cloud, (ii) streaming video (i.e. from iPlayer or Netflix), and (iii) video conferencing.

(i) Uploading to the cloud. This 2017 article in the Stanford Magazine explains that when you save a 1 Gbyte file – that’s about 1 hour of video - to your laptop’s disk drive the energy cost is 0.000005 kWh, or 5 milliWatt hours. Save the same file to the Cloud and the energy cost is between 3 and 7 kWh. For comparison your electric kettle burns about 3 kWh. This mean that the energy cost of saving to the cloud is about a million times higher than to your local disk drive. 

The huge difference makes sense when you consider that there is a very complex international network of switches, routers and exchange hubs, plus countless amplifiers maintaining signal strength over long distance transmission lines. All of this consumes energy. Then add a slice of the energy costs of the server farm.

(ii) Streaming video. This article in The Times from May 2019 makes the claim that streaming a 2 hour HD movie from Netflix incurs the same energy cost as boiling 10 kettles (based on the sustainable computing research of Mike Hazas). To estimate  how much energy that equates to we need to guess how full the kettle is. A half full 3kWh kettle will take about 2 minutes to boil, and consume therefore 100 Watts. Do that 10 times and you've burned 1kW. A DVD player typically consumes 8 Watts, so streaming costs 125 times more energy.

Again this makes sense against uploading to the cloud, except that here you are downloading from Netflix servers. A 2 hour HD movie is alot of data, around 10GBytes, so 10 times more than the case for (i) above.

(iii) Video conferencing. This post on David Mytton's excellent blog explores the energy cost of Zoom meetings in some detail. David estimates that a 1 hour video zoom call with 6 participants generates between 5 and 15GB of data and that the data transfer consumes between 0.07 – 0.22kWh of electricity. Using our benchmark of kettles boiled this is pretty modest - at most less than one tenth of the energy cost. 

However this estimate makes 2 assumptions: first that you are connected via cable or fixed line - which here in the UK costs 0.015kWh per GByte. A mobile connection costs about seven times that at 0.1kWh/GB. And second, this estimate measures only the energy costs of data transmission and fails to take account of the energy costs of Zoom's data centres, which - if (i) and (ii) here are anything to go by, could be significant, especially since there aren't any in the UK and the default servers are in the US.

As this article on the Zoom blog explains, Zoom calls are not peer to peer. The video from each participant is streamed first to a zoom server then broadcast to every other person on the call. As David Mytton says Zoom don't release information on the overall energy costs of calls. I strongly suspect that if server energy costs were factored in they would be in line with cases (i) and (ii) above. Even so, I feel sure that David Mytton's overall conclusion remains true: that the energy cost of Zoom meetings is significantly lower than all but local or regional travel.


I would like to see networking services like cloud storage, video on demand and video conferencing publish a meaningful energy cost. When we buy packaged food from the supermarket we expect to read the calorific energy value of each item, broken down into fat, salt and so on.  It would be great if every online transaction, from sending an email, to watching a movie revealed its energy/carbon cost. Not just for energy geeks like me, but to remind all of us that the Digital Economy is *very* energy hungry.

I would welcome any additional data which either adds to the above (especially the energy costs for smaller online transactions like tweets, emails or card payments), or shows that the estimates above are wrong. 

Related blog posts:

On Sustainable Robotics
Energy and Exploitation: AIs dirty secrets
What's wrong with Consumer Electronics?