Judging by the great response we had to NSW Policy Lab’s second event with Data61 on Tuesday 5 February, we know that plenty of people have been wondering about the use of artificial intelligence (AI) in the public sector.

The presenters covered the future of AI, machine learning, how AI can improve government services, and what can go wrong.

Ethics and empathy in AI were strong themes which need more consideration, along with the opportunity to design a great future for NSW citizens. As public servants, we need to be aware that what we discuss and design now and the decisions we make about it can have a great impact in the future.

Below is a short recap of each presentation. You can also watch the entire event on video in the Newsroom, or read the transcript of the event.

Panel members discussing AI
Panel discussion. L-R: Zack Thomas (moderator), Dr Audrey Lobo-Pulo, Tim De Sousa, Dr Richard Nock, Dr Tiberio Caetano, Nathan Frick

What the experts said

Pia Andrews, Executive Director of Digital Government Policy and Innovation, DFSI:

  • Trust and ethics are crucial for AI in government, including traceability, accountability and transparency.
  • Personal use of AI helpers will be a near-future reality, so governments need to ensure our rules, services, data and content are consumable externally.

Adrian Turner, CEO, Data61:

  • Australia has the opportunity to capture economic, societal and environmental benefits of AI with ethics at the core, export that thinking around the world, and be seen as leaders in AI.
  • We need to optimise nationally to make real progress.

Dr Richard Nock, Senior Principal Researcher at Data61 and Adjunct Professor at ANU:

  • People in the position to solve machine learning problems will rule the data/tech planet.
  • Machine learning started in the ‘sterile room’. The problem and the data were simple, which didn’t reflect the real world. The real world isn’t sterile – new solutions will create new problems.

Nathan Frick, Project Management Office, Revenue NSW:  

  • Look for opportunities to use AI within other government applications that we use.
  • Get a better handle on our data – clean, quality, and well-understood.

Dr Tiberio Caetano, Chief Scientist, Gradient Institute:  

  • You should learn about AI, but not faster than you learn about ethics.
  • Decisions made by an ethically blind AI system can lead to bias on race, gender, age etc. Formulas should be clear and transparent (and data sound) to prevent unfair bias.
  • We are still in the early days of AI ethics research and what it means to create safer AI to minimise harm to society.

Dr Audrey Lobo-Pulo, Founder, Phoensight:

  • We need to make AI less intimidating. What we bring to the table as humans is more important than what AI has the potential to offer.
  • Governments need to think carefully about how they’re designing systems and the interactions citizens have with those systems before AI is considered.
  • Openness and open government is the first step to transparency.  

Questions to ask about AI, from the experts

If you’re dabbling in the world of AI in government, but not sure where to start, our panelists raised many questions that will get those cogs turning:

  • How does this AI fit with our values?
  • How can we use AI to create better, seamless services?
  • Will this solution help shape a better, more inclusive future for everyone?
  • Is AI the solution to the problem at hand?
  • How do we make people feel safe in an AI environment?
  • How do we design AI that is emphatic and can relate to people’s human-ness?
  • What biases do we need to be mindful of?

Upcoming Ethical AI workshop

The themes of empathy and ethics came through very strongly. Despite being artificial intelligence, AI needs to be able to relate to humans.

The Policy Lab is looking at the AI ethical needs of NSW Government.  

If you are a NSW Government employee or an advocate for vulnerable citizens and have an interest in, or experience with AI, join us at our Sydney workshop at 10am on 28 February 2019.

Email us at [email protected] if you are interested in participating.