Data & Analytics News South Africa

Quantum, AI turns computing into a 'loaded gun'

Quantum computing and AI and machine learning are dramatically changing the face of computing, presenting the risk of a range of unintended consequences.
Cliff de Wit, CTO and co-founder of Dexterity Digital
Cliff de Wit, CTO and co-founder of Dexterity Digital

This is according to Cliff de Wit, CTO and co-founder of Metrofile group’s Dexterity Digital, who addressed the opening of the annual DevConf developers’ conference recently.

De Wit said developers had moved from using low-level building blocks to composition tools and high-end platforms, quantum computing was fundamentally changing computing, and AI and machine learning were reversing traditional programming models.

“Machine learning starts with a model, you feed in a sample output and then you train the programme to deliver the output – so you never wrote the programme, you trained it.”

This could have unintended consequences, as has been illustrated by AI tools that developed racial or gender bias of their own accord. “In AI and machine learning, the programme is learnt and taught, so it changes over time and this demands continual monitoring.”

De Wit cited cases where developers themselves did not know exactly how their AI and machine learning systems had evolved, and on what they based their decision making. In one instance, image recognition algorithms fed images of dogs and wolves persisted in identifying huskies as wolves. On investigation, it was found that the system had determined that pictures of wolves were always in snowy landscapes, therefore a husky pictured in a snowy landscape had to be a wolf.

“These unintended consequences seem to be particularly prevalent in machine learning, and the responsibility lies with us to focus on the ethics in computer science. Never before have there been so much computing power or the people behind it been in such a powerful position. We just got a big loaded gun, and we need to think about what we do with it,” he said. “We need to think about the bigger picture, ask the why questions and consider whether there could be unintended consequences.”

De Wit noted that software powers systems that determine prison sentencing, access to funding, or who qualifies for university access or home loans, and that the developers behind the software need to understand the impact their software can have.

Formalised AI and machine learning practices and guidelines will take some time, and currently commercial vendors are driving technology innovations in these spaces, he said. “At the moment, ethical software development is an industry responsibility, and developers themselves need to take the time to understand the consequences of what they are doing.”

Let's do Biz