The surge in public conversation about intelligent machines means that these days, the term “artificial intelligence” or Ai is a catch-all. There are many definitions of Ai—and many, many more imaginings about it. When it comes to describing what the Ai actually is or does, there’s usually a more mundane term to explain it, each with its own field and subfields: machine learning, robotics, virtual reality, data mining. On the other hand, there are technologies that already exist—many of which we use every day—that use a range of computational techniques that could fall under the umbrella of Ai, but which in practice aren’t called “artificial intelligence”: we call them search engines, drones, web stores, streaming platforms, social networks, voice assistants. And by some definitions, nothing we currently have is Ai—artificial intelligence is the promise of something that hasn’t been invented yet.
In that context, we find it helpful to think not of Ai, but of the constellation of technologies where data, networks, algorithms, machine learning and edge computing converge to transform the way computers and physical objects work. It is this constellation of technologies that is profoundly changing the world we live in.
So rather than Ai, we focus on “cyber-physical systems” (CPS).
The core features of all CPS are an ability to automatically sense the environment (drawing from IoT connect datasets or creating new data through sensing technology), to infer something from this data, and to act upon that data in a way that has real and unmediated effect in the world. CPS work in the world through a process of SENSE—INFER—ACT. Drones, autonomous vehicles, smart city infrastructure, wearable tech; these technologies are just the start of this CPS convergence. With advances in machine learning, these systems are moving rapidly towards being “proactive”, that is, capable of action without immediate reference to human controls, and being “intelligent” in that they can learn and adapt their action according to new information.
While there are an increasing number of programs forming at universities around the world dedicated to CPS, they generally focus on the system as a combination of robotics and software and interaction with the physical world (usually in a lab setting). At 3Ai, our research looks at the system beyond the metal – the broader technical, human and environmental implications of emerging CPS. We also look back to the past – all the way to the first human technical systems, such as 35,000 year old fish traps, examples of which can still be seen in places like Brewarrina. The Brewarrina fish traps were in continuous use until the 1930s when colonisation disrupted the Indigenous Australians’ way of life – imagine today building a technical system that is designed to last tens of thousands of years?
This broader perspective means our approach to research into Ai and CPS is fundamentally transdisciplinary.