(Image: Mitchell Squire/Private Media)

Say the word “drone” and the average human brain jumps to surveillance, spies, and… well, insert any sinister sentiment you please.

This is no doubt abetted by Hollywood renditions of robots that outwit their programmers and angle for evil, but the reality is that these machines are more often mundane than menacing. So why are humans programmed to react with fear and loathing towards a pre-programmed system?

“We see drones how the Egyptians used to see birds: ‘What do you want, why do you look at me, why are you following me?’ ” scientific futurist from the Australian National University’s recently opened School of Cybernetics Dr Catherine Ball told Crikey.

It’s a relationship wrapped in distrust of what we don’t know and a one-size-fits-all word for anything automated, autonomous, or appearing to be autonomous.

While this gives drones a bad rap, Ball says it also gives them an edge as the human “gateway drug” into an increasingly technological and AI-inclined world.

Move over for the machine

Humans already go head-to-head with machines daily. It could be an automatic door that refuses to admit entry, a revolving door that runs a tight schedule, automatic lights that ignore the idle inhabitant, automatic hand-washing stations that get a little overexcited by human touch, or a self-flushing toilet that goes above and beyond to get bums on seats. These systems serve a purpose, but to function they need humans to actively engage with them.

And yet the notion of having any semblance of a relationship with a machine sends the average human spiralling into their own science-fiction drama.

Humans tend to think of computers in two ways: slave or boss. Both come with connotations of control — either human controls machine or machine controls human — and that’s problematic for both parties because it eliminates the capacity for give and take, trial and error, feedback and improvement.

The Australasian Dance Collective, in collaboration with the Australian National University’s recently opened school of cybernetics, is attempting to dispel these perceived power dynamics through a dance piece with humans and drones.

“Is it drones in a dance piece or is it dancers in a drones piece?” Ball pondered.

From the stage to the street

As the world digs its heels into an increasingly automated future, the systems humans engage with are set to become more complex. Be it for manufacturing, transport, delivery, cleaning, emergency services, medical miracles inside our bodies, mapping under the sea, on the street or out of this world, robotics are on the rise.

They’re moving out of standalone desktop devices and into the world around us as systems that move and act by themselves. The more things a system interacts with, the more moving parts (literally) there are to account for. Understanding the basics of machine interaction (small talk through to finishing one another’s sentences) are therefore key.

And humans already do it.

“When speaking to an AI system, we regulate our voice and our manner of speaking to ensure it can understand us. It’s the same with any type of robotic system, but it’s much easier to learn to talk than to figure out how to move,” said Ball’s colleague, professor of cybernetics and deputy director of the school of cybernetics Alex Zafiroglu.

Which is why the ANU team have leaned into dance. Renowned for its robotic level of precision (think Swan Lake, where the humans are almost so automated that they all look like mirror images of each other), it’s an apt medium to explore the field of “co-botics” and draw up a how-to guide (call it a dress rehearsal) for moving with machines. Data from the dance will also be used to inform the design and development of manufacturing settings and infrastructure.

In short: cool, calm, collected humans and operating environments make for level-headed machines. And level-headed machines breed happy humans.

“I think it will be a while before we’re shaking hands with a computer and calling it brother,” said professor of mechatronic engineering at University of Southern Queensland John Billingsley, but he added that humans are also pre-programmed, play-tested and subject to automatic system upgrades courtesy of machines.

The art of computers

Billingsley has been in the computing game for more than 50 years. He was part of the 1960s cohort that introduced the 20th century to computing technologies through art and design — the original cybernetics. Jump forward to the 21st century and the ANU is deploying the same methods to entertain a similar (albeit technologically updated) conversation.

It’s simple, says Billingsley: “Computers encourage you to be creative.”

Hark back to the very first Cambridge computer. A human could play noughts and crosses with the machine by putting their hand through a beam of light. All these years later and the same game-like principles apply, only now humans will move in turn with systems like self-driving cars.

But Billingsley says there is nothing to fear, only new things to learn: “All of the great mysteries of science will be answered in the next 20 years, and know all of those answers will be wrong.”

Is technology too much? Let us know your thoughts by writing to letters@crikey.com.au. Please include your full name to be considered for publication. We reserve the right to edit for length and clarity.