In this #mtpcon SF+Americas keynote, Cynthia Yeung, Head of Product at Plus One Robotics, gets us thinking about human/robot interaction and how we might discover a deeper understanding of our own humanity through robotics and AI.
Watch the video to see her talk in full, or read on for an overview of her key points:
- Most people don’t actually like every aspect of their job
- Responsibilities are not the same thing as jobs
- We shouldn’t confuse jobs with human livelihood and dignity
- A robot can’t assume all the responsibilities of any given job
- As product managers we have to think about the consequences of our work
Cynthia starts by reflecting on the physical, economic and social context of her talk. She gives her talk from a customer site to people who could be watching it anywhere. There is an opportunity cost both to her and to all the people watching her talk, and of course none of us know each other. She says: “It’s almost private, intimate, even. It might seem as if I’m just two feet away, four feet away, I guess really. And in this moment, there’s only you and me. And so I wonder who is the you that’s on the other side of the screen?.”
A dystopian future?
Cynthia read a lot of science fiction as a child. She says that these stories always seemed to centre around the same sort of dystopian themes – killer robots, environmental destruction and the need to venture off to new planets, a lack of community where we’ve lost all our social skills and people don’t talk to each other anymore, or worlds where robots have replaced all the humans and people don’t know what to do with themselves. She says that this last scenario is often in the headlines: “And here’s the rub, the paradox, if you will. And that’s that most people don’t actually like the entirety of their job.”
Responsibilities are not the same thing as jobs, she says. Jobs consist of a set of responsibilities that might evolve over time. Similarly, we shouldn’t confuse jobs with human livelihood and dignity. Jobs are often a means to an end. They are a source of income, and can provide people with a sense of purpose and an avenue for growth and learning. These are all great things, but they don’t have to come from jobs. It’s important to make this distinction between jobs and human livelihood and dignity – because they are not the same thing.
The machines won’t take over
Many people jump to the conclusion that a robot can assume all the responsibilities of any given job, but this is not true. Even the most advanced robots do not have the dexterity or are capable of the human judgement needed to problem-solve on the fly.
People also assume that automation will lead to unemployment, and that will foment unrest. Cynthia says the truth here is more nuanced: she says there are three classes of job. The first class is jobs that are dull, dirty and dangerous, and really should be fully automated as soon as they’re able. The second class is jobs where robots and humans can work together, and where humans can delegate unwanted responsibilities of their job to robots. The third class is jobs that will be created in this new economy, like designing robots. Cynthia says that many critics don’t seem to grasp that robots and AI can make jobs more human and more humane, once certain responsibilities are automated and she gives some current examples where this has happened.
From the Industrial Revolution onwards there have been fears about automation and unemployment. But in the long run, says Cynthia, these fears typically have not come to pass. But while jobs continue to be created, at the same time the wealth gap is increasing. She says: “This should concern us because these facts come to increasingly dire situation, for the least economically privileged among us.”
While some economists point to automation as a cause of the growing wealth gap, Cynthia believes it is rather the societal context in which these technologies are used. She gives some examples – Google’s famous image classification system fiasco when it categorised black people as gorillas being one.
Good intentions aren’t sufficient
Good intentions aren’t enough, she says. “It is entirely possible for a good person to design a bad system. More to the point, it’s entirely possible for a good person to design a vulnerable system, one that can be misused by malicious actors. And this leads to undesirable outcomes.It’s important to design not just for the happy path, but also to poke and prod for vulnerabilities to understand where things might break.”
Data sets are only as good as thoughtfulness and care of the people who put them together. Automation can be a boon or curse, depending on what safety net and opportunities are available to the least privileged among us. Technology simply reveals the fissures that are already present in our society.
As product managers, we have to think about the consequences of our work, Cynthia concludes. “We have to be constantly aware of the contextual environment, the physical, the economic, the social. We are accountable, I am accountable for this interaction, this moment of connection, but also for the robots I’ve put out into the world. Robots are a mirror to our soul, they reflect the best and worst of human impulses.”
This article is part of our AI Knowledge Hub, created with Pendo. For similar articles and even more free AI resources, visit the AI Knowledge Hub now.