Google’s ‘Duplex’ And Its Impact on Ethical Use of Technology

Share on Facebook Share on Twitter Share on LinkedIn Share in Email Print article
Written by Ashwin Krishna

Silicon Valley hosted one of the most well-attended tech conferences in the valley—Google I/O—recently. As expected, there were a lot of interesting announcements. But none more interesting than the Google Duplex announcement.

In the conference keynote, Google CEO Sundar Pichai played back two phone conversations that he claimed were 100-percent legitimate, in which Google’s AI-driven voice service called Duplex, called on real-world businesses and scheduled appointments based on a user’s data. In both cases, the voices sounded eerily more human than anything we have ever heard until now.

And that raises the critical but often ignored topic of ethics in the use of technology—Duplex and others—by enterprises and consumers alike. In fact, the need for having this ethical framework defined and adhered to has never been more critical.

Why Duplex will push ethical boundaries for businesses

The impact of organizational transformation and its effect on humans has been stark and obvious over the past century. That is about to change dramatically now. The transparency is about to be cloaked with opacity.

From a transparent to an opaque world

The core business goals have endured for decades—lower OPEX and CAPEX and increasing revenue and profitability. The industrial revolution, the IT revolution and the internet revolution have all resulted in better use of OPEX and CAPEX making businesses more efficient and causing significant disruption in the kinds of employable skills needed. But in each of these revolutions, it was evident what was happening—tractors replacing humans in the field; computers crunching numbers and allowing for faster and error-free human decisions; bridging suppliers and producers around the globe for efficient online commerce.

But Duplex raises the alarming possibility of all that changing. How so? Because for the first time, what is a human and what is not is under question. And as consumers, not knowing whether you are truly talking to a human or a computer can be quite disturbing.

The Moucheng conspiracy

This is not a prognosis for the future. In fact, businesses are already crossing boundaries with the technology of today. In January 2018, 21 dating apps—including Moucheng—were shut down by authorities in China after it was discovered that hundreds of thousands of customers—who had paid to chat with women online—had instead been duped in messaging with an AI computer program. The largest among them—Moucheng—had defrauded over a million users who paid a total of 340 million yuan (approximately $50 million USD). Now, what will never be known is the mental well-being of some of these lonely individuals after this egregious breach of trust and privacy.

The only path that businesses can adopt—transparency

Consumers are getting increasingly vigilant. Sweeping regulations like the GDPR—General Data Protection Regulation—are making consumers more aware of their rights and their responsibilities. Scandals like the Facebook–Cambridge Analytica data breach fiasco are no longer transient sensational news items, they can take businesses down like Cambridge Analytica which recently filed for bankruptcy following the huge public and media outcry.

Transparency is the only way forward for businesses to survive. And if a business truly wants to use Duplex as a human assistant, which could revolutionize their customer engagement and their operational costs no doubt, it would need to make that known to the consumer. Not through some small pop-up screen, but in big bold letters saying “You are talking to a computer—please hit OK to continue”

The consumer issue—no time to keep up with my network

The distracted consumer, with hundreds of connections across a variety of Social Media apps is revealing some pretty alarming statistics when it comes to mental health. The online education portal recently culled findings from a variety of media and research outlets to reveal some telling statistics. The one that jumps out—24% of respondents to one survey said they’ve missed out on enjoying special moments in person because they were too busy trying to document their experiences for online sharing. Much like the business goals of cost and revenue are enduring over the ages, maintaining healthy relationships has been an enduring human trait that is coming under severe strain.

Duplex to the rescue

If you were to create your own Mucheng—not the dating app per se but an assistant to engage with your friends—would that be wrong to do? In the Google I/O conference, automatically tagging your friend and sending her the photos she is featured in without you lifting a finger was demonstrated. What if Duplex could then call Liza and engage in meaningful conversation pretending to be you, while you could be enjoying some sweet nothing moments with your significant other?

The only path that social media users can adopt—transparency

The solution is exactly the same. Let your friends know that it is not the real you talking to them, but rather your outsourced bot. In a recent study conducted by Michelle Drouin, the psychology professor at Purdue University Fort Wayne, she and her fellow researchers split 350 undergraduate students into three groups. The first was informed they would be interacting with a bot, the second was told it was a real person, and the last was informed only after the interaction that it had been communicating with a bot. The results of the experiment were very interesting. The first two groups were nearly equally happy with the experience. The last group was not. They made comments like “eeriness” and “deception”.

Technologies like Duplex will be in our digital universe very soon. They will bring forth some amazing benefits to make our businesses more efficient and competitive and increase customer engagement if used with care and transparency. Our social media life as users can also get better if we use these virtual assistants wisely and embrace transparency. The alternative is business jeopardy and social ostracism.

About the author: Ashwin Krishnan is an ex-high-tech executive with security and cloud expertise, now focused on educating business executives on the impact of regulations, AI and IoT.