Confucianism for robots? Better than a Constitution: CMU

[ad_1]

Over the past months, generative AI tools like ChatGPT have jolted the public into a state of techno-fiction on steroids, while career professionals who endured years of schooling to become doctors, lawyers, and computer programmers have sweated over the prospect of job cannibalization.

Amid the chaos, perhaps only the most zen of souls are asking the question: What about rights for robots?

Few have wondered; mostly philosophers and scholars, says research. And some of them have advocated for giving robots rights. But now, a study out of Carnegie Mellon University has come to the conclusion that “granting rights to robots is a bad idea.

It sounds like the type of quandary that exists only in sci-fi Westworld realms, but in real life, various nonhumans are given moral and legal status, from animals, on which scientists are forbidden to experiment, to corporations that are lent Constitutional rights.

The ethical conflict reared its head once more after a recent video from Boston Dynamics, famed for its robot dog Spot, sparked controversy over showing its new, 6-foot-tall bipedal humanoid robot Atlas—which was trained for search-and-rescue missions—being abused by employees. They were kicking Atlas, hitting it with hockey sticks, and knocking it over with a heavy ball (for testing purposes).

For extremists on either side—those who argue that robots should be granted rights that are usually reserved for sentient beings and those who believe that robots should be forced to work while denied the courtesy of moral abstraction—Carnegie Mellon’s analysis, which was led by a professor of business ethics at the university’s Tepper School of Business, offers a compromise.

It’s rooted in the ancient Chinese philosophy of Confucianism. As the study’s lead author, Tae Wan Kim, explains, Confucianists observe a reverence for rites: performing rituals that are said to bring followers closer to moral transcendence. Thus, robots should be assigned their own rites, or what Kim calls “role obligations.”

It would almost seem like a clever workaround for increasing robot labor, were that not an oversimplistic and blasphemous view. But as Kim expounds, it’s clear that this arrangement would see robots as—almost frighteningly—human.

“Assigning role obligations to robots encourages teamwork, which triggers an understanding that fulfilling those obligations should be done harmoniously,” explains Kim. “Artificial intelligence imitates human intelligence, so for robots to develop as rites bearers, they must be powered by a type of AI that can imitate humans’ capacity to recognize and execute team activities—and a machine can learn that ability in various ways.”

Kim knows some might roll their eyes at any mandate to treat automated machines with “respect” and “dignity.” But in true Confucian spirit, Kim says, it’s a mirror of humankind’s own character, reflecting the path on which we find ourselves traveling. The classical Chinese system of thought seeks to achieve balance between ethical yin and yang forces: “Individuals are made distinctively human by their ability to conceive of interests not purely in terms of personal self-interest, but in terms that include a relational and a communal self.”

For Boston Dynamics’ Atlas, meanwhile, Kim’s paper argues, “We should also ask what it means for Atlas to live a good life. . . . Being kicked by a human for testing purposes, hence, is likely to be consistent with its conception of a good life because Atlas exists to augment human capabilities. This does not mean, however, that humans can do anything to robots. . . . In analogy, the robot and the human dance together as a well-coordinated team.”

As robots that we interact with “look and behave similar to humans,” says Kim, “we need to think about what it means to be human. . . . To the extent that we make robots in our image, if we don’t treat them well, as entities capable of participating in rites, we degrade ourselves.” The loss of our humanity? That, perhaps, is the most terrifying dystopia we face.



[ad_2]

Source link

Comments are closed.