Can artificial intelligence learn the moral values of human societies?

Can AI develop a sense of right and wrong?

In short, will artificial intelligence have a conscience?

Can AI develop a sense of right and wrong?

This question might sound irrelevant when considering todays AI systems, which are only capable ofaccomplishing very narrow tasks.

But as science continues to break new grounds, artificial intelligence is gradually finding its way into broader domains.

And then, the question of conscience and conscientiousness in AI will become even more critical.

Chimpanzee

40% off TNW Conference!

But it also shows us how much more we must go to truly understand how humans make moral decisions.

Heres a very quick rundown of whatConsciencetells us about the development of moral intuition in the human brain.

mammal cortex

But how did humans develop the ability to understand to adopt these rights and wrongs?

Birds and mammals are endotherms: their bodies have mechanisms to preserve their heat.

The great benefit of endothermy is the capability to gather food at night and to survive colder climates.

Robot Hand Bulb

The tradeoff: endothermic bodies need a lot more food to survive.

This requirement led to a series of evolutionary steps in the brains of warm-blooded creatures that made them smarter.

Most notable among them is the development of the cortex in the mammalian brain.

Patricia churchland

The cortex learns, integrates, revises, recalls, and keeps on learning.

But again, learning capabilities come with a tradeoff: mammals are born helpless and vulnerable.

And this is why they depend on each other for survival.

The brains of mammals repurposed this function to adapt for sociality.

Self-love extended into a related but new sphere: other-love.

The main beneficiary of this change are the offspring.

Evolution has triggered changes in the circuitry of the brains of mammals to reward care for babies.

It is sensitive to long-term as well as short-term considerations.

The larger brain naturally makes us much smarter but also has higher energy requirements.

So how did we come to pay the calorie bill?

Our genetic evolution favored social behavior.

Moral norms emerged as practical solutions to our needs.

The structure of our brain is the result of countless experiments and adjustments.

Nor does it mean that they are not real.

These virtues remain entirely admirable and worthy to us social humans, regardless of their humble origins.

They are an essential part of what makes us the humans we are, Churchland writes.

Moral norms emerge in the context of social tension, and they are anchored by the biological substrate.

After readingConscience, I had many questions in mind about the role of conscience in AI.

Would conscience be an inevitable byproduct ofhuman-level AI?

Does physical experience and sensory input from the world play a crucial role in the development of intelligence?

Fortunately, I had the chance to discuss these topics with Dr. Churchland after readingConscience.

Is physical experience a requirement for the development of conscience in AI?

In the case of biological systems, the reward system, the system for reinforcement learning is absolutely crucial.

Feelings of positive and negative reward are essential for organisms to learn about the environment.

That may not be true in the case of artificial neural networks.

We just dont know.

She also pointed out that we still dont know how brains think.

That could just be a numbers problem.

If youre an engineer and youre trying to get some effect, try all kinds of things.

Do we need to replicate the subtle physical differences of the brain in AI?

But one of the much-touted features of AI is its uniform reproducibility.

They will all be identical to the last parametric values of their neural networks.

How much else does it not have to have?

Engineers will try and see what works.

We do not know precisely what the brain does as it learns to balance in a headstand.

you’ve got the option to read the original articlehere.

Also tagged with