For the next As a result of the informal TechCrunch book club, we read the fourth story in Ted Chiang’s exhalation, The goal of this book club is to expand our minds to new worlds, ideas and perspectives The life cycle of software objects Not disappointing In a future world where virtual worlds and generalized AI have become commonplace, it is a fantastic example of speculative fiction that forces us to ask all sorts of basic questions.
If you missed the earlier parts of this book club series, be sure to read the following:
Some questions about the fifth story in the collection, Dacey’s Patent Automatic Nannyare included below.
And as always, a few notes:
- Would you like to participate in the conversation? Please email me your thoughts at [email protected] or take part in some discussions on Reddit or Twitter.
- Follow these informal book club articles here: https://techcrunch.com/book-review/, This page also has an integrated RSS feed for articles only in the “Book Review” category, which has a very low volume.
- Feel free to add your comments in our TechCrunch comment section below this post.
Think about The life cycle of software objects
This is a much larger story than the earlier short stories in exhalation, with a much more linear representation than the fractal koans we have seen before. This wider screen offers us a huge buffet of topics, from empathy to the importance of humanity and the values we vouch for, to artificial entities, the economy of the digital future and the future of romance, sex, children and death. I have pages with notes from this story, but we can’t cover everything. That’s why I only want to enlarge two topics that I found particularly deep and rewarding.
A key goal of this story is to really question the meaning of a “person”. Chiang places our main character, Ana, as the mother of a digital unit (a “digient”) that was a zookeeper in a previous life. This career story gives us a nice framework: it enables us to compare people with animals via Ana and thus contextualize the debate about the personality of the digients throughout the story.
On the one hand, people value themselves as a species in a unique way, and even the most dedicated Digient owner eventually moves on. As a particularly revealing passage describes when the owner of a digient announces that his wife is pregnant:
“Of course you will have your hands full,” says Ana, “but what do you think about the adoption of Lolly?” It would be fascinating to see how Lolly reacts to pregnancy.
“No,” says Robyn, shaking her head. “I’m over now.”
“You passed them?”
“I’m ready for the real thing, you know what I mean?”
Ana says cautiously, “I’m not sure I do.”
“… cats, dogs, digients, they are all just replacements for what we are supposed to care for.”
This owner made a clear distinction: there is only one form of entity that is worth caring for, only one thing that a person can see as a person, and that is another person.
In fact, in this short story, Chiang constantly notes how the tastes, values, norms, rules, and laws of human society are almost exclusively intended for humans. However, the story never takes a definitive stance, and even Ana remains unconvinced until the end of the story. However, the narrative gives us a model to think about that I thought was valuable, and that’s experience.
What differentiates humans from other animals is that we make decisions based on our own past experiences. We collect this experience and use it to guide our actions and lead us to the right results that we – also from experience – want. We may want to make money (because experience shows that money is good), so we decide to go to college to get the right kind of learning to be effective in the job market. Experience is essential for this whole decision.
Chiang makes a very clear point here when it comes to a company called Exponential that is interested in finding a “superhuman AI” that does not do the work that Ana and the other Digients owners put into the upbringing of their companies to have. Ana finally realizes that they can never find what they’re looking for:
They want something that reacts like a person but doesn’t have the same obligations as a person, and they can’t give them that.
Nobody can give it to them because it is impossible. The years in which she raised Jax not only gave him fun to talk to, but also gave him hobbies and a sense of humor. They gave him all the properties that Exponential is looking for: fluent navigation in the real world, creativity in solving new problems, judgment that you can entrust with an important decision. Any quality that made a person more valuable than a database was a product of experience.
She wants to tell them that Blue Gamma was more right than she knew: experience is not just the best teacher; It is the only teacher. Experience is algorithmically incompressible.
Indeed, experience becomes the key word when owners think about when they could offer their clients independence to make their own decisions. Your ability to make your own decisions in the context of past experiences is what defines their personality.
So when we think about general artificial intelligence and the hope of creating a sentient artificial life, I think this litmus test will be the real challenge of what this technology can be. Can we only train an AI using algorithms or do we have to lead this AI with its open but empty heads every step of the way? Chiang discussed this a little earlier in history:
They are blind to a simple truth: complex minds cannot develop on their own. If they could, wild kids would be like everyone else. And the mind doesn’t grow like weeds and thrives under indifferent attention. otherwise all children would thrive in orphanages. In order for a mind to develop its full potential, it must be cultivated by other minds.
In fact, Ana and the other main character, Derek, are forced to push their digients forward, assign them homework, and lead them to new activities to keep them collecting the kind of experience they need to do in the world to be succesfull. Why should we assume that generalized AI is no less lazy than a child today? Why should we expect it to be able to teach itself when people cannot teach themselves?
When I talk about children, I want to move on to the other topic in this story that I found particularly astute. It is clear that there is a whole parallel to real human child-rearing that is inherent in all of history. I think that is obvious and although it is interesting, many of the conclusions and meanings from this concept are obvious.
What is more interesting is what affection and attachment mean in a world where entities don’t have to be “real”. Ana is a zookeeper who had a deep affection for the animals she cares for (“Her eyes still water when she thinks of the last time she saw her monkeys and wished she could explain why she didn’t would hope to adapt to their new home. “) She vigorously defends her relationship with these animals, as she does with the digients in history.
But why are some entities loved more than others when they’re all just code that runs in the cloud? The main features of the book have been literally designed to be attractive to people. As Blue Gamma searches the thousands of algorithmically generated digients, it carefully selects those that attract owners. “It was partly a search for intelligence, but it was also a search for temperament, the personality that didn’t frustrate customers.”
The reason, of course, is obvious: these creatures need attention to thrive, but they won’t get it unless they’re adorable and desirable. Derek spends his time animating the digital avatars to make them more attractive. It creates spontaneous and accidental facial expressions to connect their human owners with them.
History, however, pushes this topic so much more strongly in layers that are interconnected. Derek has been attracted to Ana throughout the story, though Ana continues to focus on developing her own digient and maintaining her relationship with her boyfriend Kyle. Derek finally realizes that his own obsession with Ana has become unsustainable, which is a subtle parallel to Ana’s own obsession with her digients:
He no longer has a wife to complain about, and Ana’s friend Kyle doesn’t seem to mind, so he can call her without charge. It is a painful pleasure to spend so much time with her. It might be healthier for them to interact less, but he doesn’t want to stop.
In fact, the book’s strongest thesis could be that this kind of love is simply not reproducible. Ana wants to join a company called Polytope to collect donations to port her digital to a new digital platform. Under the employer agreement, she is expected to wear an “intelligent transdermal” called InstantRapport that uses chemical changes in the brain to rewire a person’s reward centers and automatically love a specific person. Ana’s love for her digital device forces her to rewire her own brain to get the resources she needs.
And yet the digients eventually develop similar thought processes. Marco and Polo, two digients of Derek, finally agree to be copied as sex toys to finance the port. In their clones, the “reward cards” are rewired so that they love the customer who buys them.
History reminds us that we are ultimately a group of neurons that respond to stimuli. Some of these stimuli are under control, but many are not, but are programmed by our experiences without our conscious intervention. And there we see how these two threads are intertwined – only through experience can we create affection, and exactly affection and thus experience create a person at all.
Some questions Dacey’s Patent Automatic Nanny
- Can machines play an important role in raising children?
- Did the scientific method work in this case?
- Connect this story with that Life cycle of software objectsWhat is Chiang trying to say about raising children? Are there similarities or differences between the ideas of these two stories of children and parents?
- Should we be concerned if a child only wants to speak to one machine? Are we interested in which entities a person feels comfortable with?