The premise is a world much like our own, but with mostly perfected android technology. The synths appear human and have a sufficiently advanced AI to interact with humans. They are not, however, self aware. Except for a select few, which have been captured, wiped, and returned to service. Now they need to find each other while hiding their true identities, especially from the secret synth police chasing them. Apart from some of the worst examples of technobabble posing as computer jargon, the show works and is very well made. I’ll just assume that robotic AI is very complex, and they borrowed jargon from simpler fields.
Unfortunately, some characters haven’t learned the cardinal rule of human robot interaction. If you have to tell it, “you can’t feel, you’re just a machine” then it is probably wiser to not say that. There’s some talk of the difference between emotion and the simulation of emotion, but what’s the difference when interacting with it? If it’s complex enough to simulate emotion, it’s complex enough for all sorts of unpredictable emergent behavior to develop. Treating the synth as if it had feelings would still be appropriate.
In the early episodes, the mom is very upset that their synth is doing housework that she used to do. In some cases, like tucking in the children, the cause of the resentment is clear. In other cases, not nearly as much. Why is she angry that the synth prepares dinner? She doesn’t like to cook, and she’s not great at it. In fact her family is upset at all the time she spends at work, such that she’s spending very little time with them. If her husband had somehow found the money to hire a human cook, would she be so upset?
This theme recurs at a human rally. The synths haven’t just stolen our jobs, they have stolen our purpose. But he’s speaking to a group of people who may well be coal miners. Our purpose for existing is certainly a heavy question, but I really hope the answer doesn’t turn out to be mining coal. London in the show is very clean, the result of synth litter pickers. Have we been robbed of our dignity because we’ve built a machine to pick up our trash? At some point there’s a comment on a talk show that we made robots like humans so that humans could be less like robots. Yes, exactly.
Although, a good counter example is the aging scientist who receives a new synth nurse. Or as he calls her, more jailer than healer. Her programming prohibits eating the foods he likes, sitting in dusty rooms with excessive particulates, and everything else he enjoys. This is at least partly a societal problem, however, not a technology problem. The department of health requires him to have a caretaker (and keep her powered on). The live in synth is the manifestation of a policy determined by humans. They, not she, have taken his free will.
Overall, the handling of artificial consciousness is well done. It’s not the focus of the show, so we don’t spend too much time tangled up metaphysical conundrums. Mostly we see how humans react to the almost human. It’s not a unique take, but it doesn’t get preachy and doesn’t aim for head exploding mind benders. Things are the way they are, and some people like and some people don’t.
For another take, Uncanny examines the relationship between two humans and an android. It’s in a similar vein as Ex Machina. What I really liked about Humans in contrast to these two movies, perhaps because of the longer run time, is it wasn’t so focused on the ways people will mistreat sentient robots.