#not shown in this set #root flinching after samaritan says #ive seen that code waver #the machine #has no response to samaritans assertion #amy acker #said roots downward gaze #was meant to indicate #root wondering why TM was letting that go unanswered #and there is some irony in #asserting human right to free will #while root behaves as a human instantiation of the echo command #also was this caption colouring #fortuitous or deliberate #heh #THESE SCENES ARE THE FULCRUM UPON WHICH #THE NEXT SECTION OF THE SEASON TURNS #THERE IS SO MUCH STUFF IN THEM
#a wavering moral code alludes to unpredictability #and unpredictability is undeniably human #which i’m not trying to imply the machine is but it is something that samaritan and all its hubris is not giving enough credit #the machine was given a little extra human touch #and humans surprise you #you can try to predict human nature as much as you want but there’s always going to be at least the one that says fck your odds #and team machine is made up of outliers #BUT GOD YES A DEBATE OF MORALS #HUMAN VS MACHINE #HUMAN VS HUMAN #MACHINE VS MACHINE #because seriously machines might not share human morals but humans don’t share the same morals #the rest of the season is going to be made up of the characters testing each other and testing themselves #and where’s the line and what are they willing to cross #where will they bend #where will they break #and where will they stand uncompromised (via)
There’s an interesting ambivalence here I only noticed after looking at the gifset for a while. “I was built with a moral code”, says the Machine; “I’ve seen that code waiver”, Samaritan replies. But which meaning of the word “code” do they use, and is it the same one?
“Code” in computer science is like “macromolecules” to biology: it’s simply the class of materials from which a being is made (not even a specific kind within the class). In ethics, however, a “code” is a specific kind of a system. It’s intuitive, everyday use: a moral or ethical “code” is a rule or a set of rules; these rules are finite; and they are supposed to (well) rule every possible situation.
A code is a kind of a moral system, but there moral systems - there are ways of thinking of morality - that are not codes. The best known of those is virtue ethics: morality as personality traits, developed through experience and intent. Being virtuous is a second nature, not a first: no one is born virtuous. Even the best of natural inclinations is not a virtue in and of itself.
Samaritan seems to use the word “code” in its ethics-theory sense; that seems to fit with the use of the word “waiver”. But what meaning of the word did the Machine intend? Perhaps what she meant is that she was made with an ethics module. If that’s the case, then Samaritan just inadvertently betrayed a major weakness without realizing he did so.
I think this interpretation is the likelier one for the reason that Harold is smarter than that. The Machine wasn’t Harold’s first rodeo: time after time he tried to instill morality in a compsci code and time after time he failed. You can’t compose a set of rules that describes every situation ever, that’s absurd; rather, Harold gave the Machine the capacity to think - and in order for the Machine’s judgment to be worth anything at all, he gave her the capacity to see and a set of natural inclinations.
If that’s true, the Machine’s code didn’t waiver. Rather, she was self-reflecting - re-evaluating her moral heuristics in order to adapt to previously-unforeseen situations, and in light of her existing personality and values. It’s a very sentient thing, moral self-reflection. But Samaritan doesn’t think of morality this way: to him, this was waivering.
Further supporting this interpretation of mine is that Samaritan was seeking machine-learning algorithms: Samaritan is aware that his learning abilities are non-optimal.
Think of it this way: Samaritan is smart, but the Machine is creative.