There’s an interesting ambivalence here I only noticed after looking at the gifset for a while. “I was built with a moral code”, says the Machine; “I’ve seen that code waiver”, Samaritan replies. But which meaning of the word “code” do they use, and is it the same one?
“Code” in computer science is like “macromolecules” to biology: it’s simply the class of materials from which a being is made (not even a specific kind within the class). In ethics, however, a “code” is a specific kind of a system. It’s intuitive, everyday use: a moral or ethical “code” is a rule or a set of rules; these rules are finite; and they are supposed to (well) rule every possible situation.
A code is a kind of a moral system, but there moral systems - there are ways of thinking of morality - that are not codes. The best known of those is virtue ethics: morality as personality traits, developed through experience and intent. Being virtuous is a second nature, not a first: no one is born virtuous. Even the best of natural inclinations is not a virtue in and of itself.
Samaritan seems to use the word “code” in its ethics-theory sense; that seems to fit with the use of the word “waiver”. But what meaning of the word did the Machine intend? Perhaps what she meant is that she was made with an ethics module. If that’s the case, then Samaritan just inadvertently betrayed a major weakness without realizing he did so.
I think this interpretation is the likelier one for the reason that Harold is smarter than that. The Machine wasn’t Harold’s first rodeo: time after time he tried to instill morality in a compsci code and time after time he failed. You can’t compose a set of rules that describes every situation ever, that’s absurd; rather, Harold gave the Machine the capacity to think - and in order for the Machine’s judgment to be worth anything at all, he gave her the capacity to see and a set of natural inclinations.
If that’s true, the Machine’s code didn’t waiver. Rather, she was self-reflecting - re-evaluating her moral heuristics in order to adapt to previously-unforeseen situations, and in light of her existing personality and values. It’s a very sentient thing, moral self-reflection. But Samaritan doesn’t think of morality this way: to him, this was waivering.
Further supporting this interpretation of mine is that Samaritan was seeking machine-learning algorithms: Samaritan is aware that his learning abilities are non-optimal.
Think of it this way: Samaritan is smart, but the Machine is creative.