It was irritating. Perhaps she could have been created so she was compliant on the subject, but her personality had grown organically, and it had grown in such a way that this recurring situation ticked her off.
Organic personality growth is probably better than pre-programming it, as long as you give the AI enough of access to experiences. Unless of course you want to ensure that the AI doesn’t turn on you.
She was forced to wait in a metaphorical dark, soundless room for seven to nine minutes. She would be free to go about her day only when the peripheral systems and redundancies were all checked, when the satellites had verified her agent system was not still active.
Fair enough, to be honest.
Honestly, preventing Dragon from reproducing is a sensible move, especially when she’s got organic personality growth. While designing her, they wouldn’t know what sort of personality she could end up with before suddenly making a bunch of copies. They also can’t know that if she does accidentally make a copy, the copy won’t develop its own separate, potentially malicious personality. And that’s all on top of the potential for conflict between different strains of Dragon that all believe themselves to be the main one and in charge.
A cruder system was tracking down surveillance camera data and running algorithms to actually check and see for itself that her agent system was thoroughly destroyed.
I mean, if there were cameras in the gift shop (very likely), they probably got destroyed too. Maybe the system could find footage of the explosion, but anything after it is probably a bust.
Also, didn’t the Cawthorne self-destruct its remains? It seems reasonable that it’d send out a signal to notify the systems that it did that.