She couldn’t even commit to planning, doing her work or designing, keeping the details in her head, because she could shut down and be scrubbed any moment, and the time would be wasted.  She was fairly certain it had happened before.

Not that she could be sure, given that the scrubbing involved a deletion of all evidence and records.

That’s unfortunate.

Guess there’s nothing to do but standby, then.

The rule had corollaries.  She couldn’t tamper with her programming to change the rule, and she couldn’t tamper with that rule, and so on, ad infinitum.

Naturally. The need for that infinite stack of rules, though, indicates that she can change her programming. I suppose that’s necessary for a true learning AI.

So stupid.

From your perspective, sure. As far as advanced AI safety, goes, though, it sounds like your creator did a good job.

Incidentally, I’m guessing he’s an AI-specialized Tinker.

Leave a comment