Subject: I try never to ask gods for favors.
Author:
Posted on: 2013-06-27 20:14:00 UTC
They tend to have a habit of calling them in at the most dire of situations, or possibly when they want something done that's eventually going to end up with a species destroyed or themselves in charge of a group of other powerful entities. True, an AI interested mainly in being left alone and developing itself might not be so dramatic, but we don't know.
Is Singularity receptive to making deals, perhaps? Pre-set conditions are much more unlikely to blow up in one's face.
Deal idea: If we ensure the propagation of AIs similar to itself, it will be capable of spreading outside its initial simulation while still being able to have Singularity prime in its home world unaffected. Of course, we'd need a place to put the duplicate AIs so that they don't absorb each other and go nuts, and I'm not sure where we could do that.