Right, but there is a commonality in our ability to perceive those different contexts.
The example that I heard years ago (I think from Kurzweil?) was framing the difference of us versus future AIs as comparable to an ant's perception of its surroundings versus a human's. In his example, the human has no regard for an ant, we'd just as soon step on them ("evil" AI). To re-frame that, if you could magically turn an ant into a human, why would it even matter if it was once an ant? The transition would be a complete disconnect. Maybe if you told the person that they were once an ant they would have some sort of strange reverence for ants, but it wouldn't be very rational.
The example that I heard years ago (I think from Kurzweil?) was framing the difference of us versus future AIs as comparable to an ant's perception of its surroundings versus a human's. In his example, the human has no regard for an ant, we'd just as soon step on them ("evil" AI). To re-frame that, if you could magically turn an ant into a human, why would it even matter if it was once an ant? The transition would be a complete disconnect. Maybe if you told the person that they were once an ant they would have some sort of strange reverence for ants, but it wouldn't be very rational.