Thank you for the kind words. As for your second point, Nick Bostrom defines “superintelligence” as “an intellect that is much smarter than the best human brains in practically every field, including scientific creativity, general wisdom and social skills.” Not sure there’s an official definition of AGI, but the consensus seem to be a less extreme version of the above, replacing “much smarter than the best” with “as smart as the average”. Is either of those inevitable? I’m agnostic. Regardless, I think it’s silly to see the possibility as a threat, let alone as a near-term threat.