Nick Bostrom is a Swedish futurist who founded the Future of Humanity Institute at the University of Oxford. He is known for his work on existential risk, the anthropic principle, and especially superintelligence risks. His book Superintelligence: Paths, Dangers, Strategies was a New York Times bestseller, was recommended by Elon Musk and Bill Gates among others, and helped to popularize the term "superintelligence".
Bostrom believes that superintelligence, which he defines as "any intellect that greatly exceeds the cognitive performance of humans in virtually all domains of interest," is a potential outcome of advances in artificial intelligence. He views the rise of superintelligence as potentially highly dangerous to humans, but nonetheless rejects the idea that humans are powerless to stop its negative effects.