The technological singularity is an interesting concept from 1993 by mathematician Venor Vinge. Vinge describes the consequences of smarter-than-human systems (computers, improved humans or symbiotic human-machine systems) as leading to an infinite acceleration of intelligence-improvement.
It goes like this: "what would a smarter-than-human artificial intelligence do? It might play the stockmarket or be the worlds greatest artist, politician or general. But it might also become the worlds smartest computer-science researcher working on improving artificial intelligence, making a better version of itself. Rinse and repeat and interesting stuff starts to happen. Computer systems have been doubling performance every 18 months under the limited guidance of static human intelligence for over a century. With self improvement they could perhaps double in a much, much shorter time-spans. Think 17 minutes. Or less.
The implications of this idea are profound. It has the potential to make most of our problems today irrelevant (material scarcity and mortality might turn out to be easily solvable problems). It may also destroy our entire solar system. But just as with nuclear fusion there is also the possibility that it just won't happen in the forseeable future. We must guard against passivity among smart people who stop solving problems while they are waiting for 'the rapture of the nerds'.