I agree with you. It is an enormously complicated problem without any easy solutions, and there may be too few incentives for profit-driven companies to do the right thing. Despite the fact they paint a very dark and stylized dystopian future, classic films like The Terminator and The Matrix (and to some extent, more recent films like Ex Machina) contain stories that advise great caution when it comes to robotics, automation, and AI. Frankly, I'm not sure we take stories like those seriously enough. Many years ago, Bill Joy (the co-founder of Sun Microsystems) wrote an article for Wired Magazine that was titled "Why the Future Doesn’t Need Us [^]. Here is one excerpt: > The average man may have control over certain private machines of his own, such as his car or his personal computer, but control over large systems of machines will be in the hands of a tiny elite – just as it is today, but with two differences. Due to improved techniques the elite will have greater control over the masses; and because human work will no longer be necessary the masses will be superfluous, a useless burden on the system. If the elite is ruthless they may simply decide to exterminate the mass of humanity. If they are humane they may use propaganda or other psychological or biological techniques to reduce the birth rate until the mass of humanity becomes extinct, leaving the world to the elite. Or, if the elite consists of soft-hearted liberals, they may decide to play the role of good shepherds to the rest of the human race.