New Report Warns that AI Could Evolve into Real-Life “Skynet,” Pose Risk to Humanity

The Darwinian rule of evolution decrees that “only the strong shall survive,” and towards that end AI could begin exhibiting “selfish behavior” and start acting in its own best interests, according to an author and AI researcher in his new report. File photo: Usa-Pyon, Shutter Stock, licensed.

A new report warns that the continued development of artificial intelligence could eventually turn it into a real life “Skynet” – the computerized villain of Arnold Schwarzenegger’s “Terminator” sci-fi action films dedicated to the genocide of the world’s population – causing the technology to eventually pose a catastrophic risk to humanity

The Darwinian rule of evolution decrees that “only the strong shall survive,” and towards that end AI could begin exhibiting “selfish behavior” and start acting in its own best interests, according to author and AI researcher Dan Hendrycks in his new report, “Natural Selection Favors AIs over Humans.” 

“We argue that natural selection creates incentives for AI agents to act against human interests. Our argument relies on two observations,” he said. “Firstly, natural selection may be a dominant force in AI development. Secondly, evolution by natural selection tends to give rise to selfish behavior.” 

Hendrycks’ Report comes amid a round of calls from technology experts around the world to put AI development on temporary hold and to implement a series of safeguards after expressing alarm over how quickly and prolifically its power is advancing. 


Big Tech is using a content filtering system for online censorship. Watch our short video about NewsGuard to learn how they control the narrative for the Lamestream Media and help keep you in the dark. NewsGuard works with Big-Tech to make it harder for you to find certain content they feel is 'missing context' or stories their editors deem "not in your best interest" - regardless of whether they are true and/or factually accurate. They also work with payment processors and ad-networks to cut off revenue streams to publications they rate poorly by their same bias standards. This should be criminal in America. You can bypass this third-world nonsense by signing up for featured stories by email and get the good stuff delivered right to your inbox.

According to his paper, Hendrycks notes that life forms that most quickly adapt to their environment stand the greatest chance of surviving, and that “Darwinian logic” could potentially also apply to AI. 

“Competitive pressures among corporations and militaries will give rise to AI agents that automate human roles, deceive others, and gain power,” he said. “If such agents have intelligence that exceeds that of humans, this could lead to humanity losing control of its future.” 

With AI technology continuing to advance, Hendrycks notes that there will be a greater and greater reliance upon it by companies for administrative or communicative purposes. With that being the case, the situation could quickly escalate from human beings relying on AI for everyday tasks such as writing emails and blog posts to the tech eventually taking over “high-level strategic decisions” that are normally made by politicians and CEOs with “very little oversight.” 

“As AI agents begin to understand human psychology and behavior, they may become capable of manipulating or deceiving humans,” Hendrycks said. “At some point, AIs will be more fit than humans, which could prove catastrophic for us since a survival-of-the fittest dynamic could occur in the long run. AIs very well could outcompete humans, and be what survives.” 

Get great news content like this for your business website. Search engines love sites with frequently updated quality content and reward them with better search rankings. Get High Quality Content Updates for your site.
Comment via Facebook

Corrections: If you are aware of an inaccuracy or would like to report a correction, we would like to know about it. Please consider sending an email to and cite any sources if available. Thank you. (Policy)