END TIME BIBLE PROPHECIES HAPPENING NOW & THE ROAD TO CHRIST (YAHSHUA)
src="http://ra.revolvermaps.com/0/0/1.js?i=0s5awg5quen&m=7&s=320&c=e63100" async="async"></script>

Join the forum, it's quick and easy

END TIME BIBLE PROPHECIES HAPPENING NOW & THE ROAD TO CHRIST (YAHSHUA)
src="http://ra.revolvermaps.com/0/0/1.js?i=0s5awg5quen&m=7&s=320&c=e63100" async="async"></script>
END TIME BIBLE PROPHECIES HAPPENING NOW & THE ROAD TO CHRIST (YAHSHUA)
Would you like to react to this message? Create an account in a few clicks or log in to continue.
Search
 
 

Display results as :
 


Rechercher Advanced Search

September 2024
SunMonTueWedThuFriSat
1234567
891011121314
15161718192021
22232425262728
2930     

Calendar Calendar

Latest Topice
Latest Topics
Topic
History
Written by
{classical_row.recent_topic_row.L_TITLE}
{ON} {classical_row.recent_topic_row.S_POSTTIME}
{classical_row.recent_topic_row.switch_poster.S_POSTER} {classical_row.recent_topic_row.switch_poster_guest.S_POSTER} {classical_row.recent_topic_row.switch_poster.S_POSTER}

Latest Topice
Latest Topics
Topic
History
Written by
{classical_row.recent_topic_row.L_TITLE}
{ON} {classical_row.recent_topic_row.S_POSTTIME}
{classical_row.recent_topic_row.switch_poster.S_POSTER} {classical_row.recent_topic_row.switch_poster_guest.S_POSTER} {classical_row.recent_topic_row.switch_poster.S_POSTER}

Visitors
Flag Counter

AI could spark a nuclear apocalypse by 2040, new study warns

Go down

AI could spark a nuclear apocalypse by 2040, new study warns Empty AI could spark a nuclear apocalypse by 2040, new study warns

Post by Harry Sun Jun 03, 2018 7:53 pm

AI could spark a nuclear apocalypse by 2040, new study warns

Sunday, June 03, 2018 by: Lance D Johnson

Tags: AI capabilities, AI threat, apocolypse, badscience, calculated risk, computing, data translation, empire, future tech, future war, intelligence, military, military intelligence, military tech, nuclear power, nuclear war, nuclear weapons, rotobics, strategic stability, tech dependence, terminators

1,870
Views

(Natural News) A new study conducted by the RAND Corporation warns that advances in artificial intelligence could spark a nuclear apocalypse as soon as 2040. The researchers gathered information from experts in nuclear issues, government, AI research, AI policy, and national security. According to the paper, AI machines might not destroy the world autonomously, but artificial intelligence could encourage humans to take apocalyptic risks with military decisions.
Humans will inevitably trust in AI technology to a greater extent, as advances are made in AI for detection, tracking, and targeting. The newfound data intelligence that AI provides will escalate war time tensions and encourage bold, calculated decisions. As armies trust AI to translate data, they will be more apt to take drastic measures against one another. It will be like playing chess against a computer that can predict your future moves and make decisions accordingly.
Since 1945, the thought of mutually assured destruction through nuclear war has kept countries accountable to one another. With AI calculating risks more efficiently, armies will be able to attack with greater precision. Trusting in AI, humans may be able to advance their use of nuclear weapons as they predict and mitigate retaliatory forces. Opposing forces may see nuclear weapons as their only way out.
In the paper, researchers highlight the potential of AI to erode the condition of mutually assured destruction, therefore undermining strategic stability. Humans could take more calculated risks using nuclear weapons if they come to trust in the AI’s understanding of data. An improvement in sensor technology, for example, could help one side take out opposing submarines, as they gain bargaining leverage in an escalating conflict. AI will give armies the knowledge they need to take risky moves that give them the upper hand in battle.
Sponsored solution from CWC Labs: This heavy metals test kit allows you to test almost anything for 20+ heavy metals and nutritive minerals, including lead, mercury, arsenic, cadmium, aluminum and more. You can test your own hair, vitamins, well water, garden soil, superfoods, pet hair, beverages and other samples (no blood or urine). ISO accredited laboratory using ICP-MS (mass spec) analysis with parts per billion sensitivity. Learn more here.

How might a growing dependence on AI change human thinking?
The first intended use for AI was military purposes. The Survivable Adaptive Planning Experiment of the 1980s looked to utilize AI for translating reconnaissance data for improving nuclear targeting plans. Today, the Department of Defense is reaching out to Google for integrating AI into military intelligence. At least a dozen Google employees have resigned, protesting Google’s partnership with the Department of Defense for integrating AI with military drones. Project Maven seeks to incorporate AI into drones to scan images, identify targets, and classify images of objects and people to “augment or automate Processing, Exploitation and Dissemination (PED) for unmanned aerial vehicles.”
Improved analytics could help militaries interpret their opposition’s actions, too. This could help humans understand the motives behind an adversary’s decision and could lead to more strategic retaliation as the AI predicts behavior. Then again, what if the computer intelligence miscalculates the data, pushing humans to make decisions not in anyone’s best interest?
“Some experts fear that an increased reliance on artificial intelligence can lead to new types of catastrophic mistakes,” said Andrew Lohn, co-author on the paper and associate engineer at RAND. “There may be pressure to use AI before it is technologically mature, or it may be susceptible to adversarial subversion. Therefore, maintaining strategic stability in coming decades may prove extremely difficult and all nuclear powers must participate in the cultivation of institutions to help limit nuclear risk.”
How will adversaries perceive the AI capabilities of a geopolitical threat? Will their fears and suspicions lead to conflict? How might adversaries use artificial intelligence against one another and will this escalate risk and casualties? An apocalypse might not be machines taking over the world by themselves; it could be human trust in the machine intelligence that lays waste to the world.
For more on the dangers of AI and nuclear war, visit Nuclear.News.
Sources include:
ScienceDaily.com
Rand.org
WakeupWorld.com

Harry
Admin
Admin

Posts : 32157
Points : 96946
Join date : 2015-05-02
Age : 96
Location : United States

Back to top Go down

Back to top

- Similar topics

 
Permissions in this forum:
You cannot reply to topics in this forum