Menu
Log in


AI and the Battlefield

  • 21 Dec 2024 04:51
    Message # 13443179


    Comment:  Here is what we grew up on in the 60's:  now a fourth Law and an AI explanation. 

    Back then, robots were not to harm humans -- where are we headed?

    Last article on this page discusses War Games, the movie which came out in the 80's.

    Asimov's Three Laws of Robotics 

    ______________________________________________________


    "Bring on the unstoppable rise of

    thinking war machines"


    Quote from Air and Space Forces:

    The war in Ukraine has demonstrated, once and for all, that drones are here to stay—but it has also underlined that battlefield communications are fragile and easily disrupted, leaving drones unable to receive their orders from their operators. ... The solution: The military will eventually need more machines that can think like human soldiers, deciding autonomously what to destroy and who to kill in ambiguous battlefield conditions. And to secure future battlefields, the Pentagon, and the public, will need to get over its ‘Terminator’ fears and embrace reality,” writes Clayton Swope, deputy director of the Aerospace Security Project and a senior fellow at the Center for Strategic and International Studies (CSIS).

    Aerospace Security Project CSIS

    _____________________________________________________________________

    "Humans should teach AI how to avoid nuclear war—
    while they still can"


    Last modified: 21 Dec 2024 05:35 | Anonymous member

Mailing address:

P.O. Box 1767

Monument, CO 80132

Contact
    

The Association of Air Force Missileers

is a 501(c)3 non-profit organization. 

Powered by Wild Apricot Membership Software