Should UN ban weaponized AI/robots??

because the US can develop them and they cant. If you want to run a droid army you need a yuge economy and only the US will be able to do it.

I think there is a legit fear of these things falling into the wrong hands, being hijacked &/or used anonymously to commit crimes, violence etc..
 
I think there is a legit fear of these things falling into the wrong hands, being hijacked &/or used anonymously to commit crimes, violence etc..

these are not nukes. These are droids. Individually they would not be able to kill a city. I would imagine they would have a kill switch as well.
 
[h=1]As the UN delays talks, more industry leaders back ban on weaponized AI[/h] Rich Haridy August 20, 2017
A second open letter, this time from 116 founders of AI and robotics companies, is urging the UN to act on banning weaponized AI (Credit: UNSW)
VIEW GALLERY - 2 IMAGESTwo years ago, the Future of Life Institute presented an open letter at the 2015 International Conference on Artificial Intelligence (IJCAI) urging the United Nations to ban the development of weaponized artificial intelligence. Now a second open letter has been released, again coinciding with the start of the 2017 IJCAI. This new letter is co-signed by over 100 founders of robotics and AI companies from around the world, and demands the UN stop delaying its talks and take action.

Just a few years ago, the idea of autonomous weaponry resided solely within the realms of science fiction, but the rapidly advancing fields of AI and robotics have turned a frightening fiction into a dawning reality. With global arms manufacturer Kalashnikov recently launching a fully automated range of combat modules and startup Duke Robotics attaching machine guns to drones, the future of robotic and autonomous warfare seems incredibly close.

The original 2015 letter, directed at the UN, was co-signed by over 1,000 different scientists and researchers from around the world, including Stephen Hawking, Noam Chomsky and Steve Wozniak. The UN slowly, but surely, responded, formally convening a group of experts in late 2016, under the banner of the Convention on Conventional Weapons (CCW) with a view towards discussing and implementing a global ban.

The first discussions of this newly formed UN group were set to take place this month, but they were canceled back in May due to "insufficient funding". This bureaucratic bungle, stemming from several nations apparently falling into arrears with promised contributions, also threatens to cancel the second scheduled meeting on lethal autonomous weapons set for November this year.

These delays inspired this second open letter, which concentrated on recruiting support from those on the business and industry side of robotics and AI. One hundred and sixteen founders of major companies from around the world have already co-signed this new letter, including Elon Musk, Mustafa Salesman (founder of Google's DeepMind), and Essen Østergaard (founder of Denmark's Universal Robotics).

"Lethal autonomous weapons threaten to become the third revolution in warfare," the letter states. "Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora's box is opened, it will be hard to close."



Despite getting a notable collection of industry luminaries on board, this appeal is looking like it will face an uphill battle over the coming months and years. Advocates of a ban on lethal autonomous weapons want all development in the field to be considered for prohibition, just as is done with biological and chemical weapons, but not all countries are agreeable.
While most UN member countries, including the US and UK, have agreed to forming this panel of experts, any actual proposal for a ban will likely face strong opposition. In 2015 the UK foreign office told The Guardian that the government does not see a need for these new laws. Russia of course, has not expressed support for this entire process either.

The United States has not communicated a solid position on the matter, and while it supported the convening of this UN group, one can't imagine the world's biggest military power willingly supporting a proposal that would stifle its ability to develop complex new weapons systems – especially when Russia has already indicated support for the Kalashnikov AI systems.
Whether such broad collective support across academic, research, and industry fields actually amounts to anything is yet to be seen, but this second open letter hopefully prompts a conversation on AI weapons development that the world drastically needs to have.

Source: University of New South Wales

http://newatlas.com/letter-ban-weaponized-ai/50972/

the un ,screw the un,it should be our congress that speaks to this issue,the new world order can take a hike.
 
The objective of US military development is to make a war with the US as lopsided as possible and minimize the chance of US casualties in any future conflict. Any weapon development that increases our advantage should be pursued and any that don't should not be. It's that simple. We shouldn't base development on competition against potential adversaries, we should base it on competition against our own current capabilities.

That said, the one technology that scares me the most is FEL. The destructive capability of that technology, even with only moderate achievement of expected potential, is chilling. Until FEL the laser technology was limited heavily by necessary medium to generate and focus a laser beam. With FEL there is no medium needed to generate and focus the beam so the power potential is greatly increased. We'll have lasers not only capable of cutting down a building in a single pass, but entire skylines.
 
Last edited:
The objective of US military development is to make a war with the US as lopsided as possible and minimize the chance of US casualties in any future conflict. Any weapon development that increases our advantage should be pursued any that doesn't should not be. It's that simple. We shouldn't base development on competition against potential adversaries, we should base it on competition against our own current capabilities.

That said, the one technology that scares me the most is FEL. The destructive capability of that technology, even with only moderate achievement of expected potential, is chilling. Until FEL the laser technology was limited heavily by necessary medium to generate and focus a laser beam. With FEL there is no medium needed to generate and focus the beam so the power potential is greatly increased. We'll have lasers not only capable of cutting down a building in a single pass, but entire skylines.
what is FEL? could you provide a link?
 
The objective of US military development is to make a war with the US as lopsided as possible and minimize the chance of US casualties in any future conflict. Any weapon development that increases our advantage should be pursued any that doesn't should not be. It's that simple. We shouldn't base development on competition against potential adversaries, we should base it on competition against our own current capabilities.

That said, the one technology that scares me the most is FEL. The destructive capability of that technology, even with only moderate achievement of expected potential, is chilling. Until FEL the laser technology was limited heavily by necessary medium to generate and focus a laser beam. With FEL there is no medium needed to generate and focus the beam so the power potential is greatly increased. We'll have lasers not only capable of cutting down a building in a single pass, but entire skylines.

i believe cobra commander wanted to paint his face on the moon with a laser one episode so people could look up and see the true face of evil : )
 
what is FEL? could you provide a link?

https://en.wikipedia.org/wiki/Free-electron_laser

Simple explanation: When electrons oscillate in a magnetic field they generate photons. As electrons bunch up and oscillate in unison the photons produced also become more organized and travel in a path parallel to the axis of oscillation. So a FEL system works by introducing a stream of electrons into a tube with an oscillating magnetic field which generates a beam of photons. The power of the beam is directly proportional to the number of electrons introduced since each electron produces photons and the power of a laser is really a measure of photon density. The practical upshot of this type of laser is two fold: 1) The absence of a medium means the laser is only limited by the amount of electrons injected into the chamber and 2) The frequency of the oscillation determines the wavelength of light that is produced. The ability to adjust the phase of the produced light makes it a real life phaser.
 
The objective of US military development is to make a war with the US as lopsided as possible and minimize the chance of US casualties in any future conflict. Any weapon development that increases our advantage should be pursued and any that don't should not be. It's that simple. We shouldn't base development on competition against potential adversaries, we should base it on competition against our own current capabilities.

That said, the one technology that scares me the most is FEL. The destructive capability of that technology, even with only moderate achievement of expected potential, is chilling. Until FEL the laser technology was limited heavily by necessary medium to generate and focus a laser beam. With FEL there is no medium needed to generate and focus the beam so the power potential is greatly increased. We'll have lasers not only capable of cutting down a building in a single pass, but entire skylines.

:evilnod:
 
https://en.wikipedia.org/wiki/Free-electron_laser

Simple explanation: When electrons oscillate in a magnetic field they generate photons. As electrons bunch up and oscillate in unison the photons produced also become more organized and travel in a path parallel to the axis of oscillation. So a FEL system works by introducing a stream of electrons into a tube with an oscillating magnetic field which generates a beam of photons. The power of the beam is directly proportional to the number of electrons introduced since each electron produces photons and the power of a laser is really a measure of photon density. The practical upshot of this type of laser is two fold: 1) The absence of a medium means the laser is only limited by the amount of electrons injected into the chamber and 2) The frequency of the oscillation determines the wavelength of light that is produced. The ability to adjust the phase of the produced light makes it a real life phaser.

 
[h=1]As the UN delays talks, more industry leaders back ban on weaponized AI[/h] Rich Haridy August 20, 2017
A second open letter, this time from 116 founders of AI and robotics companies, is urging the UN to act on banning weaponized AI (Credit: UNSW)
VIEW GALLERY - 2 IMAGESTwo years ago, the Future of Life Institute presented an open letter at the 2015 International Conference on Artificial Intelligence (IJCAI) urging the United Nations to ban the development of weaponized artificial intelligence. Now a second open letter has been released, again coinciding with the start of the 2017 IJCAI. This new letter is co-signed by over 100 founders of robotics and AI companies from around the world, and demands the UN stop delaying its talks and take action.

Just a few years ago, the idea of autonomous weaponry resided solely within the realms of science fiction, but the rapidly advancing fields of AI and robotics have turned a frightening fiction into a dawning reality. With global arms manufacturer Kalashnikov recently launching a fully automated range of combat modules and startup Duke Robotics attaching machine guns to drones, the future of robotic and autonomous warfare seems incredibly close.

The original 2015 letter, directed at the UN, was co-signed by over 1,000 different scientists and researchers from around the world, including Stephen Hawking, Noam Chomsky and Steve Wozniak. The UN slowly, but surely, responded, formally convening a group of experts in late 2016, under the banner of the Convention on Conventional Weapons (CCW) with a view towards discussing and implementing a global ban.

The first discussions of this newly formed UN group were set to take place this month, but they were canceled back in May due to "insufficient funding". This bureaucratic bungle, stemming from several nations apparently falling into arrears with promised contributions, also threatens to cancel the second scheduled meeting on lethal autonomous weapons set for November this year.

These delays inspired this second open letter, which concentrated on recruiting support from those on the business and industry side of robotics and AI. One hundred and sixteen founders of major companies from around the world have already co-signed this new letter, including Elon Musk, Mustafa Salesman (founder of Google's DeepMind), and Essen Østergaard (founder of Denmark's Universal Robotics).

"Lethal autonomous weapons threaten to become the third revolution in warfare," the letter states. "Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora's box is opened, it will be hard to close."



Despite getting a notable collection of industry luminaries on board, this appeal is looking like it will face an uphill battle over the coming months and years. Advocates of a ban on lethal autonomous weapons want all development in the field to be considered for prohibition, just as is done with biological and chemical weapons, but not all countries are agreeable.
While most UN member countries, including the US and UK, have agreed to forming this panel of experts, any actual proposal for a ban will likely face strong opposition. In 2015 the UK foreign office told The Guardian that the government does not see a need for these new laws. Russia of course, has not expressed support for this entire process either.

The United States has not communicated a solid position on the matter, and while it supported the convening of this UN group, one can't imagine the world's biggest military power willingly supporting a proposal that would stifle its ability to develop complex new weapons systems – especially when Russia has already indicated support for the Kalashnikov AI systems.
Whether such broad collective support across academic, research, and industry fields actually amounts to anything is yet to be seen, but this second open letter hopefully prompts a conversation on AI weapons development that the world drastically needs to have.

Source: University of New South Wales

http://newatlas.com/letter-ban-weaponized-ai/50972/

of course not.....by the way can I have an autographed picture of you to share with my drone?.......
 
Back
Top