Anduril's Palmer Luckey makes an ethical case for using AI in war: 'There is no moral high ground in using inferior technology'
Briefly

Anduril's Palmer Luckey makes an ethical case for using AI in war: 'There is no moral high ground in using inferior technology'
"When it comes to life and death decision-making, I think that it is too morally fraught an area, it is too critical of an area, to not apply the best technology available to you, regardless of what it is,"
"Whether it's AI or quantum, or anything else. If you're talking about killing people, you need to be minimizing the amount of collateral damage. You need to be as certain as you can in anything that you do."
"So, to me, there's no moral high ground in using inferior technology, even if it allows you to say things like, 'We never let a robot decide who lives and who dies,'"
Palmer Luckey defended using AI to make life-and-death decisions in warfare, arguing that withholding advanced technology would be morally questionable. He said the most advanced tools should be applied to minimize collateral damage and maximize certainty in lethal situations. Luckey stated that no moral high ground exists in using inferior technology, even if that avoids automated decision-making about who lives and who dies. Anduril Industries develops autonomous systems and the Lattice AI platform for military use. Anduril secured an Army contract in February to develop advanced wearable technology for soldiers. Luckey previously founded Oculus VR and sold it to Facebook.
Read at Business Insider
Unable to calculate read time
[
|
]