• ArtRichards 3 days ago

    Reminds me of this: ‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets https://www.theguardian.com/world/2024/apr/03/israel-gaza-ai...

    • runarberg 3 days ago

      I think you are talking about Lavender: https://news.ycombinator.com/item?id=39918245 (7 months ago, 1418 points; 1601 comments)

      • fuzzfactor 3 days ago

        >‘The machine did it coldly’

        I'm not so sure, with the technology they're "shooting for".

        I thought this said it all:

        >They can differentiate between friendly, civilian and enemy, decide to engage or alert based on target type, and even vary their effects.

        Obviouly they're only going to be designing, building, and deploying "nice bombs".

      • tdeck 4 days ago
        • aaron695 3 days ago

          [dead]

        • tim333 3 days ago

          In 2001 an S200 missile fired at a drone during a military exercise missed the drone, saw an airliner 160 miles further away and went and took that out. Not sure if that counts?

          No one actually told it to go for the airliner but it kind of took the initiative. (https://en.wikipedia.org/wiki/Siberia_Airlines_Flight_1812)

          • JCharante 4 days ago

            Did HN automatically strip "Prediction -" from the title? At a glance it makes it look like the author is working on making it happen

            • wryoak 4 days ago

              The best way to be right is to make yourself right

            • simple10 3 days ago

              For military drones, yes, this will certainly happen if it hasn't already.

              I'd love to see some predictions on manufacturing robots intentionally killing someone for the greater good in a sort of Trolley Problem [1]. The theoretical potential of AI safety protocols getting misaligned and a robot deciding to sacrifice a human worker to save multiple lives.

              [1] https://en.wikipedia.org/wiki/Trolley_problem

              • geon 3 days ago

                That’s the Robotic Civil Wars as imagined by Asimov.

                https://asimov.fandom.com/wiki/Robotic_Civil_Wars

                • fhfjfk 2 days ago

                  That'll happen with self driving cars. It'll be interesting to see whether pedestrian or occupants are considered more valuable.

                • yawpitch 4 days ago

                  Uh, pretty sure it’s already happened, if by “robot” you mean programmed (but not necessarily independently mobile) automaton / machine, by “autonomous” you mean not under the direct and realtime control of a human operator, and by “deliberately” you mean ML came up with a > 0.5 certainty that the target met its targeting criteria.

                  Doubt a robot will ever actually deliberate, but that’s more of a philosophical issue.

                  • hollerith 3 days ago

                    I took the title to mean that human commanders will deliberately choose to deploy an autonomous robot configured to kill a person or persons (as opposed to a robot's killing a person, but the death was unwanted and unforeseen by the commanders).

                    • yawpitch 3 days ago

                      Yeah, my point is that’s kinda already been done, if we include modern “smart” landmines and loitering munitions. Honestly, I think this threshold is arguably behind us, but I’d also say if it isn’t it will be in a lot less than a decade.

                      • potato3732842 3 days ago

                        My money is on automated AA systems near some inland border where commercial flights just so happen to never go nabbing a crop duster, surveying aircraft or SAR/med chopper because some comedy of errors resulted in the device getting put on way too hair trigger of a setting.

                        • tim333 3 days ago

                          Human operated AA systems regularly shoot down the wrong thing so unless the automated systems are much better than us it seems pretty certain.

                  • more_corn 3 days ago

                    This is already happening in Ukraine.

                    • superb_dev 3 days ago

                      And Palestine

                    • gmerc 4 days ago

                      Ya but a tons of people are killed by humans every day so it's fine.

                      • fuzzfactor 3 days ago

                        You'd know exactly how it was going to happen if you could review every line of code, every comment, and every bit (byte) of data involved, and make sure it was meaningful.

                        So you could precisely pinpoint the exact data path that would carry out such a deed, and how it got that way. And be able to follow the trail of bits throughout the entire chain-of-command and arrive at the root cause quite logically.

                        Oh wait a minute . . . I was thinking about an accidental killing, my bad.

                        For a deliberate killing you don't need any of that.

                        • EvanAnderson 3 days ago

                          It depends on whether the murderer intends to get away with it.

                        • hcfman 3 days ago

                          Which is worse? Killing one human automatically by a computer. Or dropping 2000 lb bombs on civilian areas by human decisions deliberately?

                          Just a question.

                          • aaron695 4 days ago

                            [dead]