Odisha News, Odisha Latest news, Odisha Daily - OrissaPOST
  • Home
  • Trending
  • State
  • Metro
  • National
  • International
  • Business
  • Feature
  • Entertainment
  • Sports
  • More..
    • Odisha Special
    • Editorial
    • Opinion
    • Careers
    • Sci-Tech
    • Timeout
    • Horoscope
    • Today’s Pic
  • Video
  • Epaper
  • News in Odia
  • Home
  • Trending
  • State
  • Metro
  • National
  • International
  • Business
  • Feature
  • Entertainment
  • Sports
  • More..
    • Odisha Special
    • Editorial
    • Opinion
    • Careers
    • Sci-Tech
    • Timeout
    • Horoscope
    • Today’s Pic
  • Video
  • Epaper
  • News in Odia
No Result
View All Result
OrissaPOST - Odisha Latest news, English Daily -
No Result
View All Result

Israel’s AI can produce 100 bombing targets a day in Gaza. Is this the future of war?

PTI
Updated: December 11th, 2023, 09:00 IST
in International, Sci-Tech
0
IDF infantry hits several tanks inside Gaza, ground assault imminent

Pic - IANS

Share on FacebookShare on TwitterShare on WhatsAppShare on Linkedin

Canberra: Last week, reports emerged that the Israel Defence Forces (IDF) are using an artificial intelligence (AI) system called Habsora (Hebrew for “The Gospel”) to select targets in the war on Hamas in Gaza. The system has reportedly been used to find more targets for bombing, to link locations to Hamas operatives, and to estimate likely numbers of civilian deaths in advance.

What does it mean for AI targeting systems like this to be used in conflict? My research into the social, political and ethical implications of military use of remote and autonomous systems shows AI is already altering the character of war.

Also Read

Donald Trump

Trump warns additional 10% tariff on nations supporting ‘anti-American policies of BRICS’

4 hours ago
Pic- AP

Israel launches airstrikes targeting Yemen’s Houthi rebels, Houthis launch missile at Israel

5 hours ago

Militaries use remote and autonomous systems as “force multipliers” to increase the impact of their troops and protect their soldiers’ lives. AI systems can make soldiers more efficient, and are likely to enhance the speed and lethality of warfare – even as humans become less visible on the battlefield, instead gathering intelligence and targeting from afar.

When militaries can kill at will, with little risk to their own soldiers, will the current ethical thinking about war prevail? Or will the increasing use of AI also increase the dehumanisation of adversaries and the disconnect between wars and the societies in whose names they are fought?

AI in war

AI is having an impact at all levels of war, from “intelligence, surveillance and reconnaissance” support, like the IDF’s Habsora system, through to “lethal autonomous weapons systems” that can choose and attack targets without human intervention.

These systems have the potential to reshape the character of war, making it easier to enter into a conflict. As complex and distributed systems, they may also make it more difficult to signal one’s intentions – or interpret those of an adversary – in the context of an escalating conflict.

To this end, AI can contribute to mis- or disinformation, creating and amplifying dangerous misunderstandings in times of war.

AI systems may increase the human tendency to trust suggestions from machines (this is highlighted by the Habsora system, named after the infallible word of God), opening up uncertainty over how far to trust autonomous systems. The boundaries of an AI system that interacts with other technologies and with people may not be clear, and there may be no way to know who or what has “authored” its outputs, no matter how objective and rational they may seem.

High-speed machine learning

Perhaps one of the most basic and important changes we are likely to see driven by AI is an increase in the speed of warfare. This may change how we understand military deterrence, which assumes humans are the primary actors and sources of intelligence and interaction in war.

Militaries and soldiers frame their decision-making through what is called the “OODA loop” (for observe, orient, decide, act). A faster OODA loop can help you outmanoeuvre your enemy. The goal is to avoid slowing down decisions through excessive deliberation, and instead to match the accelerating tempo of war.

So the use of AI is potentially justified on the basis it can interpret and synthesise huge amounts of data, processing it and delivering outputs at rates that far surpass human cognition.

But where is the space for ethical deliberation in an increasingly fast and data-centric OODA loop cycle happening at a safe distance from battle?

Israel’s targeting software is an example of this acceleration. A former head of the IDF has said that human intelligence analysts might produce 50 bombing targets in Gaza each year, but the Habsora system can produce 100 targets a day, along with real-time recommendations for which ones to attack.

How does the system produce these targets? It does so through probabilistic reasoning offered by machine learning algorithms.

Machine learning algorithms learn through data. They learn by seeking patterns in huge piles of data, and their success is contingent on the data’s quality and quantity. They make recommendations based on probabilities.

The probabilities are based on pattern-matching. If a person has enough similarities to other people labelled as an enemy combatant, they too may be labelled a combatant themselves.

The problem of AI enabled targeting at a distance

Some claim machine learning enables greater precision in targeting, which makes it easier to avoid harming innocent people and using a proportional amount of force. However, the idea of more precise targeting of airstrikes has not been successful in the past, as the high toll of declared and undeclared civilian casualties from the global war on terror shows.

Moreover, the difference between a combatant and a civilian is rarely self-evident. Even humans frequently cannot tell who is and is not a combatant.

Technology does not change this fundamental truth. Often social categories and concepts are not objective, but are contested or specific to time and place. But computer vision together with algorithms are more effective in predictable environments where concepts are objective, reasonably stable, and internally consistent.

Will AI make war worse?

We live in a time of unjust wars and military occupations, egregious violations of the rules of engagement, and an incipient arms race in the face of US–China rivalry. In this context, the inclusion of AI in war may add new complexities that exacerbate, rather than prevent, harm.

AI systems make it easier for actors in war to remain anonymous, and can render invisible the source of violence or the decisions which lead to it. In turn, we may see increasing disconnection between militaries, soldiers, and civilians, and the wars being fought in the name of the nation they serve.

And as AI grows more common in war, militaries will develop countermeasures to undermine it, creating a loop of escalating militarisation.

What now?

 

Can we control AI systems to head off a future in which warfare is driven by increasing reliance on technology underpinned by learning algorithms? Controlling AI development in any area, particularly via laws and regulations, has proven difficult.

Many suggest we need better laws to account for systems underpinned by machine learning, but even this is not straightforward. Machine learning algorithms are difficult to regulate.

AI-enabled weapons may program and update themselves, evading legal requirements for certainty. The engineering maxim “software is never done” implies that the law may never match the speed of technological change.

The quantitative act of estimating likely numbers of civilian deaths in advance, which the Habsora system does, does not tell us much about the qualitative dimensions of targeting. Systems like Habsora in isolation cannot really tell us much about whether a strike would be ethical or legal (that is, whether it is proportionate, discriminate and necessary, among other considerations).

AI should support democratic ideals, not undermine them. Trust in governments, institutions, and militaries is eroding and needs to be restored if we plan to apply AI across a range of military practices. We need to deploy critical ethical and political analysis to interrogate emerging technologies and their effects so any form of military violence is considered to be the last resort.

Until then, machine learning algorithms are best kept separate from targeting practices. Unfortunately, the world’s armies are heading in the opposite direction.

By  Bianca Baggiarini, Australian National University

The Conversation 

Tags: AIGazaIsraelwar
ShareTweetSendShare
Suggest A Correction

Enter your email to get our daily news in your inbox.

 

OrissaPOST epaper Sunday POST OrissaPOST epaper

Click Here: Plastic Free Odisha

#MyPaperBagChallenge

Shreyanshu Bal

December 12, 2019
#MyPaperBagChallenge

Ipsita

December 12, 2019
#MyPaperBagChallenge

Adweeti Bhattacharya

December 12, 2019
#MyPaperBagChallenge

Jyotshna Mayee Pattnaik

December 12, 2019
#MyPaperBagChallenge

Adyasha Priyadarsani Sendha

December 12, 2019
#MyPaperBagChallenge

Keshab Chandra Rout

December 12, 2019
#MyPaperBagChallenge

Spinoj Pattnaik

December 12, 2019
#MyPaperBagChallenge

Pratik Kumar

December 12, 2019
#MyPaperBagChallenge

Pragyan Priyambada

December 12, 2019
#MyPaperBagChallenge

Anup Mahapatra

December 12, 2019
#MyPaperBagChallenge

Anasuya Sahoo

December 12, 2019
#MyPaperBagChallenge

Swarit Praharaj

December 12, 2019
#MyPaperBagChallenge

Jhili Jena

December 12, 2019
?????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????
#MyPaperBagChallenge

Dibya Ranjan Das

December 12, 2019
#MyPaperBagChallenge

Aishwarya Ranjan Mohanty

December 12, 2019
#MyPaperBagChallenge

Nishikant Rout

December 12, 2019
#MyPaperBagChallenge

Lopali Pattnaik

December 12, 2019
#MyPaperBagChallenge

Pratyasharani Ghibela

December 12, 2019
#MyPaperBagChallenge

Amritansh Mishra

December 12, 2019
#MyPaperBagChallenge

Ankita Balabantray

December 12, 2019
#MyPaperBagChallenge

Manas Samanta

December 12, 2019
#MyPaperBagChallenge

Anshuman Sahoo

December 12, 2019
#MyPaperBagChallenge

Subhajyoti Mohanty

December 12, 2019
#MyPaperBagChallenge

Adrita Bhattacharya

December 12, 2019
#MyPaperBagChallenge

Debasis Mohanty

December 12, 2019
#MyPaperBagChallenge

Sarfraz Ahmad

December 12, 2019
#MyPaperBagChallenge

Sisirkumar Maharana

December 12, 2019
#MyPaperBagChallenge

Kamana Singh

December 12, 2019
#MyPaperBagChallenge

Akshaya Kumar Dash

December 12, 2019
#MyPaperBagChallenge

Tapaswini Mallick

December 12, 2019

Archives

Editorial

Acknowledge Failure

Deputy Chief of Army Staff Lieutenant General Rahul R Singh
July 7, 2025

Deputy Chief of Army Staff Lieutenant General Rahul R Singh’s candid revelations about Operation Sindoor at a FICCI event ‘New...

Read more

Politics of Philosophy

AAKAR PATEL
July 6, 2025

The BJP’s constitution (Article 3) says, “Integral Humanism shall be the philosophy of the party.” The party’s membership form has...

Read more

India’s Spy Shift

July 5, 2025

India’s espionage architecture is quietly shifting. The appointment of Parag Jain as the new chief of RAW comes at a...

Read more

Hungary Lessons

Hungary
July 2, 2025

Revolting against oppression and seeking freedom is ingrained in human nature, something that a repressive regime finds out sooner or...

Read more
  • Home
  • State
  • Metro
  • National
  • International
  • Business
  • Editorial
  • Opinion
  • Sports
  • About Us
  • Advertise
  • Contact Us
  • Jobs
Developed By Ratna Technology

© 2024 All rights Reserved by OrissaPOST

  • News in Odia
  • Orissa POST Epaper
  • Video
  • Home
  • Trending
  • Metro
  • State
  • Odisha Special
  • National
  • International
  • Sports
  • Business
  • Editorial
  • Entertainment
  • Horoscope
  • Careers
  • Feature
  • Today’s Pic
  • Opinion
  • Sci-Tech
  • About Us
  • Contact Us
  • Jobs

© 2024 All rights Reserved by OrissaPOST

    • News in Odia
    • Orissa POST Epaper
    • Video
    • Home
    • Trending
    • Metro
    • State
    • Odisha Special
    • National
    • International
    • Sports
    • Business
    • Editorial
    • Entertainment
    • Horoscope
    • Careers
    • Feature
    • Today’s Pic
    • Opinion
    • Sci-Tech
    • About Us
    • Contact Us
    • Jobs

    © 2024 All rights Reserved by OrissaPOST