Most Popular
1. It’s a New Macro, the Gold Market Knows It, But Dead Men Walking Do Not (yet)- Gary_Tanashian
2.Stock Market Presidential Election Cycle Seasonal Trend Analysis - Nadeem_Walayat
3. Bitcoin S&P Pattern - Nadeem_Walayat
4.Nvidia Blow Off Top - Flying High like the Phoenix too Close to the Sun - Nadeem_Walayat
4.U.S. financial market’s “Weimar phase” impact to your fiat and digital assets - Raymond_Matison
5. How to Profit from the Global Warming ClImate Change Mega Death Trend - Part1 - Nadeem_Walayat
7.Bitcoin Gravy Train Trend Forecast 2024 - - Nadeem_Walayat
8.The Bond Trade and Interest Rates - Nadeem_Walayat
9.It’s Easy to Scream Stocks Bubble! - Stephen_McBride
10.Fed’s Next Intertest Rate Move might not align with popular consensus - Richard_Mills
Last 7 days
US Presidential Election Year Stock Market Seasonal Trend - 29th Nov 24
Who controls the past controls the future: who controls the present controls the past - 29th Nov 24
Gold After Trump Wins - 29th Nov 24
The AI Stocks, Housing, Inflation and Bitcoin Crypto Mega-trends - 27th Nov 24
Gold Price Ahead of the Thanksgiving Weekend - 27th Nov 24
Bitcoin Gravy Train Trend Forecast to June 2025 - 24th Nov 24
Stocks, Bitcoin and Crypto Markets Breaking Bad on Donald Trump Pump - 21st Nov 24
Gold Price To Re-Test $2,700 - 21st Nov 24
Stock Market Sentiment Speaks: This Is My Strong Warning To You - 21st Nov 24
Financial Crisis 2025 - This is Going to Shock People! - 21st Nov 24
Dubai Deluge - AI Tech Stocks Earnings Correction Opportunities - 18th Nov 24
Why President Trump Has NO Real Power - Deep State Military Industrial Complex - 8th Nov 24
Social Grant Increases and Serge Belamant Amid South Africa's New Political Landscape - 8th Nov 24
Is Forex Worth It? - 8th Nov 24
Nvidia Numero Uno in Count Down to President Donald Pump Election Victory - 5th Nov 24
Trump or Harris - Who Wins US Presidential Election 2024 Forecast Prediction - 5th Nov 24
Stock Market Brief in Count Down to US Election Result 2024 - 3rd Nov 24
Gold Stocks’ Winter Rally 2024 - 3rd Nov 24
Why Countdown to U.S. Recession is Underway - 3rd Nov 24
Stock Market Trend Forecast to Jan 2025 - 2nd Nov 24
President Donald PUMP Forecast to Win US Presidential Election 2024 - 1st Nov 24
At These Levels, Buying Silver Is Like Getting It At $5 In 2003 - 28th Oct 24
Nvidia Numero Uno Selling Shovels in the AI Gold Rush - 28th Oct 24
The Future of Online Casinos - 28th Oct 24
Panic in the Air As Stock Market Correction Delivers Deep Opps in AI Tech Stocks - 27th Oct 24
Stocks, Bitcoin, Crypto's Counting Down to President Donald Pump! - 27th Oct 24
UK Budget 2024 - What to do Before 30th Oct - Pensions and ISA's - 27th Oct 24
7 Days of Crypto Opportunities Starts NOW - 27th Oct 24
The Power Law in Venture Capital: How Visionary Investors Like Yuri Milner Have Shaped the Future - 27th Oct 24
This Points To Significantly Higher Silver Prices - 27th Oct 24

Market Oracle FREE Newsletter

How to Protect your Wealth by Investing in AI Tech Stocks

DARPA's XAI Explainable Artificial Intelligence Future

Politics / AI May 15, 2018 - 02:00 PM GMT

By: BATR

Politics

The popular scenario has AI deploying autonomous killer military robots as storm troopers. The mission of DARPA is to create the cutting edge of weaponized technology. So when a report contends that the Pentagon now using Jade Helm exercises to teach Skynet how to kill humans, it is not simply a screenplay for a Hollywood blockbuster.

"Simply AI quantum computing technology that can produce the holographic battle simulations and, in addition, "has the ability to use vast amounts of data being collected on the human domain to generate human terrain systems in geographic population centric locations" as a means of identifying and eliminating targets - insurgents, rebels or "whatever labels that can be flagged as targets in a Global Information Grid for Network Centric Warfare environments."


While this assessment may alarm the most fearful, Steven Walker, director of the Defense Advanced Research Projects Agency in DARPA: Next-generation artificial intelligence in the works, presents a far more sedated viewpoint.

"Walker described the current generation of AI as its “second wave,” which has led to breakthroughs like autonomous vehicles. By comparison, “first wave” applications, like tax preparation software, follow simple logic rules and are widely used in consumer technology.

While second-wave AI technology has the potential to, for example, control the use of the electromagnetic spectrum on the battlefield, Walker said the tools aren’t flexible enough to adapt to new inputs.

The third wave of AI will rely on contextual adaptation — having a computer or machine understand the context of the environment it’s working in, and being able to learn and adapt based on changes in that environment."

Here is where the XAI model comes into play. The authoritative publication Janes states that  DARPA’s XAI seeks explanations from autonomous systems. "According to DARPA, XAI aims to “produce more explainable models, while maintaining a high level of learning performance (prediction accuracy); and enable human users to understand, appropriately, trust, and effectively manage the emerging generation of artificially intelligent partners”.

Mr. David Gunning provides an insight that Explainable Artificial Intelligence (XAI) is the next development.

"XAI is one of a handful of current DARPA programs expected to enable “third-wave AI systems”, where machines understand the context and environment in which they operate, and over time build underlying explanatory models that allow them to characterize real world phenomena.

The XAI program is focused on the development of multiple systems by addressing challenge problems in two areas: (1) machine learning problems to classify events of interest in heterogeneous, multimedia data; and (2) machine learning problems to construct decision policies for an autonomous system to perform a variety of simulated missions. These two challenge problem areas were chosen to represent the intersection of two important machine learning approaches (classification and reinforcement learning) and two important operational problem areas for the DoD (intelligence analysis and autonomous systems)."

The FedBizOpps government site provides this synopsis: "The goal of Explainable AI (XAI) is to create a suite of new or modified machine learning techniques that produce explainable models that, when combined with effective explanation techniques, enable end users to understand, appropriately trust, and effectively manage the emerging generation of AI systems."

The private sector is involved in these developments. The question that gets lost involves national security since Xerox is being bought by Fujifilm. But why worry over such mere details when the machines are on a path to become self-directed networks.

"PARC, a Xerox company, today announced it has been selected by the Defense Advanced Research Projects Agency (DARPA), under its Explainable Artificial Intelligence (XAI) program, to help advance the underlying science of AI. For this multi-million dollar contract, PARC will aim to develop a highly interactive sense-making system called COGLE (COmmon Ground Learning and Explanation), which may explain the learned performance capabilities of autonomous systems to human users."

With the news that the Xerox sale to Fuji is called off, could the PARC component of this deal be a breaker?

As for trusting the results of the technology, just ask the machine. It will tell the human user what to believe. Another firm that is involved with XAI is Charles River Analytics. The stated objective is to overcome the current limitation from the human interface. "The Department of Defense (DoD) is investigating the concept that XAI -- especially explainable machine learning -- will be essential if future warfighters are to understand, appropriately trust, and effectively manage an emerging generation of artificially intelligent machine partners."

The Defense Department is developing a New project wants AI to explain itself.

"Explainable Artificial Intelligence (XAI), which looks to create tools that allow a human on the receiving end of information or a decision from an AI machine to understand the reasoning that produced it. In essence, the machine needs to explain its thinking.

More recent efforts have employed new techniques such as complex algorithms, probabilistic graphical models, deep learning neural networks and other methods that have proved to be more effective but, because their models are based on the machines’ own internal representations, are less explainable.

The Air Force, for example, recently awarded SRA International a contract to focus specifically on the trust issues associated with autonomous systems."

It would be a mistake to equate an AI system to just an advanced auto pilot device to navigate an aircraft. While the outward description of an objective to create an AI communication with human interface sounds reassuring, the actual risk of generating an entirely independent computerized decision structure is being mostly ignored.

Just look at the dangerous use of AI at Facebook. AI Is Inventing Languages Humans Can’t Understand. Should We Stop It? Can DARPA be confident that they can control a self-generation and thinking Artificial Intelligence entity that may very well see a human component unnecessary? Imagine a future combat regiment that see their commanding officer as inferior to the barking of a drill sergeant computer terminal? In such an environment, where would a General Douglas MacArthur fit in?   

XAI is an overly optimistic belief that humans can always pull the plug on a rogue machine. Well, such a conviction needs to be approved by the AI cloud computer.

SARTRE

Source: http://batr.org/utopia/051518.html

Discuss or comment about this essay on the BATR Forum

http://www.batr.org

"Many seek to become a Syndicated Columnist, while the few strive to be a Vindicated Publisher"

© 2018 Copyright BATR - All Rights Reserved

Disclaimer: The above is a matter of opinion provided for general information purposes only and is not intended as investment advice. Information and analysis above are derived from sources and utilising methods believed to be reliable, but we cannot accept responsibility for any losses you may incur as a result of this analysis. Individuals should consult with their personal financial advisors

BATR Archive

© 2005-2022 http://www.MarketOracle.co.uk - The Market Oracle is a FREE Daily Financial Markets Analysis & Forecasting online publication.


Post Comment

Only logged in users are allowed to post comments. Register/ Log in