Most Popular
1. Banking Crisis is Stocks Bull Market Buying Opportunity - Nadeem_Walayat
2.The Crypto Signal for the Precious Metals Market - P_Radomski_CFA
3. One Possible Outcome to a New World Order - Raymond_Matison
4.Nvidia Blow Off Top - Flying High like the Phoenix too Close to the Sun - Nadeem_Walayat
5. Apple AAPL Stock Trend and Earnings Analysis - Nadeem_Walayat
6.AI, Stocks, and Gold Stocks – Connected After All - P_Radomski_CFA
7.Stock Market CHEAT SHEET - - Nadeem_Walayat
8.US Debt Ceiling Crisis Smoke and Mirrors Circus - Nadeem_Walayat
9.Silver Price May Explode - Avi_Gilburt
10.More US Banks Could Collapse -- A Lot More- EWI
Last 7 days
Stock Market Volatility (VIX) - 25th Mar 24
Stock Market Investor Sentiment - 25th Mar 24
The Federal Reserve Didn't Do Anything But It Had Plenty to Say - 25th Mar 24
Stock Market Breadth - 24th Mar 24
Stock Market Margin Debt Indicator - 24th Mar 24
It’s Easy to Scream Stocks Bubble! - 24th Mar 24
Stocks: What to Make of All This Insider Selling- 24th Mar 24
Money Supply Continues To Fall, Economy Worsens – Investors Don’t Care - 24th Mar 24
Get an Edge in the Crypto Market with Order Flow - 24th Mar 24
US Presidential Election Cycle and Recessions - 18th Mar 24
US Recession Already Happened in 2022! - 18th Mar 24
AI can now remember everything you say - 18th Mar 24
Bitcoin Crypto Mania 2024 - MicroStrategy MSTR Blow off Top! - 14th Mar 24
Bitcoin Gravy Train Trend Forecast 2024 - 11th Mar 24
Gold and the Long-Term Inflation Cycle - 11th Mar 24
Fed’s Next Intertest Rate Move might not align with popular consensus - 11th Mar 24
Two Reasons The Fed Manipulates Interest Rates - 11th Mar 24
US Dollar Trend 2024 - 9th Mar 2024
The Bond Trade and Interest Rates - 9th Mar 2024
Investors Don’t Believe the Gold Rally, Still Prefer General Stocks - 9th Mar 2024
Paper Gold Vs. Real Gold: It's Important to Know the Difference - 9th Mar 2024
Stocks: What This "Record Extreme" Indicator May Be Signaling - 9th Mar 2024
My 3 Favorite Trade Setups - Elliott Wave Course - 9th Mar 2024
Bitcoin Crypto Bubble Mania! - 4th Mar 2024
US Interest Rates - When WIll the Fed Pivot - 1st Mar 2024
S&P Stock Market Real Earnings Yield - 29th Feb 2024
US Unemployment is a Fake Statistic - 29th Feb 2024
U.S. financial market’s “Weimar phase” impact to your fiat and digital assets - 29th Feb 2024
What a Breakdown in Silver Mining Stocks! What an Opportunity! - 29th Feb 2024
Why AI will Soon become SA - Synthetic Intelligence - The Machine Learning Megatrend - 29th Feb 2024
Keep Calm and Carry on Buying Quantum AI Tech Stocks - 19th Feb 24

Market Oracle FREE Newsletter

How to Protect your Wealth by Investing in AI Tech Stocks

Nvidia’s Chips Have Powered Nearly Every Major AI Breakthrough

Companies / AI Dec 24, 2020 - 06:02 PM GMT

By: Stephen_McBride

Companies

 “Within 20 years, machines will be capable of doing anything man can do.”

Take a stab at when this quote is from. It wasn’t this year,  2010, or even during the ‘90s tech boom. It’s from one of America’s top computer scientists: in 1960.

You’ve surely heard about Artificial Intelligence (AI) before. “AI” often conjures up images of intelligent robots taking over the world. You’ll often read that it’s only a matter of time before AI steals all our jobs.

But the idea of humanoid machines is nothing new. It began with the “heartless” Tin Man from The Wizard of Oz. By the 1950s, a generation of scientists and engineers were convinced we’d soon co-exist with clever robots.


The term artificial intelligence was coined in 1954 at Dartmouth during the world’s first AI conference. Attendee Marvin Minsky, who later founded MIT’s AI lab, said “In 3–8 years, we will have a machine with the intelligence of a human.”

A couple of years later, Stanford created its AI project “with the goal of building a fully intelligent machine in a decade.”

This idea gripped Hollywood, too. Ever watch the sci-fi classic 2001: A Space Odyssey? The 1968 movie is best remembered for the intelligent supercomputer, HAL 9000. HAL could think just like a human and had the ability to scheme against anyone who threatened its survival.

Soon novels like I, Robot packed our bookshelves. We got stories of robots gone mad, mind-reading robots, robots with a sense of humor, and robots that secretly run the world.

Even the US military was convinced, so it pumped billions of dollars into AI research. In the ‘50s, we imagined bionic men would soon be running factories. Within a decade, cyborgs would be doing our housework. We were promised a new breed of machines.

70 years later, what did we get? Dishwashers, air conditioners, and microwaves!

How Do Robots Learn?

Despite many lofty predictions and billions of dollars in funding, we never got machines with human-like intelligence. You have to dig into how machines learn to see why the idea was a flop from the get-go.

“AI” is a term that’s shrouded in a weird mix of hype and complexity. But the core idea of artificial intelligence is a machine that learns and thinks just like you or I. Most importantly, it learns all by itself, without human intervention.

Of course learning doesn’t come naturally to robots. To overcome this challenge, scientists created neural networks in the late 1950s. In short, neural networks are computer programs that mimic how the human brain works. They are made of thousands—sometimes millions—of artificial “brain cells” that learn through analyzing examples.

Say you’re creating a machine that can recognize cats. First, you’ll feed tons of cat pictures into the neural network. After analyzing, say, 1,000 examples, it starts to learn what a cat looks like. Then you can show it a real cat it’s never seen before, and it will know what it is.

Scientists who believed neural networks would breed intelligent computers were right on the money. Problem was… they lacked the raw materials needed to fuel their ambitions.

Remember, machines learn through analyzing examples, or data. And it turns out you need to feed them with truly enormous amounts of data to kindle any kind of intelligence. So machines need to see hundreds of thousands, if not millions, of cat pictures before they “learn” what a cat looks like. But in the ‘60s and ‘70s, we didn’t have that much data. The internet wasn’t invented, so we had almost no digital text or images. Books, photo libraries, and documents were still in the physical world, which meant converting them into digital files was inefficient and expensive.

And get this: the lack of data wasn’t even the greatest hurdle to building intelligent computers. Designing computer programs that mimic the human brain was genius. The drawback was neural networks needed hyper-fast computers to function.

And by 1995, even supercomputers were shockingly slow. For example, it took a giant “render farm” of 117 Sun Microsystems running 24/7 to produce the original Toy Story. The machines worked non-stop for seven weeks to produce the 78-minute film.

A Match Made in Heaven

After 40 years in the wilderness, two huge breakthroughs are fueling an AI renaissance.

The internet handed us a near unlimited amount of data. A recent IBM paper found 90% of the world’s data has been created in just the last two years. From the 290+ billion photos shared on Facebook, to millions of e-books, billions of online articles and images, we now have endless fodder for neural networks.

The breathtaking jump in computing power is the other half of the equation. RiskHedge readers know computer chips are the “brains” of electronics like your phone and laptop. Chips contain billions of “brain cells” called transistors. The more transistors on a chip, the faster it is.

Your phone is more powerful than the render farm that produced Toy Story. The 117 Sun Microsystems had 1 billion transistors, combined. There are 8.7 billion packed onto the chip inside the latest iPhone!

And in the past decade, a special type of computer chip emerged as the perfect fit for neural networks.

Do you remember the blocky graphics on video games like Mario and Sonic from the ‘90s? If you have kids who are gamers, you’ll know graphics have gotten far more realistic since then. Here’s each Lara Croft from the Tomb Raider series since 1996:


Source: Epic Games

This incredible jump is due to chips called graphics processing units (GPUs). GPUs can perform thousands of calculations all at once, which helps create these movie-like graphics. That’s different from how traditional chips work, which calculate one by one.

Around 2006, Stanford researchers discovered GPUs “parallel processing” abilities were perfect for AI training. For example, do you remember Google’s Brain project? The machine taught itself to recognize cats and people by watching YouTube videos. It was powered by one of Google’s giant data centers, running on 2,000 traditional computer chips. In fact, the project cost a hefty $5 billion.

Stanford researchers then built the same machine with GPUs instead. A dozen GPUs delivered the same data crunching performance of 2,000 traditional chips. And it slashed costs from $5 billion to $33,000! The huge leap in computing power and explosion of data means we finally have the “lifeblood” of AI.

America’s Most Important Company

Artificial intelligence is the ultimate buzzword in tech these days. Data from Bloomberg shows a record 840 US firms mentioned AI at least once in recent earnings reports. In short, it’s become a “mating call” for companies trying to attract investor dollars.

The reality is few of these companies are building intelligent systems. For example, venture capital firm MMC Ventures recently studied 2,830 AI start-ups. In 40% of cases, it found no evidence AI was an important part of their business.

You only need to ask one simple question to weed out the fakes: What percent of their sales come from AI? I’ve done the work: and I can tell you only a handful make any money from this budding disruption.

The one company with a booming AI business is NVIDIA (NVDA). NVIDIA invented graphics processing units back in the 1990s. It’s solely responsible for the realistic video game graphics we have today. And then we discovered these gaming chips were perfect for training neural networks.

NVIDIA stumbled into AI by accident, but early on, it realized it was a huge opportunity. Soon after, NVIDIA started building chips specifically optimized for machine learning. And in the first half of 2020, AI-related sales topped $2.8 billion. In fact, more than 90% of neural network training runs on NVIDIA GPUs today.

Its AI-chips are lightyears ahead of the competition. Its newest system, the A100, is described as an “AI supercomputer in a box.” With more than 54 billion transistors, it’s the most powerful chip system ever created.

In fact, just one A100 packs the same computing power as 300 data center servers. And it does it for one-tenth the cost, takes up one-sixtieth the space, and runs on one-twentieth the power consumption of a typical server room. A single A100 reduces a whole room of servers to one rack.

The Epicenter of Disruption

NVIDIA has a virtual monopoly on neural network training. And every breakthrough worth mentioning has been powered by its GPUs.

Computer vision is one of the world’s most important disruptions. And graphics chips are perfect for helping computers to “see.”

NVIDIA crafted its DRIVE chips specially for self-driving cars. These chips power several robocar startups including Zoox, which Amazon just snapped up for $1.2 billion. With NVIDIA’s backing, vision disruptor Trigo is transforming grocery stores into giant supercomputers.

Trigo fits stores out with a network of cameras and sensors, which feed its neural network with reams of data. In short, the network has learned to “see” what items customers throw in their baskets. So when you’re finished shopping, you simply walk out. Trigo then sends the store a tally, who bills you for that amount.

Trigo’s computer vision system is powered by NVIDIA chips and software. The UK’s largest grocer, Tesco, is trialing Trigo in several of its stores. and each system runs on 40–50 GPUs.

But hands-down the biggest breakthroughs are happening in America’s most broken industry—healthcare.

Cancer is the #2 killer in America, responsible for 600,000 deaths last year. Catching the disease early has proven to be an effective way of beating it. But today, spotting tumors is a manual, time-consuming process.

Medical imaging disruptor Paige.AI built an AI system that could revolutionize cancer diagnosis. Paige.AI fed millions of real-life medical images into its neural network. Using 10 NVIDIA GPUs, it trained the system to detect early signs of tumors.

The neural network recently tested itself by scanning 12,000 medical images for potential tumors. It had never seen these images before, yet was able to “achieve near perfect accuracy.” After announcing these results, Paige.AI was granted “Breakthrough Designation” by the FDA, the first ever for an AI in cancer diagnosis.

NVIDIA is also opening the door to early detection of Alzheimer’s. Stanford researchers built an AI system that detects Alzheimer’s disease from scanning MRIs with 94% accuracy. Powered by six GPUs, it “learned” what biomarkers were most commonly associated with early signs of the disease.

The powerful GPU/AI combo is also saving victims of strokes. During a stroke, patients lose roughly 1.9 million brain cells every minute. So interpreting their CT scans even one second faster matters.

Medical imaging startup Deep01 has created a neural network which almost instantly evaluates strokes. DeepCT has a 95% accuracy rate within 30 seconds per case, which is roughly 10x faster than traditional methods. The system was trained on 60,000 medical images, using NVIDIA chips. And get this… it’s the first Asian firm to be granted FDA clearance for an AI product.

I could pepper you with dozens more examples, but you see my point. NVIDIA’s chips have powered almost every major AI breakthrough. It’s a buy today.

The Great Disruptors: 3 Breakthrough Stocks Set to Double Your Money"
Get my latest report where I reveal my three favorite stocks that will hand you 100% gains as they disrupt whole industries. Get your free copy here.

By Stephen McBride

http://www.riskhedge.com

© 2020 Copyright Stephen McBride - All Rights Reserved Disclaimer: The above is a matter of opinion provided for general information purposes only and is not intended as investment advice. Information and analysis above are derived from sources and utilising methods believed to be reliable, but we cannot accept responsibility for any losses you may incur as a result of this analysis. Individuals should consult with their personal financial advisors.


© 2005-2022 http://www.MarketOracle.co.uk - The Market Oracle is a FREE Daily Financial Markets Analysis & Forecasting online publication.


Post Comment

Only logged in users are allowed to post comments. Register/ Log in