Skip to main content

What is artificial intelligence? Here’s everything you need to know

Demystifying artificial intelligence: Everything you need to know about A.I.

Crazy singularities, robot rebellions, falling in love with computers: Artificial intelligence conjures up a multitude of wild what-ifs. But in the real world, A.I. involves machine learning , deep learning, and many other programmable capabilities that we’re just beginning to explore. Let’s put the fantasy stuff on hold (at least for now) and talk about real-world A.I. Here’s what it is, how it works, and where it’s going.

Recommended Videos

What is artificial intelligence?

A.I. seeks to process and respond to data much like a human would. That may seem overly broad, but it needs to be: Developers are baking in human-like smarts into a wide variety of applications. Generally, A.I. falls within three categories — which we would note there is still some disagreement as to what the exact definitions are, much less if they’re truly possible.

  • Narrow: Narrow A.I. (sometimes called “weak A.I.”) is where most of humankind’s work so far has been. As its name suggests, it is focused on executing a single task, and interactions with a narrow A.I. are limited. Examples of this are checking weather reports, controlling smart home devices, or giving us answers to general questions that are pulled from a central database (Wikipedia, etc.). Several narrow A.I.s can be strung together to offer a more comprehensive service: Alexa, Google Assistant, Siri, and Cortana are great examples, even current forms of the autonomous car. Narrow A.I. can’t think for itself; this is why sometimes you’ll get a nonsensical answer back — it lacks the ability to understand context.
  • General: General A.I. (or “strong A.I.”) is where we’re headed. Here, A.I. gains the ability to understand context and make judgments based on it. Over time, it learns from experience, is able to make decisions even in times of uncertainty or with no prior available data, use reason, and be creative. Intellectually, these computers operate much like the human brain. So far we’ve not been able to do it, although most believe we might be able to do so sometime this century.
  • Super: In the far distant future, A.I. may become intellectually superior to humans in every way. A.I. robots would be able to think for themselves, attain consciousness, and operate without any human involvement, perhaps at the direction of another A.I. This sounds like some real Skynet-like dystopia complete with the end of humanity as some warn, but it could also be the dawn of an era in innovation that might make previous advancements look pedestrian .

A.I. can also be classified by how it operates, which is particularly important when considering how complex an A.I. system is and its ultimate cost. If a company is creating an A.I. solution, the first question must be, “Will it learn through training or inference?”

  • Training : These A.I.s are designed to learn and improve over time, and adjust their data sets and certain parts of their processes to become more efficient. General and super A.I. platforms will be able to do this, however narrow A.I. typically does not, since the amount of processing power necessary is so great making it quite expensive.
  • Inference : Most narrow A.I.s are designed to look at data and draw conclusions in careful steps, a much cheaper and less computationally expensive method. For example, to answer the question “What was the score of yesterday’s games?” an A.I. might infer, “To answer this question, I must find data for yesterday’s game scores by searching list of reliable sports datasets, I’ll compare that data to favorite teams listed in settings, and report back the scores in audio.” While helpful to the end user, if the response wasn’t exactly what the user was looking for, the A.I. has little ability to adapt on its own over time. A human must get involved to make its responses more relevant.

As we’ve noted earlier, these definitions are only meant as a general guide ( this Medium article is a great discussion on what we’ve just talked about), and some may have slightly different descriptions. But there are examples of current A.I. which are worth discussing.

Current forms of A.I.

C2Sense tiny artificial nose sensor
Jan Schnorr/C2Sense
Jan Schnorr/C2Sense

Voice assistants : Siri, Cortana, Alexa, and other voice assistants are growing more common, becoming the “face” of modern A.I. A growing subset here are chatbots, which manage messaging on websites and carry on online conversations.

Translation : This isn’t just about translating language. It’s also about translating objects, pictures, and sounds into data that can then be used in various algorithms.

Predictive systems : These A.I.s look at statistical data and form valuable conclusions for governments, investors, doctors, meteorologists, and nearly every other field where statistics and event prediction prove valuable.

Marketing : These A.I.s analyze buyers and their behavior, then choose tactics, products, and deals that best fit said behavior. There is a lot of crossover between these behind-the-scenes tools and voice assistants at the moment.

Research : Research A.I.s like Iris search through complex documents and studies for specific information, typically at higher speeds than Google’s search engine.

Awareness : These A.I.s watch for and report unusual events when humans can’t have an eye on them. One of the most complex examples of this is theft detection, which reports unusual behavior. A more exciting example, however, is self-driving cars, which use A.I. systems to scan for dangers and choose the appropriate course of action.

Editing software : These basic A.I.s look at pictures or text and locate ways that they could be improved.

Where A.I. is headed

Recently, neural networking expert Charles J. Simon recently opined on our pages about where he thinks A.I. is headed, which we recommend you read . While we won’t cut and paste the entire article here, we’ll point you to one specific section:

Most people look at the limitations of today’s A.I. systems as evidence that AGI [general A.I.] is a long way off.  We beg to differ. A.I. has most of AGI’s needed pieces already in play, they just don’t work together very well — yet.

This is a key point. As we’ve noted, A.I. is getting better — at least perceptually — by the fact that developers are stringing together several narrow A.I. platforms. But the platforms don’t talk with each other. For example, while Alexa might now be able to start your car, it can’t use the current weather conditions to adjust your car’s heater or air conditioning systems or start the defroster to make sure you’re ready to go as soon as you get in. But Simon argues that we may have the computational and developmental capability either already and don’t know it yet, or within the next decade.

Companies are spending massive amounts on money on A.I. right now, and as long as they’re willing to spend the billions (if not eventually trillions) to advance the technology, things are going to move quickly. But there are all kinds of roadblocks in the way — whether it be a recessionary economy, computational challenges, and even moral and philosophical hurdles to overcome — so the road to a real-world Skynet might be a long one.

Is A.I. dangerous?

Image used with permission by copyright holder

While we keep coming back to the obvious Skynet references, it’s time for a bit of a reality check. A.I.s are long strings of programmed responses and collections of data right now, and they don’t have the ability to makes truly independent decisions. That being the case, malice is definitely off the table for the time being. But that’s not to say human error could make them so.

For example, if a predictive A.I. tells a team that storms will spawn on the East Coast next week, the team can send resources and warnings there in preparation. But if storms actually appear in the Gulf of Mexico and hit the coast there, that prediction was inaccurate and may have endangered lives. No one would think the A.I. is somehow personally to blame for this; instead, they would look at the various data inputs and algorithm adjustments. Like other types of software, A.I.s remain complex tools for people to use.

At least for now, A.I. is, for the most part, harmless and if anything helpful to the world at large. But that could change in the distant future, and at that time we’ll need to have a serious discussion on just how much of our lives we’re willing to turn over to machines.

Ed Oswald
For fifteen years, Ed has written about the latest and greatest in gadgets and technology trends. At Digital Trends, he's…
Toyota unveils 2026 bZ: A smarter, longer-range electric SUV
toyota bz improved bz4x 2026 0007 1500x1125

Toyota is back in the electric SUV game with the 2026 bZ, a major refresh of its bZ4X that finally delivers on two of the biggest demands from EV drivers: more range and faster charging.
The headline news is the improved driving range. Toyota now estimates up to 314 miles on a single charge for the front-wheel-drive model with the larger 74.7-kWh battery—about 60 miles more than the outgoing bZ4X. All-wheel-drive variants also get a boost, with up to 288 miles of range depending on trim.
Charging speeds haven’t increased in terms of raw kilowatts (still capped at 150 kW for DC fast charging), but Toyota has significantly improved how long peak speeds are sustained. With preconditioning enabled—especially helpful in colder weather—the new bZ can charge from 10% to 80% in about 30 minutes. Also new: Plug and Charge support for automatic payment at compatible stations and full adoption of the North American Charging Standard (NACS), meaning access to Tesla Superchargers will be standard by 2026.
Under the hood, or rather the floor, Toyota has swapped in higher-performance silicon carbide components to improve efficiency and power delivery. The AWD version now produces up to 338 horsepower and sprints from 0–60 mph in a brisk 4.9 seconds.
Toyota didn’t stop at just the powertrain. The exterior has been cleaned up, with body-colored wheel arches replacing the black cladding, and a sleeker front fascia. Inside, a larger 14-inch touchscreen now houses climate controls, giving the dash a more refined and less cluttered appearance. There’s also more usable storage thanks to a redesigned center console.
With the 2026 bZ, Toyota seems to be responding directly to critiques of the bZ4X. It’s faster, more efficient, and more driver-friendly—finally bringing Toyota’s EV efforts up to speed.

Read more
Cheaper EVs ahead? GM and LG say new battery cells are the key
2025 Chevrolet Equinox EV front quarter view.

General Motors and LG Energy Solution have announced a new phase in their ongoing partnership: developing a new battery cell chemistry that could significantly lower the cost of electric vehicles. The joint effort centers on lithium manganese iron phosphate (LMFP) battery cells, a variation of lithium iron phosphate (LFP) that’s gaining popularity for being more affordable and less reliant on expensive materials like nickel and cobalt.

This is a big deal because battery costs are still the single largest expense in producing EVs. According to GM and industry experts, LMFP cells could help bring the cost of electric vehicles close to — or even on par with — gas-powered cars. The goal? Making EVs accessible to a broader range of drivers without sacrificing range or performance.

Read more
Archer’s flying taxis head to LA for the 2028 Olympics
archer air taxi la28 inglewood aerial a final

Remember the buzz about flying taxis zipping through Paris for the 2024 Olympics? That sci-fi fantasy never got off the ground —Germany’s Volocopter dream was denied certification, leaving fans staring at the same old ground traffic. But now, the skies are opening again for a second shot at glory—this time over Los Angeles.
Archer Aviation, the California-based electric vertical takeoff and landing (eVTOL) company, has been named the exclusive air taxi provider for the 2028 Los Angeles Olympic and Paralympic Games.
Archer’s Midnight aircraft, a piloted electric air taxi designed to carry four passengers, will be whisking around VIPs, fans, and stakeholders between venues and key locations like LAX, Hollywood, Santa Monica, and even Orange County. Think 10-20 minute flights that skip the infamous LA gridlock and land you right where the action is—on the roof, basically.
“We want to transform the way people get around Los Angeles and leave a legacy that shapes the future of transportation in America. There’s no better time to do that than during the LA28 Games,” said Adam Goldstein, CEO and founder of Archer Aviation.
And Midnight isn’t just a pretty rotor. It’s a whisper-quiet, emission-light aircraft with 12 rotors and a redundant, airline-level safety design.
What’s more, Archer and LA28 are working together to electrify vertiport hubs around the city—think futuristic sky stations—to serve not only Games-time needs but also to plant seeds for a post-Olympic air mobility network.
The air mobility market has been fast developing over the past few years, featuring the likes of Hyundai partnership with China’s XPeng HT Aero and Toyota's backing of Joby Aviation, a U.S. venture. Joby bought Uber Elevate in 2020, hoping to someday pair its air taxis with Uber’s ride-hailing app.
Archer, for its part, has been busy building a strategic partnership with United Airlines, which has already placed orders for the aircraft and is helping with logistics to integrate air taxis into airport-to-downtown travel. More than a demo for the cameras, the LA28 partnership will showcase urban air travel for real-world daily use, starting with one of the most high-profile events on Earth.
After raising false hopes in Paris, the air taxi dream is aiming for liftoff in LA—and this time, it might just stick the landing.

Read more