What is AI? Understanding the true influence of artificial intelligence

Artificial intelligence is the most discussed and debated technology today, generating widespread adulation and anxiety, as well as significant interest and investment for governments and businesses. In a variety of tasks and surveys that inform immediate adoption, what is the actual effect of AI on businesses?

“2021 has been the year that artificial intelligence went from being an emerging generation to a mature one. . . that has a real effect, whether positive or negative,” the 2022 AI Index report said. The fifth tranche of the index measures development has an effect on AI in a number of ways, adding personal investment in AI, the number of AI patents filed, and the amount of AI-related expenses that have been approved in the legislatures of 25 countries around the world.

However, there is nothing in the report about “real-world impact” as you would describe it: significantly successful, sustainable, and meaningful AI deployments. There is also no definition of “AI” in the report.

Returning to the first component of the AI Index report, published in 2017, it still does not give a definition of what the report is. But the aim of the report is set from the outset: “. . . the AI box continues to evolve and even experts are struggling to perceive and track progress in this picture. Without the applicable knowledge to explain why about the state of AI technology, we are “flying blind” in our AI-related conversations and decision-making.

“Flying blind” is a clever description, in my opinion, of gathering knowledge about anything it doesn’t define.

The 2017 report was “created and presented as a task of the Hundred Years Study on AI at Stanford University (AI100),” published in 2016. The first segment of this study asked “what is synthetic intelligence?”just to provide the classic circular definition that AI is what makes machines intelligent, and that intelligence is the “quality that allows an entity to function and with foresight in its environment. “

So were early computers (commonly known as “giant brains”) “smart” because they could simply calculate, even faster than humans?The hundred-year-old exam answers: “Although our broad interpretation places the calculator on the spectrum of intelligence. . . The frontier of AI has come a long way and the purposes of the calculator are just one of the millions that smartphones can perform. In other words, everything a PC has done in the afterlife or done is “AI. “

The exam also proposes an “operational definition”: “AI can also be explained through what AI researchers do. “That’s why this year’s AI Index measures the “real impact” and “progress” of AI, among other indicators, through citations and articles on AI (explained as “AI” through the authors of the articles and indexed with the keyword “AI” through publications).

Beyond circular definitions, however, the study gives us a transparent and concise description of what caused the sudden frenzy and worry around a term coined in 1955: “Several things drove the AI revolution. Most of it is the maturation of device learning. “, supported in components through cloud computing resources and large-scale knowledge gathering on the web. Machine learning has been dramatically boosted through “deep learning,” a form of adaptive synthetic neural networks trained with an approach called backpropagation.  »

In fact, “machine learning” (a term coined in 1959) or training a computer to classify knowledge (spam or non-spam) and/or make a prediction (if you liked eebook X, you’d like eebook y), that’s what “AI” has to do with today. Specifically, since its breakthrough in symbol classification in 2012, its most recent variety or “deep learning,” which involves the classification of gigantic amounts of knowledge with many characteristics.

AI learns from data. The AI of the 1955 variety, which generated a series of boom-and-bust cycles, was based on the assumption that “all facets of learning or any other feature of intelligence can in principle be described with such precision that a device can be designed to simulate it. “This is the vision and, in general, so far, it has not materialized in a significant and sustainable way, demonstrating a significant “real impact”.

A serious challenge with this vision was that it predicted the arrival in the not too distant long term of a device with human intelligence functions (or even surpassing humans), a prediction repeated periodically through very intelligent humans, from Turing to Minsky to Hawking. . This preference to play God, along with outdated “AI,” has and discussion (and business and government movements) about today’s “AI. “That’s what happens when you don’t describe what you’re talking about (or describe AI as what AI researchers do).

The mix of new backpropagation strategies, the use of specialized hardware (GPU) more suited to the type of calculations being performed and, above all, the availability of a large amount of knowledge (already classified and classified knowledge used to teach the PC the classification of the correct type), this is what has led to the current “AI revolution”.

Call this the triumph of statistical analysis. This “revolution” is a 60-year evolution of using increasingly complicated statistical analysis to aid in a wide variety of business (or medical or government, etc. ) decisions, actions, and transactions. This has been termed “data mining” and “predictive analytics” and, more recently, “data science. “

Last year, a survey of 30,000 U. S. production sites was conducted in the U. S. The U. S. Department of Homeland Security found that “productivity is particularly higher among factories that use predictive analytics. “(By the way, Erik Brynjolfsson, lead author of this study, has also been a member of the AI index report’s steering committee since its inception. )It turns out that it’s imaginable to locate a measurable “real impact” of “AI,” as long as you describe it correctly.

AI learns from data. And the successful, measurable commercial use of data-driven learning is what I would call practical AI.

Leave a Comment

Your email address will not be published. Required fields are marked *