IMO, AI only happens to be the current use of a term to wrap itself into so many areas of computing. I don't let the term in particular cause me to move one way or another, relative to my positive bias towards investing into the broad 'computing world'. AI only became a new term for another area of tech, and something new for the investment writers and tv folks to write and talk about as a 'new' miracle in tech. Oh, well.....this could be a new discussion unto itself.
AI or whatever one chooses to name as investing in the area of tech. has many sectors and companies involved. Those (science, research, etc.) who have the ideas of how to use the quantum speed of computing (Nvidia) and others providing the product(s) to continue this forward progress will continue to benefit. Quantum computing is more than Cloud storage and replay speeds for instant replays of real time sports action.
Past (back to October, 2019) quantum computing
threads.
Searching this word at MFO finds writes that do not involve quantum computing. However, the link above takes one to some writes and one does not have to be read past the
October, 2019 post; as related to computing. Reading further past the October, 2019 takes one into a Quantum Fund discussion and other unrelated areas (Quantum Physics).
Enough rambling from me.
Remain curious,
Catch
Comments
On the history part, per this source, the phrase "AI" was actually coined in 1956.
https://www.sas.com/en_us/insights/analytics/what-is-artificial-intelligence.html#:~:text=Artificial intelligence (AI)%20makes%20it,learning%20and%20natural%20language%20processing.
The buzz these days is referenced as AI, but is really more specifically about Generative AI as seen below.
Excerpt (BOLD added):
Artificial Intelligence History
The term artificial intelligence was coined in 1956, but AI has become more popular today thanks to increased data volumes, advanced algorithms, and improvements in computing power and storage.
Early AI research in the 1950s explored topics like problem solving and symbolic methods. In the 1960s, the US Department of Defense took interest in this type of work and began training computers to mimic basic human reasoning. For example, the Defense Advanced Research Projects Agency (DARPA) completed street mapping projects in the 1970s. And DARPA produced intelligent personal assistants in 2003, long before Siri, Alexa or Cortana were household names.
This early work paved the way for the automation and formal reasoning that we see in computers today, including decision support systems and smart search systems that can be designed to complement and augment human abilities.
While Hollywood movies and science fiction novels depict AI as human-like robots that take over the world, the current evolution of AI technologies isn’t that scary – or quite that smart. Instead, AI has evolved to provide many specific benefits in every industry. Keep reading for modern examples of artificial intelligence in health care, retail and more.
1950s–1970s
Neural Networks
Early work with neural networks stirs excitement for “thinking machines.”
1980s–2010s
Machine Learning
Machine learning becomes popular.
2011–2020s
Deep Learning
Deep learning breakthroughs drive AI boom.
Present Day
Generative AI
Generative AI, a disruptive tech, soars in popularity.
See also
https://www.celonis.com/blog/who-are-the-leaders-in-the-generative-ai-industry/#:~:text=Nvidia: Describing itself as “the,for a variety of applications.
Excerpt (BOLD added):
Nvidia: Describing itself as “the world’s most advanced platform for generative AI”, Nvidia combines accelerated computing, AI software, pre-trained models and AI foundries to enable users to build, customize, and deploy generative AI models for a variety of applications. Nvidia’s own models include StyleGAN, GauGAN and eDiff-I.