Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

In this Discussion

Here's a statement of the obvious: The opinions expressed here are those of the participants, not those of the Mutual Fund Observer. We cannot vouch for the accuracy or appropriateness of any of it, though we do encourage civility and good humor.

    Support MFO

  • Donate through PayPal

Quantum+ computing, what AI has been waiting for; although not new to the tech world

edited February 24 in Other Investing
IMO, AI only happens to be the current use of a term to wrap itself into so many areas of computing. I don't let the term in particular cause me to move one way or another, relative to my positive bias towards investing into the broad 'computing world'. AI only became a new term for another area of tech, and something new for the investment writers and tv folks to write and talk about as a 'new' miracle in tech. Oh, well.....this could be a new discussion unto itself.

AI or whatever one chooses to name as investing in the area of tech. has many sectors and companies involved. Those (science, research, etc.) who have the ideas of how to use the quantum speed of computing (Nvidia) and others providing the product(s) to continue this forward progress will continue to benefit. Quantum computing is more than Cloud storage and replay speeds for instant replays of real time sports action.

Past (back to October, 2019) quantum computing threads.
Searching this word at MFO finds writes that do not involve quantum computing. However, the link above takes one to some writes and one does not have to be read past the October, 2019 post; as related to computing. Reading further past the October, 2019 takes one into a Quantum Fund discussion and other unrelated areas (Quantum Physics).

Enough rambling from me.
Remain curious,


  • edited February 24
    I for one appreciate your "rambling" as it spurs talk about this topic.

    On the history part, per this source, the phrase "AI" was actually coined in 1956. intelligence (AI)%20makes%20it,learning%20and%20natural%20language%20processing.

    The buzz these days is referenced as AI, but is really more specifically about Generative AI as seen below.

    Excerpt (BOLD added):

    Artificial Intelligence History
    The term artificial intelligence was coined in 1956, but AI has become more popular today thanks to increased data volumes, advanced algorithms, and improvements in computing power and storage.

    Early AI research in the 1950s explored topics like problem solving and symbolic methods. In the 1960s, the US Department of Defense took interest in this type of work and began training computers to mimic basic human reasoning. For example, the Defense Advanced Research Projects Agency (DARPA) completed street mapping projects in the 1970s. And DARPA produced intelligent personal assistants in 2003, long before Siri, Alexa or Cortana were household names.

    This early work paved the way for the automation and formal reasoning that we see in computers today, including decision support systems and smart search systems that can be designed to complement and augment human abilities.

    While Hollywood movies and science fiction novels depict AI as human-like robots that take over the world, the current evolution of AI technologies isn’t that scary – or quite that smart. Instead, AI has evolved to provide many specific benefits in every industry. Keep reading for modern examples of artificial intelligence in health care, retail and more.

    Neural Networks

    Early work with neural networks stirs excitement for “thinking machines.”

    Machine Learning

    Machine learning becomes popular.

    Deep Learning

    Deep learning breakthroughs drive AI boom.

    Present Day
    Generative AI

    Generative AI, a disruptive tech, soars in popularity.

    See also Describing itself as “the,for a variety of applications.

    Excerpt (BOLD added):

    Nvidia: Describing itself as “the world’s most advanced platform for generative AI”, Nvidia combines accelerated computing, AI software, pre-trained models and AI foundries to enable users to build, customize, and deploy generative AI models for a variety of applications. Nvidia’s own models include StyleGAN, GauGAN and eDiff-I.
  • edited February 24
    Good posts. In the history of AI did we ever have the current level of widespread consumer (and as such private companies’s) participation? If not, the current enthusiasm is unique. May be the current popularity has to do with the consumer directly interacting with AI via services while prior booms resulted in products the consumer used without needing to appreciate the power of AI (e.g., machine learning of 2010-20)
Sign In or Register to comment.