Sometimes, management can't get a clear focus on the proper path; or the battles with others and their pathways.
IBM seems to remain stuck, IMHO. WATSON super computer was quite the introduction for such a device in the A.I. world.
Apparently, something fully related to the "human" aspect of IBM remains to get in the way.
---I suggest they program WATSON with at least 5 scenarios and let it decide the pathway for the company.
What greater demonstration of the company's insight and future value, than this.
https://spectrum.ieee.org/the-human-os/robotics/artificial-intelligence/layoffs-at-watson-health-reveal-ibms-problem-with-aiDisclosure: no direct investments in IBM at this house
Comments
Nvidia GPU speeds and advances into healthcare applications. Beyond IBM's difficulties presented in the associated post above; Nvidia has and will have bangs and bumps along the path, too; but is currently finding some success to the benefit of health diagnostics and we humans.
The entrance point to real time, in depth, diagnostics of the human body condition approaches. "Bones" from Star Trek would be impressed.
https://medcitynews.com/2018/06/meet-the-preeminent-ai-company-on-earth-but-can-it-succeed-in-healthcare/
"the United States has regained the lead thanks to a new supercomputer built for the Oak Ridge National Laboratory in Tennessee by IBM in a partnership with Santa Clara’s Nvidia."
"The Summit computer, which cost $200 million to build, is not just fast — it is also at the forefront of a new generation of supercomputers that embrace technologies at the center of the friction between the United States and China. The machines are adding artificial intelligence and the ability to handle vast amounts of data to traditional supercomputer technology to tackle the most daunting computing challenges in science, industry and national security."
Link to article in the SF Chronicle
This is nothing new. AI / BlockChain are just the new hype that "most people don't understand". The point is not whether they have potential. The point is the countless $s that will be wasted in the name of "progress" citing stories such as how we wouldn't have been able to land on the moon without such an attitude. The difference is we are talking "IT" here, not moon exploration. You can't make shit up with the latter.
I'm still waiting for the geniuses to do something with these new technologies to benefit society at large. No, the iPhone doesn't count. Neither does Alexa. Not if neither can get people out of poverty. No one was seeking profit for landing on the moon. IBM sought profit where none was to be found. Layoffs.
@VintageFreak- Funny you should ask,,, I've been wondering exactly the same thing for a while now.
The only main differences since the 1980's is that computers are faster and we have more data from the internet. Some incremental advances in machine learning, but no major breakthroughs on solving fundamental problems regarding actual intelligence. Machine learning algorithms are 'black boxes'.
A common response back then to the question about how Lisp machines would handle memory management is that you could wheel in another rack of memory. Sure enough, since then the memory issue's been "fixed".
Remember Lamda machines?
http://www.computerhistory.org/revolution/artificial-intelligence-robotics/13/290
With additional processing power and memory has been a shift from rule based systems to learning systems, but as @Johann noted, they're not something new.
I think we are at a point where we have advances in technology but we either can't find applications OR can't find applications that can benefit people OR can't find applications where we can make money but let's get some investment while we fool everyone around - so BlockChain anyone?