This AI Ecosystem Report, featuring CIO and Acceleration Economy analyst Kenny Mullican, explores groundbreaking technologies like MTJs and C-RAM that could revolutionize AI by dramatically reducing energy consumption and accelerating computational processes.
The AI Ecosystem Q2 2024 Report compiles the innovations, funding, and products highlighted in AI Ecosystem Reports from the second quarter of 2024. Download now for perspectives on the companies, innovations, and solutions shaping the future of AI.
Highlights
00:13 — I want to tell you about some exciting new technology. It is built on two different technologies. One is called MTJs, or magnetic tunnel junctions. MTJs are used to build C-RAM (computational random access memory). Now, normal random access memory — what we’ve had in the past — is two separate chips. One for storage, and one for computation.
01:12 — This computing chip has to ask for the data that’s stored in the memory chip, and it has to pull it over, process it, and then it may put information back into that memory chip. Going back and forth between these chips greatly slows the process and uses considerably more energy. The University of Minnesota has been working on this technology for over 20 years.
Ask Cloud Wars AI Agent about this analysis
02:21 — The energy that many of our machine learning and artificial intelligence processes currently take is anticipated, within the next few years, to reach the total energy consumption of Japan. So if this technology would allow us to cut that down by a factor of 1,000 or 2,000, that is a significant reduction in energy usage and can seriously speed up that process.
03:21 — This technology is something I believe we will see in the near future, and it could revolutionize what we’re seeing out of artificial intelligence, especially with this generative AI boom.