Keep in mind that a positive research result is not necessarily anywhere close to any of usability, commercial viability, or even possibility outside the lab. Nuclear fusion is a prominent example of this, and I'm still waiting for MRAM, which has been "around the corner" for quite some time now. Until memory glass may be available, who knows what else is.
Everspin needs to market it better... but yeah, if you're a Computer Engineer, you can build MRAM systems. The issue now is to build computers that actually take advantage of MRAM.
Its a niche product. No one wants MRAM right now, so the people who produce it can sell it for however much they want to.
Its not necessarily better than DDR3 RAM as RAM, and its significantly less dense than Flash RAM. The practical applications of MRAM are not quite as popular as once thought...
So... it remains a niche product, with a niche price.
Again, MRAM is a niche product. So is a crypto chip. If you want a crypto chip, they are already available with Flash RAM / Microcontroller combos. There's no need to use MRAM for that application, when Flash is already widespread and cheap.
I disagree with your irrelevant comparison to fusion technologies. It's not like they have to create crazy powerful magnetic confinement fields here. It's lasers and glass. The CD went from an initiative at Phillips in 1974 to release in 1983. 9 years to fully develop and do a production release.
I think it's inspiring, and I'm thankful I'm alive to see all of it unfold before us.
It's ultrafast femtosecond pulse lasers, spatial light modulators, and lab-quality fused silica glass. Nobody has ever put any of those into a consumer product before. The CD combined microscopic feature casting in plastic (same technology used for phonograph records since at least the 1930s if not the 1890s), metal-plating of plastic (from the 1950s), room-temperature semiconductor lasers (from the 1970s, although I don't know when their first mass-market product was), error-correction codes (commonly used since the 1940s), and PCM (from the 1930s, but, I think, only then being rolled out on a grand scale for digital telephony). The only one of the component technologies that might have had uncertainty as to its suitability for mass-market uses would have been the semiconductor laser diode, and in theory it wasn't necessary — you could have built CD players with HeNe lasers like the ones being rolled out in supermarket barcode scanners at the time, and which had been used for freight-car barcode scanners for a decade, but they would have been heavy and fragile like a tube radio or fluorescent light, not rugged and lightweight.
Aside from this, the storage technology itself might turn out not to work. It's holographic, and extrapolating is perilous in holography — some small source of noise that isn't significant when you have a megabyte of data recorded might turn out to be overwhelming when you have ten gigabytes of data recorded, let alone hundreds of terabytes.
Also, it occurs to me that megapixel spatial light modulators are also the key element in megapixel projector displays, so that might also be already ready for prime time.
TI just released an FRAM memory (instead of FLASH) version of the MSP430.
It has several features which make it better suited to ultra-low power operation.
I know FRAM was popular 10 years or so ago, but never really made it big. I'd always assumed someone had a patent locked up that made other companies avoid it. Maybe that's changed recently?
MRAM and FRAM are commercially available solutions, but the status quo of the computer industry is to build SSDs out of much much slower (but far more dense) Flash RAM.
> Keep in mind that a positive research result is not necessarily anywhere close to any of usability, commercial viability, or even possibility outside the lab. Nuclear fusion is a prominent example of this, and I'm still waiting for MRAM
I, for one, am still waiting for those jelly fish based CPU :(