Anecdata: Ordered my Mac mini M4 Pro (48GB) on April 1. Was told it wouldn't be available until June 4 but it just came in yesterday—a full month early. So I think there is an "underpromise; overdeliver" thing happening with current orders. Will be curious to see what happens with the Mini M5 release this year.
It seems like M3U 512GB RAM was a unicorn we won't ever see again :( Many skipped buying it with the hopes of a 768GB-1TB M5U but it looks increasingly unlikely.
I'm really interested to see what Apple eventually does. I don't think Apple will ever again be at all interested in releasing something as genuinely capable at a reasonable price point: my expectation is they are working real hard on new things that they can and will charge vastly more for, that include not just amazing bandwidth but ridiculous cores too, that justify enormous colossal price points (above and beyond the unbelievable cost of ram+nand).
There's some really wild patents on some wild systems architectures here, spanning 2021 to 2024. A lot of this can definitely go into an Ultra like design, but there's definitely broader possibilities here, that I expect they're now working furiously on. https://bsky.app/profile/ogawa-tadashi.bsky.social/post/3mif...
It's just so hard for me to come to terms with what a post-consumer Apple would be like! The Ultra chips are the mini-computers to the mainframe, and in this arena, they need to scale up and move upmarket and I cannot imagine how weird it would be to be an Apple that is so torn like that: that is still the worlds favorite consumer computing, but that also is selling mini-data-center like things, at phenomenal price.
Eventually if component prices ever settle that will be a commanding position to be in, to sell widely from, to have architecture for: but for the next half decade? A torn Apple.
The Mac Mini and Studio are due for an update in the coming months, a part of this is also probably that they’d rather save memory to build up their next gen model inventory rather than current gen ones?
Like trvz said, they use different memory. M3 Ultra uses LPDDR5X 6400 MT/s, M4 Max uses LPDDR5X 8533 MT/s, while all the M5 models use LPDDR5X 9600 MT/s.
People may not remember, it is ~1980 all over again. There was a massive 'chip' shortage back then were the mini-computer company I was at and many others could not get chips they needed.
In fact, chips were kept under lock and key to prevent theft. But there was a massive theft there were 20,000 chips were stolen.
May 15, 1989
FBI SA and US Customs Agents advised Assistant US Attorney that source information and investigation had determined that Atari Corporation was importing 256K DRAMS into the US in false packing containers, and without proper import documents in violation of US import laws and contrary to import agreements between the US and the Japanese Ministry of Industry and Trade. Atari purchases large quantities of DRAMS from Japanese manufacturers for use in their Taiwanese manufacturing plants. Purchasing in Taiwan allows Atari to obtain the DRAMS at a greatly reduced price. There are strict import quotes on the DRAMS, because of Japanese flooding of the market in years previously but there are no import duties. By shipping the DRAMS in the U.S., Atari can thereby increase the price by approximately four times their purchase price. The original manufacturers, whether Fujitsu or Mitsubishi would not be allowed to import this quantity at this price into the U.S., because this practice stifles U.S. manufacturers.
Investigation determined that Atari was importing large quantities, 150,000 or more a wekk into the U.S. since May,1988. None were declared through U.S. Customs, and it appears telexes and telephones were used to order specific quantities in furtherance of this scheme.
Based on the above, Assistant US Attorney stated he would consider prosecution of this matter under the Wire Fraud Statutes or 1001 Falsification of Import Documents.
SOURCE: FBI Case 87A-SF-40454, Pages 42-43
A lot of discussion surrounding the ram shortage seems to imply that it will recover, but AI companies slurping up ram for training hasn't gone down and probably won't ever. Is there any signs that the situation is improving or is this just the new normal?
boom-bust cycle historically used to apply only to latest generation (f.e. DDR5 now), but current crisis affects previous gen DDR4 as well (and a bit of DDR3 too)
It feels much more like cartel behaviour, where all the players recognized blame can be redirected to "Ai demand" and "Sam Altman secret deal"
I hope it won't be this bad forever, but RAM companies are currently slow-walking any booms (not fast-tracking new fabs, etc.) in hopes of avoiding a bust. Seems it'll be more of a slow decay to still-inflated pricing.
From what I understand, the RAM shortage is more about AI inference than AI training. Yes, training created much of the early HBM crunch because frontier-model training clusters need tons of HBM near GPUs, but inference is what is keeping the pressure on now and into the future.
These don't have normal ram, right? The ram is part of the die of the processor? So... what's going on? They're keeping the chips for themselves? They're moving production to other lower memory configurations? But why? That's where demand is? Probably more demand at higher memory though?
I'd buy one or two but I can't stick them in a Colo because they don't have LOM or dual power supplies but I've been seriously thinking about buying one and just keeping it at home and having my Colo servers talking to it for local deepseek.
Not a high priority though considering how cheap deepseek is.
The ram is "unified" meaning it's a single shared between CPU and GPU, and it's "on package", meaning the RAM chips are packaged together with the CPU / GPU die, but it's just regular old RAM chips.
It’s still dedicated ram on a separate chip, which is affected by supply shortages.
The ram is soldered onto the SoC in close proximity to the main arm chip. What’s different is that it is simultaneously addressable by cpu and gpu cores, not part of the same die as the apple silicon unit.
The base mac mini I got has been one of the best tech purchases I've ever made, and of course as soon as I wanted another [loaded] machine for more serious work this happens.
It's absolutely wild that Apple's desktop machines now cap out at less ram than their portables which can't sustain an intensive workload without throttling!
Apple certainty has the financial resources to support other companies in e.g. developing specific innovations or building infrastructure (and has done so in the past) as long as there's an RoI for Apple.
It would surely be a smart move to support the right partner in quickly starting a new memory factory, precisely to Apple's specifications, in return for a long-term supply agreement? If Apple could secure their memory supply and at a lower cost than all of the their PC and phone competition, it would be hugely beneficial for them.
Memory designs are pretty entrenched with the various patents involved... I've said a few times that I don't know why Intel hasn't gotten back into DRAM production with their fabs. I suspect they may be contractually limited when they sold off their memory businesses.
Design is not the problem. Having foundry space to manufacture is the bottleneck. It is just all being sucked up (with AI needs being the big additional load).
And to be clear, the foundry space for CPUs/GPUs is not the same as for RAM, which is printed with much larger feature size in order to lower the costs.
I don't think it's that... you have three companies that control over 90% of the market that have been convicted of collusion and price fixing more than once, when there were even more companies in the mix. The memory companies aren't producing at max capacity, they're price fixing.
Beyond this, memory isn't produced on leading end nodes, they're a few generations back as it is. For that matter, Intel isn't even near capacity and has/had plenty of opportunity to produce VRAM and SSD Storage, they got out of both as they became more commoditized.
For CPUs, they are still licensing ARMs cores, of course with their own modifications, and they bought Intel’s modem businesses, which likely gave them the patents they needed. GPUs I can’t speak to on this though.
To be clear here, Apple doesn't actually license any cores from ARM - they've got an architectural license and implement their own cores. Licenses for cores are a different thing.
Alas RAM is basically a commodity product, unless they could have some design advantage over others like the A and M series chips, there is little incentive to go into RAM.
If Apple had the manufacturing capabilities then sure, but they would still be running into the same resource constraints for inputs that everyone else is having nowadays.
At the moment, there are no solutions only responses.
Immune to shortages no. They're not suffering shortages because they don't have their own design, they suffer shortages because the whole supply chain has issues, starting from required minerals and going all the way to shipping.
And like the final product (commercial RAM) now goes to AI which pays better, processes/materials/factory utilization to make RAM would continue to go to another industry and not Apple, if that pays better then.
reply