
To sell at a loss, or at least very low profit? Low end GPUs tend to have tight margins to begin with. Why stick limited DRAM in there when there are products that need it that can actually be sold for profit?
I guess they can be a loss leader. It’s not a sustainable business model, though, and this DRAM shortage is projected to last a while.

Funny, I was just reading about this sort of thing in “How to blow up a pipeline”. It’s the sort of argument that seems obvious in retrospect.
When someone in the global south uses a coal stove to cook their food, they’re doing it by necessity. When a billionaire sails out on a mega yacht, it’s pure excess. Yeah, banning them won’t make the difference between 1.5C and 2.0C of global warming, but it’s low hanging fruit.
We can also ban private jets, and the only significant impact to the economy would be that some billionaires have to travel around in first class like some kind of lowly multimillionaire or upgraded plebian.
It does not matter if you think Valve makes good products or not.

All of this is going to be based on the fluctuation of RAM prices and tariffs, as well as whether or not Valve has an existing stockpile of RAM from 6 months ago.
FWIW, Sony just announced a Japan-only PS5, sans optical drive, for about $350. Now, US prices are remaining higher, but the GabeCube is likely to have less performance than a PS5. I can’t see them going much over $600 and still having a value proposition. Even that is going to be based on the gigantic library of Steam games that can be played on it that aren’t on the PS5.

You make the claim it has an order of magnitude benefit, then you get to provide the proof.
And there isn’t any. There is some evidence that people will fool themselves into thinking it makes them faster, and it sounds like you’re one of them.

No, that’s exactly what this is about. They came right out and said as much. It won’t work, but they’ll cause a lot of damage in the process of failing.

Moore’s Law was originally formulated as the cost per integrated component being cut in half every x months. The value of x was tweaked over the decades, but settled at 24.
That version of the law is completely dead. Density is still going up, but you pay more for it. You’re not going to build a console anymore for the same cost while increasing performance.
High end PC’s can still go up, but only by spending more money. This is why the only substantial performance gains the last few GPU generations has been through big jumps in cost.

It’s also accidentally a good trainer for motorcycle skills. Not that its physics are good. They’re not. It does have one thing that is really useful: traffic tends to pull out on you and do unpredictable things.
That makes it a pretty good simulator for training against target fixation. You tend to drive/ride towards whatever you’re looking at. When someone pulls out on you, then you will tend to look at the car and hit it. If you train yourself to look to the side, you will tend to miss it. This is a good skill for drivers, and can make the difference between life and death on motorcycles (and motorcycles pretending to be ebikes).
Most other games with a driving element don’t have cars pulling out on you a lot the way Cyberpunk 2077 does. Makes it worse as an overall game, but it does have some value.
The first x86-64 processor came out in 2003. Technology sure does move fast.
Edit: checked my old Newegg orders. I bought my first x86-64 processor, an AMD Athlon 64 3200+, in Jan 2005. I seem to remember games were starting to pick up 64-bit support around then (I think Eve Online in particular, which I played a lot back then), so it made sense to switch.

As far as DNF goes, it was probably an easy profit for the company. They bought it from 3D Realms and patched it up into something releasable. I doubt they spent a lot on the deal. It didn’t have to sell many copies for them to hit break even.
Not a bad business decision, and I’m glad that development story had a definitive ending.

Someday, the industry is going to realize that while transistors might still be getting smaller, they aren’t getting cheaper for it. Which was the original formulation of Moore’s Law; cost of integrated component gets cut in half every x months.
Not just games, but the whole tech industry. Even in so far as faster hardware exists–and it just plain might not in this case–people can’t afford it.

My conclusion is that the US is getting what it wants out of the importation block regardless of smuggling or “fell of the assembly line”.
Universities (China and the US) want a warranty on that hardware. They can’t get a warranty on smuggled hardware. That’s where you would have researchers building models. The GPUs they have are getting old and they don’t have replacements lined up.
The other place to build models is corporations, who might choose to ignore the warranty issue, but they can’t possibly get enough high end GPUs to actually do that. Not while using mules who can only bring in one or two at a time. Maybe they can find a way to smuggle things en masse, but they’d likely just make themselves a target to US trade authorities.
That leaves Chinese gamers as the only ones who want smuggled GPUs at all. US trade policy doesn’t give a shit about them.
So yes, there’s smuggling, Nvidia certainly knows about it, US trade authorities certainly know about it, but nobody has any reason to care.
It’s not like AMD created this situation. It’s pretty well documented, and the culprit is OpenAI plus the three companies that make all the DRAM. Mostly OpenAI.