
If the game was crashing to desktop, it was probably not a graphics card thing, but that you were running out of ram and the system was just automatically killing the program when that happened. That’s a system stability thing, it prioritizing the needs of the system and desktop over a program so as to prevent the whole system from crashing.
The easiest solution to that is to increase the “swap” size (IE a bit of storage that gets set aside to act as back up memory). That is a system level thing, but it’s not really a big deal to change.
If you were getting to the command line, Linux was running, although, if it was just the command line then it may have been an issue with the desktop or window manager not starting. And if it was an issue with the desktop or window manager, then I could see changing some boot loader settings fixing that. Like, making sure the boot loader automatically starts the desktop when the computer boots. not sure how you got from messing with graphics card to the system only booting in command line, but, shit happens. I’ve broken my system in weirder ways while pulling at the guts.
By switching from cinnamon to XFCE desktop you may have solved the issue with the game crashing simply because you had more memory available, as XFCE is a much lighterweight desktop.
If the games was slowly slowing to a crawl before freezing up completely but not outright crashing to desktop, that could have been an issue of running out of video ram for the the GPU, I’ve had that happen with helldiver’s 2. I don’t remember exactly what I did to fix that, something with the launch options that affected the graphics settings or capped the frame rate I think. Not sure how changing between desktops could have fixed that though.

Skyrim had a narrative, it had stories that raised curiosity enough to engage with the gameplay loops. Some of the side quest were even pretty good, the main quest was meh.
Increasingly Bethesda seems to be building their games around gameplay loops with narrative increasingly ancillary. They’ve optimized for grind without giving a reason to grind.
The reality is, that it’s often stated that generative AI is an inevitability, that regardless of how people feel about it, it’s going to happen and become ubiquitous in every facet of our lives.
That’s only true if it turns out to be worth it. If the cost of using it is lower than the alternative, and the market willing to buy it is the same. If the current cloud hosted tools cease to be massively subsidized, and consumers choose to avoid it, then it’s inevitably a historical footnote, like turbine powered cars, Web 3.0, and laser disk.
Those heavily invested in it, ether literally through shares of Nvidia, or figuratively through the potential to deskill and shift power away from skilled workers at their companies don’t want that to be a possibility, they need to prevent consumers from having a choice.
If it was an inevitability in it’s own right, if it was just as good and easily substitutable, why would they care about consumers knowing before they payed for it?