540
Microsoft begins cracking down on people dodging Windows 11's system requirements
(www.xda-developers.com)
This is a most excellent place for technology news and articles.
I just installed Linux Mint on a 15-year-old desktop that has never been upgraded and was middle-of-the-road when I got it. It shipped with Windows 7, and I tried a couple of times to upgrade to 10 (it failed every time, either losing core hardware functionality, running so slowly as to be unusable, or just refusing to boot altogether). But it runs Linux like a dream. Seriously—it's easily running the latest version of Mint better than it ran an 11-year-old service pack of Windows 7.
What's even crazier is that I installed VirtualBox on it, and put Windows 10 on that, to use some work programs. And that runs Windows 10 a bit slowly, but otherwise more or less flawlessly!
That's right: I'm having a better Windows experience in Linux than I've ever had on baremetal Windows on this box.
I can't believe I didn't do this...well, 15 years ago.
For what it’s worth, your experience 15 years ago likely would have been very different. It’s only in the past few years that things like drivers for basic hardware have become widely available on Linux without a bunch of weeping and wailing and gnashing of teeth. And even today, there are still certain drivers that often don’t like to play nice.
Ask anyone who had an nvidia GPU 15 years ago if they’d suggest switching to Linux. The answer would have been a resounding “fuck no, it won’t work with your GPU.”
Eh, "a few years" here is selling Linux a bit short. I switched about 15 years ago, and while driver issues were a thing, it was still a pretty solid experience. I had to fiddle with my sound card and I replaced my wifi card in my laptop, but other than that, everything else worked perfectly. That still occasionally happens today, but as of about 10 years ago, I honestly haven't heard of many problems (esp. w/ sound, that seems largely solved, at least within a few months of HW release).
I don't know what you're talking about WRT GPUs. Bumblebee (graphics switch) was absolutely a thing back in the day for Nvidia GPUs on laptops, which kinda sucked but did work, and today there are better options. On desktops, I ran Nvidia because ATI's drivers were more annoying at the time. Ubuntu would detect your hardware and ask you to install proprietary drivers for whichever card you had. I ended up getting a laptop w/o a dGPU, mostly because I didn't want to deal with graphics switching, but that doesn't mean it didn't work, it was just a pain. For dedicated systems though, it was pretty simple, I was able to play Minecraft on the GPU that came with my motherboard (ATI?), and it ran the beta Minecraft build just fine, along with some other simple games.
In short, if you were on a desktop, pretty much everything would work just fine. If you were on a laptop, most things would work just fine, and the better your hardware, the fewer problems you'd have (i.e. my ThinkPad worked just fine ~10 years ago).
Playing games could be a bit more tricky, but for just using the machine, pretty much any hardware would work out of the box, even 15 years ago. It has only gotten better since then.
That's a good point. I didn't think about that.