• 0 Posts
  • 59 Comments
Joined 1 year ago
cake
Cake day: June 12th, 2023

help-circle
  • The underlying issue is that nobody wants to develop using any of the available cross-platform toolkits that you can compile into native binaries without an entire browser attached. You could use Qt or GTK to build a cross-platform application. But if you use Electron, you can just run the same application on the browser AND as a standalone application.

    Me? I’m considering developing my next application in Qt out of all things because it does actually have web support via WASM and I want to learn C++ and gain some Qt experience. Good idea? Probably not.















  • More limited, but also less enshittified than Windows.

    If you want a good, well-polished experience for certain creative workloads, or even programming, MacOS is great and their Apple Silicon CPUs are excellent.

    If you want to do ANY gaming besides WoW (which surprisingly enough has always had great MacOS support) or you can’t stand the lack of configurability, Linux is immediately the superior choice by far.


  • We just see incremental performance improvements for enthusiasts/professionals and little more than power draw improvements for everyone else.

    For several years we didn’t even see those. When AMD wasn’t competitive, Intel didn’t do shit to improve their performance. Between like Sandy Bridge (2011) and Kaby Lake (2016) you’d get so little performance uplift, there wasn’t any point in upgrading, really. Coffee Lake for desktop (2017) and Whiskey Lake for laptops (2018) is when they actually started doing… anything, really.

    Now we at least get said incremental performance improvements again, but they’re not worth upgrading CPUs for any more often than like 5 or more years on desktop IMO. You get way more from a graphics card upgrade and if you’re not pushing 1080p at max fps, the improvements from a new CPU will be pretty hard to feel.