Considering that git doesn’t need federation, and email is the grandfather of federation, sourcehut has a working version of it this very moment.
Monitor the room temperature.
Support for QUIC and HTTP/3 protocols is available since 1.25.0. Also, since 1.25.0, the QUIC and HTTP/3 support is available in Linux binary packages.
https://nginx.org/en/docs/quic.html
2023-05-23 nginx-1.25.0 mainline version has been released, featuring experimental HTTP/3 support.
It’s not a dev code. It would also take a mere minute to check this before failing to sound smart.
Even better, the dude forked because a security issue in “experimental” but nonetheless released feature was responsibly announced.
Talk about an ego.
Please correct me if I’m wrong, but doesn’t this allow one to represent virtually any resource as a mail inbox/outbox with access through a generic mail app?
I’m working with a specialized healthcare company right now, and this looks like a way to represent patient treatments data as an intuitive timeline of messages. With a local offline cache in case of outages. Security of local workstations is a weak point of course, but when is it not…
deleted by creator
Sorry, but you don’t get to claim groupthink while ignoring state of Apache when Nginx got released.
Apache was a mess of modules with confusing documentation, an arsenal of foot guns, and generally a PITA to deal with. Nginx was simpler, more performant, and didn’t have the extra complexity that Apache was failing to manage.
My personal first encounter was about hosting PHP applications in a multiuser environment, and god damn was nginx a better tool.
Apache caught up in a few years, but by then people were already solving different problems. Would nginx arrive merely a year later, it would get lost to history, but it arrived exactly when everyone was fed up with Apache just the right amount.
Nowadays, when people choose a web server, they choose one they are comfortable with. With both httpds being mature, that’s the strongest objective factor to influence the choice. It’s not groupthink, it’s a consequence of concrete events.
Different disciplines - different thresholds. But yeah, that’s exactly it.
With software engineering, the unknown space is vast, yet the tools are great. So it’s very easy to start tinkering and get lost in the process.
That’s how engineers think in their free time.
When the specific goal is something I can do manually, and it’s not pressing, I would rather spend time learning how to make a tool to do it. I might not need the tool ever, I do use the knowledge picked up on those forays every day.
Teleportation in that term means “make a thing disappear in one place and appear in another”. No “immediate” is ever implied.
Wikipedia article has a great diagram on the topic. Add an article on “no cloning theorem” to understand why “teleportation” is a fitting term. I recommend reading both without expectation, just read through the steps as if you’re learning a new math tool.
In short, quantum teleportation is a way to take a quantum state (which are fundamentally unforgeable - you can’t simply create a clone of a particle), destroy it, extracting classically communicable data, and they recreate it in another location.
FTL is a weird one.
Speed of light is a singularity in a special relativity theory. Singularities usually indicate model limitations, not reality fundamentals.
The theory happily describes behaviours below and above this “speed limit”, but insists on it being unapproachable from either side, which is weird already. At the same time our other models tell us that matter loses a finite amount of energy when it gains mass and stops moving at the speed of light.
Problem is, we don’t seem to have a vocabulary to discuss ways around this singularity and universe is not so forthcoming with any clues.
It’s a general crysis of physics lately. We know our models have limitations, we often know where they break exactly, and universe just giggles along.
But yeah, it’s highly unlikely that any SF will correctly guess a viable FTL, even if it is possible. Especially considering how seemingly every author thinks quantum entanglement is it.
It’s because Unix was created by engineers rather than by ui/ux design professionals.
This is somewhat disingenuous. Unix terminal is one of the most ergonomic tools out there. It is not “designed by engineers”, it is engineered for a purpose with user training in mind.
Ergonomics is engineering. UI/UX design is engineering. UX designer that doesn’t apply engineering method is called an artist.
RAM is the fastest and most expensive memory in your PC. It uses energy, regardless of whether you use the memory. Not utilising RAM is a waste of resources.
There’s a reason good monitoring tools draw a stacked RAM chart.
Turns out, I do need therapy.
Constructive feedback doesn’t need to offer a better solution. Almost everyone who uses that “definition” uses it to avoid criticism in the first place.
When I say “according to research A your policies led to segregation and discrimination” to a politician, I don’t need to provide a better solution. Moreover, them invoking a constructive criticism sentiment would be a clear deflection.
You’re conflating “data” with “information”.
Repeated re-encoding loses information. “The compression algorithm averages pixel boundaries” is a perfect example of losing information.
That it sometimes results in more bits of data is a separate phenomenon altogether.
Every time someone confidently claims that we can cryptographically verify voting, they are deliberately or ignorantly keeping the complexity and necessity of verifying the verifier runtime, the data source, and the communication channels out of the picture.
Cryptography doesn’t solve voting verification problem, it obscures and shifts it.
By becoming a CTO and having an early retirement. Or not at all.