Just some Internet guy

He/him/them 🏳️‍🌈

  • 1 Post
  • 659 Comments
Joined 1 year ago
cake
Cake day: June 25th, 2023

help-circle




  • I get about 350-400 both ways which AFAIK is what my Unifi AC-Lite tops at since it’s WiFi 5 and it’s only got 2 antennas and tops at 80MHz channels. I get about 200-250 on my phone (1+8T) which I think is single stream.

    Everything indicates me that’s as best as it can be with the set of hardware I have. Signal is solid, latency is solid.

    You’ll need 802.11ax and/or more MIMO streams to get higher speeds, and/or 160MHz/320MHz channels.






  • It’ll depend a lot on your experience. I can just install Arch without reading the wiki at all in about 5 minutes for something fairly vanilla. If you’re comfortable with Linux then following the wiki won’t be too hard, took me maybe 2-3 hours on my first install before I had my DE and everything all set up (12 years ago). If you’ve never used Linux before and take the deep dive then it could take hours and days depending on how fast you can absorb all that information.

    “Easy” is very subjective, there’s stuff that’s so dumbed down for the sake of “easy” that it makes my life harder when I need to do more complex stuff. I know people for whom linear algebra in 11 dimensions is easy for them to do and solve. Easy is relative to your own personal experience level and what you’re trying to accomplish.

    Install it in a VM as a test run, you’ll see by yourself.


  • No, simply because even with pure CSS and even pure HTML you can find ways to leak some information about the browser. For example, a background image that only loads on 1920x1080, another for 2560x1440, and so on. Make hundreds of those for every possible resolution (they can be the same file on the server but at a different path), and there you go, you now figured that the client downloaded img/background/2448x1280.png from the server logs. You can use the same trick for fonts as well, you just apply the same trick on a box on the page that is sized based on text content. Repeat for every font you want to test for.

    There’s just a ton of those little features that are for performance optimizations because loading a 4K background on a 480p phone is a bad experience for everyone involved. Sometimes you need to know the size of some elements to position other elements relative to it. You need the mouse cursor position to open popups at the right place. You need the window size to realign popups and modals. You’d have to go back to text based only sites like it’s the 80s and 90s to avoid that kind of fingerprinting.

    And thus Tor’s solution: everyone’s got the same window size, same fonts and everything.









  • The problem with Fedora and especially the atomic versions is that when you Google “how to do X on Linux” you pretty much always get information for Ubuntu and Debian derivatives. The atomic versions have it mildly harder because now you also have to learn how immutable distros work, and you can’t just make install something from GitHub (not that it’s recommended to do so, but if you just want your WiFi to work and that’s all you could find, it’s your best option).

    It’s not as bad as it used to be thanks to Flatpak and stuff, but if you’re really a complete noob the best experience will be the one you can Google and get a working answer as easily as possible.

    Once you’re familiar and ready to upgrade then it makes sense to go to other distros like Fedora, Nobara, Bazzite, Kionite and whatnot.

    I don’t like Ubuntu, I feel like Mint is to Ubuntu what Manjaro is to Arch, Pop_OS is okay when it doesn’t uninstall your DE when installing Steam. But I still recommend those 3 to noobs because everyone knows how to get things working on those, and the guides are mostly interchangeable as well. Purely because it’s easy to search for help with those. I just tell them when you’re tired of the bugs and comfortable enough with Linux then go start distrohopping a bit to find your more permanent home.


  • Ask your admin to turn it off, or if you’re the admin, turn it off.

    They really went with the worst possible way to implement this in that it mangles the post to rewrite all images to the image proxy, so it’s not giving you a choice. So if you want the original link you have to reprocess it to strip the proxy. It’s like when they thought it was a good idea to store the data as HTML encoded, so not-web clients had to try to undo all of it and it’s lossy. It should be up to the clients to add the proxy as needed and if desired. Never mangle user data for storage, always reprocess it as needed and cache it if the processing is expensive.

    Now you edit a post and your links are rewritten to the proxy, and if you save it again, now you proxy to the proxy. Just like when they applied the HTML processing on save, if you edited a post and saved it again it would become double encoded.

    Personally I leave it off, and let Tesseract do it instead when it renders the images. It’s the right way to do it. If the user wants a fresh copy because it’s a dynamic image, they can do so on demand instead of being forced into it. And it actually works retroactively compared to the Lemmy server only doing it for new posts.