• 5 Posts
  • 59 Comments
Joined 1 year ago
cake
Cake day: June 5th, 2023

help-circle




  • [Meta] I don’t think there’s a need to cross-post this within Beehaw. Beehaw is low-activity as it is (in terms of new posts) so most people here would be just browsing new/local so they’d be seeing this post in their feed twice.

    Even if you’re not browsing by local, most people in this community would likely also be subscribed to the Technology community as well, so again, there’s a double-up.





  • From @SuperIce@lemmy.world:

    If the PoS supports tokens, it’ll use unique tokens for each payment. If the PoS doesn’t support tokens, the phone has a virtual credit card number linked to the real one, so if it does get stolen, you can just remove the card from your Google Wallet to deactivate it. Your real card number is never exposed.

    Even then, credit card numbers on their own aren’t that useful anymore. Any online payment needs the CVC and PoS devices usually require chip or tap cards, which don’t use the number. On top of that, credit card companies have purchase price restrictions when using swipe because of the security risks vs chip (which is why most PoS devices don’t support swipe anymore).








  • I did the TV -> projector swap last year, got myself a 4K projector that sits above my bed and projects a massive 100" image on the wall opposite my bed, and it’s awesome. I’ve got my PS5 and Switch hooked to it, and I’m currently living the dream of being able to game and watch movies on a giant screen, all from the comfort of my bed. Some games really shine on such a screen and you see them in a new light, like TotK, Horizon series, Spiderman etc and it’s 100% worth the switch, IMO.

    Now I also have a regular monitor - a nice low latency QHD 16:10 monitor with HDR, hooked up to my PC, which also uses a 6600 XT btw. Main reason I use this setup is for productivity, running some PC games that don’t have console equivalents, plus the colors look much nicer compared to my projector. Maybe if I bought a laser projector and had one of those special ALR screens I could get nicer colors, but all that is way beyond my budget. Although these days I’m not on my desktop as much as I used to be (I also have a Ryzen 6000 series laptop that I game on btw), I still like my desktop because of the flexibility and upgradability. I also explored the option of switching to a cloud-first setup and ditching my rig, back when I wanted to upgrade my PC and we had all those supply chain issues during Covid, but in the end, cloud gaming didn’t really work out for me. In fact after exploring all the cloud options, I’ve been kind of put off by cloud computing in general - at least, the public clouds being offered by the likes of Amazon and Microsoft - they’re just in it to squeeze you dry, and take control away from you, and I don’t like that one bit. If I were to lean towards cloud anything, it be rolling my own, maybe using something like a Linode VM with a GPU, but the pricing doesn’t justify it if you’re looking any anything beyond casual usage. And that’s one of the things I like about PC, I could have it running 24x7 if I wanted to and not worry about getting a $200 bill at the end of the month, like I got with Azure, because scummy Microsoft didn’t explain anywhere that you’d be paying for bastion even if the VM was fully powered off…

    Anyways, back to the topic of CPUs, I don’t really think we’re at the cusp of any re-imagining, what we’ve been seeing is just gradual and natural improvements, maybe the PC taking inspiration from the mobile world. I haven’t seen anything revolutionary yet, it’s all been evolutionary. At the most, I think we’d see more ARM-like models, like the integrated RAM you mentioned, more SoC/integrated solutions, maybe AI/ML cores bring the new thing to look for an a CPU, maybe ARM itself making more inroads towards the desktop and laptop space, since Apple have shown that you can use ARM for mainstream computing.

    On the revolutionary side, the things I’ve been reading about are stuff like quantum CPUs or DNA computers, but these are still very expiremental, with very niche use-cases. In the future I imagine we might have something like a hybrid semi-organic computer, with a literal brain that forms organic neural networks and evolves as per requirements, I think that would be truly revolutionary, but we’re not there yet, not even at the cusp of it. Everything else that I’ve seen from the likes of Intel and AMD, have just been evolutionary.






  • If people really felt strongly about this, we would’ve seen it being done already. Perhaps the state of Lemmy right now is “good enough” so folks don’t care too strongly about a lack of a minor feature, or maybe they find it easier to just migrate to something like Kbin instead and still be federated to Lemmy. Or maybe they prefer to just write a simple patch, which can be maintained and distributed separately, instead of forking the entire code. Afterall, it’s easy enough to make a fork, but a PITA to maintain one. Much more easier to just make a separate patch set or standalone utilities or something.

    Also, frontend features, like the infinite scrolling one which was quoted, are basically non-issues, considering so many good alternative frontends exist, such as Photon, Alexandrite, mlymm, slemmy, etc. There’s no rule you have to use the default frontend. In fact many Lemmy instances have decided to host these frontends on their own servers, and if they wanted to, they could easily switch to it and make it the default landing page.