It’s still not great. Especially on bleeding edge hardware.
Usually it works fine on older hardware as long as what you don’t have requires proprietary software. If it does then lord have mercy.
It’s still not great. Especially on bleeding edge hardware.
Usually it works fine on older hardware as long as what you don’t have requires proprietary software. If it does then lord have mercy.
SSDs are absurdly cheap at the moment. 2023s demand glut led to a huge over abundance of SSDs and dirt cheap prices.
As tech shrinks it’s only getting more and more expensive per mm. Unless we get some major improvement we’re kinda at the limit for the moment.
To be fair they did far over produce them which is why they’ve been so dirt cheap lately.
But companies did learn over Covid that if you just don’t make something you can charge whatever you want for it and people will pay it.
It’s a hell of a lot cheaper to buy an EV with a range/capacity lower than what you need 5% of the time, and spending $40 to rent a truck/$100 to rent a car for a trip than it is to buy some ridiculously oversized battery. Sure 5% of the time it’s useful, but getting a rental isn’t that bad.
Plus with a rental you can pick the exact type of car suits the trip well. I took a V6 camaro on a road trip for thanksgiving and that thing gets almost 30 mpg doing 80+ on the highway. Vs if I had my one size fits all Outback for that trip I’d be getting 25 doing only 70, and in the low 20s at 80 if I’m lucky.
That’s a precationary limit because of a pending recall. The batteries aren’t degraded to 80%.
People really like to overestimate how much range they actually need on a daily basis.
I drive maybe 200 miles a week. Almost all EVs could easily get that range in spring/fall. And even in the worst of winter as long as I have 120 volts to keep the battery warm I’ll make it through the week no problem.
Honestly big fast charger networks aren’t the biggest hurdle. We need basic 120v or 240v outlets ran to every apartment/town homes parking spot. With essentially a trickle from 120v you’ll be fine for 90% of your driving needs.
On T-Mobile shitty 5g is just called 5g in the top next to your signal strength. The mmwave 5g is called 5g uc. Every carrier does something different.
How rapidly are you breathing? Are you taking 15 breaths per minute or 50?
If you’re breathing in and out slowly at closer to 15 and it’s audible then something is wonky with your nose. If you’re at 50 just sitting still or casually walking around then you’re probably just out of shape.
High speeds aren’t just for you to destroy your data plan in 10 seconds.
It’s for when everyone at the same time trying to download something so they can provide a good experience in the worst case scenarios.
mmwave 5g has some incredibly low latency compared to 4g. You’d be surprised how much of your latency is just from you to the tower.
Right now when it’s not busy mmwave 5g doing a speed test I have 45ms latency and 3ms of jitter. 4g is 54ms with 12ms of jitter. When the network is loaded there’s a HUGE difference. 5g can handle so many more people at once so your latency is never really that high. But when 4g is loaded down latency gets huge fast.
Ping from china to me in the US is sub 250ms on my wired internet connection so that’s not really the problem. The rest is whoever is doing your phone call.
At work we have phones that are web crawlers and they each use 50+ gigs of data per month so they’re well within the deprioritized zone. But even then they still get really good speeds unless the network is super congested for some reason.
mmwave 5g or “5g”? On T-Mobile near me I get anywhere from 200Mbps down to over gigabit down depending on the location.
A few areas are the shitty 5g that’s essentially 4g+. But most areas around me have really good coverage with pretty insane speeds.
OK, good point. Are people using mobile data for that?
Unlimited data. You do whatever you want, whenever you want, wherever you want.
I haven’t seen any carriers charging extra for 5g but I don’t see why it would be more expensive since the quicker you’re done using the data the quicker the tower can serve someone else.
It’s a text mode joke
It’s arm so have fun with drivers.
In windows you’re not sending the signal directly through another port. You’re sending the dGPU’s signal through the iGPU to get to the port.
On a laptop with nvidia optimus or AMD’s equivalent you can see the increased iGPU usage even though the dGPU is doing the heavy lifting. it’s about 30% usage on my 11th gen i9’s iGPU routing the 3080s video out to my 4k display.
No. The video card is only wired to send video out through it’s ports (which don’t exist) and the ports on the motherboard are wired to go to the nonexistent iGPU on the CPU.
In terms of what matters for compatibility 12th and 13th gen are largely the same, just a better P core. iGPU is identical, chipsets are largely the same etc. 14th gen almost certainly will post a problem, but 13th gen is a nice step up.
Alternatively just pay a lot less for the machine. I’m surprised best buy has the balls to list it for that much without any discounts. With 14th gen rolling out I’d hope prices will tank on those machines, if not new then used they’re getting close to the 3 year mark and their value will plummet.
Most people happily will. So the year of the Linux desktop will always be n+1