firmly of the belief that guitars are real
Fuck it, let’s all just start winding our own magnetic core memory arrays.
Open source is just another commons, and companies have a way of uncontrollably exploiting common resources until they collapse.
In the case of open source, it’s healthy in the sense that money is flowing, we have companies sponsoring projects, tons of code is available for inspection and reuse, etc. Very nice. But if you go back to the original concepts of free software, in many cases we struggle with actually exercising the four freedoms. Red Hat has engineered an EULA that basically lets them ban practices that had been thought protected by the GPL for at least a generation, and so on and so forth. So is the open source community healthy or dying? Doesn’t the answer to that depend on your priorities?
I think it would make a lot of sense to try to create an economic model that can fund open source software development without relying on corporate injections of cash. It’s not that they don’t pay for it ever, they just pay for it to the bare minimum extent. IE, the heartbleed fiasco – tons of companies were freeloading off one guy and like half the Internet’s security got fucked for it. Imagine if OpenSSL had had some kind of economic support structure in place to allow for, uh, more than one guy to manage the encryption library for like half the Internet before something insanely stupid and predictable like that happened. Well, we can never have that with corporate-controlled open source.
Since CRF settings aren’t that useful for hitting a specified filesize, you can use the following equation to calculate the bitrate needed to encode a video of a given runtime at a given file size
b = (t*8*10^6)/s
ie, my copy of Serenity is 01:58:55 long, which is 7135 seconds (see https://www.calculateme.com/time/hours-minutes-seconds/to-seconds), I want it to be 2.5gb, my equation is
b = (2.5*8*10^6)/7135 = 2803 kbit/s
You can use any tool, handbrake, ffmpeg, whatever, any codec, and this equation will tell you the average bitrate needed to hit that file size. You would use “vbr” encoding mode instead of crf. I’d recommend enabling 2-pass for x264, not sure if this would be needed/is available for x265 as I’m a bit of a stick in the mud re: video codecs.
Couple notes, I’m using SI units (powers of 10 instead of powers of 2) for the conversion, and am converting from bytes to bits as this is a more common unit to represent bitrates. If your software uses different units for the bitrate for some reason, or you prefer representing file sizes using gibibytes/etc then you’ll need to rewrite the equation accordingly
deleted by creator
You know how scientists announced they got slime molds to solve mazes? I imagine it could be something like that. The slime mold is just looking for food and living its life. What it doesn’t realize is that the food has been put somewhere that will force it to solve certain computational problems along the way.
Now imagine a central scheduler breaking down computational problems into bite-sized chunks and using an immersive storytelling simulator to force a few billion humans to do something similar. I could see it, in theory.
With the SSD’s I can afford, there are what you might call “net negative savings” when I save maybe a couple dollars in power a month but have to replace them every few months. We can’t all afford EVO’s.
deleted by creator
I mean, with stuff like ZFS, it’s a little hard to justify the outlay for all solid-state disk storage when I can build out a large storage array using HDD’s and use one mid-size SSD for ZIL and then L2ARC to provide read/write speedups. Who actually cares what the underlying storage mechanism is as long as the dataset is backed up and the performance is good?
Low-end hardware, under-specced hardware, etc. I had so many problems getting a client’s Surface tablet to install updates in a timely fashion because it shipped with only 64GB of storage, which turns out not to be enough for a Windows 10 install + an office suite + download space for updates. The best part is, Microsoft designed the Surface.
I don’t see the hypocrisy. If the universe is a simulation, that wouldn’t make whoever built the universe a god. There would be no analytical reason to conclude that, unless we started from the specially-crafted supposition that any being capable of creating something like the observable universe had to be equivalent to God, but at that point, you’re just defining your way into theism. If the universe is a simulation, which is not a terribly interesting thought experiment tbh, then it could be a simulation for any reason. The simulators could have been interested in the dynamics of gas and dust dispersion within galaxies and just so happened to create a sophisticated enough simulation that it could simulate the evolution of natural life. If the entire Universe had been “created” (although the point of defining it as a simulation is to point to how it doesn’t really exist, ipso facto if God is a simulator, then God is not a Creator in the sense theists mean) to study dust dynamics at the galactic scale, somehow I think theists would be dissatisfied and not feel like they had really found what they meant by “God.”
In theory, any type of Boltzman Brain could assemble itself at any time and start processing information, so in theory, a simulation could also be an entirely natural phenomenon occurring in a higher-order reality. The two ideas are different, even though Christians like to claim everyone is a theist and everything is theism even when they aren’t and it isn’t.
Anyways, the simulation hypothesis is sort of fun to think about sometimes, while “I invoke supernatural powers to explain phenomena I don’t understand” isn’t all that interesting.
Hi, sorry I just saw this. “SFF” is short for “small form factor.” It’s just industry jargon for “a small PC.” They tend to be designed to use less power which makes them a good fit for home servers. Pretty much any line of PC sold to businesses, like Dell Optiplex or HP EliteDesk, will have small form factor variants.
It’s more about the imbalance caused by algae blooms. They breed prolifically, and die off en masse more or less constantly as they bloom. When they die, they decompose and release carbon dioxide back into the water. So algae blooms hoover up carbon dioxide and concentrate it in a specific spot of ocean water, which can cause problems regarding anoxia and also ocean acidication.
The issue is that after a couple hundred years of intentionally eating literally everything in the ocean and dumping tons of our garbage and industrial waste there, oceanic ecosystems are even more fragile than usual and we don’t exactly have the ecological spare room to tinker with wild algae blooms on a scale large enough to make an impact on climate change. It would be trivial to ruin oceanic ecosystems, and by extension, many land-based ecosystems, with a megascale algae bloom.
Vats of algae in controlled environments might be a way to go, though?
deleted by creator
Everyone likes to trash machine learning because the power requirements are high, but what they don’t realize is that we’re in the very first days of this technology (well, first couple decades of the technology being around, first few years of it being advanced enough to have anything to show off). Every technology that got bundled together into your phone was equally as useless when it was first invented. Honestly, compared to the development of most other technologies I’ve looked at, the pace of development in AI has been shocking.
Literally once a week, I see some news story about AI researchers delivering an order of magnitude speedup in some aspect of AI inference. The technique described here apparently allows for a 20x speedup on GPU’s.
Yeah, but I don’t know any other language where the fact a program is written in that language is used as a selling point. I never cared that Linux was written in C, I cared that it does its job. I’ve heard about Redox many times, yet never once has there ever been anything said about it other than “it’s written in Rust! :D” Literally, the fact that it’s a UNIXY operating system written in Rust is the first thing about the OS on their home page.
Hey, Linux started as a learning project, you learn more about programming by writing code, so I’m not saying it’s bad, I just can’t understand why I’d care about something that at this stage seemingly is just a learning project.
Shout out for ODROID, their product revision cycles take too long (lmao why are they still selling a 32-bit chip that was an iffy investment back in 2013), but when they drop new stuff, it tends to be great.
deleted by creator
Do you also hand out copies of your car and house keys to strangers you meet in parking lots?