People buy specially supported PCs for Linux?
People buy specially supported PCs for Linux?
Doing a “hack simulator” would likely be easier in other languages, so you will hopefully run into some problems regarding acquiring and presenting the information, which I imagine would give you a decent understanding of the flow of data in python.
I’d say “Go for it”, doesn’t sound too advanced and not “hello world”-simple either.
I’m too stupid to walk backwards through the 4th one, so I guess I only experience 3.5 dimensions.
Extra virgin
I prefer lightly fucked.
Screw it, I’m not picky, give me the full slut.
Yup, that’s the one.
Had quite some problems with programs not cleaning caches properly and drives having weird behavior when accessed in offline state when they first introduced it, though I imagine it surely must have become more robust by now.
Mostly for not loosing unsavable work across transit. Though, Windows has kinda blurred the line between shutdown and standby, so now you can do neither (I guess you can still shutdown properly holding down the shift key while pressing the button, but who thinks about that?).
But standby was indeed much more prevelant when booting your laptop took 2~5min.
I was thinking about Knights of Sidonia. They bough the rights while second season was being made, and while I’m not sure whether they also Labelled the first season as Netflix Original, they at least took it down from everywhere and instead put it on their own site.
EDIT: Correction, as far as I can see, they bought the rights a month after the first season finished airing, and they did call the first season a Netflix Origin.
When the second season aired, I got pretty butthurt that I not only couldn’t see the new season on other sites than Netflix, but I couldn’t even see the first season anymore!
Haven’t seen that behavior myself yet, but yes, that does sound like either a bug or shadowbanning.
Excuse me for not being able to help.
If Netflix can call an existing series to which they bought the rights a “Netflix Original”, then this dude can host you a free website for 1$+.
I’ve encountered a few times where the post or a parent comment got deleted, which also appears to hide any sub-comments.
Might that be it?
Was just about to say. Apples keep away the doctor, beans keep away anyone else.
Quite the contrary, it’s properly structured and leaves no room for misinterpretation, given that the reader can, well, read.
Are you assuming that Google, which, as far as I’m aware, is an international company providing service to a multilingual userbase, has less than 1% non-native English speaking users?
I mean, I don’t care much how Google advertises itself, even companies I do like sometimes make an unlucky promotion and that’s fine, but I do find the arguments in this comment thread to make some wild assumptions.
That just expands the question: do they not know about other countries?
Many of us have certain connotations with google, and while we know the game in our native language, it’s not the first thing we think about when thinking “Google says: I spy”.
I’m pretty sure almost no nerds use chatgpt, as chatgpt kinda takes the nerdiness out of the nerd.
Script kiddy might fit better, looking at stackoverflow from the past half year.
Right, apologies for dumping it down so far, I find it hard to properly gauge the knowledge of others on the internet, and just try and play safe.
I wasn’t aware that one could serial program gate arrays, as, as far as I know, the definition of serial programming is code that is governed by a processor, and which prohibits anything but serial execution of commands. So it’s new to me that gate arrays can run serial code without any governance or serialization process, since gate arrays by themselves are anything but serial. Or rather, you need to synchronize anything and everything that is supposed to be serial by yourself, or use pre-built and pre-synced blocks, I guess.
Anyway, going by the definition that serial programming can only be performed using some kind of governance or synchronizing authority, that alone would be another layer of security.
As serial implies, it rid us, or lessened the burden, of those timing related issues, some of which included:
And the list goes on, but you know.
Serial also has a lot of pitfalls, and you can definitely screw things up bad, but at least you don’t have to think much about clock or timing, or memory placement, unless communicating between devices or cores, and those sync problems tend to be rather tame and simple compared to intra-processor problems.
At least from my experience.
I think you are misunderstanding me. Are you perhaps thinking about multithreading or multi core? Because some people have also started calling that “parallel”, even if it is nothing like low-level parallel.
A CPU does not build upon a CPU, a CPU builds upon transistors which are collected into gates, and which can be assembled into the correct order using parallel programming.
EDIT: as an example, you do not actually need a computer to parallel program. Get yourself a box of transistor, some cable, and a soldering iron, and you can build some very rudimentary gate arrays, like a flip-flop.
This link might give a better understanding of our confusion.
EDIT 2: One could perhaps illustrate the confusion which this topic is often victim of as such:
Transistors are part of the hardware and are parallel programmed to form complex gate arrays called “Processors”, which feature instruction sets used by machine code, which is made using assembly, which is called “serial programming”, which enables high-complexity operations such as multi-core “parallel” programming.
I’m talking about the former “PGA parallel programming”, and not the latter “multi-core parallel programming”.
A CPU is a very complex gate array which handles bothersome tasks such as synchronization (run conditions) and memory access, and presents you with a very limited set of instructions. All serial programming builds upon this very limited set of instructions, and the instructions have been thoroughly tested over the past 6 decades.
Not to say that CPU architecture or microcode is fail-safe, but the chance of your computer blue-screening because of a failure of your CPU is rather small.
Now, parallel programming (the low level variant, not the hijacked definition) is the art of “wiring” those gate arrays. A CPU is actually made using parallel programming, so all the safeties it presents for serial programming will not be present in parallel programming, as parallel programming does not use a CPU.
EDIT: the above is of course simplified, there exist multiple architectures, collected into more common instructions sets such as amd64, armhf, arm64, etc. but even the most barebone processing unit contains a lot of securities and nicities that parallel does not have.
Lots of buzzwords indeed, author apparently doesn’t even know what a smart sensor is, as they described a regular sensor in their first paragraph.
That said, you can absolutely program analog ICs, such as by using a Field Programmable Gate Array instead of just your regular Gate Array (your usual, ‘stupid’ IC). Though, while a random IC might cost you less than half a dollar, a FPGA will cost you around 100$ for a simple chip.
On the other hand, skipping any GPU or CPU and their limitations by clock speed should speed up the AI considerably, though parallel programming (not concurrent programming, and not multi-core “parallel” programming either) is much harder and comes with almost no safety when compared to serial programming.
Not even Windows can run all Windows games, so that’s kind of a hard criteria for Linux to achieve.