What’s that? All I see is *******
wiki-user: car
What’s that? All I see is *******
The usurper delivered us from the evil that was cable or nothing.
You either die a hero or live long enough to become the enemy. The cycle must repeat
Kind of a “takes one to know one” situation here
I’d imagine you need to point the software to a camera you own. In this case, cities would add list of the networked cameras they use to the software suite and let it do its thing.
I doubt this program is just scouring the net for unsecured cameras, but who really knows. IP geolocation is getting worse and worse by the year, so that’s an unreliable feature.
deleted by creator
I want to say they missed the mark with Shadow, but it indirectly brought us this kind of treasure. It’s like the grandfather of modern cringe
This seems simple for one stream, but scale that up to how many unique streams that Youtube is servicing at any given second. 10k?
Google doesn’t own all of the hardware involved in this video serving process. They push videos to their local CDNs, which then push the videos to the end users. If we’re configuring streams on the fly with advertisements, we need to push the ads to the CDNs pushing out the content. They may already be collocated, but they may not. We need to factor in additional processing which costs time and money.
I can see this becoming an extremely ugly problem when you’re working with a decentralized service model like Youtube. Nothing is ever easy since they don’t own everything.
I’ve seen this but nobody actually likes the older versions either. Vista being an outlier, of course
Thankfully it seems that encoding ads into the video stream is still too expensive for them to implement.
I’m assuming that asking CDNs to combine individualized ads with content and push the unique streams to hosts does not scale well.
Honeypots have gotten really weird lately. Anti-honeypot (along with anti-VM and anti-debugging) techniques and methods are more common than ever. I think something like 80% of all APT-level malware from the past 5 years use these techniques
It’s best to purchase an old router which doesn’t support new protocols to learn with. It should only be used for your testing - not meant for normal use. WEP will be several orders of magnitude easier to crack than WPA2 or WPA3. Tools can help you break certain implementations of encryption regardless of how many bits of entropy that are being used - often by addressing weaknesses in the algorithms or cryptologic pathways vice brute forcing. That’s often the kind of thing demonstrated in conferences and featured in research papers.
As far as everything else is concerned, you’ll get there if you stick with it. I’ll echo what others have said in this thread; there are some serious diminishing returns for attaining absolute security, all of which can be bypassed by attacking you.
Best place to start is by vacuuming up some open courseware from MIT on the topics you’re interested in. RF fundamentals, basic wireless communications, maybe some basics of network security and fundamentals of computer security or cryptology.
You need a knowledge base in order to know what to look for when you run into problems, else you just kind of waste a lot of time.
Then, familiarize yourself with wireshark. Start the program and visit a few http websites to see what information your computer is transmitting and how it’s formatted. Your goal is to ultimately snoop on this information and modify it. You need to know how to change a character in the middle of a packet to deliver an effect. If none of that makes sense…
Learning an SDR is honestly a bit of a pain. You can get a $30 antenna on Amazon that covers the ~1-6 GHz range and that will enable a lot of what you want to do. Try to pick up on old router that supports the WEP protocol. It’s old and deprecated with lots of information on how to break it.
Combine the SDR with your computer and wireshark to visit a webpage with HTTP. You’re almost certainly going to run into problems manually isolating and cleaning up the WiFi signal on your SDR into something that’s useful, but you probably have enough to start you off on your journey. If you can capture the WiFi traffic and convert it from an analog waveform into a digital bitstream, then you can finally begin doing useful things. Of course… you need to decrypt the bitstream and account for errors.
Good luck
I came off as pretty aggressive, so I apologize. I’ve been interested in this field for a while and I am still an amateur in most aspects. This isn’t really an area that’s intuitive or easy to pick up for most people.
You’ve come out of the gate swinging. It’s technically possible for people to do the things you’re exploring… but the same people who are publishing these techniques and concepts are professionals. They may not have formal education in computer science, but they have the experience.
Spend time going over things like DEFCON presentations. Sharpen your coding skills. Vacuum up free courseware from sources like MIT.
You can probably pick up “normal” RF with a cheap SDR antenna setup, but then what? You are stuck with some waves and no idea what to do with them. Are you picking up intentional Bluetooth? How would you recognize Bluetooth that’s frequency hopping? Looking at RF waveforms for modern communications is absolutely ugly and tedious.
There’s so much to learn. You need to pick one topic and dig in. All of these things have much more depth than we can explain over lemmy.
You should try this. I guarantee that it’s nowhere near as easy as you’re thinking.
There’s a huge difference between proof of concept activities and useful, fruitful information gathering and analysis.
If you’re going to be downloading programs and running scripts without doing the work to understand how these tools were built and how to modify them to suit your use cases, then you aren’t actually going to get anything useful out of them.
I don’t think an RTL-SDR is going to help you with any sort of privacy outside of maybe validating that your devices aren’t emitting typical RF while they off. You aren’t realistically going to become an electronic warfare master with some shitty home equipment and no formal training.
Best route is to start combing through security conference presentations for anything relevant to your lifestyle.
A lot of the cutting edge information gathering stuff isn’t exactly practical for widespread use. I guess somebody living a floor above you could capture your wireless traffic, but you’re not interesting enough for them to dedicate high sensitivity antennas and bespoke equipment to phreak your keyboard strokes and break out fucking differential power analysis techniques on your home.
Practice good data and security hygiene, stay off social media when possible, and don’t use IOT devices. If anybody wants to get at you, and I mean really wants to get at you, there’s nothing you’re going to be able to do about it besides giving up all electronics.
What you’re describing doesn’t sound like UBI at all.
UBI is universal. There’s no criteria for eligibility besides maybe residency and being alive.
A homeless person would receive the same UBI that a doctor would. Anything else is a form of aid.
Are you suggesting that we should raise the cost of Starfield’s development then and account for hidden engine costs?
We can do that. I don’t know what a good number would be, but let’s quadruple or quintuple it for fun. Are we sitting at the $1.5 billion dollar mark? This gives us a scenario where Starfield has now cost twice to develop than this game.
The game was still developed and released. At some point, long development times start to work against a product. This isn’t a field where consumer expectations and tastes remain constant. The longer a game takes to make, the more dated design decisions may appear. Graphics cannot remain cutting edge for the entirety of a 10 year development cycle without rework, which can be seen as a waste of resources. That time and energy could have gone towards something else. Rework enough systems and you begin to paralyze your ability to actually complete the project.
Does it really though?
Starfield could have been programmed in potato with ti-84 calculators as dev tools. The work has been done to bring a playable game to the market.
What goes on behind the scenes isn’t really important to an end user. They are purchasing an entertainment experience, not an investment into a game engine.
Nono, they raise prices with parity such that they’re still technically minimally cheaper.
That being said, I don’t think AMD and Intel have similar game streaming services. It’s pretty much GeForce Now and Xbox Cloud streaming as the big dogs.
I’d think the skull is unusable without any processing anyways. Coffee is slightly acidic and bones will dissolve over time. Plus, bones are porous not unlike terra cotta pots. You’d need to glaze the interior or something after you seal the major holes