It’s actually superior to ChatGPT, because of much higher context memory of about 100,000 tokens whereas GPT-4 only got at most 32,000 tokens. Plus you can upload doc and then query on it.
It’s actually superior to ChatGPT, because of much higher context memory of about 100,000 tokens whereas GPT-4 only got at most 32,000 tokens. Plus you can upload doc and then query on it.
Microsoft: “Gotta keep all of the telemetries and AI running 24/7 of course!”
I don’t think it’s possible for them to do so, because that would means killing the gaming aspect of Windows. GPU on cloud is stupidly overpriced and expensive, just look at Standard_NV6 for an example, it easily cost $10,000/yr according to this (Just look for anything that have “N” in it’s name for GPU enabled VM and they are all expensive.)
If they try to ban everyone from being allowed to use their own computer hardware, I really doubt people would stay on Windows, they most likely would be in the 5 stages of griefs and then contemplate on switching to either Linux or Mac OSX.
I would honestly hope you bring it up to your agency to start offering chat channels that are end to end encrypted and have all history wiped clean after certain period of time.
Probably both and Network Effect is still a thing that he would consider.
I think it’s better to create a division between corpo-social media and public social media. Everything that corporation does just muddle the water.
Same, it’s like we’re watching from the sideline seeing all of the sheep going head fast off the cliff.
And so the global enshittification grows…
As in actual world, providing context to physics of things, providing logical association/evaluation, and so go on. It is basically something that supposed to help LLM get closer to understanding the “world” rather than just spewing out whatever the training dataset give it. It does have a direct implication for technical writing, because with stronger understanding of the things you wanted to write about in technical writing, LLM with World Model would basically auto-fill that.
This is something that the researchers are pretty much all hand on deck working on to create.
Guess that all you can do, yep.
There are active researches on world model working alongside with llm. The idea generally is that llm is used for generating text, but world model provide more context for llm to understand the world.
Yeah, though it would be more challenging to make a living when it lower the barrier of entry for writers.
To be fair, it a REALLY good hammer.
I thought about bringing up technical writing, then I realized that it’s a possibility that even that job isn’t safe within the next 5 years considering the promising development of Spiking Neural Net. This is something I would probably suggests to your daughter at this point that she should probably reconsider her chosen field and try to enter biology or some stable job.
It would make it even more important to have sites like Goodread where books are recommended by communities.
Yeah, and I am honestly surprised that you could do ok for AI on Mac since I was pretty sure that Tensorflow/Pytorch are pretty much CUDA implementation primarily and only have recently worked on branching out to other API.
On other hand, I kind of want Microsoft and Apple to force it through, charge $60/mo and lose all of their users within a few days and see Linux population exploded. Ah one can dream…
I guess it depends on circumstances, for an example, I would develop GUI Toolkit that utilizes Vulkan Compute for computing various indicators and trading analysis on the front-end that takes in billions of candlesticks for second by second tradings. Having a real-time feedback by adjusting the indicator algorithm is very handy to have in a trading software.
Well, I’ve worked for the government (as contractor), corporations, and small businesses, I could count a few times I’ve seen people using Apple Mac Pro devices on one hand (more often seeing Macbook Pro rather, but very rarely for development) and more time than I can count on either Linux or Windows workstation computers.
We use Linux desktop often, because most of our servers are running on Linux so it helps to have version conformity when matching up with server’s versioning and we occasionally use Windows for Visual Studio, proprietary software and so forth. But there are a few times where we get discounts for buying software for Linux rather than Windows.
Employees in my office switched from Apple Macbook Pro to Windows/Linux based laptops like Framework Laptop, because Macbook Pro often time lacked GPU that you would find on Linux and Windows workstation. Apple is going off on it’s own little world with their own Metal API/GPU and it doesn’t reflect the reality in real world emerging technologies. For instance, there are some computational challenges that in my office, we make use of Vulkan Compute so that we can purchase both Nvidia GPU and AMD GPU to generate real-time data, had we used Metal API and Apple’s products, it would’ve been cheaper to purchase cloud compute servers. (We wanted to ensure each developer can test the given Vulkan code on their own desktop/workstation.)
deleted by creator