This article seems kinda misleading. Even if you could set ChatGPT as your default assistant on Android it will not be triggered by a hotword like Google Assistant does.
This article seems kinda misleading. Even if you could set ChatGPT as your default assistant on Android it will not be triggered by a hotword like Google Assistant does.
You didn’t even read the article, huh?
They are getting rid of an experimental feature only released in one county. Companies do this all the time, it’s only news “because Google kills everything” so it gets clicks.
Scrolling any list.
It’s like I’m on Reddit again. So sick of hearing this tired line regurgitated over and over to free upvotes.
I mean no disrespect, this is a genuine question: at this point why limit yourself to HomeSeer hardware when Home Assistant has become so user friendly to setup and configure?
Why is it a deal breaker? Synthetic benchmarks don’t mean anything. The real world experience of using a Pixel is faster/smoother.
I feel like most people don’t understand that there is a default share menu for the OS providers but the vast majority of apps create their own (which are significantly worse).
The Android 14 share menu works great for me. The people and apps I share stuff with are always there. 🤷
Maybe read the article?
Software updates and eventually you have to stop supporting older versions of Android. Apps do it all the time. Dropping support for Android 7.0 going into 2024 seems reasonable (it’s only a couple percent of the total market).
Nice. I’d much rather they kept the nearby share name but whatever.
How does this have anything to do with market share anyway?
Are you serious or are you just trolling? This is an anti trust lawsuit. The definition of antitrust is preventing abuse of monopolies. And the definition of a monopoly is “controlling most or all of the market share” or something.
No, Apple won on some technicalities.
Honestly I love the Pixel Fold because it’s so short. You can actually use it with one hand when closed without contorting.
Reading comprehension is not your strong suit.
You’re just moving the goal posts. I ran an LLM on device in an Android app I built a month ago. Does that make me first to do it? No. They are the first to production with an actual product.
You didn’t list a single production app in that post…
I can’t find a single production app that uses MLC LLM (because of the reasons I listed earlier (like multi GB models that aren’t garbage).
Qualcomm announcement is a tech demo and they promised to actually do it next year…
where the fuck “generative” is in the title
LLMs and diffusion models have been in apps for months.
Show me a single example of an app that has an LLM on device. Find a single one that isn’t making an API call to a powerful server running the LLM. Show me the app update that adds a multi gigabyte LLM into the device. I’ll wait…
Feel free to not respond when you realize you are wrong and you have no clue what everyone else is talking about.
That’s the entire point. Running the LLM on device is what’s new here…
At a glance I was confused/angry why this would only be for the Pixel 8 Pro and not the standard Pixel 8 considering they both have the same Tensor G3.
However, (from my own testing) it seems very likely the full 12 GB of ram the Pro has (vs the 8GB in the Pixel 8) is needed for some of these tasks like summarization.
What does this have to do with GrapheneOS?