Anyone else get the feeling that AI is a net loss for humanity? Search already made us lazy, now we don't even have to think about the answer, just regurgitate what the AI said. Don't even read your email, just have AI summarize what an AI probably wrote.
Then there's a strange dismissal of AI's failures. You can blame the AI for a failure, but if you were to arrive at the same conclusion/result, it would be your fault. It's become some sort of ownership offloading. Easier to blame the machine than take responsibility?
Where are the environmentalists? AI uses 10x the energy (and thus 10x carbon emissions) as search. That doesn't include the resources and energy to make the hardware and train the models in the first place. I can't think of any device or tool where people would tolerate a 10x increase in energy usage to complete the same task. People don't care what happens in the Datacenter until its next door to them causing problems.
I find it very useful for generating boilerplate code, unit tests, etc. It's a great tool for doing certain work. I don't need or want it shoved into everything because every Product Manager and Sales team seems to think it's a good idea to drive 'growth factors' or some other BS.
The real uses of AI will be far more subtle.
Conversely, I spent an hour one day trying to prevent Gmail from puking text onto every new email I begin. It turned out to be a Chrome feature and scrapping Chrome is the only way to stop it. Past that is more time spent working out how to disable Gemini and remove it's elements, so it's unwanted presence isn't triggered into being a problem.
Every month I have to review the list of registry edits I use to keep copilot's unwanted advances out of my users' workspace. Notably, there isn't one for Win11 notepad and MS CP has to be disabled manually thru the UI.
Of these two preferences, major tech respects just one. The other one is continually acted on, leveraged and intruded upon - with tech corps showing no more understanding of consent than Harvey Weinstein did. If AI is on the table, what we want is just an obstacle for UI devs to overcome.
I see other products trying to do this as well. My intuition tells me this is not going to end well. I could be entirely wrong. I really hope I am wrong. Maybe it's just purely hype and products will not lose what makes them great. Maybe it will just be some burnt development cycles. Or maybe it will really just be making new application connectors, formatting and protocols to be more AI friendly.
[1] - https://www.rsyslog.com/rsyslog-goes-ai-first-a-new-chapter-...
They were so proud to announce in ~February that they will sum up our comments on the internal survey using Copilot (probably)!
I just want to not have it shoved in my face. It's exhausting.
Toasters?
Oh wait. https://www.theguardian.com/uk-news/2022/sep/10/intelligent-...
You're using consumer-tier level gear.
So they're assuming consumer-tier level desires.
Try some professional gear.
Let's face it. Huge portions of the population are performing unnecessary jobs in order to sustain themselves. Most products and services available to us today are unnecessary, unsustainable, destructive (economically and socially), and many represent a huge bubble which will hurt us down the line. LLM have exposed the sort of office jobs which have been automatable for years.
We are killing ourselves mentally and spiritually, as well as the planet we live on, so everybody can drive to offices to respond to emails or to create carbon-copies of the same already existing products and hoping marketing makes the difference.
The environmentalists have been screaming from the rooftops about it for a while. But activists can only do so much, compared to huge multinationals with marketing budgets bigger than some countries GDPs...
Throughout history, we've found ways to guide powerful technologies toward better outcomes, even when there were strong economic incentives pushing in other directions. Think about how we've developed safety standards for cars, regulations for medicines, or international agreements around nuclear technology. It took time and effort, but people working together made a real difference.
When it comes to AI, we still have meaningful choices. We can support leaders who take these issues seriously, back companies that are genuinely trying to develop AI responsibly, and speak up when we see problems. We can also stay informed and help others understand what's happening - sometimes the most important thing is just having honest conversations about what we want our future to look like.
Like dawg I don’t use Gemini I just use your email and google drive.
Fuck.
We will see either one of two things:
(a) a massive market crash which nearly destroys the software industry and for a time wreaks havoc on the global economy, or (b) a soft crash due to external smoothing effects because never underestimate the power of fake billionaire money; along with a severe bifurcation of tech markets where simultaneously we see slop dominate poor quality mainstream products and then the higher-end luxury products either minimize or outright remove all mention of AI because it's become such a low-brow, tainted brand.
(On that last note, we already have evidence that consumers see that something is now "AI-powered" or whatever and they like it LESS. It will eventually become the kiss of death.)
This says more that search takes almost no energy than it does about the huge environmental toll of AI.
In this day and age when “but the environment!” Is one of your key arguments, you’ve already lost. The environment is cooked and nobody cares. People (in the societal sense) say they care but every action they take says the opposite.