Every week there’s several new things, most examples are cherry picked, benchmarks are blown out etc.
I run a software company that has not added any AI features, mostly because the usefulness seems to fall below the bar for value that most of our features try to maintain.
I use ChatGPT personally for random searches and Cursor + whatever model is deemed best at the time but even with that it takes a lot of work to get something valuable out of it in day to day.
I feel like I’m losing my mind when I see startups posting 10m ARR numbers in just 6 months or whatever.
I’m hearing from VC’s churn is 15-30% in many of these companies and they’re far from profitable but the growth is just wild.
Do I succumb and just add yet another text generation that maps to an object like fill-in-the-blank app of the week?
It feels disingenuous but even companies I know and respect are adding questionable “agent” features that rarely work.
Anyway, how are you feeling?
If AI lives up to its hype, which is a whole other subject, then I expect to see the two things I like about my job vanishing quickly - the pay, and the problem solving.
I’m not working on AI, but working with AI.
The leverage feels dramatic for a solo founder in my shoes. I think it’s all the cross-domain context switching. Gemini 2.5 Pro for academic type research, ChatGPT 4o for rapid fire creative exploration, o1-pro for one-shot snippets. Copilot for auto-complete.
It’s exciting honestly. I don’t know where we’re going but I do feel free and in a solid strategic position having my own company and not still being a cog in the machine.
- Who does this move the needle for? How does it compare to how things are done now?
- How does the regular person benefit, if at all?
- What's likely to happen to pricing after the initial investor subsidization end? What does the price history of other 'unicorns' tell us? Airbnb and Uber used to be cheap once too.
- What is the valuation of "AI-first Company X" based on? Who are the insiders and what is their work background?
Too much AI news today is just parroting corporate press releases and CEO keynotes.
It’s ironic though. VB6 macros in excel was a major productivity win. Point and click forms an MBA could whip up in 20 minutes. Software development libraries used to be much faster to develop for with far less boiler plate.
Just relax and realize it's mostly FOMO: https://www.theregister.com/2025/05/06/ibm_ai_investments/
LLM’s are inherently non-deterministic. In my anecdotal experience, most software boils down to an attempt to codify some sort of descision tree into automation that can produce a reliable result. So the “reliable” part isn’t there yet (and may never be?).
Then you have the problem of motivation. Where is the motivation to get better at what you do when your manager just wants you to babysit copilot and skim over diffs as quickly as possible?
Not a great epoch for being a tech worker right imo.
Another problem is that we're turning any problem into a black-box, which takes the fun out of problem-solving.
I think there are situations where AI as it currently exists is absolutely a value add, but often it does seem like it's been shoehorned into an existing product just to ride the latest trend.
Frankly, as a "user", not a potential employee, I don't give much of a fuck about anything more than what I can do with the thing right now. (Which is quite a bit in fact.)
HKCU\Software\Microsoft\Windows\CurrentVersion\Search\BingSearchEnabled:DWORD = 0
People wildly overblow upsides, make wild predictions, cherry pick results etc.
I can't read headlines like "Why the fact that some teachers are using AI to check homeworks means that there would be no teachers required".
Word. I’m not a huge Microsoft fan, but it feels like they’re just shoving ChatGPT/copilot down your throat every chance they get. It’s integrated into everything - it’s useful when it’s useful, okay, but it generally isn’t. It’s one more Microsoft’ism that you have to learn to tolerate or simply ignore.
I can’t tell if Nadella’s really betting the farm on it all, or if he’s just trying to leave his mark.
LLMs as coding assistants are undeniably time saving devices, especially when working in languages/libraries/platforms/frameworks you aren't already very familiar with, or when needing to generate something very boilerplatey as a one-off.
I am not calling the technology useless by any stretch of the imagination, but its still just so wildly overhyped right now.
It is a pretty common occurrence these days for me to have a blog post open from some "AI industry thought leader" talking about how all developers will be out of work in a year while at the same time I have a Gemini window open and I'm just watching it absolutely flail on relatively simple things like generating a database query or a regex (that is novel and not something that's scattered all over its training set like a simple email validator).
And Gemini 2.5 is, IMO, the best of the models when it comes to programming assistance (having replaced Claude 3.5 which was IMO previously the best), at least for the areas I touch (lots of kotlin/KMP/Android/etc).
As goofy as Gemini sometimes gets it is far less frustrating than asking Claude 4 a question and watching it write out a whole ass answer but then correct itself like 7 times before finally coming to a shitty answer that is worse than 2 of the ones it wiped out while blowing through most of its context window on its loop of indecisiveness.
And relatedly... color me completely unsurprised that this thread got dumpstered off the front page so quickly. Gotta keep pretending like the singularity is going to happen next week.
:D
I recommend ignoring them. Despite VCs trying to spend it into existence, we aren’t going to have another internet level event in information technology and the smartphone+laptop combo is peak personal computing.
I think we are near the crest of this wave but that just means the next one is coming, though.
I am having a lot of fun learning about generative AI. It is just a bit thankless because I know the stuff I am building will be dead on arrival. So, I will not get any praise regardless on how well I do my job, maybe even get blamed.
But hey, after all the junior devs have been starved because no one wants to hire them, I will make bank once the next AI winter comes and companies desperately look for people who can actually code.
If you have your own company you can just weather it out and invest in good talent. Really a good position to be in.