HACKER Q&A
📣 pella_may

Go deep into AI/LLMs or just use them as tools?


I'm a software engineer with a solid full-stack background and web development. With all the noise around LLMs and AI, I’m undecided between two paths:

1. Invest time in learning the internals of AI/LLMs, maybe even switching fields and working on them

2. Continue focusing on what I’m good at, like building polished web apps and treat AI as just another tool in my toolbox

I’m mostly trying to cut through the hype. Is this another bubble that might burst or consolidate into fewer jobs long-term? Or is it a shift that’s worth betting a pivot on?

Curious how others are approaching this—especially folks who’ve made a similar decision recently.


  👤 jillesvangurp Accepted Answer ✓
Depends on what you want to do. But my 2 cents are that like all new technology, LLMs will become a commodity. Which means that everybody uses them but few people are able to develop them from scratch. It's not different from other things like databases, GPU drivers, 3D engines for games, etc. That all involves a lot of hardcore computer science and math. But lots of people use these things without being hindered by such skills.

It probably helps a little to understand some of the internals and math. Just to get a feel for what the limitations are.

But your job as a software engineer is probably to stick things together and bang on them until they work. I sometimes describe what I do as being a glorified plumber. It requires skills but surprisingly few skills related to math and algorithms. That stuff comes in library form mostly.

So, get good at using LLMs and integrating what they do into agentic systems. Figure out APIs, limitations, and learn about different use cases. Because we'll all be doing a lot of work related to that in the next few years.


👤 antirez
My 2 centes:

1. Learn basic NNs at a simple level, build from scratch (no frameworks) a feed forward neural network with back propagation to train against MNIST or something as simple. Understand every part of it. Just use your favorite programming language.

2. Learn (without having to implement with the code, or to understand the finer parts of the implementations) how the NN architectures work and why they work. What is an encoder-decoder? Why the first part produces an embedding? How a transformer works? What are the logits in the output of an LLM, and how sampling works? Why is attention of quadratic? What is Reinforcement Learning, Resnets, how do they work? Basically: you need a solid qualitative understanding of all that.

3. Learn the higher level layer, both from the POV of the open source models, so how to interface to llama.cpp / ollama / ..., how to set the context window, what is quantization and how it will affect performances/quality of output, and also, how to use popular provider APIs like DeepSeek, OpenAI, Anthropic, ... and what model is good for what.

4. Learn prompt engineering techniques that influence the qualtily of the output when using LLMs programmatically (as a bag of algorithms). This takes patience and practice.

5. Learn how to use AI effectively for coding. This is absolutely non-trivial, and a lot of good programmers are terrible LLMs users (and end believing LLMs are not useful for coding).

6. Don't get trapped into the idea that the news of the day (RAG, MCP, ...) is what you should spend all your energy. This is just some useful technology surrounded by a lot of hype of all the people that want to get rich with AI and understand they can't compete with the LLMs themselves. So they pump the part that can be kinda "productized". Never forget that the product is the neural network itself, for the most part.


👤 loveparade
I come from a more traditional (PhD) ML/DL background. I wouldn't recommend getting into (1) because the field is incredibly saturated. We have hundreds of new, mostly low quality, papers each day. If you want to get into AI/ML on a more fundamental level now is probably the worst time in terms of competition. There are probably 100x more people in this field than there are jobs, and most of them have a stronger background than you if you are just starting out.

👤 NitpickLawyer
> Is this another bubble that might burst

I see this a lot, but I think it's irrelevant. Even if this is a bubble, and even if (when?) it bursts, the underlying tech is not going anywhere. Just like the last dotcom bubble gave us FAANG+, so will this give us the next letters. Sure, agentsdotcom or flowsdotcom or ragdotcom might fail (likely IMO), but the stack is here to stay, and it's only gonna get better, cheaper, more integrated.

What is becoming increasingly clear, IMO, is that you have to spend some time with this. Prompting an LLM is like the old google-fu. You need to gain experience with it, to make the most out of it. Same with coding stacks. There are plenty of ways to use what's available now, as "tools". Play around, see what they can do for you now, see where it might lead. You don't need to buy into the hype, and some skepticism is warranted, but you shouldn't ignore the entire field either.


👤 joshdavham
I’d recommend you simply follow your curiosity and not take this choice too seriously. If you’re simply doing this for career purposes, then the honest answer is that absolutely no one knows where these fields will go in the next couple years so I wouldn’t take anyone’s advice too seriously.

But as for my 2 cents, knowing machine learning has been valuable to me, but not anywhere near as valuable as knowing software dev. Machine learning problems are much more rare and often don’t have a high return on investment.


👤 janalsncm
As an MLE I get a decent amount of LinkedIn messages. I think I got on someone’s list or something. I would bucket the companies into two groups:

1) Established companies (meta/google/uber) with lots of data and who want MLEs to make 0.1% improvements because each of those is worth millions.

2) Startups mostly proxying OpenAI calls.

The first group is definitely not hype. Their core business relies on ML and they don’t need hype for that to be true.

For the second group, it depends on the business model. The fact that you can make an API call doesn’t mean anything. What matters is solving a customer problem.

I also (selfishly) believe a lot of the second group will hire folks to train faster and more personalized models once their business models are proven.


👤 ednite
I think about this a lot. If you're early in your career, it must feel like you're staring at a technological fork in the road, with AI standing there ominously, waving both paths like it's the final boss in a RPG game.

Between your two options, I’d lean toward continuing to build what you’re good at and using AI as a powerful tool, unless you genuinely feel pulled toward the internals and research side.

I’ve been lucky to build a fun career in IT, where the biggest threats used to be Y2K, the dot-com bubble, and predictions that mobile phones would kill off PCs. (Spoiler: PCs are still here, and so am I.)

The real question is: what are you passionate enough about to dive into with energy and persistence? That’s what will make the learning worth it. Everything else is noise in my opinion.

If I had to start over today, I'd definitely be in the same uncertain position, but I know I'd still just pick a direction and adapt to the challenges that come with it. That’s the nature of the field.

Definitely learn the fundamentals of how these AI tools work (like understanding how AI tools process context or what transformers actually do). But don’t feel like you need to dive head-first into gradient descent to be part of the future. Focus on building real-world solutions, where AI is a tool, not the objective. And if a cheese grater gets the job done, don’t get bogged down reverse-engineering its rotational torque curves. Just grate the cheese and keep cooking.

That’s my 2 cents, shredded, not sliced.


👤 mdp2021
Building AIs has always been there - it's a (fuzzy, continuous to its complement) way to engineer things. Now we have a boom over the development of some technologies (some next-layer NN implementations).

If you are considering whether the future will boost the demand to build AIs (i.e. for clients), we could say: probably so, given regained awareness. It may not be about LLMs - and it should not, at this stage (it can hit reputation - they can hardly be made reliable).

Follow the Classical Artificial Intelligence course, MIT 6.034, from Prof. Patrick Winston - as a first step.


👤 y42
Depends on your goals. :)

If you're good at what you're doing right now and you enjoy it — why change? Some might argue that AI will eventually take your job, but I strongly doubt that.

If you're looking for something new because you are bored, go for it. I tried to wrap my head around the basics of LLMs and how they work under the hood. It’s not that complicated — I managed to understand it, wrote about it, shared it with others, and felt ready to go further in that direction. But the field moves fast. While I grasped the fundamentals, keeping up took a lot of effort. And as a self-taught “expert,” I’d never quite match an experienced data scientist.

So here I am — extensively using AI. It helps me work faster and has broadened my field of operation.


👤 mindcrime
3) go back to school and study something that isn't done entirely on a computer and requires human physical presence (for now). Learning plumbing, electrical wiring, welding, etc. are options. Even if you don't make that your primary path, it never hurts to have a fallback plan JUST IN CASE some of the buzz around AI-fueled job displacement turns out to be valid.

Or, if you believe there may be some merit to "AI is coming for your job" meme, but really don't want to do blue collar / skilled trades work, at least go in with the mindset of "the people who build, operate, and maintain the AI systems will probably stay employed at least a little bit longer than the people don't". And then figure out how to apply that to deciding between one or both of your (1) and (2) options. There may also be some white collar jobs that will be safe longer due to regulatory reasons or whatever. Maybe get your physician's assistant license or something?

And yes, I'm maybe playing "Devil's Advocate" here a little bit. But I will say that I don't consider the idea of a future where AI has meaningful impact on employment for tech professionals to be entirely out of the question, especially as we extend the timeline. Whatever you think of today's AI, consider that it's as bad right now as it will ever be. And ask what it will be like in 1 year. Or 3 years. Or 7 years. Or 10 years. And then try to work out what position you want to be in at those points in the timeline.


👤 teleforce
Nobody is expecting you to be able to derive and write automatic differentiation (AD) library from scratch but it's always good to know the fundamentals [1].

Andriy Burkov has written excellent trilogy books series on AI/LLMs namely "The Hundred-Page Machine Learning Book" and "Machine Learning Engineering" and the latest "The Hundred-Page Language Models Book" [2],[3],[4].

Having said that, the capability of providing useful AI/LLMs solutions for intuitive and interactive learning environment, training portal, standards documentation exploration, business and industry rules and regulations checking, etc based on the open-source local-first data repository with AI/LLMs are probably the killer application that're truly useful for end users, for examples here [5],[6].

[1] Automatic differentiation:

https://en.wikipedia.org/wiki/Automatic_differentiation

[2] The Hundred-Page Machine Learning Book:

https://www.themlbook.com/

[3] Machine Learning Engineering:

https://www.mlebook.com/wiki/doku.php

[4] The Hundred-page Language Models Book

https://www.thelmbook.com/

[5] Local-first software: You own your data, in spite of the cloud:

https://www.inkandswitch.com/essay/local-first/

[6] AI-driven chat system designed to support students in the Introduction to Computing course (ECE 120) at UIUC, offering assistance with course content, homework, or troubleshooting common problems. It serves as an educational aid integrated into the course’s learning environment:

https://www.uiuc.chat/ece120/chat


👤 carbocation
When I was in my postdoc (applied human genetics), my advisor's rule was that you needed to understand the tools you were using at a layer of abstraction below your interface with them.

For example, if we wanted to conduct an analysis with a new piece of software, it wasn't enough to run the software: we needed to be able to explain the theory behind it (basically, to be able to rewrite the tool).

From that standpoint, I think that even if you keep with #2, you might benefit from taking steps to gain the understanding from #1. It will help you understand the models' real advantages and disadvantages to help you decide how to incorporate them in #2.


👤 itake
Both are tough.

1/ There aren't many jobs in this space. There are still far more companies (and roles) that need 'full-stack development' than those focused on 'AI/LLM internals.' With low demand for AI internals and a high supply of talent—many people have earned data science certificates in AI hoping to land lucrative jobs at OpenAI, Anthropic, etc.—the bar for accessing these few roles is very high.

2/ The risk here is AI makes everyone good at full-stack. This means more competition for roles, less demand for roles (now 1 in-experienced engineer with AI, can output 1.5x the code an experience Senior engineer could do in 2020).

In the short/medium term, 2/ has the best risk/reward function. But 1/ is more future proof.

Another important question is where are you in your career? If you're 45 years old, I'd encourage you to switch into leadership roles for 2/. This wont be replaced by AI. If you're early in your career, it could make more sense to switch.


👤 JackDanMeier
From my prespective it's a bubble, very similar to the dot com bubble. All businesses are integrating it into everything, often where it's unnecessary or just confusing.

But I believe that the value will come after the bubble is burst, and the companies which truly create value will survive, same as with webpages after the dot com bubble.


👤 Jabrov
My recommendation would be to use them as a tool to build applications. There's much more potential there, and it will be easier to get started as an engineer.

If you want to switch fields and work on LLM internals/fundamentals in a meaningful way, you'd probably want to become a research scientist at one of the big companies. This is pretty tough because that's almost always gated by a PhD requirement.


👤 bloppe
AI is a scientific discipline. Software development is an engineering discipline.

Do you like science? Then dive deep into LLMs. Be ready for science, though. It involves shooting a thousand shots in the dark until you discover something new. That's how science gets done. I respect it, but I personally don't love doing it.

Do you like engineering? That's when you approach a problem and can reason about a number of potential solutions, weigh the pros and cons of each, and pick one with the appropriate trade-offs. It's pretty different from science.


👤 rikroots
I posted a recent Show HN[1] detailing why I felt the need to understand the basics of what LLMs do, and how they do it. Even though I've no interest in building or directly training LLMs, I've learned the critical importance of preparing documentation for LLM training to try and stop AI models generating garbage code when working with my canvas library.

[1]https://news.ycombinator.com/item?id=44079296


👤 petesergeant
Focussing on the inner workings of them may well end up being a type of programming you don’t enjoy: endless tweaking of parameters and running experiments.

Learning to work with the outputs of them (which is what I do) can be much more rewarding. Building apps based around generative outputs, working with latency and token costs and rate limits as constraints, writing evals as much as you write tests, RAG systems and embeddings etc.


👤 eric-burel
Hi, I am working in making the term "llm developer" more popular in France and train people to this new job. We will need a bunch of them in the months/years to come to implement advanced AI systems after companies manage to properly pick and set up their AI platforms. Currently people would tend to involve data scientists into this job, but data scientists are often less versed into the software engineering aspect, eg when they work more on notebooks than web apps. The job is akin to being a web developer, so a "normal" developer but specialized in a certain field. Knowing the internals of LLMs is a big bonus, but you can start your journey with treating them as black box tools and still craft relevant solutions. You'll need to learn about running systems with databases (vector, graph, relational and nosql are all useful) and plugging multiple services together (docker, kubernetes, cloud hosting).

👤 JFingleton
3. Focus on leveraging AI to solve real world problems.

You don't need to deep dive into the maths. You'll need to understand the limitations, the performance bottlenecks, etc. RAGs, Vector DBs, etc


👤 silisili
IMO, you're a woodworker, a craftsman that builds solid products. You've been using a hacksaw and hammer all these years, now someone invented a circular saw and drill and people can move a lot faster. And now even relatively previously inept people are able to do woodwork.

Do you need to understand how the circular saw and drill are made?


👤 scrozart
Lots of good answers here about part of your question: is it hype or not? In that it likely isn't going away, and is becoming a valid force multiplier, it's not.

However, this question is better answered by asking yourself what you're interested in. Do you _want_ a deeper understanding of AI/ML? If so, jump in. If you're not genuinely interested it'll be an interminable slog, and you'll revert to doing whatever you actually want to do eventually.

Nothing wrong with continuing to develop web/full stack apps while leveraging the new tools; that's also quite interesting.


👤 dilsmatchanov
I believe you should do what you genuinely find interesting. Go for 1, dig into internals, read some papers, and see how it goes. Even if you decide not to get into ML/AI, learning how stuff works is always rewarding.

👤 xgb84j
When learning new stuff I consider 2 things: - How much fun is the actual learning? - Can I actually apply what I am learning?

So I would learn things that are either fun for you to learn or things that you can directly apply.

For AI this means you probably should learn about it if you are really interested and enjoy going through build-your-own-NN tutorials or if you have good chances of switching to a role where you can use your new skills.

Edit: Basically investing anything (also time) is risky. So invest in things that directly pay off or bring you joy regardless of the outcome.


👤 culebron21
Maybe I'm too late to the party, but here are my 2 cents.

AI/LLMs for web dev is like StackOverflow on steroids. Was it worth learning the model of SO site, or Google, in order to become a good dev? I guess, no.

LLMs jobs in this analogy are like becoming search engine developer. Completely different area, and different career, I guess.

To upgrade a CV and just skills, I would suggest learning another backend language, like Go or Rust, if they're not on your CV already. I chose Rust, and it boosted my understanding of what's happening in code. You may also choose Go, and offer speed boosts for Python/JS backend code, as many employers are more willing to use it than Rust. I hesitated to do that, and turned out, I was wrong, and I could have offered impressive speed boosts at one of my jobs, where background jobs would work for hours.

This can be any tech that is near, but feels like beyond a barrier, and people hesitate trying it.

As for other jobs - take a look at college/uni teaching: it brings status (if you do it more, even tangible titles), and boosts social network, which will yield crops in 5-15 years.

Another thing is something that requires certification. This is what people hesitate to try or fail at tests. Jobs protected by certificates are supposedly more secure. And learning as an adult is much cooler, because now you've seen yourself learn stuff on your own and can apply methods to the process.


👤 xiphias2
It's your choice, but it's definitely not ,,just another tool''.

Most of my LLMs made lots of mistakes, but Codex with $200 subscription changed my workflow totally, and now I'm having 40 pull requests/day merged.

Treat LLMs as interns, increase your test coverage with them to the point that they can't ruin your codebase and get really good at reviewing code and splitting tasks up to smaller digestible ones, and promote yourself as team leader.


👤 jll29
LLMs are part of soft-computing, i.e. contrary to traditional (algortihm-based) computing sometimes things won't get the right result (or, just as bad, the right result in the wrong format). Engineering solutions with LLM is a lot of fiddling, which is experimental rather than analytical/logical.

It is worth getting use to that mind-set, and then use LLMs as a tool (they are likely here to stay, because big tech have started to integrate features based on them everywhere, for better or worse). So this is your option (2.). Personally, I prefer software I use NOT to be smart, but to be 100% deterministic.

But already my favorite LaTeX authoring environment (Overleaf) has a button pop up called "fix this" that auto-resolves syntax errors, many of which overwhelm my doctoral students that no longer read books end-to-end (here, to learn LaTeX).

Gradually, you may dive deeper into the "how", driven by either need or curiosity, so eventually you will probably have done both (2.) and (1.). - in the same way that you will have first learned SQL before learning how replication, transactions, data buffers and caches, or query optimizers are implemented.


👤 elAhmo
I don't really understand why would you even consider going deep if this is not something you have experience with or strong interest. Sure, it is good to know how some of the things work under the good, but you can be perfectly capable developer by reading docs and knowing how to use tools - without needing to know how to write them or how they do complex things under the hood. Take databases as example, network stack, etc.

Just because a field is popular, doesn't mean you should switch to going deep into it. But that doesn't mean you shouldn't use it - it costs a few dollars to try it out and see whether it fits your workflow. If it does, this is great and you can be more productive and focus on stuff that you can solve and LLM can't, but if it doesn't, that is fine too.


👤 pors
I had the same question and decided to get into the basics at least. I highly recommend the fast.ai course.

👤 YseGuy74000
To be realistic and in the know, accoring to development, we are still 50 years to stable LLMs.

Data Drift. Over the course of a few moNths the data deteriorates and the LLM ceases to function in a worthwhile manner.

Currently most LLMs are based upon the core preMise that people should not believe anythiNg. This is tokenized aBove everythiNg else. Then there are other erroneous tokenizations. Why these are not fully documented, people use these tools. You should know what you are getting.

Tokenization. Different words are tokenized to have a higher value than other words or configurations.

So, these are the dangers that everyone has ignored. It makes it an unethical tool because it is based upon someone's erroneous views. Honesty is the best policy. If it cant be honest, how can you trust it?


👤 blueboo
To build great AI products you need to be a fluent, deeply engaged user AND understand how they work and how to bend them to your use case beyond simple prompting.

We’re in a funny moment. Right now, AI tech is so powerful and capable that people are categorically underestimating their value and systematically underusing them — whatever the hype is signalling. If the tech froze right now, there’s decades of applications to mine.

Lots of great products being built on that thesis. The strategy is: unlock more of their present capability, harness that for a wider audience’s use case.

In that way you do both — leverage the tools, and in becoming an expert user, you can find yourself a vendor of very valuable guidance — and a builder of desperately sought-after products.


👤 layer8
Focus on what you're good at. Don't switch fields just because of a vague fear that your job will become obsolete (it won't; instead it will evolve). Understand the rough basics of how LLMs fundamentally work (e.g. 3Blue1Brown's videos). Evaluate and use AI as a tool, and get a general feel for what it can and can't do. Even that can become too much of a rabbit hole. Today's prompt engineering and AI workflow techniques may become obsolete in just a few years, and there is a risk in getting caught up in the month-to-month AI tooling developments. Successful techniques will spread quickly enough. Currently people still do a lot of experimenting to find out what's workable.

👤 josefrichter
Well its easy: dive into 1 and you will see if you like it and persist. I don’t think it’s a bubble - the benefits are obvious and immediate, and I don’t think there’s a single developer around the planet doing 2 and not using AI tools.

👤 jerpint
Interestingly I’m on the other side of things, where I’ve been training NNs for quite some time and have lately been dedicating much more time to become more full stack, because most tasks these days can be solved with LLMs, which are mostly available to anyone. It does help to have a good grasp of how things work, especially embeddings, tokens, logprobs, and the likes, but I am still impresssed at how accessible and good the tooling around LLMs has become.

You don’t need much expertise in NNs to still be able to get huge value out of them today


👤 yubblegum
I am thinking about the same. There are actually 3 areas related to the LLM matter:

1 - The magic box itself.

2 - LLM Whispering

3 - Tools/Products/Infrustructure to support (a) RnD in (1); and (b) devops in (2).

I think for experienced backend (distributed, streaming, large data, ...) SWE type, option 3 is the optimal way to go. Option 2, becoming an LLM whisperer, is obviously the biggest job source but that space is guaranteed to be filled with phonies and charlatans. Option 3 is solidly in (LLM netural, really) software engineering.


👤 px1999
Consider this (possibly very bad) take:

RAG could largely be replaced with tool use to a search engine. You could keep some of the approach around indexing/embeddings/semantic search, but it just becomes another tool call to a separate system.

How would you feel about becoming an expert in something that is so in flux and might disappear? That might help give you your answer.

That said, there's a lot of comparatively low hanging fruit in LLM adjacent areas atm.


👤 WA
To piggyback on this discussion, what do you all think about option 3:

Work for companies (as a consultant?) to help them implement LLMs/AI into their traditional processes?


👤 bjourne
LLMs is a subset of sequence learning which is a subset of ML which is a subset of AI. But even in the LLM subset the field is so deep and vast that you can only learn one small subset. It's a subset of a subset of a subset of a subset of a ... You get the idea. Do you like math? Then pick one subset. Do you like hardware? Pick another.

👤 fructacean
You can call it a bubble to the extent where only major competitors like OpenAI, Google, maybe even Anthropic (doubt it) will remain and lead the arms race. Their tools grow in power and soon will wipe out most of the AI startups but it's still important to understand how any of this works.

👤 christophilus
I say, play with and explore the subjects that interest you. Enjoy the process. Don’t worry so much about the hype or career trajectory. It never hurts to learn and grow— as long as you’re enjoying yourself.

Worst case, you’ll be a more interesting, well-rounded, and curious person with a broad set of programming skills.


👤 cess11
Focus on immediate profitable problem solving or do a PhD.

Over like five years we've been promised a revolution that has yet to appear and is still lighting billions of dollars on fire. Don't bet on it materialising tomorrow.

If you need comforting, go read Merleau-Ponty and Heidegger, perhaps condensed down as Hubert Dreyfus.


👤 indrex
I develop AI for a living and I don’t understand the internals of it either, just as I don’t understand the internals of Intel architecture. My job is to build, not to fit information into my mind.

👤 demirbey05
https://stanford-cs336.github.io/spring2025/

Just learn this and its prerequisites


👤 bigstrat2003
Option 2 for sure. Make use of them if you find them useful, or don't if you don't. Personally I find LLMs to be pretty much useless as a tool so I don't use them, but if you get use out of them then more power to you (just be careful that their inherent unreliability isn't costing you more effort than they save). I think you should in no way consider option 1 - this is very much a hype bubble that is going to burst sooner or later. How much later I can't say, but I don't see any way it doesn't happen. I certainly wouldn't advise anyone to hitch their career to a bubble like that.

👤 binary132
If you’re thinking your craft is going to get consumed by LLMs, what reason do you have to believe that LLMs will not eventually take over that craft too?

👤 tom_m
Tools. And read their output. You will save time, be more productive, and actually learn something.

Don't read and you'll not learn and get things wrong.


👤 seydor
The algebra behind neural networks and llms is surprisingly simple, and the rest of it is scaling tricks. So, por que no los dos

👤 neom
What one sounds more interesting to you, and why?

👤 pathikrit
Do you need to know internals of a database or just use them as a tool. If you hit some bottleneck, you go deep.

👤 wseqyrku
> Go deep into AI/LLMs or just use them as tools?

Are you asking to get a PhD or use them as tools?


👤 insane_dreamer
Tools. You don’t need to be an EE and IC expert to use a computer effectively.

👤 slotrans
Neither. LLMs are destructive.

👤 zerr
I leave that to statisticians.

👤 tmsh
Hot take: based on how fast models are accelerating and replacing large parts of the development process (my two cents in https://x.com/TomHarada1/status/1926193211678023953), I think more and more you want to work backwards from a world where AI does 90% of things. Script kiddies : prompt engineering :: current engineering :: future of engineering. Either path makes sense - going deep in research or development. One is a kernel and one is the rest of the egg.

👤 binary132
emdash detected, please step into the Voight-Kampff booth.

👤 yapyap
You could definitely do 1. if you have the mental patience to surround yourself with the grifters in the AI world and the moral ambiguity to do your work.

It’s up to you where your morals lay and how important money is compared to those morals but it seems like AI is here to stay.


👤 achooie
Even