HACKER Q&A
📣 greenleaf3

As a developer what are your aha moments?


To me the aha moments where as follows:

1. When I was introduced to QBasic

2. When I got to know how simple and amazing Lisp is

3. When I was able to code at the speed of my thoughts with VIM

4. When I got to know Express.js (after learning Django)

5. When I knew that everything in Smalltalk is a message including if else statements


  👤 haroldegibbons Accepted Answer ✓
My "aha" moment was realizing most of my ideas and most apps out there are complete garbage. Not needed. Damaging, even. 99.9% of all of it.

For example, most "cutting edge" web apps are better off as PHP monoliths. Facebook was a PHP file for a long time. But most apps in general should never make it past being shell scripts, which are better off staying as spreadsheets or better - text files which are better off as pieces of paper whenever possible. And all paper is better off left as thoughts whenever possible, and most thoughts should be forsaken.


👤 jdmoreira
1. When I realized programming was hard

That’s it. For a long time I thought I was good at programming and I kept making the same mistakes over and over. All nighters, leet coding, runtime hacks, etc...

And then one day it struck me. It’s hard and my mind can’t keep up with yesterday’s cowboy coding.

Slowly I started putting defense mechanisms everywhere - good type systems, immutability, compile type instead of runtime, never overwork and sleep the best I can.

Life is much better now and I'll never go back to thinking I'm good at programming.


👤 kleiba
1. When I realized programming was easy

That's it. For a long time, when I thought of the hardest stuff to program, I was thinking of computer games. Mind you, not the latest AAA billion dollar development budget 300+ people on the team kind of games. No - given my age, I was thinking about single-programming doing code and graphics and sounds on a C64 kind of games. To me, it was incomprehensible how one person alone could manage to do a machine, especially with the hardware limitations, do all that.

And then one day it struck me. I had started to dabble with programming in my early teens and then never stopped but, I guess, gotten better at it over time.

Like with every process that evolutionary rather then revolutionary, there's a good chance to lose track of your progress. I remember clearly after finishing college, I really though that everything I knew then about computers and programmers I had already known even before I entered university. But then one day, a freshman asked me a question about a programming assignment in their class, and it was completely trivial to me. Yet, when I was a freshman, I would have probably asked the same question to someone else.

And yet, that was not my aha moment. It still took many years for me to realize that my idolization of game programmers was perhaps a bit much. Mind you, I do realize of course that someone specialized in any area will be able to produce better code than someone who's not - for any definition of "better". I'm still not a game programmer and never will be, so I still have the highest respect for their profession. But I do realize now that there's not outer-worldly skill that separates the game coder from the crop of all other programmers. Anything is within reach - what you need is not some innate talent, it's just dedication.

Life is somewhat better now and I'll never go back to thinking I'm not good at programming.


👤 gonzo41
1. No one working in tech actually wants to solve a problem and move on. The problems are too fun to leave alone once there's an MVP. Enter stage left, all the framework churn. Adoption of graph databases that don't fit the problem. All because the developers are bored and not business aligned.

2. Knowing the business is key to having personal buy in for work. IF you work in a bank writing bank software, understand the bank and banking so you understand the context of the software. It's a real 10x thing to know why a requirement is a requirement.

3. The software you write will live a lot longer than you planned. Your experiments from 1. will haunt you.


👤 lioeters
A few that come to mind:

- Implementing small languages, and how it inevitably leads one to a deeper understanding of all the layers that make computing and programming possible

- The Make-a-Lisp project and the language-agnostic conceptual design at the heart of it - https://github.com/kanaka/mal/blob/master/process/guide.md

- Declarative nature of React (the view as function of state); using centralized unidirectional state management and actions

- Test-driven (and, similarly, type-driven) development

- TypeScript - Benefits and costs of type definitions, dynamic/gradual and structural typing; power of a language integrated with the editor environment

- Virtual machines, QEMU, and later Docker - Commoditized, reproducible builds for applications and their environments

- Build pipeline scripts


👤 DanielBMarkham
1. That programmers were a market, we weren't just a bunch of nerds sharing cool stuff with one another. That speaker at the conference? The one talking about Wheezle-snort 7.0? Yeah it sounded awesome, but it was supposed to. It's a sales pitch, even if the software is free (Actually, especially if the software is free)

2. That programming and making stuff people want are two entirely different things, although programmers always assume that whatever you're building, it's the right thing. You can spend your entire career in programming, learning all sorts of goodness about things like Erlang innards, and never really understand what your job is.

3. That all of that mousing around, learning a new IDE every two or three years when I started was a complete waste of time. I automatically assumed that the more cool and shiny the programming environment, the easier it would be to code and the better my code would be. But no. grep and awk work (mostly) the same way now as they did 30 years ago, and any time I spent learning which hotkeys did that work in some long-lost dev stack was a complete waste of time.

4. Conversely, that UX beats internal architecture, every time. If folks are having a blast using your app, you win, even if it crashes every five minutes (How they could have fun if it crashed every five minutes is a good question. You might want to ask them)

5. The more smart people you throw at a programming project, the bigger mess you end up with. I know when I say that people are thinking of "The Mythical Man-Month", but it goes deeper than that. Even if you somehow manage to stay on-schedule, human communication around innovation stops working at a certain scale, and that looks like a hard limit. There are ways around it, using things like Code Budgets, following CCL, layers and DSLs, but nobody does that, so it doesn't matter. We, as an industry, have absolutely no idea how to staff or run projects.

ADD: One thing that was quite profound that I discovered late: if you code well, the simple act of programming can tell you something about the way you're reasoning about the problem that you couldn't learn any other way. Programming is by no means a simple one-way street where ideas come out of the head of the programmer and end up as bits on the tech stack. It's very, very much bidirectional. Our programs influence us as coders probably much more than we influence them.


👤 cafemachiavelli
1. More of a CompSci aha: Hard problems are hard. Sometimes I get frustrated over getting stuck in a bad local optimum that I can't easily get out of. But then if I take a step back, I notice that the problem is NP-hard, that I'm not gonna solve it by throwing random heuristics at it and I need to either change the constraints to make it P, lower the input size, or lower my expectations. Simplest example: Deciding what goes in which shelf in my home is NP-hard. Solution: Get rid of stuff more liberally -> (n' < n) -> f(n') is much easier than f(n)

2. Doing NAND to Tetris the first time. It taught me not only how computers work, but how powerful recursive layers of abstraction can be. I had absolutely no idea how my system looked on RTL, but I was still able to build it.

3. Also Lisp. I wish Nand to Tetris had picked something more lisp like in the second half to show how simple and powerful it is.

4. When I'm just coding something- alternating writing and testing - is much quicker than writing a bunch and then debugging it. For bigger projects, setting up CI/CD early can similarly save headaches.

5. The functional big three (map, filter, reduce), but for me even more so closures. I had gotten stuck trying to hardcode coefficients for a polynomial until I noticed I'd end up with a lot of duplicate code, which was what I was trying to avoid with FP in the first place. Then I realized I could just put the polynomial function itself in a closure and call it with the coefficients I wanted, when I wanted.


👤 melvinroest
Writing a computer graphics engine during university taught me a couple:

1. I can use math, intuitively, even if I don't know how the exact calculation works.

2. I can use a debugger and replay it again and again to understand what's happening.

3. I can diagram a high level design and not have the complete context in mind and still produce a 10K LoC OpenGL powered computer graphics engine.

After that project, programming never felt impossible to me anymore. After that project, I always had some confidence of being able to learn whatever I needed, as long as I have enough time.

Here is the engine (2013): https://www.youtube.com/watch?v=PH6-dLvZEiA&t=1s&ab_channel=...


👤 superasn
My aha moment was that it is 100x better to focus on one project and make it the absolutely very best instead of trying 10 things.

With these new and easy frameworks it's incredibly easy and quick to create new projects but creating and launching something is just the tip of the iceberg.

The main work comes after that, i.e. traffic, leads, conversions, optimization, etc which unfortunately can only be done with good old tedious hardwork and laser focus.


👤 hirundo
When I learned about the repl. For me 90% of debugging is about figuring out how to break at the crucial line and then shining the repl light on it. 75% of writing new code is about trying stuff in the repl and then stepping through the code in the repl and testing everything.

For me that quick feedback loop makes coding as fun as gaming.


👤 brainwipe
1. Seeing a company do agile right by keeping the customer so close their almost in-house. All the other stuff is meaningless. 2. Realising I didn't have to code for faceless corporates shifting cash around, I could work for a small company actively trying to make people's lives better. 3. Breaking neural networks out of a predefined topology and letting them grow. 4. If it changes during runtime, put it in the database. If it changes per environment, put it in config. Every else in code. 5. That programming is a craft and you must treat it like one; just bashing out code to fill a brief isn't enough.

👤 mcv
When I realised that programming is just taking a big problem and chopping it into a couple of smaller problems, and then repeating that until your problems are trivial.

The first time (1993) I was logged in on a remote machine and downloaded a file to there from another remote machine. It was just magic that the machine I was touching had nothing to do with that file.

When I hacked together a mailinglist system in shell script.

When I realised that C is basically just all memory locations.

When I understood closures.


👤 semicolonandson
Becoming proficient with tools that allow me to pry into running processes and see what they are doing under the lid

e.g.

lsof to see files and ports opened up by a process

strace to look at what system calls to the Kernel by program is making (and ltrace for library calls)

pstree to see subprocesses spawned

gdb to inject myself into infinite loops of programs written in c-based implementations (e.g. Ruby)

I'm a collector of aha-ish moments so if you want more, my YouTube account is linked to in my profile.


👤 namuol
1. Even though it requires precision, the vastness of the solution space effectively makes programming a mushy, tangled mess of technology.

2. End of list.


👤 baron816
I went from Ruby to JavaScript development. First, I’d like to say I think starting with Ruby/Rails is a bad idea. Ruby uses a lot of higher order functions, but it isn’t super clear from the syntax how that all works. Higher order functions in JS are much more clear, IMO, on account of having to use parens to call the function (in Ruby, you don’t need parens to call function, you can just name the function and delineate the arguments with spaces).

So with Ruby, I was just pattern matching—I never had an intuition of what lambdas were doing until I moved over to JS.

The Feynman Method was another aha moment. Learning how to learn, in my case, by writing down stuff in my own words as if I were teaching someone else was probably the most important thing. That helped me develop a much deeper understanding of important concepts.

Similarly, not trying to learn everything was a big deal. Don’t try to learn all the niche technologies you see in job listings expecting you need to know those things in order to qualify for the position. If you just do dozens of tutorials, you’ll end up knowing a lot of things very superficially. It’s much better to know a handful of ubiquitous, related technologies very well and have a strong foundation of programming fundamentals. Those things can transfer over to other technologies, when you need to pick them up.


👤 Regenschirm
- My team lead showed me code which i wrote just two years ago and i didn't believe at first, that i wrote that crappy.

- When i realized that someone might quit his/her job in his trail phase. It did not occure too me that both sides are checking if its fun/good for them

- The main reason why i'm successful is that, besides my character flaws (which did become better over the last 15 years), good software engineeres are in a very strong demand and when i look on how hard it is to get good people, i just might never have a really hard live

- Softskills are crucial: Taking responsibility, being on Time, being reliable, taking action when it matters without hesitation

- You had a salary negotiation or a discussion and something was decided? You still can get back to this 1-x days later and say 'you know i thhought about it and i'm not happy with the outcome at all. We need to discuss this topic again'

- Don't complain if shit is shitty. Either change it, try to change it, accept it, or quit. Stop telling others thats it shitty and do nothing.

- Estimation is bullshit, never works, never aligns, no one really retrospect it and if you ever find a team where it works, your team gets dismantled for whatever reason and you have to start at 0. Prioritise for relevance, optimize how you work, accept the outcome.

- Never accept a deadline. Without a deadline, your manager can't come back and say 'you promised' which leads to you doing overtime for a misstake your manager did: he/she missmanaged!

- Do less but better. Whatever you do shitty now will come back

- Not doing something because you actually figure out what the other person needed/wanted, is more often then not the better result if it does not to lead writing more code

- Understanding how you programm a computer game for three main reasons: 1. memory allocation can fail 2. how the game loop works and how to programm it 3. Randomness

- Sentences to know:

-- I can try to get it done, but i can't promise

-- I have a date tonight, i can't stay (if they insist:) I have expensive cards for ... (pressuring you into doing overtime just because is missmanagement)

-- No


👤 daxfohl
Powershell? (I've done windows most of my career, but "shell" would apply anywhere).

It always seemed so lame compared to neat new functional languages or distributed actor frameworks. And, not being much of an ops person, I almost wanted not to be any good at powershell (or any shell) so I wouldn't get ops assignments.

After getting familiar with it though, it's improved my workflows and ultimately quality of life. I'm not only more fluent in ops now, but I also get to spend less time on it, which is what the rationale for not learning it was (write a script once and never have to remember what I did when creating a new environment).


👤 gitgud
Learning the [1] Unix Philosophy. Writing & Using small tools that can be piped together... was a huge AHA moment for me

[1] https://en.wikipedia.org/wiki/Unix_philosophy


👤 medymed
When I found that interfaces/abstraction was way more critical to understand than recursion or obscure algorithms for most real projects. Half of the battle of starting (or understanding) any project is clarifying the interfaces in which one should be nestled.

👤 philihp
My favorite all-time aha moment was coming from a CVS world and learning how to use git. Oh, you just hash the entire lot of the files together and use that as a version for the repo? Aha! The change set should be for all the files together!

👤 ralphc
Mid 90s, when I was introduced to Java after C++. With the combination of garbage collection vs malloc/free, and just seeing that Java gave you 80% of C++'s features with 20% of the complexity, I knew that it was going to be big and take over C++ as the language for company and enterprise development.

👤 tootie
I think the simplest realization is that the compiler is always right. Yes, technically, compilers can have bugs but 99.99% of the time the bug is yours. I used to stare at non-working code and think "This should work!" But that's never true. If it doesn't work, I made a mistake.

My next realization was that a well-defined and clearly communicated product definition is 10x more important than good coding.


👤 sedeki
1. When I realized that there's a thing called "computer science", and you can actually study things related to programming. It's not just ad hoc tricks written by experienced and bearded C programmers. You see, I'm self-taught from an early age, and learnt programming via skimming "VB for Dummies" when I was 8 y.o. or so. I didn't know anyone at all that knew how to program. It would take another decade before I met someone who knew how to program.

2. When I read K&R at the age of 14. Before that, I had just followed random tutorials that I found on the internet. From that point on, I went straight to the source (no pun intended) when it comes to learning new stuff.


👤 qw
1. Talking to other developers.

Sometimes using "uncool" technologies and languages is ok, and they often have a good reason.

2. The concept of "innovation tokens" if you solve a business need

https://mcfunley.com/choose-boring-technology

3. There is no silver bullet.

Learned the hard way after switching to the new "flavour of the month"


👤 ChicagoDave
1. Polymorphism

2. Vendors are often pushing bad architecture

3. Architects often push things for the wrong reasons

4. Companies often push risk to their vendors, avoiding collaboration, inherently increasing risk

5. ORM's hide business logic, preventing a company from understanding its business and adapting to change

6. Relational Databases are an operational anti-pattern

7. Graph databases are excellent tools for common problems like security, social media

8. Document databases are excellent tools for Event Stores and CQRS

9. Cloud native development (FaaS) is awesome

10. Domain-Driven Design and Event Storming are excellent disciplines to add to a corporate development group

11. Corporations inherently don't understand Agile because they have to measure everything through strong process definition


👤 LaundroMat
While I was reading "Eloquent JavaScript", I came across this absolute eye-opener (to me at least): https://eloquentjavascript.net/07_robot.html#p_MtO6TwqB5I

In this chapter the author demonstrates that it is wrong to solve a problem by creating a series of objects that do stuff. He shows that the right way is by abstracting the problem into its most essential elements. Do not try to emulate reality with code, but make it its own, abstract thing that solves the issue instead.

This chapter and its code felt like pure poetry to me.


👤 trumpeta
1. Use ADTs instead of OOP, always. a.k.a composition over inheritance

2. No ORMs

3. No Frameworks, use libraries to build up your project

4. Use strong type systems for modelling (make invalid states not representable)


👤 bfuclusion
When I stopped coding, and started thinking. Turns out hacked together stuff is slower to produce than thinking overnight and getting it right.

👤 Raed667
When react/redux finally "clicked" in my head. Coming from an MVC world it was quite a paradigm change.

👤 jacobwilliamroy
When I learned what a kernel was, and the implications of proprietary closed source kernels, I was instantly radicalized to the cause of gnu and libre software.

👤 dinkleberg
Finally understanding the appeal of static typed languages after doing a project in Go using Goland.

I always thought I loved the freedom of python and JavaScript (and I still do to some extent, you’ll have to take Django from my cold dead hands). But the power of static type becomes super clear when you’re using a great IDE.

I can hit a hot key when hovering over any function and it’ll show me the docs and take me to the full page if I want. Hitting another hotkey it’ll auto fill parameter options. It’ll auto import the libraries I need. It’ll complain if I’m using the wrong type.

It gives me so much more confidence that the code I’m writing will actually work. It’s slightly more of a pain to write, no doubt about that, but the payoff is huge.

I was burnt by java in the past and can’t stand how every project seems to end up like this https://github.com/EnterpriseQualityCoding/FizzBuzzEnterpris.... But Go has shown it doesn’t have to be like that.


👤 Oras
I always thought programming was my passion as I never get tired or loose motivation when I code until I realised that my passion was DIY and programming was just a tool to create things.

Starting from there, I don't take any programming language seriously anymore. Whatever makes the job easier and faster to accomplish in context.


👤 temporallobe
When I finally understood that browsers only understand HTML, JavaScript, and CSS. Yes that’s a simplification, but it’s essentially true. Early on in my career when I started more front-end development, I was under the impression that there was somehow much more going on under the hood, i.e., more languages, executables, ways of setting styles and layouts. When this finally clicked with me, it was all so much clearer and immediately less intimidating. I’m still surprised to this day that even many experienced developers think that browsers can natively interpret SCSS, TypeScript, or whatever templating language they’re using — heck, some people even used to call jQuery a “language”.

As a relatively new Clojure developer, the meaning of the parentheses was the ah-ha moment. I imagine it’s the same for other Lisp-y languages.


👤 Antoninus
Making the realization that I only do this for money and attaching my self-worth to my output or current role is futile.

👤 mekster
What makes money is with a business sense with a shitty developer skill and not with a developer sense with shitty business skill.

Of course, being purely technically talented is one way to go as I have chosen to be.


👤 manjana
Trying to master non-framework CSS and realizing how dead simple styling a website can be when you are using a framework instead of trying to reinvent the wheel all over.

👤 philholden76
Not really an 'aha' moment, more of a dawning realisation as I gained experience.

When starting out as a developer, I think there is a tendency to see the particular language/framework/syntax that you're using being all-important.

Over time, and with experience, you realise that the language and syntax are just the "fine detail" of how you solve problems as a software developer: as your understanding deepens, it's as if a kind of abstraction happens in your brain, because you stop thinking so much about the fine details of language and syntax, and start to worry about things like managing complexity and optimising design etc.

And at that point, the realisation for me was that I can apply that experience to any language/syntax/framework, which frees me up to pick the best way of solving any particular problem, and to not be stuck in a rut with any particular technology.

An added benefit is that a lot of the debate over stuff like "language X is better than Y", or "this code style is correct and that one is wrong" become unimportant, because you're thinking at a level that's not limited by specifics.


👤 vincent-manis
- when I read Clark Weissman's `LISP 1.5 Primer' in 1968

- when I finished a student programming assignment a year later and realised that a well-chosen set of procedures constituted what we'd now call a DSL

- when I ported Martin Richards' very clean BCPL compiler and realized that you could write efficient code in something that wasn't Fortran

- when I read the Thompson/Ritchie paper on Unix

- when I completed my first reading of SICP


👤 bilinualcom
When I actually understood the idea behind functional programming, specially being able to think in the form of recursive functions than loops.

👤 BiteCode_dev
Programming is using conventions. Lots and lots of it.

Conventions for syntax, names, logic, API, structure, vocabulary and so on.

Conventions are by nature arbitrary, influenced by culture, history, social behavior and a whole lot of human weirdness.

Don't try to learn all the conventions first, it comes faster with practice and exposure to it. Solve the problem, then find the conventions you need to apply the solution in your context.

The beginners paradox is that they need to learn a bit of specific conventions (E.G: part of a language, one paradigm and a few libs) to start working on solving a problem, so it's a frustrating experience.

There is no easy way to build the rest of the conventions based on your knowledge of what you know now, because it's artificial.

It's also what leads people to say "don't learn languages, learn to program". Which makes no sense to you at all, until, well, you finally know how to program. But you got there by learning conventions on the way.


👤 animesh
1. When I understood everything that happened in a custom chart with d3 I created, including d3 patterns, animation, svg, everything else just paled in comparison.

2. When I gave into Power BI finally and fixed a report someone else created, it felt so good. I don't think there is anything that compares to it and I only scratched the surface.


👤 majky538
Spending so much time on meetings, especially awesome agile scrum - standups and even when working in probably 5th company, business processes do not work well, people deal with same problems of work organization. For me as developer, this kills my productivity and annoys me sometimes when it's just to much in a week.

👤 Jtsummers
I've had a few, but I'll pick one in particular. I got started with programming via BASIC (various forms), none of which supported recursion. I didn't really know what it was, but knew that I just wanted to call into the same routine again from itself or via mutual recursion (again, didn't know the term). This didn't work as expected, however. Either the program wouldn't compile/execute (threw an error at the recursion) or the recursion just corrupted the data (each call shared the same local data, so they'd clobber each others' work). While playing around on a TI calculator I programmed (something, can't remember the details) but created a stack using the list data structure. I then looped (versus recursed) but pushed a data element onto the stack or popped elements off of it. The program quit when the stack was empty.

Later, in college, we were learning lower level programming details (like what C translated to in assembly and how it managed calls and the stack frame). Despite this being my third CS course in college, I hadn't really grokked recursion yet. But I had a flashback during one of the classes to the TI-BASIC programs I'd written using a stack, and realized I'd recreated recursion (but manually). After that recursion and loops were synonymous in my mind (as they should be, at least in many cases) since I knew how to translate between them. Whenever I saw someone managing a stack and looping until it was empty, I knew both that it could be and how it could be translated into recursion (and vice versa).

It seems to be one of the hardest topics for many of my colleagues (especially those without a CS degree, so lacking practice with recursion) to understand or ever use. But I can usually get them to understand it once I draw a few things out on paper and show the two solutions to a problem (recursive or iterative). This doesn't mean they like recursion, most still avoided it, but they started to understand that it wasn't magic, it was just the computer managing the same data structure they were manually managing.


👤 znpy
I was studying operating systems and computer architecture in university (this included a tiny amount of assembly programming) and was interning at a company that used scala and learning scala. At the time there was quite a fuss about the fact that the jvm did not support tail call optimization (at least until Java 7 IIRC?).

So on one side i was programming in scala and on the other side i was "hand-compiling" small C functions into m68k assembly...

The aha-moment came when I was hand-compiling a recursive function down to m68k assembly and I saw that I could completely eliminate ALL the recursive calls by re-arranging some register values and some values in the stack frame, inserting a small preamble in the assembly and then at the end of the assembly routine jump back to said preamble instead of making a recursive call.


👤 meheleventyone
Realising that 90% of programming advice is unsupported junk.

👤 jariel
1) Complexity is very expensive

2) Written code can take a long time to stabilize and is thus expensive to change and maintain

3) Ostensibly orthodox legacy systems are full of anti-patterns.

4) How easily a small problem can turn into an infinite number of other, small problems, and the intuition required for where to 'draw the line'.


👤 agentultra
Some more recent ones of note...

0. When I learned how to model systems in TLA+

1. When I figured out how to structure programs using monad transformers... sort of the moment where I started reasoning algebraically about programs on my own

2. When I learned how to specify propositions in types and verify my designs and programs


👤 Megalepozy
Finding Anki after 17 years of programming and commiting to insert every new data into it and practice it daily.

For me there is no other way of learning anymore and it serve as my real memory. Basically it give me the confidence that something that I learn now will be known years later.


👤 l0b0
1. My first job out of university, when I discovered you can do interesting things with programming.

2. Learning XForms, which can do most of what JavaScript is used for in a tiny amount of code.

3. Learning JetBrains IDEA, the only IDE that I ever enjoyed using.

4. Learning red-green-refactor TDD. Now refactoring is something I do as a matter of routine, not dread.

5. Understanding the fractal complexity of Bash. It's weird how a language can make stream processing and parallelization basically trivial, while making things looping over non-trivial sets of files reliably PhD-level tasks.

6. Doing pair programming, and then keep doing it for four years because it was brilliant.

7. Installing Arch Linux, the first distro where things weren't constantly broken because of version drift.


👤 daxfohl
Watching a coworker and his Zettelkasten notes. Junk that he did three years ago, just pulling them right up, finding everything related in a couple keystrokes, and reproducing the results. Unlike my lame attempts to keep notes that I can never find again.

👤 lonelyasacloud
How easy I am to distract when I think I have something important to say ... gosh dang it ;-)

👤 tehlike
The moment when I first wrote my cgi/perl thingy and deployed my first website.

👤 stunt
When I realized I can use a simplified version of my take on Agile and Scrum with a simple Trello board, and apply that on my side projects. And that could help me to build them twice as fast and actually finish them too.

👤 Havoc
Docker & associated devops stuff like docker-compose. Had a bit of that "speak the right incantations and magic appears out of thin air" vibe to it as with learning to do basic programming.

👤 shivenigma
1. When I finally understood how event loop in Javascript works for the very first time, I had an Aha moment on the language design.

2. When I read about how garbage collection works in V8 Javascript engine, I had an aha moment about how hard things are just one layer below my working area.

3. When I started learning Rust, had another aha about how the language is designed without GC. I never thought that was possible in a language.

These maybe simple things for many, but I have no educational background in any of these, so I'm amazed by things that people actually got used to.


👤 vbsteven
That everything software and most hardware/electronics is just logic all the way down with different levels of abstraction.

This is why I love software so much and struggle with people. Software is all logic and mostly deterministic, while with people feelings are involved (which could be argued to be also deterministic but then we get into philosophic discussions about free will)

But on the software/hardware side, any bug can be resolved by digging through the layers of abstraction and figuring out where the logic error is.


👤 dipiert
1) Some people don't share your passion for software. They consider this like just another job, they are not interested in improvement as long as they can do daily programming tasks. That's ok.

2) It's not enough that an idea makes sense to you to communicate it. People usually hate changes, you need to understand all the details before opening your mouth.

3) I remember many a-ha moments regarding how the Internet works while reading "Computer networks" by A. Tanembaum.

4) When I was finally able to exit vim.


👤 Rerarom
1. When I first learned to program (in Pascal)

2. When I learned event-driven programming using Windows Forms in C# and I was able to create programs that resembled the ones I used

3. When I took a course in POSIX, programmed using fork and pipes and learned about stuff like race conditions

4. When I spent a year learning everything there is to know about coding (including assembly, lisp, smalltalk, rust) and realized I would never feel as happy as I felt during 1-3 because I had changed as a person


👤 david_m
1. Figuring out and loving functional programming in JS/Node.js, coming from a Java/OO background

2. Hating working on a Java/Angular/OO project after 5 years of FP


👤 erik_seaberg

  [1]> (+ (expt 2 32) (expt 2 32))
  8589934592
Numeric overflow is a choice. Even a 16-bit program that might have had a 32-bit ALU could simply decide to support larger numbers rather than producing something meaningless and dangerous.

  def retry(block: =>T): T = ???
Half of what I missed about macros is being able to control (re)evaluation of an expression, and call by name handles that really well.

👤 abetusk
I'm not sure if these are too meta, but here are ones that I think are relevant:

- The free/libre/open source community care deeply about fundamental problems of our society and are trying to provide legal and technical tools to help take steps to create a better world. When I was younger, I thought "suckers! They're giving their compiler away for free!" It took me a while to internalize the free software ideals and even longer to be an active proponent.

- Corporate software, especially Microsoft, is in the business of creating a walled ecosystem, charging end consumers for their product and charging software developers to be part of that ecosystem. The first 'a-ha' moment was when I realized they, and others like them, were a racket, or at least trying to be one. The later 'a-ha' moment was when I realized there was a viable alternative to this game.

- Most (but not all) computer programming language flame wars about which language is better boil down to whether the developer prioritizes speed of program execution or speed of program development. See [1]. Newer language wars center of safety and bug creation, so maybe this point is dating me.

- Programming languages are more about human psychology than about some mathematical proof of correctness. Programming languages are designed as a compromise between how a computer needs to be told to operate and how humans communicate. All the theoretical and mathematical infrastructure around language design are there to justify decisions once the language has passed the "psychology" test. This is the same idea between JPEG, MP3, etc. compression. Fourier transforms and wavelet transforms don't inherently create a saving, it's only when we can throw away higher order coefficients because the human perceptual system isn't as sensitive to them as lower order coefficients, does it give benefit. That is, JPEG and MP3 are specifically tailored to the human perceptual system. The math is there as the building block but which bits/coefficients to throw away are determined empirically. To belabor this point a bit more, programming language discussions arguing over functional vs. imperative, type safety, etc. that don't try to determine measurable metrics of success, preferably through testing their hypothesis with data collected on actual usage, are expressing an opinion only.

[1] https://web.archive.org/web/20200407014302/http://blog.gmarc...


👤 quickthrower2
You can get paid more and have an easier time.

You can’t possibly know what a company you join will be like until you actually have been working there 12 months.

For large code bases you can’t really rearchitect anything. You are stuck with how it works. Maybe on small scales you can refactor.

Don’t blindly apply design patterns. SOLID is good as a thinking framework rather than a code review gate.

Marketing isn’t what you think it is until you study it somewhat. E.g. it’s not glossy ads!


👤 kangnkodos
When I first started compiling code, there were often pages and pages of compiler errors. I felt that I had to read all the errors every time, and then somehow the most important information would magically emerge from taking in the big picture.

I learned to focus in on just the first compiler error, and ignore all the rest.

Read the first error. Resolve the first error. Recompile. Repeat.

This is just one example of breaking big problems into smaller problems.


👤 baq
1. computer science is maths, software engineering is power point.

2. in software engineering it's always a people problem no matter what they tell you. see point 1.


👤 yodsanklai
There were so many... I learned programming by myself with limited resources when I was a kid. Eventually I got a formal education and there were a lot of aha moments!

For instance:

1. Writing a Scheme interpreter in C

2. Abstract data types and encapsulation

3. Functional programming and recursive algorithms

4. Groking OOP and design patterns (this took me a long time)

5. Understanding how processes and scheduling is done in an OS

More recently, not really a "aha" moment, but Git has been a game changer.


👤 chubot
Using shell as a REPL for C.

I learned C on Windows, and before I learned any dynamic languages. And before I had ever written a unit test.

I knew all the rules, but I was not good at making a reasonably sized, correct program in a reasonable amount of time.

---

But then I developed a good workflow for writing Python, shell, etc., and then went back to writing C, and it helped immensely.

C is a dynamic language in many respects anyway!


👤 7sigma
1. lisp

2. realising that its more about delivering than dreaming up the perfect abstraction (get it done).

3. what you think the user wants vs what the user thinks they want vs what the user actually wants vs what the user actually needs.

4. that there are always tradeoffs

5. building a product by yourself (whether on the side or starting up) will give you invaluable experience.


👤 stack_underflow
1. The first language I ever used/learned was batch to script things in windows. When I learned python shortly after I recall it taking quite a bit of convincing myself to accept/use the "magic" of control flow being able to automatically jump back after a function returned as I was so used to manually wiring up all my gotos (was more of a painful experience than an 'ah-ha' I guess...)

2. When I was going through Tim Roughgarden's Algorithms course and saw the derivation of runtime complexity for mergesort and finally understood/visualized what a logarithm actually did (in school it was just taught as some rote function to help you manipulate eqautions of the form y=b^x)

3. Learning how TCP works from the bottom up. I think the biggest aha moment was when the textbook I was reading explained the TCP algorithm as a state machine that's just running on the client and server machines with the rest of the underlying network just forwarding packets, i.e. "pushing the complexity to the edge of the network".

4. Working through the nand2tetris project resulted in a lot of "oh X is just basically this at its core"

5. When going through a textbook explaining how relational database engines were implemented and seeing that they're essentially just using disk-persisted hash tables and self-balancing search trees to build indices on columns and make `select`s O(1)/O(log) time (I wasn't taught this in my uni's database course and assumed there was some fancy magic going on to make queries fast)

6. Realizing that I could just do a form of graph search/dependency resolution when learning a new codebase/trying to understand how a function works. I think before seeing someone do this in front of me I would usually just panic at the thought of "thousands of lines of code" rather than just "keep iteratively diving into the functions being called". Whenever I'm learning a new language, the first thing I'll do is setup the LSP plugin in vim so that I can quickly navigate up and down call graphs. Tbh I don't understand how some developers claim to not need this and instead just manually grep+open file in order to "jump to definition".

7. Forcing myself to derive the rotation logic for AVL trees. I was curious if, given just the high level properties of how an AVL tree behaves in order to guarantee O(log) time lookups, if I would be able to figure out all the rotation cases. Was a very rewarding exercise and something I plan on writing a blog post about (eventually...)

(edit)

8. Learning about the log data structure and how state can be replicated by replaying/executing this stream of transformations/updates.


👤 phendrenad2
When I realized that my code won't last forever, and in fact shouldn't. My code will solve a problem now, but in 5-10 years someone else will rewrite it. The company may pivot, or get acquired, or merger. So build a solid foundation, but don't plan for it to become the next world wonder.

👤 josgraha
my aha moment as a developer

- I would learn some technology to a point where it does what I want

- I take what I learned to some new technology but find myself doing things from what I learned in other technologies

- this puts me in a strange loop where reality wasn't lining up with my expectations or I would do things that were more work than necessary because of learned habits

- the aha moment was when i started learning the theory of the thing I was working with

* it was key to getting out of this rut

* turns out this applies to any theory from abstractions all the way down to computational theory, type theory, automata theory, software analysis theory, hey if i ever get there maybe even software synthesis

in a nutshell i would summarize this aha moment as "you can keep doing the same old tricks and eat the cost or you can always dig deeper and see what costs can be avoided."


👤 sys_64738
When I realized Emacs can do everything.

👤 lazharichir
1. When I learnt TypeScript after having used mostly PHP and JS until then – not sure how I lived without stronger typing

2. When I started reading programming and software engineering books after having just "done it" – this brought so much thought and structure to what I used to improvise


👤 sebastien_b
In the 90’s, as a teen I was reading the Amiga ROM Kernel Manuals to try to learn programming on the Amiga.

The section on graphics and UI described BOOPSI - an object-oriented way of constructing UI elements with inheritance, etc. Had never been exposed to that before and it blew my mind.


👤 dave_sid
My favourite Aha moment is the chorus of The Sun Always Shines on TV. Gives me shivers down my spine.

👤 SargeZT
1. Objects finally 'clicking' into place for me.

2. Being an early adopter of Tulip (later renamed asyncio) and gaining an understanding of the event loop and concurrency without threads.

3. Understanding that all code is just a particular representation of some S-expresssion.


👤 adontz
Microsoft Excel is a functional language IDE where each cell is a function expression.

👤 globular-toast
When I realised it's not magic.

The first computer I had access to was my family's Windows 95 PC. I learnt to write HTML in notepad and see it rendered in Internet Explorer. This was the beginning of my career, but so much of what was happening was accepted by my brain as simply "magic".

I would later chip away at that stack of magic and learn how more and more things worked, but even while being an accomplished programmer I still had this feeling that magic existed. It wasn't until I learnt to build my own computer from discrete logic components (thanks, Ben Eater) that I finally felt like I understood it. Computers are just machines.

I've since revisited textbooks (like Knuth) and the history of computing (starting with Babbage) and feel like my eyes are no longer obscured by my preconceptions about what a computer is.


👤 scott31
When I was introduced to Arc, it was the missing piece that completed LISP for me

👤 tibbydudeza
Dad's HP-67 .. could program it ... he later got a HP41C. Dad bringing home his HP-85 for my school vacation for "work". Turbo C Unix What is new is actually old ... "Mother of All Demos".

👤 cryptica
When I realized that the entire software industry is a farce; the only purpose of which is the creation of jobs in an unnatural supply-side economy driven by unequal, preferential access to easy money.

👤 christophilus
Learning Clojure. All of the FP theory I’d learned in university finally clicked and made sense. Also, it challenged my notion that the only sane languages were statically typed.

👤 tmaly
If you are good at communicating, sometimes you do not even need to write a line of code.

Understanding what someone really needs should be a skill taught in CS.


👤 endori97
Reactive programming coming from OOP was it for me.

👤 leeman2016
1. Finding out WPF after years of work on WinForms; in that you can do MVVM stuff, control templating, vector graphics, etc.

2. React and Typescript

3. Jetbrains IDEs


👤 neillyons
After going through Ben Eater's 8bit computer video course on YouTube. He builds a computer on breadboards. It is amazing.

👤 switch007
I'd be interested to know what the 'aha' was when learning Express (and how you find it compared to Django)?

👤 austincheney
Most recently: HTTP2 has unpredictable performance problems when not used for typical get/response or binary streams.

👤 RickJWagner
When I purchased the most amazing problem solving device I've ever had. A personal whiteboard.

👤 HeadHonchoSP
1. That you only need VIM :qa

👤 cutler
My discovery of Clojure followed by my discovery of Ruby's Lisp roots.

👤 throwaway_pdp09
Realising that functions could be first class object, that is, they could be passed around just like integers or strings. Suddenly a whole new world of possibilities to simplify code magically opened up.

👤 yitchelle
The less code I write, the less bug it will contain.

👤 scollet
Networking and byte packing. Still amazes me.

👤 m00dy
the moment that I realised what a syscall is.

👤 slmjkdbtl
(gamedev) When I switched from Rust to C.

👤 username90
Most of the time spent developing is spent making decisions.

To me technical debt is therefore defined as how many decisions you have to make in order to create a feature. Clean is when you don't have to make many decisions to get things done.

Example decisions, I'd say I spend at least 90% of my time developing on these decisions:

- What feature would be good to have?

- Is the feature worth the effort to build?

- Is the feature worth the compute costs?

- What language/framework should we use for this feature?

- How should we structure persistent data related to this feature?

- Where should the code for this feature live?

- How should we test this feature?

- How performant should this feature be?

- What name should this helper function/variable have?

The more of those you have to think about when developing the slower you will make progress. Therefore the main productivity hack is to write down guidelines or roadmaps or design documents for all of those so you don't have to think much about it when developing. This means don't be a manager when coding, let someone else do that work or do it before you start programming.

Things you can do to reduce mental cost of above decisions:

- Product roadmap with features that would be good to have.

- Discussions in the roadmap related to how much value said feature will provide and the effort to produce it.

- Discussions in the roadmap related to how expensive the feature will be to run.

- General guidelines on what language/framework you use.

- Have a very good architectural document describing how you structure persistent data.

- Have list of example commits showing where to put code for different features.

- Have a well documented testing strategy with examples pointing to commits with good integration and unit tests.

- Have guidelines on how much typical actions are allowed to take, like "page update should take 100ms at most".

- Try to write code where you don't need a lot of long superfluous names, namespaces and anonymous functions are your friends.

- Lastly, as much as possible try to make reasonable defaults for shared code. If you have to make 20 configuration decisions in order to use a library then you wont save a lot of time using it, and likely people will just copy the configuration from other places anyway since making 20 decisions is too much work. For example, lets say your library have a flag that can speed up processing 2x in some cases but with extra overhead most of the time. You could think that forcing the developer to decide in each case to ensure we aren't missing any performance improvements would be a good thing, but in reality a 2x performance improvement rarely matters. So the cost of having every developer making this decision outweighs the performance benefit. Instead have it as an optional config that they can set when they actually need the performance.


👤 02020202
Not AHA moments, but some pointers:

- most tech is stupid simple, like retard levels of simple, with incredibly complex description

- SOA is stupid and monlith is always the way to go(maybe with handful simple microservices)

- microservices are not SOA

- event sourcing is almost never a good idea

- raw bytes over the wire or in storage are not as complex and mysterious as you might think

- compiled languages are not "smarter" than interpreted languages, don't feel like you are lesser of a programmer if you do php or javascript

- new cool and shiny tech is cool and shiny for 5 minutes, until you implement it, then you realize it brings a ton of new complexity and a ton of wasted time and you should have stayed with what worked before and you realize the people who promoted it were just trying to make a sale

- mysql/mariadb will handle way more than you think

- if you want to waste resources, use java/jvm

- yagni should be tattooed on every programmer's hand so when he types on the keyboard he is constantly reminded that he is wasting time with stupid crap nobody will use

- don't think in "what if" terms or try to predict the future, just code what is needed right now, avoid being a smartass

- you can charge more for your services the more essential you are for the project and the harder it is to onboard new people, ie. your value increases over time

- if you charge less now, you will get paid less tomorrow

- running your project entirely in cloud will bankrupt you, use it to gather usage data, than move to bare metal. cloud is cool and "in" but it will eat your wallet, for no good reason whatsoever

- single binary is always better than docker

.. i could go on and on...but thesre are more guidelines than aha moments so that is it for me.


👤 non-entity
Not sure I can pinpoint it to a specific time or anything, but as of recent nothing feels like "magic anymore". Many years ago there were all sorts of classes of software, patterns, etc. that looked and sounded arcane and complex and I'd never understand how it worked. Now nothing really feels like that. I can typically get an "aha" moment by doing a bit of reading and understand from a very high level how something works. Not to say that I'm a good programmer and can't write and implement anything, because that's far from the truth, but I can typically understand stuff and lose the intrigue. Kinda sucks though because its killed the motivators that got me into the craft.

👤 logicslave12
When I watched a world class software engineer build a real time data processing platform using only functional programming.

At a major company, over ten teams were automated in a year.

Beautiful and useful abstractions around data and data processing tasks that provided extreme value.

I could never replicate it, but it reminded me about the power of software and the extreme ability that some have to weird it.