1. When I was introduced to QBasic
2. When I got to know how simple and amazing Lisp is
3. When I was able to code at the speed of my thoughts with VIM
4. When I got to know Express.js (after learning Django)
5. When I knew that everything in Smalltalk is a message including if else statements
For example, most "cutting edge" web apps are better off as PHP monoliths. Facebook was a PHP file for a long time. But most apps in general should never make it past being shell scripts, which are better off staying as spreadsheets or better - text files which are better off as pieces of paper whenever possible. And all paper is better off left as thoughts whenever possible, and most thoughts should be forsaken.
That’s it. For a long time I thought I was good at programming and I kept making the same mistakes over and over. All nighters, leet coding, runtime hacks, etc...
And then one day it struck me. It’s hard and my mind can’t keep up with yesterday’s cowboy coding.
Slowly I started putting defense mechanisms everywhere - good type systems, immutability, compile type instead of runtime, never overwork and sleep the best I can.
Life is much better now and I'll never go back to thinking I'm good at programming.
That's it. For a long time, when I thought of the hardest stuff to program, I was thinking of computer games. Mind you, not the latest AAA billion dollar development budget 300+ people on the team kind of games. No - given my age, I was thinking about single-programming doing code and graphics and sounds on a C64 kind of games. To me, it was incomprehensible how one person alone could manage to do a machine, especially with the hardware limitations, do all that.
And then one day it struck me. I had started to dabble with programming in my early teens and then never stopped but, I guess, gotten better at it over time.
Like with every process that evolutionary rather then revolutionary, there's a good chance to lose track of your progress. I remember clearly after finishing college, I really though that everything I knew then about computers and programmers I had already known even before I entered university. But then one day, a freshman asked me a question about a programming assignment in their class, and it was completely trivial to me. Yet, when I was a freshman, I would have probably asked the same question to someone else.
And yet, that was not my aha moment. It still took many years for me to realize that my idolization of game programmers was perhaps a bit much. Mind you, I do realize of course that someone specialized in any area will be able to produce better code than someone who's not - for any definition of "better". I'm still not a game programmer and never will be, so I still have the highest respect for their profession. But I do realize now that there's not outer-worldly skill that separates the game coder from the crop of all other programmers. Anything is within reach - what you need is not some innate talent, it's just dedication.
Life is somewhat better now and I'll never go back to thinking I'm not good at programming.
2. Knowing the business is key to having personal buy in for work. IF you work in a bank writing bank software, understand the bank and banking so you understand the context of the software. It's a real 10x thing to know why a requirement is a requirement.
3. The software you write will live a lot longer than you planned. Your experiments from 1. will haunt you.
- Implementing small languages, and how it inevitably leads one to a deeper understanding of all the layers that make computing and programming possible
- The Make-a-Lisp project and the language-agnostic conceptual design at the heart of it - https://github.com/kanaka/mal/blob/master/process/guide.md
- Declarative nature of React (the view as function of state); using centralized unidirectional state management and actions
- Test-driven (and, similarly, type-driven) development
- TypeScript - Benefits and costs of type definitions, dynamic/gradual and structural typing; power of a language integrated with the editor environment
- Virtual machines, QEMU, and later Docker - Commoditized, reproducible builds for applications and their environments
- Build pipeline scripts
2. That programming and making stuff people want are two entirely different things, although programmers always assume that whatever you're building, it's the right thing. You can spend your entire career in programming, learning all sorts of goodness about things like Erlang innards, and never really understand what your job is.
3. That all of that mousing around, learning a new IDE every two or three years when I started was a complete waste of time. I automatically assumed that the more cool and shiny the programming environment, the easier it would be to code and the better my code would be. But no. grep and awk work (mostly) the same way now as they did 30 years ago, and any time I spent learning which hotkeys did that work in some long-lost dev stack was a complete waste of time.
4. Conversely, that UX beats internal architecture, every time. If folks are having a blast using your app, you win, even if it crashes every five minutes (How they could have fun if it crashed every five minutes is a good question. You might want to ask them)
5. The more smart people you throw at a programming project, the bigger mess you end up with. I know when I say that people are thinking of "The Mythical Man-Month", but it goes deeper than that. Even if you somehow manage to stay on-schedule, human communication around innovation stops working at a certain scale, and that looks like a hard limit. There are ways around it, using things like Code Budgets, following CCL, layers and DSLs, but nobody does that, so it doesn't matter. We, as an industry, have absolutely no idea how to staff or run projects.
ADD: One thing that was quite profound that I discovered late: if you code well, the simple act of programming can tell you something about the way you're reasoning about the problem that you couldn't learn any other way. Programming is by no means a simple one-way street where ideas come out of the head of the programmer and end up as bits on the tech stack. It's very, very much bidirectional. Our programs influence us as coders probably much more than we influence them.
2. Doing NAND to Tetris the first time. It taught me not only how computers work, but how powerful recursive layers of abstraction can be. I had absolutely no idea how my system looked on RTL, but I was still able to build it.
3. Also Lisp. I wish Nand to Tetris had picked something more lisp like in the second half to show how simple and powerful it is.
4. When I'm just coding something- alternating writing and testing - is much quicker than writing a bunch and then debugging it. For bigger projects, setting up CI/CD early can similarly save headaches.
5. The functional big three (map, filter, reduce), but for me even more so closures. I had gotten stuck trying to hardcode coefficients for a polynomial until I noticed I'd end up with a lot of duplicate code, which was what I was trying to avoid with FP in the first place. Then I realized I could just put the polynomial function itself in a closure and call it with the coefficients I wanted, when I wanted.
1. I can use math, intuitively, even if I don't know how the exact calculation works.
2. I can use a debugger and replay it again and again to understand what's happening.
3. I can diagram a high level design and not have the complete context in mind and still produce a 10K LoC OpenGL powered computer graphics engine.
After that project, programming never felt impossible to me anymore. After that project, I always had some confidence of being able to learn whatever I needed, as long as I have enough time.
Here is the engine (2013): https://www.youtube.com/watch?v=PH6-dLvZEiA&t=1s&ab_channel=...
With these new and easy frameworks it's incredibly easy and quick to create new projects but creating and launching something is just the tip of the iceberg.
The main work comes after that, i.e. traffic, leads, conversions, optimization, etc which unfortunately can only be done with good old tedious hardwork and laser focus.
For me that quick feedback loop makes coding as fun as gaming.
The first time (1993) I was logged in on a remote machine and downloaded a file to there from another remote machine. It was just magic that the machine I was touching had nothing to do with that file.
When I hacked together a mailinglist system in shell script.
When I realised that C is basically just all memory locations.
When I understood closures.
e.g.
lsof to see files and ports opened up by a process
strace to look at what system calls to the Kernel by program is making (and ltrace for library calls)
pstree to see subprocesses spawned
gdb to inject myself into infinite loops of programs written in c-based implementations (e.g. Ruby)
I'm a collector of aha-ish moments so if you want more, my YouTube account is linked to in my profile.
2. End of list.
So with Ruby, I was just pattern matching—I never had an intuition of what lambdas were doing until I moved over to JS.
The Feynman Method was another aha moment. Learning how to learn, in my case, by writing down stuff in my own words as if I were teaching someone else was probably the most important thing. That helped me develop a much deeper understanding of important concepts.
Similarly, not trying to learn everything was a big deal. Don’t try to learn all the niche technologies you see in job listings expecting you need to know those things in order to qualify for the position. If you just do dozens of tutorials, you’ll end up knowing a lot of things very superficially. It’s much better to know a handful of ubiquitous, related technologies very well and have a strong foundation of programming fundamentals. Those things can transfer over to other technologies, when you need to pick them up.
- When i realized that someone might quit his/her job in his trail phase. It did not occure too me that both sides are checking if its fun/good for them
- The main reason why i'm successful is that, besides my character flaws (which did become better over the last 15 years), good software engineeres are in a very strong demand and when i look on how hard it is to get good people, i just might never have a really hard live
- Softskills are crucial: Taking responsibility, being on Time, being reliable, taking action when it matters without hesitation
- You had a salary negotiation or a discussion and something was decided? You still can get back to this 1-x days later and say 'you know i thhought about it and i'm not happy with the outcome at all. We need to discuss this topic again'
- Don't complain if shit is shitty. Either change it, try to change it, accept it, or quit. Stop telling others thats it shitty and do nothing.
- Estimation is bullshit, never works, never aligns, no one really retrospect it and if you ever find a team where it works, your team gets dismantled for whatever reason and you have to start at 0. Prioritise for relevance, optimize how you work, accept the outcome.
- Never accept a deadline. Without a deadline, your manager can't come back and say 'you promised' which leads to you doing overtime for a misstake your manager did: he/she missmanaged!
- Do less but better. Whatever you do shitty now will come back
- Not doing something because you actually figure out what the other person needed/wanted, is more often then not the better result if it does not to lead writing more code
- Understanding how you programm a computer game for three main reasons: 1. memory allocation can fail 2. how the game loop works and how to programm it 3. Randomness
- Sentences to know:
-- I can try to get it done, but i can't promise
-- I have a date tonight, i can't stay (if they insist:) I have expensive cards for -- No
It always seemed so lame compared to neat new functional languages or distributed actor frameworks. And, not being much of an ops person, I almost wanted not to be any good at powershell (or any shell) so I wouldn't get ops assignments.
After getting familiar with it though, it's improved my workflows and ultimately quality of life. I'm not only more fluent in ops now, but I also get to spend less time on it, which is what the rationale for not learning it was (write a script once and never have to remember what I did when creating a new environment).
My next realization was that a well-defined and clearly communicated product definition is 10x more important than good coding.
2. When I read K&R at the age of 14. Before that, I had just followed random tutorials that I found on the internet. From that point on, I went straight to the source (no pun intended) when it comes to learning new stuff.
Sometimes using "uncool" technologies and languages is ok, and they often have a good reason.
2. The concept of "innovation tokens" if you solve a business need
https://mcfunley.com/choose-boring-technology
3. There is no silver bullet.
Learned the hard way after switching to the new "flavour of the month"
2. Vendors are often pushing bad architecture
3. Architects often push things for the wrong reasons
4. Companies often push risk to their vendors, avoiding collaboration, inherently increasing risk
5. ORM's hide business logic, preventing a company from understanding its business and adapting to change
6. Relational Databases are an operational anti-pattern
7. Graph databases are excellent tools for common problems like security, social media
8. Document databases are excellent tools for Event Stores and CQRS
9. Cloud native development (FaaS) is awesome
10. Domain-Driven Design and Event Storming are excellent disciplines to add to a corporate development group
11. Corporations inherently don't understand Agile because they have to measure everything through strong process definition
In this chapter the author demonstrates that it is wrong to solve a problem by creating a series of objects that do stuff. He shows that the right way is by abstracting the problem into its most essential elements. Do not try to emulate reality with code, but make it its own, abstract thing that solves the issue instead.
This chapter and its code felt like pure poetry to me.
2. No ORMs
3. No Frameworks, use libraries to build up your project
4. Use strong type systems for modelling (make invalid states not representable)
I always thought I loved the freedom of python and JavaScript (and I still do to some extent, you’ll have to take Django from my cold dead hands). But the power of static type becomes super clear when you’re using a great IDE.
I can hit a hot key when hovering over any function and it’ll show me the docs and take me to the full page if I want. Hitting another hotkey it’ll auto fill parameter options. It’ll auto import the libraries I need. It’ll complain if I’m using the wrong type.
It gives me so much more confidence that the code I’m writing will actually work. It’s slightly more of a pain to write, no doubt about that, but the payoff is huge.
I was burnt by java in the past and can’t stand how every project seems to end up like this https://github.com/EnterpriseQualityCoding/FizzBuzzEnterpris.... But Go has shown it doesn’t have to be like that.
Starting from there, I don't take any programming language seriously anymore. Whatever makes the job easier and faster to accomplish in context.
As a relatively new Clojure developer, the meaning of the parentheses was the ah-ha moment. I imagine it’s the same for other Lisp-y languages.
Of course, being purely technically talented is one way to go as I have chosen to be.
When starting out as a developer, I think there is a tendency to see the particular language/framework/syntax that you're using being all-important.
Over time, and with experience, you realise that the language and syntax are just the "fine detail" of how you solve problems as a software developer: as your understanding deepens, it's as if a kind of abstraction happens in your brain, because you stop thinking so much about the fine details of language and syntax, and start to worry about things like managing complexity and optimising design etc.
And at that point, the realisation for me was that I can apply that experience to any language/syntax/framework, which frees me up to pick the best way of solving any particular problem, and to not be stuck in a rut with any particular technology.
An added benefit is that a lot of the debate over stuff like "language X is better than Y", or "this code style is correct and that one is wrong" become unimportant, because you're thinking at a level that's not limited by specifics.
- when I finished a student programming assignment a year later and realised that a well-chosen set of procedures constituted what we'd now call a DSL
- when I ported Martin Richards' very clean BCPL compiler and realized that you could write efficient code in something that wasn't Fortran
- when I read the Thompson/Ritchie paper on Unix
- when I completed my first reading of SICP
Conventions for syntax, names, logic, API, structure, vocabulary and so on.
Conventions are by nature arbitrary, influenced by culture, history, social behavior and a whole lot of human weirdness.
Don't try to learn all the conventions first, it comes faster with practice and exposure to it. Solve the problem, then find the conventions you need to apply the solution in your context.
The beginners paradox is that they need to learn a bit of specific conventions (E.G: part of a language, one paradigm and a few libs) to start working on solving a problem, so it's a frustrating experience.
There is no easy way to build the rest of the conventions based on your knowledge of what you know now, because it's artificial.
It's also what leads people to say "don't learn languages, learn to program". Which makes no sense to you at all, until, well, you finally know how to program. But you got there by learning conventions on the way.
2. When I gave into Power BI finally and fixed a report someone else created, it felt so good. I don't think there is anything that compares to it and I only scratched the surface.
Later, in college, we were learning lower level programming details (like what C translated to in assembly and how it managed calls and the stack frame). Despite this being my third CS course in college, I hadn't really grokked recursion yet. But I had a flashback during one of the classes to the TI-BASIC programs I'd written using a stack, and realized I'd recreated recursion (but manually). After that recursion and loops were synonymous in my mind (as they should be, at least in many cases) since I knew how to translate between them. Whenever I saw someone managing a stack and looping until it was empty, I knew both that it could be and how it could be translated into recursion (and vice versa).
It seems to be one of the hardest topics for many of my colleagues (especially those without a CS degree, so lacking practice with recursion) to understand or ever use. But I can usually get them to understand it once I draw a few things out on paper and show the two solutions to a problem (recursive or iterative). This doesn't mean they like recursion, most still avoided it, but they started to understand that it wasn't magic, it was just the computer managing the same data structure they were manually managing.
So on one side i was programming in scala and on the other side i was "hand-compiling" small C functions into m68k assembly...
The aha-moment came when I was hand-compiling a recursive function down to m68k assembly and I saw that I could completely eliminate ALL the recursive calls by re-arranging some register values and some values in the stack frame, inserting a small preamble in the assembly and then at the end of the assembly routine jump back to said preamble instead of making a recursive call.
2) Written code can take a long time to stabilize and is thus expensive to change and maintain
3) Ostensibly orthodox legacy systems are full of anti-patterns.
4) How easily a small problem can turn into an infinite number of other, small problems, and the intuition required for where to 'draw the line'.
0. When I learned how to model systems in TLA+
1. When I figured out how to structure programs using monad transformers... sort of the moment where I started reasoning algebraically about programs on my own
2. When I learned how to specify propositions in types and verify my designs and programs
For me there is no other way of learning anymore and it serve as my real memory. Basically it give me the confidence that something that I learn now will be known years later.
2. Learning XForms, which can do most of what JavaScript is used for in a tiny amount of code.
3. Learning JetBrains IDEA, the only IDE that I ever enjoyed using.
4. Learning red-green-refactor TDD. Now refactoring is something I do as a matter of routine, not dread.
5. Understanding the fractal complexity of Bash. It's weird how a language can make stream processing and parallelization basically trivial, while making things looping over non-trivial sets of files reliably PhD-level tasks.
6. Doing pair programming, and then keep doing it for four years because it was brilliant.
7. Installing Arch Linux, the first distro where things weren't constantly broken because of version drift.
2. When I read about how garbage collection works in V8 Javascript engine, I had an aha moment about how hard things are just one layer below my working area.
3. When I started learning Rust, had another aha about how the language is designed without GC. I never thought that was possible in a language.
These maybe simple things for many, but I have no educational background in any of these, so I'm amazed by things that people actually got used to.
This is why I love software so much and struggle with people. Software is all logic and mostly deterministic, while with people feelings are involved (which could be argued to be also deterministic but then we get into philosophic discussions about free will)
But on the software/hardware side, any bug can be resolved by digging through the layers of abstraction and figuring out where the logic error is.
2) It's not enough that an idea makes sense to you to communicate it. People usually hate changes, you need to understand all the details before opening your mouth.
3) I remember many a-ha moments regarding how the Internet works while reading "Computer networks" by A. Tanembaum.
4) When I was finally able to exit vim.
2. When I learned event-driven programming using Windows Forms in C# and I was able to create programs that resembled the ones I used
3. When I took a course in POSIX, programmed using fork and pipes and learned about stuff like race conditions
4. When I spent a year learning everything there is to know about coding (including assembly, lisp, smalltalk, rust) and realized I would never feel as happy as I felt during 1-3 because I had changed as a person
2. Hating working on a Java/Angular/OO project after 5 years of FP
[1]> (+ (expt 2 32) (expt 2 32))
8589934592
Numeric overflow is a choice. Even a 16-bit program that might have had a 32-bit ALU could simply decide to support larger numbers rather than producing something meaningless and dangerous. def retry(block: =>T): T = ???
Half of what I missed about macros is being able to control (re)evaluation of an expression, and call by name handles that really well.
- The free/libre/open source community care deeply about fundamental problems of our society and are trying to provide legal and technical tools to help take steps to create a better world. When I was younger, I thought "suckers! They're giving their compiler away for free!" It took me a while to internalize the free software ideals and even longer to be an active proponent.
- Corporate software, especially Microsoft, is in the business of creating a walled ecosystem, charging end consumers for their product and charging software developers to be part of that ecosystem. The first 'a-ha' moment was when I realized they, and others like them, were a racket, or at least trying to be one. The later 'a-ha' moment was when I realized there was a viable alternative to this game.
- Most (but not all) computer programming language flame wars about which language is better boil down to whether the developer prioritizes speed of program execution or speed of program development. See [1]. Newer language wars center of safety and bug creation, so maybe this point is dating me.
- Programming languages are more about human psychology than about some mathematical proof of correctness. Programming languages are designed as a compromise between how a computer needs to be told to operate and how humans communicate. All the theoretical and mathematical infrastructure around language design are there to justify decisions once the language has passed the "psychology" test. This is the same idea between JPEG, MP3, etc. compression. Fourier transforms and wavelet transforms don't inherently create a saving, it's only when we can throw away higher order coefficients because the human perceptual system isn't as sensitive to them as lower order coefficients, does it give benefit. That is, JPEG and MP3 are specifically tailored to the human perceptual system. The math is there as the building block but which bits/coefficients to throw away are determined empirically. To belabor this point a bit more, programming language discussions arguing over functional vs. imperative, type safety, etc. that don't try to determine measurable metrics of success, preferably through testing their hypothesis with data collected on actual usage, are expressing an opinion only.
[1] https://web.archive.org/web/20200407014302/http://blog.gmarc...
You can’t possibly know what a company you join will be like until you actually have been working there 12 months.
For large code bases you can’t really rearchitect anything. You are stuck with how it works. Maybe on small scales you can refactor.
Don’t blindly apply design patterns. SOLID is good as a thinking framework rather than a code review gate.
Marketing isn’t what you think it is until you study it somewhat. E.g. it’s not glossy ads!
I learned to focus in on just the first compiler error, and ignore all the rest.
Read the first error. Resolve the first error. Recompile. Repeat.
This is just one example of breaking big problems into smaller problems.
2. in software engineering it's always a people problem no matter what they tell you. see point 1.
For instance:
1. Writing a Scheme interpreter in C
2. Abstract data types and encapsulation
3. Functional programming and recursive algorithms
4. Groking OOP and design patterns (this took me a long time)
5. Understanding how processes and scheduling is done in an OS
More recently, not really a "aha" moment, but Git has been a game changer.
I learned C on Windows, and before I learned any dynamic languages. And before I had ever written a unit test.
I knew all the rules, but I was not good at making a reasonably sized, correct program in a reasonable amount of time.
---
But then I developed a good workflow for writing Python, shell, etc., and then went back to writing C, and it helped immensely.
C is a dynamic language in many respects anyway!
2. realising that its more about delivering than dreaming up the perfect abstraction (get it done).
3. what you think the user wants vs what the user thinks they want vs what the user actually wants vs what the user actually needs.
4. that there are always tradeoffs
5. building a product by yourself (whether on the side or starting up) will give you invaluable experience.
2. When I was going through Tim Roughgarden's Algorithms course and saw the derivation of runtime complexity for mergesort and finally understood/visualized what a logarithm actually did (in school it was just taught as some rote function to help you manipulate eqautions of the form y=b^x)
3. Learning how TCP works from the bottom up. I think the biggest aha moment was when the textbook I was reading explained the TCP algorithm as a state machine that's just running on the client and server machines with the rest of the underlying network just forwarding packets, i.e. "pushing the complexity to the edge of the network".
4. Working through the nand2tetris project resulted in a lot of "oh X is just basically this at its core"
5. When going through a textbook explaining how relational database engines were implemented and seeing that they're essentially just using disk-persisted hash tables and self-balancing search trees to build indices on columns and make `select`s O(1)/O(log) time (I wasn't taught this in my uni's database course and assumed there was some fancy magic going on to make queries fast)
6. Realizing that I could just do a form of graph search/dependency resolution when learning a new codebase/trying to understand how a function works. I think before seeing someone do this in front of me I would usually just panic at the thought of "thousands of lines of code" rather than just "keep iteratively diving into the functions being called". Whenever I'm learning a new language, the first thing I'll do is setup the LSP plugin in vim so that I can quickly navigate up and down call graphs. Tbh I don't understand how some developers claim to not need this and instead just manually grep+open file in order to "jump to definition".
7. Forcing myself to derive the rotation logic for AVL trees. I was curious if, given just the high level properties of how an AVL tree behaves in order to guarantee O(log) time lookups, if I would be able to figure out all the rotation cases. Was a very rewarding exercise and something I plan on writing a blog post about (eventually...)
(edit)
8. Learning about the log data structure and how state can be replicated by replaying/executing this stream of transformations/updates.
- I would learn some technology to a point where it does what I want
- I take what I learned to some new technology but find myself doing things from what I learned in other technologies
- this puts me in a strange loop where reality wasn't lining up with my expectations or I would do things that were more work than necessary because of learned habits
- the aha moment was when i started learning the theory of the thing I was working with
* it was key to getting out of this rut
* turns out this applies to any theory from abstractions all the way down to computational theory, type theory, automata theory, software analysis theory, hey if i ever get there maybe even software synthesis
in a nutshell i would summarize this aha moment as "you can keep doing the same old tricks and eat the cost or you can always dig deeper and see what costs can be avoided."
2. When I started reading programming and software engineering books after having just "done it" – this brought so much thought and structure to what I used to improvise
The section on graphics and UI described BOOPSI - an object-oriented way of constructing UI elements with inheritance, etc. Had never been exposed to that before and it blew my mind.
2. Being an early adopter of Tulip (later renamed asyncio) and gaining an understanding of the event loop and concurrency without threads.
3. Understanding that all code is just a particular representation of some S-expresssion.
The first computer I had access to was my family's Windows 95 PC. I learnt to write HTML in notepad and see it rendered in Internet Explorer. This was the beginning of my career, but so much of what was happening was accepted by my brain as simply "magic".
I would later chip away at that stack of magic and learn how more and more things worked, but even while being an accomplished programmer I still had this feeling that magic existed. It wasn't until I learnt to build my own computer from discrete logic components (thanks, Ben Eater) that I finally felt like I understood it. Computers are just machines.
I've since revisited textbooks (like Knuth) and the history of computing (starting with Babbage) and feel like my eyes are no longer obscured by my preconceptions about what a computer is.
Understanding what someone really needs should be a skill taught in CS.
2. React and Typescript
3. Jetbrains IDEs
To me technical debt is therefore defined as how many decisions you have to make in order to create a feature. Clean is when you don't have to make many decisions to get things done.
Example decisions, I'd say I spend at least 90% of my time developing on these decisions:
- What feature would be good to have?
- Is the feature worth the effort to build?
- Is the feature worth the compute costs?
- What language/framework should we use for this feature?
- How should we structure persistent data related to this feature?
- Where should the code for this feature live?
- How should we test this feature?
- How performant should this feature be?
- What name should this helper function/variable have?
The more of those you have to think about when developing the slower you will make progress. Therefore the main productivity hack is to write down guidelines or roadmaps or design documents for all of those so you don't have to think much about it when developing. This means don't be a manager when coding, let someone else do that work or do it before you start programming.
Things you can do to reduce mental cost of above decisions:
- Product roadmap with features that would be good to have.
- Discussions in the roadmap related to how much value said feature will provide and the effort to produce it.
- Discussions in the roadmap related to how expensive the feature will be to run.
- General guidelines on what language/framework you use.
- Have a very good architectural document describing how you structure persistent data.
- Have list of example commits showing where to put code for different features.
- Have a well documented testing strategy with examples pointing to commits with good integration and unit tests.
- Have guidelines on how much typical actions are allowed to take, like "page update should take 100ms at most".
- Try to write code where you don't need a lot of long superfluous names, namespaces and anonymous functions are your friends.
- Lastly, as much as possible try to make reasonable defaults for shared code. If you have to make 20 configuration decisions in order to use a library then you wont save a lot of time using it, and likely people will just copy the configuration from other places anyway since making 20 decisions is too much work. For example, lets say your library have a flag that can speed up processing 2x in some cases but with extra overhead most of the time. You could think that forcing the developer to decide in each case to ensure we aren't missing any performance improvements would be a good thing, but in reality a 2x performance improvement rarely matters. So the cost of having every developer making this decision outweighs the performance benefit. Instead have it as an optional config that they can set when they actually need the performance.
- most tech is stupid simple, like retard levels of simple, with incredibly complex description
- SOA is stupid and monlith is always the way to go(maybe with handful simple microservices)
- microservices are not SOA
- event sourcing is almost never a good idea
- raw bytes over the wire or in storage are not as complex and mysterious as you might think
- compiled languages are not "smarter" than interpreted languages, don't feel like you are lesser of a programmer if you do php or javascript
- new cool and shiny tech is cool and shiny for 5 minutes, until you implement it, then you realize it brings a ton of new complexity and a ton of wasted time and you should have stayed with what worked before and you realize the people who promoted it were just trying to make a sale
- mysql/mariadb will handle way more than you think
- if you want to waste resources, use java/jvm
- yagni should be tattooed on every programmer's hand so when he types on the keyboard he is constantly reminded that he is wasting time with stupid crap nobody will use
- don't think in "what if" terms or try to predict the future, just code what is needed right now, avoid being a smartass
- you can charge more for your services the more essential you are for the project and the harder it is to onboard new people, ie. your value increases over time
- if you charge less now, you will get paid less tomorrow
- running your project entirely in cloud will bankrupt you, use it to gather usage data, than move to bare metal. cloud is cool and "in" but it will eat your wallet, for no good reason whatsoever
- single binary is always better than docker
.. i could go on and on...but thesre are more guidelines than aha moments so that is it for me.
At a major company, over ten teams were automated in a year.
Beautiful and useful abstractions around data and data processing tasks that provided extreme value.
I could never replicate it, but it reminded me about the power of software and the extreme ability that some have to weird it.