HACKER Q&A
📣 OdinSpecc

How do you manage integrations at scale?


Curious how teams handle integrations as they grow and start relying on multiple systems

And at what point did it become a real problem for you?

Did you solve it with internal tooling, external platforms, or something else?


  👤 PaulHoule Accepted Answer ✓
(1) Regular storage procedure. Same as all the other software.

(2) The usual problem is that a lot of vendors mistake this experience for security: https://www.youtube.com/watch?v=o2ObCoCm61s No matter what you need some strategy for managing credentials for dev, test and production that is not "check them into version control"

(3) Never an external platform. Take a minute. Stop. Breathe. Think about it. Before you get the external platform you have to integrate with one thing. Add the external platform you have to integrate with two things. Don't go there.

(4) Platforms like Zapier rely on management being intimidated by the trauma of developing UI software. In UI software people quote a week to develop something and it takes five. When people quote 30 minutes to make a web scraper they often have it running in 10 minutes.

(5) There is a lot of fear that that web scraper will have to change as the target site changes, and occasionally that's a problem but (a) it is so expensive to make changes to GUIs that most web sites go for years between major changes and (b) a site that ranks well on Google today will probably lose it's rankings if it changes anything major so -- odds are that 10 minute web scraper will still be running in five years.

(6) In the AI coding age people are discovering just how easy it is to write scrapers and other integrations -- it's not that you need AI to code them, but that AI gives people the courage to try.

(7) Always look at the scraper as an option to the API client if it is possible. More often than not the API has restricted functionality and content compared to the web site and authentication is almost always more difficult. (e.g. to authenticate on a web site you just use a http client that has a cookie jar and submit your username and password to the form -- 20 APIs require implementation of 20 different auth methods, 20 scrapers use 1 authentication method, the only thing that is different is what the username and password fields are called)