HACKER Q&A
📣 resume384

What's broken about the internet?


I've been interacting with a group of interesting people through the Mozilla Builders / Fix-The-Internet project. A lot of great project ideas going on there. There's talk of making Web 3.0. Interesting stuff.. but, I'm curious, what's broken about the internet, the web and tech? If we're going to fix something it seems good to have a list of what we're trying to fix. So what do you see as being broken? Let's build a list I can take back to the group looking to fix things!

https://builders.mozilla.community/ https://mozillabuilders.slack.com/


  👤 kaffeeringe Accepted Answer ✓
These days you need an internet-device and an anti-internet-device like a Pi-Hole plus half a dozen browser plugins. Every driver of every part of your computer is trying to sneak out some of you data. Every website is full of trackers. You are not only fighting off criminals but also some of the most powerful corporations and every government. A hand full of companies run large parts of the infrastructre - you couldn't live without Google, Facebook, Microsoft, Apple and Amazon if you wanted. The central social networks are hate and manipulation machines. Everything wants to sell you something. And when you pay to buy it, it tries to steal your data anyways.

👤 dfex
In no particular order: - The WWW - from a delightfully whimsical follow the rabbit-hole experience in the early 90s where discovery was hard, but content was king; to a centralised, AI-ranked, viral echo-chamber in the 00s and 10s where the discovery engines now control the whole experience and the content you see is presented based on who pays. The key problem here being the monetisation of discovery. We need solve search and discovery, but deliver it as an open Internet standard like DNS or HTTP.

- Smart devices - buying an appliance with a closed-source, embedded device that relies on Internet connectivity and the solvency of it's manufacturer in order to operate it, patch it, secure it, and maintain it is the antithesis of what this planet needs right now. When the CA Root certificate(s) installed in your no-name smart TV expires and the OEM doesn't exist/doesn't care to provide you a firmware update, these devices will become less than worthless and most likely landfill. We need industry to adopt an open framework for smart devices that helps prolong their lifespan eg: a public Linux repo for updates to the underlying OS - the OEMs can deploy their own user interface, but end-users should be able to pick and choose if they wish (and most wont').

- Trust - in particular X.509 certificates. A lot of progress has been made in making trust via digital certificates the default rather than a paranoid exception, with a large portion of the web being delivered over HTTPS, RPKI for BGP currently being deployed in large operators and DNSSEC showing some (admittedly slow) signs of adoption. What is still a major problem in this area is the complexity of certificate management and renewal. The work LetsEncrypt and the EFF (certbot) have done in automating this process is fantastic, but these are still a long way from mainstream usage.


👤 zzo38computer
Well, the biggest problems with the internet seems to be the World Wide Web. NNTP has improved (some of these improvements at my suggestions, such as support for 63-bit article numbers), and Gopher has also improved since the original specification (now there are "i" type lines in menus, which are useful), but WWW has just gotten worse (and that includes Hacker News to some degree too, but not as much as Google and Facebook and so on). I have set up not only HTTP but also NNTP, Gopher, QOTD, SMTP, etc, and may later also set up IRC, Telnet, Viewdata, etc. Additionally, many things I serve on the HTTP are just direct downloads anyways; no need to deal with 100 megabyte files just to access a text file or other file that you wanted to download. And, you can do it too, if you want to do, I suppose.

👤 sixhobbits
Content mills -

Someone wise said that tools that make writing easier turn bad writers into worse writers.

Goodhart's law means that nearly all content is broken. It is judged by its view count and Google rank, so all it is good for is getting clicks and ranking well.

So much effort, so much money, ploughed into creating really really awful content which hides away the 10% that isn't crap (Sturgeon?).


👤 musicale
Business model seems to be broken. I pay a giant internet bill each month, basically insane margins based on what it should actually cost to deliver much faster connectivity than what I get, yet none of that extra money funds the things that I actually use the internet for. (Why does it cost that much? Part of the reason may be that our local cable company is a government-enforced monopoly, that it's next to impossible to get right of way for new fiber, and that wireless is currently uncompetitive. Perhaps 5G may change things, but I'm not holding my breath.)

Moreover, the internet makes distribution nearly free and allows a nearly unlimited number of people to access information and digital media from all over the world, but lengthy (70 years or more) copyright terms make it illegal to do so in many cases. Instead, thousands or millions of person-hours are spent on the impossible task of trying to make bits behave like physical objects in order to satisfy legal and business requirements. When an organization such as the internet archive tries to make a digital library whose collection isn't bound by the constraints of physical libraries, they are sued by publishers for copyright infringement and potentially liable for $150k in statutory damages per occurrence.


👤 Shared404
IANAE, but it seems like the "The web is for transfer of information" and "The web is for applications" ideas should be split into two separate domains. I shouldn't need an OS in order to read HN or browse a text based site.

👤 ricardo81
Privacy and removal of insidious tracking would be a good start.

I feel the web is too centralised, with half a dozen or so platforms essentially being gatekeepers of content on the web.

People link out on their sites much less tha in the past, in the belief that it raises the chance of penalising them on Google and hurt their rankings. Last I looked search engine are typically responsible for delivering around 50% of visitors to a site, and Google has a near monopoly in many countries.

Wikipedia while great provides a less than obvious set of rules and regulations before adding data into it. It typically tends to rank first on all major search engines for any query.

Social media have become moral compasses in what is OK and what is not OK to talk about.

A more diversified web moving away from these 'decision makers' IMO would make it a healthier place.


👤 tibbydudeza
Make it more decentralized ... why should stupid laws made in the US impact my experience ???.

👤 RandomBacon
The lack of privacy and security. Throwing an ad-blocker on a browser is just a band-aid; the festering wound is still there getting worse.

👤 6510
My world view (or more like a cosmic view) is this: Space and matter are fascinating phenomenon but the truly fascinating things happen between the two. Empty space and time are quite boring until you put matter in it.

The "Let the output of your program be the input of mine." philosophy was really good. Web applications do the exact opposite. All of the interesting shit is missing.

Keeping parts of a thing in separate places can be necessary at times but it usually is not. There is the layout of an article website and there are article in it. The articles with their videos and images are separate things as much as browsers are separate things from web pages (perhaps even more so).

The article can be a separate file. Like an iframe but as inline text. No , no

Web Analytics Made Easy - Statcounter