Off the top of my head I can think of a number of other solutions to store directory metadata: a Finder-specific database mapping values to directory paths, resource forks (ala OS 9), etc.
99% of the reason it exists is because the OS generated thumbnails. I see these in, for example, zips, that do not have images in them, and have that file.
You know what I've seen? People do `git add ` and then I find that directory in the repo. Why does git not automatically ignore that?! It's never the right option.
Anything* that pollutes directories should be absolutely verboten, its never what the user wants.
Edit: The worst part is, it makes them on network shares, and on filesystems that aren't HFS. The hell, Apple?!
It doesn’t really bother me as it is, but it’s on the less-than-perfect list.
None of your solutions preserve that backwards compatibility -- the database wouldn't because it presumably stays on your machine, and going back to resource forks wouldn't because the earlier OS releases wouldn't know about the existence of them.
Doesn’t mean it’s right... but this is my anecdotal experience.
Haiku does things differently, and elegantly: https://medium.com/@probonopd/my-sixth-day-with-haiku-under-...)
*/20 * * * * root find / -name ".DS_Store" -depth -exec rm {} \;
You can tell the OS not to create them on network mounted drives:
defaults write com.apple.desktopservices DSDontWriteNetworkStores true
Instead I will say we can have a set of critters that cull them in various interesting ways. Critters that watch filesystems for these files and remove them if they still exist an hour after creation. Or remove them all on a schedule. Or block list them via .gitignore and similar. Sync scripts that have block lists for files and folders that match names. The solutions are endless.
This isn’t only a mac thing either. Other OS create their own variations of debris.
A quick google search finds several blog posts with commands to run.
Apple usually prioritizes bugs by duplicate count.