So let's assume a TB (or a few in the future), with slow regular upload, and very rare DL access. I want to minimize cost, but also effort in sync (assume limited technical expertise).
What's the best option?
Having 3 copies of the data (with one being local) is safe enough for me, you can multiply that by as much as you're willing to pay.
But I think more important is the front-end tool, what format it creates, whether the content is encrypted before it leaves your machines, etc....
Once you decide which front-end you want to use, then start looking at which back-end might work best with it.
For my part, I believe that it is best to do the first level of backups locally to a NAS device (with encryption before the files hit the server), and then secondarily back up the NAS device to the cloud (with another layer of encryption before the files leave your site).
It would be cheaper still if I used Amazon Glacier but I prefer to have the speedier access, when needed.
With this backup frequency, I would lose some data. I don't generate much important data, so backing up after the few times I generate it is fine with me. The only except is for code, but I use GitHub so that is covered.
If you don’t have upstream bandwidth, try USB disks and place them in a safe place elsewhere. It’s an expensive drill in terms of commitment but I’ve seen in practice and works.