But some of you have, and I figured they make for fun and interesting stories ;-)
So what's your favorite method to send large files, of at least 5GB or bigger? Though, I'm also curious on how you'd send 10TB or more.
Transferring the torrent metadata is pretty trivial and can be done via a wide range of methods, and having that flexibility can be nice.
Unlike HTTP, you get reasonable retry behavior on network hiccups. Also, more robust data integrity guarantees, though a manual hash test is probably a good idea either way.
At least among people I'm throwing TBs of data around with, torrent infra is common and it's nice to not have to deal with some special-purpose tool that, in practice, is probably a pain in the ass to get compatible versions deployed across a range of OSes. Basically every platform known to man can run a torrent client of some sort.
And obviously, no dependency on an intermediary. This is good if you're trying to avoid Google et al. That does, however, bring a potential con: if my side is fast and your side is slow, I'm seeing until you're done. If I'm uploading something to gdrive or whatever, I can disconnect one r the upload is done. If you control an intermediary like a seedbox, that's less of a problem.
In general, though, torrents are pretty great for this sort of thing. Just encrypt your stuff beforehand.
5GB on Internet: Upload to my OVH VPS and HTTP or Rsync it to its destination
10TB, local or Internet: Physical Hard Drive.
Never underestimate the bandwidth of a station wagon full of backup tapes!
https://www.tidbitsfortechs.com/2013/09/never-underestimate-...
And since you now buy 1TB micro SD cards, so perhaps I'd split the file 11 ways (no way it'll fit exactly) and send them via carrier pigeon. Or heck, I could just tape them to a drone and hope they aren't damaged in the crash. There's lots of ways to move data around. Maybe you want to UUENCODE it and submit bad URL's to a servers log so that it can be exfiltrated later? It would probably take a very, very long time, but could be done. I call it "CURLyTP"
https://miscdotgeek.com/curlytp-every-web-server-is-a-dead-d...
If it is above 50G and on my personal computer, I encrypt the data and then physically mail a USB stick with the data on it. Trying to arrange the downloading/uploading of 50G of data from my personal computer to another personal computer is a real pain. The people that I would send/receive that much data to/from, that is stored on my personal computer are usually people who don't know much about ftp, scp or easily sharing files over the Internet. Sending a USB stick is just so much easier and in many cases, faster. I make sure the recipients know how to decrypt the data before sending the data.
If it is on a server (For example, a side project instance in AWS), then I drop it into S3 bucket, export it and send the URL to the recipient. I just eat the cost of them downloading it. Usually I am making money off the work anyways, so it is the cost of doing business.
Between OSX, AirDrop works very well. I have sent >10GB files between Macs, quite quick as well.
I have never send a 10TB file so I wouldn't know. None of my drives are that large yet :)
$ nc -l 4242 > dest
And then on the sending end: $ nc hostname 4242 < file_to_send
This works great when you just need to get a file of any size from one machine to another and you’re both on the same network. Are used this a lot at one of my offices to schlep the files around between a few of the machines we had.
However no-one mentioned a super simple service: https://wetransfer.com/ - Simple as drag & drop, enter recipient address, SEND. Pretty simple if you want non techie to send you something.
It's like a private dropbox.
For files on the order of <= 10GB, magic wormhole is lovely: https://techcrunch.com/2017/06/27/magic-wormhole-is-a-clever...
`sudo pip3 install magic-wormhole` is an automatic "must run this command" on all my servers. Simple, secure way of sending files from A to B, with no bullshit. No account creation, even.
Here's a couple of my projects:
code[0]
Sender:
curl https://patchbay.pub/random-channel/filename.bin --data-binary @filename.bin
Receiver: curl -O https://patchbay.pub/random-channel/filename.bin
https://fbrg.xyzcode[1]
This one works in the browser. You select a file and it gives you a link you can share with others. The underlying tech[2] is more complicated (relies on WebSockets) than patchbay, but the server is currently more reliable. I'm in the process of improving the patchbay server to be more robust, and I think it will eventually replace fibridge completely.
My current project is building something along the lines of Google Drive, but much simpler and designed to be self-hosted from within your home.
[0]: https://github.com/patchbay-pub/
I gave that question a few tries but I feel part of the problem is the market being saturated by giants doing half the job. Another part of it is the lack of interoperability regarding file sharing between operating systems. I mean - a native right click>send to>user - native notification to the user with a solution to direct p2p download from that person. No software needed, seemless integration. Why is that so hard?
I really wish the big boys would give that a try rather than giving us toy-like features and price bump.
For my 12TiB of data I use Syncthing when I need to sync them more often, rsync.
I used rsync several times for billions of smaller files totalling to 300GiB, but really all depending on how I connect nodes. I prefer syncthing, but when only ssh is available, then rsync is good too.
Currently largest synced directory by syncthing (that shares usage stats) is over 61384 GiB :)) https://data.syncthing.net/
If it's for more than 1 person I upload it to a VPS if it's small (<20 GB) or make a torrent if it's not.
$ python -m http.server 8000
and then you can start ngrok to expose the file
$ ngrok http 8000
that will give you an URL to share with whoever wants it.
For local network:
I use miniserve ( https://github.com/svenstaro/miniserve ) which is just a simple http server. There are similar tools for when I want to share it from the smartphone.
Through the internet it really varies:
Sometimes it is Firefox send ( https://send.firefox.com/ )
For photos, I use a self hosted photo solution piwigo ( https://github.com/Piwigo/Piwigo )
In earlier days it has been a self hosted nextcloud ( https://github.com/nextcloud/server ) instance. I still use it when the files are getting too large for Firefox send.
I also tried the already mentioned wormhole but this works only with tec ppl.
I've never needed to send TBs of data.
I had to keep Netscape open because it showed the status of the file download. People asked why my web browser was open all day, it was for the giant download that is part of my job.
For the thing you did, these could have worked.
Windows now comes with real sshd - Mac has it, and linux/bsd of course has it.
For small files (less than 10gb?) generally scp - but trying to get in the habit of using rsync. Generally scp isn't a very good idea.
For larger filesystems zfs send over ssh.
For streaming: dlna/upnp over zerotier vpn.
Shame Firefox send service imploded - for smaller files and where ssh isn't an option - it was a nice service. But a little too much hassle to self-host.
Didn't realize there was another way.
We use it for copying hundreds of TiB's regularly, occasionally over a PiB.
For ssh, there's https://www.psc.edu/index.php/hpn-ssh for improving wide-area transfer speeds, and something else, I think.
Otherwise I liked Firefox Send while it was running since I mostly trust them to not be full of intrusive nonsense.
When I want to make something large available to someone else, I post it on my local webserver. It points to an NAS on my LAN; anyone in the house can just drop whatever on it and hand out a link.
I've never used it myself but I've heard but it's actually quite impressive but the business model makes it inaccessible to individuals
Apparently they do some sort of complicated Trace routing and open a ton of simultaneous routes. Probably also uses something other than TCP 4 correction.
Seems like a good place for some disruption
- iCloud drive for personal stuff I need to share between devices. I trust this to sync anything up to a TB reliably.
- Google Drive when I need to share to someone else.
- Occasionally external drives when I need to move data fast locally.
- Some combination of S3 / AWS CLI / EC2 when things go beyond personal computer capacity depending on where the data is coming from and going to.
For bigger things, there are two basic paths:
If it's to someone that's not primary a tech person, then the data goes on a portable hard drive, and I either drive it to them, or mail it and walk them through accessing the data. In both cases I encrypt the data, generally with Veracrypt.
If it's to someone that won't mind using AWS's toolchain, and has decent connectivity, I'll use Snowcone or Snowball to get the data to S3 and give them S3 URLs.
I tend to get the data to S3 sooner or later so I can migrate it to Glacier. More than once I've had to recover GBs or TBs of data because the customer lost it, so I'm prepared now.
https://www.hightail.com/file-sharing
Hightail offers four different subscription levels for file sharing, with varying restrictions on file upload limits and storage. With our free Lite plan, you can upload files up to 100MB each, and have a storage limit of 2GB. Pro plans, which start at $15/month, allow you to upload individual files up to 25GB each. Teams plans, which start at $30/user/month, allow you to upload individual files up to 50GB each. Business plans, which start at $45/user/month, allow you to upload individual files up to 500GB each. Pro, Teams and Business plans come with unlimited storage.
pipx install magic-wormhole
wormhole send /some/big/stupid/thing.tar.gz
That's good for the 5G +/- file transfer.
For files I need more control over that are less than say 5gb, I tend to scp them to a web server I control, so that I can delete them afterwards.
For files larger than that, I'll use a private Bittorrent file. It's very rare I need to transfer files this large, but I really like this solution.
What about using "Firefox Send"? (I never used it so far)
https://support.mozilla.org/en-US/kb/send-files-anyone-secur...
I read that the limit is 1-2.5GBs => maybe you could break down the file and upload it in multiple pieces... .
EDIT: oopps, Firefox Send doesn seem to be available anymore - https://support.mozilla.org/en-US/kb/what-happened-firefox-s...
You should go to another cloud computing provider if you don't like Google Drive.
My third alternative is even more easy, but it's a little sluggish because it cuts you at two gigs either way. It is a service called WeTransfer, which was almost your only choice 15 years ago if you had to submit a huge file.
But for something like 10 TB or more, I'd see a torrent as the only way. My uploads speeds are too slow for anything else, the connection would be reset. The torrent also helps prevent corruption.
When I'm the receiver, I'm using a self-written CLI tool "send me a file" (smaf): https://github.com/nikeee/send-me-a-file
It basically opens up an http server on my machine and let's a user upload a file via curl or a web interface. It also computes a sha256 hash while receiving.
These methods only work on a local network. I use my VPS and sftp/http for transfers descending the local network.
The only person I'm sharing with lives near me, so sneakernet is the most convenient.
I once hit a transfer speed of 30TBs/hour carrying a box of hard drives home from work.
Now I live in the heart of Silicon Valley, a couple km from Google's Headquarters, and have a crappy 15 mbps uplink with no other options available to me, so I typically throw the file on Amazon S3, Google Drive, or Dropbox before sending a link to the other person so that they don't have to put up with a 15 mbps download.
For larger transfers, I'd look at File Catalyst Spaces. If you have a bigger budget, you could look at IBM Aspera or Media Shuttle.
<10Gb - chuck it on S3 and send a link to the other person.
I see a lot of people mentioning magic-wormhole and NAT traversal. I can't find any docs that confirm this. I think it always runs through a relay server?
Everything works as expected, there is a lot of storage space and end-to-end encryption. There are clients for all platforms and it just works.
I can easily create secure share links whenever I want.
The only downsides are that the browser-based app is very slow to start once you have lots of files, and the android client does not have the same synchronization capabilities.
Is it one 10TB file? Multiple files? How do you handle file integrity? Error correction?
For much larger files I always fall back to BitTorrent. I have also kept an eye on WebWormhole, but haven't been in a situation yet where I needed it.
Edit: I guess it was magic wormhole, discussed elsewhere in this thread.
But HTTP is not a great protocol for really large transfers like 10TB. Ideally you'd want something that parallelizes and does hash checks.
Even then, at 10TB you need a 2 gigabit connection to be even competitive with SSD-and-overnight-shipping (about 12hrs).
Too bad the laptop I am transferring from in a bad condition and with Windows so sometimes iCloud service is freezing and I need to restart it.
I tried to create torrent but it seems it takes forever on such a weak machine.
When I don't I usually just zip it and upload it to the media section of my Wordpress-managed website.
I've never generated files in the terabyte range, so I don't worry about that.
I did it like you "in the past" (on my own fqdn / public-data) but I've given up on maintaining my website.
A torrent works really good if you've got multi-Tb files and many recipients.
Simple. Works.
Send help... no seriously... how do I upload a file from a server I have a shell on but can't easily open ports to or install software on (and where the built in cp tool fails on big files...)?
Never needed to send anyone 10TB nor do I have that much storage but if you do you probably have some kind of NAS/server where you can enable FTP access.
Free/open source
a raven with a sufficient large micro-sd card ;)
Technically, unlimited storage on OneDrive, can force the recipient to verify 2fa via their email address, force expiration link, etc.
If it's too big for cloud storage, ship a hard drive in the mail.
You can send a 10TB hard drive faster by USPS than it would take to transmit across the internet.
I think google drive is a decent solution as well.
otherwise, i've mailed a flash drive before, and for <20GB i'll just put it on a public S3 bucket.
Some times I'll have to open the file in a hex editor and change the hash if it sets off copyright restrictions.
This is a 10 gig limit.