HACKER Q&A
📣 mettamage

What is your favorite method of sending large files?


I just opened up a simple HTTP server to send someone a large file. Then I figured, I never gave this question proper thought.

But some of you have, and I figured they make for fun and interesting stories ;-)

So what's your favorite method to send large files, of at least 5GB or bigger? Though, I'm also curious on how you'd send 10TB or more.


  👤 livueta Accepted Answer ✓
Bittorrent. No, really. Lots of nice behaviors when transferring large amounts of data between arbitrary endpoints.

Transferring the torrent metadata is pretty trivial and can be done via a wide range of methods, and having that flexibility can be nice.

Unlike HTTP, you get reasonable retry behavior on network hiccups. Also, more robust data integrity guarantees, though a manual hash test is probably a good idea either way.

At least among people I'm throwing TBs of data around with, torrent infra is common and it's nice to not have to deal with some special-purpose tool that, in practice, is probably a pain in the ass to get compatible versions deployed across a range of OSes. Basically every platform known to man can run a torrent client of some sort.

And obviously, no dependency on an intermediary. This is good if you're trying to avoid Google et al. That does, however, bring a potential con: if my side is fast and your side is slow, I'm seeing until you're done. If I'm uploading something to gdrive or whatever, I can disconnect one r the upload is done. If you control an intermediary like a seedbox, that's less of a problem.

In general, though, torrents are pretty great for this sort of thing. Just encrypt your stuff beforehand.


👤 geocrasher
5GB on my local network: Windows File Share, Rsync, or HTTP depending on source/destination

5GB on Internet: Upload to my OVH VPS and HTTP or Rsync it to its destination

10TB, local or Internet: Physical Hard Drive.

Never underestimate the bandwidth of a station wagon full of backup tapes!

https://www.tidbitsfortechs.com/2013/09/never-underestimate-...

And since you now buy 1TB micro SD cards, so perhaps I'd split the file 11 ways (no way it'll fit exactly) and send them via carrier pigeon. Or heck, I could just tape them to a drone and hope they aren't damaged in the crash. There's lots of ways to move data around. Maybe you want to UUENCODE it and submit bad URL's to a servers log so that it can be exfiltrated later? It would probably take a very, very long time, but could be done. I call it "CURLyTP"

https://miscdotgeek.com/curlytp-every-web-server-is-a-dead-d...


👤 mullen
If the data is below 50G in size and on my personal computer, then I just drop it into the Google Drive folder and it syncs to Google over night. When it is done, I export it and then send a link to the person(s). I pay $2 a month for the 100G account and I usually have about 50G disk space unused, so this is not an issue for me.

If it is above 50G and on my personal computer, I encrypt the data and then physically mail a USB stick with the data on it. Trying to arrange the downloading/uploading of 50G of data from my personal computer to another personal computer is a real pain. The people that I would send/receive that much data to/from, that is stored on my personal computer are usually people who don't know much about ftp, scp or easily sharing files over the Internet. Sending a USB stick is just so much easier and in many cases, faster. I make sure the recipients know how to decrypt the data before sending the data.

If it is on a server (For example, a side project instance in AWS), then I drop it into S3 bucket, export it and send the URL to the recipient. I just eat the cost of them downloading it. Usually I am making money off the work anyways, so it is the cost of doing business.


👤 kvn_95
Magic Wormhole (https://magic-wormhole.readthedocs.io/en/latest/) if it's between OSes.

Between OSX, AirDrop works very well. I have sent >10GB files between Macs, quite quick as well.

I have never send a 10TB file so I wouldn't know. None of my drives are that large yet :)


👤 ashton314
Netcat:

    $ nc -l 4242 > dest
And then on the sending end:

    $ nc hostname 4242 < file_to_send
This works great when you just need to get a file of any size from one machine to another and you’re both on the same network. Are used this a lot at one of my offices to schlep the files around between a few of the machines we had.

👤 BrandonM
I remember using AIM (AOL (America Online) Instant Messenger) and other instant messaging applications for direct peer-to-peer file transfer back in 2002. On fast university connections, you could transfer movies in a few minutes. It's crazy to me that direct, fast file sharing was easier and more ubiquitous almost 20 years ago than it seems to be now. More context for my oft-misunderstood early feedback to dhouston about Dropbox.

👤 jve
Private Nextcloud. Or some free tier OneDrive/alternative is enought for me.

However no-one mentioned a super simple service: https://wetransfer.com/ - Simple as drag & drop, enter recipient address, SEND. Pretty simple if you want non techie to send you something.


👤 sillysaurusx
Syncthing. https://syncthing.net/

It's like a private dropbox.

For files on the order of <= 10GB, magic wormhole is lovely: https://techcrunch.com/2017/06/27/magic-wormhole-is-a-clever...

`sudo pip3 install magic-wormhole` is an automatic "must run this command" on all my servers. Simple, secure way of sending files from A to B, with no bullshit. No account creation, even.


👤 anderspitman
I've been working on solutions in this space for a couple years now. IMO making data on your local device available to others via HTTP range requests is the sweet spot between storing your data in the cloud and going full p2p.

Here's a couple of my projects:

https://patchbay.pub/

code[0]

Sender:

  curl https://patchbay.pub/random-channel/filename.bin --data-binary @filename.bin
Receiver:

  curl -O https://patchbay.pub/random-channel/filename.bin

https://fbrg.xyz

code[1]

This one works in the browser. You select a file and it gives you a link you can share with others. The underlying tech[2] is more complicated (relies on WebSockets) than patchbay, but the server is currently more reliable. I'm in the process of improving the patchbay server to be more robust, and I think it will eventually replace fibridge completely.

My current project is building something along the lines of Google Drive, but much simpler and designed to be self-hosted from within your home.

[0]: https://github.com/patchbay-pub/

[1]: https://github.com/anderspitman/fibridge-proxy-rs

[2]: https://github.com/omnistreams/omnistreams-spec


👤 BenjiWiebe
Sending a 10TB file on my internet connection would take 2.5 years of constant uploading. Shipping a hard drive is cheaper and quicker.

👤 tbronchain
It doesn't have to be large files, file sharing sucks. Even copy/past across devices, even your own devices, is a pain. How many times did I email myself some links to share them across devices. And I'm not even speaking about sharing pictures with people not very accustomed to technology (in other words, don't try to get them install a cloud storage app).

I gave that question a few tries but I feel part of the problem is the market being saturated by giants doing half the job. Another part of it is the lack of interoperability regarding file sharing between operating systems. I mean - a native right click>send to>user - native notification to the user with a solution to direct p2p download from that person. No software needed, seemless integration. Why is that so hard?

I really wish the big boys would give that a try rather than giving us toy-like features and price bump.


👤 akerro
>I'm also curious on how you'd send 10TB or more.

For my 12TiB of data I use Syncthing when I need to sync them more often, rsync.

I used rsync several times for billions of smaller files totalling to 300GiB, but really all depending on how I connect nodes. I prefer syncthing, but when only ssh is available, then rsync is good too.

Currently largest synced directory by syncthing (that shares usage stats) is over 61384 GiB :)) https://data.syncthing.net/


👤 superkuh
I have run a internet reachable whatever.com webserver from my home desktop computer for 20 years. I just copy or softlink the file to a place in ~/www/ and give out the corresponding web link. I have a couple nginx locations prepared with pre-existing bandwith throttles so it's a matter of soft linking to the appropriate ~/www/dirwithassociatedrate/.

If it's for more than 1 person I upload it to a VPS if it's small (<20 GB) or make a torrent if it's not.


👤 itake
You can create a simple python http server to expose files in a directory to the local web:

$ python -m http.server 8000

and then you can start ngrok to expose the file

$ ngrok http 8000

that will give you an URL to share with whoever wants it.


👤 GRBurst
It depends in whether I want to send it to someone in the local network or through the internet.

For local network:

I use miniserve ( https://github.com/svenstaro/miniserve ) which is just a simple http server. There are similar tools for when I want to share it from the smartphone.

Through the internet it really varies:

Sometimes it is Firefox send ( https://send.firefox.com/ )

For photos, I use a self hosted photo solution piwigo ( https://github.com/Piwigo/Piwigo )

In earlier days it has been a self hosted nextcloud ( https://github.com/nextcloud/server ) instance. I still use it when the files are getting too large for Firefox send.

I also tried the already mentioned wormhole but this works only with tec ppl.


👤 cpach
Related: If you need to transfer sensitive data over Bittorent, Age is a good tool for encrypting it before transmission.

https://github.com/FiloSottile/age


👤 Paul-ish
I use Dropbox to send and receive files that over the email/slack/etc attachment limit. Dropbox has a cool "request files" feature that let's people upload files directly to your Dropbox.

I've never needed to send TBs of data.


👤 jedimastert
I asked the same question on HN about 3 years ago[0]. I'm curious to see how the answers have changed and what might have stayed the same.

[0]: https://news.ycombinator.com/item?id=15440571


👤 LinuxBender
SFTP Chroot server. lftp client using mirror subsystem + sftp. It is multi-threaded, even for a single file and supports rsync like behavior even in sftp chroot. I can max out any internet link using lftp (client) + sftp (protocol).

👤 erezsh
I'm partial to https://mega.nz/

👤 orionblastar
When I worked as a federal contractor in 1996-1997 we used password protected FTP servers to store giant text files for import to a database. Later on they moved it to a website you login and download the text file. Since it was so long it took almost all day and if it aborted had to start all over again.

I had to keep Netscape open because it showed the status of the file download. People asked why my web browser was open all day, it was for the giant download that is part of my job.


👤 1ark
10TB or more, not sure. Anecdotally I did have to send GBs of home directory locally recently. Fastest way I found out the hard way was mounting SMBFS and `tar` it there. But of course it is unreliable without resume etc.

For the thing you did, these could have worked.

https://file.pizza

https://instant.io

https://www.sharedrop.io


👤 e12e
Apparently not explicitly mentioned: rsync over ssh.

Windows now comes with real sshd - Mac has it, and linux/bsd of course has it.

For small files (less than 10gb?) generally scp - but trying to get in the habit of using rsync. Generally scp isn't a very good idea.

For larger filesystems zfs send over ssh.

For streaming: dlna/upnp over zerotier vpn.

Shame Firefox send service imploded - for smaller files and where ssh isn't an option - it was a nice service. But a little too much hassle to self-host.


👤 motohagiography
Bike courier with checkered past, simultaneously on the run from organized crime clans and mercenaries working for retired spies, probably has mirror shade implants, lives in squat, punk haircut, etc.

Didn't realize there was another way.


👤 wilsonnb3
USPS

👤 britmob
Resilio Sync (formerly bittorent sync) is my go-to for any file larger than a few hundred megabytes.

👤 colabiblen
WDT from Facebook - https://github.com/facebook/wdt/

We use it for copying hundreds of TiB's regularly, occasionally over a PiB.


👤 gulerc
https://Sendgb.com is good way over the internet. Possible to send large files up to 20 gb. No needed registration or sign in.

👤 gnufx
Not something I've used, and it's difficult to countenance anything called that, but what Globus (globus.org) has become seems to be quite popular for transferring large research datasets between academic sites which subscribe.

For ssh, there's https://www.psc.edu/index.php/hpn-ssh for improving wide-area transfer speeds, and something else, I think.


👤 Groxx
Since 5GB would take quite a long time to upload on DSL: almost universally dropbox, simply because it resumes + it's accessible to people with a url. If the recipient is ssh-friendly and there's a shared machine we can both access, rsync is quite a lot faster and more controllable and doesn't make my CPU angry for hours.

Otherwise I liked Firefox Send while it was running since I mostly trust them to not be full of intrusive nonsense.


👤 cmckn
I run FileBrowser [1] to share media with friends. I would suggest trying IPFS if you don't want to forward a port from the internet. You'll get similar download performance to FileBrowser once your node integrates with the network (this takes 30 minutes or so). Check it out!

[1]: https://github.com/filebrowser/filebrowser


👤 WarOnPrivacy
I and my customers all have symmetric FTTH. When I'm copying a file from me to me (working remotely), I use a VPN. These are usually .vhdx files, ranging from 6GB to 300GB. I copy them to here for troubleshooting, then I send them back.

When I want to make something large available to someone else, I post it on my local webserver. It points to an NAS on my LAN; anyone in the house can just drop whatever on it and hand out a link.


👤 codethief
Syncthing works pretty well in my experience, even across NAT. I recently shared 10GB worth of photos with a family member and ~30min later it was done.

👤 danbmil99
Worth noting that there's an expensive but very popular product from IBM called Aspera

I've never used it myself but I've heard but it's actually quite impressive but the business model makes it inaccessible to individuals

Apparently they do some sort of complicated Trace routing and open a ton of simultaneous routes. Probably also uses something other than TCP 4 correction.

Seems like a good place for some disruption


👤 talove
- Airdrop for immediate stuff. Gets dicey around 30-50GB tho.

- iCloud drive for personal stuff I need to share between devices. I trust this to sync anything up to a TB reliably.

- Google Drive when I need to share to someone else.

- Occasionally external drives when I need to move data fast locally.

- Some combination of S3 / AWS CLI / EC2 when things go beyond personal computer capacity depending on where the data is coming from and going to.


👤 banana_giraffe
For things smaller than around 10gb, I tend to use S3. Either with pre-signed links behind a little bespoke portal that people can log into and see what files I've "shared" with them, or just direct S3 links for those that what to automate it all.

For bigger things, there are two basic paths:

If it's to someone that's not primary a tech person, then the data goes on a portable hard drive, and I either drive it to them, or mail it and walk them through accessing the data. In both cases I encrypt the data, generally with Veracrypt.

If it's to someone that won't mind using AWS's toolchain, and has decent connectivity, I'll use Snowcone or Snowball to get the data to S3 and give them S3 URLs.

I tend to get the data to S3 sooner or later so I can migrate it to Glacier. More than once I've had to recover GBs or TBs of data because the customer lost it, so I'm prepared now.


👤 skmurphy
I have used Hightail for years, from when it was called YouSendIt. Here is info from their website on plans and pricing

https://www.hightail.com/file-sharing

Hightail offers four different subscription levels for file sharing, with varying restrictions on file upload limits and storage. With our free Lite plan, you can upload files up to 100MB each, and have a storage limit of 2GB. Pro plans, which start at $15/month, allow you to upload individual files up to 25GB each. Teams plans, which start at $30/user/month, allow you to upload individual files up to 50GB each. Business plans, which start at $45/user/month, allow you to upload individual files up to 500GB each. Pro, Teams and Business plans come with unlimited storage.


👤 hprotagonist
magic-wormhole: https://magic-wormhole.readthedocs.io/en/latest/

  pipx install magic-wormhole
  wormhole send /some/big/stupid/thing.tar.gz

👤 Qision
Someone proposed WeTransfer but there is also Tresorit Send (https://send.tresorit.com/) who does the exact same job. Bonus point: they say they encrypt the data and their servers are hosted in Switzerland.

👤 beginrescueend
Inside a company, I like transfer.sh, which is like an open source version of file.io - * https://github.com/dutchcoders/transfer.sh

That's good for the 5G +/- file transfer.


👤 kn100
For files less than 1GB I tend to use Telegram. It obviously has the downside of uploading to server X and then downloading from server X, but usually if I am sending a file to one person, I'll likely be sending it to others too, and therefore the ability to forward the file to others arbitrarily after the fact actually proves to be a pretty useful quality. If I care about the datas security, an encrypted 7z container or something will do.

For files I need more control over that are less than say 5gb, I tend to scp them to a web server I control, so that I can delete them afterwards.

For files larger than that, I'll use a private Bittorrent file. It's very rare I need to transfer files this large, but I really like this solution.


👤 zepearl
Personally I would use my own Nextcloud instance for up to 20-30GBs. Not sure about TBs.

What about using "Firefox Send"? (I never used it so far)

https://support.mozilla.org/en-US/kb/send-files-anyone-secur...

I read that the limit is 1-2.5GBs => maybe you could break down the file and upload it in multiple pieces... .

EDIT: oopps, Firefox Send doesn seem to be available anymore - https://support.mozilla.org/en-US/kb/what-happened-firefox-s...


👤 retouchup
There are many ways of sending big files over the Internet, so I will go into the top three, save time. The first one is the shortest, when using Gmail. Click on the drive button instead, rather than just the paper clip button you will be using with regular attachments. It provides an outline of your drive files, from which you can pick the file to import. Just as easy.

You should go to another cloud computing provider if you don't like Google Drive.

My third alternative is even more easy, but it's a little sluggish because it cuts you at two gigs either way. It is a service called WeTransfer, which was almost your only choice 15 years ago if you had to submit a huge file.


👤 makeworld
I like gofile.io, as it's private and has no limits.

But for something like 10 TB or more, I'd see a torrent as the only way. My uploads speeds are too slow for anything else, the connection would be reset. The torrent also helps prevent corruption.


👤 robert_foss
Magic wormhole - it traverses NATs, is encrypted, requires configuration of the source or destination.

https://github.com/warner/magic-wormhole


👤 renewiltord
S3. Sometimes I do that even when I'm transferring on my local network just because I know the flow so well. After all, my speed to the Internet is roughly the same as the speed on my LAN. They're both gigabit duplex.

👤 nikeee
When I'm the sender, I use an http server in the current directory using http-Server (node) or http.server (python 3).

When I'm the receiver, I'm using a self-written CLI tool "send me a file" (smaf): https://github.com/nikeee/send-me-a-file

It basically opens up an http server on my machine and let's a user upload a file via curl or a web interface. It also computes a sha256 hash while receiving.

These methods only work on a local network. I use my VPS and sftp/http for transfers descending the local network.


👤 ryankrage77
For anything 5GB plus - external drive & cycle over

The only person I'm sharing with lives near me, so sneakernet is the most convenient.

I once hit a transfer speed of 30TBs/hour carrying a box of hard drives home from work.


👤 dheera
I used to do the HTTP server approach when I was a student at MIT and had a static IPv4 and symmetric gigabit in my dorm room and therefore the recipient could download at usually their full downlink speed.

Now I live in the heart of Silicon Valley, a couple km from Google's Headquarters, and have a crappy 15 mbps uplink with no other options available to me, so I typically throw the file on Amazon S3, Google Drive, or Dropbox before sending a link to the other person so that they don't have to put up with a 15 mbps download.


👤 usmannk
For local networks I use the builtin python http server `python3 -m http.server` (or `python -m SimpleHTTPServer` in 2.x). By default it binds to all so anyone on your network can access it.

👤 onelastjob
For transfers on the smaller side, I'd have a look at https://massive.io/ I've done a fair bit of looking into options for sending files online and I like their pricing model a lot because it is purely usage based and has no caps. It's $0.25/GB for downloads.

For larger transfers, I'd look at File Catalyst Spaces. If you have a bigger budget, you could look at IBM Aspera or Media Shuttle.


👤 m0xte
>10Gb - Stick it on a bitlocker encrypted USB stick and chuck it in the post. Next day download if you send it 1st class here.

<10Gb - chuck it on S3 and send a link to the other person.


👤 esaym
Usually a local HTTP server. But I've wondered before if binary data could be encoded in to a video file and then uploaded to youtube or some other video service...

👤 Sean-Der
https://webwormhole.io/ has web and native clients. You can build for Windows/Mac/Linux/FreeBSD/Mobile. It uses WebRTC so if you are in the same network it will establish the best path possible.

I see a lot of people mentioning magic-wormhole and NAT traversal. I can't find any docs that confirm this. I think it always runs through a relay server?


👤 Seb-C
I have been using mega.nz to synchronize and backup my files for years, and so far I am very happy with it.

Everything works as expected, there is a lot of storage space and end-to-end encryption. There are clients for all platforms and it just works.

I can easily create secure share links whenever I want.

The only downsides are that the browser-based app is very slow to start once you have lots of files, and the android client does not have the same synchronization capabilities.


👤 kenneth
In the enterprise space, there is dedicated software for this function. It uses an UDP protocol designed directly for efficiency (less overhead, vs. TCP), and builds in resilience such that it'll work over connections with lots of packet drops, or connections with high latency or low reliability.

https://gojetstream.io/


👤 raverbashing
I think the question for 10TB is more complicated than it seems at first

Is it one 10TB file? Multiple files? How do you handle file integrity? Error correction?


👤 eFishCent
I have used linux's rsync to send up to 100GiB at a time if the connection is reasonably stable. There are a couple challenges with this method: 1) the way it resumes takes a while to recheck the file before staring again; the larger the file, the longer this takes 2) you need to know a bit about writing a simple script to loop the rsync until the transfer is completed.

👤 Fornax96
If it's a file less than 10 GB I will use my own website: https://pixeldrain.com. For things slightly larger I will split it up using 7zip and use pixeldrain too.

For much larger files I always fall back to BitTorrent. I have also kept an eye on WebWormhole, but haven't been in a situation yet where I needed it.


👤 Ace__
Large files, and I mean from hundreds of GB to terrabytes I use https://www.filemail.com/. Their desktop app uses UDP, so it is a lot faster than TCP based protocols like FTP and HTTP. Yeah you can still use their website to send stuff, and if small in size, free anyway for up to 50 GB.

👤 natch
There was some tool for doing this securely on the command line. You run it with the file name and it gives you back a token. You give the token to your friend, they run the same tool with the token, and it downloads the file. Can’t for the life of me find it or remember what it was called. Wormhole?

Edit: I guess it was magic wormhole, discussed elsewhere in this thread.


👤 ohthehugemanate
I have a home nextcloud with a few extra TB of storage. I would put it there and use a share link.

But HTTP is not a great protocol for really large transfers like 10TB. Ideally you'd want something that parallelizes and does hash checks.

Even then, at 10TB you need a 2 gigabit connection to be even competitive with SSD-and-overnight-shipping (about 12hrs).


👤 guar47
I am actually transferring 400Gb of data across the world right now. I decided to use Apple iCloud and it works pretty good.

Too bad the laptop I am transferring from in a bad condition and with Windows so sometimes iCloud service is freezing and I need to restart it.

I tried to create torrent but it seems it takes forever on such a weak machine.


👤 egypturnash
Apple Mail has a 30-day storage zone for large email attachments that it automatically offers when you attach big stuff, and I usually use that.

When I don't I usually just zip it and upload it to the media section of my Wordpress-managed website.

I've never generated files in the terabyte range, so I don't worry about that.


👤 manexploitsman
Have a look at the "File Sharing and Synchronization" section https://github.com/awesome-selfhosted/awesome-selfhosted#fil...

👤 parliament32
I self-host a Nextcloud instance. If it's too big for a browser download, Bittorrent is the way to go.

👤 laluser
Dropbox transfer supports up to 250GB files now. That's the way to go for most of my needs.

👤 Faaak
Plain and simple: I use swisstransfer.com which is free (50Gb) and swiss-based.

I did it like you "in the past" (on my own fqdn / public-data) but I've given up on maintaining my website.

A torrent works really good if you've got multi-Tb files and many recipients.


👤 hotwire
A station wagon full of tapes hurtling down the highway...

👤 jerven
For 500Gb or more, aspera soft/ascp. Used by quite a few large bioinformatics institutes for allowing data uploads.

[1] https://www.ibm.com/products/aspera


👤 heavyset_go
Physical media for anything over a few gigs. It's just easier, especially if the recipient isn't technologically inclined. A lot of the media industry ships hard drives around to one another because content can be so massive.

👤 m463
"Never underestimate the bandwidth of a 747 fully loaded with backup tapes"

👤 justinweiss
I was asking myself exactly this a few days ago, to share ~500mb worth of files with a few people on a message board. https://gofile.io/ worked fine.

👤 hosteur
I use http://ifile.dk

Simple. Works.


👤 gpm
5G from a kubernetes pod: netcat

Send help... no seriously... how do I upload a file from a server I have a shell on but can't easily open ports to or install software on (and where the built in cp tool fails on big files...)?


👤 Arkanosis
If the network bandwidth on both end allows for a transfer in less than 48 hours, most likely rsync with the --partial-dir=.rsync-partial flag. Otherwise SD card / USB key / hard drive in the mail.

👤 noxer
Telegram. Up to 2GB per file so just 7zip and split the file(s).

Never needed to send anyone 10TB nor do I have that much storage but if you do you probably have some kind of NAS/server where you can enable FTP access.


👤 AnIdiotOnTheNet
Locally: SMB. Over the internet: SendGB. Ridiculously sized: HDDs in a car.

👤 ha-ckernews
Fram's Fast File Exchange (https://fex.rus.uni-stuttgart.de/) works well.

Free/open source


👤 wheels
I have Syncthing syncing to a "/shared" directory on my private web server. If the file is under the 20-ish GB I have free there, that's good enough.

👤 daniel_iversen
Dropbox was literally built for that sort of thing! And with 5GB you can probably even do it with the free account by getting yourself a tiny bit more bonus space.

👤 thedanbob
I send public links from the Seafile instance running on my home server. For 10+ TB, I think the only practical option would be to mail a hard drive or two.

👤 t312227
as a game of thrones fan:

a raven with a sufficient large micro-sd card ;)


👤 rootsudo
Onedrive/sharrepoint online.

Technically, unlimited storage on OneDrive, can force the recipient to verify 2fa via their email address, force expiration link, etc.


👤 askvictor
Given the quantity of data sent from the LHC to universities across the Atlantic, I'm curious how they (CERN et al) handle their data transfers.

👤 carlreid
I use justbeamit if it's a direct send to someone.

https://justbeamit.com/


👤 nakodari
I am running Jumpshare (https://jumpshare.com) and our users regularly send large files anywhere from 2GB to 100GB and in some cases even more. My recommendation is to use cloud providers for sharing files that are maximum 50GB in size. Please note that although you can share even bigger files you have to consider the reliability of the internet provider of both the sender and receiver. For files bigger than 50GB in size, I would recommend a P2P solution.

👤 ikeboy
I like mega, it's encrypted, there's command line tools (megatools), it's very fast to upload from a server in my experience.

👤 kyleee
Firefox send works great for this sort of thing though there is an upper limit on file size (can't recall what it is right now)

👤 werber
Maybe I’m old, but physically. Most of my large transfers are photographs, video and audio recordings and I like flash drives.

👤 Apreche
Upload to cloud storage then have the other party download it.

If it's too big for cloud storage, ship a hard drive in the mail.


👤 vbezhenar
Up to a few GBs I'll host them on my server and send a link. More than that: I'll host a torrent.

👤 reportgunner
In person. If it's not possible to share in person I point them to the place where I myself got it.


👤 sumnole
OneDrive has served well enough for my purposes. I've only uploaded files smaller than 1tb though.

👤 pcvarmint
USPS Priority Mail.

You can send a 10TB hard drive faster by USPS than it would take to transmit across the internet.


👤 searchableguy
I have a raspberry pi with a hard drive. I throw the files in that and then use dat/ipfs.

👤 thehappypm
Google Drive! Send a public link.

👤 holychiz
for non-technical folks to send me big files (5GB limit), I send them to https://transfer.pcloud.com/. no registration needed.

👤 anonymoushn
I upload it to my VPS and send someone a link.

I think google drive is a decent solution as well.


👤 simonw
I use Transmit on macOS to upload to an S3 bucket and then send them a link.

👤 notatoad
if i think the recipient is competent enough, i'll ask for or provide ssh access and copy it with rsync

otherwise, i've mailed a flash drive before, and for <20GB i'll just put it on a public S3 bucket.


👤 really3452
Uploading to Sia's Skynet, then emailing or texting the link.

👤 Foivos
At 10 TB I guess your best option is to physically mail an HDD.

👤 simon_acca
Google cloud storage. Ui is actually nice, permissions are flexible and has both a robust web uploader as well as a cli one. Not to mention it’s cheap and only billed by the GB/hour or something.

👤 rawoke083600
scp to one one of my unrelated websites and send http-link. Remove file afterwards... #notperfect.

👤 badrabbit
ngrok.io and similar reverse tunnels

👤 mcculley
Keybase

👤 summm
Retroshare

👤 bethecloud
I like to use transfer.sh

👤 tobyhinloopen
Either via Google Drive or iCloud Drive. If it doesn’t fit on these, use physical media

👤 villgax
Backblaze/Torrent

👤 cellularmitosis
socat on both ends :)

👤 alfg
S3 and signed URLs.

👤 11235813213455
streaming json-lines (for large json array files)

👤 arthurcolle
Magic Wormhole

👤 zacksinclair
Git LFS ;)

👤 ohazi
rsync over ssh with retries

👤 ghthor
IPFS

👤 cheeze
python -m http.server

👤 rocky1138
scp

👤 known
Interesting information; Thanks to all commentators;

👤 emerged
upload to s3 bucket, make public, send link

👤 aaron695
Free unused Google Drive

Some times I'll have to open the file in a hex editor and change the hash if it sets off copyright restrictions.

This is a 10 gig limit.


👤 jeffbee
Google Drive seems a lot more practical than pretty much every other answer in this thread.