I understand the need for independent fiber ISPs. But are gigabit speeds really necessary? For me, a 300 Mbps connection is way more than enough for a four-person family.
I understand the need for independent fiber ISPs. But are gigabit speeds really necessary? For me, a 300 Mbps connection is way more than enough for a four-person family.
It's usually the upload speeds that are the bottleneck. When the upload is only 10 Mbps any video upload or large file upload will saturate the connection. In addition to having to wait for the video to finish uploading, if you don't manually rate limit the upload job speed or have anti-bufferbloat measures deployed any concurrent voice or video calls will suffer. During Covid shutdowns four people video chatting at once was no longer far fetched.
The scarcity of upload speed is why I learned how to setup OpenWrt's SQM feature, and upgraded to the highest package my ISPs offers.
That's a bit more than I have (Starlink), but anytime a kid is downloading a big Steam game or big ISO file or something, everything else slows to a crawl. I also occassionally rsync large directories to/from cloud storage and that can also saturate. I've tried setting rules/priorities but it's a constant game of whack-a-mole
For convenience, the rclone tool is nice for most cloud storage like google and stuff that make rsync annoying[0]
rsync also offers compression[1], and you might want to balance it depending if you want to be CPU bound or IO bound. You can pick the compression and level, with more options than just the `-z` flag. You can also increase speed by not doing the checksum, or by running without checksum and then running again later with. Or some intervaling like daily backups without and monthly you do checksums.
If you tar your files up first I have a function that is essentially `tar cf - "${@:2}" | xz -9 --threads $NTHREADS --verbose > "${1}"` which uses the maximum `xz` compression level. I like to heavily compress things upstream because it also makes downloads faster and decompression is much easier than compression. I usually prefer being compute bound.
Also, a systemd job is always nice and offers more flexibility than cron. It's what's helped me most with the wack-a-mole game. I like to do on calendar events (e.g. Daily, Weekly) and add a random delay. It's also nice that if the event was missed because the machine was off it'll run the job once the machine is back on (I usually make it wait at least 15 minutes after machine comes online).
[0] https://rclone.org/
[1] https://unix.stackexchange.com/a/292020
Thank you! I do love rclone a great deal, and need to start using it more. I mostly use rsync right now because I've got some Contabo storage instances and ssh is already set up, so it's nice and easy, and I've already got the rsync commands burned into memory from decades of use :-) I also typically rsync with the same command to an external hard drive for an on-site backup.
Great tips! I'll definitely be using your tar command
Oh, that reminds me. When I had an Android (I already regret my iPhone) I used termux to write a very basic script to rsync data to machines and do so based on if connected to WiFi (with whitelisted SSIDs). Pretty easy to write and then schedule a cron job. Can be nice to put that onto other peoples devices and give them automated backups. Makes it FAR easier to do the 321 backup strategy. Also pretty easy to build a tracking app for a lost phone that way (like if haven't connected to home WiFi in x days send you an email). Both work really well when also using tailscale.
The article says they're a 10 person family.
Depends on what you're doing, right?
Let's take video streaming. I have a pretty compressed version of Arrival that's at 2GB and is a 4k movie ~2hrs long (the original file was ~2x the size). To stream that we need to do 2000Mb / (3600s * 2) = 277.8Mb/s. This also doesn't account for any buffering. This is one of my smaller 4k videos and more typical is going to be 3Gb-5Gb (e.g. Oppenheimer vs Children of Men). Arrival is pretty dark and a slow movie so great for compression.
Now, there's probably some trickery going on that can get better savings and you'll see used with things like degrading the quality. You could probably drop this down to 1.5Gb and have no major visual hits or you can do a variable streaming and drop this even more. On many screens you might not notice a huge difference between 1440 and 4k, and depending on the video, maybe even 1080p and 4k[0].
For comparison, I loaded up a 4k YouTube video (which uses vp9 encoding) and monitored the bandwidth. It is very spiky, but frequently jumped between 150kbps and 200Mbps. You could probably do 2 people on this. I think it'd get bogged down with 4 people. And remember, this is all highly variable. Games, downloads, and many other things can greatly impact all this. It also highly depends on the stability of your network connection. You're paying for *UP TO* 300Mbps, not a fixed rate of 300Mbps. Most people want a bit of headroom.
[0] Any person will 100% be able to differentiate 1080p and 4k when head to head, but in the wild? We're just too used to spotty connections and variable resolutions. It also depends on the screen you're viewing from, most importantly the screen size (e.g. phone).
> 2000Mb / (3600s * 2) = 277.8Mb/s
2000Mb / 7200s = 0.278 Mb/s, or 277.8Kb/s
> It is very spiky, but frequently jumped between 150kbps and 200Mbps. […] I think it’d get bogged down with 4 people
That’s just burst buffering as fast as it can, you didn’t capture the average. It doesn’t suggest it would slow down with 4 people. 4K on YouTube takes 20Mbps, so to parent’s point, you’ll have plenty of bandwidth to spare if 4 people do this at the same time on a 300Mbps line.
> I have a pretty compressed version of Arrival that's at 2GB and is a 4k movie ~2hrs long (the original file was ~2x the size). To stream that we need to do 2000Mb / (3600s * 2) = 277.8Mb/s.
Your math is WAY off.
2 gigabytes / 2 hours is only about 2.22 megabits/sec.
Blu-rays don't do anywhere near 277Mb/s.
First, if it was 2GB * 2 for the source of your recompressed copy, that's 4GB * 8 bits per byte = 32 Gigabits (Gb), or 32,000Mb. Two hours in seconds is 60 * 60 * 2 = 7,200 seconds.
32,000 / 7,200 is 4.444Mb/s. Streaming your 2 hour long 4GB movie could be done with ~5Mbit. A 1Gb/s connection could handle streaming ~200 of these movies.
Going back to Blu-rays as a source, an Ultra HD Blu-ray maxes out at 144Mbit but in reality most movies are encoded at a much lower bitrate. Most movies will cap out around 40-50Mbit. You could do 20 of these straight Blu-ray movies on a 1Gb connection.
https://en.wikipedia.org/wiki/Ultra_HD_Blu-ray#Specification...
> But are gigabit speeds really necessary? For me, a 300 Mbps connection is way more than enough for a four-person family.
I'm certain that one could make a sound argument that 300 Mbps is not necessary for that four-person family, and they could make do with a much slower connection. Back in the day, folks would be asking if it's necessary for your Internet connection to be always on. After all, it's no hassle at all to plug the modem into the house phone line and unplug it when you're done!
For me, switching from a 1400/40mbps cable connection to a symmetric-but-variable 300-1000mbps Ethernet connection meant that I was doing the same sorts of things, but often spending much less time waiting for them to complete. Related to that, it also made "content creation"-esque things [0] much, much easier.
[0] Which I'm declaring is a category that includes uploading and downloading huge files while working from home as a programmer/"DevOps" guy.
Low upload speed is like a house with only one bathroom.
Gigabit is the slowest reasonable fiber speed. Even 10-gigabit is now very cheap to the point that some gigabit equipment is just 10-gigabit equipment operating at a low speed. They could artificially throttle you to even less, if it made business sense.
It's not a committed rate. Your individual line is a gigabit, but the upstream from your whole block is 10 gigabit so you can't all use it at once. Your guaranteed rate is probably have more like 20-50 Mbps, if that's what's confusing you. But it's extremely rare that everyone tries to use their gigabit all at once.
If it's a Passive Optical Network, you might be sharing a gigabit download with your block - you all share the same fiber - and you get substantially less than a gigabit upload due to the need for timeslotting. Gigabit PON is obsolete though, now you'd get at least 10G PON.