akamai yesterday - what in the world was that

Once upon a time, Hugo Slabbert <hugo@slabnet.com> said:

> This just follows the same rules as networks have always seemed to; If
you build it, they will come, and you'll have to build more. :slight_smile:

Induced demand - Wikipedia

Yep, just like your disk space requirements will always grow to 110% of
available space.

I get annoyed when I'm chatting with friends, waiting to play some game
we decided to download, and it's ONLY downloading at 300 megabits per
second! :stuck_out_tongue:

Except the CDN providing the content did not anticipate this type of influx (How come I am not sure probably more concerned about new business revenue and not thinking about the backend infrastructure) and has pushed over costly peers for most of us. BTW, we are still waiting for our PNIs with them. Very frustratingā€¦

Peace,

I find it both happy and disturbing. I remember the first 2.4/2.5g links I turned up as well as the first 10g and (eventually) the first 100g links.

I was leaving the house earlier this week thinking about how it used to be Mbps of traffic that was a lot and now itā€™s Gbps and how thatā€™s shifted to Tbps.

While it makes me feel old, itā€™s also something that I marvel about periodically.

A bit of perspective on bandwidth and feeling old. The first non-academic connection from Africa (Usenet and Email, pre-Internet) ran at about 9600 bps over a Telebit Trailblazer in my living room.

The first non-academic IP connection was a satellite connection (64Kbps IIRC, not in my living room :-)).

Now we have a bajillion Gbps over submarine fibre landing pretty much everywhere, and my guess is that it is not enough bandwidth.

All this to bring such vital resources as Facebook and Netflix :slight_smile:

  paul

I get annoyed when Iā€™m chatting with friends, waiting to play some game
we decided to download, and itā€™s ONLY downloading at 300 megabits per
second! :stuck_out_tongue:

In this scenario, which mechanism controls the download speed? I hear many users complain that their gigabit internet connection is not maxing out and the update is taking forever. I would never expect a gigabit internet connection to be saturated during a game update, but Iā€™m curious how the throttling works.

Thanks.

Can be for large orgs that are dysfunctional as well. This is somewhat typical of large companies and we usually try to not air all our internal drama in public.

Itā€™s entirely possible thereā€™s numerous problems with blame to go around.

It can also take longer than weā€™d like to get things done as well.

I can assure you everyone is acting in good faith here, even if it doesnā€™t seem that way. Sometimes errors arenā€™t caught immediately and sometimes people go on leave, etc.. The rest of the dialogue should happen off-list though.

- Jared

Someone asked about Antarctica recently.

I remember the day in the 80s when they, I'm pretty sure McMurdo
Station, got its first "internet" connection. It was announced on
lists like this one.

It was a satellite which was good for only so many minutes per day as
it flew in and out of sight and exchanged batched email etc via Kermit
at probably around 9600bps if that, probably variable depending on
conditions.

If I may...which also reminds me of a project in Africa which used
some sort of wireless link (probably packet-radio) on top of buses.

People with the right equipment could get a batch exchange as a bus
drove by. I'm pretty certain that really was implemented and used.

No idea what the bandwidth was, I think packet-radio in that era
generally was glad to achieve around 1200bps.

Moral: Really, really, bad connectivity is a zillion times better than
no connectivity.

Game updates are generally compressed chunks and the client does live decompression on the data.

As such, insufficient CPU or IO performance will result in lower overall speeds, since it canā€™t keep up with the incoming stream of data.

Regards,
Filip

Speaking from my CDN and service provider hats, I'd say upstream congestion. If you're an end user with a gigabit internet connection, on a service that presumably has lots of customers with similarly large connections, it's a safe bet nothing upstream of you is provisioned with enough capacity to serve everyone running full speed. Whether it's peering with the CDN, backhaul on your provider's network from the peering point to whereever you are, etc. Something, or multiple something's, will run out of pipe in a traffic event like yesterday's.

If true (not arguing), thatā€™s really dumb.

Apple did this with the original iPhone. Turned out even in their ecosystem they didnā€™t get it right. The full restore images have always been there and diffs didnā€™t reappear until you could ā€œOTAā€ the device (WiFi)

I canā€™t imagine how hard a console would be with every random app writing data wherever.

Sandboxes and jails have been escaped as long as they have been around as well so they can help but are far from perfect

Paul Nash writes:

A bit of perspective on bandwidth and feeling old. The first
non-academic connection from Africa (Usenet and Email, pre-Internet)
ran at about 9600 bps over a Telebit Trailblazer in my living room.

For your amusement, this latest e-bloodbath, erm -sports update, at 48GB
("PC" version), would take about 463 days (~15 months) to complete at
9600 bps (not counting overhead like packet headers etc.)

At 64kbps (ISDN/Antarctica) you could do it in 69 days, maybe even
finishing before the next - undoubtedly bigger - release comes out.

And now for our amusement Akamai can do it accidentally.

* ximaera@gmail.com (Tļæ½ma Gavrichenkov) [Fri 24 Jan 2020, 11:49 CET]:

And now for our amusement Akamai can do it *accidentally*.

What do you mean? The CDNs don't publish the games nor do they buy the games. The people downloading aren't even their customers. The publishers generally decide the % of traffic share each CDN serves if they contract with multiple CDNs for a project.

  -- Niels.

Interestingā€¦ I just found this. Speaks of 800 gbps, 1.2 tbps, 1.6 tbps Ethernet

https://en.wikipedia.org/wiki/Terabit_Ethernet

https://ethernetalliance.org/technology/2019-roadmap/

https://ethernetalliance.org/wp-content/uploads/2019/08/EthernetRoadmap-2019-Side1-ToPrint.pdf

https://ethernetalliance.org/wp-content/uploads/2019/08/EthernetRoadmap-2019-Side2-ToPrint.pdf

-Aaron

Thanks Hugo, very interesting. Induced demand. Someone said recentlyā€¦ theyā€™ve seen that no matter how much bandwidth you give a customer, they will eventually figure out how to use it. (whether they realize it or notā€¦ I guess it just happens)

-Aaron

Thanks Jared, When I reminisce with my boss he reminds me that this telco/ISP here initially started with a 56kbps internet uplink , lol

-Aaron

Thanks Jared, When I reminisce with my boss he reminds me that this telco/ISP here initially started with a 56kbps internet uplink , lol

Oh, gods, what have you done?! This comment will bring everyone out of
the woodwork, reminiscing about the good ol' days....

So, I grew up in South Africa, and one of the more fascinating /
cooler things I saw was a modem which would get you ~50bps (bps, not
Kbps) over a single strand of barbed wire -- you'd hammer a largish
nail into the ground, and clip one alligator[0] clip onto that, and
another alligator clip onto the barbed wire. Repeat the process on the
other side (up to ~5km away), plug the modems in, and bits would
flow... I only saw these used a few times, but always thought they
were cool....

One of the first ISPs I worked at had an AGS+ as the "core" router
(and a pile of IGS's) -- the AGS had metal plate in the front to
access the "line cards", and a monster big squirrel fan - the squirrel
fan was strong enough that the vacuum / airflow would keep the metal
plate sucked on if you didn't do up the screw.. anyway, the device
also had a watchdog timer, which would reboot the box if IOS[1] locked
up, and the fan would slow down while this occurred -- so, for a long
time, our fastest / most reliable monitoring was an empty PC case
placed on the floor under the rack -- when the router locked up, the
fan would slow down, the front would fall off[2] and bounce on the PC
case, making an unholy racket - and alerting the NOC that something
bad was happening....

Anyway, so I tied an onion to my belt, which was the style at the
time. Now, to take the ferry cost a nickel, and in those days, nickels
had pictures of bumblebees on 'em. Give me five bees for a quarter,
you'd say.
Now where were we? Oh yeah: the important thing was I had an onion on
my belt, which was the style at the time. They didn't have white
onions because of the war. The only thing you could get was those big
yellow ones...

W
[0]: In .za we called them crocodile clips -- true story.
[1]: For all you young whippersnappers, that's the Cisco IOS, not the
Apple iOS.. :-/
[2]: https://www.youtube.com/watch?v=3m5qxZm_JqM -- Unlike the rest of
this email, this is off-topic, but still great....

Ahhhh hahahaha, that's great Warren !

afterall, it is Friday, might was well...

oh my gosh, I cut my teeth on a few of those mgs type routers... I recall they sounded a bit like a small vacuum cleaner.... and I think I had to set jumpers or flip dip switches for password recovery!
https://www.flickr.com/photos/pleia2/16858944610

back then, my onion hung on a rope around my waist....

-Aaron

Same with compute resources, tbh. Give 'em a new stack of racks: ā€œOh, this service that didnā€™t even exist last year now requires 10,000 CPU cores kthxbye.ā€

Also, https://twitter.com/iamdevloper/status/926458505355235328?s=20

"1969:
-whatā€™re you doing with that 2KB of RAM?
-sending people to the moon

2017:
-whatā€™re you doing with that 1.5GB of RAM?
-running Slack"