A spot of tech nostalgia for us, with Google's hirsute chief engineer, Urs Hölzle, discussing his first day in Google's "data center" 15 years ago:
[...] a megabit cost $1200/month and we had to buy two, an amount we didn't actually reach until the summer of 1999. (At the time, 1 Mbps was roughly equivalent to a million queries per day.)What's interesting here is that the primary criteria for billing was space - square footage taken up on the colocation site's floor. Network was an additional cost as noted above, but Exodus didn't bill its residents for power - the 3 x 20A required for all the servers was a scrawled note on the invoice. Nowadays, power is one of the most fundamental requirements of a data center and you don't pour the first bit of concrete before you've got your megawattage lined up. Apple goes as far as sticking its own solar power generation around its North Carolina data center. We've come a long way in fifteen years.
- You'll see a second line for bandwidth, that was a special deal for crawl bandwidth. Larry had convinced the sales person that they should give it to us for "cheap" because it's all incoming traffic, which didn't require any extra bandwidth for them because Exodus traffic was primarily outbound.
You wouldn't be able to get away with a server rack like Google's 1999 design nowadays - just look at the way they cram the hardware into every available space. I've seen one of these racks on display, and you can barely see any daylight through it from front to back. The fire safety inspector would have kittens.
In the comments, Todd Reed calculates that if you tried to run today's YouTube while paying those data rates, you'd be forking over just under $3bn per month...
This just makes the point that the computing world of 15 years ago really was a different generation from today. Google was anticipating that a few megabits per second would be more than enough to keep crawling the entire web and keep up with the addition of content. Let's look at the most content-dense medium of the modern web - Tweets. In 2013 Twitter averaged 5700 Tweets per second. At 160 characters plus maybe 40 characters of timestamp and attribution that's 200 x 5700 = 1,140,000 characters per second or about 9 Mbits per second (Mbps). It would have cost Google nearly $11,000 per month just to keep up with Twitter's tweets. Nowadays you can get 20Mbps on your home Internet connection for $75 per month (business class) which should cope comfortably with two Twitters - until they started allowing you to attach images...
No comments:
Post a Comment
All comments are subject to retrospective moderation. I will only reject spam, gratuitous abuse, and wilful stupidity.