Found 1 bookmarks
Custom sorting
If Xerox PARC Invented the PC, Google Invented the Internet
If Xerox PARC Invented the PC, Google Invented the Internet
Inside Google, Jeff Dean is regarded with awe. Outside the company, few even know his name. But they should. Dean is part of a small group of Google engineers who designed the fundamental software and hardware that underpinned the company's rise to the web's most dominant force, and these creations are now mimicked by the rest of the net's biggest names -- not to mention countless others looking to bring the Google way to businesses beyond the web.
Dean is part of a small group of Google engineers who designed the fundamental software and hardware that underpinned the company's rise to the web's most dominant force
>"Google did a great job of slurping up some of the most talented researchers in the world at a time when places like Bell Labs and Xerox PARC were dying. It managed to grab not just their researchers, but their lifeblood."
But because Google is so concerned with keeping its latest data center work hidden from competitors
"Google did a great job of slurping up some of the most talented researchers in the world at a time when places like Bell Labs and Xerox PARC were dying,"
Google's new approach to data center hardware
The difference between it and a Xerox PARC is that Google profited mightily from its creations before the rest of the world caught on.
"Jeff and Sanjay worked together to develop much of Google's infrastructure
In the '90s, both worked at Silicon Valley research labs run by the Digital Equipment Corporation
They came to Google as part of a mass migration from DEC's research arm.
Some went to a Palo Alto startup called VMware, whose virtual servers were about to turn the data center upside-down.
Eric Brewer, the University of California at Berkeley computer science professor who now works alongside Dean and Ghemawat.
several other engineers who arrived at Google from DEC would help design technologies that caused a seismic shift in the web as a whole, including Mike Burrows, Shun-Tak Leung, and Luiz André Barroso
Rather than using big, beefy machines to run its search engine, it broke its software into pieces and spread them across an army of small, cheap machines. This is the fundamental idea behind GFS, MapReduce, and BigTable – and so many other Google technologies that would overturn the status quo.
"The architecture challenges that arise when building a data system like Google's that spans thousands of computers
Jeff Dean was the first to arrive from DEC. He came by way of his "academic uncle," Urs Hölzle.
Hölzle was one of Google's first 10 employees, and as the company's first vice president of engineering, he oversaw the creation of the Google infrastructure, which now spans more than 35 data centers across the globe
He was soon hired by the same man who hired Hölzle: Google co-founder Larry Page.
the company's core search technologies, which were already buckling under the weight of a rapidly growing worldwide web
Barroso and Hölzle in their seminal 2009 book on the subject, The Datacenter as a Computer. "In other words, we must treat the data center itself as one massive warehouse-scale computer.”
but the details remained hidden until 2009
In the data center, Google isn't content to merely innovate. It keeps the innovations extremely quiet until it's good and ready to share them with the rest of the world.
Larry Page has a thing for Nikola Tesla. According to Steven Levy's behind the scenes look at Google – In The Plex – Page regarded Tesla as an inventor on par with Edison, but always lamented his inability to turn his inventions into profits and long-term recognition.
Clearly, the cautionary tale of Nicola Tesla influenced the way Google handles its core technologies. It treats them as trade secrets, and much like Apple, the company has a knack for keeping them secret. But in some cases, after a technology runs inside Google for several years, the company will open the kimono. "We try to be as open as possible – without giving up our competitive advantage," says Hölzle. "We will communicate the idea, but not the implementation."
In 2006, Google published a paper on its sweeping database, and together with an Amazon paper describing a data store called Dynamo
In 2003 and 2004, the company published papers on GFS and MapReduce.
We still know relatively little about the inside of Google's data centers, but the company's efforts to design and build its own gear
updating the index in realtime
We don't know much about what the company's now uses inside these top secret facilities
In recent years, Google published papers on Caffeine and two other sweeping software platforms that underpin its services: Pregel, a "graph" database for mapping relationships between pieces of data, and Dremel, a means of analyzing vast amounts data at super high speeds.
In May of last year, University of California at Berkeley professor Eric Brewer announced he was joining the team building Google's "next gen" infrastructure.
Google was a research operation – and yet it wasn't. "The Google infrastructure work wasn't really seen as research," Ghemawat says. "It was about how do we solve the problems we're seeing in production."
For some, the drawback of working on Google's core infrastructure is that you can't tell anyone else what you're doing. This is one of the reasons an engineer named Amir Michael left Google
·wired.com·
If Xerox PARC Invented the PC, Google Invented the Internet