Synthesis Lectures on Computer Architecture | Book titles in this series
This series covers topics pertaining to the science and art of designing, analyzing, selecting and interconnecting hardware components to create computers ...
Google Throws Open Doors to Its Top-Secret Data Center
If you're looking for the beating heart of the digital age--a physical location where the scope, grandeur, and geekiness of the kingdom of bits become manifest--you could do a lot worse than Lenoir, North Carolina. This rural city of 18,000 was once rife with furniture factories. Now it's the home of a Google data center.
(computer) (computers) (signal) (absolute time) (satellite) before:priority:19980917 - Google Patents
A synchronized timing system is disclosed for one or more of a plurality of network interconnected computers. The system utilizes a global satellite system and includes a receiver device for detecting out-of-phase signals from a plurality of satellite sources of the satellite system. A mechanism is provided for processing and phase correlating these signals to generate a single absolute time reference signal therefrom. An interface device is disposed in each computer for receiving the reference signal and adapting this signal as the internal master clock reference for the operating system of the computer. Finally, a mechanism interconnects each computer in the network of computers to synchronize the internal master clocks of the computers to the absolute time reference signal to create a plurality of network interconnected time synchronized computers. These computers may be additionally time synchronized and interconnected to other networks of computers through a global communication system such as the global Internet.
Supercomputing news and information focused on emerging HPC applications in science, engineering, financial modeling, virtual reality, databases and other compute intensive tasks
What is wrong with the Internet today? SCION is a clean-slate internet architecture, with up-to-date security and scalability properties, meant to overcome the limitations of the current IP- and BGP-based internet, namely security, availability, and performance.
Dr. Guru Parulkar – Executive Director, Clean Slate Internet Design Program, Stanford University
Dr. Guru Parulkar joined Stanford University in August 2007 and served as the Executive Director of its Clean Slate Internet Design Program. He has been in the field of networking for over 25 years and cherishes the opportunities he has had to work with great people. He has worked in academia (Washington University in St. […]
Q&A: Stanford applies a clean slate to the Internet
Nobody’s going to tear down the Internet and rebuild it from scratch, but academics at Stanford University are imagining what the new blueprint would look like if they did and they hope their work will lead to an Internet that works better in 20 years than it does today.
A European approach to a clean slate design for the future internet
The paper discusses a research program for a clean slate design of a “Future Internet” undertaken by Bell Labs in cooperation with a consortium of major European operators, manufacturers, and academia. The research cooperation explores innovative solutions in architectural design, virtualization, and generic connectivity in order to create a future “network of information.” We describe high-level goals and identify technical requirements and the expected business opportunities of this initiative. A basic idea for a new networking concept—the “generic path”—is outlined as an example of how to realize this in the future.
[PDF] Clean-slate Design for the Internet | Semantic Scholar
We believe that the current Internet has significant deficiencies that need to be solved before it can become a unified global communication infrastructure. Further, we believe the Internet’s shortcomings will not be resolved by the conventional incremental and “backward-compatible” style of academic and industrial networking research. The proposed program will focus on unconventional, bold, and long-term research that tries to break the network’s ossification. To this end, the research program can be characterized by two research questions: “With what we know today, if we were to start again with a clean slate, how would we design a global communications infrastructure?”, and “How should the Internet look in 15 years?” We will measure our success in the long-term: We intend to look back in 15 years time and see significant impact from our program.
Many believe that it is impossible to resolve the challenges facing today's Internet without rethinking the fundamental assumptions and design decisions underlying its current architecture. Therefo...
Scientists throughout the world are looking at a large-scale overhaul of some basic Internet elements to eliminate the need to constantly create workarounds to meet the challenges of coping with technology changes.
You Can't Have Google's Pluto Switch, But You Can Have This
When photos of Google's mystery "Pluto Switch" appeared on the web early last year, it seemed like something from another world -- and not just because Google called it the Pluto Switch. But as alien as the Pluto Switch may seem, it's very much a sign of where the rest of the computer networking world is moving.
SEP 19, 2012 - Google Spans Entire Planet With GPS-Powered Database
Three years ago, a top Google engineer named Vijay Gill was asked what he would do if someone gave him a magic wand. Gill hesitated before answering. And when he did answer, he was coy. But he seemed to say that he would build a single system that could automatically and instantly juggle information across all of Google's data centers -- and he seemed to say that Google had already built one.
NOV 26, 2012 - Exclusive: Inside Google Spanner, the Largest Single Database on Earth
Much like the engineering team that created it, Google Spanner is something that stretches across the globe while behaving as if it's all in one place. Unveiled this fall after years of hints and rumors, it's the first worldwide database worthy of the name -- a database designed to seamlessly operate across hundreds of data centers and millions of machines and trillions of rows of information.