In his new role, he will help ensure that technology implemented by the White House is “efficient, effective and secure,” including “converging overlapping systems, modernizing software used to collaborate and bringing use of new technologies in line with private sector best practices.”
At Facebook, Recordon led teams on projects including productivity tools that aided employees in creating, sharing and finding information, as well as open source, engineering education, human resources, videoconferencing and physical security.
President Barack Obama said in a release announcing Recordon’s appointment:
In our continued efforts to serve our citizens better, we’re bringing in top tech leaders to support our teams across the federal government. Today, I’m pleased to welcome David Recordon as the director of White House information technology. His considerable private sector experience and ability to deploy the latest collaborative and communication technologies will be a great asset to our work on behalf of the American people.
Readers: Are you surprised more Facebook employees — particularly a certain chief operating officer — haven’t left the social network for positions in Washington, D.C.?
Image courtesy of Shutterstock.
Article courtesy of SocialTimes Feed
Day one of the Open Compute Summit in San Jose, Calif., was highlighted by a handful of announcements from Facebook.
The social network used the first day of OCP U.S. Summit 2015 to introduce a new system-on-a-chip compute server, a specification for its top-of-rack network switch, open-sourced low-level board-management software, the open-sourcing of its top-of-rack network switch central library and cost and energy savings.
Facebook detailed its announcements in a post on its engineering blog:
The social network concluded in the blog post:
Over the past four years, Facebook has been steadily working to revolutionize data-center hardware. We started with new server designs, power handling and cooling; then focused on storage and rack; and over the past year, we have completely opened the data-center network. The result is that today, we have open-sourced every major physical component of our data-center stack — a stack that is powerful enough to connect 1.39 billion people around the world, and is efficient enough to have saved us $2 billion in infrastructure costs over the last three years. But we’re not finished — not even close.
Our mission is to connect the world. We can’t achieve this goal without infrastructure innovation. To go where we want to go, we need to build and deploy infrastructure that is as flexible, efficient and sustainable as possible. To do this, we want to work with not just the best minds under one roof, but the best minds in the world — and that’s where the Open Compute Project comes in.
Four years ago, the major Web companies were working in silos to build the infrastructure necessary for their scale. Now, just four years in, the Open Compute Project has thousands of participants and nearly 200 companies working to increase the pace of innovation in the industry. Facebook is proud to have started this initiative, and we will continue to openly share our technologies and our learnings as we build the infrastructure required to connect the next 5 billion people.
These technologies underpin the software services we use every day. As hardware advances, so do the speed, performance, capabilities and reach of that software. The Open Compute Project is about working together to reimagine, reinvent and build data centers, servers, storage devices and network technologies to support the massive growth of data today and enable the great services of tomorrow.
Readers: What are your initial thoughts on Facebook’s announcements on day one of OCP U.S. Summit 2015?
Article courtesy of SocialTimes Feed