According to a blogged report by the Wall Street Journal, Google is building out an experimental wireless network at its Mountain View, Calif. campus, using frequencies that would not normally be used for wireless communications in the United States. However, these frequencies are currently being examined for use in countries with highly populated areas, like China and Japan, which makes Google’s usage of these frequencies even more tantalizing and mysterious.
Naturally, Google is tight-lipped about divulging any of their plans, but the deployment is set to happen in the same building that houses the Google Fiber team — and that means it’s worth speculating on Google’s infrastructure plans. With the recent roll-out of Google Fiber in Kansas City, Kan., and Kansas City, Mo., Google has been challenging the status quo when it comes to connectivity.
Google’s pitch with Google Fiber was simple — a gigabit of speed with no caps or throttles allows for more innovative technologies because access to the Internet has become a near-instant experience. And while Google’s efforts here are likely focused on the mobility space (the alleged 2423-2625 megahertz frequencies being tested may be licensed by wireless broadband company Clearwire Corp.), Google could be looking to create an extension of broadband outside the home by blanketing major cities and eliminating the need to rely on traditional mobile carriers that have less-reliable and more restrictive data transfer speeds and practices.
Much in the way ISPs were shocked after the Kansas Cities roll-out, wireless providers like AT&T Inc. and Verizon Communications Inc. should be paying attention to what innovative infrastructure is all about — assuming the speculation on mobile infrastructure is accurate.
But big picture time: how does this affect cloud services and cloud providers? In a word, it means ubiquity and greater reliance on the public cloud utility. When it comes to the public cloud, users will demand greater security, since shared airspace and resources is dangerous for unencrypted data. With the inevitable rise in mobile devices, this could be the first signs that on-premise security is going the way of the dodo, too. If the majority of the world relies on public infrastructure, the best way to protect users is to encapsulate them and their traffic.
As such, cloud service providers will need to pull start offering a more complete solution package.That idea exists in small spaces, like a recent partnership between IBM and AT&T for a secure public cloud VPN, but more elegant solutions that are already part of cloud service delivery will be the future. Not every company that relies on the public cloud for apps and services will want to rely on an additional companies for a secure pipe.
This means cloud service providers will be a brokerage of software, security, services and support, all specialized for the particular industry at hand. It’s a daunting task, which is why the cloud provider world may shrink as years go by and companies merge and conglomerate. Whether or not Google’s project comes to fruition won’t necessarily matter (though it may accelerate the process), this future is still on the horizon thanks to the existing efforts of mobile carriers and the proliferation of LTE.
Even though partners today are just wrapping their heads around cloud-based business models and delivery, the best prepared partners will make sure an end-to-end software and security model is a realistic long-term project, because when the market suddenly starts demanding it, the providers first to market may corner it.
Leave a Reply