Is Scalable Cloud Computing Ready for Higher Education?
Dennis Anderson, PhD and Dr. Peter Morales
There seem to be a lot of differing perspectives around cloud computing. If you ask three experts, you will likely get three views. One certainty is that we all, especially in academia, are asked to do more with less. In this article we will attempt to cut through some of those issues as they pertain to higher education but we are certain that folks from other industries will find that the issues ring true for them as well.
We often encounter the following questions from our academic peers:
Here are a few things about academic computing environment.
Spiky – seasonal traffic loads are multiples of daily load
Traffic on our services in higher education follow predicable patterns. We have application spikes, registration spikes, testing and other scheduled activities that drive loads in our environment. Once in a while, a special speaker or other event will create an unusual spike in data volume. This is more and more true as we use video and multimedia. However, in general, we can predict our ‘spikyness’.
Slow provisioning sets the stage for inefficient use of resources
Even with virtualization, we find that it often takes days to get a new service provisioned in a data center. Yes we have all the tools. We are running an up to date hypervisor. We have a fully stocked SAN and virtualized our storage. Unfortunately, a fully deployed service requires much more than just the bare VM instance. There are many other aspects of deploying a service which often require time consuming manual configuration. As a result, folks almost universally over provision their environments. The problem is that each server has more than the initial fixed cost. There are power and cooling costs, support costs, operational costs etc. So here are the challenges we hear from our organizations:
Cloud requirements from an academic perspective
In order to meet those challenges, we need to address several significant obstacles:
Secure handling of data is critical
Like financial industry, secure handling of data is a crucial component of any cloud strategy. When it comes to security, cloud computing gets interesting. The problem is of course compounded when the cloud extends beyond the internal campus network. The problem with a private cloud inside the firewall is that if the resources are being used by a broad user base across the campus, any one virtualized instance can request access for the outside world. An infrastructure as a service (IAAS) approach can create situations where one virtualized application service could be handling secure data while another virtual instance on the same cloud could be compromised. Does this pose a security risk to private data? Conceivably someone hacking a public instance could hop to other instances on the cloud – private or not! There have been developments in the area of cloud security by Intel, NSA, Red Hat and others though as is usually the case, new vulnerabilities have and will continue to surface.
The NSA continues to fund work at the National Information Assurance Research Laboratory on multiple security projects directly related to cloud computing and secure operating systems in general. The Flexible Mandatory Control Architecture (Flask) has been integrated into several operating systems including several versions of Linux, Red Hat Enterprise Virtualization (RHEV) and Mac OS X.
Both Microsoft’s and Google’s Software as a Service (SAAS) offerings face considerable security risks and both vendors have signaled serious focus on this aspect. From a higher education perspective there are compelling reasons to leverage these SAAS models. On one hand, you can free up capital to reinvest on other critical needs. In addition, developing and maintaining the knowledge and focus on security is not a trivial matter. The providers of IAAS and SAAS have a lot at stake and therefore have every incentive to stay ahead of the curve on information security (not that a University doesn’t). However, the large vendors may also be juicier targets. Microsoft’s Global Foundation Services has standardized some 200 specific controls across its data centers and service provisioning centers. Google for its part chose to delay a major cloud application implementation for the city of Los Angeles in part to strengthen its security.
Anything involving security is a continuing arms race between attackers and defenders. For higher education, the risks and the rewards are high. There will be breaches and corresponding responses. Is our data more secure on our own servers? Risk is a daily balance and there is no such thing as a risk free environment. The question you need to answer is: as an organization, can we do better on our own or does leveraging a larger organization’s information security resources make us safer, even if you are part of a richer target. This is not the first time we’ve faced this question in the evolution of our species nor will it be the last. If you choose to go with the larger organizational strategy, it is important that you keep up with software updates. Otherwise you risk the worst of both worlds: a juicy target that doesn’t keep up their defenses.
Guest contributors: Peter Morales, M.S.M.S., PMP; and Rogerio Panigassi, M.S.E.E., PMP
About the writers:
Dennis Anderson, Ph.D. is Chairman and Professor of Management and Information Technology at St. Francis College, New York City. Prior to this appointment, he was a professor of information systems and associate dean at Pace University. He has also taught at NYU Courant Institute. He received his Ph.D. and M.Phil. from Columbia University and completed Harvard University's Institute for Management and Leadership in Education Program. He has served as an adviser to various organizations including CIO Magazine, Microsoft and United Nations. More information can be found at http://www.drdennisanderson.com.
Peter Morales is the Chief Technologist and Director of the Project Management Office at the NYU School of Law. he has served NYU in multiple roles including a stint as the CIO the Polytechnic Institute of NYU and serves on various executive technology steering committees. He also teaches a management course to NYC managers, directors and assistant commissioners across all city agencies. Prior to NYU, Peter directed various technology efforts at the NYSE for nearly a decade. He was responsible for the development of an experimental neurological research tool at a major research hospital. He started his career leading development efforts for an advanced avionics diagnostic and information management system used on the F18 for a major defense contractor. He is a Computer Science Doctoral candidate at Pace University, has a Master’s of Science in the Management of Technology from the Polytechnic Institute of NYU (formerly known as Brooklyn Poly) and a degree in Electrical Engineering from the Rochester Institute of Technology.
Rogerio Panigassi is a program manager at Microsoft Server and Tools Online Division, working as the TechNet site manager responsible for the website’s learning content dedicated to develop IT professionals’ capabilities with Microsoft products. Having worked for ten years at the company, he experienced many different positions in services, marketing, evangelism, and engineering. He obtained his Masters in Electric Engineering title from the Polytechnic School of the University of Sao Paulo, Brazil in the field of Digital Systems, after getting an Electrical Engineering degree from Maua Engineering School. A university where he returned later and taught computer sciences for four years.