Why Should We Consider Cloud Computing?

So far, in my earlier articles, we familiarize ourselves with various terms of Cloud Computing. In case, you are new and not sure about the terminologies of Cloud Computing, I would suggest to take a quick look on “Taxonomy of Cloud Computing“. Here, in this article, we will explore the necessity of Cloud Computing. In the other word, we will mainly focus on the advantages of Cloud Computing over colloquial data center. So, let’s start.

What is a Data Center? A quick look on it’s history!

A data center is a building, or a dedicated space within a building, or a group of buildings used to house computer systems and it’s associated components, such as network, storage etc. Since IT operations are one of the important factor for business continuity, it generally includes redundant or backup components and infrastructure for power supply, data communication connections, environmental controls (e.g. air conditioning, fire suppression) and various other security devices. Data centers have their roots in the huge computer rooms of the early 1940s! Later, with the advancement of the technology and it’s demand, the large computer rooms are transformed into Data Center. We have divided the time frame into decades.

  • 1940 : ENIAC was the first electronic general-purpose digital computer. It was Turing-complete, and able to solve “a large class of numerical problems” through reprogramming. ENIAC was designed and primarily used to calculate artillery firing tables for the United States Army’s Ballistic Research Laboratory, its first program was a study of the feasibility of the thermonuclear weapon. This computer was still not using transistors. Instead, it had almost 18 thousand vacuum tubes, 7200 crystal diodes and 10000 capacitors. The thing was huge – 167.2 square meters (1 800 square feet). It is considered as a root of modern Data Center.
  • 1950: The first transistorized computer (TRADIC) was introduced in 1954 and was the first machine to use all transistors and diodes and no vacuum tubes.
  • 1960: In 1964, the first supercomputer got introduced. It was the CDC 6600, with the performance of 1 MFlops and peak at 3 MFlops.
  • 1970: In 1971 Intel introduced its 4004 processor, becoming the first general-purpose programmable processor on the market. It served as a “building block” that engineers could purchase and then customize with software to perform different functions in a wide variety of electronic devices. In 1973, Xerox Alto got into the market and presented the first graphical UI. This computer was way ahead of its time and it even came with a 3-button mouse. In 1977, ARCnet is introduced as the first LAN, being put into service at Chase Manhattan Bank. It supported data rates of 2.5 Mbps, and connected up to 255 computers across the network. Just a year later, the American multinational software company SunGard established the first commercial disaster recovery. Prior to the introduction of PC servers, IT decisions revolving around the mainframe had to be made on an absolute enterprise scale for everything from operating system, hardware, and applications. All of these things ran within one device for the entire enterprise, offering limited flexibility and difficult IT decisions.
  • 1980: The massive and expensive mainframes were dying. They were replaced with cheaper and easier to maintain PCs. Personal computers (PCs) were introduced in 1981, leading to a boom in the microcomputer industry. Sun Micro-systems created the network file system protocol. With it, the client computers were able to access files over the network in a similar way to accessing internal storage. Computers were being installed at a rapid rate everywhere we turned, but minimal attention was being given to environmental and operating requirements.
  • 1990: During 1st phase of 1990s, Microcomputers began filling old mainframe computer rooms as “servers,” and the rooms became known as data centers. Companies then began assembling these banks of servers within their own walls. During, mid 90s, “.com” surge caused companies to desire fast internet connectivity and nonstop operation. This resulted in enterprise construction of server rooms, leading to much larger facilities (hundreds and thousands of servers). The data center as a service model became popular at this time. IT decisions started becoming made in two separate ways. Servers allowed for application-based decisions, while hardware (data center) decisions remained their own enterprise level decision. In 1997, Apple created a program called Virtual PC and sold it through a company called Connectix. Virtual PC, like SoftPC allowed users to run a copy of windows on the Mac computer, in order to work around software incompatibilities. Around 1999, VMware began selling VMware Workstation, which was similar to Virtual PC. Initial versions only ran on Windows, but later added support for other operating systems. Salesforce.com pioneered the concept of delivering enterprise applications via a simple website.
  • 2000: At the beginning of the period, power efficiency was beginning to cause maintenance issues. The current generation of data centers was consuming too much power. This started a trend to improve the efficiency, build better cooling systems and to reduce the consumption. In 2002, Amazon Web Services begins development of a suite of cloud-based services, which included storage, computation and some human intelligence through “Amazon Mechanical Turk.” In 2006, Amazon Web Services begins offering IT infrastructure services to businesses in the form of web services, now commonly known as cloud computing. In 2007, Sun Microsystems introduces the modular data center, transforming the fundamental economics of corporate computing.
  • 2010: In 2011, Facebook launches Open Compute Project, an industry-wide initiative to share specifications and best practices for creating the most energy efficient and economical data centers. In 2012, surveys indicated that 38 percent of businesses were already using the cloud, and 28 percent had plans to either initiate or expand their use of the cloud. In 2013, Telcordia introduces generic requirements for telecommunications data center equipment and spaces. The document presents minimal spatial and environmental requirements for data center equipment and spaces. Google invested a massive $7.35 billion in capital expenditures in its Internet infrastructure during 2013. The spending was driven by a massive expansion of Google’s global data center network, which represented perhaps the largest construction effort in the history of the data center industry.
  • 2020: Today, the data center is driving to a new model (client-server) based on subscription. Companies choose this model to reduce their costs. They don’t need to purchase expensive hardware and constantly upgrade it. Instead, they use cloud services, where a third party is responsible for the hardware resources and often for the IT support as well. The future consists of low-power, long-lasting client devices that connect to the cloud (data centers) where all of the processing is done.

Benefits of Cloud

In the time line, we have seen the sequential evolution of cloud computing. In the following section, we will summarize them mainly. We will write down the major advantages of it.

Infrastructure Cost

One of the major benefit is “Cost Reduction”. Cloud computing eliminates the capital expense like purchasing hardware and software, setting up and running on-site Data Centers, the racks of servers, the round-the-clock electricity for powering & cooling and the IT experts for managing that infrastructure.

Ability of scaling

The benefits of cloud computing services include the ability to scale the infrastructure elastically. That means – procuring right amount of IT resources at any specific time – e.g. more or less computing power, storage, bandwidth—right when it is needed and from the right geographic region.

Performance

The major cloud computing services run on a worldwide network of secure data centers that are regularly upgraded to the latest generation of fast and efficient computing hardware. This eventually offers several benefits over a corporate data center, including reduced network latency for applications and greater economies of scale.

Speed

Most cloud computing services are provided self service and on demand, so even vast amounts of computing resources can be provisioned in minutes, typically with just a few mouse clicks, giving businesses a lot of flexibility and taking the pressure off capacity planning.

Reliability & Security

Cloud computing makes data backup, disaster recovery and business continuity easier and less expensive because data can be mirrored at multiple redundant sites on the cloud provider’s network. Many cloud providers offer a broad set of policies, technologies and controls that strengthen your security posture overall, helping protect your data, apps and infrastructure from potential threats.

Productivity

On-site data centers typically require a lot of “rack and stack”—hardware setup, software patching, and other time-consuming IT management topics. Cloud computing removes the need for many of these tasks, so IT teams can spend time on achieving more important business goals.

References:

  1. Wikipedia
If you like the article, please don’t hesitate to give a like. In case if you find this article useful for someone, please don’t hesitate to share. Please feel free to comment in order to make this article complete or more informative.
Share Your Impression

Rating: 4 out of 5.

1 thought on “Why Should We Consider Cloud Computing?”

  1. Pingback: Effective Ways Of Migration To Cloud – Start Learning

Comments are closed.