Cloud computing is one of the important factors for most of the IT and large scale companies, as it is important to secure the sensitive data related to the organisation. Every organisation would have a set of contents, which play a major role to gain a good amount of growth on a regular basis.
Interesting Books on Jenkins and DevOps… Have a look at it:
Cloud Computing is a “newsworthy” term in the IT industry in recent times and it is here to stay!
Cloud Computing is not a technology, or even a set of technologies — it’s an idea. Cloud Computing is not a standard defined by any standards organization.
Basic understanding for Cloud: “Cloud” represents the Internet; Instead of using application installed on your computer or saving data to your hard drive, you’re working and storing stuff on the Web. Data is kept on servers and used by the service you’re using; tasks are performed in your browser using an interface / console provided by the service.
A credit card and Internet access is all you need to make an investment in technology. Businesses will find it easier than ever to provision technology services without the involvement of IT.
There are many definitions available in the market for Cloud Computing but we have aligned it with NIST publication and with our understanding. NIST defines cloud computing by describing five essential characteristics, three cloud service models, and four cloud deployment models.
“Cloud Computing is a self service which is on demand, Elastic, Measured, Multi-tenant, Pay per use, Cost-effective and efficient”. It is the access of data, software applications, and computer processing power through a ‘cloud’/a group of many on line/demand resources. Tasks are assigned to a combination of connections, software and services accessed over a network. This network of servers and connections is collectively known as “the cloud.”
Cloud service delivery is divided among three fundamental classifications referred as the “SPI Model,”
Software as a Service is a model of software deployment where an application is hosted as a service provided to customers across the Internet. By eliminating the need to install and run the application on the customer’s own computer, SaaS alleviates the customer’s burden of software maintenance, ongoing operation, and support. Salesforce is very popular Customer Relationship Management (CRM) software that is offered only as a service.
PaaS model makes all of the facilities required to support the complete life cycle of building and delivering web applications and services entirely available from the Internet. Google App Engine (GAE) is an example of PaaS. GAE provides a Python environment within which you can build, test and then deploy your applications.
Infrastructure as a Service (IaaS) is the delivery of computer infrastructure as a service. Rather than purchasing servers, software, data center space or network equipment, clients instead buy those resources as a fully outsourced service. Amazon Web Services (AWS) is one of the pioneers of such an offering. AWS’ Elastic Compute Cloud (EC2) is “a web service that provides resizable compute capacity”.
There are four deployment models for cloud services regardless of the service model utilized (SPI).
Public clouds refer to shared cloud services that are made available to a broad base of users. Although many organizations use public clouds for private business benefit, they don’t control how those cloud services are operated, accessed or secured. Popular examples of public clouds include Amazon EC2, Google Apps and Salesforce.com.
Private cloud describes an IT infrastructure in which a shared pool of computing resources—servers, networks, storage, applications and software services—can be rapidly provisioned, dynamically allocated and operated for the benefit of a single organization.
Hybrid Cloud represents composition of two or more cloud deployment models (private, community, or public) that remain unique but are bound together by uniform or proprietary technology that enables data and application portability.
Community Cloud represents infrastructure is shared by several organizations and supports a specific community that has shared concerns. E.g. FDA compliance needs specific controls where audit requirements can’t be met by other deployment models.
Cloud computing brings efficiencies and savings. The diverse benefits of cloud computing are undoubtedly worth pursuing. Cost-cutting is at the top of most companies’ lists of priorities in these challenging economic times. Having turned from revolutionary possibility into increasingly well-established custom, the cost of ‘outsourcing to the cloud’ is now falling dramatically.
In only paying for the resources used, therefore, operating costs can be reduced. After all, in-house data centres typically leave 85%-90% of available capacity idle. Cloud computing can lead to energy savings too, removing from individual companies the costly burden of running a data centre plus generator back-up and uninterruptible power supplies. Thus it results in reduction of CAPEX & OPEX.
Cloud Computing is in its formative years, but expect it to grow up quick. The prospective of Cloud Computing is mind boggling and the technology and business options will increase exponentially.
Still question remains, how Clouds are beneficial to the enterprises?
Focus on core business
Cloud computing increases the profitability by improving resource utilization. Pooling resources into large clouds drives down costs and increases utilization by delivering resources only for as long as those resources are needed.
Cloud computing is particularly valuable to small and medium businesses, where effective and affordable IT tools are critical to helping them become more productive without spending lots of money on in-house resources and technical equipment.
Ease of availability
Real-time collaboration capabilities
Gain access to latest technologies
We can leverage the sheer processing power of the cloud to do things that traditional productivity applications cannot do. “For instance, users can instantly search over 25 GB worth of e-mail online, which is nearly impossible to do on a desktop.
To take another example, each document created through Google Apps is easily turned into a living information source, capable of pulling the latest data from external applications, databases and the Web. This revolutionizes processes as simple as creating a Google spreadsheet to compare stock prices from vendors over time, because the cells can be populated and updated as the prices change in real time.
Cloud computing offers almost unlimited computing power and collaboration at a massive scale for enterprises of all sizes.
“Salesforce.com has 1.2m users on its platform. If that’s not scalable show me something that is. Gmail is SaaS and how many users are on that?”
Multi-tenancy enables sharing of resources and costs among a large pool of users, allowing for:
Centralization of infrastructure in areas with lower costs (such as real estate, electricity, etc.)
Peak-load capacity increases (users need not engineer for highest possible load-levels)
Utilization and efficiency improvements for systems that are often only 10-20% utilized.
Sustainability comes about through improved resource utilization, more efficient systems, and carbon neutrality.
But, are there any issues with Cloud Computing?
The benefits of cloud computing will not be realized if businesses are not convinced that it is secure. Trust is at the centre of success and providers have to prove themselves worthy of that trust if hosted services are going to work.
CIA (Confidentiality, Integrity, Availability)
IT Security Standards – There are multiple standards for security protocol for IT systems that have yet to be implemented into cloud computing.
Regulatory compliance— the vendor will be required to participate in internal and external audits. They will need to find a way to accommodate auditors from all firms using their service. [FDA Compliance is not feasible yet.]
Let’s consider Facts and Figures before jumping into minor details of Cloud Computing. Compare the annual cost of Amazon EC2 with an equivalent deployment in co-located and on-site data centers by entering a few basic inputs (Ref: Amazon EC2 Cost Comparison Calculator).
High-CPU Instances: Instances of this family have proportionally more CPU resources than memory (RAM) and are well suited for compute-intensive applications.
20 High-CPU Extra Large Instance (75% utilization) and No. of Peak Instances – 5 with 10% Annual Utilization
7 GB of memory
20 EC2 Compute Units (8 virtual cores with 2.5 EC2 Compute Units each)
1690 GB of instance storage
I/O Performance: High
Avg. Monthly Data Transfer “In” Per Instance (GB) -10 GB
Avg. Monthly Data Transfer “Out” Per Instance (GB)- 20 GB
Economics of Cloud Computing: Scarcity refers to the uneasiness between organization’s limited resources and its endless wants and needs. For an organization, resources include Compute, Storage and Networking Infrastructure. In conventional IT environment, upfront investment is the biggest barrier. On top of that, resources are limited and the utilization of limited resources is also inefficient. Virtualization resolves the issue and makes the resource utilization better than the regular approach. The emergence of cloud services is again fundamentally shifting the economics of IT. Cloud technology standardizes and pools IT resources and automates many of the maintenance tasks done manually today. Cloud architectures facilitate elastic consumption, self-service, and pay-as-you-go pricing.
The introduction of this general purpose technology can provide a fundamental contribution to promote growth and competition and it can help the economy to recover from a severe downturn. Economic impact of advancement in the hardware-software field is considerable which is going to have a powerful effect on the market structure of many sectors and on the global macroeconomic performance in the next years.
Business Economics has always been a powerful force in driving industry transformations and as more and more customers assess cloud computing investments strategies that will considerably affect ROI.
Macro and microeconomics are the two vantage points from which the Cloud economy can be observed. Micro and macroeconomics are intertwined; as Cloud economists gain understanding of Cloud phenomena, they can help nations and individuals make more informed decisions while using cloud computing.
Production Possibility Frontier (PPF)
Under the field of macroeconomics, the production possibility frontier (PPF) represents the point at which an economy is most efficiently using resources, therefore, allocating its resources in the best way possible.
The production possibility frontier shows there are limits to production, so an economy, to achieve efficiency, must decide what combination of services can be produced.
If there was a change in technology (In our case it’s a disruptive innovation named Cloud Computing) while the other factors remain the same, execution time or time to market will be reduced significantly due to elasticity, pay as you go billing model and flexibility. Output would increase, and the PPF would be pushed outwards. A new curve would represent the new efficient allocation of resources.
The macroeconomic impact of the diffusion of this new general purpose technology may be quite large, as it happened for the introduction of the Internet.
Economics of the Cloud is the field of study concerned with Cloud computing that deals with the production, distribution, and consumption of IT services. With “Economics of the cloud”, scope is certainly not limited to Cloud Computing paradigm’s Financial Benefits. It is more about choices Cloud customers make, and to inquire into why?
Economics of the Cloud should show the way and should not confuse the adopters. In case of Cloud, more of Microeconomics will be critical which includes various aspects such as agility, creativity, innovation, social impact, trust and risk, economic values and, scale of trade off.
Opportunity cost is the cost of any activity measured in terms of the value of the next best alternative foregone (that is not chosen). The opportunity cost is also the cost of the foregone products after making a choice. Opportunity cost is a key concept in economics, and has been described as expressing “the basic relationship between scarcity and choice”.
In such an environment the opportunity cost of moving to cloud computing is paramount because it entails a commitment in ‘sunk costs’: those are costs that underpin the venture. This must be viewed together with potential yield, i.e. the carrot in the hype.
Solution of “80-20” Problem is “20-80”: In computer science and engineering control theory such as for electromechanical energy converters, the Pareto principle can be applied to optimization efforts.
Cloud Computing is a force that helps flip this ratio and gives IT departments the ability to spend 80% of their time on core business processes, such as business application design. It’s for this reason, the ability to go from 20% of time and money dedicated to core business processes to 80%, that the economics of Cloud Computing is so compelling. Nowhere is the current model’s inefficiency more evident than in the opportunity costs that organizations pay to manage their own computing needs.
The curious question is: what is Cloud computing Definition with real world example? It is a disruptive and innovative model for enabling convenient, on-demand, and flexible access to a shared pool of computing resources such as networks, servers, storage, applications, and services which are configurable and can be quickly provisioned and de-provisioned with minimal or no management effort or service provider interaction.
This model is composed of three service models, four deployment models, and five essential characteristics as per NIST.
On-demand self-service: Use credit card and use resources such as compute, storage, database from cloud environment without any interaction with service providers or any delay due to workflows and permissions. It takes less than a MINUTE! And believe me, you are not day dreaming…It is a reality.
Balance/Usage: with *123*3# you know the available amount of SMS (In case of Cloud, All Cloud Service Provider gives Dashboard/ Management Console/ Cost Control Dashboard which gives you information regarding your resource usage, cost and many more things.
Broad network access: Computing resources such as compute and storage capacity are available over network / on internet or intranet and they can be accessed via various devices such as smartphones, iPads, mobile phones, tablets, laptops, and workstations.
Real world example: Videos on Mobile Web – Applications, wallpapers and ringtones from Internet are used
The more suitable example can be the way we watch videos over internet; similarly we can use various resources on Cloud as well…
Resource pooling: A resource pool is a set of resource which is homogeneous with respect to some activity, action or context. Cloud Service provider create a pool of all computing resources to serve multiple consumers in a multi-tenant environment, where different physical and virtual resources are dynamically assigned and re-assigned according to demand.
Real world example: In software engineering, a connection pool is considered as a cache of database connections so that the connections can be reused in case of future requests to the database are required.
Purpose: To enhance performance of executing database commands
In case of cloud environment, resources are pooled together and its capacities are used in unified manner to enhance the performance and customer satisfaction.
Rapid elasticity: Application can expand on demand, across all its tiers such as presentation layer, database layer,application layer – MVC). It also implies that application components can grow independently from each other. So if you require more storage for database, you should be able to grow that tier without affecting the availabilityof that application, reconfiguration, or changing the other tiers.
Real world example: In physics, elasticity or stretchiness is the physical property of a material that returns to its original shape after the stress e.g. external forces (Consider Peak Hours as external forces in cloud environment)
Measured service: Cloud environment automatically control and optimize resource use by leveraging a metering or chargeback capacity. The basic value proposition of cloud computing is its utility based price model where you pay for what you use. Resource usage in cloud environment can be monitored, controlled, and notified via alerts, dashboards. It is a very basic feature which is essential in Cloud environment
Real world example: Consumers pay for electricity as they have used it
Life science application comes under Life sciences industry.
Life sciences consist of all fields of science that involve the scientific study of living organisms such as human beings, plants, and animals. The study of behaviour of organisms is only included in as much as it involves a clearly biological aspect. Biology and medicine remain main parts of the life sciences, having said that technological advances in molecular biology and biotechnology have directed it to a burgeoning of specializations and new interdisciplinary fields.
R&D process in the life science can be a long and expensive undertaking. The product development process follows basic steps at a very high level as described below:
• Phase 1 – recognition of the particle, initial testing, and toxicology studies
• Phase 2 – more development, formulation, and human testing
• Phase 3 – double blind clinical trials to test efficacy and submission for FDA approval
Life science industry operates under the regulatory guidelines put forward by the Food & Drug Administration (FDA).
Food and Drug Administration is a federal agency in the Department of Health and Human Services. It is established to regulate the release of new foods and health related products.
The IT organizations in life science companies must adhere to the FDA guidelines put forth in the Code for Federal Regulations 21 Part 11 (CFR 21 Part 11). It defines how systems managing electronic records in life science firms must be validated and verified to ensure that the operation of and the information in these systems can be trusted.
Title 21 CFR Part 11 of the Code of Federal Regulations deals with the Food and Drug Administration (FDA) guidelines on electronic records and electronic signatures in the United States. CFR Part 11, as it is called, defines the criteria under which electronic records and signatures are considered to be reliable and equivalent to paper records.
Part 11 requires drug makers, manufacturers, biologics developers, biotech companies, and other FDA-regulated industries to implement controls such as audits, system validations, audit trails, electronic signatures, and documentation for software and systems involved in processing electronic data that are (a) required to be maintained by the FDA predicate rules or (b) used to demonstrate compliance to a predicate rule with some specific exceptions.
The actual Part 11 compliance process for any application includes software, hardware, and operational environment for the system itself. This allows an IT Team to answer the questions.
To prove these things the system validation process has three primary components, the Installation Qualification (IQ), the Operational Qualification (OQ), and Performance Qualification scripts. Organizations manage IT environment separately for the life science applications and with proper controls placed.
CFR does not ask organization on How to do it? but it states What needs to be done.
It all comes to convincing the FDA auditor whether the Cloud environment conforms to the FDA compliance requirements or not.
Cloud computing can improve and speed up process by reducing IT complexity and cost while allowing R&D organizations to focus on the ‘what’ of the R&D process in stead of the ‘how’.
But, how Cloud Computing and FDA can be brought on a same table is the biggest issue because:
Audit / Track of following items are needed.
Ø Hardware serial number
Ø System configuration
Ø Equipment location
Ø Exact versions off all installed software
FDA compliance in Public Cloud is impossible till now because you must be aware about the detailed information on the hardware and software that your system will be running on and even the exact physical location of the resources as well.
In Private Cloud, owner has control over all resources (Hardware, Software) and thus it is still possible.
In a nutshell, public cloud model just does not fit for the current practices for validation in FDA regulated organizations. However private cloud environment could be leveraged to provide life science companies with a short cut to completing overall system validation Public Cloud’s benefit “Economy of Scale” will be out of reach in this case.
A community cloud may be established where several organizations have similar requirements and seek to share infrastructure so as to realize some of the benefits of cloud computing. With the costs spread over less users than a public cloud (but more than a single tenant) this option is more expensive but may offer a higher level of privacy, security and/or policy (FDA) compliance.
Dedicated Instances are Amazon EC2 instances launched within your Amazon Virtual Private Cloud (Amazon VPC) that run hardware dedicated to a single customer.
NOTE: hardware dedicated to a single customer
Dedicated Instances let you take full advantage of the benefits of Amazon VPC and the AWS cloud – on-demand elastic provisioning, pay only for what you use, and a private, isolated virtual network, all while ensuring that your Amazon EC2 compute instances will be isolated at the hardware level.
You can easily create a VPC that contains dedicated instances only, providing physical isolation for all Amazon EC2 compute instances launched into that VPC, or you can choose to mix both dedicated instances and non-dedicated instances within the same VPC based on application-specific requirements
To get started using Dedicated Instances within an Amazon VPC, perform the following steps: