Cloud Computing – Introduction

Although cloud computing is only a different way to deliver computer resources, rather than a new technology, it has sparked a revolution in the way organizations provide information and service.

cloudcomputing by Gartner

Originally IT was dominated by mainframe computing. This sturdy configuration eventually gave way to the client-server model. Contemporary IT is increasingly a function of mobile technology, pervasive or ubiquitous computing, and of course, cloud computing. But this revolution, like every revolution, contains components of the past from which it evolved.

Thus, to put cloud computing in the proper context, keep in mind that in the DNA of cloud computing is essentially the creation of its predecessor systems. In many ways, this momentous change is a matter of “back to the future” rather than the definitive end of the past. In the brave new world of cloud computing, there is room for innovative collaboration of cloud technology and for the proven utility of predecessor systems, such as the powerful mainframe. This veritable change in how we compute provides immense opportunities for IT personnel to take the reins of change and use them to their individual and institutional advantage.

Cloud computing is a comprehensive solution that delivers IT as a service. It is an Internet-based computing solution where shared resources are provided like electricity distributed on the electrical grid. Computers in the cloud are configured to work together and the various applications use the collective computing power as if they are running on a single system.

The flexibility of cloud computing is a function of the allocation of resources on demand. This facilitates the use of the system’s cumulative resources, negating the need to assign specific hardware to a task. Before cloud computing, websites and server-based applications were executed on a specific system. With the advent of cloud computing, resources are used as an aggregated virtual computer. This amalgamated configuration provides an environment where applications execute independently without regard for any particular configuration.

Cloud computing building blocks

The cloud computing model is comprised of a front end and a back end. These two elements are connected through a network, in most cases the Internet. The front end is the vehicle by which the user interacts with the system; the back end is the cloud itself. The front end is composed of a client computer, or the computer network of an enterprise, and the applications used to access the cloud. The back end provides the applications, computers, servers, and data storage that creates the cloud of services.

Layers: Computing as a commodity

The cloud concept is built on layers, each providing a distinct level of functionality. This stratification of the cloud’s components has provided a means for the layers of cloud computing to becoming a commodity just like electricity, telephone service, or natural gas. The commodity that cloud computing sells is computing power at a lower cost and expense to the user. Cloud computing is poised to become the next mega-utility service.

The virtual machine monitor (VMM) provides the means for simultaneous use of cloud facilities. VMM is a program on a host system that lets one computer support multiple, identical execution environments. From the user’s point of view, the system is a self-contained computer which is isolated from other users. In reality, every user is being served by the same machine. A virtual machine is one operating system (OS) that is being managed by an underlying control program allowing it to appear to be multiple operating systems. In cloud computing, VMM allows users to monitor and thus manage aspects of the process such as data access, data storage, encryption, addressing, topology, and workload movement.

A Virtual Machine Monitor (VMM) is a software program that enables the creation, management and governance of virtual machines (VM) and manages the operation of a virtualized environment on top of a physical host machine.

VMM is also known as Virtual Machine Manager and Hypervisor. However, the provided architectural implementation and services differ by vendor product.

 Virtual Machine Monitor (VMM)

VMM is the primary software behind virtualization environments and implementations. When installed over a host machine, VMM facilitates the creation of VMs, each with separate operating systems (OS) and applications. VMM manages the backend operation of these VMs by allocating the necessary computing, memory, storage and other input/output (I/O) resources.

VMM also provides a centralized interface for managing the entire operation, status and availability of VMs that are installed over a single host or spread across different and interconnected hosts.

For example, IBM’s VM/ESA can control multiple virtual machines on an IBM S/390 system.

In Microsoft Virtual Server 2005, Virtual Machine Monitor is the proprietary name for a kernel-mode driver that functions as a firewall between the host OS and the virtual machines. It can prevent any single program, running in one of the virtual machines, from overusing the resources of the host OS.

A cloud server is a logical server that is built, hosted and delivered through a cloud computing platform over the Internet. Cloud servers possess and exhibit similar capabilities and functionality to a typical server but are accessed remotely from a cloud service provider.A cloud server may also be called a virtual server or virtual private sever.
 

A cloud server is primarily an Infrastructure as a Service (IaaS) based cloud service model. There are two types of cloud server: logical and physical. A cloud server is considered to be logical when it is delivered through server virtualization. In this delivery model, the physical server is logically distributed into two or more logical servers, each of which has a separate OS, user interface and apps, although they share physical components from the underlying physical server. 

Whereas the physical cloud server is also accessed through the Internet remotely, it isn’t shared or distributed. This is commonly known as a dedicated cloud server.

The following are the layers the cloud provides:

    The infrastructure layer is the foundation of the cloud. It consists of the physical assets — servers, network devices, storage disks, etc. Infrastructure as a Service (IaaS) has providers such as the IBM® Cloud. Using IaaS  the IT organization don’t actually control the underlying infrastructure, but do have control of the operating systems, storage, deployment applications, and, to a limited degree, control over select networking components.

    Print On Demand (POD) services are an example of organizations that can benefit from IaaS. The POD model is based on the selling of customizable products. PODs allow individuals to open shops and sell designs on products. Shopkeepers can upload as many or as few designs as they can create. Many upload thousands. With cloud storage capabilities, a POD can provide unlimited storage space.

    The middle layer is the platform. It provides the application infrastructure. Platform as a Service (PaaS) provides access to operating systems and associated services. It provides a way to deploy applications to the cloud using programming languages and tools supported by the provider. IT Organizations do not have to manage or control the underlying infrastructure, but do have control over the deployed applications and, to some degree over application hosting environment configurations.

    PaaS has providers such as Amazon’s Elastic Compute Cloud (EC2). The small entrepreneur software house is an ideal enterprise for PaaS. With the elaborated platform, world-class products can be created without the overhead of in-house production.

Amazon Elastic Compute Cloud (Amazon EC2) is a cloud infrastructure offered under Amazon Web Services (AWS) that provides raw computing resources on demand.

Amazon EC2 provides computing instances that can be scalable in terms of computing power and memory, flexible by providing the option to host applications on multiple different platforms, and secure thanks to a tightly coupled multi-tenant architecture. Amazon EC2 enables the provision of a virtual server, which can incorporate massive amounts of computing power. This is available on a subscription-based utility computing model, and the user is billed only for the resources used.

Amazon EC2 is also known as Amazon Web Services EC2 (AWS EC2).

Amazon Elastic Compute Cloud is a pioneer cloud infrastructure product that allows users to create powerful virtual servers on demand. Amazon EC2 is hosted on the server consolidation/virtualization concept, where the entire computing power of server hardware can be divided into multiple instances and offered to the end-user over the Internet as a computing instance. 

Because the computing instances provided are software based, each unique instance is scalable and users can create an entire virtual data center over the cloud. Amazon EC2-created instances can be accessed by open-source Simple Object Access Protocol (SOAP) application programming interface (API) support, giving developers the liberty to create various types of applications, just as with an on-premises computing infrastructure. The instance provided by EC2, commonly known as a virtual machine, is created using Amazon Virtual Image and is hosted over Xen Hypervisor, a server virtualizing software.

The top layer is the application layer, the layer most visualize as the cloud. Applications run here and are provided on demand to users. Software as a Service (SaaS) has providers such as Google Pack. Google Pack includes Internet accessible applications, tools such as Calendar, Gmail, Google Talk, Docs, and many more.

Cloud formations

There are three types of cloud formations: private (on premise), public, and hybrid.

Public clouds are available to the general public or a large industry group and are owned and provisioned by an organization selling cloud services. A public cloud is what is thought of as the cloud in the usual sense; that is, resources dynamically provisioned over the Internet using web applications from an off-site third-party provider that supplies shared resources and bills on a utility computing basis.

Private clouds exist within  company’s firewall and are managed by your organization. They are cloud services organizations create and control within enterprise. Private clouds offer many of the same benefits as the public clouds — the major distinction being that your organization is in charge of setting up and maintaining the cloud.

    Hybrid clouds are a combination of the public and the private cloud using services that are in both the public and private space. Management responsibilities are divided between the public cloud provider and the business itself. Using a hybrid cloud, organizations can determine the objectives and requirements of the services to be created and obtain them based on the most suitable alternative.

IT roles in the cloud

Let us consider the probability that management and administration will require greater automation, requiring a change in the tasks of personnel responsible for scripting due to the growth in code production, IT may be consolidating, with a need for less hardware and software implementation, but it is also creating new formations. The shift in IT is toward the knowledge worker. In the new paradigm, the technical human assets will have greater responsibilities for enhancing and upgrading general business processes.

The developer

The growing use of mobile devices, the popularity of social networking, and other aspects of the evolution of commercial IT processes and systems, will guarantee work for the developer community; however, some of the traditional roles of development personnel will be shifted away from the enterprise’s developers due to the systemic and systematic processes of the cloud configuration model.

A recent survey by IBM, New developerWorks survey shows dominance of cloud computing and mobile application development demonstrated that the demand for mobile technology will grow exponentially. This development, along with the rapid acceptance of cloud computing across the globe, will necessitate a radical increase of developers with an understanding of this area. To meet the growing needs of mobile connectivity, more developers will be required who understand how cloud computing works.

Cloud computing provides an almost unlimited capacity, eliminating scalability concerns. Cloud computing gives developers access to software and hardware assets that most small and mid-sized enterprises could not afford. Developers, using Internet-driven cloud computing and the assets that are a consequence of this configuration, will have access to resources that most could have only dreamed of in the recent past.

The administrator

Administrators are the guardians and legislators of an IT system. They are responsible for the control of user access to the network. This means sitting on top of the creation of user passwords and the formulation of rules and procedures for such fundamental functionality as general access to the system assets. The advent of cloud computing will necessitate adjustments to this process since the administrator in such an environment is no longer merely concerned about internal matters, but also the external relationship of his enterprise and the cloud computing concern, as well as the actions of other tenants in a public cloud.

This alters the role of the firewall constructs put in place by the administration and the nature of the general security procedures of the enterprise. It does not negate the need for the guardian of the system. With cloud computing comes even greater responsibility, not less. Under cloud computing, the administrator must not only ensure data and systems internal to the organization, they must also monitor and manage the cloud to ensure the safety of their system and data everywhere.

The architect

The function of the architecture is the effective modeling of the given system’s functionality in the real IT world. The basic responsibility of the architect is development of the architectural framework of the agency’s cloud computing model. The architecture of cloud computing is essentially comprised of the abstraction of the three layer constructs, IaaS, PaaS, and SaaS, in such a way that the particular enterprise deploying the cloud computing approach meets its stated goals and objectives. The abstraction of the functionality of the layers is developed so the decision-makers and the foot soldiers can use the abstraction to plan, execute, and evaluate the efficacy of the IT system’s procedures and processes.

The role of the architect in the age of cloud computing is to conceive and model a functional interaction of the cloud’s layers. The architect must use the abstraction as a means to ensure that IT is playing its proper role in the attainment of organizational objectives.

To cloud or not to cloud: Risk assessment

The main concerns voiced by those moving to the cloud are security and privacy. The companies supplying cloud computing services know this and understand that without reliable security, their businesses will collapse. So security and privacy are high priorities for all cloud computing entities.

Governance: It addresses the question of how will industry standards be monitored?

Governance is the primary responsibility of the owner of a private cloud and the shared responsibility of the service provider and service consumer in the public cloud. However, given elements such as transnational terrorism, denial of service, viruses, worms and the like — which do or could have aspects beyond the control of either the private cloud owner or public cloud service provider and service consumer — there is a need for some kind of broader collaboration, particularly on the global, regional, and national levels. Of course, this collaboration has to be instituted in a manner that will not dilute or otherwise harm the control of the owner of the process or subscribers in the case of the public cloud.

Bandwidth requirements

If you are going to adopt the cloud framework, bandwidth and the potential bandwidth bottleneck must be evaluated in your strategy.

Virtualization implementers found that the key bottleneck to virtual machine density is memory capacity; now there’s a whole new slew of servers coming out with much larger memory footprints, removing memory as a system bottleneck. Cloud computing negates that bottleneck by removing the issue of machine density from the equation—sorting that out becomes the responsibility of the cloud provider, freeing the cloud user from worrying about it.

For cloud computing, bandwidth to and from the cloud provider is a bottleneck.

In today’s market the best answer is the blade server. A blade server is a server that has been optimized to minimize the use of physical space and energy. One of the huge advantages of the blade server for cloud computing use is bandwidth speed improvement. For example, the IBM BladeCenter is designed to accelerate the high-performance computing workloads both quickly and efficiently. Just as the memory issue had to be overcome to effectively alleviate the bottleneck of virtual high machine density, the bottleneck of cloud computing bandwidth must also be overcome, so look to the capabilities of your provider to determine if the bandwidth bottleneck will be a major performance issue.

Financial impact

Because a sizable proportion of the cost in IT operations comes from administrative and management functions, the implicit automation of some of these functions will per se cut costs in a cloud computing environment. Automation can reduce the error factor and the cost of the redundancy of manual repetition significantly.

There are other contributors to financial problems such as the cost of maintaining physical facilities, electrical power usage, cooling systems, and of course administration and management factors. As you can see, bandwidth is not alone, by any means.

Mitigate the risk

Consider these possible risks:

  •     Adverse impact of mishandling of data.
  •     Unwarranted service charges.
  •     Financial or legal problems of vendor.
  •     Vendor operational problems or shutdowns.
  •     Data recovery and confidentiality problems.
  •     General security concerns.
  •     Systems attacks by external forces.

With the use of systems in the cloud, there is the ever present risk of data security, connectivity, and malicious actions interfering with the computing processes. However, with a carefully thought out plan and methodology of selecting the service provider, and an astute perspective on general risk management, most companies can safely leverage this technology.

In conclusion

In this revolutionary new era, cloud computing can provide organizations with the means and methods needed to ensure financial stability and high quality service. Of course, there must be global cooperation if the cloud computing process is to attain optimal security and general operational standards. With the advent of cloud computing it is imperative for us all to be ready for the revolution.

Advertisements

cloud computing – A perspective

cc1Cloud computing is the delivery of computing as a service rather than a product, whereby shared resources, software, and information are provided to computers and other devices as a utility (like the electricity grid) over a network (typically the Internet )

Cloud computing entrusts services (typically centralized) with a user’s data, software and computation on a published application programming interface (API) over a network. It has considerable overlap with software as a service (SaaS).

End users access cloud based applications through a web browser or a light weight desktop or mobile app while the business software and data are stored on servers at a remote location. Cloud application providers strive to give the same or better service and performance than if the software programs were installed locally on end-user computers.

At the foundation of cloud computing is the broader concept of infrastructure convergence (or Converged Infrastructure) and shared services. This type of data centre environment allows enterprises to get their applications up and running faster, with easier manageability and less maintenance, and enables IT to more rapidly adjust IT resources (such as servers, storage, and networking) to meet fluctuating and unpredictable business demand.

Cloud computing providers offer their services according to three fundamental models: Infrastructure as a service (IaaS), platform as a service (PaaS), and software as a service (SaaS) where IaaS is the most basic and each higher model abstracts from the details of the lower models.

Platform as a Service (PaaS) is a way to rent hardware, operating systems, storage and network capacity over the Internet. The service delivery model allows the customer to rent virtualized servers and associated services for running existing applications or developing and testing new ones.

Platform as a Service (PaaS) is an outgrowth of Software as a Service (SaaS), a software distribution model in which hosted software applications are made available to customers over the Internet. PaaS has several advantages for developers. With PaaS, operating system features can be changed and upgraded frequently. Geographically distributed development teams can work together on software development projects. Services can be obtained from diverse sources that cross international boundaries. Initial and ongoing costs can be reduced by the use of infrastructure services from a single vendor rather than maintaining multiple hardware facilities that often perform duplicate functions or suffer from incompatibility problems. Overall expenses can also be minimized by unification of programming development efforts.

On the downside, PaaS involves some risk of “lock-in” if offerings require proprietary service interfaces or development languages. Another potential pitfall is that the flexibility of offerings may not meet the needs of some users whose requirements rapidly evolve.

 Infrastructure as a Service

In this most basic cloud service model, cloud providers offer computers – as physical or more often as virtual machines –, raw (block) storage, firewalls, load balancers, and networks. IaaS providers supply these resources on demand from their large pools installed in data centers. Local area networks including IP addresses are part of the offer. For the wide area connectivity, the Internet can be used or – in carrier clouds – dedicated virtual private networks can be configured.

To deploy their applications, cloud users then install operating system images on the machines as well as their application software. In this model, it is the cloud user who is responsible for patching and maintaining the operating systems and application software. Cloud providers typically bill IaaS services on a utility computing basis, that is, cost will reflect the amount of resources allocated and consumed.

Platform as a Service
(PaaS)

In the PaaS model, cloud providers deliver a computing platform and/or solution stack typically including operating system, programming language execution environment, database, and web server. Application developers can develop and run their software solutions on a cloud platform without the cost and complexity of buying and managing the underlying hardware and software layers. With some PaaS offers, the underlying compute and storage resources scale automatically to match application demand such that the cloud user does not have to allocate resources manually.

 Software as a Service (SaaS)

In this model, cloud providers install and operate application software in the cloud and cloud users access the software from cloud clients. The cloud users do not manage the cloud infrastructure and platform on which the application is running. This eliminates the need to install and run the application on the cloud user’s own computers simplifying maintenance and support. What makes a cloud application different from other applications is its elasticity. This can be achieved by cloning tasks onto multiple virtual machines at run-time to meet the changing work demand. Load balancers distribute the work over the set of virtual machines. This process is
transparent to the cloud user who sees only a single access point. To accomodate a large number of cloud users, cloud applications can be multitenant, that is, any machine serves more than one cloud user organization.
It is common to refer to special types of cloud based application software with a similar naming convention: desktop as a service, business process as a service, Test Environment as a Service, communication as a service.

The pricing model for SaaS applications is typically a monthly or yearly flat fee per user.

Essential Characteristics:

On-demand self-service. A consumer can unilaterally provision computing capabilities, such as server time and network storage, as needed automatically without requiring human interaction with each service provider.
Broad network access. Capabilities are available over the network and accessed through standard mechanisms that promote use by heterogeneous thin or thick client platforms (e.g., mobile phones, tablets, laptops, and workstations).
Resource pooling. The provider’s computing resources are pooled to serve multiple consumers using a multi-tenant model, with different physical and virtual resources dynamically assigned and reassigned according to consumer demand. There is a sense of location independence in that the customer generally has no control or knowledge over the exact location of the provided resources but may be able to specify location at a higher level of abstraction (e.g., country, state, or datacenter). Examples of resources include storage, processing, memory, and network bandwidth.
Rapid elasticity. Capabilities can be elastically provisioned and released, in some cases automatically, to scale rapidly outward and inward commensurate with demand. To the consumer, the capabilities available for provisioning often appear to be unlimited and can be appropriated in any quantity at any time.
Measured service. Cloud systems automatically control and optimize resource use by leveraging a metering capability1 at some level of abstraction appropriate to the type of service (e.g., storage, processing, bandwidth, and active user accounts). Resource usage can be monitored, controlled, and reported, providing transparency for both the provider and consumer of the utilized service.

Service Models:

Cloud clients

Users access cloud computing using networked client devices, such as desktop computers, laptops, tablets and smartphones. Some of these devices – cloud clients – rely on cloud computing for all or a majority of their applications so as to be essentially useless without it. Examples are thin clients and the browser-based Chrome-book. Many cloud applications do not require specific software on the client and instead use a web browser to interact with the cloud application. With Ajax and HTML5 these Web user interfaces can achieve a similar or even better look and feel as native applications. Some cloud applications, however, support specific client
software dedicated to these applications (e.g., virtual desktop clients and most email clients). Some legacy applications (line of business applications that until now have been prevalent in thin client Windows computing) are delivered via a screen-sharing technology.

 Deployment models

Cloud computing types Public cloud Applications, storage, and other resources are made available to the general public by a service provider. Public cloud services may be free or offered on a pay-per-usage model. There are limited service providers like Microsoft, Google etc owns all Infrastructure at their Data Center and the access will be through Internet mode only. No direct connectivity proposed in Public Cloud Architecture.

Community cloud
Community cloud shares infrastructure between several organizations from a specific community with common concerns (security, compliance, jurisdiction, etc.), whether managed internally or by a third-party and hosted internally or externally. The costs are spread over fewer users than a public cloud (but more than a private cloud), so only some of the cost savings potential of cloud computing are realized.

 Hybrid cloud
Hybrid cloud is a composition of two or more clouds (private, community or public)
that remain unique entities but are bound together, offering the benefits of multiple deployment models.

 Private cloud
Private cloud is cloud infrastructure operated solely for a single organization, whether managed internally or by a third-party and hosted internally or externally.

They have attracted criticism because users “still have to buy, build, and manage them” and thus do not benefit from less hands-on management, essentially “[lacking] the economic model that makes cloud computing such an intriguing concept”.

Private cloud is a computing model that uses resources which are dedicated to your organization. A private cloud shares many of the characteristics of public cloud computing including resource pooling, self-service, elasticity and pay-by-use delivered in a standardized manner with the additional control and customization available from dedicated resources.
While virtualization is an important technological component of private cloud, the key differentiators is the continued abstraction of computing resources from infrastructure and the machines (virtual or otherwise) used to deliver those resources.
Only by delivering this abstraction can customers achieve the benefits of private cloud – including improved agility and responsiveness, reduced TCO, and increased business alignment and focus. Most importantly, a private cloud promises to exceed the cost effectiveness of a virtualized infrastructure through higher workload density and greater resource utilization.

With a private cloud, you get many of the benefits of public cloud computing—including self-service, scalability, and elasticity—with the additional control and customization available from dedicated resources.
Two models for cloud services can be delivered in a private cloud:
Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). With IaaS, you can use infrastructure resources (compute, network, and storage) as a service, while PaaS provides a complete application platform as a service.

 Architecture

Cloud architecture, the systems architecture of the software systems involved in the delivery of cloud computing, typically involves multiple cloud components communicating with each other over a loose coupling mechanism such as a messaging queue.
Elastic provision implies intelligence in the use of tight or loose coupling as applied to mechanisms such as these and others.

 The Inter-cloud

The Inter-cloud is an interconnected global “cloud of clouds” and an extension of the Internet “network of networks” on which it is based.

 Cloud engineering
Cloud engineering is the application of engineering disciplines to cloud computing. It brings a systematic approach to the high level concerns of commercialization, standardization, and governance in conceiving, developing, operating and maintaining cloud computing systems. It is a multidisciplinary method encompassing contributions from diverse areas such as systems, software, web, performance, information, security, platform, risk, and quality engineering.

Private cloud solutions are from Microsoft and VMware

.
Microsoft private cloud solutions are licensed on a per processor basis, so customers get the cloud computing benefits of scale with unlimited virtualization and lower costs – consistently and predictably over time.

VMware private cloud solutions are licensed by either the number of virtual machines or the virtual memory allocated to those virtual machines – charging you more as you grow. This difference in approach means that with Microsoft your private cloud ROI increases as your private cloud workload density increases. With VMware,  cost grows, as  workload density does. The reason for this significant cost difference is VMware’s per-VM licensing for private cloud products like VMware vCenter Operations Management Suite   and drives this cost differential.

Economics has always been a powerful force in driving industry transformations and as more and more customers evaluate cloud computing investments that will significantly affect ROI, now is the time to provide the information they need to make informed decisions, for today and tomorrow.

Microsoft Private Cloud – Unlimited Virtualization Rights Microsoft private cloud solutions are built using Windows Server with Hyper-V and System Center – the combination of which provides enterprise class virtualization, end-to-end service management and deep insight into applications so users can focus more attention on delivering business value.
Microsoft private cloud solutions are delivered through  wide ecosystem of partners and are offered as custom, pre-configured, or hosted offerings – so, no matter  unique business need; there is a Microsoft private cloud solution for it.
Microsoft private cloud solution is licensed through the Microsoft Enrollment for Core Infrastructure1 (ECI) licensing program. ECI is a Microsoft Enterprise Agreement (EA) enrollment, available in two editions (Datacenter and Standard), that allows a
simple and flexible per processor licensing option. Its approach is focused on delivering the benefits of scale – through unlimited virtualization rights and significantly simplified licensing for Windows Server and System Center.

VMware
Private Cloud – Per-VM Licensing In 2011, VMware announced the latest version of its virtualization platform, vSphere 5.0, along with updated versions of surrounding technologies;
vCenter Site Recovery Manager,
vShield Security and vCloud Director.

These products collectively are referred to as Cloud Infrastructure Suite. VMware has also released several management products like vCenter Operations Management Suite and vFabric Application Performance Manager (APM) to provide capabilities like monitoring, application performance management, and configuration
management. To build a comparable private cloud solution using VMware technologies,  require components from VMware Cloud Infrastructure suite, vCenter Operations Management Suite and vFabric APM as a private cloud solution requires capabilities like monitoring, configuration, automation, orchestration and security in addition to the virtualization platform.
Unlike Microsoft ECI Datacenter, VMware Cloud Infrastructure Suite, vCenter Operations Management Suite, and vFabric APM cannot be licensed as a single SKU, but have to be licensed separately for individual products. Moreover, VMware private cloud products follow a combination of three different licensing schemes-   vSphere 5.0 is licensed on a per processor basis with virtual RAM entitlements vCenter is licensed on a per-instance basis Cloud Infrastructure products – vCloud Director, vCenter Site Recovery Manager, and vShield are licensed on a per-VM basis3   vCenter
Operations Management Suite and vFabric APM are licensed on a per-VM
basis5
Microsoft® offers solutions that deliver IaaS and PaaS for both private and public cloud deployments. This  focuses on Microsoft solutions for IaaS and provides an overview of Microsoft Hyper-V™ Cloud, a set of programs and initiatives to help customers and partners accelerate deployment of IaaS

Organizations can build a private cloud today with Windows Server® 2008 R2, Microsoft Hyper-V, and Microsoft System Center.
The foundation is built on the Windows Server platform with the Windows Server Active Directory® identity framework, Hyper-V virtualization capability, and System Center end-to-end service management capabilities.
The new System Center Virtual Machine Manager Self-Service Portal 2.0 simplifies the pooling, allocation, provisioning, and usage tracking of datacenter resources, so that your business units can consume Infrastructure as a Service.

Cloud app vs. web app: Understanding the differences

The line between a cloud app and a web app remains as blurry as ever. This of course stems from the natural similarities that exist between them.   However, that there are noteworthy differences, especially when looking to leverage cloud apps for richer user customization experience and seamless integration with resilient and scalable back-end infrastructure, which often characterizes public cloud services.

Cloud app

A cloud app is the evolved web app. It’s equally used to access online services over the Internet like web apps but not always exclusively dependent on web browsers to work. It’s possible for a customizable, multi-tenancy cloud app to be solely available over the web browser from service providers, but quite often the web-interface is used as alternative access methods to the custom built cloud app for online services.

Cloud apps are usually characterized by advanced features such as:

  • Data is stored in a cloud / cloud-like infrastructure
  • Data can be cached locally for full-offline mode
  • Support for different user requirements, e.g., data backup cloud app with different features such as data compression, security, backup schedule
  • Can be used from web browser and/or custom built apps installed on Internet connected devices such as desktops, mobile phones
  • Can be used to access a wider range of services such as on-demand computing cycle, storage, application development platforms

Examples of cloud apps

Some common examples include Mozy, Evernote, Sugar Sync, Salesforce, Dropbox, NetSuite, and Zoho.com. Other qualifying examples such as web email (Google, Yahoo, Microsoft Hotmail, etc.) may not be so obvious, but they depend on cloud technology and are available off-line if consumers so choose to have them configured as such.

There are numerous websites where you can find useful information on cloud apps.  www.getapp.com to be particularly informative. It includes cloud app reviews and ratings to evaluate the apps.

Web apps

Web apps on the other hand are almost exclusively designed to be used from a web browser. A combination of server-side script (ASP, PHP etc) and client-side script (HTML, JavaScript, Adobe Flash) are commonly used to develop the web application. The web browser (thin client) relies on the web server components installed on backend infrastructure systems for the heavy lifting in providing its core functional web services.

The obvious benefit that this computing model provides over the traditional desktop app is that it is accessible from anywhere via the web browser. Cloud apps can also be accessed this way.

Examples of web apps

For many, web services such as WebEx, electronic banking, online shopping applications, and eBay fall into this category in as much as they are exclusively web-based with limited options for consumer customization.

Conclusion

Application service providers have been quick to exploit advantages brought about by pioneering web app building framework technologies for greater customer reach. However these technologies are not necessarily optimized for building new apps for the cloud era.

Cloud apps are web apps in the sense that they can be used through web browsers but not all web apps are cloud apps. Software vendors often bundle web apps to sell as “cloud” apps simply because it’s the latest buzz-word technology, but web apps do not offer the same richness in functionality and customization you’ll get from cloud apps. So, buyer beware!

Some software application vendors also falsely think that just because their application runs on the web, this automatically qualifies it to be a cloud app. This is not always the case. For your web app to evolve into a cloud app, it should exhibit certain properties such as

  • True multi-tenancy to support various requirements & needs for consumers
  • Support for virtualization technology, which plays a starring role for cloud era apps. Web applications should either be built to support this or re-engineered to do so

The good news is that vendors looking to move into this cloud app space now have rich development platforms and frameworks to choose from. Whether migrating from an existing web app or even starting from scratch. These new age cloud app development platforms are affordable and agile, reducing time to market and software development complexities.

VMware Cloud foundry, Google apps Engine, Microsoft Azure, Appcara, Salesforce (Heroku and Force.com), AppFog, Engine Yard, Standing Cloud, and Mendix are examples of such development platforms offering cloud-based technology for building modern applications.

Systems/concepts similar to cloud computing

Cloud computing shares characteristics with:

  • Autonomic computing — Computer systems capable of self-management.
  • Client–server model — Client–server computing refers broadly to any distributed application that distinguishes between service providers (servers) and service requesters (clients).
  • Grid computing — “A form of distributed and parallel computing, whereby a ‘super and virtual computer’ is composed of a cluster of networked, loosely coupled computers acting in concert to perform very large tasks.”
  • Mainframe computer — Powerful computers used mainly by large organizations for critical applications, typically bulk data processing such as census, industry and consumer statistics, police and secret intelligence services, enterprise resource planning, and financial transaction processing.
  • Utility computing — The packaging of computing resources, such as computation and storage, as a metered service similar to a traditional public utility, such as electricity.
  • Peer-to-peer — Distributed architecture without the need for central coordination, with participants being at the same time both suppliers and consumers of resources (in contrast to the traditional client–server model).
  • Cloud gaming – Also called On-demand gaming is a way of delivering to games to computers. The gaming data will be stored in the provider’s server, so that gaming will be independent of client computers used to play the game.

Characteristics

nist_cloud

Cloud computing exhibits the following key characteristics:

  • Agility improves with users’ ability to re-provision technological infrastructure resources.
  • Application programming interface (API) accessibility to software that enables machines to interact with cloud software in the same way the user interface facilitates interaction between humans and computers. Cloud computing systems typically use REST-based APIs.
  • Cost is claimed to be reduced and in a public cloud delivery model capital expenditure is converted to operational expenditure. This is purported to lower barriers to entry, as infrastructure is typically provided by a third-party and does not need to be purchased for one-time or infrequent intensive computing tasks. Pricing on a utility computing basis is fine-grained with usage-based options and fewer IT skills are required for implementation (in-house). The e-FISCAL project’s state of the art repository contains several articles looking into cost aspects in more detail, most of them concluding that costs savings depend on the type of activities supported and the type of infrastructure available in-house.
  • Device and location independence enable users to access systems using a web browser regardless of their location or what device they are using (e.g., PC, mobile phone). As infrastructure is off-site (typically provided by a third-party) and accessed via the Internet, users can connect from anywhere.
  • Virtualization technology allows servers and storage devices to be shared and utilization be increased. Applications can be easily migrated from one physical server to another.
  • Multi-tenancy enables sharing of resources and costs across a large pool of users thus allowing for:
    • Centralization of infrastructure in locations with lower costs (such as real estate, electricity, etc.)
    • Peak-load capacity increases (users need not engineer for highest possible load-levels)
    • Utilization and efficiency improvements for systems that are often only 10–20% utilized.
  • Reliability is improved if multiple redundant sites are used, which makes well-designed cloud computing suitable for business continuity and disaster recovery.[33]
  • Scalability and elasticity via dynamic (“on-demand”) provisioning of resources on a fine-grained, self-service basis near real-time, without users having to engineer for peak loads.
  • Performance is monitored, and consistent and loosely coupled architectures are constructed using web services as the system interface.
  • Security could improve due to centralization of data, increased security-focused resources, etc., but concerns can persist about loss of control over certain sensitive data, and the lack of security for stored kernels. Security is often as good as or better than other traditional systems, in part because providers are able to devote resources to solving security issues that many customers cannot afford. However, the complexity of security is greatly increased when data is distributed over a wider area or greater number of devices and in multi-tenant systems that are being shared by unrelated users. In addition, user access to security audit logs may be difficult or impossible. Private cloud installations are in part motivated by users’ desire to retain control over the infrastructure and avoid losing control of information security.
  • Maintenance of cloud computing applications is easier, because they do not need to be installed on each user’s computer and can be accessed from different places.
On demand self service
See also: Provisioning#Self-service provisioning for cloud computing services and Service catalog#Service catalogs for cloud computing services

On demand self-sevice allows users to obtain, configure and deploy cloud services themselves using cloud service catalogues, without requiring the assistance of IT.[38][39] This feature is listed by the The National Institute of Standards and Technology (NIST) as a characteristic of cloud computing.

The self-service requirement of cloud computing prompts infrastructure vendors to create cloud computing templates, which are obtained from cloud service catalogues. Manufacturers of such templates or blueprints include Hewlett-Packard (HP), which names its templates as HP Cloud Maps RightScale and Red Hat, which names its templates CloudForms.

The templates contain predefined configurations used to by consumers to set up cloud services. The templates or blueprints provide the technical information necessary to build ready-to-use clouds. Each template includes specific configuration details for different cloud infrastructures, with information about servers for specific tasks such as hosting applications, databases, websites and so on. The templates also include predefined Web service, the operating system, the database, security configurations and load balancing.

Cloud consumers use cloud templates to move applications between clouds through a self-service portal. The predefined blueprints define all that an application requires to run in different environments. For example, a template could define how the same application could be deployed in cloud platforms based on Amazon Web Service, VMware or Red Hat. The user organisation benefits from cloud templates because the technical aspects of cloud configurations reside in the templates, letting users to deploy cloud services with a push of a button.  Cloud templates can also be used by developers to create a catalog of cloud services.

Agility in the Cloud: Big Data and the Cloud

The International Data Corporation has stated that the enterprise software market achieved £342bn in revenue in 2012, a 3.6 per cent growth on the previous year. While the overall growth of the market is slowing – it is less than half of the growth rate seen in 2010 and 2011 – evidence suggests that the increased adoption of big data and cloud has contributed to the improved turnover of the last 12 months.

The findings were presented in the organisation’s ‘Worldwide Semi-annual Software Tracker’. It revealed that data access, analysis and delivery, collaborative applications, CRM applications, security software, and system, and network management software represent the faster expanding areas of the market; each grew by six to seven per cent last year, about double the rate for enterprise software as a whole.

IDC has said that the larger investments in tools that enable organisations to manage, access and analyse their data indicates the prominent role big data is now playing in this market as greater emphasis is placed on uncovering value from vast data sets. Furthermore, the ever-increasing number of cloud deployments, often underpinning the way big data technologies are procured, has also become integral to the market’s success.

Big Data Starups Investment

Big Data Starups Investment

In terms of the success of software vendors, the top five based on revenue last year were Microsoft, IBM, Oracle, SAP and Symantec. SAP witnessed the largest growth at 5.1 per cent, followed by Oracle at 3.9 per cent.

Commenting on the research, Henry Morris, senior vice president for worldwide software, services and executive advisory research at IDC, said in a statement: “The global software market, comprised of a multi-layered collection of technologies and solutions, is growing more slowly in this period of economic uncertainty. Yet there is strong growth in selective areas.

“The management and leveraging of information for competitive advantage is driving growth in markets associated with big data and analytics.

“Similarly, rapid growth in cloud deployments is fuelling growth in application areas associated with social business and customer experience. The combination of these forces is advancing the growth to what IDC has termed the third platform.”

Recent years have seen a lot of development in the cloud computing sphere. Big data is believed by many to be here to stay, and a lot of real investment is touted to happen in this particular area. Such a trend is quite exciting, as new, better and more powerful infrastructure will be needed to support all this. So,  a lot of further development is on the way to accommodate these computing perimeters.

Does agility matter?

Your company might have been in the business for many years, or it may be a newcomer to the field – JIT (Just in Time) deployment of services is very important for both types of organizations. This is also critical to the success of any of the types of company described above. While small business owners may sometimes think that there is a cost control through traditional IT, they really need to consider the agility that the cloud brings to their businesses. Cloud computing, when moved and executed properly, can help companies tap market opportunities to the best possible effect, due to the extended flexibility and agility it has to offer. The recent acquisition of Cloud dot com by Citrix clearly proves that there has been an increase in interest in cloud computing technology. It is also touted in the networking space that many emerging SDN players are bound to be acquisition targets for major companies keeping an eye on the developments in the sphere.

The most important standards in Cloud

There are a lot of important standards that need to be provisioned into the cloud computing sphere, one of which is using a standardAPI. With services like Cloud Stack and Open Stack, a lot of progress is being made in this regard. However, there is still work left to be done regarding the way abstraction of various layers is defined, especially in the security and networking layers. The work being done with Open Stack is quite commendable, and it could serve as a guiding platform for SDN to become aligned with this vision of abstraction in network and security.

Big data – the buzz

Big data is not new. It has existed for ages and can be attributed even to the initial years of computing. However, one might do well to consider why is there an increased buzz around this now. The answer is quite simple: Significant advances that have been brought about by x86 hardware have actually helped in bringing computing power to the masses. However, with new technologies such as vPlane, etc., cloud computing has extended this power. Now, users have extended perimeters, while still being able to control costs effectively. Cloud computing platforms have achieved performance standards that were previously only possible by using ASICs built specifically for that purpose.

3Vs is a term used to define the different attributes of big data – volume, variety and velocity. In 2001, the 3Vs term was coined to define the constructs or attributes that make up an organization’s stored and owned data repositories. 3Vs is now used to define the trends and dimensions of big data.

The 3Vs were discussed in the 2001 paper, 3D Data Management: Controlling Data Volume, Velocity and Variety, by Doug Laney. The paper predicted trends in data warehousing techniques that evolved from 2001 to 2006, particularly in e-commerce-based big data.

3Vs is a data management trend that was conceived to help organizations realize and cope with the emergence of big data. The 3Vs compare the storage, utilization and consumption of data with regards to the three base dimensions, and it encompasses all data forms, regardless of storage location or format, that are eventually compiled as a big data repository.

The Verdict

There is no clear-cut verdict in this domain. While cloud computing has become a boon to network infrastructures, it has also been a support to the ever-expanding needs of infrastructure, and also put a check on the costs. The agility it brings does has not yet seen a the time when end-to-end cloud computing operating systems are a common phenomenon, and fewer articles stressing its importance will be written – perhaps like this one!

Big data’s impact will influence cloud computing in 2013

Cloud computing and big data were two of the most important trends to shape the IT industry in 2012. Looking ahead, both will continue to impact the landscape in a number of ways. According to a recent RedHat report, many businesses implemented cloud-based environments last year as a way to manage the influx of structured and unstructured data.

The report suggested that corporate storage will evolve in 2013 into a “data platform,” rather than a “data destination.”

“As a platform for big data and not just a destination for data storage, enterprise storage solutions will need to deliver cost-effective scale and capacity; eliminate data migration and incorporate the ability to grow without bound; bridge legacy storage silos; ensure global accessibility of data; and protect and maintain the availability of data,” the report explained.

Cloud/big data relationship goes deeper
There is a great deal of industry research that continues to point to the growing importance of both cloud computing and big data. In many cases, both are impacting the other. A recent Markets and Markets report indicated that the global public and private cloud storage market will expand at a compound annual growth rate of 40.2 percent between 2012 and 2018, approaching $47 billion.

The research firm explained that the cloud storage market will reach this level as businesses address the emergence of digital trends that have impacted the volume in which unstructured data is generated. The cloud, which is not only cost-effective but also scalable, is an ideal technology to help companies address this need.

US firms using cloud for storage
The influx of big data has reached an important point that has made it necessary for companies to migrate information to hosted environments. A recent Redwood Software survey found that only 35 percent of U.K. firms are using the cloud for private data storage, compared to nearly 60 percent of U.S. organizations.

Looking ahead to the global cloud storage industry, U.S. companies will have a major role in the growing market. MarketsandMarkets predicted that North America will account for nearly $22 billion worth of the global cloud storage landscape by 2018.

There are a few constants that are expected to remain throughout the entire IT industry moving forward. The fact is that companies are producing more data than ever, making it necessary for businesses to not become overwhelmed with storing this information.

Distributed or grid computing has been powering many big data projects from gene-cracking to the search for extraterrestrial intelligence. The idea is simple: take huge amounts of data that need to be analyzed-break it into small chunks and let individual computers tackle those chunks. It is the basis of the popular Hadoop platform for big data. Sure, Hadoop is designed for server clusters and companies like Amazon have huge amounts of servers that can tackle any given problem. Hence, Amazon Elastic Cloud Compute (Amazon EC2) is a prime way for companies to tackle big data and cloud computing. And while there is little doubt that servers are better at handling such massive data through-put better than any smart phone – let’s face it – Amazon doesn’t have 1 billion and growing servers either. The cost of such expansion to them is enormous. However, consumers continue to purchase additional phones and constantly upgrade their hardware.

There are already grid computing models that take advantage of the fact that   a desktop computer that may remain on, idle, and connected for extended periods of time. The popular BOINC platform is such an example. Download their client software, and choose a project. When  computer is “idle” – it is actually working on analyzing genes, astronomical data, or any number of projects  volunteer   computer cycles to.