Lift and shift is an approach, one among many, for migrating the apps to the cloud. It means moving an application and its related data to a cloud platform without restructuring the app.
There is no one-size-fits-all transition for moving an application from users on-premises data center to the cloud. But there are known core migration paths; many consider lift and shift is one of them. It is a way for organizations to safeguard their investments in business workflow, logic, and data trapped in on-premises hardware.
The lift-and-shift approach opens pathways to IT modernization by moving to an open and more extensible architecture in the cloud. Organizations consider lift and shift for solid business reasons, including minimized costs and improved performance and resiliency.
WHEN IS THE LIFT AND SHIFT CLOUD-MIGRATION MODEL A GOOD FIT??
With the lift-and-shift approach, on-premises applications can move to the cloud without redesigning. But they cannot always take complete advantage of native-cloud features, so this may not be the most cost-effective migration path. According to the survey, Gartner estimates that by 2020, organizations lacking cost-optimization processes will average 40% overspend in the public cloud.
STRATAGEMS TO CONSIDER
Rehost : This is infrastructure as a service (IaaS), or lift and shift. The user rehost his/her application in another hardware environment without changing the applications architecture. Migration is fast and comparatively low-priced, but ongoing operation can be costly because the user is not leveraging cloud efficiencies.
Refactor : Refactor is also known as platform as a service (PaaS), the user run his/her applications on a cloud provider’s infrastructure. Developers can reuse languages, frameworks, and containers leveraging code that’s strategic to the company.
Revise : Firstly, the user support legacy-modernization requirements by modifying or extending the existing code; then take the rehost or refactor route to the cloud. This means the user can leverage the cloud characteristics of their provider’s infrastructure but not without upfront development cost.
Rebuild : The user throws out the code for an existing app and re-architect it. The benefit is access to inventive features in the provider’s platform that improve developer’s productivity. The price the user pay is either lock-in or abandoning the application assets if the situation becomes unacceptable.
Replace : Discard the existing application set and adopt commercial software as a service (SaaS). When requirements for a business function change rapidly, this approach eludes the time and investment of mobilizing a development team. But there might be some issues such as inconsistent data semantics, difficult data-access, and vendor lock-in.
Over the last few years, we have seen exponentially growing interest in what is commonly called a multi-cloud strategy or multi-cloud infrastructure, across the enterprise community. Enterprises from all industries are considering and putting in place multi-cloud strategies, virtualizing their infrastructures and choosing a mix of cloud providers, rather than depending on a single vendor.
WHAT IS MULTI-CLOUD?
Multi-cloud is the next leap of cloud-enabled IT architecture beyond the hybrid cloud. It refers to an IT design where multiple cloud providers and on-premises private cloud resources are used simultaneously to realize certain business objectives and metrics that are hard to achieve with private-only or hybrid cloud designs.
These business objectives include the freedom of choice to pick and choose best of breed cloud services across public cloud providers, as well as allowing data mobility to eliminate concern over vendor lock-in. Organizations are also looking to achieve enhanced data availability and durability with data sets spread across multiple cloud architectures, and cost optimization with the ability to use the most appropriate cloud pricing scheme for each application across providers. These factors put the cloud customer in charge with optimal leverage and control, thereby fuelling the growing momentum of multi-cloud.
PRINCIPLES OF MULTI-CLOUD
The advantages of multi-cloud outlined above hinge on a number of principles that must be adhered to. First is the need to normalize data access, control, and security across all clouds with a single interface, with the choice being the Amazon S3 API – assuming a full-fledged implementation with Identity & Access Management or IAM. Secondly, the need to ensure that data always stays in its open, cloud-native format, so that it can be accessed wherever it exists in and can be moved around freely as required. The third is a crystal clear data brokering capability that allows data to be placed and moved around automatically based on pre-defined business rules.
If not implemented correctly and with reasonable safeguards, multi-cloud could intensify the drawbacks and challenges for cloud customers, such as increased complexity and overhead of data management, minimized flexibility in ways the data can be accessed, used, and tracking of where data is placed.
Many organizations can already be defined as multi-cloud users by default since the deployment of multiple cloud offerings within a single company has grown organically out of practices over the past few years. Those organizations are finding the need of a solution to deal with these multiple clouds. At this moment, an increasing section of customers are clearly better educated and more practical in their use of cloud services, prompting them to explore and implement multi-cloud strategies.
As multi-cloud is becoming the standard for cloud designs and is enjoying mainstream adoption, over the next couple of years there will be demand for a solution that fundamentally changes cloud storage and data management to provide customers with the full power and flexibility of on-premises storage and public clouds so that they can get not only the most value from their data but also the optimal experience in doing so. The rise of multi-cloud will help the agent to manage information across different clouds.
A hardware security module (HSM) is a physical computing device that protects and manages digital keys for strong authentication and provides crypto processing. These modules usually come in the form of a plug-in card or an external device that attaches directly to a computer or network server.Azure Key Vault first became available as a public screening in January 2015 and became generally available in same year of June. It is available in Standard and Premium service levels. There is no set up fee and users are billed for operations and keys. So, let’s dive deep in and know more about AKV and its benefits of HSM.
To help you out of this hardware misery, Microsoft offers you Azure Key Vault (AKV) in the cloud. It offers the benefits of HSM and managing it.
HOW SAFE IS STORING DELICATE DATA IN AKV?
Storing data in a database and an HSM is not the same. The data does not simply stay in a file on user’s server. This information is stored in hardware device and the device offers the user many features such as auditing, tamper-proofing, encryption, etc. What Microsoft provides in the form of AKV is an interface using which the user can access the HSM device in a safe way.
If the user wants further assurance about the integrity of the key, the user can generate it right inside of the HSM. Isn’t it cool? Microsoft processes keys in FIPS 140-2 Level 2 validated HSMs and even Microsoft cannot see or extract the user’s keys. With logging enabled, you can pipe the logs to Azure HD Insight or SIEM solutions for threat detection.
PATCHING, PROVISIONING, AND OTHER INFRASTRUCTURE ISSUES
Microsoft does, just like other Azure IaaS resources. They provide an SLA of 99.9% effective processing for Key Vault transactions within 5 seconds.
WHAT KIND OF DATA SHOULD BE STORED?
The user can store PFX files and manage your SSL certificates using AKV. Database Connection strings, Social Network client keys, Subscription Keys, License Keys, and many more keys could be stored and managed easily using AKV.
CREATING AND MANAGING KEY VAULTS
Azure Key Vault offers a way to securely store credentials and other keys and secrets, but the user code needs to authenticate to Key Vault to recover them. Managed Service Identity (MSI) makes resolving this problem simpler by giving Azure services an automatically managed identity in Azure Active Directory (Azure AD). The user can use this identity to authenticate to any service that supports Azure AD authentication, including Key Vault, without having any credentials in the user’s code.
Google Cloud Platform is a set of public cloud computing services offered by Google. The platform includes a range of hosted services for compute, storage, and application development that run on Google hardware. Google Cloud Platform services can be accessed by the developers, cloud administrators, and other enterprise IT professionals over the public internet or through a committed network connection. Explore this article and know more about Google cloud platform and its benefits.
GOOGLE CLOUD PLATFORM OFFERINGS
Google Cloud Platform offers services for compute, storage, networking, big data, machine learning, and the internet of things (IoT), as-well-as cloud management, security, and developer tools. The core cloud computing products in Google Cloud Platform include:
Google Compute Engine: Is an infrastructure-as-a-service (IaaS) which offers the users with virtual machine instances for workload hosting.
Google App Engine: It is a platform-as-a-service (PaaS) that gives the software developers access to Google’s scalable hosting. The developers can also use a software developer kit to develop software products that run on App Engine.
Google Cloud Storage: Is a cloud storage platform designed to store large, unstructured data sets. Google Cloud also offers database storage options, including Cloud Datastore for NoSQL non-relational storage, Cloud SQL for MySQL fully relational storage, and Google’s native Cloud Bigtable database.
Google Container Engine: Is a management and orchestration system for Docker containers that runs within Google’s public cloud. Google Container Engine is based on the Google Kubernetes container orchestration engine.
Google continues to enhance higher-level services, such as those related to big data and machine learning, to its cloud platform. Google big data services include those for data processing and analytics, such as Google BigQuery for SQL-like queries made against multi-terabyte data sets. Adding to that, Google Cloud Dataflow is a data processing service intended for analytics; extract, transform, and load (ETL); and real-time computational projects. The platform also includes Google Cloud Dataproc, which offers Apache Spark and Hadoop services for big data processing.
ADVANTAGES OF GOOGLE CLOUD BETTER PRICING THAN COMPETITORS
Google bills in minute-level increments (with a 10-minute minimum charge), so the user can only pay for the compute time they use. And a big advantage is that they give you discounted prices for long-running workloads with no up-front commitment required. Use the VMs for a month and the user gets a discount. This makes it perfect for start-ups and for enterprise IT to cut costs. AWS for instance requires prepays in the form of “reserved instances” to be eligible for the discounts. And Azure only offers a 5% discount on a 12-month prepay.
LIVE MIGRATION OF VIRTUAL MACHINES
Another big advantage for Google Cloud Hosting, especially for how you can use it at live migrations of Virtual Machines. Neither AWS, Azure, or smaller providers such as Digital Ocean offer this functionality. So this is a very important differentiator for Google Cloud compared to other cloud providers. This means the users are basically always up with their VMs, with no noticeable degradation in performance when they are live migrating VMs between host machines.
The benefits of live migrations allow the engineers at Google to better address issues such as patching, repairing, and updating the software and hardware, without the need to worry about machine reboots.
STATE OF THE ART SECURITY
Another big advantage is security. Choosing Google Cloud Platform means the user gets the benefit of a security model that has been built upon over 15 years, and currently secures products and services such as Gmail, Search, and much more. Google currently employs more than 500 full-time security experts. Niels Provos, a security Engineer at Google, gave a great in-depth look at the security of Google Cloud Platform.
Some of Google Cloud Platform security features include:
All data is encrypted in transit between Google, the customers, and the data centres; as well as the data in all of the Cloud Platform services. The data stored on persistent disks which is encrypted under 256-bit AES and each encryption key is also encrypted with a set of regularly changed master keys.
The layers of the Google application and storage stack require that requests coming from other components are authenticated and authorized.
DEDICATION TO CONTINUED EXPANSION
Google has continued to swiftly build out their infrastructure for Google Cloud Platform. On September 29th, 2016 they have announced the locations of new strategically placed Google Cloud Regions. We are very excited to see São Paulo and Sydney included in the list as this will help dramatically decrease latency for Google Cloud Hosting customers across South America and Australia.
As you can see, when it comes to Google Cloud Hosting and using Google Cloud Platform, there are a lot of advantages. Not only it is inexpensive, but the user can also benefit from one of the largest networks in the world! This means less latency and more accurate compute prices as data is able to be processed in less time. Live migration of virtual machines is currently a unique and very important differentiator when it comes to comparing other cloud hosting providers. Top of all the state of the art security and performance is able to handle hundreds of thousands of concurrent connections and the user will have a platform that can set their business up for long-term success.
Over the past few years, cloud computing has taken the tech world by storm. This computing model is now making it easier to deliver computing resources to businesses worldwide, and at their current costs. The companies have reduced exponentially owing to automation, and the internet-powered cloud migration strategy is already helping countless businesses worldwide to make smooth transitions. We know that the fast adoption of cloud technology is not really far away. Today, many companies have started their journey towards an integrated cloud environment by migrating their existing systems, applications, services, and data to the cloud. So, explore this article and know more about cloud migration and its advantages.
Cloud migration involves the process of moving relevant business data, applications, and other important elements of a company from its desktops and servers to the cloud. This process can sometimes also involve the moving of data between different cloud environments provided by different service providers based upon their area of expertise.
Cloud migration makes the cloud computing possible, wherein the cloud performs all the computing that was performed earlier by mobile devices, laptops, or desktops. The benefits of cloud computing are many, as most of the background processing is now presented at large, secure data centres thereby saving valuable organizations resources. At the same time, cloud migration challenges are still an important reality that many organizations wake up to unless they partner with an experienced service provider.
ADVANTAGES OF CLOUD MIGRATION IMPROVED SECURITY FEATURES
Most of the cloud providers take care of the tougher security problems, such as keeping unwanted traffic outside a specific scope from accessing the machines on which the user’s data and apps reside and ensuring automatic security updates which are applied to their systems to keep from being vulnerable to the latest known security threats.
Keep in mind, that the user still need to have security policies in place on their end, such as keeping mobile devices secure, making sure employees do not reveal passwords or other sensitive information to unauthorized parties, and much more.
LESS SUBSTRUCTURE COMPLEXITY
Cloud systems always tend to peel away the complexity of the substructure that underlies the architecture being used to provision new machines and make them all work together to provide the needed services. Instead, the user can fill out some information on what is needed and launch the necessary services. This can save quite a bit of time, as those particular complexities are no longer a part of the process.
AUTOMATIC BACKUP AND LOGGING OF KEY METRICS
Monitoring, backup, and logging services are extremely important, especially if the user need to perform disaster recovery from an outage and see where things went wrong. The backups will allow the user to get things up and running again, and the logs may provide some critical information to help the user to find out what caused the issue in the primary place.
Flexibility is something which comes to our mind when we speak about cloud migration. It is best suited for organizations with changing bandwidth demands as the services will be customized accordingly. If the needs increase, it is easy to scale up the cloud capacity.
IMPROVED COST MANAGEMENT
Some cloud providers also provide auto scaling, which allows the user to provision more services when needed and turning them off when they are not needed. This highly responsive technique can help even more with price savings, as the user only needed to be charged for the additional systems instead of keeping additional machines up and running all the time to deal with peak load. In this way, the services can automatically respond with the number of resources needed at any given time, preventing both downtime and unnecessary expenses.
The HDInsight service is a type of implementation of Hadoop that runs on the Microsoft Azure platform. HDInsight is completely compatible with Apache Hadoop because it is built on the Hortonworks Data Platform (HDP).
Microsoft implements Hadoop as a service as it has several advantages:
A user can quickly deploy the system from a portal or through Windows PowerShell
scripting, without having to create any physical or virtual machines.
A user can implement a small or large number of nodes in a cluster.
The HDInsight service works with input-output technologies from Microsoft or other vendors.
The user only pays for what they use.
HDINSIGHT AS CLOUD SERVICE
Microsoft Azure is a cloud service provided by Microsoft. One of the services offered is HDInsight, an Apache Hadoop-based distribution which runs in the cloud on Azure.
Creating an HDInsight cluster is brisk and easy. Log-on to Azure, select the number of nodes, name the cluster, and set permissions. The cluster will be available on demand, and once the job is completed, the cluster can be deleted but the data will remain in Azure storage. Having the data securely stored in the cloud before, after, and during the process gives HDInsight an edge compared with other types of Hadoop deployments.
UPGRADES TO MICROSOFT AZURE HDINSIGHT
Microsoft has announced a number of updates to Azure HDInsight that makes Hadoop enterprise ready in the cloud. Azure HDInsight is Hadoop and Spark cloud services from Microsoft. It is now offering the security capabilities for cloud Hadoop solution, Big Data query and speeds the approach of data warehousing performance, and new notebook experiences for data scientists in the latest Hortonworks Data Platform 2.5 and Spark 2.0 platform.
Advanced security features: New security features such as authentication and encryption for data protection will give the user an increased level of security for authentication, authorization, and encryption.
According to Microsoft, LLAP (Live, Long and Process) allows the data to stay in a constricted format while running in-memory to deliver a better performance boost for Big Data queries.
New features in Azure HDInsight includes integration with Azure active directory, which is Microsoft cloud hosted directory and identity management service.
Azure HDInsight also supports Apache Kafka and works with Spark and Apache storm to give a solution for a large and fast stream of data which is ideal for a wide variety of IoT scenarios.
Microsoft also collaborated with Simba to deliver an ODBC driver for Azure HDInsight which can be used for world-class BI tools such as Power BI, Tableau, and QlikView. Also, it allows the business analysts to gain insights over a Big Data using their own choice of tool.
Containers are a solution to the problem of how to get software to run reliably when moved from one computing environment to another. This could be from a developer’s laptop to a test environment, from a staging environment into production, and perhaps from a physical machine in a data center to a virtual machine in a private or public cloud.
Managing the unique and innovative changes in both technology and business over the past decade has created an ongoing IT infrastructure challenge for many senior technologies executive. Each progressive step has been built on the previous one while adding new challenges, dimensions, and opportunities for IT departments and their business partners.
In recent past, virtualization has become a widely accepted way to reduce operating costs and increase the authenticity of the enterprise IT. The speed of innovation and unique acceleration in introduction of new products has changed the way the market works. Along with the wide acceptance of software as a service (Saas), these changes have paved the way for the latest IT infrastructure challenge to cloud computing.
What is Cloud Computing?
Cloud computing has become one of the most discussed IT paradigms in the recent past. It builds on many of the advances in IT industry over the past decade and presents significant opportunities for the organization to minimize time to market and trim costs. Using cloud computing, organizations can consume shared computing and storage resources rather than building, operating, and developing infrastructure on their own.
In the simplest terms, cloud computing is the process of storing and accessing the data and programs over the internet instead of using your computer’s hard drive. Cloud computing enables organizations to obtain malleable, secure, and cost-effective IT infrastructure. Cloud computing has now become a highly demanded service due to the advantages of high computing power, lower cost of services, high performance, flexibility, accessibility, as well as availability. In a year, cloud vendors are almost experiencing a growth of 50%. But, due to being in a stage of infancy, it has some pitfalls which have to be given proper attention to make cloud computing services more reliable and user-friendly.
Amazon and Cloud Computing
Amazon had spent over a decade and millions of dollars building and managing the large-scale, safe, and efficient IT infrastructure that powered one of the world’s largest online retail platforms. Amazon has launched Amazon Web Services (AWS) in 2006, and today it is serving hundreds of thousands of customers worldwide. Amazon.com is running a global web platform serving millions of customers and managing billions of dollars worth of commerce every year.
The growing AWS collection offers diversified services including:
Cloud drive: This allows users to upload and access music, videos, documents, and pictures from web-connected devices. The service also enables users to stream music in their devices.
Scalable and Elastic: Organizations can quickly add and subtract AWS resources to their applications in order to meet customers requirements and manage cost.
Cost-effective: With AWS, companies only pay for what they use, without any upfront or long-term commitments.
Secure: In order to provide end-to-end security and privacy, AWS builds services in accordance with best security practices, provides the appropriate security features in those services and documents.
The major key difference between AWS and IT models is flexibility. The flexibility of the AWS allows you to keep the programming models, languages, and operating systems that are better suited for your projects. In other words, Flexibility means, migrating legacy applications to the cloud is easy and cost effective. Instead of re-writing applications, you can move them to AWS cloud and tap into advanced computing capabilities.
Cost effective is one of the complicated elements in delivering contemporary IT solutions. Let’s take an example, developing and deploying an application for an e-commerce site can be a low-cost effort, but a successful deployment can increase the need for hardware and bandwidth.
In the same way, the cloud provides an on-demand IT infrastructure which makes you consume only the amount of resources you actually need. One is not bounded to set amount of storage, bandwidth, or computing resources.
Scalable and Elastic
In the conventional IT organization, scalability and elasticity are often equated with investment and infrastructure. In the cloud, scalability and elasticity provide opportunity for savings and improved ROI. AWS uses the word elastic to describe the ability to scale computing resources easily, with minimum friction.
AWS provides scalable cloud computing platform for the users with end-to-end security and end-to-end privacy. AWS builds security into services in accordance with security best practices.
Today, Amazon web Services provides highly conscientious, scalable, and low-cost infrastructure platform in the cloud that powers hundreds of thousands of businesses all over the world. Customers across all industries are taking advantages of all the benefits.
It is an undeniable truth that Information Technology has transformed the way business is done. The enticing features inherent to the cloud concept such as storage and processing of data in third party data centres, access to various services at convenient prices and adequate sharing of necessary resources etc. have ensured its widespread acceptance and dependence across industries.
Choosing a cloud computing service can be a simple process if one knows what he wants, knows what to expect and understands business requirements. Primarily, there are three types of cloud computing service providers that provide unique cloud services:
Software as a service (SaaS)-
These are service providers that offer software as services. For example: Google Docs, Microsoft Office 365 etc. SAAS eliminates the need to install and run applications on individual computers. Though a few plugins might be required, SAAS applications can be run directly from a web browser without any downloads or installations.
Platform as a service (PaaS)-
PaaS is primarily a framework for developers to build, develop and customize applications. The development, functioning and testing of applications made using PAAS is hassle free, fast and economic.
Infrastructure as a service (IaaS)-
Iaas provides a virtual environment for access to the computing resource. The clients subscribing to this form of cloud computing are given access to virtualised components so that they could build their own IT platforms.
All these services are cost efficient and time saving as they save clients the trouble of setting everything right from servers to hardware up from scratch. This means that the memory required by a website to respond to a sudden increase in end users can be doubled within minutes, given the efficiency and scalability of these services. Most of these services can be availed on a pay-per-hour basis that is exclusive to each server, role or process.
Deployment Models: Which one to choose?
There are choices to make when it comes to adopting a cloud solution. The deployment of different models depends on the needs of each company. Some of the most effective cloud services available are as follows:
Windows Azure, which functions on the PaaS platform and supplies as well as manages the operating system, is a great choice to opt for if a specialized OS is not required in the building of applications. This means that one can devote complete attention to the building, deployment and the management of cloud applications without having to worry about OS updates and patches. Primarily, Azure offers three main roles:
Web Role: This is an Internet Information Services 7 Windows Azure supplied OS that enables the development of applications using web technologies such as ASP.NET, PHP and Node.JS
Worker Role: This Windows Azure supplied OS enables the hosting of applications such as Apache, Tomcat amongst several others and runs arbitrary code.
Virtual Machine Code: By uploading a Windows Server 2008 R2 (Enterprise or Standard) VHD image, the customer supplies the OS in this service. This role, which is presently in Beta, renders the customer responsible for updating the OS. Applications can be built on Windows Azure using any language, tool or framework.Microsoft offers a three month free trial of Azure, which is adequate time enough for professionals to become well-versed with it. The service can be bought on a pay-as-you-go basis or a six month commitment for customized, reduced pricing.
Amazon Web Services (AWS Cloud) from Amazon that offers raw infrastructure to enable the functioning of any OS a customer’s applications might require. Despite offering OS control, Amazon Elastic lacks automated patching. Import of supported virtual machine images, creation of instances based on numerous Linux and Windows OSs is made possible. The service can be bought on an hourly rate basis, a one-time fee basis which entails a discounted hourly rate with a commitment of one or three years and on the basis of bidding. Following are best compute services in Amazon Web Services :
Auto Scaling:Auto Scaling helps you maintain application availability and allows you to
scale your Amazon EC2 capacity up or down automatically according to conditions you define.
Amazon RDS: This makes easier to set up, operate, and scale a relational database in the cloud. It provides cost-efficient, resizeable capacity while managing time-consuming database management tasks, freeing up to focus on applications.
Elastic Load Balancer:This automatically distributes incoming application traffic across multiple Amazon EC2 instances in the cloud.
OpenStack is a popular and rapidly developing cloud platform for the creation of IaaS (infrastructure as a service) platforms. It is jointly founded by Rackspace and NASA and is supported by other multiple established vendors including HP, IBM, Rackspace, Dell and Red Hat. It is a platform for creating and managing large groups of virtual private servers in cloud computing environment.
It controls large pools of compute storage and networking resources throughout a datacenter all managed through a dashboard that gives administrators control while empowering users to provision resources through a web interface. Many have deployed Openstack clouds to achieve control,business agility and cost savings without the licensing fees and terms of proprietary software. Its massive industry support, AWS compatibility, security, and powerful dashboard has made it highly competitive.
Rackspace Cloud Hosting:
Is a service that provides raw infrastructure with control over the OS. Rackspace, unlike other Iaas providers, does not allow the upload of a customer’s own virtual machines. Rather, either the Windows or the Linux versions, which are supported by the company must be chosen.
Rackspace offers numerous server sizes and charges a per hour fee for each server.
Rackspace, unlike others, does not let one stop the per-hour charges by halting instances. If one wants to be in possession of an idled server without being charged for it, the back-up must be done for the image (Which again incurs charges from Rackspace) and the server must be deleted from the account. The server can be added back anytime later.
The company presently does not offer a free trial. However, an account can be created for free to gain access to the administrative portal which gives a better understanding of the service’s functioning. Charges would only be incurred if any instances are created or other resources are used.