Infrastructure

12:30 PM
Natalie Lehrer
Natalie Lehrer
Commentary
Connect Directly
Facebook
Twitter
RSS
E-Mail
50%
50%

The Virtual Datacenter Model Explained

There are many tools and technologies that make a virtual datacenter run. Here are a few of the major layers firms will have in their virtual set up.

Understanding the model of a virtual datacenter is often explained by textbooks and online instructional manuals as being a stack with layers that lie on top of each other. These layers are dependent upon one another in order for the virtual datacenter to work properly. Virtual datacenters abstract each of these resources into a pool that can be used on demand when instructed by an administrator using a hypervisor. So what are the layers of the virtualized datacenter model?

Network virtualization
When you think about the way a virtualized datacenter abstracts the traditional components of a datacenter, the first component that must be virtualized is the networking component. Without being able to interconnect of all of your devices, you have a room full of computers that will be unable to communicate with each other. In a traditional sense, setting up a datacenter can get confusing quite quickly. Once the network layer has been virtualized, you more easily manage nodes while being able to dictate the actual function and connectivity of your entire datacenter.

Storage virtualization
When you have a rack full of servers that all have internal hard drives, it can be difficult to utilize traditional means that help you identify what data is stored where. Storage virtualization is an important part of this spectrum because it creates a pool of all of your available storage which allows you to use storage on demand. You would theoretically run out of space when all of your hard drives in your datacenter are completely full. The storage load will be spread across all eligible storage devices.

Processing virtualization
If you have a rack full of servers, wouldn’t you be able to do more if you were able to harness all of the power of the processors in the rack versus just one or two at a time. The virtualization layer of processing power is essential to tackling large computing tasks. If you have a multicore processor, you can dedicate a portion or an entire core or multiple cores to one virtual machine. The process is as easy as moving a slider until you receive the perfect amount of resources for your computing task.

Application and access virtualization
Being able to virtualize your applications will give your end-users a consistent experience regardless of their underlying operating systems. If you think about the way software-as-a-service is delivered in the cloud, you already have a perfect understanding of application virtualization. When you virtualize your application layer, you can instantaneously deploy applications without having to worry about things you have previously had to worry about when deploying to bare metal machines such as hardware requirements or system incompatibilities.

How are you going to access this infrastructure? Access has its own virtualization layer in this model as well. Since there are so many different methods to authenticate onto a system, virtualizing the access platform makes for an efficient method of delivering authentication services to your end-users. With a virtualized access platform, you can integrate popular identity services such as active directory which can give you the ability to offer single sign-on for your users.

Natalie Lehrer is a senior contributor for CloudWedge. In her spare time, Natalie enjoys exploring all things cloud and is a music enthusiast. View Full Bio
Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
DaneAtKinson
50%
50%
DaneAtKinson,
User Rank: Apprentice
6/13/2014 | 6:01:29 AM
Virtualization Tools To Use
Hi! Nice Sharing!

 

I just wanted to ask which tool will be a good practice to use Virtualization, incase if I have two options as Windows Server 2012 R2 and VMware? Can you suggest in terms of Total Cost of Ownerships.
More Commentary
Bankrolling Technical Debt: A Financierís Guide
Technical debt represents the effort required to fix source code or application problems that put the business at risk.
Staying Ahead of the Game With Continuous Delivery
The need to develop better software faster is leading financial organizations to continuous delivery (CD), a practice pioneered by SaaS companies like Salesforce.
Shore Up Cyber Security Now
Knowing that a data breach can and will happen at some point, asset management firms can manage new operational and regulatory risk with a layered approach to cyber security.
Is Big Data a Problem or an Opportunity?
When it comes to data, financial services firms are, as a rule, quite circumspect. They fear cyberattacks, data theft, data loss, security breaches, data privacy, and human error.
Data Integrity: A Necessity, Not an Option
Financial institutions that have taken on the data integrity task in the past now have to spend more money on hardware, software, and people just to keep up with the demand.
Register for Wall Street & Technology Newsletters
White Papers
Current Issue
Wall Street & Technology - July 2014
In addition to regular audits, the SEC will start to scrutinize the cyber-security preparedness of market participants.
Video
7 Unusual Behaviors That Indicate Security Breaches
7 Unusual Behaviors That Indicate Security Breaches
Breaches create outliers. Identifying anomalous activity can help keep firms in compliance and out of the headlines.