Welcome!

Containers Expo Blog Authors: Yeshim Deniz, Liz McMillan, Pat Romanski, Zakia Bouachraoui, Elizabeth White

Related Topics: Containers Expo Blog

Containers Expo Blog: Article

SANs and NAS: Improved Efficiency Through Virtualization

SANs and NAS: Improved Efficiency Through Virtualization

SANs, NAS, iSCSI, virtualization, in-band, out-of-band, the terminology seems never ending when it comes to storage and what's worse, no-one will tell you what's best.

Unfortunately, it's not that simple. The advent of SANs and the introduction of new technology has increased the number of options available, but there are no clear guidelines as to which one to use and when. There isn't a silver bullet or golden configuration that is good for everyone, the solution has to be tailored to the specific environment.

But all is not lost, there has been a lot written about storage and storage architectures, and if all else fails look at what you are trying to achieve and how much money you have to spend.

While it is widely thought that SANs are for big enterprises and NAS for smaller ones, this is not true. Most enterprises, whether big or small, now have NAS servers and many are using them for more than just file serving. The cost of SANs has fallen such that they are now a very real prospect for smaller organizations that want to take advantage of improved connectivity and performance to utilize with technologies such as third-party copy and clustered file systems.

So it is the applications and the business requirements that should drive the architecture, not the "latest and greatest" technology or the cheapest solution. Storage is not just about the online disk. Backup (which now might be to disk before going to tape), disaster recovery, and legislative compliance all have their part to play. Without a big picture of what needs to be achieved (from the business perspective) the decisions made will be insufficient.

Another factor to include is storage growth. If the space required in 12 months is 100% more than you have today, will that influence your architecture decision; what happens if it is 1000% in three years? How long do you plan to remain with the architecture that has been defined? The immediate logical conclusion is to go for the biggest you can buy - now. But we know this is not a pragmatic business decision, the architecture should be designed so that it can be grown - and this might mean starting with NAS and expanding into a SAN just as much as starting with SAN and acquiring a NAS solution later.

Utility computing is a trend we are hearing a great deal about, with many vendors touting it as the next big thing. When it comes to storage, applying utility computing principles and creating a storage utility is a great place to start. By using storage virtualization tools storage can be pooled and then provisioned when required; by having it attached to a SAN it can be allocated to any server that needs it. Additional functionality allows file systems to be grown automatically without the need to take the application using it down.

Business reporting tools enable departments (or lines of business) to see how much storage they are using. The IT organization can then choose to apply costs to the storage and could present each business with a bill (a.k.a., chargeback) if they so wished. More often than not it is the insight into costs that is useful, and it can be an invaluable guide as to where best to invest money in IT to get the greatest return for the business. In addition, utility computing is all about improving efficiency through best practice and automation. Again, storage is a great place to begin and putting in some best practices and simple automation - e.g., increasing space on servers when they are running out - can save a business a great deal of money, no matter what its size.

The grid is also seen as the next big thing and again, storage is a key component of a grid architecture. However most grid applications, while they need a large amount of space to store data centrally, it is then farmed out and generally processed in memory within the grid so the actual storage requirements are virtually nonexistent on the fringe nodes. For the main central storage, ensuring that the application serving out the data is highly available and that the data is sufficiently protected, i.e. backed up or replicated, is generally adequate.

Outside of storage, a general comparison of grid versus utility computing is interesting because while both have very different applications running on the architecture and so from 30,000 feet look very different, from the ground level there are many similarities. What is being used, how much is it being used, can it be used more - either to improve efficiency and/or utilization. --

More Stories By Guy Bunker

Dr. Guy Bunker, an Independent Expert at Bunker and Associates, is co-author with Gareth Fraser-King of "Data Leaks For Dummies" (John Wiley & Sons, February 2009). He holds a PhD in Artificial Neural Networks from King’s College London, several patents, and is a Chartered Engineer with the IET.

Comments (1)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio addr...
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
To Really Work for Enterprises, MultiCloud Adoption Requires Far Better and Inclusive Cloud Monitoring and Cost Management … But How? Overwhelmingly, even as enterprises have adopted cloud computing and are expanding to multi-cloud computing, IT leaders remain concerned about how to monitor, manage and control costs across hybrid and multi-cloud deployments. It’s clear that traditional IT monitoring and management approaches, designed after all for on-premises data centers, are falling short in ...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...