PACE your Server Storage I/O decision making, its about application requirements

PACE your Server Storage I/O decision making, its about application requirements

PACE your Server Storage I/O decision-making, it’s about application requirements. Regardless of if you are looking for physical, software-defined virtual, cloud or container storage, block, file or object, primary, secondary or protection copies, standalone, converged, hyper-converged, cluster in a box or other forms of storage and packaging, when it comes to server storage I/O decision-making, it’s about the applications.

I often see people deciding on the best storage before the questions of requirements, needs and wants are even mentioned. Sure the technology is important, so too are the techniques and trends including using new things in new ways, as well as old things in new ways. There are lots of buzzwords on the storage scene these days. But don’t even think about buying it until you truly understand your business’ storage needs.

However when it comes down to it unless you have a unique need, most environments server, and storage I/O resources exist to protect preserve and serve applications and their information or data. Recently I did a couple of articles over at Network Computing; these are tied to server and storage I/O decision-making balancing technology buzzwords with business and application requirements.

PACE and common applications characteristics

PACE your server storage decisions

A theme I mention in the above two articles as well as elsewhere on server, storage I/O and applications is PACE. That is, application Performance Availability Capacity Economics (PACE). Different applications will have various attributes, in general, as well as how they are used. For example database transaction activity vs. reporting or analytics, logs and journals vs. redo logs, indices, tables, indices, import/export, scratch and temp space. PACE (figure 2.7) describes the applications and data characters and needs.

Server Storage I/O PACE

Common Application Pace Attributes

All applications have PACE attributes

  • Those PACE attributes vary by application and usage
  • Some applications and their data are more active vs. others
  • PACE characteristics will vary within different parts of an application

Think of an application along with associated data PACE as its personality or how it behaves, what it does, how it does it and when along with value, benefit or cost along with Quality of Service (QoS) attributes. Understanding the applications in different environments, data value and associated PACE attributes is essential for making informed server, storage I/O decisions from configuration to acquisitions or upgrades, when, where, why and how to protect, or performance optimization along with capacity planning, reporting, and troubleshooting, not to mention addressing budget concerns.

Data and Application PACE

Primary PACE attributes for active and inactive applications and data:
P – Performance and activity (how things get used)
AAvailability and durability (resiliency and protection)
C – Capacity and space (what things use or occupy)
EEnergy and Economics (people, budgets and other barriers)

Some applications need more performance (server computer, or storage and network I/O) while others need space capacity (storage, memory, network or I/O connectivity). Likewise, some applications have different availability needs (data protection, durability, security, resiliency, backup, BC, DR) that determine various tools, technologies and techniques to use. Budgets are also a concern which for some applications meaning enabling more performance per cost while others are focused on maximizing space capacity and protection level per cost. PACE attributes also define or influence policies for QoS (performance, availability, capacity), as well as thresholds, limits, quotas, retention and disposition among others.

Where to learn more

Learn more about data infrastructures and tradecraft related trends, tools, technologies and topics via the following links:

Additional learning experiences along with common questions (and answers), as well as tips can be found in Software Defined Data Infrastructure Essentials book.

Software Defined Data Infrastructure Essentials Book SDDC

What this all means

The best storage will be the one that meets or exceeds your application requirements instead of the solution that meets somebody else’s needs or wants. Keep in mind, PACE your Server Storage I/O decision making, it is about application requirements

Ok, nuff said, for now.

Cheers Gs

Greg Schulz – Microsoft MVP Cloud and Data Center Management, VMware vExpert 2010-2018. Author of Software Defined Data Infrastructure Essentials (CRC Press), as well as Cloud and Virtual Data Storage Networking (CRC Press), The Green and Virtual Data Center (CRC Press), Resilient Storage Networks (Elsevier) and twitter @storageio. Courteous comments are welcome for consideration. First published on https://storageioblog.com any reproduction in whole, in part, with changes to content, without source attribution under title or without permission is forbidden.

All Comments, (C) and (TM) belong to their owners/posters, Other content (C) Copyright 2006-2024 Server StorageIO and UnlimitedIO. All Rights Reserved. StorageIO is a registered Trade Mark (TM) of Server StorageIO.

March 31st is world backup day; when is world recovery day

March 31st is world backup day; when is world recovery day

If March 31st is world backup day, when is world recovery day?

For several years, if not decades, March 31st has been world backup day, a reminder to protect and backup your apps and data. Data protection, including backup, recovery, business continuance (BC), disaster recovery (DR), and business resilience (BR), should be a 365-day-a-year focus. If you have regular data protection, including backup, that is great; when was the last time you tested restore?

Some related content

Upcoming and past events including webinars, tips and commentary
World Backup Day Reminder Don’t Be an April Fool Test Your Data Recovery
Data Infrastructure Overview, Its What’s Inside of a Data Center
Application Data Value Characteristics Everything Is Not The Same
Data Protection Diaries Topics Tools Techniques Technologies Tips

Reminder to Protect your data and apps and settings

Thus, this is also a reminder to protect your data and apps and their settings regularly. What’s even better is evolving from none once a year to more frequent data protection, including backup of your critical and noncritical apps and data. Notice I keep mentioning apps and not just the usual focus of or on data. Program apps are considered broadly data; after all, apps and your settings and metadata are just data when stored and protected.

There is also often a focus on just the data, which can lead to problems when it comes time to recover an app program, settings, or metadata. Also, a reminder that data protection, including backup, is not just for large enterprises; it applies to organizations and entities of all sizes, including small and medium businesses (SMBs), non-profits, and homes (e.g., your photos, worksheets, and other documents).

What About Recovery

If March 31st is world backup day, when is world recovery day? So far, I have been talking about backup as part of data protection or ensuring your apps, data, and settings are protected; what about recovery?

Sometimes with data protection, discussions can drift into what’s more critical, backup or recovery, which is a bit like a chicken and egg situation. In other words, what’s more important, the chicken or the egg? Similar to data protection, what’s more critical, backup or recovery?

Recovery is only as good as your backup (or snapshot, point-in-time copy, checkpoint, or consistency point), and your backup or protection copy is only as good as its recoverability. Recoverability means that not only is there something to restore from a point in time (e.g., recovery point objective or RPO) in a given amount of time (recovery time objective or RTO).

Recoverability also means that you can pull the data (e.g., bits, bytes, blocks, blobs, objects, files, tables) from the protection medium, media, or service and use it. Recovery means that the data is valid and consistent, has integrity, or is otherwise not bad, missing, damaged, or corrupted (e.g., usable).

What About Recovery Day?

For several years I have mentioned and will continue to do so that if March 31st is world backup day, then April 1st should be a world recovery day. So why April 1st for world recovery day? Simple, you don’t want to look like a fool the day after world backup day if you can’t restore and use data backed up the day before.

If you are not comfortable with April 1st for world recovery day? Then make your world recovery day (or test) a day or so later. The important message is to ensure your apps, data, and settings are protected (e.g., copied, backed up, snapshot, checkpoint, etc.), trust yet verify, and test your restorations.

Why do I mentation apps, data, and settings?

The important message here is that it is good if you are already protecting your data, your spreadsheets, worksheets, databases, files, photos, and the application programs that use them. However, also ensure that you are protecting application settings, configurations, metadata, encryption keys, the backup or protection mechanisms, and their data.

For example, when I accidentally delete a data file or configuration settings, I can restore those without recovering everything. Suppose, for instance, I accidentally or intentionally uninstall an application program. In that case, I can reinstall (assuming I have a copy of the program), then restore my settings and pick up where I resumed.

Who does this apply to?

From organizations of size and type to individuals. If you have or generate or save data, if it is worth having (or you have to keep it), then it should be protected. What how often to protect data (time interval) will be based on what your recovery point objective (RPO) is. Likewise how fast you need to recover with your recovery time objective (RTO).

Remember that it is not if you will need to restore, recover, reload, refresh, or repair your apps, data, and settings instead when. It might be because of accidental or planned deletion, accident, hardware, software, cloud service situation, ransomware, or malware, among other things that can and do happen.

What to do?

If March 31st is world backup day, when is world recovery day? Ensure you have regular copies of your apps, data, and configuration settings, including encryption keys. Implement a variation of the old school three two one (e.g., 3 2 1) data protection, e.g., backup scheme (e.g., three or more copies, stored on two or more devices, systems, media or mediums, and at least one of them offsite preferably offline including at cloud).

A variation of the new school 4 3 2 1 data protection scheme has:
Have four or more versions of your protected data.
Three or more copies (feel free to swap the number of copies and versions).
Stored on two or more different systems (devices, media, or locations).
At least one copy offsite (preferably with one offline), including cloud.

The big difference between the old school 3 2 1 and the new school 4 3 2 1 is the emphasis and distinction of having multiple copies and various versions (e.g., points in time). For example, storing three copies on two systems with one offsite is good unless all copies are damaged. Having different versions (e.g., point in time) and multiple copies of those versions stored in different places including at least one offline (e.g., air-gapped), is essential.

Trust yet verify, test your backups and recovery

Test to verify your data protection is working and that data (apps, data, settings) can be restored. When testing restores, be careful not to overwrite your good data and cause a disaster. Also, ensure your data is encrypted in multiple locations and layers and that you protect your encryption keys. Finally, make sure your backup, protection software, catalog, and settings are encrypted, secured, and protected.

If you have questions, not sure, learn more here in my book Software Defined Data Infrastructure Essentials (CRC Press), Data Infrastructure Management Insight and Strategies (CRC Press), as well as check out these listed below, or reach out to me or others. If you are an individual consumer and just looking to protect some photos, valuable documents, and heirlooms, get in touch with professionals who specialize in these types of things.

What do I do?

Implement 4 3 2 1 type data protection with different granularities and frequencies. For example, my data protection includes regular point-in-time copies, including backups and snapshots, checkpoints, consistency points of systems, volumes, shares, apps, files, data, and settings at different intervals. Having different types of apps and data, some of which are more static vs. others that are changing, protection is also varied to avoid treating everything the same, reduce cost, and increase coverage.

I protect my Apps, data, and settings with multiple versions and copies locally on different systems, devices, mediums, and offsite, including offline and at cloud services. So why do I store data offsite vs. having it all in the cloud? Simple, speed of recovery, and flexibility.

If it’s a few files, perhaps a few GBs of data, it is usually faster for me if I don’t have a good copy locally to get it from Microsoft Azure. Otoh, if I need to restore TBs of data (something terrible happens), then it can be faster to bring an offline, offsite copy back, correct that, then only pull the more recent data I need from the cloud.

What are some of the tools and technologies that I use?

Locally I have multiple Microsoft Windows Servers (Server 2022) with various storage (HDDs and SSDs), including removable devices. In addition to on-prem, I have data stored offsite on removable media and cloud copies. For my cloud copies, I have a mix of files and blobs stored at Microsoft Azure.

A challenge moving from AWS to Azure was Retrospect did not support objects (Azure blobs). I realized, no worries, Retrospect supports storing data on local storage (SSD or HDD) on regular filesystems as files. The solution was set up an Azure file share for Retrospect, and everything has worked fantastic.

Are there things I need and want to improve? Yes, it’s an ongoing process and journey.

What should you do next?

Make sure you have a data backup; if not, march 31st is a good reminder. Trust yet verify your backups are working and you can recover and not be an April 1st fool.

Where to learn more

Learn more about world backup day, recovery and data protection along with other related topics via the following links:

Upcoming and past events including webinars, tips and commentary
Next Generation Hybrid Data Infrastructures Are In Your Future
Cloud File Data Storage Consolidation and Economic Comparison Model
New Book Data Infrastructure Management Insight Strategies
World Backup Day Reminder Don’t Be an April Fool Test Your Data Recovery
Virtual, Cloud and IT Availability, it’s a shared responsibility
Don’t Stop Learning Expand Your Skills Experiences Everyday
Data Infrastructure Overview, Its What’s Inside of a Data Center
Application Data Value Characteristics Everything Is Not The Same
Data Protection Diaries Topics Tools Techniques Technologies Tips
Data Infrastructure Server Storage I/O related Tradecraft Overview

Additional learning experiences can be found in Software Defined Data Infrastructure Essentials book. Also check out Data Infrastructure Management Insight and Strategies.

Software Defined Data Infrastructure Essentials Book SDDC backup restore data protection cloud storage containers data footprint reduction

What this all means

If March 31st is world backup day, when is world recovery day? Every day should be a backup day (e.g., some protection, backup, copy, snapshot, checkpoint, consistency point). Likewise, every day should be able to be a recovery day. World backup day and recovery apply to organizations of all sizes and individuals. Remember that If March 31st is world backup day, when is world recovery day?

Ok, nuff said.

Cheers gs

Greg Schulz – Multi-year Microsoft MVP Cloud and Data Center Management, ten-time VMware vExpert. Author of Data Infrastructure Insights (CRC Press), Software Defined Data Infrastructure Essentials (CRC). Cloud and Virtual Data Storage Networking (CRC), The Green and Virtual Data Center (CRC), Resilient Storage Networks (Elsevier). Visit twitter @storageio as well as www.picturesoverstillwater.com to view various UAS/UAV e.g. drone based aerial content created by Greg Schulz. Courteous comments are welcome for consideration. First published on https://storageioblog.com. Any reproduction without attribution or without permission is forbidden.

All Comments, (C) and (TM) belong to their owners/posters, Other content (C) Copyright 2006-2024 Server StorageIO and UnlimitedIO. Visit our companion site https://picturesoverstillwater.com to view drone based aerial photography and video related topics. All Rights Reserved. StorageIO is a registered Trade Mark (TM) of Server StorageIO and UnlimitedIO LLC.

Cloud Ready Data Protection for Hybrid Data Centers Are In Your Future

Cloud Ready Data Protection for Hybrid Data Centers Are In Your Future

Cloud Ready Data Protection for Hybrid Data Centers

Join me for a free webinar Cloud Ready Data Protection for Hybrid Data Centers and Data Infrastructures 11AM PT Thursday July 11th produced by Redmond Magazine sponsored by Quest Software.

Hybrid Data Infrastructure Data Center Cloud Container Software Defined Next Generation Cloud Ready Data Protection for Hybrid Data Centers

Hybrid Data Infrastructures and Data Centers

Hybrid cloud and on-prem data centers are in your future if not already a reality. In addition to using public cloud and on-prem resources, your environment is likely a mix of many different operating systems, applications and servers (virtual and physical), along with multiple backup and recovery technologies.

Cloud Ready Data Protection for Hybrid Data Centers

In this engaging, interactive webinar, we will look at trends, issues, and challenges, as well as provide best practices in what you can do to address them today. You’ll learn how to simplify and streamline your system, application and data protection in both the cloud and data center without compromise, all while removing complexity and cost.

What You Will Learn

Join Microsoft MVP, VMware vExpert and IT analyst Greg Schulz of Server StorageIO along with Michael Gogos, Data Protection expert from Quest, as they discuss how to:

  • Become hybrid and cloud data protection ready
  • Use the cloud for backup and disaster recovery
  • Protecting cloud applications and their data
  • Address different hybrid data protection scenarios
  • Take action today to prepare for tomorrow

 

Where to learn more

Learn more about world backup day, recovery and data protection along with other related topics via the following links:

Additional learning experiences along with common questions (and answers), as well as tips can be found in Software Defined Data Infrastructure Essentials book.

Software Defined Data Infrastructure Essentials Book SDDC

What this all means

I look forward to you joining Michael Gogos of Quest Software and myself on Thursday July 11th 11AM PT for our interactive discussion (bring your questions) around Cloud Ready Data Protection for Hybrid Data Centers and what you can do today (Register here).

Ok, nuff said, for now.

Cheers GS

Greg Schulz – Multi-year Microsoft MVP Cloud and Data Center Management, ten-time VMware vExpert. Author of Data Infrastructure Insights (CRC Press), Software Defined Data Infrastructure Essentials (CRC Press), as well as Cloud and Virtual Data Storage Networking (CRC Press), The Green and Virtual Data Center (CRC Press), Resilient Storage Networks (Elsevier) and twitter @storageio. Also visit www.picturesoverstillwater.com to view various UAS/UAV e.g. drone based aerial content created by Greg Schulz. Courteous comments are welcome for consideration. First published on https://storageioblog.com any reproduction in whole, in part, with changes to content, without source attribution under title or without permission is forbidden.

All Comments, (C) and (TM) belong to their owners/posters, Other content (C) Copyright 2006-2019 Server StorageIO and UnlimitedIO. Visit our companion site https://picturesoverstillwater.com to view drone based aerial photography and video related topics. All Rights Reserved. StorageIO is a registered Trade Mark (TM) of Server StorageIO.

World Backup Day Reminder Don’t Be an April Fool Test Your Data Recovery

World Backup Day Reminder Don’t Be an April Fool Test Your Data Recovery

World Backup Day Reminder Don't Be an April Fool Plan Your Data Recovery

World Backup Day Reminder Don’t Be an April Fool Test Your Data Recovery.

March 31 is the annual world backup day to spotlight awareness around the importance of protecting your data and test your data recovery. The focus of world backup and recovery day spans from the largest enterprise and cloud service providers (e.g., super scalars) to the smallest SMB, SOHO, ROBO and home consumers (including your photos) or other valuable items.

Granted the technology, tools, techniques, trends will differ with a scope as well as scale.

However, the fundamental data protection approaches apply to all. That is, having multiple copies of different points in time spread across separate storage (systems, servers, devices, media, cloud services) as well as offsite (and off-line).

world backup day data protection cloud

Why The Need For Data Protection And Recovery

Data Protection encompasses many different things, from accessibility, durability, resiliency, reliability, and serviceability ( RAS) to security and data protection along with consistency. Availability includes basic, high availability ( HA), business continuance ( BC), business resiliency ( BR), disaster recovery ( DR), archivingbackup, logical and physical security, fault tolerance, isolation and containment spanning systems, applications, data, metadata, settings, and configurations.

From a data infrastructure perspective, availability of data services spans from local to remote, physical to logical and software-defined, virtual, container, and cloud, as well as mobile devices. On the left side of the following figure are various data protection and security threat risks and scenarios that can impact availability, or result in a data loss event ( DLE), data loss access ( DLA), or disaster. The following figure shows various techniques, tools, technologies, and best practices to protect data infrastructures, applications, and data from threat risks.

the need for data protection backup bc dr

Don’t Become An April 1st Recovery Fool

April 1st also known as April Fool’s day should be a reminder to plan as well as test your recovery, so the joke is not on you. Data protection including backup, archiving, security, disaster recovery (DR), business continuance (BC) as well as business resiliency (BR) are not a once a year focus, instead of a 365 day a year continuum. Likewise, the focus needs to expand from just making sure you backed up or made copies of your data to recover. After all, what good is a check box that you did a backup on world backup day only to find out the next day you cannot recover, or, what you thought was protected is not there.

If you already have good backups and data protection copies, verify that they are in fact good-by restoring their contents to a different location. It should go without saying, however all too often common sense needs to be repeated, make sure in the course of testing data protection including restoring that you do not inadvertently cause a disaster. Also, go a step beyond verifying that you can read the data stored on disk, tape, SSD, optical, that is, actually try to use, or open the data. What this does is verify that you can both access and restore the data from the protection medium or cloud location, as well as unlock, decrypt, uncompressed or re-inflate deduped data.

Evolving Data Protection Including Backup and Recovery

While the emphasis of world backup is on the importance of data protection including having backup copies, there also needs to be an emphasis on recovering. It is essential to make sure data is protected which means having multiple copies of different time intervals stored on several mediums or systems across one or more locations. The previous is the basis of 4 3 2 1 data protection, having four or more copies with three or more-time interval versions spread across two or more different systems or storage mediums.

server storageio data infrastructure data protection 4 3 2 1
4 3 2 1 data protection (via Software Defined Data Infrastructure Essentials)

4    At least four copies of data (or more), Enables durability in case a copy goes bad, deleted, corrupted, failed device, or site.
3    The number (or more) versions of the data to retain, Enables various recovery points in time to restore, resume, restart from.
2    Data located on two or more systems (devices or media/mediums), Enables protection against device, system, server, file system, or other fault/failure.
1    With at least one of those copies being off-premise and not live (isolated from active primary copy), Enables resiliency across sites, as well as space, time, distance gap for protection.

Also, make sure that at least one of those offsite preferably offline. Likewise, it is crucial that whatever is protected, backed up, copied, cloned, snapshot, checkpoint, consistency point, replicated is also usable. In addition to having multiple copies and versions, those data protection copies should also include occurring at various altitude or layers in the data infrastructure stack from applications to database, file systems to virtual machines or containers among others.

What About Individual Data Protection at Home

For consumers and individuals, as well as small business, make sure that you are copying your essential data from your computer to some other storage medium (or multiple). For example, have a local copy on an external hard disk drive (HDD) or a solid-state device (SSD). Better yet, have a couple of copies for different time intervals both on-site as well as off-site. Anything important you have stored on site including copies of photos, images, video, audio, records, spreadsheets, and other documents should have extra copies including off-site or in the cloud.

Likewise, anything you store in the cloud should have at least one other copy stored elsewhere. Don’t be scared of the cloud, however, do your homework to be prepared. Similar to only having one copy of your data on site, the other extreme only has one copy in the cloud. Instead, put a copy in the cloud as well as have one on-site (or on-prem if you prefer) or elsewhere.

Don’t Forget Your Home Photos and Movies

Speaking of photos and other documents, for those that are not yet digitized, scanned or electronic copies made, get them converted.  Get in touch with data protection and backup professional, as well as a photo (and digital asset) organizer. They can provide advice on best practices, techniques, as well as tools, technologies, and services to keep your digital data safe and secure. Some photo organizer professionals also can help with converting your old photos, movies, videos to new digital formats. For example, get in touch with Holly Corbid at Capture Your Photos (www.captureyourphotos.com) who is a certified professional photo organizer and member of Association of Professional Photo Organizers.

Where to learn more

Learn more about world backup day, recovery and data protection along with other related topics via the following links:

Additional learning experiences along with common questions (and answers), as well as tips can be found in Software Defined Data Infrastructure Essentials book.

Software Defined Data Infrastructure Essentials Book SDDC

What this all means

March 31 world backup day is more than an annual event for vendors to send out press releases on the importance of data protection. The focus should also expand to world recovery day or something similar as well as span 365 days a year. Now is a good time to review and verify your existing data protection including backup and recovery works as expected. Keep in mind, world backup day reminder don’t be an April fool test your data recovery before you need it.

Ok, nuff said, for now.

Cheers GS

Greg Schulz – Multi-year Microsoft MVP Cloud and Data Center Management, ten-time VMware vExpert. Author of Data Infrastructure Insights (CRC Press), Software Defined Data Infrastructure Essentials (CRC Press), as well as Cloud and Virtual Data Storage Networking (CRC Press), The Green and Virtual Data Center (CRC Press), Resilient Storage Networks (Elsevier) and twitter @storageio. Also visit www.picturesoverstillwater.com to view various UAS/UAV e.g. drone based aerial content created by Greg Schulz. Courteous comments are welcome for consideration. First published on https://storageioblog.com any reproduction in whole, in part, with changes to content, without source attribution under title or without permission is forbidden.

All Comments, (C) and (TM) belong to their owners/posters, Other content (C) Copyright 2006-2024 Server StorageIO and UnlimitedIO. Visit our companion site https://picturesoverstillwater.com to view drone based aerial photography and video related topics. All Rights Reserved. StorageIO is a registered Trade Mark (TM) of Server StorageIO.

Deliver Data Management Availability For Multi Cloud Environments Webinar

Deliver Data Management Availability For Multi Cloud Environments Webinar

Deliver Data Management Availability For Multi Cloud Environments Webinar

Join me on Thursday March 14th 11AM PT when I host a webinar with topic Deliver Data Management Availability For Multi Cloud Environments. This is free webinar (will also be available for replay) sponsored by Veeam, produced by Redmond Magazine where I will be joined by Dave Russell, Vice President of Enterprise Strategy at Veeam Software for an interactive engaging discussion.

Our discussion including questions for attendees will look at how IT landscapes are evolving, hybrid and multi-cloud have become the new normal and what can be done to protect, preserve, secure and serve data spread across on-prem and different public clouds. Topics will include what to do today to prepare for tomorrow, minimizing risk of hybrid environments, changing environments along with their requirements, identify strategies for sound data management, data protection including backup for hybrid environments.

Register for the Deliver Data Management Availability For Multi Cloud Environments Webinar here (Live Thursday March 14th 11AM PT).

Where to learn more

Learn more about cloud, multi-cloud, hybrid and data protection via the following links:

Additional learning experiences along with common questions (and answers), as well as tips can be found in Software Defined Data Infrastructure Essentials book.

Software Defined Data Infrastructure Essentials Book SDDC

What this all means

Remember to register here for the live March 14, 2019 event. Join me for an interactive discussion with Dave Russell as we discuss the trends, issues, challenges and what can be done to put a strategy in place for data protection and to Deliver Data Management Availability For Multi Cloud Environments.

Ok, nuff said, for now.

Cheers Gs

Greg Schulz – Multi-year Microsoft MVP Cloud and Data Center Management, ten-time VMware vExpert. Author of Data Infrastructure Insights (CRC Press), Software Defined Data Infrastructure Essentials (CRC Press), as well as Cloud and Virtual Data Storage Networking (CRC Press), The Green and Virtual Data Center (CRC Press), Resilient Storage Networks (Elsevier) and twitter @storageio. Also visit www.picturesoverstillwater.com to view various UAS/UAV e.g. drone based aerial content created by Greg Schulz. Courteous comments are welcome for consideration. First published on https://storageioblog.com any reproduction in whole, in part, with changes to content, without source attribution under title or without permission is forbidden.

All Comments, (C) and (TM) belong to their owners/posters, Other content (C) Copyright 2006-2024 Server StorageIO and UnlimitedIO. Visit our companion site https://picturesoverstillwater.com to view drone based aerial photography and video related topics. All Rights Reserved. StorageIO is a registered Trade Mark (TM) of Server StorageIO.

Announcing My New Book Data Infrastructure Management Insight Strategies

Announcing My New Book Data Infrastructure Management Insight Strategies

Announcing My New Book Data Infrastructure Management Insight Strategies

Announcing my new book Data Infrastructure Management Insight Strategies published via Auerbach/CRC Press is now available via CRC Press and Amazon.com among other global venues.

My Fifth Solo Book Project – Data Infrastructure Management

Data Infrastructure Management Insight Strategies (e.g. the white book) is my fifth solo published book in addition to several other collaborative works. Given its title, the focus of this new book is around Data Infrastructures, the tools, technologies, techniques, trends including hardware, software, services, people, policies inside data centers that get defined to support business and application services delivery. The book (ISBN 9781138486423) is soft covered (also electronic kindle versions available) with 250 pages, over a 100 figures, tables, tips and examples. You can explore the contents via Google Books here.

Data Infrastructure Books by Greg Schulz
Stack of my solo books with common theme around Data Infrastructure topics

Data Infrastructure Management Book
Data Infrastructure Management – Insight and Strategies e.g. the White book (CRC Press 2019)

Some of My Other Books Include

Click on the following book images to learn more about, as well as order your copy.

Software Defined Data Infrastructure Essentials BookSNIA Recommended Reading List
Software Defined Data Infrastructure Essentials (SDDI) – Cloud, Converged, and Virtual Fundamental Server Storage I/O Tradecraft e.g. the Blue book covers software defined, sddc, sddi, hybrid, among other topics including serverless containers, NVMe, SSD, flash, pmem, scm as well as others. (CRC Press 2017) available at Amazon.com among other global venues.

Cloud and Virtual Data Storage Networking Intel recommended reading listIntel recommended reading list
Cloud and Virtual Data Storage Networking (CVDSN) – Your Journey to efficient and effective Information Services e.g. the Yellow or Gold Book (CRC Press 2011) available at Amazon.com among other global venues.

 

The Green and Virtual Data Center BookIntel Recommended Reading List
The Green and Virtual Data Center (TGVDC) – Enabling Efficient, Effective and Productive Data Infrastructures e.g. the Green Book (CRC Press 2009) available at Amazon.com among other venues.

Resilient Storage Networks Book
Resilient Storage Networks (RSN) – Designing Flexible scalable Data Infrastructures (Elsevier 2004) e.g. the Red Book is SNIA Education Endorsed Reading available at Amazon.com among other venues. I have some free copies of RSN for anybody who is willing to pay shipping and handling, send me a note and we will go from there.

Where to learn more

Learn more via the following links:

Additional learning experiences along with common questions (and answers), as well as tips can be found in Software Defined Data Infrastructure Essentials book.

Software Defined Data Infrastructure Essentials Book SDDC

What this all means

Today more than ever there tends to be a focus on the date something was created or published as there is a lot of temporal content with short shelf life. This means that there is a lot of content including books being created that are short temporal usually focused on a particular technology, tool, trend that has a life span or attention focus of a couple of years at best.

On the other hand, there is also content that is still being created today that combines new and emerging technology, tools, trends with time-tested strategies, techniques as well as processes, some of whose names or buzzwords will evolve. My books fit into the latter category of combing current as well as emerging technologies, tools, trends, techniques that support longer shelf life, just insert your new favorite buzzword, buzz trend or buzz topic as needed.

Data Infrastructure Books by Greg Schulz

You will also notice looking at the stack of books, Data Infrastructure Management Insight and Strategies is a smaller soft covered book compared to others in my collection. The reason is that this new book can be a quick read to address what you need, as well as be a companion to others in the stack depending on what your focus or requirements are.

Common questions I get having written several books, not to mention the thousands of articles, tips, reports, blogs, columns, white papers, videos, webinars among other content is what’s is next? Good question, see what’s next, as well as check out some other things I’m doing over at www.picturesoverstillwater.com where I’m generating big data that gets stored and processed in various data infrastructures including cloud ;) .

Will there be another book and if so on or about what? As I mentioned, there are some projects I’m exploring, will they get finished or take different directions, wait and see what’s next.

How do I find the time to create these books and how long does it take? The time required varies as does the amount of work, what else I’m doing. I try to leverage the book (and other content creation projects) with other things I’m doing to maximize time. Some book projects have been very fast, a year or less. Some take longer such as Software Defined Data Infrastructure Essentials as it is a big book with lots of material that will have a long shelf life.

Do I write and illustrate the books or do I have somebody do them for me? For my books I do the writing and illustrating (drawings, figures, images) myself along with some of the layouts relying on external copy editors and production folks.

What do I recommend or give advice to those wanting to write a book? Understand that publishing a book is a project, there’s the actual writing, editing, reviews, art work, research, labs or other support items as book companions. Also understand why are you writing a book, for fame, fortune, acclaim, to share with others or some other reason. I also recommend before you write your entire book to talk with others who have been published to test the waters, get feedback. You might find it easier to shop an extended outline than a completed manuscript, that is unless you are writing a novel or similar.

Want to learn more about writing a book (or other content), get feedback, have other questions, drop me a note and will do what I can to help out.

Data Infrastructure Management Book

There is an old saying, publish or perish, well, I just published my fifth solo book Data Infrastructure Management Insight Strategies that you can buy at Amazon.com among other venues.

Ok, nuff said, for now.

Cheers Gs

Greg Schulz – Microsoft MVP Cloud and Data Center Management, VMware vExpert 2010-2019. Author of Data Infrastructure Insights (CRC Press), Software Defined Data Infrastructure Essentials (CRC Press), as well as Cloud and Virtual Data Storage Networking (CRC Press), The Green and Virtual Data Center (CRC Press), Resilient Storage Networks (Elsevier) and twitter @storageio. Also visit www.picturesoverstillwater.com to view various UAS/UAV e.g. drone based aerial content created by Greg Schulz. Courteous comments are welcome for consideration. First published on https://storageioblog.com any reproduction in whole, in part, with changes to content, without source attribution under title or without permission is forbidden.

All Comments, (C) and (TM) belong to their owners/posters, Other content (C) Copyright 2006-2019 Server StorageIO and UnlimitedIO. All Rights Reserved. StorageIO is a registered Trade Mark (TM) of Server StorageIO.

Microsoft Azure Data Box Disk Impressions #blogtobertech

Microsoft Azure Data Box Disk Test Drive Impressions #blogtobertech

Microsoft Azure Data Box Disk Test Drive Impressions #blogtobertech

Data Box Disk Test Drive Impressions is the last of a four-post series looking at Microsoft Azure Data Box. View Part 1 Microsoft announced Azure Data Box updates, Part 2 Microsoft Azure Data Box Family, and Part 3 Microsoft Azure Data Box Disk Test Drive Review.

Overall, I liked the Azure Data Box experience along with a range of options to select the best fit solution for my needs. A common trend among the major cloud service providers such as AWS, Microsoft Azure and Google is that one size fits all approach solution does not meet different customer needs.

The only things that I did not like about and would like to see improved with Azure Data Box are two items one at the beginning, the other at the end of the process. Granted with Data Box Disks still in preview, there is time for those items to be addressed before general availability, and I have passed on the feedback to Microsoft.

At the beginning of the process, things are pretty straightforward with good tools along with resources to help you navigate which type of Data Box to order, how to order, specify your account details and other information.

What I did not like with the up front experience was after the quick ordering and notification process, the time delay of a week or more until notified when a Data Box would be arriving. Granted I was not in a rush and Microsoft did indicate that it could take about ten days to be informed of availability, this is something that should be done quickly as resources become available. Another option is for Microsoft to add an ordering option for priority or low-priority in the future.

The other experience that I did not like was at the very end, in that perhaps its stuck in an email spam trap (checked, could not find it), the final notification could be better. Not only a final email note saying your data is copied, but also a reminder of where your block or page blobs were copied to (e.g., what your setup when ordering).

Monitoring the progress of the process, I knew when Data Box drives arrived at Microsoft, copy started and completed including with error status. Having gotten used to receiving update notifications from Azure, not receiving one at the end saying congratulations your data has been copied, check here for any errors or other info, as well as a reminder where the data was copied to would be useful.

Likewise, a follow-up note from Microsoft saying that the Azure Data Box drives used as part of the transfer were securely erased along with a certificate of digital destruction would be useful for compliance purposes.

As mentioned above, overall, I found the Data Box Disk experience very positive and a great way to move bulk data faster than what could be done with available networks. My next step is now to migrate some of the transferred data to cold long-term archive storage, and some others to Azure Files, with some staying in block blobs. There are also a couple of VHD and VHDX that will be moved and attached to VMs for additional testing.

Where to learn more

Learn more about Microsoft Azure Data Box, Clouds and Data Infrastructure related trends, tools, technologies and topics via the following links:

Additional learning experiences along with common questions (and answers), as well as tips can be found in Software Defined Data Infrastructure Essentials book.

Software Defined Data Infrastructure Essentials Book SDDC

What this all means

For those who have a need to move large amounts of data including structured, unstructured, semi-structured, little or big data to a cloud resource, solutions such as Azure Data Box may be in your future. Likewise, for those looking to support remote and edge workloads from AI, ML, DL inferencing, to large-scale data pre-processing, data collection and acquisition, video, telemetry, IoT among others Data Box type solutions may be in your future. Overall I found Microsoft Azure Data Box Disk Impressions Favorable and was able to address a project I had on the to-do list for some time.

Ok, nuff said, for now.

Cheers Gs

Greg Schulz – Microsoft MVP Cloud and Data Center Management, VMware vExpert 2010-2018. Author of Software Defined Data Infrastructure Essentials (CRC Press), as well as Cloud and Virtual Data Storage Networking (CRC Press), The Green and Virtual Data Center (CRC Press), Resilient Storage Networks (Elsevier) and twitter @storageio. Courteous comments are welcome for consideration. First published on https://storageioblog.com any reproduction in whole, in part, with changes to content, without source attribution under title or without permission is forbidden.

All Comments, (C) and (TM) belong to their owners/posters, Other content (C) Copyright 2006-2024 Server StorageIO and UnlimitedIO. All Rights Reserved. StorageIO is a registered Trade Mark (TM) of Server StorageIO.

Microsoft Azure Data Box Disk Test Drive Review #blogtobertech

Microsoft Azure Data Box Test Drive #blogtobertech

Microsoft Azure Data Box Test Drive #blogtobertech

Microsoft Azure Data Box Test Drive is part three of four series looking at Data Box. View Part 1 Microsoft announced Azure Data Box updatesPart 2 Microsoft Azure Data Box Family, and Part 4 Microsoft Azure Data Box Disk Impressions.

Getting Started

The workflow for using Data Box involves selecting with the type of Data Box to use via the Microsoft Azure portal (here), or Data Box Family page (here).

Getting Started via the Microsoft Azure Data Box Family Page image via Microsoft.com
Getting Started via the Microsoft Azure Data Box Family Page image via Microsoft.com

First step of ordering a Data Box is to specify your Azure subscription, type of operation (e.g., import data into Azure, or export out), source country/region and destination Azure region.

Selecting Data Box from Azure Portal
Selecting Data Box from Azure Portal

The next step is to determine what type of Data Box, in this test I choose 40 TB Data Box Disks. Make a note of fees to avoid any surprises.

Selecting Data Box Disks (40 TB) From Azure Portal
Selecting Data Box Disks (40 TB) From Azure Portal

After selecting the type of Data Box, fill in storage account information using an existing resource, or create new ones as needed. Make a note of these selections as you will need them after the copy is done as this is where your data will be located.

Specify Azure Storage Account Information Where Data Will Transfer To
Specify Azure Storage Account Information Where Data Will Transfer To

Once the order is placed, an email is received confirming the order and also being a preview, indicating that it might take ten days to hear a status update or availability of the devices.

Email notification received after the order is placed
Email notification received after the order is placed

After about ten days, I was contacted by Microsoft via an email (not shown) confirming the amount of data to be copied to determine how many disks would be needed. Once this was confirmed with Microsoft, a status update was noted on the Azure dashboard.

Azure Data Box Dashboard Status after order placed
Azure Data Box Dashboard Status after order placed

After a few days, a box arrived with the Data Box disks, cables and return shipping labels enclosed. Also received was an email notification indicating the disks had arrived.

Email notice Data Box has arrived on site
Email notice Data Box has arrived on site (on-prem if you prefer)

The following is the physical box that contains the Data Box disks that I received from Microsoft.

The shipping box with Data Box Disks arrives
The shipping box with Data Box Disks arrives

Once you get the Data Box, go to the Azure portal for Data Box and access the tools. There are tools and commands for Windows as well as Linux that are needed for accessing and unlocking the disks. This is where you also obtain device IDs. You will also need to have the access key phrase you specified in an earlier step as part of placing the order.

Access Data Box Software Tools and Keys from Azure Portal
Access Data Box Software Tools and Keys from Azure Portal

Inside the shipping box was a pair of 8 TB SATA SSDs, SATA to USB cables, along with return shipping labels.

Contents inside the shipping box, two Data Box 8 TB disks
Contents inside the shipping box, two Data Box 8 TB disks

From the Azure portal, access the device IDs that will be needed along with passphrase for obtaining and unlocking the Data Box disks. You will also want to download the tools as well as follow other instructions on the portal for accessing disks.

Azure Data Box tools, device IDs and Keys
Azure Data Box tools, device IDs and Keys

The Windows system I used for testing is a virtual machine hosted on a VMware vSphere ESXi 6.7 host. After physically attaching the Data Box Disks to the VM host, a virtual or software attachment was done by adding USB devices to the VM.

Virtual Attach of Data Box Disks to VMware vSphere ESXi host and guest VM
Virtual Attach of Data Box Disks to VMware vSphere ESXi host and guest VM

Once the VM had the Data Box disks attached and mapped, they appeared to Windows. After downloading the Data Box software tools and unlocking the devices, they were ready to copy data to. Note that the disks appear as a regular Windows device once unlocked. Simply using bit locker does not unlock the drives, you need to use the Data Box tools. Speaking of Windows disks, there are a couple of folders on the Data Box disk when shipped including one for Block Blob and Page Blob along with verification items.

View of Data Box Disks (8 TB each) after attaching to Windows system
View of Data Box Disks (8 TB each) after attaching to Windows system

Note that you are given several days as part of the base transfer cost, then extra days apply. Since I had a few extra days, I used some of the excess capacity to do some staging and reorganization of data before the actual copy.

Data copy is done using your choice of tools, for example, Robocopy among many others. I used a combination of Robocopy, Retrospect among others. Also, note that for most data place them in the folder or directory structure of your choice in the Block Blob folder. Page Blobs are for VHDX to be used with virtual machines on Azure. After spending a few days to copy the data I wanted to move along with performing verification, it was time to pack up the devices.

As a reminder, blobs are analogous to and what Microsoft Azure refers to instead of objects (e.g., object storage). Also remember that Azure blobs include block, page (512-byte page aligned for VHDX) and append (similar to other vendors object storage). Microsoft Azure in addition to blobs, supports file (SMB and NFS) access, along with table (database) and queue storage services.

The following shows the return label attached to the shipping box that contains the Data Box disks and cables. I also included a copy of the shipping label inside the box just in case something happened during shipment. Once prepared for delivery, I took the box to a local UPS store where I received a shipment receipt (not shown). Later that day I also received an email from Microsoft indicating the shipment was in-progress.

Data Box disks packaged with return receipt (was in the box)
Data Box disks packaged with return receipt (was in the box)

The Azure portal shows status of Data Box shipment being sent to Microsoft, along with a follow-up email notification.

Azure Data Box portal status
Azure Data Box portal status

Email notification of Data Box on the way to Microsoft.

Notice data box is on the way to Azure
Notice data box is on the way to Azure

After a few days’ ways, checking the Azure Portal shows the Data Box arrived at Microsoft and copied operations underway. Remember the storage account you specified back in the early steps is where you will look for your data. This is something I think Microsoft can improve on by providing a link, or some reminder of where the data is being copied to in the status. Likewise, a copy completion email notice would be handy after getting used to the other alerts previous in the process.

Azure Data Box portal showing disk copy operation status
Azure Data Box portal showing disk copy operation status

Looking at the Azure storage account specified during the ordering process in the Blob storage resources the contents of the Data Box Disks can be found.

Contents of Data Box disks copied into specified Azure Blobs and storage account
Contents of Data Box disks copied into specified Azure Blobs and storage account

The following shows folders that I had copied from on-prem systems to the Data Box now located in the proper Azure Block Blobs. Not shown are Page blobs where I moved some VHDXs.

xMission accomplished, data folders now stored in Azure block blobs
Mission accomplished, data folders now stored in Azure block blobs

Where to learn more

Learn more about Microsoft Azure Data Box, Clouds and Data Infrastructure related trends, tools, technologies and topics via the following links:

Additional learning experiences along with common questions (and answers), as well as tips can be found in Software Defined Data Infrastructure Essentials book.

Software Defined Data Infrastructure Essentials Book SDDC

What this all means

Overall the test drive of the Azure Data Box Disk solution was positive, and look forward to trying out some of the other Data Box solutions, both offline and online options in the future. Continue reading Part 4 Microsoft Azure Data Box Disk Impressions as part of this series including Microsoft Azure Data Box Disk Test Drive Review.

Ok, nuff said, for now.

Cheers Gs

Greg Schulz – Microsoft MVP Cloud and Data Center Management, VMware vExpert 2010-2018. Author of Software Defined Data Infrastructure Essentials (CRC Press), as well as Cloud and Virtual Data Storage Networking (CRC Press), The Green and Virtual Data Center (CRC Press), Resilient Storage Networks (Elsevier) and twitter @storageio. Courteous comments are welcome for consideration. First published on https://storageioblog.com any reproduction in whole, in part, with changes to content, without source attribution under title or without permission is forbidden.

All Comments, (C) and (TM) belong to their owners/posters, Other content (C) Copyright 2006-2024 Server StorageIO and UnlimitedIO. All Rights Reserved. StorageIO is a registered Trade Mark (TM) of Server StorageIO.

Microsoft Azure Data Box Family #blogtobertech

Microsoft Azure Data Box Family #blogtobertech

Microsoft Azure Data Box Family #blogtobertech

Microsoft Azure Data Box Family is part two of a four-part series looking at Data Box. View Part 1 Microsoft announced Azure Data Box updates, Part 3 Microsoft Azure Data Box Disk Test Drive Review, Part 4 Microsoft Azure Data Box Disk Impressions.

Microsoft Azure Data Box Overview

Microsoft has several Data Box solutions available or in the preview to meet various customer needs. These include both online as well as offline solutions that include hardware (except Data Box Gateway), software tools and cloud services.

Data Box Online

Microsoft has two online Data Box offerings that provide real-time access of Azure cloud storage resources from on-prem including remote, edge locations. The online Data Box solutions include Edge and Gateway both with local on-prem storage.


Data Box Edge image via Microsoft.com

Data Box Edge (Preview)

Currently, in preview, Data Box Edge is a 1U appliance that combines hardware along with software resources for deployment on-prem at the edge or remote locations. Data Box Edge places locally converged compute and storage resources as an appliance along with connectivity to Azure cloud-based resources.

Intended workloads and applications for Data Box Edge include remote AI, ML, and DL inferencing, data processing or pre-processing before sending to Azure Cloud, function as an edge compute, data protection and data transfer platform (e.g., cloud storage gateway) with local compute. Data Box Edge is similar in functionality and focuses on other cloud service provider solutions such as AWS Snow Ball Edge (SBE). Management tools include Data Box Edge resource Azure portal for management from a web UI, create and manage resources, devices, shares.

Other Data Box Edge attributes include:

  • Supports Azure Blob or Files via SMB and NFS storage access protocols
  • Dual Intel Xeon processors each with 10 CPU cores, 64GB RAM
  • 2 x 10 Gbps SFP+ copper cables, 2 x 1 Gbps RJ45 cables
  • 8 NVMe SSD (1.6 TB each), no HA, 12.8 TB total raw cap
  • 2 x 1 GbE (one for management, one for user access)
  • 2 x 25 GbE (can operate at 10 GbE) and 2 x 25 GbE ports
  • Local web UI for management and configuration

Data Box Gateway (Preview)

Also in Preview, Data Box Gateway is a virtual machine (VM) based software defined appliance that runs on VMware vSphere (ESXi) or Microsoft Hyper-V hypervisors. The functionality of Data Box Gateway is that of a cloud storage gateway providing access to Azure Blob (Page and Block) or Files (NAS) via SMB or NFS protocols. Learn more about both Data Box Edge and Data Box Gateway here including pricing here.

Data Box Offline Solutions

Microsoft has several offline Data Box offerings including previously available and new in preview models. Offline Data Box solutions enable large amounts of data to be moved from on-prem primary, remote and edge locations to Azure cloud storage resources. Bulk data movement operations can be one-time or recurring in support of big data migration of energy, research, media & entertainment and other large volumes of data.

Other bulk movement includes for archive, backup, BC/DR, virtual machine and application migration among others. Use Data Box Offline solutions when large amounts of data need to be moved from on-prem to Azure cloud faster than what available networks will support promptly.

Offline Data Box solutions include:

  • Data Box Heavy (Preview) 1 PB Storage, 800 TB usable
  • Data Box 100 TB (80 TB usable)
  • Data Box Disk (Preview) 40 TB (35 TB Usable)


Data Box Heavy 1 PB (Preview) image via Microsoft.com

Data Box Heavy 1 PB (Preview)

  • Appliance with Up to 800 TB usable capacity per order
  • One system per order
  • Supports Azure Blob or Files
  • Copy data to up to 10 storage accounts
  • 1 x 1/10 Gbps RJ45 connector, 4 x 40 Gbps QSFP+ connectors
  • AES 256-bit encryption
  • Copies data using NAS SMB and NFS protocols


Data Box 100TB image via Microsoft.com

100 TB Data Box

  • An appliance that supports 80 TB usable storage capacity
  • Supports Azure Blob or Files
  • Copies data to 10 storage accounts
  • 1 x 1/10 GbE RJ45 connector
  • 2 x 10 GbE SFP+ connector
  • AES 256-bit encryption
  • Storage access and copy via SMB and NFS NAS protocols

Case of Data Box Disks image via Microsoft.com

Data Box Disk 40 TB (Preview)

  • Up to 35 TB usable capacity per order
  • Up to 5 SSDs per order
  • This is what I tested (2 x 8 TB)
  • Supports Azure Blob storage (Block and Page)
  • Copies data to a single storage account
  • USB/SATA II, III server I/O interface (comes with SATA to USB connector cables)
  • AES 128-bit encryption
  • Copy data with standard tools

Where to learn more

Learn more about Microsoft Azure Data Box, Clouds and Data Infrastructure related trends, tools, technologies and topics via the following links:

Additional learning experiences along with common questions (and answers), as well as tips can be found in Software Defined Data Infrastructure Essentials book.

Software Defined Data Infrastructure Essentials Book SDDC

What this all means

Which Microsoft Azure Data Box is the best? That depends on your needs and requirements.

Microsoft along with other major cloud service providers continue to evolve their data migration services. Realizing that customers who need, want, or have to get data to the cloud also need to remove barriers, solutions such as Azure Data Box are a step in eliminating cloud barriers while addressing cloud concerns. Continue reading Part 3 Microsoft Azure Data Box Disk Test Drive Review and Part 4 Microsoft Azure Data Box Disk Impressions as part of Microsoft Azure Data Box Family.

Ok, nuff said, for now.

Cheers Gs

Greg Schulz – Microsoft MVP Cloud and Data Center Management, VMware vExpert 2010-2018. Author of Software Defined Data Infrastructure Essentials (CRC Press), as well as Cloud and Virtual Data Storage Networking (CRC Press), The Green and Virtual Data Center (CRC Press), Resilient Storage Networks (Elsevier) and twitter @storageio. Courteous comments are welcome for consideration. First published on https://storageioblog.com any reproduction in whole, in part, with changes to content, without source attribution under title or without permission is forbidden.

All Comments, (C) and (TM) belong to their owners/posters, Other content (C) Copyright 2006-2024 Server StorageIO and UnlimitedIO. All Rights Reserved. StorageIO is a registered Trade Mark (TM) of Server StorageIO.

Microsoft announced Azure Data Box updates #blogtobertech

Microsoft announced Azure Data Box updates – #blogtobertech

Microsoft announced Azure Data Box updates - #blogtobertech

Microsoft announced Azure Data Box is the first in a series of four posts looking at Data Box including a test drive experience. View Part 2 Microsoft Azure Data Box Family, Part 3 Microsoft Azure Data Box Disk Test Drive Review, Part 4 Microsoft Azure Data Box Disk Impressions.

Microsoft Azure Data Box Family Page image via Microsoft.com
Microsoft Azure Data Box Family Page image via Microsoft.com

At Ignite in Microsoft announced Azure Data Box updates, which means its time for a test drive and review. Microsoft has several Data Box solutions available or in the preview to meet various customer needs. These include both online as well as offline solutions that include hardware (except Data Box Gateway), software tools and cloud services. In general, Data Box enables bulk movement and migration of data from on-prem environments to Azure cloud storage including blobs (e.g., objects) and files (e.g., NAS accessible) resources.

Whats The Need for Data Movement Appliance Service

Some might ask the question why do you need a Microsoft Azure Data Box when there are fast networks? Good question, assuming you have fast networks that can move large amounts of bulk data promptly. Microsoft supports traditional Internet-based access to Azure cloud resources for data migration, along with higher speed Express Route service similar to Amazon Web Service (AWS) Direct Connect among other options.

On the other hand, if you need to move a large amount of data that would take weeks, months or longer sending over expensive networks, then solutions like Data Box are an option. Microsoft is not alone or unique having data storage migration or movement services. AWS has Snowball, Snowball Edge with compute, as well as the truck size Snowmobile for large-scale data movement. Google also has their Transfer services including Google Transfer Appliance.

Who is Azure Data Box for?

Azure Data Box is for those who need to migrate data to Azure cloud storage and other services on a one-time or recurring basis. Another scenario is for those who need to have on-prem storage and optional compute at remote or edge locations in support of data acquisition, media & entertainment, energy exploration, AI, ML, DL inferencing, local data processing, pre-processing before sending to cloud among other workloads.

Yet other scenarios for those who need to move large amounts of data online, off-line, or in disconnected also known as submarine mode where a connection to the internet is not always available. Bulk data movement also applies for one-time, as well as recurring data protection such as archive, backups, BC/DR, as well as data shipping, virtual machine farm relocation, SQL Server data migration to cloud, data center consolidation among many other scenarios.

What is Azure Data Box

Azure Data Box is a combination of hardware, software, cloud services that support data migration (on-line and off-line) from on-prem environments including remote or edge to Azure cloud storage resources. There are different Data Box solutions available or in the preview to meet various needs from performance, capacity, functionality, without as well as without compute. In addition to being used for data migration, there are also Data Box solutions (e.g., Edge) that converge compute and storage for deployment at remote or edge locations.

Data Box Gateway is a software-defined virtual machine appliance that deploys on VMware and Microsoft (e.g., Hyper-V) hypervisors. Off-line Data Box solutions scale from single 8TB SSD disks to PB of capacity with various functionality.

As a reminder, blobs are analogous to and what Microsoft Azure refers to instead of objects (e.g., object storage). Also remember that Azure blobs include block, page (512-byte page aligned for VHDX) and append (similar to other vendors object storage). Microsoft Azure in addition to blobs, supports file (SMB and NFS) access, along with table (database) and queue storage services.

Where to learn more

Learn more about Microsoft Azure Data Box, Clouds and Data Infrastructure related trends, tools, technologies and topics via the following links:

Additional learning experiences along with common questions (and answers), as well as tips can be found in Software Defined Data Infrastructure Essentials book.

Software Defined Data Infrastructure Essentials Book SDDC

What this all means

Azure Data Box type solutions and services are becoming more common as well as diverse. With the addition of compute in some of these solutions to support remote edge workloads, the lines may blur with some of the converged and hyper-converged infrastructure (HCI) solutions. Likewise, keep an eye to see how cloud service providers leverage solutions like Data Box Edge to further place their reach out to the edge enabling fog (e.g., cloud at the edge) among other converged functionality. Continue reading Part 2 Microsoft Azure Data Box Family, Part 3 Microsoft Azure Data Box Disk Test Drive Review, and Part 4 Microsoft Azure Data Box Disk Impressions as part of Microsoft announced Azure Data Box updates.

Ok, nuff said, for now.

Cheers Gs

Greg Schulz – Microsoft MVP Cloud and Data Center Management, VMware vExpert 2010-2018. Author of Software Defined Data Infrastructure Essentials (CRC Press), as well as Cloud and Virtual Data Storage Networking (CRC Press), The Green and Virtual Data Center (CRC Press), Resilient Storage Networks (Elsevier) and twitter @storageio. Courteous comments are welcome for consideration. First published on https://storageioblog.com any reproduction in whole, in part, with changes to content, without source attribution under title or without permission is forbidden.

All Comments, (C) and (TM) belong to their owners/posters, Other content (C) Copyright 2006-2024 Server StorageIO and UnlimitedIO. All Rights Reserved. StorageIO is a registered Trade Mark (TM) of Server StorageIO.

Cloud File Data Storage Consolidation and Economic Comparison Model #blogtobertech

Cloud File Data Storage Consolidation and Economic Comparison Model #blogtobertech

Cloud File Data Storage Consolidation and Economic Comparison Model

The following is a new Industry Trends Perspective White Paper Report titled Cloud File Data Storage Consolidation and Economic Comparison Model.

Cloud File Data Storage Consolidation and Economic Comparison Model

This new report looks at Distributed File Server and Consolidated Cloud Storage Economic Comparison with a fundamental economic comparison model for remote (on-prem) distributed file-servers and cloud storage consolidation decision-making. IT data infrastructure resource (servers, storage, I/O network, hardware, software, services) decision-making involves evaluating and comparing technical attributes (speeds, feeds, features) of a solution or service. Another aspect of data infrastructure resource decision-making involves assessing how a solution or service will support and enable a given application workload from a Performance, Availability, Capacity, and Economic (PACE) perspective.

Cloud File Data Storage Consolidation and Economic Comparison Model

Keep in mind that all application workloads have some amount of PACE resource requirements that may be high, low or various permutations. Performance, Availability (including data protection along with security) as well as Capacity are addressed via technical speeds, feeds, functionality along with workload suitability analysis. The E in PACE resource decision-making is about the Economic analysis of various costs associated with different solution approaches.

Read more in this Server StorageIO Industry Trends and Perspective (ITP) Report.

Where to learn more

Learn more about Clouds and Data Infrastructure related trends, tools, technologies and topics via the following links:

Additional learning experiences along with common questions (and answers), as well as tips can be found in Software Defined Data Infrastructure Essentials book.

Software Defined Data Infrastructure Essentials Book SDDC

What this all means

When comparing and making data infrastructure resource decisions, consider the application workload PACE characteristics. Also keep in mind that PACE means Performance (productivity), Availability (data protection), Capacity and Economics. This includes making decisions from a technical feature, functionality (speeds and feeds) capacity as well as how the solution supports your application workload. Leverage resources including tools to perform analysis including Cloud File Data Storage Consolidation and Economic Comparison Model approaches.

Ok, nuff said, for now.

Cheers Gs

Greg Schulz – Microsoft MVP Cloud and Data Center Management, VMware vExpert 2010-2018. Author of Software Defined Data Infrastructure Essentials (CRC Press), as well as Cloud and Virtual Data Storage Networking (CRC Press), The Green and Virtual Data Center (CRC Press), Resilient Storage Networks (Elsevier) and twitter @storageio. Courteous comments are welcome for consideration. First published on https://storageioblog.com any reproduction in whole, in part, with changes to content, without source attribution under title or without permission is forbidden.

All Comments, (C) and (TM) belong to their owners/posters, Other content (C) Copyright 2006-2024 Server StorageIO and UnlimitedIO. All Rights Reserved. StorageIO is a registered Trade Mark (TM) of Server StorageIO.

Ten tips to reduce your cloud compute storage costs #blogtobertech

Ten tips to reduce your cloud compute storage costs #blogtobertech

Ten tips to reduce your cloud compute storage costs

The following are Ten tips to reduce your cloud compute storage costs.

In some cases, reducing your cloud costs means spending the same yet getting more value and resources that provide a business benefit. For example, paying the same yet upgrading to fewer, faster servers, storage, I/O network resources to support growth while boosting productivity. In other words, when measured on a cost per unit of work done or service enabled, there should be an improvement.

On the other hand, cost cutting can be measured by an actual reduction in spending, for example, consolidating multiple applications to a lower cost compute instance running at higher utilization. The caveat is that while the spend may be reduced, is the corresponding level of service or application and user productivity negatively impacted?

Other examples are a hybrid of removing complexity and cost, as well as cost-cutting, for instance finding orphan resources that are powered on and not used. Orphan resources include IP addresses assigned, being charged for yet not used, or a virtual machine instance powered on however not used. Another orphan example is a VM instance that is powered off however no longer used, nor are the disks assigned to it, as well as any snapshots or backups.

Ten tips to reduce your cloud costs

  • Utilize client and remote site data file cache to reduce cloud egress network fees
  • Bring your own software licenses for operating systems and applications
  • Monitor your cloud cost summaries regularly to watch out for surprises
  • Find and remove orphan resources including instances, images, IP address, storage volumes, buckets
  • Revisit if your data is stored in the appropriate storage class or tier for how it is used. Likewise, leverage lower durable storage tiers as locations for additional protection instead of merely as a single destination to support cost-cutting. For example, cost cutting would be placing your only data protection copy and archive on a lower cost lower durable storage tier. Removing cost, boosting availability would be putting a copy of your data on two or more economical price, less durable storage tiers in different locations, instead of a single copy on a highly durable tier in one place.
  • Consolidate many smaller, lower cost instances into fewer larger instances, removing complexity and costs
  • Utilize reserved instances (RI) along with prepayment discounts, also check with your finance department to see if there are benefits of considering as OpEx or CapEx.
  • Audit your RIs to make sure you have the appropriately sized resources to meet workload needs.
  • Utilize spot instances for spot or ad-hoc interruptible workloads
  • Leverage ephemeral on-instance storage as a cache to boost performance

Additional Tips and Recommendations

Everything is not the same, why treat everything the same including assigning to the same type of resources. Keep in mind that all applications have some level of Performance, Availability, Capacity, and Economic (PACE) resource requirements that need to be balanced.

Similar to on-prem environments, one of the top mistakes when choosing storage is looking only at a cost per capacity, particular with flash-based SSD and NVMe accessed storage. Also look into what the storage performance thresholds are, as well as any access and API or service call fees.

Watch out for excessive API and cloud service calls beyond your normal monthly limits. For example, consistently running rsync on some storage classes can result in surprise monthly invoices. Likewise, moving data around, changing encryption or other operations may wipe out savings from going to a lower storage tier. Look beyond the monthly cost per capacity, what are the access including egress (reading data) fees, as well as API calls such as list, dir or other operations.

Likewise, for compute instances, look beyond the necessary cost also considering how much memory (DRAM), I/O for storage and networking, on-instance storage (temporary or persistent), bring your own license options, number of cores or virtual CPUs along with their speed. Also, watch for any limits on the number of I/O operations per instance particular with fast flash SSD including NVMe accessed storage. Just because its flash or NVMe does not mean it’s going to be fast.

Where to learn more

Learn more about Clouds and Data Infrastructure related trends, tools, technologies and topics via the following links:

Additional learning experiences along with common questions (and answers), as well as tips can be found in Software Defined Data Infrastructure Essentials book.

Software Defined Data Infrastructure Essentials Book SDDC

What this all means

Have a situational awareness of your on-prem environment knowing your costs of resources as well as the level of services to make informed decisions. Don’t be scared, be prepared, avoid flying blind, plan ahead and apply the appropriate resources along with quantity to require application workload needs. Keep in mind that there are more than Ten tips to reduce your cloud compute storage costs, however these should get your off to a good start.

Ok, nuff said, for now.

Cheers Gs

Greg Schulz – Microsoft MVP Cloud and Data Center Management, VMware vExpert 2010-2018. Author of Software Defined Data Infrastructure Essentials (CRC Press), as well as Cloud and Virtual Data Storage Networking (CRC Press), The Green and Virtual Data Center (CRC Press), Resilient Storage Networks (Elsevier) and twitter @storageio. Courteous comments are welcome for consideration. First published on https://storageioblog.com any reproduction in whole, in part, with changes to content, without source attribution under title or without permission is forbidden.

All Comments, (C) and (TM) belong to their owners/posters, Other content (C) Copyright 2006-2024 Server StorageIO and UnlimitedIO. All Rights Reserved. StorageIO is a registered Trade Mark (TM) of Server StorageIO.

How I saved money storing more data on aws s3 simple storage service #blogtobertech

How I saved money storing more data on aws s3 simple storage service #blogtobertech

How I saved money storing more data on aws s3 simple storage service

How I saved money storing more data on aws s3 simple storage service is an example of reducing cloud costs as opposed to merely cutting cloud costs. What this means is that instead of just cutting my cloud storage costs with a focus on how much I could save, I wanted to remove some costs while also storing more data without compromise. For example, since making the changes, storage capacity usage has almost doubled, yet prices are remaining 37% lower from two years ago before the changes were made.

How I saved money storing more data on aws s3?

Without adding any context, the typical reaction might be that I saved money storing more data on (or in) AWS S3 as opposed to locally on-site (on-prem). Another typical response would be that I moved all of my data from a different more expensive cloud service to AWS S3. Yet another common reaction would that I moved my AWS S3 data into AWS Glacier cold storage, or, deleted a large amount of data.

Some might even comment that I must have used some type of dedupe, compression or other data footprint reduction (DFR) technology. On the other hand, some might determine that I probably did all or some of the above, or, leveraged AWS tiered storage, aligning different storage classes to the type of data activity.

How I saved money storing more data in AWS S3 actually involved spending some money, to eventually save money by leveraging different S3 storage classes. As part of rebalancing or moving different data to its new storage class, some one-time charges were incurred which recouped after several months of savings. The costs pertained to EC2 compute instances and associated storage used for some of the data tiering, other fees were for access charges along with excessive API calls. For example, some of the data was in storage classes that had fees for early retrieval or deletions, or fees for access among others.

How I use different AWS S3 storage classes (tiers)

  • Standard – Frequently changing data, or data with frequent access
  • Infrequent Access (IA) – Data that does not change frequently or that is not routinely accessed. In the past before OZA, I had placed data that did not need to be in standard, yet to warm for Glacier in this storage class. After the migrations, I have fewer data stored in IA, with more in OZA as well as some in Standard.
  • One Zone Availability (OZA) – Data that is frequently accessed for reading, however, is static, not yet cold enough to move to Glacier or deep archive. A mix of backups, online and active archives. Note that I use OZA as an additional copy or location and not as a single, lowest cost place to store data. In other words, anything that I put into OZA has at least one additional copy somewhere else which may not be in the cloud.
  • Glacier – Very cold, seldom accessed, archives

Where to learn more

Learn more about Clouds and Data Infrastructure related trends, tools, technologies and topics via the following links:

Additional learning experiences along with common questions (and answers), as well as tips can be found in Software Defined Data Infrastructure Essentials book.

Software Defined Data Infrastructure Essentials Book SDDC

What this all means

I decreased my AWS monthly bill by balancing things around, there was a one-month period where my costs increased during the changes, then a subsequent reduction. However, while I saw my monthly AWS storage invoices decrease, I’m also storing more data per month. How I saved money storing more data on aws s3 simple storage service involved using different storage classes.

Ok, nuff said, for now.

Cheers Gs

Greg Schulz – Microsoft MVP Cloud and Data Center Management, VMware vExpert 2010-2018. Author of Software Defined Data Infrastructure Essentials (CRC Press), as well as Cloud and Virtual Data Storage Networking (CRC Press), The Green and Virtual Data Center (CRC Press), Resilient Storage Networks (Elsevier) and twitter @storageio. Courteous comments are welcome for consideration. First published on https://storageioblog.com any reproduction in whole, in part, with changes to content, without source attribution under title or without permission is forbidden.

All Comments, (C) and (TM) belong to their owners/posters, Other content (C) Copyright 2006-2024 Server StorageIO and UnlimitedIO. All Rights Reserved. StorageIO is a registered Trade Mark (TM) of Server StorageIO.

Next Generation Hybrid Software Defined Data Infrastructures Are In Your Future #blogtobertech

Next Generation Hybrid Software Defined Data Infrastructures Are In Your Future #blogtobertech

Next Generation Hybrid Software Defined Data Infrastructures Are In Your Future

A few weeks ago I was invited to present a keynote at the 1st annual Minnesota VMware User Group (VMUG) Super VMUG mega event in Minneapolis titled Next Generation Hybrid Software Defined Data Infrastructures Are In Your Future (download PDF presentation here).

Key themes of the presentation focused around data infrastructures (e.g. what’s inside physical data centers including server, storage, I/O networking, hardware, software, policies, procedures) along with industry trends including hybrid software defined clouds (and containers). Anther aspect of the presentation focused around building, refreshing and expanding our fundamental data infrasture tradecraft skills. Also keep in mind that everything is not the same across different environments, granted there are similarities that can be leveraged.


Data Infrasture’s are defined to support business applications information service delivery

Data Infrastructures

The fundamental role of data infrastructures is to provide a platform environment for applications and data that is resilient, flexible, scalable, agile, efficient as well as cost-effective. Put another way, data infrastructures exist to protect, preserve, process, move, secure and serve data as well as their applications for information services delivery. Technologies that makeup data infrastructures include hardware, software, cloud or managed services, servers, storage, I/O and networking along with people, processes, policies along with various tools spanning legacy, software-defined virtual, containers and cloud.

Depending on your role or focus, you may have a different view than somebody else of what is infrastructure, or what an infrastructure is. Generally speaking, people tend to refer to infrastructure as those things that support what they are doing at work, at home, or in other aspects of their lives. For example, the roads and bridges that carry you over rivers or valleys when traveling in a vehicle are referred to as infrastructure.

Similarly, the system of pipes, valves, meters, lifts, and pumps that bring fresh water to you, and the sewer system that takes away waste water, are called infrastructure. The telecommunications network. This includes both wired and wireless, such as cell phone networks, along with electrical generating and transmission networks are considered infrastructure. Even the airplanes, trains, boats, and buses that transport us locally or globally are considered part of the transportation infrastructure. Anything that is below what you do, or that supports what you do is considered infrastructure.

The following figure shows various layers or altitudes of encapsulation and abstraction of data infrastructures along with their underlying resources that are defined to support a business enablement outcome, as well as support information services delivery.


Data Infrastructure Stack Layers and Resources Defined To Support Business Information Services

The following figure shows evolution of data infrastructures from on-prem bare metal to software-defined virtual, cloud, containers, converged and hyper-converged packaging as well as emerging composable. Also shown below are a hybrid as well as multi-clouds including bare metal dedicated services in addition to virtual machine instances as well as container-based services.


Data Infrastructure and Resource Packaging Deployment Evolution

Hybrid Software Defined Industry Trends

Some of the trends discussed in the presentation include:

Clouds – Public, Private, Hybrid, Multi and hybrid clouds along with how they are being used, along with technology evolution including virtual machine (VM) instances, bare metal dedicated private servers (DPS) as well as metal as a service. Other cloud trends include data migration appliances such as AWS Snowball Edge, Microsoft Azure Databox among others, VMware on AWS, as well as fog and edge computing.

Other trend topics included converged, hyper-converged, serverless, containers, persistent memory (PMEM) also known as storage class memory (SCM) along with other server storage I/O topics. Additional trend topics included data protection, Azure Stack, security, NVMe as well as NVMe over Fabrics (NVMeoF) along with composable and Gen-Z.

Tradecraft Skills Experience

Expanding your data infrastructure tradecraft means evolving from your primary focus area, gaining insight into other technologies, tools, techniques in adjacent areas outside your comfort zone. For industry veterans with several years to many decades of experience, this means refreshing on what you know, think you know or need to know with what’s new or evolving. On other other hand, for those who are new, expanding your tradecraft means moving beyond learning to memorize to pass a certificate test, to gaining insight on how, when, where, why to apply different tools, technologies, trends to tasks at hand.

For example, developing tradecraft from knowing the different hardware, software, and services resources as well as tools, to what to use when, where, why, and how. Another dimension of expanding data infrastructure tradecraft skills is gaining the experience and insight to troubleshoot problems, gain insight awareness with dashboard or monitoring tools, as well as how to design and manage to cut or reduce the chance of things going wrong.

From Tools and Technologies to Techniques and Tricks of the Trade

Expanding your awareness of new technologies along with how they work is important, so too is understanding application and organization needs. Developing your tradecraft means balancing the focus on new and old technologies, tools, and techniques with business or organizational application functionality.

This is where using various tools that themselves are applications to gain insight into how your data infrastructure is configured and being used, along with the applications they support, is important.

Data Infrastructure Tools Tradecraft
Data Infrastructure Toolbox (Hardware, Software, Scripts)

Next Generation Hybrid Software Defined Data Infrastructures What Next


Balance head in the clouds (thinking, strategy, vision) with feet on the ground (what you can do today)

The following are some additional tips, comments, recommendations to keep in mind for enabling your next generation hybrid software defined data infrastructure.

Where to learn more

Learn more about data infrastructures and tradecraft related trends, tools, technologies and topics via the following links:

Additional learning experiences along with common questions (and answers), as well as tips can be found in Software Defined Data Infrastructure Essentials book.

Software Defined Data Infrastructure Essentials Book SDDC

What this all means

Everything is not the same across different organizations, IT environments, application workloads and the data infrastructures that support them. Data Infrasture’s span from legacy on-prem to software-defined cloud (public, private, hybrid, multi-cloud), container, serverless, virtual, hybrid, converged and hyper-converged, as well as central, core and distributed edge or remote office branch office (ROBO). Even though everything is not the same, there are similarities across different environments, technologies and workloads that can be leveraged. Fundamental tradecraft skills and experiences are what enable you to know what to use when, where, why and how including using new as well as old things in new ways, while not making old mistakes in new ways.

Some other tips include avoid flying blind, particular in software defined and cloud environments, have situational awareness, end to end (E2E) insight leveraging metrics that matter, are relevant, timely, accurate and hold context to the data infrastructures as well as applications they support. Part of expanding your tradecraft skills is refreshing on what you know, also expanding into new adjacent areas getting out of your comfort zone. Also understand the context of different terms, technologies and tools. For example, SAS can be big data analytic statistical analysis software, serial attached SCSI storage device as well as shared access signature for Azure clouds among others.

Also keep in mind that while software defined things are popular and trendy with the industry, keep the focus on what is being defined to enable an outcome or business enablement In other words, the emphasis should not be on the software aspect per say, rather how something (hardware, software, service) is defined to enable something. Also keep in mind with software defined marketing and trends such as serverless, servers and software still need hardware (somewhere), and hardware still needs software from micro code to firmware to many other places in the data infrasture layers or stack. Meanwhile, keep in mind that it is #blogtobertech and Next Generation Hybrid Software Defined Data Infrastructures Are In Your Future.

Ok, nuff said, for now.

Cheers Gs

Greg Schulz – Microsoft MVP Cloud and Data Center Management, VMware vExpert 2010-2018. Author of Software Defined Data Infrastructure Essentials (CRC Press), as well as Cloud and Virtual Data Storage Networking (CRC Press), The Green and Virtual Data Center (CRC Press), Resilient Storage Networks (Elsevier) and twitter @storageio. Courteous comments are welcome for consideration. First published on https://storageioblog.com any reproduction in whole, in part, with changes to content, without source attribution under title or without permission is forbidden.

All Comments, (C) and (TM) belong to their owners/posters, Other content (C) Copyright 2006-2024 Server StorageIO and UnlimitedIO. All Rights Reserved. StorageIO is a registered Trade Mark (TM) of Server StorageIO.