LATEST NEWS

DataBank Establishes $725M Financing Facility to Support Growth. Read the press release.

Get a Quote

Request a Quote

Tell us about your infrastructure requirements and how to reach you, and one of team members will be in touch shortly.

Schedule a Tour

Tour Our Facilities

Let us know which data center you'd like to visit and how to reach you, and one of team members will be in touch shortly.

Get a Quote

Request a Quote

Tell us about your infrastructure requirements and how to reach you, and one of team members will be in touch shortly.

Schedule a Tour

Tour Our Facilities

Let us know which data center you'd like to visit and how to reach you, and one of team members will be in touch shortly.

Get a Quote

Request a Quote

Tell us about your infrastructure requirements and how to reach you, and one of team members will be in touch shortly.

Schedule a Tour

Tour Our Facilities

Let us know which data center you'd like to visit and how to reach you, and one of team members will be in touch shortly.

Optimizing Data Center Infrastructure For Big Data Workloads

Optimizing Data Center Infrastructure For Big Data Workloads


Big data workloads, as the name suggests, are workloads that involve large quantities of data. Consequently, they place very high demands on data center infrastructure. Furthermore, these demands will only increase as the use of big data increases. With that in mind, here is a quick guide to optimizing data center infrastructure for big data workloads.

Understanding big data workloads

Big data workloads are identified by a set of characteristics often referred to as “the four Vs”. These are volume, variety, veracity, and velocity.

Volume: This is self-explanatory. It’s what puts the “big” into big data. The amount of data created by individuals, businesses, and organizations has vastly increased over recent decades. In the 1970s businesses routinely used floppy disks that held 1.44Mb as standard. Now, individuals may need terabytes of storage while businesses have reached petabytes and beyond.

Variety: Data has become much more diverse. In particular, there has been significant growth in the volume of semi-structured and unstructured data. Furthermore, the ratio of semi-structured and unstructured data to structured data looks set to increase. This is because semi-structured and unstructured data is often generated by maturing technologies such as smart devices (with sensors).

Veracity: Society is generating ever-increasing volumes of data but not all of that data is effectively checked for quality. As a result, some of it is noisy (corrupted, distorted, or with a low signal-to-noise ratio) and/or incomplete, or inconsistent. 

Even data that is technically correct may be out of date, duplicated, or irrelevant. This means that modern data centers must incorporate mechanisms for data cleansing and validation to ensure the accuracy of insights derived from the data.

Velocity: Modern users often want to be served yesterday, now is just their second-best option. Certain businesses need data in real-time (or as close as possible) to make vital decisions. Consumers want it to improve their experience. For example, they want to feel like they are chatting with someone in person even though they’re online.

Scalability for big data workloads

The traditional approach to scaling is now what is called vertical scaling. This essentially means boosting the capability of your existing resources. In the context of data centers that could mean increasing the specifications of servers by upgrading the CPU and/or adding memory or storage. This certainly makes a difference but the extent of that difference may be limited and it can be a relatively expensive approach.

As a result, vertical scaling is now often replaced by (or combined with) horizontal scaling. This essentially means expanding your infrastructure such as by adding more servers or nodes. This makes it possible to distribute workloads across multiple machines and hence to handle increased data volumes.

Options for scaling

From a user perspective, probably the most obvious option for horizontal scaling is to use the public cloud. In the context of data centers, however, the two key options are the use of distributed file systems such as the Hadoop Distributed File System (HDFS) and the use of distributed storage solutions.

Performance optimization for big data workloads

There are four main considerations when optimizing data center infrastructure for big data workloads. These are hardware, processing, storage systems, and resource allocation.

Hardware

Your first hardware consideration is to ensure that you actually have enough servers. Those servers need to be fitted with high-powered multi-core processors along with plenty of top-quality memory and storage. 

Traditional hard disk drives (HDDs) are likely to be far too slow for big data workloads. Instead, you will need high-speed, high-capacity storage solutions such as solid-state drives (SSDs) and non-volatile memory express (NVMe) drives. 

You will also need robust and high-performance network infrastructure to ensure high-bandwidth connections with minimal latency.

Processing

Processing big data workloads often requires the use of at least one of parallel processing and in-memory processing. 

Parallel processing breaks down large tasks into smaller ones. These can then be distributed and processed simultaneously on different resources. In-memory processing refers to using memory instead of storage. Memory tends to be faster than even fast storage and hence can process tasks quicker.

Storage systems

Traditional, row-based storage is generally still fine for transactional data (e.g. CSVs). Analytical workloads, however, often benefit from columnar storage as this facilitates rapid data scanning and aggregation. 

Using the right storage format can significantly improve your performance when processing big data workloads. Implementing effective compression can improve it even further.

Resource allocation

Last but definitely not least, it’s vital to have effective resource allocation. Investing in AI-powered tools can help significantly with this.

 

Related Resources:

Optimizing Data Center Infrastructure For Big Data Workloads

 

Share Article



Categories

Discover the DataBank Difference

Discover the DataBank Difference

Explore the eight critical factors that define our Data Center Evolved approach and set us apart from other providers.
Download Now
Get Started

Get Started

Discover the DataBank Difference today:
Hybrid infrastructure solutions with boundless edge reach and a human touch.

Get A Quote

Request a Quote

Tell us about your infrastructure requirements and how to reach you, and one of the team members will be in touch.

Schedule a Tour

Tour Our Facilities

Let us know which data center you’d like to visit and how to reach you, and one of the team members will be in touch shortly.