Overcoming the challenges of Big Data and making the most out of it

Tal Nitzan


The better part of the current decade has seen a growing number of businesses realize the importance of one of the most frequent occurrences in my line of work: the popular buzzword big data in all its glory.

The effect it has on marketing, sales, IT, and other business operations is often negated by the fact that data on its own can’t always solve problems. Wherever you read ‘big data’ online, it’s usually accompanied by some form of the word ‘challenge’. While it’s tempting to look at it this way, the problem is not big data itself – it’s how companies use it or better yet, fail to use it. In this post I’ll cover examples of misuse that will hopefully help form an approach to big data correctly. Ready? Let’s get started.

(Mis)Handling the data growth

The first thing that comes to mind is how much people are unaware of the enormous proportions of big data. If it was in the form of a robot, I’m pretty sure 60’s Japan import Gigantor would be a close fit, especially with the matching tagline.

“Bigger than big, taller than taller, quicker than quick, stronger than strong!”

Anime adaptations aside, the proliferation of cloud computing is producing enormous amounts of data from a lot of different data points. We’re at a point where data is ahead of the technology. There isn’t a platform powerful enough that can handle all the big data collected (people are also scarce – I’ll get to that in a minute) so a lot hinges on how you organize your data.

Some people start working before they have a template or a way to synchronize everybody that’s working on the data itself. There are too many complexities when it comes to access, transmission, and delivery of data from different sources and loading it into a big data platform. Doing it without some kind of established structured process is risky, to say the least. There’s lots of noise in big data so it needs to be cleaned first because there are more than enough bad data points to give you a recurring headache.

You have to organize your big data in a way that’s very methodical, otherwise a large amount of data you are bent on analyzing and processing will swell beyond your grasp. What’s worse, it can spike at literally any time. That’s why there needs to be support for both operational and analytical processing needs of a company, typically within a NoSQL framework with evolving models in different categories. Communication plays a key role as well as it helps organizations to educate and explain numerous aspects of big data through trainings and workshops so that it can be integrated into their goals.

Make no mistake – this is going to be an endless battle, particularly as the environment around us becomes more interconnected. If there’s a silver lining in this, it’s that as data grows in volume, so will the tools grow in efficiency and sophistication.

Here’s something to think about. In some cases, it’s better to think through a specific data point and go really deep, like Mariana Trench deep.

deep snow GIF
Not in his case.

Such an approach can provide you with a large advantage in a particular area as opposed to the macro level. There’s only so much you can know so if you have one specific thing you know the most about, it can be your competitive advantage and a key differentiator. I guess what I’m trying to say is that sometimes, less is more.

Not extracting usable information

Another major challenge is filtering out big data to get actionable (and relevant!) insights. Big Data is neither 100% accurate nor all of it is vital to your operations. With so much data in play, a little reverse engineering is what might solve your problems. In essence, you have to know beforehand what is the specific output you want to get in order to ask the right questions to get the data and execute whatever it is you want to do.

As much as it sounds complex, it’s actually interesting as there’s so much you can accomplish by studying the data. However, you have to pick your battles because you can’t do everything. You can be easily confused and misguided by doing too many things at the same time. Trying to do everything with big data involved is too much. You have to have very specific answers and conclusions that you want to get and work backwards from there.

For example, an ecommerce business wants to know what accessories people buy together with Apple’s Watch Series 4 to specifically promote and maybe offer discounts. Hence, it needs to analyze data from the competition and social media for trends but keep focus only on one product and not smartwatches as a whole. It’s easier that way to account all the factors and data sources whose analysis will yield the required insights.

Lack of skills

In any new profession, the lack of experts is a common occurrence and big data is a continuation of that principle. There are not enough educational degrees dedicated to big data as the majority of modern technology teaching is still focused on computer science without the necessary specialization.

Very few people are actually experienced and trained in big data, even though they call themselves professionals. I’d argue that enough time didn’t pass since the advent of big data to become an expert. This is largely a trial and error job, where you gain experience through implementation time and time again. I view it as such and take that exact approach when on the lookout for people with big data skills.

Spending too much and not generating ROI

I’ve touched upon this subject a bit already. Big data is very specific. It’s a whole world of collecting information. With so many industries and niches, there isn’t something customized enough to get everything in one place. There are tools and solutions that can help you but you have to look at the ROI – each tool you’re using costs money. Then there are new hires like developers and engineers, software development and configuration, new hardware, and so on – the expenses keep on going.

The Office I Give Up GIF

The key to solving this challenge differs from company to company but the foundation is the same: analyze your needs and goals and choose the appropriate course of action. For instance, if your focus is on security, then a cloud-based big data solution may not be for you – a hybrid could suit you better. If you want to scale, there are larger solutions that scale to bigger data than, let’s say traditional relational databases but completely remove affordability out of the picture.  

There are numerous examples but the point is big data solutions don’t have to cost a lot of money: you have to figure out where the cost incurs in order to improve solution performance and not waste money with overkill solutions.

Final words

The thing with big data is that there is no single, all-encompassing solution that tidies up the mess that comes from massive volumes of raw data. It’s incredibly easy to get more than you bargained for. Almost all of the listed challenges, as well as other miscellaneous issues, can be foreseen and dealt with if there is a systematic approach and a well-organized architecture in place. That’s how you grapple with the giant beast that is big data.

Image credits:

Latest news