[WEBINAR] The New Normal: Dealing with the Reality of an Unsecure World

What considerations are most important when deciding which big data solutions to implement?


What considerations are most important when deciding which big data solutions to implement?


Every business and organization must consider its own needs and resources when figuring out which issues are most important for big data implementation. However, there are a number of principles that are generally considered critical for this kind of adoption of technology.

Webinar: Big Iron, Meet Big Data: Liberating Mainframe Data with Hadoop & Spark
Register here

One of the biggest questions is implementation and the amount of disruption it will cause. Users of big data systems always have to compare what they are about to use to what they are currently using. In many cases, disruption is the deciding factor in whether big data resources are going to boost productivity and profits, or send a business crashing down due to insurmountable hurdles with implementation. Vendor support (or lack of it) has a lot to do with this, but businesses also have to look at the learning curve for technologies, how much they would change the operations of legacy systems, and in general, whether the changes are something that the enterprise can handle.

Another major question is which data is most valuable to a business or organization. By examining the value of different data sets, those intending to implement big data can set the scope of their project. Without these kinds of guidelines, big data projects can get bloated and overwhelmed in an enterprise. Experts recommend focusing on the specific data sets that will give the most value, without getting bogged down in casting a wider net.

A corollary issue here is the use of structured and unstructured data. Business leaders can look at the levels of difficulty of getting different bits of data into a big data context like a data center. For example, already formatted data sets can be easily digested, but some other pieces of data may need extensive manipulation to get them into a useful format, and it may not be worth it.

Adopters will also have to look at advanced handling for big data. Big data systems are defined as those that are difficult to handle with basic and simple hardware and software infrastructures. That means the adopters need to have adequate talent and resources on hand to find ways to use the big data sets that won't cause network congestion or otherwise create bottlenecks in operations.

Have a question? Ask Techopedia here.

View all questions from Techopedia.

Techopedia Staff
Profile Picture of Techopedia Staff

At Techopedia, we aim to provide insight and inspiration to IT professionals, technology decision-makers and anyone else who is proud to be called a geek. From defining complex tech jargon in our dictionary, to exploring the latest trend in our articles or providing in-depth coverage of a topic in our tutorials, our goal is to help you better understand technology - and, we hope, make better decisions as a result. 

 Full Bio
Free Whitepaper: The Path to Hybrid Cloud
Free Whitepaper: The Path to Hybrid Cloud:
The Path to Hybrid Cloud: Intelligent Bursting To Amazon Web Services & Microsoft Azure
Free E-Book: Public Cloud Guide
Free E-Book: Public Cloud Guide:
This white paper is for leaders of Operations, Engineering, or Infrastructure teams who are creating or executing an IT roadmap.
Free Tool: Virtual Health Monitor
Free Tool: Virtual Health Monitor:
Virtual Health Monitor is a free virtualization monitoring and reporting tool for VMware, Hyper-V, RHEV, and XenServer environments.
Free 30 Day Trial – Turbonomic
Free 30 Day Trial – Turbonomic:
Turbonomic delivers an autonomic platform where virtual and cloud environments self-manage in real-time to assure application performance.


  • Being digital should be of more interest than being electronic.
    - Alan Turing, 1947