How can existing data warehouse environments best scale to meet the needs of big data analytics?
Individual data warehouse projects need to be assessed on a case-by-case basis. Generally, in trying to stretch an existing data warehouse design to better handle big data analytics, there's a core process for figuring out what needs to be done. IT professionals can call this "scaling up" or "scaling out."
|Webinar: Big Iron, Meet Big Data: Liberating Mainframe Data with Hadoop & Spark|
Scaling up generally involves looking at getting sufficient processing power, getting a sufficient amount of memory, and accommodating more powerful server activities to handle all of the larger data sets that the business will process. By contrast, scaling out can mean collecting clusters of server hardware and networking them together to corral big data.
Some IT experts have suggested that the more common method with Apache Hadoop and other popular big data tools and platforms is to scale out and cluster hardware to achieve the desired effects. However, others point out that with today's technology, a data warehouse can scale up using a procurement strategy that adds resources to a server, such as by getting a higher number of processing cores along with a larger amount of RAM.
Whether they scale up or scale out, data warehouses need additional physical hardware assets to be able to handle the larger data workloads. They also need additional human administration, which means more training for internal teams. A lot of planning needs to go into the project to determine what kind of stress and pressure the larger data workloads will have on an existing legacy system in order to outfit it for a new big data ecosystem. One big issue is storage bottlenecks, which require upgrades to storage centers, and other kinds of performance bottlenecks that may hobble a nascent system if not they're not addressed.
More Q&As from our experts
- How can existing data warehouse environments best scale to meet the needs of Big Data analytics?
- Can there ever be too much data in big data?
- How can businesses solve the challenges they face today in big data management?
- Two-Phase Commit
- Atomicity Consistency Isolation Durability
- Software as a Service
- Stereoscopic Imaging
- Electronic Medical Record
- Computer Telephony Integration
- Electrophoretic Ink
- Web-Based Training
- Reduce, Reuse, Recycle
- Business Blog
Tech moves fast! Stay ahead of the curve with Techopedia!
Join nearly 200,000 subscribers who receive actionable tech insights from Techopedia.
- Data Governance Is Everyone's Business
- Key Applications for AI in the Supply Chain
- Service Mesh for Mere Mortals - Free 100+ page eBook
- Do You Need a Head of Remote?
- Web Data Collection in 2022 - Everything you need to know
- How to Protect Microsoft 365 from NOBELIUM Hackers
- 5 Steps to Streamline Security for your Hybrid Network