Large Scale Data Analysis

What Does Large Scale Data Analysis Mean?

Large scale data analysis is the process of applying data analysis techniques to a large amount of data, typically in big data repositories. It uses specialized algorithms, systems and processes to review, analyze and present information in a form that is more meaningful for organizations or end users.

Advertisements

Techopedia Explains Large Scale Data Analysis

Large scale data analysis is a broad term that encompasses a series of different tools and systems to process big data. Typically, large scale data analysis is performed through two popular techniques: parallel database management systems (DBMS) or MapReduce powered systems. The parallel DBMS system requires that the data be in a DBMS supported schema, whereas the MapReduce option supports data in any form. Moreover, the data extracted or analyzed in large-scale data analysis can be displayed in various different forms, such as tables, graphs, figures and statistical analysis, depending on the analysis system.

Advertisements

Related Terms

Latest Database Terms

Related Reading

Margaret Rouse

Margaret Rouse is an award-winning technical writer and teacher known for her ability to explain complex technical subjects to a non-technical, business audience. Over the past twenty years her explanations have appeared on TechTarget websites and she's been cited as an authority in articles by the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine and Discovery Magazine.Margaret's idea of a fun day is helping IT and business professionals learn to speak each other’s highly specialized languages. If you have a suggestion for a new definition or how to improve a technical explanation, please email Margaret or contact her…