Large Scale Data Analysis

What Does Large Scale Data Analysis Mean?

Large scale data analysis is the process of applying data analysis techniques to a large amount of data, typically in big data repositories. It uses specialized algorithms, systems and processes to review, analyze and present information in a form that is more meaningful for organizations or end users.

Advertisements

Techopedia Explains Large Scale Data Analysis

Large scale data analysis is a broad term that encompasses a series of different tools and systems to process big data. Typically, large scale data analysis is performed through two popular techniques: parallel database management systems (DBMS) or MapReduce powered systems. The parallel DBMS system requires that the data be in a DBMS supported schema, whereas the MapReduce option supports data in any form. Moreover, the data extracted or analyzed in large-scale data analysis can be displayed in various different forms, such as tables, graphs, figures and statistical analysis, depending on the analysis system.

Advertisements

Related Terms

Margaret Rouse
Technology Expert

Margaret is an award-winning technical writer and teacher known for her ability to explain complex technical subjects to a non-technical business audience. Over the past twenty years, her IT definitions have been published by Que in an encyclopedia of technology terms and cited in articles by the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine, and Discovery Magazine. She joined Techopedia in 2011. Margaret's idea of a fun day is helping IT and business professionals learn to speak each other’s highly specialized languages.