What Does Granularity Mean?

In computer science, granularity refers to a ratio of computation to communication – and also, in the classical sense, to the breaking down of larger holistic tasks into smaller, more finely delegated tasks. Granularity can inform development practices and direct design for technologies, by bringing attention to how computing tasks work in the context of an entire project.


Techopedia Explains Granularity

Experts talk about how “finer granularity” results in small tasks with smaller code sizes and faster execution times. Professional programmer George Moromisato on Quora talks about granularity as “the ability to manipulate, display or specify small, discrete pieces as opposed to large groups,” using the example of firewall software that would target individual IP addresses instead of massive address blocks.

In general, IT professionals talking about granularity are talking about bringing a finer level of detail to technologies and computing or code work. This is the general lexical meaning of granularity, to make things into finer, and often more manageable pieces.


Related Terms

Latest Hardware Terms

Related Reading

Margaret Rouse

Margaret Rouse is an award-winning technical writer and teacher known for her ability to explain complex technical subjects to a non-technical, business audience. Over the past twenty years her explanations have appeared on TechTarget websites and she's been cited as an authority in articles by the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine and Discovery Magazine.Margaret's idea of a fun day is helping IT and business professionals learn to speak each other’s highly specialized languages. If you have a suggestion for a new definition or how to improve a technical explanation, please email Margaret or contact her…