Definition - What does Granularity mean?
In computer science, granularity refers to a ratio of computation to communication – and also, in the classical sense, to the breaking down of larger holistic tasks into smaller, more finely delegated tasks. Granularity can inform development practices and direct design for technologies, by bringing attention to how computing tasks work in the context of an entire project.
Techopedia explains Granularity
Experts talk about how “finer granularity” results in small tasks with smaller code sizes and faster execution times. Professional programmer George Moromisato on Quora talks about granularity as “the ability to manipulate, display or specify small, discrete pieces as opposed to large groups,” using the example of firewall software that would target individual IP addresses instead of massive address blocks.
In general, IT professionals talking about granularity are talking about bringing a finer level of detail to technologies and computing or code work. This is the general lexical meaning of granularity, to make things into finer, and often more manageable pieces.
Join thousands of others with our weekly newsletter
The 4th Era of IT Infrastructure: Superconverged Systems:
Approaches and Benefits of Network Virtualization:
Free E-Book: Public Cloud Guide:
Free Tool: Virtual Health Monitor:
Free 30 Day Trial – Turbonomic: