Tech moves fast! Stay ahead of the curve with Techopedia!
Join nearly 200,000 subscribers who receive actionable tech insights from Techopedia.
In computer science, granularity refers to a ratio of computation to communication – and also, in the classical sense, to the breaking down of larger holistic tasks into smaller, more finely delegated tasks. Granularity can inform development practices and direct design for technologies, by bringing attention to how computing tasks work in the context of an entire project.
Experts talk about how “finer granularity” results in small tasks with smaller code sizes and faster execution times. Professional programmer George Moromisato on Quora talks about granularity as “the ability to manipulate, display or specify small, discrete pieces as opposed to large groups,” using the example of firewall software that would target individual IP addresses instead of massive address blocks.
In general, IT professionals talking about granularity are talking about bringing a finer level of detail to technologies and computing or code work. This is the general lexical meaning of granularity, to make things into finer, and often more manageable pieces.
Join nearly 200,000 subscribers who receive actionable tech insights from Techopedia.