Cisco CloudCenter: Get the Hybrid IT Advantage

Network Degradation

Definition - What does Network Degradation mean?

Network degradation refers to a decrease in connectivity and response speed throughout a given network. Analysis and diagnosis of this kind of deterioration is often important for those maintaining data structures or other network-supported models, as well as for network functionality in general.

Techopedia explains Network Degradation

Causes of network degradation can include propagation delays, which involve physically transporting data across an IT architecture, and problems with routing. Aspects of an IT system that modify or work on data can also cause different kinds of delays. End-point problems, where terminals or workstations acting as data destinations may experience delays because of insufficient memory or processing capacity, are another problem, as are other forms of degradation that occur as a result of malware or spyware.

While problems with individual hardware devices don’t usually depress functionality through an entire network, other problems can be network-wide. For example, problems with fragmentation of data packets can affect network performance. In addition, those analyzing network degradation may look at the impact of denial of service (DoS) attacks, or various other kinds of outside hacking that can impact a network.

In order to identify and handle different kinds of network degradation, networking professionals may perform various tests on the network and data packet routing. In addition, in order to anticipate and handle some kinds of network degradation, developers or engineers may consider fault tolerant design or graceful degradation models, where systems may be designed to operate well even under significant pressure. This prevents various kinds of natural degradation from causing system failure or interrupting core network services.
Share this: