Definition - What does Autonomic Computing mean?
Autonomic computing is a computer's ability to manage itself automatically through adaptive technologies that further computing capabilities and cut down on the time required by computer professionals to resolve system difficulties and other maintenance such as software updates.
The move toward autonomic computing is driven by a desire for cost reduction and the need to lift the obstacles presented by computer system complexities to allow for more advanced computing technology. Autonomic computing was implemented by IBM in 2001.
Techopedia explains Autonomic Computing
Maintenance for operating systems combined with a the lack of qualified IT professionals produced a need for autonomic computing. In the 2001 mid-October manifesto, ''The Vision of Autonomic Computing'', which was based out of the IBM Thomas J. Watson Research Center, authors Jeffrey Kephart and David Chess caution readers about pervasive computing limitations that could pose real challenges for interactions between computing systems and devices. They caution that system engineers may not be able to develop continuing complex architectural designs in the future. The authors also point toward the use of autonomic computing for low-level task management in order to free up system administrators so they can focus on more complex tasks.
The autonomic computing initiative (ACI), which was developed by IBM, demonstrates and advocates networking computer systems that do not involve a lot of human intervention other than defining input rules. The ACI is derived from the autonomic nervous system of the human body. IBM has defined the four areas of automatic computing to include self-configuration, self-healing (error correction), self-optimization (automatic resource control for optimal functioning) and self-protection (identification and protection from attacks in a proactive manner). Characteristics that every autonomic computing system should have include automation, adaptivity and awareness.