The “AI winter” was a period of reduced interest in the field of artificial intelligence lasting from the 1980s through the 2000s. This lack of interest led to a lack of funding.
The term was coined by analogy to the hypothesized concept of “nuclear winter” where the fallout from a nuclear war would change weather patterns.
The “winter” has been blamed on several causes: the overhyping of AI research, the interdisciplinary nature of AI and conflicts among university departments, university budget cuts, lack of practical applications for AI research, and cheaper computing products overtaking expensive Lisp machines in performance.
The AI winter of the 1980s was precipitated by failures of “expert systems,” systems that were purported to have decision-making capabilities similar to a human expert. These systems were popular in the 1980s, but they proved expensive and unreliable. This led to the belief that AI research had been overhyped.
Another major consequence of the AI winter was the fall of Lisp machines. These machines were expensive workstations built to support the Lisp programming language, which was the main language for AI research. They were also the main platform for expert systems. Cheaper general-purpose computers such as those based on the x86, Motorola 68000 or SPARC processors outstripped Lisp machines in performance by the late 1980s.
These developments led to a steep drop in interest in AI research in the late 1980s through the 1990s. Funding was harder to come by, and researchers began to refer to their efforts by other names. The current focus on “machine learning” within AI is one long-standing consequence of the AI winter.
Interest in AI research picked up in the 2000s. AI’s resurgence seems to have been sparked by the thing that killed Lisp machines: improvements in performance of computers. This led to new applications and approaches being used in AI.