Tech moves fast! Stay ahead of the curve with Techopedia!
Join nearly 200,000 subscribers who receive actionable tech insights from Techopedia.
Cognitive technology is a field of computer science that mimics functions of the human brain through various means, including natural language processing, data mining and pattern recognition. It is expected to have a drastic effect on the way that humans interact with technology in coming years, particularly in the fields of automation, machine learning and information technology.
Cognitive technology is a subset of the broader field of artificial intelligence, which itself could be considered a subset of biomimetics. Although artificial intelligence has been the subject of research for a very long time, cognitive technology evolved mostly out of the internet (particularly the web and the cloud).
One notable innovation that has become emblematic of cognitive technology is IBM’s Watson supercomputer, which has a processing rate of 80 teraflops that it uses to essentially "think" as well as (or better than) a human brain. Cognitive technology has also been applied in the business sector, perhaps most famously with the streaming media service Netflix, which uses it to generate user recommendations (a function that has largely contributed to the company’s success).