Don't miss an insight. Subscribe to Techopedia for free.


Generative AI

Reviewed by Margaret RouseCheckmark | Last updated: June 28, 2022

What Does Generative AI Mean?

Generative AI is broad label that's used to describe any type of artificial intelligence that uses unsupervised learning algorithms to create new digital images, video, audio, text or code.

Until recently, most AI learning models have been characterized as being discriminatory. The purpose of a discriminatory learning algorithm is to use what's learned during training to make a decision about new input. In contrast, the purpose of a generative AI model is to generate synthetic data that can pass a Turing Test. Because generative AI requires more processing power than discriminative AI, it is more expensive to implement.

Generative AI models are given a limited number of parameters to use during the training period. Essentially, this approach forces the model to draw its own conclusions about the training data’s most important characteristics. Once the generative model identifies the data's fundamental properties, it can use a Generative Adversarial Network (GAN) or Variational Autoencoder (VAE) to improve output accuracy.

While the term generative AI is often associated with deep fakes and data journalism, the technology is playing an increasingly important role in helping to automate the repetitive processes used in digital image correction and digital audio correction. Generative AI is also is being used experimentally in manufacturing as a tool for rapid prototyping and in business to improve data augmentation for robotics process automation (RPA).


Techopedia Explains Generative AI

To recap, basically any time an AI technology is generating its own content, whether that's text or visual or multimedia, professionals may talk about that as “generative AI.” This includes technologies that can draw and paint pictures, as well as technologies that can use information gathered on the internet to create website articles and article summaries, corporate brochures, press releases and white papers.

For example, in the instance of creating text, generative AI will scrutinize existing human-written text for everything from grammar and punctuation to style and word choice and narrative and thesis. With the advanced AI that we have now at our disposal, the generative AI can create content that seems to be written by humans and pass the Turing test established by notable mathematician and cryptographer Alan Turing in the mid-20th century. It’s expected that generative AI will be responsible for taking over parts of those creative processes that humans have used for centuries in publishing, broadcasting and communications.

Here's an example of generative AI – suppose you have the task of putting together an insurance brochure. You have a list of policies and costs, and benefits and details. The traditional way this would work is that a human writer would take a look at all of that raw data, and then take notes and write something in a narrative form that explains to the reader what each of these things is. With generative AI, the program can review the raw data, fashion a narrative around it, and create something that's readable for a human reader, without a human writer being directly involved.

Some people are afraid of certain generative AI technologies, especially those that simulate human creativity by writing fiction or producing works of art. This leads to a more general debate about the limits of technology and its impact on our lives. While people may think of generative AI as something that is going to be replacing human jobs, new technologies like these often have a human in the loop (HITL) element. When AI is characterized as an assistive technology that helps humans produce faster, more accurate results, it’s referred to as augmented artificial intelligence.

Arguably, because machine learning and deep learning are inherently focused on generative processes, they can be considered types of generative AI, too.


Share this Term

  • Facebook
  • LinkedIn
  • Twitter

Related Reading


Artificial Intelligence Machine Learning

Trending Articles

Go back to top