Advertisement

Lexical Analysis

What Does Lexical Analysis Mean?

Lexical analysis is a concept that is applied to computer science in a very similar way that it is applied to linguistics. Essentially, lexical analysis means grouping a stream of letters or sounds into sets of units that represent meaningful syntax. In linguistics, it is called parsing, and in computer science, it can be called parsing or tokenizing.

Advertisement

Techopedia Explains Lexical Analysis

The idea of lexical analysis in computer science is that lexical analysis breaks streams down into “lexemes” where a token represents the basic unit of meaning. Tokens are strung together in such a way that the language compiler must go back and isolate them to implement the right computing instructions. Basically, both humans and computers do lexical analysis, but computers do it differently, and in a much more technical way. The way that computers do lexical analysis does not need to be transparent to humans – it just has to be programmed into the computing system. Programs that do lexical analysis in computer science are often called lexers, tokenizers or scanners.

Advertisement

Share this Term

  • Facebook
  • LinkedIn
  • Twitter

Related Reading

Tags

Computer Science

Trending Articles

Go back to top