Definition - What does 8-Bit mean?
8-bit is a measure of computer information generally used to refer to hardware and software in an era where computers were only able to store and process a maximum of 8 bits per data block. This limitation was mainly due to the existing processor technology at the time, which software had to conform with. This resulted in blocky graphics and slow compute times. So at present, when 8-bit is mentioned, it is generally associated with slow computers, low-resolution graphics and simplistic sound.
Techopedia explains 8-Bit
Eight bits was the maximum word size capability of many computers which were widely used in the early 1970s up to the late 1980s. This was a hardware limitation of the microprocessor architecture technology and was the major bottleneck for software created for those computer systems. The processor and its registers could only hold and process 8 bits of data at a time, so each computation process involved many more fetch and execute cycles compared to computers with 16-bit or higher word sizes; a far cry from today's 32- and 64-bit processor architectures. In the same vein, 8-bit graphics processors maxed at 8 bits so they could only show a maximum of 256 colors.
The 8-bit architecture is particularly popular with gamers, as the first truly classic game consoles, which paved the way for the game industry started in 8-bit. The Atari 2600 and the Nintendo Entertainment System (NES), game consoles held in high regard, are considered icons of the 8-bit era of games. Even today, 8-bit graphics and audio are still being used for new games that run on modern hardware. The use of 8-bit-like graphics is called pixel art, which is obviously no longer limited to 8-bit colors or monitors but is simply made to look pixelated to resemble 8-bit graphics and invoke a sense of nostalgia.