Before the advent of modern color monitors and displays, earlier computers communicated to humans through a set of more mechanical and less advanced interfaces.
Computer output in the early twentieth century started with punch cards and blinking lights. In the earliest computers, there were often a set of indicator lights that human operators read. Some computers also had dials or gauges that would show various results.
At the same time, some computer engineers were creating punch card systems – for instance, many of the large mainframe computers predating the ENIAC and related designs took in and spat out Hollerith punch cards designed by IBM. Others had different kinds of paper punched outputs that sometimes needed to be translated or interpreted with the assistance of machines or tables.
As computers advanced, engineers added teletype interfaces. In these types of interfaces, the computers would simply print out results. The “print” command became a basic staple of computer programing (and would remain so for decades, arguably, up until the present time). Printed results became popular ways of getting computer output, because they were easier to read than punch cards.
The final evolution of early interfaces into display monitors came when pioneers in computing technology figured out that they could use cathode-ray tube or CRT displays as a kind of “virtual teletype.” In other words, the same printed results that previously came out on paper, usually from dot matrix printers, could be displayed on a CRT screen instead. Those were the earliest display monitors, which became ubiquitous in the early 1980s. From there, display monitor technology advanced into multicolored VGA designs, and then onto flat-screen and LCD designs.