What Does Bignum Mean?

Bignum is a tool for code representation for integers that
are too large for the conventional integer declaration or value type in many
mainstream programming languages. Since the value type integer in these
languages only goes up to a value of 2,147,483,648 a bignum value type represents
numbers that are larger than that.


Techopedia Explains Bignum

Another way to describe bignum is that these numerical values do not fit in at a 62-bit binary value. So they must be stored differently by languages like Ruby and others. This requires some specific procedure for storing the extra binary code in order to add it to the 62-bit code in machine language in order to provide the numerical value encoded.

Computer systems also make use of floating point operations to represent very large integers, with a finite degree of precision.

In addition to its use in conventional programming languages, bignum is also used and referenced in newer types of languages like JSON.


Related Terms

Latest Computer Science Terms

Related Reading

Margaret Rouse

Margaret Rouse is an award-winning technical writer and teacher known for her ability to explain complex technical subjects to a non-technical, business audience. Over the past twenty years her explanations have appeared on TechTarget websites and she's been cited as an authority in articles by the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine and Discovery Magazine.Margaret's idea of a fun day is helping IT and business professionals learn to speak each other’s highly specialized languages. If you have a suggestion for a new definition or how to improve a technical explanation, please email Margaret or contact her…