Definition - What does Bignum mean?
Bignum is a tool for code representation for integers that are too large for the conventional integer declaration or value type in many mainstream programming languages. Since the value type integer in these languages only goes up to a value of 2,147,483,648 a bignum value type represents numbers that are larger than that.
Techopedia explains Bignum
Another way to describe bignum is that these numerical values do not fit in at a 62-bit binary value. So they must be stored differently by languages like Ruby and others. This requires some specific procedure for storing the extra binary code in order to add it to the 62-bit code in machine language in order to provide the numerical value encoded.
Computer systems also make use of floating point operations to represent very large integers, with a finite degree of precision.
In addition to its use in conventional programming languages, bignum is also used and referenced in newer types of languages like JSON.
Join thousands of others with our weekly newsletter
The 4th Era of IT Infrastructure: Superconverged Systems:
Approaches and Benefits of Network Virtualization:
Free E-Book: Public Cloud Guide:
Free Tool: Virtual Health Monitor:
Free 30 Day Trial – Turbonomic: