The ten digits commonly used to represent numbers in counting today are generally referred to as the Arabic or Hindu numerals. These are the digits 0, 1, 2, 3, 4, 5, 6, 7, 8, and 9. They are the most common symbolic representation of numbers in the world as of the beginning of the twenty-first century.
With the advent of computers came the binary, octal, hexadecimal, and ASCII systems that represents numbers and/or letters because computing hardware and software are generally easier to designed and be understood with these alternative representations of numbers and/or letters. Examples of these representations include “0101b” is the number “5” in binary, and “1Ah” is the number “26” in hexadecimal.
Numbers, letters and/or other information continue to be represented in different forms, generally to best fit a specific need, purpose, or application. For example, because classical computing executes using “ones and zeros,” numbers and/or letters are represented by “ones and zeros”. With the growth of quantum computing and Artificial Intelligence, for example, other means to represent numbers and/or letters that would more appropriately fit these applications are desirable.