Does anyone know of an algorithm to change a binary number into an decimel ASCII? I don't need an algorithm per se, but an idea about how to go about it.
Decimel ASCII->Binary Number isn't so bad since you can just isolate each ASCII numerical symbol and change it into it's numerical binary equivelant, scale it by the order of magnitude and then add them all up.
But I have no idea how to take a binary number like 0b1111101000 and change it into '1000', without a horrid repetitive function where you subtract the largest power of 10 from the binary number that is still less than the number until it becomes less than the power, and then you reduce the power of 10 by one, and then repeat over and over again until you get to 10^0 and the number equals zero.
Decimel ASCII->Binary Number isn't so bad since you can just isolate each ASCII numerical symbol and change it into it's numerical binary equivelant, scale it by the order of magnitude and then add them all up.
But I have no idea how to take a binary number like 0b1111101000 and change it into '1000', without a horrid repetitive function where you subtract the largest power of 10 from the binary number that is still less than the number until it becomes less than the power, and then you reduce the power of 10 by one, and then repeat over and over again until you get to 10^0 and the number equals zero.
Last edited: