Byte Size

Question: Why are there only 8 bits per byte and not something like 12 bits per byte?
Gerald

Answer: Early computers varied in the number of bits per byte with values of between 6 and 9 before 8 was eventually picked as the standard. The more bits in a byte the fewer characters a computer can handle in the available bits. The ASCII character set which is the standard that many computers use as their default defines 128 characters and therefore requires 7 bits to represent each character. As early computers sometimes lost individual bits when transmitting data the eighth bit was used as a checkbit - and was set either on or off in order to give an even number of bits on for each character. Whenever an odd number of bits was found then the receiving computer could request for the character to be resent because it knew something was lost. As that is no longer required the extra bit is now always assumed to be zero for the standard 128 characters allowing a further 128 additional characters to be defined. Where 256 characters are insufficient UNICODE can be used which uses one, two, or four bytes to represent different characters depending on how common they are. By doing this the amount of storage to handle the data is kept to a minimum. Were bytes to be 12 bits instead of 8 then all the memory in all the computers in existence would be reduced to being able to hold only 2/3 of what they can currently hold for most purposes and only in rare cases would those extra bits allow any saving.

 

This article written by Stephen Chapman, Felgall Pty Ltd.

go to top

FaceBook Follow
Twitter Follow
Donate