Explain how to convert between 8-bit binary and decimal.
To convert decimal to binary, you need to first divide the number by 2, then get the remainder for the next binary digits, and repeat the steps until you are left with 0.
To convert binary to decimal, you need to first write down the binary number and list the powers of 2 from right to left. Then, write the digits of the binary number below their corresponding powers of two, and write down the final value of each power of two. Add them together to get the decimal value.
Examples:
Convert 182 to 8-bit binary manually:
Convert 10011100 to decimal:
How many bytes in a kilobyte, megabyte, gigabyte, petabyte and terabyte?
Unit: | How many bytes: |
---|---|
Kilobyte | 1000 |
Megabyte | 1000000 |
Gigabyte | 1 billion |
Petabyte | 10^15 |
Terabyte | 1024^4 |
Explain the difference between ASCII and Unicode. Give reasons why Unicode is more useful for web use.
The difference between Unicode and ASCII is that Unicode is the IT standard that represents letters of many different languages, mathematical symbols, historical scripts, etc, whereas ASCII is limited to few characters such as uppercase and lowercase letters, symbols, and digits (0-9). Unicode is more useful for web use because it can accommodate a huge number of characters. Because of this, Unicode currently contains most written languages and still has room for even more.