Representing Letters in Binary (ASCII)

Because everything is represented in bits, global agreed-upon standards are needed for representing letters and characters.

Question: How does the computer know when we mean 65 or A?

Problem: ASCII is quite US-centric

Measuring Bits

Suppose we send the message: 72 73 33 ("HI!")

However, bits are pretty small (physically and mathematically), so we don’t usually measure things in bits

Unicode

256 different unique values for a byte works for English ASCII, but other global standards are needed to support other languages and things like emojis. One solution is unicode.