I do understand I’m not able to read it myself, I’m more curious about the architecture of how that data is represented and stored and conceptually how such representation is practically organized/reified…
The original binary format is split into six-bit chunks (e.g., 100101), which in decimal format correspond to the integers from 0 to 63. These are just mapped to letters in order:
000000 = A,
000001 = B,
000010 = C,
000011 = D,
etc.—it goes through the capital letters first, then lower-case letters, then digits, then “+” and “/”. It’s so simple you could do it by hand from the above description, if you were looking at the data in binary format.
I do understand I’m not able to read it myself, I’m more curious about the architecture of how that data is represented and stored and conceptually how such representation is practically organized/reified…
The original binary format is split into six-bit chunks (e.g., 100101), which in decimal format correspond to the integers from 0 to 63. These are just mapped to letters in order:
etc.—it goes through the capital letters first, then lower-case letters, then digits, then “+” and “/”. It’s so simple you could do it by hand from the above description, if you were looking at the data in binary format.