Ok I'm not sure I understand this, though it makes sense in my mind and I don't want to get this wrong.
If I have a key that is hexadecimal its a 16-bit's right? 0-F right?
If I have a key that takes 5 pairs of hexadecimal bits its 10-bits right? AB CD EF GH IJ
Now if I have a 64-bit encryption it takes the 5 pairs of hexadecimal bits right? so 2^5 = 32 bytes, so its a 32 byte word or character string is what I'm meaning. Now when it encrypts that word to a 64-bit encryption it does (in programming terms) this, it bit shifts the 32 byte word 63 places right. e.g.
wordtwo = word << 63
That's what makes the 64-bit encryption, correct? But it does it to each individual character, correct? But it may be in an algorithm to mix up the letters and to add a key value and mix in the key value correct?
I need to make sure I have this down, cause it makes sense to me, but I'm not sure if its right or wrong.
If I have a key that is hexadecimal its a 16-bit's right? 0-F right?
If I have a key that takes 5 pairs of hexadecimal bits its 10-bits right? AB CD EF GH IJ
Now if I have a 64-bit encryption it takes the 5 pairs of hexadecimal bits right? so 2^5 = 32 bytes, so its a 32 byte word or character string is what I'm meaning. Now when it encrypts that word to a 64-bit encryption it does (in programming terms) this, it bit shifts the 32 byte word 63 places right. e.g.
wordtwo = word << 63
That's what makes the 64-bit encryption, correct? But it does it to each individual character, correct? But it may be in an algorithm to mix up the letters and to add a key value and mix in the key value correct?
I need to make sure I have this down, cause it makes sense to me, but I'm not sure if its right or wrong.