vishweshgm
Member level 4
I have a 3byte unique ID number for chip. I want to read this unique chip number and convert it to 2byte and assign this same chip a new 2Byte ID. I know that it is not logical to fit 3byte info in 2 byte, but I want to know if any mathematical way available where I can convert most 3byte numbers to 2 byte number. I also want to know, in the mathematical way proposed, what combination of 3 byte numbers might result in same 2byte number after conversion. Based on that drawback I want to take decision if I need to convert to 2 byte or not.
For eg.
eg1: Assume I have 3byte number as {B2,B1,B0}. I can make rule that, when 3bytes are read in code, if B2>0, then do not do conversion. else, convert it to 2 bytes.
So if 2 chips having 3byte unique id as 0xAB1234 and 0x001234, only chip2's unique ID will be changed to 2 bytes.
Now if I have 100,000 chips with me out of which I take 60 chips and run this rule. Now if all chips had B2>0, then I will end up having no chip converted to 2 bytes at all, which is bad.
So I am looking for a some other method where my probability to convert to unique ID increases.
For eg.
eg1: Assume I have 3byte number as {B2,B1,B0}. I can make rule that, when 3bytes are read in code, if B2>0, then do not do conversion. else, convert it to 2 bytes.
So if 2 chips having 3byte unique id as 0xAB1234 and 0x001234, only chip2's unique ID will be changed to 2 bytes.
Now if I have 100,000 chips with me out of which I take 60 chips and run this rule. Now if all chips had B2>0, then I will end up having no chip converted to 2 bytes at all, which is bad.
So I am looking for a some other method where my probability to convert to unique ID increases.