In Programming, do we have a standard as to what 8th bit, 7th bit, first bit or second bit is?
In programming, when we say "the 7th Least Significant Bit", do we have a standard of whether it is bit 7 or bit 6 (if we start from bit 0).
Because if we say "the 2nd Least Significant Bit", it soun开发者_开发知识库ds like it is bit 1 (counting from bit 0 again), so if 2nd means bit 1, then 7th means bit 6, not bit 7.
A standard? Like an ISO standard? No, although quite a few of them actually count bits at b0. But, in English terms, the second least significant bit is one removed from the (first) least significant bit, so that would be b1.
So the seventh would be b6. In an octet, the most significant bit, b7, would be the eighth least signicant bit.
For what it's worth, I don't think I've ever heard the phrase, "the 7th least significant bit", in my entire 30-odd year worklife. It's always been bN (where N rages from 0 to the number of bits minus one) or just the least or most significant bit (not even second most significant).
The standard that I've always used is that bits are numbered 0 through n-1 for an n bit number, with 0 the lowest order bit, but the "1st bit" is bit 0, the "second bit" is bit 1, and so on.
精彩评论