开发者

how to represent an integer array using java.util.BitSet ?

I need to represent an array of integers using 开发者_Python百科BitSet. Can somebody explain me the logic required to do this ?


You can represent a set of integers using BitSet, but not an arbitrary array. You will lose information about order and repetitions.

Basically, set the nth bit of the BitSet if and only if n appears in your set of integers.

BitSet bitSet = new BitSet();
int[] setOfInts = new int[] { /* Your array here */ };
for (int n : setOfInts) {
   bitSet.set(n);
}


I think the logic would be: Run through the integer array, test every bit and set this bit in the bitset like bitset.set(array_pos+bit_pos)


first thought:
use BigInteger and create it like: new BigInteger(int value, int base). Then you can toString() it, and then create BitSet using that String(don't know how to do it without analyzing the string, however).
--
didn't read it right. That method only helps you to create an array of BitSet, not the whole BitSet that contains the whole array.
I don't know how to make array of integers to one bitSet. I guess you will need some kind of delimeters, but how to make good delimeter in binary - that's a good question.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜