开发者

Paralleling trading software (continue)

I'm not sure if I need to use such advanced technics as PLINQ that's because I would like to rephrase my previous question Paralleling trading software I think m开发者_JAVA百科y previous question is too complicated and is not clear, I hope now I extracted exactly the required infromation and nothing else.

I have two very similar (i would say almost identical) threads.

Thread1:

while (true)
    foreach (Object lockObj : lockObjects) {
        lock (lockObj) {
            // do work (may take some time)
        }
    }
}

Thread2 (the same, but do another work):

while (true)
    // the same lockObjects from Thread1 are used so threads use "shared" resources
    foreach (Object lockObj : lockObjects) {
        lock (lockObj) {
            // do another work (may take some time)
        }
    }
}

Profilier says that about 30% of the processor time I'm waiting for lock to be released. How to avoid that easily? How to say, "Ok, if the object is locked now, then postpone this object processing, process another object, and return to this object after a while?"


One approach would be to maintain a queue of free objects, and a list of what each thread has processed. Have the threads find the first free object from the queue that they haven't processed while they're working on it, and put it back at the end of the queue when they're done.


Don't lock. Rewrite your processing to operate without it. It looks like you could set up queues for each thread and move objects between the queues without locking them. As long as threads only work on objects from their queue and objects aren't copied into multiple queues, you won't have to lock on anything.


if the object is locked now, then postpone this object processing, process another object

Use Monitor.TryEnter, Monitor.Exit instead of lock in your both threads:

while (true)
{
    foreach (Object lockObj in lockObjects) 
    {
        if (Monitor.TryEnter(lockObj)
        {
            try
            {
                // do another work (may take some time)
            }
            finally
            {
                Monitor.Exit(lockObj);
            }
        }
    }
}

The lock statement is in fact just a shortcut for Monitor.Enter Monitor.Exit like:

Monitor.Enter(lockObj);
try
{
    // do another work (may take some time)
}
finally
{
    Monitor.Exit(lockObj);
}

Note: By using TryEnter if the object is locked it will not block waiting for it to be released, where in using Enter or lock it will block waiting for the object to be freed and so have about 30% of the processor time waiting for lock to be released.

Edit: You stated at comment:

as I understand I need to remember if I have entered or not? because if TryEnter didn't entered, I need to enter later.. or will it do this for me?

The TryEnter will only check if the object is not locked then lock it otherwise ignore so it will not enter later by itself, however you can declare a list at the method level and added items that is not locked, something like:

List<object> notQueuedItems = new List<object>(lockObjects);

while (notQueuedItems.Count > 0)
{
    foreach (Object lockObj in notQueuedItems) 
    {
        if (Monitor.TryEnter(lockObj)
        {
            try
            {
                // do another work (may take some time)
            }
            finally
            {
                Monitor.Exit(lockObj);
                notQueuedItems.Remove(lockObj);
            }
        }
    }

    if (notQueuedItems.Count > 0)
    {
        Thread.Sleep(100);//give it some time to breath here.
    }
}


i think you could use some sort of arbiter to avoid collisions when processing objects

i've implemented an example for you

public class Arbiter<T>
{
    private static Random _rnd = new Random();
    private HashSet<T> _all = new HashSet<T>();
    private HashSet<T> _available = new HashSet<T>();        
    private object lockObj = new object();
    private uint _modCnt = 0;
    public void Add(T item)
    {
        lock (lockObj)
        {
            _all.Add(item);
            _available.Add(item);
            _modCnt++;
        }
    }
    public bool Remove(T item)
    {
        lock (lockObj)
        {
            if (_available.Contains(item))
            {
                _available.Remove(item);
                _all.Remove(item);
                _modCnt++;
                return true;
            }
            else if (!_all.Contains(item))
                return false;
            throw new InUseException();            
        }
    }
    private bool CheckOut(T item) 
    {
        lock (lockObj)
        {
            return _available.Remove(item);
        }
    }
    private void CheckIn(T item) 
    {
        lock (lockObj)
        {
            _available.Add(item);
        }
    }
    public IEnumerable<T> getEnumerable()
    {
        LinkedList<T> visited = new LinkedList<T>();
        LinkedList<T> all;
        uint modCnt;
        lock (lockObj)
        {
            //a list of all our items, in random order
            //each enumeration will get a new random order
            //should minimize collisions 
            all = new LinkedList<T>(_all.OrderBy(x => _rnd.Next()));
            modCnt = _modCnt;
        }
        while (all.Count > 0) 
        {
            if (modCnt != _modCnt)
            {//items have been added or removed ...
                modCnt = _modCnt;
                T[] r;
                T[] a;
                lock (lockObj)
                {
                    r = all.Except(_all).ToArray();//items we want to remove
                    a = _all.Except(all.Concat(visited)).ToArray();//items we want to add
                }
                foreach (var item in r)
                    all.Remove(item);
                foreach (var item in a)
                {//random placement for minimized collision probability
                    var node = all.First;
                    int skip = _rnd.Next() % all.Count;
                    for (int i = 0; i < skip && node.Next != null; i++)
                        node = node.Next;
                    all.AddAfter(node, item);
                }
            }
            var current = all.First;
            all.RemoveFirst();
            if (CheckOut(current.Value))
            {//checkout successfull -> we can have the item, and noone else will get it until we give it back
                yield return current.Value; // hand the item out for processing
                //note: yield return will _not_ end this method here
                CheckIn(current.Value); // give the item back so others can get it
                visited.AddLast(current); // mark as visited
            }
            else 
            {//someone else has our item ... a.k.a. collision
                all.AddLast(current); // move item to the end of our list to process it later
                //maybe we should take care of the case if there are no other items left
                //we could wait a bit before trying again ... but i don't care right now
            }
        }
    }
}

it should be more or less self-explaining and show you what i mean (use the result of getEnumerable() in your foreach loops), but it is not meant for production code ... not guaranteed to be bug free ... ;-)

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜