开发者

what takes up the most of time in this small function? weird, and profiling breakdown given

blockProperty is dictionary<string,string[]>

    bool BlockMatch(IList<string> container, string block, int cut)
    {
        string[] blockArray = blockProperty[block];
        int length = blockArray.Length - cut;
        if (length > container.Count)
        {
            return false;
        }

        for (int i = 0; i < length; i++)
        {
            if (blockArray[length开发者_StackOverflow社区 - 1 - i] != container[container.Count - 1 - i])
            {
                return false;
            }

        }
        return true;
    }

what takes up the most of time in this small function? weird, and profiling breakdown given

Columns are: inclusive Elapsed time, exclusive Elapsed time, inclusive percentage (of the whole program), exclusive percentage, number of calls.

How can i optimize the method according to the profiling breakdown? As I find it strange that the exclusive elapsed time of the method (6042) is more than a half of inclusive one (10095).


To break this down any further, you may (for testing purposes) split up the function to small "one-line-subfunctions" so you can see the profiling broken down even more. Another help will be a double click on the profiling line to show you the individual function calls with their relative time.


try

if(!blockArray[length - 1 - i].Equals( container[container.Count - 1 - i]) )) {return false;}


Its possible that traversing the array in reverse-order is doing this: from what I know, arrays are optimized for forward/sequential access. Furthermore you might be preventing the JIT from doing bounds-checking elimination with all that arithmetic. Try this:

    for (int i = cut; i < blockArray.Length; i++)
    { 
        if (!StringComparer.Ordinal.Equals(blockArray[i], container[i]))
            return false;
    }

However, in the end it depends on how many items you have in the array - if there are a lot there isn't much you can do (except use unsafe code, but that is only going to give you a tiny bit extra).

Edit: you can improve the speed of negative cases using HashCodes; at the cost of memory.

class StringAndHash
{
    public int HashCode;
    public string Value;

    public StringAndHash(string value)
    {
        if (value == null)
            HashCode = 0;
        else
            HashCode = StringComparer.Ordinal.GetHashCode(value.GetHashCode());
        Value = value;
    }

    public static implicit operator string(StringAndHash value)
    {
        return value.Value;
    }

    public static implicit operator StringAndHash(string value)
    {
        return new StringAndHash(value);
    }

    public override int GetHashCode()
    {
        return HashCode;
    }

    public override bool Equals(object obj)
    {
        var sah = obj as StringAndHash;
        if (!object.ReferenceEquals(sah, null))
        {
            return Equals(sah);
        }
        return base.Equals(obj);
    }

    public override bool Equals(StringAndHash obj)
    {
        return obj.HashCode == HashCode // This will improve perf in negative cases.
            && StringComparer.Ordinal.Equals(obj.Value, Value);
    }
}

public Dictionary<string, StringAndHash[]> blockProperty;
bool BlockMatch(IList<StringAndHash> container, string block, int cut)
{
    var blockArray = blockProperty[block];
    var length = blockArray.Length - cut;
    if (length > container.Count)
    {
        return false;
    }

    for (int i = cut; i < blockArray.Length; i++)
    {
        if (blockArray[i].Equals(container[i]))
        {
            return false;
        }
    }
    return true;
}
0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜