开发者

Why is my return value wrong?

I have a vector class in C# (a fragment below). My issue is that when I call GetMagnitude(), it always returns 0.0f - even with the debugger running and I check that Sq has a valid value, as soon as it gets passed back into other function (eg: Normalize() ), it has return 0.0f. Can someone explain this and help me fix it? My guess is that it has something to do with double->float conversion but I just can't figure it.

public class float3
{
    public float x;
    public float y;
    public float z;


    public float GetMagnitude()
    {
        float SumSquares = (float)( Math.Pow(x, 2) + Math.Pow(y, 2) + Math.Pow(z, 2));
        float Sq = (float)Math.Sqrt(SumSquares);
        return Sq;
    }

    public void Normalize()
    {
        float inverse = 1.0f / Ge开发者_StackOverflow社区tMagnitude();

        x *= inverse;
        y *= inverse;
        z *= inverse;
    }
}


I just tested your code with this set up and it worked perfectly:

void Main()
{
    var myData = new float3
    {
        x = 1,
        y = 1,
        z = 1
    };
    float result = myData.GetMagnitude();
}

I get the result 1.73...

Is it possible that the problem is elsewhere? Could you create a small console app and insert that code just to isolate it?

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜