开发者

Using GPU on Silverlight 5 for general-purpose math

I'm working on an in-browser Silverlight application that has some fairly compute-intensive operations, e.g., running an Inverse Discrete Cosine Transform, or a Fast Fourier Transform, hundreds of times a second. It would be valuable to be able to offload as much of this as possible onto the computer's GPU. I know that there was some discussion of this with Silverlight 3 and 4, using pixel shaders, but the consensus was that because Silverlight 3/4 didn't use hardware acceleration for their pixel shaders, and because their pixel shader language was limited to Level 2, it wasn't going to result in much of a performance increase, if any.

However, supposedly Silver开发者_开发问答light 5 has a much broader range of hardware-accelerated graphics, including a reasonably complete 3D pipeline. However, I haven't yet heard if anyone has been able to leverage this pipeline for accelerating general purpose mathematical operations (like FFT's, DCT's, IDCT's, etc). Has anyone tried that yet? Any pointers on where to start looking?


I thought I'd post back what I've discovered so far. The short answer is that no, I don't think that the 3D pipeline on Silverlight 5 can be leveraged for this sort of thing. On the one hand, from what I can tell, the pixel shaders and vertex shaders that are a part of the pipeline do, in fact, get executed on the GPU (unlike the 2D shaders in Silverlight 4, which were executed on the CPU).

But that said:

(1) Everything I've read says that getting data onto the GPU is very fast, but that for most machines, getting that data out of the GPU is much slower, on the order of milliseconds. That makes it unlikely that we could, say, load up the GPU with the data necessary to perform an FFT, perform the FFT, and then pull the data back faster than we could just do it on the CPU.

(2) Silverlight 5 has a very limited set of instructions that it can execute on the GPU. Specifically, it's limited to HLSL Level 2, which has a limited number of instructions and registers available. I doubt that it would be possible -- at best, it would be very difficult and very slow -- to model an FFT or a DCT within those limited instructions.

(3) But even if we could get around those two limitations, from what I can tell, Silverlight doesn't have any ability to read the results of the calculations the GPU is performing. Normal XNA (the framework on which Silverlight's 3D features are based) has various GetData() or GetTexture() methods that I think you could use to read the results of a set of calculations. But those equivalent methods are missing in their Silverlight 5 versions. From everything I can tell, in Silverlight 5, the GPU is a write-only device. You load your shaders onto it, you load up your data, you pull the trigger, and you wave good-bye. Your code will never see those bytes again.

If it turns out that I'm wrong on this, I'll come back here and update this answer. But at least at the moment, it looks as if this is a dead-end.

[Edit 10/10/11 - According to Shawn Hargreaves from MS, this isn't supported in Silverlight 5. His guess as to why is that (a) it would be difficult to get it working consistently across all GPU drivers, and (b) for all but a tiny class of demo-ware-style problems, it wouldn't make any sense. Oh well.]

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜