I have a NVIDIA GeForce 8400GS graphics card which has a DVI output and I would like to take a video or series of frames and display them as the DVI output for WUXGA (1,920 × 1,200) @ 120 Hz with GTF
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
In Direct2D they recommend 开发者_如何转开发drawing similar things together, to avoid unnecessary GPU state changes. They also do some drawing operation reordering behind the scene just for that.
I might be mixing apples and oranges in this question since I\'m noob in mentioned areas, so please try to understand what I mean.
The new MacBookPros come with two graphic adapters, the Intel HD Graphics, and the NVIDIA GeForce GT 330M. OS X switches back and forth between them, depending on the workload, detection of an externa
What are the best IDE\'s / IDE plugins / Tools, etc for programming with CUDA / MPI etc? I\'ve been working in these frameworks for a short while but feel like the IDE could be doing more heavy lifti
For my studies, we have code for matrix multiplication, for sizes between 1000-10000. It looks pretty fast, and uses GPU for calculations. As homework we need to find number crunching applications, wi
If I want to re-write my application so that it leverages the power of nVidia\'s CUDA SDK, are there any differences at all in runtime开发者_Python百科 performance between the different SDK offerings:
Just wondering - th开发者_如何学Gorowing ideas in my head - about starting a new XNA project for the 360.I would like it to be retro-old school, and emulating scanlines and color palettes and such.
I have a WinForms application that uses XNA to animate 3D models in a control. The app have been doing just fine for months but recently I\'ve started to experience periodic pauses in the animation. S