开发者

OpenGL: How to render perfect rectangular gradient?

I can render triangular gradient with simply just one triangle and using glColor for each corner.

But how to render perfect rectangular gradient? I tried with one quad, but the middle will get ugly seam. I also tried with texture of 2x2 size, it was like it should b开发者_开发问答e done: proper blending from each corner, but the texture sampling precision becomes unprecise when stretched too much (i started to see pixels bigger than 1x1 size).

Is there some way of calculating this in a shader perhaps?

--

Edit: Link to images were broken(removed).


Indeed, the kind of gradient you want relies on 4 colors at each pixel, where OpenGL typically only interpolates input over triangles (so 3 inputs). Getting the perfect gradient is not possible just with the standard interpolants.

Now, as you mentioned, a 2x2 texture can do it. If you did see precision issues, I suggest switching the format of the texture to something that typically requires more precision (like a float texture).

Last, and as you mentioned also in your question, you can solve this with a shader. Say you pass an extra attribute per-vertex that corresponds to (u,v) = (0,0) (0,1) (1,0) (1,0) all the way to the pixel shader (with the vertex shader just doing a pass-through).

You can do the following in the pixel shader (note, the idea here is sound, but I did not test the code):

Vertex shader snippet:

varying vec2 uv;
attribute vec2 uvIn;

uv = uvIn;

Fragment shader:

uniform vec3 color0;
uniform vec3 color1;
varying vec2 uv;

// from wikipedia on bilinear interpolation on unit square:
// f(x,y) = f(0,0)(1-x)(1-y) + f(1,0)x(1-y) + f(0,1)(1-x)y + f(1,1) xy. 
// applied here:
// gl_FragColor = color0 * ((1-x)*(1-y) + x*y) + color1*(x*(1-y) + (1-x)*y)
// gl_FragColor = color0 * (1 - x - y + 2 * x * y) + color1 * (x + y - 2 * x * y)
// after simplification:
// float temp = (x + y - 2 * x * y);
// gl_FragColor = color0 * (1-temp) + color1 * temp;
gl_FragColor = mix(color0, color1, uv.u + uv.v - 2 * uv.u * uv.v);


The problem is because you use a quad. The quad is drawn using two triangles, but the triangles are not in the orientation that you need.

If I define the quad vertices as:

  • A: bottom left vertex
  • B: bottom right vertex
  • C: top right vertex
  • D: top left vertex

I would say that the quad is composed by the following triangles:

  • A B D
  • D B C

The colors assigned to each vertex are:

  • A: yellow
  • B: red
  • C: yellow
  • D: red

Keeping in mind the geometry (the two triangles), the pixels between D and B are result of the interpolation between red and red: indeed, red!

The solution would be the a geometry with two triangles, but orientated in a different way:

  • A B C
  • A C D

But probably you will no get the exact gradient, since in middle of quad you will get a full yellow, instead of a yellow mixed with red. So, I suppose you can achieve the exact result using 4 triangles (or a triangle fan), in which the centered vertex is the interpolation between the yellow and the red.


Wooop! Effetively the result is not what I was expecting. I thought the gradient was produced by linear interpolation between colors, but surely is not (I really need to setup the LCD color space!). Indeed, the most scalable solution is rendering using fragment shaders.

Keep the solution proposed by Bahbar. I would advice to start the implementation of a pass-through vertex/fragment shader (specifying only vertices and colors you should get the previous result); then, start playing with the mix function and the texture coordinate passed to the vertex shader.

You really need to understand the rendering pipeline with programmable shaders: vertex shader is called once per vertex, fragment shader is called once per fragment (without multisampling, a fragment is a pixel; with multisampling, a a pixel is composed by a many fragments which are interpolated to get the pixel color).

The vertex shader take the input parameters (uniforms and inputs; uniforms are constant for all vertices issued between glBegin/glEnd; inputs are characteristic of each vertex shader instance (4 vertices, 4 vertex shader instances).

A fragment shader takes as input the vertex shader outputs which has produced the fragment (due the rasterization of triangles, lines and points). In the Bahbar answer the only output is the uv variable (common to both shader sources).

In you case, the vertex shader outputs the vertex texture coordinates UV (passed "as-are"). These UV coordinates are available for each fragment, and they are computed by interpolating the values outputted by the vertex shader depending on the fragment position.

Once you have those coordinates, you only need two colors: the red and the yellow in your case (in Bahbar answer corresponds to color0 and color1 uniforms). Then, mix those colors depending on the UV coordinates of the specific fragment. (*)

(*) Here is the power of shaders: you can specify different interpolation methods by simply modifying the shader source. Linear, Bilinear or Spline interpolation are implemented by specifying additional uniforms to the fragment shader.

Good practice!


Do all of your vertices have the same depth (Z) value, and are all of your triangles completely on-screen? If so, then you should have no problem getting a "perfect" color gradient over a quad made from two triangles with glColor. If not, then it's possible that your OpenGL implementation treats colors poorly.

This leads me to suspect that you may have a very old or strange OpenGL implementation. I recommend that you tell us what platform you're using, and what version of OpenGL you have...?

Without any more information, I recommend you attempt writing a shader, and avoid telling OpenGL that you want a "color." If possible, tell it that you want a "texcoord" but treat it like a color anyway. This trick has worked in some cases where color accuracy is too low.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜