开发者

Simple OpenGL code always causes a segmentation fault (C++ on Ubuntu, virtual machine)

I just started trying to use OpenGL in C++ for a class(I have previously used it in Java a fair amount). And I started off trying to write something substantial, I couldn't get that to stop Seg faulting so I wrote this piddly little piece of code which is nearly a line for line copy from an example in the first chapter of the red book. It also Seg faults. My question is why. I have tried both eclipse, and netbeans, I have the glut.h library linked in my projects in both, I am running 64 bit ubuntu 10.4, on a virtual machine using VMWare, gcc and freeglut are both installed, Both netbeans and eclipse will run regular (non OpenGL) C++ code I write without seg faulting.

Anyway here is the code:

#include <stdlib.h>
#include <GL/freeglut.h>
#include <stdio.h>

void init(){
    glClearColor(0.0, 0.0, 0.0, 0.0);
    glMatrixMode(GL_PROJECTION);
    glLoadIdentity();
    glOrtho(0.0, 1.0, 0.0, 1.0, -1.0, 1.0);

}
void display(){
    glClear(GL_COLOR_BUFFER_BIT);
    glColor3f(1.0,1.0,1.0);
    glBegin(GL_POLYGON);
        glVertex3f(0.25, 0.25, 0.0);
        glVertex3f(0.75,0.25,0.0);
        glVertex3f(0.75,0.75, 0.0);
        glVertex3f(0.25, 0.75, 0.0);
        glEnd();
    glFlush();
}
int main(int argc, char** argv) {
    glutInit(&argc, argv);
    glutInitDisplayMode(GLUT_SINGLE | GLUT_RGB);
    glutInitWindowSize(250,250);    //if I comment out this line,
    glutInitWindowPosition(100,100);
    glutCreateWindow(argv[0]);  //this line,
    init();  //this line and the glut main loop line it runs without any errors, but why wouldn't it? It's not doing anything now!
    glutDisplayFunc(display);
    glutMainLoop();    //if I comment out just this line I get illegal instruction instead of segfault but I need this line
    return 0;
}

Thread [1] 28944 (Suspended : Signal : SIGSEGV:Segmentation fault)

XF86DRIQueryVersion() at 0x7ffff7e7412e XF86DRIQueryExtension() at 0x7ffff7e742c9 0x7ffff7e73c70 0x7ffff7e53ff8 glXGetFBConfigs() at 0x7ffff7e4c71e glXChooseFBConfigSGIX() at 0x7ffff7e4cd97

fgChooseFBConfig() at freeglut_window.c:205 0x7ffff794a8c7

fgOpenWindow() at freeglut_window.c:768 0x7ffff794aac8

fgCreateWindow() at freeglut_structure.c:106 0x7ffff7948f62 glutCreateWindow() at freeglut_window.c:1,183 0x7ffff794a2a2 main() at Thread [1] 28944 (Suspended : Signal : SIGSEGV:Se开发者_JAVA百科gmentation fault) XF86DRIQueryVersion() at 0x7ffff7e7412e XF86DRIQueryExtension() at 0x7ffff7e742c9 0x7ffff7e73c70 0x7ffff7e53ff8 glXGetFBConfigs() at 0x7ffff7e4c71e glXChooseFBConfigSGIX() at 0x7ffff7e4cd97

fgChooseFBConfig() at freeglut_window.c:205 0x7ffff794a8c7

fgOpenWindow() at freeglut_window.c:768 0x7ffff794aac8

fgCreateWindow() at freeglut_structure.c:106 0x7ffff7948f62 glutCreateWindow() at freeglut_window.c:1,183 0x7ffff794a2a2 main() at (project stuff here):54 0x40100b


Could help : add GLUT_DEPTH in glutInitDisplayMode Works for me with this flag, and not without (segfault at glutCreateWindow)


Do you have hardware acceleration inside the virtual machine? Check using glxinfo. The crash inside the DRI suggests that you do not.


I had a similar problem with different code, and using Debian 6.0 remotely from Mac OS X 10.6. The machine in question was using a Radeon HD 5670 card (although this should not be relevant).

Upon closer inspection it seemed that something with OpenGL itself was not working properly, because it was not only my code that failed.

Some of the symptoms I encountered include:

  • Programs using OpenGL crash with SIGSEGV at XF86DRIQueryVersion()
  • glxinfo/glxgears crashes with SIGSEGV as well

For me the solution was to remove the Radeon fglrx driver (remove all packages with aptitude) and install the 'radeon' driver instead. Then run 'sudo Xorg -configure' to generate a new configuration, test it with 'sudo X -config /root/xorg.conf.new' and copy it to your default xorg.conf location.

After that the the programs did not crash with SIGSEGV anymore, although I got a different error:

  • glxinfo gives: Got Error: couldn't find RGB GLX visual or fbconfig
  • glxgears gives: Error: couldn't get an RGB, Double-buffered visual

This in turn seemed to be related to direct rendering (DRI) giving trouble. A solution to this was to disable direct rendering by setting 'export LIBGL_ALWAYS_INDIRECT=yes'. An alternative is to remove the library used for DRI (in my case this was libgl1-mesa-dri).

I'm still not quite convinced that this is the whole solution, since the fglxr drivers shouldn't matter when using only glxinfo and glxgears from a remote terminal. I suspect that removing fglxr and installing radeon somehow solved something in the configuration.

Some references I used include:

  • http://forum.parallels.com/showthread.php?p=382913#post382913
  • http://wiki.debian.org/AtiHowTo
  • https:// discussions.apple.com/message/10140344#10140344
  • http:// forums.debian.net/viewtopic.php?f=7&t=63730


My hunch is that if you compile the following pared down example, you'll still get a segmentation fault:

#include <stdlib.h>
#include <GL/freeglut.h>
#include <stdio.h>

int main(int argc, char** argv) {
    glutInit(&argc, argv);
    glutInitDisplayMode(GLUT_SINGLE | GLUT_RGB);
    glutInitWindowSize(250,250);
    glutInitWindowPosition(100,100);
    glutCreateWindow("test window");
    return 0;
}

Because your segmentation fault is occurring in:

fgCreateWindow() at freeglut_structure.c:106 0x7ffff7948f62 glutCreateWindow()

In other words, it can't create a window (doesn't matter what's in it). My guess, along the lines of what @Matias replied is that you need to enable 3D acceleration within your VM? But it might be a display driver issue, 32 vs 64 bit, or something slightly more sinister such as a mis-match between freeglut & your version of OpenGl -- hence my previous barrage of questions. Could you re-compile this pared down version and post the results?

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜