开发者

Fixing Segmentation faults in C++

I am writing a cross-platform C++ program for Windows and Unix. On the Window side, the code will compile and execute no problem. On the Unix side, it will compile however when I try to run it, I get a开发者_JAVA百科 segmentation fault. My initial hunch is that there is a problem with pointers.

What are good methodologies to find and fix segmentation fault errors?


  1. Compile your application with -g, then you'll have debug symbols in the binary file.

  2. Use gdb to open the gdb console.

  3. Use file and pass it your application's binary file in the console.

  4. Use run and pass in any arguments your application needs to start.

  5. Do something to cause a Segmentation Fault.

  6. Type bt in the gdb console to get a stack trace of the Segmentation Fault.


Sometimes the crash itself isn't the real cause of the problem-- perhaps the memory got smashed at an earlier point but it took a while for the corruption to show itself. Check out valgrind, which has lots of checks for pointer problems (including array bounds checking). It'll tell you where the problem starts, not just the line where the crash occurs.


Before the problem arises, try to avoid it as much as possible:

  • Compile and run your code as often as you can. It will be easier to locate the faulty part.
  • Try to encapsulate low-level / error prone routines so that you rarely have to work directly with memory (pay attention to the modelization of your program)
  • Maintain a test-suite. Having an overview of what is currently working, what is no more working etc, will help you to figure out where the problem is (Boost test is a possible solution, I don't use it myself but the documentation can help to understand what kind of information must be displayed).

Use appropriate tools for debugging. On Unix:

  • GDB can tell you where you program crash and will let you see in what context.
  • Valgrind will help you to detect many memory-related errors.
  • With GCC you can also use mudflap With GCC, Clang and since October experimentally MSVC you can use Address/Memory Sanitizer. It can detect some errors that Valgrind doesn't and the performance loss is lighter. It is used by compiling with the-fsanitize=address flag.

Finally I would recommend the usual things. The more your program is readable, maintainable, clear and neat, the easiest it will be to debug.


On Unix you can use valgrind to find issues. It's free and powerful. If you'd rather do it yourself you can overload the new and delete operators to set up a configuration where you have 1 byte with 0xDEADBEEF before and after each new object. Then track what happens at each iteration. This can fail to catch everything (you aren't guaranteed to even touch those bytes) but it has worked for me in the past on a Windows platform.


Yes, there is a problem with pointers. Very likely you're using one that's not initialized properly, but it's also possible that you're messing up your memory management with double frees or some such.

To avoid uninitialized pointers as local variables, try declaring them as late as possible, preferably (and this isn't always possible) when they can be initialized with a meaningful value. Convince yourself that they will have a value before they're being used, by examining the code. If you have difficulty with that, initialize them to a null pointer constant (usually written as NULL or 0) and check them.

To avoid uninitialized pointers as member values, make sure they're initialized properly in the constructor, and handled properly in copy constructors and assignment operators. Don't rely on an init function for memory management, although you can for other initialization.

If your class doesn't need copy constructors or assignment operators, you can declare them as private member functions and never define them. That will cause a compiler error if they're explicitly or implicitly used.

Use smart pointers when applicable. The big advantage here is that, if you stick to them and use them consistently, you can completely avoid writing delete and nothing will be double-deleted.

Use C++ strings and container classes whenever possible, instead of C-style strings and arrays. Consider using .at(i) rather than [i], because that will force bounds checking. See if your compiler or library can be set to check bounds on [i], at least in debug mode. Segmentation faults can be caused by buffer overruns that write garbage over perfectly good pointers.

Doing those things will considerably reduce the likelihood of segmentation faults and other memory problems. They will doubtless fail to fix everything, and that's why you should use valgrind now and then when you don't have problems, and valgrind and gdb when you do.


I don't know of any methodology to use to fix things like this. I don't think it would be possible to come up with one either for the very issue at hand is that your program's behavior is undefined (I don't know of any case when SEGFAULT hasn't been caused by some sort of UB).

There are all kinds of "methodologies" to avoid the issue before it arises. One important one is RAII.

Besides that, you just have to throw your best psychic energies at it.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜