[FrontPage] [TitleIndex] [WordIndex

Note: You are looking at a static copy of the former PineWiki site, used for class notes by James Aspnes from 2003 to 2012. Many mathematical formulas are broken, and there are likely to be other bugs as well. These will most likely not be fixed. You may be able to find more up-to-date versions of some of these notes at http://www.cs.yale.edu/homes/aspnes/#classes.

1. Core files

When a program running under Unix encounters certain errors, it "dumps core." This means that the operating system kernel creates a file called core in the current working directory that contains the entire state of the program at the time it failed. You can use these core files to do post-mortem autopsies on the dead program; for example, if you have a core file and the executable it was generated from, you can type gdb name-of-program core and use gdb to examine the state of the program when it crashed, just as if you had run it under the debugger to begin with.

2. The trouble with core files

Core files can be invaluable when you have a long-running process that mysteriously fails and you need to reconstruct what events led to its demise. Most of the time, though, they just get in the way. Core files can be very big, and can quickly wipe out your quota. This can block writing other files, for example when compiling a program or running submit. If you see a file named core lying around, you should usually just delete it.

3. Preventing core dumps

The best way to prevent core dumps is to write programs without errors. The next best way is to tell the operating system that you don't want to generate a core file, even if an error occurs. If your shell is bash, you can turn off core dumps with ulimit -c 0. If you use tcsh, the command is limit coredumpsize 0. You can put this command in your startup file (~/.bashrc for bash and ~/.tcshrc for tcsh) to have it executed every time you start a new shell.

4. Why "core"?

In the old days, "dumping core" meant dumping out the contents of core memory, a mid-twentieth-century computer memory technology that is no longer used. Particularly ancient machines would dump core to a printer, leaving the programmer to try to debug his or her code by looking at a large stack of numbers on paper. (Computer memories were smaller then.)


CategoryProgrammingNotes


2014-06-17 11:57