Core Dump

Why Trust Techopedia

What Does Core Dump Mean?

A core dump is a file of a computer’s documented memory of when a program or computer crashed. The file consists of the recorded status of the working memory at an explicit time, usually close to when the system crashed or when the program ended atypically.

Advertisements

Aside from the entire system memory or just part of the program that aborted, a core dump file may include additional information such as:

  • The processor’s state
  • The processor register’s contents
  • Memory management information
  • The program’s counter and stack pointer
  • Operating system and processor information and flags

Core dump may also be known as memory dump, storage dump or dump.

Techopedia Explains Core Dump

Programmers often use a core dump to examine the problem with the use of a debugger. A core dump can include all the system memory or a part of the program that failed. There are several reasons why a computer or program can crash:

  • Corrupted data
  • A severe user error
  • Virus-infected files
  • Problems accessing data files
  • An outdated operating system
  • A segmentation fault or bus error
  • A poorly ventilated or dusty computer tower
  • A system-detected fault in the software or hardware
  • Computer overheating caused by a faulty heat sink or fan

Generally, a core dump file includes the random access memory (RAM) contents of a certain process or part of an address space of the process and values of processor registers. The core dump files can be used to analyze the cause of the dump, viewed as text or printed.

Because a contemporary OS process address space may share breaks and pages with other files and processes, a more intricate image is used. In Unix-like systems, core dumps typically use the standard executable image format:

  • Mach-O in Mac OS X
  • a.out in older versions of Unix
  • Executable and linkable format (ELF) in modern Linux, Solaris, Unix System V and Berkeley software distribution (BSD) schemes

Originally, a core dump transferred the contents of memory precisely in order to record the state of the computer. The core dumps were actual printouts of around a hundred pages or more that consisted of octal or hexadecimal numbers. The pages were studied by programmers to research the cause of the crash or abnormally terminated program. Eventually, the introduction of debuggers eliminated the need for massive stacks of printouts.

Advertisements

Related Terms

Margaret Rouse
Technology Specialist
Margaret Rouse
Technology Specialist

Margaret is an award-winning writer and educator known for her ability to explain complex technical topics to a non-technical business audience. Over the past twenty years, her IT definitions have been published by Que in an encyclopedia of technology terms and cited in articles in the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine, and Discovery Magazine. She joined Techopedia in 2011. Margaret’s idea of ​​a fun day is to help IT and business professionals to learn to speak each other’s highly specialized languages.