Wednesday, July 11, 2012

Xvfb Memory Leak Workaround

I've been using Xvfb, the X virtual frame buffer, for a few projects. Xvfb allows you to run applications that require a display, without actually having a graphics card or a screen. I started noticing that the resident memory used by the Xvfb process would continually go up. I could literally run processes that connect to the display in a loop and watch the memory grow. Clearly, there was some kind of memory leak going on.

I finally found a workaround to preventing this memory issue. If you add -noreset to the argument of Xvfb, the memory issue vanishes. By default, when the last client to Xvfb disconnects, the server resets itself. My guess is that this works fine when video memory is backed by a hardware device, but Xvfb has a bug where it doesn't free the memory it allocated for the buffer. By setting the noreset option, the server no longer restarts and the memory isn't lost.

Here's my Xvfb command line:
Xvfb :1 -screen 0 1024x768x24 -ac +extension GLX +render -noreset

An explanation of the other arguments:

  • :1 - Runs the server on "display 1". Most X servers run on display 0 by default.
  • -screen 0 1024x768x24 - Sets screen 0 (the default screen) to a resolution of 1024x768 with 24 bits per pixel.
  • -ac - Disables access control because X access control is incredibly painful to get right, and since the server is usually only accessible from localhost, it's not a big deal.
  • +extension GLX - Enables the OpenGL extension, allowing graphics programs that use OpenGL to work inside the virtual display.
  • +render - Enables the X Rendering extension, enabling advanced image compositing features that most applications will need.
I also use a simple Xvfb start script. It's available as a gist.