I finally found a workaround to preventing this memory issue. If you add -noreset to the argument of Xvfb, the memory issue vanishes. By default, when the last client to Xvfb disconnects, the server resets itself. My guess is that this works fine when video memory is backed by a hardware device, but Xvfb has a bug where it doesn't free the memory it allocated for the buffer. By setting the noreset option, the server no longer restarts and the memory isn't lost.
Here's my Xvfb command line:
Xvfb :1 -screen 0 1024x768x24 -ac +extension GLX +render -noreset
An explanation of the other arguments:
- :1 - Runs the server on "display 1". Most X servers run on display 0 by default.
- -screen 0 1024x768x24 - Sets screen 0 (the default screen) to a resolution of 1024x768 with 24 bits per pixel.
- -ac - Disables access control because X access control is incredibly painful to get right, and since the server is usually only accessible from localhost, it's not a big deal.
- +extension GLX - Enables the OpenGL extension, allowing graphics programs that use OpenGL to work inside the virtual display.
- +render - Enables the X Rendering extension, enabling advanced image compositing features that most applications will need.
I also use a simple Xvfb start script. It's available as a gist.