Niels Horn's Blog
Random thoughts, tips & tricks about Slackware-Linux, Lego and Star WarsNVIDIA "Failed to allocate primary buffer" error
October 12th, 2010 by Niels Horn in Slackware, configuration, kernel
Yesterday I installed Slackware on a computer that came with a Debian-derivative installed. Nothing against Debian, but I'm more used to Slackware, and the hard drive had enough space left…
This box is an all "no-brand / low-price" system, but has very reasonable specs: Quad-core AMD GHz processor, 4GB memory, 400GB hard drive, NVIDIA 6150SE nForce 430 on-board, and - best of all - does *not* come with that paid operating system from Redmond
The monitor that came with it is a 20″ Philips model, with 1600×900 maximum resolution.
Installation of Slackware 13.1 went smoothly, as always, except for one thing: the maximum resolution with the NVIDIA driver (built with the script from SlackBuilds.org) was 1280×1024. This annoyed me and intrigued me at the same time, as the Debian spin-off worked flawlessly at 1600×900.
In the /var/log/Xorg.0.log
(the first place to look when X does not work the way you expect) I noticed the following strange error message:
Failed to allocate primary buffer: out of memory
The /etc/X11/xorg.conf
created by the nvidia-settings program was almost the same on both Linux flavors, so that was not the problem.
Then I started to check the differences between the two installations…
I noticed that the Debian spin-off used an older kernel and an older version of the NVIDIA driver, but I never had problems with the newer versions on other systems.
Next step: Google.
I found several reports of the same problem, and the "solution" (more like a work-around) was adding the "nopat" parameter to the kernel.
I checked the configuration of the kernel used by Slackware 13.1 (/boot/config
) against the configuration used to build the kernel that came with the Debian spin-off and they were different indeed:
in Slackware it is:
CONFIG_X86_PAT=y
While the other has:
# CONFIG_X86_PAT is not set
I added the "nopat" parameter in /etc/lilo.conf
:
... append=" vt.default_utf8=0 nopat" ...
After running lilo and rebooting, the resolution went to 1600×900 without problems!
Now it's just a lot of irony that I have to set "nopat" to use Slackware, but I hope he forgives me until I compile my own kernel for this box