Well, I started thinking about revising my options for this old PC I've been working on part-time for months now and I took a closer look at GALPon (based in Pontevedra, Galicia - north-west corner of Spain which I personally know is a rather pretty part of the planet) and discovered their MiniNo Ártabros 2.1 Full distro.
I saw some forum topics about installing nvidia drivers and decided to download it and discovered that it was using the linux-image-3.2.0-4-686-pae kernel which caught my attention as my attempt to install the nvidia-glx-legacy-96xx package on wattOS indicated that it was looking for a 3.2.0-4 version of the kernel.
So, I proceeded to install the nvidia-detect, nvidia-glx-legacy-96xx and nvidia-xconfig packages.
Everything appeared to install successfully and so I executed nvidia-xconfig and reboot the PC.
You can see the outcome of that in the topic Problema con la Instalación del Controlador NVIDIA (
http://minino.galpon.org/es/node/439) I posted on the MiniNo forum.
After re-reading, more closely, the terminal output for the installation of the nvidia-glx-legacy-96xx package, I did some research regarding "Module build for the currently running kernel was skipped since the kernel source for this kernel does not seem to be installed." which indicated that the Linux-Headers needed installing which I had thought should have been automatic.
Why are these linux-headers-* packages not flagged as a dependency for packages such as nvidia-kernel-legacy-96xx-dkms? Is there some logical reason for this?
It was then that I recalled the instructions offered by ausmuso and what I had seen on the Debian Wiki (
https://wiki.debian.org/NvidiaGraphicsDrivers).
However, for documentation purposes for inexperienced computer users, I wanted to achieve everything via Synaptic.
With some searching, about the removal of NVIDIA driver packages, I used the following command in Terminal.
aptitude purge nvidia-kernel-legacy-96xx-dkms nvidia-glx-legacy-96xx
Using Synaptic, I marked the package "linux-headers-686-pae" for installation. This also marked the required dependencies (linux-headers-3.2.0-4-686-pae, linux-kbuild-3.2, linux-headers-3.2.0-4-common, gcc-4.6, cpp-4.6 and gcc-4.6-base) for installation. Then I clicked the Apply button to activate the installation.
Next, I marked the package nvidia-glx-legacy-96xx and activated its installation.
I executed nvidia-xconfig from the command line again and rebooted.
Success
However, I noticed the screen flickering (it was also happening with the nouveau driver but I wasn't paying attention) which involved more analysis and testing to resolve what appears to be possibly a common problem at least across all the configurations of monitors, graphics cards and computers I have at my disposal. With the default wallpaper enabled there is a lot of flickering of the screen. I can see this via the following page in my browser also.
minino-wallpaper-2_1280x1024Try downloading it and using it as a wallpaper and let me know whether you see the same flickering effect and perhaps explain why it is happening with this particular image.
At first, I didn't notice that the flickering was associated with the wallpaper. I looked at the Monitor Settings applet which was displaying a Refresh Rate of 50.0 and the only other options were auto and 51.0. So, that didn't seem correct.
Also, the nvidia-settings GUI was reporting the Monitor as being a CRT type which was strange given that it is an LCD monitor.
Along the way I discovered various command line tools such as cvt, gtf, xresprobe, ddcprobe, i2c Tools, read-edid and parse-edid.
On the target PC, xresprobe reported a Segmentation Fault with id, res and freq as blank and disptype as crt and ddcprobe reported a Segmentation Fault and nothing more.
With the GUI for nvidia-settings, it appeared I could extract the EDID data but opening the file with Leafpad did not produce anything I could read.
So, with more searching for solutions and my Laptop (Manjaro 0.8.10 XFCE with Intel HD4000 Graphics and VGA and HDMI ports) and Old Desktop (using MiniNo Ártabros on the Live USB with a Graphics Card using a GeForce 6200 GPU and VGA and DVI ports), I peformed various combinations of testing connections which allowed me to properly determine the values for the HorizSync, VertRefresh and Modeline parameters by reading the Xorg.o.log files. It was during this phase of analysis and testing that I discovered read-edid and, hence, parse-edid which enabled me to read the EDID data I had previously acquired and confirm what I had seen by using my old desktop PC.
So, nvidia-settings could extract the correct data for the monitor but nvidia-xconfig couldn't. nvidia-xconfig created and configured the xorg.conf file with values for the HorizSync and VertRefresh parameters that did not match those reported by nouveau, ddcprobe (on my old PC) and nvidia-settings (via the Acquire EDID button that appears for the monitor entry under the GPU entry in the left pane of the interface) on the target PC.
ddcprobe appears to not fully function for non-VGA connections. When I tried it on my old PC using the DVI connection it failed when trying to read the EDID data. Using the VGA connection it reported the same values I had seen in the Xorg.o.log file for the nouveau driver.
With the nouveau driver, Monitor Settings, reports 60.0 as the refresh rate.
In the nvidia-settings GUI, the refresh rate is reported as 60.0.
Also, discovered the following this morning.
http://forums.linuxmint.com/viewtopic.php?f=59&t=91923&p=528528#p528059https://bbs.archlinux.org/viewtopic.php?pid=853138#p853138So, the Monitor Settings (lxrandr) applet is not able to correctly report the Refresh Rate being used when the NVIDIA driver is being used because the NVIDIA driver does not report the real refresh rates to xrandr.
If I run xrandr from the command line, it also reports 50.0 and 51.0 for the 1920 * 1080 resolution.
Using nvidia-settings -nt -q RefreshRate reported a Refresh Rate of 60.
The NVIDIA driver appears to function properly whether either the HorizSync and VertRefresh or the Modeline parameters are used. If a Modeline parameter is not used the nvidia-settings GUI reports the Resolution to be auto and defaults to the native resolution. I'm going to use the Modeline parameter for the native resolution fior the LCD monitor because instinct says it's possibly more efficient.
So, after all this, I can report that with the NVIDIA Driver the Graphics are properly rendered without having to scroll panes or move the mouse over a checkbox, for example, as with the nouveau driver.
I also tried configuring a Live USB with Persistence with the NVIDIA driver but, yes, it appears that Debian based Live ISOs use their own Xorg configuration file (in etc/X11/xorg.conf.d) which appears to be created when the Live environment is booted. So, hopefully a Remaster will solve that issue. Can ayone advise otherwise or as to which file(s) would need to be modified to permit this?
Even though the monitor I have been using with the target PC is mine, and not the one that will be used by the people concerened, it was important to go through this process in case they want to upgrade from the old 15" CRT sometime in the future when they can afford it.
For the CRT monitor, I am guessing that I will be creating multiple Modeline parameters for the Monitor section for the xorg.conf file.
As xrandr is not compatible with the NVIDIA driver, I will "hide" those applications from the Menu.
It will be interesting to see what happens now when I connect the CRT monitor. Fingers crossed
So, MiniNo Ártabros 2.1 is as fast as, or a little faster than, wattOS and somewhat leaner in memory usage. It gets to the desktop with less than 60MB used -
.
It comes with applets for the managment of Language and Keyboard layout.
There is no Update Manager in the default installation but its available in the Repositories. Does anyone know how Update Manager categorises packages as a Security Update? I can't see anything in Synaptic that allows such filtering.
Midori crashed the first time I was using it. So, I'll probably replace that with Qupzilla if I can't install Pale Moon for Linux.
One disappointment, however, is that the translations are not complete, with it neither, which means I still have some work to do in that regard. You would think a distro produced in the Iberian Peninsula could at least get that right - come on!
Most of them appear to be fully translated but there are some with a few missing translations and there is one that is missing most of the translations.
Installation is a bit different and I will reinstall for documentation purposes once I've gotten the translations completed as much as is possible in preparation for the second post-installation configuration. Unless there is something I missed configuring when I first used the Live environment, the installation interface is only in English. I will communicate with the people at GALPon about this.
One of the installation differences is that it doesn't include the configuration of a User. It uses the same User account as the Live environment. However, there is an applet for the management of Users and Groups including an option to specify that a Password is required when initiating a Session.
I had looked at antiX 14R Alpha 2. It's fast and light on resources too but not as User Friendly I feel but there are some applications, like CENI for example, that I am interested in exploring.
Well, I think that covers it for now. Hopefully someone else will benefit from this saga.
Thanks for the input guys!
Hasta pronto, Michael.