I have chosen an nVidia card for my setup due to the available drivers for Linux. Two (at least) drivers exists: The nVidia proriatary drivers available from nVidia and the open-source nv drivers, which are available with the apt-get standard packeging system. While the propriatary drivers supports the largest set of ffeatures, the nv-drivers should actually suffice as long as you do not plan to use XvMC for Mpeg2 hardware acceleration. None of the drivers supports Mpeg4/AVC/H264 acceleration under Linux (Known as PureVideo under Windows).
As I have experimented with XvMC, I am using the propriatary drivers.

By the way - the LCD display on my HD160XT revision 3 is an 7" LG RV101-WV1 capable of 1024x768.

NVidia drivers

Under Debian Etch, the propriatary NVidia drivers was no problem - download it from the nVidia site, make it executable and run it. It will even suggest to modify the Xorg setup. If it fails, try running it again.

But under Debian Lenny and the 2.6.21 kernel, Paravirtualization was enabled in the kernel as default, causing the driver to break. You can see a description of the problem here and here. The solution is a patched version of the nVidia installer.

Going for the open source version (nv) is no solution for this, as it needs a similar patch. I have not tried to compile a kernel without the paravirtualization, but this is supposed to work with the original NVidia drivers, giving you access to the newest one as soon as nVidia releases them.

Note: Using kernel version 2.6.24 and/or the NVidia drivers version 169.09 or higher seems to have solved this problem.

If you use the propriatary drivers, you should remove the open-source drivers, including the GLX driver. XvMC can be enabled by placing the propriatory library /usr/lib/libXvMCNVIDIA_dynamic.so.1 or similar in the file /etc/X11/XvMCConfig
Note, that things break easily when updating the system when using the propriatary drivers, including MythTV. It shows as errors such as:
mythfrontend: Fatal IO error: client killed
But just re-install the driver, and things will work again.

Dual screens

As the Zalman HD160XT comes with a touch screen, and as you need a big screen for watching movies, you need a dual-head X setup. This can be done in many ways: The nVidia TwinView, the X Xinerama, X dual-head and two separate X sessions.
TwinView and Xinerama will not let your touchscreen cover just the small display, but will expand its use to the entire virtual desktop, just like your mouse. Two X sessions will not let you use the same keyboard and mouse for both screens. This leave us with a dual-head setup.

Dual-head has the advantage of letting us use the touchscreen on one screen only (allthough it HAS to be the first screen) whereas the keyboard and mouse may be shared among both screens. You only login once and thus has only one active session. The two screens can be in different resolution and colordepth. But you cannot drag files and windows from one screen to another, as well as the hardware acceleration will only function on one screen (at the time?).

Dual-screen is setup using /etc/X11/xorg.conf. This is an example:
Note that with a dual-head setup on a single graphic card, the Screen 0 and 1 is necessary. The cards are placed on identical addresses and differentiated by their order in the file (trial and error) or by their connection (DFP for digital DVI and CRT for analog VGA). You can find the board address using e.g. lspci or looking in the X11.log.

The Zalman display is a 7" 1024x768 screen. But the format is 16:9, even as the pixel count is 4:3. The Option "UseEdidDpi" "false" and the DisplaySize tells X that this is the case, but it still relies on the programs using this information as they create their screen layout. This is generally not the case, but as the screen is secondary and most for interfacing, it is not crucial. I did however had to manually scale a round clock to get it round..

The setup above relies on the HDTV being connected on the DVI interface using the digital channel (DVI or HDMI and not VGA in the other end). Only the bare necessities are included - use this to copy parts into your own xorg.conf if you like.

The Option "RenderAccel" "True" is used to enable the hardware acceleration (probably true as default, but anyway..), whereas the Option "UseEvents" "True" is crucial for the performance for GeForce series 6xxx and up!

Please see here for the touchscreen setup.

Getting the resolution you want

The build-in touchscreen is not a problem at all under X. It plays nicely with the EDID interface of the video card. But some HDTV's does not - including my Acer AT3705MGW. The problem is that they are generally rather pessimistic on what they are capable of displaying. They send this information as the EDID block to the video card, and it contains information on maximum sync rates, link bandwidth, timings and resolutions. To test if your HDTV just works, enter your expected resolution in the list under Section "Screen" and see if you are able to change the X resolution to it. If it is not in the list of possible resolutons, or if it does not work, you need to tweak it.

The driver will collect a pool of possible video modes using a build-in standard list, a VESA list, the received EDID data and any manually entered modes using the modeline option. Then the driver will validate each mode against the limitations of the video card and of the attached display (using the EDID data or any data overwriting this in xorg.conf. The modes thus left will be accepted after they are tweaked a little to further fit the recommendations in timing from the EDID list. Tweaking the modes consists of disabling the tests and - if this is not enough - entering custom modelines with the desired resolution and timing. For further info, see here.

To disable the checks, enter the following into the Section "Screen":
Option "ModeValidation" "DFP-0: AllowNon60HzDFPModes,NoMaxPClkCheck,NoEdidMaxPClkCheck,NoHorizSyncCheck,NoVertRefreshCheck,NoDFPNativeResolutionCheck,AllowInterlacedModes"

It does have to be on ONE line. Not all of it may be necessary. But using a wrong timing on an LCD display will not ruin the display - you might just not be able to see the screen, or at least not as you expected. But DO NOT DO THIS ON AN CRT SCREEN - it might be permanently damaged!

To avoid the driver tweaking our modes, we further need the line:
Option "ExactModeTimingsDVI" "true"

It might also be necessary to disable the nVidia drivers ability to scale to the supposed native display resolution. The option NoDFPNativeResolutionCheck above only disables the validation check. But at least in my setup, the driver by default scaled the image to the appearant native display resolution - even as the virtual screen and the mode was in a different resolution. The solution was to use the nvidia-settings and under the relevant display uncheck the option "ForceGPUScaling". The screen should immediately change, if the scaling was active for the current mode. But this setting is not saved between sessions! So what you do is to use the program to set all settings as you desire (verifying this scaling setings work) and then exit the program. In a termnal under X run it again with nvidia-settings -r to save the settings in the file ~/.nvidia-settings-rc. Then add nvidia-settings -l to the programs to run on X startup. The setting in the file is "1/GPUScaling[DFP-0]=65537" for screen 1, DFP-0.
Note, that the suggested addition of the program to ~/.xinitrc in the link did not work in my case..

If this is still not enough, you will need a custom modeline. This may be found by googling for "< your HDTV model > modeline" or if you are lucky here.

You test them by entering them in the Monitor section and then refering to them in the Screen section. Note, that in order not to clutter up all the various modes, it might be a good idea to include some of the following options in the list of ModeValidation parameters to avoid the automatically added modes: NoEdidModes,NoVesaModes,NoXServerModes,NoPredefinedModes. Then you are (more) sure that the modes to choose from are actually your own.
Below is my definitions:



Finally, if the HDTV has both VGA, Component, HDMI and DVI inputs, there might be a difference in which modes is accepted by the TV depending on the input port. And many TVs interpret some inputs as "Video" as opposed to "PC", and therefore deliberately introduce a significant overscan to get rid of video artefacts in the border of the video image (as may be seen on analogue broadcasts and VCRs - dont know why this is relevant for a 1080p 16:9 HDTV HDMI-port?!). So you might want to try all possibilities if you encounter problems.
In my case with the Acer AT3705MGW, this casued me to use the HDMI port, but to run it in the non-standard 1912x1080p or 1928x1080p modes in order to let the TV accept it as a non-video source. But the 1912 caused the TV to double some pixel columns with noticable visual artefacts, whereas the 1928 caused some pixel columns to blend without noticable visual effects - so I strongly prefers the latter. Some Windows-users reports being able to use 1921 using powerstrip, and this is of course even better, but X will not allow anything not dividable by 8.

Of course, be sure the TV can actually accept 1080i and 1080p before wasting to much time trying ;-)

Disabling the screensaver

If you are sitting and enjoying a movie, it is rather annoying if the screen blanks or a screensaver is started after 10 minutes or an hour. To disable this, add the following to the xorg.con: Section "ServerFlags"
Option "Xinerama" "0"
Option "BlankTime" "0"
Option "StandbyTime" "0"
Option "SuspendTime" "0"
Option "OffTime" "0"
EndSection
And on X startup, run the following:
/usr/bin/xset s off
/usr/bin/xset -dpms


Note that all these things to be run at X startup can be run in a single script file which are then inserted into the X start-up menu.