[Gllug] Modelines ignored on new GFX card :-(
general_email at technicalbloke.com
general_email at technicalbloke.com
Thu Oct 1 21:42:45 UTC 2009
Walter Stanish wrote:
>> Modeline "1280x1024" 109.62 1280 1327 1472 1682 1024 1024 1023 1063
>> And restarted... nada.
>>
>> Any idea how I can force the graphics card to start kicking out
>> 1280x1024 before the blurry edges give me a migraine?
>>
>
> You need to add DefaultMode under another section (Screen IIRC) ...
> where the various Modes are listed (16 bit, 8 bit, etc.). That will
> determine bit-depth. Then you need to make sure the first resolution
> under that section is the resolution that you're after. Check any
> example X11 config to see this...
>
Hi Walter,
Yeah I have a Modes line, it skips the first entry and proceed to start
in 1024x768...
Section "Screen"
Identifier "Default Screen"
Monitor "Configured Monitor"
Device "Configured Video Device"
DefaultDepth 24
SubSection "Display"
Depth 24
Modes "1280x1024" "1024x768" "640x480"
EndSubSection
EndSection
> Depending on your X11 driver it should be possible to change without
> restarting X, using a default/VESA driver will cause issues but since
> compiz is working I guess that's not the case.
>
Yeah I'm restarting it anyway (ALT-GR+PRN_SCR+"K") just to be sure, esp
as just hitting 'apply' causes my launcher (Cairo Dock) to go squiffy.
> You can also log your X11 output to get more info on why that mode
> selection is happening, unsure on your distro but this is often
> available in /var/log/X11/... or can be gathered with a manual
> su - user_to_run_x_as startx 1>>/some/log 2>>/some/log
>
The trouble seems to start at line 146 of my most recent log...
(WW) NVIDIA(GPU-0): Unable to read EDID for display device CRT-1
(II) NVIDIA(0): NVIDIA GPU GeForce 9500 GT (G96) at PCI:2:0:0 (GPU-0)
(--) NVIDIA(0): Memory: 524288 kBytes
(--) NVIDIA(0): VideoBIOS: 62.94.28.00.00
(II) NVIDIA(0): Detected PCI Express Link width: 4X
(--) NVIDIA(0): Interlaced video modes are supported on this GPU
(--) NVIDIA(0): Connected display device(s) on GeForce 9500 GT at PCI:2:0:0:
(--) NVIDIA(0): CRT-1
(--) NVIDIA(0): CRT-1: 400.0 MHz maximum pixel clock
(II) NVIDIA(0): Assigned Display Device: CRT-1
(WW) NVIDIA(0): No valid modes for "1280x1024"; removing.
(II) NVIDIA(0): Validated modes:
(II) NVIDIA(0): "1024x768"
(II) NVIDIA(0): "640x480"
(II) NVIDIA(0): Virtual screen size determined to be 1024 x 768
(WW) NVIDIA(0): Unable to get display device CRT-1's EDID; cannot
compute DPI
(WW) NVIDIA(0): from CRT-1's EDID.
(==) NVIDIA(0): DPI set to (75, 75); computed from built-in default
I guess I need to prevent it "validating" the correct resolution
"1280x1024" and removing it! Any ideas how I can accomplish this? or why
it won't accept my modeline?
> Also IIRC some of the bleeding edge X servers are moving away from
> the old style <X11|xorg>.conf entirely... probably not the case for
> you though.
>
Nah, Ubuntu still reads xorg which is lucky as I have found the much
vaunted autoconfiguration in the new X deeply flawed - don't even get me
started on my Thinkpad X30's display problems!
Cheers,
Roger - Still cheerful on balance now it no longer takes 4 jerky seconds
to flip my cube round! :)
--
Gllug mailing list - Gllug at gllug.org.uk
http://lists.gllug.org.uk/mailman/listinfo/gllug
More information about the GLLUG
mailing list