Okay, I've scoured www.dejanews.com and couldn't find anyone else
having a problem like this. I also searched altavista and all the
relevant Matrox/X pages (alloy.net, suse.de, xfree86.org...).
Anyway, here's the scoop:
XFCom_Matrox Version 2.2/ X Window System
(protocol Version 11, revision 0, vendor release 6300)
Release Date: 1 October 1998
If the server is older than 6-12 months, or if your card is
newer
than the above date, look for a newer version before reporting
Operating System: Linux 2.1.51 i686 [ELF]
Configured drivers:
SVGA: server for SVGA graphics adaptors (Patchlevel 0):
mga2064w, mga1064sg, mga2164w, mga2164w AGP, mgag200, mgag100
(using VT number 7)
XF86Config: /etc/XF86Config Fatal server error: When reporting a problem related to a server crash, please send _X11TransSocketUNIXConnect: Can't connect: errno = 111 Is this related to the signal 11 problem that gcc has known to I guess I should also say I'm running a brand new Red Hat 5.2 release
(**) stands for supplied, (--) stands for probed/default values
(**) XKB: keymap: "xfree86(us)" (overrides other XKB settings)
(**) Mouse: type: IMPS/2, device: /dev/mouse, buttons: 3
(**) SVGA: Graphics device ID: "Matrox Millennium G200 16MB"
(**) SVGA: Monitor ID: "My Monitor"
(--) SVGA: Mode "1600x1200" needs hsync freq of 87.50 kHz. Deleted.
(--) SVGA: Mode "1152x864" needs hsync freq of 89.62 kHz. Deleted.
(--) SVGA: Mode "1280x1024" needs hsync freq of 91.15 kHz. Deleted.
(--) SVGA: Mode "1600x1200" needs hsync freq of 93.75 kHz. Deleted.
(--) SVGA: Mode "1600x1200" needs hsync freq of 105.77 kHz. Deleted.
(--) SVGA: Mode "1280x1024" needs hsync freq of 107.16 kHz. Deleted.
(--) SVGA: Mode "1800X1440" needs hsync freq of 96.15 kHz. Deleted.
(--) SVGA: Mode "1800X1440" needs hsync freq of 104.52 kHz. Deleted.
(**) FontPath set to
"/usr/X11R6/lib/X11/fonts/misc/,/usr/X11R6/lib/X11/fonts/75dpi/:unscaled,/u
0xd0200000
(--) SVGA: Linear framebuffer at 0xF0000000
(--) SVGA: MMIO registers at 0xD0200000
(--) SVGA: Video BIOS info block at 0x000c7a60
(--) SVGA: Found and verified enhanced Video BIOS info block
(!!) SVGA: reset VideoRAM to 2 MB for safety!
(--) SVGA: detected an SGRAM card
(--) SVGA: chipset: mgag200
(--) SVGA: videoram: 16384k
(**) SVGA: Option "dac_8_bit"
(**) SVGA: Using 16 bpp, Depth 16, Color weight: 565
(--) SVGA: Maximum allowed dot-clock: 250.000 MHz
(**) SVGA: Mode "640x480": mode clock = 45.800
(**) SVGA: Mode "800x600": mode clock = 69.650
(**) SVGA: Mode "1024x768": mode clock = 115.500
(**) SVGA: Mode "1280x1024": mode clock = 135.000
(--) SVGA: Virtual resolution set to 1280x1024
(--) SVGA: SpeedUp code selection modified because virtualX != 1024
(--) SVGA: Read OPTION 0x4007cd21
(--) SVGA: Using XAA (XFree86 Acceleration Architecture)
(--) SVGA: XAA: Solid filled rectangles
(--) SVGA: XAA: Screen-to-screen copy
(--) SVGA: XAA: 8x8 color expand pattern fill
(--) SVGA: XAA: CPU to screen color expansion (TE/NonTE imagetext,
TE/NonTE polytext)
(--) SVGA: XAA: Using 10 128x128 areas for pixmap caching
(--) SVGA: XAA: Caching tiles and stipples
(--) SVGA: XAA: General lines and segments
(--) SVGA: XAA: Dashed lines and segments
Caught signal 11. Server aborting
the full server output, not just the last messages
giving up.
xinit: Connection refused (errno 111): unable to connect to X server
xinit: No such process (errno 3): Server error.
report? I've tried setting Option "no_accel" in XF86Config and it
makes no difference. I've got symlinks in the right places and
I followed the instructions for XSuSE to a tee!
with a 2.0.36 kernel (??? that's what it says!). It's a Pentium II
based machine w/ 64Mb of RAM. It is a MGA G200 with 16Mb SGRAM, so
that part is working okay. I just can't figure out why it doesn't
go on from there.