Hi all:
I have found the problem, but not solved yet :-(
This is the init code in the svgalib.c video driver:
----------------
static int init(char *options) { if(detect()) { char *frame = NULL; vga_modeinfo *mdinfo; GrVideoMode mode,*modep = &modes[1]; GrVideoModeExt ext, *extp = &exts[0]; int mindex; memzero(modep,(sizeof(modes) - sizeof(modes[0]))); for(mindex = G320x200x16; mindex <= GLASTMODE; mindex++) { if(!(vga_hasmode(mindex))) continue; if(!(mdinfo = vga_getmodeinfo(mindex))) continue; if(!(build_video_mode(mdinfo,&mode,&ext))) continue; if(frame == NULL) { #if 0 /* paranoid hack to really make sure... */ long endmem = (long)(sbrk(0)); if((endmem & 0xffffL) != 0) { brk((void *)((endmem + 0xffffL) & ~0xffffL)); } #endif vga_setmode(mindex); if (ext.flags & GR_VMODEF_LINEAR) vga_setlinearaddressing(); frame = (char *)vga_getgraphmem(); vga_setmode(initmode); } mode.mode = mindex; ext.frame = frame; add_video_mode(&mode,&ext,&modep,&extp); } _GrVideoDriverSVGALIB.adapter = isEGA ? GR_EGA : GR_VGA; return(TRUE); } return(FALSE); }
--------------------
The problem:
GRX needs ext.frame to be set in advance for all video modes. But with Svgalib GRX needs to set the videomode to get the pointer. GRX assumes ext.frame is the same for all modes, but is not true for the newest video cards.
I had make a try, setting the videomode (vga_setmode) for every mode and getting the frame pointer. It works, and "demogrx" runs in 16bpp OK. But it takes a lot to start, and probably is not a good idea to set all video modes consecutive.
So, any one knows how to get the frame pointer without set the video mode?