Forcing harware acceleration

Forcing harware acceleration

Post by Benjamin Lo » Fri, 16 Jul 1999 04:00:00



I'm trying to use the hardware acceleration in my OpenGL app.  When using a
Riva TNT, it automatically picks up that I have one and uses the
NVOPENGL.DLL (when querrying OpenGL).  Ditto for the 3dfx.  But with the
Matrox G200 and the ATI Rage PRO it always defaults to the software renderer
from Microsoft.    Anyone have ideas?

2nd thing, if I

 
 
 

Forcing harware acceleration

Post by Benjamin Lo » Fri, 16 Jul 1999 04:00:00



>I'm trying to use the hardware acceleration in my OpenGL app.  When using a
>Riva TNT, it automatically picks up that I have one and uses the
>NVOPENGL.DLL (when querrying OpenGL).  Ditto for the 3dfx.  But with the
>Matrox G200 and the ATI Rage PRO it always defaults to the software
renderer
>from Microsoft.    Anyone have ideas?

whoops!

Continuing now:

2nd thing, if I

glGetString(GL_VENDOR)

and it reads someone OTHER than Microsoft, can I automatically assume I'm
getting hardware acceleration?

Thanks,
Ben

 
 
 

Forcing harware acceleration

Post by Andrew F. Vespe » Fri, 16 Jul 1999 04:00:00



> I'm trying to use the hardware acceleration in my OpenGL app.  When using a
> Riva TNT, it automatically picks up that I have one and uses the
> NVOPENGL.DLL (when querrying OpenGL).  Ditto for the 3dfx.  But with the
> Matrox G200 and the ATI Rage PRO it always defaults to the software renderer
> from Microsoft.    Anyone have ideas?

For the Rage Pro, I think you have to set the color depth to "TrueColor" to
get the OpenGL ICD.

--
Andy V (OpenGL Alpha Geek)
"In order to make progress, one must leave the door to the unknown ajar."
Richard P. Feynman, quoted by Jagdish Mehra in _The Beat of a Different Drum_.

 
 
 

Forcing harware acceleration

Post by Benjamin Lo » Fri, 16 Jul 1999 04:00:00


I did:
        DEVMODE         dm;
        memset(&dm, 0, sizeof(dm));
        dm.dmSize = sizeof(dm);

        dm.dmPelsWidth  = pRend->m_pParms->m_uiWidth;
        dm.dmPelsHeight = pRend->m_pParms->m_uiHeight;
        dm.dmFields     = DM_PELSWIDTH | DM_PELSHEIGHT;
        dm.dmBitsPerPel = 24; //XXX Only for ATI

        long result;
        result = ChangeDisplaySettings( &dm, CDS_FULLSCREEN);

and I still got only the microsoft  software renderer.  Was setting the
dm.dmBitsPerPel to 24 enough?  I also tried 16 and 32, but to no avail.

Thanks,
Ben

Quote:>For the Rage Pro, I think you have to set the color depth to "TrueColor" to
>get the OpenGL ICD.

 
 
 

Forcing harware acceleration

Post by Lucian Wisch » Fri, 16 Jul 1999 04:00:00



>and I still got only the microsoft  software renderer.  Was setting the
>dm.dmBitsPerPel to 24 enough?  I also tried 16 and 32, but to no avail.

I think he means:
you must set the screen mode to 16bpp (maybe 32bpp works as well)
otherwise the card won't accelerate anything. I may be wrong.

--
Lucian

 
 
 

Forcing harware acceleration

Post by Dominic Ludla » Fri, 16 Jul 1999 04:00:00


You'll probably need to set dm.dmFields to

    dm.dmFields = DM_PELSWIDTH | DM_PELSHEIGHT | DM_BITSPERPEL;

--
Dominic Ludlam

 
 
 

Forcing harware acceleration

Post by Benjamin Lo » Fri, 16 Jul 1999 04:00:00


I tried that.  Still nothing at 16,24,32.

In response to Lucian's remark, I'm trying to change the screen resolution
(and associated bit depth) so I think that the current resolution/bit depth,
does not affect things.  Am I wrong here?  Or does using the pixel format
descriptor determine things?

Thanks,
Ben

 
 
 

Forcing harware acceleration

Post by Lucian Wisch » Fri, 16 Jul 1999 04:00:00



>In response to Lucian's remark

Sorry! My comment was nonsense because I didn't read your code. This is
what I should have said:

1. Many cards only accelerate at 16bpp and 32bpp. That's why you're trying
to change screen mode, presumably.

2. As per the previous post, you certainly need DM_BITSPERPEL. But you
tried that and said that it didn't work.

3. Most cards under '95 (and I guess some under '98, but I don't know
about your particular card) are not able to change bpp using just
ChangeDisplaySettings. NT always works. In any case, you should always
test the mode first:
  LONG res=::ChangeDisplaySettings(&dm,CDS_TEST);
  if (res!=DISP_CHANGE_SUCCESSFUL) return;

4. You must choose a PIXELFORMATDESCRIPTOR and a pixel format with the
same bpp as the screen. If you don't, then its very unlikely that you'll
get hardware acceleration on screen.

5. Some cards don't have enough memory for a high-res mode as well as a
big depth buffer. Are you changing to 640x480? Does that work?

--
Lucian

 
 
 

Forcing harware acceleration

Post by Benjamin Lo » Sat, 17 Jul 1999 04:00:00


Quote:>1. Many cards only accelerate at 16bpp and 32bpp. That's why you're trying
>to change screen mode, presumably.

Yes.  When my app runs, I'd like to switch to fullscreen with hardware
acceleration.  I can get the screen to go to full screen mode, but it is
always in software acceleration.

Quote:>3. Most cards under '95 (and I guess some under '98, but I don't know
>about your particular card) are not able to change bpp using just
>ChangeDisplaySettings. NT always works. In any case, you should always
>test the mode first:
>  LONG res=::ChangeDisplaySettings(&dm,CDS_TEST);
>  if (res!=DISP_CHANGE_SUCCESSFUL) return;

Well it seems to change the resolution just fine, just that it is only using
software rendering when I'm using an ATI Rage PRO and the Matrox G200.  I
get hardware acceleration using the Voodoo3 and TNT.  I even used non 16-bit
pixel format descriptors and the voodoo3 still was accelerated.

Quote:>4. You must choose a PIXELFORMATDESCRIPTOR and a pixel format with the
>same bpp as the screen. If you don't, then its very unlikely that you'll
>get hardware acceleration on screen.

Okay, so I'm correct in assuming I have two things to set, the screen
resolution (+ bit depth), and also the pixel format descriptor for my OpenGL
rendering client?

Quote:>5. Some cards don't have enough memory for a high-res mode as well as a
>big depth buffer. Are you changing to 640x480? Does that work?

Yeah, it can change to 640x480 (both the Rago PRO and G200 have at least 4
MB)  and I'm sure memory isn't the problem.  Any more thoughts?

Ben

 
 
 

Forcing harware acceleration

Post by Benjamin Lo » Thu, 22 Jul 1999 04:00:00


SOLUTION:
Okay I think I found out what was wrong...

I was calling:
    result = ChangeDisplaySettings( &dm, CDS_FULLSCREEN);

needed to call:
    result = ChangeDisplaySettings( &dm, 0);

That, (and I think along with twiddling the bit depth to 16,
pixelformatdescriptor to 16, etc.) was the cause.  I now get hardware
acceleration using ATI cards.  This wasn't a problem with TNT or Voodoo3
cards, so who knows?
Just my expereince

Thanks to all who replied,
Ben

 
 
 

Forcing harware acceleration

Post by Lucian Wisch » Thu, 22 Jul 1999 04:00:00



>Okay I think I found out what was wrong...
>I was calling:  result = ChangeDisplaySettings( &dm, CDS_FULLSCREEN);
>needed to call: result = ChangeDisplaySettings( &dm, 0);
>That, (and I think along with twiddling the bit depth to 16,
>pixelformatdescriptor to 16, etc.) was the cause.  I now get hardware
>acceleration using ATI cards.  This wasn't a problem with TNT or Voodoo3
>cards, so who knows?

Ben, this all seems really suspicious to me. What do you mean by
"twiddling the bit depth..." ? I think it is more desirable to use
CDS_FULLSCREEN than not, because CDS_FULLSCREEN will not mess up people's
desktop icons but 0 will. Did you also try CDS_TEST before?

1. I've heard rumour about this issue. Suppose your application starts up.
It is linked to OPENGL32.DLL. Therefore the OpenGL libraries get
initialised, to the current screen depth. Then you try to change mode to a
different bpp. Perhaps the opengl libraries fail to reinitialise
themselves, and so end up kind of screwed up? Perhaps doing
ChangeDisplaySettings(&dm,CDS_FULLSCREEN) was not enough to reset them?
Perhaps doing ChangeDisplaySettings(&dm,0) is enough to reset them?

- perhaps another solution would be to change screen mode *before* linking
to OPENGL32.DLL? There are two ways you could accomplish this. (1) You
could write a launcher program whose job is to ChangeDisplaySettings and
then run your actual game. Or (2) you could load OPENGL32.DLL dynamically
using LoadLibrary.

2. Are you using ChoosePixelFormat? I never trust that call. The most
reliable way for you to choose a pixel format is for you to enumerate them
one by one, and call DescribePixelFormat for each one, and to find out
which ones are accelerated. (by checking PFD_GENERIC_FORMAT and
PFD_GENERIC_ACCELERATED).

Try changing screen mode using CDS_FULLSCREEN to something or other (16,
probably). Then enumerate all the pixel formats. Does it tell you that it
can accelerate some of them okay?

I ask because I've done exactly this, on an ATI card (using
CDS_FULLSCREEN, and enumerating the pixel formats, and choosing the one
that was accelerated). And it all appeared to work fine, and I did get
accleration.

--
Lucian

 
 
 

1. Selecting harware acceleration?

Hi!

I just started using open gl, and the stuff I code works, but it's not
hardware accelerated.  It's software only.  I guess I have to select
the hardware acceleration somewhere...  how is it done?
I want my apps to run on voodoo2 or G200.

Thanks for any info!  :-)

2. Problem in reading image on newsgroup and email

3. Harware Acceleration

4. GKS tutorial?

5. OpenGL harware acceleration on Diamond Monster 3D

6. Simple geometry

7. harware lock

8. Colorizing - Turning Black into Black?

9. What harware Do I need?

10. Advise on harware

11. Help with harware config for a PC

12. Harware transforms, and skeletal animation?