Double buffering under GLX

Double buffering under GLX

Post by Ralf Beye » Fri, 23 Aug 1996 04:00:00



Hi,

one of our users develops an OpenGL program on a Indigo^2 Maximum Impact
running IRIX 5.3 and reports that updating his graphic in double buffer
mode (using glXSwapBuffers) takes twice the time than updating it in single
buffer mode.

Having read the manuals he and I are confused by statements like

  glXSwapBuffers promotes the contents of the back buffer
  of drawable to become the contents of the front buffer of
  drawable" (from the glXSwapBuffers manpage)

or

  For drawables that are double-buffered, the front and back
  buffers can be exchanged by calling glXSwapBuffers() (from
  the OpenGL Programming Guide)

What is the meaning of "promoted" and "exchanged" in this context?

What happens at buffer swap time:

  Is a back-buffer written into a front-buffer which is actually
  the hardware frame buffer displayed ?

  Is there a hardware switch to select the old hardware back buffer
  as the new hardware front buffer for display?

Does anyone have an idea why double buffering takes twice the time to update
the graphics compared to single buffer ?

Thanks for any comments and best regards.


German Aerospace Research Establishment (DLR) e.V.

 
 
 

Double buffering under GLX

Post by Ralf Beye » Fri, 23 Aug 1996 04:00:00


Hi,

one of our users develops an OpenGL program on a Indigo^2 Maximum Impact
running IRIX 5.3 and reports that updating his graphic in double buffer
mode (using glXSwapBuffers) takes twice the time than updating it in single
buffer mode.

Having read the manuals he and I are confused by statements like

  glXSwapBuffers promotes the contents of the back buffer
  of drawable to become the contents of the front buffer of
  drawable" (from the glXSwapBuffers manpage)

or

  For drawables that are double-buffered, the front and back
  buffers can be exchanged by calling glXSwapBuffers() (from
  the OpenGL Programming Guide)

What is the meaning of "promoted" and "exchanged" in this context?

What happens at buffer swap time:

  Is a back-buffer written into a front-buffer which is actually
  the hardware frame buffer displayed ?

  Is there a hardware switch to select the old hardware back buffer
  as the new hardware front buffer for display?

Does anyone have an idea why double buffering takes twice the time to update
the graphics compared to single buffer ?

Thanks for any comments and best regards.


German Aerospace Research Establishment (DLR) e.V.

 
 
 

Double buffering under GLX

Post by Mark D Stadl » Fri, 23 Aug 1996 04:00:00




Quote:>one of our users develops an OpenGL program on a Indigo^2 Maximum Impact
>running IRIX 5.3 and reports that updating his graphic in double buffer
>mode (using glXSwapBuffers) takes twice the time than updating it in single
>buffer mode.

not surprising that it appears this way.

when running in double-buffer mode, you render to the back-buffer
while the display shows whats in the front buffer.  after completely
rendering a frame, you call glXSwapBuffers() to switch the buffers.
(back buffer becomes the front buffer, front buffer becomes the back)
for a smooth transition of these buffers, it is timed to happen at
vertical retrace time while the video is blanked.  since the switching
is synchronized with the video (typically 60-75Hz) you will do
less rendering.  and your rendering will be quantized to the rate
of the video.

at 60Hz, you have ~16ms to draw a scene.  if you can draw it in half
that time it doesn't matter, the next frame is always 16ms from the previous.
if it takes you 17ms to draw the scene, you get quantized down to the
next lower frame rate, and you might as well take 30ms.

when running in single-buffer mode, your rendering is not
synchronized to the monitor's vertical retrace.  so you will be
rendering to pixels in the frame buffer as they are being displayed.
this can cause visual artifacts (tearing) since you may not always
see a coherent frame.

so, even if you can render a scene in 4ms, the display will still
only show the contents of the frame buffer every 16ms.  if you are
running in single buffer mode, 3/4 of all the rendering you do will
never be displayed.  even the frames you do see will likely be incoherent.

so, yes, double-buffer mode effectively limits the amount of rendering
you do to match that which can be displayed on a video monitor.  any rendering
you do beyond that is time wasted.

Quote:

>Having read the manuals he and I are confused by statements like

>  glXSwapBuffers promotes the contents of the back buffer
>  of drawable to become the contents of the front buffer of
>  drawable" (from the glXSwapBuffers manpage)

just kind of a wordy way to say we switch the buffers.

Quote:

>or

>  For drawables that are double-buffered, the front and back
>  buffers can be exchanged by calling glXSwapBuffers() (from
>  the OpenGL Programming Guide)

>What is the meaning of "promoted" and "exchanged" in this context?

another wordy way to say we switch the buffers.

Quote:

>What happens at buffer swap time:

>  Is a back-buffer written into a front-buffer which is actually
>  the hardware frame buffer displayed ?

>  Is there a hardware switch to select the old hardware back buffer
>  as the new hardware front buffer for display?

the hardware has 2 buffers (we refer to them as buffer-A and buffer-B).
think of front-buffer and back-buffer as pointers that can either point
to buffer-A or buffer-B.  after you request a glXSwapBuffers(), we wait
for vertical retrace, then change the pointers for the front and back buffer.
we do not move the data around.

note too that glXSwapBuffers() returns immediately allowing you to
use the CPU for computation of the next frame.  just because we have
to wait for vertical retrace, the application doesn't have to wait
to gain control of the CPU again.

Quote:>Does anyone have an idea why double buffering takes twice the time to update
>the graphics compared to single buffer ?

i'm guessing that you are measuring performance based on a fairly simple scene.
anything under 16ms will appear to take 16ms in double-buffer mode.
anything between 17ms and 32ms will appear take 32ms and so on.

remember, there is no point in drawing a scene more times per second than a
display monitor can display it.

by the way, single-buffer can be a useful tool in tuning your rendering.
if your application is only running at 30Hz, you can time it in single-buffer
mode to see how close you are to 60hz.  a 1ms tune could get you from 17ms
to 16ms which is the difference between 30Hz and 60Hz.

hope this helps
--

 
 
 

Double buffering under GLX

Post by Allen Ak » Fri, 23 Aug 1996 04:00:00






| >
| >Having read the manuals he and I are confused by statements like
| >
| >  glXSwapBuffers promotes the contents of the back buffer
| >  of drawable to become the contents of the front buffer of
| >  drawable" (from the glXSwapBuffers manpage)
|
| just kind of a wordy way to say we switch the buffers.

Some systems (not SGI's, though) actually copy the back buffer to the
front buffer.  The manuals are worded so that people don't assume the
buffers are always swapped.  Code that depends on true swapping (to
implement incremental update, for example) could break on systems that
use copying.

Allen

 
 
 

1. single/double buffer in GLX mode

Hello,

I'm using the GLX mode with GL and Motif.
My GLXconfig is defined like that:
    {GLX_NORMAL, GLX_DOUBLE, TRUE},
...

The problem is that I need to pass from double buffer to single buffer mode
during the
application.
The simple call to doublebuffer(); or singlebuffer(); doesn't work.
It did work in GL by calling gconfig(); (not allowed in mixed model).

Any idea ? Thanks in advance.

w.

2. PS Slow

3. ---> glx double buffer/ single buffer <-------

4. When doesLightWave use duel processors?

5. GLX double/single buffer and resize

6. RFI - Getting into animation

7. double buffer -> double image

8. Plans for '63 Corvette Stringray

9. switching from double buffered to single buffered mode

10. OGL double buffering (i need a third buffer)

11. X newbie: How to draw in front buffer only, in double buffered window using X

12. Double buffer/single buffer switching

13. Border of GLX widget flickers when buffers are swapped