16 colors vs 256 colors

16 colors vs 256 colors

Post by David Stewa » Thu, 05 Aug 1993 02:19:37



This is a basic question I know but I am just totally fuzzy on the whole video issue.  Someone involved in graphics should be able to help if they would be so kind.

I was always under the impression that a VGA monitor could not support a resolution beyond 640x480 and 16 colors now I understand with the proper driver I can support higher resolutions and 256 colors with standard vga equipment, is this true?

If it is true does that mean a user with a SVGA card could if necessary display the higher res and colors on a VGA monitor.  What would be the point of SVGA monitor if I could do this?

Thanks for clearing this up for me.  Feel free to reply by email in addition to the post, I'm not sure I'm getting my mail.

Thanks,

Dave

 
 
 

16 colors vs 256 colors

Post by Raymond Bl » Fri, 06 Aug 1993 03:45:24


Quote:>This is a basic question I know but I am just totally fuzzy on the whole video issue.  Someone involved in graphics should be able to help if they would be so kind.

>I was always under the impression that a VGA monitor could not support a resolution beyond 640x480 and 16 colors now I understand with the proper driver I can support higher resolutions and 256 colors with standard vga equipment, is this true?

>If it is true does that mean a user with a SVGA card could if necessary display the higher res and colors on a VGA monitor.  What would be the point of SVGA monitor if I could do this?

>Thanks for clearing this up for me.  Feel free to reply by email in addition to the post, I'm not sure I'm getting my mail.

>Thanks,

>Dave

   A VGA *MONITOR* can support 640*480 resolution, an SVGA monitor can support
higher resolutions (i.e. 800*600, 1024*768). The number of CONCURRENT colors
that can be displayed by a VGA card (adapter) is 16, the number of CONCURRENT
colors displayed by a SuperVGA adapter may be higher (i.e. 256, 32,768...).
  The max resolution is a function of both the monitor and the display adapter,
the number of concurrent colors is a function of the display adapter. On the
display adapter side, the limitation is usually imposed by the amount of RAM
that the display adapter has available to store the bitmap that corresponds to
the pixels on the screen. At least one bit per pixel, a two color display of
640*480 would require 307,200 bits (640*480) or 38,400 bytes. A <= 16 color image
of 640*480 requires 150K bytes (640*480*4 bits per pixel). The # of bits per
pixel needed to display (n) colors is 2log(n) bits (?).
  Hope this helps. Some of the math may be off but I'm pretty sure that the
theory described above is accurate.
---Raymond

---
-------------------------------------------------------
-----                               __   ___       ----
-----                        . o  _/  \_/   \_     ----
----                  \\\\\\\    O   Hey!     \    ----
----                 \\\    \\   | why AM I    |   ----
----                  \\ O O     \_ smiling? _/    ----
----                               \__/\____/      ----
----                     \_/                       ----
----                                               ----
----   Raymond Blum                                ----

----   (201) 896-7594                              ----

  Of course these are MY opinions! Whattya think, my EMPLOYER
  could've thought this stuff up?

 
 
 

16 colors vs 256 colors

Post by Brandon S. Dewber » Sat, 07 Aug 1993 01:01:03



>>This is a basic question I know but I am just totally fuzzy on the whole video issue.  Someone involved in graphics should be able to help if they would be so kind.

>>I was always under the impression that a VGA monitor could not support a resolution beyond 640x480 and 16 colors now I understand with the proper driver I can support higher resolutions and 256 colors with standard vga equipment, is this true?

>>If it is true does that mean a user with a SVGA card could if necessary display the higher res and colors on a VGA monitor.  What would be the point of SVGA monitor if I could do this?

>>Thanks for clearing this up for me.  Feel free to reply by email in addition to the post, I'm not sure I'm getting my mail.

>>Thanks,

>>Dave
>   A VGA *MONITOR* can support 640*480 resolution, an SVGA monitor can support
>higher resolutions (i.e. 800*600, 1024*768). The number of CONCURRENT colors
>that can be displayed by a VGA card (adapter) is 16, the number of CONCURRENT
>colors displayed by a SuperVGA adapter may be higher (i.e. 256, 32,768...).
>  The max resolution is a function of both the monitor and the display adapter,
>the number of concurrent colors is a function of the display adapter. On the
>display adapter side, the limitation is usually imposed by the amount of RAM
>that the display adapter has available to store the bitmap that corresponds to
>the pixels on the screen. At least one bit per pixel, a two color display of
>640*480 would require 307,200 bits (640*480) or 38,400 bytes. A <= 16 color image
>of 640*480 requires 150K bytes (640*480*4 bits per pixel). The # of bits per
>pixel needed to display (n) colors is 2log(n) bits (?).
>  Hope this helps. Some of the math may be off but I'm pretty sure that the
>theory described above is accurate.
>---Raymond

I get something different.  The number of bits per pixel (x) required to
display n colors concurrently is x = ln(n)/ln(2) or x = log(n)/log(2).
This makes it approximately x = 3.32log(n) rather than the above.
Here's my reasoning:

    2^x = n
    ln(2^x) = ln(n)
    xln(2) = ln(n)
    x = ln(n)/ln(2) = (approx) 1.44ln(n) = 3.32log(n) ( ln(?) = 2.303log(?) )

So, a high end SVGA adaptor with 1024X768 resolution and 32,768 simultaneous
colors needs 1024*768*15 ( 15 = ln(32768)/ln(2) ) = 1.44Mb.
Usually, these come with 2Mb.  What do they do with the rest of the memory?

How do you know you have a 'VGA monitor' or a 'SVGA monitor' ?

Brandon
--
--
Brandon S. Dewberry
NASA/MSFC/EB43                 Vanderbilt University Biomedical Engineering

 
 
 

16 colors vs 256 colors

Post by David Stewa » Sat, 07 Aug 1993 21:04:10




>I get something different.  The number of bits per pixel (x) required to
>display n colors concurrently is x = ln(n)/ln(2) or x = log(n)/log(2).
>This makes it approximately x = 3.32log(n) rather than the above.
>Here's my reasoning:

>    2^x = n
>    ln(2^x) = ln(n)
>    xln(2) = ln(n)
>    x = ln(n)/ln(2) = (approx) 1.44ln(n) = 3.32log(n) ( ln(?) = 2.303log(?) )

>So, a high end SVGA adaptor with 1024X768 resolution and 32,768 simultaneous
>colors needs 1024*768*15 ( 15 = ln(32768)/ln(2) ) = 1.44Mb.
>Usually, these come with 2Mb.  What do they do with the rest of the memory?

>How do you know you have a 'VGA monitor' or a 'SVGA monitor' ?

>Brandon
>--
>--
>Brandon S. Dewberry
>NASA/MSFC/EB43                 Vanderbilt University Biomedical Engineering


*****************************************************************
I don't think thank you can look at a monitor and physically tell whether you have VGA and SVGA.  You need the documentation.

As far as the original questin went; a vga monitor cannot regardless of the adapter produce a resolution higher than 640x480????? But with more memory on the card can produce more concurrent colors????  

DAS

 
 
 

16 colors vs 256 colors

Post by Joey H. Blankensh » Sun, 08 Aug 1993 05:18:20


Video resolution and color depth are functions of both the monitor and the
adapter card (or onboard solution).  Video resolutions supported by the adapter
cards may be very high, although the refresh rate at which a particular

refresh)
which can produce visible flickering.  Most manufacturers (including us) sell
video packages in which the video drivers will program the video chip for the
optimum (read highest) refresh rate supported by the monitors that we offer.
The color depth also affects refresh rate, because the more data must be
transferred to the DAC in order to support higher color depths.

As for the memory needed at a particular resolution at a certain color depth,
the equation is quite simple.  (Vertical * Horizontal * Actual Bits Per Pixel)
divided by 8.  Actual bpp is not the same as color depth because although some
adapters will support 24 bpp, it gets represented internally (in video RAM) as
32 bpp, with either the upper or lower 8 bits discarded.  This is done to
simplify the HW and the SW.  If also can result in some performance gain due
to simplified algorithms.

Hope this helps.

Joey.

-----
Joseph H. Blankenship
Project Leader, System Diagnostics and Video Drivers
NCR, WPD-Clemson

 
 
 

16 colors vs 256 colors

Post by Air Bon » Thu, 12 Aug 1993 05:04:05



>As far as the original questin went; a vga monitor cannot regardless of the adapter produce a resolution higher than 640x480????? But with more memory on the card can produce more concurrent colors????  

>DAS

The first VGA monitors were fixed frequency and could only
disply 640x480. I think every monitor made now is capable of at
least 72 Hz (or whatever). Analog monitors are not limited in
the number of colors they can produce, of course.

Back to the original post: a standard VGA card can disply 256
colors at 320x200 (i think that's correct).

Tim

 
 
 

16 colors vs 256 colors

Post by Lenny Gr » Fri, 13 Aug 1993 13:31:50


Standard VGA gives 640x480x16 colors is basically defined by the
simultaneous limit of monitor frequency response _and_ 256K video
RAM.  Anything that steps beyond the 256K card should be able to
give the 640x480x256 colors, but is programmed in a totally
different way.  There was also the problem of "no standard" that
made each piece of software have do know about the specific card.
These days, I guess all the cards either support the VESA SVGA
standard or you can get a TSR that emulates it without significant
overhead.  Everything from 640x480x256 on up _is_ SVGA.  The basic
"VGA" monitor _can_ handle 640x480x256, even though the card has to
be SVGA and have 512K to do it.  The use of the words SVGA monitor
basically refers to the "beyond 640x480" frequency response that
the monitor has.
 
 
 

16 colors vs 256 colors

Post by Robert Schmi » Mon, 23 Aug 1993 23:59:35



|> >
|> >As far as the original questin went; a vga monitor cannot regardless of the adapter produce a resolution higher than 640x480????? But with more memory on the card can produce more concurrent colors????  
|> >
|> >DAS
|> >
|>
|> The first VGA monitors were fixed frequency and could only
|> disply 640x480. I think every monitor made now is capable of at
|> least 72 Hz (or whatever). Analog monitors are not limited in
|> the number of colors they can produce, of course.
|>
|> Back to the original post: a standard VGA card can disply 256
|> colors at 320x200 (i think that's correct).

Standard VGA is maximum 720 dot clocks horizontally and 480 clocks
vertically.  Since a 256 colour pixel is two dot clocks, any standard VGA
adapter and standard VGA monitor combination should be able to support
up to 360x480 in 256 colors, or 720x480 in 16 colors.

--

Ztiff Zox Softwear: fast/tiny utilities, games/graphics programming on
                    the DOS platform (C/C++ & asm).  Suggestions welcome!

Everything I write is my opinion only - go make up your own!