> > I was searching for a way to capture Ctrl-<char> keys, and stumbled
> > across a macro definition like this:
> > used as follows:
> > if((c = getchar()) == ctrl('G')){
> > /* .. handle Ctrl-G .. */
> > }
> in ASCII, the control characters have values of 1-26, and the uppercase
> alphabet characters have values of 65-90. therefore, if you subtract
> 64 from the ASCII code of an uppercase character, you will get the
> have an ASCII value of 64.
The subtraction yields the desired result only if the
given character is an ASCII upper-case letter. All of the
following will give some sort of garbage:
ctrl('g')
ctrl('0') /* looks like SO, but it isn't */
ctrl('\n')
One could argue that these are mis-uses of the macro;
the programmer has made a mistaik. This is true, but the
compiler will accept all of them without protest, meaning
the error will be detected later (if at all) rather than
sooner. Debugging sessions take time and effort; wouldn't
it be nicer if the compiler squawked right away when it
first encountered the error?
For this and other reasons, I recommend less cleverness:
#define CTRL_A '\001'
#define CTRL_B '\002'
...
or
#define SOH '\001'
#define STX '\002'
...
or
enum {
NUL, SOH, STX, ... };
With "exhaustive" listings like this, a programmer who
accidentally writes `if (ch == ctrl_g)' will get an error
message right away instead of a program malfunction later on.
Safer, I'd say, and much cheaper in the long run.
--