Greetings, Master Coders --
I'm about to embark on a software development project in which I will be
developing an application which I want to run on several different
operating platforms (initially just *NIX variants and MS-Windows, but
eventually others -- maybe BeOs; MacOS, etc.). It will be a network
service (think POP mail or LDAP server) so user interface issues aren't
my main concern. What I am worried about are what I would call the
"basics": endianness, word size, how to do basic disk i/o, etc.
So as a "for instance", something as simple as an integer variable means
very different things on various systems. Is my int a 32-bit int or a
64-bit int? Is it big-endian, or little-endian? The answer depends on
your platform. I obviously want to write the core code for my app in
such a way that I can get exactly what I need on all platforms without
actually rewriting the entire application for each platform.
Now I could probably work out some system of #ifdef's and #define's to
do a lot of this, but if somebody has worked a lot of this out I'd
rather not re-invent the wheel.
I am wondering if anybody out there could recommend a book or article or
web page or something that gives practical, concrete solutions to these
problems.
Oh, and before somebody suggests I just write the thing in Java -- I've
considered that, but for various reasons I'm sticking with C/C++.
Thanks in advance,
- Cedric