Shared memory on another system

Shared memory on another system

Post by Studio 5 » Sat, 14 Apr 2001 13:44:36



I'm trying to learn about servers side programming for massively multiplayer
online games. Game state has to be controlled in one central location (i.e.
master server), but theres a lot of processing overhead that has to be done,
like determining what happens if players shoot, what happens when things
explode, etc...

This processing can't be done by the client because it would just use up
more bandwidth to transmit the extra state info. If this processing were
done by the master server then it creates more load on that server that
could be better spent servicing clients. If this processing were delegated
to secondary a server(s), that would be preferable, except that there needs
to be a way for the secondary server to access/alter the state info on the
master server. The state info must be stored in RAM on the master server for
best performance, so I'm wondering what the fastest way to allow access to
that memory from another system is? Maybe the overhead on the master server
from communicating with the secondary server would negate any benifits that
come from sharing the processing workload with the secondary server?

I'm thinking there must be some way to preserve game state in one central
location but still allow different servers to share the workload of serving
clients, processing events, etc..., but I'm having trouble coming up with a
system that will work very well.

Any thoughts?


discussing game programming in general, related to this topic or not.
Thanks,

LKembel

 
 
 

Shared memory on another system

Post by mind » Sat, 14 Apr 2001 17:19:14


Studio 51 (?) ?T??????- ??o?????...

>I'm trying to learn about servers side programming for massively
multiplayer
>online games. Game state has to be controlled in one central location (i.e.
>master server), but theres a lot of processing overhead that has to be
done,
>like determining what happens if players shoot, what happens when things
>explode, etc...

>This processing can't be done by the client because it would just use up
>more bandwidth to transmit the extra state info. If this processing were
>done by the master server then it creates more load on that server that
>could be better spent servicing clients. If this processing were delegated
>to secondary a server(s), that would be preferable, except that there needs
>to be a way for the secondary server to access/alter the state info on the
>master server. The state info must be stored in RAM on the master server
for
>best performance, so I'm wondering what the fastest way to allow access to
>that memory from another system is? Maybe the overhead on the master server
>from communicating with the secondary server would negate any benifits that
>come from sharing the processing workload with the secondary server?

>I'm thinking there must be some way to preserve game state in one central
>location but still allow different servers to share the workload of serving
>clients, processing events, etc..., but I'm having trouble coming up with a
>system that will work very well.

>Any thoughts?


>discussing game programming in general, related to this topic or not.
>Thanks,

>LKembel

Do you want distributed shared memory ?

Distributed shared memory is generally implemented by MemoryChannel.

This method need additional H/W devices, platform dependented and
have many restrictions, but performance is very good.
Additionally programming is easy.

I know that Compaq Alpha system have MC soluction.

See more information, visit

http://tru64unix.compaq.com/faqs/publications/cluster_doc/cluster_51/...
OC.HTM
   Highly Available Applications manual chapter 9,10

 
 
 

Shared memory on another system

Post by Sean O'Donnel » Sun, 15 Apr 2001 03:19:32


You might want to investigate memory mapped files over NFS.  I used this
once to implement remote monitoring in a distributed application.  The basic
idea was that each host in the application ran a daemon that collected
status information periodically (5 second intervals) and dumped the status
info to a file.  The status file was in a file system that was exported via
NFS.  A client GUI running on another host mounted the exported file system
from each host, mapped the status file, and displayed changes to the files
in real time.  Kind of like shared memory between systems.

Sean O'Donnell


>I'm trying to learn about servers side programming for massively

multiplayer online games.
> ... snip...

>I'm thinking there must be some way to preserve game state in one central
>location but still allow different servers to share the workload of serving
>clients, processing events, etc..., but I'm having trouble coming up with a
>system that will work very well.

>Any thoughts?


>discussing game programming in general, related to this topic or not.
>Thanks,

>LKembel

 
 
 

Shared memory on another system

Post by Rich Gra » Sun, 15 Apr 2001 04:21:38



> Studio 51 (?) ?T??????- ??o?????...
> >I'm trying to learn about servers side programming for massively
> multiplayer
> >online games. Game state has to be controlled in one central location (i.e.
> >master server), but theres a lot of processing overhead that has to be
> done,
> >like determining what happens if players shoot, what happens when things
> >explode, etc...

> >This processing can't be done by the client because it would just use up
> >more bandwidth to transmit the extra state info. If this processing were
> >done by the master server then it creates more load on that server that
> >could be better spent servicing clients. If this processing were delegated
> >to secondary a server(s), that would be preferable, except that there needs
> >to be a way for the secondary server to access/alter the state info on the
> >master server. The state info must be stored in RAM on the master server
> for
> >best performance, so I'm wondering what the fastest way to allow access to
> >that memory from another system is? Maybe the overhead on the master server
> >from communicating with the secondary server would negate any benifits that
> >come from sharing the processing workload with the secondary server?

> >I'm thinking there must be some way to preserve game state in one central
> >location but still allow different servers to share the workload of serving
> >clients, processing events, etc..., but I'm having trouble coming up with a
> >system that will work very well.

> >Any thoughts?


> >discussing game programming in general, related to this topic or not.
> >Thanks,

> >LKembel

> Do you want distributed shared memory ?

> Distributed shared memory is generally implemented by MemoryChannel.

> This method need additional H/W devices, platform dependented and
> have many restrictions, but performance is very good.
> Additionally programming is easy.

> I know that Compaq Alpha system have MC soluction.

> See more information, visit

> http://tru64unix.compaq.com/faqs/publications/cluster_doc/cluster_51/...
> OC.HTM
>    Highly Available Applications manual chapter 9,10

Probably overkill, but Systran (http://www.systran.com) makes a hardware
shared memory product called SCRAMNet+ (Shared Common RAM Network).  It
was originally designed for high-speed distributed realtime apps such as
flight simulators.

Cheers!
Rich


 
 
 

Shared memory on another system

Post by Studio 5 » Sun, 15 Apr 2001 06:46:37



Quote:> You might want to investigate memory mapped files over NFS.  I used this
> once to implement remote monitoring in a distributed application.  The
basic
> idea was that each host in the application ran a daemon that collected
> status information periodically (5 second intervals) and dumped the status
> info to a file.  The status file was in a file system that was exported
via
> NFS.  A client GUI running on another host mounted the exported file
system
> from each host, mapped the status file, and displayed changes to the files
> in real time.  Kind of like shared memory between systems.

Thanks for the suggestion, that's interesting. Unfortunately I don't think
it would meet the speed requirements needed. With the latency of the
internet, I would need the data to be shared very fast and very often (many
times per second).

LKembel

 
 
 

Shared memory on another system

Post by Studio 5 » Sun, 15 Apr 2001 06:47:33



Quote:> Probably overkill, but Systran (http://www.systran.com) makes a hardware
> shared memory product called SCRAMNet+ (Shared Common RAM Network).  It
> was originally designed for high-speed distributed realtime apps such as
> flight simulators.

Thanks for both suggestions, I'll have to do a lot more research on these
kinds of systems.

LKembel

 
 
 

Shared memory on another system

Post by Studio 5 » Mon, 16 Apr 2001 19:10:33



Quote:> It may not be a problem in todays systems, but a caution is in order.  I
> once used shared memory between two minicomputers.  Occasionally a
> program would get erroneous results.  It turned out that when storing a
> multi-word variable, i.e. double precision, it was possible for the
> other computer to access the memory in between words.  Yep, you guessed
> it.  Every now and then it accessed a partially changed value :-).

Hmm... I can only assume that in today's systems there would a very tight
locking system. Thanks for the info.

LKembel

 
 
 

Shared memory on another system

Post by ChromeDom » Tue, 17 Apr 2001 00:26:22



> > program would get erroneous results.  It turned out that when storing a
> > multi-word variable, i.e. double precision, it was possible for the
> > other computer to access the memory in between words.  Yep, you guessed
> > it.  Every now and then it accessed a partially changed value :-).

> Hmm... I can only assume that in today's systems there would a very tight
> locking system. Thanks for the info.

We had one too.  But the point is that it was on a word basis, since
that was the hardware "atomic" storage quantity.  Any lockout for
multiples of the hardware store would necessarily have to be software.
IIRC, "mov" is an interruptible instruction :-).

--
Homo Sapiens is a goal, not a description.

 
 
 

Shared memory on another system

Post by Benjamin Kaufma » Thu, 19 Apr 2001 21:48:22


I don't think that these events take much processing. It's the graphics that
consumes most of the cpu.  The game should be designed so that each of the game
player's machines is fed deltas to the state table.  Sharing the memory across
multiple machines will probably slow things down because  there can only be one
outcome and synchronization is required to insure identical sequentiality.  For
example, without synchronization, if you distribute over multiple machines it is
possible that on server A  in a face off between user1 and user2 that user1 gets
his shot off first and on server B that user2 gets his shot off - just because
of the propagation delay.  If you really do need more processing power, this
type of application is a text book candidate for a more powerful machine.
If anything, figure out how to multi-thread the app and get a multiple cpuer.

Ben


>I'm trying to learn about servers side programming for massively multiplayer
>online games. Game state has to be controlled in one central location (i.e.
>master server), but theres a lot of processing overhead that has to be done,
>like determining what happens if players shoot, what happens when things
>explode, etc...

>This processing can't be done by the client because it would just use up
>more bandwidth to transmit the extra state info. If this processing were
>done by the master server then it creates more load on that server that
>could be better spent servicing clients. If this processing were delegated
>to secondary a server(s), that would be preferable, except that there needs
>to be a way for the secondary server to access/alter the state info on the
>master server. The state info must be stored in RAM on the master server for
>best performance, so I'm wondering what the fastest way to allow access to
>that memory from another system is? Maybe the overhead on the master server
>from communicating with the secondary server would negate any benifits that
>come from sharing the processing workload with the secondary server?

>I'm thinking there must be some way to preserve game state in one central
>location but still allow different servers to share the workload of serving
>clients, processing events, etc..., but I'm having trouble coming up with a
>system that will work very well.

>Any thoughts?


>discussing game programming in general, related to this topic or not.
>Thanks,

>LKembel

 
 
 

Shared memory on another system

Post by Studio 5 » Fri, 20 Apr 2001 02:31:41



Quote:> I don't think that these events take much processing. It's the graphics
that
> consumes most of the cpu.  The game should be designed so that each of the
game
> player's machines is fed deltas to the state table.  Sharing the memory
across
> multiple machines will probably slow things down because  there can only
be one
> outcome and synchronization is required to insure identical sequentiality.
For
> example, without synchronization, if you distribute over multiple machines
it is
> possible that on server A  in a face off between user1 and user2 that
user1 gets
> his shot off first and on server B that user2 gets his shot off - just
because
> of the propagation delay.  If you really do need more processing power,
this
> type of application is a text book candidate for a more powerful machine.
> If anything, figure out how to multi-thread the app and get a multiple

cpuer.

As I do more research and get other opinions, I'm starting to think you're
right. The problem is I would like the system to be as scalable as possible,
eventually there is a limit to how many machines a single server can
support.

LKembel