The Design of Software (CLOSED)

A public forum for discussing the design of software, from the user interface to the code architecture. Now closed.

The "Design of Software" discussion group has been merged with the main Joel on Software discussion group.

The archives will remain online indefinitely.

In-memory data structures

I am looking for a library (commertial is good as well as free) which provides an abstraction for effecient sharing of arrays among multiple processes. We need to keep a single system-wide copy of in-memory array[s]. The library must be run in user mode on Linux, Solaris and HPUX. Performace is critical, in the perfect case each process would access the shared memory as fast as it's local memory.

Any suggestions?
Dima Send private email
Tuesday, October 12, 2004
Carfield Yim Send private email
Tuesday, October 12, 2004
You could write your own - it is not that hard: use shared memory + semaphores for synchronization.


For an API example, have a look at the Win32 MapViewOfFile (works with shared memory too). The API is not that nice, but it does the job.
Wednesday, October 13, 2004
Linux, Solaris, and HPUX.  Yup, shared memory under Unix -- shmget(), etc.  Use semaphores to control access.

See Stevens, "Advanced Programming in the Unix Environment", page 463.

I'm assuming each process is on the same machine, so it actually IS accessing the same shared memory.  If not, then see "Leaky Abstractions" -- you can't REALLY share memory which is in one box with a processor which is in another box.

You can do something similar to this (perhaps fast enough, but MUCH slower than true shared memory) using socket connections to throw data updates around from machine to machine.
Wednesday, October 13, 2004

This topic is archived. No further replies will be accepted.

Other recent topics Other recent topics
Powered by FogBugz