SAM Distributed Shared Memory System <description><url>http://suif.stanford.edu/~scales/sam.html</url> <abstract> SAM is a run-time system that supports a shared name space in software on distributed-memory multiprocessors. There are a variety of scientific applications which operate on complicated data structures and access data in irregular and often data-dependent ways. SAM provides a global name space that facilitates programming these types of applications and caches data as necessary for efficient execution. All shared data in SAM is communicated in terms of user-defined data types, rather than fixed-size units such as pages. SAM provides simple primitives for accessing data from which more complex types of access can be built. SAM primitives can directly model the fundamental data relationships in parallel programs: producer-consumer, mutual exclusion, and chaotic relationships. <keywords>parallel runtime system; virtual shared memory <category>ppt-rts <contact>Dan Scales / scales@cs.stanford.edu </urc> <urc> <title>Jade Parallel Programming Language <description><url>http://suif.stanford.edu/~scales/sam.html</url> <abstract> Included with the SAM system is an implementation of the Jade parallel language for distributed memory machines using SAM. Jade is a parallel programming language (an extension to C) for exploiting coarse-grain concurrency in sequential, imperative programs. Jade provides the convenience of a shared memory model by allowing any task to access shared objects transparently. Jade programmers augment their sequential programs with constructs that decompose the computation into tasks and declare how tasks access shared objects. The Jade implementation dynamically interprets this information to execute the program in parallel while preserving the sequential semantics -- if there is a data dependence between tasks, tasks run in the same order as in the sequential execution. The structure of parallelism produced by the Jade program is a directed acyclic graph of tasks, where the edges between tasks are the data dependence constraints. Because the constructs for declaring data accesses are executed dynamically, this task graph can be dynamic and can therefore express data-dependent concurrency available only at run-time. <keywords>parallel programming language; virtual shared memory; coarse grain parallelism <category>ppt-pplang <contact>Dan Scales / scales@cs.stanford.edu </urc>