This report includes information concerning experimental use of unregistered pesticides or unregistered uses of pesticides. Experimental results should not be interpreted as recommendations for use. Use of unregistered materials or use of any registered pesticides inconsistent with its label is against both Federal Law and State Law.
Topological Sorting is a standard computation performed on finite partial order relations, for which efficient algorithms are well known.
This work is a study of using a depth-first search of a directed graph to implement topological sorting algorithms. A new algorithm is presented, along with a discussion of how it...
This grant supported acquisition of a minicomputer system for departmental research. The equipment selected is a DEC VAX-11/750 system, installed in remodeled space in the Computer Science (formerly Farm Crops) Building. Grant funds for equipment acquisition were supplemented by support from the Tektronix Foundation.
After completion of the physical facilities...
A theory for resiliency control of distributed database systems is developed, and schemes that preserve the consistent global database state in the presence of site crashes and message link failures are discussed. The schemes can at least theoretically be paired with any concurrency control scheme that produces y-serializable executions. Further,...
A typical database system maintains target data, which contain information useful for users, and access path data, which facilitate faster accesses to target data. Further, most large database systems support concurrent processing of multiple transactions. For a static database system model, where units of concurrency control are not dynamically created...
A concurrency and resiliency control scheme for a distributed database system with replicated data is discussed. The scheme, true-copy token scheme, uses true-copy tokens in order to designate the physical data copies (true copies) that can be identified with the current logical data that are globally unique, and then it...
A software system to study network algorithms was implemented on UNIX. Each part of a network algorithm can be written as a single C program which becomes a virtual node in the network. During a simulation all virtualized nodes run as separate processes on a single PDP 11/44. Inter-node communication...
A computer program complexity measure is a measure of how easy the program is to understand, test, modify, maintain, etc. Many of these measures are derived from the control or flow graph of the program. We describe these measures graph theoretically, indicate what aspect or aspects of the program they...