Monographs in Computer Science
6 total works
A Basis for Theoretical Computer Science
by M. A. Arbib, A. J. Kfoury, and R. N. Moll
Published 30 September 1981
Computer science seeks to provide a scientific basis for the study of inform a- tion processing, the solution of problems by algorithms, and the design and programming of computers. The last forty years have seen increasing sophistication in the science, in the microelectronics which has made machines of staggering complexity economically feasible, in the advances in programming methodology which allow immense programs to be designed with increasing speed and reduced error, and in the development of mathematical techniques to allow the rigorous specification of program, process, and machine. The present volume is one of a series, The AKM Series in Theoretical Computer Science, designed to make key mathe- matical developments in computer science readily accessible to under- graduate and beginning graduate students.
Specifically, this volume takes readers with little or no mathematical background beyond high school algebra, and gives them a taste of a number of topics in theoretical computer science while laying the mathematical foundation for the later, more detailed, study of such topics as formal language theory, computability theory, programming language semantics, and the study of program verification and correctness. Chapter 1 introduces the basic concepts of set theory, with special emphasis on functions and relations, using a simple algorithm to provide motivation. Chapter 2 presents the notion of inductive proof and gives the reader a good grasp on one of the most important notions of computer science: the recursive definition of functions and data structures.
Specifically, this volume takes readers with little or no mathematical background beyond high school algebra, and gives them a taste of a number of topics in theoretical computer science while laying the mathematical foundation for the later, more detailed, study of such topics as formal language theory, computability theory, programming language semantics, and the study of program verification and correctness. Chapter 1 introduces the basic concepts of set theory, with special emphasis on functions and relations, using a simple algorithm to provide motivation. Chapter 2 presents the notion of inductive proof and gives the reader a good grasp on one of the most important notions of computer science: the recursive definition of functions and data structures.
Algebraic Approaches to Program Semantics
by Ernest G. Manes and Michael A. Arbib
Published 10 September 1986
In the 1930s, mathematical logicians studied the notion of "effective comput ability" using such notions as recursive functions, A-calculus, and Turing machines. The 1940s saw the construction of the first electronic computers, and the next 20 years saw the evolution of higher-level programming languages in which programs could be written in a convenient fashion independent (thanks to compilers and interpreters) of the architecture of any specific machine. The development of such languages led in turn to the general analysis of questions of syntax, structuring strings of symbols which could count as legal programs, and semantics, determining the "meaning" of a program, for example, as the function it computes in transforming input data to output results. An important approach to semantics, pioneered by Floyd, Hoare, and Wirth, is called assertion semantics: given a specification of which assertions (preconditions) on input data should guarantee that the results satisfy desired assertions (postconditions) on output data, one seeks a logical proof that the program satisfies its specification. An alternative approach, pioneered by Scott and Strachey, is called denotational semantics: it offers algebraic techniques for characterizing the denotation of (i. e., the function computed by) a program-the properties of the program can then be checked by direct comparison of the denotation with the specification. This book is an introduction to denotational semantics. More specifically, we introduce the reader to two approaches to denotational semantics: the order semantics of Scott and Strachey and our own partially additive semantics."
A Programming Approach to Computability
by A. J. Kfoury, Robert N. Moll, and Michael A. Arbib
Published 31 August 1982
Computability theory is at the heart of theoretical computer science. Yet, ironically, many of its basic results were discovered by mathematical logicians prior to the development of the first stored-program computer. As a result, many texts on computability theory strike today's computer science students as far removed from their concerns. To remedy this, we base our approach to computability on the language of while-programs, a lean subset of PASCAL, and postpone consideration of such classic models as Turing machines, string-rewriting systems, and p. -recursive functions till the final chapter. Moreover, we balance the presentation of un solvability results such as the unsolvability of the Halting Problem with a presentation of the positive results of modern programming methodology, including the use of proof rules, and the denotational semantics of programs. Computer science seeks to provide a scientific basis for the study of information processing, the solution of problems by algorithms, and the design and programming of computers. The last 40 years have seen increasing sophistication in the science, in the microelectronics which has made machines of staggering complexity economically feasible, in the advances in programming methodology which allow immense programs to be designed with increasing speed and reduced error, and in the develop ment of mathematical techniques to allow the rigorous specification of program, process, and machine.
An Introduction to Formal Language Theory
by Robert N. Moll, Michael A. Arbib, and A. J. Kfoury
Published 8 August 1988
The study of formal languages and of related families of automata has long been at the core of theoretical computer science. Until recently, the main reasons for this centrality were connected with the specification and analy sis of programming languages, which led naturally to the following ques tions. How might a grammar be written for such a language? How could we check whether a text were or were not a well-formed program generated by that grammar? How could we parse a program to provide the structural analysis needed by a compiler? How could we check for ambiguity to en sure that a program has a unique analysis to be passed to the computer? This focus on programming languages has now been broadened by the in creasing concern of computer scientists with designing interfaces which allow humans to communicate with computers in a natural language, at least concerning problems in some well-delimited domain of discourse. The necessary work in computational linguistics draws on studies both within linguistics (the analysis of human languages) and within artificial intelligence. The present volume is the first textbook to combine the topics of formal language theory traditionally taught in the context of program ming languages with an introduction to issues in computational linguistics. It is one of a series, The AKM Series in Theoretical Computer Science, designed to make key mathematical developments in computer science readily accessible to undergraduate and beginning graduate students.
The Design of Well-Structured and Correct Programs
by Suad Alagic and Michael A. Arbib
Published 28 March 1978
The major goal of this book is to present the techniques of top-down program design and verification of program correctness hand-in-hand. It thus aims to give readers a new way of looking at algorithms and their design, synthesizing ten years of research in the process. It provides many examples of program and proof development with the aid of a formal and informal treatment of Hoare's method of invariants. Modem widely accepted control structures and data structures are explained in detail, together with their formal definitions, as a basis for their use in the design of correct algorithms. We provide and apply proof rules for a wide range of program structures, including conditionals, loops, procedures and recur sion. We analyze situations in which the restricted use of gotos can be justified, providing a new approach to proof rules for such situations. We study several important techniques of data structuring, including arrays, files, records and linked structures. The secondary goal of this book is to teach the reader how to use the programming language Pascal. This is the first text to teach Pascal pro gramming in a fashion which not only includes advanced algorithms which operate on advanced data structures, but also provides the full axiomatic definition of Pascal due to Wirth and Hoare. Our approach to the language is very different from that of a conventional programming text.
This book presents a unified collection of concepts, tools, and techniques that constitute the most important technology available today for the design and implementation of information systems. The framework adopted for this integration goal is the one offered by the relational model of data, its applica tions, and implementations in multiuser and distributed environments. The topics presented in the book include conceptual modeling of application environments using the relational model, formal properties of that model, and tools such as relational languages which go with it, techniques for the logical and physical design of relational database systems and their imple mentations. The book attempts to develop an integrated methodology for addressing all these issues on the basis of the relational approach and various research and practical developments related to that approach. This book is the only one available today that presents such an inte gration. The diversity of approaches to data models, to logical and physical database design, to database application programming, and to use and imple mentation of database systems calls for a common framework for all of them. It has become difficult to study modern database technology with out such a unified approach to a diversity of results developed during the vigorous growth of the database area in recent years, let alone to teach a course on the subject.