List of important publications in theoretical computer science

This is a list of important publications in theoretical computer science, organized by field.

Some reasons why a particular publication might be regarded as important:

  • Topic creator – A publication that created a new topic
  • Breakthrough – A publication that changed scientific knowledge significantly
  • Influence – A publication which has significantly influenced the world or has had a massive impact on the teaching of theoretical computer science.

Computability

Cutland's Computability: An Introduction to Recursive Function Theory (Cambridge)

  • Cutland, Nigel J. (1980). Computability: An Introduction to Recursive Function Theory. Cambridge University Press. ISBN 978-0-521-29465-2.

The review of this early text by Carl Smith of Purdue University (in the Society for Industrial and Applied Mathematics Reviews),[1] reports that this a text with an "appropriate blend of intuition and rigor… in the exposition of proofs" that presents "the fundamental results of classical recursion theory [RT]... in a style... accessible to undergraduates with minimal mathematical background". While he states that it "would make an excellent introductory text for an introductory course in [RT] for mathematics students", he suggests that an "instructor must be prepared to substantially augment the material… " when it used with computer science students (given a dearth of material on RT applications to this area).[1]

Decidability of second order theories and automata on infinite trees

Description: The paper presented the tree automaton, an extension of the automata. The tree automaton had numerous applications to proofs of correctness of programs.

Finite automata and their decision problems

Description: Mathematical treatment of automata, proof of core properties, and definition of non-deterministic finite automaton.

Introduction to Automata Theory, Languages, and Computation

Description: A popular textbook.

On certain formal properties of grammars

Description: This article introduced what is now known as the Chomsky hierarchy, a containment hierarchy of classes of formal grammars that generate formal languages.

On computable numbers, with an application to the Entscheidungsproblem

Description: This article set the limits of computer science. It defined the Turing Machine, a model for all computations. On the other hand, it proved the undecidability of the halting problem and Entscheidungsproblem and by doing so found the limits of possible computation.

Rekursive Funktionen

The first textbook on the theory of recursive functions. The book went through many editions and earned Péter the Kossuth Prize from the Hungarian government.[2] Reviews by Raphael M. Robinson and Stephen Kleene praised the book for providing an effective elementary introduction for students.[3]

Representation of Events in Nerve Nets and Finite Automata

Description: this paper introduced finite automata, regular expressions, and regular languages, and established their connection.

Computational complexity theory

Arora & Barak's Computational Complexity and Goldreich's Computational Complexity (both Cambridge)

  • Sanjeev Arora and Boaz Barak, "Computational Complexity: A Modern Approach," Cambridge University Press, 2009, 579 pages, Hardcover
  • Oded Goldreich, "Computational Complexity: A Conceptual Perspective, Cambridge University Press, 2008, 606 pages, Hardcover

Besides the estimable press bringing these recent texts forward, they are very positively reviewed in ACM's SIGACT News by Daniel Apon of the University of Arkansas,[4] who identifies them as "textbooks for a course in complexity theory, aimed at early graduate… or... advanced undergraduate students… [with] numerous, unique strengths and very few weaknesses," and states that both are:

"excellent texts that thoroughly cover both the breadth and depth of computational complexity theory… [by] authors... each [who] are giants in theory of computing [where each will be] ...an exceptional reference text for experts in the field… [and that] ...theorists, researchers and instructors of any school of thought will find either book useful."

The reviewer notes that there is "a definite attempt in [Arora and Barak] to include very up-to-date material, while Goldreich focuses more on developing a contextual and historical foundation for each concept presented," and that he "applaud[s] all… authors for their outstanding contributions."[4]

A machine-independent theory of the complexity of recursive functions

Description: The Blum axioms.

Algebraic methods for interactive proof systems

Description: This paper showed that PH is contained in IP.

The complexity of theorem proving procedures

Description: This paper introduced the concept of NP-Completeness and proved that Boolean satisfiability problem (SAT) is NP-Complete. Note that similar ideas were developed independently slightly later by Leonid Levin at "Levin, Universal Search Problems. Problemy Peredachi Informatsii 9(3):265–266, 1973".

Computers and Intractability: A Guide to the Theory of NP-Completeness

Description: The main importance of this book is due to its extensive list of more than 300 NP-Complete problems. This list became a common reference and definition. Though the book was published only few years after the concept was defined such an extensive list was found.

Degree of difficulty of computing a function and a partial ordering of recursive sets

Description: This technical report was the first publication talking about what later was renamed computational complexity[5]

How good is the simplex method?

  • Victor Klee and George J. Minty
  • Klee, Victor; Minty, George J. (1972). "How good is the simplex algorithm?". In Shisha, Oved (ed.). Inequalities III (Proceedings of the Third Symposium on Inequalities held at the University of California, Los Angeles, Calif., September 1–9, 1969, dedicated to the memory of Theodore S. Motzkin). New York-London: Academic Press. pp. 159–175. MR 0332165.

Description: Constructed the "Klee–Minty cube" in dimension D, whose 2D corners are each visited by Dantzig's simplex algorithm for linear optimization.

How to construct random functions

Description: This paper showed that the existence of one way functions leads to computational randomness.

IP = PSPACE

Description: IP is a complexity class whose characterization (based on interactive proof systems) is quite different from the usual time/space bounded computational classes. In this paper, Shamir extended the technique of the previous paper by Lund, et al., to show that PSPACE is contained in IP, and hence IP = PSPACE, so that each problem in one complexity class is solvable in the other.

Reducibility among combinatorial problems

  • R. M. Karp
  • In R. E. Miller and J. W. Thatcher, editors, Complexity of Computer Computations, Plenum Press, New York, NY, 1972, pp. 85–103

Description: This paper showed that 21 different problems are NP-Complete and showed the importance of the concept.

The Knowledge Complexity of Interactive Proof Systems

Description: This paper introduced the concept of zero knowledge.[6]

A letter from Gödel to von Neumann

Description: Gödel discusses the idea of efficient universal theorem prover.

On the computational complexity of algorithms

Description: This paper gave computational complexity its name and seed.

Paths, trees, and flowers

Description: There is a polynomial time algorithm to find a maximum matching in a graph that is not bipartite and another step toward the idea of computational complexity. For more information see .

Theory and applications of trapdoor functions

Description: This paper creates a theoretical framework for trapdoor functions and described some of their applications, like in cryptography. Note that the concept of trapdoor functions was brought at "New directions in cryptography" six years earlier (See section V "Problem Interrelationships and Trap Doors.").

Computational Complexity

Description: An introduction to computational complexity theory, the book explains its author's characterization of P-SPACE and other results.

Interactive proofs and the hardness of approximating cliques

Probabilistic checking of proofs: a new characterization of NP

Proof verification and the hardness of approximation problems

Description: These three papers established the surprising fact that certain problems in NP remain hard even when only an approximative solution is required. See PCP theorem.

The Intrinsic Computational Difficulty of Functions

Description: First definition of the complexity class P. One of the founding papers of complexity theory.

Algorithms

"A machine program for theorem proving"

Description: The DPLL algorithm. The basic algorithm for SAT and other NP-Complete problems.

"A machine-oriented logic based on the resolution principle"

Description: First description of resolution and unification used in automated theorem proving; used in Prolog and logic programming.

"The traveling-salesman problem and minimum spanning trees"

Description: The use of an algorithm for minimum spanning tree as an approximation algorithm for the NP-Complete travelling salesman problem. Approximation algorithms became a common method for coping with NP-Complete problems.

"A polynomial algorithm in linear programming"

Description: For long, there was no provably polynomial time algorithm for the linear programming problem. Khachiyan was the first to provide an algorithm that was polynomial (and not just was fast enough most of the time as previous algorithms). Later, Narendra Karmarkar presented a faster algorithm at: Narendra Karmarkar, "A new polynomial time algorithm for linear programming", Combinatorica, vol 4, no. 4, p. 373–395, 1984.

"Probabilistic algorithm for testing primality"

Description: The paper presented the Miller-Rabin primality test and outlined the program of randomized algorithms.

"Optimization by simulated annealing"

Description: This article described simulated annealing which is now a very common heuristic for NP-Complete problems.

The Art of Computer Programming

Description: This monograph has four volumes covering popular algorithms. The algorithms are written in both English and MIX assembly language (or MMIX assembly language in more recent fascicles). This makes algorithms both understandable and precise. However, the use of a low-level programming language frustrates some programmers more familiar with modern structured programming languages.

Algorithms + Data Structures = Programs

Description: An early, influential book on algorithms and data structures, with implementations in Pascal.

The Design and Analysis of Computer Algorithms

Description: One of the standard texts on algorithms for the period of approximately 1975–1985.

How to Solve It By Computer

Description: Explains the Whys of algorithms and data-structures. Explains the Creative Process, the Line of Reasoning, the Design Factors behind innovative solutions.

Algorithms

Description: A very popular text on algorithms in the late 1980s. It was more accessible and readable (but more elementary) than Aho, Hopcroft, and Ullman. There are more recent editions.

Introduction to Algorithms

Description: This textbook has become so popular that it is almost the de facto standard for teaching basic algorithms. The 1st edition (with first three authors) was published in 1990, the 2nd edition in 2001, and the 3rd in 2009.

Algorithmic information theory

"On Tables of Random Numbers"

Description: Proposed a computational and combinatorial approach to probability.

"A formal theory of inductive inference"

Description: This was the beginning of algorithmic information theory and Kolmogorov complexity. Note that though Kolmogorov complexity is named after Andrey Kolmogorov, he said that the seeds of that idea are due to Ray Solomonoff. Andrey Kolmogorov contributed a lot to this area but in later articles.

"Algorithmic information theory"

Description: An introduction to algorithmic information theory by one of the important people in the area.

Information theory

"A mathematical theory of communication"

Description: This paper created the field of information theory.

"Error detecting and error correcting codes"

Description: In this paper, Hamming introduced the idea of error-correcting code. He created the Hamming code and the Hamming distance and developed methods for code optimality proofs.

"A method for the construction of minimum redundancy codes"

Description: The Huffman coding.

"A universal algorithm for sequential data compression"

Description: The LZ77 compression algorithm.

Elements of Information Theory

Description: A popular introduction to information theory.

Formal verification

Assigning Meaning to Programs

Description: Robert Floyd's landmark paper Assigning Meanings to Programs introduces the method of inductive assertions and describes how a program annotated with first-order assertions may be shown to satisfy a pre- and post-condition specification – the paper also introduces the concepts of loop invariant and verification condition.

An Axiomatic Basis for Computer Programming

Description: Tony Hoare's paper An Axiomatic Basis for Computer Programming describes a set of inference (i.e. formal proof) rules for fragments of an Algol-like programming language described in terms of (what are now called) Hoare-triples.

Guarded Commands, Nondeterminacy and Formal Derivation of Programs

Description: Edsger Dijkstra's paper Guarded Commands, Nondeterminacy and Formal Derivation of Programs (expanded by his 1976 postgraduate-level textbook A Discipline of Programming) proposes that, instead of formally verifying a program after it has been written (i.e. post facto), programs and their formal proofs should be developed hand-in-hand (using predicate transformers to progressively refine weakest pre-conditions), a method known as program (or formal) refinement (or derivation), or sometimes "correctness-by-construction".

Proving Assertions about Parallel Programs

Description: The paper that introduced invariance proofs of concurrent programs.

An Axiomatic Proof Technique for Parallel Programs I

Description: In this paper, along with the same authors paper "Verifying Properties of Parallel Programs: An Axiomatic Approach. Commun. ACM 19(5): 279–285 (1976)", the axiomatic approach to parallel programs verification was presented.

A Discipline of Programming

Description: Edsger Dijkstra's classic postgraduate-level textbook A Discipline of Programming extends his earlier paper Guarded Commands, Nondeterminacy and Formal Derivation of Programs and firmly establishes the principle of formally deriving programs (and their proofs) from their specification.

Denotational Semantics

Description: Joe Stoy's Denotational Semantics is the first (postgraduate level) book-length exposition of the mathematical (or functional) approach to the formal semantics of programming languages (in contrast to the operational and algebraic approaches).

The Temporal Logic of Programs

Description: The use of temporal logic was suggested as a method for formal verification.

Characterizing correctness properties of parallel programs using fixpoints (1980)

Description: Model checking was introduced as a procedure to check correctness of concurrent programs.

Communicating Sequential Processes (1978)

Description: Tony Hoare's (original) communicating sequential processes (CSP) paper introduces the idea of concurrent processes (i.e. programs) that do not share variables but instead cooperate solely by exchanging synchronous messages.

A Calculus of Communicating Systems

Description: Robin Milner's A Calculus of Communicating Systems (CCS) paper describes a process algebra permitting systems of concurrent processes to be reasoned about formally, something which has not been possible for earlier models of concurrency (semaphores, critical sections, original CSP).

Software Development: A Rigorous Approach

Description: Cliff Jones' textbook Software Development: A Rigorous Approach is the first full-length exposition of the Vienna Development Method (VDM), which had evolved (principally) at IBM's Vienna research lab over the previous decade and which combines the idea of program refinement as per Dijkstra with that of data refinement (or reification) whereby algebraically-defined abstract data types are formally transformed into progressively more "concrete" representations.

The Science of Programming

Description: David Gries' textbook The Science of Programming describes Dijkstra's weakest precondition method of formal program derivation, except in a very much more accessible manner than Dijkstra's earlier A Discipline of Programming.

It shows how to construct programs that work correctly (without bugs, other than from typing errors). It does this by showing how to use precondition and postcondition predicate expressions and program proving techniques to guide the way programs are created.

The examples in the book are all small-scale, and clearly academic (as opposed to real-world). They emphasize basic algorithms, such as sorting and merging, and string manipulation. Subroutines (functions) are included, but object-oriented and functional programming environments are not addressed.

Communicating Sequential Processes (1985)

Description: Tony Hoare's Communicating Sequential Processes (CSP) textbook (currently the third most cited computer science reference of all time) presents an updated CSP model in which cooperating processes do not even have program variables and which, like CCS, permits systems of processes to be reasoned about formally.

Linear logic (1987)

Description: Girard's linear logic was a breakthrough in designing typing systems for sequential and concurrent computation, especially for resource conscious typing systems.

A Calculus of Mobile Processes (1989)

Description: This paper introduces the Pi-Calculus, a generalisation of CCS which allows process mobility. The calculus is extremely simple and has become the dominant paradigm in the theoretical study of programming languages, typing systems and program logics.

The Z Notation: A Reference Manual

Description: Mike Spivey's classic textbook The Z Notation: A Reference Manual summarises the formal specification language Z notation which, although originated by Jean-Raymond Abrial, had evolved (principally) at Oxford University over the previous decade.

Communication and Concurrency

Description: Robin Milner's textbook Communication and Concurrency is a more accessible, although still technically advanced, exposition of his earlier CCS work.

a Practical Theory of Programming

Description: the up-to-date version of Predicative programming. The basis for C.A.R. Hoare's UTP. The simplest and most comprehensive formal methods.

References

  1. Smith, Carl H. (1982). "Computability: An Introduction to Recursive Function Theory (N. J. Cutland)". SIAM Review. 24: 98. doi:10.1137/1024029.
  2. "Rózsa Péter: Founder of Recursive Function Theory". Women in Science: A Selection of 16 Contributors. San Diego Supercomputer Center. 1997. Retrieved 23 August 2017.
  3. "Reviews of Rózsa Péter's books". www-history.mcs.st-andrews.ac.uk. Retrieved 29 August 2017.
  4. Daniel Apon, 2010, "Joint Review of Computational Complexity: A Conceptual Perspective by Oded Goldreich… and Computational Complexity: A Modern Approach by Sanjeev Arora and Boaz Barak…," ACM SIGACT News, Vol. 41(4), December 2010, pp. 12–15, see , accessed 1 February 2015.
  5. Shasha, Dennis, "An Interview with Michael O. Rabin", Communications of the ACM, Vol. 53 No. 2, Pages 37–42, February 2010.
  6. SIGACT 2011
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.