ECO中文网

标题: 1993 理查德·斯特恩斯 [打印本页]

作者: shiyi18    时间: 2022-4-16 17:47
标题: 1993 理查德·斯特恩斯
Richard E Stearns
BIRTH:
July 5, 1936 in Caldwell, New Jersey

EDUCATION:
B.A. in Mathematics (Carleton College, 1958); Ph.D. in Mathematics (Princeton University, 1961).

EXPERIENCE:
General Electric Research Laboratory (1961-1978); State University of New York at Albany (1978-2000, Chair, Computer Science Department,1982-1989); Editor, SIAM Journal of Computing (1972-1988); Member-at-Large, SIGACT Executive Committee (1973-1975); Visiting Professor, Hebrew University (1975); Adjunct Professor, Rensselaer Polytechnic Institute (1977-1978); Visitor, Mathematical Science Research Institute (1985).

HONORS AND AWARDS:
Lanchester Prize in Operations Research, co-winner (1977); ACM Alan M. Turing Award (1993); ACM Fellow (1994); Distinguished Professor, State University of New York (1994).



RICHARD ("DICK") EDWIN STEARNS DL Author Profile link
United States – 1993
CITATION
With Juris Hartmanis, in recognition of their seminal paper which established the foundations for the field of computational complexity theory.

SHORT ANNOTATED
BIBLIOGRAPHY
ACM TURING AWARD
LECTURE
RESEARCH
SUBJECTS
VIDEO INTERVIEW


Richard (“Dick”) Edwin Stearns, born on July 5, 1936 in Caldwell New Jersey, was the son of Dr. Edwin I. Stearns and Winifred T. Scales. He married Charlotte A Reed in 1963. They have two grown children, Chris R. Stearns and Dr. Elizabeth R. Gumustop.

Stearns received a B.A. in Mathematics from Carleton College in 1958. In 1961, he received a Ph.D. in Mathematics from Princeton University. His thesis, Three Person Cooperative Games without Side Payments, was done under the supervision of Harold W. Kuhn with mentoring from Robert J. Aumann.

In the summer of 1960 Stearns worked at the General Electric Research Laboratory in Schenectady N.Y., where he and Juris Hartmanis started joint work on the state assignment problem. After receiving his Ph.D. in 1961, Stearns joined Hartmanis as a permanent employee of the GE Research Laboratory’s Information Studies Branch, managed and led by Richard L. Shuey. The atmosphere and environment at the Research Lab encouraged the free and unbounded type of research they both enjoyed.

Stearns describes his first exposure to computers and to computer science.        
Stearns and Hartmanis initially worked on the decomposition of sequential machines: how models of simple computers can be decomposed into a set of smaller sequential machines that accomplish the same tasks. They published several papers on the subject, and in 1966 summarized their work in a book [1].  Later they worked on computational complexity, and in 1965 published the paper for which they received the Turing award [4].

The seminal paper “On the Computational Complexity of Algorithms” [4] mentioned in the citation for the Turing Award was written with Juris Hartmanis in 1965. It provides a precise definition of the complexity of an algorithm, defines the concept of a complexity class, and shows that there is an infinite sequence of distinct complexity classes and therefore an infinite sequence of increasingly hard problems.

Stearns describes working with Hartmanis at GE to write “On the Computational Complexity of Algorithms”.        
An algorithm is a step-by-step procedure that takes inputs and produces corresponding outputs. For example, a sorting algorithm will take a list of numbers as input and output the list in numerical order. Such an algorithm is said to solve the sorting problem. Another algorithm might take a number as an input and output YES or NO depending on whether or not the input number is a prime number. Such an algorithm is said to solve the problem of primality testing. The complexity of an algorithm is based on the number of steps required, as a function of the input size, for the algorithm to solve a problem. Such a function is called a time bound. For mathematical purposes, algorithms are imagined to be executed on a Turing machine, a simple model of a computer. The complexity of a problem is the complexity of the fastest algorithm for solving that problem.

Hartmanis and Stearns defined a complexity class as the set of all problems solvable within a specified time bound. Their paper [4] shows that there is an infinite hierarchy of complexity classes (for example, problems for which the fastest algorithm takes a time proportional to n, n log n, n2, n3, 2n, and so on) where a small increase in the time bound enables more problems to be solved.

After Hartmanis left GE to become Chair of the Computer Science Department at Cornell University, Stearns worked with Daniel J. Rosenkrantz and Philip M. Lewis. One important paper [5] (with Philip M. Lewis) showed that a similar hierarchy exists when the complexity is defined in terms of the amount of memory space required (as a function of input size) to solve the problem on a Turing machine. Changing the simple model of a Turing Machine by separating the “input tape” from the “work tape” allowed sub-linear space-bounded complexity classes to be defined.

Stearns describes working with Lewis to investigate the space complexity of algorithms.        
These papers led to the establishment of computational complexity theory as a fundamental part of the computer science discipline. Many computer scientists have written papers extending and enhancing the theory. Although the reasoning and the results of many of these papers are highly theoretical, they have practical implications for real-world problems. Many computer scientists and software engineers have used these results in the design and implementation of practical systems.

One of the implications of the hierarchy of complexity classes is that there are many real-life problems whose complexity is so high that it is not practical to solve them. The work on approximating computationally complex problems[9], showed that some problems that would take an impractically long time to compute exactly can be approximately solved by algorithms that take a much shorter time yet yield a solution that is provably close to the optimal solution.

Rosenkrantz, Lewis, and Stearns published a number of papers and, in 1976, a book [2] on compiler design theory. A compiler is the program that translates programs written in a high-level programming language into the machine code of a particular computer. One compiler design problem is the problem of parsing a program in the programming language. For natural languages, parsing a sentence involves finding the subject, the verb, the object, etc. Programming languages are usually expressed in terms of context free grammars, and programs written using such grammars must be parsed by the compiler. The computational complexity of parsing a “sentence” (program) of length n written using a context free grammar is n3, which is too high to be practical in a compiler. The authors defined a special class of context-free grammars, LL(k) grammars, that can be parsed in linear time, where the complexity is only n. The method they described for parsing LL(k) grammars is called top-down parsing and is now an important part of compiler design.  They also showed that much of the desired translation from a programming language into a computer language can be done at the same time the parsing is being done.

After Stearns left GE in 1978 to go to the State University of New York (SUNY) at Albany (Rosenkrantz had left to go to Albany in 1977), he continued working with Lewis. They did research on concurrent database systems, where a number of transactions simultaneously read and write items in the same database. One goal of such systems is for the transactions to run as concurrently as possible to increase the throughput of the system, but without destroying the correctness of the database.  Their paper on this subject [10] shows that a necessary and sufficient condition for consistency and correctness in concurrent execution is serializability of the transactions—that is, the overall effect of the read and write requests of all the concurrently-running transactions must be the same as if the transactions had been run serially (one after the other) in some order. Previously it had been known that serializability is a sufficient condition for consistency:  if the execution is serializable, it is correct. This paper showed that, with the exception of certain read-only transactions, it is also a necessary condition: the execution must be serializable to be correct. This result is now a key part of all database courses and the design of all database systems.

At SUNY Albany, Stearns began a long collaboration with Harry B. Hunt III. Problems of interest are often proven hard by “reductions” (mappings) from other problems known to be hard (assuming P ≠ NP). Hunt and Stearns studied deeper issues involving reductions. They looked for reductions that mapped to simple subsets of instances, because instances of practical interest might all be simple. They also looked for reductions of small size because the smaller the size, the stronger the conclusions that can be drawn about complexity. They formalized some of these ideas with the concept of a power index. Rather than looking for reductions one at a time, they looked for reduction principles which could be applied to many kinds of objects at once, particularly to various kinds of algebras. In this way, they showed that many problems were hard for the same simple reason.

Stearns recounts his work with Harry Hunt III on computational complexity.        
Hunt and Stearns studied sum-of-products problems where the plus and times operators are from commutative semi-rings. They showed that such problems can be solved with a reduced number of operations if the problem had good structure as displayed in a structure tree. “Good structure” here means small “weighted depth” for top-down processing or small “channelwidth” for bottom-up processing. Channelwidth is a manifestation of “treewidth” if sum-of-product problems are looked at graphically, but the structure tree concept provides clarity if the problems are looked at algebraically. Then, with the addition of an algebraic condition, they extended the structure tree concept to apply to quantified formulas. They also established a tight connection between sub-linear space and sub-linearly treewidth bounded OR-of-AND problems.

In collaboration with S.S. Ravi, Harry Hunt III, Daniel Rosenkrantz, and Madhav Marathe, Stearns has been working on Dynamical Systems.

In September 2000, Stearns retired to live with his wife in Slingerlands, N.Y. He still visits and collaborates with former colleagues at the University.

Author: Philip M. Lewis



Richard E Stearns
出生地:美国
1936年7月5日,在新泽西州的考德威尔。

学历:数学学士(卡尔顿学院,1958);数学博士(普林斯顿大学,1961)。

工作经历
通用电气研究实验室(1961-1978);纽约州立大学奥尔巴尼分校(1978-2000,计算机科学系主任,1982-1989);SIAM计算杂志编辑(1972-1988);SIGACT执行委员会无任所成员(1973-1975);希伯来大学客座教授(1975);伦斯勒理工学院兼职教授(1977-1978);数学科学研究所访问者(1985)。

荣誉和奖项。
兰彻斯特运筹学奖,共同获奖者(1977年);ACM艾伦-M-图灵奖(1993年);ACM研究员(1994年);纽约州立大学杰出教授(1994年)。



RICHARD ("DICK") EDWIN STEARNS DL作者简介链接
美国 - 1993年
参考文献
与Juris Hartmanis一起,表彰他们为计算复杂性理论领域奠定基础的开创性论文。

简短注释
书目
亚马逊图灵奖
讲座
研究
题目
视频采访


理查德("迪克")-埃德温-史坦斯,1936年7月5日出生在新泽西州考德威尔,是埃德温-I-史坦斯博士和温尼弗雷德-T-斯卡尔斯的儿子。他于1963年与夏洛特-A-里德结婚。他们有两个成年的孩子,克里斯-R-史坦斯和伊丽莎白-R-古穆斯托普博士。

斯特恩斯于1958年获得卡尔顿学院的数学学士学位。1961年,他获得了普林斯顿大学的数学博士学位。他的论文是在哈罗德-W-库恩的指导下,在罗伯特-J-奥曼的指导下完成的,论文题目是《没有边际支付的三人合作游戏》。

1960年夏天,斯特恩斯在纽约州斯克内克塔迪的通用电气研究实验室工作,在那里他和Juris Hartmanis开始共同研究状态分配问题。1961年获得博士学位后,斯特恩斯加入哈特曼尼斯,成为通用电气研究实验室信息研究处的长期雇员,由理查德-L-舒伊管理和领导。研究实验室的气氛和环境鼓励他们都喜欢自由和无限制的研究类型。

Stearns描述了他第一次接触计算机和计算机科学的情况。        
斯特恩斯和哈特曼最初研究的是顺序机的分解:简单计算机的模型如何能被分解成一组较小的顺序机,完成同样的任务。他们就这个问题发表了几篇论文,并在1966年将他们的工作总结成书[1]。 后来他们又研究了计算复杂性,并在1965年发表了他们获得图灵奖的那篇论文[4]。

图灵奖引文中提到的开创性论文 "论算法的计算复杂性"[4]是与Juris Hartmanis在1965年写的。它为算法的复杂性提供了一个精确的定义,定义了复杂性类的概念,并表明存在一个不同复杂性类的无限序列,因此存在一个越来越难的问题的无限序列。

斯特恩斯描述了在通用电气公司与哈特曼尼斯合作撰写《论算法的计算复杂性》的情况。        
算法是一个按部就班的程序,它接受输入并产生相应的输出。例如,一个排序算法将接受一个数字列表作为输入,并按数字顺序输出该列表。这样的算法被称为解决排序问题。另一种算法可能将一个数字作为输入,并根据输入的数字是否为质数,输出YES或NO。这样的算法被称为解决素数检验的问题。一个算法的复杂性是基于该算法解决一个问题所需的步骤数,作为输入规模的函数。这样的函数被称为时间界限。为了数学上的目的,算法被想象成在图灵机上执行,这是一个简单的计算机模型。一个问题的复杂性是解决该问题的最快算法的复杂性。

Hartmanis和Stearns将复杂度等级定义为所有可在特定时间范围内解决的问题的集合。他们的论文[4]显示,存在一个无限的复杂度等级(例如,最快的算法需要的时间与n、n log n、n2、n3、2n等成正比的问题),在这个等级中,时间界限的小幅增加可以使更多的问题得到解决。

在Hartmanis离开GE成为康奈尔大学计算机科学系主任后,Stearns与Daniel J. Rosenkrantz和Philip M. Lewis一起工作。一篇重要的论文[5](与Philip M. Lewis合作)表明,当复杂性被定义为在图灵机上解决问题所需的内存空间量(作为输入大小的函数)时,存在类似的层次结构。通过将 "输入带 "和 "工作带 "分开,改变了图灵机的简单模型,允许定义次线性空间约束的复杂性等级。

斯特恩斯描述了与刘易斯合作研究算法的空间复杂性。        
这些论文导致了计算复杂性理论的建立,成为计算机科学学科的一个基本部分。许多计算机科学家写了一些论文来扩展和加强这一理论。尽管这些论文中的许多推理和结果都是高度理论化的,但它们对现实世界的问题有实际意义。许多计算机科学家和软件工程师在设计和实现实际系统时使用了这些结果。

复杂度等级的含义之一是,有许多现实生活中的问题,其复杂度非常高,解决它们是不切实际的。关于近似计算复杂问题的工作[9]表明,一些需要花费不切实际的长时间才能准确计算的问题可以通过算法近似解决,这些算法花费的时间要短得多,但得到的解却可以证明接近最优解。

Rosenkrantz、Lewis和Stearns发表了许多论文,并在1976年出版了一本关于编译器设计理论的书[2]。编译器是将用高级编程语言编写的程序翻译成特定计算机的机器代码的程序。一个编译器设计问题是解析编程语言的程序的问题。对于自然语言来说,解析一个句子需要找到主语、动词、宾语等。编程语言通常用上下文自由语法来表达,使用这种语法编写的程序必须由编译器进行解析。解析使用无语境语法编写的长度为n的 "句子"(程序)的计算复杂度为n3,这在编译器中太高了,不实用。作者定义了一类特殊的上下文自由语法,即LL(k)语法,可以在线性时间内解析,其复杂度只有n。他们描述的解析LL(k)语法的方法被称为自上而下的解析,现在是编译器设计的一个重要部分。 他们还表明,许多所需的从编程语言到计算机语言的翻译可以在解析的同时完成。

1978年Stearns离开GE,去了纽约州立大学(SUNY)的奥尔巴尼(Rosenkrantz在1977年离开去了奥尔巴尼),之后他继续和Lewis一起工作。他们做了关于并发数据库系统的研究,在这个系统中,一些事务同时在同一个数据库中读取和写入项目。这种系统的一个目标是让事务尽可能地并发运行,以增加系统的吞吐量,但不破坏数据库的正确性。 他们关于这个问题的论文[10]表明,并发执行中的一致性和正确性的必要和充分条件是事务的可序列化,也就是说,所有并发运行的事务的读写请求的总体效果必须与事务按某种顺序连续运行(一个接一个)的情况相同。在此之前,人们已经知道可序列化是一致性的充分条件:如果执行是可序列化的,那么它就是正确的。这篇论文表明,除了某些只读事务之外,它也是一个必要条件:执行必须是可序列化的,才是正确的。这个结果现在是所有数据库课程和所有数据库系统设计的一个关键部分。

在纽约州立大学奥尔巴尼分校,Stearns开始了与Harry B. Hunt III的长期合作。感兴趣的问题通常是由其他已知困难的问题(假设P≠NP)的 "还原"(映射)来证明是困难的。亨特和斯特恩斯研究了涉及还原的更深层次的问题。他们寻找映射到简单实例子集的还原,因为实际感兴趣的实例可能都是简单的。他们还寻找小规模的还原,因为规模越小,可以得出的关于复杂性的结论就越强。他们用权力指数的概念正式确定了其中的一些想法。他们不是一次一次地寻找还原,而是寻找可以同时适用于多种对象的还原原则,特别是各种类型的代数。通过这种方式,他们表明,许多问题都是因为同样简单的原因而难以解决。

斯特恩斯讲述了他与哈里-亨特三世在计算复杂性方面的工作。        
亨特和斯特恩斯研究了加法和乘法运算符来自交换半环的乘积之和问题。他们表明,如果问题具有结构树所显示的良好结构,那么这类问题可以用较少的操作数来解决。这里的 "良好结构 "是指自上而下处理的小 "加权深度 "或自下而上处理的小 "通道宽度"。如果从图形上看乘积之和问题,通道宽度是 "树宽 "的一种表现,但如果从代数上看问题,结构树的概念就会更清晰。然后,通过增加一个代数条件,他们将结构树的概念扩展到适用于量化的公式。他们还建立了亚线性空间和亚线性树宽有界的OR-of-AND问题之间的紧密联系。

在与S.S. Ravi, Harry Hunt III, Daniel Rosenkrantz和Madhav Marathe的合作中,Stearns一直在研究动态系统。

2000年9月,Stearns退休后与妻子住在纽约州的Slingerlands。

作者。菲利普-M-刘易斯




欢迎光临 ECO中文网 (http://ecocn.org/) Powered by Discuz! X3.3