# Theory of Computing Blog Aggregator

### TCS book: Call for GitHub issues

I originally planned this summer to finish the work on my Introduction to Theoretical Computer Science book, and in particular write the two missing chapters on space complexity and interactive proof systems. Needless to say, this summer did not go as planned and I won’t be able to write these chapters. However, I still intend to go over the existing chapters, fixing typos, adding examples, exercises, and generally making it friendlier to beginning undergraduate students.

Toward this end, I would be grateful for people posting bugs, typos, and suggestions as GitHub issues (I currently have 267 closed and 14 open issues which I hope to get to soon). Of course, if you are technically inclined and there’s a simple local fix, you can also make a pull request.

Aside from these fixes, I am making two more “global” changes to the book. First, I am adding a “non mathy overview” for each chapter. While some students got a lot from reading the book prior to lectures, others were intimidated by the mathematical notation, and so I hope this more gentle introduction will be helpful. I am also adding more examples & solved exercises toward this end.

Another change is that I now follow the more traditional way of presenting deterministic finite automata before Turing machines – DFAs are still optional and can be skipped without missing anything, but some instructors find them as a good introduction to Turing Machines. Thus the order of presentation of materials in the book is roughly as follows:

1. Introduction, representing objects as strings – Representing numbers, lists, etc. Specifying computational tasks as functions mapping binary strings to binary strings, Cantor’s theorem.
2. Finite functions and Boolean circuits – Every function can be computed by some circuit, circuits as straightline programs, representing circuits as strings, universal circuit evaluator, counting lower bound.
3. Computing on unbounded inputs – DFAs (optional), Turing Machines, equivalence between Turing machines, RAM machines and programming languages, λ calculus (optional), cellular automata (optional)
4. Uncomputability – Universal Turing machine, Halting problem, reductions, Rice’s Theorem. Optional: Gödel’s incompleteness theorem, uncomputability of quantified arithmetic statements, context free grammars.
5. Efficient computation – Modeling running time, time hierarchy theorem, P and EXP
6. NP and NP completeness – Polynomial-time reductions, Cook-Levin Theorem (using circuits), definition of NP using “proof system”/”verifying algorithms” (no non-deterministic TMs), NP completeness, consequences of P=NP: search to decision, optimal machine learning, etc..
7. Randomized computation: Worst-case randomized computation, defining BPP, Sipser-Gács, does BPP=P? (a little on derandomization)
8. Cryptography: One time pad, necessity of long keys for information theoretic crypto, pseudorandom generators and stream ciphers, taste of public key and “magic” (ZKP, FHE, MPC)
9. Quantum computing: Some quantum mechanics background – double slit experiment, Bell’s inequality. Modeling quantum computation. Bird’s eye view of Shor’s algorithm and quantum Fourier transform.

by Boaz Barak at July 10, 2020 05:29 PM UTC

### Ron Graham, 1935–2020

from Richard Lipton

Ron Graham passed away, but he lives on…

 Cropped from tribute by Tom Leighton

Ron Graham just passed away Monday at the age of ${84}$ in La Jolla near UCSD.

Today Ken and I wish to say a few words about Ron.

Tributes are being written as we write, including this from the Simons Foundation. Here is the American Mathematical Society announcement, which we saw first:

Ron Graham, a leader in discrete mathematics and a former president of both the AMS (1993-1994) and the MAA (2003-2004), died on July 6. He was 84. Graham published more than 350 papers and books with many collaborators, including more than 90 with his wife, Fan Chung, and more than 30 with Paul Erdős. He was known for his infectious enthusiasm, his originality, and his accessibility to anyone who had a mathematics question.

A tribute by Brady Haran embeds several short videos of Ron and his work. Fan’s own page for Ron has much more. We have made a collage of images from his life:

Ron was special and will be greatly missed by all. We at GLL send our thoughts to his dear wife, Fan. Ken and I knew Ron for many years. Ken knew Ron since a visit to Bell Labs in the 1980s and meeting Fan too at STOC 1990. I knew Ron since I was at Yale in the 1970’s—a long time ago. I recall fondly meeting him for the first time when he was at Bell Labs.

## Some Stories

Ken and I thought we would give some personal stories about Graham.

${\bullet }$ Ken’s story is told here. In breaking a confidence by telling Erdős the secret about Bobby Fischer recounted there, Ken hoped that it would spread behind the scenes to enough people that Fischer would be less blamed for failing to play Anatoly Karpov in 1975. Since Erdős was staying with the Grahams, presumably it would have emerged there. The social excursion during STOC 1990 was a dinner cruise in Baltimore’s harbor. Ron and Fan and Ken found each other right away, and some questions to Ken about chess quickly went to the Fischer topic. At least Ken knows the secret was told at least once.

${\bullet }$ Ron told me once that he was the accountant for Erdős. One of Ron’s jobs was to keep track of the prize money that Erdős owed. Ron would send out the checks to whoever solved the next problem. One of the brilliant insights of Erdős was to make the problems hard, but at least some where solvable. Ron told me that for years no one would actually cash the checks. They would frame them and proudly display them.

Ron said that he liked this for the obvious reason—less cash for Erdős to have to pay. But the advent of color xerox machines in the 1970’s changed this. He told me that people began cashing the checks and displaying the color copy. Bummer.

${\bullet }$ My first talk at Bell Labs was on my work on the planar separator theorem—joint work with Bob Tarjan. At the beginning of the talk I saw that Ron had a pile of papers on his desk. He was a manager and I guessed he had some paper work to do. I gave my talk. At the end I when up to Ron in the back and he said:

I did not get any work done.

I still fondly remember that as high praise.

${\bullet }$ Graham loved to do hand stands. I recall walking around Bell Labs one day when out of the blue Ron did a full handstand. He said that he liked to do these on the hand rail of the stairs. The trick he said was: “To not fall down.”

I searched for him doing handstands and found out he and Fan lived in a modern beautiful house.

When two mathematicians found a circular home designed by architect Kendrick Bangs Kellogg in La Jolla, they treasured their unique discovery.

## Fun and Games

Ron kept a simply organized page of all his papers. They are not sorted by subject or kind, but the titles are so descriptive that you can tell at a glance where the fun is. A number of them are expositions in the popular magazines of the AMS and MAA.

Among them, we’ll mention this note from 2016, titled “Inserting Plus Signs and Adding.” It is joint with Steve Butler, who penned his own reminiscence for Lance and Bill’s blog, and Richard Strong.

Say that a number ${w}$ is “reducible” to a number ${v}$ in one step (in base ${b}$) if there is a way to insert one or more ${+}$ signs into the base-${b}$ representation of ${w}$ so that the resulting numbers add up to ${v}$. For example, 1935 is reducible to 99 via ${1 + 93 + 5}$. The number 99 reduces only to 18 via ${9+9}$, and 18 reduces only to 9, which cannot be reduced further. Thus Ron’s birth year took ${3}$ reduction steps to become a single digit. However, doing ${1+9+3+5}$ gives 18 straightaway and thus saves a step. The paper gives cases where inserting ${+}$ everywhere is not a quickest way to reduce to a single digit.

Definition 1 For any base ${b}$ and number ${n \geq 1}$ denoting an input length, not magnitude, define ${f_b(n)}$ to be the least integer ${m}$ such that all base-${b}$ numbers of length ${n}$ can be reduced to a single digit within ${m}$ steps.

The question—of a complexity theoretic nature—is:

Given ${b}$, what is the growth rate of ${f_b(n)}$ as ${n \rightarrow \infty}$?

Here are some possible answers—which would you expect to be correct in the case where ${b}$ is base 10?

• ${f_{10}(n) = \Theta(\sqrt{n})}$.

• ${f_{10}(n) = \Theta(n^{1/10})}$.

• ${f_{10}(n) = O(\log n)}$.

• ${f_{10}(n) = O(\log\log n)}$.

• ${f_{10}(n) = O(\alpha(n))}$, where ${\alpha}$ is the inverse Ackermann function.

Your expectation might be wrong—see the paper for the answer and its nifty proof. For a warmup, if you want to answer without looking at the paper, prove that the final reduced digit is the same regardless of the sequence of reductions.

Ron is also known for very big integers, including one that held the record for largest to appear in a published mathematical proof. You can find it among the above tributes and also on a T-shirt. We could also mention his role in the largest proof known to date—at 200 terabytes it almost doubles the size of the tables for proving results of seven-piece chess endgames.

If you desire serious fun, look also to Ron’s books. He wrote several, including co-authoring the nonpareil textbook Concrete Mathematics with Don Knuth and Oren Patashnik.

## Some Prizes

Ron in the tradition famously followed by Erdős like to put money on problems. A $10 dollar problem was much easier than a$100 one. A $1,000 one is extremely hard, and so on. In Ron’s paper on his favorite problems he stated this one: Let ${H_{n} = \sum_{j=1}^{n} \frac{1}{j}}$. Challenge: prove the inequality for all ${n \ge 1}$, $\displaystyle \sum_{d | n} d \le H_{n} + \exp(H_{n})\log(H_{n}).$ And he put the prize at$1,000,000. He added:

### Intellectual Fireworks?

from Richard Lipton

Some different ideas for marking the Fourth

 “Founding Frenemies” source

John Adams and Thomas Jefferson did not use Zoom. Their correspondence, from 1777 up to their deaths hours apart on July 4, 1826, fills a 600-page book.

Today, Independence Day in the US, we consider the kind of intellectual fireworks represented by the correspondence.

Jefferson and Adams were intellectual opposites as well as political rivals. Adams favored a strong central government to bridle human passions, whereas Jefferson’s support for the French Revolution continued beyond its devolution into the Reign of Terror. They debated many other points of politics, philosophy, and culture.

Abigail Adams, the wife of John, joined in some of the exchanges. Because she often stayed in Massachusetts while he was in Philadelphia or New York or elsewhere, the husband and wife exchanged many letters—over 1,100 in all. His letter to her on July 3, 1776, instituted the use of fireworks to celebrate anniversaries of the Declaration of Independence.

Today there is not much in the way of fireworks displays. Most have been canceled because we cannot allow crowds to view them. In the Buffalo area, some townships are having small displays with limited access, and some displays are being set on high points for possible area viewing. So we felt we should write about fireworks of a different kind, a kind that is not restricted by the pandemic and might thrive through it. But first we’ll make a point about the history of fireworks.

## Fireworks: Ancient, Early, and Modern

Fireworks go back at least 1,100 years to China, where chemists discovered the fun of stuffing volatile compounds into tubes of bamboo or paper and setting them off. Some have pyrotechnics going back another 1,000 years, to about 200 BCE, insofar as bamboo was known to pop with a loud sound when dried and heated. Gunpowder traveled best of the compounds and made its way into Europe at least by the 1200s. The first recorded wide-scale fireworks display in England was in 1486 for the wedding of King Henry VII to Elizabeth of York, which ended the Wars of the Roses. Shakespeare mentions fireworks in Love’s Labours Lost. The Mughals in India from the 1500s to the 1800s made fireworks a diversion for noble women on the Diwali holiday:

 Cleveland Museum of Art source

Our point is that 1776 isn’t even halfway back to the beginning of using fireworks for celebrations, even just in the West. Can we even call it “Early”? Lavish displays to mark major events were common by the mid-1700s. A royal display in 1749 was accompanied by orchestral music commissioned from George Frideric Handel and went ahead despite rain. Over 12,000 people also paid to attend the main rehearsal six days earlier, many braving an hours-long traffic jam on approaches to the London Bridge. That feels quite modern to us. Adams’s letter mentioned other social features we know today:

It ought to be solemnized with Pomp and Parade, with Shews, Games, Sports, Guns, Bells, Bonfires and Illuminations from one End of this Continent to the other from this Time forward forever more.

The pandemic has curtailed others of these. The major North American team sports have not resumed either. Some parades have been run in “reverse” mode: the floats and performers stay put while spectators drive by slowly in cars.

Adams’s letter has another, earlier, passage that chills today. The letter begins by saying that the Declaration was supposed to have been made in December, 1775, and enumerates plans the colonies had made contingent on this. He then says that what caused the plans to be aborted was an outbreak of disease:

All these Causes however in Conjunction would not have disappointed Us, if it had not been for a Misfortune, which could not be foreseen, and perhaps could not have been prevented, I mean the Prevalence of the small Pox among our Troops. . . . This fatal Pestilence compleated our Destruction.—It is a Frown of Providence upon Us, which We ought to lay to heart.

The ellipsis is in the letter—as Ken’s children have pointed out, trailing off thought with dots in letters or e-mails or Facebook posts or texts is a distinctive habit of us older folk. Thus a specific outbreak of a contagious disease changed our history then as now.

## Ideas: Ancient, Early, and Modern

We have remarked on how the pandemic has affected opportunities to exchange ideas and how to compensate. One impacted series that both of us intended to visit this spring has been the series of workshops at the Simons Institute in Berkeley.

Still, the Simons Foundation has continued its other ways to stimulate ideas. Here we offer our congratulations to Venkatesan Guruswami, Omer Reingold, and David Woodruff, who have just been appointed as Simons investigators for 2020.

In briefly talking about their work, we want to make a point about how the pandemic enables taking the long view of ideas—in a way that appointments such as these promote. It is easy to get wrapped up in immediate aspects of a current hot problem and not be aware that it has a history. The history may not involve exactly the same ideas as the problem, but related ideas whose importance was appreciated much earlier. “Early” may not mean the Middle Ages or the 1700s as with fireworks, but it can mean times before any of us were born.

Venkatesan and Omer and David each have done some stellar research, broadly in various parts of theory. They each have many results, but we thought we would highlight just one result each. We picked a result that we think is representative, is deep, is beautiful, and is one that we personally admire the most.

### “Ancient” Times

Venkatesan did important work on a problem that was created before complexity theory existed. Our favorite is his ground-breaking work on list decoding.

What is the best way to encode data to protect it against various kinds of errors? This is still open. But Venkatesan changed the landscape.

The questions about error correcting codes go back to the 1940’s. Usually the first results are credited to Richard Hamming in 1947. Soon the notion of list decoding was introduced. The cool idea is that doing not require an answer, but allow a list of possible answers. The hope is that with other information about the message we might be able to select the answer.

Venkatesan and Ken’s colleague Atri Rudra found explicit codes that achieve list-decoding capacity, that is, they have optimal redundancy.

What we like so much is the model is so natural and so powerful. There are many applications of list decoding to complexity theory. See Madhu Sudan’s survey for some additional comments.

### Early Times

Omer did his most important work on problems that were first studied in the early days of complexity theory. Our favorite is his beautiful work on small-memory deterministic graph walks.

Is ${\mathsf{L < NL}}$? This is still open. But Omer made a huge contribution to our understanding of fundamental complexity classes. Romas Aleliunas, Dick Karp, Laszlo Lovasz, and Charlie Rackoff proved earlier that random small space could navigate undirected graphs provided they could flip coins. In a sense Omer removed the coins to get his result that undirected graph connectivity is in ${\mathsf{L}}$. The previous result was easy—I can say that because I (Dick) was a co-author on it—but Omer’s theorem is deep.

Omer’s proof drew heavily on expander graphs and the zig-zag product from his earlier work with Salil Vadhan and Avi Wigderson for creating them.

### Modern Times

David did his most important work on problems that were only created relatively recently. Our favorite is his work on approximately counting distinct elements. This work is joint with Daniel Kane and Jelani Nelson and appeared at PODS 2010. It was the first streaming algorithm with an optimal combination of space usage and update time. Here is the relevant table from their paper (KNW):

Streaming algorithms are relatively new and parts of data science are newer. But working with data is old, as old as codes. This finally leads us to pose an outlandish question:

Can all of this work be usefully interpreted from the standpoint of coding theory?

This is outlandish, because the word “code” does not even appear in either Reingold’s paper or KNW. But part of holding coding theory to be a paradigm, as both Ken and I experienced in graduate school, is that its perspective should expand. Is this capable of creating intellectual fireworks? We’ll see.

## Open Problems

Have a safe and happy fourth of July.

[some small fixes]

by RJLipton+KWRegan at July 05, 2020 12:41 AM UTC

### Postdoc in quantum computing at Nagoya and Mie Universities (apply by July 15, 2020)

from CCI: jobs

Nagoya and Mie Universities (Japan) are looking for several postdoctoral researchers to work on quantum computing, especially on the following subjects:

1) quantum algorithms,
2) quantum complexity theory,
3) theoretical aspects of quantum programming languages,
4) development of software verification tools for quantum programs,
5) quantum information theory.

Website: http://francoislegall.com/jobs.html
Email: legall@math.nagoya-u.ac.jp

by shacharlovett at July 03, 2020 11:18 AM UTC