CMSC 39600: Analysis of Boolean Functions (Autumn 2025)
Full Course Title: "Topics in Theoretical Computer Science: Analysis of Boolean Functions"
Term: Autumn 2025 at the University of Chicago
Instructor: William Hoza (williamhoza@uchicago.edu)
Meetings: Tuesdays and Thursdays | 12:30pm - 1:50pm | RY 178
Office Hours: Thursdays | 2pm - 3pm | JCL 311
Textbook: Analysis of Boolean Functions by Ryan O'Donnell, available for free online here.
Jump to: (loading...)
Course Description
A Boolean function can represent a problem we want to solve, an algorithm we want to analyze, a concept we want to learn, etc. In this course, we will use Fourier analysis to reason about Boolean functions, with applications in property testing, learning theory, pseudorandomness, and more.
Some possible topics: Linearity testing. Noise stability/sensitivity. Arrow's theorem. Dictator testing. Decision trees. Spectral concentration. The Goldreich-Levin algorithm. Total influence. Linear threshold functions. Peres's theorem. Chow's theorem. Hypercontractivity. Friedgut's junta theorem. The Kahn-Kalai-Linial theorem. The Friedgut-Kalai-Naor theorem. Bounded-depth circuits. Random restrictions. The Linial-Mansour-Nisan theorem. Fourier growth. Mansour's theorem. Read-once branching programs. Small-bias distributions. Pseudorandom generators from polarizing random walks. The Gaussian noise operator. Hermite polynomials. Invariance principles.
Prerequisites: If you're an undergraduate student and you want to take this course, you'll need to discuss it with me and get approval. If you're a graduate student, there are no firm prerequisites, but it would be helpful to have some familiarity with computational complexity theory and probability theory.
Course Timeline
(The information below will be updated throughout the quarter.)
Tuesday 9/30: The Fourier expansion theorem; characters are orthonormal; Plancherel's theorem; the Fourier coefficient formula; Parseval's theorem; convolution; the Blum-Luby-Rubinfeld linearity test.
Chapter 1.
|
Thursday 10/2: Condorcet's paradox; noise stability; Arrow's theorem.
Sections 2.0 and 2.1, the first part of Section 2.4 (up to and including the proof of Proposition 2.50), and Section 2.5.
|
Tuesday 10/7: Dictator testing; learning depth-$k$ decision trees from random examples; size-$s$ decision trees are concentrated at low degree.
Sections 3.0-3.2 and 3.4. (You can ignore the stuff about influence and noise sensitivity for now.)
|
Thursday 10/9: The low-degree algorithm; Fourier 1-norm of decision trees; spectral concentration based on Fourier 1-norm bounds; the Goldreich-Levin algorithm.
See learnability.pdf from 10/7.
Section 3.5.
|
Tuesday 10/14: The Kushilevitz-Mansour algorithm; influence; sensitivity; discrete derivatives; total influence of size-$s$ decision trees, width-$w$ DNFs, and unate functions; spectral concentration based on total influence bounds.
Sections 2.2, 2.3, 4.0, and 4.1.
|
Thursday 10/16: Total influence of unate size-$s$ decision trees; noise sensitivity bounds based on total sensitivity; linear threshold functions; Peres's theorem; spectral concentration based on noise sensitivity; Chow's theorem.
Sections 5.0, 5.1, and 5.5.
|
Tuesday 10/21: |
Thursday 10/23: |
Tuesday 10/28: |
Thursday 10/30: |
Tuesday 11/4: |
Thursday 11/6: |
Tuesday 11/11: |
Thursday 11/13: |
Tuesday 11/18: |
Thursday 11/20:
(No classes next week due to Thanksgiving Break)
|
Tuesday 12/2: |
Thursday 12/4: |
[Finals week]: |
Project
Study some topic that we won't have time to discuss in class and that is related to the analysis of Boolean functions. Then write and submit an exposition. Your exposition should be in the ballpark of 5-10 pages, but there are no strict page limits. Think of your classmates as your target audience. The exposition will be due near the end of the quarter (exact deadline TBA).
I will need to approve your topic. I encourage you to pick a specific theorem (or two) to focus on. Your exposition should explain what the theorem says and why the theorem is important. Your exposition should also explain the proof of the theorem, but this explanation need not be complete. For example, you might choose to only explain part of the proof, or you might choose to prove a special case of the theorem. I encourage you to try to develop and explain a new way of thinking about the proof. The best case would be if you manage to genuinely prove something new, but this is not required.
Here are some papers you could look at. I might add more papers a little later in the quarter. I've linked to the most "official" versions available of the papers below, but you might want to also look for other versions, such as arXiv/ECCC preprints.
- Expander random walks: the general case and limitations, by Cohen, Minzer, Peleg, Potechin, and Ta-Shma. ICALP 2022.
- Learning low-degree functions from a logarithmic number of random queries, by Eskenazis and Ivanisvili. STOC 2022.
- Sharper bounds on the Fourier concentration of DNFs, by Lecomte and Tan. FOCS 2021.
- Fractional pseudorandom generators from any Fourier level, by Chattopadhyay, Gaitonde, Lee, Lovett, and Shetty. CCC 2021.
- Fourier growth of parity decision trees, by Girish, Tal, and Wu. CCC 2021.
- The Keller-Klein proof of the Kindler-Safra theorem, by Filmus. Lecture notes, 2021.
- Theorems of KKL, Friedgut, and Talagrand via random restrictions and log-Sobolev inequality, by Kelman, Khot, Kindler, Minzer, and Safra. ITCS 2021.
- Towards a proof of the Fourier-entropy conjecture?, by Kelman, Kindler, Lifshitz, Minzer, and Safra. Geom. Funct. Anal., 2020.
- Biasing Boolean functions and collective coin-flipping protocols over arbitrary distributions, by Filmus, Hambardzumyan, Hatami, Hatami, and Zuckerman. ICALP 2019.
- Improved pseudorandomness for unordered branching programs through local monotonicity, by Chattopadhyay, Hatami, Reingold, and Tal. STOC 2018.
- Making polynomials robust to noise, by Sherstov. ToC, 2013.
- The Chow parameters problem, by O'Donnell and Servedio. SICOMP, 2011.
- An efficient membership-query algorithm for learning DNF with respect to the uniform distribution, by Jackson. JCSS, 1997.
Homework Exercises
I will assign exercises throughout the quarter. Submit your solutions through Gradescope.
- Exercises 1 & 2, due October 10 at 11:59pm
- Exercises 3 & 4, due October 17 at 11:59pm
- Exercises 5 & 6, due October 24 at 11:59pm
Collaboration
You are encouraged to collaborate with your classmates on exercises, but you must adhere to the following rules.
- Work on each exercise on your own for at least five minutes before discussing it with classmates.
- Feel free to explain your ideas to your classmates in person, and feel free to use whiteboards/chalkboards/etc. However, do not share any written/typeset solutions with your classmates for them to study on their own time. This includes partial solutions.
- Write your solutions on your own. While you are writing your solutions, do not consult any notes that you might have taken during discussions with classmates.
- In your write-up, list any classmates who helped you figure out the solution. The fact that student A contributed to student B's solution does not necessarily mean that student B contributed to student A's solution.
Permitted Resources for Full Credit
In addition to discussions with me and discussions with classmates as discussed above, you may also use the course textbook, any slides or notes posted in the "Course Timeline" section of this webpage, and Wikipedia. If you wish to receive full credit on an exercise, you may not use any other resources.
Outside Resources for Partial Credit
If you wish, you may use outside resources (ChatGPT, Stack Exchange, etc.) to solve an exercise for partial credit. If you decide to go this route, you must make a note of which outside resources you used when you were working on each exercise. You must disclose using a resource even if it was ultimately unhelpful for solving the exercise. Furthermore, if you consult an outside resource while working on an exercise, then you must not discuss that exercise with your classmates.
Grading Policies
Your grade will be based on homework exercises (65%), the project (25%), and class participation (10%).
Late Work
You are strongly encouraged to complete each exercise within the allotted time. If you don't master the subject matter of a particular exercise by the deadline, then you might have trouble understanding the topics we discuss in the following classes. Furthermore, if you submit your solutions late, then it might not be possible to give you timely feedback. That being said, I understand that occasionally missing a deadline is inevitable. Late work will ordinarily receive partial credit based on the following principles.
- If you submit your solution to an exercise $x$ days late, where $0 < x < 14$, then your score will be multiplied by $1 - (0.02x)^{1.1} - (0.0694x)^{10}$. Some rounding will occur in the calculation. Work turned in 14 or more days late will ordinarily not receive any credit.
- Homework submitted after 11:59pm on Tuesday, December 9 will not receive any credit.
- Even if two exercises have the same deadline, you do not have to submit your solutions simultaneously. You can submit one solution on time for full credit and submit the other solution late for partial credit. On the other hand, if an exercise is divided into parts, then you must submit your solutions to all parts simultaneously.
- Eventually, I will post official solutions. If you look at the official solution for an exercise, then from that point onward, you are not permitted to submit your own solution for that exercise. Submitting a solution after looking at the official solution would be an academic honesty violation.
- The principles described above might be ambiguous or problematic in some edge cases. Ultimately, I will decide how much credit to grant for late work.
The score you see in Gradescope represents your score before applying the lateness penalty, i.e., it is the score that you would have received if you had submitted your solution on time. Lateness penalties will only be recorded in Canvas, possibly after a significant delay.
Extensions
Please get in touch with me if you are having trouble keeping up with the course schedule for any reason. I may be willing to give you an extension, or I may be willing to give you partial credit for late work in excess of what the principles above stipulate. To highlight one example, I am happy to grant an extension for a religious holiday if you give me sufficient notice that you will be observing that holiday.
Regrade Requests
If you think something was graded incorrectly, please submit a regrade request through Gradescope.
Academic Honesty
If you ever find yourself in any kind of irregular / problematic / questionable situation related to academic honesty in this course, send me an email and explain what happened. For example, you should send me an email if you realize that you accidentally used a prohibited resource while you were working on the homework.
I encourage you to follow this simple protocol in every course you take. Depending on the circumstances, you might possibly face some consequences such as a lowered score, but don't let this deter you. A clean conscience is worth more than any grade.
Inclusion and Accessibility
This course, like every course you take, ought to be an enjoyable and enriching experience. You have a right to be treated with respect and a responsibility to treat others with respect. I want all students to be able to fully participate in this course. Let me know if you need an accommodation for any reason, so we can explore possible solutions together. If applicable, please follow the protocols established by UChicago Student Disability Services.
To highlight a few facets of these topics:
- You are encouraged to ask questions in class, and you should make your classmates feel comfortable asking questions in class. There is no shame in not knowing something or not understanding something. Questions make the course better for everyone.
- I am happy to work with any students who are pregnant/parenting to figure out the best way to make the course work for you.
- Computer scientists and mathematicians can come from many diverse backgrounds, and nobody should experience prejudice or discrimination in this course.