In this class, we will learn how to use Lean to prove mathematical theorems.
The first class will include a short introduction to Lean. In the few following weeks, you will learn the hooks by following slides and solving exercises in groups. The instructors will go from group to group, helping you understand the material. At this point, you will divide into groups, each group choosing a project (a certain theorem or a certain mathematical theory). The rest of the semester will be devoted to completing the project, with help from the instructors.
This course will introduce modern, state of the art research on algorithms for massive data. We shall put an emphasis on works that sit on the border “between theory and practice” and on algorithmic problems with applications in machine learning, complex networks, and data science.
Modern computational systems (such as LLMs, social networks, and e-commerce systems) are required to process and analyze huge amounts of highly-structured data, and make decisions in split seconds that affect the lives of humans and the society as a whole. The amount of data, the complex structure, and the importance of fast decision making together often require the use of heuristics and algorithms that are efficient in the real world, but where the theoretical understanding is currently lacking.
We shall discuss several algorithmic problems where there is a large (and increasing) gap between theory and practice, and study and propose “optimistic” tools from the beyond worst case analysis literature to close this gap.
The course this year will focus on quantum algorithms and on
complexity classes (quantum and classical). We will learn
Shor’s factoring algorithm (in polynomial time), as well as some
highly popular quantum algorithms that might provide quantum
advantage in the near future. We will also learn the complexity class
QMA which is similar to NP, but when the verifier has a quantum
computer and the prover can send classical data as well as
quantum states.
Our goal is to reach an understanding of the potentially exponential
advantage of quantum computing over classical computing in the near
future and in the far future.
There will be two home exercises and a (personal or by a pair) final
work.
Registration is via Prof. Tal Mor talmo@cs.technion.ac.il.
This course is the natural follow-up to the basic algorithms course. We will consider basic topics and fundamental techniques in the theory of algorithms, such as randomization and algebraic tools. Possible topics include network flow, cuts in graphs, matchings, linear programming, approximation algorithms, and online algorithms.
The course will focus on online allocation problems in stochastic settings, exemplified by the following basic scenario:
Suppose you have an item you wish to sell, and impatient buyers arrive sequentially, with each arriving buyer making a take-it-or-leave it offer for your item. Were you to know the future, you could sell to the highest bidder and get the optimal revenue from selling your item. Unfortunately, without knowing the future, and without any further assumption, you cannot even approximate the optimal revenue within any finite ratio. To overcome such pathological worst-case examples, economists (and computer scientists, and operations researchers) study such problems under probabilistic models: adversarial input revealed in random order, or stochastic input drawn from some known/unknown distribution.
The first few lectures will be given by the lecturer, who will give background on the area and on building and presenting effective lectures (see also https://www.davidwajc.com/advice for more tips). The majority of the seminar will be dedicated to students’ presentations of research papers in the area and in-class discussion. See more details here.
Submodular functions, which capture the property of diminishing returns, are ubiquitous in various disciplines, including combinatorics, graph theory, machine learning, economics, algorithmic game theory, and information theory. The family of submodular maximization and minimization problems is a prime example of a unified approach that captures both well-known classic problems, e.g., Max-Cut, Max-DiCut, and Generalized Assignment, and real-world applications in diverse settings, e.g., information gathering, image segmentation, viral marketing in social networks, and recommendation systems. This course deals with the algorithmic foundations of submodular optimization, focusing on basic problems and algorithmic techniques.
Topics:
The course deals with design of algorithms for multi-processor systems, and analysis of their complexity; study of basic problems in such systems; lower bounds and impossibility results. Specific topics include: Mutual exclusion and resource allocation, agreement problems(Byzantine generals’ problem, approximate agreement, etc.), clock synchronization and logical clocks, broadcast and multicast, lock-free synchronization and concurrent data structures.
This course gives an introduction to distributed graph algorithms.
We will see various models of computation and the basic problems in this field.
We will study both algorithms and lower bounds.
This is an algorithmic/mathematical course.
There will be no final exam. The grade will be composed by 3-4 home assignments (submissions in singles) and a final project (submissions in pairs).
Suppose we are interested in figuring out whether a given input satisfies a given property, but we are only allowed to read a small part of the input. Can we still solve the problem, at least approximately? In many cases the answer is positive: it is possible to distinguish, with high probability, between inputs satisfying the property and inputs which are far from every input satisfying it. Such algorithms are known as property testers.
Meta-complexity refers to the computational complexity of problems that
are themselves about computations and their complexity. Such problems
include the Minimum Circuit Size Problem and the Time-bounded
Kolmogorov Complexity Problem, the study of which originated in the
1950s/60s and predate the modern study of Complexity theory.
Meta-complexity provides a unifying framework for a variety of central
tasks in several areas of computer science, including computational
complexity, cryptography, and learning theory, and there has been a recent
explosion of works connecting these areas through the lens of Metacomplexity.
In this course, we will focus on these recent development, with
a particular focus on connections with Cryptography.
Course will be taught in English, online, and will follow an unusual schedule.
See more info here.
Every week, one or two graduate students in TCS (writ large) will present their research topic.
Everyone is welcome!
This course studies structural aspects and algorithmic problems in modern social networks. The course involves both theoretical work and (lightweight) hands-on data analysis of real-world social networks data, using Python.
Detailed description: https://cs.technion.ac.il/courses/all/224/236020.pdf
In classic complexity theory, coping with NP-hard optimization problems often translates to the following fundamental question: “What is the best approximate solution that we can get in polynomial time?” We will present basic techniques for the design and analysis of approximation algorithms.
This includes combinatorial techniques (such as local search and local ratio) and techniques based on linear programming. We will demonstrate the use of these techniques for solving packing and scheduling problems, as well as problems on graphs.
Course covering a wide range of topics in complexity theory, such as space complexity, the polynomial hierarchy, probabilistic computation, interactive proofs, circuit complexity, and more.
We will discuss three main topics:
– Schaefer’s theorem, which shows that any Boolean CSP is either in P or NP-complete.
– Hardness of approximation: we will find the inapproximability thresholds for MAX-3LIN and for MAX-CUT.
– Proof complexity: we will show that some CNFs, including random ones, are hard to refute in Resolution.
If time permits, we will discuss a few more minor topics.
We will discuss the question of whether it is possible to design responsible algorithms that process sensitive data while avoiding the risk of exposing personal and sensitive information. We will address this question from a theoretical perspective, focusing on the concept of differential privacy, which offers a formal mathematical definition of algorithmic privacy. Course Content Overview:
• A systematic development of the definition of differential privacy and its basic properties.
• Design and analysis of differentially-private algorithms.
• Can any algorithmic task be solved privately? Analysis of the cost of differential-privacy in terms of computational and data efficiency.
• Additional selected topics as determined by the instructor’s discretion.
This is a theoretical course that demands mathematical maturity.
This course will introduce students to algorithmic foundations for handling large, high-dimensional datasets.
A sampling of topics and questions that will be covered includes:
Basic concepts: error correction, error detection and erasure correction. Linear error-correcting codes. Hamming codes. Introduction to finite fields. Reed-Solomon codes, BCH codes and alternant codes. Decoding algorithms. Bounds on the parameters of error-correcting codes.
Probabilistic methods have become a cornerstone of combinatorial theory, while probabilistic algorithms have become central to computer science.
This course introduces an array of already-classical and more modern probabilistic methods, along with their applications to math and CS. The course is centered on proof techniques and problem solving. Among the covered topics: basic methods, concentration inequalities, martingales, the local lemma, information entropy.
Lecture Sunday 14:30-16:30,
Tutorial – TBA.
The course will be given in Hebrew.
The seminar deals with distributed computing in modern computing systems, in the presence of different kinds of failures. Grades are based (mostly) on presenting in-depth a paper from the recent research literature.
The metric method is a powerful tool that has been used extensively in the last two decades in the design and analysis of algorithms.
This course will survey some of the basic techniques in the metric approach, as well as its applications to various topics such as: clustering and graph cuts, balanced graph partitioning, network routing, and online algorithms.
Topics:
Lecture: Monday 15:30-17:30 (Roy Schwartz)
Tutorial: Monday 13:30-14:30 (Dor Katzelnick)
The topic of this course is uncertainty in algorithms. Given a computational problem, the traditional design and analysis of algorithms assumes that complete knowledge of the input is known in advance. However, this is not the case in many applications, including resource management and allocation, data structures, and game dynamics. In these scenarios making decisions in the face of uncertainty is of major importance. Take, for example, managing a cache memory; when there is a page miss, the decision which page to evict from the cache is made without knowing future requests for pages.
In the course we will go over advanced techniques developed in recent years for dealing with such settings, where information about the future is unavailable or limited. We will mainly focus on competitive analysis of online algorithms and online learning, and study surprising connections between them. We will see how techniques from linear programming and convex optimization are useful for developing online algorithms.
This course will introduce students to advanced algorithms for dynamic graphs. This active research area focuses on questions of the following flavor: “How quickly can we solve algorithmic problems on graphs that change slowly over time? In particular, how much faster than recomputing from scratch after every change in the graph?” This question is central to problems with input that changes over time (for example, in GPS applications, closure and opening of roads changes the road network), but is also key to speeding up algorithms with static inputs (in much the same way that basic data structures can be used to speed up other classic algorithms). The course’s objective is to expose students to techniques and results for basic problems in the field (such as matching, shortest paths, minimum spanning trees, etc). As part of the course, the students will also be exposed to central tools in algorithm design and theory of computer science more broadly that are relevant to the course topic, including linear programming and duality, randomized algorithms and the multiplicative weight update method. The course evaluation will be based on homework and a final project, which includes reading a research paper, summarizing the paper and simplifying it and/or improving on it.
Basic course on the theory of random graphs, focusing on the Erdős–Rényi model. Based on the nice Introduction to Random Graphs by Frieze and Karoński.
Grading is based on homework and a final project.
Possible topics:
Proof-systems are at the very heart of theoretical computer science. For example the P vs NP question asks whether every statement that someone can prove to you, you could have actually solved by yourself.
In this course we will discuss advanced proof-systems such as:
* Interactive proofs, which are a natural generalization of NP proofs in which the verifier can interact with the prover.
* Zero-knowledge proofs, which allow one to prove the correctness of a statement without revealing anything else.
* Probabilistically Checkable Proofs (PCPs), which are proofs that can be verified by reading only a small number of bits.
* Doubly efficient proofs, which allow one to delegate expensive computations to an untrusted server.
Requirements: roughly 3 homework assignments and an exam.
Communication complexity studies the number of bits that should be exchanged in order to perform a computation whose input is distributed among two or more parties. This is studied under a spectrum of computational models: deterministic, non-deterministic, randomized etc.
Beyond the immediate implications that such questions have to understanding distributed computations, we will present various surprising applications to seemingly unrelated questions, mostly in Complexity Theory.
The course will use methods from Combinatorics, Probability Theory and Algebra.
Prerequisites:
Formally: “Complexity Theory” (236313) and a course in probability
(104034 or 094412).
Students with strong theory interests and the course “Computability Theory” (236343) can contact the lecturer for participation.
An introductory course to the topic of Algorithmic Game Theory, the study of algorithms coupled with incentives. In light of the growing influence of algorithms on the economy and society, algorithm design must take into account the interaction with strategic players on top of traditional considerations like runtime complexity. As a prime example think of eBay auctions, which solicit bids from self-interested buyers.
The first half of the course will be dedicated to mechanism design, the science of designing such online auctions and markets. In the second half we will discuss the fundamental notion of equilibrium, including reaching it through centralized computation and decentralized learning.
Time: Mondays 10:30-12:30 (lecture) + Mondays 12:30-13:30 (recitation by Konstantin Zabarnyi)
Basic topics in theoretical cryptography.
Machine learning often involves and affects people – a.k.a. “strategic agents” – who have interests, preferences and rights. This seminar covers recent research in the interdisciplinary questions that arise from this combination. Main topics include: Economic mechanisms and learning; strategic classification; incentivizing exploration; fairness in automatic decision making; and more.
Prerequisites: Algorithms 1
Useful but not required: Probability Theory, Learning Theory, Algorithmic Game Theory
The main topics of the course are Gödel’s incompleteness theorems (including an axiomatic system for arithmetic, basics of recursion theory and Church’s thesis), set theory (axiomatic and combinatorial foundations of set theory as a basis for math), and modal logic (Kripke models, completeness theorems and connection to computer science).
For more information, please contact Professor Michael Kaminski.
Zero-knowledge proofs enable one to prove correctness of a computational statement without revealing any additional information. Zero-knowledge proofs have had a fundamental impact on cryptography and complexity theory and very recently are starting to be deployed also in practice.
This course will cover various advanced aspects of zero-knowledge proofs including:
1. Definitions, basic constructions and ZKP for NP.
2. Characterization of statistical zero-knowledge.
3. Non-interactive zero-knowledge.
4. A study of various efficiency aspects of zero-knowledge proofs.
The course will cover advanced algorithmic techniques developed for solving various (discrete) optimization problems. We will discuss linear programming, semidefinite programming, convex programming, and more. We will see applications of these techniques to a variety of applications, e.g., developing approximation algorithms for NP-hard problems, coloring (optimally) perfect graphs. Connections to learning theory will also be discussed.
Requirements: grades will be determined based on homework assignments.
We will study basic mathematical models in machine learning, including supervised learning, online learning, interactive learning, and distribution learning.
We will focus on combinatorial characterizations of the complexity of learning.
The course requires mathematical maturity and will involve discussions about open problems and active research directions.
For student who attended the class with Shay in the last semester:
This semester we will cover different topics (in particular we will begin with the online learning model).
Also, notice the course number is different from that of last semester, so one can get credit for both classes.
Definability deals with the expressive power of description languages which are used in program specification, databases and axiomatic mathematics. These languages include Regular Expressions, First Order Logic, Relational Algebra, Second Order Logic and various Temporal and Modal Logics.
Computability deals with computations and the resources needed to execute them in various models of computations. These include Finite Automata, Turing Machines, Register Machines, and the like.
The course deals with the interplay between definability in various descriptive languages and computability with various computing devices. It brings together theoretical aspects of Database Theory, Computability Theory, Complexity Theory and Logic.
The course can be given in English if required. The slides are in English.
The course will focus on topics related to the origins of life and in particular the origins of the genetic code.
As we all know when we see a code – someone wrote it; With one exception – the genetic code.
Relying on the notion of “autocatalytic sets” we shall explore this topic.
The use of mathematical programming is of paramount importance to the design and analysis of algorithms in general, and approximation algorithms in particular. While linear programming (LP) is the prototypical type of mathematical programming that is utilized, non-linear mathematical programming is also vastly used. This course will focus mainly on semi-definite programming (SDP) and survey its basic uses in the design and analysis of approximation algorithms as well as its applications to various topics such as: graph coloring, satisfiability, graph cuts, and clustering.
Topics:
Lecture: Monday 15:30-17:30 (Roy Schwartz)
Tutorial: Monday 13:30-14:30 (Dor Katzelnick)
Boolean function analysis is a set of tools coming from probability theory and functional analysis, with applications to theoretical computer science and combinatorics. The course is based, in large part, on the excellent monograph Analysis of Boolean functions by Ryan O’Donnell.
Grading is based on homework and a final project.
We will study basic mathematical models in machine learning, including supervised learning, online learning, interactive learning, and distribution learning.
We will focus on combinatorial characterizations of the complexity of learning.
The course requires mathematical maturity and will involve discussions about open problems and active research directions.
Prerequisites: Combinatorics and Probability Theory.
This is a seminar course on combinatorial optimization. The course will focus on two different methodologies: local ratio and primal-dual.
The first three lectures will serve as an introduction to approximation algorithms, via linear programming.
In the fourth lecture we will explore primal-dual techniques, and in the fifth the competing methodology, local ratio.
During the rest of the semester we will encounter many different applications to various optimization problems, such as vertex cover, set cover, Steiner trees, cycle cover, assignment problem, resource allocation, and more.
Property testing algorithms are used to decide if some mathematical object (such as a graph or a Boolean function) has a “global” property, or is “far” from having this property, using only a small number of “local” queries to the object.
This is a seminar course on combinatorial optimization. The course will focus on two different methodologies: local ratio and primal-dual.
The first three lectures will serve as an introduction to approximation algorithms, via linear programming.
In the fourth lecture we will explore primal-dual techniques, and in the fifth the competing methodology, local ratio.
During the rest of the semester we will encounter many different applications to various optimization problems, such as vertex cover, set cover, Steiner trees, cycle cover, assignment problem, resource allocation, and more.
Topics in pseudorandomness.
Discovery-based class, on a variety of topics. Students work in groups on handouts, with online help from the lecturer.
Grading based on final project.
Algorithms involving numbers, polynomials and matrices: FFT, fast integer multiplication, fast matrix multiplication, primality testing, integer factorization.