Publications
Book chapters, surveys, thesis

 PriorIndependent Auctions.
 Book chapter forthcoming in Beyond WorstCase Analysis, Cambridge University Press, 2019.
[Preprint] 
 Approximately Optimal Mechanism Design. With Tim Roughgarden.
 Survey in Annual Review of Economics 11:355381, 2019.
[Abstract] [arXiv]The field of optimal mechanism design enjoys a beautiful and welldeveloped theory, as well as several killer applications. Rules of thumb produced by the field influence everything from how governments sell wireless spectrum licenses to how the major search engines auction off online advertising. There are, however, some basic problems for which the traditional optimal mechanism design approach is ill suited—either because it makes overly strong assumptions or because it advocates overly complex designs. This article reviews several common issues with optimal mechanisms, including exorbitant communication, computation, and informational requirements; it also presents several examples demonstrating that relaxing the goal to designing an approximately optimal mechanism allows us to reason about fundamental questions that seem out of reach of the traditional theory.

 Robust Mechanism Design: Information and Computation.
 PhD thesis, 2015.
 ⭐ Doctoral Dissertation Award from ACM SIGecom.
[Abstract] [Dissertation]A fundamental problem in economics is how to allocate precious and scarce resources, such as radio spectrum or the attention of online consumers, to the beneﬁt of society. The vibrant research area of market design, recognized by the 2012 Nobel Prize in economics, aims to develop an engineering science of allocation mechanisms based on sound theoretical foundations. Two central assumptions are at the heart of much of the classic theory on resource allocation: the common knowledge and substitutability assumptions. Relaxing these is a prerequisite for many reallife applications, but involves signiﬁcant informational and computational challenges. The starting point of this dissertation is that the computational paradigm oﬀers an ideal toolbox for overcoming these challenges in order to achieve a robust and applicable theory of market design. We use tools and techniques from combinatorial optimization, randomized algorithms and computational complexity to make contributions on both the informational and computational fronts:
1. We design simple mechanisms for maximizing seller revenue that do not rely on common knowledge of buyers’ willingness to pay. First we show that across many diﬀerent markets – including notoriously challenging ones in which the goods are heterogeneous – the optimal revenue benchmark can be surpassed or approximated by adding buyers or limiting supplies, and then applying the standard Vickrey (secondprice) mechanism. We also show how, by removing the common knowledge assumption, the classic theory of revenue maximization expands to encompass the realistic but complex case in which buyers are interdependent in their willingness to pay.
2. We prove positive and negative results for maximizing social welfare without substitutability, i.e., without the convexity property known to drive economic eﬃciency. On the positive side, we design natural greedy mechanisms for twosided markets with strong incentive properties, whose welfare performance depends on the market’s “distance” from substitutability. On the negative side, we show how computational challenges related to complementarity lead to the economic failure of competitive markets, in the sense that there do not exist simple prices that guide such a market to an eﬃcient allocation.
These results carry implications for the practice of market design, both for revenuemaximizing sellers such as Internet companies running online auctions, and for welfaremaximizing policy makers such as governments running spectrum auctions.
Research papers

 Multiagent Evaluation Mechanisms. With Tal Alon, Magdalen R. C. Dobson, Ariel Procaccia and Jamie TuckerFoltz.
 Forthcoming in AAAI Conference on Artificial Intelligence (AAAI), 2020.
[Preprint]
Oral presentation (top 5%).

 The Complexity of Contracts. With Paul Duetting and Tim Roughgarden.
 Forthcoming in ACMSIAM Symposium on Discrete Algorithms (SODA), 2020.
[Preprint] 
 Simple versus Optimal Contracts. With Paul Duetting and Tim Roughgarden.
 In ACM Conference on Economics and Computation (EC), 2019.
[arXiv] [Slides] 
 Settling the Communication Complexity of Combinatorial Auctions with Two Subadditive Buyers. With Tomer Ezra, Michal Feldman, Eric Neyman and S. Matthew Weinberg.
 In IEEE Symposium on Foundations of Computer Science (FOCS), 2019.
[arXiv] [Poster] 
 Fair Allocation through Competitive Equilibrium from Generic Incomes. With Moshe Babaioff and Noam Nisan.
 In ACM Conference on Fairness, Accountability and Transparency (FAT*), 2019.
[Preprint] [Slides]
See also our complementary results here: [arXiv] 
 When Are Welfare Guarantees Robust? with Tim Roughgarden and Jan Vondrák.
 In International Workshop on Approximation Algorithms (APPROX), 2017.
[arXiv] [Slides]
A preliminary version appeared in Workshop on Algorithmic Game Theory and Data Science at GAMES/EC 2016. 
 The Competition Complexity of Auctions: A BulowKlemperer Result for MultiDimensional Bidders. With Alon Eden, Michal Feldman, Ophir Friedler and S. Matthew Weinberg.
 In ACM Conference on Economics and Computation (EC), 2017.
[Abstract] [arXiv] [Slides]A seminal result of Bulow and Klemperer [1989] demonstrates the power of competition for extracting revenue: when selling a single item to n bidders whose values are drawn i.i.d. from a regular distribution, the simple welfaremaximizing VCG mechanism (in this case, a second priceauction) with one additional bidder extracts at least as much revenue in expectation as the optimal mechanism. The beauty of this theorem stems from the fact that VCG is a priorindependent mechanism, where the seller possesses no information about the distribution, and yet, by recruiting one additional bidder it performs better than any priordependent mechanism tailored exactly to the distribution at hand (without the additional bidder). In this work, we establish the first full BulowKlemperer results in multidimensional environments, proving that by recruiting additional bidders, the revenue of the VCG mechanism surpasses that of the optimal (possibly randomized, Bayesian incentive compatible) mechanism. For a given environment with i.i.d. bidders, we term the number of additional bidders needed to achieve this guarantee the environment's competition complexity. Using the recent dualitybased framework of Cai et al. [2016] for reasoning about optimal revenue, we show that the competition complexity of n bidders with additive valuations over m independent, regular items is at most n+2m2 and at least log(m). We extend our results to bidders with additive valuations subject to downwardclosed constraints, showing that these significantly more general valuations increase the competition complexity by at most an additive m1 factor. We further improve this bound for the special case of matroid constraints, and provide additional extensions as well.

 A Simple and Approximately Optimal Mechanism for a Buyer with Complements. With Alon Eden, Michal Feldman, Ophir Friedler and S. Matthew Weinberg.
 In ACM Conference on Economics and Computation (EC), 2017.
[Abstract] [arXiv] [Ophir's Video]We consider a revenuemaximizing seller with m heterogeneous items and a single buyer whose valuation v for the items may exhibit both substitutes (i.e., for some S and T, v(S + T) < v(S) + v(T)) and complements (i.e., for some S and T, v(S + T) > v(S) + v(T)). We show that the mechanism first proposed by Babaioff et al. [2014]  the better of selling the items separately and bundling them together  guarantees an O(d) fraction of the optimal revenue, where d is a measure of the degree of complementarity. Note that this is the first approximately optimal mechanism for a buyer whose valuation exhibits any kind of complementarity, thus extending the work of Rubinstein and Weinberg [2015], who prove that the same simple mechanism achieves a constant factor approximation when for subadditive valuations, the most general class of complementfree valuations. Our proof is enabled by the recent duality framework developed in Cai et al. [2016], which we use to obtain a bound on the optimal revenue in this setting. Our main technical contributions are specialized to handle the intricacies of settings with complements, and include an algorithm for partitioning edges in a hypergraph. Even nailing down the right model and notion of "degree of complementarity" to obtain meaningful results is of interest, as the natural extensions of previous definitions provably fail.

 Approximate Modularity Revisited. With Uriel Feige and Michal Feldman.
 In ACM Symposium on the Theory of Computing (STOC), 2017.
[Abstract] [arXiv] [Slides] [Video]
⭐ Invited to Highlights of Algorithms (HALG), 2018.Set functions with convenient properties (such as submodularity) appear in application areas of current interest, such as algorithmic game theory, and allow for improved optimization algorithms. It is natural to ask (e.g., in the context of data driven optimization) how robust such properties are, and whether small deviations from them can be tolerated. We consider two such questions in the important special case of linear set functions. One question that we address is whether any set function that approximately satisfies the modularity equation (linear functions satisfy the modularity equation exactly) is close to a linear function. The answer to this is positive (in a precise formal sense) as shown by Kalton and Roberts [1983] (and further improved by Bondarenko, Prymak, and Radchenko [2013]). We revisit their proof idea that is based on expander graphs, and provide significantly stronger upper bounds by combining it with new techniques. Furthermore, we provide improved lower bounds for this problem. Another question that we address is that of how to learn a linear function $h$ that is close to an approximately linear function $f$, while querying the value of $f$ on only a small number of sets. We present a deterministic algorithm that makes only linearly many (in the number of items) nonadaptive queries, by this improving over a previous algorithm of Chierichetti, Das, Dasgupta and Kumar [2015] that is randomized and makes more than a quadratic number of queries. Our learning algorithm is based on a Hadamard transform.

 Competitive Equilibria with Indivisible Goods and Generic Budgets. With Moshe Babaioff and Noam Nisan.
 In Workshop on Matching under Preferences (MATCHUP), Microsoft Research New England, 2017.
[Abstract] [arXiv] [Slides]We study competitive equilibria in the fundamental Fisher market model, in which the standard assumption of divisible goods is replaced with an assumption of indivisible ones. Competitive equilibria fail to exist in one of the simplest possible markets: two players with equal budgets and a single indivisible good. However, this turns out to be a knifeedge instance  an equilibrium exists once budgets are not precisely equal (offering, in effect, a simple tiebreaking rule among the two players). Is nonexistence of equilibria a knifeedge phenomenon in general, i.e., in more complex markets with multiple different goods and general player preferences? The starting point for our theoretical analysis is anecdotal evidence, gleaned from computer simulations and reallife data, that a competitive equilibrium exists when player budgets are "generic". We prove several existence results, focusing in particular on the case of additive preferences. We also relate competitive equilibria to notions of approximate fairness, which are appropriate for allocation of indivisible items among players with unequal entitlements.

 Oblivious Rounding and the Integrality Gap. With Uriel Feige and Michal Feldman.
 In International Workshop on Approximation Algorithms (APPROX), 2016.
[Abstract] [Full version] [Slides]The following paradigm is often used for handling NPhard combinatorial optimization problems. One ﬁrst formulates the problem as an integer program, then one relaxes it to a linear program (LP, or more generally, a convex program), then one solves the LP relaxation in polynomial time, and ﬁnally one rounds the optimal LP solution, obtaining a feasible solution to the original problem. Many of the commonly used rounding schemes (such as randomized rounding, threshold rounding and others) are oblivious in the sense that the rounding is performed based on the LP solution alone, disregarding the objective function. The goal of our work is to better understand in which cases oblivious rounding suﬃces in order to obtain approximation ratios that match the integrality gap of the underlying LP. Our study is information theoretic – the rounding is restricted to be oblivious but not restricted to run in polynomial time. In this information theoretic setting we characterize the approximation ratio achievable by oblivious rounding. It turns out to equal the integrality gap of the underlying LP on a problem that is the closure of the original combinatorial optimization problem. We apply our ﬁndings to the study of the approximation ratios obtainable by oblivious rounding for the maximum welfare problem, showing that when valuation functions are submodular oblivious rounding can match the integrality gap of the conﬁguration LP (though we do not know what this integrality gap is), but when valuation functions are gross substitutes oblivious rounding cannot match the integrality gap (which is 1).

 Why Prices Need Algorithms. With Tim Roughgarden.
 ⭐ Best student paper award in ACM Conference on Economics and Computation (EC), 2015.
⭐ Invited plenary talk in STOC Theory Fest, 2017.
[Abstract] [Full version] [IJCAI 2016] [SIGecom 2015]Understanding when equilibria are guaranteed to exist is a central theme in economic theory, seemingly unrelated to computation. This paper shows that the existence of pricing equilibria is inextricably connected to the computational complexity of related optimization problems: demand oracles, revenuemaximization, and welfaremaximization. This relationship implies, under suitable complexity assumptions, a host of impossibility results. We also suggest a complexitytheoretic explanation for the lack of useful extensions of the Walrasian equilibrium concept: such extensions seem to require the invention of novel polynomialtime algorithms for welfaremaximization.

 Welfare and Revenue Guarantees for Competitive Bundling Equilibrium. With Shahar Dobzinski, Michal Feldman and Omri Weinstein.
 In Conference on Web and Internet Economics (WINE), 2015.
[Abstract] [Full version]Competitive equilibrium, the central equilibrium notion in markets with indivisible goods, is based on pricing each good such that the demand for goods equals their supply and the market clears. This equilibrium notion is not guaranteed to exist beyond the narrow case of substitute goods, might result in zero revenue even when consumers value the goods highly, and overlooks the widespread practice of pricing bundles rather than individual goods. Alternative equilibrium notions proposed to address these shortcomings have either made a strong assumption on the ability to withhold supply in equilibrium, or have allowed an exponential number of prices. In this paper we study the notion of competitive bundling equilibrium – a competitive equilibrium over the market induced by partitioning the goods into bundles. Such an equilibrium is guaranteed to exist, is succinct, and satisﬁes the fundamental economic condition of market clearance. We establish positive welfare and revenue guarantees for this solution concept: For welfare we show that in markets with homogeneous goods, there always exists a competitive bundling equilibrium that achieves a logarithmic fraction of the optimal welfare. We also extend this result to establish nontrivial welfare guarantees for markets with heterogeneous goods. For revenue we show that in a natural class of markets for which competitive equilibrium does not guarantee positive revenue, there always exists a competitive bundling equilibrium that extracts as revenue a logarithmic fraction of the optimal welfare. Both results are tight.

 Modularity and Greed in Double Auctions. With Paul Duetting and Tim Roughgarden.
 In ACM Conference on Economics and Computation (EC), 2014.
[Abstract] [Full version]Designing double auctions is a complex problem, especially when there are restrictions on the sets of buyers and sellers that may trade with one another. The goal of this paper is to develop a modular approach to the design of double auctions, by relating it to the exhaustivelystudied problem of designing onesided mechanisms with a single seller (or, alternatively, a single buyer). We consider several desirable properties of a double auction: feasibility, dominantstrategy incentive compatibility, the still stronger incentive constraints oﬀered by a deferredacceptance implementation, exact and approximate welfare maximization, and budget balance. For each of these properties, we identify sufficient conditions on two onesided algorithms  one for ranking the buyers, one for ranking the sellers  and on a method for their composition into trading pairs, which guarantee the desired property of the double auction. Our framework also offers new insights into classic double auction designs, such as the VCG and McAfee auctions with unitdemand buyers and unitsupply sellers.

 Optimal and Robust Mechanism Design with Interdependent Values. With Tim Roughgarden.
 In ACM Conference on Economics and Computation (EC), 2013.
[Abstract] [Full version]We study interdependent value settings [Milgrom and Weber, 1982], and extend several fundamental results from the wellstudied independent private values model to these settings. For revenueoptimal mechanism design, we give conditions under which Myerson’s virtual valuebased mechanism remains optimal with interdependent values. One of these conditions is robustness of the truthfulness and individual rationality guarantees, in the sense that they are required to hold ex post. We then consider an even more robust class of mechanisms called “prior independent” (a.k.a. “detail free”), and show that by simply using one of the bidders to set a reserve price, it is possible to extract nearoptimal revenue in an interdependent values setting. This shows that a considerable level of robustness is achievable for interdependent values in singleparameter environments.

 Prediction and Welfare in Ad Auctions. With Mukund Sundararajan.
 In International Symposium on Algorithmic Game Theory (SAGT), 2014.
A preliminary version appeared in Ad Auctions Workshop (AAW), 2013.
[Abstract] [Full version]We study how standard auction objectives in sponsored search markets are affected by refinement in the prediction of ad relevance (clickthrough rates). As the prediction algorithm takes more features into account, its predictions become more refined; a natural question is whether this is desirable from the perspective of auction objectives. Our focus is on mechanisms that optimize for a convex combination of economic efficiency and revenue, and our starting point is the observation that the objective of such a mechanism can only improve with refined prediction, making refinement in the best interest of the search engine. We demonstrate that the impact of refinement on market efficiency is not always positive; nevertheless, we are able to identify natural – and to some extent necessary – conditions under which refinement is guaranteed to also improve economic efficiency. Our main technical contribution is in explaining how refinement changes the ranking of advertisers by value (efficiencyoptimal ranking), moving it either towards or away from their ranking by virtual value (revenueoptimal ranking). These results are closely related to the literature on signaling in auctions.

 SupplyLimiting Mechanisms. With Tim Roughgarden and Qiqi Yan.
 In ACM Conference on Economics and Computation (EC), 2012.
[Abstract] [Full version]Most results in revenuemaximizing auction design hinge on "getting the price right" – offering goods to bidders at a price low enough to encourage a sale, but high enough to garner nontrivial revenue. Getting the price right can be hard work, especially when the seller has little or no a priori information about bidders’ valuations. A simple alternative approach is to "let the market do the work", and have prices emerge from competition for scarce goods. The simplestimaginable implementation of this idea is the following: first, if necessary, impose an artificial limit on the number of goods that can be sold; second, run the welfaremaximizing VCG mechanism subject to this limit. We prove that such "supplylimiting mechanisms" achieve nearoptimal expected revenue in a range of single and multiparameter Bayesian settings. Indeed, despite their simplicity, we prove that they essentially match the stateoftheart in priorindependent mechanism design.

 Ad Auctions with Data. With Hu Fu, Patrick Jordan, Mohammad Mahdian, Uri Nadav and Sergei Vassilvitskii.
 International Symposium on Algorithmic Game Theory (SAGT), 2012.
Preliminary versions appeared in Workshop on the Economics of Networks (NetEcon), 2012 and in Ad Auctions Workshop (AAW), 2012.
[Abstract] [Full version] [Patent App.]The holy grail of online advertising is to target users with ads matched to their needs with such precision that the users respond to the ads, thereby increasing both advertisers’ and users’ value. The current approach to this challenge utilizes information about the users: their gender, their location, the websites they have visited before, and so on. Incorporating this data in ad auctions poses an economic challenge: can this be done in a way that the auctioneer’s revenue does not decrease (at least on average)? This is the problem we study in this paper. Our main result is that in Myerson’s optimal mechanism, for a general model of data in auctions, additional data leads to additional expected revenue. In the context of ad auctions we show that for the simple and common mechanisms, namely second price auction with reserve prices, there are instances in which additional data decreases the expected revenue, but this decrease is by at most a small constant factor under a standard regularity assumption.

 Vertex Sparsifiers: New Results from Old Techniques. With Matthias Englert, Anupam Gupta, Robert Krauthgamer, Harald Raecke and Kunal Talwar.
 In International Workshop on Approximation Algorithms (APPROX), 2010.
[Full version] 
 A Direct Reduction from kPlayer to 2Player Approximate Nash Equilibrium. With Uriel Feige.
 In International Symposium on Algorithmic Game Theory (SAGT), 2010.
[Full version]
Journal papers

 A Simple and Approximately Optimal Mechanism for a Buyer with Complements. With Alon Eden, Michal Feldman, Ophir Friedler and S. Matthew Weinberg.
 Minor revision, Operations Research (OR), 2019.

 Competitive Equilibria with Indivisible Goods and Generic Budgets. With Moshe Babaioff and Noam Nisan.
 Minor revision, Math. of Operations Research (MOR), 2019.
[Abstract] [Preprint]We study the existence and fairness properties of competitive equilibria in the fundamental Fisher market model, in which players have budgets of artificial currency ("money"), which they do not value. In our model, the standard assumption of divisible goods is replaced with an assumption of indivisibility. A competitive equilibrium fails to exist in one of the simplest possible such markets: two players with equal budgets and a single indivisible good. However, this turns out to be a knifeedge instance  an equilibrium does exist once budgets are made generic by slight perturbation. Is nonexistence of an equilibrium a knifeedge phenomenon more generally? I.e., do generic budgets guarantee equilibrium existence in more complex markets, with multiple different goods, more general player preferences, or farfromequal budgets? Focusing on the class of additive preferences, we prove several existence results for two players with generic budgets. On the flip side, we demonstrate nonexistence for general preferences, despite generic budgets. We also study fairness properties of competitive equilibria, suggesting new notions of fair allocation among players with different entitlements to the goods being allocated.

 Approximate Modularity Revisited. With Uriel Feige and Michal Feldman.
 Forthcoming in SIAM Journal on Computing (SICOMP), 2019.
[Preprint] 
 Robust Auctions for Revenue via Enhanced Competition. With Tim Roughgarden and Qiqi Yan.
 Forthcoming in Operations Research (OR), 2019.
[Abstract] [Preprint]Most results in revenuemaximizing mechanism design hinge on "getting the price right"  selling goods to bidders at prices low enough to encourage a sale, but high enough to garner nontrivial revenue. This approach is difficult to implement when the seller has little or no a priori information about bidder valuations, or when the setting is sufficiently complex, such as matching markets with heterogeneous goods. In this paper we apply a robust approach to designing auctions for revenue. Instead of relying on prior knowledge regarding bidder valuations, we "let the market do the work" and let prices emerge from competition for scarce goods.
We analyze the revenue guarantees of one of the simplest imaginable implementations of this idea: first, enhance competition in the market by increasing demand (or alternatively by limiting supply); second, run a standard secondprice (Vickrey) auction. In their renowned work, Bulow and Klemperer (1996) apply this method to markets with single goods. As our main result, we give the first application beyond singleparameter settings, proving that simultaneously for many valuation distributions this method achieves expected revenue at least as good as the optimal revenue in the original market.
Our robust and simple approach provides a handle on the elusive optimal revenue in multiitem matching markets, and shows when the use of welfaremaximizing Vickrey auctions is justified even if revenue is a priority. By establishing quantitative tradeoffs, our work provides guidelines for a seller in choosing among two different revenueextracting strategies: sophisticated pricing based on market research, or advertising to draw additional bidders. 
 Oblivious Rounding and the Integrality Gap. With Uriel Feige and Michal Feldman.
 Forthcoming in Theory of Computing (TOC), Special Issue on APPROX’16 (by invitation), 2019.
[Abstract] [Preprint]The following paradigm is often used for handling NPhard combinatorial optimization problems. One first formulates the problem as an integer program, then one relaxes it to a linear program (LP, or more generally, a convex program), then one solves the LP relaxation in polynomial time, and finally one rounds the optimal LP solution, obtaining a feasible solution to the original problem. Many of the commonly used rounding schemes (such as randomized rounding, threshold rounding and others) are oblivious in the sense that the rounding is performed based on the LP solution alone, disregarding the objective function. The goal of our work is to better understand in which cases oblivious rounding suffices in order to obtain approximation ratios that match the integrality gap of the underlying LP. Our study is information theoretic  the rounding is restricted to be oblivious but not restricted to run in polynomial time. In this information theoretic setting we characterize the approximation ratio achievable by oblivious rounding. It turns out to equal the integrality gap of the underlying LP on a problem that is the closure of the original combinatorial optimization problem. We apply our findings to the study of the approximation ratios obtainable by oblivious rounding for the maximum welfare problem, showing that when valuation functions are submodular oblivious rounding can match the integrality gap of the configuration LP (though we do not know what this integrality gap is), but when valuation functions are gross substitutes oblivious rounding cannot match the integrality gap (which is 1).

 Modularity and Greed in Double Auctions. With Paul Duetting and Tim Roughgarden.
 Games and Economic Behavior (GEB) 105:5983, 2017.
[Abstract] [Preprint] [Full Version]Designing double auctions is a complex problem, especially when there are restrictions on the sets of buyers and sellers that may trade with one another. The goal of this paper is to develop a modular approach to the design of double auctions, by relating it to the exhaustivelystudied problem of designing onesided mechanisms with a single seller (or, alternatively, a single buyer). We consider several desirable properties of a double auction: feasibility, dominantstrategy incentive compatibility, the still stronger incentive constraints oﬀered by a deferredacceptance implementation, exact and approximate welfare maximization, and budget balance. For each of these properties, we identify sufficient conditions on two onesided algorithms  one for ranking the buyers, one for ranking the sellers  and on a method for their composition into trading pairs, which guarantee the desired property of the double auction. Our framework also offers new insights into classic double auction designs, such as the VCG and McAfee auctions with unitdemand buyers and unitsupply sellers.

 Optimal and Robust Mechanism Design with Interdependent Values. With Tim Roughgarden.
 ACM Trans. Economics and Comput. (TEAC) 4(3):18:118:34, Special Issue on EC’13 (by invitation), 2016.
[Abstract] [Preprint] [Video]We study interdependent value settings [Milgrom and Weber, 1982], and extend several fundamental results from the wellstudied independent private values model to these settings. For revenueoptimal mechanism design, we give conditions under which Myerson’s virtual valuebased mechanism remains optimal with interdependent values. One of these conditions is robustness of the truthfulness and individual rationality guarantees, in the sense that they are required to hold ex post. We then consider an even more robust class of mechanisms called “prior independent” (a.k.a. “detail free”), and show that by simply using one of the bidders to set a reserve price, it is possible to extract nearoptimal revenue in an interdependent values setting. This shows that a considerable level of robustness is achievable for interdependent values in singleparameter environments.

 Prediction and Welfare in Ad Auctions. With Mukund Sundararajan.
 Springer Theory of Computing Systems 59(4):664682, Special Issue on Algorithmic Game Theory (by invitation), 2016.
[Abstract] [Full version]We study how standard auction objectives in sponsored search markets are affected by refinement in the prediction of ad relevance (clickthrough rates). As the prediction algorithm takes more features into account, its predictions become more refined; a natural question is whether this is desirable from the perspective of auction objectives. Our focus is on mechanisms that optimize for a convex combination of economic efficiency and revenue, and our starting point is the observation that the objective of such a mechanism can only improve with refined prediction, making refinement in the best interest of the search engine. We demonstrate that the impact of refinement on market efficiency is not always positive; nevertheless, we are able to identify natural – and to some extent necessary – conditions under which refinement is guaranteed to also improve economic efficiency. Our main technical contribution is in explaining how refinement changes the ranking of advertisers by value (efficiencyoptimal ranking), moving it either towards or away from their ranking by virtual value (revenueoptimal ranking). These results are closely related to the literature on signaling in auctions.

 Vertex Sparsifiers: New Results from Old Techniques. With Matthias Englert, Anupam Gupta, Robert Krauthgamer, Harald Raecke and Kunal Talwar.
 SIAM Journal on Computing (SICOMP) 43(4):12391262, 2014.
[Full version]
© 2019 Inbal TalgamCohen
Template design by Andreas Viklund