You are currently browsing the tag archive for the ‘incidence geometry’ tag.

Introduction

This is the first of a series of three posts outlining the key ideas behind my recent paper “Growth Estimates in Postive Characteristics via Collisions”, which is joint work with Esen Aksoy Yazici, Misha Rudnev, and Ilya Shkredov.

Briefly, this paper proves a number of new sum-product type estimates by expanding on the point/plane incidence approach developed in Misha’s paper “On the number of incidences between points and planes in three dimensions” and Misha and Ilya’s joint work with Olly Roche-Newton “New sum-product type estimates over finite fields”. The heart of our new paper is a bound on the number of “collisions” between images of lines.

The first post in this series describes an incidence bound, which is one of the applications of our sum-product results. The second post will give the relevant background on the sum-product problem, and the third post will describe the “collision” bound and a sum-product corollary, which implies the incidence bound.

(Acknowledgement: I must thank Adam Sheffer for pointing out that I haven’t written about my own work yet!)

Incidence bound

Perhaps the most important result in the paper is a new incidence bound for points and lines in ${\mathbb{F}_q^2}$.

Theorem 1 Suppose ${P\subseteq{\mathbb F}_q^2}$ is a point set of the form ${P=A\times A}$ with ${|A|\ll p^{2/3}}$, and ${L}$ is a set of lines in ${{\mathbb F}_q^2}$. Then

$\displaystyle I(P,L)\ll |P|^{3/4}|L|^{2/3}+|P|+|L|.$

In a sense, this bound is halfway between the Szemeredi-Trotter theorem and the “trivial” Cauchy-Schwarz bound. The Szemeredi-Trotter theorem states that if ${P}$ is a set of point in ${{\mathbb R}^2}$ and ${L}$ is a set of lines in ${{\mathbb R}^2}$, then

$\displaystyle I(P,L)\ll |P|^{2/3}|L|^{2/3}+|P|+|L|. \ \ \ \ \ (1)$

The proof of this bound relies on the ordering of the reals, and doesn’t apply to finite fields. However, there is a bound that applies over ${{\mathbb F}_q^2}$ based on the Cauchy-Schwarz inequality. This bound states that

$\displaystyle I(P,L)\ll |P|^{3/4}|L|^{3/4}+|P|+|L|. \ \ \ \ \ (2)$

(See this previous post.) In the case where ${|P|\approx |L|=N}$, the Cauchy-Schwarz bound was improved to

$\displaystyle I(P,L)\ll N^{3/2-\epsilon} \ \ \ \ \ (3)$

by Bourgain, Katz, and Tao using a sum-product estimate, for some small ${\epsilon > 0}$. Prior to our result, the best value for ${\epsilon}$ was roughly ${1/806}$ (due to Jones). We improve ${\epsilon}$ to ${1/12}$, but our result only applies in case where ${P}$ is a Cartesian product, while the Cauchy-Schwarz bound and (3) apply to general points sets.

Proof of the incidence bound

The proof of Theorem 1 relies on a bound for the number ${T(A)}$ of collinear triples of a point set ${P=A\times A}$.

If the points ${(a,a'),(b,b'),(c,c')}$ are collinear, then

$\displaystyle \det\left( \begin{array}{ccc} 1 & 1& 1\\ a & b & c\\ a' & b' & c' \end{array}\right) = 0.$

The number of solutions to this equation for ${a,a',b,b',c,c'}$ in ${A}$ is the number ${T(A)}$ of collinear triples. Later we will prove that

$\displaystyle T(A)\ll |A|^{9/2}.$

For now, let us assume this upper bound, and use it to prove the incidence bound.

Proof: Let ${L_3}$ denote the set of lines in ${L}$ that are incident to at least three points of ${P}$. Then

$\displaystyle I(P,L)\leq I(P,L_3)+2|L|,$

so it suffices to bound ${I(P,L_3)}$.

Now let ${P(\ell)=P\cap\ell}$, so that ${|P(\ell)|}$ is the number of points of ${P}$ contained in the line ${\ell}$. On one hand,

$\displaystyle I(P,L_3)=\sum_{\ell\in L_3}|P(\ell)|.$

On the other hand,

$\displaystyle 3!\sum_{\ell\in L_3}{|P(\ell)|\choose 3}\leq T(A)\ll |A|^{9/2}. \ \ \ \ \ (4)$

The sum on the left hand side counts the number of collinear triples on lines of ${L_3}$, and this number is less than the total number of collinear triples ${T(A)}$.

Since ${{x\choose s}}$ is convex in ${x}$, the sum on the left side of (4) is minimized by the average of ${|P(\ell)|}$, which is ${I(P,L_3)/|L_3|}$:

$\displaystyle 3!\sum_{\ell\in L_3}{|P(\ell)|\choose 3}\geq |L_3|3!{I(P,L_3)/|L_3|\choose 3}\geq |L_3|\left(\frac{I(P,L_3)}{|L_3|}-3\right)^3.$

Combining the lower bound with the upper bound, we have

$\displaystyle \vert L_3\vert\left(\frac{I(P,L_3)}{|L_3|}-3\right)^3\leq T(A)\ll |A|^{9/2} = |P|^{9/4}.$

Now we solve for ${I(P,L_3)}$:

$\displaystyle I(P,L_3)\ll |P|^{3/4}|L_3|^{2/3}+3|L_3| \leq |P|^{3/4}|L|^{2/3}+3|L|.$

Combined with initial reduction, this bound completes the proof. $\Box$

Note that this proof is similar to the proof of the Cauchy-Schwarz bound (or restricted subgraph bounds in general), but we use the non-trivial bound on ${T(A)}$ instead of general geometric information about points and lines.

Next post: background on the sum-product problem

The upper bound on ${|T(A)|}$ was one of the last things we discovered while working on this paper. We started our investigation by looking for lower bounds for a certain sum-product type set ${|A(A+A)|}$. The next post will give the relevant background on the sum-product problem, and the final post in the series will describe the main technical innovation of the paper and give a proof of the bound ${|T(A)|\ll |A|^{9/2}}$.

This summer I’ve been studying problems in geometric combinatorics such as the joints problem and the Erdös distinct distance problem. The joints problem was open for 20 years and the distinct distance problem was open for over 60 years before they were resolved by a new technique called the polynomial method. (Algebraic techniques have been around for longer, but their application to incidence problems is new.) In addition to solving hard problems, the polynomial method has the added virtue of producing short and simple proofs. I will showcase two examples the polynomial method in action: Dvir’s proof of the finite field Kakeya conjecture, and Kaplan, Sharir, and Shustin’s proof of the joints problem in ${{\mathbb R}^d}$.

1. The Kakeya Problem

The finite field Kakeya problem and the joints problem are related to an open problem called the Kakeya conjecture. A Kakeya set, or Besicovitch set, is a subset of ${{\mathbb R}^d}$ that contains a unit line segment in every direction. Besicovitch showed that such sets may have Lebesgue measure zero for ${d\geq 2}$. However, Davies showed that for ${d=2}$, Kakeya sets must have Hausdorff dimension ${2}$, so that a Kakeya set is actually a two dimensional object, despite having no area. The Kakeya conjecture is that for all ${d}$, Kakeya sets must have full Hausdorff dimension. Wolff was able to show that the Hausdorff dimension of a Kakeya set must be at least ${\frac{d+2}{2}}$ for ${d\geq 2}$, but progress beyond this has been slow.
In a survey paper, Wolff introduced a finite field analog of the Kakeya conjecture. Let ${{\mathbb F}_q}$ be the finite field with ${q}$ elements. A Kakeya set in ${{\mathbb F}_q^d}$ is a set that contains a line in every direction. That is, ${K\subseteq {\mathbb F}_q^d}$ is a Kakeya set if for all ${\bold{y}}$ in ${{\mathbb F}_q^d}$, there exists a ${\bold{x}}$ in ${{\mathbb F}_q^d}$ such that the set ${\{\bold{x}+t\bold{y}\colon t\in{\mathbb F}_q\}}$ is a subset of ${K}$. The finite field Kakeya conjecture is that for any Kakeya set ${K}$ we have ${|K|\geq C_dq^d}$ for some fixed constant ${C_d}$ that is independent of ${q}$.

Theorem 1 (Dvir 2008) If ${K}$ is a Kakeya set in ${{\mathbb F}_q^d}$, then ${|K|\geq C_dq^d}$, where ${C_d}$ is a constant that we may choose to be ${\frac{1}{d!}}$.

Wolff showed that ${|K|\geq C_dq^{(d+2)/2}}$ using a proof similar in spirit to his ${\frac{d+2}{2}}$ lower bound on the Hausdorff dimension of Kakeya sets in ${{\mathbb R}^d}$. Since proofs in the finite field case use similar tools to attacks on the ${{\mathbb R}^d}$, but involve fewer technicalities, it was hoped that the finite field case would serve as an instructive model. In 2008, Dvir proved the finite field Kakeya conjecture using the so-called polynomial method, which currently has no continuous analog.

2. The Fundamental Lemma

The fundamental lemma of the polynomial method is a higher dimensional analog of the fact that, given ${n}$ numbers, there is a non-zero polynomial of degree ${n}$ vanishing at those numbers:

Lemma 2 (Fundamental Lemma) Let ${k}$ be a field and let ${S}$ be a finite subset of ${k^n}$. If ${|S|=m}$, then there is a non-zero polynomial of degree ${d\leq\lceil(n!m)^{1/n}\rceil}$ that vanishes on ${S}$.

Proof: Let ${d}$ be the least integer such that ${m<{n+d\choose d}}$. By elementary combinatorics, the number of monomials in ${k[x_1,\ldots,x_n]}$ of degree at most ${d}$ is ${{n+d\choose d}}$, so a polynomial ${P}$ in ${k[x_1,\ldots,x_n]}$ with variable coefficients will have ${{n+d\choose d}}$ degrees of freedom. Requiring that ${P}$ vanishes on ${S}$ imposes ${m}$ constraints. Since we have more degrees of freedom than constraints, our linear system for the coefficients of ${P}$ has a non-trivial solution.

By a previous post, we know that if ${d}$ is the least integer such that ${m<{n+d\choose d}}$, then ${d\leq\lceil(n!m)^{1/n}\rceil}$, which completes the proof. $\Box$

3. Dvir’s Proof

The basic argument in Dvir’s proof of the finite field Kakeya conjecture is to assume that you have a Kakeya set that is smaller than ${Cq^d}$. This allows us to find a non-zero polynomial of degree less than ${q}$ vanishing on our Kakeya set. However, we will show that a polynomial of degree less than ${q}$ vanishing on a Kakeya set must vanish at every point of ${{\mathbb F}_q^d}$, contradicting the fact that the polymomial was non-zero.

Proof: Let ${K}$ be a Kakeya set in ${{\mathbb F}_q^d}$. Suppose by way of contradiction that ${K for some constant ${C}$, which we will choose during the course of the proof. By the fundamental lemma, there exists a non-zero polymomial ${P}$ in ${k[x_1,\ldots,x_d]}$ of degree ${n}$ vanishing on ${K}$, where ${n\leq\lceil (d!Cq^d)^{1/d}\rceil}$. Choosing ${C\leq\frac 1{d!}}$ will ensure that ${n.

Fix a point ${\bold{y}}$ in ${{\mathbb F}_q^d}$. We wish to show that ${P(\bold{y})=0}$. Since ${K}$ is a Kakeya set, there exists a point ${\bold{x}}$ in ${{\mathbb F}_q^d}$ such that ${\{\bold{x}+t\bold{y}\colon t\in{\mathbb F}_q\}}$ is a subset of ${K}$. Let ${\tilde{P}(t)=P(\bold{x}+t\bold{y})}$; ${\tilde{P}}$ is a degree ${n}$ polynomial in ${{\mathbb F}_q[t]}$. Since ${P}$ vanishes on ${K}$ we have ${\tilde{P}(t)=0}$ for each ${t}$ in ${{\mathbb F}_q}$. Thus ${\tilde{P}}$ vanishes identically, because it has more roots than its degree.

Since ${\tilde{P}}$ vanishes identically, it’s coefficients must be zero. The coefficients of ${\tilde{P}}$ are polynomials in ${\bold{x}}$ and ${\bold{y}}$, and it turns out that the coefficients of ${t^n}$ is the ${n^\text{th}}$ degree homogeneous part of ${P}$ evaluated at ${\bold{y}}$. (Recall that a polynomial ${P}$ may be split up at a sum of its homogenous parts ${P=\sum_{i=0}^n P_i}$, where all of the monomials in ${P_i}$ are of degree ${i}$.) Thus we have shown that ${P_n(\bold{y})=0}$.

Since ${\bold{y}}$ was arbitrary, ${P_n}$ vanishes identically on ${{\mathbb F}_q^d}$, which contradicts the fact that ${P}$ is degree ${n}$. $\Box$

4. The Joints Problem

In the same survey, Wolff describes another discrete incidence problem that is related to the Kakeya conjecture: the joints problem. A joint in ${{\mathbb R}^d}$ is a point where ${d}$ lines intersect in indepedent directions. The joints problem is to find an upper bound the number of joints formed by ${n}$ lines in ${{\mathbb R}^d}$.

A lower bound on the maximum number of joints in ${{\mathbb R}^3}$ is given by the example of a cube lattice ${[0,\sqrt{n}]\times [0,\sqrt{n}]\times [0,\sqrt{n}]}$. Each side of the cube has roughly ${n}$ lattice points, and we take as our set of lines the lines through these lattice points that are parallel to either the ${x}$, ${y}$, or ${z}$ axes—this gives us roughly ${3n}$ lines. There are roughly ${n^{3/2}}$ lattice points inside the cube, and these points are exactly the set of joints formed by our lines. The conjecture asserts that this is sharp: ${n}$ lines determine at most ${Cn^{3/2}}$ joints.

After Dvir’s proof of the finite field Kakeya problem, there were hopes that the polynomial method could be applied to problems in ${{\mathbb R}^d}$. The combined genius of Larry Guth and Nets Katz produced the first such successful application, which was futher generalized and simplified by a host of researchers. I will give a simplified proof due Kaplan, Sharir, and Shustin.

The basic outline of their proof is this: assume that the set of joints is larger than we believe it should be, which allows us to find a non-zero polynomial vanishing on all of the joints and all of the lines forming the joints. We will show that if a polynomial vanishes on all the lines forming a joint, the gradient of the polynomial must vanish at that joint too. Iterating this argument, we will show that all of the partial derivatives of the polynomial vanish. Since the partial derivatives must be constant at some point, this shows that the coefficients of the polynomial must be zero.

Theorem 3 (Guth and Katz 2009) Let ${L}$ be a set of ${n}$ lines in ${{\mathbb R}^d}$. If ${J_L}$ is the set of joints formed by ${L}$, then there exists a constant ${C}$ such that ${|J_L|\leq Cn^{d/(d-1)}}$.

Proof: Let’s set ${m=|J_L|}$ for convenience. Suppose by way of contradiction that ${m>Cn^{d/(d-1)}}$, where ${C}$ is a constant that we will choose during the course of the proof.

Let ${\rho=m/2n}$. We will refine ${L}$ so that each remaining line contains at least ${\rho}$ joints. We begin by picking a line ${\ell}$ in ${L}$. If ${\ell}$ contains fewer than ${\rho}$ points of ${J_L}$, we will discard ${\ell}$ and remove all of the points on ${\ell}$ from ${J_L}$. We continue this process until all of the remaining lines contain at least ${\rho}$ joints. We will call the resulting sets ${L'}$ and ${J_L'}$. Note that every point of ${J_L'}$ is a joint of ${L'}$. Since we removed at most ${m/2}$ joints from ${J_L}$, we have ${m\geq|J_L'|\geq m/2}$.

Next we will find a non-zero polynomial that vanishes on ${J_L'}$ and all of the lines of ${L'}$. Since ${|J_L'|\leq m}$, the fundamental lemma provides a non-zero polynomial ${P}$ of degree ${r\leq\lceil(d!m)^{1/d}\rceil<2(d!m)^{1/d}}$ that vanishes on ${J_L'}$. We would also like ${P}$ to vanish on all of the lines of ${L'}$. Because ${P}$ is degree ${r}$, ${P}$ will vanish on any line that contains more than ${r}$ zeros of ${P}$. By our refining process, each line of ${L'}$ contains at least ${\rho}$ zeros of ${P}$. Since ${\rho=m/2n>\frac 12 C^{(d-1)/d}m^{1/d}}$, we can make ${\rho>r}$ by choosing ${C>4^{d/(d-1)}(d!)^{1/(d-1)}}$, in which case ${P}$ vanishes on all of the lines of ${L'}$, as desired.

The polynomial ${P}$ has a peculiar property: at each point ${\bold{x}}$ in ${J_L'}$, ${P}$ vanishes along ${d}$ lines ${\ell_1,\ldots,\ell_d}$ in ${L'}$, which run in independent directions. Suppose that ${\ell_i}$ runs in direction ${\bold{y}_i}$, so that ${\ell_i=\{\bold{x}+t\bold{y}_i\colon t\in{\mathbb R}\}}$. The directional derivative of ${P}$ at ${\bold{x}}$ in the direction of ${\bold{y}_i}$ must be zero, since it is evaluated along ${\ell_i}$; that is, ${\nabla P(\bold{x})\cdot\bold{y}_i=0}$. The direction vectors ${\bold{y}_1,\ldots,\bold{y}_d}$ span all of ${{\mathbb R}^d}$, because they are independent. Thus ${\nabla P(\bold{x})}$ is perpedicular to every vector in ${{\mathbb R}^d}$, and so ${\nabla P(\bold{x})=0}$. Since ${\bold{x}}$ was arbitrary, ${\nabla P}$ vanishes at every point of ${J_L'}$.

Now we can iterate the preceeding argument: since the components of ${\nabla P}$ vanish on ${J_L'}$ and have degree less than ${r}$, they also vanish on every line of ${L'}$. If we take one component of ${\nabla P}$, say ${P_{x_1}}$, we can use that fact that ${P_{x_1}}$ vanishes on the lines of ${L'}$ to show that ${\nabla P_{x_1}}$ vanishes on ${J_L'}$. Continuing in this way, we see that all of the iterated partial derivatives of ${P}$ must be zero. But the ${r^\text{th}}$ degree partial derivatives of ${P}$ are multiples of the coefficients of ${P}$. Thus the coefficients of ${P}$ must zero, which contradicts the fact that ${P}$ is a non-zero polynomial. $\Box$

5. Bibliography

Here is a short bibliography for each section.

The Kakeya Problem:

Dvir’s Proof of the Finite Field Kakeya Problem:

The Joints Problem:

• Guth and Katz’s 2009 paper (difficult read)
• Kaplan, Sharir, and Shustin’s short proof
• Quilodran’s summary

-Incidence problems and bipartite graphs-

An important problem in combinatorial geometry is to study the number of incidences between a set of points and set of lines (or curves, surfaces, etc.). More specifically, let ${P}$ be a finite subset of ${\mathbb{R}^n}$ and let ${L}$ be a finite set of lines (or curves, surfaces, etc.). We define the set of incidences between ${P}$ and ${L}$ by ${I(P,L):=\{(p,\ell)\in P\times L\colon p\in\ell\}}$. Our problem is now to study the size of ${I(P,L)}$. Of course we always have the trivial bound ${|I(P,L)|\leq |P||L|}$, but we would like to do better.

Introducing the set of incidences of ${P}$ and ${L}$ may just seem like good notation, but in fact it is characterizing our incidence problem in terms of graph theory: we define the incidence graph ${G(P,L)}$ of ${P}$ and ${L}$ to be the graph whose vertices are points of ${P}$ and lines of ${L}$, with a point ${p}$ connected to a line ${\ell}$ if ${p\in\ell}$. So the set of edges of ${G(P,L)}$ is ${I(P,L)}$.

The incidence graph ${G(P,L)}$ is a special type of graph called a bipartite graph. A bipartite graph is a graph ${G}$ whose vertices form two disjoint sets, ${A}$ and ${B}$, such that no edge joins two points of ${A}$ and no edge joins two points of ${B}$; that is, the edges of ${G}$ are of the form ${\{a,b\}}$ with ${a}$ in ${A}$ and ${b}$ in ${B}$. We will write ${G=G(A,B)}$ to specify the vertex sets and we write ${E(G)}$ or ${E(A,B)}$ for the edges of ${G}$.

As with incidence graphs, we have the trivial bound ${|E(A,B)|\leq |A||B|}$ and in general we can’t do any better. In the next section we will prove non-trivial bounds on the number of edges of a bipartite graph under certain restrictions. By geometric considerations, we’ll see that many incidence graphs satisfy these restrictions, thus giving us non-trivial bounds on ${|I(P,L)|}$. It turns out that these bounds are actually sharp for point-line incidences over finite fields.

-Restricted subgraph bound-

We noted above that the trivial bound ${|E(A,B)|\leq |A||B|}$ cannot be improved in general. We call a graph that achieves equality a complete bipartite graph. That is, a complete bipartite graph is a bipartite graph with vertex sets ${A}$ and ${B}$ such that every vertex of ${A}$ is connected to every vertex of ${B}$. If ${|A|=s}$ and ${|B|=t}$, then we denote the complete bipartite graph with vertex sets ${A}$ and ${B}$ by ${K_{s,t}}$. If we consider only bipartite graphs ${G(A,B)}$ whose edge set does not contain a copy of ${K_{s,t}}$ for some ${s\leq|A|}$ and ${t\leq|B|}$, we should be able to improve this trivial bound:

Restricted Subgraph Bound: Let ${G=G(A,B)}$ be a bipartite graph that does not contain a ${K_{s,t}}$ as a subgraph, where “${s}$” corresponds to points of ${A}$ and “${t}$” corresponds to points of ${B}$. The cardinality of the set ${E(G)}$ of edges of ${G}$ satisfies

$\displaystyle |E(G)|\leq \sqrt[s]{t-1}|A||B|^{1-1/s}+s|B|.$

Before we prove the restricted subgraph bound, let’s note that a point-line incidence graph contains no ${K_{2,2}}$, because any two points lie on a unique line. Thus ${|I(P,L)|\leq |P||L|^{1/2}+2|L|}$.

Proof: For now, let’s assume that each vertex in $B$ has degree greater than $s$, so that in particular $s|B|\leq E(G)$. The general case will follow easily from this particular case.

We define the set ${F}$ of “fans” in ${G}$:

$\displaystyle F:=\{(a_1,\ldots,a_s,b)\colon (a_1,b),\ldots,(a_s,b)\in E(G), a_i\not=a_j\mbox{ for }i\not=j\}.$

We will count $|F|$ in two ways. First note that

$\displaystyle |F|=s!\sum_{b\in B}{\deg(b)\choose s}. \ \ \ \ \ (1)$

Next note any ${s}$-tuple ${(a_1,\ldots, a_s)}$ contributes at most ${t-1}$ fans ${(a_1,\ldots,a_s,b_i)}$, ${i=1,\ldots,t-1}$, to ${F}$. Thus

$\displaystyle |F|\leq (t-1)s!{|A|\choose s}. \ \ \ \ \ (2)$

Combining the counts in (1) and (2), we have

$\displaystyle s!\sum_{b\in B}{\deg(b)\choose s}\leq (t-1)s!{|A|\choose s}. \ \ \ \ \ (3)$

Since ${{x\choose s}}$ is convex in ${x}$, the sum on the left side of (3) is minimized by the average of the degrees:

$\displaystyle s!\sum_{b\in B}{\deg(b)\choose s}\geq|B|s!{|E(G)|/|B|\choose s}\geq |B|\left(\frac{|E(G)|}{|B|}-s\right)^s. \ \ \ \ \ (4)$

Combining (4) with (3), we have

$\displaystyle |B|\left(\frac{|E(G)|}{|B|}-s\right)^s\leq (t-1)s!{|A|\choose s}\leq (t-1)|A|^s.$

Solving for ${|E(G)|}$ yields the desired result:

$\displaystyle |E(G)|\leq \sqrt[s]{t-1}|A||B|^{1-1/s}+s|B|.$

$\Box$

By symmetry we have the bound

$\displaystyle |E(G)|\leq \sqrt[t]{s-1}|A|^{1-1/t}|B|+t|A|.$

Thus we may conclude that

$\displaystyle |E(G)|\leq\min\{\sqrt[s]{t-1}|A||B|^{1-1/s}+s|B|, \sqrt[t]{s-1}|A|^{1-1/t}|B|+t|A|\}.$

-Acknowledgements-

The proof I gave here is based off of Jacob Fox’s lecture notes from a combinatorics class at MIT.

Given an arbitrary set of points in the plane, we would expect most pairs of points to determine distinct lines–this is the “generic” case. The obvious counterexample is when all of the points lie on a single line. Beck’s lemma says that one of these two situations must occur (up to constants) in any finite set of points in the plane.

Beck’s Lemma: Let ${P}$ be a subset of ${\mathbb{R}^n}$ with ${|P|=n}$. Then one of the following holds:

1. There is a line containing ${cn}$ points of ${P}$ for some constant ${c}$,
2. or there are ${c'n^2}$ distinct lines containing at least two points of ${P}$ each.

Note that (2) is the generic case, while (1) is a counterexample to the generic case.

Proof of Beck’s lemma: We say that ${a,b\in P}$ are ${j}$-connected if the line determined by ${a}$ and ${b}$ contains between ${>2^j}$ and ${\leq 2^{j+1}}$ points of ${P}$.

Let ${L_j}$ be the set of lines determined by ${P}$ containing more than ${2^j}$ points of ${P}$. Then by the Szemerédi-Trotter theorem:

$\displaystyle |L_j|2^j\leq I(P,L_j)\leq\max\{4n+|L_j|,4n^{2/3}|L_j|^{2/3}+|L_j|\}.$

If the max is ${4n+|L_j|}$, then we have ${|L_j|\leq 4n/(2^j-1)}$ and if the max is ${4n^{2/3}|L_j|^{2/3}+|L_j|}$, then we have ${|L_j|\leq 64n^2/(2^{j}-1)^3}$. Thus

$\displaystyle |L_j|\lesssim\frac n{2^j}+\frac{n^2}{2^{3j}}.$

If ${a}$ and ${b}$ are ${j}$-connected, then they lie on a line of ${L_j}$. Thus the number of lines created by a ${j}$-connected pair is at most ${L_j}$. Each such line contains at most ${{2^{j+1}\choose 2}}$ pairs of points, so

$\displaystyle \# j\mbox{-connected pairs}\lesssim 2^jn+\frac{n^2}{2^j}.$

Let ${C}$ be a positive constant that we will determine later. From now on we focus our attention on ${j}$ such that ${C\leq 2^j\leq n/C}$. For this range of ${j}$, the number of ${j}$-connected pairs is bounded above by ${n^2/C}$:

$\displaystyle \sum_{C\leq 2^j\leq n/C }2^jn+\frac{n^2}{2^j}\lesssim \frac{n^2}C,$

where we bounded the sum of first term by the highest term (i.e. ${1+2+\cdots+2^j\leq 2^{j+1}}$) and we bounded the second term using the geometric series (${1+2^{-1}+2^{-2}+\cdots = 2}$).

Since there are ${{n\choose 2}=n(n-1)/2}$ pairs of points, if ${C}$ is large enough then there are more than ${n^2/4}$ pairs of points that are not ${j}$-connected for ${C\leq 2^j\leq n/C}$. If one pair is above this range, then the line that they determine contains at least ${n/C}$ points, which gives us (1). Otherwise we have more than ${n^2/4}$ pairs below this region and so we have option (2). QED

Beck’s original proof did not use the Szemeredi-Trotter theorem—in fact, Beck’s paper was published in the same issue of Combinatorica as Szemeredi and Trotter’s paper. Instead he proved the bound on the number of lines containing between $2^j$ and $2^{j+1}$ points:

$\displaystyle |L_j|-|L_{2j}|\lesssim\frac{n^2}{2^{(2+\delta)j}},$

where $\delta=1/20$ and $1\leq 2^j\leq(2n)^{1/2}$. This estimate is sufficient for our argument.