Last time I demonstrated that the compiler could have to do an exponential number of bindings in order to determine whether there was a unique best overload resolution for a function call that takes a lambda. Some of you may have wondered whether we simply were not being clever enough in the compiler. Perhaps there is some way to optimize this problem so that the unique solution is found in less than exponential time.

As it turns out, the question of whether there is a clever way to do this in C# 3.0 is equivalent to solving the most famous unsolved problem in computer science. That is a problem which has stumped generations of the finest minds in academia, and is widely believed to be unsolvable, so I don't feel *particularly* bad about not solving this one myself.

Consider the following *slight* revision of the set of overloads I sketched out last time:

class MainClass

{

class T{}

class F{}

delegate void DT(T t);

delegate void DF(F f);

static void M(DT dt)

{

System.Console.WriteLine("true");

dt(new T());

}

static void M(DF df)

{

System.Console.WriteLine("false");

df(new F());

}

static T Or(T a1, T a2, T a3){return new T();}

static T Or(T a1, T a2, F a3){return new T();}

static T Or(T a1, F a2, T a3){return new T();}

static T Or(T a1, F a2, F a3){return new T();}

static T Or(F a1, T a2, T a3){return new T();}

static T Or(F a1, T a2, F a3){return new T();}

static T Or(F a1, F a2, T a3){return new T();}

static F Or(F a1, F a2, F a3){return new F();}

static T And(T a1, T a2){return new T();}

static F And(T a1, F a2){return new F();}

static F And(F a1, T a2){return new F();}

static F And(F a1, F a2){return new F();}

static F Not(T a){return new F();}

static T Not(F a){return new T();}

static void MustBeT(T t){}

static void Main()

{

// Introduce enough variables and then encode any Boolean predicate:

// eg, here we encode (!x3) & ((!x1) & ((x1 | x2 | x1) & (x2 | x3 | x2)))

M(x1=>M(x2=>M(x3=>MustBeT(

And(

Not(x3),

And(

Not(x1),

And(

Or(x1, x2, x1),

Or(x2, x3, x2))))))));

}

}

This expression *makes the compiler solve the Boolean satisfiability problem* with three terms per "or", aka "3SAT". If 3SAT has a unique solution then the program compiles and produces the unique solution. If it has more than one solution then compilation fails with an ambiguity error. If it has no solution then the program fails with the error that MustBeT has an F argument. But no matter how you slice it, the compiler must solve SAT.

Determining whether a given predicate has a set of valuations for its variables which make it true is an NP-complete problem; actually *finding* the set of valuations is an NP-hard problem. Therefore, *overload resolution in C# 3.0 is at the very least NP-hard*. There is no known polynomial-time algorithm for any NP-complete or NP-hard problem and it is widely believed that there is none to be found, though that conjecture has yet to be proven. So our dream of fast lambda type analysis in C# 3.0 is almost certainly doomed, at least as long as we have the current rules for overload resolution.

Does this really matter in practice? As I mentioned last time, hopefully not. Other languages also have this issue. When I ran an early draft of this post past him, Erik Meijer immeditely pointed out that the ML type inference system has similar worst cases, and the Haskell type inference system is even worse. Apparently in Haskell you can encode a Turing machine into the type system and make the compiler run it! In practice these sorts of problems do not arise in real-world code in any of these languages, so I am not too worried about it for C# 3.0.

And here I thought your example was going to show the Halting Problem.

I am pretty sure that overload resolution is decidable in a finite amount of time! It’s not THAT bad.

I thought it was going to be the Halting Problem as well.

‘Apparently in Haskell you can encode a Turing machine into the type system and make the compiler run it!’

Same thing goes for C++ templates. They are Turing complete too. The reason behind template meta-programming in C++. I suppose any language with similar inferencing abilities and compiler able to make decision which type makes the best candidate if any will suffer similar problems.

Partial specialization makes life a living hell for C++ compiler writers. One reason VC++ team never got partial specialization right before VC++ 2003.

Awesome example!

I would just like to note that you have shown the 2-SAT problem which is actually in P. But since it’s trivially extended to be N-SAT if you have And(x0,x1,…xn-1) and Or(x0,x1,…xn-1) functions I’ll buy it.

Whoops, you are right. My bad.

In fact, we can trivially extend this to an NP-complete problem by simply adding eight more:

static T Or(T a1, T a2, T a3){return new T();}

static T Or(T a1, F a2, T a3){return new T();}

…

static F Or(F a1, F a2, F a3){return new F();}

Because that makes the problem into 3-SAT, which is NP-complete.