Language Inferencing

I've been involved in a lot of debates over the last few years over the merits of strongly-typed languages versus loosely-types ones.  For example, vbscript and java script are generally loosely-typed languages.  Local variables defined within a body of code generally obtain their 'type' at runtime as a side-effect of an assignment, and its possible that an assignment two statements later re-assigns the variable to an instance of another type.

a = 10   'Here 'a' is an integer

a = “ten”  'Now 'a' is a string

Lot's of programs are written this way.  There are tons of ASP applications written using VBscript and tons more using languages such as jscript, etc.  There are even languages where there are no types at all.  Instance values are just bags of named properties.

C# is strongly-typed.  You must declare the type of a variable up front.  The compiler will keep you to your word.  The idea is that by strong typing the compiler will catch numerous bugs for you, where you accidentally put the wrong bit of data in the wrong variable.  There are numerous other benefits from strong-typing such as static analysis of the program, early-binding and many optimizations that can be made.

But still many people prefer loosely-typed systems.  They are just easier to get started with.  You don't have to keep repeating yourself by declaring types names all over the place.  You just start writing code.

Recently, there has been a lot of talk over using a technique called “Type Inferencing“ to lessen the burden of always writing type names in strongly-typed languages.  While we have been discussing c# here, there have been suggestions made for c++ and even Java, and so on.  (And yes I am aware of many languages that already do this.) 

Personally, I love the idea of type inferencing.  I've been pushing for it for a while.  Still, there's so much more that can be done to improve the programming experience, type inferencing is just a start.  I would like to tackle some of these.

If we alleviate much of the burden imposed by overly declarative languages and still maintain code correctness we will have done a HUGE thing.  Everyone will benefit.

What I really find burdensome about programming is getting all the formalities right.  You've got to setup your constraints and other structural bits before getting in and writing the meat of your code.  If you could just skip this step it would save a lot of time.

First of all, get rid of all the using and namespace declarations out of C#.  The compiler can figure those out later. If you reference a type, let the compiler figure you where it came from.

Next, don't bother defining a class to put your code in.  Just write the code.  If you don't write a class, let the compiler invent one for you.  It can infer a silly class and Main method much quicker than it takes to type one up yourself.

So now you can just start writing code.

a = 10;

That's a good start.  But why do I need to keep writing semi-colons.  The language syntax ought to be smart enough to determine the end of an expression without me slamming these silly 'tweeners' in there.

a = 10

Much better. 

Now that I'm on a roll, lets get into some really qualitative improvements.  Why waste all the effort setting up flow structures like if/else.  All we need to do is introduce the concept of 'FAILURE' into the language.  If a statement or expression fails, it just doesn't have its intended effect.

For example.

a = f(xxx)

a = g(yyy)

Given both these statements, either 'a' is assigned by the first statement or it is assigned by the second statement.  Which ever doesn't fail, succeeds.

The next thing we want to get rid of is looping constructs and especially statement sequencing.  I spend so much effort getting these things right.  The compiler should be able to figure this stuff out with a simple dependency graph.

a = b + 10

b = 20

Now, its easy to see that unless b is assigned first, then a will not succeed.  Therefore, the assignment to b should execute first. 

For loops, we just adopt the mathematics notation of subscripting.

a(n) = a(n-1) + 10

Now its obvious that a loop must be written to perform this calculation.  Let the compiler do it!

You can see where I'm going here.  Type inferencing leads you to the logical conclusion of full language inferencing, which is exactly where I think the industry should be moving.

And while were at it, lets get rid of variable names.  I think I must spend at least 25% of my time thinking up good variable names.  If we didn't have to name them, I'd be done so much faster.

I know, by now you are thinking, this guy is NUTS.  What an incredibly bad bunch of thinking, I can see all sorts of holes here. This would never work.  That may be so, but it might just happen anyway.  Check out the XML Conspiracy

THIS JUST IN:  It's already a happening!  Check this out!

But I digress


Comments (11)

  1. Matej says:

    Take a look at Asml.

    It has type inference and "automatic statement ordering".



  2. RebelGeekz says:

    I agree 100%

    But there are two major issues impeding this form happening:

    One. Language designers are too concentrated in inversed matrix interpolation and differential spatial programming (see: bullshit)

    Two. Joe coders are fighting their holy wars about which language is better and why curlies make you grow your pubic hair thicker.

    We really need to move on…

  3. Dumky says:

    I’m not sure which one is correct between inferencing and inferring…

    Anyways, you’d probably like the Caml language ( I has type inference and polymorphic types (it tries to infer the "largest" possible type).

    It runs both in an interpreter and compiled (with good performance in all the benchmarks I’ve seen).

    But it has different operators for different types. For example you can’t add an int and a float with + (ints are added with +, whereas floats use +.). No implicit casting or conversion.

  4. Stephane Rodriguez says:

    Totally disagree with your post.

    How is "a(n) = a(n-1) + 10" supposed to be mathematical? That statement has no value without specifying the value domain for the values of n and for function a.

    In addition, the ‘=’ char itself raises issues since it’s a statement not a comparison. If you are used to that, it’s ok, but I think that if you go as far (in the French Descartes cogito ergo sum manner) as question everything down to the root, then you have to question the use of the ‘=’ character as well.

    Good luck!

  5. Matt says:

    ‘a’ is not actually a function. a(n) is the nth iteration of the value ‘a’. You have to kind of squint you eye when you look at it. The example was not beholden to any particular language syntax, so the ‘=’ operator is not comparison here, it is assignment.

    Besides, if you follow the logic of the post, you’d come to the conclusion that the proliferation of operators is the next frontier to tame. So in actuality, most operations would eventually be signified using just the ‘=’ symbol.

    It’s like a cosmic grand-overload-theory.


  6. Stephane Rodriguez says:

    "’a’ is not actually a function. a(n) is the nth iteration of the value ‘a’."

    Whether you talk discrete values rather than continuous ones doesn’t change a bit the lack of value of your statement. Either it’s mathematical, and it will stand for itself. Or it’s crap, as in the original post.

    Ironically, a(n) = a(n-1) + 10 is exactly the reason why code has gone so wrong, in the sense it cannot be proven. Whenever you intend to implement a mathematical model, that is a provable one, you could for instance choose to implement preconditions and postconditions. Not that that would be enough, but at least this would make your post look more like from someone who shares a thought, rather than someone who gets back in infancy and finds that Basic is actually a great language to rely on.

  7. Matt says:

    I apologize for troubling you so much over this. The original post was a joke.

  8. Ken Brubaker says:

    Sarcasm as an artform

  9. D. Brian Ellis says:

    Ugh. Been using C++ and C# too long now. Your article makes me cringe (I would miss my semi-colons too much). Cool thinking though. Here’s a funny comparison. It sounds like you want Prolog. No declarations or structures, just a bunch of rules that sort themselves out when given input. I did some Prolog in college. Ouch. Through Ada, Lisp, C++, Java, etc., the only thing I couldn’t get my head around was the logical programming paradigm (Prolog)

Skip to main content