Barnacle Bits


I've been thinking of bits lately.  You know, those fiendish critters that skitter around under the structures of our data.  Take a peek.  Look in the field and you'll see them; thirty-two in every integer.  They are packed closely together, kind of like a swarm, each mindless without the others, or each of two minds, so to speak. Together they are more than the sum of their parts.  At least you start to think so after programming so long, as I have. You start to believe in them, to trust that they will always be there.  They will be there, right?


That's what I used to think, but now I'm not so sure.  It was comforting to know when learning C, that this integer thing was really these little bits over here, aligned in sequence, part of the overall master plan of the CPU and the memory bank.  I taught myself programming long ago, and I never really quite got it until I finally understood memory.  I know it sounds silly, but once I could finally see in my mind an image of a long bank of bits, strung out from here until the horizon, each blocked into groups representing machine words, then all of a sudden the rest of the program seemed trivial.  That's when I could really understand what a variable was, and what was the stack.   I had broken it down into its essence and could understand how it works.  Kind of like tearing down an engine or a radio, but in your head.


But C gave way to C++, a slightly higher level or abstraction, and of course, there were plenty of other languages between them that also had their own level of abstractions.  I choose to mention C++, because like C it felt closer to the machine, closer to those bits, and that made all the difference, that was my comfort zone.


Now there is C#, and Java, and you name it.  Everything is just so far removed from the machine now.  I know we've tried to keep to the idea that an int is an int is an int.  But does it need to be?  Is there a reason it could not stray?  I can't really take the address of a field anymore; not really.  Does it really matter anymore that these fields and variables are aligned in sequence along an imaginary one dimensional graph?  No, it does not.  It does not matter where they are, just that they exist and that they behave according to some common definition.


The bits are gone, swept away.  They are out there now, suspending in the ether, a mist of information, a buzzing swarm.  Left are only the shells, a kind reminder of the world they used to inhabit, plastered together in cascading tiers, a massive coral reef of the imagination.


Matt

Comments (2)

  1. Matt,

    >> I never really quite got it until I finally understood memory

    yup. same here.

    people that start programming today, without really knowing how memory works, have a serious problem at hand, imo. otoh – maybe this makes them free to find some really new ways to do things;

    just imagine: thinking about computing without thinking about memory layout. wow. like starktrek 🙂

    WM_MY0.02$

    thomas woelfer

  2. Eric Lippert says:

    I take your point, but hold on a minute — there ISN’T any long line of bits stretching off into the distance. That’s an abstraction, not reality.

    The operating system provides you this nice list of 32 billion bits, but in reality that’s just an operating system construct. Those bits are implemented by a complex interaction of electrons moving through flip-flop circuits, swap files encoded as iron domains on spinning disks, and currents in busses. Those bits have only a tangential correspondance to the actual state of the memory chips.

    That line of bits is just the comforting illusion of what the machine is doing; the reality is much different. The great thing about C is that it lets you write programs that manipulate that line of bits rather than worrying about all those implementation details.

    But that is also what sucks about C — that the memory model is "here’s 32 billion bits — do with them as you please, and if you screw up, if you’re lucky, you’ll crash and die horribly. If you’re unlucky, you’ll introduce security holes."

    That’s not the model I want to work with — whether it is "close to the machine" or not is irrelevant; it’s simply not a model that encourages productivity. We can come up with better models for computation than mere bits.

Skip to main content