20160325

If you don't see sharp you'll be flat



Visual C# is a strongly typed, object orientated and managed programming language. As opposed to the computer's native computer assembler which is not typed, orientated or managed at all.

If I've lost you already, it's likely to only get worse so, if I were you, I'd go find something else to do.

The point I am making is that IMHO the education of all serious programmers ought to start with assembler before moving on to a high level languages like C#. Because otherwise they will inevitably fall into the traps of unnecessary obfuscation, abstraction, bloating, redundancy, and other such anti-Occam's Razor-isms that force the rest of us to buy ever increasingly complex computers.  The young people of today won't believe you when you recall surprisingly complex software that used to run within the MSDOS memory limit of 640Kbyte. Not that complexity is bad per se - nature is full of it!

"Managed" means the language depends on, in this case, the gargantuan Microsoft .NET framework and cannot run without it. Which means the programmer does not have full control of the machine and the very simplest program still needs all that framework code.

The objects in "Object orientated" are packages of code that are called by the main program rather like subroutines of assembler-speak but with many more bells and whistles which are nice but only when needed. C# forces the programmer to use objects even when something less heavy would be more efficient all in the interests of readability and self documentation.

"Strongly typed" means that every quantity is forced to be of a specified type. Like when Mickey Mouse is doing his homework and asks "what is 3 + 4?" and Pluto replies "Is it apples or bananas?"

The computer itself only knows about "bits" aka binary digits that are either "on" or "off".  Groups of these can represent either instructions or data according to context. The computer doesn't care if the data describes apples or bananas. That's the programmer's job. Just as it is the mathematicians job to decide how to use an algebraic symbol or the what to use the digits 0 to 9 for. So when C# insists that a symbol must describe e.g. apples, that distinction is purely in the mind of the compiler (the software that converts a high level language like C# into assembler).

Which is all fine and dandy until it gets out of hand.  Like when the QuickBooks SDK interface which I have been working with requires me to use their programmer's name for a value that, to the computer, is just an integer and woe betide if I spell or capitalise it wrongly.

This morning I read a news article reporting how, when a certain Jennifer Null tries to buy a plane ticket, she gets an error message on most websites. This is a prize example of the point I am making. There is no reason why a programming language should not define a special value to mean that a variable or database field has not yet been assigned, but to confuse a string of characters "Null" with that value is unforgivable.

I came across a new word recently that describes this nonsense nicely: cruft = badly designed, unnecessarily complicated, or unwanted code or software.

No comments:

Post a Comment