Monday, January 12, 2009

Why I love clojure

There is a fundamental fact about programming languages; or rather my interaction with them. I love the newb feeling.

This is a dangerous addiction because it doesn't take long before you never have the newb feeling; at most I have another couple years where I can find languages that fascinate me.

I just thought it was damn fun trying things out in the repl and just finding interesting things.

Somehow, c++ has lost that interest for me and I think I know why.

First off, it is just butt ugly looking. Templates look like shit; although I think they are cool as a compiler-extension mechanism.

Second off, you get problems like this. Today at work, working with a large piece of legacy code that I didn't write, a co-worker and I added something that made the program just crash. During a memory-reclaim operation (embedded game engine; thus it partially does its own memory management).

We went through the obvious possibilities; the most likely being if the object were deleted twice somehow. That wasn't the case, so we wandered around the code. Finally I decided to try changing the order of inheritance for an object with multiple inheritance. That fixed the problem.

Somewhere in the code, there are lines that reinterpret_cast something they should be using static_cast for. The project doesn't have dynamic_cast enabled so that is out of the question. That pisses me off but is only one of about 100 things about it that piss me off.

Anyway, the difference is that during multiple-inheritance, the actual pointer-value will change during an upcast or a downcast. This is because the way c++ objects are stored in memory and the the vagaries of v-table implementations.

This is, coincidentally, why the diamond of death is such a big deal in c++. You end up with two representations of the top of the diamond in memory. Thus:

A
B C
D

A would be in D's memory allocation twice. Thus if you did static_cast<A*>(static_cast<B*>(d_instance)) you would get a different answer than if you did static_cast<A*>(static_cast<C*>(d_instance)). Finally if you want to avoid all of this you can use virtual inheritance in c++. This looks like class B : virtual public A.

Then, however, access to a base-classes' data takes longer because there is an extra pointer in the middle.

All of these details distract you from getting your algorithm perfect or doing a very, very good design and thus I think that most c++ programs are fundamentally worse designed than a lot of programs in other languages.

You can really concentrate on only so much simultaneously. The more you are focusing on the details of an arcane language the less you are focusing on your algorithm and its fail-cases. Or the bigger picture in the sense of where this piece of code fits in the system; or how minimally you can accomplish the task.

The pain of refactoring c++ leads you to do an inordinate amount of up-front design which is always worse unless you are solving the same problem again which you never do.

Finally, TheEyeStrainCausedByLookingAtCamelCaseClassNames, CamelOrJavaCaseVariableNamesAlongWith half_of_the_language_using_underscores_which_no_one_uses_even_though_the_damn_language's_standard_library_is_written_with_it. Mix this with reading about 10 times the characters (literally). And what exactly does this mean?
(this->*item[index])(reinterpret_cast<SomeStupidClassName*>(*data)[4]);

And tell your coding conventions to fuck themselves. I can write better, more understandable code without conventions than you will ever touch with your 10000 lines of coding conventions; most of which have never been proven to improve the readability of sections of code. Every second you spend on coding conventions you would have spent on good or god-forbid great design and frankly, just pure bad design makes things hard to work with. Not code conventions. So unless you have spent the time going through thousands of programs all in different languages all with different coding conventions so you have the breadth and depth of knowledge to make an even semi-informed opinion on what makes code understandable to other people do not write a single line of a coding convention. Because frankly, you don't have any idea what you are talking about.

Man, now that is done with. I doubt I would hate code conventions as much as I do if I haven't worked in a language that requires so much goddamn typing because its base levels of abstraction are just too low to express the vast majority of concepts clearly or precisely.

It was brilliant for java and c# to take the route that they did and take the worst aspects of c++'s type system and continue them. If you want types, use good type inference. If you don't, use a dynamic language. Piss poor required static typing is just a waste of characters. Since every line you write is a liability, that leads code that is overly verbose, hard to refactor, and *requires* sophisticated editors to manipulate in some excuse for efficiency.

I have arrived at the point in my career where I would take slow, interesting, concise code over fast boring as hell and tedious code. I have never had an unsolvable optimization problem; and I have written mmx assembly, x86, used Intel's SSE and a bit of AMD's 3d now vector instruction toolkits, not to mention some insanely fast linear algebra for large datasets using CUDA.

Some things are just tedious and kind of suck. Most things can be done elegantly. Clojure makes doing the tedious no more tedious than it was going to be anyway and doing the elegant simple and interesting.

Chris

2 comments:

skogs said...
This comment has been removed by the author.
skogs said...

Beautifully written post. I agree completely, and Clojure seems to have a good future.