Fog Creek Software
Discussion Board




Are we all crappy coders?

In the recent thread that asked where all the crappy coders were going, Simon Lucy wrote:

>One of milestones on the Road to Enlightenment in the
>Zen of Software Development is that one is, to use your
>colourful adjective, crappy.
>
>Having realised that one deals with all the results of
>being crappy and ameliorates them to the degree where
>competence is reached.

Simon's comment has been churning around in the back of my mind since I read it, but the more I think about, the more lost I seem to get. These are some of the questions that have been nagging me.

1. Is every coder, in fact, a 'crappy' coder?

2. By avoiding 'crappiness', does someone become a great developer? Or merely competent? That is, is crap-avoidance sufficient for greatness?

3. By limiting one's potential for crappiness, does one also limit his potential for greatness?

Comments?

Ken Dyck
Wednesday, October 08, 2003

Everyone has their own strengths and weaknesses.

Everyone.

Mike Swieton
Wednesday, October 08, 2003

Everyone's crappy at some point, especially when doing something for the first time.  What sets the good people apart is that they learn from their mistakes.

And it can never hurt to have words such as "ameliorate" in your vocabulary...;-)

Tim Lara
Wednesday, October 08, 2003

Also, with reference to #3 in the original post, there is definitely a fine line between taking risks and being different for the sake of being different.

Some people manage to err on the side of innovation, whereas others are just overly opinionated, cavalier, and stupid.

Tim Lara
Wednesday, October 08, 2003

One of the things that I think exacerbates the level of crappiness in our industry is the absolute dearth of tools in common use.  By this I don't mean that there's a shortage of languages or libraries or IDEs or whatever.  To my mind those things are mostly variations on a theme. 

A (somewhat over-generalized) example of what I do mean is the use of imperative languages in every conceivable context.  It doesn't matter what's being done, at what level of granularity, under what constraints.  The solution invariably comes down to writing lists of instructions in an imperative language for solving the problem, even when there are clearly better alternatives available.

These kinds of languages were developed to support number crunching applications a long time ago.  They've been extended way past where they represent a clean solution to a problem.  If we could diversify our approaches to problems in this area, and choose the proper approach for a given problem, the level of crappiness would plummet, IMHO.

disturbed
Wednesday, October 08, 2003

Ah yes, the dream of telling a computer what to do without actually telling it what to do -- merely telling it what should be done ...

And then coming back and saying, "no no no, that wasn't what I meant at all" ...

Alyosha`
Wednesday, October 08, 2003

My biggest disappointment with the paid work of creating software for a company is that they typically want it just good enough. They don't want perfect designs and eloquent solutions, they just want the problem solved. I have done some crappy code to meet deadlines, but I would like to think I am not a crappy coder, but rather an employed coder who is asked to sometimes make crappy solutions.

m
Wednesday, October 08, 2003

"Ah yes, the dream of telling a computer what to do without actually telling it what to do -- merely telling it what should be done ...

And then coming back and saying, "no no no, that wasn't what I meant at all" ... "

Sounds like exactly the kind of cop out that leads to the problem I mentioned above.  Maybe it's self generating.

disturbed
Wednesday, October 08, 2003

Well disturbed, how about some links to some industrial strength code that works as you say so we know what you mean?

I'll go first -- there's a dude somewhere who built a robot in 1 million lines of lisp to draw paintings, paintings of people and flowers and such that were creative and attractive. Saw these paintings in a gallery years ago.

So that's the only one I know of.

Is there a better language than C for greating a game engine?
And beyond that, programming of games is done with proprietary 3-D design languages called level editors, which are not imperitive.

Maybe there's already tons of non-imperative programs out there already but you aren't noticing them? Just because the interpretors are written in C shouldn't be a problem -- seems to work quite well for that purpose.

Dennis Atkins
Wednesday, October 08, 2003

Hi Dennis,

I'm sure there's lots of stuff out there written in Lisp, and there's probably quite a few folks out there that actually think, "hey, Lisp would do that very well."  In my experience, though, this is the exception, rather than the rule.  What I see most often is large systems written completely in Java or C# or whatever.

Also, I'm not even necessarily talking about language choice.  I just used that as an example.  How about architecture?  The procedural decomposition that goes into most designs is based on one function calling another calling another to get it's result.  This ties all of these things together very tightly.  For a lot of software, this kind of thing is unnecessary and undesirable.  Look at all the creative ways you can string together independent Unix programs using pipes while still preserving the independence of each program for an example of something that might be better. 

I'll wager a system that is written using a mix of all these things would be a nice clean, crap-free solution to a lot of problems.  Maybe components written imperatively connected together by pipes and queues and controlled with a rule-engine like the one available from The Haley Enterprise or Forgy's new Java offerings, or even JESS.  Each approach does what it does best and defers on things where it falls short. 

Imagine a GUI system where components are written imperatively for things like rendering, but where a rule-based system handles event processing and state maintenance.

Just to illustrate my point, I'm sure all of you have witnessed endless debates about what language and libraries to use for "the next project" wherever you work.  How many similar debates have you witnessed around the even more fundamental choice of computing model?

Anyhow, I guess a lot of it just depends on what, in your mind, makes code "crappy."  Probably one man's crappy is another man's clever, eh?

disturbed
Wednesday, October 08, 2003

Oh man - I guess we are in a pissing contest between people with completely different perspectives.

Disturbed - it think - has been writing code that models a complex problem domain, and is frustrated because C++, Java or C# makes everything a pain in the butt - right?  He would prefer some kind of lisp, or Python, Ruby, etc.

Alyosha and Dennis Atkins are more interested in a language/environment that allows them to access services/APIs that they need to use.  A language that lets them express the problem itself in the most elegant way possible is beside the point - maybe because the problems they are trying to solve aren't all that interesting - just CRUD stuff.  Or Because performance is an absolute necessity, so you have to go with C.

In my work, I need to find a compromise between the two perspectives.  I've been using C#, and the addition of generics will make things much nicer.

Reluctant bureaucrat
Wednesday, October 08, 2003

Generics would definitely be a nice addition to the language.  I wish they would have implemented Eiffel, but changed the syntax.  Since Bertrand Meyer was involved early on, I wonder why more of the Eiffel concepts didn't make it into the language.  definitely a lot in there to help clean up crap.

crap-a-holic
Wednesday, October 08, 2003

Man I don't know about that, I'm not ready to start pissing I'm still trying to find the piss bucket.

I guess I don't even know what you mean by imperatives. I thought imperatives were languages that use statements in a sequential-ish order, like C, Python, Basic, etc. And you've got these other languages like "functional languages" (is that the term for lisp and eiffel like languages) and they are thought to be very different for some strange reason. And then you've got Smalltalk/Objective C which is message based or something like that. I've messed with all of them and others and I'm never really sure what these fundamental differences are supposed to be, though I do believe that they are there, I just don't really get what the big deal is.

Now what you're saying about architecture and structure sounds like something I coould really dig. Toss language out as a consideration and talk about how all structures are based on functional decomposition/structured programming or something like that as the main model, as opposed to some other model.

It's not all real defined for me yet what were talking about so keep talking please.

(And hey, yeah I find the stuff I work on interesting. Add how that always seems to make me seem stupid to others, as if only dumb people find their work interesting.)

Dennis Atkins
Thursday, October 09, 2003

Reluctant,

I think you have the right of it from my point of view, for the most part, except for the pissing.  I'm not sure that I'm frustrated with things being a pain in the butt, though.  I think it's more the waste that drives me crazy, and probably even more, my own inability to do anything about it :)  Probably how a carpenter would feel about the effort wasted driving in nails with a screwdriver because there aren't any hammers available, and no one wants to stop and run to the hardware store to get some.

And Dennis, good catch.  If I'm going to rant about something, I should probably be a little more precise about what I'm talking about.  To my understanding, imperative approaches have a couple hallmarks:

1.  Described in terms of actions: "Set this variable to that value" or "Compute this function for these values"  or "Send this message to that object."  Imperative sentences.
2.  Also, baked in, explicit sequencing of operations:  "First do this, then do that, then do the other thing"

Of these two, I think the second one bugs me the most.  If you look at how other kinds of engineers build stuff, the connections between components are almost never built into the components themselves.  Parts get connected with wires or linkages or cables or hydraulic lines or whatever.

We hardwire connections between objects/functions into our work and then gripe that we can't produce anything that is reuseable.  So, we either rewrite code a lot with minor variations or we hack our way around it.  This invariably inflates the crap-factor.  If you look at many of the design patterns, you can see that a lot of them are just ways to loosen up the sequencing of operations, exchanging rigid code for flexible "data" in clever ways.  I guess Lisp gets distinguished because code and data are not real distinct in Lisp, but it still seems pretty imperative to me, at the end of the day.

This imperative approach is great for problems that are algorithmic, but not (IMHO) so good in levels above this where we aren't so much concerned about algorithms as we are about making decisions, routing information here and there, aggregating information, and so on. 

One thing that stands out for me is if you look at a natural language description of a solution to the sorting problem, it reads almost like the Java code that would implement the solution.  If you read a natural language description to a higher level business problem or a simulation problem, though, there's most often quite a break between this description and the matching Java code, especially problems involving a lot of decision making.  A solution that uses a model of computing like the one in OPS5, though, is often a high fidelity match to the natural language version.  (Note, I am not endorsing OPS5 - the syntax is horrible)

disturbed
Thursday, October 09, 2003

Back to the original point of the thread:

>One of milestones on the Road to Enlightenment in the
>Zen of Software Development is that one is, to use your
>colourful adjective, crappy.
>
>Having realised that one deals with all the results of
>being crappy and ameliorates them to the degree where
>competence is reached.

In essence; accept that you make mistakes and are less than perfect and can learn and improve and then actively work at identifying those mistakes and working on making fewer mistakes and improving your skill.

The difference between people who realise this and those that dont is that people who dont realise that they're crap tend to overestimate their ability and because they think they're already great they don't try to improve. They're already god's gift, why do they need to be better?

The more you learn the more you come to realise that there's so much more to learn. You may be less crap than you were last week but you're nowhere near as good as you could be...

It's a case of keeping your ego in check ;) Personally I find the best way to keep reminding myself that I have so much more to learn is to openly admit my mistakes; recently I've found that my blog is a good place for such soul searching...

I think there's possibly a more troubling issue at the core of all this 'crap people dont know they're crap' thing; it may be more to do with being true to yourself and being honest to yourself in your head... But it's too early in the day for a discussion like that (not to mention the lack of an adequate supply of guinness and comfy chairs).

Len Holgate (www.lenholgate.com)
Thursday, October 09, 2003

Well, loathe as I might be to commenting on such an ego stroking on the initial comment I'd make this point.

You can't avoid crappiness, you can't not be crappy.

Once you know that, you attend to that and work incrementally to smooth out the crappiness until its lodged in such tiny granular cracks that its not worth the bother to grind any more.

And yes, the point is there is no 'I' in programming.  The code the processor runs is not you in any sense.  So for all those that take the word crappy and externalise it, into the tools they use or the platforms they have to implement on or the management that won't soothe their environment until its just right or it produces targets that insult your sense of taste; the point will be missed.

Is this a depressing thing, the crappiness?  No, not at all.  Once you apply  Quality to that which you do, instead of praising yourself for 'doing something right' the first time you'll write code, design systems that seem to those around you as if they came fresh baked and written once.

And if you doubt this crappiness think back on those times when the code has poured live a river from you when your typing speed as increased when you are fashioning an entire construct of process and state in a kind of trance.

And the next day you look on it and see that its premise is false that entire trees of condition have been missed and you set to with the small chisel to turn it back into the silk purse you thought it was. 

Or other occasions when you're just as inspired, just as fluid but there is the voice hovering over each change, criticising it, setting mental bookmarkers to go back and smooth away the roughness and pick up the detritus, to spit, polish and shine. 

The improvement is in accepting that one is crappy and then dealing with it.

Simon Lucy
Thursday, October 09, 2003

Disturbed,

Why do programmers insist on using imperitive languages, when languages better suited to the individual tasks are available?

I think the key reason is the effort of learning a new language.  If the language is turing complete then I can do anything I want.  Sure it takes 1000 lines of code, but I can write that code using a familiar syntax. 

I may be able to achieve that same objective using LISP in 20 lines of code, but then I have to learn LISP. 

Since I'm learning scheme at the moment as my LOTY then I know that is really isn't easy.

The implementers of Lisp or Scheme really don't help here.  For example, why the insistance on using CDR, CAR and the 'convenience' functions like CDDR and CDDAR [1]?  These instructions are a nightmare.

Rebol, on the other hand, has functions like HEAD, FIRST, SECOND, THIRD, NEXT and TAIL [2].  It makes the code so much easier to read.

[1] http://www.ccs.neu.edu/home/dorai/t-y-scheme/t-y-scheme-Z-H-4.html#node_sec_2.2.3

[2] http://www.rebol.com/docs/core23/rebolcore-6.html#section-1.3

Ged Byrne
Thursday, October 09, 2003

If you're using DrScheme, you can turn on the Pretty Big language.  It has first-seventh instead of car/cdr.  Common Lisp's standard has first-tenth.  The Scheme standard may not have it, but it's a consciously minimal lisp; there are more conveniences missing than just that.

You might like the upcoming Apress Common Lisp book, which the author opened up for comments.  (It's fun to spot bugs. ;)  http://www.gigamonkeys.com/book/  I don't know if you'll like it, but it's different from the ivory tower approach.

I've always thought the nice thing about Lisp was that you were supposed to forget you're a programmer and instead build abstractions.  Whatever their faults are, the lisps certainly have abstraction-building facilities...

Tayssir John Gabbour
Thursday, October 09, 2003

Since the original question was about improving ourselves as coders, and not the philosophy of choosing languages, I'll try to direct this at improving our software writing ability. 

For those of you arguing about language, mark yourself in the crappy coder column because you're saying that it's the tools that make the coder.  Go play with your lisp compiler and let the rest of us stay on topic.

Something I noticed in myself as a crappy coder was a tendency not to see the minimalist solution.  C on UNIX, for instance, makes it very easy to keep user-specific data in a collection of files in a user's home directory. When I implemented a system a while back, I didn't look at this solution longer than it took to dismiss it.  Instead I had a centralized session-management system for a CGI program where datasets were user-centric.  The centralized session management was problematic, took too long to implement, and untimately failed.  A colleague with a lot more experience was able to re-implement the same system without sessions at all, just by storing data in files in the user's home directory.  Much less complexity, much easier to debug and support, and it left a nice audit trail when debugging user problems. 

So my thinking is that one of the things we can do to reduce our own crappiness is to learn from other programmers.  When it's possible, a more experienced programmer is a great resource, but I think that even less experienced programmes have something to bring to the table.  Even when they don't have the skills to bring their ideas to fruition, they do bring a new set of eyes and ideas to the table.

Clay Dowling
Thursday, October 09, 2003

First rule of programming: everyone elses code sucks.

This is because programming is very much the
construction of the human mind, which is very
individual. Style differences and mental model
differences cause a real disonance between
programmers.

somebody
Thursday, October 09, 2003

Thats good to know.  The book looks interesting.

The question still remains, when they were devising scheme why did they choose to loose FIRST and keep CDA?

Ged Byrne
Thursday, October 09, 2003

"For those of you arguing about language, mark yourself in the crappy coder column because you're saying that it's the tools that make the coder.  Go play with your lisp compiler and let the rest of us stay on topic."

Since the discussion about languages was focused on how this choice contributes to crappiness, I am forced to conclude that your ability to comprehend that discussion is at fault here, not the discussion itself.  That discussion was aimed at a root cause of the problem, rather than some obvious symptom, and so it requires some thinking.  Read it again, you'll see.

Your recommendation about keeping an open mind and learning from others flies in the face of your offhand dismissal of the language discussion, too.  Since you claim this as a characteristic of a crappy coder, I'll leave you to draw your own conclusions.

Aside from that, the discussion of the role of tools as a contributor to the problem is valuable to someone that is actually interested in learning something or thinking things through carefully.  That old saw about not blaming the tools is nonsense.  Under that philosophy, we never would have progressed past  the wheels, inclined planes, and levers that we learned about in grade three science class.

Do you think you could write good poetry in a language that had no adjectives, just because that language seemed sufficient for communicating other things effectively?  Would you recommend building computers out of pipes and valves and fluids, simply because you could?  What do you think drove people to abandon record-oriented databases in favor of relational databases?

disturbed
Thursday, October 09, 2003

somebody,

Surely; First rule of programming: 90% of code is crud. ;)

"Style differences and mental model differences cause a real disonance between programmers" which is why we should be striving for simplicity in all we do. The more complex the mental model you impose on a problem the less likely it is to fit with my view of the problem. But simple is hard...

Len Holgate (www.lenholgate.com)
Thursday, October 09, 2003

Len, there are several first rules of programming. Take your
pick.

Simplicity is a function of your mentality and thus does
not suffice as a way to come to agreement. What
you find obvious, i probably will not. What you find clear, i probably will not. What you value, which determines tradeoffs, is unlikely what i value. What you find pleasing, i probably will not.

It's the same as saying if an artwork is made simple
enough we'll agree it's beautiful.

somebody
Thursday, October 09, 2003

There was someone I kind of knew in an electronic sense, but whose last name is lost in my private black hole where all names reside so I'll simply identify him as 'J' who produced applications grammatically.

J spent a long time on Cix (pre-internet conferencing system similar to Bix and which still exists), claiming that his methodology of defining a language, or more accurately a grammar, to solve a particular problem or class of problems was the simplest and best method.

His claim was, if I remember rightly, that designing a grammar for a specific problem or application meant that his solution was guaranteed to be correct.

He attracted a lot of flak for this, a huge amount.  But I think I could see how that could work I just couldn't  get my head around an example and I don't remember him supplying one.

Perhaps in the intervening time things have become clearer but I can see that an Order Entry Grammar would naturally always create an accurate solution for an accurately defined set of circumstances.

I haven't done it in any real sense (apart from a few meta languages which aren't the same thing at all), largely because the time to design the grammar probably feels longer than assembling the well worn patterns (no I don't definine Design Patterns I can never remember what the names mean) into an application.

The reason I bring this up, is that if individuals feel that a procedural language isn't the right one for a particular application and the other varieties available don't also fit for some reason then why not define the grammar.  Unless its entirely arcane and doesn't use symbols at all so it can be parsed you can use good old lex and yacc to do all the hard work for you.

(Oh and if J happens to read this and recognise themselves and I've committed an atrocity then I apologise).

Simon Lucy
Thursday, October 09, 2003

Definitely seems like an approach like that, done correctly, wouldn't admit a lot of crap :)  So, the basic idea is to create a sort of "domain language" and then a grammar that defines all of the valid and well-formed sentences in that language?  I assume this would be mounted over top of another layer that did all the requisite computation?

disturbed
Thursday, October 09, 2003

No, the grammar itself could be computational as well.

Simon Lucy
Thursday, October 09, 2003

Ged briefly mentioned Rebol language above, and since the topic is meandering slightly in this direction (simplicity, dsl's, etc.), I'll mention a few things about this technology.

I'm not sure I've fully grasped the mindset of Rebol, but the key elements of its design philosophy are: simplicity, readability, platform-independence and universal communication. These folks are obsessed with elimination of bloat & complexity.

From a language standpoint, Rebol unifies code and data in a system of values. For example, there are over 40 datatypes (values) which represent everything from basic strings and integers to high-level values such as email addresses, urls, money and files. In this system, code/data == values and are interpreted indiscriminantly, much as a CPU processes bytes-- rather than code operating on data (procedural langs) or objects performing operations on data via method invocation (OOP).

This result is a system that enables reduced complexity through a clean syntax (no keywords, little punctuation), highly polymorphic functions (e.g., read url, read file, read object), clean flow of execution (functional style) and strong abstraction capabilities resulting in programs with relatively few lines of code. Rebol itself is only a few hundred Kb.

On top of this, a built-in EBNF style parser permits the creation of domain specific languages, so you can model your program using a grammar/vocabulary of your choosing.

Is Rebol simple? Like a Celtic knot, it is simple yet intricate. I'm still digesting this language, so I can't claim to be an expert. To use Rebol like a guru, I think you need to be good at abstraction and have a knack for stripping problems down to the essence. Since this skill cannot be automated by your favorite visual code editor, I'm not sure it will ever be mainstream. If you're interested, the url is: http://www.rebol.com

Edoc
Thursday, October 09, 2003

MDA might be the new little langauges approach.

somebody
Thursday, October 09, 2003

By MDA, do you mean Model Driven Architecture / UML stuff? Got a link? UML hasn't done much to reduce complexity in my projects...

Edoc
Thursday, October 09, 2003

yes we all are.

o' my
Thursday, October 09, 2003


There is only one difference between a crappy and a non-crappy coder.  Just one simple test... ask them "why".

The non-crappy coder will give you a sensible, logical and thoughtful reason(s).

The crappy coder will tell you "because I was told to".  The teller can be boss, co-worker, client... or spec.  The crappy coder is only interested in CYA.

Joe AA
Thursday, October 09, 2003

http://www.omg.org/mda/. UML isn't necessary, but it's not
UML in the normal sense anyway.

coder
Friday, October 10, 2003

Most of us aren't allowed to choose the language we use, especially when we're on a project that involves more than one programmer.  So the measure of how crappy you are is relative to how well you do with the tools you are allowed to use.

I disagree that crappy code is whatever is written by somebody else because it doesn't fit our mental model.  I have seen code by others that I have to say is quite good code, even though it wasn't the same way I would have done it.  I have also seen code written by others that is just undeniable crap.  I remember encountering this program that was supposed to account for leap years when doing its calculations, and instead of figuring out the leap years properly it did some hardcoded garbage like "if year==2000 or year==2004".  As if the world would end before 2008 (which it indeed might, but that is far from a certainty).

Crappy code can also manifest itself in other undeniable ways ... it simply produces wrong results or performs like a slug when it doesn't have to.

T. Norman
Friday, October 10, 2003

somebody,

"Simplicity is a function of your mentality and thus does
not suffice as a way to come to agreement. "

I agree. Like I said, it's a hard problem. That doesn't mean it's not something worth working towards; the results of trying hard to make something simple are often more easilly understood than the results of not bothering to try.

Certainly I find that others seem to understand my code more easilly now than they (or I) used to. Therefore the fact that I try hard to make thing simple seems to pay off for my clients. I've found that one way of making yourself improve this kind of thing is to write about the code and make it available and then deal with the resulting questions from people who are trying to use it. 

The fact is, I still have a long way to go, but I know this and I keep trying new ways to make what I do more accessible to others. I think this brings me back to my first message on this topic... Realise that you can do better and strive towards it.

Len Holgate (www.lenholgate.com)
Friday, October 10, 2003

Joe AA,

I think you've just explained what Reloaded was all about!

Ged Byrne
Friday, October 10, 2003

Laziness is the programmer's greatest virtue.

I'm always looking at legacy code, saying to myself "Why weren't these programmers lazier?"  If they would have just taken their hands off the keyboard for a few minutes and grabbed a cup of coffee while thinking about the problem, they could have achieved the same effect with, say, 10 lines of code instead of 100.

Laziness also shows up when a programmer gets tired of repeating a certain task.  He just writes a script, automated build process, or whatever to do the job for her so she doesn't have to do it anymore.  I believe this is how Perl came about.

In summary, to be a great coder, you must be exceedingly lazy.

Jim Rankin
Friday, October 10, 2003

Another point on crudiness.

There was a study done that shows incompetent people are more likely to overrate their competence than competent people.  In other words, if you're really cruddy you probably think you're fine or great.  And if you're pretty good you probably realize just how cruddy you really are.

So to those of you, who see cruddiness in others but not yourself, you might need to take a second look :).

Jim Rankin
Friday, October 10, 2003

Yes, Jim, but is a very special sort of laziness.

From Terry Prathett's 'Moving Pictures':

"Victor Tugelbend was also the laziest person in the history of the world.
Not simply, ordinarily lazy. Ordinary laziness was merely the absence of effort. Victor had passed through there a long time ago, had gone straight through commonplace idleness and out on the far side. He put more effort into avoiding work than most people put into hard labour."

"He was undoubtedly the most athletically-inclined student in the University. Having to haul around extra poundage was far too much effort, ... things with decent muscles was far less effort than trying to achieve things with bags of flab."

Ged Byrne
Friday, October 10, 2003

I like how the word "cr-r-rrapp-pp-py" sounds :-) ....  "Cr-r-rrapp-pp-py code"

mista
Tuesday, October 14, 2003

*  Recent Topics

*  Fog Creek Home