Fog Creek Software
Discussion Board




Is grokking pointers really so hard for some?

In this article here, http://www.joelonsoftware.com/articles/fog0000000006.html Joel describes the phenomenon of many of his fellow college students, otherwise bright, creative individuals, utterly mystified by pointers.

Is the concept of a variable holding an address of where a value can be retrieved from rather than the value itself really so confusing? Yes, it's indirection, but aren't variables themselves an--albeit, subtler--form of indirection (referring to value by name instead of just using the value itself)? Is pointer arithmetic really an aptitude, or is something that just needs to be explained carefully?

No thanks.
Thursday, March 25, 2004

Sorry, "or is something" should be "or is it something".

No thanks
Thursday, March 25, 2004

Yes, variables and pointers are different.

Technically, a variable always refers to the same address in memory while pointers don't.

This has at least to implications: 1) If I modify a variable's value, this variable keeps its value until I modify it again (assuming, there are no pointers pointing to that variable's value). 2) A pointer's value can change unexpectedly as well as the pointer can point to a totally different memory (which changes the value as well)

Also: do not forget that a variable is always properly initialized while the memory the pointer points at not always are.

René Nyffenegger
Thursday, March 25, 2004

Pointers are merely a level of indirection.  That indirection for whatever reason blows a lot of people's minds.  Back in the day I used to tutor Modula-2 (aging myself here) and C.  There were a bunch of people who never made it past the for loop, and another bunch who never made it past pointers.  I'd say less than 25% of the class was remaining at that point.

I had similar a experience in the logic classes the philosophy dept offered.  CS students often enrolled to fulfill our liberal arts requirements, as it was a blow off, yet the other 1/2 of the class was bewildered. 

What I think is interesting is that boolean logic is beyond a huge segment of the population.  That might explain the current state of the legal system here....

christopher baus (www.baus.net)
Thursday, March 25, 2004

Hmmm...  I asked Joel the same question on 'Ask Joel', but he wasn't interested.  Here were my thoughts:


It is often said - and I believe you, yourself have expressed a similar sentiment - that the notion of pointers is the conceptual delimiter between hard-core developers and, perhaps, the VB/Java crowd.  Why is this so?  I agree with you, by the way - I think we've all seen enough anecdotal evidence in the form of developers struggling with pointers - but my question is *literally*, "Why is this so"? 

The concept of 'pointer' seems about as simple and easy to grasp as "2400 Main Street, Any City, USA".  Grab a piece of paper, draw a box that represents a memory location.  Now draw an arrow to that box with it's address.  Can there be a simpler, more concrete notion to master?  Yet, I work with physicists and mathematicians who regularly and casually discuss topics that make my head spin, and I see them struggling with pointers in C/C++. 

Let me guess at one reason why this is so.  Linguists often suggest that we are actually intellectually bound by the utility and expressiveness of our languages.  Mathematics (and thus physics, engineering, and so on) has an elegant, intuitive, an highly expressive language.  History shows that with each advancement in expressing mathematics, there are accompanying leaps in comprehension of the subject.  (Just try to imagine doing number theory with Roman numerals.)  I don't believe that C++ is elegant.  In fact, I believe that it is so inelegant as to obfuscate as simple a notion as pointers to the extent that PhDs in heavyweight fields can barely understand what's going on.

What do you think?

anon
Thursday, March 25, 2004

There is nothing impossible to understand with pointers. I think they are just a wonderful means to work with arrays and big objects. Actually, one can do without them even in C++, but in that case his code will use computer's memory inefficiently.
Suppose, you have an object that has a hundred of properties (half of which are also objects) and about 50 methods. If you pass it a parameter to some function, even if you pass it as const, the whole object will be copied into the stack. And that's a huge piece of memory!
And if you pass a pointer, only the size of this pointer will be allocated for working with the parameter.

Oxana Seglova
Thursday, March 25, 2004

> the whole object will be copied into the stack

const &

christopher baus (www.baus.net)
Thursday, March 25, 2004

> const &

I was talking about not using references or pointers at all

Oxana Seglova
Thursday, March 25, 2004

> boolean logic is beyond a huge segment of the population

Urk!

Should we allow these people to vote?
Thursday, March 25, 2004

Oxana, the point isn't that anyone thinks there's something fundamentally impossible to understand about pointers; it's that there's very strong evidence that a lot of people find them really, really hard to understand. This is puzzling to those of us who find the concept straightforward and don't remember ever having much trouble with it.

"anon"'s suggestion that the nasty syntax of C++ is at fault is interesting. How could we improve it? COBOLish notation like "location of x" and "contents of p"? Shorthand operators "loc x" and "at p"? Implicit application of those operators, making pointers like C++ references? (But then would pointer arithmetic become impossible, and would that be a good or a bad thing?)

I like to explain pointers in a very low-level way: explain (if necessary) the idea of the computer's address space, and then say that pointers are, more or less, addresses. This has to be festooned about with warnings that, on the one hand, C and C++ try to give you a higher-level view of pointers that *isn't* defined in terms of addresses and, on the other hand, they really do behave a lot like addresses and you can screw yourself badly as a result. Whether this actually works any better than sticking to higher-level explanations, I'm not sure. But it seems like it makes the idea more concrete and therefore easier to grasp.

Probably better, though, is to teach complete novices using a language that doesn't need pointers, and leave explaining pointers until they have some exposure to other forms of indirection in programming. I suspect the following might work well. First teach a language like Python or Scheme where there are no (explicit) pointers and all the indirection happens implicitly in sufficiently natural ways that it doesn't make the students' brains hurt. Then teach a nice clean assembly language (ARM, for instance) where indirection is very explicit and pointers are just numbers. Then you can teach something like C or C++ where pointers are explicit but a bit higher-level than raw addresses, and all the necessary concepts should be in place already.

Gareth McCaughan
Thursday, March 25, 2004

I learned C nearly 20 years ago from the original K&R book, with some "help" from the local C "gurus"...

Pointers made perfect sense to me, probably because my first real programming was in 1802 machine language, then Z-80 assembly language. What confused me about C was the relationship between pointers and arrays.

The local "guru" assured me that, in C, pointers and arrays are exactly the same thing. He told me it's perfectly legal (and common practice) to declare a array like "char foo[123];", then pass it to a function expecting a "char *bar" parameter.

This worked great for the small C programs I initially wrote (they all consisted of a single .c source file). Eventually, I started creating bigger projects and discovered the benefits of using multiple source modules.

This is where I become (briefly) confused.

If one source file has "char foo[123];" and another has "extern char *foo;", bad things happen.

I returned to the "guru".

me: C pointers and arrays are not the same.
him: Yes they are.
me: No they're not.
him: Yes they are.
me: Well, what about this? (Described the problem.)
him: Hmm... That should work.
me: But it doesn't.
him: It should.
me: But it doesn't.
him: It must be a bug in the compiler. Which one are you using?
me: MSC 3 (IIRC) under DOS.
him: Ha! DOS sucks. MS sucks. MSC sucks. You suck. Try a real compiler.
me: OK.

(Later that day, after retrying the experiment under Xenix.)

me: I tried it under Xenix. It still doesn't work.
him: It should. It must be a compiler bug. Go away.

Eventually, I disassembled the offending code and determined exactly what was happening. And, to my surprise, I discovered that C pointers and arrays are not the same thing.

So much for "gurus"...

Keith Moore
Thursday, March 25, 2004

Then I am intrigued, as that is what I was always taught as well. What was the difference?


Thursday, March 25, 2004

The problem is that the extern char * foo in the second source file is declared as a pointer but is not initialised to point to the array, char foo[ 100 ].

Pointers rule
Thursday, March 25, 2004

I think those who have trouble with pointers are in that situation because they never learned about the general concepts of computer architecture, CPUs, memory addressing, etc.  They likely started with something like VB (or even HTML!) that almost totally hides the underlying nature of the computer.

So when you start talking about pointers, they are confused because they haven't even got the concept of a memory address yet.

As a teenager I started with C64 BASIC, in which you were forced to POKE and PEEK memory locations to get things done.  So grasping pointers in CS class in college was very easy.

T. Norman
Thursday, March 25, 2004

> I discovered that C pointers and arrays are not the same thing.

Well, the way you initialize arrays and pointers are different. When you write char[123], it means that the program will automatically reserve space for 123 variables of type char. And when you define it as char*, you'll need to allocate the space manually, using alloc or new.
That's the only critical difference I think. Maybe those "gurus" meant that referencing principles were identical.

Oxana Seglova
Thursday, March 25, 2004

I think the general concept of references might be kind of difficult, but the implementation of pointers in C is another level entirely.

I think the dual use of * for declaring pointers and dereferencing was a confusing choice.

And I wouldn't try teaching someone C pointer arithmetic until they really understood reference already. (e.g. why does (char*)foo + 1 add one but (int*)foo + 1 adds four?)

It might actually be easier to teach people pointers in assembly language first, then introduce C later. At least then C pointer arithmetic won't seem so quirky.

(and don't get rid of the . vs -> distinction in C, which I have seen proposed. . means "offset the pointer" wheras -> means "follow the pointer")

Dan Maas
Thursday, March 25, 2004

"Linguists often suggest that we are actually intellectually bound by the utility and expressiveness of our languages."

Linguists who suggest this is so are wrong. As has been proven experimentally.

Mr Jack
Thursday, March 25, 2004

I never quite grasped pointers.

The concept is easy to get down. A variable that holds a memory adress, right?

What makes it hard is
A) C-syntax: If it were verbose like "Make X point_to Y" it would be alot easier to understand.

B) Pointer arithmetic. Doing math on something that merely points to something else requires a special kind of brain, or alot of training. This is where the extra level of abstraction makes things serioulsy hard.
I can do linked lists, or iterate through a buffer, but that about it.

Eric Debois
Thursday, March 25, 2004

Pointers and arrays are not the same in C. A *real* guru will tell you (once you make it clear that you want the truth, not a usually-good-enough more comprehensible approximation) that pointers and arrays are different things but that arrays "decay" into pointers at the slightest provocation. The usual way to get bitten by this is as described above, but there are other differences: for instance, an array of arrays is not the same thing as an array of pointers.

Gareth McCaughan
Thursday, March 25, 2004

I've always understood pointers -- I suppose it was because I was taught with the typical box and arrow routine.  However, where I get tripped up is when someone gets ultra-clever and has something like
*foo = **bar&;

And then I have to try and figure out what the hell is being stored in foo.

But I agree that math and physics is more elegant -- they've had hundreds (thousands?) of years to throw away junk ideas whereas we've only had less than 50.

Of course, Dijkstra has something to say:
“The ongoing process of becoming more and more an a-mathematical society is more an American specialty than anything else (It is also a tragic accident of history).

The idea of a formal design discipline is often rejected on account of vague cultural/philosophical condemnations such as "stifling creativity"; this is more pronounced in the Anglo-Saxon world where a romantic vision of "the humanities" in fact idealizes technical incompetence. Another aspect of that same trait is the cult of iterative design.

Industry suffers from the managerial dogma that for the sake of stability and continuity, the company should be independent of the competence of individual employees. Hence industry rejects any methodological proposal that can be viewed as making intellectual demands on its work force. Since in the US the influence of industry is more pervasive than elsewhere, the above dogma hurts American computing science most. The moral of this sad part of the story is that as long as the computing science is not allowed to save the computer industry, we had better see to it that the computer industry does not kill computing science.”

MR
Thursday, March 25, 2004

The way to understand pointer arithmetic is to think of it as a rather thin veneer over arithmetic on raw addresses. That can be fiddly and dangerous, but (I think) it's not hard conceptually. If it is, you should go and write some programs in assembly language. :-)

Gareth McCaughan
Thursday, March 25, 2004

I forgot my most important reason:

C) Pointer are no fun! The gap between coding and getting positive feedback is huge.

Eric Debois
Thursday, March 25, 2004

Mr. Jack posts:

>> "Linguists who suggest this is so are wrong. As has been proven experimentally."

I have no idea what you're talking about.  I'd love to see the studies to which you refer.  Perhaps you're thinking about some different concept entirely.  Your assertion is trivially - and quite blatantly - false.  To see this, simply consider one of the most astonishing achievements in history: the halting problem (or alternatively, Godel).  This achievement was not only aided by, but dependent on modeling our notions of 'reasoning' using a formal language.

I sincerely hope that your post was a flame, and, if not, I hope you're not representative of the software development community.

anon
Thursday, March 25, 2004

For the last bloody time, pointers aren't hard.

People have trouble dealing with pointers in C/C++
only, because the way pointers are used in those
languages is warped e.g. using pointer arithmetic
to access arrays because it's supposed to be faster
than iterating through indexes.

A *real* compiler should optimize that kind of problem
away, so you don't have to do it in your source code,
but our C/C++ environments aren't *real* compilers.

So you end-up doing the equivalent of machine code
optimization by hand, except that the syntax of C/C++
is so bad it is harder to read than assembly.

This is a debugging problem, not a
"My God, this concept is too deep for me!" problem.

Which is why people have fewer problems with Java,
even though it DOES have pointers - the object references.

PS
Please do not use stupid made-up non-words like "Grok"
"Boxen" or "L33t", it is childish, irritating and a barrier to
communication.

Curious
Thursday, March 25, 2004

>> "I think the dual use of * for declaring pointers and dereferencing was a confusing choice."

How about the enlightened decision to use '&' as both the 'address of' operator, and to declare references?  C/C++ design was all about fat, smelly, pony-tailed primadonnas making the language as inaccessible as possible so they could protect their cool, unhygienic clique.


To Curious:
Brilliantly put!!! 

anon
Thursday, March 25, 2004

Please forgive my bad English. It's not my native language.


It is true that many people don't understand pointers.

I think I know why.


In order to explain pointers well, you have to draw boxes. Each box represents 1 byte.

Then, you have to write a number for every box. That number is the memory address.

Then, you have to teach people that the pointer is in fact a memory address.

You have to show people several programs using pointers, and for each program draw the graphical representation - using numbered boxes, memory addresses, etc.

Only after this is very well understood, you can teach pointer arithmetics, and other things.


The problem is - if somebody doesn't "get" pointers easily, then that person has to exercise.

The person has to make 20 simple programs involving pointers, and draw boxes with numbers 20 times.

After this, I guarantee that the person will understand pointers.


But.. students and teachers are lazy.

They expect that after drawing boxes only 2 times, the students will understand pointers.

The problem is not only to understand pointers, but also to remember this later.

This is why, in order for someone to understand pointers, he or she must do as many simple pointers exercises as possible, write simple pointers programs, etc.

It's just a matter of exercising enough so that the thinking necessary to "get" pointers without a drawing becomes fully automatic.

I didn't need this kind of exercising, but I have taught CS courses, and I know this:

- if you will just draw the boxes once, and explain, maybe 20% of the people will understand & remember

- if you draw the boxes and explain a few times, and then make the students do lots of exercises, more than 90% of them will understand & remember


It's just a subject that's a little more difficult.

The students have to do simple exercises until they master them.

The teacher must make the students do those exercises.

Teacher
Thursday, March 25, 2004

I wonder if most of the confusion has more to do with C's syntax and less to do with understanding pointers.

Anonymous
Thursday, March 25, 2004

I think teacher may be right.

But anyhow, I firmly believe that the people who figure out pointers after seeing the picture are going to be better programmers than the people who take 20 exercises to figure out pointers.

Joel Spolsky
Fog Creek Software
Thursday, March 25, 2004

>Is the concept of a variable holding an address of where a value can be retrieved from rather than the value itself really so confusing?

I think it is not so much the concept.
Even if you saw the pictures with boxes and arrows, even undestand pointer arithmetics, then thats only half of the story.

-- you see, the problem of appying your understanding of pointers takes up a lot of self discipline. Most people don't have that, so it takes _some_ time between understanding pointers and _being good at pointers_,

Michael Moser
Thursday, March 25, 2004

Until I actually sat down and USED pointers in C and (professionally)  in Perl, the whole concept seemed like a neat trick, but not worth the effort.  My point being, I would imagine there are a lot of people who don't grok pointers, but I'd bet a few of those just haven't used them enough.

Andrew Burton
Thursday, March 25, 2004

"This achievement was not only aided by, but dependent on modeling our notions of 'reasoning' using a formal language."

Formal languages are fundementally different from natural languages; it does not do to confuse the two.

Mr Jack
Thursday, March 25, 2004

Dear anon,
                  The theory of which you are so enamored is known as the Sapir-Whorf hypothesis.

                    Edward Sapir was one of the greatest linguists in American and world history, who specialized in native Indian languages. Whorf was an amateur linguist who worked full-time in an insurance company. He devised a theory, and as nobody would take him seriously if he named it after himself, he tagged Sapir's name on to it. Rather  like my developing some whacky theory about computation and calling it the Turing-Jones theory!
                      The popular description of the Sapir-Whorf hypothesis (it's the one given in Encarta for example) refers to the many different words Inuit Eskimos have for snow, whereas English only has a couple. You sometimes hear of the many Arabic words for camel or sand as well,or the old chestnut about the English being nicer people than the Germains because we don't have a word for "ShadenFreude" (we actually have a one syllable verb -"gloat"- which describes that glorious feeling perfectly). His hypothesis was that the use of certain lexemes or grammatical categories confined thought, and made it impossible to think outside their boundaries.

                        The theory is disproved daily by the billions of people who take new words into their language (often as straightforward loan words) and use them as the opportunity arises. The grammatical constraint theory simply doesn't take into account the fact that grammaticalization is simply a shortcut to describing concepts that can be described in other ways. "Si pudiera!" in Spanish is translated in English as "If only I could!" - the demise of the subjunctive having had no effect on the descriptive powers of English - and although we may have lost much since the time of the Romans, and our law lacks the clarity of Roman Law as Christopher Baus rightly says, I doubt if it has anything to do with the loss of the ablative.

              Few if any linguists believe, or ever have believed, in the strong version of the Sapir-Whorf hypothesis. There is a so-called "weak version", which says that the structures and lexemes of a language make it easier for some language speakers to understand certain concepts than others. The general consensus is that it exists, but many maintain that like certain sub-atomic particles or forces, is so rare and so weak it doesn't normally show up on the linguistic radar.

              McConnell did the world a small disservice when he referred to the theory in the second edition of "Code Complete". He is however probably right, when he says that it does  exist for computer languages.

Stephen Jones
Thursday, March 25, 2004

--" But anyhow, I firmly believe that the people who figure out pointers after seeing the picture are going to be better programmers than the people who take 20 exercises to figure out pointers. "-----

True Joel, but people who remember anything the first or second time instead of the twentieth are likely to be better at their field.

Stephen Jones
Thursday, March 25, 2004

Mr Jack writes:
>> "Formal languages are fundamentally different from natural languages; it does not do to confuse the two."

I wrote:
>> "Linguists often suggest that we are actually intellectually bound by the utility and expressiveness of our languages."

Mr Jack, I certainly did not confuse the two.  Exactly where do you think I specified?  Whether the language one uses is formal or natural has absolutely nothing to do with how our MINDS work with the concepts represented in the language.  COMPUTERS work better with formal languages, but this is utterly and totally irrelevant to how we use language to model concepts.

Please Mr Jack, cite the studies to which you refer.  Go on, then, to explain how number theory is just as easy to grasp using Roman numerals as Arabic.  Tell me how addition is just as easy using binary representations for numbers.  Explain how I could just as easily construct a treatise on 'Free Will and Determinism and the Implications for Moral Responsibility' using a primitive African language consisting of sequences of clicks and pops.

Please do.

anon
Thursday, March 25, 2004

Wow, this thread took an interesting turn. :-p

Brad Wilson (dotnetguy.techieswithcats.com)
Thursday, March 25, 2004

> refers to the many different words Inuit Eskimos have for snow, whereas English only has a couple.

Hmm..  That person doesn't live in tahoe... Word's for snow...

Snow,
Powder,
Corn,
Sun Cups,
Crud,
Sierra Cement,
Granular,
Corduroy,
Hard Pack,
Moguls,
Champagne,
Bullet proof,
Mashed potatos,
Baked,
Wind buff,

To name a few...

christopher baus (www.baus.net)
Thursday, March 25, 2004

I think no matter how you look at, what name you give it, how you wrap, what ever you do, it is the indirection that messes up people.  Yea pointers are easy, but there a lot of people out there that will never get it. 

It is the same with loops.  You can put all the sugar on it you want, but in the end if a student doesn't understand doing something a certain number of times based on input, they aren't going to get your construct.   

It is really frustrating and hard to believe if you are a programmer, but I am convinced it is true. 

Personally I find the -> operator pretty expressive.  There are a lot of ways to wrap up pointers in C++ now a days and I think that's fine, but in the end you still need to understand the indirection.

christopher baus (www.baus.net)
Thursday, March 25, 2004

Stephen Jones:

Good post...  I want to respond now while the thread is hot, but I have to get some work done.  D'oh!  Check back tonight for my thoughts.

anon
Thursday, March 25, 2004

---"Explain how I could just as easily construct a treatise on 'Free Will and Determinism and the Implications for Moral Responsibility' using a primitive African language consisting of sequences of clicks and pops."----

Dear Anon,
                  you're showing your ignorance again. Firstly you're confusing human languages with computer languages, or numerical systems, exactly as Mr. Jack says you do. Secondly your racism is on a par with your ignorance. All research on languages carried out so far tends to the conclusion that they are equal in complexity. If you want to take "grammatical complexity" as your yardstick, you will find that English is much more "primitive" than the languages of other less technologically developed nations.

                    You will have difficulties writing about unfamiliar concepts in another language, but the difference is cultural not linguistic. I can't see any difficulty in writing about the particular topic you mention in most world languages, that does not have to do with assuming knowledge of a previous corpus of philosophical thought. The first difficulty you will have is that you don't know the language concerned. Should you bother to learn it you might hit the second limitation of your own intellectual obtuseness

Stephen Jones
Thursday, March 25, 2004

Dear anon,
                  My last post was written before your second concilatory posting. Please presume all derogatory judgements are temporarly suspended.

                  Flattery will get you everywhere :)

Stephen Jones
Thursday, March 25, 2004

"Tell me how addition is just as easy using binary representations for numbers."

I can't comment on your other points, but this just seems wrong.  Addition using base 2 representation is no more difficult than base 10, except for the fact that you have been practicing base 10 arithmetic since you were a child.  In fact for addition, the exact same process applies.  Memorize the sum of any two single digits, carry over the extra to the next column, repeat.  In fact I bet if you were to compare, it would be much easier to teach children to add in binary since there are only 4 combinations of digits instead of 100.

MikeMcNertney
Thursday, March 25, 2004

Gloating is not the same as schadenfreude. It is about perceived personal superiority, and does not require others' misfortune. Schadenfreude is the other way round.

Insert half smiley here.
Thursday, March 25, 2004

[PS
Please do not use stupid made-up non-words like "Grok"
"Boxen" or "L33t", it is childish, irritating and a barrier to
communication.]

Ahem.  You uncultured swine!

http://dictionary.reference.com/search?q=grok

Richard P
Thursday, March 25, 2004

Yes. Try explaining C pointers to a java programmer
who has no idea how a CPU works and has never
had to think about memory. It's impossible.

son of parnas
Thursday, March 25, 2004

I always found it difficult to grasp pointers when I tried to learn C on my own. They usually showed up in chapter 5 or 6 of the book, right after the "Hello World"-a-thousand-times program.

The problem was, I didn't see the use of the pointers, yet. I mean, it's not like it's the only way of using arrays.

The subject got much clearer when I started study computer science and had to implement binary trees and other graphs.

Now I'm spending my days in python and Java land. Can't really say I miss pointer arithmetic...

Karin
Thursday, March 25, 2004

>> "you're showing your ignorance again. Firstly you're confusing human languages with computer languages, or numerical systems, exactly as Mr. Jack says you do."

Not my response to your earlier email yet.  Just popped into Joel for a second...

What's your fetish with formal vs. natural languages?  Unless your brain is running a compiler front end - and I'll guess that it's not - the formal/natural language distinction just DON'T ENTER INTO IT.  You're not creating a parse tree up there, are you?  Just think about when you use pseudo code to model an algorithm.  The particular flavor of pseudo code you use might be formal, or it might not.  Doesn't matter because your BRAIN is not using symbol tables and semantic analyzers - your MIND is using the language to model mental constructs.  Computers care about the formal/natural language distinction...  MINDS DON'T.

>> "your racism is on a par with your ignorance."

Yeah.  Wow, you're right.  The next time I want to give an example of a tiny language consisting of sequences of clicks and pops, I'll pick a European one.

Now, given this tiny EUROPEAN language USED EXCLUSIVELY BY WHITE PEOPLE consisting of sequences of clicks and pops - if you think you're going to write a short, meaningful piece on 'Free Will and Determinism and the Implications for Moral Responsibility', then you are sadly mistaken.  English, for instance, already has words and phrases like 'freedom', 'ethical', 'compatiblism', 'praiseworthy', 'responsible', 'resentment', 'second-order-desire', 'pleasure', 'Platonic value', etc., each with very precise meanings developed over decades or centuries in the literature.  In ClickyPop, perhaps you have the words 'click-pop-pop-click', 'click-click-pop' and 'click-pop-pop-pop-pop' which mean 'good', 'nice', and 'aggressive' respectively.  I'm sorry, but you will spend thirty pages defining what it is exactly that you mean by 'pessimism' in the sense used when discussing free will.  Note that you're incorrect in your assertion about grammatical complexity (check out McWhorter or Kusters), but, at any rate, who cares?  The important metric is not grammatical complexity, but LEXICAL DIVERSITY - unless you enjoy reading the thirty pages about pessimism.  But, hey, maybe you're a glass half-empty kinda' guy.

anon
Thursday, March 25, 2004

> Unless your brain is running a compiler front end

Ha I love it!  That sounds like some people I know.  I wish more lawyers would run a compiler front end!

christopher baus (www.baus.net)
Thursday, March 25, 2004

I personally found pointers quite easy to learn when I learned C and assembly code after BASIC as a kid. You just imagine the little diagram of memory locations with the boxes and the arrows. The relationship between arrays and pointers in C was a little more tricky to get but still.

What I found hard, then, was the idea of a reference in C++. It seems to abstract away the very concrete notion of a pointer in a slightly confusing sort of a way - in languages like python which just work with references it seems very natural but in C++ it seems like a kinda weird and ugly second way of doing things, once you're used to pointers.

Matt
Thursday, March 25, 2004

Dear anon,
                  The fact that you a language uses clicks and pops, or glottal stops, or is tonal is as irrelevant as whetner you use magnetic, optical, electronic or punch card media to keep your data.

                  Your presumption that for some reason African languages don't have abstractions is as laughable as it is sad. Look at most of those English words and you will find that they have been imported from Latin. Platonic values, whatever they are, came from Greek originally, and not the language of the hairy legged, freckled Saxons who were the first speakers of English. If you want your African language speakers to understand your cultural references you may have to run them a course on the Humanities, but the same will apply to plenty of native-English speakers in the Appalacians or the Thames Estuary.

                  I haven't the least idea what you are babbling on about with regard to the formal/natural language distintion. What I said was that the limits of a computer "language" might put constraints on your use or understanding of certain constructs, but this does not apply in any meaningful sense to human languages.

Stephen Jones
Thursday, March 25, 2004

--"Gloating is not the same as schadenfreude. It is about perceived personal superiority, and does not require others' misfortune. Schadenfreude is the other way round. "----

The SOED gives this as the third meaning of gloat:
4. Feast the eyes or mind lustfully, avariciously, malignantly, triumphantly, etc. Usu. foll. by on, upon, over. M18.

It also gives this example:
P. H. GIBBS Our enemies are gloating over our ruin.

Merriam Webster gives this as the second meaning:
"To observe or think about something wth triumphant or often malicious satisfaction, gratification or delight. (gloat over an enemy's misfortune).

Stephen Jones
Thursday, March 25, 2004

mov ebx,1  ; variable
mov [ebx],1  ; pointer

learn assembly, people ;)

Alex.ro
Friday, March 26, 2004

I don't know either, but I do get the impression that Assembly is less convoluted than C.

Please feel free to disillusion me, and gloat whille doing so if you want.

Stephen Jones
Friday, March 26, 2004

[Ahem.  You uncultured swine!
http://dictionary.reference.com/search?q=grok
Richard P]

What on Earth does this posting mean?
You meaningless swine!

Curious
Friday, March 26, 2004

So do both gloat and schadenfraude mean

"Smash your enemies.
See them run before you.
Hear the lamentations of their women."

?

Curious
Friday, March 26, 2004

Hmmm...
888

anon
Friday, March 26, 2004

--"So do both gloat and schadenfraude mean

"Smash your enemies.
See them run before you.
Hear the lamentations of their women."----

No, only the second and third lines. The first line can be outsourced.

Stephen Jones
Friday, March 26, 2004

I think there are two problems with how pointers in C are taught, based on my reading of a dozen textbooks and online tutorials that all seem to take the same approach to the pointer issue.

--There are a hundred pointer-arithmetic tricks that you never, ever have to use. E.g., textbooks and  tutorials always point out that list[3] == 3[list], which is valid but such bad form that it's painful.  Similarly, you can code for your entire life without using the (list + 3) form, or the terms `pointer arithmetic' or `heap'. Extraneous stuff like this is confusing, and needs to be saved for the `advanced notes' section or omitted entirely.

--Pointers are used to implement two entirely different things: arrays and call-by-reference. These should be taught as entirely separate concepts, since one has approximately zero bearing on the other. This also means that the focus of the lesson is on the application instead of the idea of a memory heap and its structure.

The standard methods of teaching (from my experience) are based more on the theoretically pleasant aspects of pointers than on the practical aspects of how they're actually used, which distinguishes them from the rest of how programming is taught. Who'd understand function calling if you taught it all in terms of a stack of frames?

Ben
Saturday, March 27, 2004

Ben: I totally agree with you. Very interesting post.

Karin
Monday, March 29, 2004

*  Recent Topics

*  Fog Creek Home