Fog Creek Software
g
Discussion Board




Why don't browsers just make URLs clickable

Even the right-click in Opera and the double click/middle click Mozilla features seem dumb to me. It also seems silly that everything from Bugzilla to this forum would need to write the same code to find URLs in text and make it a link.

At a minimum, anything that starts with http:// should be a link, period.

dmooney
Friday, October 31, 2003

Firebird can be configured to do so.

I think the best answer is to turn it around.  Why should browsers just make URLs clickable?  Unless you are using a forum or similar browser-edited system, it *never* comes up and most forums let you make them clickable.

Flamebait Sr.
Friday, October 31, 2003

how do you configure firebird to make all http's clickable?  i just searched options and couldn't find it...

nathan
Friday, October 31, 2003

With Opera you can highlight the url, right-click and "Go to URL" is on the menu.

Troy King
Friday, October 31, 2003

http://www.fogcreek.com ...testing


Friday, October 31, 2003

Looks like clickable links are back! :)


Friday, October 31, 2003

So now I want to know, who's theory was right.  Did Joel do it on purpose, was it a mistake or is he playing mind games with us to see easily he can get our panties in a bunch?

shiggins
Friday, October 31, 2003

Did they always used to go to a redirect.asp page?  Or were they straight <a> tags?

Dignified
Friday, October 31, 2003

The explanation is here: http://www.joelonsoftware.com/items/2003/10/31.html
Conspiracy theorists in this forum certainly will not accept the explanation ;-)

uncronopio
Friday, October 31, 2003

Actually the conspiracy theorists were right on the money...

Anonymizer
Friday, October 31, 2003

I'm not sure I understand Google's page rank completely, but isn't this baby/bathwater? Because maybe 1-2 bots/week post URL's here, then NOBODY whose page is posted here gets the benefit of it.

I mean, if every blog takes this approach then PageRank becomes worthless, n'est ce-pas?

Philo

Philo
Friday, October 31, 2003

Not really, Philo.  The big important thing to note is when the person writing the blog wants to link to something, not when the people who are commenting want to link.

Flamebait Sr.
Friday, October 31, 2003

Ah, gotcha. Thanks!

Philo

Philo
Friday, October 31, 2003

Here's the way I see it : A couple of blogs, each of which got popular by people posting links to their blog in other boards (with their Google influence rising for each), got wise: They were in a pretty good ranking position with Google, but each new blog linked in the comments on their site encourange a PageRank inflation, devaluing the benefit of their own site. 

Now we hypocritically have a situation where the posting of links is hypocritically proclaimed "blog spamming", and sites like Joel's now actively attempts to ensure that no one gets pagerank points but himself. Seems fair, as long as the flip side (that any other anti-PageRank site shouldn't be directly linked anywhere else) is fair as well.

Anonymizer
Friday, October 31, 2003

Thank you for your conspiracy theory.

Actually, people wrote bots which went around to thousands of blogs and posted irrelevant URLs in irrelevant comments, solely for Google to find so as to promote the PageRank of their advertiser's sites.

Joel Spolsky
Friday, October 31, 2003

Fair enough, but how often does that really happen here? I can't help but agree with Philo's comment that it's throwing out the bath with the bathwater: Maybe it does happen, but the ratio seems to be such that it's mostly a non-factor.

Anonymizer
Friday, October 31, 2003

On other blogs it was happening enough to cause the comments feature to be unusable, forcing those bloggers to turn off their comments sections completely. Most of the blogging tool vendors are scrambling for ways to prevent comment spam. The faster we do this, the more we can nip this whole form of spam in the bud before it becomes popular. If blogs close rank and shut off this loophole completely, this form of spam may be stillborn. If we don't, I assure you that it will only be a matter of months before all popular discussion boards are unusable thanks to this kind of spam... and then it will be too late, because if you've got a script that posts URLs to 1000 blogs, you're not going to edit it just because 1 of those blogs happens to have disabled URL linking. Spammers don't care about collateral damage.

Joel Spolsky
Friday, October 31, 2003

I'm guessing that spammers also don't care about wasted effort. If bots were expensive, spammers might have to pick and choose their resource usage. Since they don't, the bots will continue posting URLs despite your solution.

They won't notice that their page rank doesn't go up, and it does them no harm to continue posting, so we'll still get the spam. Not that I have a solution  :|

Zahid
Saturday, November 1, 2003

I there a blogger or MT plugin under development to prevent thread-jacking?

dmooney
Saturday, November 1, 2003

Moveable type:
http://www.sixapart.com/log/2003/10/comment_spam.shtml

I don't know about blogger.

Joel Spolsky
Saturday, November 1, 2003

thread-jacking? dunno. but there are lots of techniques to reduce comment spamming. i hacked my own 'password' technique; the problem is that everybody needs to use a slightly different technique so a human can use it but robots can't be programmed. (not easily at least).

but what's weird is that my site got hit twice. once totally robotic--guess my comment URL, post comment to post #1.

the second time was human-assisted. someone browsed the site, then a different user-agent stuck the bogus comment in.

mb
Saturday, November 1, 2003

hmm.

I just wrote a whole post on how to counteract this kind of spam and realized... some spammers might read this board and make their spambots even smarter to counteract the spam.

I guess I'll heave the spam solving up to the experts.

www.MarkTAW.com
Saturday, November 1, 2003

Seems to me that harming the result is closing the barn door after the horse is gone. Why not prevent bots from posting in the first place?

I suspect that's why the growth of the "type the number you see in the image" segments in web-based forms. And it should be trivially easy to implement here on the comment form.

Then it's not "well, if a bot *does* post, it won't help them" - it's "bots can't post"

Philo

Philo
Saturday, November 1, 2003

Thanks for the work.

Would have been nice if you had warned us though Joel!

I'm not too sure about your pessimism. I remember when ZDnet forums got invaded by pyramid spam, and on occasion the poster was doing it as fast as two or three moderators were taking it off. If it were possible to write to forums just using a script then surely the forums would already be inundated with spam. Or am I missing something.

And wouldn't registration get rid of the bots (although you are right to think of it as a last resort).

Stephen Jones
Saturday, November 1, 2003

the problem with 'image validation' techniques is that you have to be able to see.
and oddly enough, if a the result is valuable enough (e.g. people want a yahoo account to collect results from spam), they'll have a person go through the motions at the right page.
i'll spill the beans on my technique... it asks a question in english with a one-word answer. for a site targetted at literate people, that works just fine.

mb
Saturday, November 1, 2003

Redirected URLs exist at such a fundamental level within HTTP that I would be surprised that Google doesn't understand them, and so I expect that PageRank flows straight through them.

jhy
Sunday, November 2, 2003

No one caught on that I was asking about thread-jacking because this thread was jacked. Oh well. I guess no one cares about having browsers make URLs clickable.

dmooney
Sunday, November 2, 2003

If you look at Word, which does this unless you turn the default off, you will realize why people hate it.

A clickable URL is one you can't cut and paste - at least in Word. Not much fun!

Stephen Jones
Sunday, November 2, 2003

True.  However, the redirect server can be Nospidered in the robots.txt file, which will then mean that any content there won't add to the pagerank.

Flamebait Sr.
Monday, November 3, 2003

Stephen Jones,

In IE you can right click on any URL and chose Copy Shortcut. Which sounds like a good UI because most of the time people click on URLs and only sometimes to people copy them to the clipboard.

dmooney
Monday, November 10, 2003

*  Recent Topics

*  Fog Creek Home