Why don't browsers just make URLs clickable
Even the right-click in Opera and the double click/middle click Mozilla features seem dumb to me. It also seems silly that everything from Bugzilla to this forum would need to write the same code to find URLs in text and make it a link.
Firebird can be configured to do so.
how do you configure firebird to make all http's clickable? i just searched options and couldn't find it...
With Opera you can highlight the url, right-click and "Go to URL" is on the menu.
Looks like clickable links are back! :)
So now I want to know, who's theory was right. Did Joel do it on purpose, was it a mistake or is he playing mind games with us to see easily he can get our panties in a bunch?
Did they always used to go to a redirect.asp page? Or were they straight <a> tags?
The explanation is here: http://www.joelonsoftware.com/items/2003/10/31.html
Actually the conspiracy theorists were right on the money...
I'm not sure I understand Google's page rank completely, but isn't this baby/bathwater? Because maybe 1-2 bots/week post URL's here, then NOBODY whose page is posted here gets the benefit of it.
Not really, Philo. The big important thing to note is when the person writing the blog wants to link to something, not when the people who are commenting want to link.
Ah, gotcha. Thanks!
Here's the way I see it : A couple of blogs, each of which got popular by people posting links to their blog in other boards (with their Google influence rising for each), got wise: They were in a pretty good ranking position with Google, but each new blog linked in the comments on their site encourange a PageRank inflation, devaluing the benefit of their own site.
Thank you for your conspiracy theory.
Fair enough, but how often does that really happen here? I can't help but agree with Philo's comment that it's throwing out the bath with the bathwater: Maybe it does happen, but the ratio seems to be such that it's mostly a non-factor.
On other blogs it was happening enough to cause the comments feature to be unusable, forcing those bloggers to turn off their comments sections completely. Most of the blogging tool vendors are scrambling for ways to prevent comment spam. The faster we do this, the more we can nip this whole form of spam in the bud before it becomes popular. If blogs close rank and shut off this loophole completely, this form of spam may be stillborn. If we don't, I assure you that it will only be a matter of months before all popular discussion boards are unusable thanks to this kind of spam... and then it will be too late, because if you've got a script that posts URLs to 1000 blogs, you're not going to edit it just because 1 of those blogs happens to have disabled URL linking. Spammers don't care about collateral damage.
I'm guessing that spammers also don't care about wasted effort. If bots were expensive, spammers might have to pick and choose their resource usage. Since they don't, the bots will continue posting URLs despite your solution.
I there a blogger or MT plugin under development to prevent thread-jacking?
thread-jacking? dunno. but there are lots of techniques to reduce comment spamming. i hacked my own 'password' technique; the problem is that everybody needs to use a slightly different technique so a human can use it but robots can't be programmed. (not easily at least).
Seems to me that harming the result is closing the barn door after the horse is gone. Why not prevent bots from posting in the first place?
Thanks for the work.
the problem with 'image validation' techniques is that you have to be able to see.
Redirected URLs exist at such a fundamental level within HTTP that I would be surprised that Google doesn't understand them, and so I expect that PageRank flows straight through them.
No one caught on that I was asking about thread-jacking because this thread was jacked. Oh well. I guess no one cares about having browsers make URLs clickable.
If you look at Word, which does this unless you turn the default off, you will realize why people hate it.
True. However, the redirect server can be Nospidered in the robots.txt file, which will then mean that any content there won't add to the pagerank.
Fog Creek Home