willy

Spam wil eat itself, spammer requesting spam comments to be deleted

The site you are watching is rather old. The first posting is from 2002 and that is only because I deleted the database, the first posts were from around 2000.

So I have seen lots of spam attacks on my site, up to the point that I deleted the posisbility to add comments. I know about captchas and bayesian fingerprinting services like mollom, but this is my site with my rules. You want to expres your opinions, hike to facebook, twitter or start your own blog. My site, my rules.

However, back in 2006 I had comments on and there was a posting with some rather lame comments ("yes, I agree, Drupal is great"). Never gave it much thought. However, there was also a link in the users name. ... Indeed spam links. :-(

Now read this mail I got the other day:

Funny right? They have been spamming the internet for ages and now they found out that the postings they have paid for to be placed on sites like mine, are contra productive for their Google ranking and now they wanted to delete the posts!

Here is my answer, lets see what will be happening :-)

(ps I deleted the spams all the same :-)

Pong access.log with logstalgia

Let your webservers accesslog be the source of a game of pong :-)

If you are a brew OSX user, a oneliner to install :-)

see logstalgia

Fresh Likes.. My famous 15 minutes

me hitting it big time on facebook :-)

There must have been a glitch in the matrix. Yesterday I did a tweet that got autosend to Facebook. Somehow with a couple of hours, I got 12.000 likes and 130.000 comments on it. The picture up was one I took when the likes-storm just begun. The post it self is now WSOD for me but maybe it will work in die time, http://www.facebook.com/bertboerland/posts/2231777543
. Weird to see.

But then, think about a CMS that can handle 130k comments on on single post. And then that there are zillions of post per second. Wordpress anyone? ;-)

Willy moved

The End of Willy

I finnaly moved this website from my 15 year old pentium 166 with a load over 20 to some better iron, some decent 3 year old hardware. The new machine had a newer PHP version and my /ancient/ Drupal install was fine with that. However, I was unable to log in. Took me some time to see if this was caching or Cookie related but found it, adding
register_shutdown_function('session_write_close');
to the sessions.inc did the trick

So this website should be more stable now, and faster. I will get a new ADSL line in a couple of week upgrading to 20Mb so should be even faster in some time. With some downtime however, I will get a new IP address as well and need to change my zones when I know what this address will be.

Thank you bas for giving me "havenmeester" (aka tug) 5 years ago. Even at that time dated hardware but served me well. It is just that the fan of the CPU makes an enormous amount of noise now.

shutdown -h now.

We salute you Tug, the oldest webserver on the net :-)

How many RSS readers do you have?

The other day -based upon some twitter messages I was doing with some friends- I thought about how many RSS readers I have for my frontpage. I didnt feel like investigating /all/ possible RSS readers since every taxonomy item in Drupal (the CMS under this site) is a RSS feed and since since only very few people are subscribed to these feeds; most follow the frontpage.

The problems with RSS readers is that big aggegator sites get the RSS feed on behalf of multiple people, just counting readers does not do the job. Fortunately, bigger sites mention how many people are reading the feed, like this in the headers:

(+http://www.google.com/feedfetcher.html; 12 subscribers;

The other problem is that I have three RSS feed URL's for one and the same feed:

/myblog/atom/feed
/myblog/node/feed
/myblog/rss.xml

So I run the following code on an access log of half a day:

[root@tug httpd]# grep "/rss.xml" access_log | awk '{print $1, $11, $12, $13, $14, $15, $16}' | sort | uniq -c | sort -rn | more
[root@tug httpd]# grep "/myblog/node/feed" access_log | awk '{print $1, $11, $12, $13, $14, $15, $16}' | sort | uniq -c | sort -rn | more
[root@tug httpd]# grep "/myblog/atom/feed" access_log | awk '{print $1, $11, $12, $13, $14, $15, $16}' | sort | uniq -c | sort -rn | more

(note: Most RSS readers grab the feed muliple times per day. I dont count them via the onliners above and having only half a day of sampling data is not that bad)

Then count the "on behalf of xxx subscribers" by hand and...

I couldn't believe it, in half a day I have some 554 subscribers! Wowsers!

XML feed