Nicholas Skinner

Freelance website and web application developer

Archive for June, 2008

How to use Exim with Google’s Postini Spam Protection Service

Monday, June 23rd, 2008

Google Postini ScreenshotI recently setup Googles Postni message filtering service on another one of my domain names. It is a very effective spam filtering service that fits in between the Internet, and your mail server (you change your MX record to Postini, it filters the spam, then passes on only good messages to your server).

I had a problem however since a number of spam messages were still making it through. On closer inspection the spam that was getting through was somehow directly finding my mail server and bypassing the Postini service (even though the mail server is not mentioned anywhere in the MX records). The simple solution / recommended solution is to firewall port 25 to everything but Postini’s server but this is not entirely useful if authenticated users relay messages through the server, or some domains are using Postini’s service and others are not.

I am using Exim, therefore updated my configuration such that mail can be denied from mail servers other than *.postini.com where the domain name is using the Postini service (listed in a text file).

How to:

  1. Create an empty text file ready to be populated with the list of domains using the Postini service:

    touch /etc/exim/postini_filtered
    chmod 640 /etc/exim/postini_filtered
    chown root:mail /etc/exim/postini_filtered

  2. Add in domains to the list:

    echo domain_using_postini.com >> /etc/exim/postini_filtered
    echo domain2_using_postini.com >> /etc/exim/postini_filtered

  3. Edit /etc/exim/exim.conf removing the section accepting mail for all local domains

    accept domains = +local_domains
    endpass
    verify = recipient

  4. Add in statements allowing the receipt of mail in the list of domains using the Postini service, only if they come from Postini, and allowing the receipt of mail not in the list of domains using the Postini service:

    accept domains = +local_domains
    domains = lsearch;/etc/exim/postini_filtered
    hosts = *.postini.com
    endpass
    verify = recipient

    accept domains = +local_domains
    domains = !lsearch;/etc/exim/postini_filtered
    endpass
    verify = recipient

Managing 130+ RSS Feeds

Sunday, June 15th, 2008

In recent months I have been growing increasingly frustrated with my RSS reader and the 130 feeds it contains. I was starting to find it difficult to keep up, and was considering deleting feeds – the solution however turned out to be as simple as some folder reorganisation.

Google Reader ScreenshotAfter seeing a few sites publishing RSS feeds back around 2003 it took me some time actually work out what they were and how to use them but ever since then I have not looked back. I have found them to be incredibly time saving allowing me to keep up with the many news sources I want to read, without spending hours every day doing so. It got to a stage where I was writing Perl script to parse sites that did not have feeds, and create them automatically, now however things have moved on where this is no longer necessary as most sites publish their own, and for those that do not, there are web services such as Ponyfish which provide a simple web interface for the process.

Having used RSS for some time I have gone through a number of different reader applications from a custom coded Perl/PHP script, to a desktop client, back to an updated custom coded PHP script, and finally to Google Reader which I am currently using. Google Reader has some issues every now and again but even so it is very effective. Both the web based applications, and desktop client I was using started suffered from scalability issues, the more feeds that were added. Google Reader however certainly does not (at least not at the moment). Google seems to have concentrated on getting the basics stable before adding features although the scrollbar changing size and moving takes some getting used to. You can also access the Google Reader via a cut down web interface on the move and use it offline via Google Gears.

Lately however I have been finding it difficult to keep up, tending just skip over all the new items briefly without reading any in depth, just to get through them – not all that useful. I tried using aideRSS which ranks items by filtering out those that are not so popular but its idea of popular did not match what I wanted to read. The problem however turns out not to be the number of feeds as many publish only a few new items every day/week but rather high volume feeds such as Digg, TechCrunch, and Engadge which monopolised space in the reader. I finally came up with a workable strategy which involves a 4 folder structure:

  1. "Priority Feeds" – Usually low volume feeds I am interesting in reading immediately when they are published.
  2. "Detailed Reads" – Feeds which generally take some time to get through where I would typically read all / most items in detail.
  3. "Other Feeds" – Mixed assortment of all other feeds that would typically not read in detail but still like to keep an eye on.
  4. "High Volume Feeds" – Feeds which publish a lot of content and I would generally skip over quickly only stopping if there was something really interesting.

With this strategy I can now spend time reading what I am really interested in but also not let high volume feeds with less important content take up too much of my time.

Coding W3C XHTML / CSS Valid Websites

Tuesday, June 10th, 2008

HTML Markup ScreenshotA step by step approach I have found to work well when developing static W3C XHTML / CSS compliant websites or the initial templates for static websites:

  1. Copy a standard set of site template files. This saves the keep retyping the standard html / body open / close tags, along with including CSS files, and sets up a basic folder structure for the CSS, and other resources such as images.
  2. Working from the PSD copy / paste the textual content into HTML mark-up (i.e. put it into “h1″, “li”, “p”, “div” tags etc.
  3. Validate the HTML code at validator.w3.org validating at this stage usually saves time in the long run since any display issues that crop up must then be a result of CSS problems or browser incompatibility, and not errors in the HTML.
  4. Code the CSS testing in Internet Explorer 7 / Firefox all the way through, also making any additional HTML changes as required.
  5. Revalidate the HTML code.
  6. Test in Safari, fixing any cross browser compatibility issues by changing the HTML/CSS and retesting in Internet Explorer 7, Firefox, Safari.
  7. Revalidate the HTML code (if it was changed since the last time).
  8. Validate the CSS at jigsaw.w3.org/css-validator
  9. Test in Internet Explorer 6. I use a copy of Windows XP running in Parallels virtualisation software for this as I have it install for other applications. Microsoft also provide a free IE 6 Virtual PC image for this purpose.
  10. If any Internet Explorer 6 incompatibility issues are found (generally there are) use conditional comments to load an additional stylesheet just for Internet Explorer 6 to correct these.
  11. If time permits create a CSS Print media stylesheet to prevent unnecessary visual elements such as navigation menus appearing when the site is printed, and remove any formatting that makes text look incorrect / not correctly line up / unnecessarily span multiple pages.

When coding sites I make use of the Yahoo CSS Reset which removes CSS styles applied to elements such as h1, h2 etc by the browser itself. It very much simplifies the process of coding cross browser compatible sites. I also use EditPlus text editor for all my HTML / CSS / JavaScript.

Staying “connected” on holiday

Tuesday, June 3rd, 2008

Novatel XU870 ExpressCardI generally use holidays as an excuse to try out new ways of accessing the internet, and staying “connected”.

Initially this meant a laptop with a PCMCIA (or a serial) modem to dial-up. It was not all that different from accessing the internet at home considering the only real alternative at that time was ISDN which was not particularly cheap. Costs were the regular per minute call charges depending on the time of day, and you needed a phone line.

In 1999 a service called screaming.net came out (from telephone company Localtel, which then became World Online). This was a pretty big step as it gave people in the UK essentially un-metered internet access. However to use it you needed to switch over your line to their service, so anywhere else you still had the per minute charges.

If going to destinations without a regular phone line as happened in subsequent years, the only real choice was dial-up via a mobile phone. Needless to say this was expensive, slow, and unreliable. I used a Nokia 7110 connected to a Toshiba laptop via Infrared, and can remember first having to wander about the apartment we were staying in at the time to find a location with a GSM signal, then having to hold the phone at one angle to actually get the signal while at the same time positioning the laptop such that its Infrared port was aligned to that on the phone. This was not very convenient, however it was made more bearable by a script I wrote which allowed a list of sites to be specified and sent to a server which would zip them up in a single file that could be downloaded via “wget –c”. Thankfully since this time GSM coverage has improved quite a bit.

After the previous holidays to UK destinations, America turned out to be fairly straight forward, simply a case of finding a local ISPs dial-up number, and achieving from memory a ~45Kbps connection.

By the time of the next holiday, PAYG GPRS access was now readily available. I had a Bluetooth USB dongle, and a Sony Ericsson T610 which was certainly a welcome break over slow mobile dial-up, however at around £4 per MB it was not cheap.

After that for 2 years in a row I signed up for (and promptly cancelled a month later – there was 1 month contract period) a service from JJTEK Online which offered un-metered dial-up access for around £20.00 per month on an 0808 free phone number. This worked very well in Center Parcs.

OS X Network PreferencesThis year it looks like we may be heading to Portugal, and while Wi-Fi, and in room network connections are now a lot more available you generally have to stay in a Hotel, pay a premium, or visit a Coffee Shop which has led me to look into the various mobile broadband options. If you are in the UK there are now a number of very affordable packages available from most providers, with reasonable usage limits available on PAYG. Only Vodafone from what I have seen however offer reasonable pricing when abroad (i.e. not £4.00 – £8.00 per MB roaming), however there is a catch as they do not offer this on the 30 days option, you need a 12 month contract which ruled it out for me. Some time later I came up with the idea of checking the Vodafone Portugal website (via the Google Translator) and found that GPRS access costs appear to be 10 Euro Cent per MB. Therefore it is just a case of purchasing a PAYG Portugal SIM on arrival or from eBay.

Parallels MobilinkI also decided to purchased a Novatel Wireless XU870 ExpressCard on eBay – it was T-Mobile branded but after following the installation instructions I had it up and running in minutes, both on PAYG Orange GPRS (APN: orangeinternet) and PAYG T-Mobile UMTS 240Kbps (APN: general.t-mobile.uk Username: user, Password: wap). The XU870 ExpressCard is a fairly nice piece of hardware, well built and it came with a neat flip up antenna, plastic carry case and a socket to allow for an external antenna if needed. Along with OS X I also tested it in Windows via Parallels with the MobiLink Lite software (allows you to see the connection speed). With a T-Mobile SIM card it auto configured and connected straight away, on the Orange SIM card it did not work out of the box but likely would have if I had spend some time configuring it. There is also a handy little application that allows sending / receiving text messages from the desktop.