CSS Level 3 Media Queries

In my humble opinion, the upcoming CSS Level 3 Media Queries are one of the major improvements in web development.

Yesterday I have implemented those in my web site so that it adapts to various browser window widths.

See them in action on Youtube!

Try it for yourself and tell me what you think :) (CSS 3 media queries are currently supported in FF 3.5+, Chrome, Opera, Safari and IE9+)

Measures against Slowloris attacks

In a Slowloris attack a client (or a botnet) opens a large amount of connections to a web server and holds them open. It does not send complete requests so you might find no request of the attacker in an Apache log — quite devious. So… The malicious client continues to open new connections using incomplete requests while Apache is waiting for complete requests in order to serve the client. Meanwhile, regular clients cannot open new connections and thus do not get to be served by the host — the site gets unresponsive.

I just wanted to share my experience with anti slowloris measurements on small-scale Apache webservers.

Apache 2.2.15 comes with mod_reqtimeout. The module’s default settings work out of the box. During my own local slowloris attack http latency fluctuated quite a lot but Apache remained responsive. If you are using Apache 2.2.15, go for mod_reqtimeout and you are done.

Debian 5.0 provides Apache 2.2.9 only and there is no mod_reqtimeout for this version.

My first choice for Apache 2.2.9 was mod_qos which I got compiled smoothly. When an attack is launched, http latency rises sharply for about five seconds. After that, latency normalizes quickly. A quite impressive result. OTOH, while there were no obvious problems reported by site users, the module spammed Apache’s error log with backtraces which forced me to search for another solution.

Next option was mod_antiloris which is a very small module with just over 5 kb of source. It compiled smoothly and worked out of the box. There are no configuration options though. During an attack, http latency rises quickly and remains high. The site gets somewhat less responsive but Apache continues to answer requests. Not as impressive as mod_qos but at least it does not clog error.log with backtraces so I am sticking with mod_antiloris for the time being.

Keep in mind that I have only tested attacks from a single client. A botnet executing a slowloris attack is a completely different story.

And BTW: YMMV.

If you have a different approach for plain Debian systems, feel free to comment.

Web fonts slowly picking up pace

Web fonts have been adopted by all major web browsers. What Firefox 3.1+ w/ Noscript users might not know is that Noscript blocks “font face” by default so they do not get to see nice fonts. There are two easy ways though to enable web fonts:

  1. Left-click Noscript icon on the bottom right corner of the browser’s window and enable all “font@” entries in “Blocked Objects”. This will temporarily allow web fonts on the current page.
  2. To enable web fonts permanently left-click Noscript icon on the bottom right corner of the browser’s window and select “Options”. Select tab “Embeddings” and uncheck “Forbid @font-face” and click “ok”.
 

Do we actually need web fonts?

Yes and no.

Yes: Typography enthusiasts have been waiting for years. Now browsers have widely adopted web fonts and every now and then even a free (to use) font comes along. The time for web fonts is now. And why should print designers have fonts and web designers should not? If man can travel into space the web should have web fonts.

No: Have I been waiting for web fonts? Certainly not. The web got along very well for 16 years without web fonts. I like the web the way it is. Bad readability was mostly caused by bad colors only. To me, web fonts primarily are just another possibility of hurting someone’s eyes.

Beware though: Most fonts have strict licenses which do not allow web distribution. So web designers have to resort to fonts with liberal licenses, Bitstream Vera and Droid Sans being two of them. Check font licences carefully before jumping on the web fonts train!

Browser detection on the verge of IE6’s death

Internet Explorer 6 is on the verge of death. Google will phase out support of this old fellow very soon. FINALLY I might add. While they will invite IE6 users to install Chrome, Firefox or some other modern browser, other websites redirect old browsers to their mobile sites. So it is important to minimize false positives in order not to confuse or aggravate users.

Just out of curiosity I have examined about 1 million Internet Explorer HTTP requests. The shocking result: about 5% of them were far from being unambiguous. Want to see examples? Look at this:

Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; Mozilla/4.0 (compatible; MSIE 7.0; Win32; 1&1); .NET CLR 1.1.4322; .NET CLR 2.0.50727; .NET CLR 3.0.04506.30; .NET CLR 3.0.04506.648; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729; Mozilla/4.0 (compatible; MSIE 7.0; Win32; 1&1); Mozilla/4.0 (compatible; MSIE 7.0; Win32; 1&1))

This is a mess to say the least. It’s an IE 8 alright, but if a program does look for MSIE 7, this user agent would match too.

IE 6 is also affected. I have encountered user agent strings with up to three different “MSIE \d” matches, this one for example:

Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; Trident/4.0; Mozilla/4.0 (compatible; MSIE 8.0; Win32; WEB.DE); SIMBAR={omitted}; Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1) ; .NET CLR 2.0.50727; .NET CLR 3.0.04506.30; .NET CLR 1.1.4322; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729)

This string matches MSIE 8, MSIE 7, and MSIE 6. You cannot even assume that the first MSIE match would indicate the browsers real version.

Conclusion

I do not know what the cause of all this is, maybe poorly written browser extensions. In any case, you cannot reliably detect browsers by matching a single string anymore. Developers, watch out. Things got complicated again, thanks to Microsoft :)

One way to detect an Internet Explorer’s real version could be to do a regexp match for “MSIE (\d)” and then take the highest version number as indicator.

Deutschland und Amazon AWS (aktualisiert)

Wer sich dafür interessiert, Amazon Web Services (AWS) zu nutzen, beispielsweise Elastic Computing (EC2), Cloudfront oder S3, sollte eines wissen:

Es gibt keine vorsteuerabzugsfähigen Rechnungen von “Amazon Web Services LLC”. Rumms! Das sitzt…

Laut Aussage meines Steuerberaters wäre dafür ein Hinweis auf § 13b UStG oder “reverse charge” nötig. Ich habe keine große Hoffnung, dass Amazon die Rechnungen in absehbarer Zeit vorsteuerabzugsfähig machen wird, denn wenn sie es wollten oder den Bedarf dafür sehen würden, hätten sie diese recht einfache Anpassung schon längst gemacht.

Update: Man kann immerhin beantragen, dass Amazon keine Umsatzsteuer mehr berechnet: http://aws.amazon.com/tax-help/

Update 2: Ein Wunder ist geschehen: Amazon bietet nun direkt an, die USt-IdNr. zu registrieren und schon wird keine Umsatzsteuer mehr berechet. Der Link lautet “VAT registration” und ist unter der Tabelle in “Activity Report” zu finden. Das muss wohl recht neu sein, am 02.02.2010 hatte ich einen Activity Report gespeichert und da war der Link noch nicht zu sehen.

aws_vat_reg