(AD) Sponsored links
This site is build on openstandards with opensource software and an openmind. That is why all content is licenced under the open content licence.
I love all the stuff one can do with the Microsoft Kinect. Take for example, this scan of my bay window
(looks like a snow bawl due to the white parts outside)
And without color:
The kinect can be used for much more then just scanning. Scanning one's girlfriend for example :-)
Commodity sinks, innovations rises. An old rule. That is the reason why the drop has to keep on moving, have shorter release cycles and adopt new technologies faster to make sure we don't become what we replaced; old outdated systems that are slow to adapt and fast to extinct.
There are two sides to this, we have to grab new technologies faster and dump older technologies sooner. And so be sure that adopting new technologies faster doesn't create a legacy we have to release faster. Or at least, this is my opinion.
"Nobody" in the world logs in with openID anymore. Many appliaction to application in backend still might be using it, but nobody uses it to authenticate anymore. The last bastion Janrain just announced that it will close the doors. Drupal will drop OpenID form core as well and might have done this as well a long time ago. So when Drupal 8 will see the light in janury 2014, and we still would have release cycles over a 1000 days, the maintainers will still be dealing with an OpenID implementation and supporting it in Drupal core 7 up to the time D9 ships, somewhere around Q4 2017. Extrapolating our current release cycles most likely later, much later.
You want to know another example of an innovation that was once great and is now holding us back? gzipping pages. In core ever since 4.5 and was a great feature (though I had some problems back then :-)) But it is wrong. Holding us back in duplicate functionality that has to be maintained and is better being served in another OSI layer.
Back when webservers didn't compress pages and elements by default, it made perfect sense to do so from Drupal. A great way to save bandwidth and deliver the pages faster to the user. But now all webservers compress pages (and other elements like a big word document as an attachments served form /files/ !) by default, it is code that has to go. The innovation was great, but it sunk down to lower in the stack and became a commodity in all major webservers. And thta is the risk with all innovations and if one keeps holding to innovations that are already commodity, one ends up over there as well.
This holds true for many elements of frontend performance. Right now is seems like a good thing to combine multiple CSS or JS files in to one file. But once SPDY becomes mainstream this can better be done in HTTP protocol, not in the CMS.
And traditional frontend performance states that we have to use sprites in the template.
While if we add one module and one line of code this is all done at the webserver level with image sprites.
And we should use selective DATA URI's in our template. Most frontend devs will puke; binary data in a template? We are some old ugly old tchnology.
Take a look at this impressive list of options where modpagespeed -a webserver module- can help you with:
Now for some of these actions there might be a Drupal module (lazyloading), for some functions one has to write good CSS/HTML/JS (CSS above scripts), some need good content editors or backend processes (de-duplicate inline images, progressive jpeg's) and some are just not doable yet in the frontend in an easy way (DATA-URI's).
So as a frontend dev (ops), do yourself a favour and do use the page speed module out for Apache and nginx AND keep writing good templates. And as a community Drupal community member, make sure that we keep innovating on the top , and let code free at the end where it is better being served outside of our hands.
I love the idea of using the analog loophole to bypass copyprotection / DRM. For example filming in the movie in the cinema (usually very bad quality) or...
To bad I dont have a kindle. Nor really the need to copy a work like this. But still, once the WorlWide cyber police controls us all; we can stull use the analog loophole to get back at them :-)
While "performance" is a very complex topic on its' own, let us in this posting define it as the speed of the website and the process to optimize the speed of the website (or better broader, the experience of the speed by the user as performance.
Given that most sites run on the same hardware for years, this results in slower websites, leading to a lower pagerank, less traffic, less pages per visit, lower conversion rates. And in the end, if you a have a business case for your website, lower profits. Bottemline: If you make money online, you are losing this due to a slow website.
A page can be just the HTML file (can be done in 50ms)
Speedy to the rescue?
The downside of SPDY is however that is is hard to troublshoot and not yet avaliable in all browsers. It is hard to troubleshoot since most implementations use SSL, the protocol is multiplexed and zipped by default and not made to be read by humans unlike HTTP/1.0. There are however some tools that make it possible to test SPDY but most if not all tools you use every day like ab, curl, wget will fail to use SPDY and fallback like defined in the protocol to HTTP/1.0
So more users, less errors under load and a lower page load time. What is there not to like about SPDY?
Pinterest itself is doing some good work when it comes to performance as well. Not just speed but also the perception of speed.
Now say this fast after me: SPDY FTW. :-)