hardware

Leap userinterface with Flickr, Minority Report User Interface is here (well sort of :-)

I recently gave a presentation about Natural User Interface and dived again in my old (?) leapmotion.

It can do minority style user interface within a webapp:

3d scan of my room using the kinect

I love all the stuff one can do with the Microsoft Kinect. Take for example, this scan of my bay window

(looks like a snow bawl due to the white parts outside)

And without color:

The kinect can be used for much more then just scanning. Scanning one's girlfriend for example :-)

Zen and the art of innovation

Commodity sinks, innovations rises. An old rule. That is the reason why the drop has to keep on moving, have shorter release cycles and adopt new technologies faster to make sure we don't become what we replaced; old outdated systems that are slow to adapt and fast to extinct.

There are two sides to this, we have to grab new technologies faster and dump older technologies sooner. And so be sure that adopting new technologies faster doesn't create a legacy we have to release faster. Or at least, this is my opinion.

"Nobody" in the world logs in with openID anymore. Many appliaction to application in backend still might be using it, but nobody uses it to authenticate anymore. The last bastion Janrain just announced that it will close the doors. Drupal will drop OpenID form core as well and might have done this as well a long time ago. So when Drupal 8 will see the light in janury 2014, and we still would have release cycles over a 1000 days, the maintainers will still be dealing with an OpenID implementation and supporting it in Drupal core 7 up to the time D9 ships, somewhere around Q4 2017. Extrapolating our current release cycles most likely later, much later.


This is not to criticise anyone,I think it was a wise step to include OpenID back in the Drupal 6 days and it is a wise step to remove it from D8. But the time between making these releases and hence these decisions should be shorter. And especially the time that it impacts the code to maintain.

You want to know another example of an innovation that was once great and is now holding us back? gzipping pages. In core ever since 4.5 and was a great feature (though I had some problems back then :-)) But it is wrong. Holding us back in duplicate functionality that has to be maintained and is better being served in another OSI layer.

Back when webservers didn't compress pages and elements by default, it made perfect sense to do so from Drupal. A great way to save bandwidth and deliver the pages faster to the user. But now all webservers compress pages (and other elements like a big word document as an attachments served form /files/ !) by default, it is code that has to go. The innovation was great, but it sunk down to lower in the stack and became a commodity in all major webservers. And thta is the risk with all innovations and if one keeps holding to innovations that are already commodity, one ends up over there as well.

This holds true for many elements of frontend performance. Right now is seems like a good thing to combine multiple CSS or JS files in to one file. But once SPDY becomes mainstream this can better be done in HTTP protocol, not in the CMS.

And traditional frontend performance states that we have to use sprites in the template.

While if we add one module and one line of code this is all done at the webserver level with image sprites.

And we should use selective DATA URI's in our template. Most frontend devs will puke; binary data in a template? We are some old ugly old tchnology.
Again, with one command, the webserver layer will migrate these smaller images from flat files to inline DATA URI's.

Take a look at this impressive list of options where modpagespeed -a webserver module- can help you with:

  • Optimize Caching: Canonicalize JavaScript Libraries, Extend Cache, Extend Cache PDFs, Local Storage Cache, Outline CSS, Outline JavaScript
  • Minimize Round Trip Times: Combine CSS, Flatten CSS @imports, Inline CSS, Combine JavaScript, Inline JavaScript, Move CSS Above Scripts, Configuration file directive to shard domains, Sprite Image, Pre-Resolve DNS
  • Minimize Request Overhead: Rewrite Domains, Configuration file directive to map domains
  • Minimize Payload Size: Collapse Whitespace, Combine Heads, Elide Attributes, Minify JavaScript, Optimize Images, Prioritize Critical CSS, Deduplicate Inlined Images, Remove Comments, Remove Quotes, Rewrite CSS, Rewrite Style Attributes, Trim URLs
  • Optimize Browser Rendering: Convert Meta Tags, Defer Javascript, Inline Preview Images, Lazily Load Images, Move CSS to Head, Optimize Images, Convert JPEG to Progressive, Rewrite Style Attributes
  • Other: Add Head, Add Instrumentation, Inline @import to Link, Make Google Analytics Async, Insert Google Analytics Snippet, Pedantic, Run Experiment

Now for some of these actions there might be a Drupal module (lazyloading), for some functions one has to write good CSS/HTML/JS (CSS above scripts), some need good content editors or backend processes (de-duplicate inline images, progressive jpeg's) and some are just not doable yet in the frontend in an easy way (DATA-URI's).

So as a frontend dev (ops), do yourself a favour and do use the page speed module out for Apache and nginx AND keep writing good templates. And as a community Drupal community member, make sure that we keep innovating on the top , and let code free at the end where it is better being served outside of our hands.

(btw Mike Ryan, more retro future at this pinterest board :-)

The analog loophole, using a lego mindstorm and Mac to bypass the DRM on a Kindle

I love the idea of using the analog loophole to bypass copyprotection / DRM. For example filming in the movie in the cinema (usually very bad quality) or...

From allthingsd:

Using Lego’s Mindstorms — a basic robotics kit popular with hobbyists — plus a Kindle and a Mac, he has assembled a way to photograph what’s on the screen, and then submit it to a cloud-based text-recognition service.

To bad I dont have a kindle. Nor really the need to copy a work like this. But still, once the WorlWide cyber police controls us all; we can stull use the analog loophole to get back at them :-)

DIY kindle scanner from peter purgathofer on Vimeo.

SPDY and webperformance

Robert M. White
TL;RD

  1. Performance matter for all websites
  2. Performance is not just (80%) frontend
  3. SPDY kills 80% of your frontend problems

What
In the Drupal and broader web community, there is a lot of attention towards the performance of websites.

While "performance" is a very complex topic on its' own, let us in this posting define it as the speed of the website and the process to optimize the speed of the website (or better broader, the experience of the speed by the user as performance.

Why
This attention towards speed is for two good reasons. On one hand we have the site that is getting bigger and hence slower. The databases get bigger with more content and the the codebase of the website is added with new modules and features. While on the other hand, more money is being made with websites for business even if you are not selling goods or run ads.

Given that most sites run on the same hardware for years, this results in slower websites, leading to a lower pagerank, less traffic, less pages per visit, lower conversion rates. And in the end, if you a have a business case for your website, lower profits. Bottemline: If you make money online, you are losing this due to a slow website.
UFO's
When it comes to speed there are many parameters to take in to account, it is not "just" the average pageloading time. First of all the average is a rather useless metric without taking the standard deviation into account. But apart from that, it comes down to what a "page" is.

A page can be just the HTML file (can be done in 50ms)
A page can be the complete webpage with all the elements (for many sites around the 10seconds)
A page can be the complete webpage with all elements including third party content. Hint: did you know that for displaying the Facebook Like button, more Javascript is downloaded then the entire jQuery/backbone/bootstrap app of this website, non cacheable!
And a page can be anything "above the fold"



Moon Retro future
And then there are more interesting metrics then these, the time to first byte from a technologic point of view for example. But not just technical PoV. There is a website one visits every day that optimzes its' rendable HTML to fit within 1500 bytes.
So ranging from "First byte to glass" to "Round trip time", there are many elements to be taken into account when one measures the speed of a website. And that is the main point: webperformance is not just for the frontenders like many think, not just for the backenders like some of them hope, but for all the people who control elements elements in the chain involved in the speed. All the way down to the networking guys (m/f) in the basement (hint sysadmins: INITCWND has a huge performance impact!) Speed should be in your core of your team, not just in those who enable gzip compression, aggregate the Javascript or make the sprites.

Steve Souders (the webperformance guru) once stated in his golden rule that 80-90% of the end-user response time is spent on the frontend.

Speedy to the rescue?
This 80% might be matter of debate in the case of a logged in user in a CMS. But even if it is true. This 80% can be reduced by 80% with SPDY.
SPDY is an open protocol introduced by Google to overcome the problems with HTTP (up to 1.1 including pipeling, defined in 1999!) and the absence of HTTP/2.0. It speeds up HTTP by generating one connection between the client and the server for all the elements in the page served by the server. Orginally only build in chrome, many browsers now support this protocol that will be the base of HTTP/2.0. Think about it and read about it, a complete webpage with all the elements -regardless of minifying and sprites- served in one stream with only once the TCP handshake and one DNS request. Most of the rules of traditional webperf optimalisation (CSS aggregation, preloading, prefetching, offloading elements to different host, cookie free domains), all this wisedom is gone, even false, with one simple install. 80% of the 80% gone with SPDY, now one can focus on the hard part; the database, the codebase. :-)

The downside of SPDY is however that is is hard to troublshoot and not yet avaliable in all browsers. It is hard to troubleshoot since most implementations use SSL, the protocol is multiplexed and zipped by default and not made to be read by humans unlike HTTP/1.0. There are however some tools that make it possible to test SPDY but most if not all tools you use every day like ab, curl, wget will fail to use SPDY and fallback like defined in the protocol to HTTP/1.0

Measure
So can we test to see if SPDY is really faster and how much faster?
Yes, see Evaluating the Performance of SPDY-Enabled Web Servers (a Drupal site :-)
SPDY performance

So more users, less errors under load and a lower page load time. What is there not to like about SPDY?

Drupal
That is why I would love Drupal.org to run with SPDY, see this issue on d.o/2046731. I really do hope that the infra team will find some time to test this and once accepted, install it on the production server.


Performance as a Service
One of the projects I have been active in later is ProjectPAAS, bonus point if you find the easteregg on the site :-) . ProjectPAAS is a startup that will test a Drupal site, measure on 100+ metrics, analyse the data and give the developer an opinionated report on what to change to get a better performance. If you like these images around the retro future theme, be sure to checkout the flickr page, like us on facebook, follow us on twitter but most of all, see the moodboard on pinterest

Pinterest itself is doing some good work when it comes to performance as well. Not just speed but also the perception of speed.

Pinterest lazyloading with color
Pinterest does lazyload images but also displays the prominent color as background in a cell before the image is loaded, giving the user a sense of what to come. For a background on this see webdistortion


Congratulations you just saved 0,4 seconds
If you are lazyloading images to give your user faster results, be sure to checkout this module we made; lazypaas, currently a sandbox project awaiting approval. It does extract the dominant (most used) color of an image and displays the box where the image will be placed with this color. And if you use it and did a code review, be sure to help it to get it to a real Drupal module.


From 80% to 100%
Lazyloading like this leads to better user experience. Because even when 80% of the end-user response time is spent on the frontend, 100% of the time is spend in the client, most ofthen the browser. The only place where performance should be measured and the only page where performance matters. Hence, all elements that deliver this speed should be optimized, including the webserver and the browser.

Now say this fast after me: SPDY FTW. :-)

XML feed