Virtual Firewall with kinect and a projector

I am a big fan of all thing gesture (kinect) and augmented reality. So this "firewall" on aaron-sherwood's site is something I digg

The original concept stems from a performance piece I’m currently developing as Purring Tiger (with Kiori Kawai) titled Mizalu, which will premiere in June 2013. During one scene in the performance dancers will press into the spandex with the audience facing the opposite side. Mizalu is about death and experience of reality, so this membrane represents a plane that you can experience but never get through. As hard as you try to understand what’s in between life and death, you can never fully know.

And a bit more howto

The piece was made using Processing, Max/MSP, Arduino and a Kinect. The Kinect measures the average depth of the spandex from the frame it is mounted on. If the spandex is not being pressed into nothing happens. When someone presses into it the visuals react around where the person presses, and the music is triggered. An algorithm created with Max allows the music to speed up and slow down and get louder and softer, based on the depth. This provides a very expressive musical playing experience, even for people who have never played music before. A switch is built into the frame which toggles between two modes. The second mode is a little more aggressive than the first.

Need some nerd time!

Leap userinterface with Flickr, Minority Report User Interface is here (well sort of :-)

I recently gave a presentation about Natural User Interface and dived again in my old (?) leapmotion.

It can do minority style user interface within a webapp:

3d scan of my room using the kinect

I love all the stuff one can do with the Microsoft Kinect. Take for example, this scan of my bay window

(looks like a snow bawl due to the white parts outside)

And without color:

The kinect can be used for much more then just scanning. Scanning one's girlfriend for example :-)

Zen and the art of innovation

Commodity sinks, innovations rises. An old rule. That is the reason why the drop has to keep on moving, have shorter release cycles and adopt new technologies faster to make sure we don't become what we replaced; old outdated systems that are slow to adapt and fast to extinct.

There are two sides to this, we have to grab new technologies faster and dump older technologies sooner. And so be sure that adopting new technologies faster doesn't create a legacy we have to release faster. Or at least, this is my opinion.

"Nobody" in the world logs in with openID anymore. Many appliaction to application in backend still might be using it, but nobody uses it to authenticate anymore. The last bastion Janrain just announced that it will close the doors. Drupal will drop OpenID form core as well and might have done this as well a long time ago. So when Drupal 8 will see the light in janury 2014, and we still would have release cycles over a 1000 days, the maintainers will still be dealing with an OpenID implementation and supporting it in Drupal core 7 up to the time D9 ships, somewhere around Q4 2017. Extrapolating our current release cycles most likely later, much later.

This is not to criticise anyone,I think it was a wise step to include OpenID back in the Drupal 6 days and it is a wise step to remove it from D8. But the time between making these releases and hence these decisions should be shorter. And especially the time that it impacts the code to maintain.

You want to know another example of an innovation that was once great and is now holding us back? gzipping pages. In core ever since 4.5 and was a great feature (though I had some problems back then :-)) But it is wrong. Holding us back in duplicate functionality that has to be maintained and is better being served in another OSI layer.

Back when webservers didn't compress pages and elements by default, it made perfect sense to do so from Drupal. A great way to save bandwidth and deliver the pages faster to the user. But now all webservers compress pages (and other elements like a big word document as an attachments served form /files/ !) by default, it is code that has to go. The innovation was great, but it sunk down to lower in the stack and became a commodity in all major webservers. And thta is the risk with all innovations and if one keeps holding to innovations that are already commodity, one ends up over there as well.

This holds true for many elements of frontend performance. Right now is seems like a good thing to combine multiple CSS or JS files in to one file. But once SPDY becomes mainstream this can better be done in HTTP protocol, not in the CMS.

And traditional frontend performance states that we have to use sprites in the template.

While if we add one module and one line of code this is all done at the webserver level with image sprites.

And we should use selective DATA URI's in our template. Most frontend devs will puke; binary data in a template? We are some old ugly old tchnology.
Again, with one command, the webserver layer will migrate these smaller images from flat files to inline DATA URI's.

Take a look at this impressive list of options where modpagespeed -a webserver module- can help you with:

  • Optimize Caching: Canonicalize JavaScript Libraries, Extend Cache, Extend Cache PDFs, Local Storage Cache, Outline CSS, Outline JavaScript
  • Minimize Round Trip Times: Combine CSS, Flatten CSS @imports, Inline CSS, Combine JavaScript, Inline JavaScript, Move CSS Above Scripts, Configuration file directive to shard domains, Sprite Image, Pre-Resolve DNS
  • Minimize Request Overhead: Rewrite Domains, Configuration file directive to map domains
  • Minimize Payload Size: Collapse Whitespace, Combine Heads, Elide Attributes, Minify JavaScript, Optimize Images, Prioritize Critical CSS, Deduplicate Inlined Images, Remove Comments, Remove Quotes, Rewrite CSS, Rewrite Style Attributes, Trim URLs
  • Optimize Browser Rendering: Convert Meta Tags, Defer Javascript, Inline Preview Images, Lazily Load Images, Move CSS to Head, Optimize Images, Convert JPEG to Progressive, Rewrite Style Attributes
  • Other: Add Head, Add Instrumentation, Inline @import to Link, Make Google Analytics Async, Insert Google Analytics Snippet, Pedantic, Run Experiment

Now for some of these actions there might be a Drupal module (lazyloading), for some functions one has to write good CSS/HTML/JS (CSS above scripts), some need good content editors or backend processes (de-duplicate inline images, progressive jpeg's) and some are just not doable yet in the frontend in an easy way (DATA-URI's).

So as a frontend dev (ops), do yourself a favour and do use the page speed module out for Apache and nginx AND keep writing good templates. And as a community Drupal community member, make sure that we keep innovating on the top , and let code free at the end where it is better being served outside of our hands.

(btw Mike Ryan, more retro future at this pinterest board :-)

The analog loophole, using a lego mindstorm and Mac to bypass the DRM on a Kindle

I love the idea of using the analog loophole to bypass copyprotection / DRM. For example filming in the movie in the cinema (usually very bad quality) or...

From allthingsd:

Using Lego’s Mindstorms — a basic robotics kit popular with hobbyists — plus a Kindle and a Mac, he has assembled a way to photograph what’s on the screen, and then submit it to a cloud-based text-recognition service.

To bad I dont have a kindle. Nor really the need to copy a work like this. But still, once the WorlWide cyber police controls us all; we can stull use the analog loophole to get back at them :-)

DIY kindle scanner from peter purgathofer on Vimeo.

XML feed