Wednesday, June 27, 2012

A .NET guy @Velocityconf 2012 - Day 2

So, the first day went great and the second one was even better. Actually, this was the first official day (not counting the workshops) and we get started with an awesome keynote.

The keynote

Jay Parikh from Facebook gave us a really inspiring speech about their company culture, work environment and of course some staggering numbers of their performance load. Something he said and I have already found to be very useful when getting new members into the team is the fast success. To give them the opportunity to commit a real peace of code in their first days in the new company is a big booster for what's coming next. I really enjoyed the short review of the Gatekeeper - the software Facebook wrote to manage to rollout of the new features. This kind of tools really make the difference between a just an ordinary software company which does what it needs for a living and an aggressive and innovative one - the automation is the key. Jay mentioned a lot of corporate values that we try to follow in Telerik, as well- point solutions, not problems; be honest; be a team player, not a player...

Two guys from Google Arvind Jain and Dominic Hamon talked about user experience and how we can improve it by preloading things. Something new for me was the:
<link rel="prerender" src="some_resource" />  for Chrome or
<link rel="prefetch" src="some_resource" /> for Firefox tag. Basically, this way you can advise the browser to prefetche some resources that you believe are more likely to be hit by the user. This is awesome and as you probably know, Google are using this technique for some time in their search engine and in the browser when typing for a url. What they do is to keep a track of what urls you choose with what probability when typing certain characters. This gives them the opportunity to preload amazingly accurate most of the resources you request. Just an example how effective this can be - Google has measured the seconds that they save for each user by preloading resources and it turned out that only for a day they save altogether 30 years of waiting to their users (for the search engine and the browser)! This is something we certainly want to use in our websites. 

We also got a very interesting talk from Richard Cook, who is not a programmer but has made a deep research on the topic "How complex systems fail". To sum up - statistics kills the details and if you want to really study and investigate a problem you should not observe the data from eagle eye, but get down to the single item level and study every single case individually. It looks like this has been the key to solving lots of mysteries in the human history and more specifically - that how the cause of the Cholera decease was found. By aggregating data you can even hide the problems and not now when you systems behave badly.

Performance implications of Responsive Web Design

Lately, we have been greatly using this technique. We just release a new version of TelerikTv (which I will blog about later) that is using a responsive layout to adapt to different screen parameters. What we should keep in mind when using responsive web design is that hiding or adapting content doesn't make it necessarily optimized for the particular device and you can even get a performance penalty when not implementing this technique in a proper way.

One thing to have in mind are the responsive images. Currently, it's very difficult to serve the proper image size for the specific device and layout. It looks like we need a new syntax in the in order to express different sources for our images. Otherwise, we can only adapt the visual content but the amount of data will still remain irrelevantly high. To do this today, you have several options neither of which are obvious winner and probably will be replaced in the future by a new standard introduced in HTML for example. Here is an interesting reading from A List Apart about this topic.

Roll back: the impossible dream

James Turnbull explained how hard is actually to make a real roll back of a system that is currently running and speculated on is this even possible in a system that is currently running. Something that worth mentioning that a lot of people count on their roll back procedures and have them as a possible last resort solution, but roll back routines are hardly ever tested and practised. Actually, performing an operation that one is doing for a first time when something terribly wrong has already happened and the pressure is enormous is probably not the best thing to do. May be we should not count on roll back procedures at all? May be we should invest time in actually solving the problem, instead of trying to rollback the systems that are constantly changing to their initial state without loosing the operational data.

As a side note, James mentioned about the myths that every company and team has about how to do certain things and how to don't do other. His advice was to go and re-think all the "We don't do things this way because, something terrible wrong happens every time we do it" statements we have used to use, find the cause and fix it. 

The expo hall

There are a big amount of companies presented at the expo area. Most of them, I haven't heard mainly because they are not targeted at the .NET world, but I found out that most of their tools can be useful in our environment. For example, a good impression I got for a company that offers performance dashboards as a service for you. By installing an agent that tracks the activities in your web processes on every live server you have, this agent collects data about windows performance counters for example and sends them to their service. There you get a nice presentation of your current live environment from all running servers. You can also add information from your logs, eCommerce solution and get all this data in one place. Something we should probably check out with our admin team.

Something I forget to mention yesterday was the webpagetest.org project. An online tool that makes a performance study on your website from different geo-locations with different browsers and gives you a nice analytical data about how your website behaves. Another interesting tool (still beta) that worth mentioning is http://httparchive.org/ - a nice source for statistical data regarding the http traffic worldwide. According to the trends shown there - Flash is steadily disappearing, websites are using more and more custom font faces for their representation, etc. Be sure not to miss it! 

Well, basically these were the highlights from today from VelocityConf. It has been a very busy day with lots of new information. I'm sure that tomorrow will be even better.