At what point does past knowledge become prohibitive? Experience and wisdom, the fruits from life's lessons, are supposed to guide you in your future decision making process. But at what point has everything already moved on, to the point that your decision becomes flat-out wrong?

I'll be a bit more concrete. I have always believed that Javascript-heavy applications are the worst things. My complaints about them boil down to a list:

  • Accessibility - Javascript is not always enabled in browsers. As amazing as the progress of browsers, the surge of mobile web applications clearly shows that scripting-heavy applications do not carry across into mobile territory.
  • CPU Performance - One application causes your browser to crash? Say goodbye to all your tabs.
  • User experience - Instead of relying on UI patterns that people are familiar with (the "back" button), Javascript applications tend to reinvent UI flows ("click this link to get a permalink, because we don't update the URL field). Every new application is a potential source of frustration.
  • Development time - All web applications are written with a server-side language. So you want to add another language that your app will need? Not to mention that all browsers interpret ECMAscript in different ways - have fun with compatibility!
  • Network degradation - Writing a server application that's 500K in code versus 25K in code has nearly no difference to the client. But try loading a site which has 500K Javascript versus 25K of Javascript. If the response time numbers are correct, a 500K Javascript application will certainly break a user's "uninterrupted" train of thought
  • Product direction - Have you ever tried to stop an engineer from implementing a "cool new feature" they hacked up in a couple of minutes (or maybe a late weekend session) which has absolutely nothing to do with anything? Trying to cut code to keep your application speedy requires some social capital.

But most of these (with the exception of the last point, which is more of a social problem), will be solved. But when do you make the jump?

At some point, browsers will fix the threading issue, and they will have faster Javascript interpreters (Webkit is already blazing forward on this front). UI patterns will emerge and be adopted by product managers. Faster networks and smarter caching (if Gears ever catches on) will reduce the network degradation issues with large Javascript applications. Javascript libraries like jQuery already make development across browers much easier (although jQuery's primary goals are to simplify cool UI effects for CSS/HTML wizards). Assistive technologies for disabled visitors will get much better and be able to handle scripted pages more effectively. Section 508 will become less of an engineering drag on web application development.

And at some point, applications that didn't make the jump to take advantage of these technologies will be left behind. But it seems to be a matter of timing, more than anything. IE6 still still have a noticeable presence on the net. Working for an enterprise software company, how far ahead of the curve do we want to be?

With limited resources, we can only pick and choose certain battles - which ones do we pick? Make it work well for older generations of browsers, or work on the latest and greatest which only work on certain browsers?

The argument from the business side is not as difficult as one would imagine when trying to justify accessibility: "Do you want to automatically slice yourself out of 35% (an arbritrary number I made up) of the market by making this *not* work in a particular subset of browsers?"

The easiest path out is always the one which only works on people who have your particular set-up. UI designers are currently burdened with multiple browsers, on multiple platforms, in completely variable environments. That's where expertise really comes into play.

And this doesn't even begin to factor in another huge issue for effective software development: maintenance. We once had an employee who was fully capable of "making things work" for all the different browsers, but the solutions were hacks on top of hacks, to the point that the application became unmaintainable. It became impossible to add new features to the front-end, or to fix other issues.

This is why server-side languages are so awesome to me - you write something, and it works. The same way. Every time.

But the scariest part is that a lot of technologies, like Flash, are actually "catching up" in addressing many of these concerns (they are successfully following the "feature checklist" software development methodology). At some point, they will become as accessible as heavily Javascripted applications (possibly even moreso). At that point, do you build something in Adobe Flex? Ten years from now, will it make more sense to write Flex applications? It's possible - there's a reason why Flash's file uploaders kick the living bejesus out of every browser's native uploader. I shudder when I think of a world where there become more examples of this.

Reinventing the wheel is evil, but with enough time, it becomes worth it. With time, you work out the kinks, and make it par with the accepted "standards."

So the original question, posed again: When do the rules you impose ("Javascript only for progressive enhancement - rely on scripting languages for rendering pages" become prohibit to your software's evolution?

Posted by roy on March 3, 2009 at 02:48 AM in Web Development, MindTouch | Add a comment

Related Entries

Want to comment with Tabulas?. Please login.