Web App I am building out – just rendering it with HTML was getting BLOATED. We were seeing 1.18MB pages, lol – now I am down to 40kb
So I went back to the drawing board and jump all the data into the page using JSON, and with that – dynamically build the views on the client.
The beauty is, no more post backs at all, I just do ajax for everything. I can rebuild the entire ui with a command when data changes from the server.
It has been years since I’ve done this, any pitfalls to watch out for?
Well you lose out on users with javascript disabled, there might be accessibility issues, you might have more issues with older browsers unless your code is bombproof, mobile browsers may have performance issues, and search engines may have a harder time indexing your content.
Assuming you’re still loading the same content, you’re not really saving bandwidth, and it takes at least as long to render. However, if you load things in the proper order it’ll appear to load much faster.
A lot of it depends on context. Obviously the page is quite complex, what does it do?
Twitter did the same thing, and not too long ago they went back to a more traditional model. Removing their heavy Javascript reliance allowed them to significantly reduce their load times.
I am not looking for extensive backwards compatibility, if they lack HTML5, oh well. If there is a big enough demand, I’ll reconsider. I don’t think load times (downloading content) is much of a concern really.
Basically, I am building a ‘billing’ page to add/remove billing items for invoices (More complex than that, but that is a high level).
I guess load time on the client is fine, IMO – realistically they have a "View one invoice at at time" approach, so this allows them to swap between the invoices easily.
This is much, much, much easier now that jQuery is around. Last time I did this kind of work was like 6 years ago… LOL
Lose out on users with JavaScript disabled? For real? That’s such a low statistic that it really doesn’t even matter anymore. 2005? Sure, 2013? Time to stop debating that. Webapps these days are almost all UI, and the industry is moving towards front-end because such a statistic is becoming irrelevant. Plus, if you have some experience in developing web applications you can probably write something that will accommodate most of your pitfalls. And no, Twitter re-wrote their front-end stack to more of an event-based and asynchronous model, which requires much heavier javascript and integration between their services. They basically did what I’m going to list below; rewrote things that made architectural sense.
There’s a lot of things to look out for, especially if you’re a developer that’s knowledgeable in many languages, you’re probably going to code things in a way that’s going to hinder the shit out of your application. JavaScript definitely isn’t like writing most other languages, there’s a ton of shit to look out for in regards to performance. Depending on how extensive your app is, be careful with jQuery, it can really slow shit down because jQuery attempts to support backwards comparability with pretty much every function call that’s available. Simple queries will end up running so recursively that it’ll put some browsers to halt (for a few seconds).
Simple shit:
– Watch out for the number of DOM injections you use. If there are a lot, use document fragments.
– Don’t pollute the global namespace.
– Don’t excessively use closures, or keep things in memory when they need to be garbage collected.
– If you’re using jQuery, make sure to cache your re-usable objects and use them as context.
– Load javascripts at the bottom of the page.
– Since you don’t care about backwards compatibility and can use HTML5; do it, and avoid expressions.
– Do a little research on javascript design patterns to see if you can architect your app for speed and security.
– If you’re using a service, minimize the number of calls to the server that you can. Ie: Grab as much data via JSON from the user as you can before calling your service.
– If you’re doing extensive UI, look into platforms such as Node.
I would personally not be worried about javascript being disabled but yes it is 2013, and yes it can still be a problem – just like IE6. It just depends on your audience and requirements.
Twitter achieved a massive gain in load times by moving their rendering to serverside. You’re right about them implementing AMD modules – which are great for development and I love, but the bulk of their gains were based on NOT building everything clientside like they used to.
Here it is from the horse’s mouth:
When you come to twitter.com, we want you to see content as soon as possible. With hashbang URLs, the browser needs to download an HTML page, download and execute some JavaScript, recognize the hashbang path (which is only visible to the browser), then fetch and render the content for that URL. By removing the need to handle routing on the client, we remove many of these steps and reduce the time it takes for you to find out what’s happening on twitter.com. |
Here’s an interesting aside on hashbangs/too much reliance on JS:
Hashbangs for Lunch SITE BUILDING, SERVERS, HOSTING |