Hashbang URLs are a recommended practice

Why not AJAX'ify entire websites?


Is there a solid rationale for why websites shouldn't be developed with Ajax features that load large chunks of each part (assuming there are elements like headers, navigation, etc that stay the same)?

Certainly it would be less resource intensive as the server would not have to provide content to be displayed on every page, which benefits both the host and the end user.

Answer the question taking into account:

  • The JavaScript behavior of the site deteriorates considerably in any case

  • When I ask, I'm talking about new websites that might be more likely to implement this behavior from the start, so it's not technically expensive - we're not going back to a finished product to implement it.




Reply:


If the content can be accessed without JavaScript enabled, your question makes no sense. It's not "fully Ajaxified" if you can get to its content by other means. You are really wondering, "Is it okay to use Ajax to improve the user experience?" The answer is obviously "yes".

to edit

When Google came out with their crawlable Ajax proposal, it was a really bad idea. For an interesting read.





The professionals

  • AJAX allows you to share a common "base page" and only load the content areas, which can reduce load time for users as much of the page is already loaded.
  • Can allow a feast for the eyes, e.g. B. showing and hiding the content area.

The disadvantages

  • Doesn't play well when the page is downloading.
  • Can play around with devices with disabilities.
  • Viewers with Javascript disabled can only use the content if a non-Javascript version is also used.
  • Much more work (does that really need to be said?).

For example, let's say your website degrades in a reasonable way for those without javascript. How well that turns out depends on how it's done. For example, if you just see a link to a non-Javascript version out of the blue, those viewers will find it inconvenient to have to click another link. On the other hand, if there is a noscript "main page" that uses traditional links, it works better for most users, but for those who use devices with disabilities, it lacks support when the user selects a particular "page" of a link, etc. .

All in all, in the world of ever faster web connectivity, truncating a small amount of file size isn't really justified (we're assuming that all of the Javascript itself, CSS, and images can and will be cached, with only the "Base" page itself, where bytes can be stored) for the drawbacks it can offer, namely the added difficulty (although not always a challenge) and the lack of support it can provide to some users.

All in all I would say it's up to you, it would probably work pretty well, and for the vast majority of users they will likely see the site as intended, but personally I would say don't bother as it's not worth it make such a marginal improvement in file size.


Take a look at http://gawker.com/ - this page loads almost completely afterwards. You use "Hashbangs" () to determine which page of content to load. The main navigation remains static.

At http://mtrpcic.net/2011/02/fragment-uris-theyre-not-as-bad-as-you-think-really/ you can find a short tutorial on the concept used by Gawker.

There are pros and cons, you have to consider search engines (see http://code.google.com/web/ajaxcrawling/docs/getting-started.html), people with javascript disabled and do lots of tests.

After all, probably the biggest argument against them is that a user can be impatient waiting for a page to load and then having to wait for a page to load. In my opinion, the best practice is to load the main website, navigation, and primary content in one go (upon request) and save the AJAX for the non-essential ancillary costs. This works with the idea of ​​progressive improvement and mixes the best of both approaches.


Because it probably just isn't necessary.
Loading basic HTML documents is easy and works. The introduction of Ajax adds another process layer for browsers, code and maintenance for Javascript, backend content, weird hashbang urls, etc. Sometimes this can be justified, sometimes not. It could (could) save you some server resources, but will that be enough to make up for the upkeep? You have to evaluate this per project.

For example, when Twitter received the latest redesign, they assumed it wasn't just a webpage (page) but an application, and the whole thing is heavily Ajax-based, although most of what it can does is deal with regular page requests become. One of the biggest problems, which is even less of a problem now, is getting there and being greeted with a blank page because something went wrong in Ajax.




In practice, building a "fully AJAX" website is a lot of work, especially for large websites that are very complicated. Some websites that try to do this are Google and Facebook, but even they don't do it perfectly.

Common problems include navigation (i.e., forward and backward) and bookmarks, but there are many other errors that many developers would prefer not to have to deal with. And it basically means creating two versions of the website that are compatible with Javascript and non-Javascript users (and fixes for all browsers with poor AJAX support).



Yes, it should be, but it should be the other way around.

The general parts of the page should be sent over HTTP. Then a small Ajax control (or better yet, websockets) loads the dynamic content asynchronously.

Of course, you must first determine whether Javascript is enabled (via cookie) and only use this method if it is enabled.

So you need the normal full HTTP route and then a websockets route. This will require code duplication unless you are using a tool like node.js.

Most people "think" that it's just not worth the extra effort. And sometimes it isn't.

Other than that, a lot of people misjax pages. You actually decide that you don't need a non-javascript version and that you need all those weird hash bang urls and broken forward / backward buttons. HTML5 history is required to get Ajax right (IE9 doesn't have it). Also, the developer needs to complete the versions of your website.


Since you indicated that this would get significantly worse for visitors with Javascript disabled, I only see two real problems (and one potential problem) that might arise.

Bad for accessibility

Screen readers and other assistive technologies are often affected by dynamic DOM changes. They process and read the page in a linear fashion, and changing the contents of the page after it loads may not be handled correctly.

There may be techniques to work around this, but I haven't looked into them too thoroughly.

Increased complexity

Maintaining this type of website can be difficult. For example, if you've created a new layout and changed the ID of the content area that you've replaced with your AJAX links, it can damage your navigation scheme in quite confusing ways.

This type of AJAX behavior would also make any traffic analysis you would be difficult to perform. Google Analytics would not properly register these AJAX loads without a manual call from.

As you make your site more complex, it also raises the bar for new developers. Anyone who works on the site probably needs to be made aware of how this behavior affects page loading.

Possible: Slower loading of the page on the first visit

Depending on how you structure things, this page, where AJAX code is retrieved, cannot be activated until the document has fully loaded. Only after your visitor has downloaded the entire page and then the Javascript (if it is an external file) and the browser has rendered them and retrieved the content via AJAX will the page content be displayed.

Each subsequent link that is clicked is faster, but getting the first page a user visited would actually take longer than a static version.

The reason I identified this as a possible problem is that you can always send the first page statically (since you already have the static version as a fallback) and then use AJAX for the links below.


For what it's worth, this doesn't sound like a terrible idea to me - especially for bandwidth-sensitive applications like mobile sites. However, you would have to carefully weigh the cons to make sure it is worth it in your case.


Having Ajax elements on a page is fine if you have a small user base but have more traffic. You want to use a more static approach to reduce resource abuse.

Example: Suppose you have 200 people trying to access a page every second. You have about 7 database queries for your Ajax calls. That's 1400 database hits per second that can block a website.

A website that needs to be designed for high traffic should have static, outward-facing pages so that content can be displayed statically. It does this by using a server-side script that runs every second to recreate the static page that is fetched for the end user. In this way, you have reduced your load from 1400 database calls per second to 7.

It's an SOA approach to building websites.



We use cookies and other tracking technologies to improve your browsing experience on our website, to show you personalized content and targeted ads, to analyze our website traffic, and to understand where our visitors are coming from.

By continuing, you consent to our use of cookies and other tracking technologies and affirm you're at least 16 years old or have consent from a parent or guardian.

You can read details in our Cookie policy and Privacy policy.