ViewState: SEO killer or simply a hidden menace?
We have a client with a site running on asp.net – Microsoft’s proprietary coding system and in a recent audit or their website, we discovered that many of the pages were of massive weight, although on the surface there doesn’t seem to be much there to make up such huge file sizes.
Looking directly at the code – which you can do by selecting “View Source” from your browser menu of simply by pressing “Control-U” on a PC or “Command-U” on a Mac – we discovered a huge block of apparently random text: almost 193,000 characters in total. Mary Had A Little Lamb in Base 64This was Base 64 code which is used to represent binary data in ASCII format so that it can be securely stored and transferred. Actually, Base 64 is often used by hackers to embed code into a website in the hope that it is not picked up by website security scanners.
In total, this block of characters contributed almost 200Kb to the weight of the HTML file, or around one third of the code on the page. Since the ideal size of a web page is said to be less than 100Kb (a figure quoted in Google’s old “Best Practise” guide) that poses a real problem.
What is worse, this chunk of “useless” code was placed towards the top of the HTML, and we all know that “search engines only spider the first 10% of any page”.
Readers with experience of asp.net will by now have realised that we’re taking about ViewState here.
There are other issues too. Some sloppier programmers have taken storing sensitive or private information in ViewState but since Base 64 is far from secure (there are many Base64 “Decrypters” available online) the data, if intercepted, can easily be decoded.
But the problem for discussion here is: Does a huge hidden ViewState affect SEO?
Certainly, good SEOs will always advise that …
- Page sizes should be kept as small as possible (and preferably less than 100Kb) to speed download
- All the vital information on the page, including the spiderable text and the primary keywords, should feature as far up the page as possible.
On this basis, large ViewState fields might be said to break all three rules; however, as SEOs we soon learn to realise that all rules are made to be broken.
Firstly, the idea that Google and its rivals can’t get beyond the first 100k is a bit of a myth, and it was possible to see this by examining the text of the page in question: bits of the text well down below the massive ViewState field were there in the Google index. This especially true of hidden fields like ViewState which the bots recognise and skip over anyway.
Obviously, even if ViewState did contribute a 200Kb block in the way of the good stuff, GoogleBot simply skipped over it, allowing a good scout around even before hitting the mythical 100Kb limit.
In fact, Google WebMaster Trends Analyst John Mueller says he can’t remember any crawling, indexing, or ranking issues with regards to ViewState on asp.net sites: “We can crawl pages a few megabytes large, so even large ViewState values generally wouldn’t block that”.
Steps to Take
There is a real problem, however, and that is file size. With the world now getting used to lower and lower bandwidth and download capacity – and by that I mean anyone using a mobile device – huge page sizes are bad news. Anything that can be done to reduce page size and thereby make downloads more reliable is worth doing. Google has explicitly stated that download speed is a ranking factor.
For asp.net developers that should mean switching off ViewState wherever possible on a per-control, per-page, or server-wide basis using the Control.ViewStateMode property (There’s a full technical description of this at The Microsoft Developer Network see: http://msdn.microsoft.com/en-us/library/system.web.ui.control.viewstatemode.aspx). This is now what the client has done, based on our recommendations.
Yet if you believe ViewState is absolutely essential – and I really can’t think why – it should at the very least be moved to the bottom of the code and try to reduce its size and scope.