So you've finally finished your next masterpiece of a website. Everything is tested and working great. The user interface is immaculate and the design is truly something to behold. Time to push it live and start rolling in the money, right? Well, there is one more user demographic that you still need to satisfy - the search engines! Unless, of course, you don't *need* any of that free targeted search traffic??
One of the strengths of ASP.NET development with Visual Studio is the relative ease in which you can create a functional dynamic site. At the same time, one of it's weaknesses is the relative ease in which you can create a functional dynamic site that is confusing, if not completely unusable, by a search engine robot.
Here's a checklist of five common mistakes that ASP.NET and the Viewstate/Postback model of development make it far too easy for unsuspecting developers to make:
1. Overuse of Button Controls
It seems obvious, but when linking between pages try to use a plain text link or Hyperlink control whenever possible.
2. Duplicate Page Titles
With any dynamically generated site, it can be difficult to generate unique page titles for each and every page, but it really is important. If you have a quality site, then the search engines are working hard to drive traffic to your site. After all, that is their core business - to provide links to the best resources on whatever the searcher is looking for.
So you need to make it easy for the search engines to figure out exactly what your pages are about, and the page title is an important part of that. Not only that, but once the search engine does rank your page highly, the title is the primary text that searchers will be seeing and using to determine whether to click on your listing or not!
On dynamically generated pages, try to to use a keyword-rich page title, such as the full name of the product on a product page, for best results. If you don't have any appropriate field, provide the ability for the user to specify their own page titles for each item being displayed. It's worth their time and effort.
3. Duplicate Meta Descriptions
Much like the duplicate page title issue, the meta description tag should not be duplicated across your pages either. Like the page title, this text is used (although to a lesser extent) by the search engines to determine the content of your page and also appears underneath your title in the search engine listing. Depending on the number of pages of dynamic content on your site, it might not be practical to add multi-sentence descriptions for every single page. In this case, simply remove the meta description tag altogether. The major search engines are pretty good at improvising when the description tag is missing by displaying portions of the page body that match the user's search keywords instead.
In my experience, the SEO benefit of adding a keyword-rich meta description is not enough to warrant spending a great deal of time creating custom descriptions for sites with 100+ pages.
4. State-Dependent Pages
Search engines rely heavily on the idea that every unique page has it's own unique URL. That means that if you are basing a page's content on session variables or viewstate parameters, you are probably going to have problems getting that content indexed. Once a search engine finds a URL, Google will continue spidering that page, but you can bet that the search engine robot will not navigate through your site again to get there. So you need to make sure that any content you want indexed by search engines can be accessed by simply opening your browser and typing in the URL of that content. That means unique URLs for every product in your ecommerce store, ever category in your directory, etc.
My recommendation is to use viewstate rarely and session variables almost never.
5. Duplicate Content When Rewriting URLs With ASP.NET
When you rewrite a URL, the browser is displaying a keyword-rich URL, but internally the URL of the page being displayed is still the ugly URL with the querystring parameters. In technical terms, the Request.RawURL value might be something like:
but the Request.Url value would still be something like:
All of that is just fine, but a problem can arise if you have a Button or LinkButton control that posts back on that page. By default, the button control will post back to the Request.URL value. causing the URL to change after postback. This can be a problem if some users end up linking to your 'ugly' URLs, because the search engines will find that link and spider it. To the search engine the two different URLs signify two different pages and both will be indexed seperately, causing a pretty ugly duplicate content problem.
Thankfully, starting with .NET 2.0, there is a PostBackUrl
property on the button controls. Set this property to the Request.RawUrl value and your button will postback to the 'pretty' URL.
What do you do in your daily ASP.NET development to ensure your sites are search engine friendly? Post a comment about it, I'd love to hear it. :-)