Dynamic pages are fantastic. They mean less work for web designers, who can create a single webpage and change its content through a database connection. They’re quicker loading for visitors (after the first load), as the graphical elements are cached by most web browsers and only the content changes. They allow for great flexibility in content, such as shopping sites, blogs and more.
However, before you get excited about dynamic pages and race off to learn PHP, there is a downside to these web pages: Search engine bots cannot see them. And if they can’t be seen by the bots, they can’t be indexed by the engines.
The purpose of this article is to show you how you can index your dynamic pages with the search engines, and some of the pitfalls that you can avoid when submitting these to Google et al.
What is a Dynamic Page?
As mentioned above, a dynamic page is a static webpage that dynamically pulls it’s content from an external source – usually a web database such as mySQL.
This has many benefits, not least of all for the designer, whom only needs to create one webpage design. If he wishes to update his site, he need only change the one page and all pages presented to the end user will be changed too (as they all use the same template and change content dynamically, of course).
So you can use dynamic pages any time that you have information that is categorized by date or if you use an ecommerce shopping sites. Dynamic pages are also compatible with multiple service providers; so no need to worry about cross-browser support or migrating your site to another host (migrating the database is another issue).
You can spot dynamic pages as you surf around the web by looking at their URLs. Dynamic pages contain the character “?” or one of “#&*!%” within their link, such as www.yourdomain.com/?page=10
So What is the Issue?
The issue is the characters used in the URL. These instruct the webpage to send a command to the web server to fetch the content from the web database. So, in essence, the content isn’t retrieved unless the web page asks for it.
Googlebots, and other bots for that matter, cannot send commands to the web server, and so cannot retrieve the content. Remember: if they cannot see it, they cannot index it.
Additionally, some black hat SEO people and spammers use the same characters to trap search engine spiders. When you consider both of these reasons, it’s not hard to see why the spiders are both unable and unwilling (if you can say such a thing about a program) to index your dynamic pages.
So I’m Stuck Then?
Not at all. As webmasters, we have to be a little more imaginative in dealing with the issue of non-indexing and there are plenty of white hat SEO options available to you.
Make regular static pages that link to your dynamic pages
This one is really simple. On areas of your site that are static (and therefore can be indexed), add a link to your dynamic page.
Google and the others will be able to access the page via this link, and index the result. Also, please remember to optimise your anchor text too. You can read how to optimise anchor text in another article.
Make blogs or blog posts that link to your dynamic pages
Similar to the previous suggestion, but involves linking to one dynamic page from another. The difference is that, as well as linking from your blog/post, your blog post would also be submitted to various websites or feeds, such as Digg, DropJack etc.
As these sites are heavily crawled by Google, they will also index your blog articles. Once these are indexed, their “child” links will also be picked up too.
Optimize any dynamic pages that need to be indexed
Just because your page is dynamic, this is no reason to ignore the importance of on-the-page SEO.
Look at your metatags, including the title, and optimise wherever possible. Optimise the content itself. Good content attracts both search engines and visitors. Check your keywords and their densities. In short, do everything that you would do if the page were static.
Articles and content pages
If your dynamic pages are articles, such as a blog post, submit the article to webfeeds and sites, such as Digg.
As mentioned earlier, these sites are heavily crawled and indexed, and, even with a link to a dynamic page, Google will index your page with ease.
Link to your dynamic pages with a table of contents
Create a single page sitemap of your dynamic pages. Do this for your web visitors – not for the search engines – and optimise the anchor text.
The benefit of this is that the static sitemap page will be indexed quickly, depending on the directory depth of the page, and, likewise, the pages it links to will also be indexed.
Rewrite Your Page URLs with .htaccess
This is an extremely powerful, but potentially complex, means of dealing with the issue of dynamic page indexing.
Basically, a rewrite can change your url from yourdomain.com/?page=10 to yourdomain.com/dir/page10/. This works with Apache servers, and is well worth considering, if not implementing.
This rewrite rule will convert pages like this yourdomain/posts.php?page=1 to yourdomain/posts/page1
RewriteRule ^files/([^/]+)/([^/]+).zip /posts.php?page=$1&file=$2 [NC]
Of course, this assumes that you are using PHP as your dynamic page language.
If you are interested in this option, the Apache documentation covers mod_rewrite in much more detail.
So if you really need dynamic pages, remember to set them up so that the Googlebots can see and record all the information on your site. As illustrated this is not an impossible task. It is just a question of working within the Google rules to help those bots to read all the information in your site.