Which method to use can be a big factor in deciding the way you build a site. There are various ways to create these “friendly” URLs, such as mod_rewrite, URL Rewriting or dynamic 404s, but the real question is Why? Why do it in the first place? Are there any tangible reasons to go to all the effort of doing it?

After my discussion I decided to do a little research, here are my findings:

Search Engines

Very often this type of URL is called “search engine friendly”, however this is a bit misleading. It used to be true that Google had two separate indexes, one for dynamic URLs and one for static. I also know that Google sometimes had issues in indexing dynamic pages (this is still true of some engines). Together these lead to a definite advantage for static URLs – THIS IS NO LONGER TRUE! In fact it hasn’t been the case for at least three years.

However…Google DOES take the keywords in the URL into account, to some degree. Matt Cutts of Google hasposted a video saying that having keywords in your URL “does help a little bit”. He doesn’t actually say how though (i.e. whether they directly affect rankings or whether it’s these second-order affects). He also says not to obsess about them or try and keyword stuff your URLs. So the conclusion remains the same: they don’t make “much” difference to rankings … but still use them!

Here is what the Google help system currently advises webmasters and site owners:

A site’s URL structure should be as simple as possible. Consider organizing your content so that URLs are constructed logically and in a manner that is most intelligible to humans (when possible, readable words rather than long ID numbers). For example, if you’re searching for information about aviation, a URL like will help you decide whether to click that link. A URL like, is much less appealing to users.

Another important issue with search ranking is anchor text. Any link to your site from a 3rd party site is counted by Google as a vote for your site (PageRank). This vote carries more importance if the keyword text in the link is related to the content on the landing page. So you can see that any URL built from keywords is much more beneficial than a URL link which includes only query strings.


At first this may not seem that obvious an issue as it’s usually domain names, rather than URLs, that people think of when they talk about internet marketing. However, anyone with even a rudimentary understanding of marketing psychology will tell you that if you do a search for “grand hotel Brighton“, you are more likely to get a click through if your URL is listed like this:

Rather than this:

It is also worth remembering that, as shown here, Google makes any of the search keywords bold if they appear in the URL.


Fairly obvious one really, a URL that can be easily understood by a human being is far more likely to be remembered, copied and of course clicked, than a dynamic URL. On the whole dynamic URLs are longer than static ones, easier to truncate and generally more prone to human error than static ones.

When an internet user clicks on a link (any link) they have an expectation of what they will find when the page loads. This expectation is built by the context of the page the link was on, the text used around the link and of course the link text itself. Obviously having relevant keywords in the link will not only help people to decide whether a link is worth clicking it will also help build the expectation of what will follow.

The ability for a user to “guess” how he might find what he is looking for can prove extremely useful. Take for example the way the BBC organises their web site, is very intuitive.


The query string which follows the question mark (?) in a dirty URL is often modified by hackers in an attempt to perform a front door attack into a web application. The very file extensions used in dynamic URLs such as .asp, .jsp, .pl, and so on also give away valuable information about the implementation of a dynamic web site.

Abstraction and Maintainability

Because dirty URLs generally expose the technology used (via the file extension) and the parameters used (via the query string), they do not promote abstraction. Instead of hiding such implementation details, they expose the underlying “wiring” of a site. As a result, changing from one technology or platform to another is a difficult and painful process filled with the potential for broken links and numerous required redirects.

So… to Recap

Cons of Statics URLs

  • It’s quite hard to correctly create and maintain rewrites that change dynamic URLs to static-looking URLs. You might mess this up, in which case your users and search engines will struggle to find content properly on your site.
  • It’s much safer to serve the original dynamic URL and let Google handle the problem of detecting and avoiding problematic parameters.
  • The possibility exists for duplicate content, one dynamic the other static.

Cons of Dynamic URLs

  • Lower click-through rate in the search results, in emails, and on forums/blogs.
  • Greater chance of truncating when copying/pasting, resulting in a 404 or other error.
  • Lower keyword relevance and keyword prominence.
  • Nearly impossible to write down manually and share on stationary or over the phone.
  • Almost impossible to manually remember
  • Do not typically create an accurate expectation of what the user will see prior arriving on the site.
  • Not easily usable in branding or print.
  • Won’t always carry optimised anchor text when used as the link text.

Pros of Dynamic URLs

  • They are quicker to develop and using them has been, and so far still is, an accepted practice in web development.
  • Google says they can effectively crawl and index them now.
  • Reduces the possibility creating phantom or non-existent content.
  • They can discourage unwanted reuse, when the intent is to prevent the user from typing a URL, remembering it, or saving it as a bookmark, such as in an access control policy.
  • Dynamic URLs provide portability for complex content i.e. within a search result.

Pros of Static URLs

  • Higher click-through rates in the SERPs, emails, web pages, etc.
  • Higher keyword prominence and relevancy.
  • Easier to copy, paste and share on or offline.
  • Easy to remember usable in branding or offline media.
  • Provides guessable entry points.
  • Creates an accurate expectation from users.
  • Can be made to contain anchor text to help page rank.
  • All 4 of the major search engines (and most of minor engines) generally handle static URLs more easily than dynamic ones, particularly if there are multiple parameters.

To conclude, it seems that even though Google say that they have no problems with dynamic URLs they still take keyword rich URLs into account when ranking sites (although only moderately). Couple this with benefits of marketing psychology and usability and you end up with a very strong argument for friendly URLs.