In a default phpBB3 installation(without an SEO mod), the same thread can be accessed through many URLs. For example, the thread Favourite Animorphs fanfiction can be accessed through these different URLs :
- http://animorphsfanforum.com/viewtopic.php?f=5&t=60
- http://animorphsfanforum.com/viewtopic.php?f=5&p=659
- http://animorphsfanforum.com/viewtopic.php?f=5&t=60&p=659
- http://animorphsfanforum.com/viewtopic.php?f=5&t=60&p=659#p659
- http://animorphsfanforum.com/viewtopic.php?t=60
- http://animorphsfanforum.com/viewtopic.php?p=659
- http://animorphsfanforum.com/viewtopic.php?f=5&t=60&st=0&sk=t&sd=a
- http://animorphsfanforum.com/viewtopic.php?f=5&t=60&st=0&sk=t&sd=a#p659
The agressive Googlebot might pick almost all of those links and it might lead to potential ranking problems in SERPs.
One way to resove this duplicate indexing is to install an SEO mod like PhpBB-SEO.com SEO mod or Handyman’s SEO mod. But, installing and updating a mod with each phpBB3 update and/or mod update might get harrowing and burdensome. An alternative, albeit easier way to prevent duplicate indexing is to add these lines to your robots.txt :
User-agent: *<br /> Disallow: /viewtopic.php?p=<br /> Disallow: /viewtopic.php?=&p=<br /> Disallow: /viewtopic.php?t=<br /> Disallow: /viewtopic.php?start=<br /> Disallow: /*&view=previous<br /> Disallow: /*&view=next<br /> Disallow: /*&sid=<br /> Disallow: /*&p=<br /> Disallow: /*&sd=a<br /> Disallow: /*&start=0<br />
This will forbid all bots following robots.txt directives to crawl the redundant URLs. Your threads will only be accessible to the bots through the URL http://www.domain.com/viewtopic.php?f=X&t=X
.
Note(1): If you display Google Adsense ads in your forum pages, you might need to allow the Google Adsense bot(Mediapartners-Google) to access the ‘forbidden’ URLs, so that it can crawl the pages and display relevant ads. So, you need to add these ‘extra’ lines to your robots.txt :
User-agent: Mediapartners-Google<br /> Disallow:
This will allow the Mediapartners-Google bot to access the forbidden URLs.
Note(2): If your forum is on a subdirectory(say /forum/) rather than the root, append “/forum” to the robots.txt directives. It should look like this :
User-agent: *<br /> Disallow: /forum/viewtopic.php?p=<br /> Disallow: /forum/viewtopic.php?=&p=<br /> Disallow: /forum/viewtopic.php?t=<br /> Disallow: /forum/viewtopic.php?start=<br /> Disallow: /forum/*&view=previous<br /> Disallow: /forum/*&view=next<br /> Disallow: /forum/*&sid=<br /> Disallow: /forum/*&p=<br /> Disallow: /forum/*&sd=a<br /> Disallow: /forum/*&start=0
If you face any problems, leave your comments and I’ll look into it.