search engines (General)

by Alex ⌂ @, Thursday, April 19, 2007, 08:37 (5304 days ago) @ BrianC
edited by Alfie, Thursday, April 19, 2007, 10:55

One of the tricks I learned from Google was placing a file labelled "robots.txt" in the forum folder with a list of files that need to be excluded if the robot is to search the database. In our case the list of exclussions is:

That's interesting! What Google doesn't like might be the "double content" caused by the different views and order possibilities (different URLs with the same content). In the current version I reduced as much as possible the query strings for page numbers, categories and the order of threads in the URLs but maybe a robots="noindex" in all pages except for the index page and the opened messages might also be beneficial.


Complete thread:

 RSS Feed of thread

powered by my little forum