search engines (General)
One of the tricks I learned from Google was placing a file labelled "robots.txt" in the forum folder with a list of files that need to be excluded if the robot is to search the database. In our case the list of exclussions is:
That's interesting! What Google doesn't like might be the "double content" caused by the different views and order possibilities (different URLs with the same content). In the current version I reduced as much as possible the query strings for page numbers, categories and the order of threads in the URLs but maybe a robots="noindex" in all pages except for the index page and the opened messages might also be beneficial.
Alex