Do we have to have a robot.txt - Google (Technics)

by Göran B ⌂ @, Monday, June 22, 2009, 20:28 (5415 days ago) @ Auge

The database will be not indexed. Any robot will only index the content in your site. This could be the forum (and the sites of the threads) too.

I don't think it's true. All postings in our forum have been indexed by Google and can be find doing a Google search.

That is actually causing us a problem. We have a database with 155.000 entries and when the Googlebot starts to search through the database, the forum canät be used by other users for about two hours. This happens every day.

We can stop them from indexing our forum by robots.txt - but the problem is that we would like to eat AND keep the cake - we want our forum to be indexed!

Does anyone have any good advice on how to keep the forum indexed without getting performance problems? Are there any mySQL or server parameters to adjust when you are running mlf with so many entries and over 100 concurrent users?


Complete thread:

 RSS Feed of thread