We are finding duplicate pages being listed in Google search results. http and https has anyone got a fix for this. As you are all aware this is removing the importance of the pages you do want indexed.
We are finding duplicate pages being listed in Google search results. http and https has anyone got a fix for this. As you are all aware this is removing the importance of the pages you do want indexed.
See this thread.
Sorry Alfred the above thread is not helpful. We have over 20,000 https pages indexed by Google search. The files we submit to Google Base is not the problem. Is there any code we can add to block the Google bot indexing https pages? We have FULL access to our web server. We installed the mobile plug-in from Vortx maybe this is the problem.
Last edited by DaveW; 06-21-2010 at 05:55 AM.
Recently I setup URL rewrite rules in IIS7.5 to handle this.
When there is a request for https://www.(yourdomainnamehere).com/robots.txt we send it to robots_ssl.txt which contains:
# robots.txt file for https://www.(yourdomainnamehere).com
User-agent: *
Disallow: /*
This appears to have reduced bandwidth a bit and am no longer finding any of our pages linked as https in Google.
Thanks,
Casey,
MS 9.3.0.0 - Next stop MS 9.3.1.0