22850 Posts in 9711 Topics by 2803 members
|Go to End|
4 May 2010 at 12:34am
I have a client need to be able to access secure pages by google so they can be found in google search results. I have created a bit of code that sniffs out the UserAgent to be Googlebot and if so display the page. When i test the process using the Firefox plugin "User Agent Switcher" I can browse the site as if I was Google. But when I put the pages up on the live server the pages are not being indexed.
I know its not Silverstripe specific any help would be greatly appreciated.
Thanks in advance.
4 May 2010 at 3:37pm
...I have a client need to be able to access secure pages by google...
You know that this means the content of those pages will be accessible via Google cached view, and anyone who modifies their user-agent strings, right?
4 May 2010 at 4:56pm
banal: I'll have a look and see what the go is. last time I tried looking at that file I got a server timeout error.
hamish: yep I know
4 May 2010 at 4:56pm Last edited: 4 May 2010 4:57pm
Heh, ah well, the client wants what the client wants.
5 May 2010 at 10:48am
Now I have a new problem
when I try touching the sitemap.xml that is being auto-generated by SS i get this
XML Parsing Error: no element found
Line Number 1, Column 1:
What am I doing wrong now
|Go to Top|