Skip to main content

This site requires you to update your browser. Your browsing experience maybe affected by not having the most up to date version.

We've moved the forum!

Please use forum.silverstripe.org for any new questions (announcement).
The forum archive will stick around, but will be read only.

You can also use our Slack channel or StackOverflow to ask for help.
Check out our community overview for more options to contribute.

General Questions /

General questions about getting started with SilverStripe that don't fit in any of the categories above.

Moderators: martimiz, Sean, Ed, biapar, Willr, Ingo, swaiba

SS doing a SEO no,no... Duplicate content!


Go to End


15 Posts   4580 Views

Avatar
Pixel Labs

Community Member, 14 Posts

9 December 2011 at 9:11pm

Edited: 09/12/2011 9:12pm

Hi, my SEO friend has just pointed out to me that SS is creating duplicate content by the way it serves pages. make
Eg. www.mystite.com/contact.php and www.mysite.com/contact/

Both of these pages are 'spiderable' by Google and can make their way in to the Google index. This is particularly bad when you upgrade to SS from an old site. If the previous site used .php files, these would remain in the index and the new SS versions added also.
For example:
Google would treat the old www.mysite.com/services.php page as a different page to www.mysite.com/services/
Even though they are the same page in the new SS site.

The answer is to 301 the .php version to the non PHP version, but I cannot find any leads on doing this. Any ideas?

I don't know if this is a known issue. I've scoured the web for an answer but cannot find any discussion on the topic.

If you don't know about Google's penalisation of duplicate content (it's really important to know) do a Google for Google Panda update or Google farmer update.

Cheers.

Avatar
DesignerX.com.au

Community Member, 107 Posts

9 December 2011 at 9:18pm

Hi,
I never noticed that SS does have 2 versions of each page ..
anyway, Google's penalisation of duplicate content only apply (as far as i know ) to duplicate content in different Domains , & duplicate content on the same website my considered as 'Keyword spamming" (after few month of the website being live ).

anyway, it will be good to know how to direct the .php to a Non-.php url ..
Subscribed & waiting for a reply ;)

Mikel

Avatar
Devlin

Community Member, 344 Posts

9 December 2011 at 10:44pm

Edited: 10/12/2011 5:00am

I don't think this is really an (edit: silverstripe) issue. You can use any suffix you like and append any $_GET variable with ? and & to the url. Create a proper sitemap for google and use vanilla links inside the page and I'm sure you have no problem with duplicated content.

Avatar
Pixel Labs

Community Member, 14 Posts

10 December 2011 at 2:22am

I've just spoken to my SEO friend and he disagrees. He's noticed 'through fixing duplicate content on our sites so in my experience, on-domain duplicate content is an issue.'

That's the problem with SEO. Nobody actually knows anything for certain, you can only go on your experience and what you can test for. For us, this is an issue. Particularly because we have the .php versions in the Google index and we want the 'heat' from these pages to 301 to the new pages.

Avatar
Devlin

Community Member, 344 Posts

10 December 2011 at 4:52am

Edited: 10/12/2011 5:40am

Well, the best place to redirect 301 is the init method of the page controller. Search for any suffix in $_SERVER['REQUEST_URI'], wipe any data you don't want and redirect it.

Edit: Or prep your .htaccess file accordingly.

Avatar
Greg1

Community Member, 28 Posts

12 December 2011 at 11:20am

Rel=Canonical will fix the SEO problems and is easy to implement.

Avatar
Pixel Labs

Community Member, 14 Posts

12 December 2011 at 11:56pm

Thanks for the feedback. We've implemented some rel='canonical' links but my SEO friend informs me that this really needs a proper fix.
We think refining the .htaccess is the correct place to do this, not the page controller (as it's the .htaccess open ended rule that's causing the issue).
Nobody at our office is a .htaccess guru so if anyone knows how to tighten up this rule so it doesn't serve pages with an extension (.php, .html, .any-thing-you-like) then we'd be very grateful for the help.

Thanks.

Avatar
(deleted)

Community Member, 473 Posts

13 December 2011 at 10:03am

It's easiest to actually do this in your Page_Controller init() method so that you don't need to handle exceptions (sitemap.xml, *.css, *.js, sapphire/main.php, etc).

Something like:

public function init() {
	parent::init();
	if($this->request->getExtension()) {
		$url = $this->request->getURL();
		$url = substr($url, 0, strlen($url) - (strlen($this->request->getExtension()) + 1));
		return $this->redirect($url, 301);
	}
}

Go to Top