Skip to main content

This site requires you to update your browser. Your browsing experience maybe affected by not having the most up to date version.

We've moved the forum!

Please use forum.silverstripe.org for any new questions (announcement).
The forum archive will stick around, but will be read only.

You can also use our Slack channel or StackOverflow to ask for help.
Check out our community overview for more options to contribute.

General Questions /

General questions about getting started with SilverStripe that don't fit in any of the categories above.

Moderators: martimiz, Sean, Ed, biapar, Willr, Ingo, swaiba

robots.txt - help


Go to End


10 Posts   9271 Views

Avatar
Web Designer Perth

Community Member, 49 Posts

2 July 2009 at 11:27am

This has been asked in the past with no conclusive answer.

What would be sensible content? Can anyone up' an example?

Much appreciated.

Avatar
Ben Gribaudo

Community Member, 181 Posts

3 July 2009 at 12:23am

Avatar
Web Designer Perth

Community Member, 49 Posts

3 July 2009 at 12:31am

Generally, they do. So thank you.

But I was seeking a SS-specific example / recommendations.

Avatar
Ben Gribaudo

Community Member, 181 Posts

3 July 2009 at 2:51am

Is there something specific you're trying to accomplish? Generally, I think a normal SS install should be fine without a robots.txt file unless you wanted to do something like http://silverstripe.com/robots.txt.

Avatar
Sam

Administrator, 690 Posts

3 July 2009 at 12:37pm

Perhaps we should bundle a default robots.txt file with the installer?

User-agent: *
Disallow: /admin
Disallow: /assets

Avatar
Ben Gribaudo

Community Member, 181 Posts

7 July 2009 at 12:29am

Sounds like a good idea, Sam.

Not sure if assets/ should be included in the exclusion list, though. I can see some people liking it there because they don't want their assets showing up in search engines separately from the pages holding those assets. On the other hand, there are those who want their assets to be indexed so that they will show up in things like Google image search. The "assets/" directive would prevent that.

Ben

Avatar
Benedikt

Community Member, 16 Posts

2 February 2010 at 12:05am

Disallowing assets doesn't make sense to me since there might be pdf or txt files which are useful content for search engines.
It would make sense to disallow cache flushing by search engines:

User-agent: *
Disallow: /admin
Disallow: /?flush

Avatar
kcd

Community Member, 54 Posts

7 February 2011 at 10:00am

I use

User-agent: *
Disallow: /admin
Disallow: /?flush
Disallow: /myotherprivatedirectory
Allow: /

User-Agent: Googlebot-Image
Disallow: /admin
Disallow: /?flush
Disallow: /myotherprivatedirectory
Disallow: /assets
Allow: /

Go to Top