I just installed SilverStripe for the first time. The demo looked intriguing and I decided to install the program to a test domain. Once I started creating pages I noticed that my subpages were not being placed in directories in the URI.
For example I created a Contact page under an About page and instead of residing at /about/contact/ the page was created at /contact/.
I understand that there are many reasons to not use a directory path. However, there are several good reasons to use directory paths; such as implementing the CMS in to an existing site which already has search engine weight/rankings.
Fuzz10 and wombleme, thank you for your response to this issue. I feel much more comfortable investing the time to learn the framework and system knowing that this type of functionality is being actively developed.
I do have one more issue on a related note:
I noticed that if I access a page, /page/, I can also access this page at /page/a-page-that-does-not-exist. This concerns me due to content duplication issues with search engine optimization. I am guessing, from the little I know about the system, that this is due to the MVC functionality of the underlying framework.
If this is due to the MVC framework I feel that the proper way to handle this issue is that if a method in the controller is not found to match the URI then a 404 should be returned.
This would ordinarily not be a significant issue. However, I have several clients who are in very competitive industries. Having duplicate content "available" poses the risk that a multitude of pages on your website would contain identical html and information.
By "available" I mean that if someone finds your site's issue then they can create pages to link to non-existent pages to get them indexed in search engines. This could affect your rankings due to a very large number of pages utilizing the exact same content.
For example if I find a competitor's page is ranking in the number one spot for a desired phrase and I find that their site will produce duplicate content under any URI under that page, I could then create dummy sites which generate links to non-existent pages under the page that is holding the desired ranking. Then by performing a basic SEO to get the dummy sites indexed I could really harm the competitor.