I'm creating a site that scans a directory, and imports the contents of xml files it finds in the directory as new pages, it all works nicely, but now I'm trying to set up a scheduled task to automate the process, but I'm running into an issue.
if i run it through my browser, it all works fine, but when i run it from cron with lynx, as I'm not logged in, the pages are created, but not published.
is there an easy way to publish pages without being logged in?
This is just an idea, I've not tested it: In the changelog for 2.4 they say "Automatically login user if the 'remember login' cookie is set ", maybe you can use this cookie to automatically log in your lynx or wget (both support cookies). If not, you can always log in by submitting the login-form (at least wget supports the POST method) and storing the session cookie for later use. You may have to handle all this in a simple shell script.