Within the web directory, I need to duplicate a directory and its content, by a lc script executed by the client, or by a lc script executed by a cron task.
What would be the best strategy ?
CLIENT
-lc script into the www/public directory
-relative path
-> problem of sudo
-> security issue ?
Script could be :
put "echo "&myRootPassword&" | sudo -S cp -rf sourcedir newdir" into theCmd
get shell(theCmd)
CRON
-lc script in root directory
-absolute path
->no sudo
cp dir, sudo, cron and best strategy
Moderators: FourthWorld, heatherlaine, Klaus, kevinmiller, robinmiller
Re: cp dir, sudo, cron and best strategy
Hi,
If you can use a cron script, I'd use that, because shell scripts with passwords are a really tricky thing in LiveCode.
Have you considered using FTP or SSH?
Kind regards,
Mark
If you can use a cron script, I'd use that, because shell scripts with passwords are a really tricky thing in LiveCode.
Have you considered using FTP or SSH?
Kind regards,
Mark
The biggest LiveCode group on Facebook: https://www.facebook.com/groups/livecode.developers
The book "Programming LiveCode for the Real Beginner"! Get it here! http://tinyurl.com/book-livecode
The book "Programming LiveCode for the Real Beginner"! Get it here! http://tinyurl.com/book-livecode
Re: cp dir, sudo, cron and best strategy
Think i'd go with a cronjob doing an rcync or cp just because its the most easily secure way of doing so, and should work really well.
Re: cp dir, sudo, cron and best strategy
Hello Mark. Thank you for your answer.Mark wrote: because shell scripts with passwords are a really tricky thing in LiveCode.
Are you saying that it's a tricky thing particularily in LiveCode ? if yes, why ?
It wouldn't fit my process (a customer places an order on a webpage, and if payment is validated, some operations to copy directories and their contents should take place right after).Mark wrote: Have you considered using FTP or SSH?
Re: cp dir, sudo, cron and best strategy
Mark is right, auto passwords used in shell scripts are dangerous. I'm not a security expert by any means but avoiding that method is probably a good thing if at all possible. If it is not possible, I'd change the ownership of the files in question, and the destination location to an unprivelaged user (a user with no real shell/login permissions possibly) and do it that way. The trickiness is not a livecode thing its a shell scripting thing.
It also might be possible to set rcopy up so that you can initiate the copy remotely, but thats another thing that would have to be "open" on the server. Also have to be careful allowing folks to pass in parameters (like source and target locations) Even if you hard code the stub locations, you'd have to make sure the user couldn't use ../../.. methods to get back to the root filesystem, and/or use a symbolic link (if they have the rights to create one) to get to places they aren't supposed to be. Especially if you're running things using a root password.
Makes me remember back to when i was a lab assistant at a college. We were given limited privs in a.. well sort of a shell program overlaid on some netware stuff. Unfortunately the process was running with privileges and it was pretty easy to break out and get a prompt with the same privs.
If its just your home system, probably not a huge deal if its behind a nat/firewalled and doesn't allow easy shell access.
Cron should work well. Or use a web server (with no special privs, so taht hopefully only the web server itself can be hit) so that a client can initiate a copy, but again youd have to a) make sure only the bare minimum privileges are available to do the job, b) process any input looking for funkiness (like the ../../.. trick to get back to root level) or c) hard code the paths, or generate the paths yourself with no input from the user to hopefully keep things from being subverted.
Also, since i'm not sure what it is you're trying to accomplish, you might consider tar/gzipping the files, back them up to a hard coded location then extract them again to wherever you wish the copy to be. Again being careful to not give the client too much control of the machine.
It also might be possible to set rcopy up so that you can initiate the copy remotely, but thats another thing that would have to be "open" on the server. Also have to be careful allowing folks to pass in parameters (like source and target locations) Even if you hard code the stub locations, you'd have to make sure the user couldn't use ../../.. methods to get back to the root filesystem, and/or use a symbolic link (if they have the rights to create one) to get to places they aren't supposed to be. Especially if you're running things using a root password.
Makes me remember back to when i was a lab assistant at a college. We were given limited privs in a.. well sort of a shell program overlaid on some netware stuff. Unfortunately the process was running with privileges and it was pretty easy to break out and get a prompt with the same privs.
If its just your home system, probably not a huge deal if its behind a nat/firewalled and doesn't allow easy shell access.
Cron should work well. Or use a web server (with no special privs, so taht hopefully only the web server itself can be hit) so that a client can initiate a copy, but again youd have to a) make sure only the bare minimum privileges are available to do the job, b) process any input looking for funkiness (like the ../../.. trick to get back to root level) or c) hard code the paths, or generate the paths yourself with no input from the user to hopefully keep things from being subverted.
Also, since i'm not sure what it is you're trying to accomplish, you might consider tar/gzipping the files, back them up to a hard coded location then extract them again to wherever you wish the copy to be. Again being careful to not give the client too much control of the machine.