moves files to S3 and deletes from the machine. it's super fast if you have hundreds of file cause it's thread based script.
First of all, you have to install the boto3
and threading
if you don't have in the system.
if you wanna create virtual env
$ python3 -m venv env
$ source env/bin/activate
$ pip install boto3
you need to install aws-cli in your machine and run the below command and pass the correct APIs accessKey & Secret Access Key:
$ aws configure
you can also do it manully
you have pass the directory path as flag (argument) to the script and it will get all files from directory & will upload it to S3.
$ python index.py <directory path>
$ python index.py ./files
you have to create a global command for linux to be able to access everywhere:
$ cp index.py uploadToS3 && chmod +x uploadToS3 && sudo cp uploadToS3 /bin
run this command in the directory where has the sub-directories with files
$ for d in */*/*; do uploadToS3 $d; done;
it watches for a files in directory, if new file are created it so upload it to s3.
/var/spool/asterik/monitor
inotifywait -m -r -e create . | while read DIRECTORY EVENT FILE; do
case $EVENT in CREATE*)
for d in */*/*; do uploadToS3 $d; done;
;;
esac;
done
change dir to /home/centos/logs
and run, if you see empty array, no error else problem.
$ for f in *; do echo $f; echo " == "; cat "$f"; done