Thursday, May 9, 2024
 Popular · Latest · Hot · Upcoming
4
rated 0 times [  4] [ 0]  / answers: 1 / hits: 4418  / 1 Year ago, sun, march 12, 2023, 8:00:41

I have a script that generates backup and finds files older than 30 days and compresses them. Now the problem I'm facing is if the files are more than one than how can I edit the script so that it compresses all those files as well. Below is the find function Im using. Please Help.



Time=+30
PATH_TO_DUMP=/home/tarun/Desktop/Backup

#Find any Backup File defined by the time constraint
file="$(find $PATH_TO_DUMP -type f -mtime $Time)"

#To verify if $file is empty or has some value
if [ ! -n "$file" ]; then
echo "No Earlier Backups were found to compress" >> $PATH_TO_LOG
else
echo "Earlier Backups $file will be compressed" >> $PATH_TO_LOG
gzip $file
fi

More From » backup

 Answers
3

You can do that with a minor modification



(pipe the find output into while loop doing a read)



Time=+30
PATH_TO_DUMP=/home/tarun/Desktop/Backup
#Find any Backup File defined by the time constraint

find $PATH_TO_DUMP -type f -mtime $Time | while read file
do

#To verify if $file is empty or has some value
if [ ! -n "$file" ]; then
echo "No Earlier Backups were found to compress" >> $PATH_TO_LOG
else
echo "Earlier Backups $file will be compressed" >> $PATH_TO_LOG
gzip "$file"
fi

done

[#30546] Sunday, March 12, 2023, 1 Year  [reply] [flag answer]
Only authorized users can answer the question. Please sign in first, or register a free account.
ligdesig

Total Points: 164
Total Questions: 106
Total Answers: 114

Location: Japan
Member since Sat, Jun 6, 2020
4 Years ago
;