Thursday, May 2, 2024
 Popular · Latest · Hot · Upcoming
0
rated 0 times [  0] [ 0]  / answers: 1 / hits: 354  / 2 Years ago, sun, january 9, 2022, 12:58:25

I have a backup script running that creates a ~300GB backup file to a backup folder daily. In that backup folder are some files that are smaller (less than 50GB) as well.


Using a cronjob I want to to keep the 2 most recent files that are larger than 50GB while ignoring the smaller files. Deleting the oldest file while having at least 2 files in the directory


So far I tried this:


find . -type f -size +50G -printf '%T+ %p
' | sort | head -n 1 | awk '{print $NF}' | xargs rm -f

This sorts the files that are bigger than 50GB in a list and deletes the oldest file.


I can't figure out how to make sure that it keeps at least 2 files with at least 50GB. In my imagination I should count the amount of files that are bigger than 50GB sort them by date and deletes the oldest? Is there anything I'm missing? Should I rather delete all but the youngest 2 files somehow?


Any tips/hints are appreciated. Thank you.


More From » backup

 Answers
4

you almost make it (line splited for readability)


find . -type f -size +50G -printf '%T+ %p
' |
sort -r |
awk 'NR>2 {print $2}' |
xargs rm -f

where



  • sort -r will list newest file first

  • NR>2 will select line starting from 3th

  • {print $2} will print filename (provided filename has no space, tab or newline)


[#1992] Tuesday, January 11, 2022, 2 Years  [reply] [flag answer]
Only authorized users can answer the question. Please sign in first, or register a free account.
nquirewha

Total Points: 256
Total Questions: 109
Total Answers: 122

Location: Namibia
Member since Mon, Feb 21, 2022
2 Years ago
nquirewha questions
Wed, Jan 26, 22, 03:38, 2 Years ago
Mon, Nov 1, 21, 13:50, 3 Years ago
Thu, Dec 1, 22, 09:23, 1 Year ago
;