I have a backup script running that creates a ~300GB backup file to a backup folder daily. In that backup folder are some files that are smaller (less than 50GB) as well.
Using a cronjob I want to to keep the 2 most recent files that are larger than 50GB while ignoring the smaller files. Deleting the oldest file while having at least 2 files in the directory
So far I tried this:
find . -type f -size +50G -printf '%T+ %p
' | sort | head -n 1 | awk '{print $NF}' | xargs rm -f
This sorts the files that are bigger than 50GB in a list and deletes the oldest file.
I can't figure out how to make sure that it keeps at least 2 files with at least 50GB. In my imagination I should count the amount of files that are bigger than 50GB sort them by date and deletes the oldest? Is there anything I'm missing? Should I rather delete all but the youngest 2 files somehow?
Any tips/hints are appreciated. Thank you.