The other day I considered setting up a cronjob for my site regularly deleting some temp files.
So I reconsidered deleting the files and started thinking about how to backup them instead.
A quick search in internet helped me find a php function for recursive_copy_file provided by Guilherme Serrano in GitHubGist. I modified my script to copy by another file name while copying it into a backup folder and delete the original files afterwards. This way I could in case I would like to, recall the content of the files without them actually still be accessible in their original location. I don’t care about their actual names or creation dates.
For my script to know which folder it should start checking, I’ve put it in the folder it should backup and used getcwd()
However, if my script was to delete all files in the directory and leave my script still on position, it had to check the extension of the pathinfo() and make sure it won’t do nothing for “php”, or only copy files of another specific extension.
But still, even if it renames after copying, and deletes the original files, this isn’t saving much webspace. To solve this, I used rename() instead of copy(), compressed the copied files thus in a gz and unlink() afterwards the renamed copy.
Finally, back in the control center of my webspace I only needed to select this script in the Cronjob manager as a new cronjob and define the intervals for this action to be performed automatically.