Find files older than 3 days unix


















If you just want to create a single tar file for each archive, use -exec instead of passing the data to xargs:.

Stack Overflow for Teams — Collaborate and share knowledge with a private group. Create a free Team What is Teams? Collectives on Stack Overflow. Learn more. Asked 8 years, 3 months ago. Active 2 years, 1 month ago. Viewed 77k times. Improve this question. Giacomo Add a comment. Active Oldest Votes. Add a comment. Active Oldest Votes. Some implementations of cp have a -t option, so you can put the target directory before the files: find.

Improve this answer. This also copies folders with old files? THanks everybody for the help, got it working using a combination of your suggestions: went with the below code and it works perfectly. The format is: [[[[[cc]yy]mm]dd]HH]MM[. SS] Where cc is the first two digits of the year the century , yy is the last two digits of the year, the first mm is the month from 01 to 12 , dd is the day of the month from 01 to 31 , HH is the hour of the day from 00 to 23 , MM is the minute from 00 to 59 , and SS is the seconds from 00 to The minute field MM is required, while the other fields are optional and must be added in the following order: HH, dd, mm, yy, cc The SS field may be added independently of the other fields.

Multiple -T time range can be supplied and checking stops with the first match. Note: if you want to exclude some folder while taking backup you can use following command find. I always had a copy of what I have restored last time in compressed format. Rui F Ribeiro Amol Patil Amol Patil 5 5 bronze badges. FatihSarigol FatihSarigol 4 4 bronze badges. You may want to read Why is looping over find's output bad practice? Thanks Stephane, that may be why I couldn't find this in any post, though for my case it worked like a charm.

I believe the post you shared is about "Looping over", is my code looping over find's result? You are wise to use xargs as it's much more efficient. What you want is to search in your current directory: Code :. Join Date: Jun If the version of find being used supports it, I usually use the -delete option as it is usually even faster. Implementations differ but usually the -exec option of find does a fork and then an execvp , and then waits on the child process.

It starts a child process for each file which makes it slow. The -delete option simply does an rmdir or unlink as it is walking the filesystem tree and it is usually as fast or faster than xargs in my experience. Also by using -delete you don't run into the number of arguments hard limit that xargs imposes or rather execvp 3.

By default, -delete deletes both files and directories, so if the intention is only to delete files, the -type parameter must also be specified. Of course, if it is a script that is being run often, it doesn't hurt to profile and determine which is best. I usually do: Code :. Last edited by hnrz; at AM.. Join Date: Mar Code :. Shell Programming and Scripting. Find files not matching multiple patterns and then delete anything older than 10 days. Hi, I have multiple files in my log folder.

I can write multiple find commands but looking if it is possible in one line. Find all log files under all file systems older than 2 days and zip them. Hi All, Problem Statement:Find all log files under all file systems older than 2 days and zip them.

G-Man Says 'Reinstate Monica' Jan Jan 7, 30 30 silver badges 39 39 bronze badges. Using -delete is simpler syntax and no child processes spawned to do delete but may not work on all systems. Could an attacker craft a filename that would cause this to delete something you didn't want deleted?

So it's completely safe no matter what characters the filenames contain. AdminBee Greenonline 1, 4 4 gold badges 14 14 silver badges 21 21 bronze badges. Greg Greg 1. This will not work with files with spaces in them. You can use a while loop instead which should take care of files with spaces. Neeraj Mhatre Neeraj Mhatre 1. The OP asked "I want to delete all the files older than 15 days", which your answer doesn't provide.

They also did not ask for a restriction to php files. If that is your blogspot page, please indicate ownership explicitly, instead of the generic "below link", otherwise you become suspect of simply spamming your site. Sign up or log in Sign up using Google.



0コメント

  • 1000 / 1000