Script to pull out info out of very large drives
I need a script that can gather folders/files information from large drives (600GB to 1TB). The info that I will need are:
- Full name/path of the file
- File Size
- Date Created
- Date Modified
- Date last accessed.
So far I have the code below:
dir 'e:\' -recurse |
select FullName,Length,CreationTime,LastWriteTime,LastAcc开发者_开发技巧essTime |
Export-CSV e:\test\testit.csv -notype
Would it be possible to get the script adapted so as it can search on ‘If modified date is equal to ‘xxx’ days or older, then output data to csv..
The xxx would be the figure – i.e 365 for 1 year or more old.. or xxx would be the figure – i.e 730 for 2 years or more old..
Also would it be possible to modify the headers for the columns that it outputs on CSV file? Thanks
For the first part of question, you might use
# fixed date
Get-ChildItem c:\temp\ | ? { $_.LastWriteTime -lt [datetime]'2009-02-23' }
# variable depending on current date; it returns only items that are older than 365 days
Get-ChildItem c:\temp\ | ? { $_.LastWriteTime -lt [datetime]::Now.AddDays(-365) }
As for modifying the headers.. I is possible to select items with different names but with the same content (~rename the properties):
Select-Object @{Name='ItemName'; Expression={$_.FullName } },
@{Name='Write'; Expression={$_.LastWriteTime } },
@{Name='Access'; Expression={$_.LastAccessTime } }
Note that there is a script block where you can do whatever you want. You can e.g. format the dates {$_.LastAccessTime.ToString('yyyy-MM-dd') }
If I put it together (there are only 3 properties 'renamed')
Get-ChildItem e:\ -rec |
? { $_.LastWriteTime -lt [datetime]::Now.AddDays(-365) } |
Select-Object @{Name='ItemName'; Expression={$_.FullName } },
@{Name='Write'; Expression={$_.LastWriteTime } },
@{Name='Access'; Expression={$_.LastAccessTime } } |
Export-Csv e:\test\testit.csv -notype
精彩评论