开发者

Make log reading efficient

I have a perl script that is used to monitor databases and I'm trying to write it as a powershell script.

In the perl script there is a function that reads through the errorlog and filters out what it is that matters and returns it back. It also saves the current position of the log file so that the next time it has to read the log it can start where it left of instead of reading the whole log again. This is done by using the tell function.

I have an idea to use the Get-Content cmdlet, and start reading at the last position, process each line until the end of the file and then saving the position.

Do you know any tricks so that I can get the position in the log file after reading and make the read start at a particular location.

Or is there an better and/or easier way to achieve this?

Gísli

EDIT: This has to be done through the script and not with some other tool.

EDIT: So I'm getting somewhere with the .NET API but it's not quite working for me. I found a helpful links here and here

Here is what I have so far:

function check_logs{
param($logs, $logpos)
$count = 1
$path = $logs.file
$br = 0
$reader = New-Object System.IO.StreamReader("$path")
$reader.DiscardBufferedData()
$reader.BaseStream.Seek(5270, [System.IO.SeekOrigin]::Begin)
for(;;){
    $line = $reader.ReadLine()
    if($line -ne $null){$开发者_如何学编程br = $br + [System.Text.Encoding]::UTF8.GetByteCount($line)}
    if($line -eq $null -and $count -eq 0){break}
    if($line -eq $null){$count = 0}
    elseif($line.Contains('Error:')){
        $l = $line.split(',')
        Write-Host "$line  $br"
    }
}

}

I haven't found a way to use the seek function correctly. Can someone point me in the right direction?

If I run this it outputs 5270 but if I run this with out the line where I try to seek in the base stream I get:

2011-08-12 08:49:36.51 Logon       Error: 18456, Severity: 14, State: 38.  5029
2011-08-12 08:49:37.30 Logon       Error: 18456, Severity: 14, State: 38.  5270
2011-08-12 16:11:46.58 spid18s     Error: 1474, Severity: 16, State: 1.  7342
2011-08-12 16:11:46.68 spid18s     Error: 17054, Severity: 16, State: 1.  7634
2011-08-12 16:11:46.69 spid29s     Error: 1474, Severity: 16, State: 1.  7894

Where the first part is the line read from the log and the number at the end represents the bytes read at that point. So as you can see I'm now trying to use the seek function to skip the first error line but as I said earlier the output is 5270 if I use the seek function.

What am I missing?????

Gísli


you would probably be able to do this with some .net objects etc...

If it's a more standard formatted log file I don't look much past logparser though. It was awesome before time and is still awesome!

You can use it via the command line or COM with PowerShell. It has the ability to mark where it was in a file and pick up from there (stores the info in a lpc file).

May be someone will come up with a good way of doing this but if not you could also look at switching to writing error information to the event viewer. You can store the last id or last time you searched the event viewer and check from there each time.

hopefully there is something better...

EDIT:

If the file is tab delimited you can use the import-csv command and store the last number (it'd either be count or count-1 if the the header is included in count). With the last number you can jump to the last point in the file

# use Import-CliXML to get the $last_count
Import-CliXML $path_to_last_count_xml
$file = Import-Csv filename -delimiter "`t"
for ($i=$last_count; $i -gte $file.count; $i++) {
    $line = $file[$i]
    # do something with $line
    ...
}
$last_count = $file.count | Export-CliXML $path_to_last_count_xml

# use this to clear the memory
Remove-Variable $file
[GC]::collect

or you could directly query the DB using sp_readerrorlog; using the last time like the last count above.


If your objective is to only read the file from the last line read in the prior run Please feel free to disregard this answer. However if you’re just trying to get any errors since the last check time this might help.


[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO')  | Out-Null
$server = 'ServerName'
$chkDate = Get-Date -Date '8/16/2011 15:00'  # time of last check
$srvObj =  New-Object -TypeName Microsoft.SqlServer.Management.Smo.Server -argumentList $srv
$srvObj.ReadErrorLog(0) | foreach { if ($_.LogDate -notlike '' `
   -and $_.LogDate -ge $chkDate `
   -and $_.Text -like 'Error: *') {$_}} |ft -AutoSize

If you pick-up the last run time from a file, or just know you run this every hour or whatever, you can adjust the $chkDate to only show errors from then to the end of the file.

(watch out for those back-ticks (`) at the end of the $srvObj.ReadErrorLog(0) line and the next line. They don't always come out for me in the HTML)


There are two options I would suggest if you really want to go with PowerShell.

  1. Use the .NET file APIs.

  2. Read the whole contents of the log and then clear it. Store the parsed contents in a database.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜