photo by skeeze
We complete daily health checks for our customers and in one specific case, there is a BI tool which notifies us in case of major error, but we also have to scan logs for small errors and unhealthy processes. We check log files for errors, depending on your luck there could be about 10 of them a day, and to query service tables on the same amount of databases. At first I was doing it manually. For the first part I opened a remote desktop to the server, navigated to the log folder and ordered log files by modified date. Then I had opened the recent files one by one with Notepad and search for “error” in each of them. For the second part I had opened SQL Server Management Studio, connected to the server and executed a SQL script against each of the databases. It took around 30-45 minutes, however I knew that if I could partially automate some tasks, I could reduce that time.
PowerShell to Search in Files
This was my first time dealing with the PowerShell, but it was obviously required here. In the script I created two variables, logs folder path and how far back in time I need to go in hours.
[code]$Path = ‘C:\Application\Logs\*.*’
$LastModifiedRange = -24
Get-ChildItem -Path $Path
| Where-Object { $_.LastWriteTime -gt (Get-Date).AddDays($LastModifiedRange) }
| Select-String -Pattern “error”
[/code]The scripts output is a list of lines in all the files that match the pattern. If you need help with PowerShell syntax, I’m not the right person to ask, but personally I found that any solution could easily be found in the Internet.
SQLCMD Utility to Execute SQL
For the second part I have improved the query itself first. As the service table has the same name in all databases, they could be looped through with the cursor.
[sql]DECLARE @name VARCHAR(50)
DECLARE @Databases TABLE (DBName varchar(128))
INSERT INTO @Databases
VALUES (‘DB1’),(‘DB2’),(‘DB3’)
DECLARE db_cursor CURSOR FOR
SELECT DBName FROM @Databases
OPEN db_cursor
FETCH NEXT FROM db_cursor INTO @name
WHILE @@FETCH_STATUS = 0
BEGIN
DECLARE @SQL nvarchar(max)
SET @SQL = ‘
SELECT ”<@@Database>”,
Table_Schema,
Table_Name,
Error_Message,
Error_Timestamp
FROM <@@Database>.[dbo].[Service_Table]WHERE Error_Timestamp >= DATEADD(day,-2,GETDATE())
ORDER BY Error_Timestamp DESC’
SET @SQL = REPLACE(@SQL, ‘<@@Database>’,@name)
EXEC sp_executesql @SQL
FETCH NEXT FROM db_cursor INTO @name
END
CLOSE db_cursor
DEALLOCATE db_cursor
[/sql]Although it was easy enough for me to run this in Management Studio, its overloaded interface was too much for my colleagues who are unfamiliar with SQL Server, but that have to perform daily checks on a roster. Therefore I chose to use SQLCMD, a command-line utility for executing SQL scripts.
I found it the easiest to save my chunky SQL into a file on disk, so I can load it as a parameter:
[code]sqlcmd -S server001 -E -i “C:\Daily checks\SQL executable.txt”
[/code]Parameters are: -S for server name, -E for Windows authentication and -i for input file path. My script output is the list of errors if there were any.
Is It All That Could be Done?
Definitely not. As I mentioned at the beginning, log files are stored on the server. You can establish a connection and run PowerShell scripts on a remote computer. Ideally, you would schedule these tasks to be executed automatically and send all the collected errors in one email. This is also achievable from PowerShell, it allows you to send email and it could be scheduled via Windows Task Scheduler. However, both tasks not easy to achieve in my case. Business Intelligence administrators don’t belong to the elite IT caste, therefore they are limited in privileges. This is not an easy process to get permission to have your own (scary as potentially malicious) scheduled job that executes some code on customer’s computers just because you want to save your time. Anyway, I have achieved a significant reduction of time needed for daily checks, so I’m still happy.
Kate
Data masseuse