Date: Tue, 9 Jul 2002 11:14:18 -0400
Reply-To: Quentin McMullen <QuentinMcMullen@WESTAT.COM>
Sender: "SAS(r) Discussion" <SAS-L@LISTSERV.UGA.EDU>
From: Quentin McMullen <QuentinMcMullen@WESTAT.COM>
Subject: automated log-checking
Content-Type: text/plain; charset="iso-8859-1"
The subject of automated methods to scan a log for errors/warnings/bad notes
seems to come up every month (as in concurrent thread "Easy Tracking of
Errors"). I thought it might be useful to have a broad conversation about
What methods have you used to automate log-checking?
What traps have you fallen into? (as Peter Crawford mentioned to me at
SUGI, a trusted but poorly designed log-scanner is a very dangerous thing
What bells-and-whistles have you added that you really like?
I'm thinking about SAS solutions, that would be automatically invoked after
a program has been submitted (or as part of that program). But of course
non-SAS solutions are always interesting as well. Here are some of my own
It seems to me that when I review the log scanning macros I've seen (e.g Li
and Troxell NESUG 2001 paper, and posted code by Ron Fehd, Meredith Clark,
Joe DeShon and others, and my own initial attempts), I really like the
approach taken by Li and Troxell. One of the important questions for a
log-scanning utility is what to do with various NOTE: messages. Most of the
code I have seens on SAS-L includes a list of NOTEs that should be included
in an error report. Li and Troxell argue for the opposite approach: give
the macro a list of NOTEs to exclude and report everything else by default.
(Actually, better than that, their macro includes a parameter for
include/exclude, and a parameter to name a dataset listing the notes,
and...). This seems the safer route, especially as I frequently come across
new (to me at least) NOTEs when playing with procedures I haven't used
before, and of course it's not unheard of for the actual text of a a NOTE to
change from version to version. In general, it seems much easier to add a
new exclusion than to recognize that you have omitted an inclusion.
As for pitfalls, one that I have fallen into, and I think some others have
as well, was to search for the word "ERROR:" (note the colon as part of the
string), which will miss errors such as:
ERROR 180-322: Statement is not valid or it is used out of proper order.
Perhaps this is too general. Here are some more specific questions.
1. How do you "prepare" the log for scanning? That is, do you use altlog
or printto to redirect the log to another file? Or do you run in batch and
then scan programName.log?
2. Once you have the log sitting somewhere, how do you find the
errors/notes? I have seen some code that reads in an entire line, then uses
index() or indexw(). Other code reads in just the first 20 characters or so
of each line. Still other code creates a SAS dataset containing select
lines of the log, which can then be subset via a where statment using
Line NOT LIKE "NOTE: % records were read from the infile %."
3. Once you have identified all of the problematic messages, where/how do
you like to report them? I could imagine them being appended to the main
output file, or as a separate file, or emailed, or...
4. How has writing/using an automated log checking program changed your
coding practices? Just having played with this myself, it has made me less
likely to use the word "error" in my code. Instead of coding: IF <bad> then
put "ERROR: something bad", I will code something like: IF <bad> then put
"ER" "ROR: something bad" or in macro, %if &bad %then %put ER%str()ROR:
something bad. Also, although my super-ego knows it's good to avoid
automatic conversions, I find the presence of an error report helps my ego
maintain control over the laziness of my id.
Ok, enough rambling, any thoughts welcome.