Date: Wed, 21 Jul 2010 16:11:43 -0500
Reply-To: Michelle Zunnurain <michelle_zunnurain@HOTMAIL.COM>
Sender: "SAS(r) Discussion" <SAS-L@LISTSERV.UGA.EDU>
From: Michelle Zunnurain <michelle_zunnurain@HOTMAIL.COM>
Subject: In search of a more efficient program
Content-Type: text/plain; charset="iso-8859-1"
I am going through the same thing myself. Trying to make performance improvements to a large, complex, time consuming process.
I started out by setting options mprintnest to make the log easier to trace back to the program, and fullstimer to get more specific information from the log. Also %logparse was a nice addition which I wrapped in loops to summarize my steps and put the results to a spreadsheet.
%logparse will show you the stepname, # obs, # vars, realtime, usertime, cputime, memused, plus many more. It can also be filtered to get just the stepname='SAS' as a summary of multiple logs. Then run over multiple time periods, in my case by month, to see the trend.
Then I chose steps to target based on the longest real time. Then I tested by trying lots
of different coding tricks, some found online, some found here, some found in SAS
documentation and other papers. I usually tested proc sql, vs merge vs hash join. If the datasets were too large to work with in a timely manner, I extracted a test batch of records and used those for testing, then proceeded to a full test. I compared the number of records and the number of variables to make sure they matched. Then did a proc compare to make sure the dataset matched exactly.
If everything was successful/beneficial to this point then I moved the code into production.
The only way to accomplish what you seek is by a lot of work/testing/guessing and running a lot of test code.
The New Busy think 9 to 5 is a cute idea. Combine multiple calendars with Hotmail.