|Date: ||Mon, 22 Feb 2010 14:56:20 -0500|
|Reply-To: ||Michael Raithel <michaelraithel@WESTAT.COM>|
|Sender: ||"SAS(r) Discussion" <SAS-L@LISTSERV.UGA.EDU>|
|From: ||Michael Raithel <michaelraithel@WESTAT.COM>|
|Subject: ||Re: Number of Instructions Used by Datastep|
|Content-Type: ||text/plain; charset="us-ascii"|
Barry Schwarz posted the following:
> Once you start talking about processing a data set, you introduce a
> whole new set of I/O variables such as throughput, channel
> conflicts/collisions, network performance, etc. These are unrelated
> to processing speeds and instruction counts but will impact run time,
> sometimes dominantly so.
> But getting back to your question, as opposed to your goal, consider
> attacking the problem in reverse. Run a data step with no I/O but
> numerous iterations and time it. Then double the iterations. Then
> triple, etc. Hopefully, you end up with a table that looks like
> K iterations take M seconds
> 2K iterations take M+N seconds
> 3K iterations take M+2N seconds
> From this we infer that K iterations take N seconds and M-N is the
> non-iterating part of the data step (e.g., start up and shut down).
> One iteration would take N/K seconds and after multiplying by your
> MIPS rate you have the number of instructions.
> Then add something to the loop and retest. The difference between the
> two values can be attributed to the new processing.
> But you have to do all this while the system is dedicated to your test
> and not performing any other processing, usually impossible if the
> code is executing on a server.
Barry, great suggestions and good line of attack on this "problem"!
Just so that the OP doesn't get confused: He should specify:
...at the beginning of his program to capture both the User CPU Time and the System CPU Time. Paul should add those two values together and use them to perform the calculation you describe above to arrive at the number of instructions expended. He should not use Real Time, as that is confounded by the I/O's you mention, delays from higher priority tasks, and other possible factors.
Barry, best of luck in all your SAS endeavors!
I hope that this suggestion proves helpful now, and in the future!
Of course, all of these opinions and insights are my own, and do not reflect those of my organization or my associates. All SAS code and/or methodologies specified in this posting are for illustrative purposes only and no warranty is stated or implied as to their accuracy or applicability. People deciding to use information in this posting do so at their own risk.
Michael A. Raithel
"The man who wrote the book on performance"
Author: Tuning SAS Applications in the MVS Environment
Author: Tuning SAS Applications in the OS/390 and z/OS Environments, Second Edition
Author: The Complete Guide to SAS Indexes
I know that you believe you understand what you think I said,
but I'm not sure you realize that what you heard is not what
I meant. - Robert McCloskey