While I have previously only responded to this thread with some memories of
past processing, I also remember the excitement when compiling BASIC
programs sped processing so dramatically.
Since many of us do apparently run some rather large, repetitive analyses,
in not extremely well designed environments, couldn't at least some SAS
processing be speeded up if SAS were to include a compiler that would
produce compiled code that would result in a program that would only be
capable of doing that which was programmed?
I, for one, would welcome having such limited, but fast, routines.
"Magnus Mengelbier" <magnus.mengelbier@FERRING.COM> wrote in message
> Hello Mauro and SAS-L:ers
> Hard to sit still in this discussion... I believe everyone hits on
> important points, however I also feel the need for speed has truly been
> overrated for the one-box-does-it-all approach.
> If I am putting together summary table for a report, 30 seconds or maybe 3
> minutes may not seem much in a perspective. I can see where a few
> percentage points, or fractions, in performance gain can be night and day
> sitting in a transaction based system like yours Mauro. I also strongly
> believe that if you are feeling that SAS is slow on your overall system
> (including OS and SAS and database and ... i think you get the point), you
> might want to reconsider the overall design of the system, system
> components (OS, database, etc.), and epsecially what is performed when, by
> whom, how, and where.
> My experience is that a lot of companies, and please correct me if i am
> wrong, are capturing a lot of information from almost every source
> concievable, storing it in a central place like a data wharehouse, and
> processing/summarizing it in the same place. That can be an inconcievable
> amount of data to manage, and process ... and I hear your impression on
> sorts of bench markings needed.
> Where is all the knowledge on optimizing input/output flows, algorithms,
> storage and the "guru"-areas from just a few years back???? The stuff that
> gave people the apt description "the right stuff" or "stealy eyed missle
> man" (a certain individual on Apollo 13 mission control, can not remmeber
> his name).
> I am afraid of star schemas, and executives mentioning the
> term "enterprise" or "data wharehouse", or both in the same sentence. I
> have seen some hiddeous data wharehouses, and some even worse, when I was
> with previous employers. A few projects that I am hearing of is now
> with pre-processing. Doing processing of details prior to submitting to a
> central store. A primary reason is to get back a lot of performance by
> simply minimizing the volume of data that they have to wade through each
> time. And the amount of data can be drasticly much less.
> IMHO Mauro, and for anyone else interested, I would strongly recommend you
> consider how and when you do things... and especially where...
> A very wise man once asked me (courtesy of my mentor at Temple University
> stat dept)
> "Say a processing job takes 6 months (my note: this meaning writing the
> specification, programming, documentation, validation, and finally running
> the job) and you know that in 6 monhs a bigger faster computer and
> programs will do exactly what you want and only take 2 weeks to complete.
> Would you write the code or wait 6 months to buy it off the shelf???"
> For some reason, I believe it is more applicable today as it was a little
> less than a decade ago.
> A little more than my 2 cents maybe...
> On Fri, 4 Oct 2002 00:56:24 GMT, Mauro Morandin <my_family_name@LIBERO.IT>
> >Hi there,
> >thanks for the many, many ideas and thoughts.
> >The topic is really interesting and far too broad to be covered
> >in an email or two. I surely feel the need to quantify how fast
> >SAS is compared to other languages, but I want to do it on real
> >problems. So, I really don't understand how you could be so enthusiastic
> >about Dorfman running a useless program and showing everyone
> >that SAS can read a 100MB file in less than a second. Everyone was like
> >"Hurray, SAS is really fast ...." ... at what ????? Reading a file into
> >its input buffer and throwing it away. So what now ????
> >I already hear you: "But that's what you told us to do?"
> >But does it make sense just to read it? To try how fast the interpreter
> >is YOU HAVE TO USE THE INTERPRETER ( ... AS MUCH AS YOU CAN WITH
> >DIFFERENT INSTRUCTIONS AND LOOPS). This makes sense to me. And then do
> >the same thing with other languages. This not only makes sure that you
> >USE the interpreter with possibly a lot of different instructions, but
> >also makes sure that YOU don't incur in some I/O bottleneck, which would
> >of course false your results, because you don't want to measure your
> >hard disk/memory speed but the speed of your SAS interpreter.
> >To all the people who say: "I don't understand why someone should spend
> >it's time to write a program which runs some seconds faster than that?"
> >I answer: "Because this is just a 100Mbyte test program. You see what
> >happens if you have a 100 Gbyte DW? These 2 seconds could become 24
> >hours. And if you're 24 hours late with your reports they could be
> >Said that, I explain why I sometimes feel disappointed with the
> >performance of SAS. I'm now a freelance SAS consultant. I have been a
> >SAS employee some years ago, for several years. I don't like people not
> >beeing honest about SAS. And saying that SAS is a compiled language is
> >not honest, because it makes other people (mostly managers) believe that
> >a SAS program is as fast as a program written in C.
> >I have seen SAS "go really fast" with PROC SORT and PROC MEANS. Really
> >fast for me means hitting the I/O bandwidth limit, which can be around
> >50-100 Mbyte/s for a server PC/UNIX with 4 disks in RAID0. This is
> >enough for a lot of application domains, so I don't feel the need to
> >look for something to speed things up a bit. But SAS is not only PROC's.
> >The problems I have to deal with are mostly DW problems, like building
> >fact tables and dimensions with surrogate keys and a lot of computed
> >variables. The fact tables are big beasts and I find myself looking at
> >the performance monitor on AIX to see what SAS does. I look at the SAS
> >log .... hmmmm ... data step ..... I look at the monitor .... less than
> >10Mbyte/s .... then ... proc sql ... hmmmm ... 6 tables star schema join
> >.... hmmm .... monitor says .... 5-8Mbyte/s.
> >My figures on SAS performance on AIX RISC6000 S85 are:
> >PROC SORT (900Mbyte) in 2:00 (2 minutes)
> >DATA STEP (just a set statement) (900Mbyte) 0:20 seconds
> >These are good figures, but I can't build a DW only with PROC SORT's and
> >SET statements.
> >I can't show you the code, but you can be pretty sure that I know all
> >the tricks how to write tight SAS code. Moreover, we have five SAS
> >programmers on the project who look into each other's code.
> >I love SAS, because it makes my life much easier. It is such a powerful
> >framework. But sometimes I feel the need to go faster than that, and I
> >don't like hearing people say that SAS is compiled.
> >The last thing I did last Wednesday was writing a SAS program (a data
> >step) to split a 660 Mbyte XML file into pieces of 100.000 records each.
> >I can't show you the code, because I don't own the copyright (I'm just
> >the author), but I can surely rewrite the program in Python. This is the
> >first thought I had when I saw the disappointing 1.5-2 Mbyte/s of SAS
> >throughput (on AIX). I have done a similar program in Python some months
> >ago, which did more than 5 Mbyte/s (on my laptop).
> >Anyway, I will surely send a copy of my python program to SAS-L. By the
> >way: With Python I have the choice to rewrite part of the code in C if I
> >need more speed.
> >I suppose you also never saw a SAS project reduce it's scope, because
> >SAS + HARDWARE + software requirements were not chosen appropriately. So
> >where are you living, .... in HARDWARE VALHALLA ??? :-)
> >mauro morandin
> >SAS consultant
> >red hat certified engineer