|Date: ||Thu, 27 Jan 2005 19:23:10 +0000|
|Reply-To: ||toby dunn <tobydunn@HOTMAIL.COM>|
|Sender: ||"SAS(r) Discussion" <SAS-L@LISTSERV.UGA.EDU>|
|From: ||toby dunn <tobydunn@HOTMAIL.COM>|
|Subject: ||Re: SAS Merge|
|Content-Type: ||text/plain; format=flowed|
Well if memory isn't a problem load them into memory via sasfile. Otherwise
views may be the best route or at the very least mutliple merges but that
will take alot of time.
Have you tried using hyperspace?
Using a Hash or associative array would be prettey darn fast compared to not
using one here I think, Paul Dorfman or Richard D. would be the best to
advise on that end.
And finally have you tried using SQL to get what you want, atleast it will
use a hash behind the scences.
From: Amrita Singh <AmritaS2@AOL.COM>
Reply-To: Amrita Singh <AmritaS2@AOL.COM>
Subject: SAS Merge
Date: Thu, 27 Jan 2005 13:58:45 -0500
I am trying to run a big merge. I have about 20 files with varying record
lengths and quantities. Some files have a record length of only 20 bytes
whereas other files are over 1000 bytes wide. Quantities vary from 5
million to 140 million. The final output file will be over 5000 bytes wide
and will have 140 million records. This process has to run weekly. I am
running SAS 9 on the mainframe.
I did run a test with just 6 of the files but the job took too much CPU
time, too much work space and ran for 13.5 hours, which is too long. I ran
the same merge using views and that helped with the work space. Also the
job ran in 7.5 hours,which is better but still too long and it still takes
over 1 hour of CPU time which is not acceptable in our production
environment, especially on a weekly basis.