Date: Mon, 9 Nov 1998 20:41:39 +0000
Reply-To: Peter Crawford <Peter@CRAWFORDSOFTWARE.DEMON.CO.UK>
Sender: "SAS(r) Discussion" <SAS-L@UGA.CC.UGA.EDU>
From: Peter Crawford <Peter@CRAWFORDSOFTWARE.DEMON.CO.UK>
Subject: Re: help with reading a large data set
There is one extra piece of info, before deciding your optimal solution.
Is the source changing more often than the queries are made ?
If the source keeps changing,
then acquire use of any index in the original storage form.
By that I mean the db2 or oracle or informix query access.
If there are a few queries to deal with, between updates of the data,
then read all the (likely) data into SAS, creating indexes for the most
popular search criteria, in the inputting datastep.
data sasuser.mostofit( keep= ...............
index=( .............. )
infile ' your external data ' ............. ;
input ............ ;
Unfortunately sql won't be able to help you much with loading from
external files which aren't already in databases.
In article <email@example.com>, Self, Karsten
>Indexing the dataset by selection criteria should help.
>Not writing out every record when you only want to keep a subset should
>You can apply a WHERE= dataset option to the input dataset to streamline
>SQL queries against the master list and a list of key values being
>sought is often efficient:
> create index key on big(key);
> select * from big, small
> where big.key eq small.key;
>Karsten M. Self (Karsten.Self@schwab.com)
> What part of "gestalt" don't you understand?
>WARNING: All e-mail sent to or from this address will be received by
>the Charles Schwab corporate e-mail system and is subject to archival
>and review by someone other than the recipient.
>> From: Tim Pi[SMTP:timpi@FMRCO.COM]
>> Reply To: timpi@FMRCO.COM
>> Sent: Friday, November 06, 1998 10:02 AM
>> To: SAS-L@UGA.CC.UGA.EDU
>> Subject: help with reading a large data set
>> Hi SAS users,
>> I have a large dataset with 80K+ recoreds. However, I only
>> need less
>> than 200
>> records everytime when I access it. I use account number to
>> which records
>> to be kept. It takes a long time to read all 80K+ records and
>> then keep
>> only those
>> I need.
>> Any advices??
>> Tim Pi