Date: Tue, 5 Jun 2001 15:03:58 -0500
Reply-To: "Foy, Thomas M." <foytho@PARKNICOLLET.COM>
Sender: "SAS(r) Discussion" <SAS-L@LISTSERV.UGA.EDU>
From: "Foy, Thomas M." <foytho@PARKNICOLLET.COM>
Subject: Re: Windows 95 Limitations and Very Large Files
Content-Type: text/plain; charset="iso-8859-1"
Thanks Mark. It worked perfectly. I had a feeling that it would be
something simple like that.
From: Terjeson, Mark [mailto:TerjeMW@dshs.wa.gov]
Sent: Tuesday, June 05, 2001 2:27 PM
To: 'Foy, Thomas M.'; SAS-L@LISTSERV.UGA.EDU
Subject: RE: Windows 95 Limitations and Very Large Files
Check out your online documentation for the INFILE statement,
and zero in on the FIRSTOBS= and OBS= options.
infile file-specification firstobs=100 obs=200;
Hope this is helpful,
Washington State Department of Social and Health Services
Division of Research and Data Analysis (RDA)
From: Foy, Thomas M. [mailto:foytho@PARKNICOLLET.COM]
Sent: Tuesday, June 05, 2001 12:24 PM
Subject: Windows 95 Limitations and Very Large Files
I know I've read postings dealing with this topic before, but I have been
unable to find them in the archives.
The issue at hand is this; I am using SAS V8.1 on Windows 95 with plenty of
ram and storage space. I am attempting to read a text file that is 519Mb in
size. There are, I think, 1,044,596 records and 269 variables. The SAS
data set created by just running one data step to read the data, is 3.18Gb.
Far beyond the 2Gb limit for files in Windoze 95.
My question is this: Is there a way to read in a limited number of
observations from the raw data in a data step, without first having to read
in the entire input data set?
What I would like to do, if possible, is read in smaller chunks of data and
work with data sets smaller than 2Gb. Since NT is not in my near future I
need to find a way around this limitation. If I can't I might be _screwed_;
Any assistance anyone could lend will be greatly appreciated.
Park Nicollet Institute
Health Research Center
Minneapolis, Minnesota 55416