LISTSERV at the University of Georgia
Menubar Imagemap
Home Browse Manage Request Manuals Register
Previous messageNext messagePrevious in topicNext in topicPrevious by same authorNext by same authorPrevious page (February 2012, week 2)Back to main SAS-L pageJoin or leave SAS-L (or change settings)ReplyPost a new messageSearchProportional fontNon-proportional font
Date:         Fri, 10 Feb 2012 11:46:14 -0700
Reply-To:     Savian <savian.net@GMAIL.COM>
Sender:       "SAS(r) Discussion" <SAS-L@LISTSERV.UGA.EDU>
From:         Savian <savian.net@GMAIL.COM>
Subject:      Re: FW: Desktops with lots of RAM and SASFILE limits
Comments: To: Joe Matise <snoopy369@gmail.com>
In-Reply-To:  <CAM+YpE_AmCaUKtms45trkZSgZ4DeJ06CJTzy6UiTVECjcGdLpQ@mail.gmail.com>
Content-Type: text/plain; charset="us-ascii"

Joe,

Let me look into the issue on the sasfile. However, consider skipping the SAS usage and go with an inexpensive ram disk solution. That way you can use for things outside of SAS.

As a final note, here is a consumer level MB:

http://www.newegg.com/Product/Product.aspx?Item=N82E16813188070 <http://www.newegg.com/Product/Product.aspx?Item=N82E16813188070&Tpk=EVGA%20 Classified%20SR-2> &Tpk=EVGA%20Classified%20SR-2

It can support:

. Dual Xeon CPUs, 12 cores total

. 48GB of RAM, DDR3

. Dual 1Gb network connections

. 2 SATA 6Gb connectors plus 6 3GB connectors.

You could RAID 1 3TB of disk and add in 6 more 3TB disks in a RAID 10 config making for a system with 12TB of storage. That is at a consumer level.

It will cost a bit of money, though ;-] Probably a $7k machine.

Regardless, amazing stuff.

Thanks,

Savian

(719) 687-5954

From: Joe Matise [mailto:snoopy369@gmail.com] Sent: Friday, February 10, 2012 11:30 AM To: Savian Cc: SAS-L@listserv.uga.edu Subject: Re: FW: Desktops with lots of RAM and SASFILE limits

Yeah, I didn't think a PCI-E disk was within the realm of possibility ;)

I'm on WIN7-64.

-Joe

On Fri, Feb 10, 2012 at 12:24 PM, Savian <savian.net@gmail.com> wrote:

Joe,

Hash a lot: hash early, hash often. Queries can be complex in a hash object. Also consider using views with hashes or views of hashes.

What O/S are you on?

Keep in mind that a PCI-e SSD is a WHOLE different product than a regular SSD. Think in terms of 10-20x the price of a consumer SSD.

A PCI-e SSD array recently set a world record of 1 billion IOPs per second.

http://www.theregister.co.uk/2012/01/06/fusion_billion_iops/

Here is a picture of the OCZ R5, a similar drive:

http://www.storagereview.com/ocz_zdrive_r5_kilimanjaro_platform_announced

Consider playing with one of the ram disk technologies as well:

http://memory.dataram.com/products-and-services/software/ramdisk

Thanks,

Savian

(719) 687-5954 <tel:%28719%29%20687-5954>

From: Joe Matise [mailto:snoopy369@gmail.com] Sent: Friday, February 10, 2012 10:45 AM To: Savian Cc: SAS-L@listserv.uga.edu Subject: Re: FW: Desktops with lots of RAM and SASFILE limits

Thanks :) I do use hash objects regularly, though probably not as often as I ought to. Mostly I'm using it here for queries that wouldn't easily work with hash objects.

I did think about asking for a SSD, but for some reason it's easier to get large quantities of RAM than an SSD approved by IT. Go figure.

-Joe

On Fri, Feb 10, 2012 at 9:05 AM, Savian <savian.net@gmail.com> wrote:

Consider using hash objects. Great way to use the memory. Any of Paul Dorfman's papers will serve you well.

Also, what O/S are you running on? I think this may be where your issue lies but it is a hunch.

Off-topic a bit:

32GB of desktop memory is around $300 on NewEgg.(4x8GB) and a lot of motherboards support it now under DDR3. It is becoming a lot more common for capacities of 16GB and above.

The exciting area for SAS datasets is the use of a PCI-E harddrive. Prices are still insane but that is where a lot of bang for the buck can be had. Look at RevoDrive, R5, Fusion-IO. If money is no object ;-] that's where I would spend it.

Thanks, Alan

Alan Churchill 719-687-5954 (Work) 719-310-4870 (Cell)

-----Original Message----- From: Joe Matise [mailto:snoopy369@GMAIL.COM] Sent: Thursday, February 09, 2012 8:42 AM Subject: Desktops with lots of RAM and SASFILE limits

Hi folks, I just set up a desktop with 16GB RAM, and am trying to figure out the best ways to use it. I mostly process large datafiles, in the 1GB-10GB range. One of my main reasons for going with 16GB RAM was to be able to use SASFILE to store some of these large datasets in memory (and open) while running many SQL queries against them (and frequently PROC TABULATE/MEANS as well).

I've managed to change the settings to get MEMLIB to work pretty well (I can stick a 12GB dataset in there - I decided that was the most RAM I would give it and safely have enough for Windows to behave normally, plus maintain the ability to have some lower-memory-limit SAS sessions). It definitely speeds up things like sorts, although I don't see a lot of improvement on PROCs (I suppose because read access is pretty fast, particularly with 64kb block sizes).

What I haven't been able to do is get SASFILE to behave. I set MEMSIZE to 12G, as well as MEMMAXSZ (for memlib). LOADMEMSIZE and REALMEMSIZE are 0. However, I can't seem to get more than ~1.2GB into a SASFILE. If I try to load anything over that, I get one of two messages - either a warning that I ran out of memory, or an error that the dataset is damaged and I/O processing did not complete. The dataset is not actually damaged (I can view it, proc means on it, etc.) with no problem. 1GB or so I also have no problem with. Is there a setting I'm not seeing that controls SASFILE other than MEMSIZE? All of the documentation and/or papers I've read suggest MEMSIZE is the limiting factor, but most of them really just say "the memory available to SAS" which is actually not that helpful. I definitely have plenty of memory available, as demonstrated by MEMLIB working as expected; and I've been testing SASFILE with the memlib cleared to make sure it's not the issue.

Any thoughts? Also, any suggestions for how to optimize my use of memory? I've increased SORTSIZE to 1/3 of MEMSIZE, and increased the block size to 64K which seems to speed even normal disk operations significantly. I've read a few papers out there on optimizing SAS for performance and tuning, but haven't really seen anything for tuning a desktop SAS install with large amounts of memory (as I suppose it's only recent that a desktop might have this size of RAM). If there are any SUG papers out there that my googling didn't come up with, I'd love to read them.

Thanks!

-Joe


Back to: Top of message | Previous page | Main SAS-L page