Sql-server – Recommended hardware configuration Analysis and Reporting Services

sql-server-2008ssrs

What would be a good way to calculate the recommended minimum hardware configuration for a Windows 2008 R2 64bit that will be running SQL Server 2008 R2 Analysis and Reporting Services? The database server is on another machine.

The current configuration is :

Virtual Machine running on a vmware7 host

2 vCPUs
4GB of RAM
80GB of Hard Disk space.

4GB of RAM appears to be insufficent.

No one has the answer to this question? Perhaps I need to provide more information? Is there a white paper on minimum requirement for SSAS and SSRS?

What started this question was essentially the customer saying, hay the RAM on my machine is all used up. I checked out the processes and noticed owstimer.exe. After a quick search I found that owstimer.exe is a sharepoint component. I say that with some doubt because I have not verified that information.

So I have a Windows 2008 R2 64bit, 4Gb of Ram, 4vCPUs on a machine with SSAS and SSRS. Among the processes I find one called owstimer.exe which might belong to sharepoint. I read somewhere that sharepoint should be installed on a system with a minimum of 8Gb of RAM.

http://technet.microsoft.com/en-us/library/cc262485.aspx

But sharepoint isn't on this machine just SSAS and SSRS.

The databases for this client are stored on a dedicated SQL Server 2008 R2.

More information. I can at limit SSRS and SSAS via their configuration files.

For SSRS

Control SSRS according to memory pressure

http://msdn.microsoft.com/en-us/library/ms159206.aspx

C:\Program Files\Microsoft SQL Server\MSRS10.MSSQLSERVER\Reporting Services\ReportServer\rsreportserver.config.

MemorySafetyMargin 80 /MemorySafetyMargin -- % of memory reserved for
the system

MemoryThreshold 90 /MemoryThreshold --  If Report server memory use
reaches this value, the report server slows down request processing
and changes the amount of memory allocated to different server
applications

Min and Max Memory control in SSRS

WorkingSetMaximum 2097152 WorkingSetMaximum>  
WorkingSetMinimum 1048576 WorkingSetMinimum> 

So we're going to try this out on SSRS and see how it goes.

Best Answer

Intuitively, if I was doing an OLAP solution for a retail chain, I'd say your infrastructure is really inappropriate for a system with substantial data volumes. This sort of kit has trouble with the data volumes you get in insurance, which is probably a couple of orders of magnitude smaller than I would expect to see in retail.

As gbn states in the comments, SSRS is just a web application. You can set up a farm - start with one server with a few GB of RAM and a couple of virtual CPUs. Monitor it, and expand it if it's overloaded.

The amount of disk space used by SSAS depends on the volume and the aggregations you put on the cubes. The storage format is quite compact - more so than SQL Server, but if you have large volumes of data it will start to get quite big. If it's getting into the 100+GB range then you should also look at partitioning.

A surprisingly applicable generic solution

Now, your client probably doesn't want to hear this, but VMs are not my recommended hardware configuration for any business intelligence solution. They work OK for transactional applications or anything that is not too computationally intensive. BI for a retail chain is probably a bit aggressive for this sort of infrastrucutre.

As a generic 'starter for 10', My recommended configuration is a bare metal 2-4 socket server like a HP DL380 or DL580, and direct attach SAS storage. The principal reason for this is that a machine of this sort is by far the best bang for buck as a B.I. platform and has a relatively modest entry price. If you put a HBA on the machine then you can mount a LUN off the SAN for backups.

IOPS for IOPS, this sort of kit is an order of magnitude cheaper than any SAN-based virutal solution, particularly on sequential workloads like B.I. The entry level for a setup of this configuration is peanuts - maybe £10-20,000 - and it's a lot cheaper and easier to get performance out of something like this than a VM based solution. For a first approximation, the only situation where this kit is inappropriate for B.I. work is when the data volumes get too large for it, and you need something like Teradata or Netezza.

What can you do with your VMs, though?

  • A good start would be more RAM - try 32GB. SSAS is a biblical memory hog, and you've got slow infrastructure. If you're stuck with a VM, put as much RAM on it as possible. Disk access is slow on VMs.

  • A second avenue is to partition the cube over multiple LUNs, where the LUNs are on separate physical arrays. Not all SANs will give you this much control, though. 80GB is getting into the range where you might get a benefit from parititioning, particularly if you've got a platform with slow I/O.

  • Tune the buggery out of your cube aggregations - try usage based optimisation. The more hits you can get from aggregations the more efficient the server will be.

Without measuring your workload, I doubt anyone here is in a position to make recommendations that are any more specific than that. Although generic, the pointers above should be a reasonable start if you haven't implemented them yet.