One question that comes up frequently is how to determine how much physical RAM is required for the database server running SQL Server and the Jet Analytics BI Solution.
When considering this though, there is a sizable difference between the "minimum requirements" and the "suggested requirements". These differences all depend on the size and number of the databases in use, the amount of users, and the existing infrastructure in place.
The minimum requirements are easy to meet: at least 3 GB of RAM and enough hard drive space to hold their data warehouse, staging database, and cubes. However, meeting the bare minimum is often not the ideal solution and providing better hardware to your server will allow for improved run-times and efficiency.
The ideal setup will obviously vary by client, but the following are some general ideas. The main thing to keep in mind when reading the following and designing your server is scalability. If better performance is desired in the future, you'll need to be able to add more memory or processors to facilitate this while avoiding a complete server rebuild.
- Number of Users: If the amount of users that you will have accessing and pulling data from the cubes is 15 or less, then you shouldn't need to take extra memory per user into consideration. When more then 15 concurrent users are likely to be using the cubes, you should consider adding 1-2 GB or RAM for every 5 additional users.
- Execution Packages: How often does your data need to be updated? If updates are required every 30 minutes, then you will need to have the hardware to be able to accomplish this. However, if the updates are completed after hours, the hardware requirements are lower. SQL Server ETL transformations load the data being processed into your server's RAM before processing against it. This means that the more RAM your server has, the quicker your execution packages will complete.
- Size of Database(s): This is generally the most important consideration because it directly impacts how much processing must be done to populate the data warehouse. In general, if you have a database that is 50 GB or less than 16 GB of RAM is probably OK if you are only updating nightly with 10 concurrent users.
- Rate of Growth: At what rate is the database growing per year? Make sure to take into account increasing business as a percentage of this growth as well. Building out a server to house all of your data in only it's current state is short-sighted and could become problematic in the future.
- OS Requirements: A good rule of thumb is to reserve 1 GB of RAM for the OS by default, plus an additional 1 GB for each 4 GB between 4-16 and another 1 GB for every 8 GB installed above 16 GB. What this looks like in a server with 32 GB RAM is 7 GB for your OS, with the remaining 25 GB dedicated to your SQL Server.
- Dedicated or Shared?: Will there be other software operating on your server? When figuring out your RAM requirements and server hardware needs, you should make sure to take this factor into consideration. If additional software is intended to be operating on the same server, you will want to allocate enough RAM and hard-drive space to each software.
For an 'average' client who has a database in the 100 GB range, we recommend something similar to the following:
- High speed drives for the data (SAN is ideal) with separate storage for SQL log, data, and temp files.
- Anywhere between 4-8 cores with a high-performing processor(s).
- 16-64 GB RAM.
Even still, the main thing to not forget is that you want your BI solution to be scalable and extensible. If better performance is desired, the ability to simply add more RAM will become crucial. Many of our clients have found that by doubling their amount of RAM from 16 GB to 32 GB can cut the run-time of an execution package that normally took 4 hours to complete down to 2 hours. As mentioned previously, SQL Server will load the data being processed into your server's RAM before processing against it. This is mainly to optimize performance so that all of the calculations are happening live rather than having to read at each new row. If your server has enough RAM, and it's within the limits of your SQL Server edition, SQL will eventually read your entire database into RAM. To figure out what this 'best-case' amount of RAM is, add up the size of all your active databases (data only, not logs) and that is the amount of RAM you could potentially use, depending on your SQL Server version limitations. This is obviously 'best-case' scenario and is not realistic or necessary for all companies.
Additionally, it is important to consider what version of SQL Server you will be using when determining your server's hardware. The following image details out the RAM supported by the various supported versions of SQL Server. These numbers are only representative of 64-bit installations.
When sizing the server, you ideally want something that is going to last you for a while. If your current database is at 100 GB with an annual growth-rate of 15 GB, you'll want to build out your server with a larger hard-drive than 150 GB or you will outgrow it in 3 years, if not sooner. If using a SAN array is an option (or if one is already in place) then that would be best to use for hosting your database files. Otherwise, the recommended hard drive setup would include 3 separate high speed disk drives (ie. 15k RPM). In either solution, our recommendation is to have dedicated drives for the following:
- Drive 1: Data drive (size of source databases x2)
- Drive 2: Tempdb Database (variable, but generally 100 GB or less)
- Drive 3: Standard C: drive to hold OS, Program Files, etc. (variable, but generally 100 GB or less)
The location of these databases is set during installation of SQL Server. To move databases after installation please refer to this KB article.
This isn't as important as the above related instances, but it is an item of note when determining your server's hardware sizing. At a minimum, you will be able to run the BI Solution with any speed and size of processor. However, 4-8 cores is recommended for smaller companies whereas larger companies should try to get closer to 8-16 cores.
Again, the key thing to remember is scalability. If better performance is required down the road, being able to expand on what you already have is important. Additionally, when possible, the general recommendation is to start smaller and add as better performance is required. Basically, the "ideal" server will vary in components between companies, but hopefully the above will give you a good basis from which to base your decisions.