Awesome response Mic, really great advice on how to maximise services and performance of servers. I am sure most elements can be used as many setups I have worked with have a SQL Server setup that is NOT dedicated to just SIMS. Was reading down your recomendation list and I remember doing about half of those things (in particular spreading out the physical location of the default and required databases, and the log file setup) when I was still in schools. Really really useful.
It would be great if a Capita bod could comment on things that should NOT be done from this list, due to SIMS not being able to cope with it, and/or offering alternatives.
Thanks Graham, absolutely agree; if someone from Capita wants to pop along and comment on things which they would disagree with, I'll Subtitle that section of the post with a note which says "don't apply on an instance of SIMS as capita do not recommend this"; that way it remains relevant for anyone with a SQL instance of something else running, but doesn't hinder anyone with a SIMS install.
I've already had a PM thanking me which is nice, as I said to him if it helps one school to get information out to those who need it faster than previously, and kills some earache for our colleagues in the server rooms, it was worth the post :)
- - - Updated - - -
On another note, I have no EduLime, so I won't waste a Corona.
Good Stuff @michael2k6.
Not to troll, but it is almost always the case that database performance issues are due to database design issues. It really is that simple.
This is generally true. As I said though, I have no idea about the database, having never used it let alone written it, I shan't cast aspersions. It's also quite difficult to build a database for this kind of use in a generic performance tuned manner as they perform such a vast array of tasks and each school will use each bit of it to a different extent. So it's about finding the best balance across the board.
But hopefully regardless of the db design and structure, the best practice principles in my post will apply to anything; if you buy a 400bhp car and put £50 a corner rubber on it, you're coming off at the next roundabout. At least you are as soon as you start to push it :)
While it is difficult to some degree, I would hope that suppliers employ people who have skill, knowledge and experience enough that what is difficult for most, is fairly mundane for them. For a supplier to turn round and say "your DB is slow because your use is 'unusual'" sounds like an excuse. Their job is to provide a product that copes with the demands of their customers - even 'unusual' demands.
Originally Posted by michael2k6
So when you have :
"Hi just to report back, did this on our SIMS db last night and for example on a small report I ran before took around 4 minutes and 1 minutes 30 seconds after reindexing."
The fact that a "small report" is taking a minute and a half not seconds, should ring alarm bells. Not that it is not possible to ask an SQL database questions which involve scanning hundreds of millions of records to collect and aggregate data to give the answer, but in a school, it should be ... 'unusual'. And in these days of multicore multi threaded processors capable of executing 10^9 instructions per second, and with enough RAM to hold the entirety of the data in memory, mundane everyday queries should be dispatched by the DB server in fractions of a second.
When I was investigating the slow queries, I dug deep into the queries using execution plans. These showed that SIMS uses an awful lot of full table scans, and there were missing statistics for many of these.
Moreover, some of the queries were inefficiency constructed. e.g. When you chose all classes, Profiler showed that rather than a SELECT *, the app constructed a WHERE IN (class1, class2, etc) even when all classes were chosen. Safer coding, but.... well....
Anyway, I found this fantastic free PDF book kindly published by RedGate. It's deep, but you'll really understand how SQL ticks.
Red Gate book store - SQL Server Execution Plans Second Edition
Good write-up, @michael2k6
I'd suggest that the log files are sized correctly at the start, though, and instead of setting a percentage increase for auto-grow, set the auto-grow value to a couple of hundred meg.(10% of not much is still not much, and every auto-grow causes a throttle-back on SQL) You don't want auto-grows following auto-grows in quick succession.
Isn't this so you only get current?
Originally Posted by jinnantonnixx
One of the good things about cloud is that inefficient queries now cost the supplier money, so they've become incentivised to improve performance.
Only true cloud, where load is dynamically spread across servers and additional servers are spinned up when required (Arbor). Grey cloud - web based or client\server (like SIMS) you stuffed. You have to hope the guys and gals at the other end know what they're doing and actually are willing resolve it - ie spend the time\money.
Still costs then money.... Running on their servers they have to buy capacity to deliver performance, and once bought it cannot easily be unbought!
Let say you need 8 cores and 16gb of ram per customer because you haven't optimised your index and queries... That's several grand of hardware per customer, which could be reduced to say 1 core and 4 gb saving far more than the cost of the developer (for companies with 100+ customers)
Also for companies that operate primarily in one geographic market to not have proper cloud (I.e dynamic provisioning) means they will be paying for compute time they are not using (MIS loads 4pm-6am are tiny by comparison to during the school day. Again a driver for improving efficiency of the software.
My advice, whack and SSD drive in the server for the database and logging files.
Improved our performance no end... ;)
You feeling ok? Sure I didn't see Intel or Samsung being mentioned ;)
Originally Posted by zag