Have a few of our admin staff complaining that Facility is slow. Its mainly just using it to generate reports and other general use, navigating through different menu's.
The machines they are using are some of the fastest in school all connected through Gigabit LAN. The machines specs are:
Intel core 2 duo E6400, 2gb ram.
This spec should be more than enough to handle this.
The backend server is in an ESX 4 Cluster on a 32bit 2003 server with 4gb RAM and dual core processor.
Any adjustments I could make to improve performance. The server was originally a physical server and when we moved it into the ESX cluster it was a simple P2V migration. This was done about 6 months ago. The original server was a dual core with 2GB ram and when it was virtualised the memory was increased to 4gb and this has made a difference but just having a few isolated problems with heavy CMIS users which I am not able to sort.
Any advice would be great. I doubt if this problem is server side. The server is located a good 150m away but from the server room there is a fibre link to the nearest cab where the offices are located and in there we have a 1gb link to desktop.
More ram, 6 or 8 GB is what it really needs now I think, I am just going to replace my server this new financial year, I will be getting Dual CPUs (they will be Quad core most like) and 8GB ram as a minimum, and I know Serco will much that up quite happily.
How have you got it setup, are all 3 (Facility, SQL and Eportal) all on the same box? Are you using Behaviour Management as that does seem to gobble up the systems very nicely.
We used to find clearing the audit trail helped, also consider shrinking the DB if it's got a lot of free space.
I like the sound of that. I reckon that could do the trick. Are you able to direct where I need to go to make this change and does it involve all users to be logged out before I do this, I assume it will.
Ranj, if you contact the Link2ICT Service Desk we can provide you with a script that can automate the clearing down of your audit trail, limiting the table to just the last month's worth of entries. This should keep the size of your audit trail table to somewhere in the hundreds of thousands (yes, even just in one month you can get that many) rows. It backs up the contents of the table to an archive database in the event that you need the contents of the table for auditing purposes, or to identify issues with your database.
And yes, you will need to run this script out of hours, not specifically due to problems with users in the database, but mainly because the script causes a big performance hit whilst it runs. We recommend that you set it up as a scheduled task to run overnight when the system is otherwise quiet.
Ranj, If you want to use more than 4GB RAM, you will need to have Windows 2003 Enterprise 32-bit Edition. Windows 2003 Standard 64-bit edition will use more than 4GB RAM but secure ePortal may not work correctly (I think it is to do with ISAPI filters).
John do you want a job? You're starting to scare me now... what he said. Though i'm not sure whether it's available generally enough to be on firstline but if you call support and ask them for the 64bit ISAPIs theyll gladly send you it and the relevant document.
Just to clear up the thing about security -
It will still work over apache on port 8080 (default) or if you know what you're doing or do some googling, you can use ssl on apache (defaults to 8443). The affect 64bit has is it uses different ISAPI's and they are depended upon by IIS. So esentially IIS isnt going to work - secure (ssl/https/port443) or other wise (http/port80). But as i said those new filters sort that out
In terms of the original post -
Search the website for eportal performance - although it's not eportal performance issues you're experiencing, the first thing i'd be looking at is how busy the server is when you're running these reports etc. Following the performance document will help reduce the overhead on the server and should have a positive knock on effect.
Maybe also look at things such as... if you try it on the server is it also slow there?
If you do it out of hours, is it quicker, or just as slow? Is it only slow on some work stations (regardless of spec)?
Also, as a little aside, what antivirus do you have on the server? I've come across more and more of them now actually scanning the db and the log unless you tell it otherwise. And more and more are scanning all network traffic including http and https.