MIS Systems Thread, Sims Discover in Technical; Cheers Phil. I do remember seeing it at a Capita conference and it was a little slow then (running on ...
23rd June 2011, 07:29 PM #31
Cheers Phil. I do remember seeing it at a Capita conference and it was a little slow then (running on a laptop)
Obviously with so many variations of machinery and OS it'll run on no doubt there's going to be some combinations that won't work quite as well as others. Hopefully in our case we can narrow down what's dragging it back a little and then nail it
23rd June 2011, 09:31 PM #32
Performance, especially how quick you can update the the screen with whatever, is a key factor in ease-of-use - always has been, always will be, but..
..I think that is where it often all goes wrong. Manglers, sales-droids et al who just don't get behind the scenes that well respond to bells & whistles hitting their screens quickly (so they can make important exec decisions about colours, fonts and the like), and getting those there and kind-of-working all too often takes precedence over a really well-considered underlying architecture. It really doesn't take long before the "weight" of it all makes it increasingly difficult to go back and do what you should have taken your time over in the first place i.e. performance fixes are often compromised by limited 'wriggle-room" later down the line. I'll leave how that could possibly be true in the fabulous, brave new world of reusable, extendable, flexible, abstract OO-everything, as an exercise for the reader.
first releases of products are often compromises and performance problems can be addressed in future versions
IME, which is similar, devs have long had that tendency to shrug and respond "throw more h/w at it" - heard it over and over forever, but it's by no means unique to Capita. That said, it's been much more common for server-side - the pragmatic realities on client-side have generally been better recognised and better respected.
dotNet can be pretty fast, and generally sufficiently fast, serious number-crunching (which this scenario clearly isn't) excluded. I'm generally happy to write in pure dotNet for user-land code now, but were someone else of the current generation to implement the same code, I would expect it to go around the houses quite a bit e.g. why write a line to add two and two together, when you can implement an ever so clever, general purpose arithmetic engine, which of course despite the ever-lasting intention will still get thrown away within a few years to make way for the latest new and exciting dev-tech.
SQL has a lot to answer for re. resources and performance too, but if customers have an "easy/standard access to our data" tick-box there's not much choice.
23rd June 2011, 10:13 PM #33
I don't think that's the fault of SQL. My experience is that programmers are often skilled at procedural programming but treat SQL as if it is a just another procedural programming tool. They write some SQL and it gives the results they expect, they don't care about the cost and often enough in a development environment, aren't aware of the cost in a production environment. This can be exacerbated by naive developers using multiple transactions (ok, first we will ask the database for a list of students, then we will take that result set as an array and ask the database to give us the subject results for each of those students, one by one) where one transaction will do the job. Lack of database skills also leads to poor schema design and poor index selection. Well designed relational databases - even ones run on MS SQL, are quite capable of churning through results sets consisting of millions of rows in very short order - IF they are well designed and implemented. If not, go make a coffee or two.
Originally Posted by PiqueABoo
15th July 2011, 12:18 PM #34
I'd like to drag this back to the original question from synaesthesia about hardware.
Our data manager/exams officer went for training/demo on this and due to the fact that the laptop they demoed it on struggled, he's asked for a new pc for it.
His pc is getting on a bit, it's a 3ghz PentiumD (the original dual core pre-c2d, just a pair of p4's jamed onto the substrate under the IHS) 2gb of ram and agp x4 ATi Radeon 7200 (yes the original radeon which is DX7)
the great info given to him about minimum spec was
as a technical spec that is a total joke.
Core2Duo or AMD equivalent
A separate graphics card.
Going by that it should work, it's a dual core cpu, 2gb of ram is a bit low but should work and it has got a separate graphics card.
I called up the Captia "help desk" about this and they could tell me nothing more than that.
So we going to overspec, spoken to our main supplier and we looking at getting him a SB i5 2400, 8gb ddr3 1333mhz and . . .
I was thinking gts250 for cost, cores and raw performance it's great value for money, they've suggested a Quadro FX380 instead
It's a far weaker card (they are just 9500gt) however being a workstation card the drivers and support do make a major difference in most non-gaming applications (a quadro will perform around 2x-4x better than a gaming card with CAD or 3d rendering software)
Anyone know if there's any advantage to a workstation card over a consumer card with discover?
15th July 2011, 12:24 PM #35
Sorry to break it to you, but when a company gives you a minimum spec of C2D or AMD equiv and 4GB RAM, a Pentium D and 2GB of RAM is not 'a bit low'! And expecting a DX7 era card to be fine for a program released this year is expecting a lot also.
Originally Posted by RobG
That tech spec is perfectly fine from what I can see!
15th July 2011, 12:30 PM #36
Yup - falls on it's @r5e by not being able to do behaviour and achievement points between dates... (Ok I suppose I could do an excel macro report to count up the points between dates then import them into an AM aspect - but come on!)
Originally Posted by ajp233
15th July 2011, 12:38 PM #37
@BatchFiles - it's the first major release, give it a few years and it'll do more fancy things then you can shake a stick at.
15th July 2011, 12:46 PM #38
I'd also like to suggest openly to any Capita employee's following this thread (@PhilNeal I mean you ) that you add "64-bit Windows Environment or Windows Server Environment" to the list of minimum requirements. The reason? You can't have 4GB of RAM and a dedicated graphics card without them, 32-bit Windows (with the exception of server versions) are limited to 4gb of RAM total, that includes system memory and graphics memory.
That being said, I now have Discover running on my dual core 3.2Ghz Intel with 4GB RAM (3GB accessible) and 1GB Geforce GT240 and it runs flawlessly
Last edited by LosOjos; 15th July 2011 at 12:47 PM.
15th July 2011, 01:06 PM #39
Originally Posted by matt40k
15th July 2011, 01:12 PM #40
It's a joke because c2d covers a rather large range of cpu's, go back and try using one of the old low end 1.8ghz c2d or one of the laptop 1.2ghz ones, there's a massive difference between those and a c2d running at 2.6ghz or higher
Originally Posted by localzuk
And it's nothing to do with "expecting" the card to work, I don't expect the card to work, but listing "A separate graphics card." (and yes that is the exact wording on the documentation I got for the technical spec.) they don't say dx9 minimum.
So yes is a TOTAL JOKE, is like listing the minimum requirement to enter a Formula 1 race as "a car"
I'll also point out that sme of the top end onboard/ondie gpu's are actually better than some bottom end graphics cards.
Last edited by RobG; 15th July 2011 at 02:15 PM.
15th July 2011, 01:26 PM #41
At a meeting I was at Wednesday someone was saying they had it running with an integrated graphics chip on an i.., 5 but could have been a 7. I'll be happier with that as a decent chip and lots of RAM is really a given today, a dedicated graphics card is odd in my mind.
Originally Posted by RobG
The issue is they openly admit they don't know the min spec as they have had all sorts of results on differing hardware. So the only advice seems to be test it and see if it works.
15th July 2011, 01:36 PM #42
So how essential is a graphics card? I'm stuck as I have been asked to find a PC near the £300 mark but they all come with on-board or insufficient RAM.
15th July 2011, 01:40 PM #43
It'll run on most things that would be up to spec to regularly run SIMS - it's just a bit jerky flipping graphs over.
15th July 2011, 01:53 PM #44
Yes that's very true as well, you'll lose from the 4gb roughly 250mb (due to motherboard addressable resources) +video card ram, so with a graphics card with 1gb of vram you're generally down to around 2.75gb system ram with only a 32bit OS.
Originally Posted by LosOjos
Nice to here the gt240 is doing well for you.
That's sort of my exact point, not having had experience of this before and they themselves don't seem to know, so I came here to ask others, esp on the point of workstation vs consumer card.
Originally Posted by TechMonkey
15th July 2011, 02:22 PM #45
Just as we are deploying these into our offices. Should be fun.
If it doesn't run then we shall wait for the govt. to increase school funding again before using Discover. Would be a shame, it is supposed to be nice.
By matt40k in forum MIS Systems
Last Post: 6th May 2011, 08:57 AM
By vikpaw in forum MIS Systems
Last Post: 19th March 2011, 12:11 PM
By Banjo in forum MIS Systems
Last Post: 3rd November 2010, 06:20 AM
By laserblazer in forum Jokes/Interweb Things
Last Post: 3rd March 2010, 09:30 PM
By Lithium in forum Windows
Last Post: 12th December 2007, 07:35 PM
Users Browsing this Thread
There are currently 1 users browsing this thread. (0 members and 1 guests)