@PhillipHamlyn the biggest benefit of cloud is the "big data" idea. Yet no-one has yet mastered it in the MIS market.
I take your point but some of the offerings on G-cloud are virtual desktop environments with full desktop versions of Microsoft Office (not just browser) available to the end user (vendor is InsightCloud) . I'm not sure this falls within NIST or not but if a remote desktop version of an SosaaS is deemed "Cloud" by G-Cloud then I think its a useful extra definition of "what makes a cloud application Cloud'y". In this example the vendor provides user interfaces "Browser, Other, Client Side Virtual Application" which does indicate they support some level of client side deployment (even if its App Streaming or somesuch).
Personally I'd love to know how those chaps offering virtual desktops full of Office productivity software avoid the inefficiencies of hosted desktop software that you described in your earlier post - in general its very difficult to achieve elasticity for remote application hosting. Time to get hunting for some technical background on their offering I think :-)
Web-based application: You are welcome to use Bromcom Cloud MIS or any other web based MIS.
It will not be practical to guide you on this forum to simulate a 'hosted' environment', adding in users and memory measurements etc... But in essence you need to measure "free memory" in the 'hosted environment' after initial a cold start then after each time a user is log.
PM me and I will provide you with the test report/procedure followed and the results.
Does that mean data cross schools, or just the data within a school ? I can see that, for example, Fisher Family Trust provide a "Big Data Insight" service to the education market, is this the kind of thing you mean ?
@PhillipHamlyn Thank you for joining in, interesting comments.
I must say, though, the Gartner definition of SaaS you cite is so impossibly broad as to be totally useless. If all it takes is a subscription contract and remote management, then my family's Microsoft Office, for which I pay an annual subscription and which is kept up-to-date by Office Update, counts as a SaaS application - which patently it is not. The SaaS version is Office 365 although it is only partially SaaS since the client (unless you use the web version) still runs on the desktop.
I believe the NIST definition often gets misinterpreted because it describes three separate cumulative layers: IaaS, PaaS and SaaS. While it's possible to host a client-server application on IaaS and deliver it from the cloud, that doesn't make it SaaS. To be described as a cloud application per the NIST definition, an application has to include core architectural features in the definition such as resource pooling and elasticity. Otherwise it's just SoSaaS sitting on IaaS, and hugely inefficient/unwieldy.
I'm interested to see integration and big data cited as advantages of cloud SaaS - I think those are good points. I know I'm a bit of an interloper here but in the interests of stimulating discussion, here's a list of cloud-native SaaS benefits that I included in a recent article on client-server vs cloud SaaS:
◾It’s quick and easy to get started and to add or reduce resources due to on-demand, elastic provisioning and highly automated operations.
◾A high-performance, globally connected infrastructure comes built in as standard.
◾The application always remains bang up-to-date with the latest functionality, updated automatically to add new features without disruption to existing settings.
◾Intelligent resource usage across a common, demand-pooled application infrastructure allows for cost-effective service delivery.
◾Collective testing and feedback on the shared infrastructure ensures robust security and reliable operation.
◾A highly composable service-oriented architecture provides a flexible, fault tolerant platform with extensive scope for continuous innovation at all layers.
◾The ecosystem of users and partners can easily share add-ons and templates that all run on the same shared, homogenous operational instance.
◾Open, cloud-ready APIs and high-bandwidth connectivity make it easy to plug in external resources and functionality.
Thanks for your comments, I can see that the concept of "software as a service" means much more to you in its architectural design than the raw definition implies. You are pulling in concepts from open API systems, software design stuctures, low operational cost (as opposed to low price), shared infrastructure (not neccessarily, in my opinion, a good thing on a business critical system), and high bandwidth connectivity. I'm not pushing back on your definition of SaaS but is a lot more extensive a shopping list than I'd expected to be loaded onto it. Very much food for thought.
So it is down to matt40 to enlighten the forum on the facts of 50-fold memory wastage if client-server MIS is hosted rather than web-based MIS.
Press Release: Laboratory tests prove up to 50-fold saving for Web-Based School MIS over legacy Windows ‘fat client’
Meanwhile we have this press release that provides essence of the report: http://www.bromcom.com/press/pdf/130112_Press.pdf - so the findings have been in public domain for sometime.
So we now have two active members of EduGeek MIS forum on the job and hopefully soon both will enlighten the forum their comments and take on the facts presented in the report re: up to 50-fold memory wastage if client-server MIS is hosted rather than web-based MIS.
Could I have a copy?!
I wouldn't bother @PhilNeal. If it was worth while, they'd publish it publicly.
First, thanks @Bromcom-PR for the kindly sharing the document with me and offering access to Bromcom to replicate the tests.
I think we got a bit muddled up about the objective of this topic, as I understand it we all agree - A web based MIS system written from the ground up to be cloud optimized, like Bromcom, is better at running as a hosted service rather then a traditional client\server application, like SIMS .net. I would be pretty sure we'll all 100% on this, even @PhilNeal is with us on this point. So great, why are we using SIMS again?
Now we go down hill it what I like to refer to as a Bromcom special - simply put, over selling.
So let's start with - http://www.bromcom.com/press/pdf/130112_Press.pdf - MY issue, is your statements like:
The crux of how you got this number is you ran SIMS on RDS, done a few things the look at resource manager and noted the memory used. My issue is if say I was Cambridgeshire LA and I put in a request to change SIMS for Bromcom, I can't use this as a bases of cost saves. It's laughable. You're comparing apples to oranges. There is no talk about what the requires are on the end-client are because I've yet to find a web browser that doesn't eat RAM. There is no talk about Disc IO, I mean I could allocate SIMS 20MB, I'm sure I could get it working in that and most importantly no actual response times of how long stuff takes to actually perform, I mean it could take 10 mins to save a single attendance far as I know!50-fold reduction in RAM needed per user for the web based application
OK, let's put that to the side for now and move on, lets say it's all correct. Massive savings, any Bromcom quote should look like Capita quote only with the decimal point in a place that makes bursars everywhere happy.
So lets go back to the PDF above: "Legacy desktop – locally hosted £1,500 per primary"
Sorry, that's £300,000 for a 300 pupil primary?! Seriously? Just on admin side? Really? So let's say that's 15 classrooms of 20 students, say another 3 office staff, bursar, head teachers, so 20 machines, SIMS is, say £1k, let's throw on a few addons, so that's say another £3k. Say another £10k for a server - so that's £34k, oh let's slap some Capita consultant on top, what's that these days? £1k a day? Hell lets have 14 days - so that £48k, still another £252k to spend! Hey how much is SLG these days?
At this point I'm starting to question Bromcom alot more. Now they offered me access to Bromcom, by access they mean over Go-To-Meeting type thing or by visiting and using a laptop. So I can't actually compare like for like or do any actual real testing - ie point a monitoring system at it then run over a long period of time on identical hardware. It's all right playing the Cloud card, but us tax payers end up paying for the resources when you boil down to it. I'm basically not bothering to test it unless I can get both systems on my own hardware. It's not worth my time running a bias test scenario and it certainly isn't worth public money. It then makes me wonder why they wouldn't want to reveal the application? I mean I'm happy to sign a NDA. Which then makes thing, has it been PEN tested?
Amazing how a bit of over selling with some over optimistic "facts" has turned me from a lover to a hater lol.
(PEN testing is penetration testing, there are different types, the type I am referring to is application penetration testing. Whilst other testing and standards refer to how you patch, handle data, secure your firewall and blah blah blah, application pen testing is where a security expert inspects the application and basically tries to hack in. They don't care if what they find causes your project to over run or break. The idea is your security is only good as your weakest point. A really good case of this not occurring is Tesco's - Troy Hunt: The Tesco hack – here’s how it (probably) happened)
There are currently 1 users browsing this thread. (0 members and 1 guests)