JDS Australia’s work at RMIT, performance testing the MyResults (student results) website, has been written up by HP as a case study.
Twice a year, RMIT publishes course results for its students on the student portal of the university’s website. These two days in the calendar generate the highest volume of website traffic. With such huge spikes experienced in a 24-hour period, RMIT’s student portal was being saturated and not handling the volume effectively. This resulted in poor performance and students reported difficulties in accessing their results in a timely fashion.
Here is a short excerpt from the case study:
In seeking to prevent future student portal performance problems, RMIT had identified a number of potential solutions.
Tim explains, “Due to the complex nature of our student portal, it was unclear which solution design would provide the best performing architecture.
HP LoadRunner was used to obtain an accurate picture of end-to-end system performance by emulating the types of loads we receive on result days. We tested all the options. Our objective was to handle the loads and satisfy 100 per cent of our students’ results queries in under five seconds.”
By leveraging the experience of JDS, HP LoadRunner was used to emulate peak loads of 20,000+ student logins per hour, which is more than six times the average loads experienced on non-results days.
“The data from our tests revealed that we needed our student portal platform to have the ability to scale considerably, to handle traffic up to six times the usual volume on result days,” explains Tim. “As we drove loads against the various design options, we also captured end-user response times.
“Based on this, we selected the design solution that had the best performance. There were several options to choose from with some going through a portal which put unnecessary load on the system. The brand new application chosen is called MyResults, which, during testing, met our performance objective and delivered response times in less than two seconds.”
JDS account manager for RMIT, Dave Melgaard says, “As well as testing the performance of the various design solutions, optimisation opportunities were identified that enabled the student portal to be scaled more appropriately. Using HP LoadRunner, JDS consultants identified bottlenecks in the application and platform. By working with RMIT, they were able to direct efforts to remediate the problems prior to going live.”
Tim says, “By using HP LoadRunner, we significantly decreased the risk of deploying an application that would not meet our performance requirements. On results day, MyResults proved an outstanding success. It handled the loads and spikes extremely well, consistently delivering results in timeframes sub-two seconds. This would not have happened, if we had not validated performance beforehand.
“Our students certainly noticed the difference. We received a number of tweets praising the system’s performance. Many of our students couldn’t believe how quickly they obtained their results. Another great indication of our success was the low impact on our helpdesk. They didn’t receive complaints or issues regarding the system and that’s a big plus.”
As a result of deploying HP LoadRunner to validate the performance of MyResults, RMIT has realised considerable benefits. The institution has facilitated better decision-making as information is more readily available, and experienced operational efficiencies.
Tim says, “We are delighted with the outcomes of HP LoadRunner. First and foremost, we rectified poor performance issues to provide a results system that exceeded our own goals and those of our students. Thanks to the preparative measures we put in place, our system thrived and delivered 100 per cent uptime. This enabled us to provide a high quality student experience, which culminated in increased user satisfaction.
“Operationally, HP LoadRunner helped us to identify the most suitable option to improve our performance. It gave us confidence (prior to release) in MyResults’ ability and allowed us to make informed business decisions, which reduced our deployment risk. In addition, we saved money, because we were more efficient and did not experience any downtime. Plus, we were able to fix issues in development which is always a cheaper option, and we saved considerable time during testing by having the ability to re-use and repeat tests.”
Download the Case Study here [469 KB].
You can follow any comments to this post through the RSS 2.0 feed.