LoadRunner

The benefits of performance testing with LoadRunner

The benefits of performance testing with LoadRunner

Often in IT projects, the last item to be considered is a performance test. This is a mistake! Poor performance leads to unhappy users. The key to any go-live event is to ensure that your system or application is ready and able to perform well under load (i.e. under the pressure of multiple users). By performance testing with LoadRunner and the help of an experienced performance engineer, you can ensure that your system or application—no matter how simple or complex—is ready to go-live.

Let’s set up a scenario.

You’re undertaking a large project with high pressure expectations and a multitude of stakeholders— and unfortunately, you’re over budget and behind schedule. Do you still schedule in performance testing instead of going straight to production?

What about if you have an internal company system that is slowing down, thus now slowing down your staff from completing work. Considering the bad press that comes along with an application or system that crashes under load, is it worth taking the risk of trying an upgrade?

In both of these scenarios, the answer is yes—performance testing always de-risks projects, making them more likely to succeed and less likely to negatively impact your users.

How does LoadRunner work?

Using LoadRunner, we can emulate user traffic from a user’s perspective. It allows us to create load and volume using the business processes that users perform and thus gives us the ability to see how the system will cope. It’s capability to emulate user traffic and correlate associated metrics (system resources, web servers, network, etc.) is extremely powerful. It’s as close as you can get to a crystal ball in IT—letting you predict what will happen within your system before it goes to prod and affects your users.

LoadRunner comes with a wide range of protocols that it can emulate such as WEB/HTML, RDP, Citrix, SAPGUI, and more. Using its scripting tool, LoadRunner Virtual User Generator (VuGen), performance engineers are able to record and script the different business processes so they accurately represent what users action. Then, in the LoadRunner Controller, a scenario is created so it models the anticipated load on a system. This is used to execute the test and collate results. Finally, in the LoadRunner Analysis tool, we can go over the results of the test and correlate metrics together to view the various aspects of the test. We can also work with the data to discover trends within the results that will hopefully pinpoint any potential performance issues.

Take the example below captured from LoadRunner’s analysis tool after a test:

LoadRunner has a long history in IT, with the first version released in 1989. It's an incredibly mature tool, with a solid reputation.

This was a test conducted for a large data-heavy web-based system. Notice the high response times across the board for each of the different processes—can you imagine the reaction from users of this system if response times are going over 100 seconds for each request?

Another example is testing a new version of an application. The expectation is that the new version should just work…right? A performance test later, looking at the below graph, it would appear that one of the transactions had a performance defect introduced as it now has constant spiking response times. Users now have to wait for an extended period of time to view documents—something that wasn’t an issue before and functional testing did not find. Only while executing and analysing a performance test was this noticed.

 

In 2000, a simplified version of LoadRunner called Astra LoadTest was launched. Today, if you kick LoadRunner hard enough you might even see an error dialog referring to Astra LoadTest.

So how do you go about solving the above problems? Firstly, ensure you get performance testing done to begin with! Then comes the analysis of the data you have collected:

  • Are resources high?
  • Has code been modified?
  • Is the database under-performing?
  • Is there a particular business process that contains high response times?
  • Is it a common component of the application that struggles?

Sometimes the answer is easy and obvious; other times it requires in-depth technical understanding of the environment as well as the results gathered by LoadRunner. Being able to combine performance test execution with in-depth technical analysis is crucial to meaningful results being found.

JDS consultants have more than a decade of experience working with LoadRunner, and we are one of the only Premier Partners of Micro Focus in Australia. This gives us an edge when it comes to performance testing that is unmatched by other performance testers. If you are in the process of introducing a new system or application, make sure you schedule a performance test with JDS. Contact our Micro Focus team today on 1300 780 432 or at microfocus@jds.net.au.

Already have a system or application in place, but looking to make it faster? Take advantage of our One Second Faster solution—five days for less than $5k to get a performance health check.

Our team on the case

Brad Halkett

Length of Time at JDS

Since January 2015

Skills

  • Highly experienced performance tester
  • Application performance monitoring
  • Strong technical skills across a range of technologies

Workplace Solutions

  • AppDynamics
  • HP LoadRunner
  • HP SiteScope
  • HP Virtual User Generator
  • JMeter

Every day, do something that people want.

Nick Wilton

Consultant

Length of Time at JDS

8.5 years

Skills

Primary: Software security, Performance optimisation

Secondary: DevOps, Software development, Technical sales

Workplace Solutions

I help clients to solve problems like:
  • Is my application secure?
  • How do I manage threats?
  • Will my application perform when I need it to?

Workplace Passion

It’s all about managing risk whilst driving business confidence in technology and software solutions. That’s what I’m passionate about.

Our Micro Focus stories

Posted by Amy Clarke in Micro Focus, Tech Tips, 0 comments
Case Study: Flash Group optimises performance of the Global Corporate Challenge Website

Case Study: Flash Group optimises performance of the Global Corporate Challenge Website

HP LoadRunner ensures performance and availability of Global Corporate Challenge’s website

HP LoadRunner software takes the guesswork out of the GCC website’s development. It provides confidence that the application will work as intended and it gives us the data we need to support our decisions.

August 2010

In 2009, Flash experienced some performance issues with the Global Corporate Challenge (GCC) website—the world’s first virtual health program that encourages corporations to help their employees get active and healthy—that resulted in speed degradation, functionality errors and site downtime

With the number of GCC participants predicted to double in 2010 to 120,000, Flash needed to drive a higher level of application performance and mitigate the risks it had previously. As a result, the company turned to HP Preferred Partner, JDS Australia, for a solution. The company adopted a Business Technology Optimization (BTO) approach to application performance with HP LoadRunner software for predicting the behaviour and performance of the GCC website under load.

Objective

Flash Group wanted to mitigate the risk of performance issues for the launch of the 2010 Global Corporate Challenge (GCC) website.

Approach

Flash engaged HP Preferred Partner, JDS Australia and adopted a Business Technology Optimization (BTO) strategy with HP LoadRunner software to obtain an accurate picture of end-to-end system performance.

IT improvements

  • Ensured the quality and performance of the GCC website for the 2010 programme.
  • Established a standardised procedure for load testing the website.
  • Identified and eliminated performance bottlenecks to tune for better performance.
  • Matured its website development methodology.
  • Raised its profile and credibility as an organisation that produces high-performing, user-friendly websites.
  • Delivered 99.99 per cent uptime on its systems with web servers only reaching 20 percent system capacity, and page response times of less than two seconds which resulted in a high-quality user experience and enhanced the programme’s brand value.

About Flash Group

Flash Group (Flash) is one of Australia’s fastest growing full-service advertising agencies, offering integrated services including above and below the line advertising with in-house digital, strategy and design.

The company’s 30 staff are dedicated to servicing a group of high profile clients that spans retail, healthcare, travel, fashion, hardware, consumer electronics and entertainment. This includes leading brands such as Pioneer, Stanley, Global Corporate Challenge, Contiki, Origin Energy, Clive Peeters, and more.

Every year, the company assists Global Corporate Challenge (GCC), a world-first virtual health programme that encourages corporations to help their employees get active and healthy. The programme sees people from around the globe form teams and don pedometers for a period of 16 weeks and record their daily step count on the GCC website, which was designed and built by Flash.

Industry

Marketing and Advertisement

Primary hardware

  • Multiple Virtual Web and Database Servers hosted externally running Windows Server 2008

Primary software

  • HP LoadRunner software

Predicting system behaviour and application performance

“The stability and performance of the GCC website is critical to the long-term success of the programme,” explains Carla Clark, digital producer, Flash Group.

“While we undertook some basic testing in 2009, we did not have adequate visibility to obtain an accurate end-to-end picture of the website’s performance, particularly at peak loads. This was apparent when we experienced issues during the 2009 program and it was the impetus for us to seek a performance validation solution.

“Despite the broad experience of our team, we wanted to leverage specialised expertise in performance validation, so we invited JDS Australia to recommend an appropriate software solution. We settled on HP LoadRunner software, due to its functionality, reliability and versatility.”

Partnership provides expertise and speeds time to benefit

An HP Platinum Partner and winner of the coveted HP Software and Solutions Partner of the Year Award for the past four years, JDS is widely regarded as a leader in the BTO space. The company provides extensive and in-depth knowledge of HP’s suite of testing and monitoring solutions, offering support to clients in a variety of industries.

The account manager at JDS Australia believes this is quite an unusual project, as Flash is one of the first creative agencies he has come across that realised the importance of performance validation for a website it had developed. “Ensuring that mission-critical systems such as the GCC website are available and performing as intended is something that all organisations grapple with. However, we don’t often see creative agencies trying to predict system behaviour and application performance at this level – that’s usually the domain of IT teams or developers.

“For organisations (such as Flash) that don’t have in-house performance testing expertise, getting a partner on-board takes the hassle out of deployment. In this instance, JDS provided a roadmap to help Flash mitigate the risk of deploying the GCC website and prevent the costly performance problems it had previously incurred. We helped the team stress test the website to handle the large increase in participants and determine the peak loads and transactional volumes, which in turn enabled us to recommend how best to setup the IT infrastructure. The testing also identified bottlenecks, which the website developers rectified this year.”

Carla Clark believes that having an HP partner involved made all the difference to this project. She says, “Having JDS on board meant that we could focus on our core competencies, while allowing them to do what they do best—provide the services needed to ensure the GCC website would be available and performing as and when required. JDS has assisted Flash in getting the most out of HP LoadRunner in a short space of time.”

Mitigating risk and gaining confidence

The company’s vision in adopting HP LoadRunner was to ensure the GCC website would be scalable in line with the rising number of users. “We wanted to adopt a long-term approach to this project and create a robust website to keep pace with the programme’s planned growth,” explains Tim Bigarelli, senior developer at Flash. “This also entailed the migration to a new IT infrastructure to further enhance our ability to support the website’s evolution.”

Flash began preparations for the launch of this year’s website by having JDS test the previous application on the old infrastructure to establish performance benchmarks. The next round of tests were applied to the new code base using both the old and new infrastructure. “The results uncovered were extremely beneficial as it enabled us to redevelop the website for maximum performance and functionality. But more importantly, it provided us with complete visibility into the performance of the application from end-to-end, which enabled us to verify that the new application would sustain loads of 1,000 concurrent users over the first peak hour on the launch day with an average login time of 7-8 minutes per user and average response times for all pages under two seconds to avoid abandonment,” adds Bigarelli.

Following a review of three solutions, we settled on HP LoadRunner Software, due to its functionality, reliability and versatility.
JDS has assisted Flash in getting the most out of HP LoadRunner in a short space of time.
HP LoadRunner software has helped Flash mature its website development methodology.
We are delighted with the business outcomes of HP LoadRunner software. Thanks to the preparative measures we put in place, our systems thrived and delivered 99.99% uptime, with our webservers only reaching 20% system capacity, and page response times of under two seconds.

Better decisions, operational efficiencies and improved client satisfaction

As a result of deploying HP LoadRunner to validate the performance of the GCC website, Flash has realised considerable benefits. The organisation has facilitated better decision-making, particularly on the development side, experienced operational efficiencies and improved client satisfaction.

Clark says, “HP LoadRunner software takes the guesswork out of the GCC website’s development. It provides confidence that the application will work as intended and it gives us the data we need to support our decisions. In short, it helps us avoid application performance problems at the deployment stage.

“By giving us a true picture of end-to-end performance, diagnosing application and systems bottlenecks and enabling us to tune for better performance, we mitigated the risk of failure for the GCC website. And with access to facts, figures and baseline measurements, we were able to tune the application for success.”

Putting the website to the test

Following considerable testing, Flash launched the GCC website on May 13, 2010. As expected, traffic was extremely high, with an average of 130,000 visitors on the first two days, and a peak of 8,403 visitors in the first hour.

“The GCC website performed according to our expectations and we are delighted with the business outcomes of HP LoadRunner software,” says Clark.

“Thanks to the preparative measures we put in place, our systems thrived and delivered 99.99 percent uptime, with our web servers only reaching 20 percent system capacity and page response times of under two seconds. This enabled us to provide a high-quality user experience, which is enhancing the programme’s brand value.

“Overall, HP LoadRunner software helped us solve key issues this year and identify areas for performance improvements for next year. We have benefited from knowing that performance testing prevents potential failures - such as the ones we experienced last year. As a result, we have considerably reduced the opportunity cost of defects, while driving productivity and quality in our operational environment to deliver a robust GCC website this year, that performs as intended.”

0
%
Uptime
<
20
s
Loadtime
4000
Visitors in First Hour
80
k
Visitors in first two days

Project Outcomes

Business Benefits

  • Mitigated the risks of poor performance with a consistent approach to load testing.
  • Adopted a consistent approach to load testing to make confident, informed decisions about the performance and scalability of the GCC website.
  • Gained a true picture of end-to-end performance, which enabled better decision-making and functionality changes.
  • Increased client satisfaction through a fast, high-performing website.
  • Resolved issues with the production architecture and configuration before users were impacted.
  • Gained understanding and confidence in the performance characteristics of the website prior to going live.

Looking ahead

HP will continue to play a key role as the performance validation backbone of the GCC website. By leveraging the functionality and flexibility of HP LoadRunner software, Flash will continue to derive value from predicting system behaviour and application performance. The company is also exploring options to extend its HP investment by utilising the HP LoadRunner scripts with HP Business Availability Center software to monitor the performance and availability of the GCC website from an end user perspective.

In the future, Clark is keen to have someone in the team take the lead on testing. She says: “This project has demonstrated to us just how important testing really is, so we are focused on ensuring it becomes part of our routine development. We are also keen to share the functionality of HP LoadRunner to other clients with similar-sized projects.

“On the whole, HP LoadRunner software has helped Flash mature its website development methodology. We deployed a higher quality GCC website, improved client satisfaction and raised our profile and credibility as an organisation that produces high-performing, user-friendly and scalable websites,” concludes Clark.

Our team on the case

Work smarter, not harder. (I didn't even come up with that. That's smart.)

Daniel Spavin

Performance Test Lead

Length of Time at JDS

7 years

Skills

IT: HPE Load Runner, HPE Performance Center, HPE SiteScope, HPE BSM, Splunk

Personal: Problem solving, Analytical thinking

Workplace Solutions

I care about quality and helping organisations get the best performance out of their IT projects.

Organisations spend a great deal of time and resources developing IT solutions. You want IT speeding up the process, not holding it up. Ensuring performance is built in means you spend less time fixing your IT solutions, and more time on the problems they solve.

I solve problems in our customers’ solutions, so customers can use their solutions to solve problems.

Take the path of least resistance.

Michael Lee

Consultant

Length of Time at JDS

9 years

Skills

OMi, BSM, NNMi, SiteScope, VuGen, QTP/UFT

Every day, do something that people want.

Nick Wilton

Consultant

Length of Time at JDS

8.5 years

Skills

Primary: Software security, Performance optimisation

Secondary: DevOps, Software development, Technical sales

Workplace Solutions

I help clients to solve problems like:
  • Is my application secure?
  • How do I manage threats?
  • Will my application perform when I need it to?

Workplace Passion

It’s all about managing risk whilst driving business confidence in technology and software solutions. That’s what I’m passionate about.

Why choose JDS?

At JDS, our purpose is to ensure your IT systems work wherever, however, and whenever they are needed. Our expert consultants will help you identify current or potential business issues, and then develop customised solutions to suit you.

JDS is different from other providers in the market. We offer 24/7 monitoring capabilities and support throughout the entire application lifecycle. We give your IT Operations team visibility into the health of your IT systems, enabling them to identify and resolve issues quickly.

We are passionate about what we do, working seamlessly with you to ensure you are getting the best possible performance from your environment. All products sold by JDS are backed by our local Tier One support desk, ensuring a stress-free solution for the entire product lifecycle.

Posted by Laura Skillen in Case Study, Entertainment, Micro Focus, Professional Services
Case Study: Bendigo Bank delivers a higher quality customer experience with HP

Case Study: Bendigo Bank delivers a higher quality customer experience with HP

A quality and performance assurance process optimises the next-generation CRM system at Bendigo Bank.

HP has helped Bendigo Bank set the benchmark for ensuring our mission critical applications are high in quality and give the best performance to support our users in delivering excellent products and services.

June 2011

Bendigo Bank provides banking and wealth management services to individual and small to medium businesses. It is represented in all states and territories with almost 900 outlets, including more than 190 company-owned branches, 250 locally-owned Community Bank® branches, 90 agencies and 800 ATMs.

With a tradition of adding value for customers through quality personal service, the bank recently began to look to technology as the enabler of service delivery and business performance. Realising its existing systems were account-centric and not customer-focused, the bank embarked on an ambitious program to align technology more closely with its business strategy. The result? It purchased Siebel Customer Relationship Management (CRM) and Universal Customer Master (UCM) applications to streamline customer-facing operations.

Known as ‘Enable Customer Phase 1’, the objective of this 18-month project was to introduce CRM and UCM capability across the organisation. As this would significantly impact 5,000 users and would result in considerable change management, the bank knew it had to deliver high-quality applications that functioned and performed at the levels demanded by the business.

“Enable Customer Phase 1 is the single largest implementation undertaken across the Bank in the past 15 years,” explains Robert Murphy, the project’s Technical Implementation Manager. “We had one chance to get it right and we knew quality assurance had to play a big part in the equation. We decided to make use of HP Quality Center software, which has been in the organisation for the past seven years. By leveraging an existing quality management solution, we could reduce our total cost of ownership and ensure a smoother transition to our new CRM platform.”

Objective

To drive the business value of its new customer facing solutions, Bendigo Bank sought to standardise system quality and performance 

Approach

Bendigo Bank adopted a quality and performance assurance approach using HP Quality Center software and HP LoadRunner software

IT improvements

  • Standard platform manages every aspect of system quality and performance
  • Centralisation enhances productivity
  • Isolated and fixed defects quickly
  • Established benchmarks for future enhancements
  • Fine-tuned testing efforts around data migration

About Bendigo Bank

The Bendigo Bank is the retail arm of the Bendigo and Adelaide Bank Group, an Australian company formed in November 2007 as a result of the merger between Bendigo Bank and Adelaide Bank. A publicly listed company, the group is owned by more than 82,000 shareholders.

Industry

Banking and Finance

Primary applications

  • Siebel CRM
  • Siebel URM

Primary software

  • HP LoadRunner software
  • HP Quality Center v9

Partnership provides valuable and timely expertise

To complete the quality approach and ensure all aspects of the new system were tested, the bank appointed JDS Australia to provide services in load testing and performance management.

An HP Platinum Partner and winner of the coveted HP Software Partner of the Year Award for the past four years, JDS is widely regarded as a leader in the Quality and Performance testing space. The company provides extensive and in-depth knowledge of the HP suite of testing and monitoring solutions offering support to clients in a variety of industries.

Steve Smith, JDS Australia’s Account Manager, believes that validating performance of newly deployed mission-critical systems is the key to achieving high user adoption and enhancing the consumer experience. Ensuring that applications are available and performing as intended is something that all organisations grapple with. JDS assisted Bendigo Bank by deploying HP LoadRunner to stress-test its Siebel CRM/URM system to ensure it could handle the peak loads and transactional volumes it would be subjected to, once live.

A quality ownership imperative

Prior to the adoption of HP Quality Center, the bank performed quality assurance on its core systems using a mixture of spreadsheets and documents. Following two mergers, the bank expanded rapidly and decided it needed to standardise its approach to quality assurance as a way of gaining some unity across the business and driving competitive advantage in a tough financial market. Today, the bank is firmly focused on retaining and growing its customer relationships, increasing loyalty and delivering personalised and consistent service experiences.

“We began the quality assurance part of the Enable Customer Phase 1 project by putting the ownership of quality in the hands of the business. We sought to make the business accountable for its operational outcomes. In short, we wanted quality management to be part of everyone’s mandate and HP Quality Center enabled us to do just that,” says Murphy.

“The quality management structure of this project was somewhat unusual. We used an iterative approach to development and put the business analysts, testers and developers into the one team. This allows us to fast-track time to success by facilitating communication and collaboration. But more importantly, it bridged the gap between business and technology expert, aligning testing more closely to business outcomes.”

The value that HP Quality Center has brought to Bendigo Bank can be summarised in terms of standardisation, visibility and insight. We gained an end-to-end quality management infrastructure that gave us visibility into every element of the system and the insight we needed to make good decisions.
We used an iterative approach to development and put the business analysts, testers and developers into the one team. This let us to fast-track time to success by facilitating communication and collaboration... It bridged the gap between business and technology expert, aligning testing more closely to business outcomes.
400
Outlets
300
Atms
30
k
Shareholders
Throughout the course of the project, we were able to isolate and fix defects quickly, automate quality processes and establish benchmarks for future enhancements. Quite simply, we delivered a high-quality, high-performing, robust system to support our people.

Standardised processes improve decision-making

By providing a seamless, repeatable process for gathering requirements, planning and scheduling tests, analysing results and managing defects, HP has brought structure to managing quality for this project.

Murphy explains, “HP Quality Center creates an end-to-end quality management infrastructure to enforce standardised processes and best practices, such as our policy of ‘no work without a ticket’. It has given us the ability to streamline the management of defects, so that we can make effective ‘go/no-go’ decisions.

“By standardising on one quality platform we can do a lot of work in a short space of time, knowing that it is all contributing to our overall quality objectives. We can monitor the advancement of our work against these objectives to determine whether we are on track, on budget and on time. Having such insight into our progress delivers good governance and greatly improves decision-making.”

Testing what’s needed reduces risk

With quality firmly embedded in the centre of the organisation’s development mandate, ensuring that testing is prioritised according to business need was vital to achieving timely results for the bank.

HP Quality Center provides risk-based quality management to objectively assess and prioritise the highest-risk, highest-priority requirement, so testing efforts can be fine-tuned based on quantifiable business risk.

“HP Quality Center supports our approach of not wanting to test everything,” adds Murphy. “It enabled us to marry testing priorities with risk. We focused our testing efforts around data migration from our legacy systems into Siebel, as this was an integral part of future functionality.

“Prioritising our testing was also cost-effective in terms of centralisation and reusability. It meant that our people could store tests in one central location, review test planning information and reuse entire test plans or amend test cases across project components. Plus, having access to quality metrics put the business at ease because we could show that elements had been effectively tested and would work as intended.”

Validating performance

Gaining an understanding of how the Enable Customer project would meet the performance and scalability of the business was another objective the bank sought to achieve. Specifically, it wanted to obtain an accurate picture of end-to-end system performance before going live.

HP LoadRunner software was used to emulate the bank’s working environment with thousands of concurrent users. It stressed the application from end-to-end, applying consistent, measurable and repeatable workloads and identified issues that would affect its users in production.

“As we drove loads against the system, HP LoadRunner captured end-user response times for key transactions. It showed us that had we gone live, our users would have experienced slow performance when printing following a query. We rectified the issue in five days, but without HP LoadRunner it could easily have taken us a month or more to fix it.

“In the end, HP LoadRunner verified that our new Siebel CRM/URM system would meet specified performance requirements including sub-second response times,” confirms Murphy.

Project Outcomes

Quality, confidence, and success

After extensive testing and a successful pilot in two branches, the bank recently went live on Enable Customer Phase 1 without any showstoppers.

“We are delighted with the success of the project’s deployment and have achieved good outcomes through quality and performance testing,” adds Murphy. “Throughout the course of the project, we were able to isolate and fix defects quickly, automate quality processes and establish benchmarks for future enhancements. Quite simply, we delivered a high-quality, high-performing, robust system to support our people.”

“The value that HP Quality Center has brought to Bendigo Bank can be summarised in terms of standardisation, visibility and insight. We gained an end-to-end quality management infrastructure that gave us visibility into every element of the system and the insight we needed to make good decisions.”

Business Benefits

  • Gained 360-degree visibility into application quality
  • Went live on the single largest IT implementation in 15 years (Siebel CRM/URM) which functioned and performed at levels demanded by 5,000 users
  • Rectified performance issue in five days instead of a month
  • Aligned testing to business outcomes by facilitating communication and collaboration among business analysts, testers and developers
  • Reduced application deployment risk
  • Streamlined management process to assist with go/no-go decisions
  • Monitored the progress of work against objectives to track timeliness, budget and readiness

Looking ahead

HP Software will continue to play a key role as the backbone of Bendigo Bank’s quality and performance validation engine.

“We have successfully deployed one of the largest customer-facing projects in the history of the bank. Our focus now is on continuing to manage quality and performance of this system on a quarterly basis, ensuring that updates, changes, and upgrades are validated prior to release.

“Overall, HP has helped Bendigo Bank set the benchmark for ensuring our mission critical applications are high in quality and give the best performance to support our users in delivering excellent products and services,” concludes Murphy.

Our team on the case

Ensure IT works.

Nick Johnson

IT Consultant

Length of Time at JDS

10+ years

Skills

IT monitoring architecture / design / implementation, IT performance testing architecture / design / implementation, support services, presales, IT Sales, IT monitoring strategy development.

Every day, do something that people want.

Nick Wilton

Consultant

Length of Time at JDS

8.5 years

Skills

Primary: Software security, Performance optimisation

Secondary: DevOps, Software development, Technical sales

Workplace Solutions

I help clients to solve problems like:
  • Is my application secure?
  • How do I manage threats?
  • Will my application perform when I need it to?

Workplace Passion

It’s all about managing risk whilst driving business confidence in technology and software solutions. That’s what I’m passionate about.

Why choose JDS?

At JDS, our purpose is to ensure your IT systems work wherever, however, and whenever they are needed. Our expert consultants will help you identify current or potential business issues, and then develop customised solutions to suit you.

JDS is different from other providers in the market. We offer 24/7 monitoring capabilities and support throughout the entire application lifecycle. We give your IT Operations team visibility into the health of your IT systems, enabling them to identify and resolve issues quickly.

We are passionate about what we do, working seamlessly with you to ensure you are getting the best possible performance from your environment. All products sold by JDS are backed by our local Tier One support desk, ensuring a stress-free solution for the entire product lifecycle.

Posted by Laura Skillen in Case Study, Financial Services, Micro Focus
Case Study: RMIT implementation of LoadRunner

Case Study: RMIT implementation of LoadRunner

RMIT University used an HP LoadRunner implementation to drive student satisfaction, through fast access to results.

We have a good relationship with JDS, having worked together on many assignments in the past five years. We see them as an extension of our Test Management group and collaborate closely to maximise results. Their specialist testing skills complement RMIT’s broader technology skills to provide better business outcomes.

August 2011

Twice a year, RMIT publishes course results for its students on the student portal of the university’s website. These two days in the calendar generate the highest volume of website traffic. With such huge spikes experienced in a 24-hour period, RMIT’s student portal was being saturated and not handling the volume effectively. This resulted in poor performance and students reported difficulties in accessing their results in a timely fashion.

Tim Ash, RMIT’s senior production assurance and test manager, explains, “The single most important aspect of our students’ lives, apart from the education they receive, is knowing how well they are doing in their studies. Having a stable student portal is critical to satisfy our students’ needs and deliver a high level of service. It also impacts on our credibility and future success.

“For this reason, we identified performance testing as an essential part of future application delivery. With limited resources available in-house and a tight deadline, we invited JDS Australia (JDS), an HP Platinum Partner, to help us to predict future system behaviour and application performance."

Objective

Predict system behaviour and application performance to improve student satisfaction

Approach

RMIT University verified that its new critical student results application (called MyResults) met specified performance requirements

IT improvements

  • Achieved 100 percent uptime on MyResults
  • Delivered student results 60 per cent faster than predicted
  • Gained a true picture of end-to-end performance of MyResults
  • Emulated peak loads of 20,000+ student logins per hour – more than six times the average loads
  • Uncovered and rectified functional issues prior to going live
  • No helpdesk complaints logged for poor system performance
  • Optimisation opportunities were identified that enabled the student portal to be scaled more appropriately

About RMIT University

As a global university of technology and design, RMIT University is one of Australia’s leading educational institutions, enjoying an international reputation for excellence in work-relevant education.

RMIT employs almost 4,000 people and has 74,000 students studying at its three Melbourne campuses and in its Vietnam campuses. The university also has strong links with partner institutions which deliver RMIT award programmes in Singapore, Hong Kong, mainland China, and Malaysia, as well as research and industry partnerships on every continent.

RMIT has a broad technology footprint, counting its website and student portal as mission-critical applications essential to the day-to-day running of the university.

Industry

Higher Education

Primary applications

  • SAP Financials
  • PeopleSoft Student Administration

Primary software

  • HP LoadRunner software
  • HP Quality Center
  • HP QuickTest Professional software

Selecting an industry standard

A long-term user of HP software including HP Quality Center, HP QuickTest Professional and HP Business Availability Center, RMIT chose HP LoadRunner software, as its performance validation platform.

“HP LoadRunner is the industry standard for performance testing,” explains Tim. “It was a natural choice for RMIT, due to its functionality and integration capabilities as well as our investment in other HP software.”

Predicting system behaviour

In seeking to prevent future student portal performance problems, RMIT had identified a number of potential solutions.

Tim explains, “Due to the complex nature of our student portal, it was unclear which solution design would provide the best performing architecture. HP LoadRunner was used to obtain an accurate picture of end-to-end system performance by emulating the types of loads we receive on result days. We tested all the options. Our objective was to handle the loads and satisfy 100 percent of our students’ results queries in under five seconds.”

Emulating loads

By leveraging the experience of JDS, HP LoadRunner was used to emulate peak loads of 20,000+ student logins per hour, which is more than six times the average loads experienced on non-results days. “The data from our tests revealed that we needed our student portal platform to have the ability to scale considerably, to handle traffic up to six times the usual volume on result days,” explains Tim. “As we drove loads against the various design options, we also captured end-user response times.

“Based on this, we selected the design solution that had the best performance. There were several options to choose from with some going through a portal which put unnecessary load on the system. The brand new application chosen is called MyResults, which, during testing, met our performance objective and delivered response times in less than two seconds.”

The JDS account manager for the case says, “As well as testing the performance of the various design solutions, optimisation opportunities were identified that enabled the student portal to be scaled more appropriately. Using HP LoadRunner, JDS consultants identified bottlenecks in the application and platform. By working with RMIT, they were able to direct efforts to remediate the problems prior to going live.”

"Using HP LoadRunner, JDS consultants identified bottlenecks in the application and platform. By working with RMIT, they were able to direct efforts to remediate the problems prior to going live."
JDS Consultant
We are delighted with the outcomes of HP LoadRunner. First and foremost, we rectified poor performance issues to provide a results system that exceeded our goals and those of our students. Thanks to the preparative measures we put in place, our system thrived and delivered 100 percent uptime. This enabled us to provide a high-quality student experience, which culminated in increased user satisfaction.
Tim Ash, RMIT
Our students certainly noticed the difference. We received a number of tweets praising the system’s performance. Many of our students couldn’t believe how quickly they obtained their results.
We rectified poor performance issues to provide a results system that exceeded our own goals and those of our students.

Successful go-live drives student satisfaction

While MyResults is a simple application that delivers results, its structure is quite complex as it receives inputs from various databases. Ensuring it would perform as intended on results day was key.
Tim says, “By using HP LoadRunner, we significantly decreased the risk of deploying an application that would not meet our performance requirements. On results day, MyResults proved an outstanding success. It handled the loads and spikes extremely well, consistently delivering results in timeframes sub-two seconds. This would not have happened if we had not validated performance beforehand.

“Our students certainly noticed the difference. We received a number of tweets praising the system’s performance. Many of our students couldn’t believe how quickly they obtained their results. Another great indication of our success was the low impact on our helpdesk. They didn’t receive complaints or issues regarding the system and that’s a big plus.”

Boosting reputation

As a result of deploying HP LoadRunner to validate the performance of MyResults, RMIT has realised considerable benefits. The institution has facilitated better decision-making as information is more readily available, and experienced operational efficiencies.

Tim says, “We are delighted with the outcomes of HP LoadRunner. First and foremost, we rectified poor performance issues to provide a results system that exceeded our own goals and those of our students. Thanks to the preparative measures we put in place, our system thrived and delivered 100 percent uptime. This enabled us to provide a high-quality student experience, which culminated in increased user satisfaction.

“Operationally, HP LoadRunner helped us to identify the most suitable option to improve our performance. It gave us confidence (prior to release) in MyResults’ ability and allowed us to make informed business decisions, which reduced our deployment risk. In addition, we saved money, because we were more efficient and did not experience any downtime. Plus, we were able to fix issues in development which is always a cheaper option, and we saved considerable time during testing by having the ability to re-use and repeat tests."

Project Outcomes

0
%
MyResults Uptime
80
%
Speed Performance against Expectation
3000
Student Logins per Hour

Business Benefits

  • The university's reputation was boosted by providing students with reliable and fast access to results—students tweeted their satisfaction.
  • Money was saved through having zero downtime and the ability to fix issues early.
  • RMIT could make informed decisions and reduce the risk of deploying poor quality applications.
  • University and student confidence in IT systems was improved.

Next steps

HP LoadRunner will continue to play a key role as the performance validation backbone of MyResults. Tim believes RMIT is certainly up-to-date with testing technology with the HP Solutions it utilises, as well as adoption of a stringent testing methodology.

“Students are early adopters of technology. So it’s logical that our next big push is mobility and wireless applications. More and more, our students are accessing RMIT’s website using mobile devices and we need to make sure our applications are optimised accordingly. We want students to be able to log in from home and attend a lecture from their laptop or access their results from their mobile phones. HP LoadRunner will be used to ensure we can meet the wireless requirements of the university’s future.

“Looking at the big picture, HP LoadRunner is an essential component of RMIT’s technology footprint. It allows us to perform due diligence on our applications, make go live decisions with confidence and provide statistical information that is trusted and relied upon by the institution.

“Ultimately, HP LoadRunner gives me peace of mind that RMIT’s systems will work as intended and deliver the quality of service our students and staff expect.”

Our team on the case

Stop, collaborate and listen.

Felix Wawolangi

Consultant

Length of Time at JDS

Since 2010

Skills

IT: Splunk, HPE NNMi, SiteScope, BSM, Network Automation

Personal: Connecting the dots, reading between the lines, thinking outside of the square

Workplace Passion

Ensuring that monitoring solution implemented provides value to the organisation instead of being a burden for the operators.

Every day, do something that people want.

Nick Wilton

Consultant

Length of Time at JDS

8.5 years

Skills

Primary: Software security, Performance optimisation

Secondary: DevOps, Software development, Technical sales

Workplace Solutions

I help clients to solve problems like:
  • Is my application secure?
  • How do I manage threats?
  • Will my application perform when I need it to?

Workplace Passion

It’s all about managing risk whilst driving business confidence in technology and software solutions. That’s what I’m passionate about.

Why choose JDS?

At JDS, our purpose is to ensure your IT systems work wherever, however, and whenever they are needed. Our expert consultants will help you identify current or potential business issues, and then develop customised solutions to suit you.

JDS is different from other providers in the market. We offer 24/7 monitoring capabilities and support throughout the entire application lifecycle. We give your IT Operations team visibility into the health of your IT systems, enabling them to identify and resolve issues quickly.

We are passionate about what we do, working seamlessly with you to ensure you are getting the best possible performance from your environment. All products sold by JDS are backed by our local Tier One support desk, ensuring a stress-free solution for the entire product lifecycle.

Posted by Laura Skillen in Case Study, Higher Education, Micro Focus, 0 comments
What’s new in LoadRunner 12.53

What’s new in LoadRunner 12.53

HPE recently announced the next update to LoadRunner – bringing it to version 12.53. While not a major release, there were quite a few updates, enhancements and additions.

Here’s a rundown and some initial thoughts on the new release.

Brand New

  • GitHub Integration
    You can now use GitHub as your script repository straight from VuGen. This means teams can share the same script and take full advantage of GitHub’s versioning features as well as the extras like the issues register and Wiki.
  • New PCoIP protocol
    PC-over-IP is a new GUI protocol similar to Citrix and RDS. It uses a technology developed by Teradici to simulate users from remote locations (including mobile and tablets) on a centralised server.
  •  
    LoadRunner 12.53a
     

  • New HTTP/2 support
    LoadRunner was an early supporter of SPDY – the precursor to HTTP/2 and now LoadRunner includes recording and scripting support for HTTP/2.
  • New methods to manipulate JSON
    With AJAX commonplace among web apps, and JSON extensively used in AJAX, LoadRunner now comes with some methods to help manage these requests. Some examples include: lr_eval_json, lr_json_get_values, lr_json_stringify.
  • New C Interpreter
    The C Interpreter has changed from the LCC compiler to the latest Microsoft C Runtime. What does this mean for you? Probably not a lot. 64bit integers are now supported, so you can declare a ‘long long’ – but this won’t be backwards compatible.

Browser Support

  • Microsoft Edge (new to this version)
  • Internet Explorer 10 & 11 (IE 9 is no longer supported)
  • Chromium 46
  • Firefox 40

VuGen Improvements

  • REST GUI editor – giving you a better way to record a REST script.
  • Improved support for creating a script via a packet capture (PCAP, HTML Archive, SAZ).
  • Check compatibility with Linux load gens (e.g. lr_load_dll).
  • General tidyups with UI, reports, NV Analytics and correlation.

 

    LoadRunner 12.53b

 
Other Improvements

  • Better object identification in TruClient – including selecting a random item through descriptors.
  • Validate text in Citrix scripts by using OCR: ctrx_get_text_ocr().
  • Support for Java 8.
  • Support for HTTP video streaming on HTML 5, including metrics on lagging and buffering.
  • New cloud provider- DigitalOcean to join Amazon EC2, Microsoft Azure and Google Compute Engine.
  • New look Analysis HTML report – shows NV location for each transaction, graph titles have a larger font size and the background colour defaults to white.

 
LoadRunner 12.53c
 
See the official readme in the LoadRunner Help Center.

Posted by Daniel Spavin in Tech Tips, 0 comments
Tips for replaying RDP VuGen scripts in BSM or LoadRunner

Tips for replaying RDP VuGen scripts in BSM or LoadRunner

If you’ve ever worked with RDP VuGen scripts, you’ll know it can be challenging to develop a reliable script.  This applies if you need to generate high load via LoadRunner RDP scripts for performance testing. It also applies if you need reliable RDP scripts over time as part of a monitoring solution using Business Process Monitors (BPM)s in HP BSM. Differentiating between false errors and real errors with RDP VuGen scripts can be time consuming and drive you insane.

Over and above writing a robust RDP VuGen script, here are some tips to more reliably run RDP scripts in either LoadRunner or HP BSM. We have successfully run 250+ concurrent RDP user tests with no errors using a combination of the below, even without the RDP agent using LoadRunner. There are a number of additional techniques to make RDP scripts more reliable but these are beyond the scope of this article.
 
Script Run-Time Settings:

RDP Synchronization Image Tolerance:

Set the default tolerance for image synchronization to Exact. This should be the global setting. You can relax the tolerance on a case by case basis in the rdp_sync_on_image function using "Tolerance= Medium" where tolerance can be Exact, Low, Medium or High.

RDP Synchronization Typing Speed:

Set the typing speed (msec/char) to at least 300 (msec/char). In some cases the script will still run faster in LoadRunner than you want it to, in which case you can try increasing the typing speed further.
 
RDP Server settings:

RDP-tcp Session Properties:

Set your Terminal Server settings to automatically end disconnected sessions after 1 minute. This will ensure that any disconnected/hung sessions are automatically logged out so the next user can pick up a “clean” session. If allowed also set Active and Idle session limits to Never. This will ensure there will be no limits to how long a user can be active or idle in the RDP session.

These options are found in Remote Desktop Session Host Configuration > Properties > Sessions Tab.

  • End a disconnected session: 1 minute
  • Set Active session limit: Never
  • Idle session limit: Never

 
Still having problems?

Connection Reset by Server Errors.During test execution on the LR Controller, you are getting the following error:-202930 xxx.c(xx): Error:Connection reset by the server

image001

 

Generally this is a difficult error to find the root cause for but if you have a scenario with large think-time and/or pacing (minutes) and find that your vusers are failing with the above error, we suggest you try this first.

Increase the Socket receive buffer size (bytes) in Run-Time settings. This option is found under Run-Time Settings > RDP > Advanced > Socket receive buffer size (bytes).

VuGen has a description as follows:

The amount of bytes to allocate for the socket’s receive buffer. If the buffer is too small, it can fill up causing the server to disconnect. If the buffer is too large, it uses more local system resources (non-paged memory) and can slow system performance.

image003

 

Tech tips from JDS

Posted by Yauseng Chew in Micro Focus, Tech Tips, 0 comments
LoadRunner Correlation with web_reg_save_param_regexp

LoadRunner Correlation with web_reg_save_param_regexp

Do you have a correlation which you can’t solve because the values of the left and right boundary are dynamic? Correlation is an essential part of performance test scripting and there are plenty of different challenges with correlation. Imagine having a value of “GraphA123567EndGraphA” and the goal is to correlate 123567

From the example above, the left and right boundaries would be “LB=GraphA” “RB=EndGraphA”
What if the word GraphA is dynamic and can be anything from GraphA-GraphZ?

There is a solution at hand!

Using web_reg_save_param_regex will allow the user to grab a correlation value using dynamic left and right boundaries. This function uses the power of regular expressions, below are a few examples:

Example 1:
Source: “GraphA123567EndGraphA”
Solution: web_reg_save_param_regexp(“ParamName=CorrValue”, “RegExp=\“Graph[A-Za-z]\”, \“([0-9]+)\”, \“EndGraph[A-Za-z]\””, LAST);
Result: 123567

Example 2:
Correlate the values from a drop down list of a form
Source: dropdown >>> red, blue, green
Solution: web_reg_save_param_regexp(“ParamName=ColourList”, “RegExp=option=[0-9]+>([A-Za-z])

  • {ColourList1}=red
  • {ColourList2}=blue
  • {ColourList3}=green

Example 3:
Correlate up till the end of 642
Source: J\u002blsGd3zj1qdP\u002bvk0vDRaKyJFde5tCa6spDEy08SNab1hP8j5GTs4j6\u002f\u002bTqOwvxMHEQZLWd\u002btu8NlHJrVAarIQ==|634998513832503642″];
Solution: web_reg_save_param_regexp(“ParamName=SecurityString”,”RegExp=\”([A-Z0-9a-z\\\\+]+==\\|[0-9]+)\”\\];”,LAST);
Result:J\u002blsGd3zj1qdP\u002bvk0vDRaKyJFde5tCa6spDEy08SNab1hP8j5GTs4j6\u002f\u002bTqOwvxMHEQZLWd\u002btu8NlHJrVAarIQ==|634998513832503642

Example 4:
Correlate only “634998513832503642” Source:

J\u002blsGd3zj1qdP\u002bvk0vDRaKyJFde5tCa6spDEy08SNab1hP8j5GTs4j6\u002f\u002bTqOwvxMHEQZLWd\u002btu8NlHJrVAarIQ==|634998513832503642"];

Solution:

web_reg_save_param_regexp("ParamName=SecurityString",
    "RegExp=\"[A-Z0-9a-z\\\\+]+==\\|([0-9]+)\"\\];",
    LAST);

Result: 634998513832503642

So what is a Regular Expression?
Also known as regex, a regular expression is a search string which enables matching of a string. Think of it as an advance searching function which can pick out values from a string of multiple characters.

Examples of regex:

  • \d matches a single digit
  • \w matches a single word (including alphanumeric characters and underscore)
  • [A-Z]+ matches any word which is upper case
  • [a-z]+ matches any word which is lower case
  • [0-9]+ matches any numeric value

There are other alternatives to web_reg_save_param_regexp. However these functions are limited and not as flexible.

LB/DIG RB/DIG – # will be a wildcard for a numeric value“session_id_##”

  • Adding LB/IC/DIG will ignore case
  • “LB/IC/DIG=session_id_##=” (e.g. Session_id_20)

LB/ALNUM or RB/ALNUM – ^ will be a wildcard for an alphanumeric value

  • ALNUMIC – ignore case
  • ALNUMLC – match only lower case
  • ALNUMUC – match only upper case

SaveOffSet

  • If there is a dynamic value for a boundary e.g. “session_id_2” (3,4,5)
  • SaveOffSet = 2 (to cover “2=”)
  • Web_reg_save_param(“SessionID”, “LB=session_id_”, “RB=\””, “SaveOffSet=2”, LAST);

LR implementation

  • PERL based
  • LR 11 does not support multiple capture groups however this is now supported in LR 11.52 (example below)

Example Multiple Capture Groups
Source: rows”:[[“NW,RO,RA”,”DLY”,”10/07/2011″,”10/17/2011″,”10/01/2011″,”RA”,”Y”,”FR”,”AMEA”,”AC”,”1945″,”50″,”50″,”AC 100IOSH-08″,”UserDefined”,”10/07/2011″,”Reassigned”…”

Solution: web_reg_save_param_regexp(“ParamName=ParamValue”,”RegExp=rows”:\[\[“[^”\r\n]*”,”([A-Z]{3})”,”[^”\r\n]*”,”[^”\r\n]*”,”[^\/]+\/[\d]+?\/2011″,”[A-Za-z]*”,”[^”\r\n]*”,”[^”\r\n]*”,”([^”\r\n]*)”,”[^”\r\n]*”,”([^”\r\n]*)”,LAST);

Result:

  • {ParamValue1} = DLY
  • {ParamValue2} = AMEA
  • {ParamValue3} = 1945
Posted by Lionel Lim in Micro Focus, Tech Tips, 11 comments
Understanding LoadRunner Virtual User Days (VUDs)

Understanding LoadRunner Virtual User Days (VUDs)

Previously we wrote about the different types of LoadRunner licenses available. There were still a lot of questions about VUDs, so this Tech Tip aims to help people better understand this virtual user license type.

What is a VUD

A VUD is a Virtual User Day. While a virtual user day is considered a 24 hour user, 24 hours is the maximum amount of time that a Virtual User is valid.

VUDs provide a cost effective method of boosting the number of virtual users available in a performance test or set of performance tests run over period of time. In addition to the license for VUDs, a license for the LoadRunner Controller or Performance Center must be in place.

How to order VUDs

VUDs are available to any LoadRunner or Performance Center installation. Licenses can be ordered through HP or an HP Software partner like JDS Australia. The license key will be issued by HP in the same way as the other LoadRunner license types.

Configuring VUDs

Virtual User Days have an expiration time. This time can be set through HP support. Set the time for a time an hour or two after the performance tester first arrives at the office. Really, the start time should be set to a time when you are not running a test. If a performance test has been run involving VUDs, the LoadRunner Controller should be shutdown before the start/expiration time occurs.

VUDs can be used in conjunction with permanent/perpetual virtual users. There is no conflict between the two licenses. VUDs will only be expended after the all of the eligible permanent virtual users have been expended.
VUDs are activated by running a virtual user. The effect of running a virtual user is that the virtual user is decremented from the license.

Spending VUDs

Running a test with a number of VUs less than or equal to the amount on a permanent license will not result in an expenditure of any VUDs.

Poor Expenditures of VUDs

Running a test which starts before the expiration time and ends following the expiration time, results in a double charge of VUDs.

When the Controller is open, even if the test run has completed, the VUs that have been allocated to the current test will be decremented from the license at the roll over time. Therefore it is unwise to run tests on a Friday night, unless you are planning on getting to the office to turn off the controller before the start time occurs.

Wise Expenditures of VUDs

Run multiple tests in a day. Make use of the VUDs in more than one test in a day. If possible run your first test using VUDs. Analyze the results of the test, recommend and implement changes to the system under test and rerun the same test. This provides a multiple uses of the same VUDs, while testing configuration changes of the system under test. Make certain that the test scenario is closed before the expiration/start time for the VUDs ticks over.
Analysis is more important than rerunning tests with VU. Accurate identification of issues within the system returns more value than rerunning tests to confirm results. When writing the test plan take into account the VUDs and try to maximize the value of each, by stacking tests using similar quantities of VUDs which accomplish different goals.

Examples

The following examples are designed to test your knowledge of VUDs.

Scenario 1
  • Permanent Web VUs available: 200
  • Web VUDs available: 1000
  • VUD expiration/start time: 11:00

300 VU Endurance Run: Kick off test at 17:00, watch rampup and head home. Test completes at 05:00 the next day. Enter the office at 09:00. Review VUser errors through the Controller for an hour. Shutdown the controller or reset the scenario before 1059:59.

  • VUDs expended: 100
  • VUDs remaining: 900
Scenario 2
  • Permanent Web VUs available: 200
  • Web VUDs available: 1000
  • VUD expiration/start time: 11:00

300 VU Endurance Run: Kick off test at 17:00, watch rampup and head home. Test completes at 11:00 the next day. Enter the office at 09:00. Review VUser errors through the Controller for an hour. Shutdown the controller or reset the scenario after 11:00.

  • VUDs expended: 200 – although 100 remain for the rest of the day.
  • VUDs remaining: 800
Scenario 3
  • Permanent Web VUs available: 200
  • Web VUDs available: 1000
  • VUD expiration/start time: 11:00

700 VU Peak Run: Kick off test at 11:30, watch rampup and analysis of the run while running. Test completes at 14:30. Controller is shutdown or reset following the scenario completion.

  • VUDs expended: 500
  • VUDs remaining: 500
Scenario 4
  • Permanent Web VUs available: 200
  • Web VUDs available: 1000
  • VUD expiration/start time: 11:00

700 VU Peak Run: Kick off test at 11:30, watch rampup and analysis of the run while running. Test completes at 14:30. Controller is shutdown or reset following the scenario completion.
700 VU Peak Run: Make changes to the application. Kick off test at 15:00, watch rampup and analysis of the run while running. Test completes at 18:30. Controller is shutdown or reset prior to 11:00 the next day the scenario completion.

  • VUDs expended: 500
  • VUDs remaining: 500
Scenario 5
  • Permanent Web VUs available: 200
  • Web VUDs available: 1000
  • VUD expiration/start time: 11:00

700 VU Peak Run: Kick off test at 11:30, watch rampup and analysis of the run while running. Test completes at 14:30. Controller is shutdown or reset following the scenario completion.

700 VU Peak Run: Make changes to the application. Kick off test at 15:00, watch rampup and analysis of the run while running. Test completes at 18:30. Controller is not shutdown or reset prior to 11:00 the next day the scenario completion.

  • VUDs expended: 1000
  • VUDs remaining: 0 - although 500 remain available for the remainder of the day.

Please leave a comment if you still have questions about VUDs.

Tech tips from JDS

Posted by Ward Gushee in Micro Focus, Tech Tips, 0 comments
Problems recording HTTPS with VuGen

Problems recording HTTPS with VuGen

Recently a client had an urgent request to monitor a HTTPS URL due to poor availability and performance. No problem, give me the URL and I’ll have the monitor set up and running in 10 minutes. However, a simple task turned into an investigation of Vugen certificates and Windows security patching.

For any HTTPS request Vugen would not show any events after code generation and the recording browser would show:

The recording environment was:

  • Vugen 11.04
  • Windows XP
  • Internet Explorer 7

As HTTPS requests worked from normal browsing the problem pointed towards a Certificate issue somewhere between Vugen and the requested site. Investigation discovered that a recent Windows Security patch (http://support.microsoft.com/kb/2661254) now blocks all RSA certificates less than 1024 bits long.

This is a problem for Vugen as it uses RSA private key of length 512 bits in files wplusCA.crt and wplusCAOnly_.crt.

Note: In Vugen 11.50 these files are called 1.wplusCA_Expiration_2020.crt and wplusCAOnly_Expiration_2020.crt.

You can find the Vugen certificates in the following directory:

<LoadRunner installation folder>\bin\certs\

Fortunately HP are aware of the problem and have issued the following critical updates to increase the private key length to 2048 bits:

Note you will need a valid HP Support account to download these patches.

Tech tips from JDS