Author: JDS

Case Study: RMIT implementation of LoadRunner

Twice a year, RMIT publishes course results for its students on the student portal of the university’s website. These two days in the calendar generate the highest volume of website traffic. With such huge spikes experienced in a 24-hour period, RMIT’s student portal was being saturated and not handling the volume effectively. This resulted in poor performance and students reported difficulties in accessing their results in a timely fashion.

Tim Ash, RMIT’s senior production assurance and test manager, explains, “The single most important aspect of our students’ lives, apart from the education they receive, is knowing how well they are doing in their studies. Having a stable student portal is critical to satisfy our students’ needs and deliver a high level of service. It also impacts on our credibility and future success.

“For this reason, we identified performance testing as an essential part of future application delivery. With limited resources available in-house and a tight deadline, we invited JDS Australia (JDS), an HP Platinum Partner, to help us to predict future system behaviour and application performance."

Objective

Predict system behaviour and application performance to improve student satisfaction

Approach

RMIT University verified that its new critical student results application (called MyResults) met specified performance requirements

IT improvements

  • Achieved 100 percent uptime on MyResults
  • Delivered student results 60 per cent faster than predicted
  • Gained a true picture of end-to-end performance of MyResults
  • Emulated peak loads of 20,000+ student logins per hour – more than six times the average loads
  • Uncovered and rectified functional issues prior to going live
  • No helpdesk complaints logged for poor system performance
  • Optimisation opportunities were identified that enabled the student portal to be scaled more appropriately

About RMIT University

As a global university of technology and design, RMIT University is one of Australia’s leading educational institutions, enjoying an international reputation for excellence in work-relevant education.

RMIT employs almost 4,000 people and has 74,000 students studying at its three Melbourne campuses and in its Vietnam campuses. The university also has strong links with partner institutions which deliver RMIT award programmes in Singapore, Hong Kong, mainland China, and Malaysia, as well as research and industry partnerships on every continent.

RMIT has a broad technology footprint, counting its website and student portal as mission-critical applications essential to the day-to-day running of the university.

Industry

Higher Education

Primary applications

  • SAP Financials
  • PeopleSoft Student Administration

Primary software

  • HP LoadRunner software
  • HP Quality Center
  • HP QuickTest Professional software

Selecting an industry standard

A long-term user of HP software including HP Quality Center, HP QuickTest Professional and HP Business Availability Center, RMIT chose HP LoadRunner software, as its performance validation platform.

“HP LoadRunner is the industry standard for performance testing,” explains Tim. “It was a natural choice for RMIT, due to its functionality and integration capabilities as well as our investment in other HP software.”

Predicting system behaviour

In seeking to prevent future student portal performance problems, RMIT had identified a number of potential solutions.

Tim explains, “Due to the complex nature of our student portal, it was unclear which solution design would provide the best performing architecture. HP LoadRunner was used to obtain an accurate picture of end-to-end system performance by emulating the types of loads we receive on result days. We tested all the options. Our objective was to handle the loads and satisfy 100 percent of our students’ results queries in under five seconds.”

Emulating loads

By leveraging the experience of JDS, HP LoadRunner was used to emulate peak loads of 20,000+ student logins per hour, which is more than six times the average loads experienced on non-results days. “The data from our tests revealed that we needed our student portal platform to have the ability to scale considerably, to handle traffic up to six times the usual volume on result days,” explains Tim. “As we drove loads against the various design options, we also captured end-user response times.

“Based on this, we selected the design solution that had the best performance. There were several options to choose from with some going through a portal which put unnecessary load on the system. The brand new application chosen is called MyResults, which, during testing, met our performance objective and delivered response times in less than two seconds.”

The JDS account manager for the case says, “As well as testing the performance of the various design solutions, optimisation opportunities were identified that enabled the student portal to be scaled more appropriately. Using HP LoadRunner, JDS consultants identified bottlenecks in the application and platform. By working with RMIT, they were able to direct efforts to remediate the problems prior to going live.”

Successful go-live drives student satisfaction

While MyResults is a simple application that delivers results, its structure is quite complex as it receives inputs from various databases. Ensuring it would perform as intended on results day was key.
Tim says, “By using HP LoadRunner, we significantly decreased the risk of deploying an application that would not meet our performance requirements. On results day, MyResults proved an outstanding success. It handled the loads and spikes extremely well, consistently delivering results in timeframes sub-two seconds. This would not have happened if we had not validated performance beforehand.

“Our students certainly noticed the difference. We received a number of tweets praising the system’s performance. Many of our students couldn’t believe how quickly they obtained their results. Another great indication of our success was the low impact on our helpdesk. They didn’t receive complaints or issues regarding the system and that’s a big plus.”

Boosting reputation

As a result of deploying HP LoadRunner to validate the performance of MyResults, RMIT has realised considerable benefits. The institution has facilitated better decision-making as information is more readily available, and experienced operational efficiencies.

Tim says, “We are delighted with the outcomes of HP LoadRunner. First and foremost, we rectified poor performance issues to provide a results system that exceeded our own goals and those of our students. Thanks to the preparative measures we put in place, our system thrived and delivered 100 percent uptime. This enabled us to provide a high-quality student experience, which culminated in increased user satisfaction.

“Operationally, HP LoadRunner helped us to identify the most suitable option to improve our performance. It gave us confidence (prior to release) in MyResults’ ability and allowed us to make informed business decisions, which reduced our deployment risk. In addition, we saved money, because we were more efficient and did not experience any downtime. Plus, we were able to fix issues in development which is always a cheaper option, and we saved considerable time during testing by having the ability to re-use and repeat tests."

Business Benefits

  • The university's reputation was boosted by providing students with reliable and fast access to results—students tweeted their satisfaction.
  • Money was saved through having zero downtime and the ability to fix issues early.
  • RMIT could make informed decisions and reduce the risk of deploying poor quality applications.
  • University and student confidence in IT systems was improved.

Next steps

HP LoadRunner will continue to play a key role as the performance validation backbone of MyResults. Tim believes RMIT is certainly up-to-date with testing technology with the HP Solutions it utilises, as well as adoption of a stringent testing methodology.

“Students are early adopters of technology. So it’s logical that our next big push is mobility and wireless applications. More and more, our students are accessing RMIT’s website using mobile devices and we need to make sure our applications are optimised accordingly. We want students to be able to log in from home and attend a lecture from their laptop or access their results from their mobile phones. HP LoadRunner will be used to ensure we can meet the wireless requirements of the university’s future.

“Looking at the big picture, HP LoadRunner is an essential component of RMIT’s technology footprint. It allows us to perform due diligence on our applications, make go live decisions with confidence and provide statistical information that is trusted and relied upon by the institution.

“Ultimately, HP LoadRunner gives me peace of mind that RMIT’s systems will work as intended and deliver the quality of service our students and staff expect.”

The Australian Defence Reserves Support Council Awards JDS

Lt. Samuel Abdelsayed receives the 'Employer Support Award - Medium Business' on behalf of JDS.
Lt. Samuel Abdelsayed receives the ‘Employer Support Award – Medium Business’ on behalf of JDS.

JDS has again been awarded by the Australian Defence Reserves Support Council, an organisation which assists Reservists both within Australia and overseas.

This is the second time JDS has received an ‘Employer Support Award’.  Staff member, Lieutenant Samuel Abdelsayed, nominated JDS for being supportive of his Reserve activities and enabling him to attend all training activities and courses.

JDS Managing Director John Bearsley says that Sam’s service in the defence forces is to be highly commended, highlighting the role of service in developing professionalism and problem-solving skills.

Thanks go to both the Defence Reserves Support Council, and to Sam for his demonstration of leadership and commitment.

 

Read more:

 

JDS is now a CAUDIT Splunk Provider

Splunk Enterprise provides universities with a fast, easy and resilient way to collect, analyse and secure the streams of machine data generated by their IT systems and infrastructure.  JDS, as one of Australia’s leading Splunk experts, has a tradition of excellence in ensuring higher education institutions have solutions that maximise the performance and availability of campus-critical IT systems and infrastructure.

The CAUDIT Splunk offering provides Council of Australian University Directors of Information Technology (CAUDIT) Member Universities with the opportunity to buy on-premise Splunk Enterprise on a discounted, 3-year basis.  In acknowledgement of JDS’ expertise and dedication to client solutions, Splunk Inc. has elevated JDS to a provider of this sector-specific offering, meaning we are now better placed than ever to help the higher education sector reach their data collection and analysis goals.

What does this mean for organisations?

Not-for-profit higher education institutions that are members of CAUDIT can now use JDS to access discounted prices for on-premises deployments of Splunk Enterprise.  JDS are able to leverage their expertise in Splunk and customised solutions built on the platform, in combination with their insight into the higher education sector, to ensure that organisations have the Splunk solution that meet their specific needs.

Secure organisational applications and data, gain visibility over service performance, and ensure your organisation has the information to inform better decision-making.  JDS and Splunk are here to help.

You can learn more about JDS’ custom Splunk solutions here:  JDS Splunkbase Apps

JDS wins AppDynamics Partner Award

JDS Win AppDynamics 2016 Emerging Markets Partner of the Year Award

We are delighted to announce JDS have received the AppDynamics 2016 Emerging Markets Partner of the Year Award at AppSphere 2016 in Las Vegas on the 15th Of November. As a team, we are very proud of this award as it is the result of a lot of hard work from a lot of talented people at JDS.

John Bearsley, Managing Director of JDS said, “AppDynamics has stormed the market in the past few years as one of the leading next generation Application Performance Management Tools. We have seen first hand how it can transform the monitoring landscape for our customers providing them great visibility into the end user experience and the performance and availability of their business critical applications. AppDynamics is a vendor we are proud to support and we look forward to helping many more customers derive its significant benefits in the years to come”.
appd-awardv2

To see how JDS can help ensure it works with AppDynamics check out our AppDynamics Page.

 

Vugen and GitHub Integration

With the release of LoadRunner 12.53, VuGen now has built in GitHub integration. That means you not only have access to GitHub for saving versions of your script, but also other tools like bug tracking, access control and wikis.

Here’s an overview to VuGen’s GitHub integration to get you up and running.

Getting Started

First off, you’ll need a personal Git login. You can sign up for free at github.com/join.  Note that free repositories are publicly available.

You’ll also need LoadRunner 12.53 or higher from HPE.

GitHub Overview

VuGen’s GitHub integration (and GitHub in general) works by managing three versions of your script at a time.

Vugen and GitHub Integration 1

  1. The one you see in VuGen is your working copy. You can develop / replay your script as usual, and save it locally when you want.
  2. When you want a backup of your script – e.g. before doing correlation or rerecording, you can make a check point – or commit the script. This saves the script to a local staging area.
  3. After several commits, or perhaps at the end of the day, you might be ready to save everything to GitHub. To do this you Push the script.

Using GitHub integration straight from VuGen

The following example shows you how to push your first VuGen script to GitHub.

1. Starting out – Creating a Repository

Log into GitHub and create a new repository:

Vugen and GitHub Integration 2

Fill in the details and click “Create Repository”. With a free GitHub account the script will be publicly available, with a paying account you can make it private.

Vugen and GitHub Integration 3

2. VuGen Version Control – Create Local Git Repository
Now create a script in VuGen – in our example it’s ‘BankofSunshine’.

You’ll see a new ‘Version Control’ menu available in VuGen. Chose the option to ‘Create a local Git Repository’.

Vugen and GitHub Integration 4

VuGen manages the files to include so you don’t need to create a .gitignore file. If you prefer to manage it yourself, you can do that too.

3. Commit Changes to the Staging Area

Now you need to commit your script to the local repository. Do this each time you’ve made changes that you might want to push to Git Hub, or if you want to be able to roll back any changes.

When you commit, your local repository is ready to be pushed up to GitHub – but is still only available to you.

Vugen and GitHub Integration 5

4. Push Changes to GitHub

Once you are ready to save your script up to GitHub, you will need to Push the changes.

The first time you do this with your script you will need to tell VuGen some details about the GitHub repository.

Enter your details that you created in Step 1:

Vugen and GitHub Integration 7

Now your newly created script is saved to GitHub.

5. There’s no step 5.

That’s all you need to do. When you go to GitHub and click on your repository, you will see all the files that you just pushed:

Vugen and GitHub Integration 8

To keep track of changes made locally to the script, VuGen will show you which files have updated with a red tick:

Vugen and GitHub Integration 9

While you can access your scripts in ALM from the Controller, you can’t yet access your scripts in GitHub from the Controller. You’ll need to pull down a local copy before you run your test.

Now you are up and running, how about exploring more of what GitHub has to offer. Each script saved to GitHub comes with a Wiki and issues register. These could come in handy when you have large or tricky scripts or for handover to another team

Share your thoughts on VuGen GitHub integration below.

Filtered Reference Fields in ServiceNow

One common requirement JDS sees when working with ServiceNow customers is the need to dynamically filter the available choices in reference fields. This can be useful in both general form development and record producers.

Consider the following business requirement:
A national charity wants to implement a feedback form for its staff.
As staff work across a variety of areas, it’s important to match staff with the correct department, so the correct department manager can be notified about the feedback.

The charity has provided us with some sample data for testing:

Departments

Filtered Reference Fields in ServiceNow 1

Department Staff

Filtered Reference Fields in ServiceNow 2

As this feedback form is available to anyone working within the charity, JDS has built a record producer to capture feedback using the Service Catalog within ServiceNow.

Filtered Reference Fields in ServiceNow 3

To limit the available choices to only staff within a particular department, we’ll need to use a script that is triggered whenever anyone starts typing in the field or accesses a value lookup for the variable “Who would you recommend?”
Although there are numerous ways to approach this, always look for the simplest option. Rather than calling a custom script, use the Reference qualifier field in ServiceNow to set up the relationship.

Filtered Reference Fields in ServiceNow 4

Normally, fields on a form can be referenced using current.fieldname but record producers within a service catalog are slightly different, and need a slightly more qualified reference of current.variables.fieldname.

Let’s look at the results…

If we select “Community Outreach” the list of staff available under “Who would you recommend?” is limited to Jane Brown and Sarah Smith.

Filtered Reference Fields in ServiceNow 5

If we select “Youth Action” the list of staff available under “Who would you recommend?” is limited to Matthew Jones and Sarah Smith.

Filtered Reference Fields in ServiceNow 6

ServiceNow is a powerful platform that allows for advanced customisations to ensure a good fit with business requirements.

Completing the APM picture with AppDynamics EUM

How are your customers really experiencing your apps right now? How is your business reputation, loyalty and revenue being affected by the world between your application servers and your customer? AppDynamics End User Monitoring (EUM) completes the APM picture by letting you See, Act on, and Know about your customers' real experiences in real time.

With EUM, AppDynamics extends the valuable insights it provides by tracing all the way out to the browser or mobile app giving complete end-to-end visibility for every customer experience.

Watch this session to learn:

  • Why you need to understand the full customer experience on top of what is happening in your data centre.
  • The customer experience metrics that matter most.
  • How AppDynamics EUM gives visibility of the complete picture from your customer's devices to and through your backend APIs and applications.
  • How EUM Synthetics further helps you to Know at all times that your apps are working at their best.

Discover exactly how your end users experience and engage with your applications.

Straight-Through Processing with ServiceNow

How much time is your organisation wasting on manual processes?

What is the true cost of lost productivity?

Straight-through processing (STP) has long been recognised as the Holy Grail of business processes, as it promises to eliminate manual paper forms along with human intervention at the back end. By avoiding the duplication of effort, straight-through processing promises to drastically improve the speed and accuracy of business processes.

Although straight-through processing has been attempted by numerous enterprise applications, such as SAP, IBM and Oracle, the problem has been that no single system handles data collection in a consistent, comprehensive manner.

Single System of Record
ServiceNow is unique among enterprise applications in that it uses a single system of record with common/shared supporting processes, such as workflows and notifications. Although there are hundreds of tables in the ServiceNow database, they are all closely related, and are often based on the two core tables — CI (Configuration Item in a CMDB) and Task. In ordinary business terms, CI and Task translate into objects and actions—and ultimately, all business processes revolve around things/people, and actions applied to them.

Source: https://thisiswhatgoodlookslike.files.wordpress.com/2014/05/platform.png

By using a common architecture for all applications, ServiceNow avoids the problem of integrating with conflicting data structures/applications. In essence, everything speaks the same language.

Straight-through processing
ServiceNow has a single system of record, making it ideally suited to straight-through processing. ServiceNow has the flexibility to adapt to multiple different application requirements whilst leveraging common structures and components. For example, fleet car management, purchase orders, and timesheet entries are entirely different business processes, but in essence they deal with objects (cars/orders/people) and activities (driving/tracking expenditure/working hours), and have similar requirements (approvals, workflow, records management, notifications, common master data, etc).

Essentially, the ServiceNow Automation Platform allows different business processes to be captured without changing the underlying system.

JDS recommends deploying straight-through processing in an iterative manner, using an agile approach.

  1. Validate user input with the target system
  2. Digitise paper forms
  3. Straight-through processing
  4. Automation

By adopting an agile approach with implementation in phases, organisations can see incremental benefits throughout the project.

Phase One: Validate user input with the target system
For straight-through processing to be successful, ServiceNow needs to validate incoming information to ensure it’s compatible with the target system. To do this, ServiceNow forms need to be pre-populated with values taken from the core system.

Straight-Through Processing with ServiceNow 2

For example, organisational data such as cost centres and approvers would be integrated with ServiceNow overnight to provide defaults/selection values within ServiceNow form fields. This approach provides the best performance, as access to up-to-the-minute data is not typically required, and the data integration ultimately ensures the consistency and accuracy of straight-through processing.

Phase Two: Digitise paper forms
Once ServiceNow has default values to validate end user input at the point of entry, existing paper forms can be digitized.
In the first phase, the front-end will be transformed with forms which are pre-filled and built upon a responsive UI, whilst the back-end process is unchanged.

 

Straight-Through Processing with ServiceNow 3

 

Although it is possible to automate the entire business process at once, in practice, most organisations prefer a phased approach so they can manage change and reduce the risk of any inadvertent impact on their core system during this time of transition.
Note that this approach immediately has a positive impact on end users, as the old paper forms have been transformed and are now made available through a searchable and user-friendly service catalog, while back-end processes stay largely the same, avoiding any disruption to existing services. The organisation has the opportunity to introduce streamlined processing without causing upheaval in the back office.

Phase Three: Straight-through processing
Once the digitization of paper forms has been established, it is time to automate the process to introduce efficiency to the back office.

 

Straight-Through Processing with ServiceNow 4

As straight-through processing is often interacting with core systems associated with Finance and HR, JDS recommends establishing an approval workflow process. Instead of manually entering all the request information, back office staff now provide oversight and approval of straight-through processing.

Phase 4: Automation
There may be some business processes which involve multiple parties and integrate with multiple systems end-to-end that could be automated in their entirety, such as ordering of software or employee on-boarding. In this case, ServiceNow can streamline straight-through processing with the use of ServiceNow Orchestration.

Orchestration is implemented by workflows in ServiceNow, allowing JDS to configure complex processes that span multiple systems. These may include activities such as approval and follow-on tasks which utilise data outputs from a system called inside the workflow to determine the next action.

Straight-Through Processing with ServiceNow 5

How much is manual processes costing your organisation at the moment?

Think of how much productivity is lost because users are forced to fill out paper forms, or back office staff are required to enter information into multiple systems. By implementing straight-through processing with ServiceNow, JDS can help you streamline your business processes, saving time and money, while radically improving your customer satisfaction.

How operational health builds business revenue

Splunk IT Service Intelligence (ITSI) is a next-generation monitoring and analytics solution that provides new levels of visibility into the health and key performance indicators of IT services. Use powerful visualizations and advanced analytics to highlight anomalies, accelerate investigations and pinpoint the root causes that impact service levels critical to the business.Join this session to learn how to:

  • Translate operational data into business impact.
  • Provide cross-silo visibility into the health of services by integrating data across the enterprise.
  • Visually map services and KPIs to discover new insights.
  • Transform machine data into actionable intelligence.

What is Service Intelligence?
IT operations is responsible for running and managing the services that businesses depend on. Service intelligence improves service quality, helps IT make well-informed decisions and supports business priorities.

Why Splunk IT Service Intelligence?
Splunk ITSI transforms machine data into actionable intelligence. It provides cross-silo visibility into the health of services by integrating data across the enterprise, visually mapping services and KPIs to discover new insights, and translating operational data into business impact. With timely, correlated information on services that impact the business, Splunk ITSI unifies data silos, reduces time to resolution, improves service operations and enables service intelligence.

ensure IT works with

Splunk: Using Regex to Simplify Your Data

Splunk is an extremely powerful tool for extracting information from machine data, but machine data is often structured in a way that makes sense to a particular application or process while appearing as a garbled mess to the rest of us. Splunk allows you to cater for this and retrieve meaningful information using regular expressions (regex).

You can write your own regex to retrieve information from machine data, but it’s important to understand that Splunk does this behind the scenes anyway, so rather than writing your own regex, let Splunk do the heavy lifting for you.

The Splunk field extractor is a WYSIWYG regex editor.

Splunk 1

When working with something like Netscaler log files, it’s not uncommon to see the same information represented in different ways within the same log file. For example, userIDs and source IP addresses appear in two different locations on different lines.

  • Source 192.168.001.001:10426 – Destination 10.75.001.001:2598 – username:domainname x987654:int – applicationName Standard Desktop – IE10
  • Context [email protected] – SessionId: 123456- desktop.biz User x987654: Group ABCDEFG

As you can see, the log file contains the following userID and IP address, but in different formats.

  • userID = x987654
  • IP address = 192.168.001.001

The question then arises, how can we combine these into a single field within Splunk?
The answer is: regex.

You could write these regex expressions yourself, but be warned, although Splunk adheres to the pcre (php) implementation of regex, in practice there are subtle differences (such as no formal use of look forward or look back).

So how can you combine two different regex strings to build a single field in Splunk? The easiest way to let Splunk build the regex for you in the field extractor.

Splunk 2

If you work through the wizard using the Regular Expression option and select the user value in your log file (from in front of “undername:domainname”) you’ll reach the save screen.
(Be sure to use the validate step in the wizard as this will allow you to eliminate any false positives and automatically refines your regex to ensure it works accurately)
Stop when you get to the save screen.
Don’t click Finish.

Splunk 3

Copy the regular expression from this screen and save it in Notepad.
Then use the small back arrow (highlighted in red) to move back through the wizard to the Select Sample screen again.
Now go through the same process, but selecting the userID from in front of “User.”
When you get back to the save screen, you’ll notice Splunk has built another regex for this use case.

Splunk 4

Take a copy of this regex and put it in Notepad.
Those with an eagle eye will notice Splunk has inadvertently included a trailing space with this capture (underlined above). We’ll get rid of this when we merge these into a single regex using the following logic.
[First regex]|[Second regex](?PCapture regex)

Splunk 5

Essentially, all we’ve done is to join two Splunk created regex strings together using the pipe ‘|’
Copy the new joined regex expression and again click the back arrow to return to the first screen

Splunk 6

But this time, click on I prefer to write the regular expression myself.

Splunk 7

Then paste your regex and be sure to click on the Preview button to confirm the results are what you’re after.

Splunk 8

And you’re done… click Save and then Finish and you can now search on a field that combines multiple regex to ensure your field is correctly populated.

Case Study: Australian Red Cross Blood Service

The prompt and decision

The Australian Red Cross Blood Service performs a critical role in Australia’s health system. It provides quality blood products, essential services and leading edge research to improve the lives of patients. A non-profit organisation with more than 4,000 employees, ARCBS must be ready to respond around the clock to deliver blood and search its extensive records for specialised requirements for particular patients. More than 520,000 Australians are blood donors across 120 collection sites every year.

Over a number of years, and utilising a number of HP Software solutions, JDS has helped the ARCBS adopt a lifecycle approach to quality, performance and availability of key business applications. This has enhanced the ARCBS’s IT capability and there is now a focus on extending the discipline of validation to other systems in the future.

Some of the IT improvements achieved include:

  • Obtaining a single point of truth to for application validation records
  • Unifying functional, performance and quality management
  • Gaining operational efficiencies by migrating to a paperless testing environment
  • Providing visibility into application code issues and the supporting information required for application vendors to ensure a timely fixes

This has provided a number of tangible business benefits such as:

  • Improved ability to meet regulatory audits by access to validation data in hours, rather than days or weeks.
  • Achieving the 99.8 per cent availability service level for the new National Blood Management System (NBMS)

The achievements in this project were realised through the timely and effective deployment of a number of HP Software solutions, including:

  • HP Quality Center as the overarching test management solution
  • HP QuickTest Professional for test automation driven out of Quality Center
  • HP LoadRunner for performance testing across a variety of protocols and application architectures
  • HP Diagnostics to triage performance bottlenecks in application code and configuration
  • HP SiteScope to collect real time metrics from various infrastructure components
  • HP Business Service Management to monitor the performance and availability of the critical applications from both the end user and infrastructure perspectives concurrently.