Tech Tips

Service Portal Simplicity

Service Portal Simplicity

The introduction of the Service Portal, using AngularJS and Bootstrap, has given ServiceNow considerable flexibility, allowing customers to develop portals to suit their own specific needs.

While attribution depends on whether you subscribe to Voltaire, Winston Churchill, or Uncle Ben from Spiderman, “With great power comes great responsibility.” And this is especially true when it comes to the Service Portal. Customers should tread lightly so as to use the flexibility of the portal correctly.

A good example of this arose during a recent customer engagement, where the requirement arose to allow some self-service portal users the ability to see all the incidents within their company. This particular customer provides services across a range of other smaller companies, and wanted key personnel in those companies to see all their incidents (without being able to see those belonging to other companies, and without being able to update or change others’ incidents). 

The temptation was to build a custom widget from scratch using AngularJS, but simplicity won the day. Instead of coding, testing, and debugging a custom widget, JDS reused an existing widget, wrapping it with a simple security check, and reduced the amount of code required to implement this requirement down to one line of HTML and two lines of server-side code.

The JDS approach was to instantiate a simple list widget on-the-fly, but only if the customer’s security requirements were met.

Normally, portal pages are designed using Service Portal’s drag-and-drop page designer, visible when you navigate to portal configuration. In this case, we’re embedding a widget within a widget.

We’ll create a new widget called secure-list that dynamically adds the simple-list-widget only if needed when a user visits the page.

Let’s look at the code—all three lines of it:

HTML

<div><sp-widget widget="data.listWidget"></sp-widget></div>

By dynamically creating a widget only if it meets certain requirements, we can control when this particular list widget is built.

Server-Side Code

(function() {
                if(gs.getUser().isMemberOf('View all incidents')){
                                data.listWidget = $sp.getWidget('widget-simple-list', {table:'incident',display_field:'number',secondary_fields:'short_description,category,priority',glyph:'user',filter:'active=true^opened_by.company=javascript:gs.getUser().getCompanyID()',sp_page:'ticket',color:'primary',size:'medium'});
                }             
})();

This code will only execute if the current user is a member of the group View all incidents, but the magic occurs in the $sp.getWidget, as this is where ServiceNow builds a widget on-the-fly. The challenge is really, ‘where can we find all these options?’

How do you know what options are needed for any particular widget, given those available change from one widget to another? The answer is surprisingly simple, and can be applied to any widget.

When viewing the service portal, administrators can use ctrl-right-click to examine any widgets on the page. In this case, we’ll examine the way the simple list widget is used by selecting instance options.

Immediately, we can see there are a lot of useful values that could be set, but how do we know which are critical, and what additional options there are?

How do we know what we should reference in our code?

Is “Maximum entries” called max, or max_entry, or max_entries? Also, are there any other configuration options that might be useful to us?

By looking at the design of the widget itself in the ServiceNow platform (by navigating to portal-> widgets), we can scroll down and see both the mandatory and optional configuration values that can be set per widget, along with the correct names to use within our script.

Looking at the actual widget we want to build on-the-fly, we can see both required fields and a bunch of optional schema fields. All we need to do to make our widget work is supply these as they relate to our requirements. Simple.

{table:'incident',display_field:'number',secondary_fields:'short_description,category,priority',glyph:'user',filter:'active=true^opened_by.company=javascript:gs.getUser().getCompanyID()',sp_page:'ticket',color:'primary',size:'medium'}

Also, notice the filter in our widget settings, as that limits results shown according to the user’s company…

opened_by.company=javascript:gs.getUser().getCompanyID()

This could be replaced with a dynamic filter to allow even more flexibility, should requirements change over time, allowing the filter to be modified without changing any code.

The moral of the story is keep things simple and…

Don't reinvent the wheel


Document as you go.

Peter Cawdron

Consultant

Length of Time at JDS

5 years

Skills

ServiceNow, Loadrunner, HP BSM, Splunk.

Workplace Passion

I enjoy working with the new AngularJS portal in ServiceNow.

Posted by Peter Cawdron in ServiceNow, Tech Tips, 0 comments
What’s new in ServiceNow for 2017?

What’s new in ServiceNow for 2017?

ServiceNow has released its latest version, Istanbul, which introduces a game changing innovation for the platform—Automated Test Framework to provide regression testing.

The Automated Test Framework will drastically simplify future upgrades as customers will be able to have a high level of confidence in testing releases prior to upgrades quickly and efficiently. This will allow customers to stay in-sync with the latest releases from ServiceNow without fear of broken functionality adversely impacting their business processes.

What is the Automated Test Framework?

The Automated Test Framework is a module within ServiceNow that allows customers to define their own tests against any other ServiceNow module, including bespoke modules developed in-house or by third-parties such as JDS.

Tests can be reused in test suites to provide comprehensive coverage of a variety of business processes. In this example, a test has been developed to verify that critical incidents are properly identified in the upgraded version. Notice how each step contains a detailed description of what the test will do in plain English.

Be sure to always include a validation step!

Opening a form, setting values, and successfully submitting that form is not enough to ensure a test is successful. The key to a successful regression testing is to have a point of validation, or confirmation that the test has produced the expected outcome. In this case, validating that the priority of the incident has been set to critical.

When defining test steps, ServiceNow will walk you through a wizard that will show you all the possible automated steps that can be undertaken. Note that ServiceNow will automatically insert this next step after the last step, unless you specify otherwise.

The automated framework can open new or existing records and undertake any action an end-user would normally complete.

Also, note how it is possible to run automated regression tests directly against the server and not just through a regular ServiceNow form.

Be careful when defining server tests as, ideally, automated regression tests should be run through a form, mimicking a transaction undertaken by an actual user. This ensures UI policies and actions are all taken into account in the test results. Tests conducted directly against the server will include business rules, but not UI scripts/actions, which may lead to inconsistent results.

Server tests are ideal for testing integration with third-party systems via web services.

Test results include screenshots so you can see precisely what occurred during your test run. Also, notice that the test has been run against a particular browser engine, and the overall time for the entire process has been captured.

 

 

The individual runtime of each step can be eye-balled from the start-time, although it is possible to have this automatically calculated if need be.

Please note, whether you are creating a new record or modifying an existing record, the results of your automated regression test will NOT become part of that particular table’s records. For example, if you insert an new incident, you will see screenshots of the new incident complete with a new incident number, but if you go to the incident table, it won’t be present. If you update an existing incident, the update will be visible in the test results, but the actual record in the table will not change.

When it comes to best practices, JDS recommends structuring your tests carefully to ensure your coverage is comprehensive and represents an end-to-end business transaction, taking into account the impact on workflows.

If you have any questions about Automated Test Frameworks in ServiceNow, please contact JDS.

 

Posted by Peter Cawdron in ServiceNow, Tech Tips, 0 comments
Learning Analytics

Learning Analytics

CAUDIT Analysis Part 3

View Part 1 Here
View Part 2 Here

Since 2006, the Council of Australian University Directors of Information Technology (CAUDIT) has undertaken an annual survey of higher education institutions in Australia and New Zealand to determine the Top Ten issues affecting their usage of Information Technology (IT).

To help our customers get the most out of this report, JDS has analysed the results and is offering a series of insights into how best to address these pressing issues and associated challenges.

Issue 8: Learning Analytics

student-success

Supporting improved student progress through establishing & utilising learning analytics.”

Every interaction from students, staff and researchers, through any of an institution’s systems, is a data point of interest. These points of interest are a wealth of information that can be used to improve services and experiences institution wide.

Numerous studies show the importance of basing decisions on quantifiable measurements. It allows you to target, change and measure outcomes. For example, measuring student performance beyond the standard result orientated structure can allow an institution to intervene earlier in a student’s progress. One study showed that a university was able to intervene as early as two weeks into a semester (Sclater, 2016). They identified students who were at high risk of dropping out and helped these students adjust their program.

These areas may be outside of the typical ICT purview, but are becoming increasingly ICT driven. ICT tends to own the analytics platform and should be responsible for ensuring it meets an institutes requirements.

CAUDIT Ranking Trend: 2014 – | 2015 #17 | 2016 #8
 

Challenges and Considerations

challenges
Challenge: Manage increased data inflow from a vast array of sources across campuses.
Students and teachers can bring in a multitude of devices for use on campus, including laptops, tablet, smartphones and more. They can also access institutional facilities, applications, systems, and data whilst at university, work, home. These are just a few examples of different institutional data points. Combining all of these devices across a variety of locations depicts how data points are increasing exponentially.

Each data point when analysed in the right manner, holds valuable information – information that determines how an institution should respond and/or intervene to ensure it is providing the right solution to the matter at hand. In order to gain this value, an institution should first understand how to manage many and different data inputs.

JDS has the experience in building and implementing platforms that can provide a universal solution – a solution that can collect, index and analyse any machine data at massive scale, without limitation. Institutions can then search through this data in real time, correlate and analyse with the benefit of machine learning to better understand institutional trends and ultimately make better informed decisions.

Challenge: Cross collaboration across institutional stakeholders (eg. students, teachers, admin, etc.)
An institution such as a university has a wide variety of internal and external stakeholders, all with the need to share information between one another. This information also often originates from a large variety of sources.

The challenge here is making this data from students, teachers, researchers and administration staff accessible to each other in a controlled, normalised manner. These stakeholders will also have different questions to ask of the same data, each with their own requirements. ICT does not necessarily have a deep understanding nor require a deep understanding of the content of this data, but they can provide a single platform for collaboration. A common platform can address these issues and allowing cross collaboration between stakeholders while improving an institutions products and services.

With the help of JDS, ICT can provide such a platform that allows ease of generating dashboards, reports and alerts. JDS can enable the institution to not only ask various questions of the same data but also create visualisations and periodic reports to help interpret and use the information gleamed.

JDS has implemented such solutions across various Australian organisations, unleashing the powerful data collection functionalities of a variety of industry leading monitoring tools. We have our own proprietary solution, JDash, purpose build to help organisations collaborate to understand service impact levels, increase visibility into IT environments, allow efficient decision making and provide cross silo data access and visualisations. JDash is also fully customisable to various institutional needs and requirements.

Challenge: Using analytics for early intervention with student difficulties.
In a recent article in The Australian, the attrition rate from some universities was above 20%. Numerous factors contribute to this, which can ultimately be reduced through early prevention. Institution can make better informed decisions if they have the right data at the right time.

The CAUDIT report highlighted the need to identify student performance early. This allows an institution to intervene by either advising a student of a different direction or providing the right resources to handle their difficulties.

In order to provide this level of insight into student drop-out rates and early intervention, institutions need to build and manage a learning analytics platform. A learning analytics platforms collects, measures, analyses, and reports data on the progress of learners within context.

JDS can assist institutions with building a centralised, scalable, and secure learning analytics platform. This platform can ingest the growing availability of big datasets and digital signals from students interacting with numerous services. From the centralised platform, institutions can interpret the information to make decisions on intervention and ultimately reduce attrition rates of students.

About JDS

about-jds
With extensive experience across the higher education sector, JDS assists institutions of all sizes to streamline operations, boost performance and gain competitive advantage in an environment of tight funding and increasing student expectations. With more than 10 years’ experience in optimising IT service performance and availability, JDS is the partner of choice for trusted IT solutions and services across many leading higher education institutions.

 

ensure it works with

 

Posted by Joseph Banks in Higher Education, News, Tech Tips, 0 comments
Educational Technology

Educational Technology

CAUDIT Analysis Part 2

View Part 1 Here
View Part 3 Here

Since 2006, the Council of Australian University Directors of Information Technology (CAUDIT) has undertaken an annual survey of higher education institutions in Australia and New Zealand to determine the Top Ten issues affecting their usage of Information Technology (IT).

To help our customers get the most out of this report, JDS has analysed the results and is offering a series of insights into how best to address these pressing issues and associated challenges.

Our second insight featured here discusses the importance of educational technology in maximising the student experience and helping universities compete in the digital age.

 

Issue 4: Educational Technology

student-success

Supporting the use of innovative technology in teaching in learning.”

ICT in higher education is subject to the same disruption that’s affecting other industries. The difference is, universities are also at the frontline of generational disruptions including ‘Digital Natives’.

Such generational and technological shifts bring a primary challenge to institutions like yours. The question is, how do you maintain an agile technology platform and reliable services. These services can transcend technology shifts and allow students to mix and match their own personal requirements with what the university has to offer.

Ultimately, universities must offer emerging technologies and services to students that can be used by both digital and non-digital natives. What doesn’t change however, is that both groups are seeking tools that are easy to use, fast to access and always available.

CAUDIT Ranking Trend: 2016 #4 | 2015 – | 2014 –
 

Challenges and Considerations

challenges
Challenge: Embrace emerging technologies throughout learning spaces as well as virtual and remote laboratories.

The dramatic rise of the Bring Your Own Device (BYOD) phenomenon in both the public and private sector presents both challenges and opportunities. Today, organisations and education institutions alike are struggling with deploying technologies that are in sync with what people use in their everyday lives.

The key to resolving this challenge is to have a core platform that offers, tracks and manages these new services. To achieve this, a robust alignment of new technologies is needed with current processes and existing tools/apps. Unfortunately, we’re seeing agencies re-invent the wheel just to get something out there to meet student expectations.

Having an agile platform in place will allow you to standardise new product offerings while at the same time accelerating the adoption of emerging technologies. This platform should facilitate the on-boarding of technologies with minimal effort – backed by orchestration and automation capabilities. In turn, this will give your institution a clear understanding of how technologies are being used and the agility to shift rapidly into new area as required.

Challenge: Provide reliable and robust systems for learning that are interactive, collaborative and personalised.

So you’ve got the latest and greatest technologies…but they’re not always working. What now?

The objective here is that on top of facilitating the use of new technologies through a platform that allows you to distribute, track and manage it, you need to ensure that it is operating efficiently.

Bear in mind, that ‘fast to deliver’ does not always translate to ‘fast to use’. Systems that are deployed need to be reliable so they don’t create more unplanned work.

Unfortunately, unplanned work will impact your institution. For example, full-time employees will be removed from project work to concentrate on fixing issues in production. This has the effect of slowing down the process of adopting and offering emerging technologies – making ICT absent from the innovation discussion.

JDS provides numerous services to ensure the performance of your services – ranging from testing and diagnosing to monitoring. In cases where there are performance issues, we can help you diagnose and narrow down the root-cause of issues. Proceeding to production, JDS can give you visibility into the performance and availability of your systems so you can keep an eye on how your services are being used, and address any issues early – reducing unplanned work.

About JDS

about-jds
With extensive experience across the higher education sector, JDS assists institutions of all sizes to streamline operations, boost performance and gain competitive advantage in an environment of tight funding and increasing student expectations. With more than 10 years’ experience in optimising IT service performance and availability, JDS is the partner of choice for trusted IT solutions and services across many leading higher education institutions.

 

ensure it works with

 

Posted by Joseph Banks in Higher Education, News, Tech Tips, 0 comments
Static Variables and Pointers in ServiceNow

Static Variables and Pointers in ServiceNow

When scripting in ServiceNow, be careful how you create your variables. If you refer to an object such as a gliderecord field/column, you are creating a pointer to the ever-present current value rather than a static value at a point in time. Perhaps an example will help explain…

In our fictitious example, Ranner is the head of marketing and wants to know the first and last person with a first name starting with “Ra…” Obscure, I know, but it will demonstrate the point.

So we develop a script we can test using the Background Script module in ServiceNow to check a subset of records to make sure our logic is correct. We grab the first value and the last after cycling through the table.

var userGlide = new GlideRecord('sys_user');
userGlide.addEncodedQuery('first_nameSTARTSWITHRA');
userGlide.orderBy('first_name');
userGlide.setLimit('10');
userGlide.query();
 
var i = 0;
var firstUser, lastUser;
 
while (userGlide.next()) {
 
if (i==0) {
firstUser = userGlide.first_name;
gs.info('The firstUser is ' + firstUser );
}
 
gs.info('Iteration ' + i + ' current user = ' + userGlide.first_name + '\t firstUser = ' + firstUser );
 
if (!userGlide.hasNext()) {
lastUser = userGlide.first_name;
gs.info('The last user is ' + lastUser);
}
i++;
}
gs.info('-----------------------------------------------');
 
gs.info('First user is ' + firstUser );
gs.info('Last user is ' + lastUser );

If you look at the output when this runs as a Background Script, you can see there are inconsistent results. The firstUser variable changes every time the gliderecord changes even though it was set only once at the beginning.

*** Script: The firstUser is Ra
*** Script: Iteration 0 current user = Ra firstUser = Ra
*** Script: Iteration 1 current user = Ra-cheal firstUser = Ra-cheal
*** Script: Iteration 2 current user = Raamana firstUser = Raamana
*** Script: Iteration 3 current user = Raamon firstUser = Raamon
*** Script: Iteration 4 current user = Rabi firstUser = Rabi
*** Script: Iteration 5 current user = Rabia firstUser = Rabia
*** Script: Iteration 6 current user = Rabin firstUser = Rabin
*** Script: Iteration 7 current user = Racahael firstUser = Racahael
*** Script: Iteration 8 current user = Rachael firstUser = Rachael
*** Script: Iteration 9 current user = Rachael firstUser = Rachael
*** Script: The last user is Rachael
*** Script: -----------------------------------------------
*** Script: First user is Rachael
*** Script: Last user is Rachael

The reason for this is although we defined our variable at the start, all we did was assign a pointer to the glide record, not a static value. Every time the glide record moves onto a new record, so does the value within our variable.

The solution is to make sure we’re only grabbing a value and not an object from the gliderecord, by adding .getValue() to our script

var userGlide = new GlideRecord('sys_user');
userGlide.addEncodedQuery('first_nameSTARTSWITHRA');
userGlide.orderBy('first_name');
userGlide.setLimit('10');
userGlide.query();
 
var i = 0;
var firstUser, lastUser;
 
while (userGlide.next()) {
 
if (i==0) {
firstUser = userGlide.getValue('first_name');
gs.info('The firstUser is ' + firstUser );
}
 
gs.info('Iteration ' + i + ' current user = ' + userGlide.first_name + '\t firstUser = ' + firstUser );
 
if (!userGlide.hasNext()) {
lastUser = userGlide.getValue('first_name');
gs.info('The last user is ' + lastUser);
}
i++;
}
gs.info('-----------------------------------------------');
 
gs.info('First user is ' + firstUser );
gs.info('Last user is ' + lastUser );

Now our results are correct

*** Script: The firstUser is Ra
*** Script: Iteration 0 current user = Ra firstUser = Ra
*** Script: Iteration 1 current user = Ra-cheal firstUser = Ra
*** Script: Iteration 2 current user = Raamana firstUser = Ra
*** Script: Iteration 3 current user = Raamon firstUser = Ra
*** Script: Iteration 4 current user = Rabi firstUser = Ra
*** Script: Iteration 5 current user = Rabia firstUser = Ra
*** Script: Iteration 6 current user = Rabin firstUser = Ra
*** Script: Iteration 7 current user = Racahael firstUser = Ra
*** Script: Iteration 8 current user = Rachael firstUser = Ra
*** Script: Iteration 9 current user = Rachael firstUser = Ra
*** Script: The last user is Rachael
*** Script: -----------------------------------------------
*** Script: First user is Ra
*** Script: Last user is Rachael

When working with glide records, be sure to use the getValue() to avoid headaches with pointers.
Posted by Peter Cawdron in Tech Tips, 1 comment
Student Success Technologies

Student Success Technologies

CAUDIT Analysis Part 1

View Part 2 Here
View Part 3 Here

Since 2006, the Council of Australian University Directors of Information Technology (CAUDIT) has undertaken an annual survey of higher education institutions in Australia and New Zealand to determine the Top Ten issues affecting their usage of Information Technology (IT).

To help our customers get the most out of this report, JDS has analysed the results and is offering a series of insights into how best to address these pressing issues and associated challenges.

Here we take a look at the first technology-related issue.

 

Issue 1: Student Success Technologies

student-success

Improving student outcomes through an institutional approach that strategically leverages technology.”

For the third year in a row, “student success technologies” has been identified as the highest priority issue within the higher education sector in Australia and New Zealand. In addition to having the appropriate technologies in place, focus is turning to the analysis of the vast volume of data that most institutions collect with these technologies. Doing this effectively and in real time can assist in the decision making process ensuring appropriate actions and limited resources are used to their maximum benefit.
In many cases, institutions have already invested in technologies to support student success, but often these are not being leveraged to their fullest potential, or have not been integrated with other available technologies to provide an improved solution to support student, teacher and researcher success. If persisting with these technologies, the attraction and benefit of new and emerging technologies may not be fully realised if built upon this below-performing foundation.

 

It is one thing for institutes to provide technologies that support student success but useless if those technologies are not accessible, available or performing at the expected levels.

 

CAUDIT Ranking Trend: 2014 #1 | 2015 #1 | 2016 #1
 

Challenges and Considerations

challenges
Challenge: Analysing multiple disparate solutions, each providing insight into different aspects of student, faculty, and administration performance.

Many institutions today have a range of applications supporting different aspects of the student experience (for example: student learning portals, ERP and student administration systems, email and document management, BYOD enablers, etc). While consolidating this information and interpreting it through intelligent dashboards can help display the data, it won’t be enough to find correlations and provide beneficial analysis.

JDS can assist with designing and implementing solutions to combine real-time data from multiple disparate sources to provide a unified view via dashboards, analyse trends and interpret complex data relationships to improve the overall student experience.

Challenge: Ensuring that investments made in technology enablers and new emerging technologies are actually being used and are being used effectively.

It is one thing for an institution to commission supporting technologies and deploy them for use, however it is all for nothing if those systems are either infrequently available when required or performing so poorly that they effectively become unusable. IT departments and vendors alike are often guilty of assuming that the way a service has been designed is the way the service will actually be used. Understanding how services are actually being consumed, both in real-time and through historic analysis, can lead to proactive modification of essential services to improve effectiveness.

JDS has a rich history of providing and supporting End-User-Monitoring (EUM) and Application Performance Management (APM) solutions that improve service performance and availability and provide valuable insight into the effectiveness of delivered services.

Challenge: Using data as information to assist with early intervention to ensure student academic completion.

Institutions have access to large volumes of data on their student population and their progress, both current and historic, that are often maintained in multiple, disparate applications and technologies. Combining this internal sourced data with additional external sources (e.g. QILT) improves identification of students requiring additional assistance or those likely to require future assistance.

Being able to combine or ‘dice’ and ‘splice’ this information – across dimensions of discipline, faculty, geographic location, learning methods or any other demographic – can provide valuable insight into where to invest limited support resources to maximise the likelihood of student retention and success.
JDS can assist by leveraging industry best-of-breed technologies to interrogate, consolidate and analyse these disparate data silos to deliver information that can be acted on to improve overall graduation rates.

Challenge: Managing multiple arrangements and contracts with service and vendor providers

With new technologies and cloud point solutions, institutions face the challenge of managing multiple relationships between internal service providers and external partners and vendors. Services being delivered require monitoring and the technologies that underpin them will necessitate upgrades, patches and monitoring of SLAs. New processes need to be created and maintained. Help desks require additional training and knowledge base articles to be created.

 

JDS can assist institutions through the identification of routine processes that can be automated and ensure that new applications and services are integrated into their current service management solution.

 

About JDS

about-jds
With extensive experience across the higher education sector, JDS assists institutions of all sizes to streamline operations, boost performance and gain competitive advantage in an environment of tight funding and increasing student expectations. With more than 10 years’ experience in optimising IT service performance and availability, JDS is the partner of choice for trusted IT solutions and services across many leading higher education institutions.

 

ensure it works with

 

Posted by Joseph Banks in Higher Education, News, Tech Tips, 0 comments
Citrix and web client engagement on an Enterprise system

Citrix and web client engagement on an Enterprise system

JDS were engaged by a leading superannuation firm to conduct performance testing of their enterprise applications migrating to a new platform. This was part of a merger with a larger superannuation firm. The larger superannuation firm was unaware of their application performance needs and until recent times, performance was not always a high priority during the test lifecycle.

JDS were brought in to provide:

  • Guidance on performance testing best practice
  • Assistance with performance testing applications before the migration of each individual super fund across to the new platform
  • Understanding the impact on performance for each fund prior to migration

During the engagement there were multiple challenges which the consultants faced. Listed below is a few key challenges encountered, providing general tips for performance testing Citrix.

Synchronisation

You should have synchronisation points prior to ANY user interaction i.e. mouse click or key stroke. This will ensure the correct timing of your scripts during replay. You don’t want to be clicking on windows or buttons that don’t exist or haven’t completely loaded yet e.g.

ctrx_sync_on_window(“Warning Message”, ACTIVATE, 359, 346, 312, 123, “”, CTRX_LAST);
ctrx_key(“ENTER_KEY”, 0, “”, CTRX_LAST);

Screen resolution and depth

Set your desktop colour settings to 16bit. A higher colour setting adds unneeded complexity to bitmap syncs, making them less robust. Ensure that the display settings are identical for the controller and all load generators. Use the “Windows Classic” theme and disable all the “Effects” (Fading, ClearType, etc.).

Recording

Your transactions should follow the pattern of:

  • Start Transaction
  • Do something
  • Synchronise
  • Check that it worked
  • End Transaction

If you synchronise outside of your transaction timers, then the response times you measure will not include the time it took for the application to complete the action.

Runtime settings

JDS recommends the following runtime settings for Citrix:

Logging

  • Enable Logging = Checked
  • Only send messages when an error occurs = Selected
  • Extended logging -> Parameter substitution = Checked
  • Extended logging -> Data returned by server = Checked

Citrix 1

 

Think Time

Think time should not be needed if synchronisation has been added correctly

  • Ignore think time = Selected

Citrix 2

Miscellaneous

  • Error Handling -> Fail open transactions on lr_error_message = Checked
  • Error Handling -> Generate snapshot on error = Checked
  • Multithreading -> Run Vuser as a process = Selected

Citrix 3

ICA files

At times you may need to build your own ICA files. Create the connection in the Citrix program neighbourhood. Then get the wfclient.ini file out of C:\Documents and Settings\username\Application Data\ICAClient and rename it to an .ica file. Then add it to the script with files -> add files to script. Use the ICA file option for BPMs/load generators over the “native” VuGen Citrix login details for playback whenever possible as this gives you control over both the resolution and colour depth.

Citrix server setup

Make sure the MetaFrame server (1.8, XP, 3, or 4) is installed. Check the manual to ensure the version you are installing is supported. Citrix sessions should always begin with a new connection, rather than picking up from wherever a previously disconnected session left off, which will most likely not be where the script expects it to be.

Black screen of death

Black snapshots may appear during record or replay when using Citrix Presentation Server 4.0 and 4.5 (before Rollup Pack 3). As a potential workaround, on the Citrix server select Start Menu > Settings > Control Panel > Administrative Tools > Terminal Services Configuration > Server Settings > Licensing and change the setting Per User or Per Device to the alternative setting (i.e. If it is set to Per User, change it to Per Device and vice versa.)

Lossy Compression

A script might play back successfully in VuGen on the Load Generator, however when running it in a scenario on the same load generator, it could fail on every single image check. This is probably a result of lossy compression, make sure to disable it on the Citrix server.

Script layout

Put clean-up code in vuser_end to close the connection if the actions fail. Don’t put login code in vuser_init. If the login fails in vuser_ init, you can’t clean-up anything in vuser_end because it won’t run after a failed vuser_init.

 

JDS found performance issues with the applications during performance tests however these issues leaned towards functional performance issues more than volume. These issues were still investigated to provide an understanding of why the applications were experiencing performance problems.

The performance team then worked with action teams to assist with any possible performance resolutions for example:

  • Database indexing
  • Improvements to method calls
  • Improving Database queries
Posted by Lionel Lim in Financial Services, Tech Tips, 0 comments
Understanding Outbound Web Services in ServiceNow

Understanding Outbound Web Services in ServiceNow

Understanding Outbound Web Services in ServiceNow

ServiceNow has the ability to manage both inbound and outbound web services, but as they’re handled slightly differently, they can be confusing. The term “inbound” and “outbound” does not describe the type of HTTP method being used but rather the point of origin for the web service. For example, both inbound and outbound services can use GET and POST.

Inbound web services are designed to provide third parties with the ability to retrieve (GET) or create (POST) data in ServiceNow, while outbound web services allow ServiceNow to initiate a transaction with a third party (also using either GET or POST, etc.)

Although they are closely related, inbound and outbound web services are handled differently through the ServiceNow GUI even though they’re essentially the same under the covers. Most developers are comfortable with inbound web services because ServiceNow provides a walk-through wizard for them, but they struggle with outbound services, even though these can be handled in the same way.

If a third-party wants to send information to ServiceNow using web services, then an inbound web service will allow them to POST that information. By directing incoming web services to a staging table, you can massage and transform incoming information into a format that suits your ServiceNow instance and avoid duplicates.

snow2

If ServiceNow wants to retrieve information from a third-party using web services, then an outbound web service will allow ServiceNow to GET that information. In the same way as inbound web services, outbound GET web services should write to a staging table to the incoming data can be sanitized and duplicates avoided.

 

snow3

 

In both scenarios, the end result is the same—ServiceNow is populated with information from a third-party.

This tutorial assumes you have a basic knowledge of how to create applications in ServiceNow as we’re going to make an outbound web service to retrieve a weather report for use in ServiceNow.

To complete this tutorial, you will need to sign up for a free account with http://openweathermap.org/api and subscribe to the Current weather data.

Step One: Create an outbound web service

The external web service made available by Open Weather is:

http://api.openweathermap.org/data/2.5/forecast/city?q=Brisbane,AU&APPID={yourAppID}

If you sign up for an account with Open Weather and view the results in a browser, you’ll get a wall of JSON data in response, but if you copy and paste this into Notepad++ using the JSON formatter plug in, you’ll see there’s structure to the data. This is important, as this will allow us to extract information later on.

snow4

As you can see, the temperature is in Kelvin rather than either Celsius or Fahrenheit, as 294K relates to a balmy 20C or 70F.

Also, the date time (dt) is using a numeric format known as Epoch time, so we’re going to need to transform the results from our outbound web service before they’re going to be useful.

Creating an outbound web service is easy enough, simply provide the web service with a name like outbound_get_weather and the end point URL, and tell ServiceNow what to accept and what the content type will be.

http://api.openweathermap.org/data/2.5/forecast/city

show5

Under HTTP Methods further down the page, you’ll see get, delete, put, post. Delete all but get as all we want to do is to get information from open weather.
Edit the HTTP method get and add the query parameters for “q” and “APPID.” Don’t forget to set the order in which you want these parameters to occur.

  • q is the query parameter for the city, so we’re going to set this to be a variable rather than a static value. Make this ${q}
  • APPID is a static value, so this can be hard coded to match the application ID you got from Open Weather

Also, notice I’ve added a Variable Substitution and set this to be Sydney,AU, which allows me to click test and see fresh results for Sydney.

snow6

This is important as when we get to our scripted REST web service, we want to be able to change q to refer to other cities as appropriate.

Step Two: A Tale of Two Tables

Next, we’ll create a regular ServiceNow table called current weather  with three columns to capture the data from our web services. This is the table that will be referenced by users and/or other applications.

snow7

Our second table is a staging table. This is where the raw results of our web service will reside.

When you make an inbound web service, ServiceNow automatically creates a staging table, but when it comes to outbound web services, you need to create a staging table for yourself and manually assign a transform map.

Be careful with this step. You must extend the Import Set Row table to create a valid staging transform table for your web service.

snow8

Notice how our data types in the staging table match those we saw in the web service results, so the epoch date is being stored as an integer.

Step Three: Transform

Now we can set up the transformation from our staging table to the final table in ServiceNow. Under System Import Sets click on Create Transform Map and build the following transform.

snow9

Initially, there’s only one source field we can match against the target field, for the others, we’ll need to use transform scripts.

To convert Kelvin to Celsius, add a new Field Map as follows, using this script…

var Celsius = source.temperature – 273.15;
return Celsius;

snow10

To convert the epoch date to a date ServiceNow will recognize, add another field map using…

snow11

var regularDate = new Date(0); // 0 forces javascript to use an epoch date
regularDate.setUTCSeconds(source.weather_date_in_epoch);
var formattedDate = regularDate.getFullYear() + ‘-‘ + (regularDate.getMonth()+1) + ‘-‘ + regularDate.getDate() + ‘ ‘ + regularDate.getHours() + ‘:’ + regularDate.getMinutes() + ‘:’ + regularDate.getSeconds();
return formattedDate;

When you’re finished, your transform map should appear as…

snow12

Step Four: Tying it all together

We’re going to create a scripted REST API to bring everything together.

snow13

The resource we’re developing here is going to be called outbound_request_for_weather and will require some scripting, but the basic logic is pretty simple.

snow14

The logic of our outbound request for the weather is…

  • Time the transaction and record that in the application log
  • Use the ServiceNow RESTMessageV2 library to undertake the web service
  • Delete any old records
  • Set a custom parameter to grab the weather for a specific location
  • Retrieve the results from the response body and add these to the staging table
  • Our transform map will then run automatically in the background and populate the actual table with results

Here’s the script I used.

(function process(/*RESTAPIRequest*/ request, /*RESTAPIResponse*/ response) {

var requestBody, responseBody, status, sm;
try{
//Time the overall transaction so there’s a record of how long our outbound webservice takes
var startTimer = new Date();

sm = new sn_ws.RESTMessageV2(“outbound_get_weather”, “get”);
//sm.setBasicAuth(“admin”,”admin”);
sm.setStringParameter(“q”, “Townsville,AU”);
sm.setHttpTimeout(10000) //In milliseconds. Wait at most 10 seconds for response from http request.

var getresponse = sm.execute();
var responseBody = getresponse.getBody();

//clear out old records
var currentweather = new GlideRecord(“x_48036_weather_current_weather”);
currentweather.query();
currentweather.deleteMultiple();

var gr = new GlideRecord(“x_48036_weather_staging_current_weather”);
gr.query();
gr.deleteMultiple();

//Convert the response into JSON
var JSONdata = new global.JSON().decode(responseBody);

//Loop through the records and add to the staging table
for (var i = 0; i < JSONdata.list.length; i++){
gr.setValue(“weather_date_in_epoch”,JSONdata.list[i].dt);
gr.setValue(“temperature”,JSONdata.list[i].main.temp );
gr.insert();
}
//Post a message to the application log noting how long this activity took
var endTimer = new Date() – startTimer;
gs.info(‘***Weather updated in ‘ + endTimer + ‘ msec’);
} catch(ex) {
responseBody = “An error has occurred”;
status = ‘500’;
}
})(request, response);

Notice how the JSONdata.list[i].main.temp matches the structure of web service results we saw earlier.

If we look at the current weather table, we can see our results.

snow15

If we look at the application log we can see how long this activity took.

snow16

This script can now be set up under the System Definition as a Scheduled Job and run as often as needed.

In summary, we built our outbound web service using the same components as found in an inbound web service.

snow17

One common question that arises when dealing with web services is, “What approach should I use?” The answer is… think about who is initiating the web service and what do you want to accomplish.

snow18

Posted by Peter Cawdron in Tech Tips, 0 comments
How to solve SSL 3 recording issues in HPE VuGen

How to solve SSL 3 recording issues in HPE VuGen

With web application security becoming more important you may find servers refusing to accept SSL 3.0 protocol due to security vulnerabilities such as POODLE (https://en.wikipedia.org/wiki/POODLE ).

Older versions of VuGen will refuse to record the application and display an error page similar to below giving vague information to what the problem is.

SSL3a updated

VuGen 12.50 will now show a popup giving a hint where the problem is:

SSL3b

Using Wireshark it become clear that the issue is with the SSL handshake:

SSL3c

Compared with a successful secure handshake recording when using the browser:

SSL3d

By default VuGen has the following “Recording -> Network -> Mapping and Filtering” settings:

SSL3e

The problem is that VuGen will not try later TLS versions after the first handshake has failed unlike a browser which will start from the highest TLS version and work down until the server accepts the handshake:

SSL3f

Simple solution is to change the VuGen Recording -> Network -> Mapping and Filtering to at least TLS 1.0

Posted by David Batty in Hewlett Packard Enterprise, Tech Tips, 0 comments
How to record Angular JS Single Page Applications (SPA)

How to record Angular JS Single Page Applications (SPA)

These days VuGen offers a number of ways to record SPA web application including TruClient and importing Session Archive Zip (SAZ) files from fiddler. However, you can still use HTTP/HTML protocol in VuGen. In doing so you may encounter a number of issues when recording the HTTP RESTful service calls.

To create robust maintainable scripts JDS recommend using HTML-based recording rather than URL-based scripts. The ‘HTML Mode’ will not extract every resource URL into the script but will either encapsulate them into the initial calling URL or add them to the EXTRARES section of the step:

web_url("home",
"URL=https://xxxxxxx.xxx",
"TargetFrame=","Resource=0",
"RecContentType=text/html",
"Referer=",
"Snapshot=t1.inf",
"Mode=HTML",
EXTRARES,
"Url=../fonts/OpenSans/OpenSans-Regular-webfont.eot", ENDITEM,
"Url=../fonts/fontawesome-webfont.eot", ENDITEM,
"Url=../media/bootstrap.min.css", ENDITEM,
"Url=../css/jds-logo-white.png", ENDITEM,
"Url=../fonts/OpenSans/OpenSans-Light-webfont.eot", ENDITEM,
"Url=../media/jds_bg.jpeg", ENDITEM,
"Url=../fonts/OpenSans/OpenSans-Semibold-webfont.eot", ENDITEM,
"Url=../css/bg3.jpg", ENDITEM,
"Url=../css/32px.png", ENDITEM,
"Url=../fonts/OpenSans/OpenSans-Bold-webfont.eot", ENDITEM,
"Url=../css/background.jpg", ENDITEM,
LAST);

This is a problem is you want to measure the transaction times of RESTful service calls used in SPA frameworks.

To enable the VuGen recorder to extract the Angular JS calls as separate steps we must change the “Recording Options -> HTTP Properties -> Advanced” recording scheme to not record responses with headers:

Content-Type application/javascript

As resources.

Posted by David Batty in Hewlett Packard Enterprise, Tech Tips, 1 comment
Calculating Pacing for Performance Tests

Calculating Pacing for Performance Tests

If it takes 4 people 12 days to dig a trench, how long would it take 6 people?

Did you ever have this question in maths at school? Well the teacher may have been right that you’d need this later in life. When setting up our scenarios, we need to calculate pacing. How often should your script run if you need to complete 1000 business processes an hour with 250 virtual users?

Our Pacing Calculator tool will help you work out the pacing with the minimum and maximum range calculated for you.

As long as your script execution time (including think time) doesn’t take longer than the minimum pacing, you’ll achieve your throughput. If the pacing isn’t enough time to complete your business process then you should consider reducing the think time, or adding more virtual users.

Posted by Daniel Spavin in Products, Tech Tips, 1 comment
Vugen and GitHub Integration

Vugen and GitHub Integration

With the release of LoadRunner 12.53, VuGen now has built in GitHub integration. That means you not only have access to GitHub for saving versions of your script, but also other tools like bug tracking, access control and wikis.

Here’s an overview to VuGen’s GitHub integration to get you up and running.

Getting Started

First off, you’ll need a personal Git login. You can sign up for free at github.com/join.  Note that free repositories are publicly available.

You’ll also need LoadRunner 12.53 or higher from HPE.

GitHub Overview

VuGen’s GitHub integration (and GitHub in general) works by managing three versions of your script at a time.

Vugen and GitHub Integration 1

  1. The one you see in VuGen is your working copy. You can develop / replay your script as usual, and save it locally when you want.
  2. When you want a backup of your script – e.g. before doing correlation or rerecording, you can make a check point – or commit the script. This saves the script to a local staging area.
  3. After several commits, or perhaps at the end of the day, you might be ready to save everything to GitHub. To do this you Push the script.

Using GitHub integration straight from VuGen

The following example shows you how to push your first VuGen script to GitHub.

1. Starting out – Creating a Repository

Log into GitHub and create a new repository:

Vugen and GitHub Integration 2

Fill in the details and click “Create Repository”. With a free GitHub account the script will be publicly available, with a paying account you can make it private.

Vugen and GitHub Integration 3

2. VuGen Version Control – Create Local Git Repository
Now create a script in VuGen – in our example it’s ‘BankofSunshine’.

You’ll see a new ‘Version Control’ menu available in VuGen. Chose the option to ‘Create a local Git Repository’.

Vugen and GitHub Integration 4

VuGen manages the files to include so you don’t need to create a .gitignore file. If you prefer to manage it yourself, you can do that too.

3. Commit Changes to the Staging Area

Now you need to commit your script to the local repository. Do this each time you’ve made changes that you might want to push to Git Hub, or if you want to be able to roll back any changes.

When you commit, your local repository is ready to be pushed up to GitHub – but is still only available to you.

Vugen and GitHub Integration 5

4. Push Changes to GitHub

Once you are ready to save your script up to GitHub, you will need to Push the changes.

The first time you do this with your script you will need to tell VuGen some details about the GitHub repository.

Enter your details that you created in Step 1:

Vugen and GitHub Integration 7

Now your newly created script is saved to GitHub.

5. There’s no step 5.

That’s all you need to do. When you go to GitHub and click on your repository, you will see all the files that you just pushed:

Vugen and GitHub Integration 8

To keep track of changes made locally to the script, VuGen will show you which files have updated with a red tick:

Vugen and GitHub Integration 9

While you can access your scripts in ALM from the Controller, you can’t yet access your scripts in GitHub from the Controller. You’ll need to pull down a local copy before you run your test.

Now you are up and running, how about exploring more of what GitHub has to offer. Each script saved to GitHub comes with a Wiki and issues register. These could come in handy when you have large or tricky scripts or for handover to another team

Share your thoughts on VuGen GitHub integration below.

Posted by Daniel Spavin in Hewlett Packard Enterprise, Tech Tips, 1 comment