Performance tuning tips for Qualys Import JobsSummaryUsers may observe the Qualys import jobs are running slow or completed in the middle with error state when importing vulnerability data from Qualys side. A lot of things could lead to the slowness and here are some tips helps to troubleshoot the root cause and improve Qualys data import performance.ReleaseApply to all releases.InstructionsQualys Host Detection integration job Among all the Qualys integration jobs Qualys Host Detection integration job normally brings in the biggest data volume includes both Qualys Host and Vulnerability data. Below tips are mainly for this job only. 1. Check for slow scripts or queries Slow scripts can be searched from sys_script_pattern.LIST table and slow queries can be searched from sys_query_pattern.LIST table. From the [Average execution time (ms)] and [Execution count] fields you will know which business rule, database query, scripts, etc., is running most frequently and slowly during Qualys import. You can also check if any slow business rules running on any vulnerability tables (table name starts with "sn_vul_") in the localhost logs. Below is a log snippet sample showing a slow business rule running on 'sn_vul_third_party_entry.LIST' table. ======================yyyy-mm-dd hh:mm:ss (567) worker.1 worker.1 txid=14fbc5761b44 Slow business rule '<business_rule_name>' on sn_vul_third_party_entry:<span class = "session-log-bold-text"> QID-13497</span>, time was: 0:00:00.1242020-10-27 15:03:35 (568) worker.4 worker.4 txid=61bd893a1b44 WARNING *** WARNING *** (82)ExceptionHandler - exception at service ImpactManager: com.snc.cmdb.CmdbRuntimeException: Internal Error: expected record 'b642ada5b01c4909fb7ba8f1090203ce' is missing from table 'svc_model_obj_service' at com.snc.cmdb.service.modeling.persistence.BaseBlob$BaseLazyFetcher.load(BaseBlob.java:555) at com.snc.cmdb.service.modeling.persistence.BaseBlob$BaseLazyFetcher.getRaw(BaseBlob.java:445)... ... ...====================== When a slow business rule or query is found, firstly finish below basic checks and if the root cause is still unknown, raise task to involve Performance team to check it further. a. Check to see if any indexes are missing on the vulnerability table compare with OOTB (Out-Of-The-Box) instanceb. Check to see if any heavy database queries or script logic that can be optimized in that business rule 2. Check slow vulnerability rules During running a Qualys import, the related Vulnerability Rules will run after vulnerability data is imported. Below are the 3 typical Vulnerability Rules. Vulnerability Risk Rules (table: sn_vul_calc_risk.LIST) Vulnerability Assignment Rules (table: sn_vul_assignment_rule.LIST) Vulnerability Group Rules (table: sn_vul_grouping_rule.LIST) You can follow below steps to check where the bottleneck is on a Qualys Integration Run record a. Open table sn_vul_integration_run.LIST and open the corresponding Qualys Integration Run record OR Go to menu Qualys Vulnerability Integration > Primary Integrations, open Qualys Integrations record (for example, Qualys Host Detection Integration) then you can find the Qualys Integration Run records in the [Vulnerability Integration Runs] tab b. Add below fields into the list view of "Vulnerability Integration Process" if they are not there Assignment rules time Group rules time Risk rules time Other fields, such as "Import queue processing time", "VI creation time", "CI lookup time" can also be added for more processing information which is helpful to diagnose where the slowness happens. This will help to know where the slowness is during processing Qualys data. Below is a screenshot showing the field information. If slow processing is found on one specific assignment rule, try to disable it and re-run Qualys integration. If there are multiple assignment rules are all running slow, try below options. a. Disable business rule "Run assignment rules" and re-run Qualys integration.b. Mark all assignment rules and group rules as inactive and re-run initial import. This will import all the Vulnerability Items for the first time. After that you can enable all these rules (group+assignment rules) and click on [Apply Changes] button on these rules. This will apply rules to imported vulnerability items (existing on the instance) and reduce the workload during initial import. Please refer to below screenshot for this option. Once initial import is done from subsequent import onwards the number of delta new records will be less and it will be processed faster. If any of the Vulnerability Rules are found to be customized please raise task to involve dev team to check the logic further. 3. Increase data sources Data sources is the number of threads to process the Qualys data. A node could have multiple number of threads based on the free scheduler workers. Try to increase data sources for specific Qualys Integration if there are more available nodes for an instance following below steps. a. Navigate from menu Qualys Vulnerability Integration > Primary Integrations b. Open Qualys Integration record (for example, Qualys Host Detection Integration) c. Go to tab [Data Sources] and click on New button to add more data sources for this integration record Please refer to below screenshot for more details. For more details please refer KB0995003. 4. Increase scheduled import pool Scheduled import templates are used to process data via the datasource. It is shared by all active Vulnerability Response applications on the same instance, such as Qualys, Rapid, Tenable, etc. ServiceNow provides 10 scheduled import templates by default which means 10 parallel threads are running at a time to process 10 attachments. If user confirmed there are multiple Vulnerability Response applications running to compete the system resource which affected Qualys jobs' performance then user can consider to increase more import templates. For more details please refer to KB0995644. Note: Please be aware each scheduled import template will consume a scheduler worker on the node so please check carefully how many nodes customer's instance has before increase the scheduled import pool. 5. Decrease truncation limit The truncation limit is used for Qualys data import pagination which is set to 500 on OOTB (Out-Of-The-Box) instance. This value can be tweaked to control how many Qualys records can be imported at one time. Try to decrease this value in case system is overloaded and increase it again cautiously to get better performance. It can be changed following below steps. a. Navigate from menu Qualys Vulnerability Integration > Integration Instances b. Open Qualys record and go to tab [Integration Instance Parameters] c. Change the value for "truncation_limit" Please refer to below screenshot for more details. 6. Increase Qualys Attachment evaluation time During Qualys import the vulnerability data is actually retrieved from Qualys server and saved as a XML attachment into table sn_vul_ds_import_q_entry.LIST. The attachment evaluation time is by default set to 3600 seconds (60 minutes) in the script include "VulnerabilityDSAttachmentManager". When there is any processing issue happens if it is beyond the processing time an error message as below will be output in the "processing notes" field in the "sn_vul_ds_import_q_entry" records. Error, "Job exceeded processing time and was forced to complete status" In this case, you can increase the evaluation time following below steps to handle larger volume of Qualys data. a. Open below script include on your instance (replace with your instance name in the URL) https://<instance_name>.service-now.com/nav_to.do?uri=sys_script_include.do?sys_id=aa1b81669f31020034c6b6a0942e7014 b. Increase the default hard code limit of 3600 seconds (in variable _MAX_PROC_TIME_S: 3600) Please refer to below screenshot for this settings. After the timeout value in script include "VulnerabilityDSAttachmentManager" has been changed, the same value in another script include "VulnerabilityIntegrationUtils" should be changed as well. Please refer to below screenshot. Qualys Knowledge Base integration job Qualys Knowledge Base integration job retrieves KB data only so the data volume is comparatively much smaller. However, it is still possible that performance issue could happen due to customer's usage. For example, some customers may want to retrieve KB data for more than 10-20 years from their Qualys server which makes the payload attachment very big to cause memory issue. ServiceNow introduces a "max_delta_days" parameter on the Qualys integration instance page. Please refer to below screenshot. By default the value is 365 which means the job can retrieve maximum 365 days' data. When retrieving large KB data volume from Qualys user can reduce the number to 6 months (184 days) or even smaller to make the payload file size smaller to avoid memory issue for each request. Job will automatically set the start date/time to a date after 6 months for the job to run to retrieve next 6 months' data once previous run is completed.