An acquaintance of mine was saying that he didn’t find automated vulnerability assessments to be useful and they were really a waste of time. He was spending time scrubbing the false positives and stated that he rather focus his energy on a manual security testing activities. But compliance mandates us to run automated assessments. In a manual assessment, there is no high level of assurance due to the human factor involved. There is always a chance of lapses or oversight. So, we had to mandate running automated assessments also in addition to manual testing.

I also decided to help him fine tune his scan configuration so that his scans produce better results next time. If you are going through similar situation, the below options in your scan tool can help you. You can use this if you are using HP Web Inspect or Acunetix or IBM Appscan.

  1. Limit Maximum Single URL Hits:

Highly dynamic sites can have URLs like http://www.example.com/index.php?article_id=1. The articles can range from 1 to even millions. In this case, if you let your scanning tool crawl all the URLs, your scan will never complete.

In a typical site, it is better to limit it to 3 or even less.

  1. Include Query Parameters in Hit Count while limiting Single URL hits.

What if your application takes on a different action depending on parameter passed to a single URL. Take the case of URL http://www.example.com/index.php?action=add. In this case, action can take values like add, edit, delete, archive, import or export.

In this specific case, you want your scan tool to hit all the three action cases. If your site is structured like this, it makes sense to include your query parameters in hit count while you limit single URL hits. In this example, the limit would be 6 instead of 3.

  1. Limit maximum CRAWL directory depth.

If your site is structured like http://www.example.com/parent/child/grand-child/great-grandchild/great-great-grandchild/ and so on, and your code and content that is displayed in each of these sub-directories is virtually identical, it makes sense to limit the maximum crawl depth. By doing it, the scan tool will not endlessly crawl all possible sub-directories.

Set your default option to 3 and no more. If you know for certain that all your code is in a single directory, you can even reduce this to 1.

 

  1. Limit maximum CRAWL Count

This always happens in a content management system like a newspaper site. The content will always be spanning to hundreds of millions of pages and the scan will never complete. It is better in such situation to limit crawl count.

 

  1. Limit maximum web form submission

What if your web form has drop-downs with Country, states of every possible country and even cities. You cannot have your scan tool submit the web form for every possible permutation and combination of country, state and city. Limit maximum web form submission to 1 or 2.

Most scan tools also come with an option of ‘Limit Scan Folder’, ‘Enable or Disable Scan Logs’, ‘Limit link traversal depth’ or even let you modify their scan policies so that you can skip running certain audits. Remember, a scanner is a tool and it is up to the tester to use it to her/his best.

Advertisements