Application Security Metrics

This is something that I worked on last year when stakeholders in the risk management group wanted to measure the success of the Application Security Program.

But, how do you measure application security? Or rather the success of an application security center of excellence program? What can give you details that it is working? Is it ok to allocate the same budget every year? Should it be reduced? How would one know? Is the program on track? Is it improving? By, just having a secure SDLC process, doing secure code analysis and security testing alone, one cannot say that they have a sustainable application security program. To continue any task/activity, one needs to know where to reach and where they are. And that is something application security metrics will give you.

What should be done first? Answer: Inventory.

  1. Take an inventory of your assets first. Whether it is secure, insecure, or you don’t know whether it is even used for, it doesn’t really matter. It is amazing when you ask this question to any CISO on whether he has a fair understanding on how many assets he thinks the organization has. Here, we are not getting into hardware or software assets but just the basic web applications/services that an Org’s IT floats in internet or intranet.

Once the inventory is finalized, come up with an asset classification using a risk based approach. Some assets could be critical, some public. Some assets could be accessed by all and some accessed only within a closed trusted environment. Some assets are used by millions of users and some assets are used just by the CISO (ya, you read it right. His dashboard).

2. Once the inventory is finalized, then you go figure your security processes for each of your assets. Did all applications undergo all aspects of secure-SDLC?

In other words, ‘Security Coverage‘. Let’s say, you do code analysis only for 50 of your 100 applications, then your coverage is only 50% and you don’t have an idea about rest of the apps.  With this simple metric, it becomes fairly simple on what one needs to do.

Continue reading “Application Security Metrics”


Scanning Large or Highly Dynamic Sites

An acquaintance of mine was saying that he didn’t find automated vulnerability assessments to be useful and they were really a waste of time. He was spending time scrubbing the false positives and stated that he rather focus his energy on a manual security testing activities. But compliance mandates us to run automated assessments. In a manual assessment, there is no high level of assurance due to the human factor involved. There is always a chance of lapses or oversight. So, we had to mandate running automated assessments also in addition to manual testing.

I also decided to help him fine tune his scan configuration so that his scans produce better results next time. If you are going through similar situation, the below options in your scan tool can help you. You can use this if you are using HP Web Inspect or Acunetix or IBM Appscan.

  1. Limit Maximum Single URL Hits:

Highly dynamic sites can have URLs like The articles can range from 1 to even millions. In this case, if you let your scanning tool crawl all the URLs, your scan will never complete.

In a typical site, it is better to limit it to 3 or even less.

  1. Include Query Parameters in Hit Count while limiting Single URL hits.

What if your application takes on a different action depending on parameter passed to a single URL. Take the case of URL In this case, action can take values like add, edit, delete, archive, import or export.

In this specific case, you want your scan tool to hit all the three action cases. If your site is structured like this, it makes sense to include your query parameters in hit count while you limit single URL hits. In this example, the limit would be 6 instead of 3.

  1. Limit maximum CRAWL directory depth.

If your site is structured like and so on, and your code and content that is displayed in each of these sub-directories is virtually identical, it makes sense to limit the maximum crawl depth. By doing it, the scan tool will not endlessly crawl all possible sub-directories.

Set your default option to 3 and no more. If you know for certain that all your code is in a single directory, you can even reduce this to 1.


  1. Limit maximum CRAWL Count

This always happens in a content management system like a newspaper site. The content will always be spanning to hundreds of millions of pages and the scan will never complete. It is better in such situation to limit crawl count.


  1. Limit maximum web form submission

What if your web form has drop-downs with Country, states of every possible country and even cities. You cannot have your scan tool submit the web form for every possible permutation and combination of country, state and city. Limit maximum web form submission to 1 or 2.

Most scan tools also come with an option of ‘Limit Scan Folder’, ‘Enable or Disable Scan Logs’, ‘Limit link traversal depth’ or even let you modify their scan policies so that you can skip running certain audits. Remember, a scanner is a tool and it is up to the tester to use it to her/his best.