Seguridad

The Downfall of DAST Security Testing

Many organizations gather data from a variety of sources, such as the users, and then store it in data lakes afterwards. Data is everything these days, as is the ability to process a huge pile of data in a short time.

The use of data is always done through APIs. APIs hold all application components together and make them accessible as one for any type of client, like a user.

Of course, APIs did not come alone. The traditional monolithic architecture has given way to others, like microservices. Everything is modular these days. Microservices make any application modular and composable for better scalability, greater flexibility, and shorter time-to-market.

Microservices combined with APIs are considered the future of software development. This is so true that OWASP released in 2019 an API-adapted version of its well-known OWASP TOP 10. So, in this scenario, application security needs to both adapt to this new way of making software and keep up with it.

In the last decade, application security has also evolved as well in the form of several automated and specialized security tests. However, the evolution of security controls seems to lag behind, and this is the case of Dynamic Application Security Testing (DAST).


DAST, a nice guy

DAST came out as a «fairly» automatic black box tool to find specific web application vulnerabilities at a time when almost the only techniques were SAST analysis and manual pentesting. DASTs emerged with the promise of assisting the pentester, and filling some of the gaps in SAST tools, such as reducing false positives and reducing scanning time.

The running process was very simple: From some predefined rulesets, it sends HTTP requests to the web application and watches out for certain strings in the responses to see if there is any vulnerability. In other words, DAST attempts to simulate a pentester.

Needless to say, that there are a couple of main drawbacks: 

  • False positives. Any number of false positives would need some short of review after the scan, the so-called triage. That triage often would need the expert eyes of a cybersecurity analyst, and even for an expert it takes some precious time that may end up slowing down the post-scan steps.
  • Time. Scans take time to run, usually from 30 minutes to 2 hours, or maybe days depending on the application’s size. Also, time will depend on how good the configuration is. The better the configuration, the less time it will take to run. But setting up a good pre-scan configuration is not that easy and most of the times requires a cybersecurity expert.


DAST is far behind Software Development

The way of developing software has changed. DAST is now commonly integrated in a CI/CD pipeline, and in CI/CD environments agility and speed are the cornerstones. Any build-and-deployment pipeline shouldn’t last longer than just a few minutes.

This does not seem a feature of DAST. As it was mentioned before, DAST scans takes time to run, time to review the findings, and time to configure. But these pitfalls are not the only ones that are the opposite of agility and speed:

  • Discovery & Crawling. One of the great features of DAST is its ability to search and crawl almost every part of the application during scanning. By applying some heuristic rules for rewriting URLs and following links, DAST tools can discover and crawl many subdomains and sections of web applications. Another version of this discovery process is by proxing the traffic and collecting the endpoints to be scanned. But in modern times, while security is shifting to the left in the SDLC, these features could be considered a shortcoming because it takes time. Luckily, most of the DAST tools across the years have included the possibility of specifying what endpoints of the application you want to scan in various formats.
  • Developers use DAST. Under the new *DevSecOps* paradigm, developers can use some tools present in the pipelines on their own, like security tools. This is intended for providing more agility to the software development process. If developers by themselves can review the findings of a security analysis like DAST, this would supposedly speed up the whole development. In theory. In practice, developers struggle to distinguish what is a false positive from what is not, or they need to invest more time to do it than a security expert. This scenario reduces drastically the tolerance to false positives of DevSecOps teams and undermine the trust in security testing.

Finally, the software itself has changed too. As stated at the beginning, APIs and microservices are the present and the future, and DAST is not well adapted to them:

  • DAST cannot discover any dynamic generated content in a web front-end, which is very common today due to the extended use of javascript frameworks, like Angular, React, Next, JQuery, etc.
  • DAST is not able to detect some kinds of usual vulnerabilities in APIs, like IDOR/BOLA, because it requires context of the business logic of the application, like user roles and privileges.
  • DAST also struggles to pass through some protection walls, like anti-CSRF tokens and several typical authentication/authorization mechanisms in APIs, like OAuth2, SSO and multi factor authentication. Although it is possible to overcome some of these barriers, it most certainly would increase the time to prepare the scan, and every application needs its own configuration.


How to do DAST in 2023 and not die trying?

At this point, it is quite tempting to think that DAST is useless, but it is not. Many of the above shortcomings can be overcome by using DAST in some other ways:

  • Repurpose DAST to find the low-hanging fruit. Some vulnerabilities are easy and quick to find for any DAST tool and have a reasonably low false positives ratio. Some examples are insecure or missing HTTP Headers, Cross Site Scripting or even some kind of SQLi.
  • Test for specific security requirements. If a catalogue of security requirements exists, DAST could be used to try a very specific set of tests to verify those high-value requirements across many applications.
  • Create configuration templates beforehand. As stated before, the better the configuration, the less time it will take to run. It would be a nice idea to invest a bit of time in preparing some configurations that may be used to scan many applications with similar features or architecture. By doing this, with a single proper configuration, the execution time and false positives would be considerably reduced in future scans.
  • Avoid full scans. Scanning the entire application could be very time consuming, and each step in a CI/CD pipeline should be take just a few seconds or minutes. Instead, narrow the scope of the scan to just the changes recently made to the application.
  • Feed the DAST with the exact API routes to scan. If the tool supports it (recommended), always instruct the scanner to test only the API endpoints you want to scan, such as those that have any changes. This would allow to get a 100% scan coverage gradually without slowing down the CI/CD process.
  • Run DAST asynchronously. If the scan is launched in a CI/CD step and is going to take a while, a good option would be to simply execute it without waiting for it to finish. Later, when it is completed, the corresponding team would be able to review the findings or do some triage.

Apart from this, DAST still remains as a really handy tool to any pentester since it is capable of fuzzing a vast amount of input parameters on many applications in no time with some pre-defined rule sets, which allow finding many types of vulnerabilities, like injections and misconfigurations.


What to look for in a DAST tool for API Security Testing

When evaluating DAST or similar tools for API Security Testing, it can be challenging to know which tool is the best choice, so below are some criteria that you can use:

  • Easy to integrate in a CI/CD pipeline.
  • Allows to choose what kind of application to be scanned: API or web with front-end.
  • Supports several API specification formats to specify the exact API routes to scan: Postman collections, OpenAPI/Swagger, GraphQL introspection, WADL, WSDL, etc.
  • Enables to select the specific type of API about to be tested: REST, GraphQL, SOAP.
  • Provides the capability to define pre- and post-to generate fine-tune configurations for detecting business logic vulnerabilities.


To recap

The way software is developed has changed, and so has software itself. Agility and speed are now key features of any SDLC thanks to the advantages of using CI/CD pipelines. APIs have become the core of any new piece of software, so delivering secure application depends on the absence of vulnerabilities in the underlying APIs.

APIs must be secured very quickly, and although DAST is not well suited for that, it is possible to alter the configuration of the scans and form of integrating them into pipelines to scan APIs better and faster.

AI: A New Hope

This post cannot end without mentioning Artificial Intelligence. The truth is that any security solution could leverage AI to solve plenty of the gaps of DAST: improve discovery, crawling and URL rewriting, prevent duplicated or iterative requests, decrease false positives, detect complex business logic vulnerabilities…

Perhaps, AI will become the fire that will revive the DAST phoenix. What do you think?

Ernesto Rubio, Cybersecurity Analyst

Powered by WPeMatico

Gustavo Genez

Informático de corazón y apasionado por la tecnología. La misión de este blog es llegar a los usuarios y profesionales con información y trucos acerca de la Seguridad Informática.