Review: Burp Suite Certified Practitioner (Part 3 Final)

Wednesday, January 5, 2022

Failure is hard to swallow. After failing my first attempt at the Burp Suite Certified Practitioner exam, I decided to try the certification exam again... and again... and again.

If you've not already read part one or part two of my review, please do so before continuing. This is a direct continuation of the previous articles.

Attempts Summary

$9 per attempt at the Burp Suite Certified Practitioner exam was a hard deal to pass up, especially when there was so much to learn about the process and share with others, regardless whether or not I ever passed it. So after my initial failure, I decided purchase another attempt and take a more serious approach. I didn't study nearly as hard as I should have the first time and really thought that my years of experience in the field would be enough. As outlined in part two of this series of articles, this was not the case. My real world experience was essentially useless in the exam environment. Those looking to attempt the certification exam should expect the same experience if you do the kind of work that I do. Again, see the previous articles for details.

Before my second attempt, I went through every apprentice and practitioner level lab in the Web Security Academy and made honest attempts to complete them with no help from the solutions. This took a long time, but given that discovery during the exam is all about linking what you see in the labs to behaviors in the target applications, this rote memorization was required. Once I had a comfortable level of familiarity with all of the labs, I made my second attempt. I failed the exam again. This time I got a little further by completing two of the challenges, the first stage of each application.

Once again frustrated, but still with resolve and feeling like I could do better, I adjusted my approach to prepare for a third attempt. Even though Portswigger responded to my initial review and extended the exam time limit to four hours, I still felt like I was too slow. I needed a faster way to approach the challenges. I proceeded to create an index of the labs as a way to quickly identify potential vulnerabilities and reference exploitation approaches. I'll expand on the idea of indexing below. With the index in hand, I made my third attempt. I failed the exam again. However, this time was different. Timing wasn't an issue. I powered through the first application in approximately 45 minutes and completed the first stage of the second application in about 15 minutes. I then proceeded to stare at the second challenge of the second application for three hours. Yes, three hours with zero progress. I was so desperate and had so much time that I literally went through all of the labs again TWICE and tried every automated tool I could think of just for the heck of it. I reached out to support and a point of contact within Portswigger to verify that the challenge I received was solvable. Support confirmed that it was, but I had my doubts.

Again frustrated, but encouraged by my progress and knowing that the tests were randomized, I decided to make a fourth attempt and hope that I didn't receive the same problematic challenge as the previous attempt. I failed the exam again. The EXACT same way. I powered through the first application and the first stage of the second application in about an hour, before staring at that same challenge as before again for three hours. It was at this point that I gave up.

I had never put so much effort into something that ultimately resulted in failure. Sure, I've experienced a lot of failure in life. Success rarely comes without failure. But in all previous circumstances, with continued effort and resolve, I had always eventually succeeded. This was a real taste of failure, and my first experience with facing something that I simply could not do no matter how hard I tried. I was done. Perhaps my biggest concern was not knowing what I missed. At the time of this writing, I still have no idea what I missed on all those attempts. I don't know if what I missed represents something that I'll never find in the assessments I conduct, if they were the result of bugged challenges, or if they were simply challenges gamified in such a way that my brain had trouble reasoning with them. I don't know. And that's scary to me.

After a couple days of getting use to the feeling of failure, I received a message from Portswigger thanking me for helping to identify an issue with the exam and offering me a free attempt. What?! What issue? I didn't report anything. Well, as it turns out, the email I sent to support after my third attempt, or perhaps the message I sent to my Portswigger contact, led someone to discover that something wasn't right. Portswigger fixed the issue and offered those affected by it another attempt at the exam. I'll speak more to this in a bit, but the bottom line was, I had another attempt to make.

I made my fifth attempt and failed again. This time my heart just wasn't in it. I had become comfortable with the fact that I was never going to pass it. About an hour into the exam, I got tired of it, walked away to play with my kids, and never went back.

Certification Maturity

As I mentioned above, Portswigger extended the certification exam time limit to four hours after my second attempt. I don't know that my review had anything to do with it, or if it was just coincidence, but regardless, they felt it was necessary to extend the time limit and made the adjustment.

Also as mentioned above, after being notified of a potential issue, Portswigger looked at the system, found a problem, fixed the problem, and attempted to make up for losses to their customers. At the time I was curious of how many of my attempts had been impeded by the issue because I was only offered one additional attempt even though I had failed multiple times at this point. Regardless, Portswigger didn't have to acknowledge or offer anything in response to the issue, so I was content with how they handled it. Portswigger later clarified with me that the issue they fixed was not the issue that led to my multiple failures. In fact, the issue may not have been responsible for any of my failures. They could only determine that the issue existed for one of my attempts, and offered a replacement voucher for that attempt. I certainly appreciate this additional information, and recommend that Portswigger include something to this effect in the original email they send out in response to identified problems. It would remove any uncertainly or reason for distrust.

Note: The above paragraph was updated based on new information provided by Portswigger after the original article was published.

Web application security is a dynamic ever-changing field. So any certification within it is going to be dynamic and ever-changing as well if it wants to remain relevant. Between my third and fourth attempts at the exam, Portswigger released an entire block of content on file upload vulnerabilities to the Web Security Academy. Since the exam is tied directly to the academy content and labs, I was concerned that I would have to go back and do for file uploads everything I had done for the remainder of the academy up to that point. I reached out to Portswigger and was told that there is a "couple week" grace period between when content shows up in the Web Security Academy and when it is eligible for the exam. This isn't documented anywhere, and is something that should be pointed out somewhere. If you've read my previous articles on this review, then you know that this is not the first case of something needing clarification within the documentation. The good thing is that Portswigger has take much of this into account and has expanded the documentation to include many of the various items I've pointed out.

All of this goes to show a couple things. First, that Portswigger is responsive to the community. This is huge. In a day and age where everyone is capitalizing on money grabs while paying very little attention to end users and customers, Portswigger chooses to respond. And that's awesome. Second, it shows that the entire certification process is still relatively untested, pun intended. The Burp Suite Certified Practitioner certification is still very much in its infancy. Call it a beta level product if you will. I'll expand on this a little more in my final thoughts.

Indexing

I mentioned in my attempts summary above the concept of indexing. For those of you that have taken SANS courses and certifications, instructors will tell you best way to prepare for the exams is to "index the books." because there is not enough time during a SANS certification exam to search the books for answers. Indexing a SANS book consists of identifying key words and ideas from the book and putting them in a spreadsheet ordered alphabetically with the page number. The idea is that when you see the word or idea in an exam question, you can quickly find where in the book it is and look up the answer. This works really well for traditional open book tests where the quantity of material is simply too much to commit to memory or search manually.

The Burp Suite Certified Practitioner exam is open book and the quantity of material is too much to commit to memory or search manually. Sounds like a good candidate for indexing right? The problem is, it isn't a traditional exam. However, what if we treat it like a traditional exam? That's exactly what I did. I treated the Web Security Academy as a book. As I went through the labs, I noted the application, page, behavior, resource, keywords, client-side code snippets, and solution payloads for each vulnerability category and lab. I then organized each vulnerability category by stage. Given that the exam is structured so that some vulnerabilities cannot exist in some stages for one reason or another (see full exam documentation to understand why), it was quite handy knowing which vulnerability categories were valid at each stage based on what had been accomplished in the previous stage. This organized list of vulnerabilities became a table of contents for my index.

During the exam, I would look at the vulnerabilities that were valid at the current stage given the conditions and focus in on areas of interest that made sense for the valid vulnerabilities. This pretty much ALWAYS identified the vulnerability. Portswigger actually makes this pretty obvious, so nothing major gained at this point. The next step was to use the index to look up what I was seeing in the targeted portion of the application. This often led me straight to a lab or labs that the challenge was based on. Then it was a matter of figuring out the little extra that Portswigger put into the exam challenge that made it different from the lab.

It should be noted that once I had the index, time was no longer an issue.

Pass Requirement

While it should have been something that surfaced earlier in my review, the biggest takeaway after all of my attempts is what is required to pass the exam. The completion requirement feels incredibly unfair. There is absolutely zero margin for error. It is 100% success or fail. You have to be perfect. You cannot miss anything. Period. I've tried finding something that compares to this in the industry and I can't. It's unprecedented, and probably for good reason. It's a bad idea.

I think most people in Information Security would agree that Offensive Security has been a benchmark for what a good technical certification should require. Their certifications are hands on, challenging, and require practical skill and knowledge to succeed. Look at OSCP for example. It has had the respect of the community for a very long time and has been referred to by many as the standard bearer for hands on certifications. As tough as OSCP is, it doesn't require 100% completion. It requires a high level of completion, but not 100%. Yet it remains a great indicator for whether or not someone is capable of doing the job that the certification says they can.

No one is perfect. We all have our strengths and weaknesses. This is no different when we begin breaking things down into specific technical items within a field of study, like application security. I may have knack for sniffing out authorization or authentication issues while another person has a knack for finding business logic flaws. It's why consultancies will rotate consultants for the same client, and why software companies will rotate consultancies for the same application. A different set of eyes brings a fresh set of skills and strengths to uncover things that previous tests didn't.

As mentioned in my previous articles, the Burp Suite Certified Practitioner certification exam contains two applications with three random challenges each, and every challenge has to be completed, in order, to pass. The test is linear. Therefore, if one of the challenges happens to be based on a technical area where you are weak, then you're done. There is no progressing. There is no hope of passing. Every moment you spend searching for the answer not only takes time away from the current challenge, but also the remaining challenges should you somehow complete the current. This causes anxiety, and there comes a point where even if you did solve the challenge, there's no hope of completing the exam. You can't skip that challenge and move to another while your subconscious grinds away at the previous. Your only option is to take the exam again and hope you win the challenge lottery.

This is a really poor approach to certification. Does it certify that one person is more qualified than another? Or does it certify that one person got a lucky set of challenges that aligned with their strengths and another didn't? It could be either, right? That's my point. You can't tell by the result of the exam alone. You'd have to know more than the results. And that makes the result of the exam meaningless, which makes the certification meaningless.

I understand that excessive weakness should prevent someone from being certified, but at what level? Everyone has some weakness. That's why there are variable grading and rating systems. The question is at what level has the acceptable amount of weakness or strength been validated. 100% perfection is an unrealistic metric for this and makes this a game of playing the randomized challenge lottery. Perhaps that's good for selling additional certification attempts, but it's a terrible and frustrating experience for the customer.

Feedback

If anyone from Portswigger reads this, here is my official feedback on how I believe the certification process and overall value of the certification could be improved.

First, rather than require 100% completion of everything, add a third app and either require two to be completed, or perhaps require a percentage of the provided challenges across all three applications to be completed. That way, there is a tolerance for imperfection that better relates to the human condition.

Second, change the name of the certification. I feel so strongly about this. The title "Burp Suite Certified Practitioner" would lead one to believe that the certification validates ones ability to use Burp Suite. I have not talked to a single person that has prepared for, attempted, and/or passed the exam that believes this title accurately describes the certification. While Burp Suite makes the labs and exams easier, they can be completed without it using freely available alternatives. Using the provided built in word list because it includes the required password for the account does not validate the need for Burp Suite. Based on all of the labs, practice exam challenges, and certification exam challenges that I completed during the process, plus years of using, researching, and teaching people about Burp Suite Pro, nothing in the certification process required above what I would consider very basic apprentice level usage. As someone that has led teams at several security consultancies, if someone only knew of Burp Suite what was required for the labs and exams, then I would have very little confidence in their ability to test web applications. The process simply does not certify what it says it does. I've said it over and over again, and most who have experienced it would agree, this certification is about memorizing and applying a large number of exploitation techniques and combining them to solve a puzzle. It's a CTF, as indicated by the final answer of each application being a secret from a static file on the file system. Nothing in the title indicates this. This is something that I believe the community will self-police if Portswigger does not, or the certification will lose it's relevance.

Final Thoughts

After three articles and two months of my professional life, the bottom line is this. The Burp Suite Certified Practitioner certification it is not without issue, and until the community has determined that it is a technically rock solid solution that truly validates the real world skills it claims to certify, then it should be approached with caution as a metric for any kind of policy or qualification. Otherwise, it's an incredibly well put together and challenging technical puzzle that will require you to learn a large number of exploitation techniques across the majority of web application vulnerability categories. While I do not walk away from the experience certified, I don't feel as if I lost anything. I gained a lot of knew exploitation knowledge, I learned a bit about myself, and I have shared all of that with you.

Like what you see? Join me for live training! See the Training page for more information.


Please share your thoughts, comments, and suggestions via Twitter.