AutoLogon to a Website and send request defeating CSRF token
Recently at work there was a test case where we needed to send a request to a internal Web App every so often to refresh the internal LDAP database. This is already implemented and required a button to be pressed after logging in to the Web App. Now either you can stay up for few nights and press that button manually or do something to automate this. I choose to automate it :)
Python is best at doing these kind of things and provides modules likes requests and BeautifulSoup to send web request and website scrapping.
Here is how this was done:
- Figure out the the login URL and the parameters required to login. It was easy — just capture the request in Burp or any other proxy.
- Figure out the request URL which sends the request (POST) and the parameters. This is also easy — capture the request in Burp and find all the required parameters. Note there may be some hidden parameters which are not usually displayed in the address bar, that is why it is important to note all of them after intercepting the request in Burp.
- Many sites gives a CSRF tokens when we send requests to them to protect against Cross Site Request Forgery. We need to send back the same token to prove that we are the actual client which is requesting data. This is where BeautifulSoup shines. We can scrape the html which was send back to us when we send in the login request and use the same token [csrf] in the POST request where we send in the credentials for authentication.
- Once authenticated, which can be checked by searching some specific text like ‘Logout’ in our case in the response back for the last POST request, we can send in the actual refresh request which we want to send with proper parameters filled in. Validate on the WebApp if the request succeeded in doing the job which it was supposed to do, if yes we are good.
- One thing to note here is that some Web Server reject the request when they are send in without proper headers. This is because requests like these are usually done via some kind of automated requests or bots which the website owners don’t want to allow for many reasons. To avoid this we can just add a…