This blog is NOT OFFICIAL website of Kali Linux. We just share Tutorials to learn Cybersecurity.

Parsero -- Scan for Vulnerability

Home

The world of cybersecurity is really thrilling where every click, tap and byte counts. Today, we are going to learn the basics with a nifty tool called Parsero on our Kali Linux system.

Parsero is like a digital bloodhound with a mission to sniff out vulnerabilities in websites. It's basically like our cyber detective buddy, equipped with the skills to uncover any hidden threats lurking in the depth.

Parsero Scan for Vulnerability on Kali Linux
 Now let's get our hands dirty and dive into the action.

First of all we need to have Parsero tool on our system. Don't worry it comes pre-installed with our Kali Linux full version if not we can simply install it by using following command on our Kali Linux Terminal:-

sudo apt install parsero -y

Then it will prompt for our root password and it will be installed within some seconds.

Before use Parsero on our Kali Linux system let we check the options of this tool by using following command:

parsero -h

The above command will show the help of Parsero tool as we can see it on the following screenshot.

Parsero help options on Kali Linux

Let's run it against a target. Lord Google can be an example just for scanning purpose. We are not really attacking the Lord of surface internet. We should not attack any website without proper legal written permission. We can create our own vulnerable site for that. So the command will be as following:

parsero -u https://www.google.com

We can see the result of the above command in the following screenshot:

parsero performing aginst a target

In the above screenshot we can see that Parsero is performing well and finding some directories.

Parsero is actually a Python script which reads the robots.txt of a website and looks at the Disallow entries. The Disallow entries tell the search engines what directories or files hosted on a web server mustn't be indexed. For example, "Disallow: /portal/login" means that the content on www.example.com/portal/login it's not allowed to be indexed by crawlers like Google, Bing, Yahoo etc. This is the way the administrator have to not share sensitive or private information with the search engines.

But sometimes these paths typed in the Disallows entries are directly accessible by the users without using a search engine, just visiting the URL and the Path, and sometimes they are not available to be visited by anybody. Because it is really common that the administrators write a lot of Disallows and some of them are available and some of them are not, we can use Parsero in order to check the HTTP status code of each Disallow entry in order to check automatically if these directories are available or not.

Also, the fact the administrator write a robots.txt, it doesn't mean that the files or directories typed in the Dissallow entries will not be indexed by Bing, Google, Yahoo, etc. For this reason, Parsero is capable of searching in Bing to locate content indexed without the web administrator authorization. Parsero will check the HTTP status code in the same way for each Bing result.

We can see there are a lots of red lines on Parsero result which indicates

  1. 200 OK               The request has succeeded.
  2. 403 Forbidden    The server understood the request, but is refusing to fulfill it.
  3. 404 Not Found    The server hasn't found anything matching the Request-URI.
  4. 302 Found           The requested resource resides temporarily under a different URI (Uniform Resource Identifier).

If we want to see only the "HTTP 200" status code then we have to use the -o flag just like following:

parsero -o -u https://www.google.com

In the following screenshot we can see only the "HTTP 200" status codes.

parsero http 200 status codes only

Also If we have a list of domains to run Parsero then we can note down those websites on a text file each on a line just like the following screenshot:

 

Parsero target list
If we have another targets we can add them like the above. Now we can scan the list with Parsero. Before that we need to specify our website's list named 'targets.txt', which is stored on our Desktop and we also want to see "HTTP 200" status codes only. So our command will be following:

parsero -o -f ~/Desktop/targets.txt

After running the above command Parsero will start scanning the websites given on the list as we can see in the following screenshot.

Parsero on multiple targets
Once Parsero completes its scan, it'll spit out a detailed report highlighting any potential vulnerabilities it found. We need to pay close attention to these findings as it will give us valuable insights into how secure (or not-so-secure) the website is.

And there we have it, folks! We've just dipped our toes into the world of cybersecurity with Parsero on Kali Linux. But remember, this is just the beginning. The cyber realm is vast and ever-evolving, so we need to stay curious, keep learning, and never underestimate the power of a good cyber tool in our arsenal. Happy hunting, and may the digital winds be ever in our favor!

Love our article? Make sure to follow us on Twitter and GitHub, we post article updates there. To join our KaliLinuxIn family, join our Telegram Group & Whatsapp Channel We are trying to build a community for Linux and Cybersecurity. For anything we always happy to help everyone on the comment section. As we know our comment section is always open to everyone. We read each and every comment and we always reply.

author-img
Kali Linux

Comments

No comments
Post a Comment
    google-playkhamsatmostaqltradent