Parsons Software Security Consulting Blog

How to find Robots.txt with 02

with 2 comments

We already discussed a script to find the crossdomain.xml file with 02.  Today we are going to talk about how to find the Robots.txt file.  Many websites have Robots.txt file but sometimes they contain sensitive information inside of these files.   Today we are going to write a script that searches Google for these files.



Below is a sample Robots.txt from a sample web application.







var ie = panel.clear().add_IE().silent(true);"");
ie.field("Search").value("inurl:robots.txt filetype:txt");

ie.button("Google Search").click();
var targetUrls = new List<string>();

foreach(var link in ie.links().urls())
 if (link.ends("robots.txt"))

return targetUrls;
return targetUrls;
return targetUrls;
return ie.buttons();

//using  O2.XRules.Database.Utils.O2





I don’t plan on showing you how to exploit robots.txt.  But the 02 script is a simple one to find robots.txt  out in the wild.


Parsons Software Security Consulting, LLC

Securing the Internet one Application at a time.


mparsons [at]


Written by mparsons1980

December 8, 2010 at 11:07 pm

2 Responses

Subscribe to comments with RSS.

  1. Hi! I’m at work browsing your blog from my new iphone! Just wanted to say I love reading your blog and look forward to all your posts! Carry on the fantastic work!


    March 12, 2013 at 9:17 am

  2. It’s really a cool and useful piece of information. I’m glad that you just shared this helpful info with us.
    Please stay us informed like this. Thank you for sharing.


    June 24, 2013 at 4:22 pm

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: