25cbfe064306473f81efc9c575d164d23e365422,w3af/plugins/crawl/robots_txt.py,robots_txt,crawl,#robots_txt#Any#,42

Before Change


        dirs.append(robots_url)
        for line in http_response.get_body().split("\n"):

            line = line.strip()

            if len(line) > 0 and line[0] != "//" and \
                (line.upper().find("ALLOW") == 0 or
                 line.upper().find("DISALLOW") == 0):

After Change


            return

        // Send the new knowledge to the core!
        self.worker_pool.map(self.http_get_and_parse, urls)

        // Save it to the kb!
        desc = ("A robots.txt file was found at: "%s", this file might"
                " expose private URLs and requires a manual review. The"
Italian Trulli
In pattern: SUPERPATTERN

Frequency: 3

Non-data size: 3

Instances


Project Name: andresriancho/w3af
Commit Name: 25cbfe064306473f81efc9c575d164d23e365422
Time: 2018-04-13
Author: self@andresriancho.com
File Name: w3af/plugins/crawl/robots_txt.py
Class Name: robots_txt
Method Name: crawl


Project Name: okfn-brasil/serenata-de-amor
Commit Name: 17d1f7fed5e55519bc3a25a70b3dc655a72170f3
Time: 2016-11-09
Author: cuducos@gmail.com
File Name: jarbas/core/management/commands/loadsuppliers.py
Class Name: Command
Method Name: to_date


Project Name: sobhe/hazm
Commit Name: 1fe3285cad930eae051be2027a793b51720a3d4a
Time: 2013-11-26
Author: alireza.nournia@gmail.com
File Name: hazm/Lemmatizer.py
Class Name: Lemmatizer
Method Name: __init__