Skip to content

Kode-n-Rolla/sara

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

65 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SARA v1.0

About ℹ️ • Detailed Description 🔬 • Installation 🛠️ • Versions 📦

About SARA

SARA (Security Assistant Researcher Analyzer) is an asynchronous web crawler with enumeration capabilities.

The primary goal of the tool is to crawl target addresses, analyze source code and JavaScript files, and search for keywords, comments, and inline scripts. It also supports custom headers for flexibility.

Also there is an animation indicating that the program is running is included. The message I’m researching will add a . every second to show activity.

Key Features:

  • Crawling and Analysis: SARA scans pages, extracting valuable information such as links, JavaScript files, and analyzing their content.
  • Directory and Subdomain Enumeration: Perfect for discovering niche endpoints like /api or /graphql, as well as enumerating subdomains.
  • Simulated User Behavior:
    • The User-Agent header is randomly chosen from a preset list to mimic real browsers (e.g., Chrome, Firefox) and OS (e.g., Macintosh, Linux, Windows)
    • Users can specify their own User-Agent, overriding the default one.
    • The tool introduces a delay between requests (1–5 seconds) to simulate natural user behavior, minimizing noise and suspicion.

Ideal For:

  • Conducting penetration tests.
  • Bug bounty programs that require deep automation with minimal detection and noise.

Example Output

Here is an example of the --help command output:

SARA Help Command Output

Detailed Description

SARA provides extensive functionality through its flags and modes. Below is a detailed description of each:
  1. -t
  2. Accepts a single URL, a domain, or a file containing multiple targets. Works in conjunction with the three primary modes: -c, --enum-s, and --enum-d.

  3. -c (Crawling Mode).
  4. Accepts either a single URL or a file containing multiple URLs.

    In crawling mode, SARA gathers key data about the target, including:

    • HTTP response codes
    • Links on the target web page
    • HTTP response headers and their analysis
    • JavaScript files and their analysis
    • Inline scripts
    • Keywords and comments

  5. --enum-d (Directory Enumeration).
  6. Performs directory enumeration. Since the tool intentionally uses slow request rates (1-5 seconds per request), it is recommended for highly specific endpoint enumeration, such as APIs.

    • Requires a target with a full URL, including the protocol (e.g., https://example.com).
    • Accepts one target at a time.
    • Can use a default wordlist or a custom file for directories.
    • Default wordlist /admin, /login, /dashboard, /config, /api, /robots.txt, /sitemap.xml, /env, /private, /uploads, /tmp, /health, /metrics, /status, /graphql, /graphiql

  7. --enum-s (Subdomain Enumeration)
  8. Performs subdomain enumeration. Unlike --enum-d, this mode accepts only a domain as input (e.g., example.com).

    • By default, it enumerates subdomains using the HTTPS protocol.
    • Accepts one target at a time.
    • Can use a default wordlist or a custom file for subdomains.
    • Default wordlist dev, test, staging, qa, admin, dashboard, api, auth, mail, ftp, vpn, status
  9. --http
  10. Works with --enum-s to enable subdomain enumeration over HTTP instead of HTTPS.

  11. -H
  12. Adds custom HTTP headers. If provided, the custom User-Agent header will replace the default.

    • Accepts both a string or a file containing headers.

  13. -o
  14. Saves the output to a user-specified file.

    • Output will also be printed to the terminal.
    • JSON is the recommended format for easier post-processing.

  15. -kw
  16. Adds custom keywords for analysis during crawling.
    • Accepts either a string or a file containing keywords.
  17. -wjs
  18. Disables JavaScript file analysis during crawling.

  19. -wha
  20. Disables HTTP header analysis during crawling.

  21. -h (Help)
  22. Displays the manual, including command examples for easier usage.

Installation Instructions

Follow these steps to set up and use SARA:

  1. Clone the Repository
  2. git clone https://github.com/Kode-n-Rolla/sara.git
  3. Navigate to the Source Directory
  4. cd sara/src
  5. Install Dependencies
  6. pip install -r requirements.txt

    Alternatively, if you encounter an error like × This environment is externally managed, use:

    pip install -r requirements.txt --break-system-packages
  7. Run the Tool
  8. Execute the script with python3 to see available options:

    python3 sara.py -h
  9. Optional: Create a System-wide Command Shortcut
  10. To make it easier to run SARA, you can create a symbolic link (recommended save in /opt first or any your tools directory):

    sudo ln -s "$(pwd)/sara.py" /usr/local/bin/sara

    Now, you can run the tool from anywhere using:

    sara --help

Versions

Version Key Features Release Notes
1.0
  • -t, -c: Crawling and analysis
  • --enum-d, --enum-s (--http): Enumeration modes
  • -H, -kw: Custom headers and keywords
  • -wha, -wjs: Disable header or JS analysis
  • -o: Save results to a file
  • --help: Comprehensive manual with examples
Initial release with core functionality including crawling, enumeration, and customizable features for penetration testing and bug bounty tasks.

From Pentester and Bug Hunter to Pentesters and Bug Hunters with love ❤️