A bash script to spider a site, follow links, and fetch urls (with built-in filtering) into a generated text file.
-
Updated
Dec 26, 2021 - Shell
A bash script to spider a site, follow links, and fetch urls (with built-in filtering) into a generated text file.
🔗 GitHub action to extract and check urls in code and documentations.
Clean a series of links, resolving redirects and finding Wayback results if page is gone. Originally written to aid with importing from ArchiveBox.
HTTP FILTER is a fast and efficient Bash tool that automates HTTP response code analysis for security researchers, penetration testers, and bug bounty hunters. It processes a list of URLs concurrently, categorizing them into separate files based on HTTP status codes (e.g., 200.txt, 404.txt, 500.txt).
Utilizando o poder da tecnologia AlienVaultOTX, a Alopra pode extrair URLs de maneira rápida.🔥
Github Action to verify that links in your repo are up and available
this bash script aims in downloading the private rewarding scope, this can be modified by changinf the url https://bugcrowd.com/programs.json?vdp[]=false&sort[]=promoted-desc&hidden[]=false&page[]=0', this scripts stores all the urls under the code name of each project so it will create multiple text files under the folder bugcrowd_recon
A command-line tool that opens URLs based on pattern matching, designed to be used with keyboard shortcuts (e.g., Win+O to open URLs from selected text). It's particularly useful for quickly opening issue trackers, tickets, or other pattern-based URLs. It can also open local files. This project was created using AI assistance (Cursor).
Add a description, image, and links to the urls topic page so that developers can more easily learn about it.
To associate your repository with the urls topic, visit your repo's landing page and select "manage topics."