- config.yaml- Config file for trickest-cli (Set the repository name initially)
- domains.txt- List of root domains (tld domains list (example.com))
- hostnames.txt- List of hostnames found for root domains provided (Updated by the workflow, if updated manually, will be propagated through the entire workflow)
- servers.txt- List of available servers for found hostnames (Web servers found from hostnames)
- reports.txt- List of vulnerabilities found for found servers (Vulnerabilities found)
- blacklist.txt- List of strings to exclude from all results (Blacklist hostnames and servers- grep -vFf)
- templates(folder) - Place where you push nuclei templates (Folders supported)
Set up the TRICKEST_TOKEN variable to the secrets.
- Create a new repository from the template
- Open https://github.com/YOUR_USERNAME/YOUR_REPOSITORY/settings/secrets/actions
- Add TRICKEST_TOKEN, which can be found at https://trickest.io/dashboard/settings/my-account
Set up a GitHub deploy key with write access to your Bug Bounty Setup repository and add the private SSH key to the SSH_KEY action secret.
Replace REPOSITORY_NAME  with your GitHub repository name inside the config.yaml file.
inputs:   
  string-to-file-1.string: REPOSITORY_NAME
  recursively-cat-all-5:
    file:
      - id_rsa
machines:
  large: 1
All of the domains will be picked up automatically by the workflow. You will need to push the new root domain names to the domains.txt file.
echo "trickest.com" > domains.txt
All of the nuclei templates will be picked up automatically by the workflow. Push the new nuclei templates to the templates folder.
cd templates
wget "https://raw.githubusercontent.com/projectdiscovery/nuclei-templates/master/cves/2022/CVE-2022-35416.yaml"
When you're done adding your data/templates, commit and push
git add *
git commit -m "Add target"
git push
The workflow is triggered on workflow_dispatch event; feel free to change the trigger the way it suits the best your use case (The push event might be a suitable option if you want to trigger the workflow automatically).
Initially, you need to gather the data about the repository:
- String inputs are colored purple, and you need to connect your repository name and email to the string-to-file(this file will be used as a variable value when cloning the repository inside ofget-repo-data)
- id_rsais directly uploaded through the client/workflow and is used when cloning the repository
NOTE: Keep in mind that out/output.txt is reserved for the output file port, so you can cat the content of
domains.txttoout/output.txtto be available forAmassandSubFinder.
Now that you have your root domains, you should use SubFinder and Amass to get all of the results from passive sources. Their outputs will be merged through recursively-cat-all-1 with additional sort -n | uniq, which will cat and deduplicate your results.
This part consists of getting all the results from Passive Recon and then:
- Executing dsieve with the 2:4flag for subdomain levels to get all of the environments
- Executing mksub to create a wordlist of potential hostnames
- Resolving with puredns
- Executing found brute-forced hostnames permutations with gotator
- Resolving gotator with puredns
Finally, you've got your hostnames for this run, and now you can pass it to httpx to get all of the web servers. get-repo-data node will provide you with the nuclei templates you already pushed to the repository.
Finally, the update node will get the data and push it to the repository.
Integrations are a crucial part of Attack Surface Management, check out our post on how to pick the right one.
Commit messages will show what is changed, which means you will have an insight into all the new data (vulnerabilities). How awesome is that!







