Ransomware leak site monitoring

Related tags

Logging ransomwatch
Overview

RansomWatch

Build Image Docker Hub Publish Docker Hub Image

RansomWatch is a ransomware leak site monitoring tool. It will scrape all of the entries on various ransomware leak sites, store the data in a SQLite database, and send notifications via Slack or Discord when a new victim shows up, or when a victim is removed.

Configuration

In config_vol/, please copy config.sample.yaml to config.yaml, and add the following:

  • Leak site URLs. I decided not to make this list public in order to prevent them from gaining even more noteriety, so if you have them, add them in. If not, this tool isn't for you.
  • Notification destinations. RansomWatch currently supports notifying via.the following:
    • Slack: Follow these instructions to add a new app to your Slack workspace and add the webhook URL to the config.
    • Discord: Follow these instructions to add a new app to your Discord server and add the webhook URL to the config.

Additionally, there are a few environment variables you may need to set:

  • RW_DB_PATH: Path for the SQLite database to use
  • RW_CONFIG_PATH: Path to the config.yaml file

These are both set in the provided docker-compose.yml.

Usage

This is intended to be run in Docker via a cronjob on whatever increment you decide to use.

First, build the container: docker-compose build app

Then, add it to your crontab. Example crontab entry (running every 8 hours):

0 */8 * * * cd /path/to/ransomwatch && docker-compose up --abort-on-container-exit

If you'd prefer, you can use the image published on Docker Hub (captaingeech/ransomwatch) instead, with a docker-compose.yml that looks something like this:

version: "3"

services:
  app:
    image: captaingeech/ransomwatch:latest
    depends_on:
      - proxy
    volumes:
      - ./db_vol:/db
      - ./config_vol:/config
    environment:
      PYTHONUNBUFFERED: 1
      RW_DB_PATH: /db/ransomwatch.db
      RW_CONFIG_PATH: /config/config.yaml

  proxy:
    image: captaingeech/tor-proxy:latest

This can also be run via the command line, but that requires you to have your own Tor proxy (with the control service) running. Example execution:

$ RW_DB_PATH=./db_vol/ransomwatch.db RW_CONFIG_PATH=./config_vol/config.yaml python3 src/ransomwatch.py

Example Slack Messages

Slack notification for new victim

Slack notification for removed victim

Slack notification for site down

Slack notification for an error

The messages sent to Discord are very similar in style, identical in content.

Leak Site Implementations

The following leak sites are (planned to be) supported:

  • Conti
  • MAZE
  • Egregor
  • Sodinokibi/REvil
  • DoppelPaymer (captcha, prob won't be supported for a while)
  • NetWalker
  • Pysa
  • Avaddon
  • DarkSide
  • CL0P
  • Nefilim
  • Mount Locker
  • Suncrypt
  • Everest
  • Ragnarok
  • Ragnar_Locker
  • BABUK LOCKER
  • Pay2Key
  • Cuba
  • RansomEXX
  • Pay2Key
  • Ranzy Locker
  • Astro Team
  • LV

If there are other leak sites you want implemented, feel free to open a PR or DM me on Twitter, @captainGeech42

Comments
  • Pysa timestamp format change

    Pysa timestamp format change

    Traceback (most recent call last):
      File "/app/ransomwatch.py", line 66, in main
        s.scrape_victims()
      File "/app/sites/pysa.py", line 38, in scrape_victims
        published_dt = datetime.strptime(
      File "/usr/local/lib/python3.9/_strptime.py", line 568, in _strptime_datetime
        tt, fraction, gmtoff_fraction = _strptime(data_string, format)
      File "/usr/local/lib/python3.9/_strptime.py", line 349, in _strptime
        raise ValueError("time data %r does not match format %r" %
    ValueError: time data '22/03/21' does not match format '%m/%d/%y'
    
    opened by captainGeech42 4
  • Something broken with REvil

    Something broken with REvil

    app_1    | 2021/04/20 18:36:25 [ERROR] Got an error while scraping REvil, notifying
    app_1    | 2021/04/20 18:36:25 [ERROR] Error sending Discord notification (400): {"embeds": ["0"]}
    app_1    | 2021/04/20 18:36:25 [ERROR] Failed to send error notification to Discord guild "test-discord"
    app_1    | 2021/04/20 18:36:25 [ERROR] Traceback (most recent call last):
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 699, in urlopen
    app_1    |     httplib_response = self._make_request(
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 445, in _make_request
    app_1    |     six.raise_from(e, None)
    app_1    |   File "<string>", line 3, in raise_from
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 440, in _make_request
    app_1    |     httplib_response = conn.getresponse()
    app_1    |   File "/usr/local/lib/python3.9/http/client.py", line 1347, in getresponse
    app_1    |     response.begin()
    app_1    |   File "/usr/local/lib/python3.9/http/client.py", line 307, in begin
    app_1    |     version, status, reason = self._read_status()
    app_1    |   File "/usr/local/lib/python3.9/http/client.py", line 276, in _read_status
    app_1    |     raise RemoteDisconnected("Remote end closed connection without"
    app_1    | http.client.RemoteDisconnected: Remote end closed connection without response
    app_1    |
    app_1    | During handling of the above exception, another exception occurred:
    app_1    |
    app_1    | Traceback (most recent call last):
    app_1    |   File "/usr/local/lib/python3.9/site-packages/requests/adapters.py", line 439, in send
    app_1    |     resp = conn.urlopen(
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 755, in urlopen
    app_1    |     retries = retries.increment(
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/util/retry.py", line 532, in increment
    app_1    |     raise six.reraise(type(error), error, _stacktrace)
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/packages/six.py", line 734, in reraise
    app_1    |     raise value.with_traceback(tb)
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 699, in urlopen
    app_1    |     httplib_response = self._make_request(
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 445, in _make_request
    app_1    |     six.raise_from(e, None)
    app_1    |   File "<string>", line 3, in raise_from
    app_1    |   File "/usr/local/lib/python3.9/site-packages/urllib3/connectionpool.py", line 440, in _make_request
    app_1    |     httplib_response = conn.getresponse()
    app_1    |   File "/usr/local/lib/python3.9/http/client.py", line 1347, in getresponse
    app_1    |     response.begin()
    app_1    |   File "/usr/local/lib/python3.9/http/client.py", line 307, in begin
    app_1    |     version, status, reason = self._read_status()
    app_1    |   File "/usr/local/lib/python3.9/http/client.py", line 276, in _read_status
    app_1    |     raise RemoteDisconnected("Remote end closed connection without"
    app_1    | urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
    app_1    |
    app_1    | During handling of the above exception, another exception occurred:
    app_1    |
    app_1    | Traceback (most recent call last):
    app_1    |   File "/app/ransomwatch.py", line 52, in main
    app_1    |     s.scrape_victims()
    app_1    |   File "/app/sites/revil.py", line 62, in scrape_victims
    app_1    |     r = p.get(f"{self.url}?page={i}", headers=self.headers)
    app_1    |   File "/app/net/proxy.py", line 101, in get
    app_1    |     return self.session.get(*args, **kwargs)
    app_1    |   File "/usr/local/lib/python3.9/site-packages/requests/sessions.py", line 555, in get
    app_1    |     return self.request('GET', url, **kwargs)
    app_1    |   File "/usr/local/lib/python3.9/site-packages/requests/sessions.py", line 542, in request
    app_1    |     resp = self.send(prep, **send_kwargs)
    app_1    |   File "/usr/local/lib/python3.9/site-packages/requests/sessions.py", line 655, in send
    app_1    |     r = adapter.send(request, **kwargs)
    app_1    |   File "/usr/local/lib/python3.9/site-packages/requests/adapters.py", line 498, in send
    app_1    |     raise ConnectionError(err, request=request)
    app_1    | requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
    app_1    | 2021/04/20 18:36:25 [INFO] Finished all sites, exiting
    

    not sure what's going on. similar error w/ slack

    bug 
    opened by captainGeech42 3
  • Conti - Scraping Error

    Conti - Scraping Error

    Describe the bug

    Error Message Below:

    Traceback (most recent call last): File "/app/ransomwatch.py", line 66, in main s.scrape_victims() File "/app/sites/conti.py", line 56, in scrape_victims last_li = page_list.find_all("li")[-1] AttributeError: 'NoneType' object has no attribute 'find_all'

    To Reproduce Steps to reproduce the behavior: This error has happened several times over the last 24 hours while ransomwatch has been run on a cron job.

    Expected behavior Parse the contents of the Conti site with no errors or have additional error handling built in to handle this error.

    Screenshots If applicable, add screenshots to help explain your problem.

    Logs

    Traceback (most recent call last): File "/app/ransomwatch.py", line 66, in main s.scrape_victims() File "/app/sites/conti.py", line 56, in scrape_victims last_li = page_list.find_all("li")[-1] AttributeError: 'NoneType' object has no attribute 'find_all'

    Environment

    • OS: Ubuntu 20.04
    • How you are running it: Docker via cron job (read me best practices implementation)

    Additional context Add any other context about the problem here.

    opened by GRIT-5ynax 2
  • Dockerhub image out of date

    Dockerhub image out of date

    Running the Dockerhub image results in

    app_1 | Traceback (most recent call last): app_1 | File "/app/ransomwatch.py", line 98, in app_1 | NotificationManager.send_error_notification(f"Non-scraping failure", tb, fatal=True) app_1 | File "/app/notifications/manager.py", line 30, in send_error_notification app_1 | for workspace, params in Config["slack"].items(): app_1 | KeyError: 'slack'

    Works if the image is built

    bug 
    opened by nhova 2
  • New sites

    New sites

    • [x] Ranzy
    • [x] Astro
    • [x] Pay2Key
    • [x] Cuba
    • [x] RansomEXX
    • [x] Mount Locker
    • [x] Ragnarok
    • [ ] Ragnar Locker
    • [x] Suncrypt
    • [x] Everest
    • [x] Nefilim
    • [x] CL0P
    • [x] Pysa
    opened by captainGeech42 2
  • New Scraper: BLACKMATTER // ARVIN // EL COMETA // LORENZ // XING // LOCKBIT

    New Scraper: BLACKMATTER // ARVIN // EL COMETA // LORENZ // XING // LOCKBIT

    New Scraper: BLACKMATTER // ARVIN // EL COMETA // LORENZ // XING // LOCKBIT

    This pull request adds support for BLACKMATTER, ARVIN, EL COMETA, LORENZ, XING, LOCKBIT.

    • [x] The URL for the site is nowhere in the git history
    • [x] The site is added to config.sample.yaml
    • [x] There aren't any debug logging statements/etc.
    • [x] The data going into the DB is properly parsed and is accurate
    enhancement 
    opened by x-originating-ip 1
  • cl0p scraper broken

    cl0p scraper broken

    Describe the bug Cl0p scraper out of date

    Logs

    Traceback (most recent call last):
      File "/app/ransomwatch.py", line 66, in main
        s.scrape_victims()
      File "/app/sites/cl0p.py", line 21, in scrape_victims
        victim_list = soup.find("div", class_="collapse-section").find_all("li")
    AttributeError: 'NoneType' object has no attribute 'find_all'
    

    should probably just update this to the v3 site as well

    bug 
    opened by captainGeech42 1
  • Enhance pysa datetimes processing (#50)

    Enhance pysa datetimes processing (#50)

    Describe the changes

    Adding some logics into pysa.py to try to process the datetime better. Also, exception handling has been added to avoid crash of the script.

    Related issue(s)

    #50

    How was it tested?

    Before: scrapping failed at some point if pysa was defined in the yaml config file (see related issue).

    Now:

    • [x] scrapping works
    • [x] dates look good (although as we don't know what is the true value, we can only admit it's relevant)
    • [x] the script does not crash any longer because of the try/catch instructions.
    opened by biligonzales 1
  • Handle missing notifications element in the yaml config file (#52)

    Handle missing notifications element in the yaml config file (#52)

    Describe the changes

    Added minor changes into manager.py so that it does not scream out loud if we do not want to configure notifications. Basically the presence of the notifications element in the Config yaml is tested.

    Related issue(s)

    #52

    How was it tested?

    • [x] Docker started with an empty notifications element
    • [x] Docker started withtout any notifications element
    opened by biligonzales 1
  • Unable to run without configured notifications

    Unable to run without configured notifications

    The notifications part in the config.yaml file needs to be present and configured to avoid any error at runtime. Would be great to be able to leave the notifications part empty (or even not to set it in the yaml config).

    opened by biligonzales 1
  • Conti: scraper fixed (#73)

    Conti: scraper fixed (#73)

    Describe the changes

    Fixed the Conti scraper to use the newsList javascript item because no html elements were available any longer.

    Related issue(s)

    This fixes issue #73

    How was it tested?

    1. Add Conti url to config.yaml
    2. Run docker-compose build app
    3. Run docker-compose up --abort-on-container-exit
    4. Conti results are pushed again in the database

    Checklist for a new scraper (delete if N/A)

    • [x] The URL for the site is nowhere in the git history
    • [x] The site is added to config.sample.yaml
    • [x] There aren't any debug logging statements/etc. (there was one logging.debug there, I left it as it was)
    • [x] The data going into the DB is properly parsed and is accurate
    opened by biligonzales 0
  • Lockbit scraper fixed (now uses playwright) #74

    Lockbit scraper fixed (now uses playwright) #74

    Describe the changes

    Lockbit 2.0 now uses a ddos protection mechanism hence the regular http get method is no longer working.

    As a workaround I have implemented the playwright Microsoft library which behaves as if a proper browser did the request.

    Summary of the changes:

    1. lockbit.py: replaced the use of requests by playwright
    2. requirements.txt: added playwright
    3. Dockerfile: added playwright chromium support as well as required libraries.

    I have also upgraded at the top of the Dockerfile from python3.9-buster to python3.10-bullseye.

    Related issue(s)

    It fixes Issue #74

    Note that the scraping engine for lockbit has been left untouched as it is still perfectly working. Only the web page retrieval method has been altered.

    How was it tested?

    • [x] docker-compose build app
    • [x] docker-compose up --abort-on-container-exit
    • [x] Checked that Lockbit entries have been inserted into the database
    opened by biligonzales 3
  • new victims monitoring is broken, alert only when sites are down

    new victims monitoring is broken, alert only when sites are down

    Describe the bug The app doesn't alert when new victims added to the ransom sites (we noticed that new victim are being added on some of the sites) We get alerts only when the sites are down.

    Expected behavior The app alert when new victim are added to the ransom sits being monitored.

    Logs Starting ransomwatch_proxy_1 ... done Starting ransomwatch_app_1 ... done Attaching to ransomwatch_proxy_1, ransomwatch_app_1 proxy_1 | Feb 07 14:50:31.819 [notice] Tor 0.4.5.7 running on Linux with Libevent 2.1.12-stable, OpenSSL 1.1.1i, Zlib 1.2.11, Liblzma 5.2.5, Libzstd 1.4.5 and Unknown N/A as libc. proxy_1 | Feb 07 14:50:31.822 [notice] Tor can't help you if you use it wrong! Learn how to be safe at https://www.torproject.org/download/download#warning proxy_1 | Feb 07 14:50:31.822 [notice] Read configuration file "/etc/tor/torrc". proxy_1 | Feb 07 14:50:31.825 [notice] Opening Socks listener on 0.0.0.0:9050 proxy_1 | Feb 07 14:50:31.825 [notice] Opened Socks listener connection (ready) on 0.0.0.0:9050 proxy_1 | Feb 07 14:50:31.825 [notice] Opening Control listener on 0.0.0.0:9051 proxy_1 | Feb 07 14:50:31.825 [notice] Opened Control listener connection (ready) on 0.0.0.0:9051 app_1 | 2022/02/07 14:50:33 [INFO] Initializing app_1 | 2022/02/07 14:50:33 [INFO] Found 30 sites app_1 | 2022/02/07 14:50:33 [INFO] Starting process for Avaddon app_1 | 2022/02/07 14:50:33 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:50:33 [INFO] Starting process for Conti app_1 | 2022/02/07 14:50:38 [INFO] Scraping victims app_1 | 2022/02/07 14:51:48 [INFO] There are 0 new victims app_1 | 2022/02/07 14:51:48 [INFO] Identifying removed victims app_1 | 2022/02/07 14:51:48 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:51:48 [INFO] Finished Conti app_1 | 2022/02/07 14:51:48 [INFO] Starting process for DarkSide app_1 | 2022/02/07 14:51:48 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:51:48 [INFO] Starting process for REvil app_1 | 2022/02/07 14:51:48 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:51:48 [INFO] Starting process for Babuk app_1 | 2022/02/07 14:51:50 [INFO] Scraping victims app_1 | 2022/02/07 14:51:51 [INFO] There are 0 new victims app_1 | 2022/02/07 14:51:51 [INFO] Identifying removed victims app_1 | 2022/02/07 14:51:51 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:51:51 [INFO] Finished Babuk app_1 | 2022/02/07 14:51:51 [INFO] Starting process for Ranzy app_1 | 2022/02/07 14:51:51 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:51:51 [INFO] Starting process for Astro app_1 | 2022/02/07 14:51:51 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:51:51 [INFO] Starting process for Pay2Key app_1 | 2022/02/07 14:51:53 [INFO] Scraping victims app_1 | 2022/02/07 14:51:54 [INFO] There are 0 new victims app_1 | 2022/02/07 14:51:54 [INFO] Identifying removed victims app_1 | 2022/02/07 14:51:54 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:51:54 [INFO] Finished Pay2Key app_1 | 2022/02/07 14:51:54 [INFO] Starting process for Cuba app_1 | 2022/02/07 14:51:57 [INFO] This is the first scrape for Cuba, no victim notifications will be sent app_1 | 2022/02/07 14:51:57 [INFO] Scraping victims app_1 | 2022/02/07 14:52:08 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:08 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:08 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:08 [INFO] Finished Cuba app_1 | 2022/02/07 14:52:08 [INFO] Starting process for RansomEXX app_1 | 2022/02/07 14:52:10 [INFO] Scraping victims app_1 | 2022/02/07 14:52:13 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:13 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:13 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:13 [INFO] Finished RansomEXX app_1 | 2022/02/07 14:52:13 [INFO] Starting process for Mount app_1 | 2022/02/07 14:52:13 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:13 [INFO] Starting process for Ragnarok app_1 | 2022/02/07 14:52:13 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:13 [INFO] Starting process for Ragnar app_1 | 2022/02/07 14:52:13 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:13 [INFO] Starting process for Suncrypt app_1 | 2022/02/07 14:52:15 [INFO] This is the first scrape for Suncrypt, no victim notifications will be sent app_1 | 2022/02/07 14:52:15 [INFO] Scraping victims app_1 | 2022/02/07 14:52:17 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:17 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:17 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:17 [INFO] Finished Suncrypt app_1 | 2022/02/07 14:52:17 [INFO] Starting process for Everest app_1 | 2022/02/07 14:52:17 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:17 [INFO] Starting process for Nefilim app_1 | 2022/02/07 14:52:17 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:17 [INFO] Starting process for Cl0p app_1 | 2022/02/07 14:52:17 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:17 [INFO] Starting process for Pysa app_1 | 2022/02/07 14:52:19 [INFO] Scraping victims app_1 | 2022/02/07 14:52:23 [WARNING] couldn't parse timestamp: 00/00/00 app_1 | /usr/local/lib/python3.9/site-packages/dateparser/date_parser.py:35: PytzUsageWarning: The localize method is no longer necessary, as this time zone supports the fold attribute (PEP 495). For more details on migrating to a PEP 495-compliant implementation, see https://pytz-deprecation-shim.readthedocs.io/en/latest/migration.html app_1 | date_obj = stz.localize(date_obj) app_1 | 2022/02/07 14:52:24 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:24 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:24 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:24 [INFO] Finished Pysa app_1 | 2022/02/07 14:52:24 [INFO] Starting process for Hive app_1 | 2022/02/07 14:52:24 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:24 [INFO] Starting process for Lockbit app_1 | 2022/02/07 14:52:24 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:24 [INFO] Starting process for Xing app_1 | 2022/02/07 14:52:24 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:24 [INFO] Starting process for Lorenz app_1 | 2022/02/07 14:52:26 [INFO] This is the first scrape for Lorenz, no victim notifications will be sent app_1 | 2022/02/07 14:52:26 [INFO] Scraping victims app_1 | 2022/02/07 14:52:27 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:27 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:27 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:27 [INFO] Finished Lorenz app_1 | 2022/02/07 14:52:27 [INFO] Starting process for ElCometa app_1 | 2022/02/07 14:52:27 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:27 [INFO] Starting process for Arvin app_1 | 2022/02/07 14:52:30 [INFO] This is the first scrape for Arvin, no victim notifications will be sent app_1 | 2022/02/07 14:52:30 [INFO] Scraping victims app_1 | 2022/02/07 14:52:33 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:33 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:33 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:33 [INFO] Finished Arvin app_1 | 2022/02/07 14:52:33 [INFO] Starting process for Blackmatter app_1 | 2022/02/07 14:52:33 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:33 [INFO] Starting process for Avoslocker app_1 | 2022/02/07 14:52:33 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:33 [INFO] Starting process for LV app_1 | 2022/02/07 14:52:35 [INFO] Scraping victims app_1 | 2022/02/07 14:52:37 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:37 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:37 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:37 [INFO] Finished LV app_1 | 2022/02/07 14:52:37 [INFO] Starting process for Marketo app_1 | 2022/02/07 14:52:37 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:37 [INFO] Starting process for LockData app_1 | 2022/02/07 14:52:40 [INFO] Scraping victims app_1 | 2022/02/07 14:52:42 [INFO] There are 0 new victims app_1 | 2022/02/07 14:52:42 [INFO] Identifying removed victims app_1 | 2022/02/07 14:52:42 [INFO] There are 0 removed victims app_1 | 2022/02/07 14:52:42 [INFO] Finished LockData app_1 | 2022/02/07 14:52:42 [INFO] Starting process for Rook app_1 | 2022/02/07 14:52:42 [WARNING] No URL found in config for this actor, skipping app_1 | 2022/02/07 14:52:42 [INFO] Finished all sites, exiting

    Environment

    • OS: [Ubuntu 20.04.3]
    • How you are running it: [Docker with cronjob]
    opened by Deventual 1
  • Victim removal detection doesn't work properly when onion changes

    Victim removal detection doesn't work properly when onion changes

    Victim removal detection currently uses the full URL usually, which includes the onion domain. One side effect of this is that whenever the onion addr for a site changes, all of the victims are considered removed and new on the next scrape, which is problematic.

    Change this to just use the URI + site ID.

    bug 
    opened by captainGeech42 0
  • LOCKBIT 2.0 Support

    LOCKBIT 2.0 Support

    Site Info (no URL) LOCKBIT 2.0 was released some time ago. It should be confirmed either the scraper works with the new site or a module should be rewritten.

    Is the site currently online? Yes

    opened by wersas1 5
Releases(v1.2)
  • v1.2(Dec 4, 2021)

    This release fixes a few different bugs on the following scrapers:

    • Ragnar
    • Lorenz
    • Pysa
    • Arvin

    What's Changed

    • fixed #79 by @captainGeech42 in https://github.com/captainGeech42/ransomwatch/pull/80
    • fixed #76 by @captainGeech42 in https://github.com/captainGeech42/ransomwatch/pull/81
    • fixed #77, changed dateparsing to use lib by @captainGeech42 in https://github.com/captainGeech42/ransomwatch/pull/82
    • changed arvin date parsing to use lib (fixes #75) by @captainGeech42 in https://github.com/captainGeech42/ransomwatch/pull/83

    Full Changelog: https://github.com/captainGeech42/ransomwatch/compare/v1.1...v1.2

    Source code(tar.gz)
    Source code(zip)
  • v1.1(Dec 2, 2021)

    Ransomwatch v1.1

    This release adds support for many new sites, and has a critical security update. For details on the security update, see here.

    Supported Sites

    This release supports the following shame sites:

    • Conti
    • Sodinokibi/REvil
    • Pysa
    • Avaddon
    • DarkSide
    • CL0P
    • Nefilim
    • Mount Locker
    • Suncrypt
    • Everest
    • Ragnarok
    • Ragnar_Locker
    • BABUK LOCKER
    • Pay2Key
    • Cuba
    • RansomEXX
    • Pay2Key
    • Ranzy Locker
    • Astro Team
    • BlackMatter
    • Arvin
    • El_Cometa
    • Lorenz
    • Xing
    • Lockbit
    • AvosLocker
    • LV
    • Marketo
    • Lockdata
    Source code(tar.gz)
    Source code(zip)
  • v1.0(Apr 18, 2021)

    v1.0 Ransomwatch Release

    This initial version of Ransomwatch supports the following sites:

    • Conti
    • REvil/Sodinokibi
    • Avaddon
    • DarkSide

    This release supports notifying via:

    • Slack Webhooks

    More sites/notification capabilities will be added over time. However, this release has been tested in a production capacity and should be suitable to start collections.

    If you find any bugs or run across an problems, please open an issue to help improve Ransomwatch

    Source code(tar.gz)
    Source code(zip)
Owner
Zander Work
@osusec / @OSU-SOC
Zander Work
Monitoring plugin to check disk io with Icinga, Nagios and other compatible monitoring solutions

check_disk_io - Monitor disk io This is a monitoring plugin for Icinga, Nagios and other compatible monitoring solutions to check the disk io. It uses

DinoTools 3 Nov 15, 2022
Soda SQL Data testing, monitoring and profiling for SQL accessible data.

Soda SQL Data testing, monitoring and profiling for SQL accessible data. What does Soda SQL do? Soda SQL allows you to Stop your pipeline when bad dat

Soda Data Monitoring 51 Jan 1, 2023
A demo of Prometheus+Grafana for monitoring an ML model served with FastAPI.

ml-monitoring Jeremy Jordan This repository provides an example setup for monitoring an ML system deployed on Kubernetes.

Jeremy Jordan 176 Jan 1, 2023
changedetection.io - The best and simplest self-hosted website change detection monitoring service

changedetection.io - The best and simplest self-hosted website change detection monitoring service. An alternative to Visualping, Watchtower etc. Designed for simplicity - the main goal is to simply monitor which websites had a text change. Open source web page change detection.

null 7.3k Jan 1, 2023
A Prometheus exporter for monitoring & analyzing Grafana Labs' technical documentation

grafana-docs-exporter A Prometheus exporter for monitoring & analyzing Grafana Labs' technical documentation Here is the public endpoint.

Matt Abrams 5 May 2, 2022
Scout: an open-source version of the monitoring tool

Badger Scout Scout is an open-source version of the monitoring tool used by Badg

Badger Finance 2 Jan 13, 2022
dash-manufacture-spc-dashboard is a dashboard for monitoring read-time process quality along manufacture production line

In our solution based on plotly, dash and influxdb, the user will firstly generate the specifications for different robots, and then a wide range of interactive visualizations for different machines for machine power, machine cost, and total cost based on the energy time and total energy getting dynamically from sensors. If a threshold is met, the alert email is generated for further operation.

Dequn Teng 1 Feb 13, 2022
Scan Site - Tools For Scanning Any Site and Get Site Information

Site Scanner Tools For Scanning Any Site and Get Site Information Example Require - pip install colorama - pip install requests How To Use Download Th

NumeX 5 Mar 19, 2022
Discord Token Generator of a project - Some stupids ppl are trying to leak it so i'm leaking faster :)

Original creator: Rolf (dort) HCaptcha Bypasser: h0nde Shark.Solar Discord Token Generator of a project - Some stupids ppl are trying to leak it so i'

Stanley 14 Sep 29, 2021
A bot to display per user data from the Twitch Leak

twitch-leak-bot-discord A bot to display per user data from the Twitch Leak by username Where's the data? I can't and don't want to supply the .csv's

SSSEAL-C 0 Nov 8, 2022
Simple reproduction of connection leak with celery/django/gevent

Redis connection leak with celery/django/gevent Reproduces celery issue at https://github.com/celery/celery/issues/6819 using gevented django web serv

null 2 Apr 3, 2022
SARA - Simple Android Ransomware Attack

SARA - Simple Android Ransomware Attack Disclaimer The author is not responsible for any issues or damage caused by this program. Features User can cu

Termux Hackers 99 Jan 4, 2023
Vaccine for STOP/DJVU ransomware, prevents encryption

STOP/DJVU Ransomware Vaccine Prevents STOP/DJVU Ransomware from encrypting your files. This tool does not prevent the infection itself. STOP ransomwar

Karsten Hahn 16 May 31, 2022
Worm/Trojan/Ransomware/apt/Rootkit/Virus Database

Pestilence - The Malware Database [] Screenshot Pestilence is a project created to make the possibility of malware analysis open and available to the

*ERR0R* 47 Dec 21, 2022
CNKD - Minimalistic Windows ransomware written in Python

CNKD Minimalistic Windows ransomware written in Python (Still a work in progress

Alex 2 May 27, 2022
A python-based static site generator for setting up a CV/Resume site

ezcv A python-based static site generator for setting up a CV/Resume site Table of Contents What does ezcv do? Features & Roadmap Why should I use ezc

Kieran Wood 5 Oct 25, 2022
Django-static-site - A simple content site framework that harnesses the power of Django without the hassle

coltrane A simple content site framework that harnesses the power of Django with

Adam Hill 57 Dec 6, 2022
HTTP(s) "monitoring" webpage via FastAPI+Jinja2. Inspired by https://github.com/RaymiiOrg/bash-http-monitoring

python-http-monitoring HTTP(s) "monitoring" powered by FastAPI+Jinja2+aiohttp. Inspired by bash-http-monitoring. Installation can be done with pipenv

itzk 39 Aug 26, 2022
Monitoring plugin to check network interfaces with Icinga, Nagios and other compatible monitoring solutions

check_network_interface - Monitor network interfaces This is a monitoring plugin for Icinga, Nagios and other compatible monitoring solutions to check

DinoTools 3 Nov 15, 2022
Monitoring plugin to check disk io with Icinga, Nagios and other compatible monitoring solutions

check_disk_io - Monitor disk io This is a monitoring plugin for Icinga, Nagios and other compatible monitoring solutions to check the disk io. It uses

DinoTools 3 Nov 15, 2022