Threat Intel Platform for T-POTs

Overview

GreedyBear

GitHub release (latest by date) GitHub Repo stars

CodeFactor Code style: black Imports: isort Pull request automation

The project goal is to extract data of the attacks detected by a TPOT or a cluster of them and to generate some feeds that can be used to prevent and detect attacks.

Official announcement here.

Feeds

Public feeds

There are public feeds provided by The Honeynet Project in this site: greedybear.honeynet.org. Example

Please do not perform too many requests to extract feeds or you will be banned.

If you want to be updated regularly, please download the feeds only once every 10 minutes (this is the time between each internal update).

Available feeds

The feeds are reachable through the following URL:

https://
   
    /api/feeds/
    
     /
     
      /
      
       .
        
       
      
     
    
   

The available feed_type are:

  • log4j: attacks detected from the Log4pot.
  • cowrie: attacks detected from the Cowrie Honeypot
  • all: get all types at once

The available attack_type are:

  • scanner: IP addresses captured by the honeypots while performing attacks
  • payload_request: IP addresses and domains extracted from payloads that would have been executed after a speficic attack would have been successful
  • all: get all types at once

The available age are:

  • recent: most recent IOCs seen in the last 3 days
  • persistent: these IOCs are the ones that were seen regularly by the honeypots. This feeds will start empty once no prior data was collected and will become bigger over time.

The available format are:

  • txt: plain text (just one line for each IOC)
  • csv: CSV-like file (just one line for each IOC)
  • json: JSON file with additional information regarding the IOCs

Run Greedybear on your environment

The tool has been created not only to provide the feeds from The Honeynet Project's cluster of TPOTs.

If you manage one or more T-POTs of your own, you can get the code of this application and run Greedybear on your environment. In this way, you are able to provide new feeds of your own.

Comments
  • Added Basic Testcases

    Added Basic Testcases

    Description

    Added Testcases for Views and Models

    Related issues

    Fixes #21

    Type of change

    Please delete options that are not relevant.

    • [ ] Bug fix (non-breaking change which fixes an issue).
    • [ ] New feature (non-breaking change which adds functionality).
    • [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected).

    Checklist

    • [ ] I have read and understood the rules about how to Contribute to this project
    • [ ] The pull request is for the branch dev
    • [ ] The tests gave 0 errors.
    • [ ] Linters (Black, Flake, Isort) gave 0 errors. If you have correctly installed pre-commit, it does these checks and adjustments on your behalf.
    • [ ] The commits were squashed into a single one (optional, they will be squashed anyway by the maintainer)

    Important Rules

    • If your changes decrease the overall tests coverage (you will know after the Codecov CI job is done), you should add the required tests to fix the problem
    • Everytime you make changes to the PR and you think the work is done, you should explicitly ask for a review
    opened by uzaxirr 11
  • Create authenticated enrichment service

    Create authenticated enrichment service

    We could provide a service that could be queried via API key. In this way, it would be possibile to understand if an IOC is in the database of Greedybear without having to download and manage all the feeds from Greedybear.

    It would be a simple enrichment service.

    We would need:

    • a basic GUI (#11) to allow people register and get an API key.
    • limit API usage to avoid abuse.
    • allow different kind of API usage limits
    • create new API endpoint (#17)
    • Integrate it in IntelOwl (https://github.com/intelowlproject/IntelOwl/issues/817)
    opened by mlodic 9
  • Create feeds for other honeypot types

    Create feeds for other honeypot types

    GreedyBear works by extracting the data from the T-Pot logs generated by the honeypots.

    As a first alpha release we just integrated log4jpot + cowrie.

    We should also integrate all the other available honeypots in the T-PoT. Glutton should be the first

    opened by mlodic 8
  • Fixes #17: Added API for Enrichment

    Fixes #17: Added API for Enrichment

    Description

    Added Enrichment Endpoint. To get details of an observable my it's name. Endpoint: /api/enrichment?query=<observable_name>

    Please ignore the vague changes in settings.py regarding env vars. Did it because of #23 I'll revert them when my PR is good to go.

    Added Fake data in DB through admin pannel for testing purpose

    Related issues

    Fixes and Closes #17

    Type of change

    Please delete options that are not relevant.

    • [x] New feature (non-breaking change which adds functionality).

    Checklist

    • [x] I have read and understood the rules about how to Contribute to this project
    • [x] The pull request is for the branch dev
    • [ ] The tests gave 0 errors.
    • [ ] Linters (Black, Flake, Isort) gave 0 errors. If you have correctly installed pre-commit, it does these checks and adjustments on your behalf.
    • [ ] The commits were squashed into a single one (optional, they will be squashed anyway by the maintainer)

    Screenshots

    API Response

    For a record that exist in DB

    Screenshot from 2022-01-05 21-41-55

    For a record that does not exist in DB

    Screenshot from 2022-01-05 21-42-07

    Details of the searched observable in DB

    Screenshot from 2022-01-02 23-24-40

    All Records in DB

    Screenshot from 2022-01-02 23-24-27

    opened by uzaxirr 7
  • Configured Read the Docs

    Configured Read the Docs

    Description

    Configured Read the Docs

    Changes I have done :

    added .readthedocs.yaml file made some changes to docs/source/conf.py added documentation link in readme

    Things to complete :

    I created only the empty md files in docs but haven't added any documentation in them need to add doc of openapi and redoc.

    Related issues

    This PR partially solves issue #27

    Type of change

    • [x] New feature (non-breaking change which adds functionality).

    Checklist

    • [x] I have read and understood the rules about how to Contribute to this project
    • [x] The pull request is for the branch dev
    • [x] The tests gave 0 errors.
    • [x] Linters (Black, Flake, Isort) gave 0 errors. If you have correctly installed pre-commit, it does these checks and adjustments on your behalf.
    • [ ] The commits were squashed into a single one (optional, they will be squashed anyway by the maintainer)

    Important Rules

    • If your changes decrease the overall tests coverage (you will know after the Codecov CI job is done), you should add the required tests to fix the problem
    • Everytime you make changes to the PR and you think the work is done, you should explicitly ask for a review
    opened by yaswanthsaivendra 4
  • Added elasticsearch container for development

    Added elasticsearch container for development

    Description

    Added elasticsearch container for development

    Related issues

    closes #23

    Type of change

    Please delete options that are not relevant.

    • [ ] Bug fix (non-breaking change which fixes an issue).
    • [X] New feature (non-breaking change which adds functionality).
    • [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected).

    Checklist

    • [X] I have read and understood the rules about how to Contribute to this project
    • [X] The pull request is for the branch dev
    • [X] The tests gave 0 errors.
    • [X] Linters (Black, Flake, Isort) gave 0 errors. If you have correctly installed pre-commit, it does these checks and adjustments on your behalf.
    • [ ] The commits were squashed into a single one (optional, they will be squashed anyway by the maintainer)

    Important Rules

    • If your changes decrease the overall tests coverage (you will know after the Codecov CI job is done), you should add the required tests to fix the problem
    • Everytime you make changes to the PR and you think the work is done, you should explicitly ask for a review
    opened by devmrfitz 4
  • Elasticsearch installation error

    Elasticsearch installation error

    i'm encountering some error while setting up GreedyBear locally After doing the docker-compose -p greedybear up cmd. It originates from settings.py where Elasticsearch client is being initialized. The ELASTIC_ENDPOINT variable in my env file is empty Screenshot from 2022-01-02 19-51-22

    opened by uzaxirr 4
  • updated feeds  view to make use of DRF and added durin authenication

    updated feeds view to make use of DRF and added durin authenication

    Description

    • Made changes to feeds View to make use of DRF
    • Added token authentication of django-rest-durin.

    Related issues

    This PR solves #26 issue.

    Type of change

    • [ ] New feature (non-breaking change which adds functionality).

    Checklist

    • [ ] I have read and understood the rules about how to Contribute to this project
    • [ ] The pull request is for the branch dev
    • [ ] The tests gave 0 errors.
    • [ ] Linters (Black, Flake, Isort) gave 0 errors. If you have correctly installed pre-commit, it does these checks and adjustments on your behalf.
    • [ ] The commits were squashed into a single one (optional, they will be squashed anyway by the maintainer)
    opened by yaswanthsaivendra 3
  • Rate limiting for admin and API

    Rate limiting for admin and API

    Description

    Rate limiting for admin and API

    Related issues

    #31

    Type of change

    Please delete options that are not relevant.

    • [ ] Bug fix (non-breaking change which fixes an issue).
    • [X] New feature (non-breaking change which adds functionality).
    • [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected).

    Checklist

    • [X] I have read and understood the rules about how to Contribute to this project
    • [X] The pull request is for the branch dev
    • [X] The tests gave 0 errors.
    • [X] Linters (Black, Flake, Isort) gave 0 errors. If you have correctly installed pre-commit, it does these checks and adjustments on your behalf.
    • [X] The commits were squashed into a single one (optional, they will be squashed anyway by the maintainer)

    Important Rules

    • If your changes decrease the overall tests coverage (you will know after the Codecov CI job is done), you should add the required tests to fix the problem
    • Everytime you make changes to the PR and you think the work is done, you should explicitly ask for a review
    opened by devmrfitz 2
  • Add CONTRIBUTING.md file

    Add CONTRIBUTING.md file

    Can we please add or refer to the URL containing the guidelines for future contributors, I see there's nothing mentioned about it in the readme or docs for this repo.

    opened by ManishShah120 2
  • Integrate GreedyBear inside T-Pot installation

    Integrate GreedyBear inside T-Pot installation

    This would require that all of these issues were solved first:

    • #11 , #12 , #10 , #21 , #27

    Plus, we would need to work with the T-Pot team to properly integrate the project there. The goal is to try to reduce the complexity of the overall application to allow an easy integration

    opened by mlodic 2
  • Allow to do customized feeds lookups

    Allow to do customized feeds lookups

    We could add more ways to extract data feeds from GB other than "recent" and "persistent" which are free.

    These new ways must be protected with authentication to avoid abuse.

    We could give the users the chance to:

    • download the data extracted in the last X hours (customization of "recent")
    • download the data that was seen more than X times in the last X days (customization of "persistent")
    opened by mlodic 0
  • Filter IP addresses from known scanners

    Filter IP addresses from known scanners

    We should periodically download this batch of data: https://raw.githubusercontent.com/stamparm/maltrail/master/trails/static/mass_scanner.txt and add those IP to whitelists to reduce number of false positives

    opened by mlodic 0
  • Add the chance to select which honeypot we want to extract data from

    Add the chance to select which honeypot we want to extract data from

    Right now there is no chance to do that. GreedyBear would automatically extract data from all the configured honeypots.

    We should allow the app administrator from the Django Admin to enable/disable honeypot extraction. In that way we can also filter logs which states that the honeypot is not running.

    opened by mlodic 0
Releases(v1.0.2)
Owner
The Honeynet Project
The Honeynet Project
Agile Threat Modeling Toolkit

Threagile is an open-source toolkit for agile threat modeling:

Threagile 425 Jan 7, 2023
Internal network honeypot for detecting if an attacker or insider threat scans your network for log4j CVE-2021-44228

log4j-honeypot-flask Internal network honeypot for detecting if an attacker or insider threat scans your network for log4j CVE-2021-44228 This can be

Binary Defense 144 Nov 19, 2022
log4j-tools: CVE-2021-44228 poses a serious threat to a wide range of Java-based applications

log4j-tools Quick links Click to find: Inclusions of log4j2 in compiled code Calls to log4j2 in compiled code Calls to log4j2 in source code Overview

JFrog Ltd. 171 Dec 25, 2022
Arbitrium is a cross-platform, fully undetectable remote access trojan, to control Android, Windows and Linux and doesn't require any firewall exceptions or port forwarding rules

About: Arbitrium is a cross-platform is a remote access trojan (RAT), Fully UnDetectable (FUD), It allows you to control Android, Windows and Linux an

Ayoub 861 Feb 18, 2021
MozDef: Mozilla Enterprise Defense Platform

MozDef: Documentation: https://mozdef.readthedocs.org/en/latest/ Give MozDef a Try in AWS: The following button will launch the Mozilla Enterprise Def

Mozilla 2.2k Jan 8, 2023
Pupy is an opensource, cross-platform (Windows, Linux, OSX, Android) remote administration and post-exploitation tool mainly written in python

Pupy Installation Installation instructions are on the wiki, in addition to all other documentation. For maximum compatibility, it is recommended to u

null 7.4k Jan 4, 2023
This little tool is to calculate a MurmurHash value of a favicon to hunt phishing websites on the Shodan platform.

MurMurHash This little tool is to calculate a MurmurHash value of a favicon to hunt phishing websites on the Shodan platform. What is MurMurHash? Murm

Viral Maniar 87 Dec 31, 2022
Used to build an XSS platform on the command line.

pyXSSPlatform Used to build an XSS platform on the command line. Usage: 1.generate the cert file You can use openssl like this: openssl req -new -x509

null 70 Jun 21, 2022
A python script to turn Ubuntu Desktop in a one stop security platform. The InfoSec Fortress installs the packages,tools, and resources to make Ubuntu 20.04 capable of both offensive and defensive security work.

infosec-fortress A python script to turn Ubuntu Desktop into a strong DFIR/RE System with some teeth (Purple Team Ops)! This is intended to create a s

James 41 Dec 30, 2022
A cross-platform Python module that displays **** for password input. Works on Windows, unlike getpass. Formerly called stdiomask.

PWInput A cross-platform Python module that displays **** for password input. Works on Windows, unlike getpass. Formerly called stdiomask. Installatio

Al Sweigart 26 Sep 4, 2022
Threat Intel Platform for T-POTs

T-Pot 20.06 runs on Debian (Stable), is based heavily on docker, docker-compose

Deutsche Telekom Security GmbH 4.3k Jan 7, 2023
Intel® Nervana™ reference deep learning framework committed to best performance on all hardware

DISCONTINUATION OF PROJECT. This project will no longer be maintained by Intel. Intel will not provide or guarantee development of or support for this

Nervana 3.9k Dec 20, 2022
Reinforcement Learning Coach by Intel AI Lab enables easy experimentation with state of the art Reinforcement Learning algorithms

Coach Coach is a python reinforcement learning framework containing implementation of many state-of-the-art algorithms. It exposes a set of easy-to-us

Intel Labs 2.2k Jan 5, 2023
Intel® Nervana™ reference deep learning framework committed to best performance on all hardware

DISCONTINUATION OF PROJECT. This project will no longer be maintained by Intel. Intel will not provide or guarantee development of or support for this

Nervana 3.9k Feb 9, 2021
PerfSpect is a system performance characterization tool based on linux perf targeting Intel microarchitectures

PerfSpect PerfSpect is a system performance characterization tool based on linux perf targeting Intel microarchitectures. The tool has two parts perf

Intel Corporation 139 Dec 30, 2022
PoC getting concret intel with chardet and charset-normalizer

aiohttp with charset-normalizer Context aiohttp.TCPConnector(limit=16) alpine linux nginx 1.21 python 3.9 aiohttp dev-master chardet 4.0.0 (aiohttp-ch

TAHRI Ahmed R. 2 Nov 30, 2022
Intel Realsense t265 into Unreal Engine

t265_UE Intel Realsense t265 into Unreal Engine. Windows only, and Livelink plugin is 4.26.2 only at the moment. Might recompile it for different vers

Bjarke Aagaard 30 Jan 2, 2023
A Python software implementation of the Intel 4004 processor

Pyntel4004 A Python software implementation of the Intel 4004 processor. General Information Two pass assembler using the original mnemonics, directiv

alshapton 5 Oct 1, 2022
Multiple types of NN model optimization environments. It is possible to directly access the host PC GUI and the camera to verify the operation. Intel iHD GPU (iGPU) support. NVIDIA GPU (dGPU) support.

mtomo Multiple types of NN model optimization environments. It is possible to directly access the host PC GUI and the camera to verify the operation.

Katsuya Hyodo 24 Mar 2, 2022