Fuzzing JavaScript Engines with Aspect-preserving Mutation

Related tags

Deep Learning DIE
Overview

DIE

Repository for "Fuzzing JavaScript Engines with Aspect-preserving Mutation" (in S&P'20). You can check the paper for technical details.

Environment

Tested on Ubuntu 18.04 with following environment.

  • Python v3.6.10
  • npm v6.14.6
  • n v6.7.0

General Setup

For nodejs and npm,

$ sudo apt-get -y install npm
$ sudo npm install -g n
$ sudo n stable

For redis-server,

$ sudo apt install redis-server

we choose clang-6.0 to compile afl and browsers smoothly.

$ sudo apt-get -y install clang-6.0

DIE Setup

To setup environment for AFL,

$ cd fuzz/scripts
$ sudo ./prepare.sh

To compile whole project,

$ ./compile.sh

Server Setup

  • Make Corpus Directory (We used Die-corpus as corpus)
$ git clone https://github.com/sslab-gatech/DIE-corpus.git
$ python3 ./fuzz/scripts/make_initial_corpus.py ./DIE-corpus ./corpus
  • Make ssh-tunnel for connection with redis-server
$ ./fuzz/scripts/redis.py
  • Dry run with corpus
$ ./fuzz/scripts/populate.sh [target binary path] [path of DIE-corpus dir] [target js engine (ch/jsc/v8/ffx)]
# Example
$ ./fuzz/scripts/populate.sh ~/ch ./DIE-corpus ch

It's done! Your corpus is well executed and the data should be located on redis-server.

Tips

To check the redis-data,

$ redis-cli -p 9000
127.0.0.1:9000> keys *

If the result contains "crashBitmap", "crashQueue", "pathBitmap", "newPathsQueue" keys, the fuzzer was well registered and executed.

Client Setup

  • Make ssh-tunnel for connection with redis-server
$ ./fuzz/scripts/redis.py
  • Usage
$ ./fuzz/scripts/run.sh [target binary path] [path of DIE-corpus dir] [target js engine (ch/jsc/v8/ffx)]
# Example
$ ./fuzz/scripts/run.sh ~/ch ./DIE-corpus ch
  • Check if it's running
$ tmux ls

You can find a session named fuzzer if it's running.

Typer

We used d8 to profile type information. So, please change d8_path in fuzz/TS/typer/typer.py before execution.

cd fuzz/TS/typer
python3 typer.py [corpus directory]

*.jsi file will be created if instrumentation works well. *.t file will be created if profiling works well.

CVEs

If you find bugs and get CVEs by running DIE, please let us know.

  • ChakraCore: CVE-2019-0609, CVE-2019-1023, CVE-2019-1300, CVE-2019-0990, CVE-2019-1092
  • JavaScriptCore: CVE-2019-8676, CVE-2019-8673, CVE-2019-8811, CVE-2019-8816
  • V8: CVE-2019-13730, CVE-2019-13764, CVE-2020-6382

Contacts

Citation

@inproceedings{park:die,
  title        = {{Fuzzing JavaScript Engines with Aspect-preserving Mutation}},
  author       = {Soyeon Park and Wen Xu and Insu Yun and Daehee Jang and Taesoo Kim},
  booktitle    = {Proceedings of the 41st IEEE Symposium on Security and Privacy (Oakland)},
  month        = may,
  year         = 2020,
  address      = {San Francisco, CA},
}
Comments
  • Failed to generate test case

    Failed to generate test case

    Hi,

    I set up DIE by following your README and after I execute ./fuzz/script/run.sh ~/ch I got following logs.

    *] Command: timeout 30 node ./fuzz/afl/../TS/esfuzz.js output-1/.cur_input.js output-1/fuzz_inputs 100 385230076 > /dev/null
    /home/hacker/DIE/fuzz/TS/base/utils.js:136
                throw new Error("[-] " + msg);
                ^
    
    Error: [-] file doesn't exist: output-1/.cur_input.js
        at Object.assert (/home/hacker/DIE/fuzz/TS/base/utils.js:136:19)
        at new Code (/home/hacker/DIE/fuzz/TS/base/estestcase.js:17:17)
        at main (/home/hacker/DIE/fuzz/TS/esfuzz.js:22:16)
        at Object.<anonymous> (/home/hacker/DIE/fuzz/TS/esfuzz.js:46:1)
        at Module._compile (internal/modules/cjs/loader.js:1137:30)
        at Object.Module._extensions..js (internal/modules/cjs/loader.js:1157:10)
        at Module.load (internal/modules/cjs/loader.js:985:32)
        at Function.Module._load (internal/modules/cjs/loader.js:878:14)
        at Function.executeUserEntryPoint [as runMain] (internal/modules/run_main.js:71:12)
        at internal/main/run_main_module.js:17:47
    

    It seems failed to generate new test case.

    opened by zr950624 10
  • Afl-clang-fast (based on clang-6.0 ) can't be used to compile target v8 engine

    Afl-clang-fast (based on clang-6.0 ) can't be used to compile target v8 engine

    v8 engine used clang(version 12.0.0) which would be download while git the source code to compile the whole project. But afl-clang was built based on clang-6.0, there were too many options which are unknow for that. How did you deal with that ? 16032746465094

    opened by qiaoguanli 4
  • Instructions to use new corpus

    Instructions to use new corpus

    Hi, I found that https://github.com/sslab-gatech/DIE/blob/master/fuzz/scripts/make_initial_corpus.py#L59 simply copies the .js and .t files. If we want to add new test cases, how do we generate the .t files for js files?

    opened by Changochen 3
  • How to save testcases generated by DIE?

    How to save testcases generated by DIE?

    Hello! I just finished reading your paper, it's great! And I hope to run DIE on my local machine, but there are some problems I can't solve and I wish you can help me. My questions as follows:

    1. Have I installed the DIE successfully?

      Firstly, I instrumented my JS engine with the afl-clang-fast from the original AFL. When running the populate script and attach the tmux corpus, I received the following messages:

      [*] Insert a new path: ./corpus/output-x/00xxxx-corpus.js
      [*] Command: node ./fuzz/afl/../TS/redis_ctrl.js insertPath ./corpus/output-x/00xxxx-corpus.js output-x/.cov_diff
      [*] Checking corpus: ./corpus/output-x/00xxxx-corpus.js
      [*] Insert a new path: ./corpus/output-x/00xxxx-corpus.js
      [*] Command: node ./fuzz/afl/../TS/redis_ctrl.js insertPath ./corpus/output-x/00xxxx-corpus.js output-x/.cov_diff
      [*] Checking corpus: ./corpus/output-x/00xxxx-corpus.js
      [*] Insert a new path: ./corpus/output-x/00xxxx-corpus.js
      [*] Command: node ./fuzz/afl/../TS/redis_ctrl.js insertPath ./corpus/output-x/00xxxx-corpus.js output-x/.cov_diff
      
      +++ Testing aborted by user +++
      [+] We're done here. Have a nice day!
      

      And when connecting to redis database with redis-cli -p 9000 I see the following keys:

       1) "fuzzers:fuzzer-ws-X299-WU8-3f35a809c8b14ce3-9592"
       2) "pathBitmap"
       3) "fuzzers:fuzzer-ws-X299-WU8-3f35a809c8b14ce3-9602"
       4) "fuzzers:fuzzer-ws-X299-WU8-3f35a809c8b14ce3-9562"
       5) "fuzzers:fuzzer-ws-X299-WU8-3f35a809c8b14ce3-9587"
       6) "fuzzers:fuzzer-ws-X299-WU8-3f35a809c8b14ce3-9552"
       7) "fuzzers:fuzzer-ws-X299-WU8-3f35a809c8b14ce3-9597"
       8) "crashQueue"
       9) "fuzzers:fuzzer-ws-X299-WU8-3f35a809c8b14ce3-9542"
      10) "fuzzers:fuzzer-ws-X299-WU8-3f35a809c8b14ce3-9532"
      11) "fuzzers:fuzzer-ws-X299-WU8-3f35a809c8b14ce3-9572"
      12) "fuzzers:fuzzer-ws-X299-WU8-3f35a809c8b14ce3-9582"
      13) "crashBitmap"
      14) "fuzzers:fuzzer-ws-X299-WU8-3f35a809c8b14ce3-9537"
      15) "fuzzers:fuzzer-ws-X299-WU8-3f35a809c8b14ce3-9567"
      16) "fuzzers:fuzzer-ws-X299-WU8-3f35a809c8b14ce3-9547"
      17) "fuzzers:fuzzer-ws-X299-WU8-3f35a809c8b14ce3-9557"
      18) "fuzzers"
      19) "fuzzers:fuzzer-ws-X299-WU8-3f35a809c8b14ce3-9527"
      20) "fuzzers:fuzzer-ws-X299-WU8-3f35a809c8b14ce3-9577"
      21) "newPathsQueue"
      

      Does it mean that the fuzzer was well registered and executed?

      Next, I set up the client. I build the server and client on the same machine.

      So, i skip execution ./fuzz/scripts/redis.py and running ./fuzz/scripts/run.sh ~/ch ./DIE-corpus ch, I get the following messages:

      [*] No -t option specified, so I'll use exec timeout of 1000 ms.
      [+] All set and ready to roll!
      [*] Command: node ./fuzz/afl/../TS/redis_ctrl.js reportStatus fuzzer-$(hostname)-$(cat /etc/machine-id|cut -c 1-16)-16583 output-15/fuzzer_stats
      [*] Get a next testcase
      [*] Command: node ./fuzz/afl/../TS/redis_ctrl.js getNextTestcase output-15/.cur_input.js
      [*] Generating testcases...
      [*] Command: timeout 30 node ./fuzz/afl/../TS/esfuzz.js output-15/.cur_input.js output-15/fuzz_inputs 100 2079661984 > /dev/null
      [*] Scanning 'output-15/fuzz_inputs'...
      [*] Spinning up the fork server...
      [+] All right - fork server is up.
      [*] Command: node ./fuzz/afl/../TS/redis_ctrl.js downloadBitmap crashBitmap output-15/.gcov_crash
      [*] Command: node ./fuzz/afl/../TS/redis_ctrl.js reportStatus fuzzer-$(hostname)-$(cat /etc/machine-id|cut -c 1-16)-16583 output-15/fuzzer_stats
      [*] Time - Generation: 202.00 ea/s, Execution: 20.20 ea/s
      

      contents in file fuzzer_stats are

      start_time        : 1600228579
      last_update       : 1600238323
      fuzzer_pid        : 16513
      cycles_done       : 0
      execs_done        : 48460
      execs_per_sec     : 9.41
      paths_total       : 0
      paths_favored     : 0
      paths_found       : 0
      paths_imported    : 0
      max_depth         : 0
      cur_path          : 484
      pending_favs      : 0
      pending_total     : 0
      variable_paths    : 0
      stability         : 100.00%
      bitmap_cvg        : 0.00%
      unique_crashes    : 0
      unique_hangs      : 2
      last_path         : 0
      last_crash        : 0
      last_hang         : 1600231336
      execs_since_crash : 48460
      exec_timeout      : 1000
      afl_banner        : ch
      afl_version       : 2.52b
      target_mode       : default
      command_line      : ./fuzz/afl/afl-fuzz -m none -o output-1 ./engines/chakracore-1.11.5/out/Debug/ch -lib=/path/to/DIE/DIE-corpus/lib.js -lib=/path/to/DIE/DIE-corpus/jsc.js -lib=/path/to/DIE/DIE-corpus/v8.js -lib=/path/to/DIE/DIE-corpus/ffx.js -lib=/path/to/DIE/DIE-corpus/chakra.js @@
      

      My installation is complete here. Does those look correct?

    2. How to save mutated seeds before executed by instrumented JS Engines?

    • Following the installation steps above, I got some files at path/to/DIE/output-1/hangs finally. they are named like id:000000,src:0000xx,op:js,pos:0. Are these files causing the engine timeout?
    • If I want to save every test case generated by DIE, regardless of the JS engine's performance, what should I do?

    Looking forward to your reply, thank you in advance.

    opened by QuXing9 2
  • issues with resuming

    issues with resuming

    Hi @thdusdl1219 ,

    is there any way to resume a fuzzing session? If I try to use something like -i- it seems to be the equivalent of rerunning populate.sh script.

    thanks

    opened by adriantdlg 2
  • Failed to use afl-clang-fast

    Failed to use afl-clang-fast

    There's no afl-clang-fast or afl-clang-fast++ after run the compile.sh:

    pushd fuzz/afl
    make clean
    make CC=clang-6.0
    #pushd llvm_mode
    #make clean
    #make CC=clang-6.0 CXX=g++ 
    #popd
    popd
    
    afl-analyze    afl-fuzz.c     afl-tmin      experimental   QuickStartGuide.txt
    afl-analyze.c  afl-g++        afl-tmin.c    hash.h         README
    afl-as         afl-gcc        alloc-inl.h   init           README-JS.md
    afl-as.c       afl-gcc.c      as            libdislocator  run
    afl-as.h       afl-gotcpu     config.h      libtokencap    test-instr.c
    afl-clang      afl-gotcpu.c   debug.h       llvm_mode      types.h
    afl-clang++    afl-showmap    dictionaries  Makefile
    afl-fuzz       afl-showmap.c  docs          qemu_mode
    

    when I tried to compile them by myself I got that two binary but failed with afl-llvm-pass.so

    $ make LLVM_CONFIG=llvm-config-6.0 CC=clang-6.0 CXX=g++          
    [*] Checking for working 'llvm-config'...
    [*] Checking for working 'clang-6.0'...
    [*] Checking for '../afl-showmap'...
    [+] All set and ready to build.
    g++ `llvm-config-6.0 --cxxflags` -fno-rtti -fpic -O3 -funroll-loops -Wall -D_FORTIFY_SOURCE=2 -g -Wno-pointer-sign -DVERSION=\"2.52b\" -Wno-variadic-macros -shared afl-llvm-pass.so.cc -o ../afl-llvm-pass.so `llvm-config-6.0 --ldflags` 
    cc1plus: error: -Werror=date-time: no option -Wdate-time
    cc1plus: warning: command line option ‘-Wno-pointer-sign’ is valid for C/ObjC but not for C++ [enabled by default]
    Makefile:83: recipe for target '../afl-llvm-pass.so' failed
    make: *** [../afl-llvm-pass.so] Error 1
    

    When try to Instrument v8:

    [-] PROGRAM ABORT : Unable to find 'afl-llvm-rt.o' or 'afl-llvm-pass.so'. Please set AFL_PATH
             Location : find_obj(), afl-clang-fast.c:90
    
    opened by GenoWang 1
  • error when compiling

    error when compiling

    I tried multiple clang versions, I installed clang-6.0 especially for this.

    [*] Testing the CC wrapper and instrumentation output... unset AFL_USE_ASAN AFL_USE_MSAN; AFL_QUIET=1 AFL_INST_RATIO=100 AFL_PATH=. ./afl-clang -O3 -funroll-loops -Wall -Wno-unused-variable -Wno-unused-function -Wno-unused-result -D_FORTIFY_SOURCE=2 -g -Wno-pointer-sign -DAFL_PATH="/usr/local/lib/afl" -DDOC_PATH="/usr/local/share/doc/afl" -DBIN_PATH="/usr/local/bin" test-instr.c -o test-instr -ldl echo 0 | ./afl-showmap -m none -q -o .test-instr0 ./test-instr echo 1 | ./afl-showmap -m none -q -o .test-instr1 ./test-instr

    Oops, the instrumentation does not seem to be behaving correctly!

    Please ping [email protected] to troubleshoot the issue.

    make: *** [Makefile:95: test_build] Error 1 ~/Downloads/DIE

    opened by adrian-rt 1
  • Getting coverage data

    Getting coverage data

    I'm trying to determine how coverage data for the engine is determined. I've read the paper and understand that DIE is based on AFL which uses gcov. I've also found the .gcov_crash and .gcov_path files in the output directory, but I don't know where to go from there.

    I'm fairly new to fuzzing tools overall, and would appreciate any guidance!

    opened by bryanmylee 0
  • error compiling type.ts

    error compiling type.ts

    I tried to debug #7 and I thought maybe if I do a git pull (I was already up to date) and recompile would fix my issues.

    When I tried to recompile I got this error:

    $ ./compile.sh ~/Downloads/DIE/fuzz/TS ~/Downloads/DIE npm WARN [email protected] No repository field. npm WARN [email protected] No license field. npm WARN The package vm2 is included as both a dev and production dependency.

    audited 129 packages in 0.714s

    1 package is looking for funding run npm fund for details

    found 0 vulnerabilities

    base/engine/type.ts:39:24 - error TS1005: ';' expected.

    39 return this.44(step, type); ~~~

    opened by adrian-rt 0
  • errors when running the fuzzer

    errors when running the fuzzer

    Error: [-] file doesn't exist: output-11/.cur_input.js at Object.assert (/home/adrian/Downloads/DIE/fuzz/TS/base/utils.js:135:19) at new Code (/home/adrian/Downloads/DIE/fuzz/TS/base/estestcase.js:16:17) at main (/home/adrian/Downloads/DIE/fuzz/TS/esfuzz.js:22:16) at Object. (/home/adrian/Downloads/DIE/fuzz/TS/esfuzz.js:46:1) at Module._compile (internal/modules/cjs/loader.js:1137:30) at Object.Module._extensions..js (internal/modules/cjs/loader.js:1157:10) at Module.load (internal/modules/cjs/loader.js:985:32) at Function.Module._load (internal/modules/cjs/loader.js:878:14) at Function.executeUserEntryPoint [as runMain] (internal/modules/run_main.js:71:12) at internal/main/run_main_module.js:17:47 [] Scanning 'output-11/fuzz_inputs'... [] Time - Generation: inf ea/s, Execution: inf ea/s

    [*] Get a next testcase [*] Command: node ./fuzz/afl/../TS/redis_ctrl.js getNextTestcase output-11/.cur_input.js [-] getNextTestcase - Need to populate first [*] Generating testcases... [*] Command: timeout 30 node ./fuzz/afl/../TS/esfuzz.js output-11/.cur_input.js output-11/fuzz_inputs 100 381215867 > /dev/null /home/adrian/Downloads/DIE/fuzz/TS/base/utils.js:135 throw new Error("[-] " + msg); ^

    Error: [-] file doesn't exist: output-11/.cur_input.js at Object.assert (/home/adrian/Downloads/DIE/fuzz/TS/base/utils.js:135:19) at new Code (/home/adrian/Downloads/DIE/fuzz/TS/base/estestcase.js:16:17) at main (/home/adrian/Downloads/DIE/fuzz/TS/esfuzz.js:22:16) at Object. (/home/adrian/Downloads/DIE/fuzz/TS/esfuzz.js:46:1) at Module._compile (internal/modules/cjs/loader.js:1137:30) at Object.Module._extensions..js (internal/modules/cjs/loader.js:1157:10) at Module.load (internal/modules/cjs/loader.js:985:32) at Function.Module._load (internal/modules/cjs/loader.js:878:14) at Function.executeUserEntryPoint [as runMain] (internal/modules/run_main.js:71:12) at internal/main/run_main_module.js:17:47 [*] Scanning 'output-11/fuzz_inputs'... [*] Time - Generation: inf ea/s, Execution: inf ea/s

    [*] Get a next testcase [*] Command: node ./fuzz/afl/../TS/redis_ctrl.js getNextTestcase output-11/.cur_input.js [-] getNextTestcase - Need to populate first [*] Generating testcases... [*] Command: timeout 30 node ./fuzz/afl/../TS/esfuzz.js output-11/.cur_input.js output-11/fuzz_inputs 100 1684276898 > /dev/null /home/adrian/Downloads/DIE/fuzz/TS/base/utils.js:135 throw new Error("[-] " + msg); ^

    Error: [-] file doesn't exist: output-11/.cur_input.js at Object.assert (/home/adrian/Downloads/DIE/fuzz/TS/base/utils.js:135:19) at new Code (/home/adrian/Downloads/DIE/fuzz/TS/base/estestcase.js:16:17) at main (/home/adrian/Downloads/DIE/fuzz/TS/esfuzz.js:22:16) at Object. (/home/adrian/Downloads/DIE/fuzz/TS/esfuzz.js:46:1) at Module._compile (internal/modules/cjs/loader.js:1137:30) at Object.Module._extensions..js (internal/modules/cjs/loader.js:1157:10) at Module.load (internal/modules/cjs/loader.js:985:32) at Function.Module._load (internal/modules/cjs/loader.js:878:14) at Function.executeUserEntryPoint [as runMain] (internal/modules/run_main.js:71:12) at internal/main/run_main_module.js:17:47 [*] Scanning 'output-11/fuzz_inputs'... [*] Time - Generation: inf ea/s, Execution: inf ea/s

    [*] Get a next testcase [*] Command: node ./fuzz/afl/../TS/redis_ctrl.js getNextTestcase output-11/.cur_input.js [-] getNextTestcase - Need to populate first [*] Generating testcases... [*] Command: timeout 30 node ./fuzz/afl/../TS/esfuzz.js output-11/.cur_input.js output-11/fuzz_inputs 100 1208457964 > /dev/null

    opened by adrian-rt 0
  • Redis Server initial setup

    Redis Server initial setup

    What should be the redis server id ?

    #!/usr/bin/env python3 import os import subprocess #export REDIS_URL=redis://localhost:9000

    p = subprocess.Popen(["tmux", "ls"], stdout=subprocess.PIPE) out, err = p.communicate() if "ssh-tunneling" in out.decode("utf-8"): print("ssh-tunneling already exists") exit()

    print("This script makes ssh-tunneling between your redis-server and this machine.")

    server = input("redis-server URL : ") login = input("redis-server ID : ")

    os.system("tmux new-session -s ssh-tunneling -d 'ssh -L 9000:localhost:6379 " + login + "@" + server + "'")

    opened by arvindk459895 0
  • A problem in initial type analysis stage

    A problem in initial type analysis stage

    Hello author, recently I am using DIE to test my project. It works well with the corpus you provided, but there are some errors when I tries to use a custom corpus. It seems that the situation is the same to #22 . Is there anything missed for Parsing Raw Seed? @thdusdl1219

    Looking forward to your reply.

    opened by Anderson-Xia 0
  • Questions about dynamic analysis

    Questions about dynamic analysis

    Hello, I want to learn how DIE performs dynamic analysis. However, the type file (.t) generated by calling typer.py is very different from the type file in the seed library. More precisely, the generated type files are similar to .raw files. So, what went wrong in this process? How to infer the type from the bottom up at the AST level?

    Hope that can be answered. thank you very much.

    opened by 1789120321 0
  • How to calculate coverage?

    How to calculate coverage?

    Hi @thdusdl1219, I have a question about the method of coverage calculation.

    I tried to run your program, but the coverage rate is approximately a straight line.

    First, the target program is JavaScriptCore. By adjusting the afl-llvm-pass.so.cc code, the number of inserted piles is recorded each time the pile is inserted, that is, the variable inst_blocks is summed. The result is that JavaScriptCore has 875,102 points in total. I use ((MAP_SIZE << 3)-count_bits(virgin_bits)) to record the position covered by the fuzzing process. Of course, the coverage of the initial seed will be recorded first. I run the program with a single thread.

    After running the original seed, the number of locations covered by the record is 147,483, and the coverage rate is about 17%. After running for 10 hours, the number of locations covered was 148,864, and the coverage rate was still around 17%. There was no significant improvement as mentioned in the paper. Of course, it may be because the target program is different, but I don't think it should have such a big impact.

    It may be that the calculation method is different, so I want to know how you calculate the coverage of the target program.

    Thank you.

    opened by 1789120321 0
  • help

    help

    I want to test some other softwares, what can I do to make DIE support them? like: 1)https://github.com/Moddable-OpenSource/moddable 2)https://github.com/jerryscript-project/jerryscript 3) https://github.com/pcmacdon/jsish

    opened by rain6851 1
  • How to setup redis for the fuzzer?

    How to setup redis for the fuzzer?

    I ran every command in the README.md on Ubuntu 20.04 What url do I use for fuzz/scripts/redis.py? I tried 127.0.0.1:9000 and 127.0.0.1:6379 and neither worked. What is the ID of the server? How do I run the server?

    sudo apt-get -y install npm
    sudo npm install -g n
    sudo n stable
    sudo apt install redis-server
    sudo apt-get -y install clang-6.0
    cd fuzz/scripts
    sudo ./prepare.sh
    ./compile.sh
    git clone https://github.com/sslab-gatech/DIE-corpus.git
    python3 ./fuzz/scripts/make_initial_corpus.py ./DIE-corpus ./corpus
    ./fuzz/scripts/populate.sh ~/mujs-fuzzilli/build/release/mujs ./DIE-corpus ch
    ./fuzz/scripts/run.sh ~/mujs-fuzzilli/build/release/mujs ./DIE-corpus ch
    

    No redis server or tmux connection? Is there a step I missed? If so please add to README.md

    redis-cli -p 9000
    Could not connect to Redis at 127.0.0.1:9000: Connection refused
    not connected> quit
    ➜  DIE git:(master) ✗ /etc/init.d/redis-server status
    ● redis-server.service - Advanced key-value store
         Loaded: loaded (/lib/systemd/system/redis-server.service; enabled; vendor preset: enabled)
         Active: active (running) since Sat 2021-02-13 23:22:53 UTC; 7min ago
           Docs: http://redis.io/documentation,
                 man:redis-server(1)
        Process: 73976 ExecStart=/usr/bin/redis-server /etc/redis/redis.conf (code=exited, status=0/SUCCESS)
       Main PID: 73986 (redis-server)
          Tasks: 4 (limit: 9512)
         Memory: 2.0M
         CGroup: /system.slice/redis-server.service
                 └─73986 /usr/bin/redis-server 127.0.0.1:6379
    ➜  DIE git:(master) ✗ netstat -plant
    (Not all processes could be identified, non-owned process info
     will not be shown, you would have to be root to see it all.)
    Active Internet connections (servers and established)
    Proto Recv-Q Send-Q Local Address           Foreign Address         State       PID/Program name    
    tcp        0      0 127.0.0.1:6379          0.0.0.0:*               LISTEN      -                   
    tcp        0      0 127.0.0.53:53           0.0.0.0:*               LISTEN      -                   
    tcp        0      0 0.0.0.0:22              0.0.0.0:*               LISTEN      -                   
    tcp        0    632 157.230.212.231:22      76.87.216.104:52161     ESTABLISHED -                   
    tcp6       0      0 ::1:6379                :::*                    LISTEN      -                   
    tcp6       0      0 :::22                   :::*                    LISTEN      -                   
    ➜  DIE git:(master) ✗ sudo netstat -plant
    Active Internet connections (servers and established)
    Proto Recv-Q Send-Q Local Address           Foreign Address         State       PID/Program name    
    tcp        0      0 127.0.0.1:6379          0.0.0.0:*               LISTEN      73986/redis-server  
    tcp        0      0 127.0.0.53:53           0.0.0.0:*               LISTEN      583/systemd-resolve 
    tcp        0      0 0.0.0.0:22              0.0.0.0:*               LISTEN      758/sshd: /usr/sbin 
    tcp        0    576 157.230.212.231:22      76.87.216.104:52161     ESTABLISHED 28487/sshd: root@pt 
    tcp6       0      0 ::1:6379                :::*                    LISTEN      73986/redis-server  
    tcp6       0      0 :::22                   :::*                    LISTEN      758/sshd: /usr/sbin 
    ➜  DIE git:(master) ✗ ps aux | grep redis
    redis      73986  0.1  0.0  51700  4592 ?        Ssl  23:22   0:01 /usr/bin/redis-server 127.0.0.1:6379
    thelshe+   79222  0.0  0.0   8160   736 pts/0    S+   23:35   0:00 grep --color=auto --exclude-dir=.bzr --exclude-dir=CVS --exclude-dir=.git --exclude-dir=.hg --exclude-dir=.svn --exclude-dir=.idea --exclude-dir=.tox redis
    ➜  DIE git:(master) ✗ redis-cli -p 6379
    127.0.0.1:6379> keys *
    (empty list or set)
    127.0.0.1:6379> quit
    
    opened by docfate111 3
Owner
gts3.org (SSLab@Gatech)
https://gts3.org
gts3.org (SSLab@Gatech)
Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more

Apache MXNet (incubating) for Deep Learning Apache MXNet is a deep learning framework designed for both efficiency and flexibility. It allows you to m

The Apache Software Foundation 20.2k Jan 5, 2023
Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more

Apache MXNet (incubating) for Deep Learning Apache MXNet is a deep learning framework designed for both efficiency and flexibility. It allows you to m

The Apache Software Foundation 19.3k Feb 12, 2021
Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more

Apache MXNet (incubating) for Deep Learning Master Docs License Apache MXNet (incubating) is a deep learning framework designed for both efficiency an

ROCm Software Platform 29 Nov 16, 2022
Fuzzing tool (TFuzz): a fuzzing tool based on program transformation

T-Fuzz T-Fuzz consists of 2 components: Fuzzing tool (TFuzz): a fuzzing tool based on program transformation Crash Analyzer (CrashAnalyzer): a tool th

HexHive 244 Nov 9, 2022
Angora is a mutation-based fuzzer. The main goal of Angora is to increase branch coverage by solving path constraints without symbolic execution.

Angora Angora is a mutation-based coverage guided fuzzer. The main goal of Angora is to increase branch coverage by solving path constraints without s

null 833 Jan 7, 2023
MOpt-AFL provided by the paper "MOPT: Optimized Mutation Scheduling for Fuzzers"

MOpt-AFL 1. Description MOpt-AFL is a AFL-based fuzzer that utilizes a customized Particle Swarm Optimization (PSO) algorithm to find the optimal sele

null 172 Dec 18, 2022
Unified Interface for Constructing and Managing Workflows on different workflow engines, such as Argo Workflows, Tekton Pipelines, and Apache Airflow.

Couler What is Couler? Couler aims to provide a unified interface for constructing and managing workflows on different workflow engines, such as Argo

Couler Project 781 Jan 3, 2023
Official codebase for Pretrained Transformers as Universal Computation Engines.

universal-computation Overview Official codebase for Pretrained Transformers as Universal Computation Engines. Contains demo notebook and scripts to r

Kevin Lu 210 Dec 28, 2022
This repo contains the code and data used in the paper "Wizard of Search Engine: Access to Information Through Conversations with Search Engines"

Wizard of Search Engine: Access to Information Through Conversations with Search Engines by Pengjie Ren, Zhongkun Liu, Xiaomeng Song, Hongtao Tian, Zh

null 19 Oct 27, 2022
QueryFuzz implements a metamorphic testing approach to test Datalog engines.

Datalog is a popular query language with applications in several domains. Like any complex piece of software, Datalog engines may contain bugs. The mo

null 34 Sep 10, 2022
A Peer-to-peer Platform for Secure, Privacy-preserving, Decentralized Data Science

PyGrid is a peer-to-peer network of data owners and data scientists who can collectively train AI models using PySyft. PyGrid is also the central serv

OpenMined 615 Jan 3, 2023
Bachelor's Thesis in Computer Science: Privacy-Preserving Federated Learning Applied to Decentralized Data

federated is the source code for the Bachelor's Thesis Privacy-Preserving Federated Learning Applied to Decentralized Data (Spring 2021, NTNU) Federat

Dilawar Mahmood 25 Nov 30, 2022
This is the research repository for Vid2Doppler: Synthesizing Doppler Radar Data from Videos for Training Privacy-Preserving Activity Recognition.

Vid2Doppler: Synthesizing Doppler Radar Data from Videos for Training Privacy-Preserving Activity Recognition This is the research repository for Vid2

Future Interfaces Group (CMU) 26 Dec 24, 2022
Tensorflow implementation of the paper "HumanGPS: Geodesic PreServing Feature for Dense Human Correspondences", CVPR 2021.

HumanGPS: Geodesic PreServing Feature for Dense Human Correspondences Tensorflow implementation of the paper "HumanGPS: Geodesic PreServing Feature fo

Google Interns 50 Dec 21, 2022
clDice - a Novel Topology-Preserving Loss Function for Tubular Structure Segmentation

README clDice - a Novel Topology-Preserving Loss Function for Tubular Structure Segmentation CVPR 2021 Authors: Suprosanna Shit and Johannes C. Paetzo

null 110 Dec 29, 2022
A Python implementation of the Locality Preserving Matching (LPM) method for pruning outliers in image matching.

LPM_Python A Python implementation of the Locality Preserving Matching (LPM) method for pruning outliers in image matching. The code is established ac

AoxiangFan 11 Nov 7, 2022
Boundary-preserving Mask R-CNN (ECCV 2020)

BMaskR-CNN This code is developed on Detectron2 Boundary-preserving Mask R-CNN ECCV 2020 Tianheng Cheng, Xinggang Wang, Lichao Huang, Wenyu Liu Video

Hust Visual Learning Team 178 Nov 28, 2022
Official implementation of the paper "Lightweight Deep CNN for Natural Image Matting via Similarity Preserving Knowledge Distillation"

Lightweight-Deep-CNN-for-Natural-Image-Matting-via-Similarity-Preserving-Knowledge-Distillation Introduction Accepted at IEEE Signal Processing Letter

DongGeun-Yoon 19 Jun 7, 2022
Rethinking Portrait Matting with Privacy Preserving

Rethinking Portrait Matting with Privacy Preserving This is the official repository of the paper Rethinking Portrait Matting with Privacy Preserving.

null 184 Jan 3, 2023