Skip to content

Commit a2031bb

Browse files
committed
update readme
1 parent d73db42 commit a2031bb

File tree

1 file changed

+52
-1
lines changed

1 file changed

+52
-1
lines changed

README.md

+52-1
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,53 @@
11
# BANND
2-
A project for the course "Trustworthy Machine Learning". We investigate defenses for backdoor attacks on neural networks during training.
2+
3+
A project for the course "Trustworthy Machine Learning". We investigate defenses for
4+
backdoor attacks on neural networks during training.
5+
6+
# Dependencies
7+
8+
The project requires Python 3.10, and dependencies are installed using `pip`.
9+
10+
To install the dependencies, run `python -m pip install -r requirements`.
11+
12+
We recommend using a Virtual Environment for the project. To do that, install
13+
`virtualenv` with `python -m pip install virtualenv` and create a new virtualenv in the
14+
project with `python -m virtualenv ./venv`. Then, activate the virtualenv with `source
15+
./venv/bin/activate` and run all the `python` and `pip` commands within it.
16+
17+
# Running
18+
19+
Run the project with `python bannd.py`. Pass the `-h`/`--help` flag to see all the
20+
parameters. Note that for brevity's sake, some parameters like the number of epochs and
21+
batch size are omitted and can't be passed.
22+
23+
We support 3 modes of running:
24+
25+
- Baseline: pass `--runtype baseline` to see the network's baseline accuracy without any
26+
attack or defense.
27+
- Attack: pass `--runtype attack` and `--poison_rate Y` (any other number between
28+
`0.01` and `0.99`, we tested with `0.1`, `0.3`, and `0.5`) to run the badnets attack on the network, by poisoning some
29+
percentage of the samples with the backdoor. Expect the attack success rate to
30+
increase with time.
31+
- Defense: pass `--runtype defense`, `--poison_rate X`, and `--quantile-threshold Y`
32+
(any number between `0.00` and `0.99`, we tested with `0.0`, `0.5`, `0.7`, and `0.85`)
33+
to run the attack and defend against it.
34+
35+
## Examples
36+
37+
To run the baseline:
38+
39+
```sh
40+
python bannd.py --runtype baseline
41+
```
42+
43+
To run the attack:
44+
45+
```sh
46+
python bannd.py --runtype attack --poison_rate 0.1
47+
```
48+
49+
To run the defense:
50+
51+
```sh
52+
python bannd.py --runtype defense --poison_rate 0.1 --quantile-threshold 0.7
53+
```

0 commit comments

Comments
 (0)