Skip to content

lakalaka-chen/PGA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PGA

source code of Simple and Efficient Partial Graph Adversarial Attack: A New Perspective

Main Structure

  • models: implementations of GNN models
  • victims: training GNN models
    • configs: hyperparameters of models
    • models: saving checkpoints
  • attackers: implementations of attack methods
  • attack: perform attacks
    • configs: hyperparameters of attacks
    • perturbed_adjs: adversarial graphs generated by attackers

Running Steps

  1. Training GNN models
> cd victims
> python train.py --model=gcn --dataset=cora
  1. Perform attacks
> cd attack
> python gen_attack.py 

PGA attack

  1. Training GNN models
> cd victims
> python train.py
  1. Generate statistics of graphs, such as node degrees, classification margin and etc.
> cd analysis
> python gen_statistics.py --dataset=cora
  1. Perform PGA attack
> cd attack
> python gen_attack.py --attack=pga --dataset=cora

Evaluate(evasion attack)

> python evasion_attack.py --victim=robust --dataset=cora
> python evasion_attack.py --victim=normal --dataset=cora

Evaluate(poisoning attack)

> python poison_attack.py --victim=gcn --dataset=cora
> python poison_attack.py --victim=gat --dataset=cora

requirements

  • deeprobust
  • torch_geometry
  • torch_sparse
  • torch_scatter

About

Partial Attack on Graph Neural Networks

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published