Logo
1.7.0

Getting Started

  • IGNNITION at a glance
    • Why IGNNITION?
    • Main functionalities:
      • High-level abstraction
      • No coding is needed
      • High flexibility
      • Easy debugging
      • Easy integration
      • High performance
      • Next step
  • Install IGNNITION
    • Pip
    • Source files
      • Download the source files
      • Prepare the environment
      • Install IGNNITION
    • Next step
  • Quick step-by-step tutorial
    • Understanding the problem
    • Building the Dataset
    • Designing and implementing the GNN
      • 1. Hidden state initialization
      • 2. MPNN architecture
        • Message phase
        • Update phase
        • Readout phase
    • Training and evaluation
    • Debugging

Background On GNNs

  • Motivation on GNNs
  • What is a GNN?
  • Additional material
    • Related papers
    • Blogs
    • Courses

User Guide

  • User-guide introduction
    • STEP 1: Design your model
    • STEP 2: Adapt your dataset
    • STEP 3: Train and evaluate
    • Optional: Debugging assistant
  • Model Description
    • Multi-stage Message Passing
    • Generate your GNN
      • Step 1: Entity definition
      • Step 2: Message passing definition
        • What is a single message-passing?
        • How to define a single message-passing?
        • Using stages to define chronological orderings?
        • Defining the message-passing phase
      • Step 3: Readout definition
      • Step 4: Internal neural networks definition
      • Putting it into practice
  • Keyword definition
    • Step 1: Entity definition
      • Parameter: name
      • Parameter: state_dim
      • Parameter: initial_state
    • Step 2: Message-passing phase
      • Parameter: num_iterations
      • Parameter: stages
        • Stage:
      • Parameter: stage_message_passings
        • Single message-passing:
        • Parameter: source_entities
        • Parameter: destination_entity
        • Parameter: aggregation
        • Parameter: update
    • Step 3: Readout
      • Parameter: output_label
    • Source entity object
      • Parameter: name
      • Parameter: message
      • Message function object:
        • Operation: Direct_assignment
      • Aggregation operation:
        • Option 1: sum
        • Option 2: mean
        • Option 3: min
        • Option 4: max
        • Option 5: ordered
        • Option 6: attention
        • Option 7: edge-attention
        • Option 8: convolution
        • Option 9: concat
        • Option 10: interleave
        • Option 11: neural_network
      • Update operation:
        • Parameter: type
        • Parameter: nn_name
    • Operation objects
      • Operation 1: product
        • Parameter: input
        • Parameter: output_name
        • Parameter: type_product
      • Operation 2: neural_network
        • Parameter: input
        • Parameter: nn_name
        • Parameter: output_name
      • Operation 3: pooling
        • Parameter: type_pooling:
  • Generate your dataset
    • Format of the dataset
    • What should the dataset include?
    • How to generate a sample?
      • Create the graph
      • Create the nodes
      • Create edges
      • Defining the label
        • Node level
        • Graph level
    • Serializing the graph
    • Compress the file
    • Practical example
  • Train and evaluate your model
    • 1. Run the training and evaluation
      • Create the model
      • Create the computational graph
      • Train and validation
      • Evaluate
      • Predict
        • Feeding an array of samples
    • 2. Configuration file
      • Definition of the paths
        • Path to the training dataset
        • Path to the validation dataset
        • Path to the prediction dataset
        • Load model path
        • Output path
        • Additional file path
        • Path to the model description file
      • Model training parameters
        • Loss
        • Optimizer
        • Metrics
      • Advanced options
        • Batch size
        • Number of epochs
        • Epoch size
        • Training shuffling
        • Validation shuffling
        • Validation samples
        • Validation frequency
        • K-best checkpoints
        • Batch normalization
        • Initial epoch
  • Debugging assistant
    • Tensorboard visualization
      • Visualization of Shortest-Path
        • State creation
        • Message passing
        • Readout
    • Error checking
      • Wrong Neural Network reference
      • Wrong entity
  • Use of global variables
    • What are global variables and why are they useful?
    • How to adapt my model
      • Basic working principle
      • Adaptation of the Shortest-Path example
  • Examples
    • 1. Shortest-Path
      • Brief description
    • 2. Graph Query Neural Networks
      • Brief description
      • Contextualization
      • MSMP Graph
      • Try Graph Query Neural Network
    • 3. RouteNet
      • Brief description:
      • Contextualization
      • MSMP Graph
    • 4. Q-size
      • Brief Description
      • Contextualization
      • MSMP Graph
    • 5. QM9
      • Brief description
      • Contextualization
    • 6. Radio Resource Allocation
      • Brief description

Misc

  • About
  • Contact and issues
  • Mailing list
  • Contributing
    • Managing branches in Ignnition
    • Versioning
  • Community by-laws
    • Community Roles
    • Project Management Committee
  • License
  • Citing
ignnition
  • »
  • Search


© Copyright 2021, Barcelona Neural Networking Center.

Built with Sphinx using a theme provided by Read the Docs.