Detect · Enforce · Share
A three-component network defence system that combines machine learning based attack detection, kernel-level firewall enforcement, and real-time cloud threat intelligence sharing across firewall instances.
The Problem
Traditional firewalls filter traffic using fixed rules based on port numbers, IP addresses, and protocols. They cannot detect attacks whose identity only emerges from patterns of behaviour across many packets over time. A port scan looks normal packet by packet. A brute-force attempt looks like ordinary login traffic. A Slowloris attack looks like a slow but legitimate connection. We built a system that learns what these attacks actually look like in practice.
Captures live network traffic, aggregates packets into ten-second windows per source address, computes statistical features, and classifies each window as benign or one of five attack types using a hybrid rule-plus-machine-learning approach. When an attack is detected, the system automatically blocks the source at the kernel level and shares the event with peer firewall instances through a cloud database.
Unlike a pure rule-based firewall, the model learns from real captured attack traffic generated in the actual deployment environment rather than a generic public dataset. Unlike a pure ML detector, it layers deterministic rules on top of the model so obvious attacks are caught immediately. The result is a hybrid that is both fast and accurate, with a dashboard for human oversight and approval.
System Design
The system is split into three independent components that communicate through documented contracts. Each component can be run, tested, and modified without changing the others.
Opens a raw socket on the network interface and captures packets in real time using Scapy. Aggregates packets per source address into ten-second rolling windows and computes twelve statistical flow features per window. Runs deterministic rules first, then a Random Forest model for ambiguous cases. When an attack is detected above a confidence threshold, appends a JSON event to a shared contract file and writes a heartbeat so the agent can verify the detector is alive.
The enforcement layer. Tails the shared JSON events file, parses each new alert, and applies iptables DROP rules at the kernel level. Supports automatic mode (blocks immediately) and manual mode (queues for human approval). Blocks auto-expire after five minutes. A tabbed dashboard shows live alerts, pending approvals, blocked addresses, and the detector log. The agent also launches and monitors the detector subprocess.
Publishes every detection event and block decision to Firebase Realtime Database. Subscribes to block decisions from every other connected firewall node and applies them locally so an attack seen on one node pre-emptively protects others. Runs in a background thread and degrades gracefully if the cloud is unreachable, keeping the core system working in offline mode.
Data Flow
Evaluation
The model was evaluated under five-fold stratified cross-validation and tested live against thirteen attack variants including tools and protocols not seen during training.
Honest Assessment
These are real constraints, not disclaimers.
Technology
The Team
Designed and built the entire AI component: dataset capture in a controlled lab environment, feature engineering, model training and cross-validation, the hybrid rule-plus-ML classifier, and the live detector script. Responsible for the integration contract between the detection and enforcement tiers.
Built the firewall enforcement layer: the iptables-based blocker, auto-unblock timer, event consumer, local SQLite log, alert store, and the customtkinter dashboard with auto and manual blocking modes. Integrated the agent with the AI detector through the shared events contract.
Implemented the Firebase Realtime Database integration: publishing threat events and block decisions from each node, maintaining the peer block index, and pulling shared blocklists so every connected firewall benefits from what any node detects. Also contributed to project documentation and research.