Grid Disruption Analysis is a comprehensive Flask-based web application that visualizes and analyzes U.S. power grid outage data spanning from 2014 to 2023. This project combines interactive data visualization with advanced machine learning techniques to provide actionable insights for researchers, utility companies, and policymakers.
- Energy researchers and analysts
- Utility companies and grid operators
- Policy makers and government agencies
- Data science enthusiasts
- State-wise Monthly Heatmap: Track outage patterns across states and time
- Hourly Outage Analysis: Identify peak disruption times
- Choropleth Maps: Geographic visualization of outage distribution
- Network Graphs: Visualize connections between outage events
- Forecasting: Prophet-based time series prediction for outage trends
- Anomaly Detection: Isolation Forest algorithm to identify unusual events
- Clustering Analysis: K-Means clustering to group states by outage patterns
- Duration Prediction: Predictive models for outage duration estimation
- Seasonal and temporal outage patterns
- Regional vulnerability analysis
- Peak disruption time identification
- Grid resilience enhancement recommendations
- Backend: Flask (Python web framework)
- Data Processing: Pandas, NumPy
- Visualization: Plotly, Matplotlib, Seaborn
- Machine Learning: Scikit-learn, Prophet
- Network Analysis: NetworkX
- Frontend: HTML, CSS, Bootstrap
# Check Python version (3.8+ required)
python --version
# Ensure pip is installed
pip --version- Clone the repository
git clone https://github.com/kraryal/Grid_Disruption_Analysis.git
cd Grid_Disruption_Analysis- Set up virtual environment (Recommended)
# Create virtual environment
python -m venv venv
# Activate virtual environment
# On Windows:
venv\Scripts\activate
# On macOS/Linux:
source venv/bin/activate- Install dependencies
# Install from requirements.txt
pip install -r requirements.txt
# OR install manually if requirements.txt is missing
pip install flask pandas prophet plotly seaborn matplotlib networkx scikit-learn numpy- Verify installation
python -c "import flask, pandas, prophet, plotly, seaborn, matplotlib, networkx, sklearn; print('โ
All dependencies installed successfully!')"# Ensure you're in the correct directory
ls -la
# You should see: CODE/ folder, README.md, etc.
# Navigate to CODE directory
cd CODE
# Verify Flask app exists
ls -la app.py# If data folder doesn't exist, create it
mkdir -p data
# Verify data files are present
ls -la data/
# Should contain your grid outage datasets# Optional: Create a config check script (config_check.py)
import os
import sys
def check_setup():
"""Verify the application setup"""
# Check if we're in the right directory
if not os.path.exists('app.py'):
print("โ app.py not found. Make sure you're in the CODE directory.")
return False
# Check data directory
if not os.path.exists('data'):
print("โ ๏ธ Data directory not found. Creating...")
os.makedirs('data', exist_ok=True)
# Check templates directory
if not os.path.exists('templates'):
print("โ ๏ธ Templates directory not found.")
return False
# Check static directory
if not os.path.exists('static'):
print("โ ๏ธ Static directory not found.")
return False
print("โ
Setup verification complete!")
return True
if __name__ == "__main__":
check_setup()# Create .env file for configuration (optional)
cat > .env << EOF
FLASK_APP=app.py
FLASK_ENV=development
FLASK_DEBUG=1
PORT=5000
EOF
# Load environment variables
# On Windows:
set FLASK_APP=app.py
set FLASK_ENV=development
# On macOS/Linux:
export FLASK_APP=app.py
export FLASK_ENV=development# Navigate to CODE directory
cd CODE
# Run the Flask application
python app.py
# Expected output:
# * Running on http://127.0.0.1:5000
# * Debug mode: on
# * Restarting with stat
# * Debugger is active!# Local access
http://127.0.0.1:5000
# or
http://localhost:5000
# Network access (if running with --host=0.0.0.0)
http://YOUR_IP_ADDRESS:5000# Example API calls using requests library
import requests
# Get monthly trends for a specific state
response = requests.get('http://127.0.0.1:5000/api/monthly_trend?state=California')
data = response.json()
# Predict outage duration
payload = {
'duration': 120,
'customers_affected': 5000,
'demand_loss': 250
}
response = requests.post('http://127.0.0.1:5000/api/predict', json=payload)
prediction = response.json()# Example: Programmatic access to core functions
from app import app
# Create application context
with app.app_context():
# Example: Get data for specific state
def get_state_data(state_name):
# Your data processing logic here
pass
# Example: Run prediction model
def predict_outage_duration(features):
# Your ML prediction logic here
pass# Install development dependencies
pip install -r requirements-dev.txt
# Or install individual dev packages
pip install pytest flask-testing black flake8 isort
# Run tests (if available)
python -m pytest tests/
# Format code with Black
black app.py
# Check code style
flake8 app.py
# Sort imports
isort app.py# config.py - Custom configuration file
import os
class Config:
SECRET_KEY = os.environ.get('SECRET_KEY') or 'dev-secret-key'
DEBUG = True
TESTING = False
class ProductionConfig(Config):
DEBUG = False
SECRET_KEY = os.environ.get('SECRET_KEY')
class DevelopmentConfig(Config):
DEBUG = True
class TestingConfig(Config):
TESTING = True
DEBUG = True
# Usage in app.py
# app.config.from_object('config.DevelopmentConfig')- Port Already in Use
# Find process using port 5000
lsof -i :5000 # Mac/Linux
netstat -ano | findstr :5000 # Windows
# Kill the process or use different port
python app.py --port 8000- Module Not Found Errors
# Reinstall dependencies
pip install --force-reinstall -r requirements.txt
# Check Python path
python -c "import sys; print('\n'.join(sys.path))"
# Ensure virtual environment is activated
which python # Should show venv path- Data Loading Issues
# Debug data loading
import pandas as pd
import os
print("Current directory:", os.getcwd())
print("Files in data/:", os.listdir('data/') if os.path.exists('data/') else "Data directory not found")- Template Not Found
# Verify template structure
ls -la templates/
# Should contain: index.html, about.html, etc.# Add debug prints to app.py
if __name__ == '__main__':
print("๐ Starting Grid Disruption Analysis App...")
print("๐ Current directory:", os.getcwd())
print("๐ Checking data availability...")
app.run(debug=True, host='127.0.0.1', port=5000)data/
โโโ grid_outages_2014_2023.csv # Main dataset
โโโ state_coordinates.json # Geographic data
โโโ processed/ # Processed datasets
โ โโโ monthly_aggregated.csv
โ โโโ hourly_patterns.csv
โ โโโ anomaly_scores.csv
โโโ models/ # Saved ML models
โโโ prophet_model.pkl
โโโ isolation_forest.pkl
โโโ kmeans_clusters.pkl
# Example data processing script
import pandas as pd
from datetime import datetime
def load_and_preprocess_data():
"""Load and preprocess grid outage data"""
# Load main dataset
df = pd.read_csv('data/grid_outages_2014_2023.csv')
# Convert date columns
df['date'] = pd.to_datetime(df['date'])
# Create additional features
df['year'] = df['date'].dt.year
df['month'] = df['date'].dt.month
df['hour'] = df['date'].dt.hour
df['day_of_week'] = df['date'].dt.dayofweek
return df
# Usage
if __name__ == "__main__":
data = load_and_preprocess_data()
print(f"โ
Loaded {len(data)} records")Grid_Disruption_Analysis/
โโโ CODE/
โ โโโ app.py # Main Flask application
โ โโโ requirements.txt # Python dependencies
โ โโโ config.py # Configuration settings
โ โโโ models/ # ML models directory
โ โ โโโ __init__.py
โ โ โโโ forecasting.py # Prophet models
โ โ โโโ anomaly_detection.py # Isolation Forest
โ โ โโโ clustering.py # K-Means clustering
โ โโโ templates/ # HTML templates
โ โ โโโ base.html # Base template
โ โ โโโ index.html # Homepage
โ โ โโโ monthly_trend.html # Monthly analysis
โ โ โโโ predict.html # Prediction interface
โ โ โโโ heatmap.html # State heatmap
โ โ โโโ choropleth.html # Geographic map
โ โ โโโ hourly_heatmap.html # Hourly patterns
โ โ โโโ network_graph.html # Network visualization
โ โ โโโ anomaly.html # Anomaly detection
โ โ โโโ clustering.html # State clustering
โ โ โโโ about.html # About page
โ โโโ static/ # Static files
โ โ โโโ css/
โ โ โ โโโ style.css # Custom styles
โ โ โโโ js/
โ โ โ โโโ app.js # JavaScript functionality
โ โ โโโ images/ # Images and icons
โ โโโ data/ # Dataset files
โ โ โโโ grid_outages_2014_2023.csv
โ โ โโโ processed/ # Processed data
โ โโโ utils/ # Utility functions
โ โโโ __init__.py
โ โโโ data_processing.py # Data manipulation
โ โโโ visualization.py # Chart generation
โโโ tests/ # Test files
โ โโโ __init__.py
โ โโโ test_app.py # App tests
โ โโโ test_models.py # Model tests
โโโ docs/ # Documentation
โ โโโ api.md # API documentation
โ โโโ deployment.md # Deployment guide
โโโ .env.example # Environment variables template
โโโ .gitignore # Git ignore rules
โโโ requirements.txt # Production dependencies
โโโ requirements-dev.txt # Development dependencies
โโโ Dockerfile # Docker configuration
โโโ docker-compose.yml # Docker Compose setup
โโโ README.md # This file
โโโ LICENSE # License information
# train_models.py - Script to retrain models with new data
from models.forecasting import ProphetForecaster
from models.anomaly_detection import AnomalyDetector
from utils.data_processing import load_and_preprocess_data
def retrain_models():
"""Retrain all ML models with latest data"""
# Load latest data
data = load_and_preprocess_data()
# Retrain forecasting model
forecaster = ProphetForecaster()
forecaster.fit(data)
forecaster.save('models/prophet_model_updated.pkl')
# Retrain anomaly detection
detector = AnomalyDetector()
detector.fit(data)
detector.save('models/isolation_forest_updated.pkl')
print("โ
All models retrained successfully!")
if __name__ == "__main__":
retrain_models()# batch_analysis.py - Process multiple states or time periods
def batch_analyze_states(states_list):
"""Analyze multiple states in batch"""
results = {}
for state in states_list:
print(f"๐ Processing {state}...")
# Your analysis logic here
results[state] = analyze_state(state)
return results
# Usage
states = ['California', 'Texas', 'New York', 'Florida']
batch_results = batch_analyze_states(states)Team 155 - Data Analysis Initiative
- Krishna Aryal - @kraryal - Project Lead & Backend Development
- Crystal Vandekerkhove - Data Analysis & Visualization
- Jinesh Patel - Machine Learning & Modeling
This project is licensed under the MIT License - see the LICENSE file for details.
We welcome contributions! Please see our Contributing Guidelines for details.
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
- Real-time data integration with utility APIs
- Docker containerization for easy deployment
- RESTful API endpoints for external integration
- Mobile-responsive UI improvements
- Advanced ML model comparison dashboard
- Automated testing and CI/CD pipeline
- Performance optimization and caching
- Multi-language support
Need help? Here's how to get support:
- ๐ Check the Documentation
- ๐ Report bugs via GitHub Issues
- ๐ฌ Join discussions in GitHub Discussions
- ๐ง Contact: [aryalkris9@gmail.com]
โญ Star this repository if you find it helpful!
Made with โค๏ธ by Team 155
================================================================================