Table of Contents#

  • 1. Introduction

  • 2. Importing data in colab notebook

    • 2.1 Importing data from local storage

    • 2.2 Importing data from Google Drive

    • 2.3 Importing data from a remote URL

    • 2.4 Importing data from a GitHub repository

    • 2.5 Google Cloud Storage (GCS)

  • 3. Data Wrangling

    • 3.1 handling missing values

    • 3.2 Dealing with outliers

    • 3.3 Resampling and aggregation

    • 3.4 Handling inconsistent formats

    • 3.5 Feature engineering

    • 3.6 Lag Plots

    • 3.7 Normalization and scaling

  • 4. Time Series Concepts

    • 4.1 What is a Time Series?

    • 4.2 Applications of Time Series Analysis

    • 4.3 Types of Time Series Data

    • 4.4 Time Series Components

    • 4.5 Time Series Decomposition

    • 4.6 Stationarity and Non-stationarity

  • 5. Exploratory Data Analysis (EDA) for Time Series

    • 5.1 Visualizing Time Series Data

    • 5.2 Identifying Trends and Seasonality

      • 5.2.1 Additive and Multiplicative Decomposition

      • 5.2.2. Moving Averages(Smoothing)

      • 5.2.3. STL Decomposition (Seasonal-Trend decomposition using LOESS)

    • 5.3 ACF and PACF plots

  • 6. Data Prepration

    • 6.1 Data cleaning

    • 6.2 Time alignment

    • 6.3 Resampling

    • 6.4 Smoothing

    • 6.5 Differencing

    • 6.6 Normalization

    • 6.7 Feature engineering

  • 7. Stationarity

    • 7.1 How to make a time series stationary

      • 7.1.1 Differencing and its Importance

        • Example Code

      • 7.1.2 Seasonal Decomposition

  • 8. Discovered a suite of classical time series forecasting methods

    • 8.1 Time Series Forecasting

    • 8.2 Concerns of Forecasting

    • 8.3 Examples of Time Series Forecasting

    • 8.4 Classical time series forecasting methods

  • 9. Metrics

    • 9.1 Mean Absolute Error (MAE)

    • 9.2 The mean error (ME)

    • 9.3 Root Mean Squared Error (RMSE)

    • 9.4 Mean Absolute Percentage Error (MAPE)

    • 9.5 Symmetric Mean Absolute Percentage Error (SMAPE)

    • 9.6 Theil’s U-Statistic

    • 9.7 MRAE

  • 10. Classical models

    • 10.1 Autoregression (AR)

    • 10.2 Moving Average (MA)

    • 10.3 Autoregressive Moving Average (ARMA)

    • 10.4 Autoregressive Integrated Moving Average (ARIMA)

    • 10.5 Seasonal Autoregressive Integrated Moving-Average (SARIMA)

    • 10.6 Seasonal Autoregressive Integrated Moving-Average with Exogenous Regressors (SARIMAX)

    • 10.7 Vector Autoregression (VAR)

    • 10.8 Vector Autoregression Moving-Average (VARMA)

    • 10.9 Vector Autoregression Moving-Average with Exogenous Regressors (VARMAX)

    • 10.10 Simple Exponential Smoothing (SES)

    • 10.11 Holt Winter’s Exponential Smoothing (HWES)

  • 11. Deep Learning for Time Series Forecasting

  • 12. Model selection

    • 12.1 Recurrent Neural Networks (RNNs)

      • 12.1.1 Single layer LSTM

      • 12.1.2 Stacked LSTM

      • 12.1.3 Bidirectional LSTM

      • 12.1.4 Encoder-Decoder LSTM

      • 12.1.5 Attention-based LSTM

      • 12.1.6 Hybrid RNN model

    • 12.2 Convolutional Neural Networks (CNNs)

      • 12.2.1 1D CNN

      • 12.2.2 Dilated CNN

      • 12.2.3 Temporal Convolutional Network (TCN)

      • 12.2.4 ConvLSTM

      • 12.2.5 Hybrid CNN models

    • 12.3 Transformer Models

      • 12.3.1 Transformer for Time Series Analysis

      • 12.3.2 Implementation Details for TSA

      • 12.3.3 Challenges and Considerations

      • 12.3.4 Recent Developments

      • Conclusion

    • 12.4 Autoencoders

      • 12.4.1 What is an Autoencoder?

      • 12.4.2 Autoencoders in Time Series Analysis

      • 12.4.3 Implementation Details for TSA

      • 12.4.4 Challenges and Considerations

      • 12.4.5 Conclusion

    • 12.5 Generative Adversarial Networks (GANs)

      • 12.5.1 Basics of GAN

      • 12.5.2 GANs in Time Series Analysis

      • 12.5.3 Implementation Details for TSA

      • 12.5.4 Challenges and Considerations

      • 12.5.5 Conclusion

  • 13. Feature Engineering

  • 14. Preprocessing using Deep Learning

    • 14.1 Handling missing values using an RNN

    • 14.2 Outlier detection using an autoencoder

    • 14.3 Handling seasonality and trends using a CNN

  • 15. Time Series Analysis toolkits

    • 15.1 Scikit-learn

      • 15.1.1 How to use

      • 15.1.2 Example Code

    • 15.2. Statsmodels

      • 15.2.1 How to use

      • 15.2.2 Example Code

    • 15.3 Pandas

      • 15.3.1 How to use

      • 15.3.2 Example Code

    • 15.4 Prophet

      • 15.4.1 How to use

      • 15.4.2 Example Code

    • 15.5. sktime

      • 15.5.1 How to use

      • 15.5.2 Example Code

    • 15.6 Tslearn

      • 15.6.1 How to use

      • 15.6.2 Example Code

    • 15.7 Darts

      • 15.7.1 How to use

      • 15.7.2 Example Code

    • 15.8 PyFlux

      • 15.8.1 How to use

      • 15.8.2 Example Code

    • 15.9 SFRESH

      • 15.9.1 How to use

      • 15.9.2 2.Example Code

    • 15.11 Pastas

      • 15.11.1 How to use

      • 15.11.2 Example Code

  • 16. NeuralProphet

    • 16.1 Installation

    • 16.2 Basic Usage

      • 16.2.1 Importing Libraries

      • 16.2.2 Generating Synthetic Data

      • 16.2.3 Model Training

      • 16.2.4 Forecasting

      • 16.2.5 Manualy Visualization of Forecast

      • 16.2.6 Visualizing Components

      • 16.2.7 Visualizing change points

      • 16.2.8 Manual Visualization of Change Points

      • 16.2.9 Plot using plotly backend