💡 What are FLOPs (Floating Point Operations)?
FLOPs = Floating Point Operations
They measure how many math calculations (with decimals) your model does.
More FLOPs = more computing power = more time and money needed.
🧠Why do FLOPs matter?
- They show how “heavy” a model is.
- Bigger models (like GPT, BERT, ResNet) need more FLOPs.
- More FLOPs = Slower training = Higher cloud costs.
Check FLOPs
!pip install ptflops
import torch
import torchvision.models as models
from ptflops import get_model_complexity_info
model = models.resnet50()
macs, params = get_model_complexity_info(model, (3, 224, 224), as_strings=True)
print(f"FLOPs: {macs}, Parameters: {params}")

FLOPs: 4.09 G = The model performs 4.09 billion calculations for one image Parameters: 25.56 M = The model has 25.56 million values it learns during training
Example on https://colab.research.google.com/drive/1uDZ1Q6FjKTE2y_tVCfnZ0OkufUImKafH?usp=sharing
Thank you.