Stage 1: Model Development
This stage supports building and compressing models from scratch or adapting existing models.

Model Zoo
- A curated collection of pre-trained models across various tasks such as classification, detection, and segmentation.
- It enables users to start optimization without training from scratch.
NetsPresso advantages:
- Offers a wide range of lightweight, mobile-friendly backbones ready for immediate training
- Seamless integration with the full optimization and deployment workflow
Trainer
- Used to train new models or fine-tune existing ones.
- It supports transfer learning and hyperparameter control.
NetsPresso advantage:
- Hardware-aware training workflow; ensures all models are compatible with downstream compression and export
- Training graph conversion (via torch.fx) maximizes flexibility and future optimization
- Supports local datasets and Hugging Face integration
Compressor
- Reduces the size and complexity of a trained model through methods like Structured Pruning and Filter Decomposition.
- These techniques help maintain accuracy while significantly reducing model size and inference time.
- Method details
NetsPresso advantages:
- Hardware-aware compression: Structured pruning and filter decomposition methods are tailored for target hardware, maximizing real-world efficiency on edge devices.
- Visual compression profiling: Instantly see which layers can be compressed and preview the impact, using Studio’s visual interface.
- Automatic fine-tuning support: After compression, models can be fine-tuned to recover accuracy with just a few clicks or a single command.
- Seamless workflow: Compressed models remain fully compatible with the next steps (quantization, conversion, benchmarking) with no manual rework needed.
- Extensive model support: Wide range of architectures supported, including MobileNet, ResNet, YOLO, EfficientFormer, and more.
Updated 4 days ago
What’s Next