Bit-serial Weight Pools: Compression and Arbitrary Precision Execution of Neural Networks on Resource Constrained Processors Article Swipe
Shurui Li
,
Puneet Gupta
·
YOU?
·
· 2022
· Open Access
·
· DOI: https://doi.org/10.48550/arxiv.2201.11651
YOU?
·
· 2022
· Open Access
·
· DOI: https://doi.org/10.48550/arxiv.2201.11651
Applications of neural networks on edge systems have proliferated in recent years but the ever-increasing model size makes neural networks not able to deploy on resource-constrained microcontrollers efficiently. We propose bit-serial weight pools, an end-to-end framework that includes network compression and acceleration of arbitrary sub-byte precision. The framework can achieve up to 8x compression compared to 8-bit networks by sharing a pool of weights across the entire network. We further propose a bit-serial lookup based software implementation that allows runtime-bitwidth tradeoff and is able to achieve more than 2.8x speedup and 7.5x storage compression compared to 8-bit weight pool networks, with less than 1% accuracy drop.
Related Topics To Compare & Contrast
Vs
Algorithm
Concepts
Computer science
Speedup
Artificial neural network
Byte
Parallel computing
Compression (physics)
Data compression
Enhanced Data Rates for GSM Evolution
Lookup table
Algorithm
Computer hardware
Operating system
Artificial intelligence
Materials science
Composite material
Metadata
- Type
- preprint
- Language
- en
- Landing Page
- http://arxiv.org/abs/2201.11651
- https://arxiv.org/pdf/2201.11651
- OA Status
- green
- Cited By
- 2
- Related Works
- 10
- OpenAlex ID
- https://openalex.org/W4221142407
All OpenAlex metadata
Raw OpenAlex JSON
No additional metadata available.