You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+12-5Lines changed: 12 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,11 +4,12 @@
4
4
5
5
This repository contains the architectures, Models, logs, etc pertaining to the SimpleNet Paper
6
6
(Lets keep it simple: Using simple architectures to outperform deeper architectures ) : https://arxiv.org/abs/1608.06037
7
-
(Check the successor of this architecture at [Towards Principled Design of Deep Convolutional Networks: Introducing SimpNet](https://github.com/Coderx7/SimpNet))
8
7
9
8
SimpleNet-V1 outperforms deeper and heavier architectures such as AlexNet, VGGNet,ResNet,GoogleNet,etc in a series of benchmark datasets, such as CIFAR10/100, MNIST, SVHN.
10
-
It also achievs a higher accuracy (currently [71.14/89.75](https://github.com/Coderx7/SimpleNet_Pytorch#imagenet-result)) in imagenet, more than VGGNet, ResNet, MobileNet, AlexNet, NIN, Squeezenet, etc with only 5.7M parameters.
11
-
Slimer versions of the architecture work very decently against more complex architectures such as ResNet and WRN as well.
9
+
It also achievs a higher accuracy (currently [71.50/90.05 and 78.88/93.43*](https://github.com/Coderx7/SimpleNet_Pytorch#imagenet-result)) in imagenet, more than VGGNet, ResNet, MobileNet, AlexNet, NIN, Squeezenet, etc with only 5.7M parameters.
10
+
Slimer versions of the architecture work very decently against more complex architectures such as ResNet, WRN and MobileNet as well.
11
+
12
+
*78.88/93.43 was achieved using real-imagenet-labels
12
13
13
14
## Citation
14
15
If you find SimpleNet useful in your research, please consider citing:
@@ -20,6 +21,10 @@ If you find SimpleNet useful in your research, please consider citing:
20
21
year={2016}
21
22
}
22
23
24
+
25
+
(Check the successor of this architecture at [Towards Principled Design of Deep Convolutional Networks: Introducing SimpNet](https://github.com/Coderx7/SimpNet))
26
+
27
+
23
28
## Other Implementations :
24
29
25
30
**Pytorch** :
@@ -35,9 +40,11 @@ ImageNet result was achieved using simple SGD without hyper parameter tuning for
35
40
| CIFAR100*|**78.37**|
36
41
| MNIST | 99.75 |
37
42
| SVHN | 98.21 |
38
-
| ImageNet |**71.14/89.75**|
43
+
| ImageNet |**71.50/90.05 - 78.88/93.43***|
39
44
40
45
* Achieved using Pytorch implementation
46
+
* the second result achieved using real-imagenet-labels
47
+
41
48
42
49
#### Top CIFAR10/100 results:
43
50
@@ -105,7 +112,7 @@ achieved using an ensemble or extreme data-augmentation
105
112
| VGGNet16(138M) | 70.5 |
106
113
| GoogleNet(8M) | 68.7 |
107
114
| Wide ResNet(11.7M) | 69.6/89.07 |
108
-
| SimpleNet(5.4M) |**71.14/89.75**|
115
+
| SimpleNet(5.7M) |**71.50/90.05**|
109
116
110
117
111
118
Table 6-Slimmed version Results on Different Datasets
0 commit comments