Artificial neural networks (ANN) are well known for their good classification abilities. Recent advances in deep learning imposed second ANN renaissance. But neural networks possesses some problems like choosing hyper parameters such as neuron layers count and sizes which can greatly influence classification rate. Thus pruning techniques were developed that can reduce network sizes, increase its generalization abilities and overcome overfitting. Pruning approaches, in contrast to growing neural networks approach, assume that sufficiently large ANN is already trained and can be simplified with acceptable classification accuracy loss. Current paper compares nodes vs weights pruning algorithms and gives experimental results for pruned networks accuracy rates versus their non-pruned counterparts. We conclude that nodes pruning is more preferable solution, with some sidenotes.