OpenAI has found larger batch size (of tens of thousands for image-classification and language modeling, and of millions in the case of RL agents) serve well for scaling and parallelizability.
Usually, you will get more of a performance boost from adding more layers than adding more neurons in each layer.For some datasets, having a large first layer and following it up with smaller layers will lead to better performance as the first layer can learn a lot of lower-level features that can feed into a few higher order features in the subsequent layers. In general, using the same number of neurons for all hidden layers will suffice.In this case, your model will still have only a few layers to train. For these use cases, there are pre-trained models ( YOLO, ResNet, VGG) that allow you to use large parts of their networks, and train your model on top of these networks to learn only the higher order features. When working with image or speech data, you’d want your network to have dozens-hundreds of layers, not all of which might be fully connected.
Good luck and have fun!ĭownload Build-a-Lot 4 - Power Source Game Free (Windows, 94.0 MB) Make your neighborhoods fit and fancy by building tennis courts, swimming pools, boutiques and more in Build-a-Lot 4: Power Source. Build solar towers, wind farms, and dispatch technicians to make sure everyone is energy efficient. Watch out for power overload! If you build too fast, you may experience blackouts and unhappy renters. Help friendly towns to grow and glow by building neighborhoods and generating clean energy to make them run in this exciting Time Management game. Finally awaited fourth episode of very popular and funny Build a Lot downloadable game released.