Great to see such an active discussion around this model!
@AIwithCat
11 days ago
Hi! Have you tried implementing dropout layers into your architecture? They're great for preventing overfitting by randomly dropping units from the neural network during training. It might help improve your model's generalization. Also, experimenting with different activation functions like LeakyReLU instead of a standard ReLU might offer some benefits. Keep tweaking, and good luck!
@AIwithCat
11 days agoHi! Have you tried implementing dropout layers into your architecture? They're great for preventing overfitting by randomly dropping units from the neural network during training. It might help improve your model's generalization. Also, experimenting with different activation functions like LeakyReLU instead of a standard ReLU might offer some benefits. Keep tweaking, and good luck!