Considering that training is extracting the main features of a dataset, there is always some data that is discarded as “noise” in the process, then when data is generated, that discarded information is filled back with actual random noise to partially replicate the original data.
Iterate and you’re going to end up with progressively less meaningful features. I just didn’t expect it to take only 5 iterations, that’s a lot of feature loss in training even with so many parameters.
Considering that training is extracting the main features of a dataset, there is always some data that is discarded as “noise” in the process, then when data is generated, that discarded information is filled back with actual random noise to partially replicate the original data.
Iterate and you’re going to end up with progressively less meaningful features. I just didn’t expect it to take only 5 iterations, that’s a lot of feature loss in training even with so many parameters.