Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
former glory
Jul 11, 2011

That seems like overfitting. Not knowing about the data here, I'd start by checking to make sure your data is homogeneous: that you don't have a huge set of similar data in training and then a big set of different ones in test. This can happen if you don't randomly separate the two categories and flat copy the data into test where filenames might be grouping classes or similar samples within a class. Your test data could just be the tail end of your class listings.

You could also be feeding very high res data, in which case, you would either need a lot more data, or you could just scale them at the input if that's not already happening.

Augmentation could improve your acc, but we're talking single digit % typically. I'm mostly familiar with TF, but I imagine PyTorch would have that built into the front end as well. But this disparity points to some more fundamental issue with the data imo.

e: I just saw that you're trying to ID cars by mfg. with CNN. Assuming it's a bunch of random images off the net at all sorts of different angles and views, that's going to be hard to get accurate at 2000 samples. But I think you should be able to see better test acc. than that. Lower input res and making sure they're uniformly distributed should really boost that.

former glory fucked around with this message at 15:56 on Jun 23, 2022

Adbot
ADBOT LOVES YOU

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply