Day 22
(Ahhhhhh) |
The original model fared better. Here’s a result from using the unpaired dataset.
The colors are pretty dull and there are weird outlines between the sky and the trees/buildings but it looks better at least.
And here's a RGB to thermal translation.
I realize I should probably include the ground truth for all of these. There are thousands of testing images, but I might cut it down so it’s easier for me to match up images for the presentation.
As for using the paired dataset, the outputted images somehow look worse and the graphs from the training were also strange. Most of the images have white dots over them, and while there are warmer colors it lost the blue… I’m wondering if I need to change the model more for it to properly use the two groups of images...
I spent the rest of the day working on my presentation and testing different changes to the model/training code, ultimately ending up with two new models I want to try. They're going to be trained for 50 epochs with the unpaired dataset.
- The first one is saving the model every 5 epochs instead of just after high PSNRs, and uses LeakyReLU in the generator instead of RELU [session 2, saved under 6]
- The second one is saving the model every 5 epochs only (not affected by PSNR at all), and uses LeakyRELU in the generator [session 1, saved under 7]
- I'm also retraining the original model with the paired dataset because I messed up the file holding the saved parameters before I could save the RGB to thermal images [session 4, saved under 3]
Hopefully I'll see some improvements in the outputted images when I test tomorrow.
Comments
Post a Comment