Eddie's Learing Record 44
1. DurationMonday, Apr. 24th, 2023 - Sunday, Apr. 30th, 20232. Learning Records2.1. UpdatingRevised my English Curriculum Vitae and Chinese Resume bas...
Eddie's Learning Record 23
1. Duration Monday, November 7th, 2022 - Saturday, November 12th, 2022 2. Learning Record 2.1 Give a Pitch to My Junior Fellow I supposed him to be a professor of deep learning, but he turned out to be a rookie. He even didn't finish the machine learning course. Luckily, he is interested in t...
Eddie's Learning Record 40
Eddie's Learing Record 44
1. DurationMonday, Apr. 24th, 2023 - Sunday, Apr. 30th, 20232. Learning Records2.1. UpdatingRevised my English Curriculum Vitae and Chinese Resume bas...
Eddie's Learning Record 23
1. Duration Monday, November 7th, 2022 - Saturday, November 12th, 2022 2. Learning Record 2.1 Give a Pitch to My Junior Fellow I supposed him to be a professor of deep learning, but he turned out to be a rookie. He even didn't finish the machine learning course. Luckily, he is interested in t...
Eddie's Learning Record 40
Subscribe to Eddie's Learning Records
Subscribe to Eddie's Learning Records
Share Dialog
Share Dialog
<100 subscribers
<100 subscribers
Monday, January 2nd, 2023 - January 7th, 2023
Okay, I found the reason why the model ran like shit. The shuffle of the validation dataset made terrible results. And different batch_size or normalizations resulted in different outcomes.
The error said I encountered the 2 G limit of the tensorflow dataset buffer size. I tried to use the from_generator function. Unfortunately, it worked worse. I can't even finish the micro-expression training.
Furthermore, the learning rate seemed to have a huge impact on the outcomes. So whether the model is shit or the learning rate doesn't fit. I had no idea.
Finally, I finished the NTCE test on Saturday, January 7th, 2023. Then I could focus more on the experiment.
Monday, January 2nd, 2023 - January 7th, 2023
Okay, I found the reason why the model ran like shit. The shuffle of the validation dataset made terrible results. And different batch_size or normalizations resulted in different outcomes.
The error said I encountered the 2 G limit of the tensorflow dataset buffer size. I tried to use the from_generator function. Unfortunately, it worked worse. I can't even finish the micro-expression training.
Furthermore, the learning rate seemed to have a huge impact on the outcomes. So whether the model is shit or the learning rate doesn't fit. I had no idea.
Finally, I finished the NTCE test on Saturday, January 7th, 2023. Then I could focus more on the experiment.
Eddie He
Eddie He
No activity yet