Eddie's Learing Record 44
1. DurationMonday, Apr. 24th, 2023 - Sunday, Apr. 30th, 20232. Learning Records2.1. UpdatingRevised my English Curriculum Vitae and Chinese Resume bas...
Eddie's Learning Record 23
1. Duration Monday, November 7th, 2022 - Saturday, November 12th, 2022 2. Learning Record 2.1 Give a Pitch to My Junior Fellow I supposed him to be a professor of deep learning, but he turned out to be a rookie. He even didn't finish the machine learning course. Luckily, he is interested in t...
Eddie's Learning Record 40
<100 subscribers
Eddie's Learing Record 44
1. DurationMonday, Apr. 24th, 2023 - Sunday, Apr. 30th, 20232. Learning Records2.1. UpdatingRevised my English Curriculum Vitae and Chinese Resume bas...
Eddie's Learning Record 23
1. Duration Monday, November 7th, 2022 - Saturday, November 12th, 2022 2. Learning Record 2.1 Give a Pitch to My Junior Fellow I supposed him to be a professor of deep learning, but he turned out to be a rookie. He even didn't finish the machine learning course. Luckily, he is interested in t...
Eddie's Learning Record 40
Share Dialog
Share Dialog
Monday, December 5th, 2022 - Saturday, December 10th, 2022
Read the paper [1] and learned the code. Refactor the code to fit the input of different datasets. Also, I turned the eager mode off.
Read the paper [2] and learned the code. The network attained fabulous results on the fashion MNIST dataset in only five epochs. But the loss quickly turns nan in other datasets. Besides, the computation power of my laptop is enough to do the training.
Read the paper [3] and learned the code. Compared to the vision transformer, this code is a lot easier.
I had some new ideas about micro-expression spotting, but it seemed I don't have much time.
[1]S. Sabour, N. Frosst, and G. E. Hinton, “Dynamic Routing Between Capsules.” arXiv, 2017. doi: 10.48550/ARXIV.1710.09829.
[2]V. Mazzia, F. Salvetti, and M. Chiaberge, “Efficient-CapsNet: capsule network with self-attention routing,” Scientific Reports, vol. 11, no. 1, p. 14634, Jul. 2021, doi: 10.1038/s41598-021-93977-0.
[3]S. Woo, J. Park, J.-Y. Lee, and I. S. Kweon, “CBAM: Convolutional Block Attention Module,” in Computer Vision – ECCV 2018, Cham, 2018, pp. 3–19.
Monday, December 5th, 2022 - Saturday, December 10th, 2022
Read the paper [1] and learned the code. Refactor the code to fit the input of different datasets. Also, I turned the eager mode off.
Read the paper [2] and learned the code. The network attained fabulous results on the fashion MNIST dataset in only five epochs. But the loss quickly turns nan in other datasets. Besides, the computation power of my laptop is enough to do the training.
Read the paper [3] and learned the code. Compared to the vision transformer, this code is a lot easier.
I had some new ideas about micro-expression spotting, but it seemed I don't have much time.
[1]S. Sabour, N. Frosst, and G. E. Hinton, “Dynamic Routing Between Capsules.” arXiv, 2017. doi: 10.48550/ARXIV.1710.09829.
[2]V. Mazzia, F. Salvetti, and M. Chiaberge, “Efficient-CapsNet: capsule network with self-attention routing,” Scientific Reports, vol. 11, no. 1, p. 14634, Jul. 2021, doi: 10.1038/s41598-021-93977-0.
[3]S. Woo, J. Park, J.-Y. Lee, and I. S. Kweon, “CBAM: Convolutional Block Attention Module,” in Computer Vision – ECCV 2018, Cham, 2018, pp. 3–19.
Eddie He
Eddie He
No comments yet