# Eddies Learning Record 24 **Published by:** [Eddie's Learning Records](https://paragraph.com/@eddiehe/) **Published on:** 2022-11-21 **URL:** https://paragraph.com/@eddiehe/eddies-learning-record-24 ## Content 1. DurationMonday, November 14th, 2022 - Saturday, November 19th, 2022 2. Learning Record2.1 Fine-Tune the ViT ModelFine-tuned the vision transformer model on the cat_and_dag dataset and the flowers dataset. The model achieved approximately 86% accuracy on the flowers dataset. I also refactored the code for micro-expression spotting to fit the input shape of the vision transformer. But the result was as bad as shit. I needed time to fine-tune the hyperparameters. 2.2 Learned Swin Vision TransformerI found a different structure of vision transformer called Swin Vit. I watched the video and planned to read the code the next week. 2.3 Learned SL-VitI read the paper [1] and refactored the code using TensorFlow to make its structure be similar to the vision transformer code shown in the d2l notebook. The code worked well and gave a similar result on the cat_and_dog dataset and the flowers dataset. I read the code carefully and found the SL-Vit changes the patch embedding and multilayer attention module of the vision transformer. 3. FeelingI don't have any good feelings even though the code ran well as I was blocked in my dorm for ten days. 4. Reference[1]S. H. Lee, S. Lee, and B. C. Song, “Vision Transformer for Small-Size Datasets.” arXiv, 2021. doi: 10.48550/ARXIV.2112.13492. ## Publication Information - [Eddie's Learning Records](https://paragraph.com/@eddiehe/): Publication homepage - [All Posts](https://paragraph.com/@eddiehe/): More posts from this publication - [RSS Feed](https://api.paragraph.com/blogs/rss/@eddiehe): Subscribe to updates - [Twitter](https://twitter.com/eddiehe99): Follow on Twitter ## Optional - [Collect as NFT](https://paragraph.com/@eddiehe/eddies-learning-record-24): Support the author by collecting this post - [View Collectors](https://paragraph.com/@eddiehe/eddies-learning-record-24/collectors): See who has collected this post