2023-10-16 internship Keria fighting! 🔗:希望付出多少努力就能取得多少成果 TODO 10.16 [ ] :BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding [ ] :SpikingBERT: DistillingBERTtoTrainSpikingLanguageModelsUsingImplicit Differentiation [ ] :SPIKEBERT: A LANGUAGE SPIKFORMER LEARNED FROM BERT WITH KNOWLEDGE DISTILLATION 前一篇 强com VS 弱com 后一篇 THU保研经验分享