Dongyue (Oliver) Li
I am a Ph.D. Student in the Khoury College of Computer
Sciences at Northeastern
University starting from Fall 2021. I am fortunate to work with Professor Hongyang R. Zhang.
My current research interest lies in designing principled methodologies for
(i) learning with weakly supervised datasets, such as learning from limited labeled data or noisy data;
(ii) identifying task transfers and constructing multitask learning systems;
(iii) analyzing and learning from network data.
I am interested in developing such approaches in transfer learning, multitask learning, contrastive learning, and data augmentation.
I obtained my bachelor degree in Computer Science from Shanghai
Jiao Tong University and minored in Mathematics and Applied Mathematics.
I was a member of the IEEE honor class, an elite student program in the School of Electronics, Information and Electrical Engineering.
Email /
CV /
Github
|
|
-
Identification of Negative Transfers in Multitask Learning Using Surrogate Models
Dongyue Li, Huy L. Nguyen, Hongyang R. Zhang
Transactions on Machine Learning Research (TMLR), 2023 (Featured Certification)
[Code] [Presented in NeurIPS Workshop on Distribution Shifts (DistShift), 2022]
-
Generalization in Graph Neural Networks: Improved PAC-Bayesian Bounds on Graph Diffusion
Haotian Ju, Dongyue Li, Aneesh Sharma, Hongyang R. Zhang
International Conference on Artificial Intelligence and Statistics (AISTATS), 2023
[Code]
-
Optimal Intervention on Weighted Networks via Edge Centrality
Dongyue Li, Tina Eliassi-Rad, Hongyang R. Zhang
SIAM International Conference on Data Mining (SDM), 2023
[Code] [Presented in KDD Workshop on Epidemiology meets Data Mining and Knowledge Discovery (epiDAMIK), 2022]
-
Robust Fine-Tuning of Deep Neural Networks with Hessian-based Generalization Guarantees
Haotian Ju*, Dongyue Li*, Hongyang R. Zhang
Interntional Conference on Machine Learning (ICML), 2022
[Code] [Presented in ICML Workshop on Updatable Machine Learning (UpML), 2022]
-
Improved Regularization and Robustness for Fine-tuning in Neural Networks
Dongyue Li, Hongyang R. Zhang
Advances in Neural Information Processing Systems (NeurIPS), 2021
[Code]
Prior to my Ph.D. study
-
DTQAtten: Leveraging Dynamic Token-based Quantization for Efficient Attention Architecture
Tao Yang, Dongyue Li, Zhuoran Song, Yilong Zhao, Fangxin Liu, Zongwu Wang, Zhezhi He and Li Jiang
Conference on Design, Automation and Test in Europe (DATE), 2022
-
AdaptiveGCN: Efficient GCN Through Adaptively Sparsifying Graphs
Dongyue Li*, Tao Yang*, Lun Du, Zhezhi He, and Li Jiang
International Conference on Information and Knowledge Management (CIKM), 2021. Short paper.
-
PIMGCN: A ReRAM-based PIM Design for Graph Convolutional Network Acceleration
Tao Yang, Dongyue Li, Yibo Han, Yilong Zhao, Fangxin Liu, Xiaoyao Liang, Zhezhi He, and Li Jiang
Design Automation Conference (DAC), 2021
-
ReRAM-Sharing: Fine-Grained Weight Sharing for ReRAM-Based Deep Neural Network Accelerator
Dongyue Li*, Zhuoran Song*, Zhezhi He, Xiaoyao Liang, and Li Jiang
International Symposium on Circuits and Systems (ISCAS), 2021
-
Personalized and Environment-Aware Battery Prediction for Electric Vehicles
Dongyue Li*, Guangyu Li*, Bo Jiang*, Zhengping Che, and Yan Liu
KDD Workshop on Mining and Learning from Time Series (MiLeTS), 2021
Asterisk(*) indicates equal contribution
|
Teaching Assistant
CS 7140: Adavanced Machine Learning, Northeastern University, Spring 2023
Full-time Researcher
Shanghai Qi Zhi Institute, Shanghai, Aug. 2020 to Jun. 2021
Research Focus: Graph Neural Networks, Efficient Machine Learning, and System Co-design
Advisors: Prof. Li Jiang
Research Intern
DiDi Chuxing AI Labs, Beijing, Jul. 2019 to Sep. 2019
Research Focus: Spatio-Temporal Data Mining
Advisors: Prof. Yan Liu
|
|