Vincent-Daniel Yun

Graduate Research Assistant at USC

MSCS student at University of Southern California

This is Daniel, a first-year MSAI student at the University of Southern California researching Deep Learning Foundation & LLM Optimization. I’m fortunately supervised by Prof. Sai Praneeth Karimireddy and Prof. Vatsal Sharan from USC. I am also closely working with Prof. Sunwoo Lee from Inha University.

My research focuses on improving optimization, generalization, and model compression techniques to make neural networks lighter, faster, and more efficient in learning from data.

My current research interests:

  • Efficient and Effective optimization and training dynamics of neural networks
  • Model pruning, quantization, and compression techniques
  • Deep learning foundation

My current affiliations include:

  • Theory Group at University of Southern California (CS Theory)
  • Large-Scale Machine Learning Systems Lab at Inha University (LMLS)
  • Open Neural Network Research Lab at MODULABS (OpenNN)

Prior to joining USC, I completed my undergraduate studies in Computer Science and Applied Mathematics at Stony Brook University where I was supervised by Prof. Chao Chen.



Recent news

Nov, 2025
Two papers were submitted to the top cv conference.
Sep, 2025
Our paper entitled "MedCLM: Learning to Localize and Reason via a CoT-Curriculum in Medical Vision-Language Models" is submitted to the NLP Conference.
[Paper]
Sep, 2025
[NeurIPS 2025 OPT]
Congratulations! Our paper entitled "Sharpness-Aware Minimization with Z-Score Gradient Filtering" is accepted at Conference on Neural Information Processing Systems (NeurIPS) 2025 OPT Workshop.
[Paper] [Workshop]
Sep, 2025
[NeurIPS 2025 OPT]
Congratulations! Our paper entitled "SGD Convergence under Stepsize Shrinkage in Low-Precision Training" is accepted at Conference on Neural Information Processing Systems (NeurIPS) 2025 OPT Workshop.
[Paper] [Workshop]
Sep, 2025
[NeurIPS 2025 OPT]
Congratulations! Our paper entitled "Insights from Gradient Dynamics: Gradient Autoscaled Normalization" is accepted at Conference on Neural Information Processing Systems (NeurIPS) 2025 OPT Workshop.
[Paper] [Workshop]
Sep, 2025
[CIKM 2025 HCAI]
Congratulations! Our paper entitled "Fast Fourier Transform-Based Spectral and Temporal Gradient Filtering for Differential Privacy" is accepted at CIKM 2025 Human-Centric AI Workshop.
[Paper] [Workshop]
Jul, 2025
[ICONIP 2025]
Congratulations! Our paper entitled "Revisiting 16-bit Neural Network Training: A Practical Approach for Resource-Limited Learning" has been accepted as an oral presentation at International Conference on Neural Information Processing (ICONIP) 2025 (top 8% acceptance rate).
[Paper]
Nov, 2024
[AAAIW 2025]
Congratulations! Our paper entitled "ZNorm: Z-Score Gradient Normalization Accelerating Skip-Connected Network Training Without Architectural Modification" is accepted at AAAI 2025 Workshops: AI for Research and Scalable, Efficient Systems.
[Paper] [Workshop]
Nov, 2024
[IEEE BigDataW 2024]
Congratulations! Our paper entitled "Mitigating gradient overlap in deep residual networks with gradient normalization for improved non-convex optimization" is accepted at IEEE International Conference on Big Data (BigData) Optimization Workshop, BPOD.
[Paper] [Workshop]
Sep, 2024
[ESA SPAICE 2024]
Congratulations! Our paper entitled "Analysis and Predictive Modeling of Solar Coronal Holes Using Computer Vision and ARIMA-LSTM Networks" is accepted at SPAICE2024: The First Joint European Space Agency / IAA Conference on AI in and for Space.
[Paper] [Conference]
Mar, 2024
[IJCNN 2024]
Congratulations! Our paper entitled "Robust Neural Pruning with Gradient Sampling Optimization for Residual Neural Networks" is accepted at International Joint Conference on Neural Networks (IJCNN) (Oral).
[Paper]
Mar, 2024
[CVPRW 2024]
Congratulations! Our paper entitled "Uncertainty Estimation for Tumor Prediction with Unlabeled data" is accepted at IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshop.
[Paper]