Yiping Lu

Undergraduate

Department of Scientific & Engineering Computing
School of mathematical sciences
Peking University
The elite undergraduate training program of School of Mathematical Sciences in Peking University.(Both applied math and pure math track)

Email: luyiping9712 [at] pku [dot] edu [dot] cn, yplu [at] stanford [dot] edu
Contact: No. 5 Yiheyuan Road Beijing, 100871 People's Republic of China


Produced by l0 image smoothing and Canny edge detector.

Original image see here.

Biography [CV]

I am a last year undergraduate student in School Of Mathimatical Science at Peking University, majoring Information and Computing Science advised by Prof.Bin Dong. I also enjoy working with prof. Liwei Wang on theoretical machine learning. I was a visiting student at MIT CSAIL under the supervision of Prof. Justin Solomon during summer 2018. I'm now a research intern at MSRA(Microsoft Research Asia) working on Bayesian Deep Learning. (Mentor: Visual Computing Group David Wipf) During the internship, I also work closely with Machine Learning Group@MSRA. I'll visit Tokyo University and Riken AIP working with Prof. Taiji Suzuki during 2019 summer.

Always open to visiting research position and cooperation opportunities. If you're interested, don’t hesitate to contact me. My information is attached below and more information is demonstrate at introduction on my projects.

I'm also looking forward to potential collaboration. If you would like to collaborate with me, please send me e-mail or add my wechat.

Here is a slide summarize my current research. [slide]

Research Statment(2019/05): [pdf link]

I will join Stanford Institute for Computational and Mathematical Engineering as a PHD student in 2019.

Research Interest

 •(Stochastic) Dynamic System View Of Deep Learning.  (Neural ODE)

Computational Tools For Imaging And Graphics

 •Sparse Representation And Dictionary Learning Of Images.
 •Geometric Partial Differential Equations/Control Problem On Graphs.
 •Kernel Learning, Nonlocal PDE, Gaussian Process and Deep Learning.
 •Machine Learning With Limited Label.

Key Word: Numerical Differential Equations, Image Processing, Deep Learning, Wavelet Analysis, Inverse Problem.

Recent And Representative Publications

Full Publication List

To see full publication list, Click Here

Reviews:

1.Dynamic System and Optimal Control Perspective of Deep Learning: [pdf] (ACML2018,Tutorial Track [link])

Research Papers:

      
2019
Yiping Lu*, Zhuohan Li*, Di He, Zhiqing Sun, Bin Dong, Tao Qin, Liwei Wang, Tie-yan Liu "Understanding and Improving Transformer From a Multi-Particle Dynamic System Point of View." (*equal contribution) Submitted. arXiv preprint:1906.02762

[ paper] [ arXiv] [ slide] [Code]

Highlight! ODE can also be used in NLP!! We show that the Transformer can be mathematically interpreted as a numerical Ordinary Differential Equation (ODE) solver for a convection-diffusion equation in a multi-particle dynamic system.
Dinghuai Zhang*, Tianyuan Zhang*,Yiping Lu*, Zhanxing Zhu, Bin Dong. "You Only Propagate Once: Painless Adversarial Training Using Maximal Principle." (*equal contribution) Submitted. arXiv preprint:1905.00877

ICML2019 Workshop on Security and Privacy of Machine Learning

[ paper] [ arXiv] [ slide] [ poster] [Code]

Highlight! ODE can help accelerate adversarial training!! Adversarial training doesn't need too many computational resources! We fully exploit structure of deep neural networks via recasting the adversarial training for neural networks as a differential game and propose a novel strategy to decouple the adversary update with the gradient back propagation. Try Our Code!
Xiaoshuai Zhang*, Yiping Lu*, Jiaying Liu, Bin Dong. "Dynamically Unfolding Recurrent Restorer: A Moving Endpoint Control Method for Image Restoration" Seventh International Conference on Learning Representations(ICLR) 2019(*equal contribution)

[ paper] [ arXiv] [code] [ slide] [ project page] [Open Review]

Highlight! In this paper, we propose a new control framework called the moving endpoint control to restore images corrupted by different degradation levels in one model. The proposed control problem contains a restoration dynamics which is modeled by an RNN. The moving endpoint, which is essentially the terminal time of the associated dynamics, is determined by a policy network. We call the proposed model the dynamically unfolding recurrent restorer (DURR). Numerical experiments show that DURR is able to achieve state-of-the-art performances on blind image denoising and JPEG image deblocking. Furthermore, DURR can well generalize to images with higher degradation levels that are not included in the training stage.
2018
Yiping Lu, Aoxiao Zhong, Quanzheng Li, Bin Dong. "Beyond Finite Layer Neural Network:Bridging Deep Architects and Numerical Differential Equations" Thirty-fifth International Conference on Machine Learning (ICML), 2018

[paper] [arXiv] [project page] [slide][ bibtex][Poster]

Highlight! This work bridge deep neural network design with numerical differential equations. We show that many effective networks can be interpreted as different numerical discretizations of differential equations. This finding brings us a brand new perspective on the design of effective deep architectures. We can take advantage of the rich knowledge in numerical analysis to guide us in designing new and potentially more effective deep networks. As an example, we propose a linear multi-step architecture (LM-architecture) which is inspired by the linear multi-step method solving ordinary differential equations.
Zichao long*, Yiping Lu*, Xianzhong Ma*, Bin Dong. "PDE-Net:Learning PDEs From Data",Thirty-fifth International Conference on Machine Learning (ICML), 2018(*equal contribution)

[paper] [arXiv] [code] [Supplementary Materials][ bibtex]

Highlight! This paper is an initial attempt to learn evolution PDEs from data. Inspired by the latest development of neural network designs in deep learning, we propose a new feed-forward deep network, called PDE-Net, to fulfill two objectives at the same time: to accurately predict dynamics of complex systems and to uncover the underlying hidden PDE models. The basic idea of the proposed PDE-Net is to learn differential operators by learning convolution kernels (filters), and apply neural networks or other machine learning methods to approximate the unknown nonlinear responses.

We have updated a new version of PDE-Net focusing more on model discovery, please check: Zichao Long, Yiping Lu, Bin Dong. " PDE-Net 2.0: Learning PDEs from Data with A Numeric-Symbolic Hybrid Deep Network" arXiv

My Calender

Please schedule meeting with my Google Calendar account: 2prime97@gmail.com.


© Yiping Lu | Last updated: 04/01/2019

Powered By Bootstrap & Jemdoc

Visitors(From:23/10/2017):

free hits
<>

Theory without practice is empty, but equally, practice without theory is blind. ---- I. Kant

People who wish to analyze nature without using mathematics must settle for a reduced understanding. ---- Richard Feynman