Changqing Fu

Profile Pic 

Research Interests: I'm interested in visual generative models and a broad range of topics in deep learning. Currently I'm working on the improvement neural network architectures for better parameter/training/inference efficiency and with more effective control, with a unified geometrical framework.

I'm a senior-year PhD student in CEREMADE, Paris Dauphine University - PSL and Paris AI Institute (PRAIRIE), fortunately advised by Laurent D Cohen. Before that I was a graduate student in University at Albany advised by Yiming Ying. I recieved a Master's degree on Applied and Theoretical Mathematics in PSL University advised by Olga Mula and Robin Ryder, and a Bachelor's degree from School of Mathematical Sciences in Fudan University.

Here are my Institutional Page, CV, GitHub, Twitter and Linkedin.

Email: cfu-at-ceremade-dot-dauphine-dot-fr, evergreencqfu_at_gmail_dot_com

Projects

DeepPrism 

DeepPrism. An unprecedented 1000x better parameter-efficient network design for generative models, by leveraging an equivariance on the channel dimension, where training and inference FLOPs are also greatly reduced. This design applies equally to attention layers and maintains generation performance for a variety of models such as VAE, StableDiffusion (LDM).

Tip of the Iceberg 

Tip of the Iceberg. An improvement on the widely-used perceptual loss, to achieve 10x more training efficieny for generative models, by calculating non-singular symmetric features instead of activations in the path-integral along layers.

Unifying Activation and Attention 

Unifying Activation and Attention. A projective-geometric point of view to unify different operators in neural networks, under which the self-attention function intrinsically coincides with the activation-convolution layers by defining proper kernel functions. This connection does not change the form of commonly-used neural networks, can be visually confirmed, and leads to a unified view of the dynamics of neural processes.

Conic Linear Units 

Conic Linear Units. A non-pointwise activation function with unprecedented infinite-order symmetry group, which improves generation quality, and enables alignment of neural networks with different widths. Application involves more flexible Federated Learning using Optimal Transport.

Contour-Based Image Manipulation 

Contour-Based Image Manipulation. A robust unsupervised two-stage image manipulation model by modifying contour constraints. The use of contour constraints is inspired by the sparsity of the edited constraint, while the coarse-grained post-processing single-image GAN alleviates distortion caused by out-of-distribution inputs.

Publications

Talks

Reviewer Duties