Yao Lai
Logo The University of Hong Kong

I am a Ph.D. candidate in computer science department at the University of Hong Kong (HKU), also affiliated with mmlab@HKU, fortunately advised by Prof. Ping Luo. I am also honored to have the opportunity to work with the UTDA lab of The University of Texas at Austin (UTAustin), advised by Prof. David Z. Pan. Previously, I received my M.Eng. degree in the software school of Tsinghua University (THU) and B.Eng. degree in the microelectronics department of Fudan University (FDU). My research interests include AI for Electronic Design Automation (AI4EDA), AI for security and other possible applications.

Curriculum Vitae

Education
  • The University of Hong Kong
    The University of Hong Kong
    Ph.D. in Computer Science
    Sep. 2021 - Jul. 2025
  • Tsinghua University
    Tsinghua University
    M.Eng. in Software Engineering
    Sep. 2017 - Jul. 2020
  • Fudan University
    Fudan University
    B.Eng. in Micro-electronics Science and Engineering
    Sep. 2013 - Jul. 2017
Experience
  • The University of Texas at Austin
    The University of Texas at Austin
    Visiting Research Student
    Feb. 2024 - Jul. 2024
Honors & Awards
  • NeurIPS Scholar Award
    2024
  • Hong Kong PhD Fellowship
    2021
  • Outstanding Graduate of Software School, Tsinghua University
    2020
  • Outstanding Bachelor Thesis Award, Fudan University
    2017
  • Outstanding Graduate of Shanghai, China
    2017
  • National Scholarship, China
    2015
Selected Publications (view all )
AnalogCoder: Analog Circuit Design via Training-Free Code Generation
AnalogCoder: Analog Circuit Design via Training-Free Code Generation

Yao Lai, Sungyoung Lee, Guojin Chen, Souradip Poddar, Mengkang Hu, David Z. Pan, Ping Luo

AAAI Conference on Artificial Intelligence (AAAI) 2025

AnalogCoder is a training-free LLM agent for analog circuit design, using feedback-driven prompts and a circuit library to achieve high success rates, outperforming GPT-4o by designing 25 circuits.

AnalogCoder: Analog Circuit Design via Training-Free Code Generation
AnalogCoder: Analog Circuit Design via Training-Free Code Generation

Yao Lai, Sungyoung Lee, Guojin Chen, Souradip Poddar, Mengkang Hu, David Z. Pan, Ping Luo

AAAI Conference on Artificial Intelligence (AAAI) 2025

AnalogCoder is a training-free LLM agent for analog circuit design, using feedback-driven prompts and a circuit library to achieve high success rates, outperforming GPT-4o by designing 25 circuits.

Scalable and Effective Arithmetic Tree Generation for Adder and Multiplier Designs
Scalable and Effective Arithmetic Tree Generation for Adder and Multiplier Designs

Yao Lai, Jinxin Liu, David Z. Pan, Ping Luo

Conference on Neural Information Processing Systems (NeurIPS) 2024 Spotlight

This work uses reinforcement learning to optimize adder and multiplier designs as tree generation tasks, achieving up to 49% faster speed and 45% smaller size, with scalability to 7nm technology.

Scalable and Effective Arithmetic Tree Generation for Adder and Multiplier Designs
Scalable and Effective Arithmetic Tree Generation for Adder and Multiplier Designs

Yao Lai, Jinxin Liu, David Z. Pan, Ping Luo

Conference on Neural Information Processing Systems (NeurIPS) 2024 Spotlight

This work uses reinforcement learning to optimize adder and multiplier designs as tree generation tasks, achieving up to 49% faster speed and 45% smaller size, with scalability to 7nm technology.

ChiPFormer: Transferable Chip Placement via Offline Decision Transformer
ChiPFormer: Transferable Chip Placement via Offline Decision Transformer

Yao Lai, Jinxin Liu, Zhentao Tang, Bin Wang, Jianye Hao, Ping Luo

International Conference on Machine Learning (ICML) 2023

ChiPFormer is an offline RL-based method that achieves 10x faster chip placement with superior quality and transferability to unseen circuits.

ChiPFormer: Transferable Chip Placement via Offline Decision Transformer
ChiPFormer: Transferable Chip Placement via Offline Decision Transformer

Yao Lai, Jinxin Liu, Zhentao Tang, Bin Wang, Jianye Hao, Ping Luo

International Conference on Machine Learning (ICML) 2023

ChiPFormer is an offline RL-based method that achieves 10x faster chip placement with superior quality and transferability to unseen circuits.

MaskPlace: Fast Chip Placement via Reinforced Visual Representation Learning
MaskPlace: Fast Chip Placement via Reinforced Visual Representation Learning

Yao Lai, Yao Mu, Ping Luo

Conference on Neural Information Processing Systems (NeurIPS) 2022 Spotlight

MaskPlace is a method that leverages pixel-level visual representation for chip placement, achieving superior performance with simpler rewards, 60%-90% wirelength reduction, and zero overlaps.

MaskPlace: Fast Chip Placement via Reinforced Visual Representation Learning
MaskPlace: Fast Chip Placement via Reinforced Visual Representation Learning

Yao Lai, Yao Mu, Ping Luo

Conference on Neural Information Processing Systems (NeurIPS) 2022 Spotlight

MaskPlace is a method that leverages pixel-level visual representation for chip placement, achieving superior performance with simpler rewards, 60%-90% wirelength reduction, and zero overlaps.

All publications
Video