The WashU NLP Group
The Natural Language Processing Group at Washington University is a team of faculty and students working on understanding and modeling natural language. We work on a broad range of fundamental NLP topics including large language models, deep learning models, language representation and generation, structure prediction, knowledge graphs, language and vision, responsible models (including security, robustness, interpretability, fairness), neural symbolic models, code understanding and generation, as well as applications of NLP for science, biomedicine, math, economics, and blockchain.
We now have postdoc openings! Please see this page for details.
image credit: the group's passport to the Metaverse based on readyplayer.me
Recent News
- Nov, 2024: Invited talks at Google DeepMind and Adobe.
- Nov, 2024: Glad to receive the X-Camp Academy Research Award.
- Oct, 2024: We released a preprint on evaluating LLM-based judges.
- Sep, 2024: Our research is the top story in WashU Record! Congrats to Nick and Kyle.
- Sep, 2024: One paper is accepted to IEEE S&P. [Paper].
- Sep, 2024: Our research was covered by the WashU RECORD.
- Aug, 2024: Re-Tuning is out! [Paper] [Code] [Slides] [Poster].
- May, 2024: Social is now released! [Paper] [Dataset] [Code] [Slides] [Poster].
- May, 2024: Re-Tuning is accepted to ACL 2024.
- Apr, 2024: AgentInstruct is accepted to ICML 2024.
- Apr, 2024: Take a look at the recent interview with MIT Technology Review China (DeepTech) talking about our recent research on STEM and others [Orginally in Chinese] [Google Translate]!
- Apr, 2024: Congrats to Kyle on his successful defense!
- Apr, 2024: STEM is now released, try it out to understand your foundation models better! [Paper] [Leaderboard] [Dataset] [Code] [Slides] [Poster]
- Apr, 2024: Thrilled to receive the Google Research Scholar Award!
- Apr, 2024: Social paper is out. Stay tuned for our code and datasets!
- Apr, 2024: Congrats to David and Josh on receiving WashU CSE Awards!
- Mar, 2024: “Enhancing Global Estimation of Fine Particulate Matter Concentrations by Including Geophysical a Priori Information in Deep Learning” is accepted to ACS ES&T Air 2024.
- Mar, 2024: “RoZ benchmark” is accepted to CVPR 2024 workshop.
- Mar, 2024: “Measuring Social Norms of Large Language Models” is accepted to NAACL 2024.
- Mar, 2024: “Benchmarking Zero-Shot Robustness of Multimodal Foundation Models: A Pilot Study” is released. 📃 [Paper] 💻 [Github].
- Mar, 2024: “Agent Instructs Large Language Models to be General Zero-Shot Reasoners” is accepted to ICLR 2024 workshop.
- Mar, 2024: “Evaluating Large Language Models in an Emerging Domain: A Pilot Study in Decentralized Finance” is accepted to ICLR 2024 workshop.
- Feb, 2024: “STEM paper” is out. Stay tuned for our code and datasets release!
- Feb, 2024: “Preference Poisoning Attacks on Reward Model Learning” is released.
- Jan, 2024: “Measuring Vision-Language STEM Skills of Neural Models” is accepted to ICLR 2024.
- Nov, 2023: AgentInstruct’s code is out! 📃 [Paper] 💻 [Github] 🤗 [HuggingFace] 📌 [Blog] 📽 [Slides] 📋 [Poster]
- Oct, 2023: “Agent Instructs Large Language Models to be General Zero-Shot Reasoners” is released.
- Jul, 2023: “Practical Membership Inference Attacks Against Large-Scale Multi-Modal Models: A Pilot Study” is accepted to ICCV 2023.
- Apr, 2023: “CodeIPPrompt: Intellectual Property Infringement Assessment of Code Language Models” is accepted to ICML 2023.
- Oct, 2022: 4 papers are accepted to EMNLP 2022.
- Oct, 2022: The WashU NLP group was launched.