Skip to content

Latest commit

 

History

History
20 lines (11 loc) · 1015 Bytes

alignment.md

File metadata and controls

20 lines (11 loc) · 1015 Bytes

Alignment

Papers

2023

  • (2023-08) Aligning Large Language Models with Human: A Survey paper

  • (2023-05) LIMA: Less Is More for Alignment paper

  • (2023-05) RL4F: Generating Natural Language Feedback with Reinforcement Learning for Repairing Model Outputs paper

  • (2023-05) Principle-Driven Self-Alignment of Language Models from Scratch with Minimal Human Supervision paper

  • (2023-05) Improving Language Model Negotiation with Self-Play and In-Context Learning from AI Feedback paper

  • (2023-04) Fundamental Limitations of Alignment in Large Language Models paper

Useful Resources

  • Awesome-Align-LLM-Human - A collection of papers and resources about aligning large language models (LLMs) with human.