Leaderboard Comparing LLM Performance at Producing Hallucinations when Summarizing Short Documents
-
Updated
Apr 30, 2024
Leaderboard Comparing LLM Performance at Producing Hallucinations when Summarizing Short Documents
✨✨Woodpecker: Hallucination Correction for Multimodal Large Language Models. The first work to correct hallucinations in MLLMs.
Official implementation for the paper "DoLa: Decoding by Contrasting Layers Improves Factuality in Large Language Models"
Trustworthy Retrieval Augmented Generation (RAG) with Safeguards
A curated list of trustworthy deep learning papers. Daily updating...
Benchmarking the Hallucination of Chinese Large Language Models via Unconstrained Generation
Attack to induce LLMs within hallucinations
TruthX: Alleviating Hallucinations by Editing Large Language Models in Truthful Space
mPLUG-HalOwl: Multimodal Hallucination Evaluation and Mitigating
Initiative to evaluate and rank the most popular LLMs across common task types based on their propensity to hallucinate.
Repository for the paper "Cognitive Mirage: A Review of Hallucinations in Large Language Models"
An Easy-to-use Hallucination Detection Framework for LLMs.
[ICML 2024] Official implementation for "HALC: Object Hallucination Reduction via Adaptive Focal-Contrast Decoding"
Official repo for SAC3: Reliable Hallucination Detection in Black-Box Language Models via Semantic-aware Cross-check Consistency
The implementation for EMNLP 2023 paper ”Beyond Factuality: A Comprehensive Evaluation of Large Language Models as Knowledge Generators“
DCR-Consistency: Divide-Conquer-Reasoning for Consistency Evaluation and Improvement of Large Language Models
Code for Controlling Hallucinations at Word Level in Data-to-Text Generation (C. Rebuffel, M. Roberti, L. Soulier, G. Scoutheeten, R. Cancelliere, P. Gallinari)
A PyTorch implementation of the paper Thinking Hallucination for Video Captioning.
Add a description, image, and links to the hallucinations topic page so that developers can more easily learn about it.
To associate your repository with the hallucinations topic, visit your repo's landing page and select "manage topics."