This is the repository for the paper "DiaHalu: A Dialogue-level Hallucination Evaluation Benchmark for Large Language Models"
-
Updated
May 23, 2024
This is the repository for the paper "DiaHalu: A Dialogue-level Hallucination Evaluation Benchmark for Large Language Models"
[ACL 2024] Benchmarking the Hallucination of Chinese Large Language Models via Unconstrained Generation
Attack to induce LLMs within hallucinations
A curated list of trustworthy deep learning papers. Daily updating...
Leaderboard Comparing LLM Performance at Producing Hallucinations when Summarizing Short Documents
[ICML 2024] Official implementation for "HALC: Object Hallucination Reduction via Adaptive Focal-Contrast Decoding"
Trustworthy Retrieval Augmented Generation (RAG) with Safeguards
Official implementation for the paper "DoLa: Decoding by Contrasting Layers Improves Factuality in Large Language Models"
An Easy-to-use Hallucination Detection Framework for LLMs.
TruthX: Alleviating Hallucinations by Editing Large Language Models in Truthful Space
The full pipeline of creating UHGEval hallucination dataset
Official repo for SAC3: Reliable Hallucination Detection in Black-Box Language Models via Semantic-aware Cross-check Consistency
mPLUG-HalOwl: Multimodal Hallucination Evaluation and Mitigating
The implementation for EMNLP 2023 paper ”Beyond Factuality: A Comprehensive Evaluation of Large Language Models as Knowledge Generators“
[TruthGPT](https://github.com/SingularityLabs-ai/TruthGPT-mini) for google
✨✨Woodpecker: Hallucination Correction for Multimodal Large Language Models. The first work to correct hallucinations in MLLMs.
DCR-Consistency: Divide-Conquer-Reasoning for Consistency Evaluation and Improvement of Large Language Models
Hallucinate - GPT - LLM - AI Chat - OpenAI - Sam Altman info
The purpose of this application is to test LLM-generated interpretations of medical observations. The explanations are generated fully automatically by a large language model. This application should be used for experimental purposes only. It does not provide support for real world cases and does not replace advice from medical professionals.
Add a description, image, and links to the hallucinations topic page so that developers can more easily learn about it.
To associate your repository with the hallucinations topic, visit your repo's landing page and select "manage topics."