-
Updated
Dec 21, 2023 - Python
hallucination
Here are 35 public repositories matching this topic...
QuantHaLL: Quantifying Hallucination in machine translation for Low-resource Languages
-
Updated
Apr 22, 2024 - Jupyter Notebook
Hallucinate - GPT - LLM - AI Chat - OpenAI - Sam Altman info
-
Updated
Jan 1, 2024
openai assistant using code interpreter
-
Updated
May 20, 2024 - Python
🔢Hallucination detector for Large Language Models.
-
Updated
Mar 5, 2024
hallucination free LLM - TruthGPT for Google extension is a version of TruthGPT (developed by Labs) which integrates TruthGPT with Google search results.
-
Updated
Jan 22, 2024
[NAACL24] Official Implementation of Mitigating Hallucination in Abstractive Summarization with Domain-Conditional Mutual Information
-
Updated
Mar 27, 2024 - Python
Verify outputs generated by LLMs backed with real time data
-
Updated
Apr 22, 2024 - Python
Knowledge Verification to Nip Hallucination in the Bud
-
Updated
Mar 10, 2024 - Python
Controlled HALlucination-Evaluation (CHALE) Question-Answering Dataset
-
Updated
May 22, 2024 - Python
Re-implementation of the paper "Chain-of-Verification Reduces Hallucination in Large Language Models" for hallucination reduction. Developed as a final project of the Advanced Deep Learning course (DD3412) at KTH.
-
Updated
Dec 15, 2023 - Python
[ACL 2024] An Easy-to-use Hallucination Detection Framework for LLMs.
-
Updated
May 18, 2024 - Python
This repository contains the code of our paper 'Skip \n: A simple method to reduce hallucination in Large Vision-Language Models'.
-
Updated
Feb 12, 2024 - Python
An explainable sentence similarity measurement
-
Updated
May 8, 2021 - Jupyter Notebook
[NLPCC 2024] Shared Task 10: Regulating Large Language Models
-
Updated
May 19, 2024
Knowledge Verification to Nip Hallucination in the Bud
-
Updated
Mar 10, 2024 - Python
CVPR2018 Face Super-resolution with supplementary Attributes
-
Updated
Jun 26, 2018 - Lua
"Enhancing LLM Factual Accuracy with RAG to Counter Hallucinations: A Case Study on Domain-Specific Queries in Private Knowledge-Bases" by Jiarui Li and Ye Yuan and Zehua Zhang
-
Updated
Mar 18, 2024 - HTML
Improve this page
Add a description, image, and links to the hallucination topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the hallucination topic, visit your repo's landing page and select "manage topics."