<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>幻觉 on Blowfish</title><link>https://huggingaha.github.io/tags/%E5%B9%BB%E8%A7%89/</link><description>Recent content in 幻觉 on Blowfish</description><generator>Hugo -- gohugo.io</generator><language>zh-cn</language><managingEditor>huggingaha@gmail.com (时影)</managingEditor><webMaster>huggingaha@gmail.com (时影)</webMaster><copyright>© 2026 时影</copyright><lastBuildDate>Sun, 07 Sep 2025 00:00:00 +0000</lastBuildDate><atom:link href="https://huggingaha.github.io/tags/%E5%B9%BB%E8%A7%89/index.xml" rel="self" type="application/rss+xml"/><item><title>Why Language Models Hallucinate——解构LLM幻觉</title><link>https://huggingaha.github.io/blogs/llm/llm-hallucinate/</link><pubDate>Sun, 07 Sep 2025 00:00:00 +0000</pubDate><author>huggingaha@gmail.com (时影)</author><guid>https://huggingaha.github.io/blogs/llm/llm-hallucinate/</guid><description/></item><item><title>LLM幻觉不可避免？</title><link>https://huggingaha.github.io/blogs/llm/llm-hallucinations-taxonomy/</link><pubDate>Sat, 16 Aug 2025 00:00:00 +0000</pubDate><author>huggingaha@gmail.com (时影)</author><guid>https://huggingaha.github.io/blogs/llm/llm-hallucinations-taxonomy/</guid><description/></item></channel></rss>