# 第一章：从序列建模到 Transformer

自然语言是人类最重要的信息载体，而语言的本质特征之一是**序列性**——词语的顺序承载着意义。如何让机器理解和生成这种有序的、变长的符号序列，是自然语言处理（Natural Language Processing，NLP）领域的核心问题。

这一问题的求解之路，恰好勾勒出深度学习最激动人心的一段技术史：从循环神经网络（RNN）的串行处理，到长短期记忆网络（LSTM）对梯度消失的突破，再到注意力机制让模型学会“看哪里”，最终汇聚为 Transformer 架构的横空出世。每一步创新都并非凭空而来，而是为了解决前一代方案的根本缺陷。

本章将追溯这一演进脉络，帮助读者理解：为什么 Transformer 的设计是这样的？它解决了什么前人无法解决的问题？以及，为什么它能如此迅速地取代此前所有主流架构？


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://yeasy.gitbook.io/llm_internals/di-yi-bu-fen-ji-chu-pian/01_introduction.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
