# 第四部分：前沿与实践篇

- [第十二章：编码器系列模型](https://yeasy.gitbook.io/llm_internals/di-si-bu-fen-qian-yan-yu-shi-jian-pian/12_encoder_models.md)
- [12.1 BERT：双向理解的突破](https://yeasy.gitbook.io/llm_internals/di-si-bu-fen-qian-yan-yu-shi-jian-pian/12_encoder_models/12.1_bert.md)
- [12.2 RoBERTa、ALBERT 与 ELECTRA：BERT 的改进之路](https://yeasy.gitbook.io/llm_internals/di-si-bu-fen-qian-yan-yu-shi-jian-pian/12_encoder_models/12.2_roberta_albert.md)
- [12.3 长文本编码器：Longformer 与 BigBird](https://yeasy.gitbook.io/llm_internals/di-si-bu-fen-qian-yan-yu-shi-jian-pian/12_encoder_models/12.3_longformer_bigbird.md)
- [本章小结](https://yeasy.gitbook.io/llm_internals/di-si-bu-fen-qian-yan-yu-shi-jian-pian/12_encoder_models/summary.md)
- [第十三章：解码器系列与主流 LLM](https://yeasy.gitbook.io/llm_internals/di-si-bu-fen-qian-yan-yu-shi-jian-pian/13_decoder_models.md)
- [13.1 GPT 系列：从语言模型到通用智能的扩展之路](https://yeasy.gitbook.io/llm_internals/di-si-bu-fen-qian-yan-yu-shi-jian-pian/13_decoder_models/13.1_gpt_series.md)
- [13.2 Llama 家族：开源如何改变 LLM 格局](https://yeasy.gitbook.io/llm_internals/di-si-bu-fen-qian-yan-yu-shi-jian-pian/13_decoder_models/13.2_llama.md)
- [13.3 DeepSeek、Gemini 与其他前沿模型](https://yeasy.gitbook.io/llm_internals/di-si-bu-fen-qian-yan-yu-shi-jian-pian/13_decoder_models/13.3_deepseek_gemini.md)
- [13.4 编码器-解码器模型：T5 与 BART 的设计选择](https://yeasy.gitbook.io/llm_internals/di-si-bu-fen-qian-yan-yu-shi-jian-pian/13_decoder_models/13.4_t5_bart.md)
- [本章小结](https://yeasy.gitbook.io/llm_internals/di-si-bu-fen-qian-yan-yu-shi-jian-pian/13_decoder_models/summary.md)
- [第十四章：架构创新与未来趋势](https://yeasy.gitbook.io/llm_internals/di-si-bu-fen-qian-yan-yu-shi-jian-pian/14_future_trends.md)
- [14.1 高效注意力：突破平方复杂度的瓶颈](https://yeasy.gitbook.io/llm_internals/di-si-bu-fen-qian-yan-yu-shi-jian-pian/14_future_trends/14.1_efficient_attention.md)
- [14.2 混合专家模型：为什么不必激活所有参数](https://yeasy.gitbook.io/llm_internals/di-si-bu-fen-qian-yan-yu-shi-jian-pian/14_future_trends/14.2_moe.md)
- [14.3 状态空间模型与混合架构：注意力的挑战者](https://yeasy.gitbook.io/llm_internals/di-si-bu-fen-qian-yan-yu-shi-jian-pian/14_future_trends/14.3_ssm_hybrid.md)
- [14.4 多模态 Transformer：统一不同模态的表示](https://yeasy.gitbook.io/llm_internals/di-si-bu-fen-qian-yan-yu-shi-jian-pian/14_future_trends/14.4_multimodal.md)
- [14.5 AI Agent 与工具调用：让模型从“说”到“做”](https://yeasy.gitbook.io/llm_internals/di-si-bu-fen-qian-yan-yu-shi-jian-pian/14_future_trends/14.5_agent_tool_use.md)
- [14.6 推理时计算扩展：让模型学会深度思考](https://yeasy.gitbook.io/llm_internals/di-si-bu-fen-qian-yan-yu-shi-jian-pian/14_future_trends/14.6_test_time_scaling.md)
- [14.7 长上下文技术：从理论到工程实践](https://yeasy.gitbook.io/llm_internals/di-si-bu-fen-qian-yan-yu-shi-jian-pian/14_future_trends/14.7_long_context.md)
- [14.8 未来展望](https://yeasy.gitbook.io/llm_internals/di-si-bu-fen-qian-yan-yu-shi-jian-pian/14_future_trends/14.8_outlook.md)
- [本章小结](https://yeasy.gitbook.io/llm_internals/di-si-bu-fen-qian-yan-yu-shi-jian-pian/14_future_trends/summary.md)
- [附录](https://yeasy.gitbook.io/llm_internals/di-si-bu-fen-qian-yan-yu-shi-jian-pian/appendix.md)
- [A.1 数学基础速查](https://yeasy.gitbook.io/llm_internals/di-si-bu-fen-qian-yan-yu-shi-jian-pian/appendix/a1_math_basics.md)
- [A.2 PyTorch 实现示例](https://yeasy.gitbook.io/llm_internals/di-si-bu-fen-qian-yan-yu-shi-jian-pian/appendix/a2_pytorch_examples.md)
- [A.3 主流模型参数速查表](https://yeasy.gitbook.io/llm_internals/di-si-bu-fen-qian-yan-yu-shi-jian-pian/appendix/a3_model_reference.md)
- [A.4 推荐阅读与参考文献](https://yeasy.gitbook.io/llm_internals/di-si-bu-fen-qian-yan-yu-shi-jian-pian/appendix/a4_references.md)


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://yeasy.gitbook.io/llm_internals/di-si-bu-fen-qian-yan-yu-shi-jian-pian.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
