• Product Introduction
  • Quick Start
    • Importing a Git Repository
    • Starting From a Template
    • Direct Upload
    • Start with AI
  • Framework Guide
    • Frontends
    • Backends
    • Full-stack
      • Next.js
    • Custom 404 Page
  • Project Guide
    • Project Management
    • edgeone.json
    • Configuring Cache
    • Error Codes
  • Build Guide
  • Deployment Guide
    • Overview
    • Create Deploys
    • Manage Deploys
    • Deploy Button
    • Using Github Actions
    • Using CNB Plugin
    • Using IDE PlugIn
    • Using CodeBuddy IDE
  • Domain Management
    • Overview
    • Custom Domain
    • Configuring an HTTPS Certificate
    • How to Configure a DNS CNAME Record
  • Pages Functions
    • Overview
    • Edge Functions
    • Node Functions
  • Log Analysis
  • KV Storage
  • Edge AI
  • API Token
  • EdgeOne CLI
  • Pages MCP
  • Integration Guide
    • AI
      • Dialogue Large Models Integration
      • Large Models for Images Integration
    • Database
      • Supabase Integration
      • Pages KV Integration
    • Ecommerce
      • Shopify Integration
      • WooCommerce Integration
    • Payment
      • Stripe Integration
      • Integrating Paddle
    • CMS
      • WordPress Integration
      • Contentful Integration
      • Sanity Integration
    • Authentication
      • Supabase Integration
      • Clerk Integration
  • Best Practices
    • Using General Large Model to Quickly Build AI Application
    • Use the Deepseek-R1 model to quickly build a conversational AI site
    • Building an Ecommerce Platform with WordPress + WooCommerce and GatsbyJS
    • Building a SaaS Site Using Supabase and Stripe
    • Building a Company Brand Site Quickly
    • How to Quickly Build a Blog Site
  • Migration Guides
    • Migrating from Vercel to EdgeOne Pages
    • Migrating from Cloudflare Pages to EdgeOne Pages
    • Migrating from Netlify to EdgeOne Pages
  • Troubleshooting
  • FAQs
  • Contact Us
  • Release Notes

Start with AI

EdgeOne Pages features built-in AI tools such as MCP deployment, AI IDE plugin, and continuously iterated Edge AI, working in conjunction with flexible AI context files to build an intelligent free and open ecosystem. This helps you deliver outstanding Web applications faster and with higher quality.


Using Pages MCP

MCP (Model Context Protocol) is an open protocol that enables AI models to securely interact with local and remote resources.

EdgeOne Pages Deploy MCP is a dedicated service that enables quick deployment of Web applications to EdgeOne Pages and generates public access links. This allows you to preview and share AI-generated Web content immediately. For details, refer to the document Pages MCP.



AI Context File

In Pages, AI context files serve as the bridge between you and AI IDEs like CodeBuddy, Cursor, and Windsurf. These files are written in Markdown format, allowing you to define project-specific rules, best practices, code specifications, API definitions, and even business logic descriptions. Through context files, you can provide AI with accurate context information to ensure the generated code, suggestions, and automated operations better align with your project requirements and platform features.

You can download the Pages context file through this URL: https://docs.edgeone.app/pages-llms.mdc


Using .Mdc Files in AI IDE

For most AI development tools, simply place the pages-llms.mdc file in the project root directory as a project-level rule, or set it as a global rule in the development tool.
For use in CodeBuddy: Enter the settings interface in the IDE, locate the rule tab, and add the .mdc file under project rules. For more information, view the document Using CodeBuddy IDE.
For use in Cursor: Create a .cursor/rules/ folder in the project root directory and place your .mdc file there. Alternatively, add it to the user's main directory (such as ~/.cursor/rules/). For more information, view examples in the Cursor official documentation https://docs.cursor.com/context/rules.
For use in Windsurf: Create a .windsurf/rules/ folder in the project root directory and place your .mdc file there. Or manually add this file through the Windsurf UI. For more information, view examples in the Windsurf official documentation https://windsurf.com/editor/directory.


Using Llms.Txt

You can also use https://docs.edgeone.app/llms.txt in AI conversations to share Pages document context. Typically, as long as the AI conversation tool supports this format, llms.txt will be referenced as a network resource context.



AI Development Assistance and Deployment Practice

The following best practice will guide you on how to collaborate with AI effectively for faster and better project development.


Optimizing Prompt Content

Clear and structured prompt content is key to effective communication with AI. Follow the rules below to help AI understand your intent more accurately:
Use a total score structure: First state the overall goal, then gradually refine specific requirements, such as "create a responsive blog homepage with navigation, article list and footer."
Use precise terminology: Use accurate descriptions in key commands, such as "implement a blue color scheme with flat design using Tailwind CSS."
Provide existing examples: By providing existing code or design pattern samples, guide AI to generate content that matches your style.
Continuously adjust and iterate: Try different expressions to organize prompt content for the best AI results.



Refining Project Information

Providing reasonable AI context helps AI better understand your project.
Fully utilize context files: Appropriately supplement project structure, API interfaces, or business logic in pages-llms.mdc. AI IDE will automatically load these rules to ensure AI-generated code meets project requirements.
Write a high-quality readme: This is the key entry for AI to understand the project. A clear, detailed README.md should include project introduction, technology stack, deployment guide, and key module description to help AI quickly grasp the holistic picture and provide accurate recommendations.