Node Functions Developer Guide

Ethan MercerEthan Mercer
30 min read
Spt 3, 2025

What are Node Functions?

Node Functions is EdgeOne's comprehensive serverless function solution designed to empower developers with seamless dynamic backend capabilities. Simply by creating Node functions within your project's node-functions directory, your code will be automatically transformed into powerful Node.js-based API endpoints.. 

These endpoints can efficiently handle a wide range of backend operations including database queries, third-party API integrations, form submissions, and data processing tasks—all while eliminating the complexity of server management. This serverless approach makes Node Functions the perfect foundation for building scalable, modern applications.

Key Advantages of Integrating Node Functions

Experience true full-stack development with unified frontend and backend code management. House your frontend static assets and backend function logic within a single Git repository, seamlessly sharing configurations, environment variables, and deployment pipelines. Deploy your entire application stack with just one "git push" command, dramatically streamlining your development workflow while minimizing operational overhead and maintenance complexity.

Harness the full power of the vast npm ecosystem at your fingertips. Seamlessly integrate millions of npm packages including database drivers, utility libraries, and SDKs into your serverless functions.This rich ecosystem empowers developers to concentrate on crafting innovative business logic instead of rebuilding existing solutions from scratch.

Ten Essential Use Cases and Capabilities of Node Functions

Node Functions extend far beyond traditional CRUD operations, empowering developers to execute sophisticated scheduled tasks, process secure payments, deliver real-time notifications, and implement a wide array of advanced functionalities. Discover the ten most practical use cases below—each one designed to deliver transformative value and measurable improvements to your project's capabilities and user experience. 

Visit the sample code repository , which contains the complete structure of the sample code below.

API Request Processing and Management

API request handling serves as the foundational cornerstone of modern web applications, seamlessly receiving and responding to diverse HTTP requests (GET, POST, PUT, DELETE). This essential functionality powers robust data interface services for frontend applications, establishing the critical bridge for secure and efficient frontend-backend data communication. 

Explore the following example demonstrating a streamlined GET request implementation.

export async function onRequest(context) {
  const { request } = context;
  
  try {
    if (request.method === 'GET') {
      const users = [
        { id: 1, name: 'John', email: 'john@example.com' },
      ];
      return new Response(JSON.stringify(users), {
        headers: { 'Content-Type': 'application/json' }
      });
    }
    
    return new Response(JSON.stringify({ error: 'Resource not found' }), {
      status: 404,
      headers: { 'Content-Type': 'application/json' }
    });
  } catch (error) {
    // Handle errors
    return new Response(JSON.stringify({ error: 'Internal server error' }), {
      status: 500,
      headers: { 'Content-Type': 'application/json' }
    });
  }
}

Integrating OpenAI API Services

Node.js environments enable seamless integration of the standard OpenAI SDK with zero-configuration setup, delivering instant access to powerful AI capabilities. By leveraging environment variables for secure API Key management, this approach ensures enterprise-grade security while maintaining code flexibility and deployment agility. 

Explore the following example demonstrating Deepseek text generation model integration.

import OpenAI from "openai";
import getRequestBody from '../getRequestBody.js';

export async function onRequest(context) {
  const { request } = context;
  
  try {
    // Parse request body
    const { prompt, maxTokens = 500 } = await getRequestBody(request);
    
    if (!prompt) {
      return new Response(JSON.stringify({ error: 'No prompt provided' }), { status: 400 });
    }
    
    // Initialize OpenAI client
    const openai = new OpenAI({
      baseURL: process.env.OPENAI_API_URL,
      apiKey: process.env.OPENAI_API_KEY
    });
    
    // Call API to generate text
    const completion = await openai.chat.completions.create({
      model: "deepseek-ai/DeepSeek-R1-0528",
      messages: [
        { role: "system", content: "You are a helpful AI assistant." },
        { role: "user", content: prompt }
      ],
      max_tokens: maxTokens
    });
    
    // Return generated text
    return new Response(JSON.stringify({
      generatedText: completion.choices[0].message.content
    }), {
      headers: { 'Content-Type': 'application/json' }
    });
    
  } catch (error) {
    return new Response(JSON.stringify({ error: error.message }), { status: 500 });
  }
}

Database Integration and Connectivity

Seamlessly integrate various database operations within functions, supporting mainstream database connections and CRUD operations. Serverless architectures to easily implement data persistence, real-time queries, and complex data processing logic, providing comprehensive backend data support for dynamic applications.

For database operations, we provide complete full-stack database deployment template that you can use directly or reference the template code to develop your projects.

Identity Authentication and Session Control

Implement stateless authentication in functions through JWT Token. Supports core functionalities including user registration, login verification, token generation and validation, can work with cloud database services to store user information, and can integrate with third-party Auth mechanisms like Supabase to accelerate user management development.

Explore the following streamlined login authentication implementation.

const JWT_SECRET = process.env.JWT_SECRET;
const JWT_EXPIRES_IN = '24h';

function hashPassword(password) {
  return crypto.createHash('sha256').update(password).digest('hex');
}

function generateToken(payload) {
  return jwt.sign(payload, JWT_SECRET, { expiresIn: JWT_EXPIRES_IN });
}

export async function onRequest(context) {
  const { request } = context;
  
  try {
    return await handleLogin(request);
  } catch (error) {
    // errors handling
  }
}


async function handleLogin(request) {
  const data = await getRequestBody(request);
  const { email, password } = data;
  
  if (!email || !password) {
    ...
  }
  
  const user = users.find(u => u.email === email);
  if (!user) {
    ...
  }
  
  const hashedPassword = hashPassword(password);
  if (user.password !== hashedPassword) {
    ...
  }
  
  const token = generateToken({ userId: user.id });
  
  const { password: _, ...userWithoutPassword } = user;
  return new Response(JSON.stringify({
    message: 'Login successful',
    user: userWithoutPassword,
    token
  }), {
    headers: { 'Content-Type': 'application/json' }
  });
}

Email/Push Notification Delivery

Implement email delivery within functions through email service provider APIs such as Resend, Nodemailer, SendGrid, AWS SES, supporting features like HTML rich text, attachments, and template rendering. Additionally, integrate multi-channel messaging services including SMS and push notifications, suitable for scenarios like user registration verification, order notifications, and system alerts.

Discover the following Resend implementation example.

export async function onRequest(context) {
  const { request } = context;
  const resend = new Resend(process.env.RESEND_API_KEY);


  try {
    const requestBody = await getRequestBody(request);
    
    // Validate request parameters
    if (!requestBody || !requestBody.email) {
      return new Response(JSON.stringify({ error: 'Missing email address' }), {
        status: 400,
        headers: { 'Content-Type': 'application/json' }
      });
    }
    
    const { email } = requestBody;
    const name = requestBody.name || 'User';
    const subject = requestBody.subject || 'Welcome to our service';
    const customMessage = requestBody.message || '';
    
    // Send email
    const data = await resend.emails.send({
      from: 'noreply@wenyiqing.email', // Replace with your sender email
      to: [email],
      subject: subject,
      html: `
        <div style="font-family: sans-serif; max-width: 600px; margin: 0 auto;">
          <h2>Hello, ${name}!</h2>
          <p>Thank you for using our service.</p>
          ${customMessage ? `<p>${customMessage}</p>` : ''}
          <p>If you have any questions, please feel free to contact us.</p>
          <p>Best regards,<br>The Team</p>
        </div>
      `
    });
    if(data.error) {
      // errors handling
    }
    // Return success response
    return new Response(JSON.stringify({ success: true, messageId: data.id }), {
      status: 200,
      headers: { 'Content-Type': 'application/json' }
    });
  } catch (error) {
    // errors handling
  }
}

Data Format Transformation

Unleash powerful universal format transformation capabilities within your serverless functions. Harness the full potential of Node.js's extensive parsing ecosystem—including industry-proven libraries like xml2js, csv-parser, xlsx, and pdfkit—to effortlessly orchestrate complex data workflows spanning file processing, API integration, and enterprise reporting scenarios. 

Deploy as a dedicated data transformation microservice that delivers standardized format conversion APIs, empowering frontend applications and external systems while dramatically reducing cross-platform data exchange complexity.

Explore this streamlined CSV-to-XLSX conversion example.

import getRequestBody from '../getRequestBody.js';
import * as XLSX from 'xlsx';

function convertCsvToExcel(csvData, options = {}) {
  	// Data conversion logic
}

export async function onRequest(context) {
  const { request } = context;
  
  try {
    // Get CSV data from request body
    const requestBody = await getRequestBody(request);
    const csvData = requestBody;
    
    // Get options from URL parameters
    ...
    const options = {...};
    
    // Convert CSV to Excel
    const excelBuffer = convertCsvToExcel(csvData, options);
    
    // Return Excel file as downloadable
    const filename = params.get('filename') || 'data.xlsx';
    return new Response(excelBuffer, {
      status: 200,
      headers: {
        'Content-Type': 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet',
        'Content-Disposition': `attachment; filename="${filename}"`
      }
    });
  } catch (error) {
    // Simple error response
    return new Response(JSON.stringify({ error: error.message }), { status: 400 });
  }
}

// Note: CSV validation logic omitted for brevity
// In production, you would validate CSV format before conversion

Webhook Event Handling

Architect enterprise-grade, highly available Webhook endpoints within your serverless functions, engineered to seamlessly align with event-driven architectural paradigms. Enable real-time event streaming from leading platforms including GitHub, Stripe, and beyond, while implementing cryptographic signature verification to guarantee authenticated request origins and bulletproof security protocols. 

Discover this GitHub push event integration example that showcases our Webhook processing capabilities.

export async function onRequest(context) {
  const { request, env } = context;
  try {
    const webhookSecret = env.GITHUB_WEBHOOK_SECRET;
    
    const clonedRequest = cloneRequest(request);
    const isSignatureValid = await verifyGitHubSignature(clonedRequest, webhookSecret);
    
    if (!isSignatureValid) {
      // errors handling
    }
    // Get event type
    const eventType = request.headers['x-github-event'];
    if (!eventType) {
      // errors handling
    }
    // Parse request body
    const payload = await await getRequestBody(request);
    // Process based on event type
    let result;
    switch (eventType) {
      case 'ping':
        result = { status: 'success', message: 'Pong! Webhook configured successfully' };
        break;
      case 'push':
        result = handlePushEvent(payload);
        break;
      case 'pull_request':
        result = handlePullRequestEvent(payload);
        break;
      case 'issues':
        result = handleIssueEvent(payload);
        break;
      default:
        result = { 
          status: 'received', 
          message: `Received ${eventType} event, but no handler implemented` 
        };
    }
    
    // Return processing result
    return new Response(JSON.stringify(result), {
      status: 200,
      headers: {
        'Content-Type': 'application/json'
      }
    });
  } catch (error) {
    // errors handling
  }
}

File Upload and Processing Service

Engineer high-performance file upload infrastructure within your serverless functions, featuring comprehensive multipart/form-data processing capabilities for diverse media types including images, videos, and documents. Leverage industry-standard middleware solutions like Multer and Busboy to orchestrate sophisticated upload stream parsing. Seamlessly integrate with leading cloud storage platforms—including Tencent Cloud COS, AWS S3, and Alibaba Cloud OSS—to deliver resilient persistent storage solutions.

Explore this end-to-end implementation showcasing seamless user file uploads with direct Tencent Cloud COS integration through our node function architecture.

// COS Configuration (simplified)
const cos = new COS({...});

const MAX_FILE_SIZE = 100 * 1024 * 1024;

function generateSafeFilename(originalFilename, mimeType) {
	...
}

async function uploadToCOS(fileData, filename, mimeType) {
  // Tencent Cloud COS upload logic...
}

export async function onRequest(context) {
  const req = context.request;

  try {
    return new Promise((resolve, reject) => {
      const bb = busboy({ headers: req.headers });
      const uploadResults = [];
      
      // Process file fields
      bb.on('file', (fieldname, file, { filename, mimeType }) => {
        const chunks = [];
        let fileSize = 0;
        
        file.on('data', (data) => {
          chunks.push(data);
          fileSize += data.length;
          
          if (fileSize > MAX_FILE_SIZE) {
            file.resume(); // Stop receiving
          }
        });

        file.on('end', async () => {
          if (fileSize > MAX_FILE_SIZE) {
            return; // Skip oversized files
          }
          
          try {
            // Process and upload file
            const fileData = Buffer.concat(chunks);
            const safeFilename = generateSafeFilename(filename, mimeType);
            const result = await uploadToCOS(fileData, safeFilename, mimeType);
            
            uploadResults.push({
              fieldname,
              ...result
            });
          } catch (err) {
            console.error('Upload error:', err);
          }
        });
      });

      bb.on('finish', () => {
        resolve(new Response(JSON.stringify({
          success: true,
          files: uploadResults
        }), { status: 200 }));
      });

      req.pipe(bb);
    });
  } catch (error) {
    return new Response(JSON.stringify({ error: 'Upload failed' }), { status: 500 });
  }
}

Image and Video Processing Engine

Harness the full potential of industry-leading image processing libraries—Sharp and Jimp—within your serverless functions to orchestrate sophisticated visual transformations including precision scaling, intelligent cropping, dynamic watermarking, seamless format conversion, and adaptive quality compression. Deploy advanced video processing capabilities through FFmpeg and Fluent-ffmpeg integration, enabling professional-grade video editing workflows, high-performance transcoding pipelines, intelligent frame extraction, and automated thumbnail generation systems.

Explore this sophisticated showcasing base64 image ingestion through our optimized node function architecture.

export async function onRequest(context) {
  const { request } = context;

  try {
    const requestBody = await getRequestBody(request);
    
    let base64Data = requestBody.image;
    if (base64Data.includes(';base64,')) {
      base64Data = base64Data.split(';base64,')[1];
    }
    
    const imageBuffer = Buffer.from(base64Data, 'base64');
    
    // Get watermark parameters
    const watermarkText = requestBody.text || 'Watermark Example';
    const opacity = parseFloat(requestBody.opacity || '0.5');
    const textColor = requestBody.textColor || 'white';
    const fontSize = parseInt(requestBody.fontSize || '36', 10);
    
    const imageInfo = await sharp(imageBuffer).metadata();
    const imageMimeType = `image/${imageInfo.format}`;
    
    const processedImageBuffer = await addTextWatermark(
      imageBuffer,
      watermarkText,
      opacity,
      textColor,
      fontSize
    );
    
    // Return processed image
    return new Response(processedImageBuffer, {
      status: 200,
      headers: { 'Content-Type': imageMimeType }
    });
  } catch (error) {
    // errors handling
  }
}

Server-Sent Events

Build real-time communication capabilities within functions, implementing unidirectional data stream pushing through Server-Sent Events (SSE), or achieving bidirectional real-time communication by integrating with WebSocket gateway services.

Utilizing message queues (such as Redis Pub/Sub, Kafka, RabbitMQ) as intermediaries, Functions can subscribe to and push real-time data including stock quotes, sports scores, order statuses, and system monitoring. Support long-polling mechanisms for compatibility requirements, implementing data filtering, aggregation, and transformation through function chain invocation for stream processing.

Below is a code example demonstrating SSE server-side functionality implementation.

const clients = new Set();

// Simulate message queue
const messageQueue = {
  subscribers: {},
  publish: function(channel, message) {
    ...
  },
  subscribe: function(channel, callback) {
    ...
  }
};

// Simulate stock data generator
function generateStockData() {
  ...
}

// Start simulated data publishing
let stockDataInterval;
function startStockDataSimulation() {
	// Send data every 2 seconds
	...
}

// Stop simulated data publishing
function stopStockDataSimulation() {
  if (stockDataInterval && clients.size === 0) {
    clearInterval(stockDataInterval);
    stockDataInterval = null;
  }
}

export async function onRequest(context) {
  const { request, env } = context;
  
  const url = new URL(request.url, 'http://www.example.com');
  const channel = url.searchParams.get('channel') || 'stocks';
  
  const responseStream = new TransformStream();
  const writer = responseStream.writable.getWriter();
  
  // Set SSE response headers
  const response = new Response(responseStream.readable, {
    headers: {
      'Content-Type': 'text/event-stream',
      'Cache-Control': 'no-cache',
      'Connection': 'keep-alive',
    }
  });
  
  // Create client object
  const clientId = Date.now().toString();
  const client = {
    id: clientId,
    writer
  };
  
  // Add client to connection collection
  clients.add(client);
  
  if (clients.size === 1) {
    startStockDataSimulation();
  }
  
  // Send connection success message
  const connectMessage = `event: connected\ndata: {"clientId":"${clientId}","message":"Connection successful"}\n\n`;
  await writer.write(new TextEncoder().encode(connectMessage));
  
  // Subscribe to message queue
  const unsubscribe = messageQueue.subscribe(channel, async (data) => {
    const eventData = `event: message\ndata: ${JSON.stringify(data)}\n\n`;
    await writer.write(new TextEncoder().encode(eventData));
  });
  
  // Listen for connection close
  context.waitUntil(
    (async () => {
      try {
        await request.signal.aborted;
      } catch (error) {
        // errors handling
      } finally {
        // Clean up resources
      }
    })()
  );
  
  return response;
}

Validate your real-time communication infrastructure using this client-side implementation for seamless connection testing.

const eventSource = new EventSource('/api/sse?channel=stocks');

// Listen for connection success event
eventSource.addEventListener('connected', (event) => {
  const data = JSON.parse(event.data);
  console.log('SSE connection successful:', data);
});

// Listen for message events
eventSource.addEventListener('message', (event) => {
  const stockData = JSON.parse(event.data);
  console.log('Received stock data:', stockData);
  // Update UI here
});

// Listen for errors
eventSource.onerror = (error) => {
  console.error('SSE connection error:', error);
  eventSource.close();
};

Conclusion

Node functions epitomizes the evolutionary leap in contemporary web development paradigms. Through revolutionary serverless computing frameworks, development organizations transcend traditional infrastructure constraints—eliminating server provisioning overhead, automated scaling complexities, and load balancing orchestration—enabling laser-focused dedication to strategic business logic implementation and innovation acceleration.

The node capabilities encompass a comprehensive spectrum ranging from streamlined API microservices to AI integration, from foundational data persistence architecture to real-time event streaming infrastructure, and from intelligent file processing workflows to advanced multimedia conversion pipelines, providing comprehensive coverage across all dimensions of application development.

A singular Git repository, unified deployment pipeline, and centralized environment configuration orchestrate complete application lifecycles from development to global production deployment. 

Whether architecting personal innovation projects, scaling disruptive startup products, or engineering mission-critical enterprise solutions, node functions provides comprehensive end-to-end development platforms spanning rapid prototyping through production-scale deployment, authentically materializing the industry's ultimate development vision: "Architect once, deploy everywhere, scale infinitely."