Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Promotions
AI
Gate AI
Your all-in-one conversational AI partner
Gate AI Bot
Use Gate AI directly in your social App
GateClaw
Gate Blue Lobster, ready to go
Gate for AI Agent
Gate MCP
Gate Skills Hub
10K+ Skills
From office tasks to trading, the all-in-one skill hub makes AI even more useful.
GateRouter
Smartly choose from 30+ AI models, with 0% extra fees
Vercel's recently released Workflows feature is really interesting, solving a major pain point for developers—replacing an entire backend infrastructure with just two lines of code.
I looked into how it works: developers mark use workflow at the top of their TypeScript functions, then use use step inside sub-functions to mark each execution step. The framework automatically handles queue scheduling, failure retries, and state persistence—eliminating the need to deploy orchestration services, message queues, or state databases separately, all integrated into the application code.
The core pain point it addresses is quite practical: when moving AI agents or backend tasks from prototype to production, developers often spend a lot of time building orchestration infrastructure instead of optimizing the product itself. Traditional solutions scatter logic across queues, workers, state tables, and retry mechanisms, whereas Vercel’s approach directly merges orchestration logic with business logic.
Since the public beta started last October, the data has been impressive—over 100 million executions, 500 million steps processed, serving more than 1,500 customers, with npm weekly downloads exceeding 200k. This clearly indicates that many developers are using it.
For AI agent scenarios, Vercel added several features: persistent stream guarantees that agent outputs are permanently stored, even if the browser is closed; built-in encryption that automatically encrypts all data before leaving the deployment environment; and support for pause and resume, allowing waiting for manual approval or sleeping for days or months, with zero compute cost during that period; each step supports up to 50MB, with the entire execution limited to 2GB, enough for multimodal agents handling images and video transmission.
At the same time, the new AI SDK v7 integrates WorkflowAgent, and the Python SDK is also in public beta. Interestingly, the Workflow SDK is open source, and the community is already developing adapters for MongoDB, Redis, Cloudflare, and others. The next version will add concurrency control, global deployment, and snapshot runtime, further reducing the cost of event reprocessing.
The pricing model is also attractive—pay only for actual execution time, with no ongoing costs for orchestration service uptime. This is especially appealing for teams looking to iterate quickly.