Edge computing means processing data near the source rather than sending it to a centralized cloud data center. Instead of uploading user data to the cloud, processing it, and sending results back, you process it locally on the edge device or on a server close to the user. This reduces latency because the processing happens nearby.
It reduces bandwidth because you're not uploading terabytes of raw data. It improves privacy because sensitive data doesn't leave the local network. Content delivery networks are a form of edge computing. They cache content on servers distributed globally. A user requests a video, the CDN serves it from a nearby server, not from the origin. Latency is low.
Edge computing is increasingly important as IoT and mobile applications proliferate. A camera monitoring a warehouse processes video locally to detect anomalies, then sends only alerts to the cloud. A phone processes voice locally for immediate response, then sends audio to the cloud for further processing if needed. Edge computing requires a different architecture than cloud computing.
You can't assume unlimited compute at the edge. You need to be selective about what processing happens where. But the benefits (latency, bandwidth, and privacy) often justify the complexity.