Currently, "single-point storage" is the biggest threat to enterprise data security. Hardware failures, cyberattacks, human error, and natural disasters can all instantly destroy critical data. Remote backup servers, which store data copies on independent devices physically far from the main server, represent the last line of defense against these risks. This article will systematically analyze the requirements for remote backup servers, the applicable scenarios, and the key considerations when purchasing one.
A qualified remote backup server node cannot simply be "another computer that can be powered on." It needs to meet standards in four dimensions: hardware, software, network, and security.
Computing Power (CPU and Memory)
Backup tasks involve more than just simple file copying; they also involve complex operations such as encrypted transmission, data compression, deduplication, and incremental scanning, placing clear demands on the server's computing power. For basic scenarios using common backup tools like rsync, a CPU with at least 4 cores is recommended. If the plan is to use the ZFS file system for snapshots and verification, at least 16GB of memory is recommended. For big data platforms or enterprise-level scenarios, a multi-node cluster architecture should be adopted to improve backup efficiency through horizontal scaling. Storage Architecture (Disks and RAID)
Storage is the core of the backup server. SSDs are recommended to ensure read/write speeds meet daily incremental backup needs. RAID configuration is crucial for hardware-level data protection:
- RAID 5: Minimum 3 drives, allows for one drive failure, balancing capacity utilization and redundancy.
- RAID 6: 4 drives, capable of handling two drive failures simultaneously, suitable for critical business data.
- RAID 10: Combines mirroring and striping, balancing performance and redundancy, suitable for scenarios with high recovery speed requirements.
For large-scale backup scenarios, hot-swappable design for online drive replacement and sufficient drive bays for future expansion should also be considered.
Power Supply and Network Ports
Dual redundant power supplies are recommended to avoid single-point power failures. For network interfaces, at least gigabit Ethernet ports are required; medium to large enterprises are advised to configure 10 gigabit fiber optic ports to meet large-scale data transfer needs.
Software Requirements: Automation and Recoverability
Hardware is merely the carrier; software determines whether the backup system can truly be used. Operating System and File System
Linux systems (CentOS, Ubuntu, Debian) are recommended, as they natively support backup toolchains such as rsync, cron, and SSH. For file systems, ZFS (open source, supports snapshots, verification, compression, and encryption) is ideal for backup scenarios, while Btrfs and XFS are common alternatives.
Backup Toolchain
- Basic Layer: rsync + SSH + crontab, suitable for scheduled synchronization of website files and code repositories.
- Advanced Layer: rclone (supports encryption, rate limiting, multi-threading, and multi-protocol), BorgBackup (deduplication + compression + encryption)
- Enterprise Layer: Bacula, Duplicity, Veeam, and other professional backup software, supporting database hot backups and virtual machine full-system protection.
Monitoring and Alerting
Backup systems cannot be "deployed and then ignored." It is recommended to deploy Prometheus + Alertmanager + Grafana to track backup task status, storage growth trends, disk SMART health, and other key indicators in real time.
Network Conditions: Bandwidth, Latency, and Stability
Remote backup essentially involves cross-network transmission; network quality directly determines backup efficiency and reliability.
Bandwidth Requirements
For basic scenarios, a dedicated bandwidth of 100Mbps is recommended. For enterprise multi-region deployments, a BGP multi-line data center should be chosen. For cross-border backups, it is recommended to test upload speeds; at least 5MB/s is needed to meet daily incremental backup requirements.
Latency and Packet Loss Rate
The latency between the backup node and the master server should be controlled within 50ms, with a packet loss rate below 0.5% during peak evening hours. Premium lines such as CN2 GIA perform significantly better than ordinary international lines during peak evening hours, with a tested packet loss rate approaching 0.02%.
Transmission Optimization Technologies
- Incremental Backup: Only changed data blocks are transmitted, significantly reducing network load.
- Compression: Data is compressed using gzip, LZ4, or zstd algorithms before transmission.
- Deduplication: Hash-based deduplication technology avoids duplicate data transmission.
Security Conditions: End-to-End Protection from Transmission to Storage
Remote backup involves data leaving the local environment; security cannot be ignored. Transmission Encryption
All backup traffic should be encrypted via TLS 1.3 or SSH tunneling to prevent man-in-the-middle attacks. For sensitive data, file-level encryption can be performed using GPG or OpenSSL before transmission.
Storage Encryption
Data on the backup server should have full-disk encryption (such as LUKS) or file system-level encryption (such as ZFS native encryption).
Access Control
- Enforce key-based SSH login, disable password authentication.
- Allow only specific source IPs and ports through the firewall.
- Deploy Fail2ban to prevent brute-force attacks.
Ransomware Prevention Strategy
Follow the "3-2-1-1-0" principle: 3 copies of data, 2 types of storage media, 1 off-site storage, 1 clean copy (tamper-proof), 0 recovery failures, and 0 unauthorized accesses. Snapshot functionality should store backup data in a separate partition, physically isolated from the original files, so that even if ransomware tampers with the original files, it cannot access the snapshot partition.
Remote backup servers are not optional "icing on the cake," but the cornerstone of a modern enterprise data security system. When selecting and deploying, a comprehensive evaluation should be conducted around four dimensions: hardware configuration, software tools, network quality, and security protection. The "3-2-1-1-0" backup principle should be followed, and daily monitoring and regular recovery drills should be emphasized.
EN
CN