When we decided to move Hope Channel International's (HCI’s) video delivery infrastructure away from commercial cloud providers, we faced a crucial decision: how to build a global content delivery network (CDN) that would serve our mission not just technically, but strategically. Here's a behind-the-scenes look at the technical decisions that are shaping this transition.
Rethinking CDN Architecture
Most organizations that need a content delivery network choose between two options: using established CDN providers or building their network using virtual servers from cloud providers. A few might venture into colocation facilities with their hardware. Hope Channel International took a different approach entirely: deploying our physical servers across Adventist-owned properties worldwide.
This unconventional decision serves a specific purpose. In an era where digital platforms face increasing scrutiny and potential restrictions, hosting our infrastructure on church-owned properties across multiple countries and internet providers creates a resilient network that can withstand future challenges to our message. A server in an Adventist hospital in Australia, another in a school in Washington State, and others in administrative offices across Europe, Asia, and the Americas – this diversity of locations and jurisdictions provides a level of mission security that commercial solutions can't match.
Enterprise Hardware, Smart Economics
For this distributed infrastructure, we needed servers that would be both reliable and cost-effective. We chose refurbished Dell PowerEdge enterprise servers – specifically the R640 and R740xd models. Why used enterprise hardware instead of brand-new commodity gear? The math is compelling: Enterprise-grade servers from the secondary market cost roughly one-fifth of their original price while retaining their core advantages.
The Dell R640 servers we're using feature dual Intel Xeon processors, redundant power supplies, and iDRAC enterprise remote management. The R740xd models add expanded storage capabilities and support for up to 6 GPUs for video transcoding at scale. These aren't just specifications – they translate directly to mission capability. The iDRAC enterprise remote management means we can resolve most issues without local IT intervention. Redundant power supplies keep the server running even if one fails. Hot-swappable components mean repairs can happen without taking the server offline.
Zero-Touch Deployment
How do you efficiently deploy and configure servers across multiple continents? We developed an automated system that handles everything from initial setup to ongoing management. Here's how it works:
When a new server arrives at our headquarters, we connect it to our in-house provisioning network. Our PXE boot server presents a custom GRUB menu that lets us select the server's intended role – whether it's a MegaPOP router, CDN node, or transcoding server. The selected option loads Ubuntu 24.04 Server with a cloud-init configuration that handles basic setup: configuring mirrored OS drives, setting up networking, and establishing baseline security.
After the operating system installation, Ansible playbooks take over. These playbooks configure everything from ZFS storage arrays (using RAIDZ2 for maximum data reliability) to our Wireguard-based Netbird private overlay network. The result? A server that's fully configured and integrated into our management infrastructure before it leaves headquarters.
Streamlined Local Installation
We've also simplified the process for local IT teams receiving these servers. Each server ships with a welcome letter and asset tag featuring our custom QR code. Scanning this code provides immediate access to detailed installation instructions and a live dashboard showing the server's health metrics.
Why did HCI go to the extra trouble of creating QR codes welcome letters and server dashboards? It’s not just about convenience – it's about empowering our global partners.
We are incredibly grateful to the local IT professionals partnering with us to host our servers. We wanted to make sure they get clear, step-by-step guidance for physical installation and network integration, along with real-time feedback that everything is correct. They can watch as the server establishes its connections and begins serving content, confirming their contribution to our global mission.
The Technical Path Forward
This infrastructure represents more than just a collection of technical decisions – it's a blueprint for technological independence in service of mission.
With the Lord’s help, by the end of 2024, Hope Channel International will complete our transition from commercial cloud services to this self-hosted network. Every technical choice, from the hardware we selected to the automation we've built, serves a dual purpose: delivering content efficiently today while preparing for the challenges of tomorrow. Together with the efforts of the global church, this network brings us closer to achieving Hope Channel International’s vision to reach one billion people with the message of eternal hope by 2030.