Few of us understood how foundational information infrastructure would become to the economy and our daily lives—and our inability to predict its eventual importance is why we remain encumbered by a system that no longer suits our needs. The original technology revolutionized how information was transported and disseminated, but when it was conceived it only considered the need to send pre-packaged content from a central provider down and out to a consumer. The way we want to consume content and services today is marked by a greater desire for control; the networks we need now must support this desire to design the manner and method of our interactions with content and services. To develop these networks, we must challenge the status quo and rebuild our information infrastructure from first principles: the only solution is open access, content and service agnostic, fiber-based networks.
To this point, hundreds of billions of dollars have been spent deploying networks with a single application: internet access. On these networks, any additional applications must be provided over the internet and are therefore dependent on an internet service provider (ISP) to complete the transaction. It’s almost exactly like the switchboard operator we needed to plug the phone lines into the correct circuits to connect a phone call—a technology we replaced 40 years ago. Since neither the application provider nor the user can rely on the ISP to keep their connection secure, for most applications data encryption is required for security. But this model is not appropriate for every application: either the data is too sensitive to expose on the public web at all, or it’s too important to suffer the latency that results from encryption and then translation.
As a solution, many large organizations invest significant resources to build intranets between locations. Critical data like financial information, health records, intellectual property, and autonomous systems are delivered in this way. But, smaller companies, and most towns and cities, are constrained by the cost of this private network infrastructure.
The Networks We Need Now
All new networks must be designed to re-democratize control over the flow of content back to the end consumer. To achieve this goal, these networks will be able to provision private, dedicated connections between application providers and end users. In other words, future-ready networks will bring the security, speed, and robustness of a private network to every use-case. These networks will empower employees to connect directly with their business on a shared private network; and these will deliver smart city applications over common infrastructure while maintaining dedicated, private connections for sensitive edge-device data to the party that needs it in real time. In addition, maintenance costs will be low, technological upgrades will be made with minimal capital expenditure, and the base asset will be useful for more than fifty years without improvement. These networks must also be operated by a third party that is neutral to the services provided across the infrastructure, and they will support multiple network protocols. And finally, these networks must encourage competition between network services across urban and rural communities.
So, what does this network look like? The best contender is open access, software defined, neutrally operated, and fiber dense.
Open Access
In direct contrast to how networks are structured today, the open access model for networks creates a competitive marketplace for services across shared infrastructure. Today, each ISP constructs or leases their own networks, exclusively, to reach each premise that they will serve. We would not expect DHL, Fedex, and UPS to construct their own roads to deliver packages to your door, yet we expect ISPs to construct their own pathways to deliver content to your home. As a result, in some places infrastructure has been constructed duplicatively, while other areas are left with nothing, or just a single choice of provider.
Open access networks—as shared infrastructure—mitigate the overbuild problem. In this model, service providers can enter new markets without constructing their own network, consumer choice expands, and new types of services come to life with increased access to subscribers: think telehealth, financial services, entertainment, and public service solutions.
Software Defined
Software defined networking (SDN) powers the competitive marketplace for services, is critical to support the privacy and power of dedicated networks, and supports systems of edge devices in an “intelligent” community.
When software, not hardware, controls how data is routed, each piece of the network can be programmatically driven; issues that once required a truck roll to fix, or processes that once required a technician to visit a data center, can be dealt with immediately and automatically. Software code is a critical part of the infrastructure; it ensures maximum performance across the lifecycle of a network, reduced operating costs, and easier upgrades.
This technology powers open access operations: device hardware can be neutral and subscribers can effortlessly change providers with the click of a button—no new installation or configurations required.
Software can also create private bridge networks between a consumer and their service provider—keeping sensitive information away from intermediaries on the public internet. When a doctor wants to establish a HIPAA compliant network between their hospital and another for the transfer of patient information, a software defined network provides that solution at a fraction of the cost the hospital group would have spent on a private intranet between locations. In another application, consumers within the same network footprint (which can be national in scale) can create dedicated connections to each other. So, each employee of a company within the network footprint can spin up a dedicated private connection to the central office (which does not have to be a physical location) and to their colleagues. This supports a distributed workforce, and removes the necessity of a separate intranet.
Finally, SDN will power dynamic edge devices in an “intelligent” community because distributed sensors can be triggered to create private networks on the fly when sensitive data is being created. Imagine edge sensors working together automatically to track the perpetrator of a crime: a gunshot is fired and triggers multiple acoustic sensors deployed within a city; as the sensors triangulate the sound, they spin up a private network between all security cameras, weather cameras, traffic cameras, and the local law enforcement command center. While the event is tracked, that same connection can encompass streetlights with LED beacons, traffic signals, and other resources to direct automobile and pedestrian traffic away from the emergency. The command center has now called on a network of resources that it does not own to temporarily create a private connection for their use, underscoring the critical importance of shared information infrastructure in every community.
Neutrally Operated
The role of a neutral operator is to deploy the shared infrastructure, ensure equal access for qualified vendors, and create a marketplace for those services to be consumed by end users.
To encourage this competitive landscape, the network operator must be impartial to the services being provided across it. The existing system is dominated by vertically integrated operator-service providers with inherent opposition to diversifying the number and types of applications. By contrast, a neutral operator of open access infrastructure encourages content, service, and connectivity providers to develop and expand within the network footprint. Consumers will benefit with better price for value as a result of this competitive marketplace offered over shared infrastructure.
Neutral operation across shared infrastructure also maximizes the value of deployed edge devices. With a neutral operator, a single device can be accessed by multiple application developers. Under the current system, applications drive the deployment of devices, and those devices are then tied to a single service; with shared infrastructure, a single set of devices can encourage application innovation.
This can be illustrated by looking at the emerging fiber sensing market where a single sensing device can monitor 30 kilometers (18.6 miles) of fiber, listening for disruptions in the signal caused by sound waves. With this technology, we can record the speed and size of every car on a road, listen for seismic activity, monitor for leaks in adjacent water and gas lines, monitor for network strikes by digging, and hear gunshots and pinpoint their location. In the current system, a device would be deployed for each application. Only in an open access setting can we open the constant data stream from the device to multiple approved application developers, so that a single device can perform different applications to meet the demands of the market without heavy capital requirements and overbuild.
Fiber Dense
To fully realize the benefits of an open access, software defined system, the network must be designed with the capacity to emulate point-to-point connections between all edges. Fit-for-future networks must also be prepared for increased capacity and expansion to support new applications with ever-increasing bandwidth requirements. This architecture, combined with software defined networking, allows for dramatic network modifications without physically altering the network.Only fiber provides the solution.
Over a few years, the savings from dramatically lower operation and maintenance costs outweigh the increased early cost. In addition, labor expenses associated with initial construction can be offset with proper network engineering. While it might not be economical to deploy enough fiber for expansion into areas years away from completion, a well designed conduit network will minimize the labor costs associated with bringing fiber into developing areas over a number of years. Future networks will also utilize conduit paths in both underground and aerial applications, greatly reducing the permits and labor required to bring additional fiber capacity to any point on the network.
Focusing on Long-Term Benefit
As municipalities and emerging companies contemplate solutions for deploying fiber into communities, it can be tempting to optimize for the lowest short-term cost. But there’s a more important question to ask: what kind of network will optimize long-term economic, social, and technological growth? With a fifty year asset in fiber, it’s a missed opportunity not to design and construct networks to ensure maximum benefit over that lifespan. In other words, the networks we put in place now must fulfill all current needs, and be ready to facilitate new solutions to new needs. Here at Underline, that’s why we’re implementing networks designed to be open access, neutrally operated, and powered by software—with optimism and excitement for everything they will be able to do for communities. Join us.