Market Report’s Global Telecom Trends for 2016 indicates that wireless devices are set to overtake wired devices in terms of IP traffic, and in 2016 will account for over 50% of Internet traffic. What does this mean for providers? It means that solutions must be in place to enable the activation, provisioning, monitoring, and management of wireless data and WiFi services at any wireless connections point, including community WiFi hotspots, cafes, and restaurants. In addition to the demand for wireless services on-the-go, providers must also be ready for more WiFi connections within their subscribers’ homes. The Internet of Things (IoT) has grown unabated. Recent research show that the smart home market was valued at $20 billion in 2014 and is expected to reach $58 billion by 2020. Existing protocols like TR-069 are already helping enable this market, but protocols aren’t enough. To satisfy WiFi quality-of-experience (QoE) expectations both in the home and on-the-go, service providers need platforms that dive deep into wireless device diagnostics. Furthermore, customer service representatives (CSRs) require access to platforms that grant the ability to view, diagnose, and then remotely resolve any wireless service or device issues. This not only keeps subscribers happy, it reduces the OPEX associated with lengthy service calls and costly truck-rolls.
The unpredictability of subscriber bandwidth utilization is becoming increasingly challenging for service providers, and this challenge is now bolstered by the growing amount of on-the-go and wearable IP-devices. According to a new report from Tractica, the global wearable devices market will grow from 17 million device shipments in 2013 to 187 million units annually by 2020. This causes a vast amount of implications for service providers. These devices are often connected out of the blue, and the types of services these devices regularly use is incredibly hard to predict. Some subscribers might only consume minimal amounts of bandwidth with apps that check the weather or send text messages, while other subscribers might use heavy-bandwidth consuming apps like OTT content streaming, online games, or live video conferencing. Unfortunately for operators, consumers don’t normally consider the implications of network and bandwidth congestion — they just want their services to work. For providers to offer reliable high quality services to an ever-increasing amount of unpredictable subscribers, they need platforms that plunge deep into network analytics, providing insight into hardware, subscriber, and even individual user-usage patterns. Furthermore, providers must be able to actively manage network congestion as it happens to avoid issues caused by service congestion throughout their networks. By gaining this insight, the unpredictable nature of the on-the-go device market becomes less troublesome, and accurately predicting future requirements becomes much clearer.
The proliferation of fiber-optics is still gaining traction in the communications industry, offering greater amounts of bandwidth delivered at much faster speeds — FTTx optics are expected to exceed $1 billion this year, already up nearly $50 million from a record-breaking 2014. The best part for providers? Deployment costs have become more economical, making fiber a much more viable option. But there’s a caveat; switching to a fiber infrastructure means realigning service activation and fulfillment platforms as well as B/OSS processes. I’m concerned that providers might fall into the trap of only focusing on the next-gen technology and forget how to assess how it fits into their existing network. Service providers will be confronted with multiple complex access technologies, and retaining task automation throughout the service activation and fulfillment process — as well as integrating automatic communication with billing systems — is essential when balancing deployment costs against operational expenses. I encourage service providers to brush up on their fiber deployment strategies before deploying next-gen services to their subscribers. The good news for the consumer is that we can expect to see much faster and more reliable services in the very near future.
My final prediction, and this is an easy one, is that we can expect to see the continued evolution of virtual systems and technologies interacting with every part of communication services. Last year, the industry became educated on the vast number of OPEX reduction benefits offered by virtual technologies using software defined networking (SDN) and network function virtualization (NFV). Analysys Mason reports that 82% cost savings with residential vCPE can improve margins to respond to competitive pricing threats. While this may be a compelling enough reason in itself, it only scratches the surface when we discuss the potential offered by virtualization. I predict that this year, the QoE benefits offered by NFV and SDN will start to take shape. Virtualization gives providers a way to personalize each of their subscribers’ service experiences, providing new insights into usage patterns and habits that can be used to tailor and improve each subscriber’s service experience. Whether completely or partially the road for the future of communication services is being paved in the cloud, and the newest way to raise QoE lies somewhere within.
As usual, this is going to be one exciting year!