Archive for 2013

Look for video to be strong, Polycom to rebound, and network management to be a growth area–finally.

Well, it’s that time of year again. I’ve broken out my snow blower, decked the halls, wrapped (and opened) presents–now it’s time to put on my Kreskin hat and predict the future of unified communications. Drum roll please…

Video Communications Rebounds In some ways I’m cheating on this prediction because I don’t think video usage has actually declined at all, other than in perception. The research firms show the video conferencing market as flat, and hardware revenue may be flat, but utilization rates are through the roof.

Frankly I think it’s time we looked at different metrics to determine market share and market size, such as a combination of hardware, software and services. I recently finished a survey asking the question, “How do you expect the utilization of video communications will change over the next 12 months?” and a whopping 90% expect an increase in utilization, with 68% projecting an increase of 10% or more.

A year of M&A activity, executive changes, new products and more.

The year is almost over and oh, what a year we had–some M&A activity, executive changes, new products and a bunch of other events. Here are the newsworthy items and other trends that I thought stood out above all others:

The Year of Lync Without a doubt, 2013 will be remembered as the year Lync moved out of the labs and into the mainstream. Almost every reseller and systems integrator I speak to tells me that a significant number of their customers are asking for Lync today. Prior to 2013, Microsoft had focused on getting customers to deploy Lync for chat and presence. This year, Microsoft and many of its partners pushed customers to trial Lync voice, and while I think Microsoft still has some challenges with voice, the company certainly legitimized itself as a voice vendor.

Lync mobile also has much better feature parity on non-Microsoft devices, which had previously been a huge Achilles heel for the company. These are the primary reasons that reseller and customer interest in Lync is at an all-time high. Additionally, a number of vendors, such as Polycom and Aastra, launched Lync-compatible phones to complement the Lync-optimized phones on the market, giving customers a broader set of IP phones to choose from.

This week, traffic visibility solution provider Gigamon announced its Unified Visibility Fabric, which provides Traffic Intelligence to help enterprises and services providers get a better handle on what traffic is flowing across the network. Gigamon has beefed up the application and services layer of its visibility fabric with new applications and features that offer advanced filter capabilities, such as stateful correction, user-level awareness and deep packet visibility. The Traffic Intelligence provides more granular filtering and forwarding to make sure the tools and applications network managers use to manage and secure the network receive only the data that it needs to operate.

Gigamon’s focus is to provide fabric-wide, integrated applications that send the correct data to the correct tools so organizations can optimize the performance of the tools, including network and application performance.

There’s no question that the trends of video, virtualization, software defined networks, BYOD, 40 Gig and 100 Gig have all added significantly more traffic to networks today. The challenge created from the increased volumes of traffic, combined with increased network speeds, is that the management, performance, and security tools customers use can’t capture the volume of data being pushed to it. Think of network traffic having to pass through a tollbooth and when it gets through, it’s directed to the right tool(s). If there’s too much traffic, then the cars get backed up and the things on the other side of the toll plaza won’t operate as well.

Keeping the company network up and running is, by far, the most important task that a network manager has today. However, the largest cause of downtime is actually self-inflicted. ZK Research recently ran a survey that asked what the primary cause of downtime with networks is today, and the No. 1 response was “human error,” with 29% of the 1,320 respondents citing this as the top issue. This is down from the 37% that my research showed a couple of years ago, but it’s still top dog.

There are a number of reasons why human error causes downtime, and they all tend to revolve around the fact that network managers typically have very poor visibility holistically across the network. Additionally, change management, documenting processes and auditing tends to done on an ad hoc basis. Some do it well, but most don’t. Now, in many ways, this really isn’t the fault of the IT department, as the tools to manage network changes and to see what’s going on with the network also tend to be pretty poor.

Last week, ActionPacked Networks announced the 3.1 version of its LiveAction network management product to address some of these issues in Cisco environments. ActionPacked Networks is a Cisco Developer Network partner and has added a number of new features to improve the visibility and manageability of Cisco networks.

One of my favorite holiday poems starts something like “Twas the week before Christmas and all through the air, the IEEE was stirring to ensure 802.11ac would be there.” I say this because the wave 1 version of the next-generation Wi-Fi standard, 802.11ac, was ratified last week. Given the impending ratification, many network managers are now asking, “should I deploy 802.11ac or should I just stick with tried and true 802.11n?” Well, there’s no right answer, so I thought I would go through some points network managers should considering before making the decision to AC or not to AC.

The first consideration is to understand the technology itself and what’s different about AC versus the current specification. 802.11ac is the next generation of Wi-Fi and extends many of the features introduced with the 802.11n specification. This is similar to the shift when the industry went from b to a/g.

802.11ac will be backwards compatible with previous versions of Wi-Fi allowing for a gradual migration away from legacy devices, although many new features will be available on non-AC devices. The technology brings gigabit speeds to wireless for the first time through the use of more antennas, wider channels and more special streams, as wells as a number of new features such as beam forming and better signal-to-noise ratio to greatly improve performance.

Solving these complexity challenges, particularly with multi-vendor deployments, doesn’t create a threat–rather, it generates opportunities.

I think it’s time the industry finally called a spade a spade. UC is complex and getting more complex every time we add a new feature or enhance the product somehow. Much of the complexity comes from the fact that these “systems” aren’t self-contained units anymore. Today’s solutions can operate in the cloud, on virtual appliances, as physical servers or on multifunction devices such as Cisco’s ISR or an Avaya IP Office. Additionally, clients can be dedicated IP phones, soft phones, operating on wired networks, WiFi or cellular. Also, most customers want to run multi-vendor systems, and no matter how closely a solution provider follows the standards, there are a number of vendor-specific issues when it comes to implementation of the standard or signal methods, etc.

Now don’t get me wrong, the solutions today have much higher levels of flexibility and agility, and we can do so many more things today than we could ever have done in the past, but there’s no doubt complexity is at an all time high. At Cisco’s Collaboration Summit, it was refreshing to hear Rowan Trollope, the company’s GM of Collaboration, actually talk about how hard implementing UC is for their customers–that’s something I rarely hear the vendors talk about. However, complexity isn’t just a Cisco problem; it’s really all the vendors today, as the solutions have gotten broader and more flexible.

Saying that multi-vendor UC is hard to deploy is as gross an understatement as saying the Yankees overpaid for Jacoby Ellsbury. In both cases it makes sense on paper and may work out OK in the short term, but over the long haul, both may not provide the original value that was sought out.

Unlike the Yankees, telecom managers now have some help with their problem, as this week Oracle announced the Oracle Enterprise Communications Broker, a product of the acquisition of Acme Packet made earlier this year. The Communications Broker is designed to act almost as a “Rosetta Stone” for the various UC vendors who claim to interoperate with one another, but only do so to a limited extent.

In its press release, Oracle cited a Frost & Sullivan data point that 69% of enterprises use multiple UC vendors. This may be true in the aggregate, but ZK Research shows that once you get into large enterprises, the number using multiple vendors is over 90%. Even organizations that try and standardize on a single vendor eventually wind up bringing in others.

Insight and Influence Through Social Media
ZK Research: Home
RSS Feed
ZK Research is proudly powered by WordPress | Entries (RSS) | Comments (RSS) | Custom Theme by The Website Taylor