As more things and people connect, vast amounts of data will be generated, creating opportunities to learn about customers, streamline business processes, flatten organizational structures, and transform industries.
To be competitive, organizations need to be able to "know" everything, and understand this deluge of data to put it into the context of the business. This requires a clear vision of the strategic goals of Big Data and Analytics (BDA), especially as BDA technology investments are increasing across Asia/Pacific at 34% YoY for the next few years. This rapid growth in investment is creating a divide between the organizations that "know" and the ones that do not.
In an interview with Networks Asia, Jun Shi, Vice President, Sales Engineering and Chief Technology Officer Officer (CTO) for Juniper Networks across APAC, discusses why Big Data and analytics are increasingly becoming important and how businesses can successfully derive insights from it.
“To handle the big data dilemma, companies must implement networks that are highly agile, flexible and scalable,” said Jun.
The following is the excerpt of the interview.
1 We've seen businesses collect data from multiple internal sources, but this is nothing new and has been going on for a while. Companies like Splunk etc. have been trying to help us make sense of networking logs for years.
So what is causing this change? What is driving the need for external sources of information?
A major catalyst for this change has been the impact brought about by Digital Disruption. The likes of Airbnb, Uber and Bitcoin have radically altered the traditional business models, and have gained market share faster than ever before by successfully eliminating customer pain points. Specifically, we now see four new forces which are accelerating the need for external sources of information – user expectation, competitive pressure, economics and technology innovation.
All these have resulted in a widening gap – that between the shorter time-to-adoption of new business models, and user expectations of value in the form of cost and time savings which are rapidly increasing. In response, companies prioritise the user experience and this fuels competition, to provide the same value at a better price, or more value at the same price. Companies will increasingly seek to derive knowledge and insight to accelerate decision making, thereby driving the need to manipulate external sources of information in near real-time.
2 Why are current or legacy analytics tools insufficient for the vast amounts of data we are expecting? Are the current tools sufficient to combine divergent information sources? How does the IT department decide what more they need? How will big data and their tools integrate with existing analytics tools?
System administrators are continuously looking for ways to process and present data in usable and reliable ways – and often, the time taken as well as resources utilised are not taken into consideration. As the amount of both structured and unstructured data increases exponentially, effective delivery of data is obstructed as users expect data to be available quickly, whenever and wherever it’s needed on networks that weren’t built to handle that much traffic.
One way to manage usage is through applications that deliver tailored information to an individual’s interests. Administrators can provide employees with the big data equivalent of Uber or Waze – apps that are easily downloadable and serve specific needs. While this removes the IT middleman to some extent, it accomplishes the overall mission of providing end users with easy access to information, delivered in bite-sized chunks.
However, even that doesn’t change the fact that companies will need networks equipped to handle the delivery of those applications and the data they present. This is a challenge that is just as big as the data itself. To handle the big data dilemma, companies must implement networks that are highly agile, flexible and scalable. They must become the critical pieces that allow intelligent data to be pushed out on-demand at any time to any place. Software-defined networking (SDN) is the ideal conduit for this type of service because it creates a network that is elastic, resilient and built for delivering applications and data on-demand.
3 How fast can responses be? Are in-memory or cloud based analytics the way forward? What about analytics at the edge?
Real-time consumption of data from multi-dimensional sources are becoming more important than ever to accelerate the decision-making process; and faster response time or feedback loops through data analytics goes a very long way towards helping tune behavioural learning and decision making, a massive competitive advantage for business owners.
Each business and each network is different, with differing levels of response times required – although the acquisition, storage and functionality for each setup is always mapped out closely to the costs for implementation.
Certain elements of fast logic thus definitely have to be implemented close to the source on the chip or close to the edge. However, others including big data analytics, is increasingly moving to the cloud, especially given the volumes of data involved and the economics advantage from cloudification.
There’s no singular hard-and-fast answer to this, as various implementations are complementary to each other – be it the need to garner economic advantage, or simply due to the different requests and different nature of the data or behaviour.
4 It's often said that to make sense of big data, the right questions need to be asked. But how do LoBs decide what the right questions are? What is the right argument for additional investment in big data?
The decision-making process is increasingly shifting to the LoBs as they are the stakeholders situated right in the business decision centre to be afforded with the full context, background, awareness and understanding to make the right decisions and prioritisations.
Big data can and will better serve business purposes, with the aim to solve today’s problems more effectively. However, to make sense of big data, data scientists and IT departments have to work closely with the subject-matter experts within the business itself, to best understand business behaviour and the nature of the consumption model. Only with tighter integration can companies truly maximize value from big data.
Companies will need to pose the right questions which are rooted around the ‘right’ understanding of the business itself. Big data can then provide unprecedented business value by surfacing actionable insights that were previously indiscernible and not readily attainable.
5 What will happen to businesses who fail to invest in big data? Can they keep pace by using legacy analytics solutions? They've worked so far.
Once connected, the billions of devices – including those from Big Data, Analytics, Machine Learning and AI – will be producing huge amounts of data. In fact, the forecast for global IP traffic will reach 2.3 zettabytes per year by 2020, more than double the amount today!
Companies will need to consume, analyse and manipulate this data in near real-time to produce knowledge that drives services while meeting the rising expectations of users. Legacy computing, storage and networking infrastructure will come under pressure but networking will particularly become critical and require new design approaches.
Therefore, it will become essential to move compute and storage to the cloud where applications analytics, AI and service creation/delivery can scale and provide virtually unlimited compute and storage capacity for the end-user. This means that the connection or network between the end-user and the cloud, as well as the network between clouds, must provide the required performance. Networking design and performance to, from, and in the cloud will require a rethink in terms of design criteria in order to provide the instantaneous response and services demanded by end users.