Quantcast
Channel: Networks Asia - IT news

Managing data amid increasingly complex regulations

0
0
Managing data amid increasingly complex regulations

Companies are facing increasingly complex regulations that are also growing in number and scope, as regulatory frameworks continue to catch up with the rapid speed of data creation and technology changes.

“When it comes to assessing compliance with data privacy and protection regulations, the General Data Protection Regulation (GDPR) is likely to be used as the gold standard, as it is probably the world’s most expansive data privacy law,” says Ravi Rajendran, managing director for Asia South Region, Veritas, in an email interview with Networks Asia. “Given the long arm of GDPR with its extraterritorial scope, businesses in Singapore should not be so quick to dismiss the regulation.”

According to an earlier Veritas GDPR Report, 56% of respondents in Singapore fear they will be unable to meet the GDPR regulatory deadlines, way behind their global counterparts in terms of GDPR readiness.

In Veritas’ latest Value of Data Study, nearly one quarter of the respondents in Singapore said that their businesses were exposed to compliance and regulatory fines as a result of data management challenges, higher than the global average of 18%.

Excerpts of the Q&A follows:

1. How ready are local businesses for local and international standards of compliance?

When it comes to assessing compliance with data privacy and protection regulations, the General Data Protection Regulation (GDPR) is likely to be used as the gold standard, as it is probably the world’s most expansive data privacy law. 

Given the long arm of GDPR with its extraterritorial scope, businesses in Singapore should not be so quick to dismiss the regulation. According to an earlier Veritas GDPR Report, 56% of respondents in Singapore fear they will be unable to meet the GDPR regulatory deadlines, way behind their global counterparts in terms of GDPR readiness.

In our latest Value of Data Study, nearly one quarter of the respondents in Singapore said that their businesses were exposed to compliance and regulatory fines as a result of data management challenges, higher than the global average of 18%.

Singapore is an open economy with global trade linkages. Local businesses with operations or dealings in external markets will need to comply with the different data privacy laws, such as The California Consumer Privacy Act of 2018 and Notifiable Data Breaches scheme in Australia. It is certainly an ongoing journey for local businesses towards both local and international standards of compliance.

2. Given the time frame allocated, if they aren't ready, why is this so? What have they been spending on instead?

Companies are facing increasingly complex regulations that are also growing in number and scope, as regulatory frameworks continue to catch up with the rapid speed of data creation and technology changes.

A case in point would be the GDPR, which requires organizations to be able to locate, search, minimize, protect and monitor any and all personal data they collect about EU residents. The sheer scale and scope of complying with the GDPR is enormous. For example, EU residents can request to see all their personal data and ask that it be corrected, moved or deleted. Organizations are required to respond quickly to such requests, while also keeping personal data for a stipulated time related to the reason it was collected for. Besides protecting personal data from damage, loss or breach, the rules also require companies to notify authorities within 72 hours and provide specific details about the breach.

According to our survey findings, many organizations don’t have the proper technologies and resources in place to meet such stringent requirements. In fact, 34% of global organizations (44% in Singapore) surveyed say their organizations lack the right skills and technology to harness the power of data, hindering their ability to search, discover and manage data effectively.

3. How are they looking at data classification wrongly?

Data classification helps organizations understand the key questions related to data – spanning from understanding what data they have, where data is located, the age of data, who is accessing the data, data retention period to how data is protected and whether it is adhering to compliance regulations.

When it comes to data classification, the lack of a strategic mindset has led to data being treated roughly the same, and organizations do not have an in-depth understanding of their data. For example, organizations could lose sight of which are the latest copies of financial statements or a specific customer’s data. In many instances, these are important requirements for overall risk management, data security and compliance. However, the process of classifying an organization’s entire volume of data can often run into obstacles due to lack of full visibility, insights and well-defined policies.

4. What should they be doing and considering?

With the exponential growth of unstructured data, organizations have inevitably been storing information across different IT environments, be it on-premise or in the cloud. For any compliance strategy and data classification initiative to be successful, a holistic and integrated approach should be adopted.

Locate

The critical first step is gaining a holistic understanding of where data is located. Next, organizations need to identify what types of data will be relevant and important, whether there is a need for compliance regulations in the classification phase.

Minimize

Upon building a data map of where information is being stored, who has the right to access it, how long it is being retained and where it is being moved, organizations should have a view to delete what data is no longer needed as part of the data minimization. This is done by keeping data for the period of time directly related to its original intended purpose. The enforcement of retention policies that automatically expires data over time is a cornerstone of any compliance strategy. More often than not, data has accumulated over many years and in many instances no longer needed.

Risk factors

De-structured data that originates from a structured source but exported into files such as spreadsheets and text files, can pose quite a risk. External applications which contain personal data are highly likely to generate significant amounts of personal data in unstructured locations. This type of data can accumulate and become forgotten about as individuals leave or move around the organization; but the risk remains. This underscored the need to understand where personal data is stored in structured locations because it will inevitably leak out into the unstructured locations.

User behavior is another critical element in managing data and compliance risks. Based on a Veritas GDPR Project, classification showed that users will often store files that are considered to be business records in their own personal drive. This is not really a technology problem but a case of poor data habits. With increasingly stringent data regulations in place, it is timely to train, educate and improve the culture of an organization on how data should be managed. This is an effort which requires ongoing data classification to measure improvement.

The Value of Data Study reveals that there are several key areas that organizations could improve on:

  • Ensuring data compliance (Globally: 83%, Singapore: 94%).
  • Data security and managing risks (Globally: 82%, Singapore: 93%).
  • Level of data visibility and control (Global: 80%, Singapore: 89%).
  • Processes for data recoverability from data loss or a ransomware attack (Globally: 71%, Singapore: 91%).
  • Data sharing practices across business functions (Globally: 78%, Singapore: 92%)

According the survey findings, we can see that organizations in Singapore acknowledge that they have more room for improvement in comparison to their global counterparts for day-to-day data management.

5. If they strive to meet local standards, will this also provide coverage for standards like GDPR?

Globally, there are different data privacy standards with varying levels of maturity and requirements, some could be more onerous than others. For instance, the GDPR is widely seen as the world’s most sweeping data privacy law.

For local businesses, it is critical to be mindful that complying with local standards such as the Personal Data Protection Act (PDPA) do not imply coverage for other standards such as the GDPR, The California Consumer Privacy Act of 2018 or Cybersecurity Law in China.

For instance, there is one key difference in terms of consent between the PDPA and GDPR. While the PDPA prohibits organizations from collecting, using or disclosing personal data unless the data subject gives consent for the collection, there are exemptions where consent is deemed to have been given, where consent is not required, such as for any investigation or proceedings by the organization. However, under the GDPR, consent must be clearly obtained without exemption, and is usually limited, and strictly defined.

6. There is talk of appointing a compliance or data officer. Is this now the new norm? How should data be viewed?

According to the 2019 annual survey conducted by NewVantage Partners, there is an increase in the number of chief data officers (CDOs) across organizations. However, key challenges remain despite having more CDOs on board. For instance, there appears to be a profound lack of consensus on the nature of the role and responsibilities, mandate and background that qualifies an executive to operate as a successful CDO.

For many organizations, there are confusion on who should be responsible for data. Executives tend to assume the CIO or CDO is the key person to be responsible for data and compliance, while the CIO or CDO will look right back at them. To be fair, this is very much cross-functional. This cross-functionality implies a change in mindset where organizations are building up a culture of compliance and data governance. Dealing with data is no longer the work for just the CDO or CIO, but a combination of all departments.

7. How should it also be protected? What are we lacking now?

Note: In the context of our latest survey, data management is an umbrella term that encompasses several key capabilities, including data protection, data resiliency and data compliance.

The exponential growth of unstructured data has led to organizations storing information across a variety of different environments, spanning from on-premises, cloud to mobile devices.  As data becomes more siloed and sprawled, it is tougher to see, manage, access and protect, creating significant challenges. While most organizations would probably have data backup and recovery solutions in place, many lack full visibility and insights of the types of data generated and stored to effectively unlock the value of data.

Notably, there are some common issues that organizations need to tackle. For instance, 21% of the global organizations (Singapore: 25%) do not know where data is located. Such visibility is essential in determining how best to manage all your data and build the relevant policies to achieve this. Globally, 35% of organizations say that there is a lack of centralized strategy to data management. In Singapore, 44% of organizations are faced with the same issue. Furthermore, 26% of global organizations and more than one-third (34%) in Singapore cited the inability to back up and recover data reliably as another key challenge.

8. Are we collecting it from enough sources and are we analyzing it the way it needs?

According to the Value of Data Study, 40% of the respondents globally and more than half (52%) in Singapore – the highest among all the surveyed countries – say that there are too many different data management tools and systems in place, both legacy and new. At the same time, organizations are finding it increasingly challenging to deal with data sources. 38% of the global organizations say they are having too many data sources that are difficult to analyze. In this regard, close of half (49%) of organizations in Singapore faced the same challenge.

The vast amount of data collected pose enormous risks and challenges to organizations globally. According to an earlier Veritas Global Databerg Report, global organizations hold on average 52% Dark Data, 33% ROT (Redundant, Obsolete and Trivial) and just 15% was identifiable as business-critical data. If left unchecked, this could equate to $USD 3.3 trillion of avoidable storage and management costs by 2020.

Unfortunately, everyday attitudes to data and behaviors at the strategic, organizational and employee levels are causing the dark data and ROT levels to increase. These include strategies and budgets based solely on data volumes, rather than value of data and misperceptions such as storage is relatively free, especially on cloud. It is essential for organizations to eliminate dark data to reduce risks, gain insights from critical data to drive the business forward.

9. Are we holding back ML and AI with legacy approaches to data?

As businesses are continuing to adopt more complex IT environments, such as hyperconverged infrastructures and other modern workloads, data protection will also need to adapt, or risk holding back ML and AI with legacy approaches to data. AI consistently learns from the system as these dynamic IT environments adapt and change.

At Veritas, we adopt the stance of using AI and ML to improve the reliability and portability of our solutions, to provide customers with an enriched support and user experience. Our recent offering – Veritas Predictive Insights – as well as the latest acquisition of APTARE to enhance the analytics, reporting and protection of data, is a testimony of our ongoing commitment to innovation. 

Today, most IT managers are taking a rearview mirror approach when reacting to unplanned downtime caused by interruptions related to software or hardware error, component failure or something even more catastrophic in the data center. Incorporating predictive technologies will enable proactive monitoring for downtime and faults so IT managers can take preventative action before a disruption ever occurs. Being more prescriptive can lead to fewer disruptions and less downtime in operations.

More frequently, AI is enabling predictability and will play a key role in data protection in 2019 and in the future. Data protection stands to benefit the most from AI enabled predictive insights by reducing risk to data in a power disruption. With regulations such as GDPR guaranteeing data protection for users at a business’s expense, it is becoming increasingly important to keep data under lock and key. 

In Singapore, the Personal Data Protection Commission (PDPC) has recognized the benefits of AI and is also taking strides to ensure that both businesses and the public are well educated about the AI value chain (Developers, Businesses and Consumers). On top of this, they have also developed an AI governance framework which will consider important issues in the commercial deployment and adoption of AI in Singapore. Proactive strategies to avoid the repercussions of even a moment of downtime will be critical for businesses in 2019 that need to provide round-the-clock data support.

10. Are we backing it up and storing it correctly?

If we use the costs associated with ineffective data management as a proxy, it is reasonable to assume that we have a long way to go when it comes to backing data and storing it correctly.

Globally, organizations estimate that they lose over USD $2 million per year as they struggle with data management challenges. [Singapore: USD $2.66 million]. At the same time, respondents globally (including Singapore), highlighted that employees waste two hours on average per day searching for data, resulting in a 16% drop in workforce efficiency.

There are wider consequences of poor data management. Over one-third (35%) of global organizations admit to losing out on new revenue opportunities and the percentage is even higher in Singapore (45%). 39% of respondents globally say their data management challenges have caused an increase in operating costs and 43% of organizations in Singapore suffered from the same. From the business perspective, organizations are also losing out in terms of productivity, with 37% of respondents globally and more than half (55%) in Singapore saying that they are missing their efficiency goals.

11. Is the cloud replacing tape? Will it ever replace tape?

Tape continues to be a staple for two key reasons. Firstly, it is the lowest cost storage media and businesses seeking cost-effective measures to achieve regulatory compliance will still turn to tape. Secondly, as tape is stored offline, it is proving to be a less risky option as organizations are facing enormous and yet ever-changing cyber threats. In fact, based on the survey findings, 37% of the organization data is stored on-premises globally. In Singapore, we are seeing a similar trend, with 36% of data located on-premises.

However, cloud has replaced tape as the preferred storage media and continues to grow in popularity. In Singapore, 51% of the organization data resides in public and private clouds, slightly above the global average of 48%.

12. In 2019, where should we be headed and what should we expect to see happen with our data?

As we navigate our paths in 2019, the technology market will continue to transform and adapt to new customer demands. In particular, IT and data companies will be collecting, analyzing and providing insights about vast volumes of data – more so than ever before.

Businesses in Singapore and across the region will need to start thinking about the future of how they perform these tasks, and how to take advantage of new solutions that can make the jobs and lives of the people responsible for these tasks easier. New solutions can also guarantee more security and reliability, enabling better relationships with customers. This is especially pertinent for Singapore as it plans to reinvent and digitally transform 23 sectors that cover about 80 per cent of the country’s economy.

In particular, data and IT staff will need to leverage technologies that enable machine learning and predictive insights – to know how or when to upgrade technology and ensure business continuity.

Today, we are living in an era where a strategic and holistic data management approach is essential for growth. With the soaring growth of data, the ability to manage data effectively – from collecting, storing, analyzing and utilizing this data – will inspire innovation and in turn, help businesses to capitalize on their data.

To achieve this, organizations can consider a new approach of managing data with data, in an integrated three-step process:

  1. Classifying data. Classifying helps enterprises understand what they have, where it is located, who is using it, the number of copies that exist, if it’s valuable or not, and more.
  2. Enabling policies. Organizations can use the insights gained from data classification to intelligently understand, protect, and maintain their data.
  3. Automating. With every petabyte of enterprise data, there are roughly three billion files – beyond human’s capacity to manage it. Automation, through means of artificial intelligence and machine learning, can take on the tasks that an IT workforce cannot and will further unlock the capabilities of an organization’s data.

PAM adoption is becoming increasingly cloud-based

0
0
First iOS trojan exploiting Apple DRM design flaws infects any iOS device

Thycotic, a provider of privileged access management (PAM) solutions has released the findings from its 2019 RSA Conference survey. More than 200 security professionals were surveyed at the 2019 RSA Conference in March and the findings indicate nearly half (47 percent) are already using, planning or exploring deployment of PAM solutions in the cloud. The survey quantified and evaluated security professionals’ current PAM practices, including challenges in adopting PAM solutions and their interest and progress in transitioning to cloud-based PAM solutions.

According to the RSA survey, PAM adoption is becoming increasingly cloud-based; and the results suggest this cloud migration will continue: 21 percent of companies have adopted a PAM Solution hosted in the cloud as-a-service or plan to implement such technology, and an additional 26 percent are looking to transition to a cloud-based PAM Solution. Only 36 percent say they plan to keep their PAM solution on-premise. The results indicate that organisations see a future with cloud-based security solutions and expect to increase their use of cloud security solutions to 65 percent over the next 1-2 years.

“As more companies move to a cloud-first strategy, momentum is accelerating for adopting cloud-based PAM solutions,” said Joseph Carson, Chief Security Scientist at Thycotic. “Going with cloud allows an organisation to pay as you go, reduce wasted time, minimise huge upfront investments and automate updates – along with assuring high availability and geo-redundancy from a genuine SaaS PAM solution.”

With more than 90 percent of the world’s data stored in the cloud, moving to PAM in the cloud coincides with Thycotic’s survey results that show unauthorised access – which should be protected by a PAM solution – is the top business risk to cloud environments.

Other results from the survey reveal that IT and security teams are still struggling with getting management and daily users to understand the absolute necessity of PAM solutions: 28 percent said the biggest challenge was convincing team members to use the PAM solution, 24 percent said educating organisations’ leaders, and 19 percent said finding the budget for PAM technology. As the threat of cyber-attacks on privileged accounts continues to rise, naturally the deployment of PAM solutions must continue to increase. As a result, PAM awareness is rapidly growing, but still 85 percent of organisations fail to meet basic PAM security standards. Fortunately, 66 percent of respondents are increasing their knowledge of PAM technologies.

Nutanix hyperconverges secondary storage with Nutanix Mine

0
0
IBM, Box to provide enterprises with option to store data regionally in Europe and Asia

Nutanix , Inc. has announced Nutanix Mine, a new open solution that integrates secondary storage operations with the Nutanix Enterprise Cloud Platform, delivering a complete platform for primary and secondary storage within the private cloud.

Through native integration with backup vendors Veeam, HYCU, Commvault, Veritas and Unitrends, customers will be able to manage their HCI environment and backup operations from a single management console, reducing the operational cost and complexity of standalone back-up and recovery solutions. Mine will streamline overall deployment, and simplify the full lifecycle of data backup operations, including on-going management, scaling and troubleshooting - all while preserving the customer’s freedom to choose the right back-up service for their particular infrastructure environment.

According to Gartner, “There are many challenges with current backup and recovery solutions deployed today. The top concerns are most often related to the cost, complexity and capability.” Gartner goes on to state, “These costs often continue to rise despite overall cost improvements in the IT industry, and many organizations continue to perceive backup as an expensive insurance policy.”

As companies modernize their datacenter with HCI to realize the simplicity, performance and scalability needed to run modern applications, their backup and data protection strategies have remained siloed from their core datacenter environment. With Nutanix Mine, customers will be able to converge secondary storage operations into the Nutanix Enterprise Cloud Platform to deliver intelligent data backup services for all business applications. Integrated with Nutanix’s native HCI data fabric that provides intelligent tiering and advanced data reduction capabilities, Mine will enable enterprises to choose the best data backup software for their organization, optimized for their Nutanix HCI environment; reducing the time and expense of configuring standalone secondary storage solutions.

“Backup and recovery strategies are often regarded as a painful necessity for most businesses because of their complexity and lack of integration into the overall IT infrastructure. Even more modern solutions, which tout the same customer-centric principles of hyperconverged infrastructure, only address the challenges of the secondary storage environment,” said Phil Goodwin, Research Director, Cloud Data Management for Protection at IDC. “With more applications and use cases converging on a single platform within the private cloud, integrating secondary storage tightly with primary datacenter operations introduces even more flexibility and ease-of-use for customers.”

“As companies modernize their datacenter, they look to new solutions which make it easy to run business critical applications, eliminating silos and reducing complexity in their private cloud environment so IT can serve as the foundation of business success,” said Carey Stanton, Vice President of Global Alliancesat Veeam. “By integrating our software so closely with Nutanix through Mine, we’re delivering our joint customers a full platform for their primary and secondary storage needs, dramatically reducing the complexity of running and protecting applications.”

Embracing an open platform strategy, Nutanix Mine will integrate with back-up solutions from Veeam, HYCU, Commvault, Veritas and Unitrends.

Through tight integration with Nutanix’s HCI data fabric and advanced Prism management console, Mine will provide:

  • A reduction in the complexity of separate systems to back-up and recover business data, so secondary storage can be managed as easily as primary data storage
  • The simplification of the full lifecycle of data backup operations, including initial sizing and procurement, streamlined installation, and easy provisioning of the full solution, and
  • The ability to easily scale-out both primary and secondary storage to accommodate business growth

“The hyperconverged market’s rapid growth is largely attributable to its promise of modernizing infrastructure and reducing complexity by eliminating silos within the datacenter. But even as customers embraced HCI, the secondary storage silo persisted,” said Sunil Potti, Chief Product and Development Officer, Nutanix. “With Nutanix Mine, customers will get all the benefits of collapsing this silo into a single platform - reduced management complexity, simplified operations and reduced TCO - without the requirement that they forgo the backup solution best suited for their business needs.”

Hitachi Vantara introduces Lumada Video Insights

0
0
ZTE introduces LoRa-based smart meters

Hitachi Vantara, a wholly owned subsidiary of Hitachi, Ltd., has introduced Lumada Video Insights, an end-to-end, intelligent and adaptable suite of applications that delivers operational safety and business intelligence using internet of things (IoT), video, artificial intelligence (AI) and analytics.

Lumada Video Insights is an integral part of Hitachi Vantara’s DataOps strategy, which redefines data management for the AI era by seamlessly connecting data consumers with data creators to rapidly monetize customers’ data.

Lumada Video Insights brings together Hitachi’s video offerings for smart spaces with new technology updates, innovations and integrations into Hitachi’s Lumada portfolio. The solution complements and extends the Lumada platform and services ecosystem with expanded AI, computer vision, advanced analytics, data integration and orchestration capabilities to help enterprise and industrial customers accelerate their IoT initiatives and cultivate their own smart spaces and ecosystems.

With multiple customer success stories and increasing demand, Lumada Video Insights demonstrates significant momentum in the emerging smart spaces industry. And when put together with DataOps, it unlocks new opportunities for customers to maximize the value of the vast amounts of data collected as IoT disrupts numerous industries worldwide.

According to Gartner, Inc., “A smart space is a physical or digital environment in which humans and technology-enabled systems interact in increasingly open, connected, coordinated and intelligent ecosystems. Multiple elements — including people, processes, services and things — come together in a smart space to create a more immersive, interactive and automated experience for a target set of personas or industry scenarios.

This trend has been coalescing for some time around elements such as smart cities, digital workplaces, smart homes and connected factories. Gartner believes the market is entering a period of accelerated delivery of robust smart spaces, with technology becoming an integral part of our daily lives, whether as employees, customers, consumers, community members or citizens. AI-related trends, the expansion of IoT-connected edge devices, the development of digital twins of things and organizations, and the maturing of blockchain offer increasing opportunities to drive more connected, coordinated and intelligent solutions across target environments.”

Lumada Video Insights Supports Rapid Smart Spaces Growth

With the capabilities to collect, store, manage and analyze video data, Lumada Video Insights delivers valuable analysis and alerts to help organizations be more effective, efficient and secure. 

Advances in computer vision and machine learning have allowed video to be a rich source of insights that provide operational and customer experience intelligence, and real time-situational awareness and alerts to enable swift, effective and proactive responses to incidents, emergencies and customer issues. With Lumada Video Insights, customers can gather data insights to improve planning and identify opportunities for cost and waste reductions and factors to improve operational excellence.

Key features of Lumada Video Insights include:

  • IoT, video and historical data can be visualized geospatially and graphically in a single-pane-of-glass view.
  • Video analytics turns existing or new video data into a insights and real-time alerts through AI analysis.
  • Video data storage and management solutions that ensure rapid speed and high volumes of foundational data are available, complete and fault-tolerant while simplifying data management.
  • Smart edge devices and video intelligence platforms that gather data and insights from anywhere.
  • Mine publicly available data from social media or the deep web for open-source intelligence and awareness of brand-related conversations in target areas.
  • Helps data-driven decision-making about safety, operations and customer experience.

Lumada Video Insights strengthens the Lumada portfolio of data-driven applications, which is further supported by Hitachi Vantara’s data integration, orchestration, storage and management offerings to provide end-to-end solutions and business value to customers around the world.

“Smart spaces innovation is a focus area for Hitachi Vantara as we continue to see a demand to use new and diverse sources of data blended with more traditional data to gain rich insights,” said Brad Surak, chief product and strategy officer at Hitachi Vantara. “With Lumada Video Insights and our focus on DataOps, we are excited to expand our work with public and private organizations to transform how retail, government and transportation use data-driven intelligence to innovate and achieve greater outcomes.”

Hitachi Smart Spaces Customer Success

Hitachi Vantara works hand-in-hand with customers to deliver purpose-built, outcome-driven solutions that generate actionable insights to improve safety, operations and business intelligence. Recent Smart Spaces customer successes include:

  • City of Las Vegas (Nevada): By deploying Hitachi Smart Spaces in its Innovation District, the city tapped traffic, parking and passenger flow data to optimize operations. In one area, a spike of bike delivery activity was detected, highlighting the need for more bike lanes along delivery routes.
  • City of Moreno Valley (California): For several years, this city has been using elements of Lumada Video Insights which have helped decrease crime, improve emergency-response time in traffic, and address a missing person case using Hitachi technology.
  • Tequila Intelligente (Jalisco, Mexico): This heavily touristed home of the tequila industry uses Lumada Video Insights to gather foot and vehicle traffic data to enhance the visitor and citizen experience.
  • Dallas Housing Authority (Texas): To keep residents safe throughout its properties, Dallas Housing Authority deployed Lumada Video Insights, including smart cameras with compute and storage to analyze and monitor video data.

Lumada Video Insights solutions are available for customers today with global general availability for most of the portfolio, and limited availability for edge devices in some countries. 

SAS launches cloud offering in Singapore with first local data center

0
0
SAS launches cloud offering in Singapore with first local data center

SAS has launched its SAS® Cloud offering in an Amazon Web Services’ Data Centre in Singapore. This marks the first SAS data center in the market, providing Singapore enterprises with long-awaited cloud capabilities that address data residency issues and improve network latency.

Cloud technology is enabling businesses of all sizes to compete on a global level. Yet, IDC found that more than 85% of Asia Pacific organizations are still in early stages of cloud adoption maturity, needing more consistent, standardized and available automated cloud resources to execute at speed and cost[1]. Bridging this gap, SAS® Cloud combines optimized software, infrastructure and services, for enhanced performance and value. Enterprises in Singapore can now benefit from dedicated computing infrastructure, secure and hassle-free access to data and tailored alerts for early and pre-emptive incident detection.

“Organizations in Singapore and the region are realizing the value of disruptive technologies such as cloud and advanced analytics. At SAS, we aim to empower customers with the right tools to be the disruptors themselves,” said Randy Goh, Managing Director, SAS Singapore. “The launch of SAS Cloud and our first data center in Singapore provides new opportunities for us to deliver increased value to local customers. This is testament to our commitment in Singapore’s rapidly growing analytics economy.”

SAS Cloud provides customers with the flexibility to customize and manage their cloud infrastructure, software and services through these offerings:

  • Hosted Managed Services – a fully outsourced enterprise solution delivered through SAS Cloud. SAS manages all data assets and supports businesses with hosting and analytics expertise
  • Remote Managed Software and Services – for businesses with regulatory compliance issues, mandating in-house data residency. SAS implements and remotely manages any SAS software solution, with high-speed analytics services and 24/7 support
  • Results as a Service – for businesses that require flexibility, technology and expertise to turn data into insights or results. Organizations can gain access to rich analytical insights by combining SAS’ award-winning software and support from SAS experts, without investment in software licenses or infrastructure
  • Software as a Service (SAS Data Centers) – standardized, off-the-shelf offerings available for immediate use, with no customization to deliver tangible results

Addressing growing security concerns and a complex threat landscape, SAS also works with customers during implementation to determine the most appropriate data classification levels for information hosted in the cloud. Customers are required to access SAS Cloud offerings through site-to-site or whitelisted Internet Protocols (IPs) that mask SAS’ visibility to online threats. SAS Cloud is equipped with data protection warranties for personally identifiable information, covering all non-public data maintained in any of SAS’ data centers.

Through a Service Level Agreement (SLA) that promises 99% uptime, defined standards and 24/7 operation and incident management, SAS ensures customers are getting the most value in the cloud. In Singapore, SAS worked with local energy services start-up Barghest Building Performance (BBP) through a series of Results as a Service projects to deliver analytical models and rich insights, helping BBP gain the flexibility to roll out the solution tailored to its needs. As a result, BBP has created millions of dollars in energy savings for its customers and has been accredited by several national governing bodies such as Singapore’s National Environment Agency (NEA).

These are 4 questions you need to consider after a data breach

0
0
These are 4 questions you need to consider after a data breach

In the EU, GDPR is requiring organisations to disclose data breaches within 72 hours in an effort to protect personal data. Other areas of the world – notably the US - are also considering ways to introduce regulations that compel companies to disclose data breaches sooner. In Singapore, the Personal Data Protection Commission (PDPC) plans to introduce revisions to the Personal Data Protection Act (PDPA) mandating that companies notify the PDPC of breaches within 72 hours – like the EU standard – and customers sometime after.

This means that regardless of where your business is located, it’s time to make a plan that will enable you to investigate incidents quickly and with greater accuracy. The decision-makers of the company need to understand where critical assets lie and the information that may need to be reported ahead of time, so that the Incident Response (IR) team isn’t significantly burdened after a breach.

Here are the four questions IR teams should be asking from the moment a breach occurs to ensure all of the information needed for disclosing it to relevant stakeholders is readily available:

1. What’s the scope of this incident?

There’s only one thing worse than announcing leaked records, and that’s needing to make the same announcement more than once. Organisations need to understand exactly how extensive the breach was in order to avoid this faux pas—or, like some companies, be comfortable with announcing the maximum possible number of affected users before investigations are complete. There are pros and cons to playing it safe, but the best solution is to see what roadblocks exist in the IR team’s ability to investigate breaches and remove them wherever possible.

2. What kind of violation is it (e.g. PCI-DSS or HIPAA)?

If the IR team only has 72 hours to gather as much information as possible about a breach before reporting, it’s critical to know which policies to address. Requiring companies to report breaches does not just mean there’s less time before customers know about an incident. It also means that the organization will be expected to answer more specific, technical questions about the incident in a shorter timeframe.

3. Who is affected?

Identifying which customers have been affected will require precision in order to mitigate the damage to the company’s reputation. Security breaches are a fact of modern life, but customers still expect stringent protections and data privacy. When a breach does occur, company leaders across functions will need deep visibility to answer these questions right away.

4. What did the attack campaign look like—and are the attackers still present?

According to a recent report from Enterprise Management Associates (EMA), only 23 percent of organisations investigate all critical security incidents after the initial detection. That means over 75 percent of organisations don’t really understand how an attacker made it past its defenses, and often aren’t certain if the attacker is still inside the environment. This goes hand in hand with the current breach detection gap. In 2018, attackers could dwell inside an environment for three months on average before the breach was detected.

As Singapore and other governments around the world continue to strengthen consumer protections and privacy rules, this last question will grow more and more important. We’re moving away from a time when security was primarily considered the responsibility of companies and the increase in publicised breach reporting will ultimately lead to customers putting their trusted organizations under more scrutiny. Implementing frameworks like the Center for Internet Security (CIS) Top 20 Critical Security Controls can help organisations answer these questions quickly, but many need help extracting value from ambitious frameworks that require better visibility and a more efficient use of security resources. We have seen how an emerging category of security and analytics can help.

Albert Kuo is VP Asia Pacific, ExtraHop

Tags: 

NetApp rolls out new midrange, end-to-end NVMe AFF A320 storage system

0
0
IBM, Box to provide enterprises with option to store data regionally in Europe and Asia

NetApp has announced NetApp ONTAP 9.6, the new midrange, end-to-end NVMe AFF A320 storage system and an expanded portfolio of services to help businesses maximise the value of their data.

“In advances that are coming our way, like 5G networks, we can see that a company’s  ability to generate, gather and disseminate a massive volume of data will be enabled like never before and companies that aren’t ready are going to be overwhelmed,” said Joel Reich, executive vice president, Storage Systems and Software, NetApp. “With NetApp ONTAP, organisations can overcome the challenges introduced by these data-intensive technologies and cutting-edge innovations with a smart, powerful, and trusted solution that maximises the value that organisations can derive from data.”

Maximising Value in the Hybrid Cloud

By accelerating applications painlessly and by delivering simplicity and security for a future-proof infrastructure, ONTAP 9.6 data management software allows organisations of all sizes to maximise the value of data and take on new data-driven initiatives. ONTAP 9.6 includes the following enhancements:

  • Expanded NVMe over Fibre Channel (NVMe/FC) ecosystem now includes VMware ESXi, Microsoft Windows, and Oracle Linux hosts, in addition to Red Hat and SUSE Linux, with storage path resiliency. Organisations can experience NVMe/FC performance for most workloads.
  • FabricPool now supports Google Cloud Platform and Alibaba Cloud, in addition to Microsoft Azure, Amazon Web Services (AWS), and IBM Cloud Storage. Organisations can lower the cost of primary storage by automatically tiering cold data to any major public cloud or to a NetApp StorageGRID private cloud.
  • NetApp FlexCache software now supports NetApp Cloud Volumes ONTAP, allowing organisations to experience the benefits of FlexCache in hybrid cloud deployments.
  • Over-the-wire encryption for NetApp SnapMirror technology and FlexCache increases security for data replication and remote caching.
  • NetApp MetroCluster IP support for entry-level NetApp AFF and FAS systems now makes business continuity a cost-effective option for organisations, taking advantage of existing customer IP networks between sites. 

New Midrange End-to-End NVMe System

Last year NetApp introduced the AFF A800, an end-to-end NVMe system backed by an Efficiency Guarantee. With this announcement, NetApp has extended these benefits to the midrange market. The AFF A320 system enables customers to:

  • Accelerate traditional and emerging enterprise applications such as artificial intelligence and deep learning, analytics, and databases with extremely low latency.
  • Reduce data center costs by consolidating applications with a powerful system.
  • Future-proof their environment with NVMe technology, 100GbE Ethernet, and cloud integration.

Expanded Services Portfolio to Meet Broader Customer Needs

NetApp also announced that its expanded services portfolio now includes:

  • SupportEdge Prestige offers a high-touch, concierge level of technical support that resolves issues faster through priority call routing. Customers are assigned a designated team of NetApp experts and receive specialised reporting, tools, and storage environment health assessments.
  • Tiered Deployment Service accelerates time to value for new NetApp technology and reduces the risk of improper installation or misconfiguration. Three new high-quality options include Basic, Standard and Advanced Deployment, each aligned to customer business objectives.
  • Managed Upgrade Service is a remotely delivered service that reduces security risks by ensuring NetApp software is always up to date with all security patches and firmware upgrades.

Dell seeks to make hybrid cloud simpler to deploy and manage

0
0
ZTE introduces LoRa-based smart meters

Dell Technologies has unveiled Dell Technologies Cloud, a new set of cloud infrastructure solutions to make hybrid cloud environments simpler to deploy and manage.

Combining VMware and Dell EMC infrastructure, Dell Technologies Cloud removes cloud complexity by offering consistent infrastructure and operations for IT resources, across public and private clouds and the edge, regardless of location.

According to IDC, more than 70% of companies are using multiple cloud environments, and the largest data center challenge most companies face is developing a successful multi-cloud strategy. Operating in multiple clouds has caused organizations to onboard many management consoles and disparate processes, which stifles innovation and adds complexity. The hybrid cloud approach is an ideal solution by offering a familiar management interface that extends across clouds for a simplified overall experience. With VMware research finding that 83 percent of cloud adopters are seeking consistent infrastructure and operations from the data center to the cloud, the Dell Technologies Cloud is designed specifically to address this challenge.

“For many organizations, the increasingly diverse cloud landscape is resulting in an enormous amount of IT complexity, and no one is more qualified or capable to help customers solve this challenge than Dell Technologies,” said Jeff Clarke, vice chairman of products and operations, Dell Technologies. “Cloud is not a destination; it’s an operating model. With Dell Technologies Cloud and joint engineering between Dell EMC and VMware, we are offering a unified hybrid cloud experience. This provides consistent infrastructure and operations at every location the cloud resides—from on-premises data centers to public clouds and the emerging edge—allowing our customers to have greater control of their multi-cloud journey.”

The Dell Technologies Cloud portfolio consists of the new Dell Technologies Cloud Platforms and the new Data Center-as-a-Service offering, VMware Cloud on Dell EMC. These enable a flexible range of IT and management options with tight integration and a single vendor experience for purchasing, deployment, services and financing. Dell Technologies Cloud gives customers more control as the operational hub of their hybrid clouds, on premises, with consistent cloud infrastructure across all cloud types and a broad set of more than 4,200 VMware Cloud Provider Program providers and hyperscalers.

This hybrid cloud approach is delivered through a powerful integration of hardware, software, services and consumption options from Dell, Inc., the No. 1 global revenue leader in cloud IT infrastructure, and VMware, ranked No. 1 in revenue of cloud systems management software based on IDC’s latest research.

The Dell Technologies Cloud complements this core technology with a broad set of value-added services, such as security, data protection and lifecycle management. It helps ensure success through consulting, infrastructure deployment, management, support and education services while offering public cloud-like consumption of IT infrastructure.


Digital transformation hindered by cyber risks

0
0
First iOS trojan exploiting Apple DRM design flaws infects any iOS device

Asia Pacific (APAC) organizations’ failure to prioritize cybersecurity is hindering their digital transformation journey, reveals the findings of an Asia Pacific study conducted by IT analyst firm Frost & Sullivan.

The study, commissioned by Forcepoint, finds that most APAC organizations (83%) don’t think about cybersecurity while embarking on digital transformation projects. Although majority of the organizations (72%) conduct regular breach assessment to protect themselves against cyberattacks, still 55% of them were at risk. The study reveals that Cloud is a key component of digital transformation (69% of respondents have adopted cloud) but most organizations think cybersecurity is the responsibility of their cloud service provider.

“It’s clear from this study that many APAC organizations are on the back foot when it comes to enterprise cybersecurity in the borderless organization,” said Kenny Yeo, Industry Principal, APAC ICT, Frost & Sullivan. “Security leaders need to look beyond perimeter security, leverage automation, and have a better grasp of the psychology of both cybercriminals and their business users. Incorporating behavior modelling into their IT security architecture is certainly a way to identify potential risks and fend off cyberattacks.”

Digital transformation hindered by cyber risks

The study reveals a big push among APAC organizations, with 95% of respondents having embarked on a digital transformation journey, adopting emerging technologies including cloud computing, mobility, internet of things, and artificial intelligence/ machine learning. However, most organizations, 65% of respondents, acknowledged that they are seriously hampered in execution of digital transformation projects due to rising cyberattacks.

One of the key reasons for this is the less mature approach by business leaders to involve cyber security when designing digital transformation projects. Eighty-Three percent of the organizations did not consider cybersecurity until after their digital transformation projects had begun.

“Organizations today need to urgently to embrace “secure-by-design” into their digital transformation projects. Adopting a behavior-centric security approach that focuses on understanding users’ behavior on the network and within applications to identify behavioral anomalies can mitigate cyberattacks before they happen,” said Alvin Rodrigues, senior director and security strategist at Forcepoint Asia Pacific.

Serious misconceptions around security in the cloud

Cloud has become one of the key components which is leading digital transformation, with 69% of organizations adopting cloud. However, 54% of respondents perceive that their cloud service provider will take the full responsibility for security. Normally, security and compliance are a shared responsibility between an organization and the cloud service provider. This serious misconception around responsibility of security in the cloud is resulting in a higher number of cyberattacks.

Existing cybersecurity measures are not proving enough for enterprises to protect against cyber incidents

The finding suggests that the majority of organizations have taken measures to protect themselves against cyber incidents, with 72% of them performing breach assessments at least once per quarter. Despite the readiness, 55% of organizations were at risk − either they have encountered a security incident before or they didn’t do any checks to assess if they have been breached.

  • 35% of APAC organizations suffered at least one cybersecurity incident in the last 12 months.
  • On a country level, Indian (69%) and Australian (63%) firms were found to be most at risk of cyberattack.

Security blind spots in digital transformation

The study reveals the impact digital transformation is having on each organization’s risk posture. As more digital technology is built into business like cloud and mobility, it is opening each organization up to more threats. Data exfiltration, impersonation – both theft of digital identity and online brand impersonation − loss of intellectual property and malware infection emerged as the top security blind spots for organizations rolling out digital transformation. These five incidents, the study states, have high levels of business impact and long recovery times. 

Organizations demand more from their data

0
0
Organizations demand more from their data

Global organizations are demanding more from their data management investments, despite most estimating that they achieve more than double the amount they invest, finds research from Veritas Technologies.

The Value of Data study, conducted by Vanson Bourne for Veritas, surveyed 1,500 IT decision makers and data managers across 15 countries, and reveals that although companies see an average return of $2.18 USD for every $1 USD they invest in improving data management, an overwhelming majority (82 percent) of businesses expect to see an even higher return. In Singapore, while companies see a similar ROI at $2.16 USD, 86 percent of businesses still demand more from every $1 USD they invested in data management.

Just 15 percent achieved the ROI they expected to receive, while only 1 percent said the ROI they achieved exceeded expectations.

Globally, businesses admit the key factors preventing them from improving their ROI are a lack of the right technology to support data management (40 percent), a lack of internal processes (36 percent) and inadequate employee engagement or training (57 percent). A third (33 percent) also cited an absence of support from senior management as a barrier to achieving a higher return on data management investment. In Singapore, 70 percent of the businesses cited inadequate employee engagement or training as a significant factor, in addition to a lack of right technology to support data management (48 percent), insufficient internal processes (37 percent) and a lack of buy-in from senior management (38 percent).

“Mismanaging data can cost businesses millions in security vulnerabilities, lost revenues and missed opportunities, but those that invest wisely are seeing the incredible potential of their data estates. Unfortunately, technological or people-related challenges have hindered the ability of organizations to realize the full value of their data,” said Sheena Chin, country director for Singapore, Veritas.

“To fully reap the benefits of effective data management, organizations must arm themselves with the ability to access, protect and derive insights from their data. By promoting a cultural shift in the way data is managed, which includes buy-in from leadership as well as tools, processes and training, companies can empower employees with full visibility and control of data.”

Take care of your data, and it will take care of you

Organizations that are investing in the proper management of their data say they are already benefiting from their investment and are achieving the objectives they set out to achieve.  Globally, respondents ranked increased data compliance, reduced security risks, cost savings, and the ability to drive new revenue streams and market opportunities as the most attractive benefits of improving data management. In Singapore, organizations cited similar key benefits, in addition to empowering employees to be more productive by improving data management.

Of the organizations that are investing in looking after their data, four in five (81 percent globally and 79 percent in Singapore) say they are already experiencing increased data compliance and reduced data security risks, while 70 percent globally (69 percent in Singapore) are seeing reduced costs. Nearly three-quarters (72 percent globally and 66 percent in Singapore) are driving new revenue streams or market opportunities as a result of investing in data management.

“As cases of high-profile data breaches and threats of hefty fines for regulatory non-compliance continue to plague the headlines, one of the biggest drivers for businesses to invest in data management is to protect their data. Importantly, many are also benefitting greatly from the ability to use their data more intelligently. Organizations that invest in overcoming the barriers to effective data management will reap significant competitive advantages in today’s digital economy,” added Chin.





Latest Images