IDG Contributor Network: What can be uncovered when big data meets the blockchain

IDG Contributor Network: What can be uncovered when big data meets the blockchain

As defined by the World Economic Forum (WEF), “Blockchain technology allows parties to transfer assets to each other in a secure way without intermediaries. It enables transparency, immutable records, and autonomous execution of business rules.”

Investments in the blockchain are on the rise. Banks, private businesses, and even governments are investing in the technology. The WEF predicts that smart contracts alone on the blockchain could equal 10 percent of the global GDP by 2027. As with any new technology, it’s important to note the value of the data that comes with it. And arguably the most valuable data involved with blockchain technology is that of virtual currency use.

Over the past few years, and with the development of bitcoin, the use of virtual currencies is gaining momentum around the world. However, the currencies remain unpredictable and only partially understood, even by experts. To create clarity, organizations interested in the blockchain, and more specifically bitcoin, are beginning to use big data to provide insights into the virtual currency’s future performance. The IDC recently reported that global IT spending is in the trillions and big data revenues will grow to more than $203 billion by 2020. By 2027, the value of big data as a service is predicted to be between $500 billion and $1 trillion, and managing virtual currency on the blockchain could account for a significant portion of that revenue.

Many of the best ways to realize the full potential of the virtual currency, and alleviate some of the risks involved, may be by harnessing its big data. As the blockchain is essentially a ledger that every bitcoin transaction must pass through for verification by the millions of other peer-to-peer users, the insights to be collected from that ledger are potentially endless. And even though there is no identifying information in a transaction on the blockchain itself (the blockchain only knows the two wallets that exchanged the currency and the amount), all transactions are public and each user’s activities are visible to everyone on the blockchain.

Organizations are beginning to perform data analysis on virtual currency activities to uncover powerful insights into trends and events surrounding currencies like bitcoin. So what type of trends can be revealed? Here’s a brief overview of what data enthusiasts are beginning to unmask in the world of the blockchain and cryptocurrencies with a big data mindset and the right data analytics tools.

Uncovering transactional data

The data within the blockchain is predicted to be worth trillions of dollars as it continues to make its way into banking, micropayments, remittances, and other financial services. In fact, the blockchain ledger could be worth up to 20% of the total big data market by 2030, producing up to $100 billion in annual revenue. To put this into perspective, this potential revenue surpasses that of what Visa, Mastercard, and PayPal currently generate combined.

Big data analytics will be crucial in tracking these activities and helping organizations using the blockchain make more informed decisions. SpreadCoin, a cryptocurrency that is more decentralized than bitcoin, proposes reinventing bitcoin mining into big data mining. Its whitepaper outlines the development of an incentive system to support a broad system of node managers at a time when bitcoin usage is on the rise and the number of nodes are declining.

Data intelligence services are emerging to help financial institutions, governments, and all kinds of organizations delve into who they might be interacting with on the blockchain and uncover “hidden” patterns.

Uncovering social data

As the popularity of bitcoin advanced in 2013 and 2014, the virtual currency began to rise and fall as a result of real-world events and the general public’s sentiment about the technology. These fluctuations are proof that the virtual currency has several characteristics that make it ideal for social data predictions. According to Rick Burgess of Freshminds: “Using social data to predict consumer behavior is nothing new, and many traders have been looking to include social metrics into their trading algorithms. However, because there are so many factors involved in pricing most financial instruments, it can be extremely difficult to predict how markets will change.”

Fortunately, bitcoin users and social media users tend to align quite well, and it may be beneficial to use them both for data analysis, as he further explains:

  • Bitcoin traders tend to be in the same demographic as social media users, and so their attitudes, opinions, and sentiment towards bitcoin are well documented.
  • The value of bitcoins is determined almost solely by market demand because the number of coins on the market is predictable and are not tied to any physical goods.
  • Bitcoin is predominantly traded by individuals rather than large institutions.
  • Events that affect bitcoin value are disseminated first and foremost on social media.

Data analysts are now mining social data for insights into key cryptocurrency trends. This, in turn, helps organizations uncover powerful demographic information and link bitcoin’s performance to world events.

Uncovering new forms of data monetization

According to Bill Schmarzo, CTO of Dell EMC Services, blockchain technology also “has the potential to democratize the sharing and monetization of data and analytics by removing the middleman from facilitating transactions.” In the business world, this gives consumers stronger negotiating powers over companies. It allows consumers to control who has access to their data through the blockchain. They could then demand pricing discounts in exchange for revealing data on their personal consumptions of a company’s product or service.

Schmarzo also explains how the blockchain may lead to new forms of data monetization because it has the following big data ramifications:

  • All parties involved in a transaction have access to the same data. This accelerates data acquisition, sharing, the quality of data and data analytics.
  • A detailed register of all transactions is kept in a single “file” or blockchain. This provides a complete overview of a transaction from start to finish, eliminating the needs for multiple systems.
  • Individuals can manage and control their personal data without the need for a third-party intermediary or centralized repository.

Ultimately, the blockchain could become a key enabler of data monetization by creating new marketplaces where companies and individuals can share, sell, and offer their data and analytical insights directly with each other.

Spearheaded by the large scale adoption of bitcoin, blockchain technologies are gaining ground throughout the business and financial worlds. The fast and secure transactions it facilitates could potentially revolutionize traditional data systems. According to a survey by KPMG and Forrester Consulting, one-third of decision makers trust their company’s data. But with blockchain technologies, this trust can be considerably strengthened, and real applications will become much more commonplace.

This article is published as part of the IDG Contributor Network. Want to Join?

Source: InfoWorld Big Data

Report: Most Businesses Will Increase Cloud Computing Spending In 2017

Report: Most Businesses Will Increase Cloud Computing Spending In 2017

Businesses have become less skeptical of cloud computing, more confident in its security, and more inclined to invest money in it, according to new findings. Nearly 70% of U.S. businesses surveyed by B2B ratings and reviews firm Clutch say that they plan to increase spending on cloud computing in 2017. One in five of those businesses report that their cloud computing spending this year will likely increase by more than 30%.

The increased spending on cloud computing is likely due to a shift in perspective, according to experts. The cloud is no longer seen by many businesses as simply an alternative option, but as the next logical step for data storage and management.

“Cloud is the new normal,” said Jeremy Przygode, CEO of Stratalux, Inc., a California-based managed service provider (MSP). “When businesses need to evaluate new solutions, or need to do a hardware refresh on existing solutions … Cloud is the go-to solution to figure out how to do that.”

Businesses also report having greater confidence in cloud security, possibly influencing more migration to the cloud. In the survey, the largest percentage of businesses identify security as a benefit of using the cloud.

This attitude is a shift from past years, when cloud security was often treated with skepticism and distrust.

When it comes to which type of cloud to use, most businesses are using a private cloud, where services and infrastructure are maintained on a private network. However, over 80% indicate that they are considering implementing or planning to implement a hybrid cloud option in the future. A hybrid cloud has services and infrastructure spread between a private network and off-site cloud provider.

A hybrid cloud offers flexibility and customizable features.

“Customizing your cloud experience allows the customer to leverage different toolsets that’s truly drilled down to their department, their individuals, and how they do business,” said Kevin Rubin, President and COO of Stratosphere Networks, a Chicago-based IT MSP.

Experts emphasize, however, that a business should select an option that best fits their particular needs.

Once the decision is made to adopt cloud computing, many businesses seek outside help for the subsequent installation. Over half of businesses surveyed (57%) say they hire an external consulting firm to help them implement their cloud strategy.

This is a wise decision, especially if a business does not have internal expertise, says Haresh Kumbhani, founder and CEO of Zymr, Inc., a San Francisco-based cloud consulting and agile software development agency. He believes that no business will have all the expertise required.

“The idea of bringing experts in different dimensions, from strategy to implementation, delivery and security, surveillance and so on, is essential for making purchase and implementation decisions,” he said.

Read the full report here.

Source: CloudStrategyMag

TierPoint Cited As A Strong Performer In The Hosted Private Cloud Sector

TierPoint Cited As A Strong Performer In The Hosted Private Cloud Sector

TierPoint LLC has announced that it was recognized as a “Strong Performer” in The Forrester Wave™: Hosted Private Cloud Services, North America, Q2 2017.  In the report, analysts from Forrester Research evaluated nine providers that met the qualifications to participate in the Wave.

In the report, TierPoint received the top score among all participants in the “Services and Customer Experience” criterion. TierPoint was cited for “customer reference reviews that rave about its innovative and technically astute customer-centric support teams.”  In their comments, Forrester analysts pointed to TierPoint’s strength across onboarding, ongoing support and customer service.

TierPoint earned the highest possible score (5 out of 5) for its data center locations and for “Business Technology (BT) Vision.” The latter is a Forrester criterion that considers an evaluated vendor’s vision for hosted private cloud and how that vision aligns “with near and future trends, customer demands, and Forrester’s business technology (BT) vision of the hosted private cloud service.” Forrester also noted TierPoint for “delivering high levels of network and onsite security along with a massive North American data center footprint.”

“I believe our recognition in this report reflects the strength of our strategy and the importance we place on collaborating with our customers to provide the best customized cloud solutions for them,” said Terry Morrison, chief technology officer, TierPoint. “Our hosted private cloud is built from market-leading technologies that, along with our capabilities and experience, make us an industry leader in hybrid IT solutions.”

Source: CloudStrategyMag

Cirba™ Unveils Optimization Service For Public Cloud

Cirba™ Unveils Optimization Service For Public Cloud

Cirba, Inc. has unveiled an analytics service, Densify™, for optimizing public cloud and on-premise virtual infrastructure. The Densify service is the next evolution of SaaS as it combines the organization’s predictive and real-time optimization analytics with an assigned Densification Advisor™, eliminating the burden of learning, operating, managing and maintaining software. Cirba also announced it is rebranding to Densify, reflecting the company’s strategic value proposition to reduce cloud and infrastructure costs while also improving application performance. The Densify service offers fast time to value, ease of adoption and use and the greatest potential for savings in the market through the industry’s most powerful optimization engine. For a limited time, customers can start with Densify service free for the first 14 days.

“Based on our experience, we know that organizations desperately want to reduce their cloud bill and show highly utilized infrastructure, but there are only so many hours in the day,” said Gerry Smith, CEO of Densify. “We understand that IT is looking for more than a software product — they want outcomes. They are looking for experts who can tune and watch over the automated analytics; they want to know that someone is there for them, helping them deliver results – not another management and maintenance burden.”

Driving Immediate Business Value

With a unique combination of SaaS-based analytics and the Densification Advisor service, Densify is a complete service for automated optimization of public, on-premise or hybrid cloud infrastructures. With Densify, companies automate virtual machine (VM) placement and resource allocation actions to proactively remove risk and drive the lowest unit cost. The new service also offers precise, data-driven cloud migration and transformation strategies and real-time hybrid and multi-cloud workload placements. One of the key benefits of Densify is that it delivers better application performance with the highest asset utilization and lowest public cloud spend without requiring any special training. The service is up and running within 15 minutes, maximizing efficiency, simplicity and results for customers under the guidance of the Densification Advisor.

“Densify serves as the ‘brain’ of our customers’ environments, right-sizing them, taking the guess work out of adding new applications, and determining the best way to leverage cloud,” said Andrew Hillier, CTO and co-founder of Densify. “The analytics engine is extremely robust and has the ability to model some really advanced modernization techniques such as stacking workloads in bare metal clouds or optimizing container placement within a public cloud instance, which can save upwards of 80 percent of the cloud bill for some customers. Our approach is completely different from other products that are reactionary and can’t offer any visibility or insight into the reasoning or best practices of cloud utilization.”

The new service offering delivers immediate business value in these key areas:

  • Reduced Cloud Costs – Densify examines detailed public cloud utilization and billing data to help customers actively reduce costs for services such as Amazon Web Services (AWS)®, Microsoft Azure®, Google® Cloud Platform and IBM® SoftLayer®.  Densify leverages workload pattern analysis and industry benchmarks to enable more advanced optimization of cloud usage. For customers, this translates into savings of 41% on average.
  • Reduced Infrastructure Requirements – In bare metal clouds and on-premise infrastructure, such as an internal VMware® environment, Densify analyzes workload patterns to right-size allocations and strategically place VMs. Densify is the only solution that can dovetail workloads to increase density by an average of 48%, resulting in a savings of 33%.
  • Better Performing Applications – Densify’s predictive analytics leverage historical patterns to model what a workload is going to do in the future. This enables Densify to automatically and proactively place and size VMs to avoid compute and storage risks. It also provides real-time responses to address operational anomalies and unexpected resource shortfalls.
  • Optimized Application Placement and Transformation – Densify reviews all application requirements and workload patterns to automatically place workloads in the best hosting environments whether in the cloud or on-premise according to requirements, cost and strategic priorities. It also provides detailed cloud migration and technology refresh plans.

Addressing Cloud Infrastructure Market Challenges

Despite a growing desire to move to the cloud, many organizations remain concerned about the complexity and risk of such a transition, the scarcity of skilled resources that truly understand new cloud environments, and the time and costs required to manage these new implementations. Other cloud optimization solutions require companies to learn and manage tools, which is resource intensive. At the same time, these offerings lack deep analytics to truly identify how to reduce and control operating costs and risk, putting customers in an ongoing reactive mode with ever escalating operating expenses. In contrast, Densify’s deep, pattern-based workload analysis provides sophisticated cloud optimization capabilities through right-sizing and advanced hosting strategies, such as leveraging containers. With this unique approach, Densify safely reduces cloud spend by 20 to 80 percent.

“Enterprises are often painfully reminded by their monthly AWS and Azure bills that shoveling their apps to public cloud can lead to unpredictable OPEX,” said Torsten Volk, managing director, Hybrid Cloud & Infrastructure Management, Enterprise Management Associates (EMA).   “Our research shows that cost control is the number one priority in hybrid cloud operations, driving demand for a solution that increases the efficiency of public cloud use and provides guidance regarding which public cloud service offers the desired compromise between risk and cost. Densify’s service provides exactly this analytics and control layer to instantly optimize existing application environments and show how future environments will be deployed in a policy driven and cost effective manner. This gets even more interesting when considering the new economics introduced by container-as-a-service offerings and by server-less functions.”

Source: CloudStrategyMag

Epsilon Selects 1025Connect For Direct Access To Cloud Connectivity

Epsilon Selects 1025Connect For Direct Access To Cloud Connectivity

1025Connect has announced the availability of Epsilon’s Cloud Link eXchange Platform (CloudLX) service offering at its Long Island facility designed for network interconnection and colocation. Epsilon, a privately owned global communications service provider, also recently announced the establishment of its United States headquarters within the same property where 1025Connect is located.

This location will allow Epsilon to expand its services throughout the U.S. with direct access to multiple subsea cables for enhanced global connectivity at the Continental Edge. 1025Connect customers can now leverage CloudLX, available through Epsilon’s on-demand connectivity platform, Infiny by Epsilon, to rapidly interconnect new services and benefit from direct connectivity to leading global cloud providers, including Amazon Web Services (AWS) and Microsoft Azure. This deployment enables greater quality and high-speed connectivity for Long Island businesses and international network traffic.

“The CloudLX module of Infiny accelerates access to global cloud service providers and gives customers a friction-free model for connecting the cloud,” states Carl Roberts, chief commercial officer of Epsilon. “It is making connecting the cloud simple and removes the limits on how service providers and enterprises grow in the U.S., and around the world.”

“Long Island is host to a great number of leading enterprise businesses and is a key landing point for subsea systems interconnecting North America to Europe,” comments Dan Lundy, managing director, 1025Connect. “The location of 1025Connect on this Continental Edge offers a unique opportunity to bypass traditionally congested network routes, eliminating multiple points of failure while saving customers on cross connect fees. 1025Connect is the pinnacle of diverse, reliable, network enablement sitting at the true nexus of subsea cable systems and the cloud.” 

1025Connect delivers direct access to multiple submarine cable systems connecting North America, Europe and Latin America, as well as the ability to bypass Manhattan fiber routes for greater network redundancy and diversity. 1025Connect is also home to the easternmost peering point in the New York metro area, enabling easier access and delivery of content distributed to the Continental Edge.

Source: CloudStrategyMag

Nvidia's new TensorRT speeds machine learning predictions

Nvidia's new TensorRT speeds machine learning predictions

Nvidia has released a new version of TensorRT, a runtime system for serving inferences using deep learning models through Nvidia’s own GPUs.

Inferences, or predictions made from a trained model, can be served from either CPUs or GPUs. Serving inferences from GPUs is part of Nvidia’s strategy to get greater adoption of its processors, countering what AMD is doing to break Nvidia’s stranglehold on the machine learning GPU market.

Nvidia claims the GPU-based TensorRT is better across the board for inferencing than CPU-only approaches. One of Nvidia’s proffered benchmarks, the AlexNet image classification test under the Caffe framework, claims TensorRT to be 42 times faster than a CPU-only version of the same test — 16,041 images per second vs. 374—when run on Nvidia’s Tesla P40 processor. (Always take industry benchmarks with a grain of salt.)

Serving predictions from a GPU is also more power-efficient and delivers results with lower latency, Nvidia claims.

TensorRT doesn’t work with anything other than Nvidia’s own GPU lineup, and is a proprietary, closed-source offering. AMD, by contrast, has been promising a more open-ended approach to how its GPUs can be used for machine learning applications, by way of the ROCm open source hardware-independent library for accelerating machine learning.

Source: InfoWorld Big Data

GlobalSCAPE, Inc. Launches First iPaaS Solution

GlobalSCAPE, Inc. Launches First iPaaS Solution

Rapid cloud adoption has made powerful applications readily accessible and easy to implement across an organization, but many of these applications do not natively communicate or connect, requiring significant IT development time and cost to integrate. Addressing this dilemma, GlobalSCAPE, Inc. has announced that it has extended the Company’s reach further into the cloud with the launch of Kenetix, its new integration platform as a service (iPaaS) solution.

Kenetix is a markedly different kind of data integration platform, focused on bringing rapid digital transformation to businesses with the quick connection to, and integration of, new and existing technologies like cloud-based applications, Internet of Things (IoT) devices and third-party application interfaces. With Kenetix, customers are empowered to rapidly create complex application and data integrations in the cloud, with the utmost security, reliability, transparency, and control.

“Globalscape’s decades-long leadership in secure, flexible and highly manageable data movement remains unmatched. However, as more and more customer applications move to the cloud, we saw the need and opportunity to apply our expertise to helping customers accelerate that transition. With Kenetix, we are delivering an extremely powerful and agile platform that not only allows customers to unleash the power of their data with very little technical expertise, but also establishes a platform through which we can extend and expand our own portfolio for maximum customer value. We see this as a natural extension to our work in managed file transfer and our larger desire to help our customers overcome their most complex data obstacles,” said Matt Goulet, president and CEO at Globalscape.

Kenetix supports application connectivity in an effort to synchronize data across systems, and empowers business teams to create and integrate applications without having to wait on IT.

The new platform differentiates itself from other iPaaS solutions with its easy to use yet powerful functionality that meets the demands of large enterprises, while also being simple and intuitive enough to satisfy the needs of non-technical staff or line-of-business managers who require the flexibility to develop their own data integrations quickly. Kenetix comes with a suite of features and benefits that make it the ideal solution for application and data integration, business process automation and microservices orchestration. Among the many features are:

  • Access to a library of connectors for more than 120 applications like Salesforce, Marketo, Basecamp, etc.
  • Simple drag-and-drop data mapping to ensure data integration flows without the need for coding
  • Ability to launch and orchestrate microservices as well as automate workflows quickly
  • Granular controls to ensure administrators can manage what application data individuals or teams have access to
  • Built-in reporting to facilitate audits and compliance mandates
  • Military-grade security with stringent audit and validation tools

Source: CloudStrategyMag

SolarWinds Simplifies Enterprise Network Management With New Products

SolarWinds Simplifies Enterprise Network Management With New Products

SolarWinds has announced two new unified network management solutions — SolarWinds Network Operations Manager and SolarWinds Network Automation Manager — that the company will showcase at Cisco Live® US, June 25-29, 2017, in Las Vegas at the Mandalay Bay® Convention Center.

Ensuring mission-critical enterprise networks remain optimized and running at scale in today’s increasingly complex hybrid IT environments can be a daunting task. Selecting, deploying, and integrating software to help monitor such networks often adds to the challenge. These new SolarWinds offerings build on the comprehensive, single-pane-of-glass visibility of the SolarWinds® Orion® Platform to simplify large-scale network management. These unified products provide everything needed to monitor and manage networks in a seamlessly integrated solution available as a single purchase.

“Leveraging our experience with thousands of large-scale customers, we’ve designed our new unified network management solutions to streamline almost everything about network management software,” said Christoph Pfister, executive vice president of products, SolarWinds. “In essence, we’ve taken our most popular network monitoring and management capabilities and unified them into products specifically designed for large-scale networks. We’ve applied the SolarWinds philosophy of powerful, affordable, and easy-to-use software to enterprise network management.”

SolarWinds Network Operations Manager

Based on the SolarWinds Orion Platform, SolarWinds Network Operations Manager provides:

  • Fault, availability, and performance monitoring
  • Critical path hop-by-hop monitoring both on premises and in the cloud thanks to the SolarWinds NetPath™ feature
  • Bandwidth and traffic monitoring
  • Switch port monitoring with end-user tracking
  • Dynamic drag-and-drop dashboard creation with the SolarWinds PerfStack™ dashboard to better visualize networking, systems, and application relationships

SolarWinds Network Automation Manager

SolarWinds Network Automation Manager includes all the features of SolarWinds Network Operations Manager and adds:

  • Network configuration automation
  • DHCP, DNS, and IP address management
  • High availability for instantaneous failover

 See SolarWinds Unified Network Management Solutions in Action at Cisco Live US, Booth 1111

 Meet SolarWinds Head Geeks™ Patrick Hubbard, Leon Adato, and Destiny Bertucci, as well as other SolarWinds product experts at booth 1111 to participate in live demonstrations of SolarWinds Network Operations Manager and SolarWinds Network Automation Manager, along with the complete portfolio of products from SolarWinds. The company was recently named the global market leader in network management software by industry analyst firm International Data Corporation® (IDC®) in its Worldwide Semi-Annual Software Tracker. 

Join “Blame the Network, Hybrid IT Edition”

On Tuesday, June 27, 2017 from 5-5:30 p.m. Pacific Time, Hubbard will also present, “Blame the Network, Hybrid IT Edition,” as part of the Cisco Live US 2017 Think Tank at booth 1601. During the talk, Hubbard will address the following:

“Hybrid IT, the conflagration of distributed on-premises, cloud, and Software-as-a-Service (SaaS) applications has a major downside: the opportunity for businesses to blame network engineers for service issues. Troubleshooting and resolution are further complicated by application evolution, causing many of the go-to tools we’ve relied on for years to become less effective, or worse, obsolete. In this session, you’ll learn how both IT organizations and technology vendors are trying to not only catch up, but get ahead in this time of IT transition, and continue to assure visibility of the complete Hybrid IT network.”

Source: CloudStrategyMag

IBM Named A Leader In Gartner Magic Quadrant For DRaaS

IBM Named A Leader In Gartner Magic Quadrant For DRaaS

IBM has announced that Gartner, Inc. has positioned IBM as a Leader in the Gartner Magic Quadrant for Disaster Recovery as a Service (DRaaS) for the third consecutive year. And this year, IBM was once again positioned as one of the highest in ability to execute and the furthest in completeness of vision.

IBM continues to offer one of the most comprehensive sets of related professional and managed services in the industry, and it enhanced its long-standing history in disaster recovery with its 2016 purchase of Sanovi Technologies to complement its IT and applications resiliency orchestration capabilities. IBM also continues to integrate IBM Watson into its resiliency services offerings to enhance disaster avoidance.

“We believe IBM’s recognition in Gartner’s Magic Quadrant for Disaster Recovery as a Service report speaks to the value, consistency and reliability our clients find in our resiliency services across the globe, as we work to innovate and deliver for them every day,” said Laurence Guihard-Joly, general manager, IBM Resiliency Services. “Beyond natural disasters and other outages, the increasing number of cyberattacks and other threats today requires the most advanced and most proactive set of solutions possible so our clients can combat those threats and focus on differentiating in the era of ‘always-on’ expectations.”

IBM Resiliency Services offers an innovative portfolio of resiliency and business continuity solutions and services with expanded public cloud options to enable clients with greater flexibility and agility in managing their backup and disaster recovery workloads. Today, IBM operates over 300 global delivery data centers across 54 countries to help companies worldwide maintain continuous business operations and improve overall resiliency for any size organization.

Source: CloudStrategyMag

Databarracks Recognised In Gartner’s Magic Quadrant For DRaaS For 2017

Databarracks Recognised In Gartner’s Magic Quadrant For DRaaS For 2017

London-based provider Databarracks has been recognised in Gartner’s Magic Quadrant for Disaster Recovery as a Service 2017.

Peter Groucutt, managing director at Databarracks, commented on the announcement, “We are very proud to be included in the Magic Quadrant for DRaaS for a third consecutive year. We feel it is an incredible achievement for us as a specialist provider of disaster recovery and business continuity services and is a testament to the hard work and commitment of our team.

“There has been a great deal of change in the last year. We have seen some commoditisation in the market and this year’s MQ reflects that with many more service providers evaluated. I completely agree that DRaaS has now become a ‘mainstream offering’. Organisations now recognise that DRaaS is a mature service and the benefits extend beyond pure cost savings. The organisations we speak to appreciate the value a specialist DRaaS provider brings. Customers of DRaaS know that in the event of a disaster, they not only have the technology to allow for rapid recovery but the skills of recovery specialists to bring their IT back online, minimising data loss and downtime.

“In the last 12 months we have also seen further change in both the risk landscape and the attitude towards IT downtime. The recent WannaCry attack raised the level of awareness of ransomware beyond the IT team and into the wider business-consciousness both here in the UK and around the world. Major IT outages have also highlighted just how reliant organisations are on technology to deliver their services –— whether that is an airline, a bank or a university.

“Our focus has been to continue to develop the next generation of DRaaS services. That means research and development finding new methods to protect against growing cyber threats. It also means extending our services help our customers build out their Business Continuity Plans and be truly resilient businesses. We have some further exciting announcements to make on that front in the coming months.”

Source: CloudStrategyMag