Harness Hadoop and Spark for user-friendly BI

Harness Hadoop and Spark for user-friendly BI

Big data shouldn’t be an area for only academics, data scientists, and other specialists. In fact, it can’t be. If we want big data to benefit industry at large, it needs to be accessible by mainstream information workers. Big data technology must fit into the workflows, habits, skill sets, and requirements of business users across enterprises.

Datameer is a big data analytics application doing exactly that. Combining the user interface metaphors of a file browser and a spreadsheet, Datameer runs natively on open source big data technologies like Hadoop and Spark, while hiding their complexity and facilitating their use in enterprise IT environments and business user scenarios. 

In other words, Datameer creates an abstraction layer over open source big data technologies that integrates them into the stable of platforms and toolchains in use in enterprise business environments. Business users tap the power of big data analytics through a familiar spreadsheet workbook and formula interface, while also benefiting from enterprise-grade management, security, and governance controls.  

Before we dive into the details of the platform, we should note that Datameer supports the full data lifecycle, including data acquisition and import (sometimes referred to as “ingest”), data preparation, analysis, and visualization, as well as export to other systems, such as databases, file stores, and even other BI tools. 

Data import is achieved with more than 70 connectors to databases, file formats, and applications, providing diversified connectivity for structured, semistructured, and unstructured data. Nevertheless, once a given set of data is in Datameer, it can migrate through all of the data lifecycle stages mentioned previously, right along with data from other sources.

System architecture

At the heart of Datameer is the core server referred to internally as the conductor. The conductor orchestrates all work and manages the configuration of all jobs performed on the Hadoop cluster. It lets users interact with the underlying data sources via Datameer’s user interface, and it lets tools interact with the data via its API.

The conductor also has a special interactive mode that accommodates the user’s incidental work in the spreadsheet user interface. This interactivity is facilitated by Datameer’s Smart Sampling technology, which allows the user to work with a manageable and representative subset of the data in memory. When the design work is done, the workbook is executed against the full data set via a job submitted to the cluster.

This fluid movement between interactive design work by the user and bulk execution by the Conductor (running on the Hadoop cluster) is the key to Datameer’s harmonization of open source big data and enterprise BI sensibilities and workflows.

Although Datameer works cooperatively with a Hadoop cluster, the application itself executes on a standalone server (“edge node”) or desktop PC running Windows, Linux, Unix, or MacOS. It is compatible with modern browsers, including Safari and Chrome, as well as Microsoft’s Internet Explorer 11 and the new Edge browser in Windows 10.

Security and governance

From the very early days of the Datameer product – when Hadoop itself offered only file-level security – Datameer provided for role-based access controls on specific subsets of data, a non-negotiable requirement for most enterprises.

By sharing data through multiple workbooks, each of which may contain a different subset of data, and assigning permissions on each workbook to unique security groups, Datameer provides for the row-level security that enterprises need. Column-level security is accommodated as well, either through inclusion of select columns in a group-specific workbook or via masking of data in particular columns, for particular security groups.

While Datameer allows users, roles, and groups to be created and maintained, it can also integrate with Microsoft Active Directory or other LDAP user stores, authenticating users against, and assigning permissions to, the groups that are defined in those systems. Datameer can also integrate with enterprise single-sign-on (SSO) systems.

As a web application, Datameer can be run over SSL or HTTPS connections, thus providing encryption of actions and data between user and application.

Datameer provides full data lineage information, rendered in diagrammatic or columnar views (see figure below), so data can be tracked from import job to workbooks to individual chart widgets in business infographic data visualizations.

datameer data lineageDatameer

For audit control, Datameer supports an “event bus” listener-based API, wherein all user interface and data entity events (creation of workbooks, addition of users to groups, assigning or revoking of permissions) are described on an emergent basis to all API subscribers.

This event bus facilitates integration with external governance systems that may be in use at particular customer sites. For more standalone audit management, Datameer records these events in its own log files, which can in turn be imported into Datameer itself, then analyzed and visualized there.

Data integration architecture

Because Datameer is designed to work natively with big data technology, even its data import and export functionality is run on the cluster. This allows for limitless horizontal scaling to facilitate data processing at very high volume. It’s an approach that sets Datameer apart from many of its competitors. 

Nonetheless, for smaller data sets, Datameer does provide file upload jobs and the ability to download the content of any sheet in a workbook on an ad hoc basis (in the form of a simple file).

Datameer accommodates workflows where the source data set remains in its home repository (database, file, and so on) and is queried only for Smart Sampling purposes and during workbook execution. These “data links” assure that data movement and duplication are minimized while still allowing for interactive work against the data source, and cluster-based processing against the full data set when the workbook is executed.

Data preparation, analytics, and visualization

The key to Datameer’s “aesthetic” is the use of successive columns in a sheet, and successive sheets in a workbook, to yield a gradual (and self-documenting) evolution of source data sets into the analyzed whole.

This approach combines the tasks of data preparation and analysis in a single environment, by providing a library of more than 270 spreadsheet formula functions that serve each purpose (and sometimes both purposes). An example of the Datameer workbook is shown below. 

datameer workbookDatameer

Formula functions run the gamut from mundane standbys like functions for manipulating text, formatting numbers, and doing simple arithmetic to functions that group data in specific ways for aggregational analysis to specialized functions for parsing file names, HTML content, and XML- and JSON-formatted data. You’ll even find functions that can mine text for organization names and parts of speech and provide sentiment analysis on individual words.

Formulas can be entered in a formula bar or built with the assistance of the Formula Builder dialog (below), which allows pointing-and-clicking on individual columns in particular sheets to supply their data as formula parameter values.

datameer formulabuilderDatameer

Each sheet in a workbook serves as a view on the data set because workbooks don’t alter the original data. Sheets have a cascading relationship where, for example, data from sheet A is used and transformed by sheet B, which is further used and transformed by sheet C. In this way, every transformation and analysis step is made transparent and easily discoverable.

What’s more, each sheet in a workbook can have its data profiled at any time by switching to that sheet’s Flipside view, which provides histogram visualizations for each column. Along with the histogram, Flipside shows the distribution of values the column contains, the data type, the total number of values and distinct values in the column, and the minimum, maximum, and average value for all data within it.

Moving between different workbooks, the file browser, and various business infographics is made easy by Datameer’s context tabs, which allow the user to shift fluidly between the different views and return to each, in context, whenever that may be called for.

Context tabs, when combined with Datameer’s full data lifecycle functionality, perfectly facilitate working on data from multiple angles at once, rather than being forced into a linear, assembly-line approach of doing preparation, analysis, and visualization in a particular order. Many people prefer to work on each phase in a piecemeal fashion, bringing it all together at the end. Datameer fully supports that scenario.

Built using the latest HTML5 technologies, Datameer’s Infographic Designer (below) supports the creation and viewing of informative visualizations from any browser on a multitude of devices. The pixel-perfect design interface allows users to combine chart widgets with text annotations, images, videos, and other elements. Infographics are composed from more than 20 fully customizable drag-and-drop chart widgets, each based on the popular D3 visualization standard. The Infographic Designer can create high-end dashboards, operational reports, and beautiful, customized, special-purpose infographics.

datameer infographic designerDatameer

Smart Analytics

Datameer’s Smart Analytics technology provides four major algorithms that make it even easier to find the signal in the noise of big data: clustering, decision trees, column dependencies, and recommendations.

Models based on these algorithms manifest the same way other analytical assets within Datameer do: as sheets in a workbook. The sheets show all model data, along with predicted values, and the Flipside view will render a graphical representation of the model and its content.

By incorporating machine learning functionality in-situ, within the workbook user experience, Datameer provides machine learning capabilities without forcing users to have vastly specialized skills or endure abrupt user interface context switches.

This integration is further extended through the use of an optional Predictive Model Markup Language (PMML) plugin, provided by our partner, Zementis. The plugin allows scoring against machine learning models built in other tools (and published in PMML format) by exposing them within Datameer as additional spreadsheet functions.

A patent-pending execution framework

Datameer simplifies selection of execution frameworks through its patent-pending Smart Execution engine, which picks the best framework for users along each step in the analytics workflow. It takes full advantage of Apache Tez, Apache Spark, and Datameer’s own single-node, in-memory engine, freeing users from having to evaluate the best engine for any given analytics job or task.

Smart Execution provides a future-proof approach to big data analytics. By decoupling the design experience from processing on a particular execution engine, Datameer permits workbooks developed today to be functional against new execution frameworks tomorrow, as they emerge and take their place in the Smart Execution platform.

While open source big data technologies hold the keys to answering new business questions, they weren’t designed with business users in mind. Combining a spreadsheet workbook and formula interface with a cost-based query optimizer that picks the right engine for a particular set of tasks, Datameer turns Hadoop, Spark, and company into user-friendly BI tools for the business at large.

Datameer makes the big data aspiration a reality by harnessing the power of these platforms and working with them in their native capacities, not merely treating them as relational databases. At the same time, Datameer embeds these open source technologies into a business-user-oriented application, premised on familiar spreadsheet constructs, for working with data across its lifecycle and extracting relevant information from it.

New Tech Forum provides a venue to explore and discuss emerging enterprise technology in unprecedented depth and breadth. The selection is subjective, based on our pick of the technologies we believe to be important and of greatest interest to InfoWorld readers. InfoWorld does not accept marketing collateral for publication and reserves the right to edit all contributed content. Send all inquiries to newtechforum@infoworld.com.

Source: InfoWorld Big Data

Faction® Earns Patent For Its Hybrid And Multi-Cloud Solutions

Faction® Earns Patent For Its Hybrid And Multi-Cloud Solutions

Faction® has announced that the U.S. Patent and Trademark Office (USPTO) has granted Faction a patent for its pioneering work on hybrid and multi-cloud networking.

“We’re proud that Faction’s technology has essentially created the hybrid cloud category more than five years ago, and we look forward to exercising control of this technology now that we have received our initial patent issuance,” said Luke Norris, Faction’s CEO and founder.

Faction’s hybrid and multi-cloud technology that is now patented under USPTO Patent #9,571,301 powers Faction Cloud Bloc and Faction Internetwork eXchange (FIX) product sets. This allows service providers and enterprises to seamlessly connect the best features of various private and public clouds and design a robust cloud architecture that still operates as a single unified cloud.

Faction’s groundbreaking approach to cloud networking greatly reduces the cost and complexity of composing true hybrid cloud and multi-cloud solutions for customers. The technology is also broadly utilized to allow access from datacenters into private and public clouds, which Faction utilizes to connect customers currently in 22 datacenters in the United States directly into the company’s offering.

With this technology, cloud architects are now freed from rigid networking constructs and burdensome administrative tasks that have perpetually frustrated infrastructure and operations teams and slowed important business initiatives due to network constraints. Faction’s customers benefit from Faction’s composable cloud technology, allowing the combination of datacenter, private, and public cloud resources without sacrificing security or performance, and without incurring substantial migration or interconnection costs typical to traditional solutions.

This initial patent demonstrates the company’s commitment to advancing cutting-edge cloud infrastructure technology and simplifying how enterprise IT cloud infrastructures are designed, built and managed. Specifically, it details how physical resources that may be hosted within a datacenter or colocation site can connect to one or more cloud providers creating a seamless, single pool of resources. Additionally, once an enterprise is connected to the Faction composable cloud fabric, it gains the ability to easily mix other third-party cloud services together creating a true multi-cloud solution.  Further patent applications are in process, and last week a second Notice of Allowance was received on a Faction patent application to further expand the scope of issued patent claims, with others expected to follow.

The Faction technology also provides a unified data fabric for composing true hybrid and multi-cloud solutions. Enterprise IT can leverage public cloud resources on-demand while retaining the control and security of their private cloud infrastructure. The network technology enabling composable clouds complements Faction’s private cloud offering, which fully decouples compute, storage capacity, storage performance, and network capacity, enabling enterprises to compose their ideal private cloud. The Faction Internetwork eXchange makes these composable resources available not only within the private cloud environments, but the data center and colocation environments and the public clouds as well.

The announcement of the USPTO issuing a first patent follows last month’s announcement that Faction raised $11 million in capital, which will be used to expand the company and help meet strong customer demand. In 2016, Faction saw 44% year-over-year growth, and it projects similar growth in 2017.

Source: CloudStrategyMag

Bigleaf Expands Cloud Access Network In Dallas

Bigleaf Expands Cloud Access Network In Dallas

Bigleaf Networks has announced that it is expanding its Cloud Access Network with a fifth gateway Cluster in Dallas, Texas.

By tunneling all customer traffic through the Bigleaf-owned Cloud Access Network, Bigleaf’s patent-pending SD-WAN platform is uniquely able to prioritize and seamlessly failover all applications, including hosted VoIP, Virtual Desktop, Point of Sale, VPNs, and SaaS. Bigleaf hosts its redundant gateway clusters in major Internet peering hubs across North America for optimal performance. With the addition of this Dallas Gateway Cluster, Bigleaf customers will automatically see higher performance to applications hosted in the southern U.S., and Bigleaf customers in the southern U.S. will gain increased performance for all traffic.

“We’re excited to invest in this expansion, and other upgrades coming later this year,” said Bigleaf founder and CEO Joel Mulkey. “Our unique Cloud Access Network gives customers highly-peered connections directly to all of the major cloud providers, so with our service, customers don’t have to struggle with the expense and complexity of private circuits into those cloud networks to get peak performance.”

Bigleaf’s plug-and-play SD-WAN offering is delivered as an SLA-backed 24/7 managed service. Features include:

  • Intelligent Load-Balancing across multiple Internet paths from different ISPs
  • Dynamic Quality-of-Service (QoS) to prioritize VoIP and other performance-sensitive traffic over commodity Internet
  • Same-IP Failover to ensure uninterrupted VoIP and other real-time sessions while moving between ISP circuits
  • Transparent outside-the-firewall deployment for easy zero-breach installs
  • Real-time intelligence on ISP circuit latency, packet loss, jitter and throughput

 

Source: CloudStrategyMag

Malaysian Web Hosting Firm Exabytes Acquires HT Internet to Grow Managed Services

Malaysian Web Hosting Firm Exabytes Acquires HT Internet to Grow Managed Services

Malaysian web hosting provider Exabytes Group has acquired domains and web hosting company HT Internet, Telecompaper reports. The deal will allow Exabytes to branch into e-commerce and offer fully managed website and e-commerce services to its current client base of 75,000 customers and 200,000 websites. Financial details of the acquisition were not disclosed.

Exabytes will take over HT Internet’s Grow, DomainPlus, and other brands, and integrate the company’s team with its own, according to Telecompaper. The companies will share back-end resources, and the HT team will focus on growing the managed services business.

See also: Exabytes Acquires Singapore Cloud Hosting Provider Signetique

Growth in online services in emerging markets like Malaysia, and continued momentum in market like Singapore will contribute to Asia Pacific making up a larger share of the colocation data center market than North America by 2020, according to Structure Research.

The Malaysian government has made efforts to boost adoption of new technologies in the country, and Frost & Sullivan research found Malaysia’s IT market growing at a rate of roughly 9.5 percent (CAGR) from 2013-2017. Malaysia is also Southeast Asia’s largest e-commerce market, with $2.3 billion in revenues in 2015.

HT Internet was founded in 2011, and manages over 5,000 domain names. The company also offers website development, and online marketing and advertising services.

Exabytes, which was founded in 2001, is already piloting a fully managed website service in Malaysia, and Telecompaper reports it has plans to expand it to other markets, including Singapore and Indonesia.

Source: TheWHIR

WHIR Networking Event Austin Attracts Best of Local Hosting Industry

WHIR Networking Event Austin Attracts Best of Local Hosting Industry

WHIR Networking Events made its first stop of 2017 in one of our favorite cities: Austin, TX!

The event had a solid turnout and was a great way for attendees to spend a few hours on a Thursday night, mingling with like-minded industry folks and talking shop in a casual environment. Thanks to our sponsors we were able to provide complimentary food and drinks, and give away some fantastic prizes.

Here are the winners:

  • Lenovo gave away a Yoga Tab 3 Pro toRichard Hernandez of Sokorro
  • Samsung gave away a Galaxy S7 Phone to Kevin Hazard of Hazard Investments
  • OnRamp gave away an Apple Watch to Pooja Goel of DPR Construction
  • Digital Realty gave away a Yeti Hopper Flip 12 cooler to Jay Newman of Hostway
  • Apricity Recruitment gave away a Google Home to Keith Watson of Eaton

If you’re in the Phoenix area, make sure to RSVP to our next event in Scottsdale, AZ, at Bottled Blonde on Mar. 16, 2017. And bring a business card if you want to be entered the prize draw!

If you are interested in sponsoring our Phoenix event, or any of our other events coming up throughout the year, please get in touch.

Source: TheWHIR

AWS Takes Down Hundreds of Sites in Massive S3 Outage

AWS Takes Down Hundreds of Sites in Massive S3 Outage

Availability issues with the US-EAST-1 region of AWS’ S3 storage service caused downtime or slow performance for many websites on Tuesday.

Affected sites include Airbnb, Business Insider, Chef, Docker, Expedia, Heroku, Mailchimp, News Corp, Pantheon, Pinterest, Slack, and Trello, as well as parts of AWS’ own site, and ironically itsdownrigthnow.com and Down Detector, VentureBeat reports.

AWS acknowledged the issues before 7:30 a.m. Pacific, saying it was investigating. Shortly after 10:30 a.m. Pacific, the company updated the statement on its status page.

“We’re continuing to work to remediate the availability issues for Amazon S3 in US-EAST-1. AWS services and customer applications depending on S3 will continue to experience high error rates as we are actively working to remediate the errors in Amazon S3,” AWS service health dashboard said.

An hour later, AWS updated the message: “We continue to experience high error rates with S3 in US-EAST-1, which is impacting various AWS services. We are working hard at repairing S3, believe we understand root cause, and are working on implementing what we believe will remediate the issue.”

AWS suffered a service disruption lasting over five hours in 2015Google App Engine was down for nearly 2 hours in August, and problems at Telia Carrier affected many popular sites and services in June of last year.

Source: TheWHIR

Report: Hackers Take Less than 6 Hours on Average to Compromise Targets

Report: Hackers Take Less than 6 Hours on Average to Compromise Targets

Most hackers can compromise a target in less than six hours, according to a survey of hackers and penetration testers released Tuesday by security awareness training firm KnowBe4.

The Black Report was compiled from 70 surveys taken at Black Hat USA and Defcon, and shows that phishing is the preferred method for 40 percent of hackers. A further 43 percent said they sometimes use social engineering, while only 16 percent do not use social engineering at all. Forty percent sometimes use vulnerability scanners, 60 percent use open-source tools, and just over 20 percent use custom tools for hacking.

A majority of those surveyed (53 percent) said they sometimes encounter systems they are unable to crack, while 9 percent say they never do, and 22 percent said they “rarely” encounter such targets. KnowBe4 chief hacking officer Kevin Mitnick performs penetration testing with a separate company (Mitnick Security), with a 100 percent success rate. Mitnick will present the keynote address at the upcoming HostingCon Global 2017 in Los Angeles. [Register now for HostingCon Global and save $100 on your all-access pass]

Once they have gained access to a system, one in three penetration testers said their presence was never detected, and only 2 percent say they are detected more than half of the time. Exfiltrating data after a compromise takes less than 2 hours for 20 percent of respondents, and two to six hours for 29 percent, while 20 percent take longer than 12 hours.

See also: Pentagon Hires Hackers to Target Sensitive Internal Systems

When asked about effective protection against breaches, endpoint protection was named by 36 percent of those surveyed, while 29 percent identified intrusion detection and prevention systems.  Only 2 percent consider anti-virus software an obstruction to hacking networks.

One-quarter of those surveyed said their advice to corporate boards would be to recognize that it is inevitable that they will be hacked, it is only a question of when it will happen. Roughly the same number urged boards to consider the return on investment in security, while 10 percent said boards should realize that detection capability is much more important than deflection capability.

KnowBe4 also commissioned a study from Forrester on the Total Economic Impact of breaches to put numbers to the potential return on investment (ROI) of security spending. The study is available from the KnowBe4 website.

See also: Data Breaches Hit Record in 2016 as DNC, Wendy’s Co. Hacked

Source: TheWHIR

Unlike big data, IoT may live up to the hype

Unlike big data, IoT may live up to the hype

Big data has long promised more than it delivers, at least for most enterprises. While a shift to cloud pledges to help, big data deployments are still more discussed than realized, with Gartner insisting that only 14 percent of enterprises have gotten Hadoop off the ground.

Will the other darling of the chattering class, IoT (internet of things), meet the same fate? In fact, IoT might deliver, according to new data from Talend compiled in conjunction with O’Reilly. Dubbing 2016 “the year IoT ‘grew up,'” the report declares 2017 the year that “IoT starts to become essential to modern business.”

How and where IoT gets real, however, may surprise you.

The new hyped kid on the block

IoT has been proclaimed the $11 trillion savior of the global economy, which has translated into IoT becoming even bigger than big data, at least in terms of general interest. This Google Trends chart shows IoT surpassing big data in search instances around the middle of last year:

iot trendGoogle Trends

If we get more specific on “big data” and instead use Apache Hadoop, Apache Spark, or MongoDB, all hugely popular big data technologies, the crossover is even more pronounced. IoT has arrived (without its security intact, but why quibble?). Indeed, as the Talend report avers, “[W]hile the buzz around big data is louder, the actual adoption of big data in industry isn’t much larger than the adoption of IoT.”

That’s right: IoT is newer, yet sees nearly as much adoption as big data. In fact, IoT, as the source for incredible amounts of data, could actually be what makes big data real. The question is where.

Betting on boring

The answer to that question, according to the Talend report, which trawled through more than 300TB of live data to glean its insights, is not where the analysts keep insisting:

We found that IoT spending today is for use cases that are much different than those predicted by McKinsey, Gartner, and others. For example, the greatest value/consumer surplus predicted by McKinsey was in factories around predictive maintenance and inventory management, followed by healthcare and smart city–related use cases like public safety and monitoring. While these use cases may be the top producers of surplus in 2025, we do not see much spend on those use cases today. In contrast, home energy and security is low on the McKinsey list, but that’s where the market is today, in addition to defense and retail.

It’s not that the analysts are wrong when they pick out details like industrial automation as incredibly ripe for IoT disruption, so long as we don’t assume “ripe” means “developed to the point of readiness for harvesting or eating.” Given the complexity of introducing significant changes into something like factory automation, such industries most definitely are not “ripe” for IoT. The potential is huge, but so are the pitfalls holding back change.

Home energy and security, by contrast, are relatively straightforward. Or, as the report continues, areas like health care are in desperate need of disruption, but the likes of online patient monitoring “seems 100 times more complex than simple home monitoring or personalized displays for in-store customers.”

Hence, home energy (9 percent) and security (25 percent) accounts for the biggest chunk of IoT deployments in 2016, with defense (14 percent) and retail (11 percent) also significant. Health care? A mere 4 percent.

Given that regulation and complexity are inimical to real-world IoT adoption, it’s perhaps not surprising that unlike big data, which is mostly a big company phenomenon, IoT shows “more continuous adoption … across large and small companies.” As such, IoT deployments are more evenly spread across geographies, rather than following big data’s concentration on the coasts.

In sum, IoT could well end up being a truly democratizing trend, a “bottom-up” approach to innovation.

Source: InfoWorld Big Data

Webair Partners With Data Storage Corporation

Webair Partners With Data Storage Corporation

Webair has announced a partnership with Data Storage Corporation to enhance its high availability (HA) Disaster Recovery and overall support capabilities for IBM Power Systems (iSeries, AS/400, AIX) environments. The demand for Webair’s Disaster Recovery-as-a-Service (DRaaS) solution has grown exponentially over recent years, and the addition of IBM Power Systems support positions it for even further expansion.

Many companies require both x86 and IBM Power Systems platforms to run mission-critical applications, making disaster recovery critical to these environments. The partnership between Webair and Data Storage Corporation provides x86 and IBM Power Systems users with:

  • Recovery point objectives (RPO) and recovery time objectives (RTO) of one hour, including continuous replication, network automation and orchestration
  • Seventy-two hours of monthly recovery site usage before incurring additional fees
  • Fully managed quarterly recovery site testing with attestation report
  • Per-IP failover, public BGP failover, DNS failover, L2 stretch, and VPN(s)
  • One week of checkpoints
  • Recovery site network architecture customization to enable customer infrastructure integration
  • The ability to replicate data to any Webair DR location, including U.S. East and West Coasts, Canada, Europe, and Asia. 

Through this new partnership, Webair now also offers customers fully managed IBM Power Systems solutions and services backed by a premier IBM Managed Services Provider.

“Webair’s strategic partnership with Data Storage Corporation broadens our Disaster Recovery support capabilities,” explains Michael Christopher Orza, CEO of Webair. “While most providers only offer a limited range of services encompassing specific operating systems and workloads, this partnership delivers a true DRaaS solution that is customized to mirror customers’ specific production environments and supports both state-of-the-art and legacy platforms.”

Webair customers can also take advantage of ancillary services available at its data centers as part of their larger DRaaS solutions, including public and private cloud infrastructure, snapshot-based storage replication, colocation, authoritative DNS, third-party cloud connectivity, Backups-as-a-Service, network connectivity, and Seeding. These services are fully managed and can be tied directly into DRaaS infrastructure via private and secure cross-connects.

“I am excited about the Data Storage Corporation / Webair partnership and the opportunities it will provide to both companies as our services are in high demand across all markets and industries with tremendous growth forecasted,” says Hal Schwartz, President DSC.  “Because of this partnership, the combined teams can now provide first-class management and technical support, allowing them to deliver and fully manage cloud, hybrid cloud and cloud backup solutions with the highest confidence and service levels.”

 

Source: CloudStrategyMag