DataCore Chooses JULABO For Software-Defined Storage

DataCore Chooses JULABO For Software-Defined Storage

DataCore Software has announced that SANsymphony™ was chosen by JULABO GmbH as the software-defined storage platform best suited to support its flexible virtual infrastructure. JULABO, a Germany-based worldwide provider of high-quality temperature control solutions, chose SANsymphony to ensure business continuity and high performance of VMware vSphere virtual machines synchronized across German and U.S. data centers. The VMs support Microsoft Dynamics, Microsoft SQL Server and Matrix42 software and workspace management applications. A JULABO data center in Allentown, Pennsylvania mirrors the German headquarters systems located in the Black Forest using DataCore replication software and Riverbed WAN technology.

“The complete infrastructure project was implemented and supports the network, servers and storage using DataCore SANsymphony cost-effectively. The entire project was carried out at the same cost as what would we would have paid for storage alone if we had chosen a major storage hardware provider,” said Jürgen Jonescheit, CIO JULABO GmbH.

JULABO GmbH is one of the foremost providers of high-quality, reliable and high-performance “Made in Germany” temperature control solutions for high-tech industries. With 10 branch offices on three continents and more than 400 employees, the central IT team provides support for Microsoft Dynamics ERP and other business-critical applications.

JULABO was faced with the need to move data between its remote sites and to headquarters, while also supporting rapid growth and new business requirements in its local markets. Growing workloads caused increased demands for high availability and greater flexibility, prompting it to implement a comprehensive virtual infrastructure across the entire organization. Today JULABO maintains mirrored computer centers in two different sites, protected with various fire compartments, as well as a third computer center located to support off-site storage of backups. All of the sites are connected via a high-performance fibre channel network.

Parallel I/O, Fusion-io and High-Speed Caching Drive 5x Performance Increase

DataCore partner and system integrator Leitwerk AG was commissioned to implement VMware vSphere and DataCore SANsymphony on the main site. Today all systems are virtualized, with workloads running across 170 virtual machines on redundantly designed VMware ESX servers on HP Proliant hardware, each with 768 GB RAM and 16 cores.

SANsymphony also runs on an HP Proliant and virtualizes 50TB of storage capacity (SAS, SATA, SSD/flash) while providing synchronized highly available data to applications. DataCore provides auto failover protection and high availability to ensure accessibility to the two HP D2700 disk shelves. In addition to DataCore’s high-speed caching, Fusion-io PCIe flash memory cards were integrated within the DataCore software-defined storage nodes, providing cost-effective performance by using DataCore auto-tiering to manage and optimize performance and resource utilization.

DataCore’s Parallel I/O technology leverages multi-cores to improve I/O performance and response times needed by demanding applications by processing I/O requests in parallel. JULABO estimates that, in comparison with hardware-bound architecture, DataCore has resulted in a five-fold increase in performance. Up to 40,000 IOPS with 64K blocks were achieved during tests under realistic real-world conditions.

“Hardware solutions were not able to achieve the DataCore performance results and value, even with twice the number of spindles,” Jonescheit added.

“We chose DataCore because of its excellent price/performance ratio and its outstanding technical design. Good examples are the automatic failover for high availability and auto-tiering for performance. We have put this functionality into practice successfully. However, the flexibility resulting from hardware independence does not only have technical and functional advantages. It also allows us to flexibly extend our infrastructure cost-effectively to maximize our economic efficiency, as we are now vendor-independent,” Jonescheit concluded.

Benefits at a glance:

• DataCore reduces storage-related costs by 50%.

• Performance has improved 5-fold.

• Storage-related failures were reduced by 90%.

• Time spent on routine administrative tasks was reduced by 90%.

• Planned downtime (data migrations, upgrades, updates) was reduced by over 95%.

• DataCore managed to convert 50% of unused storage space into available free storage capacity.

Source: CloudStrategyMag

Cloud Business Communications Growing By 23% Per Year

Cloud Business Communications Growing By 23% Per Year

New Q1 data from Synergy Research Group shows that the cloud business communications market is growing at an annualized rate of 23%, as strong UCaaS adoption continues to drive the market. Retail UCaaS services account for well over 70% of the total market, though the smaller wholesale UCaaS segment is growing considerably more strongly. The balance of the market comprises cloud communications (virtualized call control and UC applications), which are growing at a similar rate to retail UCaaS. The total cloud communication market is now generating revenues in excess of $2 billion per year. RingCentral is the market leader based on quarterly revenues and has been consistently growing its market share, thanks to leadership of retail UCaaS and a strong position in wholesale UCaaS. It is followed by 8×8, Mitel and ShoreTel.

Retail and wholesale UCaaS business suite services now generate revenue of well over $1.7 billion on an annualized basis. These include private, public and hybrid cloud service offerings. Public UCaaS is the largest of the three sub-segments and is also growing the most rapidly. Overall the cloud business communications market continues to be heavily concentrated in the United States, which accounted for 80% of total revenues in the first quarter of the year.

“UCaaS continues to be a force for change within the business communications market,” said Jeremy Duke, Synergy Research Group’s founder and chief analyst. “Major barriers to cloud adoption are now almost a thing of the past and consequently we are seeing continued erosion of on-premises PBX-based systems. Clearly these trends will continue over the next few years.”

Source: CloudStrategyMag

IBM Launches New Service For Hybrid Cloud Networks

IBM Launches New Service For Hybrid Cloud Networks

IBM has launched a new offering designed to help companies more efficiently manage applications and services across a hybrid cloud environment by gaining real-time visual insights on the performance of supporting infrastructure.

Agile Service Manager for IBM Netcool Operations Insight will help companies in industries rapidly moving applications to the cloud — such as telecommunications and financial services — better monitor changes in their underlying systems so they can avoid customer service disruption. It’s among the only offerings in the IT industry to provide a detailed topology of infrastructure across a broad set of domains, including different public clouds and on-premises systems, with both real-time updates and historical views.

By having this view of virtual and physical resources, companies can now better manage applications and services deployed across highly distributed and increasingly complex hybrid cloud networks. Agile Service Manager also can help companies prevent and more quickly resolve problems by providing context about the resources that underpin a hybrid cloud network. This is especially important for companies where applications put significant demands on network resources while requiring continuous uptime to meet customer expectations.

For example, a communications service provider could use IBM Netcool Operations Insight with Agile Service Manager to support launching and scaling a new, cloud-based video conferencing service. As the company expands use of the conferencing service, Agile Service Manager could provide real-time insights about the infrastructure supporting the new service, such as response time of specific servers and capacity of storage, letting the company know if these assets are meeting customer needs.

“In today’s fast-changing marketplace, companies are under increasing pressure to roll out new and enhanced applications for customers,” said Denis Kennelly, general manager of IBM Hybrid Cloud. “Agile Service Manager gives valuable new insights about what is going on in the network and how it is impacting service quality and customers in real time.”

Companies can use Agile Service Manager to present a configurable topology view that shows the relationships and states of resources both in real time and within a defined time window. Agile Service Manager extends the capabilities of IBM Netcool Operations Insight, an analytics driven software that helps organizations gain actionable insights from the massive amounts of operational data generated from their hybrid cloud environments.

Source: CloudStrategyMag

Equinix Partners With Alibaba Cloud

Equinix Partners With Alibaba Cloud

Equinix, Inc. has announced a collaboration with Alibaba Cloud to provide enterprises with direct, scalable access to Alibaba Cloud via the Equinix Cloud Exchange™ in the company’s Hong Kong, Silicon Valley, Sydney, and Washington, D.C., International Business Exchange™ (IBX®) data centers, with the addition of Frankfurt and London available soon.

The cloud computing market continues to grow rapidly in Asia, and specifically China. According to the U.S. International Trade Administration, the Chinese cloud market is expected to grow at 40% per year through 2020, reaching $20 billion. Access to Alibaba Cloud is critical for multinational customers looking to expand their cloud-based applications into this growing region in a secure and high-performing manner. By providing multi-national enterprise customers with secure and direct access to Alibaba Cloud, Equinix delivers connectivity to the full suite of Alibaba cloud services to enterprises. Furthermore, Alibaba Cloud Express Connect enables companies to leverage the extensive Alibaba Cloud network in mainland China. 

“As one of the world’s leading and fastest growing cloud computing companies, Alibaba Cloud represents a significant partnership for Equinix as we continue to empower businesses around the globe to build secure, private clouds, without compromising network and application performance. We are thrilled to offer direct access in markets across the U.S., Asia and Europe, and look forward to bringing additional markets online in the near future,” said Greg Adgate, vice president of global technology partners and alliances, Equinix.

With the addition of direct access to Alibaba Cloud on Cloud Exchange in these new markets across Asia Pacific, EMEA and the Americas, Equinix now offers private access to Alibaba Cloud in five markets. The company previously offered access in its Singapore IBX. Alibaba Cloud is also a colocation customer in Dubai with Emirates Integrated Telecommunications Company PJSC (“du”).

“The global reach of Equinix Cloud Exchange makes it simple for Alibaba Cloud to access new markets. We are pleased to provide greater value and bring our services closer to enterprises by leveraging Equinix’s powerful, on-demand cloud connectivity, and in particular to provide greater connectivity to the Chinese market,” said Yeming Wang, deputy general manager of Alibaba Cloud Global.

 

Source: CloudStrategyMag

Talend Unveils Multi-Cloud Big Data Integration Solution

Talend Unveils Multi-Cloud Big Data Integration Solution

Talend has unveiled a new version of its Talend Data Fabric platform that is optimized to manage cloud and multi-cloud enterprise IT environments. Talend Summer ’17 helps seamlessly manage information across Amazon Web Services (AWS), Cloudera Altus, Google Cloud Platform, Microsoft Azure, and Snowflake platforms, enabling customers to rapidly integrate, cleanse, and analyze data to fuel innovation and gain a competitive edge.

As businesses continue to evolve their cloud strategies, many find that it’s imperative to use services from several different cloud providers to deliver value across their business units. In fact, according to IDC, more than 50% of IT organizations already utilize a multi-cloud approach and another 20% have plans to implement [a multi-cloud strategy] within 12 months.1

“Companies are adopting cloud platforms at an unprecedented pace and as they do so, they are selecting different platforms to address varying business needs,” said Ashley Stirrup, CMO, Talend. “In this environment, CIOs must design their IT infrastructure with agility to deliver in a hybrid, multi-cloud world. Using Talend Data Fabric, companies can develop data pipelines on any of the leading cloud platforms with peace of mind in knowing whatever they develop will be able to run on the latest cloud and open source innovations.”

The Summer ’17 release of Talend Data Fabric allows customers to access a rich and growing library of Talend native cloud components, using intuitive drag-and-drop visual tools to build big data workflows that run in nearly any cloud, almost anywhere. Adding to its already robust support for AWS, the new version of Talend Data Fabric includes new capabilities for Cloudera Altus, Google Cloud Platform, Microsoft Azure, and Snowflake that allow customers to:

  • Speed Multi-Cloud Pipeline Development: Talend Summer ‘17 delivers a range of new connectors for Azure, Cloudera and Google Cloud that span Big Data, cloud data warehousing, NoSQL, and messaging platforms. Additionally, the newest version of Talend Data Fabric includes the industry’s first support for Cloudera Altus, a platform-as-a-service offering that simplifies customers’ construction and deployment of intelligent data pipelines on Cloudera — minimizing the need for operational support.
  • Accelerate Migration to the Cloud: Talend Summer ’17 helps rapidly migrate on-premises data to the cloud, so customers can simply and intuitively build cloud data warehouses, power richer analytics, and speed time-to-insight. With this release Talend delivers the fastest bulk loader connector for Snowflake that accelerates data migration by up to 20X. Using data quality and visual extract, transform and load tools, Talend reduces the time needed to migrate on-premises and cloud data to Snowflake.
  • Build Once, Use Everywhere: Talend Summer ’17 provides the flexibility and portability to take development work designed for one cloud platform and reuse it with other cloud platforms. This is designed to help customers rapidly adopt new innovations in cloud services and future-proof current development work. Consequently, businesses can rapidly combine and analyze data in new ways to innovate faster, while reducing their maintenance and development costs.
  • Delivering Data Quality with Machine Learning at Big Data Scale: As data stores grow, customers must find new ways to clean and combine data at scale with even less human intervention. Talend Summer ’17 utilizes Apache Spark-powered machine learning algorithms to automate and accelerate data matching and cleansing, improving scale, performance and accuracy. Over time, these algorithms monitor decisions made by data curators to become more intelligent and accurate. These new algorithms and machine learning capabilities are designed to work seamlessly with Talend’s intuitive, role-base Data Stewardship app, to enable continuous improvement within each data quality model.

Enterprises worldwide, across all industries use Talend together with our partners’ products to create truly data-driven solutions.

“Talend and Snowflake have a solid partnership that continues to benefit our joint customers,” said Walter Aldana, Snowflake’s vice president of Alliances. “With Talend’s newest Snowflake connector, enterprises will see even more efficiencies when moving data into Snowflake. Talend’s architecture capitalizes on Snowflakes’ parallel loading capabilities, which enables our joint customers to easily load a diverse set of data types so they can jumpstart their cloud data warehousing projects much faster.”

1 “Multi-cloud Infrastructure as a Service as a Public Cloud Adoption Pattern,” IDC, August 2016, Doc # WC20160825.

Source: CloudStrategyMag

Altair And Daffron Partner

Altair And Daffron Partner

Altair has signed an agreement with Daffron & Associates to expand the reach of Envision; its cloud-based business intelligence solution, in the utilities market. Envision will be integrated with Daffron utility software solution and power its data analytics and visualization capabilities to enable utility companies to gain insight from their data.

“Daffron & Associates is very excited to collaborate with Altair. We appreciate Altair’s commitment to the utility industry, their deep experience in this field, and their willingness to help achieve the goals of Daffron and its customers,” said Paul Kluba, chief operating officer, Daffron. “Data analytics is integral to Daffron’s customers, who interact with large amounts of data and need to evaluate data quickly and derive strategic insight to gain a holistic view of their organization.”

Daffron provides tightly integrated utility software solutions, including CIS/billing, customer self-service–web and mobile, financial management, materials and workflow management, prepaid metering, meter data management, and field solutions. In addition to software, Daffron provides application and data hosting as well as information technology services.

Envision provides secure and free-form data visualization in real time. The cloud based solution is scalable throughout the enterprise in real time thanks to its innovative licensing system. Integrated with Daffron, Envision will enable Utilities with easy customer billing analysis, meter data aggregation, rate schedule simulation & optimization, assets and outage management, and more.

“It is my pleasure to welcome Daffron to our collaborative ecosystem,” said Joe Sorovetz, senior vice president for enterprise business, Altair. “Daffron is known for their strength in process-driven user interfaces and highly integrated functional systems, and this association adds strong support for our business, offering our Utility community broader data analytics capabilities that will help reduce manual processes and overall costs.”

Source: CloudStrategyMag

Q&A: Hortonworks and IBM double down on Hadoop

Q&A: Hortonworks and IBM double down on Hadoop

Hortonworks and IBM recently announced an expanded partnership. The deal pairs IBM’s Data Science Experience (DSX) analytics toolkit and the Hortonworks Data Platform (HDP), with the goal of extending machine learning and data science tools to developers across the Hadoop ecosystem. IBM’s Big SQL, a SQL engine for Hadoop, will be leveraged as well.

InfoWorld Editor at Large Paul Krill recently met with Hortonworks CEO Rob Bearden and IBM Analytics general manager Rob Thomas at the DataWorks Summit conference in Silicon Valley, to talk about the state of big data analytics, machine learning, and Hadoop’s standing among the expanding array of technologies available for large-scale data processing.

InfoWorld: What does IBM Data Science Experience bring to the Hadoop Data Platform?

Thomas: We launched Data Science Experience last year and the idea was we saw a change coming in the data science market. Traditionally, organizations were either SPSS users or SAS users but the whole market was moving toward open languages. We built Data Science Experience on Jupyter. It’s focused on Python data scientists, R, Spark, Scala programmers. You can use whatever language you want.

And you can use whatever framework you want for the machine learning underneath. You can use TensorFlow or Caffé or Theano … It’s really an open platform for data science. We focus on the collaboration, how you get data scientists working as a team as part of doing that. Think about Hadoop. Hadoop has had an enormous run in the last five to six years in enterprises. There is a lot of data in Hadoop now. There is not super value for the client by just having data there. Sometimes, there is some cost savings. Where there is super value for the client is they actually start to change how they’re interacting with that data, how they’re building models, discovering what’s happening in there.  

InfoWorld: IBM has a well-known experience with machine learning with Watson. Hortonworks has positioned Apache Spark and Hadoop as its entrance into the machine learning space. Can you discuss the company’s future plans for machine learning, AI, and data science?

Bearden:  It’s going to be through the DSX framework and the IBM platforms that come through that. Hadoop and HDP will continue to be the platform. We’ll leverage some of the other processing platforms collectively like Spark and there’s a tremendous amount of work that IBM’s done to advance Spark. We’ll continue to embody that inside of HDP through YARN but then on top of all of these large data sets, we’ll leverage DSX and the rest of the IBM tool suite. We expressed that DSX and the rest of the tool suite from IBM for machine learning, deep learning, and AI will be our strategic platforms going forward and we’re going to co-invest very deeply to make sure all the integration is done properly. That goes back to being able to bring all resources into a focused distribution so that we can not only innovate horizontally but integrate vertically.

InfoWorld: InfoWorld ran a story late last year claiming that Hadoop had peaked, that other big data infrastructure including Spark, MongoDB, Cassandra, and Kafka were marching past it. InfoWorld asked Hortonworks CTO Scott Gnau a similar question last year. What can you say about the continued vitality of Hadoop?

Bearden: We’re a public company and we’re continuing to grow at 24 to 30 percent a year. The way we get paid is by bringing data under management. That’s one vector and it’s just a quantitative data point. I think what you have to then revert backwards to is, is the volume of data growing in the enterprise? According to just about any CIO you’ll speak with or any of the traditional industry analysts, and I think Rob will back this up, about every 18 months the volume of data doubles across the enterprise. About 70 to 80 percent of that data is not going to go into the traditional data platforms, the traditional SQL transactional EDW, etc., and they’re looking for that new area to come to rest, if you will. Hadoop is the right platform and architecture for that to happen. That’s why this partnership is so important. We’re great at landing that data, bringing it under management, securing it, providing the governance, etc., and being able to drive mission-critical marks on some pretty good economics. But what the enterprise really wants is the ability to gain insight from it, to get access to it, to have visibility, to be able to act on a decision and create an action that drives value for an application.

Thomas: Maybe the hype peaked but the hype always peaks when the hard work starts. I think Hadoop is still in its early days. We’ll look back at some point and it will be like sitting here in 1992 saying relational warehouses have peaked. It was just the start. We’re in the same place but the hard work has begun, which is—all right, now we’ve got the data there, how do I actually integrate this across my whole data landscape, which is why Scott talked a lot about Big SQL and what we’re doing there. That’s a really hard problem and if people don’t solve that then there’s probably a natural limitation to how much they could do with Hadoop. But together we solve that problem to the point of the whole discussion on data science, data governance. When you bring those things to Hadoop and you do it at scale, it again changes the opportunity for how fast and how widely Hadoop can be deployed.

InfoWorld: What’s going to happen with the evolution of YARN? What’s next on the roadmap for it?

Bearden: The notion of containers and having the ability to then take a container-based approach to applications and being able to do that as an extension through YARN is actually part of the roadmap today. We published that and we think that opens up new use cases and applications that can leverage Hadoop.

You go back to the ability to get to existing applications, whether it be fraud detection, money laundering, two of the typical ones that you look at in financial services. Rapid diagnostics in the healthcare world, being able to get to better processing for genomics… analyzing the genome for certain kinds of diseases and being able to take those existing algorithms or applications and moving them over to the data via a container approach. You can do that much cleaner with YARN.

InfoWorld: Is there anything else you want to mention?

Thomas: I’d mention just one more point around data governance. We started working with Hortonworks over the last, oh, 18 months around a project called Atlas. I’d say it’s just coming into form as we’ve both been working with a lot of clients and we view it as a key part of our joint strategy around how we’re going to approach data governance. You use data governance for compliance. You use data governance for insights. There’s a big compliance mandate with things like GDPR (General Data Protection Regulation) that’s happening right now in Europe. I think you’ll see more and more on this topic in the future from us.

Source: InfoWorld Big Data

Report: Hybrid Cloud Becomes A Strategic Imperative

Report: Hybrid Cloud Becomes A Strategic Imperative

Cloud computing is no longer a decision made solely for tactical reasons like cost savings or ease of implementation. Key strategic business demands — the need for greater business agility, data capabilities, and better customer and user experiences — are compelling companies to embrace cloud systems, according to the Insight-sponsored report by Harvard Business Review Analytic Services.

Respondents are nearly split when it comes to hybrid and private cloud adoption, while total public cloud usage captures only a fraction of the market:

  • Hybrid approach with systems hosted in both the public and private clouds (42%)
  • Host most of their systems in a private cloud (40%)
  • Host most in the public cloud (13%)
  • The types of systems most likely to reside in the cloud, include:
  • Email and communication tools (54%)
  • Billing and invoicing (29%)
  • Business intelligence (29%)
  • Payroll (26%)
  • Customer service (24%)
  • Project management (24%)
  • Cloud improves company performance

“A company’s IT environment should work for them by enabling them to both run and innovate. Large and small to mid-sized companies need to focus on managing and modernizing their IT infrastructure, so that it becomes a transformative part of their business that can directly improve results,” said David Lewerke, director, Hybrid Cloud Consulting Practice at Insight. “While we knew there were a number of benefits, we wanted to better understand from respondents exactly how cloud systems were impacting their business outcomes.”

The benefits of cloud adoption are even more pronounced among small to mid-sized companies:

Large companies

Small to mid-sized companies

Time to market

15%

47%

Business/revenue/profit growth

17%

38%

End customer experience

32%

48%

Ability to manage security

26%

39%

Ability to mitigate risk

28%

39%

For all respondents, nearly half (49%) say cloud or hybrid cloud systems have significantly improved collaboration, followed by business agility and flexibility (45%), their ability to manage, analyze, act on and share data (43%) and their ability to empower employees and create a better user experience (42%).

Methodology

A total of 347 respondents were drawn from the Harvard Business Review audience of readers (magazine/ enewsletter readers, customers, HBR.org users). Twenty-nine percent of respondents were executive management or board members, 28% were senior management; 31% were middle management; 12% came from other grades.

Source: CloudStrategyMag

MapR Technologies Releases MapR-XD

MapR Technologies Releases MapR-XD

MapR Technologies, Inc. has announced MapR-XD, a cloud-scale data store to manage files and containers. As part of the MapR Converged Data Platform, MapR-XD uniquely supports any data type from the edge to the data center and multiple cloud environments with automatic policy-driven tiering from hot, warm or cold data. MapR-XD enables customers to create vast, global data fabrics which are inherently ready for analytical and operational applications making it easier to operationalize data.

“MapR-XD will deliver a single view of data activity, which is critical for immediately identifying potential fraudulent payments,” said BD Goel, chief product strategy and innovation officer, Paysafe. “The ability to unify, manage and act on data very quickly — whether it originated from the cloud, on-premises or at the edge — is a compelling value proposition for us. Reliably delivering a millisecond advantage in data analysis is how MapR helps us stay several steps ahead of cybercriminals.”

“As applications become more intelligent and take advantage of more diverse data in real time, for both analytical and operational uses, there arises the need for new approaches to data processing,” said Matt Aslett, research director, data platforms and analytics, 451 Research. “MapR-XD is designed to eliminate data silos and support new use cases as they emerge that require data processing from the edge, data center and to the cloud.”

Storage and data management are in-the-midst of a generational re-platforming to leap forward into the data age. Key shifts underway include rapidly appealing economics of flash and NVME technologies; the adoption of clouds and IoT at the edge; evolving use cases and workloads; new demands imposed by deep learning technologies; and the radical change in scale and types of data.

“MapR-XD is the result of years of technical innovation and collaboration with our customers to develop a mission-critical modern data platform,” said Anil Gadre, chief product officer, MapR Technologies. “By providing a robust solution to manage data movement across multiple locations with security, high performance and multi-tenancy, we believe MapR is a strategic solution for enterprises embarking on crafting and implementing a next gen data strategy. Our unique Converged Data Platform enables creating data fabric with a global view of data and metadata, supporting a wide diversity of data types for both analytics and operations.”

“Cisco UCS provides the ideal platform for data intensive workloads, while MapR-XD Cloud-Scale Data Store creates the data fabric for managing files, and containers for these workloads and legacy applications,” said Raghunath Nambiar, CTO, Cisco UCS. “This represents a positive direction for the industry in the continuing evolution of Software Defined Storage.”

The new MapR-XD Cloud-scale Data Store includes:

  • Files, Container Support: MapR-XD eliminates data silos and simplifies management across files and containers. MapR-XD provides unified security, data protection and high availability across diverse data types. The same underlying data can be accessed through a wide range of industry standard APIs including NFS, POSIX and HDFS to simplify development, administration and eliminate data sprawl.
  • Global Exabyte Scale: MapR-XD easily scales to support trillions of files, exabytes of data, on thousands of commodity servers or cloud instances, all accessible through a single global namespace. Additionally it reduces operational complexity and provides a single, scalable view of resources, simplifying access for users, applications and containers.
  • Cloud-grade Reliability: MapR-XD delivers high availability, data protection and disaster recovery with no single points of failure, fully distributed metadata, point-in-time snapshots and high-performance, distributed mirroring.
  • Speed at Scale with Flash: MapR-XD utilizes the full power of network interconnects and takes advantage of the available performance of underlying heterogeneous hardware, such as disk and flash to meet the demands of GPU-based architectures. Automated capabilities, such as logical partitioning, parallel processing for disparate workloads, and bottleneck avoidance with I/O shaping and optimizations, ensure maximum performance across a cluster. MapR-XD includes an extremely high-performance POSIX Client that provides up to 10x the performance of a typical NFS gateway.
  • Stateful Persistence for Containerized Applications: MapR-XD includes a secure, optimized container client for providing containers with access to persisted data. The client supports both legacy and new containerized event-based microservices applications; multiple data types of files, containers, database and event streams; works with multiple schedulers such as kubernetes, mesos and docker swarm; and across any infrastructure such as on-premises, multiple clouds and edge.
  • Flexibility to Leverage Multiple Infrastructures: MapR-XD supports edge, on-premises and cloud environments with the same platform. It enables multi-temperature capabilities across flash, disk and cloud tiers with support for containers and automated data movement to address performance, cost and compliance concerns.
  • IoT Edge Made Easy: MapR-XD for the edge provides the ability to deploy processing and storage capabilities close to an IoT data source, such as in a car, medical device or jet engine. MapR-XD can store and process machine or sensor-generated data for seamless integration with a centralized Converged Data Platform where global aggregation and analysis would be performed.
  • Extensible Architecture: MapR-XD is a powerful component of the Converged Data Platform enabling customers to easily and seamlessly leverage additional capabilities including database, stream processing and integrated analytics on the same platform.

Source: CloudStrategyMag

NetApp Expands Collaboration With Microsoft On Hybrid Cloud Data Services

NetApp Expands Collaboration With Microsoft On Hybrid Cloud Data Services

NetApp has announced plans to expand its strategic alliance with Microsoft to help enterprises accelerate digital transformation for hybrid cloud.

Customers are evolving their data centers to integrate hybrid cloud delivery models for greater agility. As they build out these more flexible architectures, they want to retain the benefits of proven on-premises methods that ensure data efficiencies, protection, and insight. These integrations, however, can create new compatibility issues, particularly with existing applications that depend on file services.

NetApp intends to expand its collaboration with Microsoft to include hybrid cloud data services, built on NetApp’s proven ONTAP® software, that will deliver enterprise-grade data visibility and insights, data access and control, and data protection and security for customers moving to Microsoft Azure.

Collaboration areas include:

  • Developing new cloud data services, based on NetApp ONTAP innovation that will be offered on the Azure cloud.
  • Engineering collaboration to deliver a solution architecture that will speed the migration of enterprise applications to Azure and Azure Stack so that customers can unlock greater value from their data.
  • Integration of NetApp’s newly launched FabricPool functionality, which reduces the cost of cold data by automatically tiering it from on-premises to cloud, with Azure Blob Storage.
  • Enablement of Azure as a backup destination for NetApp’s Cloud Control SaaS offering, which provides backup, archive and compliance services to enhance Microsoft Office 365 environments.

“Enterprises depend on Microsoft’s cloud innovation and its broad portfolio of Azure cloud-integrated services to keep them competitive in the fast-moving digital age,” said Anthony Lye, senior vice president, Cloud Business Unit, NetApp. “This new development in our strategic alliance will extend the reach of NetApp’s world-class data services for Azure cloud and support customers as they modernize their businesses and pursue new opportunities for growth.”

Scott Guthrie, executive vice president, Cloud and Enterprise Group, Microsoft Corp., added, “NetApp is a strategic hybrid cloud data services partner for Microsoft Azure and a company whose data solutions are used every day by enterprise customers around the world. Working together we will deliver new solutions that give customers using NetApp and Microsoft Azure even more freedom to build and deploy applications, however they want.”

Both companies share a deep understanding of the needs of global enterprises and offer numerous solutions that help customers maximize the power of their data to achieve competitive advantage. Their collaboration has helped to protect and increase the availability of Microsoft application data assets, and provides a flexible infrastructure to support virtualization and private cloud deployments.

NetApp provides a number of hybrid cloud data services today that support Azure:

  • NetApp ONTAP Cloud for Azure, a virtual appliance that combines data control with enterprise-class storage features such as workload portability, dedupe, compression and backup. It works in combination with NetApp OnCommand® Cloud Manager to provide a simple, point-and-click environment to manage storage and ease control of data in Azure.
  • NetApp AltaVault™ hybrid cloud appliances for Azure reduce time, cost, and risk with an efficient and a secure approach to backing up cloud-based workloads. Using customers’ existing backup software, AltaVault deduplicates, encrypts, and rapidly migrates data to Azure Storage.
  • NetApp Private Storage for Microsoft Azure provides a cloud-connected storage architecture that allows enterprises to build an agile cloud infrastructure that combines the scalability and flexibility of Azure with the control and performance of NetApp on-premises storage hosted in a co-location facility. NetApp Private Storage for Azure solution architecture can be used with data controlled by the US International Traffic in Arms Regulations (ITAR) program.

The companies intend to provide additional updates on their hybrid cloud initiatives in the fall at Microsoft Ignite.

Source: CloudStrategyMag