Survey: Lack Of Preparedness By IT Execs Prevalent

Survey: Lack Of Preparedness By IT Execs Prevalent

SolarWinds MSP has published survey findings outlining the preparedness of UK and U.S. businesses in dealing with cybersecurity breaches. The report reveals that businesses are gravely optimistic about their ability to deter and cope with malicious attacks, despite the majority experiencing a breach over the last year and nearly one-fourth experiencing more than 10.

The potent combination of this lack of preparedness, the frequency of breaches, and the potential commercial impact of each one [$76k/59k GBP for small to medium sized businesses (SMBs) and $939k/724k GBP for enterprises]1, heightens the risk of an “extinction event” i.e., a massive business failure correlating to the breach.

Commenting on the survey, John Pagliuca, SolarWinds MSP general manager, said, “Our findings underscore the problems that contributed to the ‘WannaCry’ ransomware’s ability to cause so much damage around the globe. These results beg the question, ‘How can IT leaders feel so prepared yet still be exposed?’ One of the main reasons is that people are confusing IT security with cybersecurity. The former is what companies are talking about when they think about readiness. However, what they often don’t realize is that cybersecurity protection requires a multi-pronged, layered approach to security that involves prevention, protection, detection, remediation, and the ability to restore data and systems quickly and efficiently. The overconfidence and failure to deploy adequate cybersecurity technologies and techniques at each layer of a company’s cybersecurity strategy could be fatal.”

The research, looking into 400 SMBs and enterprises in the UK and U.S. and conducted by Sapio Research, reveals that 87% of IT executives questioned are confident in their security technology and processes’ resilience, and that 59% believe they are less vulnerable than they were 12 months ago. Given another 61% of businesses are anticipating a substantial boost to their cybersecurity budgets, they are confident this position will improve.

However, 71% of the same respondents said they have experienced a breach in the last 12 months.

These breaches are significant and shouldn’t be discounted. Of the businesses that have been breached and could identify an immediately traceable impact, 77% revealed that they had suffered a tangible loss, such as monetary impact, operational downtime, legal actions, or the loss of a customer or partner.

SolarWinds MSP also investigated why this overconfidence is occurring and identified seven basic faults:

  • Inconsistency in enforcing security policies
  • Negligence in the approach to user security awareness training
  • Shortsightedness in the application of cybersecurity technologies
  • Complacency around vulnerability reporting
  • Inflexibility in adapting processes and approach after a breach
  • Stagnation in the application of key prevention techniques
  • Lethargy around detection and response

The full report from SolarWinds MSP, entitled “2017 Survey Results: Cybersecurity: Can Overconfidence Lead to an Extinction Event? A SolarWinds MSP Report on Cybersecurity Readiness for U.K. and U.S. Businesses” is available here for download.

About SolarWinds MSP
SolarWinds MSP empowers MSPs of every size and scale worldwide to create highly efficient and profitable businesses that drive a measurable competitive advantage. Integrated solutions including automation, security, and network and service management — both on-premises and in the cloud, backed by actionable data insights, help MSPs get the job done easier and faster. SolarWinds MSP helps MSPs focus on what matters most — meeting their SLAs and creating a profitable business.

Methodology and Sample
In early 2017, SolarWinds MSP investigated the cybersecurity preparedness, experiences and failings of 400 SMBs and enterprises, split equally across the U.S. and the U.K. SMBs were categorized as having fewer than 250 employees.


1.      The cost per stolen record data was taken from IBM/Ponemon’s “2016 Cost of Data Breach Study: Global Analysis”

Source: CloudStrategyMag

Webair Announces Microsoft Azure ExpressRoute Partnership

Webair Announces Microsoft Azure ExpressRoute Partnership

Webair has announced that it is a Microsoft Azure ExpressRoute Partner. Azure ExpressRoute allows Webair customers to easily and securely utilize Microsoft cloud services, including services such as Azure, Office 365 and Dynamics 365, with increased levels of reliability and performance.

ExpressRoute is a private, dedicated network connection between the Microsoft Cloud and Webair’s customers’ IT environments. The decision to become a Microsoft Azure ExpressRoute Partner is consistent with Webair’s overarching strategy of providing customers with direct, private and secure access to hybrid cloud services, and expands its ability to mix and match its own local, low-latency enterprise public cloud as well as third-party hyperscale cloud services.

“By becoming a Microsoft Azure ExpressRoute Partner, Webair’s customers are provided with redundant and diverse paths to the Microsoft Cloud,” explains Michael Christopher Orza, CEO of Webair. “Azure ExpressRoute will allow our customers to utilize Microsoft cloud services with increased confidence in network performance and security.”

Webair’s cloud infrastructure is housed in Webair-owned facilities and runs on enterprise-grade hardware dedicated to customers and deployed directly into customer environments. Its direct network connectivity model and ability to deploy dedicated hardware per customer allow the secure and private consumption of scalable and SLA-backed cloud services with no physical connectivity to the public internet or to other customers. Today, many of Webair’s healthcare provider and enterprise customers, for example, need to bypass the public internet and consume cloud services as if they were on-premises. Becoming a Microsoft Azure ExpressRoute Partner and gaining a private, dedicated network connection between Microsoft Azure data centers and Webair customers’ IT environments now provides the best of both options without having to sacrifice existing network security models.

Webair has executed many hybrid cloud solutions for its customers, which often include a hybrid of services such as Enterprise Private Cloud, Managed Security, Disaster Recovery-as-a-Service (DRaaS) and Colocation as well as connectivity to Microsoft Azure, air-gapped and bypassing the public internet where possible. An air gap means that the customer’s network or system is physically isolated from the internet, thus providing added security against intruders.

In becoming a Microsoft Azure ExpressRoute Partner, Webair has established a more formal relationship with Microsoft to meet its clients’ ongoing growth and demand for future implementations. Webair plans on offering more managed services on top of Microsoft Azure as its customers, including healthcare and enterprise organizations, seek more hybrid services. Becoming a Microsoft Azure ExpressRoute Partner is but one critical first step in meeting these demands.


Source: CloudStrategyMag

PacketFabric Expands Cloud Networking Platform To Colo Atl

PacketFabric Expands Cloud Networking Platform To Colo Atl

Colo Atl has announced that PacketFabric’s Software-defined Networking (SDN) based network platform is now available at its downtown data center location. A NantWorks company and provider of next-generation Ethernet-based cloud networking services, PacketFabric can now easily interconnect with network service providers with no monthly recurring cross-connect fees within the Colo Atl Meet-Me Area (MMA). The collaboration also enables seamless access to the platform for Colo Atl’s enterprise, cloud and XaaS provider customers.

“As a carrier-neutral facility offering key interconnection opportunities with no monthly recurring cross connect fees, Colo Atl is an ideal partner for PacketFabric in the Atlanta market,” comments William Charnock, CEO of PacketFabric. “Colo Atl’s strategic location is critical to further extending the PacketFabric network and providing more customers with access to scalable, next-generation cloud networking services and simplified provisioning and maintenance of network infrastructure.”

PacketFabric’s fully automated network platform enables instantaneous, direct and secure provisioning of terabit-scale connectivity between any of the 128 locations on its network. PacketFabric customers can dynamically design and quickly deploy any network configuration leveraging an advanced Application Program Interface (API) and web-based portal for unmatched visibility and control over their network traffic and services. Real-time analytics and interactive troubleshooting capabilities allow PacketFabric to offer the robustness of a packet-switched network, while ensuring consistent and reliable performance.

“PacketFabric is an excellent addition to our Colo Atl family,” states Tim Kiser, owner and founder of Colo Atl. “Here at Colo Atl, in addition to our highly qualified staff, outstanding customer support, and critical interconnection opportunities, we aim to provide our customers with industry-leading infrastructure and solutions that will meet their considerable data demands. PacketFabric’s innovative cloud networking platform’s ability to deliver hundreds of terabits per second of on-demand connectivity surely fits the bill.”

Founded in November 2001, Colo Atl provides a reasonable, accommodating and cost-effective interconnection environment for more than 90 local, regional and global network operators. In 2016, the company celebrated its 15-year anniversary of providing service excellence and growth.

Colo Atl is an Atlanta Telecom Professionals Award Nominee and Winner of the 2016 TMT News Award for Best Colocation & Data Center – Georgia and the 2016 Georgia Excellence Award by the American Economic Institute (AEI).

Source: CloudStrategyMag

Markley And Cray Partner To Provide Supercomputing As A Service

Markley And Cray Partner To Provide Supercomputing As A Service

Cray Inc. and Markley have announced a partnership to provide supercomputing as a service solutions that combine the power of Cray supercomputers with the premier hosting capabilities of Markley. Through the partnership, Markley will offer Cray supercomputing technologies, as a hosted offering, and both companies will collaborate to build and develop industry-specific solutions.

The availability of sought-after supercomputing capabilities both on-premises and in the cloud has become increasingly desirable across a range of industries, including life sciences, bio-pharma, aerospace, government, banking, and more – as organizations work to analyze complex data sets and research, and reduce time to market for new products. Through the new supercomputing as a service offering, Cray and Markley will make it easier and more affordable for research scientists, data scientists, and IT executives to access dedicated, powerful compute and analytic capability to increase time to discovery and decision.

“The need for supercomputers has never been greater,” said Patrick W. Gilmore, chief technology officer at Markley. “For the life sciences industry especially, speed to market is critical. By making supercomputing and big data analytics available in a hosted model, Markley and Cray are providing organizations with the opportunity to reap significant benefits, both economically and operationally.”

Headquartered in Boston, Markley delivers best-of-breed cloud and data center offerings, including its enterprise-class, on-demand Infrastructure-as-a-Service solution that helps organizations maximize IT performance, reduce upfront capital expenses, increase speed to market, and improve business continuity. In addition, Markley guarantees 100% uptime, backed by the industry’s best Service Level Agreement.

“Cray and Markley are changing the game,” said Fred Kohout, Cray’s senior vice president of products and chief marketing officer. “Now any company that has needed supercomputing capability to address their business-critical research and development needs can easily and efficiently harness the power of a Cray supercomputer. We are excited to partner with Markley to create this new market for Cray.”

The first industry solution built by Cray and hosted by Markley will feature the Cray® Urika®-GX for life sciences — a complete, pre-integrated hardware-software solution. In addition, Cray has integrated the Cray Graph Engine (CGE) with essential pattern-matching capability and tuned it to leverage the highly-scalable parallelization and performance of the Urika-GX platform. Cray and Markley have plans for the collaboration to quickly expand and include Cray’s full range of infrastructure solutions.

The Cray Urika-GX system is the first agile analytics platform that fuses supercomputing abilities with open enterprise standards to provide an unprecedented combination of versatility and speed for high-frequency insights, tailor-made for life sciences research and discovery.

“Research and development, particularly within life sciences, biotech and pharmaceutical companies, is increasingly data driven. Advances in genome sequencing technology mean that the sheer volume of data and analysis continues to strain legacy infrastructures,” said Chris Dwan, who led research computing at both the Broad Institute and the New York Genome Center. “The shortest path to breakthroughs in medicine is to put the very best technologies in the hands of the researchers, on their own schedule. Combining the strengths of Cray and Markley into supercomputing as a service does exactly that.”

“HPC environments are increasingly being used for high-performance analytics use cases that require real-time decision making such as cybersecurity, real-time marketing, digital twins, and emerging needs driven by big data and Internet of Things (IoT) use cases. Augmenting your on-premises infrastructure with HPC clouds enables you to meet your existing SLAs while scaling up performance-driven analytics for emerging use cases,” notes Gartner, in Follow These Three Steps to Optimize Business Value from Your HPC Environments, by Chirag Dekate, September 16, 2016.

Source: CloudStrategyMag

Global Capacity Expands Seven Ethernet Access Points

Global Capacity Expands Seven Ethernet Access Points

Global Capacity has announced the expansion of seven One Marketplace™ Points of Presence (PoPs) in its extensive North American network, including six Ethernet local access aggregation points and three high performance Ethernet Backbone points purposely built for the most demanding Cloud, Over-the-Top applications, and data services. The locations now Ethernet-enabled include Pittsburgh and Philadelphia, PA, Minneapolis, MN, and three new PoPs in Boston, MA, Kansas City, MO, and Vienna, VA. The locations added to the high performance Ethernet backbone include Kansas City, MO, Minneapolis, MN and Toronto, Ontario, Canada.

These high-demand, key aggregation points enable delivery of diverse route options, competitive pricing and a broad selection of network access services to One Marketplace customers. Ethernet is the technology of choice for SD-WAN, Hybrid WAN, and Cloud Connectivity solutions. The popularity of these enterprise services drives Global Capacity’s continued expansion and investment. The company’s investment in expanding its backbone PoPs and enabling greater Ethernet access is a testament to Global Capacity’s commitment to deliver ubiquitous coverage, flexible access options, and simplified service activation and management to both enterprise and service provider customers.

“Last year, Global Capacity achieved 37% growth in installed Ethernet revenue driven by cloud and data center connectivity, and the higher traffic needs of today’s data-driven society,” comments Jack Lodge, president of Global Capacity. “Global Capacity will continue to invest in the One Marketplace network in ways that will connect business locations in more markets to key destinations, over greater bandwidth and high performance Ethernet.”

Global Capacity’s award-winning marketplace of networks, One Marketplace, eliminates the complexity and inefficiency of the network market by delivering unprecedented transparency, efficiency and simplicity to the complex and highly fragmented data connectivity market. By combining intelligent information analytics and service automation through a suite of customer and supplier applications, along with network delivery, One Marketplace streamlines the process of designing, pricing, buying, delivering, and managing data connectivity solutions.


Source: CloudStrategyMag

Review: Tableau takes self-service BI to new heights

Review: Tableau takes self-service BI to new heights

Since I reviewed Tableau, Qlik Sense, and Microsoft Power BI in 2015, Tableau and Microsoft have solidified their leadership in the business intelligence (BI) market: Tableau with intuitive interactive exploration, Microsoft with low price and Office integration. Qlik is still a leader compared to the other 20 vendors in the sector, but trails both Tableau and Power BI.

ed choice plumInfoWorld

In addition to new analytics, mapping, and data connection features, Tableau has added better support of enterprises and mobile devices in the last two years. In this review, I’ll give you a snapshot of Tableau as it now stands, drill in on features new since version 9, and explore the Tableau road map.

Source: InfoWorld Big Data

NoSQL, no problem: Why MySQL is still king

NoSQL, no problem: Why MySQL is still king

MySQL is a bit of an attention hog. With relational databases supposedly put on deathwatch by NoSQL, MySQL should have been edging gracefully to the exit by now (or not so gracefully, like IBM’s DB2).

Instead, MySQL remains neck-and-neck with Oracle in the database popularity contest, despite nearly two decades less time in the market. More impressive still, while Oracle’s popularity keeps falling, MySQL is holding steady. Why?

An open gift that keeps on giving

While both MySQL and Oracle lost favor relative to their database peers, as measured by DB-Engines, MySQL remains hugely popular, second only to Oracle (and not by much):

mysql rankingDB-Engines

Looking at how these two database giants are trending and adding in Microsoft SQL Server, only MySQL continues to consistently grow in popularity:

mysql searchGoogle

While general search interest in MySQL has fallen over the years, roughly in line with falling general search interest in Oracle and Microsoft SQL Server, professional interest (as measured by Stack Overflow mentions) has remained relatively firm. More intriguing, it dwarfs every other database:

mysql stack overflowStack Overflow

The script wasn’t written this way. NoSQL, as I’ve written, boomed in the enterprise as companies struggled to manage the volume, velocity, and variety of modern data (the three V’s of big data, according to Gartner). Somehow MySQL not only survived, but thrived.

Like a comfortable supershoe

Sure, NoSQL found a ready audience. MongoDB, in particular, has attracted significant interest, so much so that the company is now reportedly past $100 million in revenue and angling to IPO later this year.

Yet MongoDB hasn’t toppled MySQL, nor has Apache Cassandra or Apache Hadoop, as former MySQL executive Zack Urlocker told me: “MongoDB, Cassandra, and Hadoop all have worthwhile specialized use cases that are sufficiently hard to do in [a] relational database. So they can be decent sized businesses (less than $100 million) but they are unlikely to be as common as relational.” Partly this stems from the nature of most big data today: still transactional in nature, and hence well-suited to the neat rows and columns of an RDBMS.

This coincides with the heart of MySQL’s popularity: It’s a great database that fits the skill sets of the broadest population of database professionals. Even better, they can take all they learned growing up with Oracle, IBM DB2, and Microsoft SQL Server and apply it to an omnipresent, free, and open source database. What’s not to love?

Scale, for one.

Actually, that was the original rap against MySQL and all relational databases: They could scale up but not out, and we live in a scale-out world. As it turns out, “It actually can scale” quite well, Linux Foundation executive Chris Aniszczyk affirmed to me. While it may have started from an architecturally underprivileged standpoint, engineers at the major web companies like Google and Facebook had huge incentives to engineer scale into it. As examples of MySQL at scale proliferated, Pivotal vice president James Bayer suggested to me, it bred confidence that MySQL was a strong go-to option for demanding workloads.

This isn’t to suggest that MySQL is an automatic winner when it comes to scale. As developer DJ Walker-Morgan puts it, “NoSQL takes care of scaling like me buying diet food takes care of weight loss: only if strict disciplines and careful management is applied.” Again, enough examples exist that developers are motivated to give it a try, especially since it’s so familiar to a broad swath of the DBA community. Also, as Server Density CEO David Mytton underscored to me, “[M]anaged services like RDS … [and] Aurora in particular solve[] a lot of scale pain” for MySQL.

Which is why, 22 years after it first hit the proverbial shelves, MySQL is arguably the most popular database on earth. It doesn’t have the “enterprise grade” label that Oracle likes to slap on its database, and it doesn’t have the “built for horizontal scale” marketing that carried NoSQL so far, but it’s the default choice for yesterday’s and today’s generation of developers.

The fact that it’s free doesn’t hurt, but the fact that it’s a free, powerful, familiar relational database? That’s a winning combination.

Source: InfoWorld Big Data

Oracle's next big business is selling your info

Oracle's next big business is selling your info

There’s a decent chance you’re part of Oracle’s next big business. Not selling products to you, but selling you as a product. That’s the idea behind the Oracle Data Cloud, a massive pool of information about consumers and companies.

The tech titan has put it together by tracking people across the web and buying data from a variety of sources. People who have their data included may not even know that they’ve opted in for that data collection.

There’s no big red button that someone has to click in order to be a part of the company’s data collection machine. Instead, its base of user data is fed by a network of third parties. The Data Cloud is primarily fed by three types of sources: publishers, like Forbes and Edmunds, retail loyalty programs, and traditional data brokers like Experian and IHS.

All of that adds up to a database of 5 billion consumer profiles, fed by 15 million data sources. Not every profile corresponds to a unique person — people can have multiple profiles — but Oracle has information on billions of people, according to Eric Roza, the vice president of Data Cloud. Using data science techniques, Oracle works to match activity from one browser to others, so companies can make sure the same ads get shown to people on their smartphones, tablets, and computers.

Oracle sees Data Cloud as a key part of its future. The service is being used to help advertisers and publishers better target ads, and it’s attractive to businesses because it’s not tied to a major advertising platform like Google’s or Facebook’s.

The Data Cloud also forms the foundation of machine learning features inside other Oracle software. One of the challenges for companies doing machine learning is getting data sets that are large enough to build accurate models, and Data Cloud can help solve that problem.

But the benefits are mostly borne by Oracle’s business customers, who stand to make more money as a result of using Data Cloud enhanced services. The boon to consumers whose data are being used is less defined.

Oracle isn’t alone in this sort of tracking. There are dozens of companies that exist for the sole purpose of collecting consumer data and then reselling that to other businesses. Google, Facebook, Microsoft, and other tech titans have made big money from accumulating customer data and using it to sell ads.

But what makes the Data Cloud different from something like Google’s ad business is that consumers might not know their behavior is being stored for resale, or how broadly it’s shared. Just because someone visits a page on Forbes doesn’t mean they’d expect that information to influence a marketing campaign on a radically different website, but that’s what the Data Cloud enables.

Partners feeding data into Oracle’s Data Cloud must agree they have user permission to collect information. But acquiring that permission is as simple as burying a few sentences deep in a privacy policy. While some might call out Oracle Data Cloud by name, most don’t. 

“Typically, because these things are quite common practice now, there’s a more generalized statement [like] some version of ‘we use this data to inform our own advertising, and select third-party partners,'” Roza said.

Users can opt out from the data collection in a variety of ways, according to Roza. Oracle allows people to install a special cookie in each of the browsers they use to prevent tracking. Deleting the cookie or using a new browser would erase that protection, however. Some publishers may allow customers to opt out of data sharing, and advertising industry groups also support opting out.

But actually knowing whether or not you’re included in the Data Cloud is the first part of the battle. And that’s not the easiest thing to figure out. Meanwhile, Oracle is continuing to pour money into the business and tout it to customers. The company has spent billions on acquisitions to build the Data Cloud, which was created through bringing companies like BlueKai, Datalogix, and Moat into the fold.

Source: InfoWorld Big Data

IBM Named To CRN's 2017 Big Data 100 List

IBM Named To CRN's 2017 Big Data 100 List

IBM has announced that CRN, a brand of The Channel Company, has named IBM to its 2017 Big Data 100 list, ranking the company as one of the 15 Coolest Big Data Platform Vendors. The annual Big Data 100 list recognizes the ingenuity of tech suppliers bringing to market innovative offerings for harnessing the increasingly huge amounts of data generated in today’s digital world, raising the bar for data management and challenging established IT practices.    

Businesses are constantly grappling with the exploding volume, speed and variety of information they produce and utilize on a daily basis to remain competitive. Solution providers are on a never-ending quest to tame this big data with innovative tools, technologies and services that can convert it into meaningful, usable statistics.

In response to this challenge, the CRN editorial team has identified the IT vendors at the forefront of data management, business analytics and infrastructure technologies and services. The resulting Big Data 100 list is a valuable guide for solution providers seeking out key big data technology suppliers.

IBM was recognized as one of the top 15 Coolest Big Data Platform Vendors for its big data initiatives, many of which are centered on Watson, including the Watson Data Platform and Watson Analytics. CRN also noted recent enhancements to “Watson’s data analysis and discovery capabilities on the IBM Cloud.”

Through the IBM Cloud platform, IBM delivers scale, security, choice and consistency for enterprises worldwide, offering a global network of more than 50 Cloud Data Centers for public, private or hybrid cloud deployments. IBM Cloud’s data-first architecture provides data diversity, data control and isolation capabilities coupled with cognitive capabilities help enterprises unravel key business insights from unstructured data to make smarter business decisions.

“Businesses everywhere are faced with managing information streams of unprecedented volume and complexity, requiring more powerful and efficient tools than ever before for capturing, storing, organizing, securing and analyzing data,” said Robert Faletra, CEO of The Channel Company. “CRN is pleased to present the 2017 Big Data 100, a list of vendors whose ingenuity and creative problem-solving have introduced remarkable new ways to help solution providers tackle this mammoth task. Congratulations to these Big Data aces, who have not only kept pace with the rapidly evolving demands of the data management field, but also innovated and challenged the status quo.”   

“We are honored to be recognized among CRN’s Big Data 100 for our IBM Cloud platform’s data-first architecture and cognitive capabilities,” said David Wilson, vice president, IBM Cloud Business Partners and Channel Innovation. “We continue to help our business partners unlock the power of data through initiatives like the Watson Build, a new challenge that gives our partners the training, tools and mentorship they need to bring new cognitive solutions to market quickly.”

Source: CloudStrategyMag

Equinix And Oracle Collaborate To Offer Direct Access To Oracle Cloud

Equinix And Oracle Collaborate To Offer Direct Access To Oracle Cloud

Equinix, Inc. has announced the immediate availability of dedicated, private access to Oracle Cloud Infrastructure — Oracle’s next-generation IaaS offering — via the Equinix Cloud Exchange™. This direct access enables enterprise customers to migrate applications and data to Oracle Cloud in a high-performance, low latency manner for optimal user experience. This announcement builds on previous collaborations between Equinix and Oracle to offer direct access to Oracle’s full suite of cloud services — both PaaS and SaaS solutions — in multiple markets around the globe. Access to Oracle Cloud Infrastructure will initially be available via Oracle Cloud Network Service – FastConnect in the Equinix Washington, D.C., International Business Exchange™ (IBX®) data center, with additional markets expected throughout the year.

Cloud deployments are continually on the rise within the enterprise. Yet, traditionally, enterprise customers with large database workloads have not had a viable way to migrate data and applications easily from their on-premises infrastructure into the cloud. Now, through Equinix Cloud Exchange and its API integration with FastConnect, customers can establish direct connectivity between their on-premises infrastructure and Oracle Cloud environments. This enables them to fully realize the benefits of hybrid cloud — moving application, middleware and database workloads seamlessly between on-premises and Oracle Cloud — with a reliable, low latency, production-ready experience.

Whether a customer is looking to incorporate direct connection to Oracle as part of a broader interconnection strategy, or specifically needing to migrate data-heavy applications to the cloud, connection to Oracle Cloud inside Equinix is an ideal solution for many enterprise users. Specific examples of ideal hybrid deployments include:

Customers looking to migrate and host complex, multi-tier solutions in the cloud while minimizing production downtime can work with Equinix and Oracle to extend their network and communicate with the Oracle Cloud over a high-speed connection inside the Equinix data center.

Customers needing to perform analytics on large data sets residing in Oracle databases can host their Oracle technology solutions on-premises inside Equinix and, with FastConnect on Equinix Cloud Exchange, extend their existing network directly into the Oracle Cloud, removing data size limits and providing lower latency, higher throughput and network-level security protection — all in a highly scalable solution. This is ideal for data privacy, regulatory compliance and data sovereignty scenarios.

Customers looking to consolidate databases into Oracle Exadata Cloud Service yet have little or no capacity in their current data center can place their Oracle Exadata Cloud Service racks inside Equinix and directly connect to Oracle cloud for high availability, disaster recovery and backup.

“As more and more enterprise CIOs deploy interconnection oriented architectures, direct connections like these are critical to their success. The ability to connect directly to Oracle is an essential strategy for CIOs as they deploy workloads to the cloud edge, closer to users and adjacent to related cloud services. Oracle’s common cloud architecture allows customers the choice to leverage Oracle cloud, deploy customer-owned assets inside Equinix, or enable a hybrid solution utilizing both. Offering direct connectivity to Oracle from Platform Equinix is another key step in helping CIOs who are seeking to capture the tremendous benefits of hybrid cloud as the IT architecture of choice. We look forward to ongoing collaborations with Oracle, bringing this solution to Equinix data centers across the globe,” said Charles Meyers, chief operating officer.

Oracle Cloud delivers nearly 1,000 SaaS applications and 50 enterprise-class PaaS and IaaS services to customers in more than 195 countries around the world, and supports 55 billion transactions each day. Oracle Cloud Infrastructure is also part of the fast growing sector of cloud computing. According to a recent Gartner report the highest cloud growth is expected to come from IaaS, with a growth of 38.4% in 2016.

“Cloud is the fastest growing part of Oracle’s business. Customers require seamless connectivity from their data centers and networks to Oracle Cloud for their most demanding workloads and applications. This relationship will help our customers leveraging the Oracle Cloud, execute on their business strategies by taking advantage of the Equinix Cloud Exchange for optimal user experience,” said Thomas Kurian, president of product development, Oracle.

Oracle and Equinix have long-standing relationship, and Equinix was recently recognized as Oracle World Wide Gold Level Partner in the Oracle PartnerNetwork (OPN). Oracle FastConnect, providing private access to Oracle Cloud on Equinix Cloud Exchange is scheduled to be available in six markets globally – Washington, D.C., Chicago, Amsterdam, London, Sydney and others by the end of 2017.

Source: CloudStrategyMag