AWS Direct Connect Service Now Available At Equinix In Munich

AWS Direct Connect Service Now Available At Equinix In Munich

Equinix, Inc. has announced the immediate availability of Amazon Web Services (AWS) Direct Connect cloud service in its Munich International Business Exchange (IBX®) data centers. With AWS Direct Connect, companies can connect their customer-owned and managed infrastructure directly to AWS, establishing a private connection to the cloud that can reduce costs, increase performance, and deliver a more consistent network experience. The Munich data center is Equinix’s second location in Germany to offer AWS Direct Connect. The company announced availability in Frankfurt in 2014. Munich brings the total number of Equinix metros offering the Direct Connect service to 14, globally.

“We are thrilled to be adding another AWS Direct Connect site in our European footprint of data centers. It has always been our goal to help enterprises realize the full benefits of the cloud, while alleviating concerns of application latency or data privacy. By providing access to AWS via the Direct Connect service, we are empowering our customers to achieve improved performance of cloud-based applications,” said Eric Schwartz, president, EMEA, Equinix.

Cloud computing continues to grow at a steady pace across Europe. In fact, IDC predicts that the EMEA public cloud market will expand 26% over the next four years. Germany remains the second-largest cloud-consuming territory, and it has the highest growth rate (23.8% CAGR) among the major territories. According to IDC, basing on current trends, Germany could be the largest buyer of public cloud services by 2026*. Yet, one of the top CIO concerns in moving to the cloud continues to be compliance with German and European data protection laws. By expanding AWS Direct Connect within Germany, Equinix, and Amazon are enabling enterprise CIOs to advance their hybrid cloud strategies by seamlessly and safely incorporating public cloud services into their existing architectures.

Equinix’s Munich campus consists of two data centers comprised of 6,500 square meters (70,000 sq ft) of colocation space. Together, the Munich data centers are business hubs for more than 70 companies. Equinix Munich customers can choose from a broad range of network services from more than 30 network providers. They can also interconnect to customers and partners in their digital supply chain. By adding AWS Direct Connect to these facilities, Equinix is providing customers with an ideal location for them to host cloud services and help address any data privacy or data sovereignty issues.

Equinix Munich data centers have prime locations in one of Germany’s most important banking and insurance industry hubs. The region is also home to thriving automotive, Internet, digital media and electronics industries. Companies in these sectors rely on Equinix’s Munich colocation facilities and interconnection services to make highly reliable, low-latency connections with business partners and customers locally, nationally and internationally.

* Source: IDC, “Western Europe Public Cloud Services Forecast, 2016–2020” (Doc #EMEA41308316 / Jun 7, 2016)

Source: CloudStrategyMag

Outscale Opens New Data Center In Silicon Valley

Outscale Opens New Data Center In Silicon Valley

Outscale has announced the opening of its new data center in San Jose, CA. The addition of a fourth data center in the U.S. will allow Outscale to support its customers’ increasing needs for cloud computing infrastructures. This new data center enables Outscale to offer enterprise customers access to its premium storage and compute services, delivering best-in-class performance, availability and customer support.

This latest infrastructure build-out in San Jose further demonstrates the company’s commitment to the U.S. market and better equips Outscale to address the cloud computing and Big Data requirements of its west coast customers. The cloud provider has once again selected Equinix to be its local partner and host of the point-of-presence (PoP) due to its certifications of SSAE 16, ISAE 3402 SOC 1 Type, SOC 2 Type and ISO 27001.

“Since we opened our doors in 2010, we have committed to helping our customers harness the power of cloud computing by providing a premium, enterprise-class IaaS and unparalleled customer support,” said Laurent Seror, founder and president, Outscale. “As the epicenter of technology innovation, Silicon Valley is a logical choice as we expand our presence in the United States to meet our customers’ needs.”

According to IDC, worldwide spending on public cloud services is expected to eclipse $203.4 billion worldwide in 2020, with the compound annual growth rate (CAGR) for public coud services to be 21.5% for the period 2015-2020. This is nearly seven times the overall growth rate of IT spending, signaling a clear demand for cloud services.

With the addition of the Silicon Valley location, Outscale now has nine data centers globally spread across Europe, North America, and Asia. Outscale delivers business agility through:

  • Unparalleled performance – When it comes to building a cloud computing infrastructure to meet high performance computing demands, Outscale stands alone. The company has built its cloud with a highly reliable infrastructure based on Intel, NetApp and Cisco technologies, and holds the Cisco Managed Services Program (CMSP) Advanced Certification.
  • Unsurpassed scalability – Outscale offers an automated and scalable cloud, designed to support complex IT projects, while controlling operational costs.
  • Flexible pricing – Outscale offers a unique pricing model, one that allows its customers to pay by the year, month, hour or even second. Per hour and per second billing means customers only pay for the resources they use, potentially saving them hundreds of thousands of dollars per year.

Outscale was founded in France in 2010, and was one of the first pure play cloud providers in the world. Today, the company delivers the cloud computing infrastructure for dozens of global brands, including strategic partner Dassault Systemes, Airbus, and OpenDataSoft. In addition, Outscale serves as the vendor of choice for ISVs, VARs and startups around the world.

“We pride ourselves as an innovative cloud provider, one that is there for our customers with superior 24/7 support,” said Rob Rosborough, U.S. CEO, Outscale. “The expansion to Silicon Valley allows us to get closer to many of our customers, as we help them embark upon their Cloud journeys and maximize their investments.”

Source: CloudStrategyMag

SolarWinds Releases IT Trends Report 2017

SolarWinds Releases IT Trends Report 2017

SolarWinds has revealed the findings of its SolarWinds IT Trends Report 2017: Portrait of a Hybrid IT Organization. Featuring insights from public sector IT practitioners, managers, and directors, this year’s annual state-of-the-industry study explores the variety of ways in which IT departments around the world are integrating the cloud, and the effect hybrid IT has had on their organizations and IT job roles.

Overall, North American public sector organizations are moving further into the cloud, with 96% of respondents reporting they have migrated critical applications and infrastructure over the past year. However, while nearly 60% say they have experienced the expected benefits of the cloud, hybrid IT is increasing the complexity of IT roles, and introducing challenges like a lack of visibility between on-premises and cloud infrastructure, as well as the need to develop new skillsets to keep pace with changing environments.

“No job is more affected by ongoing technology disruptions than the role of the IT professional, which is why we explore these dynamics year after year,” said Joe Kim, senior vice president and chief technology officer, SolarWinds. “By creating this portrait of today’s public sector hybrid IT organization, we get to the heart of the shifts occurring so we can better understand and cater to the unique needs of these government agencies. For today’s IT professionals, it’s absolutely critical not only to put the right solutions in place to best manage hybrid IT environments, but to prepare organizations — and themselves — for continued technology advancements, even as we move beyond cloud.”

2017 Key Findings for the Public Sector

The SolarWinds IT Trends Report 2017: Portrait of a Hybrid IT Organization explores significant trends, developments, and movements related to and directly affecting IT and IT professionals. Key findings show that today’s public sector hybrid IT organizations are:

Moving applications, storage, and databases further into the cloud.

  • In the past 12 months, IT professionals have migrated applications (73%), storage (51%), and databases (29%) to the cloud more than any other areas of IT.
  • By weighted rank, the top three reasons for prioritizing these areas of their IT environments for migration were greatest potential for ROI/cost efficiency, availability, and increased reliability, respectively.

Experiencing the cost efficiencies of the cloud.

  • Nearly all (96%) public sector organizations have migrated critical applications and IT infrastructure to the cloud over the past year, yet three-fourths (75%) spend less than 40% of their annual IT budgets on cloud technology.
  • Two-fifths (40%) said their organizations spend 70% or more of their annual IT budgets on on-premises (traditional) applications and infrastructure.
  • Nearly three in five (58%) organizations have received either most or all expected cloud benefits (such as cost efficiency, availability, or scalability).
  • Cost efficiency is at times not enough to justify migration to the cloud: 29% migrated areas to the cloud that were ultimately brought back on-premises due mostly to security/compliance issues, poor performance, and technical challenges with the migration.

Building and expanding cloud roles and skillsets for IT professionals.

  • Over three-fifths (62%) of IT professionals indicated that hybrid IT has required them to acquire new skills, while 11% said it has altered their career path.
  • Nearly three-fifths (57%) of public sector organizations have already hired/reassigned IT personnel, or plan to do so, for the specific purpose of managing cloud technologies.
  • The top two cloud-related skills IT professionals improved over the past 12 months were data analytics (40%) and monitoring/management tools and metrics (40%).
  • Sixty-three percent said an IT staff skills gap was one of the five biggest hybrid IT challenges, while 47% said increased workload/responsibilities.
  • More than a third (38%) do not believe that IT professionals entering the workforce now possess the skills necessary to manage hybrid IT environments.

Increasing in complexity and lacking visibility across the entire hybrid IT infrastructure:

  • Nearly two-thirds (65%) said their organizations currently use up to three cloud provider environments, with the largest percentage using two to three; however, one out of every 10 (9%) use 10 or more.
  • By weighted rank, the number one challenge created by hybrid IT is increased infrastructure complexity, followed by an IT skills gap and lack of control/visibility into the performance of cloud-based applications and infrastructure, respectively.

To explore and interact with all of the 2017 findings, please visit the SolarWinds IT Trends Index, a dynamic web experience that presents the study’s findings by region, including charts, graphs, socially shareable elements, and additional insights into the data.

The findings of this year’s North America public sector report are based on a survey fielded in December 2016, which yielded responses from 75 IT practitioners, managers, and directors in the U.S. and Canada from public-sector small, mid-size, and large organizations whose organizations are leveraging cloud-based services for at least some IT infrastructure (including applications).

Source: CloudStrategyMag

Report: Disconnect On Cloud Security And IoT

Report: Disconnect On Cloud Security And IoT

AlienVault® has released the results of a survey showing that cloud security remains a thorn in the side of security professionals, with many still struggling to monitor this environment effectively.

Conducted at RSA 2017, 974 conference participants weighed in on cloud security and IoT monitoring to provide an inside look at the challenges and concerns plaguing companies today, along with the opportunities and benefits associated with each technology.

Perhaps the most startling survey statistic is that one third of show attendees describe the state of security monitoring within their organization as “complex and chaotic.” Likely a significant factor in this outcome, survey results reveal a major disconnect between respondents’ beliefs and their actions when it comes to cloud security and IoT. For example:

Forty-two percent of respondents are less confident in their ability to detect threats in the cloud vs. on-premises, yet 47% would rather monitor a cloud environment than an on-premises network.

Sixty-two percent state that they are worried about IoT devices in their environment, yet 45% believe IoT benefits outweigh the risks. Frighteningly, 43% of respondents say their company does not monitor IoT network traffic at all, and an additional 20% aren’t even sure of the answer.

“The driving force behind cloud and IoT is the availability and analysis of information, but they must be managed and monitored in the right way. If data is misused, or inadequately protected, the consequences can be severe,” said Javvad Malik, security advocate at AlienVault. “According to the survey findings, many companies are using these impacting technologies to reap the technological and business benefits they provide, but they are doing so without proper monitoring – leaving their company at greater risk of attack.”

When it comes to monitoring security threats in the cloud, an alarming number of respondents reported being left in the dark when decisions are made. According to the survey, 39% of respondents are using more than 10 different cloud services within their organization, and 21% don’t know how many cloud applications are being used. In addition, 40% state that their IT team is not always consulted before a cloud platform is deployed, meaning that they are unable to offer guidance and advice, or do due diligence on a platform or service.

The survey also asked participants what concerned them most about cloud security. While malware was rated as the highest concern, with 47% of respondents worrying about it, some of the other responses shed light on why so many security professionals view their environments as complex and chaotic. 42% of respondents are concerned about a lack of visibility in the cloud, and 21% are worried about the cloud-based services they use producing “too many logs.” This finding also points to the problems associated with auditing cloud environments in the event of an incident.

“Most organizations are drowning in ineffective preventative measures and draining resources with investments in expensive, disjointed solutions. This unfortunate combination is likely a tremendous factor in producing the chaos, complexity and confusion experienced by so many companies,” continued Malik. “It’s time for organizations to focus on what they do have control over – threat detection and incident response — and implement a unified solution that can monitor on-premises, cloud and hybrid environments. Simplifying security in this way enables companies to immediately identify and respond to threats, and in today’s cybersecurity landscape, this is the best strategy to mitigate risk.”

 

Source: CloudStrategyMag

451 Research Highlights IndependenceIT’s Ability To Simplify WaaS

451 Research Highlights IndependenceIT’s Ability To Simplify WaaS

IndependenceIT has highlighted a new 451 Research report on the company’s flagship software platform, Cloud Workspace® Suite (CWS) 5.1. The research report, titled, “IndependenceIT Aims to Ease Deployment for Microsoft CSP Partners with CWS 5.1,” provides an overview of new CWS features and points out the company’s strength in multi-platform App and WaaS automation and workflow.

CWS 5.1 provides automation and workflow for the purpose of simplifying administrative tasks from infrastructure management to end-user support. The cloud enablement and management platform combines application, end-user, and infrastructure oversight into a seamless, easy-to-manage platform with a unified control interface and robust API for ease of integration with existing systems, minimizing deployment time and controlling data center costs. The company recently discussed the solution with 451 Research and provided an overview of the software. This has resulted in a new report.

According to the report by 451 Research, “IndependenceIT has been on a mission to focus its development efforts on ease of use and manageability. A new UI that comprises a specific dropdown menu (for services/functions), color themes and icon graphics has been available since the release of CWS v.5.0. Instead of competing with other VDI and cloud platforms, the company has carved out a niche by facilitating workflow orchestration across multiple VDI and cloud environments. As it sees growing interest in Microsoft Azure, IndependenceIT has deepened its support for Microsoft ARM — a self-provisioning cloud portal for Azure (public cloud) and Azure Stack (private cloud) — in the latest version.”

“For system administrators, IndependenceIT has further invested in driving greater automation by creating an additional layer that enables system administrators to trigger the cloud workspace interface to perform a number of automated tasks such as server execution and application installation using scripting, rather than an API. The company notes that this can significantly speed deployments and reduce overhead costs, and points to the fact that not many system administrators can support full-blown development using an API,” said the report.

“IndependenceIT remains focused on demonstrating its strength in platform automation while being nimble when it comes to aligning its platform capabilities with partner requirements,” said Agatha Poon, research director, 451 Research. “With the availability of CWS 5.1, the company has enhanced the usability and manageability of the software when deployed on Azure cloud services using Azure Resource Manager (ARM). This should be a welcome call for Microsoft CSP partners.”

“With CWS 5.1, we are dedicated to driving greater automation for IT solution providers in order to streamline IT lifecycle management,” said Seth Bostock, CEO, IndependenceIT. “Many of the world’s leading IT service providers recognize these advantages and put them to work for their organizations. We appreciate the recent analysis performed by 451 Research on our latest generation software which showcases our efforts to help partners deliver best-in-class cloud based solutions.”

Source: CloudStrategyMag

Spacemetric Selects Interoute’s Cloud To Manage Space Data

Spacemetric Selects Interoute’s Cloud To Manage Space Data

Interoute has been selected by Spacemetric to support its data storage and distribution needs. Spacemetric is a Swedish software company which streamlines the transformation of raw data from satellite and airborne sensors into imagery products ready for analytics.

This secure storage solution will be integrated with the web-based SWEA (Swedish Earth data Access) platform, developed by Spacemetric on behalf of the Swedish National Space Board. The platform is part of the existing EU earth observation program Copernicus managed by the European Space Agency (ESA). The data is collected and used to support environmental and humanitarian research. The ESA’s archive of images is available to scientists and businesses across the world via the cloud. The archive, hosted by Interoute’s private cloud network, is shared widely – from governments to entrepreneurs looking for ways to turn the data into business opportunities. SWEA can now ensure the availability of data with specific relevance to Swedish users.

Interoute Virtual Data Centre zone in Stockholm launched six months ago and is one of 17 global zones that make up Interoute private networked cloud.

“We chose Interoute as they could offer an efficient hybrid solution combining physical storage with the Interoute Virtual Data Centre in Stockholm. As a result, we are guaranteed secure storage of local data as well as superior access due to low latency. It also means that the development process is more agile, making it possible to quickly and easily scale our efforts up or down depending on demand”, said Mikael Stern, CEO at Spacemetric.

Matthew Finnie, Interoute CTO, commented, “It’s fantastic to be selected by Spacemetric for this exciting project supporting the EU earth observation program for environmental and humanitarian research. Interoute was the first global cloud provider to launch a zone in the Nordic region that offers both public and private cloud on one platform. And our Stockholm cloud zone is one of 17 global zones that make up our private networked cloud. This new project is validation that the ‘local presence, global reach’ approach to cloud is key to meeting the needs of the market in Europe.”

Source: CloudStrategyMag

Equinix Cloud Exchange Expands In Europe

Equinix Cloud Exchange Expands In Europe

Equinix, Inc. has announced that it will be significantly expanding the availability of the Equinix Cloud Exchange™, bringing the innovative solution to three new markets including Dublin, Milan, and Stockholm. Bringing direct, private access to multiple cloud providers to these new markets supports European businesses who are undergoing digital transformation and enables global enterprises to simultaneously orchestrate hybrid and multi-cloud solutions across multiple locations and gain the global scale, performance and security they need to compete. These new markets also provide additional flexibility with where enterprises locate their cloud workloads and data across the region depending on their business needs and local regulations.

“When Equinix introduced Cloud Exchange, businesses were just beginning to leverage the cloud but there was still serious scepticism around performance and security. Over the last few years however, global business adoption of cloud has been tremendous and now we see customers securely connecting to many clouds and across multiple regions all on the Equinix Cloud Exchange. We believe these three new markets will make Cloud Exchange that much more appealing for enterprises looking to distribute their cloud infrastructure across Europe,” said Eric Schwartz, president, Equinix EMEA.

Equinix data centers house a growing cloud ecosystem of over 500 Cloud Service Providers (CSPs) and SaaS solutions globally. The Equinix Cloud Exchange offers direct, private connections to more than 50 leading CSPs including Amazon Web Services (AWS), Google Cloud Platform, IBM SoftLayer, Microsoft Azure ExpressRoute, and Office 365, Oracle Cloud and Salesforce. This interconnected approach allows companies to boost cloud application performance, reduce latency, scale and improve network control and visibility — delivering a quality cloud experience to end users.

Equinix Cloud Exchange is an advanced interconnection solution that enables seamless, on-demand and direct access to multiple clouds and multiple networks across the globe. By bringing together cloud service providers with enterprises consuming cloud and enabling them to establish private, high-performance connections, the Equinix Cloud Exchange gives enterprises direct access to the services they need to build sophisticated hybrid cloud solutions inside Equinix International Business Exchange™ (IBX®) data centers.

Since its initial launch in 2014, more than 625 businesses have connected globally to the Equinix Cloud Exchange including Aon, Beeks Financial Cloud, CDM Smith, Ellie Mae, Hathaway Dinwiddie, and Walmart. Equinix has continued to see strong demand for cloud connectivity — year-over-year growth of Equinix Cloud Exchange more than doubled from 2015 to 2016.

The addition of Cloud Exchange in three new markets in addition to the five existing European markets (London, Frankfurt, Amsterdam, Paris, and Zurich) means extending companies’ ability to access low-latency, private cloud connectivity without going over the public internet or a WAN. More Europe-based businesses can now leverage the performance, security and a consistent quality of experience that only direct, private connectivity can deliver.

Equinix Cloud Exchange offers software-defined direct connections to multiple cloud services from a single physical port at both Layer 2 and Layer 3, so enterprises can easily access cloud-based services. Real-time provisioning of connections gives enterprise the flexibility to ramp up and down services according to their changing business requirements.

The European Commission states, “The huge potential of the digital economy is underexploited in Europe, with 41% of enterprises being non-digital, and only two percent taking full advantage of digital opportunities.”1 As European businesses undergo digital transformation, many will turn to cloud-based technologies, as the cloud has been a catalyst for global IT transformation. To support this shift, Equinix will roll out Cloud Exchange in the following key European metros, making Cloud Exchange available in 24 markets by the end of 2017:

  • Dublin (available March) – At a time when global volumes of data are mushrooming, that presents both indigenous and foreign direct investment (FDI) companies with an abundance of opportunity, unique to Ireland. Coupled with the country’s favorable corporate tax structure, data privacy laws and potential for mining big data, Ireland continues to be an attractive destination for digital business infrastructure.
  • Milan (available September) – Milan is the economic and financial heart of Italy and is the country’s second largest city, playing a vital role in the Italian economy which is the third largest in the Eurozone. Furthermore, Italian enterprises use of cloud computing (40%) is more than double the Eurozone average (19%).2
  • Stockholm (available September) – Stockholm’s status as a key international business and technology hub is growing dramatically. The city has been ranked as a top “future region” by the Financial Times, as well as Europe’s best region in terms of prospects for inward investment and for economic and business expansion.

Equinix’s interconnection and data center platform has become the home of the interconnected cloud globally, with more than 150 colocation facilities located across 41 markets. According to a recent report by Gartner – Deliver Data Center Modernization Using Three Cloud-Complementary Approaches (February 2017) – digital businesses should, “Enhance data center interconnection capabilities with cloud providers and digital business partners by building colocation network hubs.”

Source: CloudStrategyMag

Machine learning proves its worth to business

Machine learning proves its worth to business

Machine learning couldn’t be hotter. A type of artificial intelligence that enables computers to learn to perform tasks and make predictions without explicit programming, machine learning has caught fire among the hip tech set, but remains a somewhat futuristic concept for most enterprises. But thanks to technological advances and emerging frameworks, machine learning may soon hit the mainstream.

Consulting firm Deloitte expects to see a big increase in the use and adoption of machine learning in the coming year. This is in large part because the technology is becoming much more pervasive. The firm’s latest research shows that worldwide more than 300 million smartphones, or more than one-fifth of units sold in 2017, will have machine learning capabilities on board.

“New chip technology in the form of central processing units, graphics processing units, or dedicated field-programmable gate arrays will be able to provide neural network processing at prices, sizes, and power consumption that fit smartphones,” says Stuart Johnston, leader of the technology, media, and telecommunications practice at Deloitte.

“This hardware added to machine learning software will enable native programs designed to mimic aspects of the human brain’s structure and function, and will be applied to areas such as indoor navigation, image classification, augmented reality, speech recognition, and language translation,” Johnston says. “What that means from a day-to-day user perspective is that complicated tasks will be easier, will be more personalized, faster, and have greater privacy.”

Companies in various industries are already using or experimenting with machine learning technologies. Here is a look at how three companies are tapping machine learning to great business effect.

Pinning hopes on data-rich images

Social media site Pinterest began dabbling with machine learning in 2014, when it started investing heavily in computer vision technology and created a small team of engineers focused on reinventing the ways people find images.

Less than a year later the company launched “visual search,” a new tool that does not require text queries to search for information. “For the first time, visual search gave people a way to get results even when they can’t find the right words to describe what they’re looking for,” says Mohammad Shahangian, head of data science at Pinterest.

Visual search is powered by deep learning, a version of machine learning that taps into deeper neural networks, and allows Pinterest to automatically detect objects, colors, and patterns in any pin’s image and recommend related objects. There are more than 200 million visual searches on Pinterest every month, in addition to 2 billion text searches, Shahangian says.

In the summer of 2016, visual search evolved as Pinterest introduced object detection, which finds all the objects in a pin’s image in real time and provides related results.

“Today, visual search has become one of our most-used features, with hundreds of millions of visual searches every month, and billions of objects detected,” Shahangian says. “Now, we’re introducing three new products on top of our visual discovery infrastructure.”

Pinterest has one of the largest collections of data-rich images on the internet. “We use machine learning to constantly rank and scale 75 billion dynamic objects, from buyable pins to video, and show the right pin to the right person at the best time,” Shahangian says. “Our core focus is helping people discover compelling content, such as products to buy, recipes to make, and projects to try, and machine learning helps us provide a more personalized experience.”

As Pinterest expands its international audience, it’s vital that its service be personalized for people regardless of where they live, what language they speak, or what their interests are, Shahangian says. “Using machine-learned models, we’ve increased the number of localized pins for countries outside the U.S. by 250 percent over the past year,” he says. “Now each of the more than 150 million people who visit Pinterest monthly see pins most relevant to their country and language.”

In additional, machine learning predicts the relevance of a promoted pin on the site as well as its performance, helping improve the user experience with promoted ideas from businesses.

“We recently added deep learning to our recommendations candidate pipeline to make related pins even more relevant,” Shahangian says. “Pinterest engineers have developed a scalable system that evolves with our product and people’s interests, so we can surface the most relevant recommendations. By applying this new deep learning model, early tests show an increase in engagement with related pins by 5 percent globally.”

Pinterest is constantly developing technologies with the latest in machine learning “to build a visual discovery engine, including making advancements in object detection and scaling an ever-growing corpus of data and the world’s data-rich set of images, to people around the world,” Shahangian says.

Building high-dimensional models

Another company using machine learning, software provider Adobe Systems, has worked with supervised and unsupervised machine learning, as well as statistical models to help run its business for years, according to Anandan Padmanabhan, vice president of Adobe Research.

With the transition of Adobe’s business to a cloud-based subscription offering, there were two fundamental drivers that resulted in a need for large-scale machine learning within the company: online channels becoming the primary source for acquiring customers, and the need for driving product engagement and retention at scale across millions of customers. In addition, the data captured on customer engagement with a particular product are far more detailed through machine learning.

“Adobe captures this event-level longitudinal data across product usage, marketing, and customer support to build various types of predictive models,” Padmanabhan says. These include paid conversion and retention models, customer retention models, automated feature extraction and segmentation, upsell and cross-sell models, and optimal allocation and segment-based forecasting models.

The tools the company has used for its machine learning efforts include Python Scikit-learn, Spark ML, SAS, and proprietary in-house methods.

Machine learning methods have helped the company build individual-level, high-dimensional models, Padmanabhan says. “Previously, Adobe leveraged statistical tools for building more aggregated models that would ignore individual-level heterogeneity altogether,” he says.

Among the key benefits of machine learning for Adobe is a greater understanding of the marginal impact of paid media, which has resulted in the improved allocation of media touchpoints across various selling channels; and the ability to understand individual customer propensities and lifecycle stages, which helps drive marketing campaigns.

The company has also seen improved customer engagement through a better understanding of how individual products are used and through responses to marketing campaigns, which has led to more customized products and customer support experiences. That, in turn, has helped with customer retention.

In addition, Adobe has seen improvements in enterprise sales and territory planning, which drive higher sales efficiencies; and the development of a consistent way of defining and analyzing key performance indicators across the business, which has allowed the company to evaluate all campaigns in a common framework.

Given the success so far, the company is looking for other options to take advantage of machine learning. “There is a strong push within Adobe to leverage machine learning in managing all aspects of the customer experience,” Padmanabhan says.

Managing risk for customers

At LexisNexis Risk Solutions (LNRS), a provider of financial risk management services, machine learning helps customers protect against identity theft, money laundering, benefit scams, health care fraud, bad debt, and other risks.

LNRS began using machine learning several years ago to analyze and extract information from extremely large and heterogeneous data pools, to create graphs and make predictions about events, says Flavio Villanustre, vice president of technology architecture and product at LNRS.

The company uses mostly homegrown machine learning tools based on HPCC Systems, an open source, massive parallel-processing computing platform for big data processing and analytics.

The platform “gives us advantages when dealing with complex models and needing scalability to apply to very large and diverse data sets,” Villanustre says. On top of the HPCC platform, LNRS designed its own domain-specific abstractions in the form of domain-specific languages such as Scalable Automated Linking Technology, a sophisticated record linkage tool, and Knowledge Engineering Language, which combines graph analysis with machine learning capabilities.

Prior to machine learning, modeling through algorithms required people to understand the particular problem domain, extract facts from the existing data, and write large, “heuristics based” programs that used conditional rules to model different possible outcomes from the incoming data, Villanustre says. “These earlier systems required experts to sift through data to understand reality and describe it through conditional statements that a computer could understand,” he says. “This was very tedious, hard work, and better left to computers.”

Machine learning changed that by letting computers extract those facts and represent reality through statistical equations-based models instead, Villanustre says. “This saves countless hours of domain experts’ time and allows them to work with data sets that humans would struggle to deal with otherwise,” he says. “The resulting computer programs are more compact, easier to implement, and more efficient.”

LNRS uses machine learning to describe complete networks of organizations and individuals to identify fraud rings. It also uses the technology to assess and make predictions on credit and insurance risk, identify fraud in health-care-related transactions, and help capture criminals.

“Machine learning is at the core of everything that we do,” Villanustre says. And the company is looking into the latest iterations of the technology. Some of the recent developments around deep belief networks — generative graphical models composed of multiple layers of latent variables with connections between the layers — and deep learning are proving to be promising fields of applications, he says.

“It is always important for us to validate these new methodologies with the laws and regulations of the respective countries in which we work to ensure that they can be used in ways that maximize the benefit to individuals and society,” Villanustre says.

Machine learning in the mainstream

The adoption of machine learning is likely to be diverse and across a range of industries, including retail, automotive, financial services, and health care, says Johnston of Deloitte.

In some cases, it will help transform the way companies interact with customers, Johnston says. For example, in the retail industry, machine learning could completely reshape the retail customer experience. The improved ability to use facial recognition as a customer identification tool is being applied in new ways by companies such as Amazon at its Amazon Go stores or through its Alexa platform.

“Amazon Go removes the need for checkouts through the use of computer vision, sensor fusion, and deep or machine learning, and I expect many shopping centers and retailers to start exploring similar options this year,” Johnston said.

The fact that common devices such as smartphones will be equipped with machine learning capabilities means the technology will no longer be limited to theoretical or highly selective applications.

“Examples of emerging smartphone technologies powered by machine learning include things like programs that determine users’ moods and emotions through pressure sensors, programs that make health and life predictions using health data and programs that detect surrounding objects,” Johnston says.

Outside of smartphones, we will also see machine learning emerge in drones, tablets, cars, virtual or augmented reality devices, medical tools, and a range of IoT devices, making it available to industries that use those products, Johnston says.

Related articles

Source: InfoWorld Big Data

MIT-Stanford project uses LLVM to break big data bottlenecks

MIT-Stanford project uses LLVM to break big data bottlenecks

The more cores you can use, the better — especially with big data. But the easier a big data framework is to work with, the harder it is for the resulting pipelines, such as TensorFlow plus Apache Spark, to run in parallel as a single unit.

Researchers from MIT CSAIL, the home of envelope-pushing big data acceleration projects like Milk and Tapir, have paired with the Stanford InfoLab to create a possible solution. Written in the Rust language, Weld generates code for an entire data analysis workflow that runs efficiently in parallel using the LLVM compiler framework.

The group describes Weld as a “common runtime for data analytics” that takes the disjointed pieces of a modern data processing stack and optimizes them in concert. Each individual piece runs fast, but “data movement across the [different] functions can dominate the execution time.”

In other words, the pipeline spends more time moving data back and forth between pieces than actually doing work on it. Weld creates a runtime that each library can plug into, providing a common method to run key data across the pipeline that needs parallelization and optimization.

Frameworks don’t generate code for the runtime themselves. Instead, they call Weld via an API that describes what kind of work is being done. Weld then uses LLVM to generate code that automatically includes optimizations like multithreading or the Intel AV2 processor extensions for high-speed vector math.

InfoLab put together preliminary benchmarks comparing the native versions of Spark SQL, NumPy, TensorFlow, and the Python math-and-stats framework Pandas with their Weld-accelerated counterparts. The most dramatic speedups came with the NumPy-plus-Pandas benchmark, where the work could be amplified “by up to two orders of magnitude” when parallelized across 12 cores.

Those familiar with Pandas and want to take Weld for a spin can check out Grizzly, a custom implementation of Weld with Pandas.

It’s not the pipeline, it’s the pieces

Weld’s approach comes out of what its creators believe is a fundamental problem with the current state of big data processing frameworks. The individual pieces aren’t slow; most of the bottlenecks arise from having to hook them together in the first place.

Building a new pipeline integrated from the inside out isn’t the answer, either. People want to use existing libraries, like Spark and TensorFlow. Dumping that means getting rid of a culture of software already built around those products.

Instead, Weld proposes making changes to the internals of those libraries, so they can work with the Weld runtime. Application code that, say, uses Spark wouldn’t have to change at all. Thus, the burden of the work would fall on the people best suited to making those changes — the library and framework maintainers — and not on those constructing apps from those pieces.

Weld also shows that LLVM is a go-to technology for systems that generate code on demand for specific applications, instead of forcing developers to hand-roll custom optimizations. MIT’s previous project, Tapir, used a modified version of LLVM to automatically generate code that can run in parallel across multiple cores.

Another cutting-edge aspect to Weld: it was written in Rust, Mozilla’s language for fast, safe software development. Despite its relative youth, Rust has an active and growing community of professional developers frustrated with having to compromise safety for speed or vice versa. There’s been talk of rewriting existing applications in Rust, but it’s tough to fight the inertia. Greenfield efforts like Weld, with no existing dependencies, are likely to become the standard-bearers for the language as it matures.

Source: InfoWorld Big Data

Fujitsu Develops Database Integration Technology to Accelerate IoT Data Analysis

Fujitsu Develops Database Integration Technology to Accelerate IoT Data Analysis

Fujitsu Laboratories Ltd. has announced the development of technology to integrate and rapidly analyze NoSQL databases, used for accumulating large volumes of unstructured IoT data, with relational databases, used for data analysis for mission-critical enterprise systems.

NoSQL databases are used to store large volumes of data, such as IoT data output from various IoT devices in a variety of structures. However, due to the time required for structural conversion of large volumes of unstructured IoT data, there was an issue with the processing time of analysis involving data across NoSQL and relational databases.

Now Fujitsu Laboratories has developed technology that optimizes data conversion and reduces the amount of data transfer by analyzing SQL queries to seamlessly access relational databases and NoSQL databases, as well as a technology that automatically partitions the data and efficiently distributes execution on Apache Spark(1), a distributed parallel execution platform, enabling rapid analysis integrating NoSQL databases to relational databases.

When this newly developed technology was implemented in PostgreSQL(2), an open source relational database, and its performance was evaluated using open source MongoDB(3) as the NoSQL database, query processing was accelerated by 4.5 times due to the data conversion optimization and data transfer reduction technology. In addition, acceleration proportional to the number of nodes was achieved with the efficient distributed execution technology on Apache Spark.

With this technology, a retail store, for example, could continually roll out a variety of IoT devices in order to understand information such as customers’ in-store movements and actions, enabling the store to quickly try new analyses relating this information with data from existing mission-critical systems. This would contribute to the implementation of one-to-one marketing strategies that offer products and services suited for each customer.

Details of this technology were announced at the 9th Forum on Data Engineering and Information Management (DEIM2017), which was held in Takayama, Gifu, Japan, March 6-8.

Development Background

In recent years, IoT and sensor technology are improving day by day, enabling the collection of new information that was previously difficult to obtain. It is expected that connecting this new data with data in existing mission-critical and information systems will enable analyses on a number of fronts that were previously impossible.

For example, in a retail store, it is now becoming possible to obtain a wide variety of IoT data, such as understanding where customers are lingering in the store by analyzing the signal strength of the Wi-Fi on the customers’ mobile devices, or understanding both detailed actions, such as which products the customers looked at and picked up, and individual characteristics, such as age, gender, and route through the store, by analyzing image data from surveillance cameras. By properly combining this data with existing business data, such as goods purchased and revenue data, and using the result, it is expected that businesses will be able to implement one-to-one marketing strategies that offer products and services suited for each customer.

Issues

When analyzing queries that span relational and NoSQL databases, it is necessary to have a predefined data format for converting the unstructured data stored in the NoSQL database into structured data that can be handled by the relational database in order to perform fast data conversion and analysis processing. However, as the use of IoT data has grown, it has been difficult to define formats in advance, because new information for analysis is often being added, such as from added sensors, or from existing sensors and cameras receiving software updates to provide more data, for example, on customers’ gazes, actions, and emotions. At the same time, data analysts have been looking for methods that do not require predefined data formats, in order to quickly try new analyses. If, however, a format cannot be defined in advance, the conversion processing overhead is very significant when the database is queried, creating issues with longer processing times when undertaking an analysis.

About the Technology

Now Fujitsu Laboratories has developed technology that can quickly run a seamless analysis spanning relational and NoSQL databases without a predefined data format, as well as technology that accelerates analysis using Apache Spark clusters as a distributed parallel platform. In addition, Fujitsu Laboratories implemented its newly developed technology in PostgreSQL, and evaluated its performance using MongoDB databases storing unstructured data in JSON(4) format as the NoSQL databases.

Details of the technology are as follows:

  • Data Conversion Optimization Technology
    This technology analyzes database queries (SQL queries) that include access to data in a NoSQL database to extract the portions that specify the necessary fields and their data type, and identify the data format necessary to convert the data. The query is then optimized based on these results, and overhead is reduced through bulk conversion of the NoSQL data, providing performance equivalent to existing processing with a predefined data format.
  • Technology to Reduce the Amount of Data Transferred from NoSQL Databases
    Fujitsu Laboratories developed technology that migrates some of the processing, such as filtering, from the PostgreSQL side to the NoSQL side by analyzing the database query. With this technology, the amount of data transferred from the NoSQL data source is minimized, accelerating the process.
  • Technology to Automatically Partition Data for Distributed Processing
    Fujitsu Laboratories developed technology for efficient distributed execution of queries across multiple relational databases and NoSQL databases on Apache Spark. It automatically determines the optimal data partitioning that avoids unbalanced load across the Apache Spark nodes, based on information such as the data’s placement location in each database’s storage.

Effects

Fujitsu Laboratories implemented this newly developed technology in PostgreSQL, and evaluated performance using MongoDB as the NoSQL database. When evaluated using TPC-H benchmark queries that evaluate the performance of decision support systems, application of the first two technologies accelerated overall processing time by 4.5 times that of existing technology. In addition, using the third technology to perform this evaluation on an Apache Spark cluster with four nodes, a performance improvement of 3.6 times that of one node was achieved.

Using this newly developed technology, it is now possible to efficiently access IoT data, such as sensor data, through an SQL interface common throughout the enterprise field, which can flexibly support frequent format changes in IoT data, enabling fast processing of analyses including IoT data.

Source: CloudStrategyMag