Fujitsu Develops Database Integration Technology to Accelerate IoT Data Analysis

Fujitsu Develops Database Integration Technology to Accelerate IoT Data Analysis

Fujitsu Laboratories Ltd. has announced the development of technology to integrate and rapidly analyze NoSQL databases, used for accumulating large volumes of unstructured IoT data, with relational databases, used for data analysis for mission-critical enterprise systems.

NoSQL databases are used to store large volumes of data, such as IoT data output from various IoT devices in a variety of structures. However, due to the time required for structural conversion of large volumes of unstructured IoT data, there was an issue with the processing time of analysis involving data across NoSQL and relational databases.

Now Fujitsu Laboratories has developed technology that optimizes data conversion and reduces the amount of data transfer by analyzing SQL queries to seamlessly access relational databases and NoSQL databases, as well as a technology that automatically partitions the data and efficiently distributes execution on Apache Spark(1), a distributed parallel execution platform, enabling rapid analysis integrating NoSQL databases to relational databases.

When this newly developed technology was implemented in PostgreSQL(2), an open source relational database, and its performance was evaluated using open source MongoDB(3) as the NoSQL database, query processing was accelerated by 4.5 times due to the data conversion optimization and data transfer reduction technology. In addition, acceleration proportional to the number of nodes was achieved with the efficient distributed execution technology on Apache Spark.

With this technology, a retail store, for example, could continually roll out a variety of IoT devices in order to understand information such as customers’ in-store movements and actions, enabling the store to quickly try new analyses relating this information with data from existing mission-critical systems. This would contribute to the implementation of one-to-one marketing strategies that offer products and services suited for each customer.

Details of this technology were announced at the 9th Forum on Data Engineering and Information Management (DEIM2017), which was held in Takayama, Gifu, Japan, March 6-8.

Development Background

In recent years, IoT and sensor technology are improving day by day, enabling the collection of new information that was previously difficult to obtain. It is expected that connecting this new data with data in existing mission-critical and information systems will enable analyses on a number of fronts that were previously impossible.

For example, in a retail store, it is now becoming possible to obtain a wide variety of IoT data, such as understanding where customers are lingering in the store by analyzing the signal strength of the Wi-Fi on the customers’ mobile devices, or understanding both detailed actions, such as which products the customers looked at and picked up, and individual characteristics, such as age, gender, and route through the store, by analyzing image data from surveillance cameras. By properly combining this data with existing business data, such as goods purchased and revenue data, and using the result, it is expected that businesses will be able to implement one-to-one marketing strategies that offer products and services suited for each customer.

Issues

When analyzing queries that span relational and NoSQL databases, it is necessary to have a predefined data format for converting the unstructured data stored in the NoSQL database into structured data that can be handled by the relational database in order to perform fast data conversion and analysis processing. However, as the use of IoT data has grown, it has been difficult to define formats in advance, because new information for analysis is often being added, such as from added sensors, or from existing sensors and cameras receiving software updates to provide more data, for example, on customers’ gazes, actions, and emotions. At the same time, data analysts have been looking for methods that do not require predefined data formats, in order to quickly try new analyses. If, however, a format cannot be defined in advance, the conversion processing overhead is very significant when the database is queried, creating issues with longer processing times when undertaking an analysis.

About the Technology

Now Fujitsu Laboratories has developed technology that can quickly run a seamless analysis spanning relational and NoSQL databases without a predefined data format, as well as technology that accelerates analysis using Apache Spark clusters as a distributed parallel platform. In addition, Fujitsu Laboratories implemented its newly developed technology in PostgreSQL, and evaluated its performance using MongoDB databases storing unstructured data in JSON(4) format as the NoSQL databases.

Details of the technology are as follows:

  • Data Conversion Optimization Technology
    This technology analyzes database queries (SQL queries) that include access to data in a NoSQL database to extract the portions that specify the necessary fields and their data type, and identify the data format necessary to convert the data. The query is then optimized based on these results, and overhead is reduced through bulk conversion of the NoSQL data, providing performance equivalent to existing processing with a predefined data format.
  • Technology to Reduce the Amount of Data Transferred from NoSQL Databases
    Fujitsu Laboratories developed technology that migrates some of the processing, such as filtering, from the PostgreSQL side to the NoSQL side by analyzing the database query. With this technology, the amount of data transferred from the NoSQL data source is minimized, accelerating the process.
  • Technology to Automatically Partition Data for Distributed Processing
    Fujitsu Laboratories developed technology for efficient distributed execution of queries across multiple relational databases and NoSQL databases on Apache Spark. It automatically determines the optimal data partitioning that avoids unbalanced load across the Apache Spark nodes, based on information such as the data’s placement location in each database’s storage.

Effects

Fujitsu Laboratories implemented this newly developed technology in PostgreSQL, and evaluated performance using MongoDB as the NoSQL database. When evaluated using TPC-H benchmark queries that evaluate the performance of decision support systems, application of the first two technologies accelerated overall processing time by 4.5 times that of existing technology. In addition, using the third technology to perform this evaluation on an Apache Spark cluster with four nodes, a performance improvement of 3.6 times that of one node was achieved.

Using this newly developed technology, it is now possible to efficiently access IoT data, such as sensor data, through an SQL interface common throughout the enterprise field, which can flexibly support frequent format changes in IoT data, enabling fast processing of analyses including IoT data.

Source: CloudStrategyMag

Woolpert Earns Google Cloud 2016 Fastest Growing Company Award

Woolpert Earns Google Cloud 2016 Fastest Growing Company Award

Woolpert has been awarded the Google Cloud 2016 Fastest Growing Company Award for Maps Customer Success for North America. This award recognizes Woolpert for its demonstrated sales, marketing, technical, and support excellence to help customers of all sizes transform their businesses and solve a wide range challenges with the adoption of Maps.

Woolpert helps customers navigate the Google Maps for Work licensing process and advises them on the proper implementation of the Google Maps API within their Cloud solutions.

The national architecture, engineering and geospatial (AEG) firm saw its Google sales grow 250% in 2016, as compared to 2015. The firm’s sales were $3.25 million for its Google division and just shy of $150 million overall last year.

Woolpert, which has been a Google for Work Partner since March 2015 and a Google for Work Premier Partner since last summer, also was named a Premier Partner in the Google Cloud Program for 2017.

Jon Downey, director of the Google Geospatial Sales Team at Woolpert, said he is honored by this recognition and excited to see dynamic growth.

“This award represents our continued commitment to our Google partnership, and our ability to steadily grow in this market,” Downey said. “What sets Woolpert apart in the Google Cloud ecosystem is that approximately half of our firm’s business is geospatial, so this extension of our work makes sense. We’re not a sales organization and we’re not here to push software. We’re here to help.”

This extensive geospatial background enables Woolpert to add value and dimension to its Google Cloud services.

“We don’t just have the knowledge related to the Google data and deliverables, but we have a professional services staff capable of elevating that data,” he said. “We’re able to offer consultation on these services and that takes the relationship a step further, benefitting all involved.”

Bertrand Yansouni, vice president of global partner sales and strategic alliances, Google Cloud, said partners are vital contributors to Google Cloud’s growing ecosystem.

“Partners help us meet the needs of a diverse range of customers, from up-and-coming startups to Fortune 500 companies,” Yansouni said. “We are proud to provide this recognition to Woolpert, who has consistently demonstrated customer success across Maps.”

Source: CloudStrategyMag

Interoute Launches Managed Container Platform At Cloud Expo Europe

Interoute Launches Managed Container Platform At Cloud Expo Europe

Interoute will announce the integration of its global cloud infrastructure platform with Rancher Labs’ container management platform, Rancher, at Cloud Expo 2017. This new innovative approach enables enterprises to accelerate their digital transformation and infrastructure investments.

The advent of containers has revolutionised the way enterprises can build, and deploy software applications, bringing greater agility, quicker deployment times and lower operational costs. In the past, enterprise operations and infrastructure teams building new applications and software services had to manage all cloud infrastructure building blocks (the virtual server, OS, and application libraries) necessary to create their application development environment. Using a container based approach enterprise developers can now focus on writing applications and deploying the code straight into a container. The container is then deployed across the underling Interoute cloud infrastructure dramatically improving the time to develop and launch new applications and software.

The Interoute Container platform is part of the Interoute Enterprise Digital Platform, a secure global Infrastructure that combines a Software Defined Core Network integrated into a global mesh of 17 cloud zones, to optimise applications and services. Interoute makes it possible for organisations to integrate legacy, third party and digital IT environments onto single, secure, privately-connected global cloud infrastructure, creating the foundation for Enterprise Digital Transformation.

By integrating Rancher software, Interoute is now able to provide access to a full set of orchestration and infrastructure services for containers, enabling users to deploy containers in any of Interoute’s 17 cloud zones across the world. Rancher is an open-source container management platform that makes it simple to deploy and manage containers in production.

“Enterprises developing and building apps in the cloud and those on a path to Digital Transformation need Digital ICT Infrastructure that allows them to build, test and deploy faster than ever before. The integration of Rancher software with Interoute Digital Platform gives developers access to a managed container platform, that sits on a global privately networked cloud, enabling true distributed computing,” said Matthew Finnie, Interoute CTO.

“We’re thrilled to partner with Interoute and provide users of the Interoute Enterprise Digital Platform with a complete and turn-key container management platform. We look forward to seeing those users accelerate all aspects of their software development pipeline, from writing and testing code to running complex microservices-based applications,” said Louise Westoby, VP of marketing, Rancher Labs.

Source: CloudStrategyMag

CloudVelox Releases One Hybrid Cloud™ 4.0

CloudVelox Releases One Hybrid Cloud™ 4.0

CloudVelox has announced new enterprise-grade automated cloud workload mobility and optimization capabilities with enhanced management and control features for its One Hybrid Cloud™ (OHC) software. Through automation, OHC accelerates workload mobility and optimization in the cloud by matching data center environments with optimal cloud services to deliver cost savings or improved application performance, without requiring specialized cloud skills. New features for cloud optimization include: application-centric instance tagging, placement groups, multiple security groups, Identity and Access Management (IAM) roles. New features for managing workload mobility include comprehensive system reporting and alerts for the successful completion of workload migrations to the cloud. With the new powerful suite of OHC features, enterprises are able to accelerate time to value, are further equipped to meet regulatory and compliance requirements and reduce IT effort while enhancing system visibility, management and control.

According to an IDC study1, nearly 68 percent of organizations are using some form of cloud to help drive business outcomes; however, only three percent have optimized cloud strategies in place today. Businesses are challenged by unexpected cloud costs, the complexity of mapping security and data policies from the data center to the cloud, a scarcity of skilled cloud engineers, and a lack of visibility into monitoring the status of mass workload migrations.

Enterprises want to benefit from the advantages of the public cloud, but without optimization they risk paying for services they don’t need, or not provisioning enough of the services they do need to support the availability and performance required for mission critical applications. Automation is the key to addressing these challenges, by enabling accelerated workload mobility and optimization at scale and completing “mass migrations” successfully in a matter of weeks, instead of up to 12 months.

“’Lift and Shift’ alone to the cloud has provided limited business value and control,” said Raj Dhingra, CloudVelox CEO. “When enterprises migrate brownfield applications to the cloud there can be dramatic inefficiencies if they are not optimized for the new environment. Now businesses can execute migrations with an unprecedented, automated ‘Lift and Optimize’ approach that ensures they receive the full benefits of the public cloud, whether that means reduced costs or improved performance. By matching the application environment in the datacenter to the optimal cloud compute and storage infrastructure whether based on cost or performance, and mapping data center network and security policies to cloud services — One Hybrid Cloud enhances management and control over applications without sacrificing cloud agility and accelerates the payback for even the most complex environments.”

In addition to automated workload migration to the cloud, CloudVelox is the industry’s first automation solution to combine workload mobility and workload optimization. CloudVelox approaches workload optimization in three phases of the cloud optimization lifecycle including pre-migration optimization — available now — and will build on the initial phase with additional features in the second continuous optimization phase and third fully optimized phase later this year:

  • Pre-migration optimization –leverages CloudVelox’s automated application blueprinting capabilities, matching the application’s data center infrastructure characteristics to the appropriate cloud compute, storage, network and security services prior to migrating the workloads to the cloud
  • Continuous Optimization (available summer 2017) — enables continuous optimization of migrated workloads by monitoring key areas such as instance, storage, availability and security policy to deliver actionable insights that can yield cost savings, better performance and availability as well as compliance with regulatory requirements
  • Fully Optimized (available summer 2017) — fully optimized approach further leverages cloud native services to deliver additional agility, cost savings and higher availability. For example, future features in the company’s cloud optimization roadmap include support for autoscale, RDS (MySQL and Oracle) and automated ELB across multiple instances

The One Hybrid Cloud 4.0 include new application-centric security groups, and application-centric placement groups along with comprehensive status reporting and alerts. Security groups can be assigned to a single system or a group of systems to control flow of traffic between and to apps in the cloud, and enable security policies to be mapped from the data center to the cloud to meet regulatory and compliance requirements. An app or a group of systems can be assigned to a placement group in a selected Amazon Web Services (AWS) region to enable performance optimization for applications requiring high performance compute, low latency and lots of network I/O. Automating the assignment of placement groups prior to migration also reduces IT effort in migrating and re-hosting these apps in the cloud.

New features to offer comprehensive reporting, alerts and enhanced management and control include:

  • An inventory of selected applications for replication with cloud characteristics such as CPU, RAM, instance type, storage type and other variables
  • An application launch report of currently replicating applications showing infrastructure services used by each app
  • Application launch status reports providing current status and time taken since launch and other information
  • A Sync report that lists the various systems that have synced and their consistency point.
  • System connect or disconnect alerts to proactively report on disconnected systems
  • Replication alerts indicating if a replication has started, not started or stopped
  • Application launch activity alerts indicating successful, failed, or suspended “launch” and “migration successful” alerts.

Application-centric instance tagging and IAM roles. Instance tagging allows single systems or a group of systems to be assigned tags to classify and categorize the migrated workloads. Tags can specify type of application, line of business, owner and up to 50 other categories that can be used for billing, reporting, utilization analysis and creating policies for cost and performance optimization

Source: CloudStrategyMag

Atomic Data Selects Corero’s Real-Time DDoS Mitigation Solution

Atomic Data Selects Corero’s Real-Time DDoS Mitigation Solution

Corero Network Security has announced that Atomic Data has selected the Corero SmartWall® Network Threat Defense (TDS) solution to protect its own network and its tenant networks from DDoS attacks.

Atomic Data provides data centers, hosting services, Atomic Cloud® technology and 24×7 managed IT service and support. “We were driven to seek out a DDoS mitigation solution due to the increasing severity and frequency of DDoS attacks against our hosted client base. DDoS attacks can create service interruptions for customers and create unpredictable work efforts for the engineers tasked with resolving them.” said Larry Patterson, chief technology officer and co-founder, Atomic Data.

Previously, Atomic Data used manual techniques for dealing with DDoS attacks.  After an attack was identified using network flow monitoring technology, upstream null routing required network engineers and architects to intervene. This approach resulted in slow and ineffective problem resolution for mitigating the attacks. Thus, Atomic Data felt compelled to find a DDoS solution that was not only more effective, but also more scalable and affordable.

Atomic Data selected the Corero SmartWall TDS as its dedicated real-time DDoS mitigation solution because it delivers a granular level of protection, and the product is flexible, affordable and scalable, with an easy-to-understand user interface. The Corero solution features attack dashboards for Atomic Data and their tenants. Atomic Data can assign subscriber/tenant service levels, and distribute reporting and analytics to tenants so they can see the value of the protection they are receiving.

“The key benefit of the Corero solution is that it automatically mitigates DDoS attack traffic, and surgically removes it at the network edge, before it can be impactful to our customers. We not only keep our networks clean of attack traffic, but our network engineering team now has more time to dedicate to servicing other customer needs and scaling our network to accommodate business growth,” added Patterson.

“One emerging trend is that enterprise customers are increasingly calling on their service providers to assist them in defeating DDoS attacks, and they are eager to adopt service-based DDoS mitigation from their native providers,” said Stephanie Weagle, vice president at Corero. “Hence, Corero’s real-time mitigation capabilities set Atomic Data apart from their competition when it comes to protection against damaging DDoS attacks, adds Weagle.”

 “Because we can offer DDoS protection as a standard service with all Atomic Cloud® instances, we now have a competitive advantage in the cloud marketplace,” said Patterson.

Source: CloudStrategyMag

Microsoft Leads In Burgeoning SaaS Market

Microsoft Leads In Burgeoning SaaS Market

New Q4 data from Synergy Research Group shows that the enterprise SaaS market grew 32% year on year to reach almost $13 billion in quarterly revenues, with ERP and collaboration being the highest growth segments. For the third successive quarter Microsoft is the clear leader in overall enterprise SaaS, having overtaken long-time market leader Salesforce. Other leading SaaS providers include SAP, Oracle, Adobe, ADP, IBM, Workday, Intuit, Cisco and Google. Among the major SaaS vendors those with the highest growth rates were Oracle and Google, the latter thanks to a big push for its G Suite collaborative apps.

The enterprise SaaS market is somewhat mature compared to other cloud markets like IaaS and PaaS and consequently has a lower growth rate. Nonetheless, Synergy forecasts that it will more than double in size over the next three years, with strong growth across all segments and all geographic regions.

“There are a variety of factors driving the SaaS market which will guarantee substantial growth for many years to come,” said John Dinsdale, a chief analyst and research director at Synergy Research Group. “Traditional enterprise software vendors like SAP, Oracle and IBM are all pushing to convert their huge base on on-premise software customers to a SaaS subscription relationship. Meanwhile relatively new cloud-based vendors like Workday and Zendesk are aggressively targeting the enterprise market and industry giants Microsoft and Google are on a charge to grow their subscriber bases, especially in the collaboration market.”

Source: CloudStrategyMag

CTP Achieves Google Cloud Partner Specialization In Application Development

CTP Achieves Google Cloud Partner Specialization In Application Development

Cloud Technology Partners (CTP) has announced that it has achieved Google’s Cloud Application Development Specialization. CTP is one of the first Google consulting partners to earn this specialization, highlighting the success of its Digital Innovation services, which help clients design, build and run cloud-native applications.

“Google is a leading public cloud platform for building and deploying cloud-native applications, and is often the platform of choice for our clients wanting to develop data-intensive workloads,” said Rob Lancaster, vice president of Global Alliances at Cloud Technology Partners. “Achieving the Google Cloud Application Development Specialization reaffirms to the marketplace that CTP has the vision, the skills, and a track record of customer success building and deploying solutions on Google.”

The Google Cloud Partner Specialization program is designed to provide Google Cloud Customers with qualified partners that have demonstrated technical proficiency and proven success in key service areas.

CTP has worked with Google on a number of innovative client projects including developing an IoT and data analytics application for Land O’Lakes which was featured on stage at last year’s Google Cloud Next conference. By leveraging cloud, IoT and big data technologies, Land O’Lakes farmers are now producing 650% more corn today on 13 percent fewer acres than they were 50 years ago.

“CTP helped us build applications that streamline data capture and knowledge transfer, all in real time,” said Teddy Bekele, Vice President IT at Land O’Lakes.

“We welcome the recognition by Google in response to the tremendous results we’ve delivered for our clients that leverage Google Cloud Platform, and we look forward to continuing to expand our Google Cloud expertise and offerings,” said John Treadway, senior vice president of Cloud Technology Partner’s Digital Innovation practice.

Source: CloudStrategyMag

Zoomdata Announces Expanded Support For Google Cloud Platform

Zoomdata Announces Expanded Support For Google Cloud Platform

Zoomdata has announced support for Google’s Cloud Spanner and PostgreSQL on the Google Cloud Platform (GCP), as well as enhancements to the existing Zoomdata Smart Connector for Google BigQuery. With these new capabilities, Zoomdata is one of the first visualization analytics partners to offer such deeply integrated and optimized support for Google Cloud Platform’s Cloud Spanner, PostgreSQL, Google BigQuery, and Cloud DataProc services.

Google Cloud Spanner is the first and only relational database service that is both strongly consistent and horizontally scalable. Zoomdata’s Smart Connector for Cloud Spanner is available today for testing on Google Cloud Launcher. It supports key data analytic capabilities, including streaming analytics (Live Mode), aggregate analytics (group by), time series handling, and federated data blending of data from Cloud Spanner and other data sources via Zoomdata Fusion.

Zoomdata has also added a Zoomdata Smart Connector for PostgreSQL to its Google Cloud Platform launcher. Optimized to take full advantage of the powerful, object-relational database system, users can now easily connect to and quickly visualize and explore data from PostgreSQL running on GCP. In addition, Zoomdata enhanced its Smart Connector for Google BigQuery to include support for visual drill-through to full record “details,” as well as enhancing the speed at which visualizations are generated.

“The Zoomdata team is committed to delivering a big data visualization experience that optimizes GCP’s core data management services, including support for Google BigQuery,” said Russ Cosentino, Zoomdata co-founder and VP, Channels. “As a launch partner for Google Cloud Dataproc, and now offering optimized support for Google Cloud Spanner and PostgreSQL on GCP, Zoomdata is an ideal choice for helping business users deliver value against their data workloads on Google.”

Zoomdata is an open platform that provides visual analytics solutions for big and fast data. Architected for both cloud and on-premise deployments, its modern architecture delivers visual analysis of huge datasets in seconds. Zoomdata’s patented Data Sharpening™ technology delivers the industry’s fastest visual analytics for real-time streaming and historical data. Zoomdata’s microservices architecture makes this possible by using Apache Spark as a complementary high performance engine. Zoomdata Fusion enables users to perform analytics across disparate data sources in a single view — without the need to move or transform data.

Source: CloudStrategyMag

Dataguise DGSecure Is Now Integrated In Google Cloud Storage

Dataguise DGSecure Is Now Integrated In Google Cloud Storage

Dataguise has announced that DgSecure Detect now supports sensitive data detection on Google Cloud Storage (GCS). Integration with GCS extends the range of platforms supported by DgSecure Detect, which helps data-driven enterprises move to the cloud with confidence by providing precise sensitive data detection across the enterprise, both on premises and in the cloud. With DgSecure Detect, organizations can leverage Google’s powerful, simple, and cost-effective object storage service with a complete understanding of where sensitive data is located — an important first step to ensuring data protection and privacy compliance.

DgSecure Detect discovers, counts, and reports on sensitive data assets in real time within the unified object-based storage of GCS. The highly scalable, resilient, and customizable solution precisely identifies and summarizes the location of this data, down to the element level. DgSecure allows organizations to comb through structured, semi-structured, or unstructured content to find any data deemed “sensitive” by the organization. The range of sensitive data that is discoverable by DgSecure Detect is nearly unlimited using the solution’s custom sensitive data type definition capabilities.

Sensitive Data Detection Capabilities for Google Cloud Storage:

  • Detects high volumes of disparate, constantly moving, and changing data with time-stamping to support incremental change and life cycle management
  • Supports a flexible information governance model that has a mix of highly invested (curated) data as well as raw, unexplored (gray) data, such as IoT (Internet of Things) data, clickstreams, feeds, and logs
  • Processes structured, semi-structured, and unstructured or free-form data formats
  • Provides automated detection and processing of a variety of file formats and file/directory structures, leveraging meta-data and schema-on-read where applicable
  • Provides deep content inspection using patent-pending techniques, such as neural-like network (NLN) technology, and dictionary-based and weighted keyword matches to detect sensitive data more accurately.

These new capabilities enable enterprises from a range of industries—including finance, insurance, healthcare, government, technology and retail — to gain accurate insight on where sensitive data resides in GCS so it can be protected properly. DgSecure helps organizations comply with regulatory mandates for PII, PHI, and PCI data, such as the European Union’s General Data Protection Regulation (GDPR), the Health Insurance Portability and Accountability Act (HIPAA), and other data privacy and data residency laws.

“With support for GCS, Dataguise provides broad cross-platform support of sensitive data detection within the industry’s most popular data repositories and platforms, both on premises and in the cloud,” said JT Sison, VP, marketing and business development, Dataguise. “Demonstration of DgSecure Detect at Google Cloud Next will be the first public display of the technology, and we invite attendees to meet with Dataguise and Google regarding this innovative solution.”

 

Source: CloudStrategyMag

SBS Group Selected As An Indirect Cloud Solution Provider By Microsoft

SBS Group Selected As An Indirect Cloud Solution Provider By Microsoft

Microsoft has named SBS Group an Indirect Cloud Solution Provider (ICSP). Formerly called a Tier-2 Distribution Partner, the ICSP provides the connection between Microsoft and resellers of Microsoft’s cloud solutions including Azure, Office 365, PowerBI, and the recently launched Dynamics 365 service.

The ICSP program is built to help ease the complexity of selling Microsoft solutions. Technology resellers can partner with an ICSP for support in with sales, service, administration and billing. The Stratos Cloud Alliance, SBS Group’s new ICSP program, is the only ICSP specializing in Dynamics 365, Microsoft’s business solution focused service. SBS Group has vast experience in the Dynamics landscape having operated in the Microsoft ERP and CRM spaces for over 30 years. The Stratos Cloud Alliance will leverage that knowledge and experience to provide superior Dynamics 365 implementation, training and support services for technology partners to resell. Additionally, the Stratos Cloud Alliance will offer unique partner enablement services, giving partners the option to develop, market, and deliver their own Dynamics 365 solutions and services.

“We serve several communities including customers, partners, independent software vendors, and Microsoft,” said James Bowman, president and CEO of SBS Group. “It is our mission to deliver innovative solutions that serve the evolving needs of these communities.  Seven years ago, we pioneered the Master VAR program, enabling other Dynamics partners to grow their businesses. Last year, we led the Microsoft Dynamics community into the ‘cloud’ when we launched the first online Cloud Solution Provider (CSP) Marketplace focused on Dynamics solutions. We are leveraging these experiences in launching the Stratos Cloud Alliance. This program will help ERP and CRM-focused partners in their digital transformation process and enable Managed Service Providers (MSP’s) and IT-focused solution providers to expand their solution portfolios for their customers.”

The Stratos Cloud Alliance (SCA) features a comprehensive portfolio of Microsoft Cloud Business and Productivity Solutions, ISV Products and Tools, and Partner and Customer Services.  The SCA offers three flexible partner models (including a white-label option) with value-added features and benefits for ERP and CRM resellers, Managed Service Providers, Accounting and Consulting firms. All partner tiers are powered by best-in-class e-commerce capabilities and include dedicated partner teams and support services designed to simplify onboarding and streamline the partner experience.

Source: CloudStrategyMag