Google Cloud Machine Learning hits public beta, with additions

Google Cloud Machine Learning hits public beta, with additions

Google unveiled today machine learning-related additions to its cloud platform, both to enrich its own cloud-based offerings and to offer expanded toolsets for businesses to develop their own machine learning-powered products.

The most prominent offering was the public beta of Google Cloud Machine Learning, a platform for building and training machine learning models with the TensorFlow  learning framework and data stored in the BigQuery and Cloud Storage back ends.

Google says its system simplifies the whole process of creating and deploying machine learning back ends for apps. Some of this is simply by making models faster to train. Google claims Cloud Machine Learning’s distributed training “can train models on terabytes of data within hours, instead of waiting for days.”

Much of it, however, is about Cloud Machine Learning’s APIs reducing the amount of programming required to build useful things. In a live demo, Google built and demonstrated a five-layer neural net for stock market analysis with just a few lines of code.

Another announced feature, HyperTune, removes another source of drudgery often associated with building machine learning models. Models often need to have parameters tweaked to yield the best results. Google claims HyperTune “automatically improves predictive accuracy” by automating that step.

Google Cloud Machine Learning was previously only available as an alpha-level tech preview, but InfoWorld’s Martin Heller was impressed with its pre-trained APIs for artificial vision, speech, natural language, and language translation.

Many of the machine learning tools Google now offers for end users, such as TensorFlow, arose from Google’s internal work to bolster its projects. The revamped version of Google’s office applications, G Suite, is one of the latest to be dressed up with machine-learning powered features. Most of these additions are for automating common busywork, such as finding a free time slot on a calendar to hold a meeting.

Google’s machine learning offerings pit it against several other big-league cloud vendors offering their own variations on the same themes, from IBM’s BlueMix and Watson services to Microsoft’s Azure Machine Learning. All of them, along with Amazon, Facebook, and others, recently announced the Partnership on AI effort to “study and formulate best practices on AI technologies” — although it seems more like a general clearinghouse for public awareness about machine learning than a way for those nominal rivals to collaborate on shared projects.

Source: InfoWorld Big Data

Clarient Global Adopts IBM Cloud, VMware

Clarient Global Adopts IBM Cloud, VMware

IBM has announced that Clarient Global LLC (“Clarient”), a joint venture established to transform client data and document management in the financial services industry, has selected VMware Cloud Foundation on IBM Cloud to continue to enhance its existing SoftLayer private cloud implementation for its Clarient Entity Hub platform.

Clarient Entity Hub allows users to securely upload, maintain and share information about legal entities through a secure, easy-to-use interface. The platform automates the validation of client data and documentation, providing users with greater transparency and control, as well as improved risk management capabilities.

With this new implementation, Clarient has enhanced its Clarient Entity Hub application with VMware Cloud Foundation on IBM Cloud running on bare metal servers. Through this private cloud, Clarient will continue to improve its security, scale and flexibility while achieving greater server density due to the ability to control and manage the hypervisors.

In addition to this private cloud solution, which was streamlined due to the strategic partnership between IBM and VMware announced earlier this year, Clarient has integrated with IBM’s Business Process Management (BPM) to provide clients with even greater data and process visibility and management.

“Clarient creates efficiency in the client entity data and document management space by providing transparency, control and standardization,” said Natalia Kory, CTO, Clarient. “As the Clarient Entity Hub community grows, we continually assess ways to further enhance the solution workflow in order to improve the overall client experience and increase processing efficiencies. IBM’s BPM solution, in conjunction with the VMWare Cloud Foundation on IBM Cloud solution, will help Clarient to achieve these requirements while reducing the cost of platform maintenance.”

In addition to flexibility, the IBM Cloud provides Clarient with a fully redundant, low latency network that allows for near real-time communication between datacenter locations, making it easier to keep replication sites in sync at no charge for network usage.

“The innovative solution that Clarient provides financial institutions around the world is increasingly critical, as the need for accurate and compliant client entity data continues to grow,” said Bill Karpovich, general manager, IBM Cloud Platform. “By leveraging a partnership with IBM and VMware, Clarient is able to extend its global and controllable infrastructure footprint.”

“Our cloud partnership with IBM continues to grow and evolve as we look to enable clients, such as Clarient, to solve the key industry challenges,” said Geoff Waters, vice president, global service provider channel, VMware. “The Clarient Entity Hub is a new way to address the unique requirements within the financial services sector and provide fast, automated and accurate client entity data. We look forward to joint success with IBM and enabling clients to continue to adopt the cloud while preserving their existing investments.”

Source: CloudStrategyMag

OneNeck® IT Solutions Releases OneNeck Connect

OneNeck® IT Solutions Releases OneNeck Connect

OneNeck IT Solutions has announced the general availability of OneNeck Connect, the company’s newest service. With OneNeck Connect, businesses get access to a 1 Gbps broadband pipe into the company’s Tier III data center in Eden Prairie.

“With OneNeck Connect, businesses throughout the metro area gain fast and reliable access into our state-of-the-art facility,” says Clint Harder, CTO and senior vice president at OneNeck. “This new offering provides a quick and easy way to connect into a comprehensive array of hybrid cloud and managed services provided by OneNeck.”

Currently, OneNeck Connect is available to businesses throughout the Twin Cities metro area who are on-net with Zayo or Comcast. Throughout the coming months, OneNeck expects to broaden availability with other providers in the metro area. The company already offers OneNeck Connect in Denver and plans to introduce the high-speed data solution into its other Tier 3 data centers in Arizona, Iowa, and Wisconsin.

Source: CloudStrategyMag

Zetta Launches Zetta Disaster Recovery

Zetta Launches Zetta Disaster Recovery

Zetta has announced Zetta Disaster Recovery, a new cloud-first disaster recovery (DR) solution that offers sub-five minute failover with the push of a button. The new solution enables small and mid-sized enterprise (SME) customers and partners to continue accessing business-critical applications with minimal disruption during a downtime event. The cost-effective service offers high availability and reliability for even the most demanding recovery time objectives (RTOs).

“From human-led malicious attacks to unexpected system downtime to natural disasters, unforeseen events can be costly, even devastating, for today’s data-driven business,” said Mike Grossman, CEO, Zetta. “With the new Zetta Disaster Recovery, applications and databases can failover in less than five minutes, so businesses and their employees can continue working without interruption. This delivers true peace-of-mind without the cost and complexity that has been traditionally associated with disaster recovery solutions.”

“At EMA we have estimated that the cost of downtime can vary from as much as $90,000 to $6 million an hour, depending on the industry and its application environment. But, no matter how you slice it, downtime is a cost most businesses simply can’t endure,” said Jim Miller, senior analyst, Enterprise Management Associates. “Disaster Recovery in the cloud can be an efficient and cost-effective way to avoid the potentially high cost of downtime. With Zetta Disaster Recovery, Zetta delivers cloud-based business continuance with a complete service that features both simplicity and affordability.”

Easy-to-Achieve Disaster Readiness and Recovery

Zetta Disaster Recovery is an end-to-end service that provides complete deployment-to-failback coverage. It includes upfront network, firewall, VPN and connectivity configuration and automated DR testing, which can be easily customized to accommodate an organization’s unique network environment, ensuring that, in the event of a disaster of any kind, a company can be fully operational in the cloud. 

The new DR service also supports incremental failback, allowing companies to continue to run their systems in the cloud, while Zetta Disaster Recovery manages sequential failback in the background. As a result, final switchover from cloud to local operations can happen painlessly – in minutes. 

Enterprise-Grade DR Solution at an Affordable Price

Zetta Disaster Recovery is a cost-effective option for companies that cannot afford to invest in a secondary DR site but who require rapid failover with truly dynamic scalability and rapid throughput rates. Zetta Disaster Recovery bundles network and VPN configuration, and DR testing and planning, eliminating the need for companies to engage outside professional service firms to perform these functions.

Optimized for Complex IT Environments

Zetta Disaster Recovery has been architected with the needs of larger enterprises in mind: to rapidly protect very large data sets and complex IT environments using fewer system resources and in less time than alternative options. Key features of Zetta Disaster Recovery include:

  • Comprehensive support for end-to-end DR including backup, failover and failback
  • Failback flexibility with support for incremental failbacks
  • High-performing IO, CPU and RAM resources to support workload demands of SME organizations
  • Pre-provisioned virtual VPN and firewall to ensure that an organization’s workers have on-demand access to applications running in the Zetta Cloud
  • Power-on DR testing to validate that systems and applications will be operational in event of disaster

All protected data is encrypted via SSL in flight to and via AES at rest in the Zetta Cloud. For additional security, options for secure VPN connectivity to the recovered environment in the Zetta Cloud include Point to Site, Site to Site and IP takeover.

 

Source: CloudStrategyMag

Ensono Announces New Client Advisory Board

Ensono Announces New Client Advisory Board

Ensono™ has announced its new Client Advisory Board, emphasizing the company’s commitment to working collaboratively with its clients to deliver progressive IT solutions to help them operate their infrastructure for today and optimize it for tomorrow.

The Board provides a platform for Ensono clients to impact a number of strategic issues including the development of services and solutions, areas of expansion and market transformation. With insight into Ensono’s business strategy and direction, as well as product and technology roadmaps, members offer their opinion and expertise to optimize future offerings.

“It is rare to work with a company that not only values our feedback, but actively seeks it out in order to better serve its clients and help us progress our business initiatives,” said Chuck Musciano, chief information officer, Osmose Utility Services. “We chose to become part of Ensono’s Client Advisory Board because we not only believe in the services Ensono offers, but because this collaborative approach underscores that client relationships are a top priority.”

Ensono’s Client Advisory Board is comprised of a unique cross section of industries Ensono serves such as telecommunications and financial services, as well as a range of services it offers, including mainframe and cloud. Companies such as Acxiom, CCCIS, Dun & Bradstreet, Exelon, Hub Group, Inc., Osmose Utility Services, RR Donnelley, Sonoco, and Windstream Corporation participate on the Board.

The Board meets quarterly, which enables Ensono to provide valuable company and service updates in real time, continuing to advance the company’s mission to serve as an innovative hybrid IT services provider.

“We formed the Client Advisory Board to continue our longstanding practice of becoming a seamless extension of our clients’ IT teams,” said Brian Klingbeil, chief operating officer for Ensono. “We are thrilled clients actively participate and contribute in meetings and we look forward to continuing to solicit feedback so we can continue to help our clients do what they do best.” 

“Ensono is already a trusted partner for our business technology team,” said Ben Chan, chief information officer, Global Business Technology, for Sonoco. “Being able to be directly involved in Ensono’s process is further proof of their commitment to working side-by-side with us.”

Source: CloudStrategyMag

Accelerite Concert IoT SCEP Now Available On Microsoft Azure Cloud

Accelerite Concert IoT SCEP Now Available On Microsoft Azure Cloud

Accelerite has announced Accelerite’s Concert IoT Service Creation and Enrichment Platform (SCEP) can now be deployed on the Microsoft Azure cloud. 

In addition to accelerating time to market by simplifying coding, Concert IoT enables rapid development of service-oriented IoT applications (SOIAs). SOIAs are applications designed and delivered via APIs via a platform as a service (PaaS) model. Accelerite Concert IoT:

  • Facilitates efficient integration of third-party web services APIs into a Concert IoT SOIA to enrich the new platform’s capabilities and enable the evolution of powerful new IoT application ecosystems.
  • Enables partners to quickly monetize the data and insight generated from the IoT application to generate additional revenue streams. Customers benefit from richer apps and services and device vendors benefit from monetizing both the app and the data.

For example, a large scale farming operation may initially deploy IoT sensors to reduce water consumption and improve crop yields. Once that data is collected, it would be of great value to partners seeking to offer additional solutions, such as fertilizer or seeds customized to a targeted locale. Concert IoT provides the API management, payments and partner settlements needed to create and monetize a growing, revenue-generating IoT application ecosystem.

“Concert IoT will greatly accelerate the creation of innovative apps and rich new, vertically-focused IoT platforms built on Microsoft’s Azure public and private cloud implementations,” said Dean Hamilton, general manager of the Service Creation Business Unit at Accelerite. “These new platforms will empower IoT vendors across a wide spectrum of consumer and enterprise vertical markets to create their own IoT application partner ecosystems to continually deliver additional value.”

In addition, for compute and storage requirements, Concert IoT leverages Microsoft Azure’s IoT Hub for secure device on-boarding and device data ingestion, Stream Analytics for real time filtering and event detection, and HDInsight and Cortana AI for analytics and machine learning using the data.

“Accelerite’s support for Microsoft Azure environments within its Concert IoT service enabling platform offers value to the expanding community of Azure users and partners. Concert IoT is meant to serve as an enabling layer that rides “on-top” of the expanding set of IoT features Microsoft can now deliver via the Azure IoT suite,” said Brian Partridge, vice president, 451 Research. “The API management and monetization services that Concert IoT brings to Azure environments will be crucial to unlocking value for developers and service providers as the IoT industry matures.  Microsoft’s incumbency across all vertical industries and the growing market share of Azure in cloud services made it a natural target for prioritized support from Accelerite.”

Source: CloudStrategyMag

8 'new' enterprise products we don't want to see

8 'new' enterprise products we don't want to see

I get a lot of press releases. Most of them are from startups with the same old enterprise product ideas under different names. Some are for “new” products from existing companies (by “new,” I mean new implementations of old ideas).

Think you have a great idea? Please tell me it isn’t one of these:

1. A column family or key value store database

You have a brand-new take on how to store data, and it starts with keys associated with something. It’s revolutionary because blah blah blah.

No — stop it. Don’t start any more of these; don’t fund any more of these. That ship has sailed; the market is beyond saturated.

2. ETL/monitoring/data catalogs

The market might bear a totally new approach, but I’ve yet to see one (I mean actually a new approach, not simply saying that). I recently watched a vendor drone on for more than an hour before telling us what it was pitching. The more times a vendor says “revolutionary,” the more you know the only thing that’s new is the pricing. It’s an ETL tool with a catalog and monitoring that only works with their cloud, but they support open source and community! Sad, man.

Seriously, you can’t dress up your ETL/governance tool as a brand-new product idea — you’ve now invented Informatica. I’m not saying you should use Informatica, I’d never say that, but I’m saying “Zzzz, don’t start another one.” If you’re a big enough vendor to build your own, that’s nice, but no one cares.

3. On-prem clouds

OpenShift, CloudFoundry, and so on have all become “new and interesting ways to manage Docker or Docker images.” Also, we say “hybrid” because if you try hard you might get it up to Amazon, but the tools for doing that will certainly suck. Frankly, I’m skeptical that the “hybrid cloud” is anything but a silly marketing gimmick in terms of practicality, implementation, or utility.

4. Hadoop/Spark management with performance enhancements

Management in this area is a real problem, but if you’re starting now, you’re late to the game. This is a niche market. [Disclosure: I’m an adviser for one of these.]

5. Generic data visualization tool

In truth, I’m not superhappy with any product in this area (Tableau in particular sucks). This is a market that has had 1,000 false starts along with a handful of good players that charge too much. Amazon and others are getting into this game as well, although I’m dubious anyone wants to pay by the cycle to draw a chart. Anyhow, the usefulness of these tools will fade as we move to more automated decision-making tools.

6. Content management systems by any other name

People are still writing me about how they started these things. They have new names for them — but no, I’m not writing about them. If I covered consumer electronics I probably wouldn’t write about the various toasters you can buy at Target either. Are you people joking?

7. Another streaming tool

Between Kafka, Spark, Apex, Storm, and so on, whatever you need in big data software is covered. Your “revolutionary” new way to stream is probably not new.

8. Server-side blah blah with mobile added

Yes, mobile exists, but with maybe one or two exceptions a mobile app is mainly a client to the server like a web browser. If this means you added sync or notification to your existing product, cool. If you launched a new product line with “mobile” on it, please sell this to journalists and analysts with no technical background.

If you’re about to build any of those, please stop. Don’t tell anyone about it. Walk away from the keyboard before you bore someone.

Source: InfoWorld Big Data

Meet Apache Spot, a new open source project for cybersecurity

Meet Apache Spot, a new open source project for cybersecurity

Hard on the heels of the discovery of the largest known data breach in history, Cloudera and Intel on Wednesday announced that they’ve donated a new open source project to the Apache Software Foundation with a focus on using big data analytics and machine learning for cybersecurity.

Originally created by Intel and launched as the Open Network Insight (ONI) project in February, the effort is now called Apache Spot and has been accepted into the ASF Incubator.

“The idea is, let’s create a common data model that any application developer can take advantage of to bring new analytic capabilities to bear on cybersecurity problems,” Mike Olson, Cloudera co-founder and chief strategy officer, told an audience at the Strata+Hadoop World show in New York. “This is a big deal, and could have a huge impact around the world.”

Based on Cloudera’s big data platform, Spot taps Apache Hadoop for infinite log management and data storage scale along with Apache Spark for machine learning and near real-time anomaly detection. The software can analyze billions of events in order to detect unknown and insider threats and provide new network visibility.

Essentially, it uses machine learning as a filter to separate bad traffic from benign and to characterize network traffic behavior. It also uses a process including context enrichment, noise filtering, whitelisting and heuristics to produce a shortlist of most likely security threats.

By providing common open data models for network, endpoint, and user, meanwhile, Spot makes it easier to integrate cross-application data for better enterprise visibility and new analytic functionality. Those open data models also make it easier for organizations to share analytics as new threats are discovered.

Other contributors to the project so far include eBay, Webroot, Jask, Cybraics, Cloudwick, and Endgame.

“The open source community is the perfect environment for Apache Spot to take a collective, peer-driven approach to fighting cybercrime,” said Ron Kasabian, vice president and general manager for Intel’s Analytics and Artificial Intelligence Solutions Group. “The combined expertise of contributors will help further Apache Spot’s open data model vision and provide the grounds for collaboration on the world’s toughest and constantly evolving challenges in cybersecurity analytics.”

Source: InfoWorld Big Data

10% off SAP Crystal Reports 2016, Through Friday Only – Deal Alert

10% off SAP Crystal Reports 2016, Through Friday Only – Deal Alert

SAP Crystal Reports software is the de facto in reporting, and it’s currently discounted 10%, through Friday, if you use the code CRYSTAL10 at checkout. With SAP Crystal Reports you can create powerful, richly formatted, dynamic reports from virtually any data source – delivered in dozens of formats, in up to 24 languages. A robust production reporting tool, SAP Crystal Reports turns almost any data source into interactive, actionable information that can be accessed offline or online, from applications, portals and mobile devices.

Through Friday, September 30th, use the code CRYSTAL10 at checkout and receive 10% off the purchase. Click here to see Crystal Reports now in the SAP Store.

This story, “10% off SAP Crystal Reports 2016, Through Friday Only – Deal Alert” was originally published by TechConnect.

Source: InfoWorld Big Data