How big data is changing the game for backup and recovery

How big data is changing the game for backup and recovery

It’s a well-known fact in the IT world: Change one part of the software stack, and there’s a good chance you’ll have to change another. For a shining example, look no further than big data.

First, big data shook up the database arena, ushering in a new class of “scale out” technologies. That’s the model exemplified by products like Hadoop, MongoDB, and Cassandra, where data is distributed across multiple commodity servers rather than packed into one massive one. The beauty there, of course, is the flexibility: To accommodate more petabytes, you just add another inexpensive machine or two rather than “scaling up” and paying big bucks for a bigger mammoth.

That’s all been great, but now there’s a new sticking point: backup and recovery.

“Traditional backup products have challenges with very large amounts of data,” said Dave Russell, a vice president with Gartner. “The scale-out nature of the architecture can also be difficult for traditional backup applications to handle.”

SugarCRM is planning a Siri-like agent named Candace

SugarCRM is planning a Siri-like agent named Candace

SugarCRM has put AI at the core of its product plans and is working on a new intelligence service along with a Siri-like agent named Candace.

Tapping the company’s recent acquisitions of Stitch and Contastic, the new technology will be designed to help businesses spend less time entering data into their customer relationship management software and more time learning from and acting upon it.

SugarCRM is scheduled to demonstrate the new capabilities Wednesday at its SugarCon conference in San Francisco.

“In the CRM space, we want people to focus on what they’re good at: relating to others, such as customers and partners,” Rich Green, SugarCRM’s chief product officer, said in an interview last week.

Microsoft rolls out SQL Server 2016 with a special deal to woo Oracle customers

Microsoft rolls out SQL Server 2016 with a special deal to woo Oracle customers

The next version of Microsoft’s SQL Server relational database management system is now available, and along with it comes a special offer designed specifically to woo Oracle customers.

Until the end of this month, Oracle users can migrate their databases to SQL Server 2016 and receive the necessary licenses for free with a subscription to Microsoft’s Software Assurance maintenance program.

Microsoft announced the June 1 release date for SQL Server 2016 early last month. Among the more notable enhancements it brings are updateable, in-memory column stores and advanced analytics. As a result, applications can now deploy sophisticated analytics and machine learning models within the database at performance levels as much as 100 times faster than what they’d be outside it, Microsoft said.

The software’s new Always Encrypted feature helps protect data at rest and in memory, while Stretch Database aims to reduce storage costs while keeping data available for querying in Microsoft’s Azure cloud. A new Polybase tool allows you to run queries on external data in Hadoop or Azure blob storage.

Need more analytics speed? Cray wants to light a fire under your big data

Need more analytics speed? Cray wants to light a fire under your big data

It’s no secret that analytics is eating the enterprise world, but if there’s anything in perpetually short supply, it’s speed. Enter Cray, which on Tuesday unveiled a new supercomputing platform designed with that in mind.

Dubbed Urika-GX, the new system is the first agile analytics platform to fuse supercomputing with an open, enterprise framework, Cray said.

Due to be available in the third quarter, Urika-GX promises data scientists new levels of performance and the ability to find insight in massive data sets quickly. The system is tuned for highly iterative and interactive analytics, and integrated graph analytics offers rapid pattern matching.

“In the past, you’d run some types of analytics every 24 hours or even every week,” said Ryan Waite, Cray’s senior vice president of products. “Today, you might want to run them every six hours or every hour to be more in tune with what customers are doing.”

Who's in your store right now? SAP's new data service can tell you

Who's in your store right now? SAP's new data service can tell you

Marketers can tap virtually limitless volumes of data about customers’ online activities, but the offline world isn’t nearly as forthcoming. That’s where SAP aims to help.

The company on Thursday unveiled a new service that offers demographic data in near real time about the people currently inside a store or at a particular venue or event. Called SAP Digital Consumer Insight, the service taps consumers’ mobile data to deliver details on where they’re coming from, their age groups and gender, and the devices they’re using. 

Marketers can also benchmark one store location against another, compare two potential new locations, or see how well their marketing efforts stack up against the competition.

Data is anonymized and aggregated, so individual privacy is protected. Equipped with the result, marketers can tailor their advertising, proximity marketing, location planning, and sales strategies and campaigns to the people currently present at a particular location. 

SAP seeks to simplify IT with a beefier new version of Hana

SAP seeks to simplify IT with a beefier new version of Hana

SAP has updated its flagship Hana in-memory computing platform with a raft of new features designed to make IT simpler while giving organizations a better handle on their data.

The updates, announced Tuesday at the company’s annual Sapphire Now conference in Florida, include a new hybrid data management service in the cloud and a new version of the company’s Hana Edge edition for SMBs.

“We’ve taken an already rock solid platform and further hardened security, enhanced availability, unified the development and administration experience, and expanded advanced analytic capabilities,” Michael Eacrett, vice president of product management for SAP, wrote in a blog post detailing the new release.

Launched more than five years ago, Hana forms the basis for S/4Hana, the enterprise suite SAP released in early 2015.

Adobe adds data-science muscle to its cloud services

Adobe adds data-science muscle to its cloud services

Finding insights in an ocean of data has become one of today’s most pressing business challenges, and software vendors are rushing to help. The latest is Adobe, which has added a host of algorithms in its cloud services to help brands uncover patterns and put them to work.

Adobe’s Creative, Document and Marketing Cloud services already use data science to help brands hone their message to customers, and the algorithms announced Wednesday add more capabilities.

In the Marketing Cloud, for instance, a new auto-allocate capability billed as a “content traffic cop” helps marketers identify the best offers, messaging and creative materials for engaging with customers. An online retailer could use it to determine that a particular promotional video is driving the most purchases, and automatically push that video to more online visitors even as it tests other content.

Users of Adobe’s Campaign tool, meanwhile, can now test predictive subject lines through a new beta program. The technology suggests which email subject line will yield the best results from a marketing campaign.

Is your data safe when it's at rest? MarkLogic 9 aims to make sure it is

Is your data safe when it's at rest? MarkLogic 9 aims to make sure it is

The database landscape is much more diverse than it once was, thanks in large part to big data, and on Tuesday, one of today’s newer contenders unveiled an upcoming release featuring a major boost in security.

Version 9 of MarkLogic’s namesake NoSQL database will be available at the end of this year, and one of its key new features is the inclusion of Cryptsoft’s KMIP (Key Management Interoperability Protocol) technology.

MarkLogic has placed its bets on companies’ need to integrate data from dispersed enterprise silos — a task that has often required the use of so-called ETL tools to extract, transform and load data into a traditional relational database. Aiming to offer an alternative approach, MarkLogic’s technology combines the flexibility, scalability, and agility of NoSQL with enterprise-hardened features like government-grade security and high availability, it says.

Now coming up in the next generation of the software will be a variety of improvements in data integration, manageability and security, the company says, but certainly most notable among them is the addition of Cryptsoft’s KMIP.

IBM's Watson is going to cybersecurity school

IBM's Watson is going to cybersecurity school

It’s no secret that much of the wisdom of the world lies in unstructured data, or the kind that’s not necessarily quantifiable and tidy. So it is in cybersecurity, and now IBM is putting Watson to work to make that knowledge more accessible.

Towards that end, IBM Security on Tuesday announced a new year-long research project through which it will collaborate with eight universities to help train its Watson artificial-intelligence system to tackle cybercrime.

Knowledge about threats is often hidden in unstructured sources such as blogs, research reports and documentation, said Kevin Skapinetz, director of strategy for IBM Security.

“Let’s say tomorrow there’s an article about a new type of malware, then a bunch of follow-up blogs,” Skapinetz explained. “Essentially what we’re doing is training Watson not just to understand that those documents exist but to add context and make connections between them.”

How the skills shortage is transforming big data

How the skills shortage is transforming big data

In the early days of computing, developers were often jacks of all trades, handling virtually any task needed for software to get made. As the field matured, jobs grew more specialized. Now we’re seeing a similar pattern in a brand-new domain: big data.

That’s according to P.K. Agarwal, regional dean and CEO of Northeastern University’s recently formed Silicon Valley campus, who says big data professionals so far have commonly handled everything from data cleaning to analytics, and from Hadoop to Apache Spark.

“It’s like medicine,” said Agarwal, who at one time was California’s CTO under former Governor Arnold Schwarzenegger. “You start to get specialties.”

That brings us to today’s data-scientist shortage. Highly trained data scientists are now in acute demand as organizations awash in data look for meaning in all those petabytes. In part as a response to this, other professionals are learning the skills to answer at least some of those questions for themselves, earning the informal title of citizen data scientist.