IDG Contributor Network: Letting go: trusting AI to do its thing while humans do theirs

IDG Contributor Network: Letting go: trusting AI to do its thing while humans do theirs

Not even a year ago, businesses across industries were still fairly united in their skepticism of artificial intelligence. It took Salesforce announcing its AI platform Einstein for them to really take note of and accept that AI was not only officially here, but here to stay.

Businesses that have since adopted AI quickly realized that part of the unwritten contract is surrendering control of data-related tasks and decision-making to a machine—either at the tactical or entire process level. They’re also learning to let go of their need to interpret and act on data insights. As a result, the decisions they’re making post-AI adoption are drastically different in nature than the ones they were making pre-AI. But somewhere in between AI adoption and full AI integration, these companies are having to cope with their fear of relinquishing control to a machine.

This fear has been here since day one, but as businesses become more educated about how AI works, the “letting go” narrative is now more prolific than ever—even among those who are already using AI to unprecedented success.

The fact of the matter is that humans, by our very nature, want to be in control. I’m sure there’s some Darwinian self-preservationist reasoning behind our need to carefully choreograph the elements around us; something to do with eliminating threat and carrying on as a species. But when it comes to surrendering control to AI—specifically as it relates to machines that automate overwhelmingly complex, data-oriented business processes—it’s likely that our ability to surrender control will influence our ongoing success, rather than prevent it.

In business, AI’s role is to maximize our productivity by taking away the minutea created by data, freeing humans to work on higher-level strategic tasks, thereby making businesses and the individuals behind them “fitter”—in the Darwinian sense—over time.

It’s important to recognize that many people’s resistance to giving up control isn’t just a garden variety power struggle or even their need to micromanage a given situation. It’s a matter of being cautious and establishing a foundation of trust, rather than going in blindly. Having worked with dozens of brands at varying phases of the AI adoption process, there are a few common themes that invariably unfold among them, offering insight into how man and machine can effectively work together, and how to make the ‘letting go’ process easier. 

Humans don’t want to do robots’ jobs—or for robots to do theirs

To opponents of AI, machine is the competition. From this perspective, AI is either going to take man’s jobs completely or AI is a literal competitor that must be competed with (and won against).  The current trajectory of AI adoption in business reveals neither to be true.

Businesses that take a hybrid approach to the division of responsibilities are seeing better results from AI and humans working together, than either one working independently. When it comes to data gathering, analysis and insights, man trying to keep up with robots is a losing proposition. Man can attempt to become robot, or robot can be forced to adopt the unique characteristics of man: creativity, reasoning, emotion and intuition. But the better approach is for them to work together and produce the best outcome possible as a result.

Humans need proof, and quickly

In my experience working specifically with marketers, they are very interested in AI when they’re introduced to it through the lens of the day-to-day tasks it will alleviate for them. Their enthusiasm dissipates, however, when they realize that the AI tool taking over these tasks for them is going to do it in a way they don’t understand. They don’t want to actively manage and implement the nitty gritty tasks, but in order to give them up they need to know that that the technology will do execute those tasks better than they or their teams can.

One way of combatting their fears here is to introduce highly-targeted, quick turnaround programs as trials that allow the AI to show what it’s made of. (Depending on the solution, “quick” could be a weekend or six months.) The sooner the machine can demonstrate its ability to not only understand what seems to be a complex problem, but also out-produce man’s ability to solve it manually, the sooner humans will be able to relinquish control.

Transparency into what the machine is doing is imperative

Giving up day-to-day execution is a lot easier for humans than giving up their decision-making privileges. For marketers, relying on a wholly autonomous AI to process data and then act on its insights without bothering to ask for its human colleague’s thoughts on approach is the biggest obstacle to letting go of control.

This is important for AI providers to consider as they design the outputs shared with businesses. As it turns out, humans don’t necessarily need or want to be involved decision their AI is making, but they do want to understand how and why the AI made them. It would, of course, be impossible for a human to keep up with the pace of a machine’s decision-making processing, but making them privy to key insights goes a long way in creating trust between robot and man. It also creates a foundation for collaboration and idea sharing, as the human learns from the AI’s insights, and is able to complement the AI’s work as a result.

Ultimately, it is this type of collaboration that makes man and machine allies rather than competitors. It’s also what establishes trust, so that humans can rest assured they’re evolving, rather than endangered.

This article is published as part of the IDG Contributor Network. Want to Join?

Source: InfoWorld Big Data

Navisite Launches VMware vCloud Availability Program At VMworld

Navisite Launches VMware vCloud Availability Program At VMworld

Navisite’s managed services for the VMware-native cloud-based replication and disaster recovery capabilities for VMware vSphere and VMware vCloud Availability for vCloud Director at VMworld, Aug 28-31. This complements Navisite’s recent launch of VMware NSX network and security virtualization platform. 

With Navisite’s VMware cloud services, running VMware vCloud Availability for vCloud Director Navisite clients will be able to test replicated environments and move applications seamlessly, all while production is actively running.

“Our clients have been looking for a native solution to replicate and migrate protected virtual machines to the cloud — without the need for costly dedicated hardware or complicated third-party solutions,” said Sumeet Sabharwal, group vice president and general manager, Navisite. “The spike in recent cyberattacks and ransomware has highlighted interest in cloud replication disaster recovery implementations. Navisite VMware vCloud Availability for vCloud Director based replication service provides clients with a simple and cost effective solution to enable availability of their VMware environments.”

Navisite’s VMware vCloud Availability for vCloud Director solutions offer a variety of benefits for clients, including:

  • On-premises installation and complete VMware compatibility: Using a Navisite VMware Cloud as a vCloud Availability for vCloud Director replication target, clients can replicate virtual machines (VMs) from primary VMware vSphere environments to a highly scalable, geographically-diverse site, based in one of Navisite’s enterprise-grade data centers.
  • Simplified failover switch capabilities: Clients can gain access to a remote site for disaster recovery without the painful process of configuring a VPN. After an interruption at a client’s primary site, users can simply initiate a failover switch to keep business applications running in a remote Navisite environment. Once the primary site is restored, clients can then initiate a failback to return to running applications as usual. Leveraging a distributed architecture provides Navisite clients with unique failover testing capabilities.
  • VMware-native solution with enhanced functionality: Unlike alternative third party replication tools that can be costly and difficult to integrate, vCloud Availability for vCloud Director is a native solution developed with the full support of the vSphere stack.

“Navisite’s innovative program gives their clients access to VMware’s latest replication and Network Virtualization technologies,” said Ajay Patel, senior vice president, general manager, Cloud Provider Software Business Unit, VMware. “Navisite’s vCloud Availability for vCloud Director solution offers clients an end-to-end, VMware-native solution that can surpass existing third-party data protection solutions.” 

Clients are already taking advantage of this cloud replication and Disaster Recovery as a Service (DRaaS) solution. One example is Ceridian, a global human capital management company that has turned to Navisite to help meet the needs of its changing business model.

“As the only provider in the U.S. with a VMware vCloud Availability replication offering, Navisite is exceeding our expectations yet again,” said Warren Perlman, CIO, Ceridian. “Their new multi-cloud solution combines the power VMware NSX and vCloud Availability for vCloud Director to deliver seamless and rapid recovery time in the event of a disaster, allowing us to keep the focus on our core business.”

Navisite is a Gold Sponsor at VMworld 2017, the industry’s largest virtualization and cloud computing event, taking place August 28-31 in Las Vegas. Navisite will be discussing and demonstrating its VMware expertise along with new VMware vCloud Availability for vCloud Director at booth #212 during the conference.

Source: CloudStrategyMag

What is data mining? How analytics uncovers insights

What is data mining? How analytics uncovers insights

Organizations today are gathering ever-growing volumes of information from all kinds of sources, including websites, enterprise applications, social media, mobile devices, and increasingly the internet of things (IoT).

The big question is: How can you derive real business value from this information? That’s where data mining can contribute in a big way. Data mining is the automated process of sorting through huge data sets to identify trends and patterns and establish relationships, to solve business problems or generate new opportunities through the analysis of the data.

It’s not just a matter of looking at data to see what has happened in the past to be able to act intelligently in the present. Data mining tools and techniques let you predict what’s going to happen in the future and act accordingly to take advantage of coming trends.

The term “data mining” is used quite broadly in the IT industry. It often applied to a variety of large-scale data-processing activities such as collecting, extracting, warehousing, and analyzing data. It can also encompass decision-support applications and technologies such as artificial intelligence, machine learning, and business intelligence.

Data mining is used in many areas of business and research, including product development, sales and marketing, genetics, and cybernetics—to name a few. If it’s used in the right ways, data mining combined with predictive analytics can give you a big advantage over competitors that are not using these tools.

Deriving business value from data mining

The real value of data mining comes from being able to unearth hidden gems in the form of patterns and relationships in data, which can be used to make predictions that can have a significant impact on businesses.

For example, if a company determines that a particular marketing campaign resulted in extremely high sales of a particular model of a product in certain parts of the country but not in others, it can refocus the campaign in the future to get the maximum returns.

The benefits of the technology can vary depending on the type of business and its goals. For example, sales and marketing managers in retail might mine customer information in different ways to improve conversion rates than those in the airline orfinancial services industries.

Regardless of the industry, data mining that’s applied to sales patterns and client behavior in the past can be used to create models that predict future sales and behavior.

There’s also the potential for data mining to help eliminate activities that can harm businesses. For example, you can use data mining to enhance product safety, or detect fraudulent activity in insurance and financial services transactions.

The applications of data mining

Data mining can be applied to a variety of applications in virtually every industry.

  • Retailers can deploy data mining to better identify which products people are likely to purchase based on their past buying habits, or which goods are likely to sell at certain times of the year. This can help merchandisers plan inventories and store layouts.
  • Banks and other financial services providers can mine data related to their clients’ accounts, transactions, and channel preferences to better meet their needs. They can also gather then analyzed data from their websites and social media interactions to help increase the loyalty of existing customers and attract new ones.
  • Manufacturing companies can use data mining to look for patterns in the production process, so they can precisely identify bottlenecks and flawed methods and find ways to increase efficiencies. They can also apply knowledge from data mining to the design of products, and make tweaks based on feedback from customer experiences.
  • Educational institutions can benefit from data mining such as analyzing data sets to predict the future learning behaviors and performance of students, and then using this knowledge to make improvements in teaching methods or curricula.
  • Health care providers can mine and analyze data to determine better ways of delivering care to patients and cutting costs. With the help of data mining, they can predict how many patients they will need to care for and what type of services those patients will need. In the life sciences, mining can be used to glean insights from massive biological data, to help develop new medicines and other treatments.
  • In multiple industries, including health care and retail, you can use data mining to detect fraud and other abuses—much more quickly than with traditional methods for identifying such activities.

The key components of data mining

The process of data mining includes several distinct components that address different needs:

  • Preprocessing. Before you can apply data mining algorithms, you need to build a target data set. One common source for data is a data mart or warehouse. You need to perform preprocessing to be able to analyze the data sets.
  • Data cleansing and preparation. The target data set must be cleaned and otherwise prepared, to remove “noise,” address missing values, filter outlying data points (for anomaly detection) to remove errors or do further exploration, create segmentation rules, and perform other functions related to data preparation.
  • Association rule learning (also known as market basket analysis). These tools search for relationships among variables in a data set, such as determining which products in a store are often purchased together.
  • Clustering. This feature of data mining is used to discover groups and structures in data sets that are in some way similar to each other, without using known structures in the data.
  • Classification. Tools that perform classification generalize known structures to apply to new data points, such as when an email application tries to classify a message as legitimate mail or spam.
  • Regression. This data mining technique tis used to predict a range of numeric values, such as sales, housing values, temperatures, or prices when given a particular data set.
  • Summarization. This technique provides a compact representation of a data set, including visualization and report generation.

Dozens of vendors provide data mining software tools, some offering proprietary software and others delivering products via open source efforts.

Among the key vendors that offer proprietary data-mining software applications are Angoss, Clarabridge, IBM, Microsoft, Open Text, Oracle, RapidMiner, SAS Institute, and SAP.

Organizations that provide open source data mining software and applications include Carrot2, Knime, Massive Online Analysis, ML-Flex, Orange, UIMA, and Weka.

The risks and challenges of data mining

Data mining comes with its share of risks and challenges. As with any technology that involves the use of potentially sensitive or personally identifiable information, security and privacy are among the biggest concerns.

At a fundamental level, the data being mined needs to be complete, accurate, and reliable; after all, you’re using it to make significant business decisions and often to interact with the public, regulators, investors, and business partners. Modern forms of data also require new kinds of technologies, such as for bringing together data sets from a variety of distributed computing environments (aka big data integration) and for more complex data, such as images and video, temporal data, and spatial data.

Getting the right data and then pulling it together so it can be mined isn’t the end of the challenge for IT. The cloud, storage, and network systems need to enable high performance of the data mining tools. And the resulting information from the data mining needs to be presented clearly to the wide range of users expected to act on and interpret it. You’ll need people with skills in data science and related areas.

From a privacy standpoint, the idea of mining information that relates to how people behave, what they buy, what websites they visit, and so on can set off concerns about companies gathering too much information. That affects not just your technological implementation but your business strategy and risk profile.

Beyond the ethics of tracking individuals so thoroughly, there are also legal requirements about how data can be gathered, identified to a person, and shared. The United States’ Health Insurance Portability and Accountability Act (HIPAA) and the European Union’s General Data Protection Directive (GDPR) are among the best known.

In data mining, the initial act of preparation itself, such as aggregating and then rationalizing data, can disclose information or patterns the might compromise the confidentiality of the data. Thus, it’s possible to inadvertently run afoul of ethical concerns or legal requirements.

Data mining also requires data protection every step of the way, to make sure data is not stolen, altered, or accessed secretly. Security tools include encryption, access controls and network security mechanisms.

Data mining is a key differentiator

Despite these challenges, data mining has become a vital component of the IT strategies at many organizations that seek to gain value from all the information they’re gathering or can access. This drive will no doubt accelerate with ongoing advancements in predictive analytics, artificial intelligence, machine learning, and other related technologies.

Source: InfoWorld Big Data

IDG Contributor Network: The next big challenge in building the data-driven economy

IDG Contributor Network: The next big challenge in building the data-driven economy

The importance of data continues to grow, so much so that The Economist recently declared “the world’s most valuable resource is no longer oil, but data.” The rise in connected devices, from mobile phones to information-gathering sensors, is producing more data than ever with the potential to provide new insights about our economy.

As with any valuable resource, though, production is only part of the equation. Logistics — moving the resource to the right place at the right time — is equally important. After all, a gallon of gas only has value when you can get it to people with cars.

When it comes to data, most industries still lag in logistics. While connected devices are driving up data production, many businesses are still using outdated file transfer software, such as FTP, to move vast amounts of data between locations — the digital equivalent of sending packages cross-country by horse and buggy.

Cloud and cognitive fuel need for data transfer

To realize the full potential of data, the technology used to move data will need to catch up. Unlike older transfer systems, these new technologies must provide fast, predictable rates of transfer regardless of the amount of data, locations involved or competing traffic. This need for faster and more advanced data transfer technologies will become even more important as two major trends take hold: the shift to the cloud and the rise of cognitive business.

Cloud is enabling significant cost savings, but also new opportunities.  Mass storage archives are being created in the cloud every day with clients moving vast amounts of data to cheaper storage.  Additionally, cloud-based analytics is enabling the ability to create temporary or permanent analytic stores to answer problems in a much more real-time basis than ever before.  Cognitive application being built in the cloud are only as smart as the data that is given to them.  Incomplete or late arriving data is at times worse than no data at all.

Industry transformation starting to take hold

A few industries have traditionally led the way in making data transfer a priority. IBM Aspera and other technologies that specialize in high-speed data transfer have largely built their businesses through clients in the media and entertainment industry. For these companies, the fast, secure transfer of massive video files has transformed their industry enabling global collaboration and speeds time to market for new projects.

The media and entertainment industry, though, is just the beginning. Some retailers, for example, are now using new data transfer technology to distribute large video and software files to stores globally to more quickly refresh HD video walls and interactive displays. It’s helping them engage shoppers with the latest campaigns and product content and stand out in an intensely competitive field.

Improving health care, advancing auto safety, and fighting fraud

The health care industry also is starting to take advantage of data transfer. It’s helping researchers more efficiently share findings, building their collective body of knowledge more quickly to uncover trends, treatments and cures. Improved data sharing also has the potential to improve care for individual patients by allowing them to get real-time second opinions from specialists thousands of miles away.

In the auto industry, engineers developing next-generation driver assistance software are making use of advanced data transfer tools to collect valuable sensor data from test facilities around the globe. As more technology reaches production, fast data transfer also will be needed between vehicles and infrastructure to underpin new safety features and eventually enable self-driving cars. A lag in even a second could be the difference between a car stopping at the light and cruising into the intersection.

And in banking and financial services, faster and more secure data transfer will become necessary to keep up with the boom in digital transactions and help companies analyze data more quickly. Instead of investigating fraud after it occurs, fast data transfer could help these companies catch criminals in the act.

Data transfer will become a competitive advantage

Data production will continue to grow across all industries. That part of the equation is certain as mobile devices spread globally and information-gathering-sensors fuel the internet of things. The next challenge for businesses is creating the infrastructure to move data to the right place at the right time.

Those companies that master both the mining and movement of data will separate themselves from the pack. They’ll have a more informed view of the market and gain insights more quickly, keeping them in step with the needs of their customers and ahead of their competition.

This article is published as part of the IDG Contributor Network. Want to Join?

Source: InfoWorld Big Data

IDG Contributor Network: Vertically challenged: the pace of digital transformation across industries

IDG Contributor Network: Vertically challenged: the pace of digital transformation across industries

As I’ve outlined in my previous posts, we are in the early stages of a digital enterprise transformation tsunami and data is the new currency. By 2020, IDC predicts 1.7 megabytes of new information will be created for every human being on the planet, every second. That’s more data than five times the print collection of the Library of Congress. Every. Year. Never has there been such a motherlode of data for businesses to use to better serve their users and be more efficient. It is a massive challenge for IT organizations to seize this opportunity and fundamentally transform how they manage data and deliver it as a service to their users. It’s a challenge companies in every industry face, but there are some notable differences in how quickly certain industries have embraced the transformation.

Some industries fall into the category of fast “digital transformers”: take the fashion industry, for example. In an article on technology trends shaping the fashion industry, the World Economic Forum notes it is “one major sector being fundamentally transformed from the inside out by technology,” citing evidence like the fashion capitals of the world New York, Paris, and Milan being usurped in importance by digital platforms like Snapchat, Instagram, Pinterest and Periscope. The very nature of stores and the shoppers who used to occupy stores is changing: now that the same clothing items you used to purchase at a store are available online, what incentive does a consumer have to shop at physical stores? Digital transformation is an essential aspect to the success and future of the business, and it’s something the fashion industry, generally, has been very good at recognizing.

What happens to those who don’t take the leap to transform?  Let’s take J.Crew and its struggle to adapt to a tech-first fashion industry as a recent example. Amidst company turmoil, Mickey Drexler, chairman and chief executive of J.Crew Group went as far as to say “If I could go back 10 years, I might have done some things earlier” when referring to embracing technology. Sales at the company have fallen for the past 10 quarters, and the retail veteran who turned J.Crew into a household name actually recently stepped down as chief executive after a failure to stop the brand’s decline.

Competitors with high-tech, data-driven supply chains can copy styles faster, move them into stores quicker, and outmaneuver them. By not pursuing a digital-first approach to its business, J.Crew fell a step behind all the other fashion companies who chose to take the leap sooner. A failure to innovate and digitally transform means companies like this can’t maximize datanomics to their advantage for business intelligence and acceleration; most data simply goes unutilized, gathering dust rather than adding value.

While J.Crew provides us with an example of a digital transformation that may have come too late, the fashion industry does have plenty of examples of faster digital transformations as well. Industries like finance and health care, however, often prove slower to complete the digital transformation necessary to harness modern datanomics. According to a Harvard Business Review article on the biggest health care challenges, health care leaders see outdated or ineffective IT infrastructure as their major roadblock. Oftentimes for these industries battling slow digital transformation, getting sign off from executives to try something new is the biggest hurdle. Knowing the benefits for transforming digitally, regardless of the industry, can often be the first step in convincing your company it’s the right step to take. For health care, critical data underpins the complex interchange of patient records, research, and medical advances, and the controlling data must be reliable, immediately accessible and secure. With this in mind, health care agencies have begun to turn into their own disruptors by implementing data virtualization solutions to collapse costs and time factors, transforming the way they operate and deliver their services in the process.

As an example, a customer of ours, Access Community Health Network, supports 40 community health centers throughout metropolitan Chicago. It is responsible for organizing all patient and financial information. By undertaking digital transformation and implementing a data-as-a-service solution, they were able to streamline, automate, and create 24/7 high-performance access to critical data with no downtime—critical when lives depend on the services they provide.

The finance industry typically mirrors the pace of health care transformation, with the challenge for large banks and financial institutions coming in many forms: for example, a lack of responsiveness and agility in the marketplace worries IT leadership, or slow restores of large databases threaten devops teams. An Actifio customer, a top 20 global consumer and investment bank with data centers around the world, faced the challenge of staying agile in the marketplace and developing new capabilities powered by new software. By taking the leap to transform digitally and improve datanomics, this bank tested and deployed a new database-as-a-service cloud across its entire infrastructure, improving compliance, recovery times, and most importantly the agility of its devops efforts. In the process, the global firm saved well over $25 million in infrastructure, software licensing and operational costs—in the very first year.

No matter the industry and the reputation of industries to transform fast or slow, there is exponentially more data to be managed in digital enterprises today and making sure your organization is equipped to manage that data as a service is crucial. Data is increasingly the most strategic asset of any enterprise, or even an individual. Without proper management of data, it is impossible for organizations to make the most of it to grow a business and leverage the benefits of data-as-a-service. For businesses, managing the datanomics to drive digital transformation is very black and white: either they thrive, or they die. 

This article is published as part of the IDG Contributor Network. Want to Join?

Source: InfoWorld Big Data

Microsoft Azure ExpressRoute Now Available Across Seven CoreSite Markets

Microsoft Azure ExpressRoute Now Available Across Seven CoreSite Markets

CoreSite Realty Corporation has announced the expanded availability of Microsoft Azure ExpressRoute, which can now be privately accessed from seven of CoreSite’s markets across the country, including Northern Virginia, Chicago, Silicon Valley, Denver, Los Angeles, New York, and Boston.

CoreSite customers can privately connect to Microsoft Azure, Office 365 and Dynamics 365 via the CoreSite Open Cloud Exchange, which provides high-performance, SLA-backed virtual connections and on-demand provisioning. The integration of Azure ExpressRoute and the CoreSite Open Cloud Exchange provides CoreSite customers with the opportunity to establish a fast and reliable private connection into Microsoft Azure, Office 365 and Dynamics 365. With an Azure ExpressRoute connection, customers have a natural extension of their data centers and can build hybrid applications that span on-premises infrastructure and Microsoft Cloud services without compromising privacy or performance.

CoreSite customers can efficiently transfer large data sets for high-performance computing, migrate virtual machines between dev-test environments in Azure to production environments housed in a CoreSite data center and optimize replication for business continuity, disaster recovery, and other high-availability strategies.

“We are excited to announce the expanded availability of Microsoft Azure ExpressRoute connectivity to our customers across seven of our key markets,” said Brian Warren, senior vice president of engineering & products at CoreSite. “We are enabling our customers with the solutions necessary to bring together all of their applications, data, devices, and resources, both on-premise and in the cloud, with predictable, reliable, and secure high-throughput connections.”

Source: CloudStrategyMag

Rackspace Expands Private Cloud Capabilities

Rackspace Expands Private Cloud Capabilities

Rackspace® has announced the general availability of Rackspace Private Cloud powered by VMware®, which will now be built on VMware Cloud Foundation™. With this, customers can enhance the foundational technology that is enabling their move out of the data center and into the cloud with the newest VMware capabilities. Rackspace Private Cloud powered by VMware built on VMware Cloud Foundation will enable full software defined data center (SDDC) capabilities including compute, storage and networking that span the public and private cloud.

VMware Cloud Foundation accelerates IT’s time-to-market by providing a factory-integrated cloud infrastructure stack that is simple to use and includes a complete set of software-defined services for compute, storage, networking and security. Rackspace Private Cloud powered by VMware helps businesses maximize their VMware deployments by helping build, operate and optimize customers’ physical and virtual infrastructure, freeing IT resources from day-to-day infrastructure management so they can focus on their core business. Rackspace is one of the largest global providers in the VMware Cloud Provider™ Program and has partnered with VMware for more than 10 years delivering valuable solutions for mutual customers.

Built on VMware Cloud Foundation, Rackspace Private Cloud provides mutual customers with enhanced capabilities and management benefits including:

  • Standardized Architecture: Rackspace Private Cloud powered by VMware is built on VMware Validated Designs, which are based on best practices, making deployments more predictable and lower risk.
  • Continuous Updates and Lifecycle Management: Continuous updates allow for the most up-to-date VMware capabilities through lifecycle management of VMware components, thereby helping to improve users’ security posture.
  • Leverage Existing VMware Investments: Users leverage the control, flexibility and choice needed to run VMware as easily as they would in their own data center.IT departments can migrate or extend to the VMware cloud with consistent tooling and skills. Consistent infrastructure architecture can be leveraged across multiple locations without the need to refactor code. Mutual customers maintain value of existing investments made in training, VMware technology and familiar tools by accelerating adoption of software-defined infrastructure.
  • Offload Physical and Virtual Infrastructure Operations: Rackspace delivers a hosted model, which eliminates many of the procurement and integration challenges that IT organizations face in their own data centers. Mutual customers also benefit from the ability to scale their solution quickly and as needed without the need for significant upfront CAPEX investments in data centers and hardware.
  • Managed by Rackspace, Powered by VMware: With Rackspace Private Cloud powered by VMware, customers have access to Fanatical Support® provided 24x7x365 from more than 150 VMware Certified Professionals (VCPs) to help migrate, architect, secure and operate Rackspace hosted clouds powered by VMware technologies.

“Provisioning hardware quickly is no longer considered a value for customers, it’s expected,” said Peter FitzGibbon, vice president and general manager of VMware at Rackspace. “The enhancement in our VMware private cloud delivery model through VMware Cloud Foundation will provide further value to new and existing Rackspace Private Cloud powered by VMware customers by giving them access to the most streamlined and innovative VMware SDDC capabilities and lifecycle management. We are excited to use VMware Cloud Foundation and look forward to continued innovation on the platform.”

“With a decade of proven success in helping customers meet their business demands, VMware and Rackspace are taking another step together to help mutual customers dramatically shorten the path to hybrid cloud,” said Geoffrey Waters, vice president of Global Cloud Sales at VMware. “VMware Cloud Foundation is the industry’s most advanced cloud infrastructure platform that unlocks the benefits of hybrid cloud by establishing a common, simple operational model across private and public clouds. Together with Rackspace and its renowned Fanatical Support, we will add great value to mutual customers in their digital transformation journey.”

Source: CloudStrategyMag

ByteGrid Chosen By Re-Quest, Inc. For Highly Secure Hosting Solutions

ByteGrid Chosen By Re-Quest, Inc. For Highly Secure Hosting Solutions

ByteGrid Holdings LLC has announced an agreement with Re-Quest, Inc. to provide highly secured technical expertise supporting Re-Quest’s delivery of Oracle hybrid cloud solutions for their customers.

Re-Quest has been successfully assisting customers around the world since 1991, leveraging their investment in Oracle Technology and Infrastructure assets to gain higher returns on investment, lower total cost of ownership and measurable improvement in their business processes.

“Re-Quest prides itself on the high level of business process and technical expertise we bring to every client engagement,” said Ron Zapar, CEO. “Which is why we chose to partner with ByteGrid, providing our customers with high value services across a complete spectrum of Oracle Hybrid Cloud solutions.”

“We know it’s important for Re-Quest to provide their customers with the technical perspective to implement projects that deliver complete customer satisfaction and partner success,” said Jason Silva, ByteGrid’s CTO. “We’re proud to partner with Re-Quest to ensure they’re successful in bringing that satisfaction by hosting their Oracle Hybrid Cloud technology solutions.”

In addition to this new agreement with Re-Quest, ByteGrid serves some of the world’s largest companies and government agencies, including numerous Fortune 50 companies.

Source: CloudStrategyMag

Rob Kakareka Joins Qligent As Manager Of Business Development

Rob Kakareka Joins Qligent As Manager Of Business Development

Qligent has announced that Robert “Rob” J. Kakareka has joined the company as its new manager/business development. A broadcast industry veteran with extensive sales experience, Kakareka is tasked with developing U.S. sales, customer relationships and market opportunities.

Kakareka reports directly to John Shoemaker, Qligent’s director of sales. Atlanta-based Kakareka will focus on selling the company’s innovative, Vision cloud-based monitoring and compliance platform to U.S. broadcasters, including major networks and call-letter stations.

Qligent’s Vision platform gathers and analyzes data from high-end probes that monitor distinct points along the distribution signal path, out to the last mile. This data enables broadcasters to ascertain that they are delivering an optimal Quality of Experience (QoE) for their viewers, and pinpoint technical issues they need to address.   

“I’m excited to be promoting the value and benefits of Qligent’s flagship product, Vision, at a time of rapid change in the broadcast industry,” said Kakareka. “Vision is uniquely positioned to support mission-critical broadcast distribution in a cost-efficient SaaS model as the industry expands from traditional over-the-air, cable and satellite channels to new digital, mobile and over-the-top (OTT) outlets.

“Despite this dramatic IP-centric shift, the broadcast industry remains a close-knit community with unique requirements and workflows,” Kakareka continued. “My goal is to show broadcasters that not only is our technology exceptional, but we have their backs as they venture into new and emerging market opportunities — including a true Monitoring as a Service business model that offloads monitoring, analysis and troubleshooting responsibilities to our managed services layer.”

With a career spanning over 20 years, Kakareka is no stranger to the broadcast industry, having held strategic sales and business development positions for many high-profile brands. These prior posts include Avid (Orad) Graphics Systems (from February 2014 to February 2016), Miranda (February 2012 to March 2014), Pixel Power (February 2008 to February 2012) and BarcoNet (February 2001 to February 2002).

In these national sales roles, Kakareka regularly outperformed sales quotas, broadened customer bases, boosted sales revenues, and built strong customer relationships with broadcasters nationwide. He’s also knowledgeable in all aspects of broadcast television operations, including graphics and virtual reality studio workflows, SaaS digital media services, big data-scaled storage, TV/film production and OTT/cloud workflows.

Kakareka has also tackled complex business development challenges, such as developing new business for Comprehensive Technical Group, while creating new business plans for this system integration firm’s existing clients. While working for systems integrator Technical Innovations/Broadcast Solutions Group (from February 2002 to February 2008), he implemented a sales plan for the rollout of ATSC compliant DTV systems, sold and integrated them at hundreds of stations across North America, among other sales achievements. 

 “In his stellar career, Rob has witnessed this industry’s many transitions firsthand, and that experience will be especially valuable as we engage with broadcasters to demonstrate how our unique, groundbreaking cloud software can solve today’s ‘uncontained’ distribution challenges,” said Shoemaker. “Our company has experienced rapid growth in a short time, and we’re confident that Rob’s industry expertise, insight and track record will help us capitalize on this momentum and significantly expand our U.S. customer base.”    

Source: CloudStrategyMag

IDG Contributor Network: Responsible retail: treating customer data with care

IDG Contributor Network: Responsible retail: treating customer data with care

Retailers have become so adept at capturing and analyzing consumer data that there is now a real risk that they might alienate customers by revealing just much they know about our lifestyle, habits, and preferences. So if retailers want their big data investments to pay off, they must tread carefully. 

Big data exploitation in retail is no longer restricted to tracking and responding to broad trends; it’s become very personal. Which is great if the result is that customers find exactly what they were looking for; less so if it feels intrusive or invasive.

Analytics technology is now so sophisticated that, by drawing on an individual’s loyalty-card records, payment histories and browsing habits, retail marketing programs can detect an alcohol problem, whether someone has lost their job (because spending drops and premium brands are replaced by “value” purchases), if they’re away on holiday, and much more besides. (A few years ago, Target worked out that a teenage girl was pregnant before she knew herself.) 

This is not to imply that retailers are necessarily doing anything wrong or sinister (customers may well have given consent for this kind of data usage). But it can be unnerving to think that every time we browse online or in a store, that activity is being monitored to build a picture of our entire lives. Just think how often we are pestered with unsolicited promotions related to a product we may have glanced at only once.

Even in Europe, where measures to protect consumer privacy are fairly robust, customers are now being tracked via their mobiles as they enter or pass by stores. Their activity can be registered—even if they don’t have a loyalty card or store app. In the US, meanwhile, regulations are becoming looser rather than more stringent now that safeguards protecting internet search histories are being dismantled. So the scope for overstepping the mark is growing.

Snooping vs. problem-solving

If retailers want to impress and retain customers, rather than undermine their trust, they need to turn their attention to more beneficial ways of applying algorithms and data discovery.

In fashion, retailers are exploring ways of minimizing sales returns—a problem so costly across e-commerce that the likes of Amazon have gone so far as banning customers who do this too often. In the US alone, merchandise returns were valued at $260.5 billion in 2015, roughly 8 percent of total sales, according to the National Retail Federation. Returns are a pain for customers, too: who wants the disappointment and hassle of having to send something back because it’s not quite right? A common cause of apparel returns is over-ordering, because consumers haven’t been confident of getting the right size; this is something the industry is now trying to address with new combinations of technology and new data insight.

Another option is to use customer intelligence to provide a more responsive logistics service. Amazon has patented a shipping model that anticipates what goods certain customers are going to order, so it can have the products waiting in a nearby warehouse for faster delivery. Combine this type of strategy with automated drone deliveries and the customer experience might soar while the cost of logistics (even the need for delivery partners) diminishes.

Greater empathy, better service

To the customer, real service innovation reduces the sense of being spied upon because of the perceived personal benefit. The end justifies the means. Just as, if I go to my regular bar, it suits me that they’ll have my favorite drink ready for me before I’ve even taken a seat because of how well they know me. Though if that happened in a bar I’d never been to before, that would be unsettling. Context—and consent—matter.

If the result of deeper customer insight is something genuinely useful to the consumer, surrendering anonymity and sharing data becomes a lot more palatable. People do appreciate easier access to the items they want, it does make their life easier if they don’t have to parcel up returns, and a timely recommendation can be useful in the right circumstances. So really, retailers just need to be a bit more thoughtful about how they apply their knowledge.

What isn’t in dispute is the strategic value of data. Figures from Gallup Behavioral Economics suggest that organizations that are able to exploit customer behavioral insights outperform their peers by 85 percent in sales growth, and more than 25 percent in gross margin. So keep building those data vaults and adding ever more sophisticated real-time analytics; the rest is down to using the insights to best effect.

This article is published as part of the IDG Contributor Network. Want to Join?

Source: InfoWorld Big Data