New R extension gives data scientists quick access to IBM's Watson

New R extension gives data scientists quick access to IBM's Watson

Data scientists have a lot of tools at their disposal, but not all of them are equally accessible. Aiming to put IBM’s Watson AI within closer reach, analytics firm Columbus Collaboratory on Thursday released a new open-source R extension called CognizeR.

R is an open-source language that’s widely used by data scientists for statistical and analytics applications. Previously, data scientists would have had to exit R to tap Watson’s capabilities, coding the calls to Watson’s APIs in another language, such as Java or Python.

Now, CognizeR lets them tap into Watson’s so-called “cognitive” artificial-intelligence services without leaving their native development environment.

“Data scientists can now seamlessly tap into our cognitive services to unlock data that lives in unstructured forms like chats, emails, social media, images, and documents,” wrote Rob High, vice president and CTO for Watson, in a blog post.

Red Hat Positioned in the Visionaries Quadrant of Gartner's 2016 Magic Quadrant

Red Hat Positioned in the Visionaries Quadrant of Gartner's 2016 Magic Quadrant

Red Hat, Inc. has announced that Red Hat Enterprise Virtualization has been positioned by Gartner, Inc. in the “Visionaries” quadrant of the August 2016 x86 Server Virtualization Infrastructure Magic Quadrant. With Red Hat Enterprise Virtualization, companies around the world are building a foundation for future technologies while integrating with existing technologies with an open, scalable, and high-performance virtualization infrastructure.

Gartner’s Magic Quadrants are based on rigorous analysis of a vendor’s completeness of vision and ability to execute. According to Gartner, “About 80% of x86 server workloads are virtualized, but virtualization technologies are becoming more lightweight, supporting more workloads and agile development. Price, modernization and specific use cases are driving enterprises to deploy different, and often multiple, virtualization technologies.”

“Many enterprises are looking for an open alternative to proprietary virtualization solutions to obtain better efficiencies and interoperability, and to bridge their traditional infrastructure to cloud-native workloads using OpenStack or other platforms. As the only company included in the Visionaries quadrant, we believe Red Hat’s Magic Quadrant position reinforces our continued innovation, momentum and strong vision for an open, high-performance virtualization alternative,” said Gunnar Hellekson, director of product management, Linux and Virtualization, Red Hat.

Red Hat Enterprise Virtualization is an open infrastructure and management platform for servers and workstations with robust security capabilities. It is built on Red Hat Enterprise Linux and Kernel-based Virtual Machine (KVM) technologies, and enables customers to virtualize both traditional and cloud-native applications. Red Hat Enterprise Virtualization is the open alternative. It offers a high-performing, fault-tolerant, and more secure platform for mission-critical, virtualized Linux and Windows environments. Red Hat reduces the cost and complexity of proprietary virtual machines (VM) through improved economics, interoperability, and agility of virtualization. Backed by Red Hat’s certified ecosystem of software and hardware partners, Red Hat Enterprise Virtualization offers unparalleled performance, scale, and flexibility to support a broad range of critical workloads.

Source: CloudStrategyMag

Nimbix Expands Market Presence In Cloud-based Machine Learning

Nimbix Expands Market Presence In Cloud-based Machine Learning

Nimbix has announced a significant increase in their presence in the machine learning market space as more customers are using their JARVICE platform to help address the need for an easier, more cost efficient way of working with machine learning.

Using JARVICE to manage their machine learning process, customers are able to leverage JARVICE’s turnkey workflows, reducing time to deployment from weeks to hours. Built on NVIDIA GPU’s for optimal neural network training, Nimbix’s per-second billing also enables their customers to capture the best economics for the neural network evaluation phase of machine learning.

Experienced machine learning developer, Hugh Perkins, author of the popular open source OpenCL libraries DeepCL and cltorch, is an avid user of the Nimbix cloud. Perkins chose to work with Nimbix in addressing machine learning due to the powerful platform API, industry-leading selection of GPUs, superior-performance and economics.

“Nimbix is a breath of fresh air,” said Perkins. “The per-second billing, spin up times of seconds, and the availability of high end GPUs, make Nimbix an awesome choice for machine learning developers.”

The Nimbix cloud platform is democratized and developer-friendly, allowing users to monetize their trained neural networks in the application marketplace. Democratizing machine learning APIs will distribute the power of neural networks to smaller organizations, allowing for more breakthroughs in life sciences, IoT, automotive, and more.

“The Nimbix Cloud was a great choice for our research tasks in conversational AI. They are one of the first cloud services to provide NVIDIA Tesla K80 GPUs that were essential for computing neural networks that are implemented as part of Luka’s AI,” said Phil Dudchuck, co-founder at Luka.ai.

Nimbix provides on-demand and scalable compute resources that enable organizations to run large-scale HPC workloads in the cloud. High Performance Computing (HPC) allows scientists, developers, and engineers to solve complex science, engineering, development, big data, and business problems using applications that require high compute capabilities. Nimbix’ growing number of integrated cloud services allows organizations to increase the speed and effectiveness of research by running high performance computing in the Nimbix Cloud.

“The Nimbix Cloud, powered by JARVICE, is an ideal platform for machine learning applications as it provides access to true supercomputing GPU and FPGA-based accelerated computing with large amounts of memory is paramount for training deep neural networks ,” said Leo Reiter, chief technology officer at Nimbix. “The high performance execution of the Nimbix Cloud is key to delivering timely results to our end users.  We are continuing to improve JARVICE by allowing our customers to utilize the platform to democratize these resources and provide a processing API to help add cognitive features to any application seamlessly and cost-effectively.”

Source: CloudStrategyMag

Microsoft Named A Leader By Gartner

Microsoft Named A Leader By Gartner

Gartner has positioned Microsoft in the Leaders Quadrant in the 2016 Magic Quadrant for Cloud Infrastructure as a Service (IaaS) based on its completeness of vision and ability to execute in the IaaS market. Microsoft is the only vendor recognized as a leader across Gartner’s Magic Quadrants for IaaS, PaaS and SaaS solutions for enterprise cloud workloads. Register here to download the full report.

Source: CloudStrategyMag

Rootstock Appoints BT Partners To Represent Its Cloud ERP Solutions

Rootstock Appoints BT Partners To Represent Its Cloud ERP Solutions

Rootstock has announced that BT Partners has been appointed to help manufacturers and distributors learn about, buy, and implement Rootstock’s Cloud Manufacturing ERP software. For 30 years, the Chicago provider of technology-based business consulting services has been helping manufacturers and distribution organizations improve the ways their businesses operate with ERP and financial software solutions. 

“With Rootstock’s Manufacturing and Distribution Cloud ERP, a manufacturer or distributor can reduce its IT operational costs by outsourcing hardware and software maintenance and support to the cloud provider and get their ERP solution up and running faster,” emphasizes Todd Perlman, president of BT Partners. “With its ERP built on the Salesforce platform, Rootstock customers leverage investment in their Salesforce platform by integrating it seamlessly with other Salesforce cloud applications.”

“Today, Cloud ERP buyers need to be cautious,” adds Pat Garrehy, CEO of Rootstock Software. “Not all ERP systems are developed from a manufacturing core, especially those residing in the Cloud.  Some vendors begin with accounting software while others start out in maintenance or human resources. Rootstock’s core functions have been architected based on the needs and environment of supply chain-based organizations. Our solution is extended and enhanced by BT Partners’ technical architects as they are leaders in their field and familiar with the global governance, data management and complex integrations necessary for manufacturers to be successful.”

BT Partners is experienced in working with manufacturing and distribution companies with complex inventory and shipping needs. BT Partners will assess ERP utilization and review processes and identify needs that are not met now or are emerging soon. BT Partners is focused on developing recommendations for optimizing clients’ investments and managing the implementation process.

Source: CloudStrategyMag

IDG Contributor Network: Five core attributes of a streaming data platform

IDG Contributor Network: Five core attributes of a streaming data platform

As your data-driven organization considers incorporating new data sources like mobile apps, websites that serve a global audience, or sensor information from the internet of things, technologists will have questions about the required attributes of a streaming data platform.

There are five core attributes that are necessary for the implementation of an integrated streaming platform and allow for both the acquisition of streaming data and the analytics that make streaming applications possible:

Low latency: Streaming data platforms need to match the pace of the data sources that they will acquire data from as part of a stream. One of the keys to streaming data platforms is the ability to match the speed of data acquisition with the requirements of the near real-time analytics needed to disrupt particular business models or markets. The value of real-time streaming analytics diminishes when you have to wait for the data to be landed in a data warehouse or a Hadoop-based data lake architecture. In particular, for location-based services and predictive maintenance applications, the time between when the data is created and landed in a data management environment represents a missed customer opportunity at the least or a stranded multi-million dollar asset critical to your business operations at the most.

Scalable: Streaming data platforms are not just connecting a couple of data sources behind the corporate firewall. Streaming data platforms need to be able to match the projected growth of connected devices and the internet of things. This means that streaming data platforms will need to be able to stream data from a large number of sources — potentially millions or even billions of sources, both internally and externally.

4 Winners and 3 Losers in Gartner's Magic Quadrant for IaaS

4 Winners and 3 Losers in Gartner's Magic Quadrant for IaaS

Gartner has released the results of its Magic Quadrant for Infrastructure as a Service for 2016. The winners in the public cloud space are innovating and adding new features rapidly, while the losers are falling further and further behind. Here’s a look at some of the highlights of the report.

SEE ALSO: For Gartner, Two Cloud Providers Stand Out from the Pack

Winner: Amazon Web Services

AWS is the clear leader in the IaaS space with “a diverse customer base and the broadest range of use cases.” Its partner ecosystem combined with its training and certification programs “makes it easier to adopt and operate AWS in a best-practice fashion,” Gartner says.

Gartner notes that optimal use of AWS may require professional services, and recommends the use of third-party cost management tools to keep track of cloud expenses. Of course, all of this is good news for the latest crop of cloud service providers who are positioning their services around providing better support for AWS.

Winner: Microsoft

Microsoft Azure is considered one of the big three IaaS providers right now. Gartner says Microsoft’s strengths include integrated IaaS and PaaS components that “operate and feel like a unified whole”, rapid addition of new features and services, and becoming more open – including its support of Red Hat earlier this year.

And Gartner isn’t the only one that recognizes Azure’s mass appeal: other recent research has predicted that adoption of Azure by CIOs could surpass AWS by 2019.

Like AWS, successful implementation of Azure relies on customers forming relationships with partners. But Gartner says that while Microsoft “has been aggressively recruiting managed service and professional services partners… many of these partners lack extensive experience with the Azure platform, which can compromise the quality of the solutions they deliver to customers.”

But it’s not necessarily the fault of the partners; Gartner says that “CMP vendors and MSPs report challenges in working with Azure, particularly in the areas of API reliability and secure authentication, which are slowing their ability to deliver solutions.”

Winner: Google

Google’s capabilities in the IaaS space rely heavily on its own experience running the back-end of its behemoth search engine. In other words, Google allows other companies to “run like Google” which makes it the top contender for cloud-native use cases and applications.

But Google is lacking in key areas that could prevent it from further adoption with established organizations and startups; namely, “user management suitable for large organizations, granular and customizable role-based access control (RBAC), complex network topologies equivalent to those in enterprise data centers, and software licensing via a marketplace and license-portability agreement.”

Unlike AWS and Microsoft, who have been fairly supportive of partners, Google has focused more on delivering its cloud services direct, even pushing some MSPs to vow to never work with the company.

Winner: Rackspace

With roots in OpenStack cloud, Rackspace has worked to be more technology-neutral, and shifted away from this to embrace “its roots as ‘a company of experts,’” offering managed AWS support and other managed services for third-party clouds. Rackspace is also strong when it comes to private cloud offerings.

What has held Rackspace back? According to Gartner, it has not been able to keep up with the pace of innovation of the market leaders.

Gartner also hinted that Rackspace could become an acquisition target – which was at least partially confirmed this week as reports surfaced that it is close to a deal with private equity firm Apollo.

Loser: VMware

While Gartner acknowledges that VMware is the market share leader in virtualization, vCloud Air has “limited appeal to the business managers and application development leaders who are typically the key decision makers for cloud IaaS sourcing.”

“VMware is no longer significantly expanding the geographic footprint of vCloud Air, nor investing in the engineering necessary to expand its feature set beyond basic cloud IaaS,” Gartner says.

Loser: NTT Communications

Thought NTT Communications (NTT Com) has a strong presence in Asia-Pacific – a challenging market for many IaaS providers – its basic cloud IaaS offering is not enough to set it apart from its competitors.

Gartner says it is “missing capabilities that would make it attractive to enterprise IT operations organizations” – which could be somewhat addressed by its CSB portal that includes its offerings and third-party clouds, expected to launch this year.

Loser: Fujitsu

Gartner says that Fujitsu’s cloud IaaS capabilities “lag significantly behind those of the market leaders” and “it will continue to need to aggressively invest in acquiring and building technology in order to be competitive in this market.”

Source: TheWHIR

Cambridge Semantics Names Steve Hamby Managing Director Government

Cambridge Semantics Names Steve Hamby Managing Director Government

Cambridge Semantics has announced the appointment of Steve Hamby to managing director government.

In this newly created position, Hamby will serve Cambridge Semantics’ federal government customers seeking insights from big data discovery, analysis, and data management solutions, such as the Anzo Smart Data Lake™, to provide timely, accurate and customizable information to staff, citizens, media and businesses.

“We are delighted to have Steve join us as managing director government,” said Alok Prasad, president of Cambridge Semantics. “With our rapidly expanding client roster in the public space, Steve’s addition to the team will permit us to further develop our market presence as big data analysis becomes indispensable to delivering effective and efficient government services.”

Hamby brings over 30 years of experience in the information technology industry to the company, most recently serving public sector customers as the CEO of G Software, Inc. and as chief technology officer for Orbis Technologies, Inc. In 2013, he was recognized by the American Business Awards™ as Technology Executive of the Year, Silver Award for his pioneering efforts on cloud-based HUMINT- and OSINT-centric fusion products at Orbis Technologies. Hamby is also a published author who often speaks at major industry conferences. He holds a bachelor’s degree in management from the University of North Alabama and a master’s degree from Jacksonville State University.

“It’s an exciting time for Cambridge Semantics to step up its presence in the public sector,” said Hamby. “Government agencies have a tremendous interest in semantic-based smart data discovery and analytic solutions, and I look forward to working with these organizations to help them simplify data access and discovery for the citizenry.”

Source: CloudStrategyMag

SAIC Introduces Cloud Migration Edge

SAIC Introduces Cloud Migration Edge

Science Applications International Corp. (SAIC) has launched Cloud Migration Edge™, a multi-tiered methodology that migrates and transforms customers’ current IT applications and systems to a cloud environment securely and effectively. As a cloud services integrator, SAIC teams with the best cloud technology providers to engineer solutions that meet customers’ individual needs.

Cloud Migration Edge is a holistic, five-step approach that encompasses specialized tools, processes, and best practices to guide the cloud migration life cycle and provide ongoing improvement. This formalized framework supports the step-by-step implementation of a mission-centric cloud computing environment by breaking down the cloud migration process into standardized components at each layer of the IT service life cycle.

“Our advanced cloud expertise and proven methodology allow our federal government customers to rapidly and securely integrate and adapt cloud technologies to improve delivery of their IT services,” said Charles Onstott, SAIC senior vice president and general manager of the Cyber, Cloud, and Data Science Service Line. “To accomplish this, we have taken our IT business transformation, cybersecurity, and cloud computing expertise to deliver a systematic approach to cloud migration, while applying IT Infrastructure Library best practices.”

Additionally, SAIC’s customized approach includes several aspects of business transformation such as policies, processes, security, governance, architecture, applications, and change/risk management.

“Our cloud services integration solution creates a comprehensive and secure IT environment, crafted to meet our customers’ unique requirements, using both existing customer investments and modern cloud technologies,” said Coby Holloway, SAIC vice president of Cloud Computing and Business Transformation Services.

SAIC works with customers to analyze their requirements and business needs, develop the appropriate architecture, design the migration approach, and implement the transition plans to include change and risk management. SAIC also establishes a new operations and maintenance model based on the target architecture that includes cloud management and continuous service improvement.

“Migration is not just about the applications, it is about transforming the way business and missions are performed while providing new capabilities that cloud-based systems enable,” Onstott continued. “We evaluate the current system and requirements, future needs, what makes sense to migrate and how, the risks involved, the transition process needed, policies, people, processes and how those are affected, and develop the best implementation plan to transition the business with the lowest impacts on productivity and current operations.”

As part of SAIC’s total solution, Cloud Migration Edge uses industry-leading capabilities from Amazon Web Services, EMC, NetApp, RedHat, VMware, and others. As a cloud services integrator, SAIC is able to bring the best solutions from our partners across the cloud computing industry, avoiding vendor bias and lock-in.

SAIC Cloud Migration Edge five-phase methodology:

  • Assess and Strategize: SAIC defines objectives and builds a cloud strategy that meets technical, regulatory compliance, and security requirements. This involves creating assessments, building requirements, developing a business case, and outlining a return on investment.
  • Design: SAIC tailors a solution that includes the cloud platform, security, management, monitoring, and final design to achieve each customer’s goals. SAIC uses a comprehensive systems engineering approach to create both a final cloud-enabled infrastructure as well as a detailed migration strategy that includes transformation of the customer’s IT processes and organization to a cloud service delivery model.
  • Transition: During this step, SAIC migrates IT services to the cloud with minimal disruption using unique managed business transformation approach, including an implementation plan, operational testing, and final execution.
  • Operate: SAIC orchestrates cloud services to meet performance levels using proven processes to mitigate risk with constant monitoring. SAIC will organize, monitor, verify, report, and manage various operational and governance activities that ensure the production environment meets or exceeds performance metrics. SAIC also introduces heavy automation to increase the efficiency and consistency of the new services, and to facilitate onboarding and cloud service adoption.
  • Improve: SAIC capitalizes on the flexibility of cloud-enabled architectures to optimize service value. During this phase, SAIC provides customers with services, including project management, staff augmentation, data migration, workload migration, independent verification and validation testing, and concept operations updates. Customers benefit from the lessons learned and best practices developed across all of SAIC’s cloud work, which are used to continually update our Cloud Migration Edge approach and implementations. This phase involves evaluating service delivery, identifying, and implementing opportunities for improvement.

Source: CloudStrategyMag

Qligent Integrates Big Data Capabilities Into Vision Cloud Monitoring Platform

Qligent Integrates Big Data Capabilities Into Vision Cloud Monitoring Platform

Qligent is building big data conditioning into its Vision cloud monitoring platform in time for IBC2016 (September 9-13, RAI Exhibition Center, Stand 8.E47). The integration of this new software will help broadcasters and media businesses leverage big data insights much quicker and easier for multiplatform content delivery.

As has always been the case in television, viewers quickly lose patience and tune out if broadcast quality suffers. The challenge for broadcasters and new media businesses, including OTT service-providers, is the sheer cost and complexity of monitoring a quickly escalating density of streams and channels. The Vision cloud monitoring platform gives users a wider palette to monitor these many streams from the studio headend to the last mile more effectively — and cost-efficiently.

At IBC2016, visitors to the Qligent stand can learn how the new big data and other advanced capabilities built into Vision enhance analysis across both linear and non-linear TV and video streams. This includes rich, detailed and customized presentations around combining and structuring specific QoE parameters to see the data in a meaningful and actionable manner, including:

  • Percentage of macroblocking, freeze, black, and other artifacts in a program stream
  • Quality of advertising playout over a specific period of time
  • Presentation of off-air time over a broadcast day or week
  • Capture, verification and correlation of embedded metadata

“Many of our current and prospective customers in the broadcast space share that big data is the only way to reconnect and stay connected with what used to be their captive audiences,” said Ted Korte, COO, Qligent. “There has been an explosion of non-linear avenues for content delivery across gaming consoles, mobile devices and hundreds of social media sites, all stealing eyeballs, time and attention. Everything the linear TV service provider once understood is now completely fragmented, and these customers need a new set of data-centric tools to understand the quality of the viewing experience—and how to monetize that data moving forward.”

Vision users can opt to create and manage big data widgets for on-site analysis, or farm out the application to Qligent’s managed service layer via the company’s Oversight MaaS (“Monitoring as a Service”) platform. This further drives down the costs and labor associated with monitoring multiple streams and sources across many delivery platforms.

“The fact is that the stress around multiplatform monitoring can cause many headaches in understaffed and under-skilled facilities, to the point where they may not know they are off the air on a specific platform until receiving a complaint,” said Korte. “While the tried-and-true linear models still catches more eyeballs and viewers on initial impressions, to remain competitive, broadcasters and MVPDs need to be on as many of these emerging platforms as possible with engaging, high-quality content. Our big data capabilities in Vision will help our customers understand what the quality of experience is across these many platforms. That data really represents the viewer feedback that isn’t typically received, and will help our customers understand when and why viewers tuned out—and how to rectify any viewing quality problems.”

Source: CloudStrategyMag