Data science could keep United out of more trouble

Data science could keep United out of more trouble

I’ve avoided flying United for many years. On my last trip to Japan about 10 years back, somewhere along the way an employee took my ticket and said I’d get another one in Japan. Wrong! On my return, United told me I had to buy a new ticket for around $7,000.

Anyhow, we’ve all heard about United’s overbooking disaster, where a passenger faced a lot worse abuse than I did. With the right data and analytics, another outcome could have been possible.

When the tickets were sold, United’s ticketing system could have seen there was a high probability that the other flight would arrive late and that crew members frequently bumped passengers. The ticketing system could have reserved a number of seats as standby or told the last four passengers booking them that they might be bumped. Then, when the other flight was coming in with the crew that needed to get back home, United simply could have avoided boarding the last four.

In fact, flight data is a cornucopia of statistical information. You could learn a lot about the following:

  1. Weather patterns by season and even in unseasonable years. Sure, we have radar, but how do these patterns affect objects in the air?
  2. Flight delays (travel sites already report this).
  3. Domino effects, such as how a delayed flight or weather pattern impacts other flights.
  4. Maintenance issues, such as how frequently by plane type (or airline) parts have to be replaced or fail.

Also, you can glean a lot of customer and customer preference information. The company I work for calls these “signals,” which I like better than “events,” because they aren’t always events and “time series” is too generic. You could learn the following:

  1. Which customers will likely cancel if assigned a middle seat (my bladder is small in the air and I have broad shoulders). This goes beyond my profile preference for an aisle or a window to identify how much I prefer an aisle.
  2. Which customers are most price sensitive and influenced by cost.
  3. How frequently a customer flies your airline after being bumped or experiencing other customer service problems.

Using statistics, machine learning, and a simple rules engine — and connecting some of these data sets — airlines could:

  1. Automatically offer discounts and other incentives to passengers with flexible schedules to fill empty seats.
  2. Offer status upgrades to passengers who are likely to be incentivized to fly your airline over others (American is doing this, but I don’t know how targeted it is).
  3. Detect probable weather problems, automatically hold seats, and start rebooking before the connection even lands. (Delta does this once the delay happens, but it does so poorly with suboptimal routes.)
  4. Avoid overbooking and simply offer preselected seats. Also, instead of “dumb bidding” in the open air, send a text message to passengers who are likely to take a lower offer. This prevents people from sitting around and waiting for higher compensation.
  5. When you have to select someone, choose the person least likely to care. You have the data.
  6. Detect problematic decision-making or identify employees who frequently do stupid things (like drag people off airplanes).
  7. Assuming there’s a connection between complaints and bad PR, detect when a policy or practice is likely to cause your stock to drop should it go viral on video.

I realize that not every problem can be solved by search (full disclosure: I work for a search company) and math, but a lot of the dumbest stuff and everyday annoyances could. All it takes is motivation. Unfortunately, so far, U.S.-based airlines seem to lack a strong economic reason to care about customer service.

Source: InfoWorld Big Data

Report Shows Need For Enterprise-Wide Plans To Combat Network Intrusions

Report Shows Need For Enterprise-Wide Plans To Combat Network Intrusions

The BakerHostetler 2017 Data Security Incident Response Report highlights the critical need for senior executives in all industries to understand and be ready to tackle the legal and business risks associated with cyberthreats and to have enterprisewide tactics in place to address intrusions before they happen.

The report provides a broad range of lessons to help executives identify risks, appraise response metrics and apply company-specific risk mitigation strategies based on an analysis of more than 450 cyber incidents that BakerHostetler’s Privacy and Data Protection team handled last year. The firm’s experience shows that companies should be focused on the basics, such as education and awareness programs, data inventory efforts, risk assessments, and threat information sharing.

Theodore Kobus, leader of the Privacy and Data Protection team, said, “Like other material risks companies face, cybersecurity readiness requires an enterprisewide approach tailored to the culture and industry of the company. There is no one-size-fits-all approach.”

Why incidents occur

Phishing/hacking/malware incidents accounted for the plurality of incidents for the second year in a row, at 43 percent – a 12 percentage point jump from a year earlier. The only category for which phishing/hacking/malware was not the most common incident cause was finance and insurance, where employee action/mistake was the top reason.

Ransomware attacks — where malware prevents or limits users from accessing their system until a ransom is paid — have increased by 500% from a year earlier, according to industry research. The BakerHostetler report details the typical ransomware scenario and the challenges that such incidents present. “Having a regularly scheduled system backup and a bitcoin wallet to pay a ransom will help with operational resiliency. Ransomware is not likely to go away, and incidents will probably increase over the short term, so companies should be prepared,” added Kobus.

Included in the report is a checklist of actions companies can take to minimize their risk against these attacks and to respond promptly and thoroughly should a cyber breach occur. Topping the list is increasing awareness of cybersecurity issues through training and education. In addition, the report lists six other core steps most businesses should take to prepare for an incident and mitigate risk.

“It’s no longer a question of which industries are most at risk. All industries are faced with the task of managing dynamic data security risks. Even companies in the retail, restaurant and hospitality industries, while highly regulated, had the fourth-highest rate of data security incidents,” Kobus added.

Key statistics from BakerHostetler’s 2017 Data Security Incident Response Report:

  • Incident causes: Phishing/hacking/malware 43%, employee action/mistake 32%, lost/stolen device or records 18%, other criminal acts 4%, internal theft 3%.
  • Industries affected: Health care 35%, finance and insurance 16%, education 14%, retail/restaurant/hospitality 13%, other 9%, business and professional services 8%, and government 5%.
  • Company size by revenue: Less than $100 million 39%, between $100 million and $500 million 33%, $500 million to $1 billion 17%, and greater than $1 billion 11%.
  • Most breaches discovered internally: 64% of breaches were internally discovered (and self-reported) compared with 36% that were externally discovered. In 2015, only 52% of incidents were self-reported.
  • Incident response timeline: On average 61 days from occurrence to discovery; eight days from discovery to containment; 40 days from engagement of forensics until investigation is complete; 41 days from discovery to notification.
  • Notifications and lawsuits filed: In 257 incidents where notification to individuals was given, only nine lawsuits were filed. This is partially explained by companies being prepared to better manage incidents.
  • No notification required: 44% of incidents covered by the report required no notification to individuals — similar to 2015 results.
  • Average size of notification: Incidents in the retail/restaurant/hospitality industry had the highest average notification at 297,000, followed by government at 134,000 and healthcare at 61,000. All other industries had less than 10,000 notifications per incident.
  • Forensic investigation costs: The average total cost of forensic investigations in 2016 was
  • $62,290, with the highest costs in excess of $750,000.
  • Health care: The number of incidents rose last year, but the average size of the incidents decreased. Of the incidents analyzed by the BakerHostetler report, 35% were in healthcare, yet the average size of the incident notification was 61,000 — only the third highest of all industries surveyed.
  • Triggering state breach notification laws: Just over half of cyber incidents last year (55%) were subject to state breach notification statutes, down slightly from the year prior. Of the incidents where notification was required, the highest percentages were those involving Social Security numbers (43%) and health care information (37%). Only 12% of cases involved payment card data.
  • Active state attorneys general: AG’s made inquiries after notifications were made in 29% of incidents, although overall regulatory investigations and inquiries were down to 11% in 2016, from 24% in 2015, and litigation was down to 3% last year compared with 6% the prior year.

Back to the basics

The first line of defense in protecting a company’s data and reputation during a cybersecurity incident is to outfit the organization with baseline procedures and processes to reduce the company’s risk profile. By focusing on key areas like employee awareness and education, companies can help prevent incidents while laying the groundwork for a successful response and reducing the likelihood events will be severe should they happen.

“Employees are often cited as a company’s greatest asset. In the cybersecurity arena, they can also be a liability. The report’s numbers reinforce the ongoing need to focus on effective employee awareness and training. They also show that a defense-in-depth approach is necessary, because even well-trained employees can make mistakes or be tricked,” said Kobus.

The full 2017 BakerHostetler Data Security Incident Response Report can be found here. The Privacy and Data Protection team will host a webinar on the findings on May 9 at noon ET. Kobus also will be participating in a morning panel titled, “Shakedown Street: Cyber Extortion, Data Breach and the Dirty Business of Bitcoin” on April 20 at the Global Privacy Summit in Washington, D.C.

Source: CloudStrategyMag

Unitas Global And Canonical Partner

Unitas Global And Canonical Partner

Unitas Global and Canonical have announced they will provide a new fully-managed and hosted OpenStack private cloud to enterprise clients around the world.

This partnership, developed in response to growing enterprise demand to consume open source infrastructure, OpenStack and Kubernetes without the need to build in-house development or operations capabilities, will enable enterprise organizations to focus on strategic Digital Transformation initiatives rather than day-to-day infrastructure management.

This partnership, along with Unitas Global’s large ecosystem of system integrators and partners, will enable customers to choose an end-to-end infrastructure solution to design, build, and integrate custom private cloud infrastructure based on OpenStack. It can then be delivered as a fully-managed solution anywhere in the world allowing organizations to easily consume the private cloud resources they need without building and operating the cloud itself.

Private cloud solutions provide predictable performance, security and the ability to customize the underlying infrastructure. This new joint offering combines Canonical’s powerful automated deployment software and infrastructure operations with Unitas Global’s infrastructure and guest-level managed services in data centers globally.

“Canonical and Unitas Global combine automated, customizable OpenStack software alongside fully-managed private cloud infrastructure, providing enterprise clients with a simplified approach to cloud integration throughout their business environment,” explains Grant Kirkwood, CTO and founder, Unitas Global. “We are very excited to partner with Canonical to bring this much-needed solution to market, enabling enhanced growth and success for our clients around the world.”

“By partnering with Unitas Global, we are able to deliver a flexible and affordable solution for enterprise cloud integration utilizing cutting-edge software built on fully-managed infrastructure,” comments Arturo Suarez, BootStack product manager, Canonical. “At Canonical, it is our mission to drive technological innovation throughout the enterprise marketplace by making flexible, open source software available for simplified consumption wherever needed, and we are looking forward to working side-by-side with Unitas Global to deliver upon this promise.”

Source: CloudStrategyMag

Online Tech Acquires Echo Cloud

Online Tech Acquires Echo Cloud

Online Tech has announced it has acquired Echo Cloud, a Kansas City, MO, based enterprise cloud company. The acquisition is set to boost Online Tech’s reach in the Midwest market and provide further geographical diversity to its growing cloud infrastructure.

Yan Ness, CEO of Online Tech, said the acquisition provides several benefits for his company, including an expanded product line and geographic reach to the Kansas City and Missouri markets. “I’m very pleased with this deal,” Ness said. “We really like the Kansas City market. There’s lots of demand that we think is underserved, and this is a great opportunity to provide companies in the area with our secure, compliant hybrid IT services.”

Additional benefits include adding two Kansas City data centers and extra cloud infrastructure to Online Tech’s existing data centers across Michigan and Indiana.

Bill Severn, CEO of Echo Cloud, is equally enthusiastic for the two companies to come together. “Echo Cloud is extremely excited to be joining Online Tech,” he said. “I believe the values we hold match Online Tech’s very well, and I think this will be a great partnership moving forward.”

The companies will combine their existing services into Online Tech’s client portal to allow for easy account viewing and service management. Echo Cloud will also provide new services from Online Tech to its existing clients that are compliant with standards such as PCI and HIPAA. “Online Tech has been a leader in compliance for many years,” Severn said. “I’m pleased we can now offer HIPAA- or PCI-compliant data hosting to our existing customers here in Kansas City.”

 

Source: CloudStrategyMag

NTT Communications' Extends Multi-Cloud Connect To Oracle Cloud

NTT Communications' Extends Multi-Cloud Connect To Oracle Cloud

NTT Communications Corporation (NTT Com) has announced the extension of Multi-Cloud Connect connection to Oracle Cloud, to help multi-national customers take advantage of performance, cost and innovation benefits of the cloud.

While enterprises understand the promise and many benefits of the cloud, most experience issues such as latency, packet loss and security threats given that connectivity to cloud services are still heavily dependent to public Internet. With Multi-Cloud Connect, Oracle Cloud users will be able to leverage NTT Com’s secure, reliable, high performing MPLS network to access their business critical applications.

Multi-Cloud Connect will connect directly to Oracle Cloud’s platform through Oracle Network Cloud Services- FastConnect enabling private connection to its broad portfolios and features: platform as a service (PaaS), and infrastructure as a service (IaaS). This includes middleware such as “Oracle Database Cloud Service” and “Oracle Java Cloud Service”, as well as integration and business analytics features. Furthermore, NTT Com and Oracle will enable hybrid deployment of Oracle Cloud and Oracle software hosted on-premises or “Oracle Cloud at Customer”, under one global network.

Source: CloudStrategyMag

IDG Contributor Network: 3 reasons why data scientist remains the top job in America

IDG Contributor Network: 3 reasons why data scientist remains the top job in America

Glassdoor recently revealed its report highlighting the 50 best jobs in America, and unsurprisingly, data scientist claimed the top spot for the second year in a row. Every year, the jobs site releases this report based on each job’s overall “Glassdoor Job Score.” The score is determined by three key factors: the number of job openings, the job satisfaction rating, and the median annual base salary.

With a job score of 4.8 out of 5, a job satisfaction score of 4.4 out of 5, and a median base salary of $110,000, data scientist jobs came in first, followed by other technology jobs, such as data engineers and DevOps engineers.

In fact, data-related roles are dominating similar jobs reports released over the past year as well. A new study by CareerCast.com revealed data scientist jobs have the best growth potential over the next seven years, as they are one of the toughest jobs to fill. Statistics from rjmetrics.com show that there were anywhere from 11,400 to 19,400 data scientists in 2015, and over 50% of those roles were filled in the last four years.

A quick search for data scientist jobs in the United States on LinkedIn reveals over 13,700 open positions. Additionally, this job trends tool by Indeed, which showcases the demand for data scientists, reveals that both data scientist job listings and job seeker interest are showing no signs of slowing down.

It’s estimated there will be one million more computing jobs than employees to fill those computing jobs in the next ten years, according to Computer Science Zone. So how did the role of the data scientist rise to the top of the rankings? Let’s examine a few of the reasons and trends that led the data scientist position to claim the number one spot for the best job in America again this year.

Reason #1: There’s a shortage of talent

Not only are individuals with skills in statistics and analytics highly sought-after, but those with the soft skills to match are driving demand for data scientists. Business leaders are after professionals who can not only understand the numbers but also communicate their findings effectively. Because there is a still such a shortage of talent who can combine these two skillsets, salaries for data scientists are projected to grow over 6% this year alone.

So where are all the data scientists to fill these jobs? The main answer to this question is that they’re not trained yet. While computer science programs are on the rise, it’s still going to take some time for supply to catch up with demand. Big data and analytics courses have started making their way into the classroom only in the past couple of years so addressing the data science talent shortage won’t happen overnight. The number of job openings will certainly continue to outweigh the number of professionals with a sophisticated understanding of data and analysis to fill those openings over the next couple of years.

Reason #2: Organizations continue to face enormous challenges in organizing data

The role of the data scientist is evolving, and organizations desperately need professionals who can take on data organizing as well as preparing data for analysis. Data wrangling, or cleaning data and connecting tools to get the data into a usable format, is still highly in demand.

Data preparation may require many steps, from translating specific system codes into usable data to handling incomplete or erroneous data, but the costs of bad data are high. Some research shows that analyzing bad data can cost a typical organization more than $13 million every year.

Therefore, there will always be a demand for individuals who can weed out bad data that can alter results or lead to inaccurate insights for an organization. There’s no doubt it’s time-consuming work. In fact, data preparation accounts for about 80% of the work of data scientists. But even with the increased availability of highly sophisticated analytics dashboards and data collection tools, there will always be a demand for professionals who possess the advanced skill sets needed to clean and organize data before being able to extract valuable insights from it.

Reason #3: The need for data scientists is no longer restricted to tech giants

The demand for data scientists has finally pushed beyond large technology firms, such as Google or Facebook, as smaller organizations realize that they too can use data to make better, more informed decisions. This HBR feature on big data reported that “companies in the top third of their industry in the use of data-driven decision making were, on average, 5% more productive and 6% more profitable than their competitors.”

While small-to-medium sized organizations are not churning out nearly as much data as larger enterprises, sifting through that data to extract meaningful insights into their businesses can be a powerful competitive advantage nonetheless.

We’re also seeing entry-level data scientists flock towards startups and smaller firms because of the perception that they will be able to tackle higher-level work earlier in their careers. Data scientists possess a broad range of skills, and they want to be able to put all of those skills to use right away.

Smaller firms are also hiring fast. Large organizations looking to recruit entry-level data scientists are taking note that their multistep, legacy hiring and recruiting processes may need some updating if they are going to attract the top talent that they desire. So for now, as the demand for data professionals continues to surge, agile organizations continue to be the more favorable choice for data scientists, regardless of their size.

How to get into the field

The demand for data scientists is high, and professionals can enter the world of data science a number of ways. University programs are a good start, but a data science position often requires a mixture of skills that many schools are unable to package all together.

One way to develop all of the necessary skills is by attending a data science boot camp. Not only will you learn the analytical skills required for a data science position but you’ll also receive training for the softer skills that are becoming more and more common in data science roles – skills such as managing projects and teams across multiple departments, consulting with clients, assisting with business development, and taking abstract business issues and turning them into analytical solutions.

So if you’re still deciding the right career path, or thinking about making a career change in 2017, consider exploring what it takes to be a data scientist – one of the fast-growing and highest paid jobs in America right now.

This article is published as part of the IDG Contributor Network. Want to Join?

Source: InfoWorld Big Data

Faction® Receives New Patent

Faction® Receives New Patent

Faction® has announced that the U.S. Patent and Trademark Office (USPTO) has granted Faction a new US Patent #9,621,374, which extends the functionality of hybrid and multi-cloud to new protocols including Virtual Extensible LAN (VXLAN) and Software Defined Networking (SDN). The new patent further reinforces Faction’s place at the forefront of hybrid and multi-cloud solutions. 

Faction’s new patent allows customers to leverage two key innovations in the networking and cloud arenas: VXLAN and SDN. VXLAN technology is attractive to enterprises as it works to improve scalability challenges associated with large cloud deployments. SDN technology enables networks to add agility and flexibility, while allowing administrators to respond quickly through a centralized control location.

The new patent comes on the heels of Faction’s recently announced USPTO Patent #9,571,301 for the company’s pioneering work on hybrid and multi-cloud networking, which allows users to tap into the best features of private and public clouds to create one unified, optimized cloud. Enterprises and service providers using hybrid and multi-cloud typologies benefit from establishing a seamless extension of infrastructure to and between public clouds. These networks are especially useful in data center configurations to connect physical resources to one or more cloud providers.

“Our newest patent further validates Faction’s intellectual property leadership in hybrid and multi-cloud solutions,” states Luke Norris, CEO and founder, Faction. “By choosing Faction cloud, enterprises and service providers can take advantage of the many benefits inherent in combining the best features of private and public clouds – truly creating a cutting-edge approach to IT cloud transformation. Our patent leadership also paves the way for clients to leverage future cloud capabilities and allows Faction to confidently meet the sharp increase in demand we are seeing from enterprises seeking to establish hybrid and multi-cloud cloud strategies.”

Source: CloudStrategyMag

3W Infra Migrates Its IaaS Infrastructure To New Data Center

3W Infra Migrates Its IaaS Infrastructure To New Data Center

3W Infra has announced the migration of its complete IaaS infrastructure to a newly commissioned data center configured for high-redundancy in Amsterdam, the Netherlands. The migration to this new data hall is part of an infrastructural upgrade to support 3W Infra’s rapid customer growth. 

3W Infra has migrated its IaaS infrastructure including dedicated servers and network infrastructure to a new data hall within Switch Datacenters’ Amsterdam-based data center complex, Switch AMS1. This new data center features a highly energy-efficient data center design through the use of indirect adiabatic cooling technology and hot aisle containment (calculated pPUE = 1.04). Its efficiency and highly redundant 2N power configuration would cater to the uptime needs of customers with demanding applications including enterprises, cloud providers, gaming companies, and financials.

Designed for flexibility and scalability, this new data center hall in Switch AMS1 offers an extended capacity of about 400 data center racks and enables 3W Infra to offer companies considerable growth perspective. Its scalable power modules starting at 5 Ampere per cabinet up to 25 Ampere for high-density requirements (scalable steps of Ampere) are aimed at a wide range of customer types — from startups to enterprises and companies with demanding applications.

3W Infra expects to complete its phased data center infrastructure migration at the end of April 2017.

ISO 27001, 100% Uptime Guarantee

“The 2N power configuration gives 3W Infra customers a robust 100% uptime guarantee instead of the 99,999% we had before,” said Roy Premchand, managing director, 3W Infra. “The easy-scalable power modules available onsite allow our clients to grow their power infrastructure in a cost-efficient way. They can start with 5 Ampere and grow their infrastructure with 5 Ampere each time they need to add more power for their equipment.”

“Power, cabling, security…really all data center infrastructure included in this newly built data hall is very robust,” added Premchand. “The robustness and high security features will help us meet ISO 27001 requirements, as we’re currently aiming to achieve ISO/IEC 27001 certification for Information Security Management.”

OCP Specifications

The newly commissioned data hall in Amsterdam, Switch AMS1, is one of the first colocation data centers in Europe that’s suitable for Open Rack Systems based on OCP principles. The Open Compute Project (OCP) originates with Facebook, while companies like Intel and Microsoft joined the OCP in an early stage. A variety of industry giants joined the OCP later on, including Google, Cisco, IBM, NetApp, Lenovo, Ericsson, and AT&T. It means that Switch AMS1 is a modern premises well suited for housing OCP-specified equipment based on open standards. Its infrastructural efficiency would also be a good fit for 3W Infra’s ‘pure-play’ IaaS business regarding the delivery of traditional, highly customized dedicated server technology, colocation services and IP Transit.

Fast Growing Company 

The announcement follows the news that 3W Infra published the results of its latest server count. 3W Infra now has 4,000 dedicated servers under management, 1,000 more than half a year ago. Although quite a young company (founded in 2014), 3W Infra has been able to show significant growth in the second half of 2016.

3W Infra’s jump-start growth would be triggered by the company’s ‘pure-play’ IaaS hosting approach where cloud delivery is left to customers. 3W Infra’s high-volume transit-only network with global reach, and its ability to deliver highly customized IT infrastructures are also part of 3W Infra’s success in growing at such a fast pace, according to Premchand.

“We expect to continue our exponential growth rate, as we have quite some large sales opportunities in our sales pipeline,” added Premchand. “A variety of potential customers has shown interest in the new data hall already, companies with extensive and complex infrastructures I must say, but they were waiting until we have finished our migration processes.”

Source: CloudStrategyMag

Telehouse Launches Advanced Cloud Interconnection Solution In The U.S.

Telehouse Launches Advanced Cloud Interconnection Solution In The U.S.

TELEHOUSE has announced the launch of Telehouse Cloud Link in the United States. Currently available in the EMEA region, Telehouse Cloud Link is a multi-cloud connectivity exchange that allows enterprises to manage access to multiple cloud services through a single, secure and dedicated private connection.

Telehouse Cloud Link helps customers simplify their hybrid cloud infrastructure and accelerate data transfer between their network and cloud services by establishing direct, private connections to multiple Cloud Service Providers (CSPs), including Amazon Web Services (AWS), Microsoft Azure and Google Cloud Platform, as well as network on-demand including TELEHOUSE’s own NYIIX and LAIIX.

According to the Cisco Global Cloud Index, cloud data center traffic associated with cloud consumer and business applications is growing at 30% CAGR, and global cloud IP traffic is expected to account for more than 92% of total data center traffic by 2020.  As an increasing number of enterprises utilize cloud services to perform business-critical operations, data centers must enable fast and cost-effective access to key CSPs.

“Cloud computing has become a necessary requirement for many of today’s enterprises,” explains Aki Yamaguchi, COO, Telehouse America.  “After its success in Europe, we’re very excited about the launch of Telehouse Cloud Link in the U.S., as it enables fast, easy and secure cloud connectivity for all of our customers.”

 

Source: CloudStrategyMag

WPS Expands WPS Office

WPS Expands WPS Office

WPS Office Software has announced that it has expanded WPS Office with WPS Cloud, enabling users to work faster, more flexibly, and more productively with added storage, file roaming, and sharing capabilities.

In a report by Allied Research, cloud adoption continues to soar, with the personal cloud market in particular expected to reach $89.9 billion, globally, by 2020. In line with this trend, WPS Cloud is designed to work with WPS Office as a companion, offering users the benefits of cloud computing such as flexibility and collaboration.

WPS Cloud is a complete document storage and management service, enabling users to view and edit files anywhere at anytime. Automatic backup and link sharing tools promote collaborating in real time, across multiple devices and platforms. Files are protected from loss with enterprise-level data security and multiple backups.

Users can access files either directly from within WPS Office, or through the WPS Cloud web portal. Specific features include:

  • Free storage space: Users receive 1GB of free storage space on WPS Cloud, where they can store files and rich media.
  • File roaming and cross device support: File roaming allows users to open or create documents in WPS Office, which are then automatically saved to WPS Cloud. Recent files are shown and may be accessed across all connected devices including Windows PCs, Android or iOS mobile devices, or through a web browser.
  • Easy one-click file sharing: Once a document is uploaded to WPS Cloud, the user can create a unique URL that can be shared with anyone to view and download the file.

“Users are increasingly adopting and using apps and documents in the cloud, and growing accustomed to accessing content across devices regardless of their environment,” said Cole Armstrong, senior director of marketing, WPS Office Software. “More and more, they want equal access to office software whether on a mobile device or traditional PC. We’re pleased to offer WPS Cloud to address this trend as we continue to look for ways to improve our users’ computing experience and productivity.”

WPS Office is a full Office Software Suite for desktop and mobile devices that provides a high performing, yet considerably more affordable solution to con/prosumers and a preferred alternative to Microsoft Office. Users prefer WPS Office over software like Microsoft Office, citing that WPS is more reliable, efficient and faster. Unique features enable users to create, view, edit, store and share documents for greater productivity from their office software. WPS Writer, Presentation, Spreadsheets and PDF Reader/Converter are fully compatible with other office software solutions, and the ideal FREE office software alternative for Microsoft Office users whose trials have ended.

 

Source: CloudStrategyMag