Report: Asia, North America to Lead 5G Adoption through 2021

Report: Asia, North America to Lead 5G Adoption through 2021

5G may eventually underpin huge amounts of communication in support of the Internet of Things (IoT), but tech consultancy Ovum said it will initially be used to enhance mobile broadband services, reaching 24 million subscriptions worldwide in 2021. The company’s inaugural 5G Subscription Forecast predicts that significant immediate adoption of 5G in North America and Asia will drive global subscription numbers, with each accounting for 40 percent of the market in 2021.

5G will launch officially in 2020, and commercial services using networks and devices fully compliant with 5G are still a few years away from launching, though Ovum notes a number of operators have announced plans to launch services marketed as 5G earlier.

READ MORE: IoT to Drive Next Wave of Connected Devices: Report

“The main use case for 5G through 2021 will be enhanced mobile broadband services, although fixed broadband services will also be supported, especially in the US,” said Mike Roberts, Ovum Practice Leader covering carrier strategy and technology. “Over time 5G will support a host of use cases including Internet of Things and mission-critical communications, but Ovum does not believe those use cases will be supported by standardized 5G services through 2021.”

The forecast suggests that 5G will be available in 20 national markets by the end of 2021, across each of the four major regions the company divides the world into. While North America and Asia will each have nearly 10 million 5G subscribers at that point, Europe and the Middle East and Africa will account for 10 percent each with close to 2.5 million. Major operators in the US, South Korea, China, and Japan have made known plans to move aggressively with 5G launches, which significantly influences the forecast.

China’s Huawei, for instance, began announcing major 5G plans in April 2015.

Ovum defines a 5G subscription as applying to both the connection and the device, and 5G as a system based on 3GPP 5G standards. These will only begin to come available with 3GPP Release 15, slated to be finalized in 2018.

The company considers all of its subscription figures conservative, though it notes that uncertainty goes along with predicting technology several years away.

“5G is at an early stage and there is a high degree of uncertainty around 5G deployment and adoption, including significant upside and downside risks,” Roberts added.

Ericsson called for “global spectrum harmonization to secure early 5G deployments” in a recent report.

Source: TheWHIR

Newsmakers: Q&A with DreamHost VP of Cloud and Development

Newsmakers: Q&A with DreamHost VP of Cloud and Development

DreamHost - Jonathan LaCour Headshot (2)

Jonathan LaCour, DreamHost VP of Cloud and Development

Welcome to Penton Technology Newsmakers, a recurring monthly series focused on bringing you informative one-on-one interviews with industry experts. For this month’s edition, we talked to DreamHost VP of cloud and development Jonathan LaCour. Since joining the company in 2011, LaCour has been instrumental in building out DreamHost’s cloud services and its development organization. This Q&A is a must-read if you’re looking at building out an in-house dev team, considering whether to build or buy when it comes to cloud, or wanting an insider’s view on the cloud market over the next 12-18 months.

The WHIR: Can you give us a little bit of a background on yourself and tell us a bit about what you do at DreamHost?

Jonathan LaCour: I started out in the software industry on the technical side of things. I was actually a developer and in software engineering; started very early days in software-as-a-service doing medical software and then I started doing some startups. I founded a cloud-based, web 2.0 solution for business people, kind of like a Salesforce of the little industry I was in, and I was a CTO there. I developed a lot of skills on how to operate a business above and beyond just doing the technical stuff; making sure that we could create a sustainable company. I went from doing purely the development to product and also operations so deploying our products to early cloud solutions and growing the customer base. I sold that product and the company back in 2010 to a company here in Los Angeles and there I served as a vice president of software product so I was kind of in charge of the product management and development, bridging into operations as well for all of their web-based software.

From that I departed and came here to DreamHost. The guy who led the acquisition of my company became CEO here at DreamHost, Simon Anderson, and so I joined to run DreamHost’s product and development team. I built out a product management discipline that didn’t exist and started bringing some discipline, organization and efficiencies to the development team. That was a great experience, and I took over the cloud business unit in late 2014. That basically was our first experiment with branching off part of the company to be more of a vertically integrated business unit – putting together everything from the very base layer of operations and engineering up through the software development, product management, sales, marketing and finance. That has also been a great experience and now we have finished going through a reorganization of DreamHost where we’ve done that with the entire business. We have three discrete business units now and I run the cloud one.

WHIR: Sounds like you’ve been busy. In your day-to-day role what does your job look like? What’s kind of a typical day in your life?

LaCour: Yes, I have been very busy [laughs]. DreamHost has been around for 17-18 years now, and our primary business has always been web hosting, and VPS, shared, and dedicated, kind of the standard set of services that people in hosting have come to know and love. My part of the business, you can kind of think of it as the emerging technologies or the new platforms and new business. Cloud services are really focused on people who are interested in deploying applications to infrastructure. It’s more in line with the Amazon Web Services of the world, although we don’t view them as a competitor. We’ve got two products; we’ve got DreamObjects and DreamCompute, and those are basically cloud storage and cloud computing, respectively. Basically it gives people like developers the ability to create virtual servers, virtual networks, store lots and lots of data in our cloud storage platform, and so it really targets developers, more technical folks, people who maybe have an existing application running on premise or in another cloud and are looking to learn about cloud or wanting to save money.

READ MORE: DreamHost’s Open Source Public Cloud DreamCompute Comes Out of Beta

My day-to-day really is all about continuing to iterate on those products, make them better, and get them out to a broader audience and grow the base. Right now I represent a very small chunk of the total user base because my products have not been in market for very long. Any given week I spend a significant amount of time doing strategy, figuring out what we’re going to work on next, and then a lot of blocking and tackling in terms of execution. Whatever we’re currently working on, making sure we’re tracking and making sure we’re executing against our plan. I’m pretty hands on when it comes to that, especially on the technical side of things: understanding where the software development is at, and understanding how the deployment is going. There’s also a lot of finance work: time spent assessing the margin on the product, price points, packaging, ensuring that we are only building out just ahead of capacity so we maximize the profit we can get from our services.

I spend a significant amount of my time surveying the landscape and the marketplace and understanding what the industry looks like right now when it comes to cloud services. In the web hosting industry a lot of the other players are starting to dip their toe in the water. We also have a lot of emergent players who have burst on the scene in the last couple of years that are selling just unmanaged virtual servers have done really well. So I spend a lot of time looking at the marketplace and feeding that information back into the product.

WHIR: Speaking about the marketplace, maybe you can give me an overview of where you see the hosting industry at this point and then looking forward 12-18 months from now?

LaCour: We’ve just gone through a huge exercise doing this exact thing; really digging in and seeing where the industry is and where we want to go. What we’ve concluded is the industry itself, if you look at the standard, shared, managed, VPS hosting products that we have all seen for the past decade, that provided all the growth up until this point, the growth has been slowing for basically everyone, industry-wide. Why is that? Well, there are a number of reasons but I think the primary reason is that hosting is a platform play. You aren’t signing up for a solution to your problem, you’re signing up for something that can be a building block to a solution to your problem. If you sign up for managed hosting and you want a website, which is your problem, ‘I have a website I want’, and you have to find a content management system. So you may one-click install WordPress; our platforms are great for that – we provide people with lots of choice and lots of capabilities and for a very low cost they have something they can put lots and lots of domains on and solve lots and lots of problems but the actual solving of that problem is very much in their hands. What we’re finding now is that middle of the road option for people who want a lot of control but need some help. People are moving up the stack to more application, kind of software-as-a-service type things. You know, site-builder type services, like the Squarespaces, Weeblys, Wix of the world. Someone says ‘I want a website’ instead of ‘I want shared hosting’ and so they get the ability to create a website easily. They’re not looking for the platform; they’re looking for the actual solution to their problem.

That’s one way the industry is shifting. For the people who want more control, they actually want more and more control, and more capabilities and power, and so they’re shifting to the cloud. Instead of signing up for something that’s middle of the road, and getting limited technical capability, they want to pick which version of Linux is running on their server, and they want to have their own server, or ten servers, and they want to be able to spin them up and down at will, and really kind of embrace the technology. And they don’t want anyone messing with their stuff; they don’t want to be on a platform that’s intermingled with a bunch of other people.

Cloud is becoming bigger and bigger so that’s why you’re seeing companies emerge like DigitalOcean and Linode – these companies that provide more technical platforms aimed at IT managers, developers, the highly-technical who want to have all that control and don’t want the platform. So the core product the hosting industry has had for years is being siphoned away simultaneously be things that are more technical and less technical. People are either wanting to build it all themselves without your help or their wanting to have it all built for them and get on with their lives.

In 12-18 months I see a continued flattening of growth in the traditional managed hosting and I see continued uptick and rise in software-as-a-service platforms and in cloud.

WHIR: How do you see DreamHost’s position in the market? You didn’t always go after developers, and it sounds like you’re starting to go after more of those technical users. How much of a challenge was it to focus on that group [developers]?

LaCour: In terms of being focused on that group it wasn’t too difficult for us to make the leap because we’re a bunch of technical users ourselves so in many ways it’s serving our own needs and I think the big difference between us and a DigitalOcean-type is that in many ways it appears that we’re doing the same sort of thing it is not just a product that we’re developing to sell to our customer, it’s a product that we’re developing to sell to ourselves. We have tons of expertise selling open source apps to people, notably WordPress. We are great WordPress hosts, probably the best in the market. We have this infrastructure that we’ve been building that underpins our software as a service and our other products.

We actually develop our cloud to go after developers but also to provide an infrastructure for our own use. That’s a wonderful thing because we have a customer in the building that we can talk to at all times and ask if we’re serving them well. While we’ve been in market as a purely ‘sell our infrastructure to developers’ for a bit now we are just starting to use it extensively ourselves internally. So the shift wasn’t really all that difficult in terms of mental shift, but the technology was big, heavy lifting to get good enough for us to use to run our own services.

In terms of the people we’re going after I’ll say that while developers are certainly part of the target I’ll also note that we’ve been attracting customers such as little startups that have been running their software on dedicated racks or colo or something of that nature and they’re looking for someone like them who can understand them and who embraces open source platforms – they want more of what cloud can give them.

WHIR: When you set out to create this cloud business unit and brought forward some of the experience you had on the product management side of things, what were some of the challenges that you faced?

LaCour: Back in 2014, the reason for doing that was frankly it was very heavy lifting building out these technology platforms. We chose to build out our cloud storage and cloud computing platforms on top of open source software, including Ceph, which is a project we created here at DreamHost, and OpenStack which is a huge, successful open source project. That said, both of those projects are very difficult to deploy and manage and maintain and run. It was a huge ordeal to go through the process of building out these new platforms. In 2014 the effort to do that was really to free me up to focus fully on getting those products finished and out the door. That was job number one.

It took the better part of a year, from 2014 to 2015 to really get that done. We got into beta, and that’s when started learning a bit about the people who were using the software and that’s when you start to think, ‘OK, what do people want out of this service? Who is the persona, the customer-type that is going to be using this?’ We started to do some assessments, demographics, everything from what browser people were using to age groups and screen resolution. We started to understand our customer a lot better. What we did was we took that data and built out real personas with real names, and now everything we do is really based on satisfying those customers. It’s these external customers and our internal customer at DreamHost who are building out these new services that we are going to serve. So the biggest challenge was that learning experience. Getting things deployed and into the hands of customers and then rapidly figuring out who that customer is and what they’re trying to accomplish. And that’s guiding everything we do from this point forward.

WHIR: You spoke a little bit about building the cloud storage platform and basing a lot of that on Ceph and OpenStack. The conversation in hosting for a long time has always been the build versus buy dilemma. What’s your take on that and how do you think cloud has changed that conversation?

LaCour: DreamHost is kind of a unique bird in this particular question. This has been a longstanding thing at DreamHost – before I even came here DreamHost has always had a strong open source bent and preference, and we also fancied ourselves engineers.

Over time we’ve developed a lot of software. We have our own control panel, which many people out there buy things like cPanel or Plesk, we never have. We wrote our own control panel; we own our own user experience. We developed our own configuration management – all these technical things on the backend, cobbling together mostly open source tools and so on. I think it’s been both a blessing and a curse for DreamHost to build everything. Sometimes we make the decision that we should when we should have just bought something, but we’ve got a lot better at that over the last four or five years especially. As we’ve brought in some new blood, including myself, we’ve done a much better job at looking at things and saying, ‘OK, do we really need to have our own bug tracker that we wrote? Probably not.’ Let’s focus on the things that our customers want and are going to make us money. We’ve definitely shifted in that regard but I think that DreamHost very much is and always will be a place that is built on open source software; we take open source software and we make it accessible and great.

WHIR: One thing that has stood out to me about DreamHost is its diversity, and I think that culture is a huge part of your DNA as well. Obviously there are skill shortages in specific areas of development and engineering; can you talk a bit about that and your approach to hiring?

LaCour: It all starts from culture. If I had to say the one thing that I was most proud of about being part of DreamHost, it would be our culture. That culture has existed since before I got here but since I joined 4.5 years ago DreamHost’s leadership team actually sat down and decided that we wanted to really articulate our values and put them down on paper. We had a focus group if you will – a group of people that came from all parts of the company – representing our very diverse employee-base and they solidified it and came up with something called the DreamHost Way. We have 8 values and then one kind of core statement that talks about who we are as a company and includes things like empower people, give everyone a voice, speak hacker, embrace open source, practice shameless honesty, practice flexibility, be irreverent and fun, and provide superhero service. We actually have those values painted on the wall of all of our offices. We frequently take big decisions to the group, to the company itself, we’ll vote on things from time to time. We try to really involve everyone in the process. That’s all permeated now in the handbook and through the hiring process. When people read about DreamHost and encounter us, we want them to know who we are. [The values] very quickly attract a top quality person [in the hiring process] – someone who wants to be a part of something like this where it’s built on a mutual respect and being genuine in who we are.

dreamhost.core.values

Being present in open source communities has been a huge thing as well. I’ve hired many people for our cloud team over these three or four years that came out of these open source communities – places where we were already participating and contributing. That resonates strongly with people. Being a part of those things in an honest way is a great way to attract people to join you on working on those things as well.

WHIR: Can you share anything about what DreamHost is working on next as you look at the upcoming trends?

LaCour: I think what you’re going to see out of DreamHost in the next 12-18 months is a real embracing of who we are at our core. That’ll show up in terms of embracing open source and empowering people. Following on what we talked about earlier in terms of those market patterns, seeing people move up into software as a service and down into infrastructure, you’ll see us really trying to provide value to our customers along those lines. We’re going to continue to be the best WordPress host out there and I think our customers are going to be really excited to see what we have in the works there. Now that we have this mature and powerful cloud platform some of the things we’re going to be able to do and enable WordPress users to do on top of the DreamHost platform are going to be pretty amazing.

Source: TheWHIR

Amazon Committed to U.K. Data Center Opening Despite Brexit

Amazon Committed to U.K. Data Center Opening Despite Brexit

By Aaron Ricadela

(Bloomberg) — Amazon.com Inc.’s cloud computing division remains “committed” to opening a London data center by early next year, even after the British public’s vote for the U.K. to leave the EU.

It will also offer local customers the option of hosting data in Germany or Ireland, a company executive said Thursday.

“Demand for all our services is growing across all Europe. For us it’s business as usual,” Stephen Orban, head of enterprise strategy at Amazon Web Services, said in an interview at a customer conference Thursday in Frankfurt.

TRENDING: AWS Expansion Just Getting Started, CEO Andy Jassy Says

Britain’s vote last week to break away from the EU has raised concerns about how data-storage regulations in Europe and the U.K. will evolve. Orban confirmed Amazon’s planned U.K. data center would open despite the uncertainty, opening later this year or early next as planned.

“We’re watching the situation but I can’t speculate how everything is going to unfold,” Orban said. AWS is discussing the matter with the EU’s Article 29 working group on data protection, he said.

READ MORE: Brexit and Europe: Business as Usual

Britain’s decision to leave the EU could make using the planned London center more complicated for customers if the U.K. adheres to different rules than the rest of Europe about storing and safeguarding data. The Brexit process raises concerns that Europe’s new General Data Protection Regulation, approved in April and set to take effect in 2018, will no longer apply in the U.K. once it leaves the EU. If the U.K. needs to create its own set of protection rules it could complicate business in Europe for cloud providers such as AWS, Microsoft Corp.’s Azure service, and data businesses.

Customer Considerations

“We were considering using the U.K. one, mostly for U.K. customers,” said Charles Phillips, Chief Executive of Infor, a New York-based business software company that’s one of AWS’s biggest customers. “It’s less likely given what’s going on. I don’t want to rush in there and then have customers tell us to do something different.”

Phillips, a former high-ranking executive at Oracle Corp., said Infor plans to keep hosting British customers’ data from Amazon’s Dublin center, though he had hoped Amazon’s U.K. center would offer faster Web response times.

“Most of our customers we know are OK with Ireland,” he said. “We don’t know if they’re OK with the U.K.”

Orban said European customers that don’t want to tap computing capacity from the upcoming U.K. cluster of data centers could still host their applications in Amazon’s Frankfurt and Dublin locations.

Amazon’s cloud division, which the company says is on track for $10 billion in revenue this year, rents computing capacity and software that businesses can tap online instead of installing and maintaining their own servers. Amazon serves customers from 13 data-center clusters worldwide and plans to open four additional locations in the coming year.

“Everything new we’re building on AWS,” said Eric Bowman, vice-president of engineering at Zalando SE, the fast-growing Berlin-based Web apparel retailer. Each of its 110 developer teams has its own AWS account, which lets the company build new capabilities quickly without getting tangled in one another’s code or waiting weeks for new servers to arrive.

The U.K. cluster would offer EU customers “strong data sovereignty,” Amazon’s chief technology officer, Werner Vogels, said in a blog post last year. That’s an important issue in Europe, where regulations require data resides on computers in the EU, and many businesses, especially in Germany, insist it say in their own country.

Source: TheWHIR

Entisys360 Unveils ‘Cloud First’ Strategy

Entisys360 Unveils ‘Cloud First’ Strategy

Entisys360, formerly Entisys Solutions and Agile360 has announced the unveiling of its “Cloud First” strategy.

“Entisys360 is known throughout the industry for quickly responding to the evolving needs of our clients, while consistently staying ahead of the technology curve,” said Mike Strohl, CEO, Entisys360. “In becoming a ‘Cloud First’ company, we are again placing ourselves at the forefront of technology innovation — staying one step ahead of our clients by meeting tomorrow’s business needs, today.”

He added, “Advanced virtualization and cloud solutions from best-of-breed vendors, coupled with our innovative solutions architecture and professional services is also enabling us to drive the adoption of cloud-based infrastructure within our clients’ enterprises, regardless of whether it is a public, private or hybrid configuration.”

Entisys360’s “Cloud First” strategy includes the following areas:

  • Microsoft: Focused around helping customers move workloads to the Microsoft Azure, Office 365 and Store Simple platforms, this includes the development and implementation of private and hybrid cloud configurations where IT infrastructure is housed on-premise, and/or in a variety of Microsoft Cloud based platforms.
  • Amazon Web Services (AWS): The AWS platform offers a wide range of core services for cloud-based infrastructures. This technology is used to implement Software-as-a-Service (SaaS), Platform-as-a-Service (PaaS), Disaster-Recovery-as-a-Service (DRaaS), Database-as-a-Service (DBaaS), DevOps-as-a-Service and other public and private cloud services. And, through AWS and Entisys360’s professional services, customers can also benefit from instructor-led workshops, migration planning and services, and Enterprise Managed Services (EMS).
  • Entisys360 Desktop-in-the-Cloud: Entisys360 offers a full-featured Desktop-as-a-Service (DaaS) running in the AWS Cloud. Instant-on capabilities mean that clients can sign up and start using the Entisys360 Desktop-in-the-Cloud immediately, regardless of whether they have a single desktop or a network of thousands.
  • Private Cloud Solutions: For customers deploying a combination of private and public cloud solutions within a hybrid model, Entisys360 offers complete solutions for on-premise private cloud configurations, instructor-led workshops, architectural planning and design, implementation, training and support services.
  • As-a-Service Cloud Solutions: Based on the specific business requirements of the client, Entisys360 offers a number of “as-a-service” solutions including PaaS, DRaaS, SaaS, DBaaS and DevOps-as-a-Service. Prior to implementing these solutions, Entisys360’s expert cloud architects and consultants work with each client to evaluate their existing environment and develop strategic plans around when and how to move their enterprise to the cloud.
  • Strategic Planning: Entisys360 solution architects and consultants evaluate the client’s overall business and technology environment, working together to create a vision, strategic roadmaps and provide operational readiness services. These services do not focus on any single cloud-based service, but instead are geared toward optimizing the business by leveraging cloud-based services.

“In becoming a ‘Cloud First’ company, we want to assure our valued clients and partners that nothing has changed in terms of our core offerings,” said Matt General, CTO, Entisys360. “In fact, as a trusted advisor in the deployment of advanced infrastructure, virtualization and cloud solutions, we are in a position where we can identify the best combination of products and services required to solve our clients’ most pressing IT challenges regardless of whether their infrastructure resides on-premise, in a data center or in the cloud.”

Source: CloudStrategyMag

IBM Named Leader In Public Cloud IaaS Adoption By TBR

IBM Named Leader In Public Cloud IaaS Adoption By TBR

IBM has announced that it has been named the market leader in public cloud IaaS (Infrastructure as a Service) adoption by leading independent technology market research firm, Technology Business Research, Inc. (TBR). In TBR’s semiannual Public Cloud Customer Research survey of 2,169 enterprise, respondents identified IBM as a leader because of its ability to provide customers with hybrid management tools while addressing data location and privacy concerns through a global network of data centers.

“We continue to see strong enterprise adoption of public cloud services being driven by the need for hybrid management tools and solutions,” said Kelsey Mason, analyst at TBR. “The market is also consolidating, which leaves very few players remaining that can address customer demands around global reach and scale, as well as data security and compliance.”

Other TBR evaluation criteria include the ability of IaaS providers to focus on attracting developers and building out more data center locations while adding more professional and managed services. Enterprise customers also looked at scalability and cost of public cloud making it a logical deployment model for initiatives around Internet of Things (IoT) and analytics.

“This report validates a much broader trend where companies are turning to the IBM Cloud for choice and consistency across public, private and hybrid environments,” says Jim Comfort, chief technology officer for IBM Cloud. “We’ve become the backbone and onramp to the cloud, giving enterprise customers the global reach, capacity and scale they need. Our cloud offering bridges the gap between on premise IT and cloud with versatility and ease.”

TBR’s Public Cloud Customer Research report is designed to provide a view into cloud adoption metrics and trends. For more information please click here.

Source: CloudStrategyMag

The Post Safe Harbor Era: New Opportunities for Service Providers

The Post Safe Harbor Era: New Opportunities for Service Providers

At the beginning of June, Reuters reported that for the first time, fines were imposed on companies in Germany that continue to act as if the Safe Harbor Privacy principles are still valid. The US-EU agreement was invalidated by the EU high court in October 2015.

Immediately after the ban, new negotiations began for obvious reasons. The stakes are very high: the transfer of EU personal data to the US without the explicit consent in each case of every user is prohibited. Every citizen, consumer organization or regulator can in principle, start legal proceedings against companies that ignore the ruling.

This truly is a sword of Damocles hanging over the market and nearly everyone in the IT industry wants it removed as quickly as possible.

The complexity is apparent. The public and policy makers focus their attention on companies that have data. The relevance of the current situation to the hosting and data center industry is however, still underexposed.

Let’s have a closer look at three common issues:

  1. The Infrastructure

Companies that use software or hardware to process personal data should be aware of the “call home” function, mainly for maintenance and monitoring in many applications and devices. Do they know that during that process, (parts) of personal data can be transferred as well?

Even if all the data is stored and processed in the EU, some of it may be transferred to non-EU countries. That possibility concerns about anything that is in a server rack, servers, switches, routers, storage.

Who is responsible for that hardware? Some providers and data centers have already received inquiries about this matter; still not all have answers that will reassure users and end users. Switching off these features is an option. We know that, for example, in the public sector of several EU member states, that option and off as the default mode is required for all new appliances.

  1. Cloud Marketplaces

Data centers and large providers increasingly offer connectivity via their platforms with the infrastructure of clouds from third parties. Providers should analyze all components of those marketplaces to find out if there is a possibility of personal data being transferred to non-EU countries.

Sound complicated? You’re right.

  1. Data of Employees

Let’s turn to one detail of those first fines in Germany. The imposed amount equivalent to $32,000 USD is not particularly high. What is striking, is that the local supervisor (by the federal structure of Germany, each state has its own regulator) found two violations at Adobe, Punica and Unilever. Customer data and data of employees were being exposed.

Most press attention to date has gone to the data of the customers and/or website visitors. For all multinationals, the second category is extremely relevant. Some multinationals, even before the October court ruling, decided to quit the idea of centrally controlled payrolling and so on. They opted instead for decentralized solutions to avoid possible violation of EU legislation. Imagine the costs of that decision and the work that has to be done by IT to make it happen.

So what does this mean for the daily business of service providers, hosters and data centers in this complex situation?

If you are based in the US, the bad news is that chances are you have enduser data on your systems that the customer is not allowed to store or process. That is his responsibility, but potentially, his decision to terminate could hurt you. The second point is about the market places that might include services that move or copy data to other regions. You have to be transparent about that, because a misinformed customer cloud cost you sales.

There is some good news as well: transparency and clear communication is more rewarding than ever.

EU companies are confused by post-Safe Harbor implications and the upcoming GDPR situation and are looking for clear answers. US companies are also looking for future-proof solutions for dealing with customers and situations in the EU.

There are providers both in the US and EU, that consider their proven knowledge of this lesser known data traffic and ability to give advice on application and data migration to specific geographical areas as a unique selling proposition.

Each year, HostingCon Europe focuses on the issues, trends and legislation that affect your business. Attend to get cutting edge information about changing market conditions and how to navigate challenges in the EU marketplace. Learn about post Safe Harbor security issues with our panel of experts including US attorney David Snead and Alban Schmutz, SVP at the number one hosting provider in Europe, OVH.

Source: TheWHIR

Google-Backed FASTER Submarine Cable to Go Live This Week

Google-Backed FASTER Submarine Cable to Go Live This Week

datacenterknowledgelogoBrought to you by Data Center Knowledge

FASTER, the Google-backed submarine cable that adds much needed network bandwidth between data centers in the US and data centers in Japan, Taiwan, and the broader Asia-Pacific market, has been completed, about two years after the project was first announced. The cable will start carrying traffic on Thursday, a Google spokesperson said via email.

As more and more data is generated and transferred around the world, demand for connectivity is skyrocketing. There has been an increase in submarine cable construction activity in response, with major internet and cloud services companies like Google, who are the biggest consumers of bandwidth, playing a bigger and bigger role in this industry.

The FASTER system lands in Bandon, Oregon; two cities in Japan, Chikura and Shima; and in Taishui, Taiwan, according to TeleGeography’ssubmarine cable map. The cable landing stations are connected to nearby data centers, from where the traffic is carried to other locations in their respective regions.

On the US side, data center providers Telx (owned by Digital Realty Trust), CoreSite, and Equinix have made deals to support the new system. A Telx data center in Hillsboro, Oregon, is connected to the landing station in Bandon. FASTER traffic will be backhauled to Equinix data centers in Silicon Valley, Los Angeles, and Seattle. CoreSite’s big connectivity hub in Los Angeles will also have direct accessto the system.

FASTER cable map telegeography

The FASTER submarine cable system lands in the US, Japan, and Taiwan (Source: TeleGeography, Submarine Cable Map)

Google invested $300 million as member of the consortium of companies that financed the submarine cable’s construction. Other members are China Mobile International, China Telecom Global, Malaysian telco Global Transit Communications, Japanese telco KDDI, and Singaporean telco Singtel.

Both KDDI and Singtel are also major data center services players. Singtel is the biggest data center provider in Singapore, one of Asia’s most important network interconnection hubs, and has a partnership with Aliyun, the cloud services arm of China’s internet giant Alibaba. KDDI subsidiary Telehouse operates data centers throughout Asia, as well as in Europe, US, and Africa.

The rate of growth in global internet traffic has been breathtaking. Cisco’s latest Global Cloud Index projects the amount of traffic flowing between the world’s data centers and their end users to grow from 3.4 zettabytes in 2014 to 10.4 zettabytes in 2019. It would take the world’s entire 2019 population streaming music for 26 months straight to generate 10.4 zettabytes of traffic, according to Cisco’s analysts.

Learn more: Data Center Network Traffic Four Years From Now: 10 Key Figures

Cloud will be responsible for the majority of all that traffic four years from now, according to Cisco, so it comes as no surprise that the cloud giants have ramped up investment in global network infrastructure.

Amazon Web Services, the biggest cloud provider, made its first major investment in a submarine cable project earlier this year. The Hawaiki Submarine Cable, expected to go live in June 2018, will increase bandwidth on the network route between the US, Australia, and New Zealand. Amazon agreed to become the cable’s fourth anchor customer, which finalized the financing necessary to build the system.

Microsoft and Facebook announced in May a partnership to finance construction of a transatlantic cable called MAREA, which will land in Virginia and Bilbao, Spain.

Microsoft is also an investor in the New Cross Pacific Cable System, a transpacific cable that will land in Oregon, China, South Korea, Taiwan, and Japan, and the transatlantic system called Express, which will land in Canada, the UK, and Ireland.

Source: TheWHIR

How to get your mainframe's data for Hadoop analytics

How to get your mainframe's data for Hadoop analytics

Many so-called big data — really, Hadoop — projects have patterns. Many are merely enterprise integration patterns that have been refactored and rebranded. Of those, the most common is the mainframe pattern.

Because most organizations run the mainframe and its software as a giant single point of failure, the mainframe team hates everyone. Its members hate change, and they don’t want to give you access to anything. However, there is a lot of data on that mainframe and, if it can be done gently, the mainframe team is interested in people learning to use the system rather than start from the beginning. After all, the company has only begun to scratch the surface of what the mainframe and the existing system have available.

There are many great techniques that can’t be used for data integration in an environment where new software installs are highly discouraged, such as in the case of the mainframe pattern. However, rest assured that there are a lot of techniques to get around these limitations.

Sometimes the goal of mainframe-Hadoop or mainframe-Spark projects is just to look at the current state of the world. However, more frequently they want to do trend analysis and track changes in a way that the existing system doesn’t do. This requires techniques covered by change data capture (CDC).

Amazon's Elastic File System is now open for business

Amazon's Elastic File System is now open for business

Following an extended preview period, Amazon’s Elastic File System is now generally available in three geographical regions, with more on the way.

Originally announced last year, EFS is a fully managed elastic file storage service for deploying and scaling durable file systems in the Amazon Web Services cloud. It’s currently available in the U.S. East (northern Virginia), U.S. West (Oregon), and EU (Ireland) regions, the company announced Wednesday.

Customers can use EFS to create file systems that are accessible to multiple Amazon Elastic Compute Cloud (Amazon EC2) instances via the Network File System (NFS) protocol. They can also scale those systems up or down without needing to provision storage or throughput.

EFS is designed for a wide range of file workloads, including big data analytics, media processing, and genomics analysis, AWS said.

BigCommerce Helps Online Merchants Boost Pinterest Profits

BigCommerce Helps Online Merchants Boost Pinterest Profits

BigCommerce is helping its merchants convert Pinterest fans into paying customers with the launch of Pinterest Buyable Pins on desktop. BigCommerce launched Pinterest Buyable Pins in May, but it was initially only available for mobile devices.

According to an announcement by BigCommerce this week, while the majority of Pinterest’s traffic comes from its mobile app (80 percent), the majority of checkout experiences still take place via desktop.

SEE ALSO: Salesforce Acquires Demandware for Ecommerce Expertise

With Pinterest Buyable Pins BigCommerce merchants provide shoppers the ability to browse and purchase products directly on Pinterest.

Buyable Pins was Pinterest’s first major move in monetizing its platform and converting its some 100 million monthly active users per month into shoppers. With more than 1 in 5 consumers engaging with brands on Pinterest, this functionality could not come soon enough for retailers.

Pinterest has released a new multi-device shopping cart, available on Android now, and in the coming months on iOS. With this release shoppers can now add items to a persistent shopping cart which they can access by logging into their Pinterest account on multiple devices. With this shopping cart buyers can also buy from multiple merchants simultaneously.

“This type of holistic multi-device shopping experience, where a user’s login saves product information and activity, positions Pinterest more actively as an ecommerce marketplace, presenting a wide variety of brands and products on a single platform,” BigCommerce managing editor Tracey Wallace said in a blog post on Tuesday.

For ecommerce hosting providers to stay relevant in the new digital marketplace, where more people are making purchase decisions based on social media, being able to help your merchant customers sell where their customers are buying will be key.

Source: TheWHIR