HostingCon 2016: The Silk Road Takedown, and Why Hosts Should Know their Local FBI Agent

HostingCon 2016: The Silk Road Takedown, and Why Hosts Should Know their Local FBI Agent

Remember the basics when it comes to security, and take your local law enforcement out for lunch. These are two strategies that will help service providers’ deal with the increasing security risks and immediate threats to their businesses, according to industry experts who spoke at HostingCon this week.

It is critical to get to know your local law enforcement before there is an issue and they show up at your data center with a search warrant. Doing so can help them understand your business better, and what your policies are, Jane Shih, assistant general counsel, Endurance International Group said in a panel on Tuesday.

“The best way to not have FBI come in and take a whole rack of servers is education,” David Snead, general counsel of cPanel said in agreement.

Jay Sudowski, CEO of Handy Networks, says that providing education for staff is also important so that in the event that FBI does come knocking, they are prepared for what to do.

So who are these FBI agents and what are they like? The HostingCon audience got a peek behind the curtains of what FBI sees in capturing some of the world’s most wanted cyber targets – including hackers behind LulzSec and Anonymous. Chris Tarbell, former FBI agent involved in the Silk Road bust, spoke on Tuesday on his career in the FBI where he started in computer evidence and international terrorism before becoming involved in cybercrime.

These early career stints were imperative in learning where the evidence is stored on a computer and how to find things, as well as the importance of log information, he says.

In 2010, Anonymous started to be on the FBI’s radar more after its Operation Payback, where the hacking group launched massive DDoS attacks against payment providers like Visa, PayPal and MasterCard after they cut off support to WikiLeaks.

Around the same time, HBGary Federal sought to deanonymize the hacking group, only for Anonymous to hack CEO Aaron Barr’s email and within 20 minutes, shut down his entire online life, Tarbel says. Shortly after that Barr was forced to resign, in one of many examples of the true cost of cybercrime.

In 2011, another hacking group, called LulzSec, started to make headlines for its attacks on targets such as Sony, Fox, and the CIA.

Tarbell descibes getting a tip from another hacker – a kid in New Jersey who said he knew Hector Xavier Monsegur aka Sabu, the leader of LulzSec. He only knew that he lived in New York but that was enough for the FBI.

“We dug through all the logs, we found one IP address that was in New York: it was Hector,” he says.

Once they were able to track him down in his apartment, Sabu spent two hours trying to convince the FBI that he didn’t know anything about computers. He eventually agreed to become an informant for the FBI to teach them all about how groups like LulzSec hack.

With Sabu’s help, the FBI was able to arrest Jeremy Hammond, one of the FBI’s most-wanted cybercriminals, who used TOR to protect his identity, and the arrest of Silk Road founder Ross Ulbricht aka Dread Pirate Roberts in 2013. Silk Road was a $1.2 billion website that operated on TOR, used bitcoins so money couldn’t be traced. The online marketplace offered hacking services, murders for hire and drugs. Ulbricht is currently serving a life sentence with no chance of parole.

So what advice does Tarbell have for hosting providers when it comes to security? Bring it back to the basics. The number of hacks that happen because users use the same password across multiple sites is staggering. Doing simple tweaks can help prevent an organization or hosting end-user from being a hacker’s next target.

Source: TheWHIR

Real-World User Cloud Challenges and How to Overcome Them

Real-World User Cloud Challenges and How to Overcome Them

Today’s modern organization is in the midst of a digital revolution. Cloud services are becoming much more mature, users are utilizing more digital tools to stay productive, and organizations are constantly tasked with keeping up with demand. Accenture’s research model and analysis shows that digital is now dominating every sector of the economy. In fact, this global digital economy accounted for 22 percent of the world’s economy in 2015. And it’s rapidly growing, as we forecast those numbers to increase to 25 percent by 2020, up from 15 percent in 2005.

Accenture goes on to say that those organizations which embrace these digital trends will come out on top in today’s very competitive market. Winners will create corporate cultures where technology empowers people to evolve, adapt, and drive change.

There’s a key operating word in that previous sentence – people. People are the consumers of these digital tools and are the ones who use them to be productive and impact the modern business. So, with this in mind, what sort of issues are users experiencing when utilizing virtual workloads and cloud services?

Eventually, it all comes down to the end-user and their digital experience. A given project can have the most advanced systems in place, but if the end-user experience is poor the project might be considered a failure. With more devices, applications and data to work with, the inherent challenge has become managing this end-user digital environment.

We’re no longer managing just the end-user; rather, we are attempting to control their entire experience. The biggest challenge facing IT now is the amount of settings that a user carries with them at any given time. As opposed to just a few years ago, users have significantly more settings and personalization to work with than before. When we say settings we mean the entire line of what the user might be working with to stay productive:

  • Physical peripheral settings
  • Folder/content/data redirection
  • Data availability (file sharing)
  • Personalization settings (specific to devices, applications, desktops, and more)
  • Profile settings
  • Application settings (virtual, cloud-based, locally installed)
  • Hardware and device settings (personal vs corporate devices)
  • Desktop settings (virtual, physical, hosted)
  • Performance settings (optimization)
  • And much more

Now, how can the IT department control all of this? How can management create a truly robust user experience? The challenge is not only to manage the above end-user requirements, but also to continuously provide a positive and productive end-user experience.

A great example would be a customer with numerous locations. By having visibility into the end-user, their policy and their settings, administrators can control what they see and how they access the environment. One of the biggest complaints is the constantly varying experience of users coming in from remote locations using various devices. By centralizing the user management process, admins can deliver the same experience regardless of the location, OS or hardware device.

Designing an IT Architecture Which Supports Digital Users

More applications, more data, more end-user devices, and more operations being done on a computing platform all mean more help-desk calls. One of the biggest shifts, however, has been where these calls are coming from. Originally, many calls could be fielded and resolved at the data center level. Now, more users are experiencing issues related to their experience rather than the application or platform that they’re operating on. For example, profile corruption or missing settings are extremely common calls. Similarly, users are requesting cloud services which often times result in fragmented delivery architecture.

We now see users within healthcare, law, and other large verticals using more and more devices. Because of this, we began so see a new issue arise: User Fragmentation. This means broken profiles, improperly aligned settings, and missing application components because of the number of varying devices.

Why is this happening? As mentioned earlier, the availability of more devices on the network creates for greater diversity in how data and information is being handled. Unfortunately, without a good control platform, IT is stuck with fielding more and more help-desk calls. Users are demanding that more of their devices share the same experience as the corporate desktop. Even in that scenario, the corporate desktop has to deal with even more information as data usage increases. The biggest challenge facing IT now is how to control the user layer and still facilitate the growth in the number of devices being utilized.

Overcoming the Digital Dilemma

Instead of just deploying a piece of technology – make sure to design your solutions around your users. Know the devices they’re deploying, know the resources they’re accessing, and always think about their experience. New tools and technologies help control the entire digital experience between physical devices and virtual resources. Most of all these new systems help create a bridge between on-premise resources and those sitting in the cloud.

For example, new types of portals can present services, applications (on premise, cloud, SaaS), virtual desktops, and much more from a single location. These portals act as service stores where users can request everything from printers to headsets. Furthermore, they can request specific cloud and business resources like applications, desktops, and much more. Ultimately, this removes barriers to adoption by reducing complexity. In designing a digital-ready end-user ecosystem; the rule of thumb is actually pretty straightforward: Start your planning with your end-users. You’ll not only learn even more about your own business; but you’ll also learn how users stay productive, the devices they’re using, and how to proactively keep these users happy.

Source: TheWHIR

Nearly Half of All Corporate Data is Out of IT Department's Control

Nearly Half of All Corporate Data is Out of IT Department's Control

Many organizations are not responding to the continuing spread of “Shadow IT” and cloud use with appropriate governance and security measures, and more than half do not have a proactive approach, according to research released Tuesday. The 2016 Global Cloud Data Security Study, compiled by the Ponemon Institute on behalf of Gemalto, shows that nearly half of all cloud services (49 percent) and nearly half of all corporate data stored in the cloud (47 percent) are beyond the reach of IT departments.

The report is drawn from a survey of more than 3,400 IT and IT security practitioners from around the world. It shows only 34 percent of confidential data on SaaS is encrypted, and members of the security team are only involved in one-fifth of choices between cloud applications and platforms.

READ MORE: Shadow IT: Embrace Reality – Detect and Secure the Cloud Tools Your Employees Use

IT departments are making gains in visibility, with 54 percent saying the department is aware of all cloud applications, platforms, and infrastructure services in use, up from 45 percent two years ago. Also, the number of respondents saying it is more difficult to protect data using cloud services fell from 60 to 54 percent, however those gains were offset by more broadly reported challenges in controlling end-user access.

“Cloud security continues to be a challenge for companies, especially in dealing with the complexity of privacy and data protection regulations,” Dr. Larry Ponemon, chairman and founder, Ponemon Institute said. “To ensure compliance, it is important for companies to consider deploying such technologies as encryption, tokenization or other cryptographic solutions to secure sensitive data transferred and stored in the cloud.”

The number of companies storing customer data in the cloud is increasing, with nine percent more organizations reporting the practice than in 2014, despite 53 percent still saying that is where it is most at risk.

Almost three-quarters say encryption and tokenization are important, and even more think it will be important over the next two years. However, almost two-thirds (64 percent) said their company does not have policies requiring safeguards like encryption for certain cloud applications.

Seventy-seven percent say managing identities is harder in the cloud than on-premises, yet only 55 percent have adopted multi-factor authentication.

“Organizations have embraced the cloud with its benefits of cost and flexibility but they are still struggling with maintaining control of their data and compliance in virtual environments,” said Jason Hart, Vice President and Chief Technology Officer for Data Protection at Gemalto. “It’s quite obvious security measures are not keeping pace because the cloud challenges traditional approaches of protecting data when it was just stored on the network. It is an issue that can only be solved with a data-centric approach in which IT organizations can uniformly protect customer and corporate information across the dozens of cloud-based services their employees and internal departments rely every day.”

The report recommends organizations set comprehensive policies for data governance and compliance, as well as guidelines for sourcing cloud services, and cloud data storage rules.

A study released in June by Alert Logic indicated that workloads were subject to the same security operations strategy regardless of the infrastructure they are on.

Source: TheWHIR

ZENEDGE Launches Single IP Protection for DDoS Mitigation

ZENEDGE Launches Single IP Protection for DDoS Mitigation

ZENEDGE launched Single IP Protection to general availability on Tuesday at HostingCon to provide enterprise-class network DDoS mitigation to organizations with smaller networks.

Network DDoS mitigation traditionally requires Border Gateway Protocol for routing decisions, which means they only work on networks with a minimum class C subnet including 254 usable and 256 total IP addresses, according to the company.

READ MORE: Evolution of DDoS Protection, and the Modern Opportunity

With the new offering, ZENEDGE assigns clients a DDoS-protected IP address range from its IP pool. It establishes a GRE tunnel to route traffic between the companies servers and the ZENEDGE protected IP network, and then directs new traffic through ZENEDGE via a DNS change.

“ZENEDGE serves many gaming companies, SaaS providers and organizations who are hosting their solutions in a colocated data center or in the cloud,” Leon Kuperman, CTO of ZENEDGE said in a statement. “While these organizations operate smaller networks and don’t control their routers, they are nevertheless consistently targeted with volumetric DDoS attacks.”

SEE ALSO: DDoS Attack Victims Have 82 Percent Chance of Being Hit Again: Report

The company says gaming companies and others using proprietary protocols, UDP, VPN, or non-standard TCP ports.

With network layer DDoS attacks costing up to $40,000 per hour according to a 2015 report, the solvency of smaller organizations without protection could be at risk.

ZENEDGE received $4 million in a Series B funding round late last year.

Source: TheWHIR

Skyscape Launches Assured Oracle Cloud Platform

Skyscape Launches Assured Oracle Cloud Platform

Skyscape Cloud Services Limited has announced the launch of its new cloud platform, based on Oracle’s Virtual Machine (OVM) technology. Skyscape’s Oracle platform will ensure that its public sector customers can host their workloads on Skyscape’s secure platform, while meeting the stringent licensing and hardware requirements specified by Oracle’s software solutions.

A wide variety of citizen-facing applications and UK public sector back office systems are based on Oracle’s database technology. Traditionally these systems require dedicated hardware which not only involves initial capital investment but also poses a challenge in terms of flexibility and capacity management — if more capacity is required, so is more hardware which takes time and resource to put in place. The very specific hardware and licensing requirements associated with Oracle solutions also make it difficult for public sector organizations to change hosting suppliers or deploy a diverse solution hosted across two platforms. Skyscape’s new Oracle platform provides a solution to these problems. Its customers can host their Oracle-based services with Skyscape and therefore realize the benefits of true cloud hosting – increased agility and consumption-based pricing as well as significantly reduced costs as the need for investment in hardware is removed. All while complying with Oracle’s licensing requirements.

“Our new Oracle Platform, which is built on OVM technology has been specifically designed with our commitment to the public sector in mind,” said Simon Hansford, CEO of Skyscape Cloud Services. “Our platform is used to deliver many critical services to citizens so it’s imperative that we continue to adapt to the evolving requirements of our public sector customers. This new platform will respond to these demands from customers using Oracle technologies, helping us to continue to support the public sector in the digital transformation of public services, ultimately benefiting UK citizens and tax payers.”

Matt Howell, head of public sector at Capgemini said “We applaud the launch of Skyscape’s Oracle platform. While we have Oracle loads on the Skyscape platform, we observe a lag in cloud adoption amongst our public sector customers due to the lack of an assured platform designed specifically to meet the needs of Oracle applications. As Skyscape’s new platform complies with Oracle licencing it will enable those customers to migrate services to the cloud quicker and with less risk.”

The new Skyscape Oracle platform offers the same unique assurance, sovereignty and connectivity capabilities offered by Skyscape’s other platforms, all within an environment that is specifically optimised for Oracle workloads. Use of the Skyscape Oracle platform can enable organisations to prolong the lifecycle of existing applications by removing the reliance on legacy hardware. More importantly it can support further innovation across the public sector as it can be used to support projects that require high levels of security and assurance, combined with flexibility and scalability. Projects that can be up and running far quicker as test and development as well as production environments can be quickly established without the need to procure hardware.

“We’re really excited about the development of Skyscape’s Oracle based platform” said Vikram Setia, partner and chief commercial officer at Infomentum. “As an Oracle platinum partner, we understand the complexities of successfully delivering Oracle solutions in the cloud – especially for customers who have specific requirements around data management and location. Skyscape’s new Oracle based platform, and its sole focus on the requirements of the UK public sector, is great news for those looking to utilise a proven UK based cloud service provider to host their Oracle workloads.”

Additional benefits of Skyscape’s Oracle platform include:

  • Compatibility with a wide variety of Oracle technologies such as, but not exclusive to: Oracle Database, WebLogic, Fusion Applications, E-Business Suite and more
  • Service Level Agreements that are sensitive to the requirements of Oracle applications
  • Advanced Cross Domain Security Zone — a secure managed or self-managed area that enables citizen access over the internet to data which is securely hosted on the elevated domain
  • Broad networking – connect via the internet (with DDoS protection provided as standard); via Government community networks (PSN Assured service, PSN Protected service, N3, Janet, RLI; or legacy networks including PNN) or via HybridConnect

Source: CloudStrategyMag

EdgeConneX®, Comcast Business, Datapipe, And Megaport Partner To Bring The Cloud Local To Boston Enterprises

EdgeConneX®, Comcast Business, Datapipe, And Megaport Partner To Bring The Cloud Local To Boston Enterprises

EdgeConneX® has announced that it has entered into an agreement to facilitate the availability of direct cloud connectivity to Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform within its Boston Edge Data Center® via the deployment of Megaport’s SDN-based elastic interconnection fabric. With this new offering, Datapipe and Comcast Business will offer localized cloud migration services to ease enterprise adoption and simplify network connectivity to the cloud.

Datapipe, a leading global managed hosting and cloud services provider, will offer Boston-based cloud integration services, facilitating and managing local enterprises through the appropriate migration to public and hybrid solutions. Comcast Business, which offers Ethernet, Internet, Wi-Fi, Voice, TV and Managed Enterprise Solutions to help organizations of all sizes transform their business, is offering its direct and private Ethernet services to the EdgeConneX Boston Edge Data Center (EDC) and will be training its vast sales team to speed enterprise adoption of hybrid cloud services. Megaport, the pioneer in elastic bandwidth services, will provide its elastic interconnection platform, which enables on-demand, highly reliable, scalable and private direct connections to leading Cloud Service Providers. As a Microsoft Azure ExpressRoute connectivity partner, Amazon AWS Direct Connect partner and Google Cloud Platform Carrier Interconnect partner, Megaport makes it simple to directly connect to cloud services with right-sized capacity.

This collective partnership results in the availability of a cloud ecosystem that makes Boston one of the most effectively and efficiently connected markets in the world for handling the complex needs of IT organizations as they continue to advance their use of the cloud and take advantage of its many benefits. Previously, cloud connectivity investments of this magnitude were only found in a few global core markets leaving the majority of enterprises outside of those geographies without an ideal solution.

“It’s amazing that a top-five enterprise market relies on services hundreds of miles away for critical IT infrastructure,” remarks Clint Heiden, chief commercial officer, EdgeConneX. “Effective cloud solutions, and thus adoption by enterprises, depends on reliable, fast and secure connectivity. Furthermore, an effective hybrid implementation is still one that is local to the customer. Boston marks our third roll out of local cloud services, following the May and June announcements of Portland and Detroit, respectively. We plan to continue rolling out these services across our entire global EDC footprint.”

The Internet of Everywhere requires a highly diverse and distributed content and cloud architecture, with the network Edge extended beyond traditional major peering hubs to ensure the service quality and experience expected by today’s enterprises and consumers. According to the Cisco Global Cloud Index (GCI): Forecast and Methodology, 2014-2019 White Paper, network latency is a critical factor in cloud adoption and limits the use of advanced services in the cloud. “Reducing delay in delivering packets to and from the cloud is crucial to delivering today’s advanced services and ensuring a high-quality end-user experience,” the GCI stated.

“Boston enterprise growth is on the rise with the city’s high-tech sector growing nine percent every year since 2010,” comments Robb Allen, CEO, Datapipe. “EdgeConneX has built a compelling multi-cloud ecosystem around their data center facilities to meet this increase in demand. Datapipe’s experience in migrating, optimizing and managing IT infrastructure can help enterprises take full advantage of this ecosystem.”

The EdgeConneX Boston EDC is optimized to offer security, speed and performance improvements. These innovations enable customers to deliver digital content, cloud and applications to end-users as fast as possible. Edge Data Centers are proximity-based, strategically located nearest to the end-user’s point of access to reduce network latency and optimize performance. Local proximity access also brings the cloud closer to the enterprise, enabling more secure, real-time access to cloud applications and services while offering reduced backbone transport costs.

“Until recently, cloud services were consumed on-demand while connectivity to those cloud services remained static and fixed,” explains Denver Maddux, CEO, Megaport. “Megaport enables enterprises to deploy on-demand capacity that is right-sized and reliable. We’re excited to support this initiative to enable Boston businesses with a powerful set of tools to unlock next-generation cloud architectures.”

“Our extensive Boston network has direct on-net access to thousands of enterprises,” states Steve Walsh, vice president for Comcast Business in Greater Boston. “This broad reach allows us to offer reliable Ethernet services that can meet virtually any business demand. Having a local cloud offering in Boston adds a tremendous solution for us to bring to our customers and further highlights how Comcast Business is keeping pace with the growth needs of the market.”

Source: CloudStrategyMag

Would Google Cloud Win Spell Doom For MSPs?

Would Google Cloud Win Spell Doom For MSPs?

mspmentorBrought to you by MSPmentor

There’s a dirty little secret that has long been whispered, even if not widely discussed outside close circles of managed services providers (MSPs): Don’t work with Google.

For years, MSPs have bristled at the tech behemoth’s approach, long seen as ignoring the channel to market a growing array of user-friendly, self-serve technology products directly to consumers.

The disruption from tsunami-like adoption of public cloud services is bringing those old fears back front and center.

This year, Google placed huge bets on public cloud, vowing to catch up and dominate the competition with Amazon, Microsoft and others, and raising questions about the effect on MSPs should Google Cloud Platform (GCP) become the market leader.

READ MORE: New Google AI Services Bring Automation to Customer Service

“I think MSPs are generally afraid of working with Google because of their direct company history,” said Jim Lippie, principal at Clarity Channel Advisors, a consultancy that advises MSPs. “The fear from an MSP perspective is that Google will ramp up efforts on the direct side and freeze out MSPs.”

In fact, Lippie said he knows of MSPs that actively refuse to work with Google, citing perceived anti-channel practices.

For its part, Google offered nods to its partners on separate occasions this year.

In April, Google held an event in New York City for partners who sell products in their ecosystem, including Google Apps for Work.

The following month, a two-day event was held in Dublin, Ireland for EMEA partners.

Google used the events to tout the partner programs, discuss upcoming product releases and offer market overviews on how MSPs can succeed with cloud services.

“I do know that Google takes care of its partners,” said Fabrice Vanegas, a software integrator with Canadian technology solution provider Gestion-bsp.

Google, Vanegas explained, relies on its partners to sell to and cultivate relationships with truly small- and medium-sized business – those with fewer than 1,000 employees.

“We sort of do Google’s work for them,” he said. “Google having partners is crucial to its growth.”

At the same time, Google is pressing full speed ahead with an ambitious cloud strategy that includes cheap computing and storage, and a dizzying array of constantly improving, user-friendly, self-serve tools and applications.

SEE ALSO: Google Launches Its First Cloud Data Center on West Coast

“I definitely think Google is making things easier,” said Jovan Hernandez, an engineer with Google partner Mac-tech. “It’s really a toss up about how MSPs are going to play a role in the cloud revolution.”

Is Google ‘anti-channel?’

Right or wrong, perceptions that Google is less than committed to channel are easy to find.

“Google is not a partner company,” said Jamison West, president of Seattle cloud services provider Arterian CSP. “That’s not their culture.”

In March, Google, which trails Amazon Web Services (AWS) and Microsoft’s Azure division for market share in the public cloud competition, announced a massive escalation of its cloud effort.

VMWare co-founder Diane Greene, newly hired to head Google Cloud Platform, was tasked to lead an expansion of Google’s worldwide data centers from three to five by the end of 2016, with 10 more data centers coming on line by the end of 2017.

Along with dramatically increasing the supply of public cloud, Google has rolled out tools that let users do everything from building cloud-based web and mobile apps that scale with demand, to deploying massive fully managed databases, to monitoring and managing workloads on their own cloud-based virtual networks.

Vanegas, of Gestion-bsp, acknowledged that partners must come accept Google’s alternate market strategy.

“It does do it fair share of direct deals,” he said.

Arterian’s West noted the contrast between Google and its top cloud competitors.

Microsoft, of which Arterian is a long-time partner, is well known for its channel sales program and commitment to cultivating partner relationships.

Public cloud market leader Amazon is another direct-sale company that not so long ago was also seen as ambivalent to channel partners.

But in recent years, its Amazon Web Services cloud division has built a thriving cloud partner program that appears to be gaining traction, West said.

“They’re taking steps and there are visible sigs that they’re making progress,” he said.

‘Economics’ to dictate Google moves

In the cases of Amazon and Microsoft, managing cloud workloads and designing relevant virtual infrastructure remains sufficiently complex that third-party service providers can reasonably expect to find recurring revenue after completing the migration.

Google, on the other hand, has work to do to convince channel companies that it is a profitable partner choice.

Data from this year’s MSP501 worldwide ranking of top MSPs shows 82 percent of the firms on the list reported leveraging Office 365 services in 2015. By comparison, just 15.7 percent of companies that made the ranking offered Google Apps last year.

Clarity Channel Advisors’ Lippie, who made a presentation to Google partners during the Ireland gathering, said he’s seen signs that the tech giant is carefully considering cloud partner strategy.

“I think Google is trying to remedy that and I think you might see them come out with a stronger channel program to reverse that,” he said. “If they decide that they want to, it’ll depend on the economics.”

That is, whether the razor thin margins on granular cloud activity that GCP is relying upon, leave enough room for MSPs, other channel companies and Google itself to turn a profit.

Time will tell if the recent overtures to channel amount to lucrative new veins of revenue or empty gestures, West said.

“I’ve seen a lot of direct companies try to become channel companies and vice versa with mixed results,” he said.

It’s a question that has crossed the mind of at least one Google partner.

“I’d like to believe that we will remain useful in the equation,” Vanegas said. “Will we get cut out as partners? Is that something that could happen? Who knows?”

Source: TheWHIR

Inside the High Stakes Auction for .Web

Inside the High Stakes Auction for .Web

Some very deep-pocketed internet giants are facing off on July 27, 2016 for a high stakes game of poker. The pot isn’t cash but the rights to sell the coveted .web top level domain (TLD) extension to eager website owners, domain speculators, online entrepreneurs, developers, designers and digital ad agencies. Google, Web.com, United Internet and Afilias are among the seven competing entities who will bid in real time on July 27 via an online auction conducted by the non-profit organization ICANN (Internet Corporation for Assigned Names and Number) to confer the rights to sell .web.

The auction

If you have a ton of time on your hands and want to brush up on the legal details of how the auction process works you can read all about it here. For those who aren’t lawyers here’s a tl;dr version of how it works.

Step 1 – Become eligible for participating in the auction. The criteria are basically you must have an extra large sum of American dollars (auctions are all conducted in American dollars regardless of the top level domain) and be in good standing with ICANN.

Step 2 – Login to the auction interface on the day of the auction to bid. The larger your deposit is, the higher you can bid. A deposit of $2 million gives you an unlimited bidding potential. The bids are made through a series of ”rounds” where the floor and ceiling of that round are specified. If all bidders meet the ceiling of the round then a new round is started after a short break with the floor being set at the ceiling of the previous round. The rounds continue at higher and higher floors until there is only one bidder remaining. That bidder pays the second place bidder’s highest bid.

Big money bids and big money profits

So exactly what would the rights to sell the .web TLD be worth and what might the winning bid be? Consider that on Jan. 27, 2016 a number of large firms including Amazon, were bidding via an ICANN auction for the rights to the .shop TLD. After 14 rounds of bidding GMO Registry, Inc. won the rights with a winning bid of $41,501,000. Clearly the expectation is that the revenues derived from the .shop domains would well exceed the price paid. Note also that the current champion of newly minted TLDs is .xyz which has registered a total of nearly 6.5 million domains as of July 20, 2016. At a conservative estimate of only a one year registration period and an average price of $10 per domain that works out to around $65 million so far. Clearly the current bidders for .web hope that the number of .web registrations surpass those of .xyz making it potential worth in excess of $65 million.

So what could a winning bid look like? Using .shop as a proxy – it is certainly possible that .web could fetch a higher bid that .shop ($41,501,000) – but how much higher? Only the bidders know what their upper limits are. It is clear that the bidders all have substantial funds to bring to bear on the auction. Here are the recent market caps of three of the bidders who are publicly traded:

Alphabet Inc Class A (Google) – $514 Billion
United Internet AG – $8 Billion
Web.com – $950 Million

Would Google with its massive war chest of cash even blink at paying $50 million or more? Not likely. In fact Google paid over $18 million just to submit a list of TLDs that it wanted to pursue before ever arriving at the final sale price.

Could .Web become the new .Com

Is it likely that .web will be a standout among new TLDs? Here are a few points that may indicate .web is poised to gain traction relative to other recently introduced TLDs.

1. We’re already used to using the term ‘web’ for internet-related activities. We refer to online properties as ‘websites’ or ‘web pages’ and the talent who create them are ‘web designers’ and ‘web developers’. We use ‘web servers’ and ‘web browsers’ and even ‘web apps’. The common references make a transition to a .web domain a natural activity for a mass online and mobile audience.

2. .Web is short and memorable. With the explosion of new top level domains, it’s literally hard to keep track of them all or their proper use. A short generic term like .web could cut through all the clutter. It’s just simpler to type: yourcomany.web than say: yourcompany.company or yourcompany.solutions. It’s certainly less prone to confusion as well. Was it yourcompany.solution or yourcompany.solutions?

3. Large companies set standards. Imagine if Google won the auction and decided that every time someone searched for anything related to ‘domain names’ on Google – they would suggest trying the .web TLD as an alternative to .com. Standard set.

4. Dictionary names and short phrases are still available on .web. This is true of all new TLDs so it’s not unique to .web. However, simply offering a short, memorable and generic alternative to .com could be enough if the momentum gets behind this new domain.

Stuart Melling is co-founder of UK domain name firm 34SP.com with decades of domain name experience and he offered up his expert opinion on whether .web could be the next .com.

”There’s such a huge array of new domains available to buyers now making it very difficult for them to really understand the selection on offer. Likewise, I’ve yet to see any registrar (ourselves included) deliver a domain search tool that really nails domain discovery,” he says. “It boils down to marketing might at this point. The registries that will win are most likely going to be those that have the heftiest budgets to market and promote their domains. I personally see .com being the de facto domain for any new website for some time to come. Right now, the new TLDs seem to represent a fallback, a secondary area to secure a relevant domain if the .com space isn’t viable. I’d imagine it would take years to unseat this kind of approach; but then this is the web, and making predictions is really a fools game.”

What other domain experts think

Mark Medina, Director of Product, Domain Names with Dreamhost has been selling domain names to web businesses for over 15 years. Medina has some strong predictions for .web: “The winning bid for .shop was $41.5M, so I think the winning bid will definitely be north of $50M. Because there are multiple bidders, one of them being the mighty Google, I can foresee some pretty aggressive bids, which I think will take the final winning bid into the $80M – $100M range.”

”Everyone still wants a .com. We’ve done user testing on people searching for domains, where users speak their thoughts during the test, and almost all of them say ‘Where’s the .com?’ With that said, I can’t foresee .web becoming the new .com, but I think it will be one of the more popular new TLDs that could overtake .net in a few years,” Medina says. “The .net TLD has been losing its popularity, and I think TLDs like a .web or a .xyz could become more popular than .net in a few years time. .Com will remain number 1 but number 2 is up for the taking.”

Chris Sheridan is currently Head of Channel Sales at Weebly.com and has also held senior positions at domain registrars eNom and VeriSign.

Sheridan shares his take: ”When new TLDs first launched, the larger registrars had to dedicate themselves to just focusing on the integration of hundreds of new TLDs per quarter. I look at 2014 as a year basically focused on integrating as many of the new TLDs as possible so that 2015 and 2016 could be more focused on marketing and sales. What I see today is more focus by the larger registrars on marketing the new TLDs and raising their visibility to their existing customer base. Since new TLDs are typically priced higher than a ‘.com’ they give the advantage to the registrars of driving higher revenue sales and allowing them to capture more margin on each individual domain name sale as well.”

He continues: “I think the .web TLD has big potential. For starters, there is no consumer education hurdle here. I think people will just get it…so that is a major advantage. I think we will have to see how the future .web registry addresses two key areas: pricing and marketing.”

“In regards to pricing, the wholesale cost to registrars will be key to adoption by larger registrars and its inclusion in key hosting bundles managed by the larger registrars (which impacts distribution). In regards to marketing, there will need to be a big effort to raise awareness of .web globally. This will require the help of the larger registrars (marketing programs) but will also require the .web registry to be involved as well,” Sheridan says. “The manner in which the future .web registry address pricing and marketing could potentially dictate its success. The future delegation of .web to a registry provider represents the final batch of remaining new TLDs to go live. I think it is great to have a big TLD like .web being delegated toward the end of this long new TLD rollout. It generates more media attention to the overall program and re-ignites excitement around domains. So that is good thing on all levels.”

Source: TheWHIR

ServerHub Brings Sixth Global Location Online in New York

ServerHub Brings Sixth Global Location Online in New York

ServerHub has announced on Monday at HostingCon Global 2016 that it has launched its sixth global data center this week, in New York City, marking its first true east coast location.

According to ServerHub CEO John Brancela, the new location provides financial services customers with low latency to the New York market, and came highly requested by customers.

“We now have services in Chicago that we’ve been offering customers, and it’s been highly successful but it’s still not a true east coast location,” Brancela says.

Microsoft Can't Shield User Data From Government, U.S. Says

Microsoft Can't Shield User Data From Government, U.S. Says

By Kartikay Mehrotra

(Bloomberg) — The U.S. says there’s no legal basis for the government to be required to tell Microsoft Corp. customers when it intercepts their e-mail.

The software giant’s lawsuit alleging that customers have a constitutional right to know if the government has searched or seized their property should be thrown out, the government said in a court filing. The U.S. said federal law allows it to obtain electronic communications without a warrant or without disclosure of a specific warrant if it would endanger an individual or an investigation.

READ MORE: Microsoft Wins Big in Fight for User Privacy as Irish Search Warrant Found Invalid

Microsoft sued the Justice Department and Attorney General Loretta Lynch in April, escalating a feud with the U.S. over customer privacy and its ability to disclose what it’s asked to turn over to investigators. Last week, Microsoft persuaded an appeals court to overturn an order to turn over e-mails stored on servers in Ireland as part of a Manhattan drug prosecution.

The Justice Department’s reply Friday underscores the government’s willingness to fight back against tech companies it sees obstructing national security and law enforcement investigations. Tensions remain high following a series of court confrontations between the FBI and Apple Inc. over whether the company could be compelled to help unlock iPhones in criminal probes, including a phone used by one of the attackers in last December’s terrorist attack in San Bernardino, California.

The industry’s push against government intrusion into their customers’ private information began at least two years ago, in the wake of Edward Snowden’s disclosures about covert data collection that put them all on the defensive.

SEE ALSO: Microsoft Rises Most in Nine Months After Profit Beat Estimates

Microsoft and Apple argue the very future of mobile and cloud computing is at stake if customers can’t trust that their data will remain private, while investigators seek digital tools to help them fight increasingly sophisticated criminals and terrorists savvy at using technology to communicate and hide their tracks.

Kathy Roeder, a spokeswoman for Microsoft, didn’t immediately respond to an e-mail after regular business hours Friday seeking comment on the filing.

Court Orders

“Microsoft’s challenge effectively asks this court to adjudicate the lawfulness of thousands of such court orders from across the U.S., without regard to the basis for, and terms of, those orders, which necessarily vary from case to case,” the Justice Department said in Friday’s court filing.

The government said Microsoft doesn’t have the authority to sue over whether its users’ constitutional protections against unlawful search and seizure are being violated, the U.S. said.

Secrecy orders on government warrants for access to private e-mail accounts generally prohibit Microsoft from telling customers about the requests for lengthy or even unlimited periods, the company said when it sued. At the time, federal courts had issued almost 2,600 secrecy orders to Microsoft alone, and more than two-thirds had no fixed end date, cases the company can never tell customers about, even after an investigation is completed.

Microsoft conceded that there may be times when the government is justified in seeking a gag order to prevent customers under investigation from tampering with evidence or harming another person. Still, the Redmond, Washington-based company says the statute authorizing the gag orders is too broad and sets too low of a standard for secrecy.

The case is Microsoft Corp. v. U.S. Department of Justice, 16-cv-00538, U.S. District Court, Western District of Washington (Seattle).

Source: TheWHIR