Cisco to Acquire Cloud Access Security Broker CloudLock for $293 Million

Cisco to Acquire Cloud Access Security Broker CloudLock for 3 Million

talkincloudBrought to you by Talkin’ Cloud

Cisco (CSCO) is set to acquire cloud access security broker CloudLock for $293 million in cash and equity, bringing the team under its networking and security business group, according to an announcement on Tuesday.

With more than 700 customers with tens of millions of users under management, CloudLock is based in Waltham, Massachusetts, and has recently been named among the Best Places to Work by the Boston Business Journal and the Best Small & Medium Companies to Work for by Glassdoor. Cisco plans to pay additional retention-based incentives for CloudLock employees who join Cisco, Reuters reports.

READ MORE: Cisco Expands Cloud and Hyperconverged Infrastructure Play

“As enterprises are retooling themselves and increasingly building their futures in the cloud, security has not only become a top business priority, it is now universally demanded by businesses and individuals alike, as a necessity to keep their cloud applications, their data, and their businesses safe,” CloudLock co-founder and CEO Gil Zimmerman said. “The ability to protect all of those assets in the cloud now has a name, Cloud Access Security Broker (CASB).”

According to Gartner, CASB solutions will go from five percent penetration in 2015 to over 85 percent by 2020. Zimmerman said that when CloudLock started talking to Cisco about potential opportunities to collaborate it realized how much the companies had in common:

“We share the same vision on the future of security that focuses on a platform that scales to any size, provides immediate value, is simple to manage and leverage, and is easily extensible through APIs. We also realized that by joining forces, we could accelerate the execution of our vision with greater investments in research and development, the CloudLock CyberLab, partner enablement, and global reach which is far greater than we could have ever achieved on our own. Customers will experience new capabilities and offerings that the combined Cisco Security offerings bring, unparalleled in the industry. On all fronts, the customer experience will only improve, even with the bar already set so high.”

In a FAQ page about the deal, CloudLock said the deal will not impact existing partners, but it will work to “enable and leverage the amazing partner network Cisco has already created.”

The deal comes shortly after Cisco acquired CliQr to add its application-defined cloud orchestration platform.

Source: TheWHIR

Report: Cloud Requires New Approach to Security Operations

Report: Cloud Requires New Approach to Security Operations

Three quarters of businesses using public cloud apply the same security operations strategy to workloads regardless of the infrastructure they reside on, according to research sponsored by Alert Logic and released on Tuesday.

The study, Evolve Your Security Operations Strategy to Account for Cloud, shows many security teams appear to be adapting slowly to increasingly complex service delivery models.

Forrester Consulting recently surveyed 100 cloud security infrastructure-decision makers in the US and UK on behalf of Alert Logic about the impact of cloud adoption on security operations. It found that 51 percent of companies are increasing security spending as a result of cloud adoption. Forty-nine percent are instituting new policies and controls for cloud security, and 46 percent are re-evaluating security operations and controls for all environments.

A CIO survey released by Nomura in March showed that security and cloud computing are among the biggest drivers of IT spending increases.

“Cloud computing enables businesses to invest more time in innovation and less time managing IT infrastructure,” Ben Matheson, Alert Logic CMO said in a statement. “In the same way, many businesses are finding that supplementing or outsourcing their security operations with cloud security vendors that offer cloud-native technologies and fully managed services is an increasingly strategic option.”

More than half (53 percent) of companies surveyed have their own in-house security operations center (SOC). The challenges most often faced by those companies bringing security operations in-house are managing security content such as signatures and whitelists, and identifying multi-vector attacks, at 44 percent each, respectively. High costs were cited by 41 percent, followed by a trio of skills-related challenges: “building out threat intelligence skills” (40 percent); making sense of data (33 percent); and staffing the SOC (33 percent).

Both business and technical skills are necessary to support a SOC, according to the study, and a number of items from each set of skills were identified, led on the business side by risk management expertise (46 percent), and on the technical side by network security (42 percent), just ahead of virtualization and cloud infrastructure experience, threat intelligence and analytics, and application/infrastructure security.

Four out of five respondents said they would seek help from a security expert for threat intelligence anaylsis, public cloud security, security operations, network security, and data privacy and compliance.

Security is becoming less of an obstacle to public cloud adoption, according to a study released earlier this month by HyTrust. Combined with the challenges of one-strategy-fits-all security operations and finding the right skills, this may be a reflection of confidence that the security solutions are out there, rather than what they actually are.

Source: TheWHIR

China's Globalization Means Shrinking Web Access

China's Globalization Means Shrinking Web Access

By Justin Fox

(Bloomberg View) — I wrote most of this column at the Meijiang Convention and Exhibition Center in Tianjin, the giant port city (population: 15 million) a half-hour bullet-train ride southeast of Beijing. It’s a sleek aircraft-hangar of a building that’s hosting the World Economic Forum’s Annual Meeting of the New Champions, what the Chinese call “summer Davos.”

That all sounds pretty modern and global and connected, doesn’t it? Technologically sophisticated, too: I arrived too late this morning (lots of traffic in Tianjin) to get a seat at the question-and-answer session with Lei Jun, the founder and chief executive officer of smartphone maker Xiaomi, so I sat in a comfy chair in one of the cafés strewn about the convention center, drinking a coffee and tapping into the conference Wi-Fi to watch live on my laptop instead.

SEE ALSO: 5 Cybersecurity Stories You Need to Know Now, June 27

If I wanted to use that Wi-Fi connection to reach the outside world, though, things deteriorated pretty quickly. I could search on Bing, but not Google. Sometimes my Bloomberg e-mail functioned OK, but Gmail never did. Evernote worked, Dropbox didn’t. And if I wanted to check Facebook or Twitter, or read something on a Western news site, or — God forbid — watch a show on Netflix, I was completely out of luck.

Such are the workings of the Great Firewall — the Chinese government’s way of keeping the free-for-all of the internet within bounds it finds comfortable. For most people in China, I get the sense that the firewall seldom interferes with their shopping and gaming and digital socializing. For Chinese internet companies, it may even be a net positive, providing a defense against the colonization by U.S.-based internet giants that has been experienced in most of the rest of the world.

READ MORE: Baidu Creates Own Indexes to Paint Picture of China’s Economy

For foreigners visiting or living in China, or for Chinese citizens trying to maintain business or personal relationships outside the country, it’s a different story. This isn’t my first visit to China, but it’s the first when I’ve tried to keep doing my job while here, and I can testify that the Great Firewall is a gigantic pain. Lots of China-based businesspeople I’ve talked to report similar aggravation.

There are workarounds. Virtual private networks that connect users directly to servers outside the country are a necessity for those aiming to remain connected to the outside world. But while the VPN I subscribe to has worked — slowly — about a third of the time at my hotel in Tianjin, it’s been completely blocked at the World Economic Forum meeting.

Cellular data networks are another option for visitors. It’s easy for Chinese mobile operators to tell which users are from other countries, so non-Chinese smartphones are allowed to bypass the firewall. But cellular is slower and can be expensive. Also, the fact that Chinese cellular operators can easily identify foreigners isn’t necessarily good news; some people buy phones just for China and discard them when they leave because they’re worried about inadvertently downloading spyware.

In past years, the World Economic Forum was allowed to bypass the Great Firewall and give participants at its summer Davos meetings unfettered internet access. But with the continuing tightening of control under President Xi Jinping, and lots of Chinese citizens at the meetings, that’s apparently no longer an option.

This is pretty remarkable, when you think about it. The Annual Meeting of the New Champions, now in its 10th year, is one of China’s biggest opportunities to showcase the country to the global elite. Yet the government is now perfectly willing to deny that global elite access to Google.

This strikes me as a useful indication of how China’s current leadership sees its relationship with the rest of the world. It wants to participate in globalization, but to do so entirely on its own terms. If that means it’s a place where it’s really difficult for outsiders to do business and live their lives, well, tough.

This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.

Source: TheWHIR

Snowden Blasts Russia's Proposed Anti-Terror Laws

Snowden Blasts Russia's Proposed Anti-Terror Laws

Former NSA contractor and whistleblower Edward Snowden has condemned a proposed set of new anti-terror laws in Russia which would require ISPs to store users’ data for a year.

The proposed laws would also require phone companies to store the contents of all calls and texts for six months, and metadata for three years. The Russian Duma (or lower legislative assembly) voted 325-1 on Friday to approve the bill, which also requires any company encrypting digital communications to assist the government with decryption.

SEE ALSO: Tech Companies Speak Out Against “Dangerous” Anti-Encryption Bill

The law has been presented as a response to the bombing of a Russian passenger jet over Egypt in October, the Guardian reports, and includes provisions requiring individuals to warn authorities of plans by others to commit crimes, which some are calling a throwback Soviet repression.

Snowden took to Twitter over the weekend to denounce the “Big Brother law,” saying it will cost money and liberty “without improving safety,” and urging Russian President Vladimir Putin not to sign it into law. He also suggested it could require a “tiny 50Gbps ISP” to set up and run around 100PB of storage to comply.

Russian telecommunication companies have responded critically. The founder of Russian instant messaging service Telegram told a Russian newspaper that “Telegram does not provide data and encryption keys to third parties, including governments,” RT reports. A company refusing to assist in decryption can be fined up to a million rubles ($15,000) under the proposed law.

Russia’s largest mobile phone operators, including MTS, MegaFon, VympelCom and Tele2, sent a joint letter to the head of the Russian Federation Council (or upper assembly) Valentina Matvienko, protesting the law, the Moscow Times reports via newspaper Kommersant. The letter called the measures “technically and economically impractical.” MTS estimated its cost of storage at 2.2 trillion rubles ($33.8 billion), and several of the companies claimed they would cease to be profitable, depriving the Russian government of billions of rubles in tax revenue.

They also pointed out that such mass storage creates a data breach risk, which they argued could threaten national security.

Operating in Russia already poses some unique challenges to technology companies, including data storage laws that led Apple to lease data center space in the country in September.

Snowden was revealed in March as the target of an investigation which resulted in the closure of email provider Lavabit in 2013.

Source: TheWHIR

Brexit and Europe: Business as Usual

Brexit and Europe: Business as Usual

Brexit. That’s the word on everyone’s lips, especially in Europe, since last Friday when the results of UK’s referendum revealed that the “will of the people” was to exit the European Union.

Truth be told this “will of the people” is so very divided that the country is torn as parts of the United Kingdom like Scotland, North Ireland and major cities like London, Manchester, Liverpool and Cardiff voted overwhelmingly to stay in the EU while rest of the UK voted to leave. What is also peculiar is the fact that the age of the population seems to have had some serious division in the opinion as people over 60 voted mostly to leave while the younger generations preferred to stay in the EU.

The chances that Brexit will happen is high, but in order to actually set things in motion the UK needs to invoke article 50 of the Lisbon Treaty, something the UK has been reluctant to do, and the referendum was actually to gauge the opinions of the UK population on the EU and is not legally binding.

At this point, anything goes. Scotland wants to stay in the EU even if it means leaving the UK and standing on its own; similar voices are coming from North Ireland. To make matters worse many people who voted to leave now want to reverse their votes. In fact, there are over three million petitions already signed to hold a new referendum over the same decision. David Cameron (current UK Prime Minister) said that as an advocate of the “stay camp” he no longer feels he is the right person to lead UK into a future outside of the European Union and announced his resignation.

The UK government still has a way out of the decision to leave Europe and the coming days/weeks/months will determine what will actually happen. Right now anything is possible as there is no legally binding obligation just yet. While the EU heads are getting impatient already and want some clarity on final UK’s decision, the country has been put before a life-altering decision. From the possibility of splitting UK and going from Great Britain to Little England and possible repercussions from the EU side limiting UK’s ability to trade with the “continent” as a show of strength and a lesson for any other EU members entertaining similar idea of bailing, to losing powerful seat at the EU council, the consequences behind UK’s final decision could be felt by more than one generation.

Threats of sudden changes to UK government, splitting the country, and the possibility of having less than favorable position to trade with the European Union (which accounts for 16.5 percent of the world’s imports and exports) has had immediate impact on global markets as it brought uncertainty not just to UK, but also to the EU altogether. Such fears of financial and political uncertainty has sent a shockwave across financial markets dropping British Pound exchange rates to a 30 year low and increased volatility on global stock exchanges. Several credit rating agencies stripped the EU from its AAA ratings due to “weakened predictability, stability, and effectiveness of UK policymaking”.

Europe

Does whether Brexit will happen or not make Europe less attractive as a single market? I don’t believe it does. Although UK is a significant player within the EU it’s just one of the 28 countries which make up the EU; there are other countries like Switzerland and Norway which have free trade agreements with the EU. Europe and the United States have very similar GDP, around €18.9 trillion for the EU and €18.3 trillion for the U.S. at the end of 2015. So while a British exit would impact these numbers it will not make nor break Europe’s attractiveness as a single trading market next to the U.S. or China. Europe is and will remain a very attractive market to foreign businesses looking for exponential growth outside of their current geographical region because access to any large single market is key for an “easy” international expansion.

Companies that want to expand to Europe need to understand that Europe’s resiliency is not built on a single country, and while businesses that opted to HQ in UK might face some unexpected challenges, there are way more than one way to shine a penny. While even the shadow of a Brexit brought a scare to the financial markets, business in Europe will continue as usual even in the somewhat unusual circumstances.

Lessons learned

Some UK citizens are learning some new facts after they have already cast their votes and came to new conclusions. Looking from technological perspective businesses do not need to take the hard way nor do they need to learn everything from their own mistakes. Trading in regional and global economy for a lot of companies has become a digital business and while some businesses try to change and adapt to the digitalization era others undergo transformations and take lead.

Agility is key in today’s shifting markets and understanding technological solutions in order to leverage them for competitive advantage is more important than ever because business environments change continuously; from different customer needs to behavioral patterns and commoditized technology access allowing market disruptions on a whim.

Companies that already operate internationally or have global ambitions need to focus on what’s important for their core business and limit their risks through partnerships. Although this is not possible for all companies or verticals no company should be investing into technology that is not their core business and is already globally available as commodity.

Whether accessing Europe or the U.S. markets from abroad finding partners that can support your business locally limits your risks and lowers your investments, not to mention that it makes you a lot more resistant to some of the hazards like regulatory changes. This rings especially true for tech companies which tend to operate internationally and just recently had to deal with the invalidation of Safe Harbor agreement. The Brexit scenario will add further to the confusion and will certainly not make matters any easier in short or mid-term perspective.

Yet despite all these obstacles the internet and cloud industry growth has been skyrocketing and even in uncertain times the prediction is that businesses will spend slightly less on Information Technology (IT) this year, we are still talking about a $3.49 trillion market for 2016, according to Gartners forecasts, with $22.4 billion going this year into IaaS expenditures.

After all changing a data center or cloud services location when it’s being used as a service is relatively “easy”, rather than having to pack-up, migrate and sell off facilities. Partnership with IaaS and Managed Services Providers (MSPs) is a way to avoid many predicaments from technology, legal, and market perspectives. Companies that offer IT as a service in general are much more versatile in dealing with local regulatory challenges.

Business as usual

The results of the UK’s referendum were rather unexpected and while Brexit is not a fact yet, it is clear that even a threat of a threat can have a massive impact on the markets and that business environment can literally change overnight. There is nothing that businesses fear more than uncertainty and lack of stability.

There is a lesson to be learned here over the design for failure. Being at the very least somewhat ready for sudden market shifts and political changes impacting trade and economic stability is an important part of becoming future-proof and should be incorporated into every company strategy. In the digital era and information driven economy this is much easier than ever before. Leveraging access to Infrastructure-as-a-Service, Software-as-a-Service, Network-as-a-Service, and in general commoditized IT is a part of becoming agile as a company.

Bottom line: uncertainty and change is a part of life and business (now more than ever) and even though you cannot foresee everything nor avoid all risks you can certainly plan for uncertainty and unexpected changes by staying lean, mean, and agile as a business. In order to achieve that find a trustworthy and experienced partner who will assist you in your business ventures.

As the English would say, “Keep Calm and Carry On” just make sure to do your due diligence.

About the Author
Martin_Wielomski_BIOMartin Wielomski is a Manager of Business Development for the EMEA region for PhoenixNAP Global IT Services. He has years of experience in the Information Technology and Hosting industries and specializes in strategy development and international business, sales leadership and product management. He also writes for several international IT oriented magazines and blogs and advises about technology, management, customer relations, sales and international business expansion while leveraging human connection potential in his corporate strategies. Martin believes in lifelong learning and leadership through engagement, while maintaining realistic and down to earth people approach. He can be reached at: linkedin.com/in/martinwielomski

Source: TheWHIR

NASCAR is digitizing race day

NASCAR is digitizing race day
In a sport where winning often comes down to thousandths of a second, data matters. NASCAR is going high-tech with new race management software.

NASCAR GOES HIGH TECH

At the Toyota/Save Mart 350 something new, a race management system that gives officials a single-screen from which to monitor the racetrack, where cars are, review infractions and share that with teams.

It’s the result of 18 months development that started here, in the inspection tent where NASCAR officials check cars to make sure everything is within the sport’s rules.

This all used to be done on slips of paper, so the sport worked with Microsoft to come up with a tablet based app that collects, stores and transmits the results of every test.

“Giving us data that we never had before, allowing us to see things from the inspection that we were never able to see before, because everything was paper based.”

NASCAR says its inspection times have been cut in half.

So attention turned to the race track, where several independent systems were already in use tracking different elements of the race. A new Windows 10 app brings them all together with powerful results.

“During the race itself i can come down, i can look at info abotu a particular car, their last lap speed, average lap time, all of the pitstops they made”

Cars are shown in actual locations on this screen and officials can pair that data with live video of the race. The app also pulls in video from pit stops so officials can see whether drivers broke any rules and, if so, share that with the teams in realtime.

“If there was an infraction on pit row, we’d explain it verbally, we’ll get back to you monday or tuesday, have a question ‘what really happeneds?’ we’d try to explain it. In this case, realtime video we can get to the race team, hopefully have that communication, here was that infraction, post race meet with the team and here’s why the call was made.”

And it also allows officials to figure out exactly where cars where when race holds were called, perhaps in response to an accident. Sometimes cars end up in the wrong order and the system helps sort that out so when the green flag falls again, no car has lost a place.

It’s a powerful tool for a sport that runs in real time.

“NASCAR is the only sport where every second of every race, we don’t have time to call a time out or a TV timeout, let the officials sit over on the side, we’ve gotta make a quick call and this allows us to do that.”

Like most major sports, NASCAR is keen to use new technology to innovate and bring faces closer to the action. And that could happen in the future if this data is shared online, so fans can keep track during the race. There’s even the idea of employing machine learning to help officials predict things before they happen.

Source: InfoWorld Big Data

Comodo Drops Trademark Applications, Avoiding Legal Battle with Certificate Authority Let's Encrypt

Comodo Drops Trademark Applications, Avoiding Legal Battle with Certificate Authority Let's Encrypt

Comodo has withdrawn applications for three trademarks involving the term “Let’s Encrypt” – a move that seems to be related to a plea by an open certificate authority of the same name urging Comodo to abandon its applications.

Let’s Encrypt is a free, automated, and open certificate authority by the non-profit Internet Security Research Group (ISRG). Comodo’s Requests for Express Abandonment came with 24 hours of a blog post by the Let’s Encrypt project on Thursday last week, but it is unclear if the two are directly related.

The Let’s Encrypt project said in a blog post that it contacted Comodo regarding the trademark applications in March, and asked directly and through attorneys for Comodo to drop its applications, saying it is “the first and senior user” of the term.

Comodo filed trademarks for the terms “Let’s Encrypt,” “Let’s Encrypt with Comodo,” and “Comodo Let’s Encrypt” for certificate authority or related services. The company acknowledges in its applications that these phrases have not been part of its branding before they were filed in October.

The United States Patent and Trademark Office (USPTO) responded to Comodo’s application in February, asking for clarification of “identification and classification of goods and services.”

“We’ve forged relationships with millions of websites and users under the name Let’s Encrypt, furthering our mission to make encryption free, easy, and accessible to everyone,” ISRG executive director Josh Aas said in the blog post. “We’ve also worked hard to build our unique identity within the community and to make that identity a reliable indicator of quality. We take it very seriously when we see the potential for our users to be confused, or worse, the potential for a third party to damage the trust our users have placed in us by intentionally creating such confusion. By attempting to register trademarks for our name, Comodo is actively attempting to do just that.”

The Let’s Encrypt project was announced in November 2014, and it issued over a million SSL/TLS certificates in its first three months after launching late last year.

The organization argued it is most commonly associated with the term and has been using it longer, and will “vigorously defend” its brand.

Comodo did not respond to an email seeking comment.

Source: TheWHIR

Here's How Much Energy All US Data Centers Consume

Here's How Much Energy All US Data Centers Consume

datacenterknowledgelogoBrought to you by Data Center Knowledge

It’s no secret that data centers, the massive but bland, unremarkable-looking buildings housing the powerful engines that pump blood through the arteries of global economy, consume a huge amount of energy. But while our reliance on this infrastructure and its ability to scale capacity grows at a maddening pace, it turns out that on the whole, the data center industry’s ability to improve energy efficiency as it scales is extraordinary.

The demand for data center capacity in the US grew tremendously over the last five years, while total data center energy consumption grew only slightly, according to results of a new study of data center energy use by the US government, released today. This is the first comprehensive analysis of data center energy use in the US in about a decade.

US data centers consumed about 70 billion kilowatt-hours of electricity in 2014, the most recent year examined, representing 2 percent of the country’s total energy consumption, according to the study. That’s equivalent to the amount consumed by about 6.4 million average American homes that year. This is a 4 percent increase in total data center energy consumption from 2010 to 2014, and a huge change from the preceding five years, during which total US data center energy consumption grew by 24 percent, and an even bigger change from the first half of last decade, when their energy consumption grew nearly 90 percent.

Efficiency improvements have played an enormous role in taming the growth rate of the data center industry’s energy consumption. Without these improvements, staying at the efficiency levels of 2010, data centers would have consumed close to 40 billion kWh more than they did in 2014 to do the same amount of work, according to the study, conducted by the US Department of Energy in collaboration with researchers from Stanford University, Northwestern University, and Carnegie Mellon University.

Energy efficiency improvements will have saved 620 billion kWh between 2010 and 2020, the study forecasts. The researchers expect total US data center energy consumption to grow by 4 percent between now and 2020 – they predict the same growth rate over the next five years as it was over the last five years – reaching about 73 billion kWh.

LBNL DOE DC energy use efficiency impact

This chart shows past and projected growth rate of total US data center energy use from 2000 until 2020. It also illustrates how much faster data center energy use would grow if the industry, hypothetically, did not make any further efficiency improvements after 2010. (Source: US Department of Energy, Lawrence Berkeley National Laboratory)

Counting Electrons

Somewhere around the turn of the century, data center energy consumption started attracting a lot of public attention. The internet was developing fast, and many started asking questions about the role it was playing in the overall picture of the country’s energy use.

Many, including public officials, started ringing alarm bells, worried that continuing to power growth of the internet would soon become a big problem. These worries were stoked further by the coal lobby, which funded pseudo-scientific research by “experts” with questionable motives, who said the internet’s power consumption was out of control, and if the society wanted it to continue growing, it wouldn’t be wise to continue shutting down coal-burning power plants.

The DOE’s first attempt to quantify just how much energy data centers were consuming, whose results were published in a 2008 report to Congress, was a response to those rising concerns. It showed that yes, this infrastructure was consuming a lot of energy, and that its energy use was growing quickly, but the problem wasn’t nearly as big as those studies of murky origins had suggested.

“The last [DOE] study … was really the first time data center energy use for the entire country was quantified in some way,” Arman Shehabi, research scientist at the DOE’s Lawrence Berkeley National Laboratory and one of the new study’s lead autors, said in an interview with Data Center Knowledge.

What authors of both the 2008 report and this year’s report did not anticipate was how much the growth curve of the industry’s total energy use would flatten between then and now. This was the biggest surprise for Shehabi and his colleagues when analyzing the most recent data.

“It’s slowed down, and right now the rate of increase is fairly steady,” he said. “There’s more activity occurring, but that activity is happening in more efficient data centers.”

See also: Cleaning Up Data Center Power is Dirty Work

Fewer Servers

There’s a whole list of factors that contributed to flattening of the curve, but the most obvious one is that the amount of servers being deployed in data centers is simply not growing as quickly as it used to. Servers have gotten a lot more powerful and efficient, and the industry has figured out ways to utilize more of each server’s total capacity, thanks primarily to server virtualization, which enables a single physical server to host many virtual ones.

Each year between 2000 and 2005, companies bought 15 percent more servers on average than the previous year, the study says, citing server shipment estimates by the market research firm IDC. The total number of servers deployed in data centers just about doubled in those five years.

Growth rate in annual server shipments dropped to 5 percent over the second half of the decade, due in part to the 2008 market crash but also to server virtualization, which emerged during that period. Annual shipment growth dropped to 3 percent since 2010, and the researchers expect it to remain there until at least 2020.

The Hyperscale Factor

The end of the last decade and beginning of the current one also saw the rise of hyperscale data centers, the enormous facilities designed for maximum efficiency from the ground up. These are built by cloud and internet giants, such as Google, Facebook, Microsoft, and Amazon, as well as data center providers, companies that specialize in designing and building data centers and leasing them to others.

According to the DOE study, most of the servers that have been responsible for that 3 percent annual increase in shipments have been going into hyperscale data centers. The cloud giants have created a science out of maximizing server utilization and data center efficiency, contributing in a big way to the slow-down of the industry’s overall energy use, while data center providers have made improvements in efficiency of their facilities infrastructure, the power and cooling equipment that supports their clients’ IT gear. Both of these groups of data center operators are well-incentivized to improve efficiency, since it has direct impact on their bottom lines.

The amount of applications companies deployed in the cloud or in data center provider facilities started growing as well. A recent survey by the Uptime Institute found that while enterprise-owned data centers host 71 percent of enterprise IT assets today, 20 percent is hosted by data center providers, and the remaining 9 percent is hosted in the cloud.

LBNL DOE 2016 dc energy use by space type

This chart shows the portion of energy use attributed to data centers of various types over time. SP data centers are data centers operated by service providers, including both colocation and cloud service providers, while internal data centers are typical single-user enterprise data centers. (Source: US Department of Energy, Lawrence Berkeley National Laboratory)

Additionally, while companies are deploying fewer servers, the amount of power each server needs has not been growing as quickly as it used to. Server power requirements were increasing from 2000 to 2005 but have been relatively static since then, according to the DOE. Servers have gotten better at reducing power consumption when running idle or at low utilization, while the underlying data center power and cooling infrastructure has gotten more efficient. Storage devices and networking hardware have also seen significant efficiency improvements.

See also: After Break, Internet Giants Resume Data Center Spending

From IT Closet to Hyperscale Facilities

To put this new data in perspective, it’s important to understand the trajectory of the data center industry’s development. It was still a young field in 2007, when the first DOE study was published, Shehabi said. There was no need for data centers not too long ago, when instead of a data center there was a single server sitting next to somebody’s desk. They would soon add another server, and another, until they needed a separate room or a closet. Eventually, that footprint increased to a point where servers needed dedicated facilities.

All this happened very quickly, and the main concern of the first data center operators was keeping up with demand, not keeping the energy bill low. “Now that [data centers] are so large, they’re being designed from a point of view of looking at the whole system to find a way to make them as efficient and as productive as possible, and that process has led to a lot of the efficiencies that we’re seeing in this new report,” Shehabi said.

Efficiency Won’t Be the Final Answer

While the industry as a whole has managed to flatten the growth curve of its energy use, it’s important to keep in mind that a huge portion of all existing software still runs in highly inefficient data centers, the small enterprise IT facilities built a decade ago or earlier that support applications for hospitals, banks, insurance companies, and so on. “The lowest-hanging fruit will be trying to address efficiency of the really small data centers,” Shehabi said. “Even though they haven’t been growing very much … it’s still millions of servers that are out there, and those are just very inefficient.” Going forward, it will be important to find ways to either make those smaller data centers more efficient or to replace them with footprint in efficient hyperscale facilities.

See also: The Problem of Inefficient Cooling in Smaller Data Centers

As with the first data center study by the DOE, the new results are encouraging for the industry, but they don’t indicate that it has effectively addressed energy problems it is likely to face in the future. There are only a “couple of knobs you can turn” to improve efficiency – you can design more efficient facilities and improve server utilization – and operators of the world’s largest data centers have been turning them both, but demand for data center services is increasing, and there are no signs that it will be slowing down any time soon. “We can only get to 100 percent efficiency,” Shehabi said.

Writing in the report on the study, he and his colleagues warn that as information and communication technologies continue to evolve rapidly, it is likely that deployment of new systems and services is happening “without much consideration of energy impacts.” Unlike 15 years ago, however, the industry now has a lot more knowledge about deploying these systems efficiently. Waiting to identify specific efficient deployment plans can lead to setbacks in the future.

“The potential for data center services, especially from a global perspective, is still in a fairly nascent stage, and future demand could continue to increase after our current strategies to improve energy efficiency have been maximized. Understanding if and when this transition may occur and the ways in which data centers can minimize their costs and environmental impacts under such a scenario is an important direction for future research.”

Source: TheWHIR

5 Cybersecurity Stories You Need to Know Now, June 27

5 Cybersecurity Stories You Need to Know Now, June 27

It’s Monday, and it’s almost July if you can believe it. But even though you may feel like you’re in summer vacation mode with the Fourth of July just around the corner, hackers really don’t seem to take a holiday. Here are the 5 cybersecurity stories you need to know as you start your week.

  1. China Is Another Step Closer to Controversial Cybersecurity Law

Here’s something to keep you up at night: China is going through the steps to bring a controversial cybersecurity draft law into practice, which would require network operators to “comply with social morals and accept the supervision of the government and public” according to a report by Fortune. The law would also require data belonging to Chinese citizens to be stored domestically. It is not clear when it will be passed as parliament just held a second draft reading of the bill, but it will be something to watch closely if you do business in China.

SEE ALSO: U.S. Closely Eyeing China’s Corporate Hacking Vow, Official Says

2. Intel Considers Sale of Cybersecurity Division: Report

Intel is looking at selling its Intel Security division which it formed after acquiring McAfee back in 2010. The deal could fetch the company – which is shifting away from PCs to data centers and IoT – as much as the $7.7 billion it paid six years ago.

3. Everyone’s Waiting for the Next Cybersecurity IPO

Cybersecurity is hot, but there still have only been two US tech IPOs this year. An uncertain market is keeping would-be IPOs from moving forward, according to a report by Fortune, putting a damper on “an otherwise vibrant cybersecurity sector.”

READ MORE: Dell’s Cybersecurity Unit SecureWorks Files for IPO

4. Security Sense: The Ethics and Legalities of Selling Stolen Data

The WHIR sister site Windows IT Pro has a really interesting take on the mass data breaches we’ve been seeing lately (think LinkedIn, MySpace) and the ethics around those selling the data, challenging common defenses used by those who profit off stolen credentials. It’s definitely worth a read.

5. How Healthcare Cybersecurity is Affected by Cyber Sharing Law

The Cybersecurity Act was signed into law in December 2015 and several industry stakeholders gathered earlier this month to discuss its impact on healthcare cybersecurity, according to a report by HealthITSecurity. If you’ve got clients in the healthcare sector, you will want to take a look for sure.

Source: TheWHIR

Two-Thirds of Companies See Insider Data Theft, Accenture Says

Two-Thirds of Companies See Insider Data Theft, Accenture Says

By Matthew Kalman

(Bloomberg) — As businesses spend billions of dollars a year trying to protect their data from hacking that’s costing trillions, they face another threat closer to home: data theft by their own employees.

That’s one of the findings in a survey to be published by management consultant Accenture Plc and HfS Research on Monday.

Of 208 organizations surveyed, 69 percent “experienced an attempted or realized data theft or corruption by corporate insiders” over the past 12 months, the survey found, compared to 57 percent that experienced similar risks from external sources. Media and technology firms, and enterprises in the Asia-Pacific region reported the highest rates — 77 percent and 80 percent, respectively.

READ MORE: Basic Security Training for Employees Not Enough to Stop Data Breaches: Report

“Everyone’s always known that part of designing security starts with thinking that your employees could be a risk but I don’t think anyone could have said it was quite that high,” Omar Abbosh, Accenture chief strategy officer, said in an interview in Tel Aviv, where he announced Accenture’s purchase of Maglan Information Defense & Intelligence Group, an Israeli security company.

Each year, businesses currently spend an estimated $84 billion to defend against data theft that costs them about $2 trillion — damage that could rise to $90 trillion a year by 2030 if current trends continue, Abbosh forecast. He recommended that corporations change their approach to cybersecurity by cooperating with competitors to develop joint strategies to outwit increasingly sophisticated cyber-criminals.

SEE ALSO: Shadow IT: Embrace Reality – Detect and Secure the Cloud Tools Your Employees Use

“There’s a huge business rationale to share and collaborate,” he said. “If one bank is fundamentally breached in a way that collapses its trust with its customer base, I could be happy and say they’re all going to come to me, but that’s a false comfort” because “it pollutes the whole sphere of customers because it makes everyone fearful,” he said.

Despite recent high-profile data breaches of Sony Corp., Target Corp. and the U.S. Office of Personnel Management, many corporations do not yet consider cybersecurity a top business priority, Accenture found. Seventy percent of the survey’s respondents said they lacked adequate funding for technology, training or personnel needed to maintain their company’s cybersecurity, while 36 percent said their management considers cybersecurity “an unnecessary cost.”

Source: TheWHIR