Upgrade to 5G Costs $200 Billion a Year and May Not Be Worth It

Upgrade to 5G Costs 0 Billion a Year and May Not Be Worth It

(Bloomberg) — In the wildest dreams of wireless engineers, the mobile network of the future controls our cars, lets our refrigerators talk to the grocery store to order more milk, and provides fast, reliable broadband connections to our homes so we can sever ties with cable companies.

But it’s going to cost the mobile-phone companies, chipmakers, device manufacturers and software developers about $200 billion a year in research and capital spending to get to that point, with engineers laboring to work around interference from trees and rain and provide a strong enough signal to handle so much demand.

Even if they’re successful, making a profit on that investment will be difficult in an industry that isn’t growing much anymore. In most developed countries, like the U.S., the wireless market has reached saturation, and there are few new subscribers to sign up without undercutting rivals on price.

“Historically, 1G to 4G, it’s been a pretty straightforward evolution from the point of view of business and technology,” said Chetan Sharma, a wireless consultant. “The revenue grew proportionate to the usage.”

The future of 5G, as the next-generation wireless network is known, is already beginning, as a handful of carriers including Verizon Communications Inc. move from trials to deployments. The first technical standards everyone can use to design their networks, phones and chips for 5G will be released at a summit that starts Monday in Lisbon.

Most mobile-phone companies are targeting 2020 for the initial rollout of the technology, which promises 10 times faster speeds and lower latency, or lag time in transferring data when it’s requested. After that, wireless carriers’ revenue will grow about 2.5 percent a year through 2025 — only about half a percentage point more than their growth in the prior five years, according to industry group GSMA.

This time around, it’s not clear that 5G will translate into more revenue until perhaps five or 10 years from now, Sharma said. New applications like the Internet of Things — using wireless connectivity to let machines on the factory floor talk to each other, and for autonomous cars on the freeway to talk to light signals — may take years to materialize, and may not pay that much.

First, engineers have to figure out how to make 5G work. Rain, fog and trees have long been the enemy of high frequency radio waves. AT&T Inc. is among the companies that have been exploring the problem. With environmental conditions “you get degradations but we haven’t lost signals completely,” said Andre Fuetsch, president of AT&T Labs and chief technology officer.

Given the relatively short, fragile nature of high-frequency 5G signals, carriers have to configure networks differently. They’re shifting more of the network hardware from tall towers that are scattered to spread signals over broad areas, to smaller, more clustered sites like rooftops and street poles.

These “small cells” use cabinets that look like mini-refrigerators mounted on poles or rooftops. Inside the cabinets there’s an array of more than 1,000 antennas, says Ed Chan, senior vice president of network planning for Verizon. In dense, urban areas, network engineers will have to install lots of small cells to handle demand for data, adding to the costs of 5G.

Some companies, including Verizon, aim to make money by offering up 5G as an alternative to home broadband connections, competing with cable and landline phone providers. High costs could make that commercially unviable.

“Carriers are all looking at 5G for fixed wireless broadband, even though the technology isn’t particularly well suited to that application,” said Craig Moffett, an analyst at MoffettNathanson LLC. “That’s largely because it is almost impossible to identify any other real revenue opportunities for the technology.”

Verizon says it will take a targeted approach, which will require a few years of spending before there’s enough 5G service to prove that it can be a viable source of revenue.

The company has told analysts and investors that the goal for the first phase of its 5G network is to build a coverage area of 30 million homes starting next year in Sacramento, California, and possibly four additional cities. It will take a “short number of years” of network investment to have 5G within reach of that many homes and a few years before it shows some returns, Verizon Chief Financial Officer Matt Ellis said at a UBS investor conference earlier this month.

“As we’ve seen with other products, you build up to your penetration levels over some period of time,” Ellis said. After enough customers sign on, it “starts to be significant to our financials in the next two to three years,” he said.

Each of the top four wireless carriers in the U.S. decided to take different paths with 5G. AT&T plans to sell both a direct-to-home wireless service and a mobile service. T-Mobile US Inc. is adapting its network to 5G through a software update, and Sprint Corp. is planning to upgrade its antenna towers with advanced network gear. Neither T-Mobile nor Sprint have specified how they plan to generate revenue from 5G.

Verizon and AT&T have been testing the 5G technology in controlled settings. The trials have gone well enough to take the service to consumers starting late next year.

Sprint plans to boost network spending from about $4 billion this year to as much as $6 billion in 2018 as it kicks off a four-part upgrade plan, Chief Financial Officer Tarek Robbiati said at an investor conference earlier this month.

T-Mobile, meanwhile, is saying it will be the first to offer 5G service nationwide, though it’s not clear if they’re referring to the same type of technology others are implementing.

“The capital we’re putting in the ground, starting now, is future ready,’’ Chief Operating Officer Mike Sievert said this month at the same conference. “We’re not hyping it right now like our competitors because we actually have a story in the part of a business where the revenues and profits are.”

Source: TheWHIR

2017 Holiday Gift Guide: 35 Tech Gifts for Every Budget

2017 Holiday Gift Guide: 35 Tech Gifts for Every Budget

Happy Holidays everyone! It’s that fateful time of year when snow comes around (even in Texas) and we scramble to check off our holiday lists. If you’re like me, you may be stuck on what to gift your favorite people.

This challenge becomes even more of a head-scratcher if you’re a techie and everything new coming out looks so cool.

Before we get into this year’s awesome list, a couple of points to remember:

  1. Creativity goes a long way. It doesn’t have to be large and shiny for your friends and family to enjoy. In fact, the more thought you put into it, the more valuable your present will become. There are so many cool, inexpensive, ideas out there for you to get creative. Let me give you an example: Star Wars gadgets and gifts. There’s a fun Star Wars BB-8 USB Car Charger, or maybe an awesome Millennium Falcon Star Wars Lighting Gadget Lamp. Both are under $50 and are super fun. If you’re a slightly bigger Star Wars spender, definitely check out the Sphero Star Wars BB-8 App Controlled Robot with Star Wars Force Band. The fun doesn’t have to end there, there are other super fun ideas around popular shows like Stranger Things and even Doctor Who (check out this Doctor Who TARDIS Bottle Opener with Sound FX Effects).
  2. Before you get them a gadget, exercise some caution. I’ve mentioned this before, but it’s an important yearly reminder. Just because you think it’s awesome and amazing doesn’t mean your friends or relatives will like the gift. Put yourself in their shoes. Will they use the device? Will they enjoy it? Is it really a great idea to get your grandparents an Amazon Echo? My best piece of advice when getting a techie gift for someone is to spend a second thinking from their perspective. This will let you get a better idea to what they want and how they’ll use the gift. Remember, creativity goes further than just getting them a ‘thing.’ Finally, there are some gifts you should either avoid or be ready to have them re-gifted. Before you get anyone another Bluetooth speaker, a battery pack, or a light-up fidget spinner, see if they don’t already have five of them. A lot of these kinds of presents have really become the ‘scented candle’ of techie gifts.

Finally, if you’re curious to see our list for amazing gifts for the previous two years – many of which are still very applicable – you can see them here:

And now – for our holiday gift list!

  1. Home automation is still really cool, and a lot less complicated (and less expensive!) Let’s start with all the cool things Amazon has been doing. It started off with the Echo, and now have an entire line of fun products. I’ve already worked with the original Echo, the new updated Echo, the old and new Echo Dot, and I’ve really enjoyed the new Echo Show. Beyond that, there are updated Fire Tablets, the Amazon Cloud Camera, and the new Echo Spot (think of that as a mini Echo Show). I’ve honestly been enjoying all of these as great devices to introduce greater levels of home automation. When it comes to gift ideas, you don’t have to splurge. The new Echo Dot is on sale for a limited time (40 percent off) and you can even bundle in a few other smart home gadgets like a Bose speaker, the Logitech Harmony Hub, and a Sonos Play:5 system. One of the coolest bundles that I really enjoy is the one with the Philips Hue. At less than $80, you can get a smart home Echo Dot and pair it with a very fun Philips Hue lighting system. From there, just ask Alexa to change the lights for you.

If you’re looking to go a bit further with home automation, I absolutely recommend browsing the Wink website for lots of great ideas. Wink can act as a centralized hub that will aggregate almost all the smart devices in your home. This can range from everything like your smart lock and Nest thermostats, to your Alexa-enabled devices.

  1. Home security is getting a lot smarter. It feels like everyone is getting into the smart home security business. And, it’s a great business to be in. I’m a big fan of Ring and its security products. I was one of the early adopters of its doorbell and now have the new Video Doorbell Pro. Ring was one of the very first to get into the video doorbell market and continue to make waves. As of now, it has evolved past just doorbells and have Spotlight cameras, Floodlight cameras, and even its own security system. The entire protection security kit starts at $199. From there, there are lots of fun sensors you can add for flooding, freezing, smoke/CO, and more.

Ring isn’t the only one in this game. Nest continues to lead with smart home innovation. I’ve been a huge fan of their cameras, thermostats, and smoke/CO systems. I’ve even pre-ordered a few of the new Nest Cam IQ outdoor cameras. As of right now, Nest has a lot of new offerings to explore if you’re looking for an awesome gift. This includes a new line of thermostats. The new Nest Thermostat E is designed to be a less expensive, yet still powerful, version of the original Nest Learning Thermostat. Similarly, you’ll see new editions of its cameras as well. The new Nest Secure alarm system comes with lots of sensors, is designed to be super easy-to-use, and has some unique gadgets like a Nest Tag – allowing you easily arm and disarm your system quickly. Plus, they’ve come a long way with things like facial recognition, awareness, and alerting. Otherwise, solutions from folks like Alarm.com and SimpliSafe also make great gifts by making home security even easier.

  1. Something for the baby: So many people in my life are having kids, and lots of my millennial friends are also looking to leverage some smart baby tools to make parenting just a tiny bit easier. To help out, there are lots of fun baby-related techie gift ideas. First of all, check out this Smart Infrared Ear Thermometer with Touchable LED Display. Priced at just under $60, it’s non-invasive, accurate, and super-fast.

Another really cool give idea is the technology from Owlet. The Owlet Smart Sock tracks your infant’s heart rate and oxygen while they sleep. It’s not the least expensive gift out there, but’s pretty cool. For those connected parents out there, this little sock will silently track an infant’s heart rate and oxygen while they sleep and notifies parents if something is wrong.

Finally, we come to the baby monitor. Invented in 1937, and reinvented (according to Nanit) in 2017. We’ve all heard of traditional baby monitors being hacked; where intruders not only see the kid, but can actually talk to the child as well. Nanit takes the approach to baby monitoring completely differently. According to Nanit, the device was designed to live next to your newborn. From the stable camera stand and wall mount to our infrared light, every hardware detail has been crafted with safety in mind. And Nanit’s data is secured with end-to-end encryption. With safe cable management and as shatter-resistant lens included, this is definitely the next-generation in baby monitors.

  1. Something for the pets: Where would we be without our beloved pets? If you have a puppy or a kitty in your life that deserves an awesome gift you have to check these out. First is the Pawbo Life Wi-Fi Pet Camera. This is fun little gadget comes with 720p HD video, 2-way audio, video recording, a treat dispenser, and a laser game. If you or your friends have a cat at home – the manual laser game that you can control from your phone is so much fun.

Next is the Furbo Dog Camera. This is a treat-tossing, full HD Wifi pet camera and 2-way audio gadget. Designed for dogs, Furbo uses dog recognition technology to send you dog activity alerts, person alerts, and even a dog selfie alert! Oh, and guess what – it works with Amazon Alexa.

This next gift idea isn’t entirely cheap. But, when it comes to spoiling a loved pet – this is definitely a posh gift! The PetChatz HD & PawCall Bundle is a two-way audio/video pet treat camera with DogTV. It also includes brain games, recording capabilities, aromatherapy scents, motion & sound detection, and even a call mode. One of the really cool features here is the included PawCall function. This is basically an interactive brain game with your pup where your dog can actually contact you. No, I’m not kidding.

If you’re looking for a bit more of a reasonable stocking stuffer for your pet, check out the Pooch Selfie: The Original Dog Selfie Stick. This is the first smartphone accessory that actually ups your pup’s selfie game. Priced at a reasonable $12, you basically strap a squeaker ball for maximum attention grabbing to the top of your smartphone. From there, you get the absolute best selfies you will ever take with your pup.

  1. Great gifts under $50: You don’t need to spend too much to get something cool for your friends. There are lots of fun ideas for gifts under $50. For those friends that love to cook – get them a fun Wireless Meat Thermometer. I actually use mine all the time for baking, grilling, and overall cooking. This lets you keep the oven door closed while still getting updates on temperatures. The best part is that there’s an app, so you can keep an eye on your cooking from anywhere in the house.

Another fun idea is a VR system. Yes, you can get the new Oculus Rift+Touch Virtual Reality System for $400, but there are others which are still fun and a lot less expensive. This VR Headset with Bluetooth Remote Controller is priced at $38. This little guy is great for large viewing immersive experiences, looking at 3D movies, and even playing 3D games on your smartphone.

Here’s another fun one to help you calm down over the holidays. The Ocean Wave Night Light creates a relaxing, comforting, soothing, and brilliant light show for viewing entertainment and pleasure. At $25, this is an awesome gift to create ambience and a relaxing atmosphere for your friends.

Finally, as I mentioned earlier, check out what Amazon has as far as deals. The new Fire HD 8 Tablet with Alexa is priced at $49 (30 percent off right now) and is a great little device. There is even a kid-proof version of the Fire Tablet.

Whatever you decide to gift, the most important part here is that you have fun, spend time with your friends and family, and really enjoy the holiday season. Remember, if you can’t think of anything of a gift, donating to a charity in someone’s name is always an amazing gift.

Source: TheWHIR

MATRIXX Software Launches Digital Commerce Solution On Google Cloud

MATRIXX Software Launches Digital Commerce Solution On Google Cloud

MATRIXX Software has announced the availability of MATRIXX Digital Commerce on the Google Cloud Platform. A comprehensive solution, MATRIXX Digital Commerce is a single platform that brings together traditionally separate functions including: product lifecycle management, customer engagement, service delivery and monetization. By offering this innovative cloud-deployment capability, MATRIXX Software now makes it possible for Communications Service Providers (CSPs) to leverage the benefits of Google Cloud for rapid digital transformation.     

CSPs embarking on digital transformation journeys are seeking cloud native, fast-start solutions in lieu of large-scale IT transformations which carry substantial risk. By leveraging MATRIXX Digital Commerce in Google Cloud, CSPs have a fast-start, low cost option to digitize IT operations. MATRIXX Software provides telcos with an alternative path to continuing the steep investments in decades-old BSS infrastructure, which is inherently too complex and slow to support today’s dynamic market requirements.   

“We are seeing more demand for real-time commerce capabilities deployed via the cloud,” explained Dave Labuda, founder, CTO and CEO of MATRIXX Software. “We’ve chosen to work with Google because they are the disruptive cloud vendor in terms of price, innovation and performance. In addition, Google’s well-provisioned global network, with hundreds of thousands of miles of fiber optic cable, makes them the right partner to bring our network-grade application into the public cloud.”

Leveraging a public cloud option, CSPs can quickly automate processes and streamline operations. By replacing outdated and complex technology with MATRIXX Digital Commerce in Google Cloud, CSPs deploy a simpler IT architecture that enables business agility and delivers a much-improved experience to customers — at a fraction of the operational costs incurred via traditional BSS.

Rich Karpinski, principal analyst, mobile operator strategies at 451 said, “While some operators continue to focus solely on NFV, many are looking for a quicker win with their IT operations transformation — an activity that all too often in the past has been a long, expensive project that failed to pay promised dividends. As NFV is still evolving and maturing, going with a public cloud offering that supports network-grade applications and operations today could provide CSPs a faster, more cost-effective path forward.”

MATRIXX Software chose Google Cloud because of its unique ability to provide telco-grade scalability and performance with network-grade application and security support, embedded analytics, and a level of price performance that provides a new benchmark of economics for telco transformation and operations.

MATRIXX Software has been working with the Google Cloud Platform since the beginning of 2017 to define, test, benchmark and validate the environment. Using Compute Engine, Cloud Load Balancing and Virtual Private Cloud services from Google Cloud, MATRIXX Software has benchmarked supporting tens of millions of subscribers, using a small number of VMs, and full support for both local and geographical redundancies to provide telco-grade reliability and availability.

MATRIXX Digital Commerce in Google Cloud provides:

  • A quick-start environment and architecture to fast-track digital transformation
  • Virtual managed network capabilities
  • Strong organizational and platform security
  • Local and geo-redundancy to provide five-nines of availability
  • Seamless horizontal scalability
  • The ability to start small but scale to support Tier One telco customer and network traffic volumes

The first live deployments of MATRIXX Digital Commerce in Google Cloud will launch in early 2018.  

Source: CloudStrategyMag

Masergy Integrates Managed Cloud Workload Protection Into MDR Platform

Masergy Integrates Managed Cloud Workload Protection Into MDR Platform

Masergy has announced the availability of its Managed Cloud Workload Protection solution. The managed offering reduces threats to enterprise cloud services by delivering rigorous protection, detection and response capabilities for AWS, Azure, Google and private cloud environments. This innovative solution is the latest integration into Masergy’s comprehensive managed detection and response (MDR) platform to mitigate risk, offload incident monitoring and improve security outcomes.

“Migration to IaaS/PaaS environments offer tremendous business benefits, but as we’ve seen with recent high-profile cloud breaches, a single misconfiguration or missing control can be detrimental,” said Jay Barbour, director of security product management, Masergy. “Our managed solution delivers best practices for tracking vulnerabilities and ensuring proper configuration. With 24/7 monitoring from our global team of certified security experts we provide real-time alerting and incident response.”

According to Gartner’s How to Make Cloud IaaS Workloads More Secure Than Your Own Data Center G00300337, “95% of cloud security failures will be the [IaaS] customer’s fault.” In addition, “adopting the best practices outlined in this research will require changes to security culture, mindsets and processes, thus slowing adoption.”

Masergy addresses Gartner’s recommendations and goes even further by extending patented security analytics capabilities and mature processes to cloud computing environments in partnership with CloudPassage. This continues Masergy’s expansion of its Managed Detection Response ecosystem with a cost-effective solution that provides scalable coverage for servers, virtual machines, cloud operating systems and containers.

Benefits include:

  • Automates best practices and accelerates enterprise migration to cloud computing while also supporting legacy on-premise servers and data centers
  • Frees up critical in-house security resources from time-consuming 24/7 alert monitoring and triage
  • Protects against sophisticated attackers and security misconfigurations with extensive security controls and audit capabilities

Gartner, “How to Make Cloud IaaS Workloads More Secure Than Your Own Data Center,” Neil MacDonald, Lydia Leong, Terrence Cosgrove, Refreshed: 4 October 2017 | Published: 24 June 2016.[G1]

Source: CloudStrategyMag

Cybersecurity for Novices Has U.K. Firm Trouncing Silicon Valley

Cybersecurity for Novices Has U.K. Firm Trouncing Silicon Valley

(Bloomberg) — In a world where protecting against cyber crime is high on most big business agendas, a U.K. provider of IT security to clients as small as dentists and neighborhood stores is outpacing the best that Silicon Valley has to offer.

Sophos Group Plc shares have more than doubled in 2017, beating every other stock in the Nasdaq CEA Cybersecurity Index, including larger California-based peers such as Symantec Corp. and Palo Alto Networks Inc. The stock has also left domestic equities trailing, being one of the top five performers in the U.K.’s FTSE All-Share Index.

Investors’ appetite is understandable. After this year’s global WannaCry ransomware attacks and headline-grabbing hacks at Uber Technologies Inc. and Equifax Inc., demand for cyber security has never been greater — whether you are a multinational corporation or a local shop owner. It’s a platform that’s giving Sophos some lofty ambitions in a British technology sector that was jolted by the $32 billion Japanese takeover of ARM Holdings in July 2016.

“We should be, we will be, the U.K. tech champion,” Chief Financial Officer Nick Bray said in an interview.

To get there, Bray will need to overtake software giants including Sage Group Plc and his former employer Micro Focus International Plc, whose market value of about 10.8 billion pounds ($14.5 billion) dwarfs Sophos’s 2.5 billion pounds.

The executive’s optimism is mostly shared by analysts, with nine out of 10 having buy recommendations on the stock and none advising clients to sell. Morgan Stanley named Sophos its top European technology sector pick for 2018 in a note on Friday. Yet, after this year’s gains, not all are bullish: KeyBanc’s Rob Owens cut Sophos to sector weight last month when it was trading about 12 percent above its current price of 541 pence.

“It’s had a heck of a run,” Owens said in an interview. “It’s not overly expensive, but it’s not overly cheap anymore.” Trading about 31 times calendar 2018 free cash flow, the stock is “fairly valued” compared with companies like Symantec and Qualys Inc., he said.

Demand for Sophos’s services is growing as cyber crime tactics evolve. According to Bray, criminal gangs are changing tack and aiming hacks at a large number of smaller companies instead of a handful of bigger corporations, making cybersecurity “very relevant’’ for smaller firms.

Sophos’s products are aimed at mid-market businesses with up to 5,000 employees, but also include very small companies that are “playing catch up” with the need to protect against cyberattacks, he said. A dental practice could lose access to its patient records, for example.

While knowledge of IT security remains in the “very embryonic” stages, both general awareness and the sophistication of customers is increasing, Bray said.

The same game of catch-up appears to be happening with investors. Bray said Sophos was initially “misunderstood” when it sold stock in a 2015 initial public offering at 225 pence a share, having pulled a previous attempt at an IPO in 2007. A challenge of listing in London over the U.S. was that “we had to get the awareness up of what we did,” Bray said.

Sophos has “spent a lot of time educating” investors on the size of its addressable market, its competitive position and its financial model, specifically the size of its customer base and its renewal rate, the executive said. In November, the company raised its outlook for the 2018 financial year, reporting 22 percent first-half billings growth and hailing its renewal rate, a growing base of subscription revenue and a 220 percent rise in sales of its Sophos Central cloud platform.

After riding a wave in technology stocks for much of the year, Sophos shares have slipped back with the sector in recent weeks, also weighed down by a share placing by early investor Apax Partners. That and some recent share sales by directors led to oversupply, a weak share price and “misplaced concern about the fundamentals, which are very strong,” said Numis analyst David Toms, who upgraded the stock to add from hold last week.

Organic growth remains Sophos’s primary focus, according to Bray. The company is always evaluating “targeted technology tuck-ins” to boost its offering via mergers and acquisitions. Any deals it does pursue, like the 2015 purchase of Dutch endpoint protection firm SurfRight and this year’s acquisition of the software product arm of Invincea, would be done to expand the product offering and boost cross-selling scope, burnishing its organic growth potential.

Yet the pace of the company’s growth raises the question of whether Sophos may itself become a target. While no board members want to sell, “it’s not impossible,” Bray said. “You can never stop somebody knocking at the door.”

Source: TheWHIR

Net Neutrality Rules Swept Aside by Republican-Led U.S. FCC

Net Neutrality Rules Swept Aside by Republican-Led U.S. FCC

(Bloomberg) — The U.S. Federal Communications Commission swept aside rules barring broadband providers from favoring the internet traffic of websites willing to pay for speedier service, sending the future of net neutrality on to a likely court challenge.

The Republican-led commission voted 3-to-2 on Thursday to remove Obama-era prohibitions on blocking web traffic, slowing it or demanding payment for faster passage via their networks. Over objections from its Democrats, the FCC gave up most authority over broadband providers such as AT&T Inc. and Comcast Corp. and handed enforcement to other agencies. The changes won’t take place for at least two months.

“It is time for us to restore internet freedom,” said FCC Chairman Ajit Pai, who was chosen by President Donald Trump to lead the agency, and who dissented when the FCC adopted the rules under Democratic leadership in 2015. “We are restoring the light-touch framework that has governed the internet for most of its existence.”

“This decision puts the Federal Communications Commission on the wrong side of history, the wrong side of the law, and the wrong side of the American public,” said Jessica Rosenworcel, a Democratic member who voted against changing the rules.

The change frees broadband providers to begin charging websites for smooth passage over their networks. Critics said that threatens to pose barriers for smaller companies and startups, which can’t afford fees that established web companies may pay to broadband providers, or won’t have the heft to brush aside demands for payment. Broadband providers said they have no plans for anti-competitive “fast lanes,” since consumers demand unfettered web access.

The FCC’s vote concludes a tumultuous eight-month passage since Pai, proposed gutting the earlier rules. The agency took in nearly 24 million comments, but many of those appeared to be of dubious origin including almost half a million routed through Russia. Dozens of Democratic lawmakers expressed opposition, while Republicans lauded Pai’s plan.

The FCC’s action will “return the internet to a consumer-driven marketplace free of innovation-stifling regulations,” Senate Majority Leader Mitch McConnell, a Kentucky Republican, said in remarks prepared before the agency’s vote.

Democratic Senator Amy Klobuchar, of Minnesota, said the FCC with its vote “will put internet service providers, not consumers, in charge of determining the future of the internet.”

Pai argued that the Obama-era rules brought needless government intrusion to a thriving sector, and discouraged investment in broadband. Supporters said investment has flowed unhindered, and that rules are needed to keep internet service providers from unfairly exploiting their position as gateways to homes and businesses.

The FCC with its 2015 rules claimed powers that could include regulating rates charged by internet service providers. The agency said it wouldn’t immediately do so, but the prospect helped propel broadband providers’ opposition.

The cable and telephone companies also criticized the breadth of what critics called utility-style regulations, including a portion written to allow the FCC to vet data-handling practices it couldn’t yet envision. Companies supporting Pai’s rollback proposal included AT&T, Verizon Communications Inc. and cable providers led by Comcast and Charter Communications Inc.

Web companies such as Alphabet Inc.’s Google, Facebook Inc. and Amazon.com Inc. wanted to keep the previous regulations. “Having clear, legally sustainable rules in place finally established rules of the road and provided legal certainty,” the Internet Association, a trade group for web companies, said in comments to the FCC. “The commission should maintain its existing net neutrality rules and must not weaken their firm legal basis.”

With its vote the FCC rescinded its 2015 decision to treat internet service providers using a portion of the laws designed to regulate utilities. Much of the debate over net neutrality has revolved around this question of classification: whether Washington regulators can wield the kind of intrusive rulemaking that’s also used, for instance, to tell telephone providers when and where they can stop offering service.

The FCC also abandoned the bulk of its oversight role, saying antitrust authorities and the Federal Trade Commission can monitor for anti-competitive practices. Critics say those agencies don’t have expertise and act only after abuses occur, rather than setting rules that guide behavior.

In addition, the authority of the FTC is under question in a case before federal judges in California, where AT&T is contesting a sanction from the FTC for deceiving smartphone consumers who paid for unlimited data only to have their download speeds cut.

Opponents of Pai’s rules are expected to ask U.S. judges to overturn the ruling and restore the old rules. Issues before the judges will include whether the FCC has adequate grounds to reverse a decision taken less than three years earlier. Judges last year upheld the previous rules.

Congress could write a law to overrule the FCC’s action, but it hasn’t acted as Democrats dismiss Republican invitations to legislate to a permanently weaken the 2015 rules. The Democrats’ “wall of resistance” may weaken in the new year after partisan fervor heightened by Thursday’s vote has a chance to abate, Cowen & Co. analyst Paul Gallant said in a Nov. 21 note. A bill might restore some basic net neutrality protections and also bar the FCC from regulating rates, Gallant said.

The new rules are to take effect 60 days after being published in the Federal Register that chronicles regulatory activity, the FCC said in its draft order for Thursday’s vote.

Source: TheWHIR

Squarespace Is Said to Raise Funding at $1.7 Billion Valuation

Squarespace Is Said to Raise Funding at .7 Billion Valuation

(Bloomberg) — For years, Squarespace Inc. has been a leader in the old-school art of designing websites. Its main rival, Wix.com Ltd., has been public since 2013, but Squarespace remains private.

Now in its teenage years, Squarespace is giving early employees and investors a way to cash out. The New York-based company said General Atlantic LLC, an investment firm and Squarespace backer, is injecting a new round of funding, most of which will go toward buying stock from other investors and employees.

General Atlantic will commit about $200 million to the deal, and the new shares value the business at $1.7 billion, said a person with knowledge of the deal. Squarespace Chief Executive Officer Anthony Casalena declined to comment on terms of the funding.

As many technology companies postpone initial public offerings indefinitely, early backers are getting restless. The Bloomberg U.S. Startups Barometer, an index tracking the private technology industry, shows IPOs and acquisitions are at a three-year low. More startups are arranging deals similar to Squarespace’s to appease shareholders. Uber Technologies Inc. recently offered stockholders the option to sell to a group of investors led by SoftBank Group Corp. at a 30 percent discount to the most recent valuation. At least two high-profile backers have already agreed to participate.

Unlike Uber, Squarespace is profitable. “If anything, we actually would have been interested in buying more,” Anton Levy, managing director and head of internet and technology at General Atlantic, said in an interview. “Even people that were early investors that have made a fabulous return sold a small percentage.”

Revenue in the past year increased 50 percent to about $300 million, Casalena said. Squarespace isn’t far behind Wix, which is expected to generate about $424 million this year. The Squarespace brand has gained recognition among consumers for its ubiquitous podcast ads and a Super Bowl commercial featuring John Malkovich.

But Casalena suggested Squarespace has more to do before a potential IPO. He said the company is focused on helping customers sell products through their sites. This effort puts Squarespace more directly in competition with Bigcommerce Inc., Etsy Inc., Shopify Inc. and, most terrifyingly, Amazon.com Inc. “It’s the most requested feature on the platform right now,” Casalena said. “A lot of people are there building a brand. They want to sell something.”

Its broad customer base, which includes wedding photographers to family-run pizza parlors, gives Squarespace a unique group of people to grow its commerce products into, whereas a company like Shopify is focused on online-only e-commerce sites that sell physical goods, Casalena said.

The funding round is a milestone for the 13-year-old business and its 700 employees, Casalena said. It brings Squarespace into the realm of Buzzfeed Inc. and Reddit Inc., both with similar valuations, according to research firm CB Insights. “We’ve been a little under the radar for a lot of people,” Casalena said.

The next stage of growth is to build into more international markets like France and Germany, he said. Right now, around 30 percent of the company’s business is outside the U.S.

Analysts and bankers have been expecting the company to go public since at least 2016. Casalena declined to comment on IPO plans but said he’s building a company capable of doing so. “We want to do this on our own time table,” he said. “We’re not in a rush.”

Source: TheWHIR

In the rush to big data, we forgot about search

In the rush to big data, we forgot about search

I ready David Linthicum’s post ”Data integration is the one thing the cloud makes worse” with great interest. A huge reason that I decided my next job would be for a search company was because of this very problem. (That’s why I now work for LucidWorks, which produces Solr– and Spark-based search tools.) While working with clients, I realized that with big data and the cloud a tough problem, finding things was becoming worse. I had seen the upcoming meltdown as the use of Hadoop formed yet another data silo and as a result produced few actual insights.

Part of the problem is that the technology industry is trend-driven rather than problem-solving. A few years ago, it was all about client/server under the guise of distributed computing à la Enterprise JavaBeans, followed by web services and then big data. Now it is all about machine learning. Many of these steps were important, and machine learning is an important tool for solving problems.

We lost indexing and search as big data emerged

But sadly, the most important problem-solving trend got lost in the shuffle: indexing and search.

The modern web began with search. The web would be a lot smaller if Yahoo and the search portals of the late 1990s had triumphed. The dot-com bomb happened and yet Google was born from its ashes. Search also birthed big data and arguably the modern machine learning trend. Google, Facebook, and other companies needed more ways to handle their indexing jobs and their large amounts of data distributed to internet scale. Meanwhile, they needed better ways to find and organize data after they ran upon the limits of crowdsourcing and human intelligence.

Amazon.com blew away the retail market in part because it dared to invest in search technology. The main reason I go to Amazon and not other vendors is because I’ll almost definitely find what I’m looking for. In fact, Amazon may suggest what I want before I get around to searching for it. (Though, I have to say that Amazon.com’s recommendations are now falling behind the curve.) Yet many retailers still use the built-in search in their commerce suite and then wonder why customer conversion and engagement is off. (Hint: Customers can’t find anything to buy.)

Meanwhile, many companies continue to keep old-style enterprise search products. Some of these products aren’t even maintained, belonging to dead or acquired companies. Most people still operate with bookmarks. So if you move some of your data to SaaS solutions, move some of your data to PaaS solutions, move some of your data to IaaS solutions and across multiple vendors’ cloud platforms while maintaining some of your data behind the firewall—yeah, no one is going to find anything!

How to redefine “integration”

To address what Linthicum raised in his post, we need to do is redefine “integration” for the distributed and cloud computing era.

Data integration used to mean just that: grabbing all the data and dumping it into a big, fat, single area. First this was with databases, then data warehouses, and then Hadoop. Ironically, we moved further away from indexed technology when doing this.

Now, integration must mean that we can index and find the data where it lives, deduplicate it, and derive a result. To find a single source of truth, we need to capture timestamps and source IDs.

To integrate, we need a single search solution that can reach our on-premises data and our cloud data. The worst thing we can do is deploy a search tool that only searches one source of data, serves only one use case, or can’t be used behind our firewall.

In the cloud era, we need to look at search to be the glue that lets us find the data and analyze it together, no matter where it lives. We can’t just dump everything into one place; we need tools to let us get to exactly the right data where it lives.

Source: InfoWorld Big Data

IDG Contributor Network: The clash of big data and the cloud

IDG Contributor Network: The clash of big data and the cloud

Recently, I visited a few conferences and I noticed a somewhat hidden theme. While a lot of attention was being paid to moving to a (hybrid) cloud-based architecture and what you need for that (such as cloud management platforms), a few presentations showed an interesting overall development that everybody acknowledges but that does not get a lot of close attention: the enormous growth of the amount of digital data stored in the world.

What especially caught my attention was a presentation from PureStorage (a storage vendor) that combined two data points from two other vendors. First, a June 2017 Cisco white paper The Zettabyte Era: Trends and Analysis that extrapolates the growth of internet bandwidth, the second a Seagate-sponsored IDC study Data Age 2025 that extrapolates the trend of data growth in the world. PureStorage combined both extrapolations in the following figure (reused with permission):

purestoragePureStorage

PureStorage’s depiction of the clash between world data growth and world internet bandwidth growth.

These trends—if they become reality, and there are reasons enough to think these predictions to be reasonable—are going to have a major impact of the computing and data landscapes in the years to come. And they will especially impact the cloud hoopla that is still in full force. Note: The cloud is real and will be an important part of future IT landscapes, but simplistic ideas about it being a panacea for every IT ailment are strongly reminiscent of the “new economy” dreams of the dot-com boom. And we know how that ended.

The inescapable issue

Anyway, there are two core elements of all IT: the data and the logic working with/on the data. Big data is not just about the data. Data is useless (or as Uncle Ludwig would have it: meaningless) unless it can be used. What everybody working with big data already knows: To use huge amounts of data, you need to bring the processing to the data and not the data to the processing. Having the processing at any “distance” creates such a transport bottleneck that performance decreases to almost nothing and any function of that logic becomes a purely theoretical affair.

Even with small amounts of data, this already may happen because of latency. For instance, moving your application server to the cloud while retaining your database server on premises may work on paper, but when the application is sensitive to latency between it and the database it doesn’t work at all. And that can already be the case for small amounts of data. This is why many organizations are trying to adapt software so it becomes less latency-sensitive, thus enabling a move into the cloud. But with huge amounts of data, you need to bring processing and data close to each other, else it just does not work. Add the need for massive parallelism to handle that data and you get Hadoop and other architectures that tackle the problem of processing huge amounts of data.

Now, the amount of data in the world is growing exponentially. If IDC is to be believed, in a few years’ time, the world is expected to store about 50ZB (zettabytes), or 50,000,000,000,000,000,000,000 bytes). On the other hand, while the total capacity of the Internet to move data around grows too, it does at a far more leisurely pace. In the same period that world data size grows to 50ZB, the total internet bandwidth will reach something like 2.5ZB per year (if Cisco is to be believed).

The conclusion from those two (not unreasonable) expectations is that the available internet bandwidth is by far not enough to move a sizeable fraction of the data around. And that is ignoring even the fact that about 80 percent of the current bandwidth is used for streaming video. So, even if you have coded your way around the latency issues in your core application, for cases with larger amounts of data, there will be a bandwidth issue as well.

Now, is this issue actually a problem? Not if the processing or use of that data happens locally—that is, in the same datacenter that holds the data. But while on the one hand the amount of data is growing exponentially, the world is also aggressively pursuing cloud-strategies; that is, to put all kind of workloads to the cloud, in the absolute extremes even “serverless” (for example, AWS Lambda).

Assuming that only small-sized results (calculated from huge data sets) may move around probably only helps a bit, because the real value of huge amounts of data comes from combining them. And that may mean combining data from different owners (your customer records with a feed from Twitter, for instance). It is the aggregation of all that different sets that is the issue.

So, what we see is two opposing developments. On the one hand, everybody is busy adapting to a cloud-based architecture that in the end is based on distributed processing of distributed data. On the other and, the amount of data we use is getting so large that we have to consolidate data and its processing in a single physical location.

So, what does that imply?

Well, we may expect that what Hadoop does at the application architecture level will also happen on a world level: the huge data sets will be attractors for the logic that make them meaningful. And those huge data sets will gravitate together.

Case in point: Many are now scrambling to minimize the need to move that data around. So, in the IoT world there is a lot of talk about edge computing: handling data locally where the sensors and other IoT devices are. Of course, what that also means is that the processing must also be locally, and you can safely assume that you will not be bringing the same level of computing power to bear in a (set of) sensors than what you can do in big analytics setups. Or: you probably won’t see a Hadoop cluster under the hood of your car anytime soon. So, yes, you can minimize data traffic that way, but at the expense of how much you can compute.

There is another solution for this issue: Stick together in datacenters. And that is also what I see happening.  Colocation providers are on the rise. They offer large datacenters with optimized internal traffic capabilities where both cloud providers and large cloud users are sticking together. Logically, you may be in the cloud, but physically you are on the same premises as your cloud provider. You don’t want to run your logic just on AWS or Azure; you want to do that in a datacenter where you also have your own private data lake so all data is local to the processing and data aggregation is local as well. I’ve written elsewhere (Reverse Cloud | Enterprise Architecture Professional Journal) on the possibility that cloud providers might be extending into your datacenters, but the colocation pattern is another possible solution for solving the inescapable bandwidth and latency issues arising from the exponential growth of data.

The situation may not be as dire as I’m sketchinf it. For example, maybe the actual average volatility of all that data will ultimately be very low. On the other hand, you would not want to run your analytics on stale data. But one conclusion can be drawn already: Simply assuming that you can distribute your workloads to a host of different cloud providers (the “cloud … yippy!” strategy) is risky, especially if at the same time the amount of data you are working with grows exponentially (which it certainly will, if everyone wants to combine their own data with streams from Twitter, Facebook, etc., let alone if those combinations spawn all sorts of new streams).

Therefore, making good strategic design decisions (also known as “architecture”) about the locations of your data and processing (and what can and can’t be isolated from other data and) is key. Strategic design decisions … hmm that sounds like a job for architecture.

This article is published as part of the IDG Contributor Network. Want to Join?

Source: InfoWorld Big Data

Enter Fortifies Carrier-Neutral Interconnection Capabilities At Its Milan Data Center

Enter Fortifies Carrier-Neutral Interconnection Capabilities At Its Milan Data Center

Enter has announced enhanced connectivity capabilities at its Milan Caldera data center campus with the addition of its neutral interconnection facility, MIL2.

An expansion of Enter’s existing MIL1 data center, MIL2 is purpose-built to facilitate cost-effective cross-connects to a multitude of carriers and content providers via its Meet-Me Room.

Enter’s MIL1 and MIL2 data centers provide a reliable environment for telco colocation, with redundant power distribution and a generator that provides the facility with at least five days of backup power at full load for business continuity.

Leveraging Enter’s expansive backbone and metro dark fiber network, MIL2 customers can access hundreds of networks within the Caldera campus and seamlessly connect to additional local network providers and remote facilities in the Milan area.  Additionally, nearby landing points in Bari and Palermo enable access to submarine cables Southeast Asia-Middle East-Western Europe 5 (SEA-ME-WE 5) and Asia-Africa-Europe (AAE-1) by way of Enter’s strategic provider partnerships.

“We expanded our neutral interconnection facility to provide customers with cost-effective, reliable interconnection opportunities. Located in one of Italy’s key connectivity and fiber hubs, our Milan Caldera data centers offer a strategic and cost-effective alternative to Frankfurt and Marseille,” says Milko Ilari, Head of International Business & Strategy at Enter.  “In addition to serving as a bridge for operators looking to expand their reach to or within Western Europe, MIL2 is also designed to accommodate unique project requirements as well as facilitate mutually beneficial partnerships amongst our customers.”

Enter’s transparent, partner-centric approach is also evident in its recent Open Compute Project (OCP) hardware deployment for Enter Cloud Suite (ECS) in MIL2.  The OCP was launched by Facebook and provides an open source design for servers, racks and data center facilities, lowering vendor lock-in and increasing community participation in data center.  By adopting OCP, companies can dramatically reduce CAPEX and OPEX, while driving innovation through the incremental contribution of the open source community.

ECS is the first European, OpenStack-based cloud Infrastructure-as-a-Service (IaaS) solution.  With one connection to Enter, small to mid-size communication service providers can affordably expand their network footprint and reach all of Europe’s leading IXs.  In addition to ECS, Enter also offers Colocation, Ethernet and internet access services, Virtual Private Networks (VPNs), dark fiber, and data center services at its MIL2 data center.

Source: CloudStrategyMag