Green House Data Acquires Cirracore

Green House Data Acquires Cirracore

Green House Data has announced the acquisition of Cirracore, an Atlanta-based infrastructure provider of enterprise-ready Infrastructure-as-a-Service (IaaS) and hybrid cloud products. The Cirracore customer list includes a strong presence in the Southeast as well as large national and international brands.

“As a high-growth market and innovation hub, Atlanta has been a target expansion market for us,” said Shawn Mills, CEO, Green House Data. “Integrating Cirracore and its management team into Green House Data will allow us to deliver a larger set of products with greater geographic diversity, ultimately to provide higher value for all of our customers, both existing and future.”

Cirracore was founded in 2008 by Fred Tanzella, a veteran technology executive with deep experience in information security and infrastructure technology start-ups. Under his leadership, Cirracore has beat revenue projections and boasts an exceptionally low customer churn rate. Mr. Tanzella serves on the Board of Directors for the Technology Association of Georgia, and the Executive Advisory Board for the National Association of Telecom Professionals. He will join the Green House Data executive team.

Green House Data will celebrate its 10-year anniversary in July of 2017. The company has expanded from a single facility in Cheyenne, Wyoming to over 100,000 sq ft of white space and cloud connectivity. With access to over 250 carriers, service providers, and content providers spread across west, central, east, and now Southeast locations, Green House Data customers are uniquely positioned for rapid scale and location-specific workloads. All Green House Data facilities are compliant to standards including HIPAA, SSAE 16 Type II, and PCI-DSS. Committed to sustainability, the company is the nation’s 25th largest green power buyer within the technology and telecom space.¹

“As we entered this next phase, we looked for a strategic acquisition partner,” said Tanzella. “We’ve more than doubled our footprint in the last two years, and it was critical to my team that Cirracore’s model of enterprise-focused, VMware-based, and hyper-growth IaaS be pulled forward in any merger or acquisition scenario.”

Cirracore’s compute and storage infrastructure includes high-speed carrier-redundant network connectivity together with managed carrier-grade security. From Equinix’s AT1 and AT3 facilities in Atlanta, Cirracore can cross-connect customers directly into cloud resources from over 180 network and service providers.

“We’re thrilled to bring yet another location into our portfolio, and have Fred’s leadership and vision added to our team,” said Mills. “It’s an exciting time for both of our companies, and the industry as a whole.”

¹ https://www.epa.gov/greenpower/green-power-partnership-top-30-tech-telecom

Source: CloudStrategyMag

LLVM-powered Pocl puts parallel processing on multiple hardware platforms

LLVM-powered Pocl puts parallel processing on multiple hardware platforms

LLVM, the open source compiler framework that powers everything from Mozilla’s Rust language to Apple’s Swift, is emerging in yet another powerful role: an enabler of code deployment systems that target multiple classes of hardware for speeding up jobs like machine learning.

To write code that can run on CPUs, GPUs, ASICs, and FPGAs alike—something hugely useful with machine learning apps—it’s best to use something like OpenCL, which allows a program to be written once and then automatically deployed across all those different types of hardware.

Pocl, an implementation of OpenCL that was recently revamped to version 0.14, is using the LLVM compiler framework to do that kind of targeting. With Pocl, OpenCL code can be automatically deployed to any hardware platform with LLVM back-end support.

Pocl uses LLVM’s own Clang front end to take in C code that uses the OpenCL standard. Version 0.14 works with both LLVM 3.9 and the recently released LLVM 4.0. It also offers a new binary format for OpenCL executables, so they can be run on hosts that don’t have a compiler available.

Aside from being able to target multiple processor architectures and hardware types automatically, another reason Pocl uses LLVM  is that it aims to “[improve] performance portability of OpenCL programs with the kernel compiler and the task runtime, reducing the need for target-dependent manual optimizations,” according to the release note for version 0.14. 

There are other projects that automatically generate OpenCL code tailored to multiple hardware targets. The Lift project, written in Java, is one such code generation system. Lift generates a specially tailored IL (intermediate language) that allows OpenCL abstractions to be readily mapped to the behavior of the target hardware. In fact, LLVM works like this; it generates an IL from source code, which is then compiled for a given hardware platform. Another such project, Futhark, generates GPU-specific code.

LLVM is also being used as a code-generating system for other aspects of machine learning. The Weld project generates LLVM-deployed code that is designed to speed up the various phases of a data analysis framework. Code spends less time shuttling data back and forth between components in the framework and more time doing actual data processing.

The development of new kinds of hardware targets is likely to continue driving the need for code generation systems that can target multiple hardware types. Google’s Tensor Processing Unit, for instance, is a custom ASIC devoted to speeding one particular phase of a machine learning job. If hardware types continue to proliferate and become more specialized, having code for them generated automatically will save time and labor.

Source: InfoWorld Big Data

Hyperscale Operators Continue Ramping Up Share Of Cloud Markets

Hyperscale Operators Continue Ramping Up Share Of Cloud Markets

New data from Synergy Research Group shows that hyperscale operators are aggressively growing their share of key cloud service markets, which are themselves growing at impressive rates. Synergy’s new research has identified 24 companies that meet its definition of hyperscale, and in 2016 those companies in aggregate accounted for 68% of the cloud infrastructure services market (IaaS, PaaS, private hosted cloud services) and 59% of the SaaS market. In 2012 those hyperscale operators accounted for just 47% of each of those markets. Hyperscale operators typically have hundreds of thousands of servers in their data center networks, while the largest, such as Amazon and Google, have millions of servers.

In aggregate those 24 hyperscale operators now have almost 320 large data centers in their networks, with many of them having substantial infrastructure in multiple countries. The companies with the broadest data center footprint are the leading cloud providers — Amazon, Microsoft, and IBM. Each has 45 or more data center locations with at least two in each of the four regions (North America, APAC, EMEA, and Latin America). The scale of infrastructure investment required to be a leading player in cloud services or cloud-enabled services means that few companies are able to keep pace with the hyperscale operators, and they continue to both increase their share of service markets and account for an ever-larger portion of spend on data center infrastructure equipment — servers, storage, networking, network security, and associated software.

“Hyperscale operators are now dominating the IT landscape in so many different ways,” said John Dinsdale, a chief analyst and research director at Synergy Research Group. “They are reshaping the services market, radically changing IT spending patterns within enterprises, and causing major disruptions among infrastructure technology vendors. Our latest forecasts show these factors being accentuated over the next five years.”

Source: CloudStrategyMag

ViaWest Adds Cloud Node To Colorado Data Center

ViaWest Adds Cloud Node To Colorado Data Center

Bringing clients closer to their customers, ViaWest launched a new cloud node in the Colorado region, enabling Rocky Mountain users to access applications more efficiently.

 “The speed that applications can be accessed can have a very real impact on the bottom line,” said Michele Corvino, director of product management for ViaWest. “This cloud node enables our clients to place their applications closer to their users in the Rocky Mountain region so they can deliver service excellence to their internal and external customers.”

Hosted from the ViaWest Compark Data Center, the cloud node brings lower latency, premium connectivity and advanced hardware for clients serving the Rocky Mountain region. Cloud solutions include private, public and compliant clouds, disaster recovery, and managed services for public hyperscale cloud providers, such as Amazon Web Services, Microsoft Azure, and Google Cloud.

“This cloud node enables our clients to achieve speed, scale and efficiency and is connected to our network of cloud nodes and data centers located across the country by our high speed national backbone,” said Gunnar Stinnett, vice president of sales for the Rocky Mountain Region. “This investment enhances our ongoing commitment to serving clients in our home state of Colorado.”

ViaWest operates and manages five data centers in Colorado, offering a combined raised floor capacity of over 278,000 sq ft. The Compark Data Center opened in Centennial in 2014 and is among the most innovative and efficient in the data center industry, with a Power Usage Effectiveness of 1.3 and power density capabilities of more than 1,500 watts per sq ft.

Source: CloudStrategyMag

Gartner Says Worldwide IT Spending Forecast To Grow 1.4% In 2017

Gartner Says Worldwide IT Spending Forecast To Grow 1.4% In 2017

Worldwide IT spending is projected to total $3.5 trillion in 2017, a 1.4% increase from 2016, according to Gartner, Inc. This growth rate is down from the previous quarter’s forecast of 2.7%, due in part to the rising U.S. dollar (Table 1.)

“The strong U.S. dollar has cut $67 billion out of our 2017 IT spending forecast,” said John-David Lovelock, research vice president at Gartner. “We expect these currency headwinds to be a drag on earnings of U.S.-based multinational IT vendors through 2017.”

The Gartner Worldwide IT Spending Forecast is the leading indicator of major technology trends across the hardware, software, IT services, and telecom markets. For more than a decade, global IT and business executives have been using these highly anticipated quarterly reports to recognize market opportunities and challenges, and base their critical business decisions on proven methodologies rather than guesswork.

The data center system segment is expected to grow 0.3% in 2017. While this is up from negative growth in 2016, the segment is experiencing a slowdown in the server market. “We are seeing a shift in who is buying servers and who they are buying them from,” said Lovelock. “Enterprises are moving away from buying servers from the traditional vendors and instead renting server power in the cloud from companies such as Amazon, Google, and Microsoft. This has created a reduction in spending on servers which is impacting the overall data center system segment.”

Driven by strength in mobile phone sales and smaller improvements in sales of printers, PCs and tablets, worldwide spending on devices (PCs, tablets, ultramobiles and mobile phones) is projected to grow 1.7% in 2017, to reach $645 billion. This is up from negative 2.6% growth in 2016. Mobile phone growth in 2017 will be driven by increased average selling prices (ASPs) for phones in emerging Asia/Pacific and China, together with iPhone replacements and the 10th anniversary of the iPhone. The tablet market continues to decline significantly, as replacement cycles remain extended and both sales and ownership of desktop PCs and laptops are negative throughout the forecast. Through 2017, business Windows 10 upgrades should provide underlying growth, although increased component costs will see PC prices increase.

The 2017 worldwide IT services market is forecast to grow 2.3% in 2017, down from 3.6% growth in 2016. The modest changes to the IT services forecast this quarter can be characterized as adjustments to particular geographies as a result of potential changes of direction anticipated regarding U.S. policy — both foreign and domestic. The business-friendly policies of the new U.S. administration are expected to have a slightly positive impact on the U.S. implementation service market as the U.S. government is expected to significantly increase its infrastructure spending during the next few years.

Gartner’s IT spending forecast methodology relies heavily on rigorous analysis of sales by thousands of vendors across the entire range of IT products and services. Gartner uses primary research techniques, complemented by secondary research sources, to build a comprehensive database of market size data on which to base its forecast.

 

Source: CloudStrategyMag

Unitas Global Collaborates With Equinix And Microsoft Azure™

Unitas Global Collaborates With Equinix And Microsoft Azure™

Unitas Global has announced a strategic partnership with Equinix and Microsoft Azure™ to deliver an end-to-end hybrid cloud solution that enables enterprises to consume public and private cloud services with full transparency and ease. Combining Unitas Global’s expertise in designing, deploying, and managing hybrid cloud solutions, Equinix’s data center and interconnection services, and Microsoft’s Azure public cloud, the enterprise solution delivers application enablement and multi-cloud orchestration capabilities, as well as extensive global reach enabling connectivity into any market.

In a recent survey, Right Scale found hybrid cloud to be the preferred enterprise strategy in 2017. Of the enterprises, they surveyed who have a strategy to use multiple clouds, the majority plan to use hybrid this year. The partnership between Unitas Global, Equinix, and Microsoft Azure eliminates the common challenges the enterprises face associated with designing, deploying, and managing hybrid cloud solutions, including security, connectivity, scale, and management of multiple cloud and infrastructure providers, simplifying the process for the enterprise customers.

“The partnership between Unitas, Equinix, and Microsoft Azure creates the optimal trifecta for a successful enterprise hybrid cloud solution: global reach, multi-cloud orchestration, and end-to-end management and provisioning,” says Patrick Shutt, CEO, Unitas Global.  “Unitas works hand-in-hand with customers to design secure, easy-to-consume hybrid cloud solutions that are fully managed and monitored 24x7x365.”

The enterprise hybrid cloud solution leverages Equinix’s global footprint of 150 International Business Exchange™ (IBX®) data centers and interconnection capabilities to enable end users to deploy performance-driven, mission-critical applications in any market around the world.

“As a longtime partner of Unitas Global and one of the first data center providers to offer direct and private interconnection to Microsoft Azure via Microsoft ExpressRoute on Equinix Cloud Exchange™, Equinix is an ideal partner for end-to-end hybrid cloud solutions,” says Greg Adgate, vice president of global technology partners and alliances.  “Our real-time interconnection capabilities and global footprint of best-in-class data centers spanning 21 countries and 41 business metros enable enterprise customers to accelerate their hybrid cloud strategies on a global scale.”

Source: CloudStrategyMag

451 Releases Cloud Transformation Study

451 Releases Cloud Transformation Study

According to recent results of 451’s Voice of the Enterprise: Cloud Transformation study, 80% of organizations polled report that their IT environments require moderate or significant transformation to meet business requirements over the next five years, involving migrating workloads to the cloud and keeping cloud spending on an upward trend.

The latest survey finds that 22% of enterprises have adopted a ‘cloud first’ approach and infrastructure as a service (IaaS)/public cloud is the fastest-growing cloud model. With organizations increasingly choosing IaaS for mission-critical applications, service provider selection is a critical part of digital transformation. However, the respondents indicate concern about service providers’ perceived ability to align IT and business requirements, noting that there is scope for a better understanding of the customers’ businesses to support IT service delivery.

“As organizations implement IT transformation in earnest, they are increasingly relying on strategic partners for operational assistance. Those IaaS service providers who position infrastructure and technological innovation alongside meeting business requirements will be best positioned to capitalize on this market opportunity,” said Melanie Posey, research vice president and lead analyst for 451’s Voice of the Enterprise: Cloud Transformation service.

Survey respondents rated their IaaS providers on a range of attributes prior to purchase (promise) and after implementation (fulfillment). IaaS providers received consistently high ratings on both the promise and fulfillment of service attributes such as uptime/performance, security and technical expertise. Although these are table stakes requirements, they are also make-or-break factors that can adversely affect brand reputation. 

However, highlighting the importance of the overall customer experience, 451 Research’s survey found that organizations gave IaaS providers lower ratings on service-delivery factors such as understanding business requirements, multi-cloud/hybrid cloud support and enterprise-level customer support.

451 Research’s IaaS Vendor Window underscores AWS’s dominant market position because it outpaced competitors on multiple promise and fulfillment rating attributes, most notably breadth of services/features, brand/reputation, technical expertise and innovation. Of respondents surveyed, 55.8% are using AWS for IaaS. 

However, for the first time, AWS’s customer ratings fell behind those of other IaaS providers on ‘value for money/cost,’ where Google Cloud Platform was rated highest, and ‘understands my business,’ where IBM/SoftLayer and Microsoft Azure obtained higher scores. 

Many organizations use multiple IaaS providers and Microsoft Azure is emerging as a formidable challenger, closing the market adoption gap with AWS. While AWS remains the top pick as respondents’ most important IaaS provider (39%), nearly 35% of respondents named Microsoft their most important IaaS provider, up from 20.2% in the previous survey. 

Microsoft is also closing the customer perceptions gap with AWS, posting slightly above-average scores for overall promise and fulfillment, as does Google Cloud Platform, which secured particularly strong marks for service reliability and value for money/cost. Google and Microsoft are seeing higher overall adoption rates since the previous survey in Q1 2015.

Commenting on Microsoft’s position, Posey added, “It will be interesting to assess the impact of Azure Stack (scheduled for launch by mid-2017) on Microsoft’s overall positioning and individual attribute ratings for multi-cloud/hybrid support, as well as technical expertise and innovation.”

Among the European respondents surveyed, Microsoft emerged as the predominant primary IaaS provider, with 43.7% of European respondents citing it vs. 32% naming AWS. Microsoft’s efforts to address European customers’ data sovereignty concerns no doubt contributed to its elevated positioning in the region. 

 

Source: CloudStrategyMag

DXC Technology To Extend Microsoft Advance Azure Services

DXC Technology To Extend Microsoft Advance Azure Services

DXC Technology has announced that it will extend its current portfolio of offerings for Microsoft Azure with a hybrid cloud solution from its data centers based on the Microsoft Azure Stack. This new offering, DXC Services for Azure, provides business clients with a consistent and flexible platform for enterprise workloads across both private infrastructure and the Azure public cloud.

DXC and Microsoft will enable a common experience and consistent service management of Azure services across private DXC data centers and Microsoft public cloud. DXC Services for Azure will create a significantly more efficient and simplified path for clients to realize the benefits of public and hybrid cloud, and will ensure streamlined access to Azure services across both the Microsoft public cloud and DXC private data centers, complemented by a comprehensive set of managed services. DXC will train and certify thousands of consultants and developers on Azure in support of this initiative.

As traditional hosting and outsourcing services give way to the new hybrid cloud operating model, businesses need a cloud platform with a common operating model that can be consistently delivered across all deployment architectures, from traditional infrastructure to public cloud. To unlock value and provide clients a full set of solutions to meet their digital transformation needs. DXC and Microsoft will offer clients a comprehensive portfolio of Microsoft solutions, including Azure, Secure Productive Enterprise, Windows 10, Office 365, and Dynamics 365.

Today’s announcement strengthens the strategic partnership between the companies that builds on over 50 years of combined Microsoft relationships with CSC and HPE Enterprise Services. Together, DXC and Microsoft will co-invest in both the DXC Services for Azure as well as with DXC clients to accelerate their transition to public and hybrid cloud.

“DXC is building on CSC’s and HPE’s strategic partnership with Microsoft to accelerate our clients’ adoption of cloud services,” said Mike Lawrie, DXC Technology chairman, president, and CEO. “The result is DXC Services for Azure, a hybrid cloud offering which extends across Azure’s hyper-scale public cloud and private cloud. It facilitates real benefits, such as allowing clients to choose the right infrastructure landing zone for their workloads, all operated in a uniformly managed and consistent cloud platform.”

DXC will offer services based on Microsoft Azure Stack when the solution is released later this year. Because DXC Services for Azure offers true consistency across Microsoft Azure environments, it will be especially attractive to organizations that want a uniform programming and deployment model that effectively addresses regulatory requirements, data residency concerns, latency, and custom functionality demands. A consistent cloud platform means that applications, processes and skills can be equally applied without cumbersome translation between environments — paving the way for true acceleration of the cloud IT revolution.

“At Microsoft, our mission is to empower every person and every organization on the planet to achieve more,” said Nadella. “Pursuing this mission means forging strong partnerships with leading organizations like DXC. We look forward to helping accelerate our mutual customers’ digital transformations and empowering them to seize all of the opportunities ahead.”

DXC-Microsoft Strategic Alliance

The DXC Technology and Microsoft global strategic partnership offers clients the integrated and innovative solutions they need to succeed in a cloud-first world. Informed by its 30-plus year strategic relationship with Microsoft and recognized internationally as a key Microsoft alliance, DXC builds on Microsoft’s mission to empower every person and organization on the planet to do more and achieve more. Together, DXC and Microsoft help organizations worldwide transform their IT and business processes to drive success orchestrated by the cloud.

Source: CloudStrategyMag

21 hot programming trends—and 21 going cold

21 hot programming trends—and 21 going cold

Programmers love to sneer at the world of fashion where trends blow through like breezes. Skirt lengths rise and fall, pigments come and go, ties get fatter, then thinner. But in the world of technology, rigor, science, math, and precision rule over fad.

That’s not to say programming is a profession devoid of trends. The difference is that programming trends are driven by greater efficiency, increased customization, and ease of use. The new technologies that deliver one or more of these eclipse the previous generation. It’s a meritocracy, not a whimsy-ocracy.

What follows is a list of what’s hot and what’s not among today’s programmers. Not everyone will agree with what’s A-listed, what’s D-listed, and what’s been left out. That’s what makes programming an endlessly fascinating profession: rapid change, passionate debate, sudden comebacks.

Hot: Preprocessors

Not: Full language stacks

It wasn’t long ago that people who created a new programming language had to build everything that turned code into the bits fed to the silicon. Then someone figured out they could piggyback on the work that came before. Now people with a clever idea simply write a preprocessor that translates the new code into something old with a rich set of libraries and APIs.

The scripting languages like Python or JavaScript were once limited to little projects, but now they’re the foundation for serious work. And those who didn’t like JavaScript created CoffeeScript, a preprocessor that lets them code, again, without the onerous punctuation. There are dozens of variations preslicing and predicing the syntax in a different way.

The folks who loved dynamic typing created Groovy, a simpler version of Java without the overly insistent punctuation. There seem to be dozens of languages like Scala or Clojure that run on the JVM, but there’s only one JVM. You can run many languages on .Net’s VM. Why reinvent the wheel?

Hot: Docker

Not: Hypervisors

This isn’t exactly true. The hypervisors have their place, and many Docker containers run inside of operating systems running on top of hypervisors. However, Docker containers are soooo much smaller than virtual machine images, and that makes them much easier to use and deploy.

When developers can, they prefer to ship only Docker containers, thanks to the ease with which they can be juggled during deployment. Clever companies such as Joyent are figuring out how to squeeze even more fat out of the stack so that the containers can run, as they like to say, on “bare metal.”

In the digital business era, you need agility to seize new opportunities while efficiently maintaining legacy, on-premise systems.

Hot: JavaScript MV* frameworks

Not: JavaScript files

Long ago, everyone learned to write JavaScript to pop up an alert box or check to see that the email address in the form contained an @ sign. Now HTML AJAX apps are so sophisticated that few people start from scratch. It’s simpler to adopt an elaborate framework and write a bit of glue code to implement your business logic.

There are now dozens of frameworks like Kendo, Sencha, jQuery Mobile, AngularJS, Ember, Backbone, Meteor JS, and many more, all ready to handle the events and content for your web apps and pages.

Those are merely the web apps. There are also a number tuned to offering cross-platform development for the smartphone/tablet world. Technologies like NativeScript, PhoneGap, and Sencha Touch are a few of the options for creating apps out of HTML5 technology.

Hot: CSS frameworks

Not: Generic Cascading Style Sheets

Once upon a time, adding a bit of pizzazz to a web page meant opening the CSS file and including a new command like font-style:italic. Then you saved the file and went to lunch after a hard morning’s work. Now web pages are so sophisticated that it’s impossible to fill a file with such simple commands. One tweak to a color and everything goes out of whack. It’s like what they say about conspiracies and ecologies: Everything is interconnected.

That’s where CSS frameworks like SASS and its cousins Compass have found solid footing. They encourage literate, stable coding by offering programming constructs such as real variables, nesting blocks, and mix-ins. It may not sound like much newness in the programming layer, but it’s a big leap forward for the design layer.

Hot: Video tags

Not: Static tags

Once upon a time, video was something you watched on YouTube or Vimeo. It was a separate thing that lived on its own in a dedicated page. That’s changing as more and more websites use video as building blocks like static GIFs or JPGs. All of a sudden, the screen starts to move as the people or dogs come alive.

Designers are discovering that the modern video tag is simply another rectangle, albeit a rectangle that often needs a bit more JavaScript code from the programmer to control it. We’re only beginning to understand that video isn’t the main course for that box in front of the living room couch, but a decorating option everywhere.

Hot: Almost big data (analysis without Hadoop)

Not: Big data (with Hadoop)

Everyone likes to feel like the Big Man on Campus, and if they aren’t, they’re looking for a campus of the appropriate size where they can stand out. It’s no surprise then that when the words “big data” started flowing through the executive suite, the suits started asking for the biggest, most powerful big data systems as if they were purchasing a yacht or a skyscraper.

The funny thing is many problems aren’t big enough to use the fanciest big data solutions. Sure, companies like Google or Yahoo track all of our web browsing; they have data files measured in petabytes or yottabytes. But most companies have data sets that can easily fit in the RAM of a basic PC. I’m writing this on a PC with 16GB of RAM—enough for a billion events with a handful of bytes. In most algorithms, the data doesn’t need to be read into memory because streaming it from an SSD is fine.

There will be instances that demand the fast response times of dozens of machines in a Hadoop cloud running in parallel, but many will do fine plugging along on a single machine without the hassles of coordination or communication.

Hot: Spark

Not: Hadoop

It’s not so much that Hadoop is cooling off. It’s more that Spark is red hot, making the Hadoop model look a bit old. Spark borrows some of the best ideas of Hadoop’s approach to extracting meaning from large volumes of data and updates them with a few solid improvements that make the code run much, much faster. The biggest may be the way that Spark keeps data in fast memory instead of requiring everything be written to the distributed file system.

Of course many people are merging the two by using Spark’s processing speed on data stored in Hadoop’s distributed file system. They’re more partners than competitors.

Hot: Artificial intelligence/machine learning

Not: Big data

No one knows what the phrase “artificial intelligence” means, and that helps the marketers, especially since the term “big data” has run its course. They’re grabbing terms from artificial intelligence and upgrading the sophistication of the big, number-crunching algorithms that plow through our log files and clickstreams. By borrowing the more sophisticated algorithms from the 50-odd years of AI research, we stand a better chance than ever of finding that signal in the noise. Tools run the gamut from machine learning frameworks to cognitive computing, all the way up to IBM’s Watson, which you can now ping to solve your problems. Each offers its own level of machine intelligence and, with this, the promise of taking over more of the data analysis and forensics for us.

Hot: Robotics

Not: Virtual things

Was it only a few minutes ago that we were all going to be living in virtual reality where everything was drawn on our retinas by some video card? It still might happen, but in the meantime the world of robotics is exploding. Every school has a robotics team, and every corner of the house is now open to a robotics invasion. The robot vacuum cleaners are old news and the drones are taking off.

That means programmers need to start thinking about how to write code to control the new machines. For the time being, that often means writing scripts for lightweight controllers like the Raspberry Pi, but that’s bound to change as the libraries grow more sophisticated. Many roboticists, for instance, like hacking the code in OpenCV, a machine vision platform filled with C. This means new rules, new libraries, new protocols, and plenty of other new topics to think about.

Hot: Single-page web apps

Not: Websites

Remember when URLs pointed to web pages filled with static text and images? How simple and quaint to put all information in a network of separate web pages called a website. The design team would spend hours haggling over the site map and trying to make it easy enough to navigate.

New web apps are front ends to large databases filled with content. When the web app wants information, it pulls it from the database and pours it into the local mold. There’s no need to mark up the data with all the web extras needed to build a web page. The data layer is completely separate from the presentation and formatting layer. Here, the rise of mobile computing is another factor: a single, responsive-designed web page that work like an app — to better avoid the turmoil of the app stores.

Hot: Mobile web apps

Not: Native mobile apps

Let’s say you have a great idea for mobile content. You could rush off and write separate versions for iOS, Android, Windows 8, and maybe even BlackBerry OS or one of the others. Each requires a separate team speaking a different programming language. Then each platform’s app store exerts its own pound of flesh before the app can be delivered to the users.

Or you could build one HTML app and put it on a website to run on all the platforms. If there’s a change, you don’t need to return to the app store, begging for a quick review of a bug fix. Now that the HTML layer is getting faster and running on faster chips, this approach can compete with native apps better on even more complicated and interactive apps.

Hot: Android

Not: iOS

Was it only a few years ago that lines snaked out of Apple’s store? Times change. While the iPhone and iPad continue to have dedicated fans who love their rich, sophisticated UI, the raw sales numbers continue to favor Android. Some reports even say that more than 80 percent of phones sold were Androids.

The reason may be as simple as cost. While iOS devices still cost a pretty penny, the Android world is flooded with plenty of competition that’s producing tablets for as low as one-fifth the price. Saving money is always a temptation.

But another factor may be the effect of open source. Anyone can compete in the marketplace—and they do. There are big Android tablets and little ones. There are Android cameras and even Android refrigerators. No one has to say, “Mother, may I?” to Google to innovate. If they have an idea, they follow their mind.

Apple, though, is learning from Android. The iPhone 6 comes with different screen sizes, and what do you know? The lines are starting to reappear.

Hot: GPU

Not: CPU

When software was simple and the instructions were arranged in a nice line, the CPU was king of the computer because it did all of the heavy lifting. Now that video games are filled with extensive graphical routines that can run in parallel, the video card runs the show. It’s easy to spend $500, $600, or more on a fancy video card, and some serious gamers use more than one. That’s more than double the price of many basic desktops. Gamers aren’t the only ones bragging about their GPU cards. Computer scientists are now converting many parallel applications to run hundreds of times faster on the GPU.

Hot: GitHub

Not: Résumés

Sure, you could learn about a candidate by reading a puffed-up list of accomplishments that includes vice president of the junior high chess club. But reading someone’s actual code is so much richer and more instructive. Do they write good comments? Do they waste too much time breaking items into tiny classes that do little? Is there a real architecture with room for expansion? All these questions can be answered by a glimpse at their code.

This is why participating in open source projects is becoming more and more important for finding a job. Sharing the code from a proprietary project is hard, but open source code can go everywhere.

Hot: Renting

Not: Buying

When Amazon rolled out its sales for computers and other electronics on Black Friday, the company forgot to include hypeworthy deals for its cloud. Give it time. Not so long ago, companies opened their own datacenter and hired their own staff to run the computers they purchased outright. Now they rent the computers, the datacenter, the staff, and even the software by the hour. No one wants the hassles of owning anything. It’s all a good idea, at least until the website goes viral and you realize you’re paying for everything by the click. Now if only Amazon finds a way to deliver the cloud with its drones, the trends will converge.

Hot: Cloud complexity

Not: Cloud simplicity

The early days of cloud computing saw vendors emphasizing how easy it was to click a button and get a running machine. Simplicity was king.

Now choosing the right machine and figuring out the right discount program could take more time than writing the code. There are dozens of machine profiles available, and most cloud providers support some of the older models. All offer unique levels of performance, so you better be ready to benchmark them to decide which is the most cost-effective for you. Is it worth saving 12 cents per hour to get by with less RAM? It could be if you’re spinning up 100 machines for months at a time.

To make matters more complex, the cloud companies offer several options for getting discounts by paying in advance or buying in bulk. You have to put them in the spreadsheet too. It’s enough to invest in an online course on cloud cost engineering.

Hot: Data movement experts

Not: Backup tapes

When data was small, we didn’t have to think about moving it. We could back it up to a tape or maybe install a RAID hard drive. Now data is so big that it’s not so easy to assume it is wherever we need it. This is becoming increasingly important because more services take place somewhere off in the cloud, not in the rack where the RAID array sits.

Consider Amazon’s new Snowmobile, a cute inside name for a shipping container filled with hard disks that can hold 100 petabytes of data. They also make a smaller box called the Snowball that can hold 80TB. Both move data like a physical thing, not a signal in a fiber optic, which really scales. One estimate suggests that it would take 28 years to move the 100 petabytes down a 1Gbps fiber line while a tractor trailer could move the container across the country in a few days.

All of this means that developers should start thinking about where data is collected and where it needs to be. We’re gathering much more data than before, and moving it to the right location is more important than ever. As Wayne Gretzky said, his success depended on planning ahead and skating where the puck was going to be, not where it happened to be right now.

Hot: Audio

Not: Websites

Websites aren’t really dying; it’s just that the new audio interfaces are booming. Amazon, Google, and Apple are pushing everyone to speak their questions instead of getting up, walking over to the computer, and flexing those fingers.

This means a bit more work for programmers because all these mechanisms have new APIs, like Alexa’s new one for controlling the light switches. If your company wants to connect with these audio interfaces, you better start hacking. Keyboards and URLs were invented in the last century, after all.

Hot: Node.js

Not: JavaEE, Ruby on Rails

The server world has always thrived on the threaded model that let the operating system indulge any wayward, inefficient, or dissolute behavior by programmers. Whatever foolish loop or wasteful computation programmers coded, the OS would balance performance by switching between the threads.

Then Node.js came along with the JavaScript callback model of programming, and the code ran really fast—faster than anyone expected was possible from a toy language once used only for alert boxes. Suddenly the overhead of creating new threads became obvious and Node.js took off. Problems arise when programmers don’t behave well, but the responsibility has largely been good for them. Making resource constraints obvious to programmers usually produces faster code.

The Node.js world also benefits from offering harmony between browser and server. The same code runs on both, so it’s easier for developers to move around features and duplicate functionality. As a result, Node.js layers have become the hottest stacks on the internet.

Hot: PHP 7.0

Not: Old PHP

In the past, PHP was a simple way to knock out a few dynamic web pages. If you needed a bit of variety, you could embed simple code between HTML tags. It was basic enough for web developers to embrace it, but slow enough to draw sneers from hard-core programmers.

That’s old news because some PHP lovers at places like WordPress and Facebook have been competing to execute PHP code faster than ever by incorporating the Just-in-Time compiler technology that once made Java such a high-performing solution. Now tools like the HipHop Virtual Machine and PHP 7.0 are delivering speeds that may be twice as fast as the old versions. Take that, Node.js and Java.

Hot: Just-in-time education

Not: Four years up front

The computer-mediated courses aren’t new anymore, and everyone is enjoying the advantage of watching a video lecture with buttons for speeding up, slowing down, or asking the prof to repeat that last point. The online forums also improve over the old seminar rooms where only one blowhard could dominate the discussion at a time.

But it’s not only the nature of and technology behind online coursework that’s upending the education industrial complex; it’s also the flexibility to learn whenever and wherever you need to. This is changing the dynamic as people no longer have to invest four years of outrageous tuition on a big collection of courses that may or may not be relevant to their lives. Why take courses on compilers until you know you’ll actually work on a compiler? If the boss wants to switch from a relational database to a NoSQL engine, then you can invest the time in a course in modern data stores. You get fresh information when you need it and don’t clutter your brain with quickly rotting ideas.

Related articles

Source: InfoWorld Big Data

Review: Amazon QuickSight covers the BI basics

Review: Amazon QuickSight covers the BI basics

When I reviewed self-service exploratory business intelligence (BI) products in 2015, I covered the strengths and weaknesses of Tableau 9.0, Qlik Sense 2.0, and Microsoft Power BI. As I pointed out at the time, these three products offer a range of data access, discovery, and visualization capabilities at a range of prices, with Tableau the most capable and expensive, Qlik Sense in the middle, and Power BI the least capable but a very good value.

A new entry, Amazon QuickSight, runs entirely in the AWS cloud, has good access to Amazon data sources and fair access to other data sources, and offers basic analysis and data manipulation at a basic price. Of the three products I reviewed in 2015, QuickSight most closely resembles Power BI, only without the dependence on a desktop product to create data sets—or the level of analysis power provided by the Power BI Desktop/Service combination.

Source: InfoWorld Big Data