1. A tech brief for 2017 and beyond
12/14/2016 2:03:11 PM
A tech brief for 2017 and beyond
Artificial Intelligence,Machine Learning,Cognitive Computing,Technology Predictions
https://news-cdn.moonbeam.co/A-Tech-Brief-for-2017-and-Beyond-App-Developer-Magazine_vslysmv2.jpg
App Developer Magazine
IoT

A tech brief for 2017 and beyond


Wednesday, December 14, 2016

Richard Harris Richard Harris

IEEE Computer Society, a source for technology information in the computing industry and career development, offers a comprehensive list of industry-recognized products, services and professional opportunities. They've recently released their latest report in an attempt to highlight some of the up and coming technological advances that could take hold.

As technology rapidly evolves, the importance on what is and what is to come is becoming more and more of a requirement to know. So here is an outline on some of their predictions that they have came up with for not only 2017 but for the next five years.

Technology trends that will reach adoption in 2017:


1) Industrial IoT

With many millions of IoT sensors deployed in dozens of industrial strength real-world applications, this is one of the largest and most impactful arenas for big data analytics in 2017.

2) Self-driving Cars

In Silicon Valley, one can easily see up to three self-driving cars on the same street. While adoption is less likely in general use, the broader adoption will likely occur in constrained environments such as airports and factories.

3) Artificial Intelligence, Machine Learning, Cognitive Computing

These overlapping areas are a fundamental requirement for big data analytics and for other areas of control and management. Machine learning, and deep learning in particular, are quickly transitioning from research lab to commodity products. On the software side, advanced engines and libraries from industry leaders, such as Facebook and Google, are making it to open source. On the hardware side, we see continually improving performance and scalability from existing technologies (CPUs and GPUs), as well as emerging accelerators. Consequently, writing domain-specific applications that can learn, adapt, and process complex and noisy inputs in near real time is easier than ever and a wide range of new applications is emerging.

4) 5G

While it is unlikely that 5G will have immediate adoption in the next year, its roadmaps and standards are being developed, influencing the applications that will eventually evolve. Also, some early-use cases of deployment are being pursued.

5) Accelerators

While looking at the long term, the ending of Moore's law is being addressed by novel technologies such as those covered by rebooting computing (see bullet 1 in 5 Year Trends below), heterogeneous computing founded on accelerators enables the stretching of performance boundaries in today's technologies.

6) Disaggregated Memory – Fabric-attached Nonvolatile Memory (NVM)

While NVM has achieved mixed success in productization in the past year, the number of companies working in this arena, be it on materials, architecture, or software, makes it a certain candidate for imminent adoption. Fast, nonvolatile storage bridges the gap between RAM and SSD's, with a performance-cost ratio lying somewhere in between. This fast, nonvolatile storage will be initially configured either as "a disc," accessed by the OS like any other permanent storage device, or as "RAM" in DIMM slots, accessed by the OS as memory. But once the hardware and OS support is fully figured out, this technology will open the door to new applications that aren't currently available.

7) Sensors Everywhere and Edge Compute

From smart transportation and smart homes, to retail innovations, surveillance, sports and entertainment, and industrial IoT, we are starting to see intelligence being aggressively deployed at the edge. With intelligence comes the need to compute at the edge, and a variety of edge compute offerings are opening up new disruptive opportunities.

8) Blockchain (beyond Bitcoin)

While known as the technology behind Bitcoin, Blockchain has far more disruptive uses, potentially changing the way in which we implement processes like voting, financial transactions, title and ownership, anti-counterfeiting, and digital rights managements, securing these processes without the need (and bottleneck) of a central authority.

9) Hyper-converged Systems

Also known as "software-defined everything," hyper-converged systems are bundles of hardware and software that contain elements of compute, storage and networking together with an orchestration system that lets IT administrators manage them using cloud tools and dev/ops practices. While they have been on the roadmap for major IT players for the last three to five years, we see major adoption trends that may cause their growth to explode in 2017.

Technology Trends that will reach adoption in 5 years:


1) Rebooting Computing (includes quantum computing)

The end of Moore's law has resulted in the end of the ITRS (International Technology Roadmap for Semiconductors) and its transformation into IRDS (International Roadmap for Devices and Systems), focusing on new technologies, such as quantum computing, neuromorphic, adiabatic, and many others.

2) Human Brain Interface

There are many types of interfaces developed, but the one that can be most impactful is human brain interface that can drive and control machines directly.  This will be enabled by the rebooting computing technologies above but will also require separate innovation to connect the human brain to hardware.

3) Capabilities – Hardware protection

Protecting data at rest and flight requires more sophisticated security technologies based on more robust hardware protection, such as capabilities. Capabilities had been popular in the 1960s but were abandoned in favor of paging which was sufficient when physical memory was small. Rapid advances in memory, interconnects, and processors, as well as requirements of big data applications, open up new opportunities for capabilities.

4) The Year of Exascale

The scientific community is starting to converge on 2022 being the year where they can expect the first wave of Exascale systems to be deployed. An Exascale machine would almost double the performance of all of 2016's top 500 supercomputers put together, enabling breakthroughs in scientific fields such as weather, genomics, life sciences, energy, and manufacturing.

5) NVM Reaches Maturity

There are indicators that the long-predicted adoption of NVM is coming and by 2022, we'll be at least in the second or third generation of true nonvolatile memory devices that will change the entire memory-storage hierarchy, and associated software stack, across the IT industry.

6) Silicon Photonics Becomes a Reality

While bridging technologies (such as VCSEL-based photonics) may be sufficient to address the needs for the next five years, we see 2022 as the pivot point where highly integrated silicon photonics components will be necessary to meet the combined cost, energy, and performance requirements of Exascale systems.

7) Smart NICs

Networking equipment, such as the kind seeing explosive growth in data centers, is becoming more commoditized and open. Ever more sophisticated chips in network interface cards (NICs) allow more offloading of traditional networking tasks from the CPU to the NIC, including encryption, compression, package management, etc. We've seen this trend before with graphics cards: commodity specialized hardware mated with good library support enabled an explosion of applications and libraries in domains far from graphics, earning the nickname "GPGPU." Similarly, GPNICs may allow newly accelerated software to take advantage of the unique hardware properties of NICs, both within classical network applications, such as key-value stores, and in new domains, such as text processing.

8) Power Conservative Multicores

Integrated processor cores on a chip will go over hundreds and thousands for top 500 and green 500 HPC machines. With more processors on a chip, memory architectures and data transfer will become key technologies in hardware. In software, a parallelizing compiler that allows users to employ the many cores efficiently and easily will reduce rapidly increasing software development costs. Automatic power reduction with the collaboration of the architecture and compiler will become crucial to apply clock or power gating or frequency and voltage lowering to idle processor cores.

Subscribe to App Developer Magazine

Become a subscriber of App Developer Magazine for just $5.99 a month and take advantage of all these perks.

MEMBERS GET ACCESS TO

  • - Exclusive content from leaders in the industry
  • - Q&A articles from industry leaders
  • - Tips and tricks from the most successful developers weekly
  • - Monthly issues, including all 90+ back-issues since 2012
  • - Event discounts and early-bird signups
  • - Gain insight from top achievers in the app store
  • - Learn what tools to use, what SDK's to use, and more

    Subscribe here