Fueling Science and Research through Machine Perception

 
Our Interconnected Planet

The concept of computers being aware of their surroundings and acting upon their perceptions began as fictitious stories around advanced intelligence and consciousness. This, perhaps, was the inspiration that propelled scientists to begin seriously discussing the possibility of designing intelligence into an electronic device. Computers that we are all familiar with have been programmed with a very specific instruction set. Applications are then capable of leveraging these instructions to carry out a wide variety — yet very specific set — of tasks. Until recently, machines have not had the ability to perceive their surroundings and modify actions based on these perceptions, or to increase their accuracy by learning from mistakes. However, this is changing with the arrival of autonomous vehicles; it is through machine perception that we may fuel the next driving force for future innovation.

Machines are already pushing boundaries of human-level performance across the entire spectrum of perceptions, with an ability to surpass even human levels of achievement. For example, we can’t leverage perception through tasks that require systems such as radar, ultrasound, or infrared without the assistance of machines. Computers, on the other hand, have the ability to harness a powerful array of perceptive powers. Great progress has already been made in computers’ ability to harness image recognition, speech analysis, and other such perceptive capabilities. In fact, improvements have been staggering in the past ten years and the pace is sure to continue at an exponential rate over the next decade.

To understand what is possible in the future, let’s first define the terms that are being used today such as, Artificial Intelligence (AI), Machine Learning (ML), and Deep Learning (DL) and how they relate to each other. First, you should understand that AI is the broadest terminology with ML and DL being subsets. Similarly, DL is a subset of ML. Simple definitions of each are listed below:

Artificial Intelligence – performing tasks that require human intelligence to perform.

Machine Learning – a system that can grow its own intelligence.

Deep Learning – inspired by how the brain works and using an artificial neural network of computers.

AI applications are based on the training of computers, which requires complex computations and fast and efficient data delivery. This is accomplished through the use of fast, efficient and lightweight protocols that can streamline the communication and delivery process. Mellanox heritage has been in meeting these needs and is uniquely positioned to provide end-to-end solutions for an organization seeking a competitive advantage for their data. Enabling smart offloading such as RDMA and GPUDirect®, and in-network computing capabilities dramatically improves computer networks and assists in performance attributes required for AI workloads.

If you’ve been following the Mellanox Interconnected Planet campaign, which explores the impact of technology on global issues, you’ve likely read a lot about InfiniBand’s role in AI. Mellanox has proven to be a game-changer in achieving new levels of performance in the race to solve the challenges of scalable DL. This can be seen in the most popular DL frameworks today which utilize multiple GPUs in a server, but have difficulty in scaling throughout multiple servers. IBM Research recently announced an amazing achievement in unprecedented performance and scaling with new distributed DL solutions using Mellanox InfiniBand technologies, which were able to achieve a record 95 percent scaling efficiency on the Caffe deep learning framework. This challenge, in particular, is where Mellanox has been the clear leader as the only interconnect solution able to deliver the performance and offload capabilities required to unlock the power to scale to a 64-node system for DL.

 

Ethernet for AI Applications

However, Mellanox Ethernet products provide similar capabilities: the most efficient data movement via offloads, accelerators, and RDMA-enabled capabilities all contribute to handling the complex computations and fast and efficient data delivery that enable world-leading AI platforms. Using Ethernet for supercomputing is no novel idea. In 2013, Australia’s National Computational Institute built a world-class, high-end research platform on Mellanox’s interconnect solutions to research coral bleaching in Australia’s Great Barrier Reef.

More recently, Tencent invested in an end-to-end 100 GbE infrastructure for analytics and ML deployments and participated in the TeraSort Benchmark’s Annual Global Computing Competition. The Mellanox Ethernet NIC and Switch solution was a key component enabling them to achieve amazing improvements over previous year’s benchmarks. Achieving more than 1TB/sec of TeraSort performance allowed them to attain record breaking results. It’s an achievement no other company has been able to demonstrate. By leveraging recent advances in CPU, memory, storage, and networking through industry leaders such as IBM and Mellanox, they were able to achieve 5X the performance of the previous record. Solution highlights can be seen below and a solution brief covering the world record results can be viewed here.

Mellanox Ethernet solutions have also been selected by Baidu, the leading Chinese language Internet search provider, for their ML platforms. With the Mellanox Ethernet solution in place, Baidu was able to demonstrate 200 percent improvement in machine learning training times over previous solutions, resulting in faster decision making results. Baidu’s Institute for Deep Learning utilizes a Machine Learning solution to improve voice and image recognition, as well as natural language processing. In return, this helps provide smarter, more useful and more personalized search results. They are also currently using it to research augmented reality, autonomous vehicles and in health care to develop a medical assistant that patients can chat with to receive a reliable diagnosis, similar to a doctor’s visit.

 

A Promise for Breakthrough Research

AI is all around us and most of us may not even be aware of it. Android and Apple speech recognition on our phones are types of DL that turn speech to text and provide directions through Google maps. The future is here now with many application using AI as a building block that is poised to further intertwine our lives with technology. AI brings the promise of breakthrough research for just about everything, from the way credit card companies detect fraud, to the exploration of space. A prime example of how AI will help research in the future can be seen in health care. Research of the human genome will be combined with an individual’s medical record and provide the capability to access to enormous quantities of clinical studies. AI can then parse through all this data—something that would be impossible for a human to do—to determine an individual’s diagnosis and tailor a specialized treatment.

The amount of data in the health care industry is growing very rapidly; electronic health and imaging records, as well as data from wearable devices and sensors are contributing to a flood of information. While the tsunami of data can be overwhelming to humans, it’s full of possibilities for AI. For health care providers, research could improve patient treatment, simplify regulatory compliance, boost operational efficiencies and increase collaboration between medical teams. Researching the health of large populations will enable AI to recognize potential risks and suggest powerful preventive care. All this will allow us to better address the challenges in taking care of an aging population while allowing much longer independence.

 

Conclusion

As data volumes continue to increase and processing it becomes more affordable, many more industries are likely to benefit from AI. We can already see this through improved experiences from consumer mobile applications such as Apple’s Siri and Amazon’s Alexa. Many of the top organizations in the world are betting big by investing significant resources in AI including Google, Microsoft, Yahoo, Facebook, Twitter, Tencent and Baidu—and at the center of this investment is Mellanox Technologies. Before long, the ability for machines to achieve perception will be an essential feature in everyday devices, applications and the web. Autonomous cars and voice assistants are only scratching the surface of what AI can do.

Supporting Resources:

 

About Tim Lustig

Tim Lustig is the Director of Corporate Ethernet Marketing at Mellanox Technologies, Inc. As a professional in the networking industry, Tim Lustig has been at the forefront of marketing new networking technologies for over two decades. From his start in network and database administration to product marketing and corporate technologist roles, Lustig’s experience includes outbound marketing activities, third party testing/validation, strategic product marketing, market research, and technical writing. Lustig has written many papers and articles on multiple networking technologies, and has been a featured speaker at many industry conferences around the world. Prior to Mellanox Tim held positions at Brocade Communication as Sr. Product Marketing Manager and QLogic Corporation as the Director of Corporate Marketing. Tim Lustig currently sits on the 25 Gigabit Ethernet Consortium marketing committee where he is an industry steward for the promotion of 25, 50 and 100Gb Ethernet. Follow Tim on Twitter: @tlustig

Comments are closed.