AI, ML and Data Engineering InfoQ Trends Report

02 Oct 2024 (2 months ago)
AI, ML and Data Engineering InfoQ Trends Report

Deep Learning Frameworks

  • There are two major deep learning frameworks: PyTorch and TensorFlow. PyTorch is more popular in academic research, while TensorFlow is more dominant in commercial applications. (2m58s)
  • ONNX is a framework developed by Facebook and Microsoft that allows for the transfer of deep learning models between different platforms, such as PyTorch and TensorFlow. (3m51s)

Deep Learning Trends

  • Deep learning is becoming the assumed machine learning method for many projects, leading to data being stored in a way that makes it easier for deep learning algorithms to process. (5m9s)
  • Frameworks are adding tools for large-scale training, including mesh tensorflow, Microsoft's Deep Speed framework, and Facebook's FairScale. (6m21s)
  • There is a growing trend of using pre-trained models and fine-tuning them for specific applications, especially for those new to the field. (8m19s)
  • Pre-trained models, such as GPT-3, are being used for a variety of applications, including natural language processing and vision. (7m53s)

Edge AI

  • Quantization for machine learning on edge devices often involves reducing the precision of numbers from 32-bit or 64-bit floats to 8-bit integers for greater efficiency. (12m7s)
  • Small drones are a prime example of edge AI, with applications like road following and gas leak detection using simple neural networks on microprocessors like the STM32. (13m55s)
  • While Docker is used for development and quantization toolchains, it is not typically run directly on resource-constrained edge devices like small drones. (15m24s)
  • There is a need for a lighter version of Kubernetes for use on the edge and in fields like AI and ML. (16m47s)
  • Companies like Apple are adapting their hardware to support machine learning with features like tensor accelerators. (17m52s)

Autonomous Vehicles

  • Self-driving cars are seeing massive success with companies like Waymo having driverless cars on the road, but scaling the technology and proving its safety remain challenges. (20m47s)
  • Autonomous vehicles will likely utilize GPUs for their processing power, both at the edge and centrally, to handle the complex data processing required for self-driving. (26m21s)

CUDA and GPUs

  • CUDA is a programming language used to program GPUs, which are designed for massively parallel processing. (22m8s)
  • CUDA is here to stay due to the widespread availability of GPUs and their ability to perform complex tasks, such as deep learning and image recognition, efficiently. (23m11s)
  • Docker and Kubernetes make it easier to incorporate GPU or CUDA programming. (27m9s)

Natural Language Processing

  • The current trend in natural language processing is to use large Transformer models, such as GPT-3, which are replacing older models like LSTM. (28m0s)
  • GPT-3 is being used for a variety of tasks, including chatbots, grammar correction, and image generation. (29m52s)

AI Performance

  • AI models excel at benchmarks due to their training, but their real-world performance might differ. (32m52s)
  • Github Copilot, potentially powered by a similar architecture to GPT-3, is an AI pair programmer that can complete code based on context and rules. (36m54s)

Kubernetes and Cloud Platforms

  • Kubernetes, initially designed for stateless applications, has evolved to support diverse workloads, including machine learning, making it a standard platform for deployment. (37m56s)
  • Cloud databases such as Cassandra and CockroachDB can be deployed as containers on cloud platforms that utilize Kubernetes. (38m12s)
  • Apache Spark can be containerized and ran as a Kubernetes application, and frameworks such as Cube flow have gained popularity. (38m38s)

MLOps

  • MLops, similar to the widely adopted DevOps, aims to enhance efficiency in machine learning processes by automating the cycle of model training, deployment, monitoring, and retraining based on data changes. (38m54s)
  • There are two roles in data science: data scientists who focus on the problem domain and data, and data engineers who focus on infrastructure and pipelines. (44m11s)
  • The trend in MLOps is to automate the data engineer role, making it easier for data scientists to handle both roles, similar to how DevOps aimed to simplify developers taking on operations tasks. (44m46s)

AutoML

  • AutoML aims to automate the machine learning lifecycle, including tasks like hyperparameter search, but experts believe human involvement will always be necessary for defining problems and providing industry knowledge. (46m15s)
  • AutoML automates routine tasks in machine learning, such as running input data against different algorithms, and can recommend the best model for a given dataset. (48m4s)
  • AutoML does not eliminate the need for human involvement in machine learning but shifts the focus to finding the best data and ensuring its quality, balance, and comprehensiveness. (48m36s)

Learning Machine Learning

  • While there are many resources available for learning machine learning, it can be challenging to find the right ones, especially for those with some background knowledge who want to get up to speed on the latest advancements. (50m53s)
  • Beginners in machine learning are advised to start with simpler tools like scikit-learn and linear or logistic regression before moving on to more complex frameworks like TensorFlow and deep neural networks. (55m50s)

Resources for Learning

  • Multiple GPUs can be used on the cloud to experiment with CA programming using frameworks like TensorFlow, PyTorch, Theano, and CNTK. (53m6s)
  • Google's Teachable Machine website allows users to train a computer to recognize images, sounds, or poses using a webcam, enabling them to collect data and deploy a machine learning model on their local laptop quickly. (54m28s)
  • There are many resources available to learn more about this topic, including Google Code Labs, Udemy, Pluralsight, and InfoQ. (58m4s)
  • A summary of the discussion and relevant links are available at infoq.com. (58m25s)

Acknowledgements

  • The speaker thanks Shini, Anthony, Rex, and Kimberly for participating in the discussion. (58m15s)

Overwhelmed by Endless Content?