To main content

Green AI on the Edge-Cloud Continuum: Orchestrating Between Edge and Cloud using Container Technologies

This thesis will delve into orchestrating the balance between an edge device (Raspberry Pi) and a more robust system (Desktop or Cloud) using container technologies to realize the benefits of Green AI across the edge-cloud continuum.

Contact persons

AI-generated image

Master Project Description

Green AI aims to make artificial intelligence more environmentally sustainable by reducing computational costs and energy consumption. As the demand for AI applications grows, edge devices like Raspberry Pi are emerging as vital components to facilitate local, real-time computations, thereby reducing the need for data transfer to centralized cloud systems. However, there's a clear trade-off between the computational efficiency of edge devices and their power constraints. 

Research Topic Focus

  • Understand the energy consumption dynamics of AI workloads on edge devices versus cloud infrastructures.
  • Investigate container technologies' role in seamless workload transition between edge devices and cloud/desktop systems.
  • Design and implement orchestration strategies that prioritize energy efficiency and computational power using container technologies.
  • Experiment with various AI workloads to evaluate the energy and computational efficiency of the proposed orchestration strategies.

Expected Results

  • A detailed analysis of the energy consumption and computational efficiencies of AI workloads across edge and cloud environments.
  • Successful implementation of container-based orchestration strategies that enable Green AI applications.
  • Demonstrable benefits, in terms of energy savings and computational efficiencies, of the proposed orchestration techniques across diverse AI workloads.

Learning Outcomes

  • Develop a deep understanding of the edge-cloud continuum and its importance in Green AI.
  • Master the application of container technologies in orchestrating AI workloads.
  • Gain practical experience in designing energy-efficient strategies for AI applications.
  • Enhance problem-solving and system design skills pertinent to real-world AI orchestration challenges.

Qualifications

  • Strong foundation in AI, edge computing, and cloud infrastructure.
  • Proficiency in relevant programming languages and tools.
  • Familiarity with container technologies, preferably Docker and Kubernetes.
  • An analytical mindset and interest in Green AI and sustainable computing.

References

  1. Schwartz, R., Dodge, J., Smith, N. A., & Etzioni, O. (2019). Green AI. arXiv preprint arXiv:1907.10597.
  2. Satyanarayanan, M. (2017). The emergence of edge computing. Computer, 50(1), 30-39.
  3. Burns, B., Grant, B., Oppenheimer, D., Brewer, E., & Wilkes, J. (2016). Borg, omega, and kubernetes. ACM Queue, 14(1), 70-93.
  4. Yi, S., Hao, Z., Qin, Z., & Li, Q. (2018). Fog computing: Platform and applications. In Third IEEE Workshop on Hot Topics in Web Systems and Technologies (HotWeb). IEEE.
  5. Merkel, D. (2014). Docker: lightweight Linux containers for consistent development and deployment. Linux Journal, 2014(239), 2.

Contact persons/supervisors

Sagar Sen ( Arda Goknil (), Erik Johannes Husom ()