Optimization




Abstract

The design and control of winged aircraft and drones is an iterative process aimed at identifying a compromise of mission-specific costs and constraints. When agility is required, shape-shifting (morphing) drones represent an efficient solution. However, morphing drones require the addition of actuated joints that increase the topology and control coupling, making the design process more complex. We propose a co-design optimization method that assists the engineers by proposing a morphing drone’s conceptual design that includes topology, actuation, morphing strategy, and controller parameters. The method consists of applying multi-objective constraint-based optimization to a multi-body winged drone with trajectory optimization to solve the motion intelligence problem under diverse flight mission requirements, such as energy consumption and mission completion time. We show that co-designed morphing drones outperform fixed-winged drones in terms of energy efficiency and mission time, suggesting that the proposed co-design method could be a useful addition to the aircraft engineering toolbox.


Abstract

This book is a self-contained introduction to the design of modern (deep) neural networks. Because the term β€œneural” comes with a lot of historical baggage, I prefer the simpler term β€œdifferentiable models” in the text. The focus of this 250-pages volume is on building efficient blocks for processing nD data, including convolutions, transformers, graph layers, and modern recurrent models (including linearized transformers and structured state-space models). Because the field is evolving quickly, I have tried to strike a good balance between theory and code, historical considerations and recent trends. I assume the reader has some exposure to machine learning and linear algebra, but I try to cover the preliminaries when necessary. The volume is a refined draft from a set of lecture notes for a course called Neural Networks for Data Science Applications that I teach in Sapienza. I do not cover many advanced topics (generative modeling, explainability, prompting, agents), which will be published over time in the companion website.

2024/05/02 13:59 · Horea Caramizaru · 0 Comments · 0 Linkbacks


Course Description

We all know that calculus courses such as 18.01 Single Variable Calculus and 18.02 Multivariable Calculus cover univariate and vector calculus, respectively. Modern applications such as machine learning and large-scale optimization require the next big step, β€œmatrix calculus” and calculus on arbitrary vector spaces.

This class covers a coherent approach to matrix calculus showing techniques that allow you to think of a matrix holistically (not just as an array of scalars), generalize and compute derivatives of important matrix factorizations and many other complicated-looking operations, and understand how differentiation formulas must be reimagined in large-scale computing. We will discuss reverse/adjoint/backpropagation differentiation, custom vector-Jacobian products, and how modern automatic differentiation is more computer science than calculus (it is neither symbolic formulas nor finite differences).


Abstract

Artificial intelligence has recently experienced remarkable advances, fueled by large models, vast datasets, accelerated hardware, and, last but not least, the transformative power of differentiable programming. This new programming paradigm enables end-to-end differentiation of complex computer programs (including those with control flows and data structures), making gradient-based optimization of program parameters possible. As an emerging paradigm, differentiable programming builds upon several areas of computer science and applied mathematics, including automatic differentiation, graphical models, optimization and statistics. This book presents a comprehensive review of the fundamental concepts useful for differentiable programming. We adopt two main perspectives, that of optimization and that of probability, with clear analogies between the two. Differentiable programming is not merely the differentiation of programs, but also the thoughtful design of programs intended for differentiation. By making programs differentiable, we inherently introduce probability distributions over their execution, providing a means to quantify the uncertainty associated with program outputs.


2023/12/23 21:32 · Horea Caramizaru · 0 Comments · 0 Linkbacks




Abstract:

Smooth optimization on manifolds naturally generalizes smooth optimization in Euclidean spaces in a manner that is of interest in a variety of applications, including modal analysis, blind source separation (via independent component analysis), pose estimation in computer vision, and model reduction in dynamical systems. Manifolds of interest include the Stiefel manifolds and Grassmann manifolds.

After presenting a number of motivating applications, we will introduce the basics of differential manifolds and Riemannian geometry, and describe how methods in optimization, like line-search, Newton's method, and trust region methods can be generalized to the case of manifolds. The course will assume only a basic knowledge of matrix algebra and real analysis.

2023/12/04 16:59 · Horea Caramizaru · 0 Comments · 0 Linkbacks



Abstract:

Designing robots with extreme performance in a given task has long been an exciting research problem drawing attention from researchers in robotics, graphics, and artificial intelligence. As a robot is a combination of its hardware and software, an optimal robot requires both an excellent implementation of its hardware (e.g., morphological, topological, and geometrical designs) and an outstanding design of its software (e.g., perception, planning, and control algorithms). While we have seen promising breakthroughs for automating a robot's software design with the surge of deep learning in the past decade, exploration of optimal hardware design is much less automated and is still mainly driven by human experts, a process that is both labor-intensive and error-prone. Furthermore, experts typically optimize a robot's hardware and software separately, which may miss optimal designs that can only be revealed by optimizing its hardware and software simultaneously. This thesis argues that it is time to rethink robot design as a holistic process where a robot's body and brain should be co-optimized jointly and automatically. In this thesis, we present a computational robot design pipeline with differentiable simulation as a key player. We first introduce the concept of computational robot design on a real-world copter whose geometry and controller are co-optimized with a differentiable simulator, resulting in a custom copter that outperforms designs suggested by human experts by a substantial margin. Next, we push the boundary of differentiable simulation by developing advanced differentiable simulators for soft-body and fluid dynamics. Contrary to traditional belief, we show that deriving gradients for such intricate, high-dimensional physics systems can be both science and art. Finally, we discuss challenges in transferring computational designs discovered in simulation to real-world hardware platforms. We present a solution to this simulation-to-reality transfer problem using our differentiable simulator on an example of modeling and controlling a real-world soft underwater robot. We conclude this thesis by discussing open research directions in differentiable simulation and envisioning a fully automated computational design pipeline for real-world robots in the future.


2023/11/01 23:08 · Horea Caramizaru · 0 Comments · 0 Linkbacks



2022, the year when I defended my Thesis, β€œMulti-body modeling of robot dynamics and system identification during MPC”, in Scientific Computing. The updated translation of the poem can be found here.

Abstract

Due to external influences over parameters that characterize dynamical systems, an online parameter estimation must be added as part of model predictive control strategies. In this thesis, we show how continuous parameters estimation, using inverse dynamics, can be used for identifying the inertial parameters (mass, inertia, and center of mass) of multi-body systems as part of an adaptive control strategy. For this, a Featherstone spatial algebra equivalent model, based on screw theory was used. The system identification was done using a linear least squares approach using the Recursive Newton-Euler Algorithm as a way of implementing a generic solution. The process is for open-loop robots and is tested using an optimal control algorithm based on multiple shooting.

lists/optimization.txt Β· Last modified: 2023/12/24 14:55 by Horea Caramizaru