Danilo Pau
YOU?
Author Swipe
View article: Optimized FreeMark Post-Training White-Box Watermarking of Tiny Neural Networks
Optimized FreeMark Post-Training White-Box Watermarking of Tiny Neural Networks Open
Neural networks are powerful, high-accuracy systems whose trained parameters represent a valuable intellectual property. Building models that reach top level performance is a complex task and requires substantial investments of time and mo…
View article: Robust Watermarking of Tiny Neural Networks by Fine-Tuning and Post-Training Approaches
Robust Watermarking of Tiny Neural Networks by Fine-Tuning and Post-Training Approaches Open
Because neural networks pervade many industrial domains and are increasingly complex and accurate, the trained models themselves have become valuable intellectual properties. Developing highly accurate models demands increasingly higher in…
View article: Learning Online MEMS Calibration with Time-Varying and Memory-Efficient Gaussian Neural Topologies
Learning Online MEMS Calibration with Time-Varying and Memory-Efficient Gaussian Neural Topologies Open
This work devised an on-device learning approach to self-calibrate Micro-Electro-Mechanical Systems-based Inertial Measurement Units (MEMS-IMUs), integrating a digital signal processor (DSP), an accelerometer, and a gyroscope in the same p…
View article: Compensating PI Controller’s Transients with Tiny Neural Network for Vector Control of Permanent Magnet Synchronous Motors
Compensating PI Controller’s Transients with Tiny Neural Network for Vector Control of Permanent Magnet Synchronous Motors Open
Recent advancements in neural networks (NNs) have underscored their potential for deployment in domains that demand computationally intensive operations, including applications on resource-constrained edge devices. This study investigates …
View article: Quantitative Analysis of Deeply Quantized Tiny Neural Networks Robust to Adversarial Attacks
Quantitative Analysis of Deeply Quantized Tiny Neural Networks Robust to Adversarial Attacks Open
Reducing the memory footprint of Machine Learning (ML) models, especially Deep Neural Networks (DNNs), is imperative to facilitate their deployment on resource-constrained edge devices. However, a notable drawback of DNN models lies in the…
View article: Transitioning from TinyML to Edge GenAI: A Review
Transitioning from TinyML to Edge GenAI: A Review Open
Generative AI (GenAI) models are designed to produce realistic and natural data, such as images, audio, or written text. Due to their high computational and memory demands, these models traditionally run on powerful remote compute servers.…
View article: Biases in Edge Language Models: Detection, Analysis, and Mitigation
Biases in Edge Language Models: Detection, Analysis, and Mitigation Open
The integration of large language models (LLMs) on low-power edge devices such as Raspberry Pi, known as edge language models (ELMs), has introduced opportunities for more personalized, secure, and low-latency language intelligence that is…
View article: Transitioning from TinyML to Edge GenAI: A Review
Transitioning from TinyML to Edge GenAI: A Review Open
Generative AI (GenAI) models are designed to produce realistic and natural data, such 1 as images, audio, or written text. Due to their high computational and memory demands, these 2 models traditionally run on powerful remote compute serv…
View article: Enhancing Field-Oriented Control of Electric Drives with Tiny Neural Network Optimized for Micro-controllers
Enhancing Field-Oriented Control of Electric Drives with Tiny Neural Network Optimized for Micro-controllers Open
The deployment of neural networks on resource-constrained micro-controllers has gained momentum, driving many advancements in Tiny Neural Networks. This paper introduces a tiny feed-forward neural network, TinyFC, integrated into the Field…
View article: Air Quality Prediction via Embedded ML/DL and Quantized Models
Air Quality Prediction via Embedded ML/DL and Quantized Models Open
In the people’s everyday life, one of the most significant (and unfortunately well-known) environmental problems with substantial impact and effects is air pollution. To this end, extensive research (e.g., conducted by the World Health Org…
View article: Watermarking Tiny MLCommons Image Applications Without Extra Deployability Costs
Watermarking Tiny MLCommons Image Applications Without Extra Deployability Costs Open
The tasks assigned to neural network (NN) models are increasingly challenging due to the growing demand for their applicability across domains. Advanced machine learning programming skills, development time, and expensive assets are requir…
View article: Inertial Measurement Unit Self-Calibration by Quantization-Aware and Memory-Parsimonious Neural Networks
Inertial Measurement Unit Self-Calibration by Quantization-Aware and Memory-Parsimonious Neural Networks Open
This paper introduces a methodology to compensate inertial Micro-Electro-Mechanical System (IMU-MEMS) time-varying calibration loss, induced by stress and aging. The approach relies on a periodic assessment of the sensor through specific s…
View article: Benchmarking In-Sensor Machine Learning Computing: An Extension to the MLCommons-Tiny Suite
Benchmarking In-Sensor Machine Learning Computing: An Extension to the MLCommons-Tiny Suite Open
This paper proposes a new benchmark specifically designed for in-sensor digital machine learning computing to meet an ultra-low embedded memory requirement. With the exponential growth of edge devices, efficient local processing is essenti…
View article: Efficient Tiny Machine Learning for Human Activity Recognition on Low-Power Edge Devices
Efficient Tiny Machine Learning for Human Activity Recognition on Low-Power Edge Devices Open
Human Activity Recognition (HAR) continues to capture the attention of academic and industrial researchers because of its practical applications in healthcare and everyday living environments. To equate this technology for widespread use, …
View article: Tiny Machine Learning Battery State-of-Charge Estimation Hardware Accelerated
Tiny Machine Learning Battery State-of-Charge Estimation Hardware Accelerated Open
Electric mobility is pervasive and strongly affects everyone in everyday life. Motorbikes, bikes, cars, humanoid robots, etc., feature specific battery architectures composed of several lithium nickel oxide cells. Some of them are connecte…
View article: Calibrating Glucose Sensors at the Edge: A Stress Generation Model for Tiny ML Drift Compensation
Calibrating Glucose Sensors at the Edge: A Stress Generation Model for Tiny ML Drift Compensation Open
Background: Continuous glucose monitoring (CGM) systems offer the advantage of noninvasive monitoring and continuous data on glucose fluctuations. This study introduces a new model that enables the generation of synthetic but realistic dat…
View article: Forward Learning of Large Language Models by Consumer Devices
Forward Learning of Large Language Models by Consumer Devices Open
Large Language Models achieve state of art performances on a broad variety of Natural Language Processing tasks. In the pervasive IoT era, their deployment on edge devices is more compelling than ever. However, their gigantic model footpri…
View article: Towards Full Forward On-Tiny-Device Learning: A Guided Search for a Randomly Initialized Neural Network
Towards Full Forward On-Tiny-Device Learning: A Guided Search for a Randomly Initialized Neural Network Open
In the context of TinyML, many research efforts have been devoted to designing forward topologies to support On-Device Learning. Reaching this target would bring numerous advantages, including reductions in latency and computational comple…
View article: Developing a TinyML Image Classifier in an Hour
Developing a TinyML Image Classifier in an Hour Open
Tiny machine learning technologies are bringing intelligence ever closer to the sensor, thus enabling the key benefits of edge computing (e.g., reduced latency, improved data security, higher energy efficiency, and lower bandwidth consumpt…
View article: Mathematical Formulation of Learning and Its Computational Complexity for Transformers’ Layers
Mathematical Formulation of Learning and Its Computational Complexity for Transformers’ Layers Open
Transformers are the cornerstone of natural language processing and other much more complicated sequential modelling tasks. The training of these models, however, requires an enormous number of computations, with substantial economic and e…
View article: IEEE COINS 2023 Contest for In Sensor Machine Learning Computing
IEEE COINS 2023 Contest for In Sensor Machine Learning Computing Open
The purpose of the investigation is to analyze the effect of fin length and position in terms of rotational angle on heat transmission and entropy generation. Different parameters such as Prandtl numbers, Hartmann numbers, Rayleigh numbers…
View article: Tiny Machine Learning Zoo for Long-Term Compensation of Pressure Sensor Drifts
Tiny Machine Learning Zoo for Long-Term Compensation of Pressure Sensor Drifts Open
Pressure sensors embodied in very tiny packages are deployed in a wide range of advanced applications. Examples of applications range from industrial to altitude location services. They are also becoming increasingly pervasive in many othe…
View article: IEEE COINS 2023 Contest for In Sensor Machine Learning Computing
IEEE COINS 2023 Contest for In Sensor Machine Learning Computing Open
There is a vast consensus in the embedded system community about the need to support artificial intelligence at the edge. An increasing demand for IoT and automotive devices is experienced with such a mandatory need to provide additional v…
View article: IEEE COINS 2023 Contest for In Sensor Machine Learning Computing
IEEE COINS 2023 Contest for In Sensor Machine Learning Computing Open
There is a vast consensus in the embedded system community about the need to support artificial intelligence at the edge. An increasing demand for IoT and automotive devices is experienced with such a mandatory need to provide additional v…
View article: Online Machine Learning for Dynamic Line Rating of the Overhead Lines
Online Machine Learning for Dynamic Line Rating of the Overhead Lines Open
In the context of energy transmission, Line Rating addresses the capacity of conductors to safely transmit energy under different conditions. Two distinct rating methods are known: Static Line Rating and Dynamic Line Rating, cor?respondin…