Downloads provided by UsageCounts
arXiv: 2201.07677
Billions of distributed, heterogeneous, and resource constrained IoT devices deploy on-device machine learning (ML) for private, fast, and offline inference on personal data. On-device ML is highly context dependent and sensitive to user, usage, hardware, and environment attributes. This sensitivity and the propensity toward bias in ML makes it important to study bias in on-device settings. Our study is one of the first investigations of bias in this emerging domain and lays important foundations for building fairer on-device ML. We apply a software engineering lens, investigating the propagation of bias through design choices in on-device ML workflows. We first identifyreliability biasas a source of unfairness and propose a measure to quantify it. We then conduct empirical experiments for a keyword spotting task to show how complex and interacting technical design choices amplify and propagatereliability bias. Our results validate that design choices made during model training, like the sample rate and input feature type, and choices made to optimize models, like light-weight architectures, the pruning learning rate, and pruning sparsity, can result in disparate predictive performance across male and female groups. Based on our findings, we suggest low effort strategies for engineers to mitigate bias in on-device ML.
FOS: Computer and information sciences, Computer Science - Machine Learning, personal data, fairness, 006, on-device machine learning, Machine Learning (cs.LG), design choices, Software Engineering (cs.SE), Computer Science - Computers and Society, Computer Science - Software Engineering, Bias, Audio and Speech Processing (eess.AS), Computers and Society (cs.CY), FOS: Electrical engineering, electronic engineering, information engineering, embedded machine learning, audio keyword spotting, Electrical Engineering and Systems Science - Audio and Speech Processing
FOS: Computer and information sciences, Computer Science - Machine Learning, personal data, fairness, 006, on-device machine learning, Machine Learning (cs.LG), design choices, Software Engineering (cs.SE), Computer Science - Computers and Society, Computer Science - Software Engineering, Bias, Audio and Speech Processing (eess.AS), Computers and Society (cs.CY), FOS: Electrical engineering, electronic engineering, information engineering, embedded machine learning, audio keyword spotting, Electrical Engineering and Systems Science - Audio and Speech Processing
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 9 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 10% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Top 10% |
| views | 17 | |
| downloads | 11 |

Views provided by UsageCounts
Downloads provided by UsageCounts