
This paper analyzes the representational structures that emerge in neural networks trained via evolutionary selection rather than gradient-based optimization. While standard deep learning avoids saturation due to the vanishing gradient problem, gradient-free evolution demonstrates that saturation serves as a functional mechanism for discovering hybrid digital-analog representations. We formalize this as a partitioned state space where k saturated neurons establish discrete operational modes, while n-k continuous neurons facilitate fine-grained modulation. Through systematic experimentation across 13 configurations, we empirically validate that saturation emerges when networks must selectively attend to a subset of available inputs. Our results demonstrate that evolution dynamically allocates k based on task demands, achieving k=0 for clean continuous tasks where all inputs are relevant, and k→n when selective filtering becomes necessary
