Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ ZENODOarrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Report
Data sources: ZENODO
addClaim

Linear Programming Formulation of Relu Function

Authors: Tan, Sing Kuang;

Linear Programming Formulation of Relu Function

Abstract

Relu function is a non-linear function. But if we formulate it as a hard constraint in linear programming, we can make it becomes a convex optimization problem as linear programming is convex. If we will to imagine Relu function as a linear inequality in linear programming, it will look something like inequalities. If we look at an example of feasible region in linear programming graphically, it will contains a few lines of inequalities where green strips mark the invalid regions. So in a BSnet Relu network, the Relu activation functions will act like inequalities and the BSnet can be represented by linear programming. The linear transform by the weights in a layer in BSnet will reorientate the inequality hyperplanes in the linear programming.

Powered by OpenAIRE graph
Found an issue? Give us feedback