Cross-entropy loss (binary) § LCE(y^,y)=−(ylog(y^)+(1−y)log(1−y^)) y^: true distribution y: estimated distribution Why? § calculate the distance between two probability distributions 🔗 Links 🚶 Logistic Regression