Alex' Gardenアレックスの庭

Home

❯

5_Archive

❯

Other

❯

Uni

❯

Modules

❯

WS21 22

❯

NL1

❯

VL (1)

❯

VL09

❯

Cross entropy loss

1 min read

Cross-entropy loss (binary)

==LCE​(y^​,y)=−(ylog(y^​)+(1−y)log(1−y^​))==

  • y^​: true distribution
  • y: estimated distribution

Why?

  • calculate the distance between two probability distributions

🔗 Links

🚶 Logistic Regression

Graph View

  • Cross-entropy loss (binary)
  • Why?

Backlinks

  • NL1 Lectures
  • 1⃣ Binary logistic regression