Report Open Access

The convergence of the Stochastic Gradient Descent (SGD) : a self-contained proof

Gabriel Turinici


JSON-LD (schema.org) Export

{
  "inLanguage": {
    "alternateName": "eng", 
    "@type": "Language", 
    "name": "English"
  }, 
  "description": "<p>We give&nbsp; here a proof of&nbsp; the convergence of the Stochastic Gradient Descent (SGD) in a self-contained manner.</p>\n\n<p>&nbsp;</p>", 
  "license": "https://creativecommons.org/licenses/by-nc-nd/4.0/legalcode", 
  "creator": [
    {
      "affiliation": "CEREMADE, Universit\u00e9 Paris Dauphine - PSL", 
      "@type": "Person", 
      "name": "Gabriel Turinici"
    }
  ], 
  "headline": "The convergence of the Stochastic Gradient Descent (SGD) : a self-contained proof", 
  "image": "https://zenodo.org/static/img/logos/zenodo-gradient-round.svg", 
  "datePublished": "2021-03-26", 
  "url": "https://zenodo.org/record/4638695", 
  "version": "1", 
  "keywords": [
    "Stochastic Gradient Descent", 
    "Neural Network", 
    "SGD", 
    "Adam", 
    "RMSprop", 
    "Gabriel TURINICI"
  ], 
  "@context": "https://schema.org/", 
  "identifier": "https://doi.org/10.5281/zenodo.4638695", 
  "@id": "https://doi.org/10.5281/zenodo.4638695", 
  "@type": "ScholarlyArticle", 
  "name": "The convergence of the Stochastic Gradient Descent (SGD) : a self-contained proof"
}
95
28
views
downloads
All versions This version
Views 9595
Downloads 2828
Data volume 8.9 MB8.9 MB
Unique views 8686
Unique downloads 2727

Share

Cite as