<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> <title>SGD: Convex Loss Functions — scikits.learn v0.6.0 documentation</title> <link rel="stylesheet" href="../../_static/nature.css" type="text/css" /> <link rel="stylesheet" href="../../_static/pygments.css" type="text/css" /> <script type="text/javascript"> var DOCUMENTATION_OPTIONS = { URL_ROOT: '../../', VERSION: '0.6.0', COLLAPSE_INDEX: false, FILE_SUFFIX: '.html', HAS_SOURCE: true }; </script> <script type="text/javascript" src="../../_static/jquery.js"></script> <script type="text/javascript" src="../../_static/underscore.js"></script> <script type="text/javascript" src="../../_static/doctools.js"></script> <link rel="shortcut icon" href="../../_static/favicon.ico"/> <link rel="author" title="About these documents" href="../../about.html" /> <link rel="top" title="scikits.learn v0.6.0 documentation" href="../../index.html" /> <link rel="up" title="Examples" href="../index.html" /> <link rel="next" title="Ordinary Least Squares with SGD" href="plot_sgd_ols.html" /> <link rel="prev" title="Plot multi-class SGD on the iris dataset" href="plot_sgd_iris.html" /> </head> <body> <div class="header-wrapper"> <div class="header"> <p class="logo"><a href="../../index.html"> <img src="../../_static/scikit-learn-logo-small.png" alt="Logo"/> </a> </p><div class="navbar"> <ul> <li><a href="../../install.html">Download</a></li> <li><a href="../../support.html">Support</a></li> <li><a href="../../user_guide.html">User Guide</a></li> <li><a href="../index.html">Examples</a></li> <li><a href="../../developers/index.html">Development</a></li> </ul> <div class="search_form"> <div id="cse" style="width: 100%;"></div> <script src="http://www.google.com/jsapi" type="text/javascript"></script> <script type="text/javascript"> google.load('search', '1', {language : 'en'}); google.setOnLoadCallback(function() { var customSearchControl = new google.search.CustomSearchControl('016639176250731907682:tjtqbvtvij0'); customSearchControl.setResultSetSize(google.search.Search.FILTERED_CSE_RESULTSET); var options = new google.search.DrawOptions(); options.setAutoComplete(true); customSearchControl.draw('cse', options); }, true); </script> </div> </div> <!-- end navbar --></div> </div> <div class="content-wrapper"> <!-- <div id="blue_tile"></div> --> <div class="sphinxsidebar"> <div class="rel"> <a href="plot_sgd_iris.html" title="Plot multi-class SGD on the iris dataset" accesskey="P">previous</a> | <a href="plot_sgd_ols.html" title="Ordinary Least Squares with SGD" accesskey="N">next</a> | <a href="../../genindex.html" title="General Index" accesskey="I">index</a> </div> <h3>Contents</h3> <ul> <li><a class="reference internal" href="#">SGD: Convex Loss Functions</a></li> </ul> </div> <div class="content"> <div class="documentwrapper"> <div class="bodywrapper"> <div class="body"> <div class="section" id="sgd-convex-loss-functions"> <span id="example-linear-model-plot-sgd-loss-functions-py"></span><h1>SGD: Convex Loss Functions<a class="headerlink" href="#sgd-convex-loss-functions" title="Permalink to this headline">ΒΆ</a></h1> <p>Plot the convex loss functions supported by <cite>scikits.learn.linear_model.stochastic_gradient</cite>.</p> <img alt="auto_examples/linear_model/images/plot_sgd_loss_functions.png" class="align-center" src="auto_examples/linear_model/images/plot_sgd_loss_functions.png" /> <p><strong>Python source code:</strong> <a class="reference download internal" href="../../_downloads/plot_sgd_loss_functions.py"><tt class="xref download docutils literal"><span class="pre">plot_sgd_loss_functions.py</span></tt></a></p> <div class="highlight-python"><div class="highlight"><pre><span class="k">print</span> <span class="n">__doc__</span> <span class="kn">import</span> <span class="nn">numpy</span> <span class="kn">as</span> <span class="nn">np</span> <span class="kn">import</span> <span class="nn">pylab</span> <span class="kn">as</span> <span class="nn">pl</span> <span class="kn">from</span> <span class="nn">scikits.learn.linear_model.sgd_fast</span> <span class="kn">import</span> <span class="n">Hinge</span><span class="p">,</span> \ <span class="n">ModifiedHuber</span><span class="p">,</span> <span class="n">SquaredLoss</span> <span class="c">###############################################################################</span> <span class="c"># Define loss funcitons</span> <span class="n">xmin</span><span class="p">,</span> <span class="n">xmax</span> <span class="o">=</span> <span class="o">-</span><span class="mi">3</span><span class="p">,</span> <span class="mi">3</span> <span class="n">hinge</span> <span class="o">=</span> <span class="n">Hinge</span><span class="p">()</span> <span class="n">log_loss</span> <span class="o">=</span> <span class="k">lambda</span> <span class="n">z</span><span class="p">,</span> <span class="n">p</span><span class="p">:</span> <span class="n">np</span><span class="o">.</span><span class="n">log2</span><span class="p">(</span><span class="mf">1.0</span> <span class="o">+</span> <span class="n">np</span><span class="o">.</span><span class="n">exp</span><span class="p">(</span><span class="o">-</span><span class="n">z</span><span class="p">))</span> <span class="n">modified_huber</span> <span class="o">=</span> <span class="n">ModifiedHuber</span><span class="p">()</span> <span class="n">squared_loss</span> <span class="o">=</span> <span class="n">SquaredLoss</span><span class="p">()</span> <span class="c">###############################################################################</span> <span class="c"># Plot loss funcitons</span> <span class="n">xx</span> <span class="o">=</span> <span class="n">np</span><span class="o">.</span><span class="n">linspace</span><span class="p">(</span><span class="n">xmin</span><span class="p">,</span> <span class="n">xmax</span><span class="p">,</span> <span class="mi">100</span><span class="p">)</span> <span class="n">pl</span><span class="o">.</span><span class="n">plot</span><span class="p">([</span><span class="n">xmin</span><span class="p">,</span> <span class="mi">0</span><span class="p">,</span> <span class="mi">0</span><span class="p">,</span> <span class="n">xmax</span><span class="p">],</span> <span class="p">[</span><span class="mi">1</span><span class="p">,</span> <span class="mi">1</span><span class="p">,</span> <span class="mi">0</span><span class="p">,</span> <span class="mi">0</span><span class="p">],</span> <span class="s">'k-'</span><span class="p">,</span> <span class="n">label</span><span class="o">=</span><span class="s">"Zero-one loss"</span><span class="p">)</span> <span class="n">pl</span><span class="o">.</span><span class="n">plot</span><span class="p">(</span><span class="n">xx</span><span class="p">,</span> <span class="p">[</span><span class="n">hinge</span><span class="o">.</span><span class="n">loss</span><span class="p">(</span><span class="n">x</span><span class="p">,</span><span class="mi">1</span><span class="p">)</span> <span class="k">for</span> <span class="n">x</span> <span class="ow">in</span> <span class="n">xx</span><span class="p">],</span> <span class="s">'g-'</span><span class="p">,</span> <span class="n">label</span><span class="o">=</span><span class="s">"Hinge loss"</span><span class="p">)</span> <span class="n">pl</span><span class="o">.</span><span class="n">plot</span><span class="p">(</span><span class="n">xx</span><span class="p">,</span> <span class="p">[</span><span class="n">log_loss</span><span class="p">(</span><span class="n">x</span><span class="p">,</span><span class="mi">1</span><span class="p">)</span> <span class="k">for</span> <span class="n">x</span> <span class="ow">in</span> <span class="n">xx</span><span class="p">],</span> <span class="s">'r-'</span><span class="p">,</span> <span class="n">label</span><span class="o">=</span><span class="s">"Log loss"</span><span class="p">)</span> <span class="n">pl</span><span class="o">.</span><span class="n">plot</span><span class="p">(</span><span class="n">xx</span><span class="p">,</span> <span class="p">[</span><span class="n">modified_huber</span><span class="o">.</span><span class="n">loss</span><span class="p">(</span><span class="n">x</span><span class="p">,</span><span class="mi">1</span><span class="p">)</span> <span class="k">for</span> <span class="n">x</span> <span class="ow">in</span> <span class="n">xx</span><span class="p">],</span> <span class="s">'y-'</span><span class="p">,</span> <span class="n">label</span><span class="o">=</span><span class="s">"Modified huber loss"</span><span class="p">)</span> <span class="c">#pl.plot(xx, [2.0*squared_loss.loss(x,1) for x in xx], 'c-',</span> <span class="c"># label="Squared loss")</span> <span class="n">pl</span><span class="o">.</span><span class="n">ylim</span><span class="p">((</span><span class="mi">0</span><span class="p">,</span> <span class="mi">5</span><span class="p">))</span> <span class="n">pl</span><span class="o">.</span><span class="n">legend</span><span class="p">(</span><span class="n">loc</span><span class="o">=</span><span class="s">"upper right"</span><span class="p">)</span> <span class="n">pl</span><span class="o">.</span><span class="n">xlabel</span><span class="p">(</span><span class="s">r"$y \cdot f(x)$"</span><span class="p">)</span> <span class="n">pl</span><span class="o">.</span><span class="n">ylabel</span><span class="p">(</span><span class="s">"$L(y, f(x))$"</span><span class="p">)</span> <span class="n">pl</span><span class="o">.</span><span class="n">show</span><span class="p">()</span> </pre></div> </div> </div> </div> </div> </div> <div class="clearer"></div> </div> </div> <div class="footer"> <p style="text-align: center">This documentation is relative to scikits.learn version 0.6.0<p> © 2010, scikits.learn developers (BSD Lincense). Created using <a href="http://sphinx.pocoo.org/">Sphinx</a> 1.0.5. Design by <a href="http://webylimonada.com">Web y Limonada</a>. </div> </body> </html>