Sophie

Sophie

distrib > Fedora > 18 > i386 > by-pkgid > 7f671eb35339cf812de52087b0d93519 > files > 268

python3-pytest-2.3.5-3.fc18.noarch.rpm



<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN"
  "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">


<html xmlns="http://www.w3.org/1999/xhtml">
  <head>
    <meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
    
    <title>Skip and xfail: dealing with tests that can not succeed</title>
    
    <link rel="stylesheet" href="_static/sphinxdoc.css" type="text/css" />
    <link rel="stylesheet" href="_static/pygments.css" type="text/css" />
    
    <script type="text/javascript">
      var DOCUMENTATION_OPTIONS = {
        URL_ROOT:    '',
        VERSION:     '2.3.4.1',
        COLLAPSE_INDEX: false,
        FILE_SUFFIX: '.html',
        HAS_SOURCE:  true
      };
    </script>
    <script type="text/javascript" src="_static/jquery.js"></script>
    <script type="text/javascript" src="_static/underscore.js"></script>
    <script type="text/javascript" src="_static/doctools.js"></script>
    <link rel="top" title="None" href="index.html" />
    <link rel="up" title="py.test reference documentation" href="apiref.html" />
    <link rel="next" title="Asserting deprecation and other warnings" href="recwarn.html" />
    <link rel="prev" title="Marking test functions with attributes" href="mark.html" /> 
  </head>
  <body>
    <div class="related">
      <h3>Navigation</h3>
      <ul>
        <li class="right" style="margin-right: 10px">
          <a href="recwarn.html" title="Asserting deprecation and other warnings"
             accesskey="N">next</a></li>
        <li class="right" >
          <a href="mark.html" title="Marking test functions with attributes"
             accesskey="P">previous</a> |</li>
        <li><a href="contents.html">pytest-2.3.4.1</a> &raquo;</li>
          <li><a href="apiref.html" accesskey="U">py.test reference documentation</a> &raquo;</li>
 
<g:plusone></g:plusone>

      </ul>
    </div>
      <div class="sphinxsidebar">
        <div class="sphinxsidebarwrapper">
<div id="searchbox" style="display: none">
    <form class="search" action="search.html" method="get">
      <input type="text" name="q" size="18" />
      <input type="submit" value="Search" />
      <input type="hidden" name="check_keywords" value="yes" />
      <input type="hidden" name="area" value="default" />
    </form>
</div>
<script type="text/javascript">$('#searchbox').show(0);</script>

<h3>quicklinks</h3>
<div style="text-align: left; font-size: 100%; vertical-align: middle;">
<table>
<tr>
<td>
        <a href="index.html">home</a>
</td><td>
        <a href="contents.html">TOC/contents</a>
</td></tr><tr><td>
        <a href="getting-started.html">install</a>
</td><td>
        <a href="changelog.html">changelog</a>
</td></tr><tr><td>
        <a href="example/index.html">examples</a>
</td><td>
        <a href="customize.html">customize</a>
</td></tr><tr><td>
        <a href="https://bitbucket.org/hpk42/pytest/issues?status=new&status=open">issues[bb]</a>
</td><td>
        <a href="contact.html">contact</a>
</td></tr></table>
</div>

  <h3><a href="contents.html">Table Of Contents</a></h3>
  <ul>
<li><a class="reference internal" href="#">Skip and xfail: dealing with tests that can not succeed</a><ul>
<li><a class="reference internal" href="#marking-a-test-function-to-be-skipped">Marking a test function to be skipped</a></li>
<li><a class="reference internal" href="#skip-all-test-functions-of-a-class">Skip all test functions of a class</a></li>
<li><a class="reference internal" href="#mark-a-test-function-as-expected-to-fail">Mark a test function as expected to fail</a></li>
<li><a class="reference internal" href="#evaluation-of-skipif-xfail-expressions">Evaluation of skipif/xfail expressions</a></li>
<li><a class="reference internal" href="#imperative-xfail-from-within-a-test-or-setup-function">Imperative xfail from within a test or setup function</a></li>
<li><a class="reference internal" href="#skipping-on-a-missing-import-dependency">Skipping on a missing import dependency</a></li>
<li><a class="reference internal" href="#imperative-skip-from-within-a-test-or-setup-function">Imperative skip from within a test or setup function</a></li>
</ul>
</li>
</ul>

  <h4>Previous topic</h4>
  <p class="topless"><a href="mark.html"
                        title="previous chapter">Marking test functions with attributes</a></p>
  <h4>Next topic</h4>
  <p class="topless"><a href="recwarn.html"
                        title="next chapter">Asserting deprecation and other warnings</a></p>
        </div>
      </div>

    <div class="document">
      <div class="documentwrapper">
        <div class="bodywrapper">
          <div class="body">
            
  <div class="section" id="skip-and-xfail-dealing-with-tests-that-can-not-succeed">
<span id="skipping"></span><span id="skip-and-xfail"></span><h1>Skip and xfail: dealing with tests that can not succeed<a class="headerlink" href="#skip-and-xfail-dealing-with-tests-that-can-not-succeed" title="Permalink to this headline">¶</a></h1>
<p>If you have test functions that cannot be run on certain platforms
or that you expect to fail you can mark them accordingly or you
may call helper functions during execution of setup or test functions.</p>
<p>A <em>skip</em> means that you expect your test to pass unless a certain
configuration or condition (e.g. wrong Python interpreter, missing
dependency) prevents it to run.  And <em>xfail</em> means that your test
can run but you expect it to fail because there is an implementation problem.</p>
<p>py.test counts and lists <em>skip</em> and <em>xfail</em> tests separately. However,
detailed information about skipped/xfailed tests is not shown by default
to avoid cluttering the output.  You can use the <tt class="docutils literal"><span class="pre">-r</span></tt> option to see
details corresponding to the &#8220;short&#8221; letters shown in the test
progress:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="n">py</span><span class="o">.</span><span class="n">test</span> <span class="o">-</span><span class="n">rxs</span>  <span class="c"># show extra info on skips and xfails</span>
</pre></div>
</div>
<p>(See <a class="reference internal" href="customize.html#how-to-change-command-line-options-defaults"><em>How to change command line options defaults</em></a>)</p>
<div class="section" id="marking-a-test-function-to-be-skipped">
<span id="skipif"></span><h2>Marking a test function to be skipped<a class="headerlink" href="#marking-a-test-function-to-be-skipped" title="Permalink to this headline">¶</a></h2>
<p>Here is an example of marking a test function to be skipped
when run on a Python3 interpreter:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="kn">import</span> <span class="nn">sys</span>
<span class="nd">@pytest.mark.skipif</span><span class="p">(</span><span class="s">&quot;sys.version_info &gt;= (3,0)&quot;</span><span class="p">)</span>
<span class="k">def</span> <span class="nf">test_function</span><span class="p">():</span>
    <span class="o">...</span>
</pre></div>
</div>
<p>During test function setup the skipif condition is
evaluated by calling <tt class="docutils literal"><span class="pre">eval('sys.version_info</span> <span class="pre">&gt;=</span> <span class="pre">(3,0)',</span> <span class="pre">namespace)</span></tt>.
(<em>New in version 2.0.2</em>) The namespace contains all the module globals of the test function so that
you can for example check for versions of a module you are using:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="kn">import</span> <span class="nn">mymodule</span>

<span class="nd">@pytest.mark.skipif</span><span class="p">(</span><span class="s">&quot;mymodule.__version__ &lt; &#39;1.2&#39;&quot;</span><span class="p">)</span>
<span class="k">def</span> <span class="nf">test_function</span><span class="p">():</span>
    <span class="o">...</span>
</pre></div>
</div>
<p>The test function will not be run (&#8220;skipped&#8221;) if
<tt class="docutils literal"><span class="pre">mymodule</span></tt> is below the specified version.  The reason
for specifying the condition as a string is mainly that
py.test can report a summary of skip conditions.
For information on the construction of the <tt class="docutils literal"><span class="pre">namespace</span></tt>
see <a class="reference internal" href="#evaluation-of-skipif-xfail-conditions">evaluation of skipif/xfail conditions</a>.</p>
<p>You can of course create a shortcut for your conditional skip
decorator at module level like this:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="n">win32only</span> <span class="o">=</span> <span class="n">pytest</span><span class="o">.</span><span class="n">mark</span><span class="o">.</span><span class="n">skipif</span><span class="p">(</span><span class="s">&quot;sys.platform != &#39;win32&#39;&quot;</span><span class="p">)</span>

<span class="nd">@win32only</span>
<span class="k">def</span> <span class="nf">test_function</span><span class="p">():</span>
    <span class="o">...</span>
</pre></div>
</div>
</div>
<div class="section" id="skip-all-test-functions-of-a-class">
<h2>Skip all test functions of a class<a class="headerlink" href="#skip-all-test-functions-of-a-class" title="Permalink to this headline">¶</a></h2>
<p>As with all function <a class="reference internal" href="mark.html#mark"><em>marking</em></a> you can skip test functions at the
<a class="reference external" href="mark.html#scoped-marking">whole class- or module level</a>.  Here is an example
for skipping all methods of a test class based on the platform:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="k">class</span> <span class="nc">TestPosixCalls</span><span class="p">:</span>
    <span class="n">pytestmark</span> <span class="o">=</span> <span class="n">pytest</span><span class="o">.</span><span class="n">mark</span><span class="o">.</span><span class="n">skipif</span><span class="p">(</span><span class="s">&quot;sys.platform == &#39;win32&#39;&quot;</span><span class="p">)</span>

    <span class="k">def</span> <span class="nf">test_function</span><span class="p">(</span><span class="bp">self</span><span class="p">):</span>
        <span class="s">&quot;will not be setup or run under &#39;win32&#39; platform&quot;</span>
</pre></div>
</div>
<p>The <tt class="docutils literal"><span class="pre">pytestmark</span></tt> special name tells py.test to apply it to each test
function in the class.  If your code targets python2.6 or above you can
more naturally use the skipif decorator (and any other marker) on
classes:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="nd">@pytest.mark.skipif</span><span class="p">(</span><span class="s">&quot;sys.platform == &#39;win32&#39;&quot;</span><span class="p">)</span>
<span class="k">class</span> <span class="nc">TestPosixCalls</span><span class="p">:</span>

    <span class="k">def</span> <span class="nf">test_function</span><span class="p">(</span><span class="bp">self</span><span class="p">):</span>
        <span class="s">&quot;will not be setup or run under &#39;win32&#39; platform&quot;</span>
</pre></div>
</div>
<p>Using multiple &#8220;skipif&#8221; decorators on a single function is generally fine - it means that if any of the conditions apply the function execution will be skipped.</p>
</div>
<div class="section" id="mark-a-test-function-as-expected-to-fail">
<span id="xfail"></span><h2>Mark a test function as expected to fail<a class="headerlink" href="#mark-a-test-function-as-expected-to-fail" title="Permalink to this headline">¶</a></h2>
<p>You can use the <tt class="docutils literal"><span class="pre">xfail</span></tt> marker to indicate that you
expect the test to fail:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="nd">@pytest.mark.xfail</span>
<span class="k">def</span> <span class="nf">test_function</span><span class="p">():</span>
    <span class="o">...</span>
</pre></div>
</div>
<p>This test will be run but no traceback will be reported
when it fails. Instead terminal reporting will list it in the
&#8220;expected to fail&#8221; or &#8220;unexpectedly passing&#8221; sections.</p>
<p>By specifying on the commandline:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="n">pytest</span> <span class="o">--</span><span class="n">runxfail</span>
</pre></div>
</div>
<p>you can force the running and reporting of an <tt class="docutils literal"><span class="pre">xfail</span></tt> marked test
as if it weren&#8217;t marked at all.</p>
<p>As with <a class="reference internal" href="#skipif">skipif</a> you can also mark your expectation of a failure
on a particular platform:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="nd">@pytest.mark.xfail</span><span class="p">(</span><span class="s">&quot;sys.version_info &gt;= (3,0)&quot;</span><span class="p">)</span>
<span class="k">def</span> <span class="nf">test_function</span><span class="p">():</span>
    <span class="o">...</span>
</pre></div>
</div>
<p>You can furthermore prevent the running of an &#8220;xfail&#8221; test or
specify a reason such as a bug ID or similar.  Here is
a simple test file with the several usages:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="kn">import</span> <span class="nn">pytest</span>
<span class="n">xfail</span> <span class="o">=</span> <span class="n">pytest</span><span class="o">.</span><span class="n">mark</span><span class="o">.</span><span class="n">xfail</span>

<span class="nd">@xfail</span>
<span class="k">def</span> <span class="nf">test_hello</span><span class="p">():</span>
    <span class="k">assert</span> <span class="mi">0</span>

<span class="nd">@xfail</span><span class="p">(</span><span class="n">run</span><span class="o">=</span><span class="bp">False</span><span class="p">)</span>
<span class="k">def</span> <span class="nf">test_hello2</span><span class="p">():</span>
    <span class="k">assert</span> <span class="mi">0</span>

<span class="nd">@xfail</span><span class="p">(</span><span class="s">&quot;hasattr(os, &#39;sep&#39;)&quot;</span><span class="p">)</span>
<span class="k">def</span> <span class="nf">test_hello3</span><span class="p">():</span>
    <span class="k">assert</span> <span class="mi">0</span>

<span class="nd">@xfail</span><span class="p">(</span><span class="n">reason</span><span class="o">=</span><span class="s">&quot;bug 110&quot;</span><span class="p">)</span>
<span class="k">def</span> <span class="nf">test_hello4</span><span class="p">():</span>
    <span class="k">assert</span> <span class="mi">0</span>

<span class="nd">@xfail</span><span class="p">(</span><span class="s">&#39;pytest.__version__[0] != &quot;17&quot;&#39;</span><span class="p">)</span>
<span class="k">def</span> <span class="nf">test_hello5</span><span class="p">():</span>
    <span class="k">assert</span> <span class="mi">0</span>

<span class="k">def</span> <span class="nf">test_hello6</span><span class="p">():</span>
    <span class="n">pytest</span><span class="o">.</span><span class="n">xfail</span><span class="p">(</span><span class="s">&quot;reason&quot;</span><span class="p">)</span>
</pre></div>
</div>
<p>Running it with the report-on-xfail option gives this output:</p>
<div class="highlight-python"><pre>example $ py.test -rx xfail_demo.py
=========================== test session starts ============================
platform linux2 -- Python 2.7.3 -- pytest-2.3.5
collected 6 items

xfail_demo.py xxxxxx
========================= short test summary info ==========================
XFAIL xfail_demo.py::test_hello
XFAIL xfail_demo.py::test_hello2
  reason: [NOTRUN]
XFAIL xfail_demo.py::test_hello3
  condition: hasattr(os, 'sep')
XFAIL xfail_demo.py::test_hello4
  bug 110
XFAIL xfail_demo.py::test_hello5
  condition: pytest.__version__[0] != "17"
XFAIL xfail_demo.py::test_hello6
  reason: reason

======================== 6 xfailed in 0.05 seconds =========================</pre>
</div>
</div>
<div class="section" id="evaluation-of-skipif-xfail-expressions">
<span id="evaluation-of-skipif-xfail-conditions"></span><h2>Evaluation of skipif/xfail expressions<a class="headerlink" href="#evaluation-of-skipif-xfail-expressions" title="Permalink to this headline">¶</a></h2>
<p class="versionadded">
<span class="versionmodified">New in version 2.0.2.</span></p>
<p>The evaluation of a condition string in <tt class="docutils literal"><span class="pre">pytest.mark.skipif(conditionstring)</span></tt>
or <tt class="docutils literal"><span class="pre">pytest.mark.xfail(conditionstring)</span></tt> takes place in a namespace
dictionary which is constructed as follows:</p>
<ul class="simple">
<li>the namespace is initialized by putting the <tt class="docutils literal"><span class="pre">sys</span></tt> and <tt class="docutils literal"><span class="pre">os</span></tt> modules
and the pytest <tt class="docutils literal"><span class="pre">config</span></tt> object into it.</li>
<li>updated with the module globals of the test function for which the
expression is applied.</li>
</ul>
<p>The pytest <tt class="docutils literal"><span class="pre">config</span></tt> object allows you to skip based on a test configuration value
which you might have added:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="nd">@pytest.mark.skipif</span><span class="p">(</span><span class="s">&quot;not config.getvalue(&#39;db&#39;)&quot;</span><span class="p">)</span>
<span class="k">def</span> <span class="nf">test_function</span><span class="p">(</span><span class="o">...</span><span class="p">):</span>
    <span class="o">...</span>
</pre></div>
</div>
</div>
<div class="section" id="imperative-xfail-from-within-a-test-or-setup-function">
<h2>Imperative xfail from within a test or setup function<a class="headerlink" href="#imperative-xfail-from-within-a-test-or-setup-function" title="Permalink to this headline">¶</a></h2>
<p>If you cannot declare xfail-conditions at import time
you can also imperatively produce an XFail-outcome from
within test or setup code.  Example:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="k">def</span> <span class="nf">test_function</span><span class="p">():</span>
    <span class="k">if</span> <span class="ow">not</span> <span class="n">valid_config</span><span class="p">():</span>
        <span class="n">pytest</span><span class="o">.</span><span class="n">xfail</span><span class="p">(</span><span class="s">&quot;unsupported configuration&quot;</span><span class="p">)</span>
</pre></div>
</div>
</div>
<div class="section" id="skipping-on-a-missing-import-dependency">
<h2>Skipping on a missing import dependency<a class="headerlink" href="#skipping-on-a-missing-import-dependency" title="Permalink to this headline">¶</a></h2>
<p>You can use the following import helper at module level
or within a test or test setup function:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="n">docutils</span> <span class="o">=</span> <span class="n">pytest</span><span class="o">.</span><span class="n">importorskip</span><span class="p">(</span><span class="s">&quot;docutils&quot;</span><span class="p">)</span>
</pre></div>
</div>
<p>If <tt class="docutils literal"><span class="pre">docutils</span></tt> cannot be imported here, this will lead to a
skip outcome of the test.  You can also skip based on the
version number of a library:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="n">docutils</span> <span class="o">=</span> <span class="n">pytest</span><span class="o">.</span><span class="n">importorskip</span><span class="p">(</span><span class="s">&quot;docutils&quot;</span><span class="p">,</span> <span class="n">minversion</span><span class="o">=</span><span class="s">&quot;0.3&quot;</span><span class="p">)</span>
</pre></div>
</div>
<p>The version will be read from the specified module&#8217;s <tt class="docutils literal"><span class="pre">__version__</span></tt> attribute.</p>
</div>
<div class="section" id="imperative-skip-from-within-a-test-or-setup-function">
<h2>Imperative skip from within a test or setup function<a class="headerlink" href="#imperative-skip-from-within-a-test-or-setup-function" title="Permalink to this headline">¶</a></h2>
<p>If for some reason you cannot declare skip-conditions
you can also imperatively produce a skip-outcome from
within test or setup code.  Example:</p>
<div class="highlight-python"><div class="highlight"><pre><span class="k">def</span> <span class="nf">test_function</span><span class="p">():</span>
    <span class="k">if</span> <span class="ow">not</span> <span class="n">valid_config</span><span class="p">():</span>
        <span class="n">pytest</span><span class="o">.</span><span class="n">skip</span><span class="p">(</span><span class="s">&quot;unsupported configuration&quot;</span><span class="p">)</span>
</pre></div>
</div>
</div>
</div>


          </div>
        </div>
      </div>
      <div class="clearer"></div>
    </div>
    <div class="related">
      <h3>Navigation</h3>
      <ul>
        <li class="right" style="margin-right: 10px">
          <a href="recwarn.html" title="Asserting deprecation and other warnings"
             >next</a></li>
        <li class="right" >
          <a href="mark.html" title="Marking test functions with attributes"
             >previous</a> |</li>
        <li><a href="contents.html">pytest-2.3.4.1</a> &raquo;</li>
          <li><a href="apiref.html" >py.test reference documentation</a> &raquo;</li>
 
<g:plusone></g:plusone>

      </ul>
    </div>

    <div class="footer">
        &copy; Copyright 2012, holger krekel.
      Created using <a href="http://sphinx.pocoo.org/">Sphinx</a> 1.1.3.
    </div>
<script type="text/javascript">

  var _gaq = _gaq || [];
  _gaq.push(['_setAccount', 'UA-7597274-13']);
  _gaq.push(['_trackPageview']);

  (function() {
    var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true;
    ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js';
    var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s);
  })();

</script>
<script type="text/javascript" src="https://apis.google.com/js/plusone.js"></script>

  </body>
</html>