Sam Holder

My feedback

  1. 3 votes
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Feature Requests  ·  Flag idea as inappropriate…  ·  Admin →
    Sam Holder shared this idea  · 
  2. 4 votes
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Feature Requests  ·  Flag idea as inappropriate…  ·  Admin →
    Sam Holder supported this idea  · 
  3. 3 votes
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Feature Requests  ·  Flag idea as inappropriate…  ·  Admin →
    Sam Holder supported this idea  · 
  4. 1 vote
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)

    We’ll send you updates on this idea

    0 comments  ·  Feature Requests  ·  Flag idea as inappropriate…  ·  Admin →
    Sam Holder shared this idea  · 
  5. 9 votes
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)

    We’ll send you updates on this idea

    4 comments  ·  Feature Requests  ·  Flag idea as inappropriate…  ·  Admin →
    Sam Holder commented  · 

    I've validated that this can indeed be fixed from the specflow side (at least for MSTest tests) and will be submitting a pull request for specflow shortly.

    I'm assuming that frameworks that allow [DataRow] annotation (which specflow uses to generate the Scenario Outlines in the compatible frameworks) would not be able to make use of this functionality as there is no difference in the lines of code that are actually executed in these cases (unless it can tell that it executed a method because a particular [datarow] annotation existed.

    Sam Holder commented  · 

    Thanks for the reply Remco. There seems to be a 1-to-1 mapping between the scenario examples and the variant number in the test name (and the TestProperty attribute on the test). ie the first line of the scenario examples is Test_Variant0, the second is Test_Variant1 etc, so I'd thought that the existing SpecFlow integration which allows the dots to appear in the Scenarios could be extended to place a dot by the examples which corresponds to the test that ran.

    Does NCrunch use meta data output by specflow to allow integration with the feature files? Could SpecFlow be extended to output the required data to allow this?

    Sam Holder supported this idea  · 
    Sam Holder shared this idea  · 
  6. 8 votes
    Sign in
    (thinking…)
    Sign in with: Facebook Google
    Signed in as (Sign out)

    We’ll send you updates on this idea

    1 comment  ·  Feature Requests  ·  Flag idea as inappropriate…  ·  Admin →
    Sam Holder commented  · 

    you can just add the [Ignore] attribute to the tests. this works in most unit testing frameworks I'm aware of

Feedback and Knowledge Base