This Bugzilla instance is a read-only archive of historic NetBeans bug reports. To report a bug in NetBeans please follow the project's instructions for reporting issues.

Bug 240916 - Recording Tutorial and Test Specification reviews
Summary: Recording Tutorial and Test Specification reviews
Status: RESOLVED FIXED
Alias: None
Product: qa
Classification: Unclassified
Component: Synergy (show other bugs)
Version: 8.0
Hardware: PC Windows 8
: P3 normal (vote)
Assignee: Vladimir Riha
URL:
Keywords:
Depends on: 243285 243286 243565
Blocks:
  Show dependency tree
 
Reported: 2014-01-24 09:22 UTC by mienamoo
Modified: 2014-04-04 13:54 UTC (History)
1 user (show)

See Also:
Issue Type: ENHANCEMENT
Exception Reporter:


Attachments

Note You need to log in before you can comment on or make changes to this bug.
Description mienamoo 2014-01-24 09:22:12 UTC
I think that it would be useful to be able to record tutorial and TS review results in Synergy too. (For completeness, [1] is the NetCAT forum thread about this.)

[1] http://forums.netbeans.org/viewtopic.php?t=58804

Could you please add a new type of test/assignment for reviewing a tutorial or TS? It should at least record who did the review, whether it was Go/NoGo, and free text comments (to be sent to the owner of the document).
Comment 1 Vladimir Riha 2014-03-26 07:05:01 UTC
Should be done, at least the first version.

How does it work:
 - tester clicks on "create volunteer assignment" button
 - then he selects type of assignment: test or review
  - test assignment option is same as before the fix
  - review assignment requires user to only specify URL of the reviewed page
 - test run page (e.g. [1]) will then show new table Review assignments
  - tester can again continue/restart reviewing. Continue will preserve already created comments, restart will remove them
  - the button with eye icon is for "view action" - it displays the reviewed URL and all comments without any editing. If you click on some comment, it is highlighted in reviewed page

How to review:
 - create a volunteer review assignment
 - on test run, click on Continue (or Restart)
 - displayed page now contains 2 columns: left one with iframe of page being reviewed, right one with comments
 - to add comment, click on some element in the embedded reviewed page => it gets highlighted (click again to deselect it or you can select multiple elements)
 - add comment text and press Add => all comments are bound to certain element on page - It uses XPath to identify elements so if the reviewed page will change while being reviewed, it could cause some problems (comment text might be bound to different element...)
 - you can then Submit data, delete comments or edit them


The flow is a bit different then with classical test assignment, as a result, these review assignments are not being counted in any statistics (as they are all based on test cases), time to review is not measured as well (I could add it if needed)


It should work fine for tutorial pages but for pages with dynamic content (e.g. from JavaScript), it would need some changes as right now, the reviewed page is preprocessed before being embedded in Synergy (injected own JavaScript and CSS code, removed any other JavaScript). 




[1] http://services.netbeans.org/synergy/client/app/#/run/5/v/2
Comment 2 mienamoo 2014-03-26 07:19:18 UTC
Brilliant, thanks! I will try it out soon. :)
Comment 3 Jiri Kovalsky 2014-03-26 12:00:38 UTC
Thanks a lot Lado! How do I mark the tutorial reviewed? After the tutorial is reviewed, it should send notification to our docs team if there are some comments of course. And it would be great to include some information in the test run statistic anyway. Something like: # of tutorials reviewed, # of page downs (or some other "objective :)" indicator of tutorial's complexity or length) plus table with reviewers similar to the completed test cases one.
Comment 4 Vladimir Riha 2014-03-26 12:15:25 UTC
You're welcome

(In reply to Jiri Kovalsky from comment #3)
> Thanks a lot Lado! How do I mark the tutorial reviewed? After the tutorial
> is reviewed, it should send notification to our docs team if there are some
> comments of course. 

Right now there is no such option. How about when you freeze test run, it would sent notifications to docs team about all reviewed tutorials? Or do you think "mark as reviewed" action would be better? (but then I guess it wouldn't be possible to add/edit comments once it has been marked as reviewed). 

Where can I get email to the docs team (or tutorial owner)? Is there some general email address? Or the tester would be required to specify it? Another way would be to have fixed set of pairs <tutorial, tutorial_owner> but I originally wanted to avoid that.

> And it would be great to include some information in the
> test run statistic anyway

Candidates could be: # of reviewed tutorials, # of comments, time taken
Comment 5 Jiri Kovalsky 2014-03-26 12:31:00 UTC
Ideal solution would be to have a preloaded stack of tutorials with their writers. Before Beta Ken Ganfield prepares [1] and review starts. It takes more time than a full test run so I am afraid it was a design mistake to bind tutorial reviews with test run. It would be better to have a special tutorial kind of test run instead. Adding some "Mark as reviewed" button would be ideal. This action would then trigger the notification to docs writer.

[1] http://wiki.netbeans.org/NetBeansTutorialsForCommunityReview

# of comments is not suitable, because even person who reviews 5 tutorials 15 page downs long each with no added comments deserves a lot of credit. Time taken is better from that point of view. I would still welcome to know how long a reviewed tutorial is though.
Comment 6 Vladimir Riha 2014-03-26 12:52:29 UTC
(In reply to Jiri Kovalsky from comment #5)
> Ideal solution would be to have a preloaded stack of tutorials with their
> writers. Before Beta Ken Ganfield prepares [1] and review starts. 

OK, I'll create some parser for this page and "create review assignment" page will offer only these tutorials (parsing will be triggered from administration)

> more time than a full test run so I am afraid it was a design mistake to
> bind tutorial reviews with test run. It would be better to have a special
> tutorial kind of test run instead. 

We can have 2 test runs: "testing" and "reviewing". It's possible but not required to mix these 2 assignment types in a single run.

> 
> # of comments is not suitable, because even person who reviews 5 tutorials
> 15 page downs long each with no added comments deserves a lot of credit.
> Time taken is better from that point of view. I would still welcome to know
> how long a reviewed tutorial is though.

OK, I'll add time taken and # of reviews. How can I determine a length of tutorial?
Comment 7 Jiri Kovalsky 2014-03-26 14:29:54 UTC
> OK, I'll create some parser for this page and "create review assignment"
> page will offer only these tutorials (parsing will be triggered from
> administration)

Great idea.

> OK, I'll add time taken and # of reviews. How can I determine a length of
> tutorial?

No idea. Maybe calculate words?