Trac Ticket Queries

In addition to reports, Trac provides support for custom ticket queries, used to display lists of tickets meeting a specified set of criteria.

To configure and execute a custom query, switch to the View Tickets module from the navigation bar, and select the Custom Query link.

Filters

When you first go to the query page the default filter will display tickets relevant to you:

  • If logged in then all open tickets it will display open tickets assigned to you.
  • If not logged in but you have specified a name or email address in the preferences then it will display all open tickets where your email (or name if email not defined) is in the CC list.
  • If not logged and no name/email defined in the preferences then all open issues are displayed.

Current filters can be removed by clicking the button to the left with the minus sign on the label. New filters are added from the pulldown lists at the bottom corners of the filters box ('And' conditions on the left, 'Or' conditions on the right). Filters with either a text box or a pulldown menu of options can be added multiple times to perform an or of the criteria.

You can use the fields just below the filters box to group the results based on a field, or display the full description for each ticket.

Once you've edited your filters click the Update button to refresh your results.

Clicking on one of the query results will take you to that ticket. You can navigate through the results by clicking the Next Ticket or Previous Ticket links just below the main menu bar, or click the Back to Query link to return to the query page.

You can safely edit any of the tickets and continue to navigate through the results using the Next/Previous/Back to Query links after saving your results. When you return to the query any tickets which were edited will be displayed with italicized text. If one of the tickets was edited such that it no longer matches the query criteria the text will also be greyed. Lastly, if a new ticket matching the query criteria has been created, it will be shown in bold.

The query results can be refreshed and cleared of these status indicators by clicking the Update button again.

Saving Queries

Trac allows you to save the query as a named query accessible from the reports module. To save a query ensure that you have Updated the view and then click the Save query button displayed beneath the results. You can also save references to queries in Wiki content, as described below.

Note: one way to easily build queries like the ones below, you can build and test the queries in the Custom report module and when ready - click Save query. This will build the query string for you. All you need to do is remove the extra line breaks.

Note: you must have the REPORT_CREATE permission in order to save queries to the list of default reports. The Save query button will only appear if you are logged in as a user that has been granted this permission. If your account does not have permission to create reports, you can still use the methods below to save a query.

You may want to save some queries so that you can come back to them later. You can do this by making a link to the query from any Wiki page.

[query:status=new|assigned|reopened&version=1.0 Active tickets against 1.0]

Which is displayed as:

Active tickets against 1.0

This uses a very simple query language to specify the criteria (see Query Language).

Alternatively, you can copy the query string of a query and paste that into the Wiki link, including the leading ? character:

[query:?status=new&status=assigned&status=reopened&group=owner Assigned tickets by owner]

Which is displayed as:

Assigned tickets by owner

Using the [[TicketQuery]] Macro

The TicketQuery macro lets you display lists of tickets matching certain criteria anywhere you can use WikiFormatting.

Example:

[[TicketQuery(version=0.6|0.7&resolution=duplicate)]]

This is displayed as:

#69
Add orientation to 1D plots
#202
Fix auto build of sphinx documentation
#207
Batch fitting does not display the correct data
#258
Windows installer problems when previous versions are not uninstalled
#343
SESANS data loader
#369
clean up ImageViewer interface
#380
Build number is still 1 in installer version
#388
Need BUMPS usage documentation
#403
Two "Fitting" menus
#415
Save Project and Save Analysis don't open on double clicking
#537
Magnetic sld not implemented for python models
#630
rev-parse problem when running from non-git/unitilized directory
#655
Polydispersity array doesn't work
#739
Load project issues
#748
plug-in model unit tests from python shell not always working?
#752
"compute" button still not working
#755
S(q) models should not have scale and background parameters added
#781
blank fit page when selecting old style plugin after reload
#831
only process ticket updates from checkin messages when merged to master
#863
Load Plugin error
#899
Igor Reader q calculation
#904
Separated P(Q) and S(Q) not updated when model changes
#944
non fitting or derived parameters in models
#954
cross check dll/opencl/python polydispersity and orientation results
#1101
Batch results page not displaying polydispersity values
#1127
File Loader is not loading valid NXcanSAS
#1181
Add FitPage name to Graph Title
#1189
some magnetic parameters need to be hidden for multiplicity models

Just like the query: wiki links, the parameter of this macro expects a query string formatted according to the rules of the simple ticket query language. This also allows displaying the link and description of a single ticket:

[[TicketQuery(id=123)]]

This is displayed as:

No results

A more compact representation without the ticket summaries is also available:

[[TicketQuery(version=0.6|0.7&resolution=duplicate, compact)]]

This is displayed as:

#69, #202, #207, #258, #343, #369, #380, #388, #403, #415, #537, #630, #655, #739, #748, #752, #755, #781, #831, #863, #899, #904, #944, #954, #1101, #1127, #1181, #1189

Finally, if you wish to receive only the number of defects that match the query, use the count parameter.

[[TicketQuery(version=0.6|0.7&resolution=duplicate, count)]]

This is displayed as:

28

Customizing the table format

You can also customize the columns displayed in the table format (format=table) by using col≤field> - you can specify multiple fields and what order they are displayed by placing pipes (|) between the columns like below:

[[TicketQuery(max=3,status=closed,order=id,desc=1,format=table,col=resolution|summary|owner|reporter)]]

This is displayed as:

Full rows

In table format you can also have full rows by using rows≤field> like below:

[[TicketQuery(max=3,status=closed,order=id,desc=1,format=table,col=resolution|summary|owner|reporter,rows=description)]]

This is displayed as:

Results (1 - 3 of 721)

1 2 3 4 5 6 7 8 9 10 11
Ticket Resolution Summary Owner Reporter
#1194 invalid console log hides bottom of data explorer butler
Description

in 5.0-beta.1 the console log always stays on top, even if clicking on the data explorer. Further the console cannot be shrunk (nor does it seem to want to expand despite an arrow indicating it should), nor can the data explorer be easily resized.

The default layout should not have the console covering such a key area of the GUI anwyay. Can we make the data explorer smaller by default? and expandable? and also the console?

#1191 obsolete Correct erroneous Scale reported by Spinodal model smk78
Description

This ticket is being used to report changed (for the better!) behaviour in 4.2.0 that does not appear to have been previously ticketed or documented, so that the issue can be appropriately reported in future release notes, and to draw attention to the change.

This issue was verified in the presence of @richardh.

To reproduce the issue: Load the attached data in 4.1.2, send it for fitting, select the spinodal model. Increase Qmin to 0.05. Select scale, background & q_0. Fit. Now repeat the fit in 4.2.0.

The Chi2/Npts, Npts(Fit), theory curve, residuals, and indeed the theory intensities reported by DataInfo, are all the same. As are the fitted background and q_0 parameters (and their uncertainties). But the scale (and scale uncertainty) values reported are DIFFERENT. However, simple visual inspection shows that it is 4.2.0 which is giving the correct scale.

Closer inspection shows that 4.1.2 is actually reporting the square root of the correct scale.

The underlying code did not change (apart from the inclusion of an inconsequential numpy import):

This is 4.1.2

from numpy import inf, errstate

name = "spinodal"
title = "Spinodal decomposition model"
description = """\
      I(q) = scale ((1+gamma/2)x^2)/(gamma/2+x^(2+gamma))+background

      List of default parameters:
      scale = scaling
      gamma = exponent
      x = q/q_0
      q_0 = correlation peak position [1/A]
      background = Incoherent background"""
category = "shape-independent"

# pylint: disable=bad-whitespace, line-too-long
#             ["name", "units", default, [lower, upper], "type", "description"],
parameters = [["scale",    "",      1.0, [-inf, inf], "", "Scale factor"],
              ["gamma",      "",    3.0, [-inf, inf], "", "Exponent"],
              ["q_0",  "1/Ang",     0.1, [-inf, inf], "", "Correlation peak position"]
             ]
# pylint: enable=bad-whitespace, line-too-long

def Iq(q,
       scale=1.0,
       gamma=3.0,
       q_0=0.1):
    """
    :param q:              Input q-value
    :param scale:          Scale factor
    :param gamma:          Exponent
    :param q_0:            Correlation peak position
    :return:               Calculated intensity
    """
    
    with errstate(divide='ignore'):
        x = q/q_0
        inten = scale * ((1 + gamma / 2) * x ** 2) / (gamma / 2 + x ** (2 + gamma)) 
    return inten
Iq.vectorized = True  # Iq accepts an array of q values
This is 4.2.0

import numpy as np
from numpy import inf, errstate

name = "spinodal"
title = "Spinodal decomposition model"
description = """\
      I(q) = Imax ((1+gamma/2)x^2)/(gamma/2+x^(2+gamma)) + background

      List of default parameters:
      
      Imax = correlation peak intensity at q_0
      background = incoherent background
      gamma = exponent (see model documentation)
      q_0 = correlation peak position [1/A]
      x = q/q_0"""
      
category = "shape-independent"

# pylint: disable=bad-whitespace, line-too-long
#             ["name", "units", default, [lower, upper], "type", "description"],
parameters = [["gamma",      "",    3.0, [-inf, inf], "", "Exponent"],
              ["q_0",  "1/Ang",     0.1, [-inf, inf], "", "Correlation peak position"]
             ]
# pylint: enable=bad-whitespace, line-too-long

def Iq(q,
       gamma=3.0,
       q_0=0.1):
    """
    :param q:              Input q-value
    :param gamma:          Exponent
    :param q_0:            Correlation peak position
    :return:               Calculated intensity
    """

    with errstate(divide='ignore'):
        x = q/q_0
        inten = ((1 + gamma / 2) * x ** 2) / (gamma / 2 + x ** (2 + gamma))
    return inten
Iq.vectorized = True  # Iq accepts an array of q values

@richardh theorised that this might have something to do with the spinodal model being a non-SLD model, so we also tested the behaviour of the gaussian_peak model. The scale values from that reported by 4.1.2 and 4.2.0 were the same.

Does anyone remember making a change elsewhere that might account for this issue?

#1190 invalid documentation for magnetism need update smk78 richardh
Description

docs for magnetism need changing as they claim that magnetism only works for 5 models, when in fact they should now work for all discrete particles i.e. all of cylinder, ellipsoid, parallelepiped & sphere categories.

Magnetic sld's are automatically generated for any parameter name starting sld_.

Except need to check whether there are any pure python models - see #1048

Paul K said: The calculation works by setting the effective SLD for each particle SLD based on the spin and scattering angle of the neutron and the angle of the magnetic field for that SLD. The scattering patterns for the different spin states (++, +-, -+, —) are then added proportionally. Assuming uniform magnetism within each particle component, I see no reason that this wouldn't apply to all particle types.

1 2 3 4 5 6 7 8 9 10 11

Query Language

query: TracLinks and the [[TicketQuery]] macro both use a mini “query language” for specifying query filters. Basically, the filters are separated by ampersands (&). Each filter then consists of the ticket field name, an operator, and one or more values. More than one value are separated by a pipe (|), meaning that the filter matches any of the values. To include a literal & or | in a value, escape the character with a backslash (\).

The available operators are:

= the field content exactly matches one of the values
~= the field content contains one or more of the values
^= the field content starts with one of the values
$= the field content ends with one of the values

All of these operators can also be negated:

!= the field content matches none of the values
!~= the field content does not contain any of the values
!^= the field content does not start with any of the values
!$= the field content does not end with any of the values

The date fields created and modified can be constrained by using the = operator and specifying a value containing two dates separated by two dots (..). Either end of the date range can be left empty, meaning that the corresponding end of the range is open. The date parser understands a few natural date specifications like "3 weeks ago", "last month" and "now", as well as Bugzilla-style date specifications like "1d", "2w", "3m" or "4y" for 1 day, 2 weeks, 3 months and 4 years, respectively. Spaces in date specifications can be left out to avoid having to quote the query string.

created=2007-01-01..2008-01-01 query tickets created in 2007
created=lastmonth..thismonth query tickets created during the previous month
modified=1weekago.. query tickets that have been modified in the last week
modified=..30daysago query tickets that have been inactive for the last 30 days

See also: TracTickets, TracReports, TracGuide

Last modified 2 years ago Last modified on Oct 5, 2016 1:31:12 PM