Changeset 4b001f3 in sasview for src


Ignore:
Timestamp:
Sep 27, 2017 10:47:44 AM (7 years ago)
Author:
Paul Kienzle <pkienzle@…>
Branches:
master, ESS_GUI, ESS_GUI_Docs, ESS_GUI_batch_fitting, ESS_GUI_bumps_abstraction, ESS_GUI_iss1116, ESS_GUI_iss879, ESS_GUI_iss959, ESS_GUI_opencl, ESS_GUI_ordering, ESS_GUI_sync_sascalc, magnetic_scatt, release-4.2.2, ticket-1009, ticket-1094-headless, ticket-1242-2d-resolution, ticket-1243, ticket-1249, ticket885, unittest-saveload
Children:
fca1f50
Parents:
48154abb (diff), ad476d1 (diff)
Note: this is a merge changeset, the changes displayed below correspond to the merge itself.
Use the (diff) links above to see all the changes relative to each parent.
Message:

Merge branch 'master' into ticket-915

Location:
src/sas
Files:
1 added
16 edited

Legend:

Unmodified
Added
Removed
  • src/sas/sascalc/corfunc/corfunc_calculator.py

    ra859f99 r92eee84  
    124124 
    125125        params, s2 = self._fit_data(q, iq) 
     126        # Extrapolate to 100*Qmax in experimental data 
    126127        qs = np.arange(0, q[-1]*100, (q[1]-q[0])) 
    127128        iqs = s2(qs) 
  • src/sas/sasgui/perspectives/corfunc/media/corfunc_help.rst

    rf80b416e rad476d1  
    99----------- 
    1010 
    11 This performs a correlation function analysis of one-dimensional 
    12 SAXS/SANS data, or generates a model-independent volume fraction 
    13 profile from the SANS from an adsorbed polymer/surfactant layer. 
     11This currently performs correlation function analysis on SAXS/SANS data,  
     12but in the the future is also planned to generate model-independent volume  
     13fraction profiles from the SANS from adsorbed polymer/surfactant layers.  
     14The two types of analyses differ in the mathematical transform that is  
     15applied to the data (Fourier vs Hilbert). However, both functions are  
     16returned in *real space*. 
    1417 
    1518A correlation function may be interpreted in terms of an imaginary rod moving 
    16 through the structure of the material. Γ\ :sub:`1D`\ (R) is the probability that 
    17 a rod of length R moving through the material has equal electron/neutron scattering 
    18 length density at either end. Hence a frequently occurring spacing within a structure 
    19 manifests itself as a peak. 
    20  
    21 A volume fraction profile :math:`\Phi`\ (z) describes how the density of polymer segments/surfactant molecules varies with distance from an (assumed locally flat) interface. 
    22  
    23 Both functions are returned in *real space*. 
    24  
    25 The analysis is performed in 3 stages: 
    26  
    27 *  Extrapolation of the scattering curve to :math:`Q = 0` and 
     19through the structure of the material. Γ(x) is the probability that a rod of  
     20length x has equal electron/neutron scattering length density at either end.  
     21Hence a frequently occurring spacing within a structure will manifest itself  
     22as a peak in Γ(x). *SasView* will return both the one-dimensional ( Γ\ :sub:`1`\ (x) )  
     23and three-dimensional ( Γ\ :sub:`3`\ (x) ) correlation functions, the difference  
     24being that the former is only averaged in the plane of the scattering vector. 
     25 
     26A volume fraction profile :math:`\Phi`\ (z) describes how the density of polymer  
     27segments/surfactant molecules varies with distance, z, normal to an (assumed  
     28locally flat) interface. The form of :math:`\Phi`\ (z) can provide information  
     29about the arrangement of polymer/surfactant molecules at the interface. The width  
     30of the profile provides measures of the layer thickness, and the area under  
     31the profile is related to the amount of material that is adsorbed. 
     32 
     33Both analyses are performed in 3 stages: 
     34 
     35*  Extrapolation of the scattering curve to :math:`Q = 0` and toward  
    2836   :math:`Q = \infty` 
    2937*  Smoothed merging of the two extrapolations into the original data 
    3038*  Fourier / Hilbert Transform of the smoothed data to give the correlation 
    31    function / volume fraction profile, respectively 
    32 *  (Optional) Interpretation of the 1D correlation function based on an ideal 
    33    lamellar morphology 
     39   function or volume fraction profile, respectively 
     40*  (Optional) Interpretation of Γ\ :sub:`1`\ (x) assuming the sample conforms  
     41   to an ideal lamellar morphology 
    3442 
    3543.. ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ 
     44 
    3645 
    3746Extrapolation 
     
    4150................ 
    4251 
    43 The data are extrapolated to Q = 0 by fitting a Guinier model to the data 
    44 points in the low-Q range. 
     52The data are extrapolated to q = 0 by fitting a Guinier function to the data 
     53points in the low-q range. 
    4554 
    4655The equation used is: 
    4756 
    4857.. math:: 
    49     I(Q) = Ae^{Bq^2} 
    50  
    51 The Guinier model assumes that the small angle scattering arises from particles 
    52 and that parameter :math:`B` is related to the radius of gyration of those 
    53 particles. This has dubious applicability to polymer systems. However, the 
    54 correlation function is affected by the Guinier back-extrapolation to the 
    55 greatest extent at large values of R and so only has a 
    56 small effect on the final analysis. 
     58    I(q) = A e^{Bq^2} 
     59 
     60Where the parameter :math:`B` is related to the effective radius-of-gyration of  
     61a spherical object having the same small-angle scattering in this region. 
     62         
     63Note that as q tends to zero this function tends to a limiting value and is  
     64therefore less appropriate for use in systems where the form factor does not  
     65do likewise. However, because of the transform, the correlation functions are  
     66most affected by the Guinier back-extrapolation at *large* values of x where  
     67the impact on any extrapolated parameters will be least significant. 
    5768 
    5869To :math:`Q = \infty` 
    5970..................... 
    6071 
    61 The data are extrapolated to Q = :math:`\infty` by fitting a Porod model to 
    62 the data points in the high-Q range. 
     72The data are extrapolated towards q = :math:`\infty` by fitting a Porod model to 
     73the data points in the high-q range and then computing the extrapolation to 100  
     74times the maximum q value in the experimental dataset. This should be more than  
     75sufficient to ensure that on transformation any truncation artefacts introduced  
     76are at such small values of x that they can be safely ignored. 
    6377 
    6478The equation used is: 
    6579 
    6680.. math:: 
    67     I(Q) = K Q^{-4}e^{-Q^2\sigma^2} + Bg 
    68  
    69 Where :math:`Bg` is the background, :math:`K` is the Porod 
    70 constant, and :math:`\sigma` (which must be > 0) describes the width of the electron or neutron scattering length density profile at the interface between the crystalline and amorphous 
    71 regions as shown below. 
     81    I(q) = K q^{-4}e^{-q^2\sigma^2} + Bg 
     82 
     83Where :math:`Bg` is the background, :math:`K` is the Porod constant, and :math:`\sigma` (which  
     84must be > 0) describes the width of the electron/neutron scattering length density  
     85profile at the interface between the crystalline and amorphous regions as shown below. 
    7286 
    7387.. figure:: fig1.png 
     
    7892--------- 
    7993 
    80 The extrapolated data set consists of the Guinier back-extrapolation from Q~0 
    81 up to the lowest Q value in the original data, then the original scattering data, and the Porod tail-fit beyond this. The joins between the original data and the Guinier/Porod fits are smoothed using the algorithm below to avoid the formation of ripples in the transformed data. 
     94The extrapolated data set consists of the Guinier back-extrapolation from q ~ 0 
     95up to the lowest q value in the original data, then the original scattering data,  
     96and then the Porod tail-fit beyond this. The joins between the original data and  
     97the Guinier/Porod extrapolations are smoothed using the algorithm below to try  
     98and avoid the formation of truncation ripples in the transformed data: 
    8299 
    83100Functions :math:`f(x_i)` and :math:`g(x_i)` where :math:`x_i \in \left\{ 
     
    94111 
    95112 
    96 Transform 
    97 --------- 
     113Transformation 
     114-------------- 
    98115 
    99116Fourier 
    100117....... 
    101118 
    102 If "Fourier" is selected for the transform type, the analysis will perform a 
     119If "Fourier" is selected for the transform type, *SasView* will perform a 
    103120discrete cosine transform on the extrapolated data in order to calculate the 
    104 1D correlation function: 
    105  
    106 .. math:: 
    107     \Gamma _{1D}(R) = \frac{1}{Q^{*}} \int_{0}^{\infty }I(q) q^{2} cos(qR) dq 
    108  
    109 where Q\ :sup:`*` is the Scattering Invariant. 
     1211D correlation function as: 
     122 
     123.. math:: 
     124    \Gamma _{1}(x) = \frac{1}{Q^{*}} \int_{0}^{\infty }I(q) q^{2} cos(qx) dq 
     125 
     126where Q\ :sup:`*` is the Scattering (also called Porod) Invariant. 
    110127 
    111128The following algorithm is applied: 
     
    116133    N-1, N 
    117134 
    118 The 3D correlation function is also calculated: 
    119  
    120 .. math:: 
    121     \Gamma _{3D}(R) = \frac{1}{Q^{*}} \int_{0}^{\infty}I(q) q^{2} 
    122     \frac{sin(qR)}{qR} dq 
     135The 3D correlation function is calculated as: 
     136 
     137.. math:: 
     138    \Gamma _{3}(x) = \frac{1}{Q^{*}} \int_{0}^{\infty}I(q) q^{2} 
     139    \frac{sin(qx)}{qx} dq 
     140 
     141.. note:: It is always advisable to inspect Γ\ :sub:`1`\ (x) and Γ\ :sub:`3`\ (x)  
     142    for artefacts arising from the extrapolation and transformation processes: 
     143         
     144        - do they tend to zero as x tends to :math:`\infty`? 
     145        - do they smoothly curve onto the ordinate at x = 0? (if not check the value  
     146          of :math:`\sigma` is sensible) 
     147        - are there ripples at x values corresponding to (2 :math:`pi` over) the two  
     148          q values at which the extrapolated and experimental data are merged? 
     149        - are there any artefacts at x values corresponding to 2 :math:`pi` / q\ :sub:`max` in  
     150          the experimental data?  
     151        - and lastly, do the significant features/peaks in the correlation functions  
     152          actually correspond to anticpated spacings in the sample?!!! 
     153 
     154Finally, the program calculates the interface distribution function (IDF) g\ :sub:`1`\ (x) as  
     155the discrete cosine transform of: 
     156 
     157.. math:: 
     158    -q^{4} I(q) 
     159 
     160The IDF is proportional to the second derivative of Γ\ :sub:`1`\ (x). 
    123161 
    124162Hilbert 
    125163....... 
    126  
     164         
    127165If "Hilbert" is selected for the transform type, the analysis will perform a 
    128166Hilbert transform on the extrapolated data in order to calculate the Volume 
    129167Fraction Profile. 
    130168 
    131 .. note:: This functionality is not yet implemented in SasView. 
     169.. note:: The Hilbert transform functionality is not yet implemented in SasView. 
    132170 
    133171 
     
    138176.................... 
    139177 
    140 Once the correlation function has been calculated it may be interpreted by clicking the "Compute Parameters" button. 
    141  
    142 The correlation function is interpreted in terms of an ideal lamellar 
    143 morphology, and structural parameters are obtained from it as shown below. 
    144 It should be noted that a small beam size is assumed; ie, no de-smearing is 
    145 performed. 
     178Once the correlation functions have been calculated *SasView* can be asked to  
     179try and interpret Γ\ :sub:`1`\ (x) in terms of an ideal lamellar morphology  
     180as shown below. 
    146181 
    147182.. figure:: fig2.png 
    148183   :align: center 
    149184 
    150 The structural parameters obtained are: 
     185The structural parameters extracted are: 
    151186 
    152187*   Long Period :math:`= L_p` 
     
    160195....................... 
    161196 
    162 SasView does not provide any automatic interpretation of volume fraction profiles in the same way that it does for correlation functions. However, a number of structural parameters are obtainable by other means: 
     197SasView does not provide any automatic interpretation of volume fraction profiles  
     198in the same way that it does for correlation functions. However, a number of  
     199structural parameters are obtainable by other means: 
    163200 
    164201*   Surface Coverage :math:`=\theta` 
     
    175212   :align: center 
    176213 
     214The reader is directed to the references for information on these parameters. 
    177215 
    178216References 
    179217---------- 
    180218 
     219Correlation Function 
     220.................... 
     221 
    181222Strobl, G. R.; Schneider, M. *J. Polym. Sci.* (1980), 18, 1343-1359 
    182223 
     
    189230Baltá Calleja, F. J.; Vonk, C. G. *X-ray Scattering of Synthetic Poylmers*, Elsevier. Amsterdam (1989), 260-270 
    190231 
     232Göschel, U.; Urban, G. *Polymer* (1995), 36, 3633-3639 
     233 
     234Stribeck, N. *X-Ray Scattering of Soft Matter*, Springer. Berlin (2007), 138-161 
     235 
    191236:ref:`FDR` (PDF format) 
     237 
     238Volume Fraction Profile 
     239....................... 
     240 
     241Washington, C.; King, S. M. *J. Phys. Chem.*, (1996), 100, 7603-7609 
     242 
     243Cosgrove, T.; King, S. M.; Griffiths, P. C. *Colloid-Polymer Interactions: From Fundamentals to Practice*, Wiley. New York (1999), 193-204 
     244 
     245King, S. M.; Griffiths, P. C.; Cosgrove, T. *Applications of Neutron Scattering to Soft Condensed Matter*, Gordon & Breach. Amsterdam (2000), 77-105 
     246 
     247King, S.; Griffiths, P.; Hone, J.; Cosgrove, T. *Macromol. Symp.* (2002), 190, 33-42 
    192248 
    193249.. ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ 
     
    198254Upon sending data for correlation function analysis, it will be plotted (minus 
    199255the background value), along with a *red* bar indicating the *upper end of the 
    200 low-Q range* (used for back-extrapolation), and 2 *purple* bars indicating the range to be used for forward-extrapolation. These bars may be moved my clicking and 
    201 dragging, or by entering appropriate values in the Q range input boxes. 
     256low-Q range* (used for Guinier back-extrapolation), and 2 *purple* bars indicating  
     257the range to be used for Porod forward-extrapolation. These bars may be moved by  
     258grabbing and dragging, or by entering appropriate values in the Q range input boxes. 
    202259 
    203260.. figure:: tutorial1.png 
    204261   :align: center 
    205262 
    206 Once the Q ranges have been set, click the "Calculate" button to determine the background level. Alternatively, enter your own value into the field. If the box turns yellow this indicates that background subtraction has resulted in some negative intensities. 
    207  
    208 Click the "Extrapolate" button to extrapolate the data and plot the extrapolation in the same figure. The values of the parameters used for the Guinier and Porod models will also be shown in the "Extrapolation Parameters" section of the window. 
     263Once the Q ranges have been set, click the "Calculate Bg" button to determine the  
     264background level. Alternatively, enter your own value into the box. If the box turns  
     265yellow this indicates that background subtraction has created some negative intensities. 
     266 
     267Now click the "Extrapolate" button to extrapolate the data. The graph window will update  
     268to show the extrapolated data, and the values of the parameters used for the Guinier and  
     269Porod extrapolations will appear in the "Extrapolation Parameters" section of the SasView  
     270GUI. 
    209271 
    210272.. figure:: tutorial2.png 
     
    214276buttons: 
    215277 
    216 *   **Fourier** Perform a Fourier Transform to calculate the correlation 
    217     function 
    218 *   **Hilbert** Perform a Hilbert Transform to calculate the volume fraction 
     278*   **Fourier**: to perform a Fourier Transform to calculate the correlation 
     279    functions 
     280*   **Hilbert**: to perform a Hilbert Transform to calculate the volume fraction 
    219281    profile 
    220282 
    221 Click the "Transform" button to perform the selected transform and plot 
    222 the result in a new graph window. 
    223  
    224 If a Fourier Transform was performed, the "Compute Parameters" button can now be clicked to interpret the correlation function as described earlier. 
     283and click the "Transform" button to perform the selected transform and plot 
     284the results. 
    225285 
    226286 .. figure:: tutorial3.png 
    227287    :align: center 
    228288 
     289If a Fourier Transform was performed, the "Compute Parameters" button can now be  
     290clicked to interpret the correlation function as described earlier. The parameters  
     291will appear in the "Output Parameters" section of the SasView GUI. 
     292 
     293 .. figure:: tutorial4.png 
     294    :align: center 
     295 
    229296 
    230297.. note:: 
    231     This help document was last changed by Steve King, 08Oct2016 
     298    This help document was last changed by Steve King, 26Sep2017 
  • src/sas/sascalc/dataloader/data_info.py

    r17e257b5 rdeaa0c6  
    11761176        final_dataset.yaxis(data._yaxis, data._yunit) 
    11771177        final_dataset.zaxis(data._zaxis, data._zunit) 
    1178         final_dataset.x_bins = data.x_bins 
    1179         final_dataset.y_bins = data.y_bins 
     1178        if len(data.data.shape) == 2: 
     1179            n_rows, n_cols = data.data.shape 
     1180            final_dataset.y_bins = data.qy_data[0::int(n_cols)] 
     1181            final_dataset.x_bins = data.qx_data[:int(n_cols)] 
    11801182    else: 
    11811183        return_string = "Should Never Happen: _combine_data_info_with_plottable input is not a plottable1d or " + \ 
  • src/sas/sascalc/dataloader/file_reader_base_class.py

    rae69c690 rdeaa0c6  
    167167                    dataset.x_bins = dataset.qx_data[:int(n_cols)] 
    168168                dataset.data = dataset.data.flatten() 
     169                if len(dataset.data) > 0: 
     170                    dataset.xmin = np.min(dataset.qx_data) 
     171                    dataset.xmax = np.max(dataset.qx_data) 
     172                    dataset.ymin = np.min(dataset.qy_data) 
     173                    dataset.ymax = np.max(dataset.qx_data) 
    169174 
    170175    def format_unit(self, unit=None): 
     
    191196        self.output = [] 
    192197 
    193     def remove_empty_q_values(self, has_error_dx=False, has_error_dy=False, 
    194                               has_error_dxl=False, has_error_dxw=False): 
     198    def data_cleanup(self): 
     199        """ 
     200        Clean up the data sets and refresh everything 
     201        :return: None 
     202        """ 
     203        self.remove_empty_q_values() 
     204        self.send_to_output()  # Combine datasets with DataInfo 
     205        self.current_datainfo = DataInfo()  # Reset DataInfo 
     206 
     207    def remove_empty_q_values(self): 
    195208        """ 
    196209        Remove any point where Q == 0 
    197210        """ 
    198         x = self.current_dataset.x 
    199         self.current_dataset.x = self.current_dataset.x[x != 0] 
    200         self.current_dataset.y = self.current_dataset.y[x != 0] 
    201         if has_error_dy: 
    202             self.current_dataset.dy = self.current_dataset.dy[x != 0] 
    203         if has_error_dx: 
    204             self.current_dataset.dx = self.current_dataset.dx[x != 0] 
    205         if has_error_dxl: 
    206             self.current_dataset.dxl = self.current_dataset.dxl[x != 0] 
    207         if has_error_dxw: 
    208             self.current_dataset.dxw = self.current_dataset.dxw[x != 0] 
     211        if isinstance(self.current_dataset, plottable_1D): 
     212            # Booleans for resolutions 
     213            has_error_dx = self.current_dataset.dx is not None 
     214            has_error_dxl = self.current_dataset.dxl is not None 
     215            has_error_dxw = self.current_dataset.dxw is not None 
     216            has_error_dy = self.current_dataset.dy is not None 
     217            # Create arrays of zeros for non-existent resolutions 
     218            if has_error_dxw and not has_error_dxl: 
     219                array_size = self.current_dataset.dxw.size - 1 
     220                self.current_dataset.dxl = np.append(self.current_dataset.dxl, 
     221                                                    np.zeros([array_size])) 
     222                has_error_dxl = True 
     223            elif has_error_dxl and not has_error_dxw: 
     224                array_size = self.current_dataset.dxl.size - 1 
     225                self.current_dataset.dxw = np.append(self.current_dataset.dxw, 
     226                                                    np.zeros([array_size])) 
     227                has_error_dxw = True 
     228            elif not has_error_dxl and not has_error_dxw and not has_error_dx: 
     229                array_size = self.current_dataset.x.size - 1 
     230                self.current_dataset.dx = np.append(self.current_dataset.dx, 
     231                                                    np.zeros([array_size])) 
     232                has_error_dx = True 
     233            if not has_error_dy: 
     234                array_size = self.current_dataset.y.size - 1 
     235                self.current_dataset.dy = np.append(self.current_dataset.dy, 
     236                                                    np.zeros([array_size])) 
     237                has_error_dy = True 
     238 
     239            # Remove points where q = 0 
     240            x = self.current_dataset.x 
     241            self.current_dataset.x = self.current_dataset.x[x != 0] 
     242            self.current_dataset.y = self.current_dataset.y[x != 0] 
     243            if has_error_dy: 
     244                self.current_dataset.dy = self.current_dataset.dy[x != 0] 
     245            if has_error_dx: 
     246                self.current_dataset.dx = self.current_dataset.dx[x != 0] 
     247            if has_error_dxl: 
     248                self.current_dataset.dxl = self.current_dataset.dxl[x != 0] 
     249            if has_error_dxw: 
     250                self.current_dataset.dxw = self.current_dataset.dxw[x != 0] 
     251        elif isinstance(self.current_dataset, plottable_2D): 
     252            has_error_dqx = self.current_dataset.dqx_data is not None 
     253            has_error_dqy = self.current_dataset.dqy_data is not None 
     254            has_error_dy = self.current_dataset.err_data is not None 
     255            has_mask = self.current_dataset.mask is not None 
     256            x = self.current_dataset.qx_data 
     257            self.current_dataset.data = self.current_dataset.data[x != 0] 
     258            self.current_dataset.qx_data = self.current_dataset.qx_data[x != 0] 
     259            self.current_dataset.qy_data = self.current_dataset.qy_data[x != 0] 
     260            self.current_dataset.q_data = np.sqrt( 
     261                np.square(self.current_dataset.qx_data) + np.square( 
     262                    self.current_dataset.qy_data)) 
     263            if has_error_dy: 
     264                self.current_dataset.err_data = self.current_dataset.err_data[x != 0] 
     265            if has_error_dqx: 
     266                self.current_dataset.dqx_data = self.current_dataset.dqx_data[x != 0] 
     267            if has_error_dqy: 
     268                self.current_dataset.dqy_data = self.current_dataset.dqy_data[x != 0] 
     269            if has_mask: 
     270                self.current_dataset.mask = self.current_dataset.mask[x != 0] 
    209271 
    210272    def reset_data_list(self, no_lines=0): 
  • src/sas/sascalc/dataloader/readers/abs_reader.py

    rad92c5a rffb6474  
    109109                # Sample thickness in mm 
    110110                try: 
    111                     value = float(line_toks[5]) 
     111                    value = float(line_toks[5][:-1]) 
    112112                    if self.has_converter and \ 
    113113                            self.current_datainfo.sample.thickness_unit != 'cm': 
     
    202202                is_data_started = True 
    203203 
    204         self.remove_empty_q_values(True, True) 
     204        self.remove_empty_q_values() 
    205205 
    206206        # Sanity check 
  • src/sas/sascalc/dataloader/readers/ascii_reader.py

    rf994e8b1 r7b07fbe  
    156156            raise FileContentsException(msg) 
    157157 
    158         self.remove_empty_q_values(has_error_dx, has_error_dy) 
     158        self.remove_empty_q_values() 
    159159        self.current_dataset.xaxis("\\rm{Q}", 'A^{-1}') 
    160160        self.current_dataset.yaxis("\\rm{Intensity}", "cm^{-1}") 
  • src/sas/sascalc/dataloader/readers/cansas_reader.py

    rae69c690 r62160509  
    104104            xml_file = self.f_open.name 
    105105        # We don't sure f_open since lxml handles opnening/closing files 
    106         if not self.f_open.closed: 
    107             self.f_open.close() 
    108  
    109         basename, _ = os.path.splitext(os.path.basename(xml_file)) 
    110  
    111106        try: 
    112107            # Raises FileContentsException 
    113108            self.load_file_and_schema(xml_file, schema_path) 
    114             self.current_datainfo = DataInfo() 
    115             # Raises FileContentsException if file doesn't meet CanSAS schema 
     109            # Parse each SASentry 
     110            entry_list = self.xmlroot.xpath('/ns:SASroot/ns:SASentry', 
     111                                            namespaces={ 
     112                                                'ns': self.cansas_defaults.get( 
     113                                                    "ns") 
     114                                            }) 
    116115            self.is_cansas(self.extension) 
    117             self.invalid = False # If we reach this point then file must be valid CanSAS 
    118  
    119             # Parse each SASentry 
    120             entry_list = self.xmlroot.xpath('/ns:SASroot/ns:SASentry', namespaces={ 
    121                 'ns': self.cansas_defaults.get("ns") 
    122             }) 
    123             # Look for a SASentry 
    124             self.names.append("SASentry") 
    125116            self.set_processing_instructions() 
    126  
    127117            for entry in entry_list: 
    128                 self.current_datainfo.filename = basename + self.extension 
    129                 self.current_datainfo.meta_data["loader"] = "CanSAS XML 1D" 
    130                 self.current_datainfo.meta_data[PREPROCESS] = self.processing_instructions 
    131118                self._parse_entry(entry) 
    132119                self.data_cleanup() 
     
    150137                    invalid_xml = self.find_invalid_xml() 
    151138                    if invalid_xml != "": 
     139                        basename, _ = os.path.splitext( 
     140                            os.path.basename(self.f_open.name)) 
    152141                        invalid_xml = INVALID_XML.format(basename + self.extension) + invalid_xml 
    153142                        raise DataReaderException(invalid_xml) # Handled by base class 
     
    164153        except Exception as e: # Convert all other exceptions to FileContentsExceptions 
    165154            raise FileContentsException(e.message) 
    166  
     155        finally: 
     156            if not self.f_open.closed: 
     157                self.f_open.close() 
    167158 
    168159    def load_file_and_schema(self, xml_file, schema_path=""): 
     
    209200        if not self._is_call_local() and not recurse: 
    210201            self.reset_state() 
     202        if not recurse: 
     203            self.current_datainfo = DataInfo() 
     204            # Raises FileContentsException if file doesn't meet CanSAS schema 
     205            self.invalid = False 
     206            # Look for a SASentry 
    211207            self.data = [] 
    212             self.current_datainfo = DataInfo() 
     208            self.parent_class = "SASentry" 
    213209            self.names.append("SASentry") 
    214             self.parent_class = "SASentry" 
     210            self.current_datainfo.meta_data["loader"] = "CanSAS XML 1D" 
     211            self.current_datainfo.meta_data[ 
     212                PREPROCESS] = self.processing_instructions 
     213        if self._is_call_local() and not recurse: 
     214            basename, _ = os.path.splitext(os.path.basename(self.f_open.name)) 
     215            self.current_datainfo.filename = basename + self.extension 
    215216        # Create an empty dataset if no data has been passed to the reader 
    216217        if self.current_dataset is None: 
    217             self.current_dataset = plottable_1D(np.empty(0), np.empty(0), 
    218                 np.empty(0), np.empty(0)) 
     218            self._initialize_new_data_set(dom) 
    219219        self.base_ns = "{" + CANSAS_NS.get(self.cansas_version).get("ns") + "}" 
    220220 
     
    228228            tagname_original = tagname 
    229229            # Skip this iteration when loading in save state information 
    230             if tagname == "fitting_plug_in" or tagname == "pr_inversion" or tagname == "invariant": 
     230            if tagname in ["fitting_plug_in", "pr_inversion", "invariant", "corfunc"]: 
    231231                continue 
    232232            # Get where to store content 
     
    258258                self._add_intermediate() 
    259259            else: 
     260                # TODO: Clean this up to make it faster (fewer if/elifs) 
    260261                if isinstance(self.current_dataset, plottable_2D): 
    261262                    data_point = node.text 
     
    502503            self.sort_two_d_data() 
    503504            self.reset_data_list() 
    504             empty = None 
    505             return self.output[0], empty 
    506  
    507     def data_cleanup(self): 
    508         """ 
    509         Clean up the data sets and refresh everything 
    510         :return: None 
    511         """ 
    512         has_error_dx = self.current_dataset.dx is not None 
    513         has_error_dxl = self.current_dataset.dxl is not None 
    514         has_error_dxw = self.current_dataset.dxw is not None 
    515         has_error_dy = self.current_dataset.dy is not None 
    516         self.remove_empty_q_values(has_error_dx=has_error_dx, 
    517                                    has_error_dxl=has_error_dxl, 
    518                                    has_error_dxw=has_error_dxw, 
    519                                    has_error_dy=has_error_dy) 
    520         self.send_to_output()  # Combine datasets with DataInfo 
    521         self.current_datainfo = DataInfo()  # Reset DataInfo 
     505            return self.output[0], None 
    522506 
    523507    def _is_call_local(self): 
     
    553537            self.aperture = Aperture() 
    554538        elif self.parent_class == 'SASdata': 
    555             self._check_for_empty_resolution() 
    556539            self.data.append(self.current_dataset) 
    557540 
     
    609592        if 'unit' in attr and attr.get('unit') is not None: 
    610593            try: 
    611                 local_unit = attr['unit'] 
     594                unit = attr['unit'] 
     595                unit_list = unit.split("|") 
     596                if len(unit_list) > 1: 
     597                    self.current_dataset.xaxis(unit_list[0].strip(), 
     598                                               unit_list[1].strip()) 
     599                    local_unit = unit_list[1] 
     600                else: 
     601                    local_unit = unit 
    612602                unitname = self.ns_list.current_level.get("unit", "") 
    613603                if "SASdetector" in self.names: 
     
    665655        return node_value, value_unit 
    666656 
    667     def _check_for_empty_resolution(self): 
    668         """ 
    669         a method to check all resolution data sets are the same size as I and q 
    670         """ 
    671         dql_exists = False 
    672         dqw_exists = False 
    673         dq_exists = False 
    674         di_exists = False 
    675         if self.current_dataset.dxl is not None: 
    676             dql_exists = True 
    677         if self.current_dataset.dxw is not None: 
    678             dqw_exists = True 
    679         if self.current_dataset.dx is not None: 
    680             dq_exists = True 
    681         if self.current_dataset.dy is not None: 
    682             di_exists = True 
    683         if dqw_exists and not dql_exists: 
    684             array_size = self.current_dataset.dxw.size 
    685             self.current_dataset.dxl = np.zeros(array_size) 
    686         elif dql_exists and not dqw_exists: 
    687             array_size = self.current_dataset.dxl.size 
    688             self.current_dataset.dxw = np.zeros(array_size) 
    689         elif not dql_exists and not dqw_exists and not dq_exists: 
    690             array_size = self.current_dataset.x.size 
    691             self.current_dataset.dx = np.append(self.current_dataset.dx, 
    692                                                 np.zeros([array_size])) 
    693         if not di_exists: 
    694             array_size = self.current_dataset.y.size 
    695             self.current_dataset.dy = np.append(self.current_dataset.dy, 
    696                                                 np.zeros([array_size])) 
    697  
    698657    def _initialize_new_data_set(self, node=None): 
    699658        if node is not None: 
  • src/sas/sasgui/perspectives/corfunc/corfunc_state.py

    r2a399ca r1fa4f736  
    289289                namespaces={'ns': CANSAS_NS}) 
    290290            for entry in entry_list: 
    291                 sas_entry, _ = self._parse_entry(entry) 
    292291                corstate = self._parse_state(entry) 
    293292 
    294293                if corstate is not None: 
     294                    sas_entry, _ = self._parse_entry(entry) 
    295295                    sas_entry.meta_data['corstate'] = corstate 
    296296                    sas_entry.filename = corstate.file 
  • src/sas/sasgui/perspectives/fitting/fitpage.py

    r13374be r48154abb  
    20422042            # Save state_fit 
    20432043            self.save_current_state_fit() 
     2044            self.onSmear(None) 
     2045            self._onDraw(None) 
    20442046        except: 
    20452047            self._show_combox_helper() 
  • src/sas/sasgui/perspectives/fitting/pagestate.py

    rda9b239 r1fa4f736  
    12971297                                            namespaces={'ns': CANSAS_NS}) 
    12981298                    for entry in entry_list: 
    1299                         try: 
    1300                             sas_entry, _ = self._parse_save_state_entry(entry) 
    1301                         except: 
    1302                             raise 
    13031299                        fitstate = self._parse_state(entry) 
    1304  
    13051300                        # state could be None when .svs file is loaded 
    13061301                        # in this case, skip appending to output 
    13071302                        if fitstate is not None: 
     1303                            try: 
     1304                                sas_entry, _ = self._parse_save_state_entry( 
     1305                                    entry) 
     1306                            except: 
     1307                                raise 
    13081308                            sas_entry.meta_data['fitstate'] = fitstate 
    13091309                            sas_entry.filename = fitstate.file 
  • src/sas/sasgui/perspectives/fitting/simfitpage.py

    ra9f9ca4 r9804394  
    10731073        """ 
    10741074 
    1075         model_map = {} 
     1075        init_map = {} 
     1076        final_map = {} 
    10761077        if fit.fit_panel.sim_page is None: 
    10771078            fit.fit_panel.add_sim_page() 
     
    10871088                save_id = self._format_id(save_id) 
    10881089                if save_id == model_id: 
    1089                     model_map[saved_model.pop('fit_page_source')] = \ 
    1090                         model[3].name 
     1090                    inter_id = str(i) + str(i) + str(i) + str(i) + str(i) 
     1091                    init_map[saved_model.pop('fit_page_source')] = inter_id 
     1092                    final_map[inter_id] = model[3].name 
    10911093                    check = bool(saved_model.pop('checked')) 
    10921094                    sim_page.model_list[i][0].SetValue(check) 
     
    11061108                param = item.pop('param_cbox') 
    11071109                equality = item.pop('egal_txt') 
    1108                 for key, value in model_map.iteritems(): 
    1109                     model_cbox.replace(key, value) 
    1110                     constraint_value.replace(key, value) 
     1110                for key, value in init_map.items(): 
     1111                    model_cbox = model_cbox.replace(key, value) 
     1112                    constraint_value = constraint_value.replace(key, value) 
     1113                for key, value in final_map.items(): 
     1114                    model_cbox = model_cbox.replace(key, value) 
     1115                    constraint_value = constraint_value.replace(key, value) 
    11111116 
    11121117                sim_page.constraints_list[index][0].SetValue(model_cbox) 
  • src/sas/sasgui/perspectives/invariant/invariant_state.py

    r7432acb r1fa4f736  
    728728 
    729729                for entry in entry_list: 
    730  
    731                     sas_entry, _ = self._parse_entry(entry) 
    732730                    invstate = self._parse_state(entry) 
    733  
    734731                    # invstate could be None when .svs file is loaded 
    735732                    # in this case, skip appending to output 
    736733                    if invstate is not None: 
     734                        sas_entry, _ = self._parse_entry(entry) 
    737735                        sas_entry.meta_data['invstate'] = invstate 
    738736                        sas_entry.filename = invstate.file 
  • src/sas/sasgui/perspectives/pr/inversion_state.py

    ra0e6b1b r1fa4f736  
    472472 
    473473                for entry in entry_list: 
    474                     sas_entry, _ = self._parse_entry(entry) 
    475474                    prstate = self._parse_prstate(entry) 
    476475                    #prstate could be None when .svs file is loaded 
    477476                    #in this case, skip appending to output 
    478477                    if prstate is not None: 
     478                        sas_entry, _ = self._parse_entry(entry) 
    479479                        sas_entry.meta_data['prstate'] = prstate 
    480480                        sas_entry.filename = prstate.file 
Note: See TracChangeset for help on using the changeset viewer.