- Timestamp:
- Sep 27, 2017 10:47:44 AM (7 years ago)
- Branches:
- master, ESS_GUI, ESS_GUI_Docs, ESS_GUI_batch_fitting, ESS_GUI_bumps_abstraction, ESS_GUI_iss1116, ESS_GUI_iss879, ESS_GUI_iss959, ESS_GUI_opencl, ESS_GUI_ordering, ESS_GUI_sync_sascalc, magnetic_scatt, release-4.2.2, ticket-1009, ticket-1094-headless, ticket-1242-2d-resolution, ticket-1243, ticket-1249, ticket885, unittest-saveload
- Children:
- fca1f50
- Parents:
- 48154abb (diff), ad476d1 (diff)
Note: this is a merge changeset, the changes displayed below correspond to the merge itself.
Use the (diff) links above to see all the changes relative to each parent. - Location:
- src/sas
- Files:
-
- 1 added
- 16 edited
Legend:
- Unmodified
- Added
- Removed
-
src/sas/sascalc/corfunc/corfunc_calculator.py
ra859f99 r92eee84 124 124 125 125 params, s2 = self._fit_data(q, iq) 126 # Extrapolate to 100*Qmax in experimental data 126 127 qs = np.arange(0, q[-1]*100, (q[1]-q[0])) 127 128 iqs = s2(qs) -
src/sas/sasgui/perspectives/corfunc/media/corfunc_help.rst
rf80b416e rad476d1 9 9 ----------- 10 10 11 This performs a correlation function analysis of one-dimensional 12 SAXS/SANS data, or generates a model-independent volume fraction 13 profile from the SANS from an adsorbed polymer/surfactant layer. 11 This currently performs correlation function analysis on SAXS/SANS data, 12 but in the the future is also planned to generate model-independent volume 13 fraction profiles from the SANS from adsorbed polymer/surfactant layers. 14 The two types of analyses differ in the mathematical transform that is 15 applied to the data (Fourier vs Hilbert). However, both functions are 16 returned in *real space*. 14 17 15 18 A correlation function may be interpreted in terms of an imaginary rod moving 16 through the structure of the material. Î\ :sub:`1D`\ (R) is the probability that 17 a rod of length R moving through the material has equal electron/neutron scattering 18 length density at either end. Hence a frequently occurring spacing within a structure 19 manifests itself as a peak. 20 21 A volume fraction profile :math:`\Phi`\ (z) describes how the density of polymer segments/surfactant molecules varies with distance from an (assumed locally flat) interface. 22 23 Both functions are returned in *real space*. 24 25 The analysis is performed in 3 stages: 26 27 * Extrapolation of the scattering curve to :math:`Q = 0` and 19 through the structure of the material. Î(x) is the probability that a rod of 20 length x has equal electron/neutron scattering length density at either end. 21 Hence a frequently occurring spacing within a structure will manifest itself 22 as a peak in Î(x). *SasView* will return both the one-dimensional ( Î\ :sub:`1`\ (x) ) 23 and three-dimensional ( Î\ :sub:`3`\ (x) ) correlation functions, the difference 24 being that the former is only averaged in the plane of the scattering vector. 25 26 A volume fraction profile :math:`\Phi`\ (z) describes how the density of polymer 27 segments/surfactant molecules varies with distance, z, normal to an (assumed 28 locally flat) interface. The form of :math:`\Phi`\ (z) can provide information 29 about the arrangement of polymer/surfactant molecules at the interface. The width 30 of the profile provides measures of the layer thickness, and the area under 31 the profile is related to the amount of material that is adsorbed. 32 33 Both analyses are performed in 3 stages: 34 35 * Extrapolation of the scattering curve to :math:`Q = 0` and toward 28 36 :math:`Q = \infty` 29 37 * Smoothed merging of the two extrapolations into the original data 30 38 * Fourier / Hilbert Transform of the smoothed data to give the correlation 31 function /volume fraction profile, respectively32 * (Optional) Interpretation of the 1D correlation function based on an ideal33 lamellar morphology39 function or volume fraction profile, respectively 40 * (Optional) Interpretation of Î\ :sub:`1`\ (x) assuming the sample conforms 41 to an ideal lamellar morphology 34 42 35 43 .. ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ 44 36 45 37 46 Extrapolation … … 41 50 ................ 42 51 43 The data are extrapolated to Q = 0 by fitting a Guinier modelto the data44 points in the low- Qrange.52 The data are extrapolated to q = 0 by fitting a Guinier function to the data 53 points in the low-q range. 45 54 46 55 The equation used is: 47 56 48 57 .. math:: 49 I(Q) = Ae^{Bq^2} 50 51 The Guinier model assumes that the small angle scattering arises from particles 52 and that parameter :math:`B` is related to the radius of gyration of those 53 particles. This has dubious applicability to polymer systems. However, the 54 correlation function is affected by the Guinier back-extrapolation to the 55 greatest extent at large values of R and so only has a 56 small effect on the final analysis. 58 I(q) = A e^{Bq^2} 59 60 Where the parameter :math:`B` is related to the effective radius-of-gyration of 61 a spherical object having the same small-angle scattering in this region. 62 63 Note that as q tends to zero this function tends to a limiting value and is 64 therefore less appropriate for use in systems where the form factor does not 65 do likewise. However, because of the transform, the correlation functions are 66 most affected by the Guinier back-extrapolation at *large* values of x where 67 the impact on any extrapolated parameters will be least significant. 57 68 58 69 To :math:`Q = \infty` 59 70 ..................... 60 71 61 The data are extrapolated to Q = :math:`\infty` by fitting a Porod model to 62 the data points in the high-Q range. 72 The data are extrapolated towards q = :math:`\infty` by fitting a Porod model to 73 the data points in the high-q range and then computing the extrapolation to 100 74 times the maximum q value in the experimental dataset. This should be more than 75 sufficient to ensure that on transformation any truncation artefacts introduced 76 are at such small values of x that they can be safely ignored. 63 77 64 78 The equation used is: 65 79 66 80 .. math:: 67 I( Q) = K Q^{-4}e^{-Q^2\sigma^2} + Bg68 69 Where :math:`Bg` is the background, :math:`K` is the Porod 70 constant, and :math:`\sigma` (which must be > 0) describes the width of the electron or neutron scattering length density profile at the interface between the crystalline and amorphous 71 regions as shown below.81 I(q) = K q^{-4}e^{-q^2\sigma^2} + Bg 82 83 Where :math:`Bg` is the background, :math:`K` is the Porod constant, and :math:`\sigma` (which 84 must be > 0) describes the width of the electron/neutron scattering length density 85 profile at the interface between the crystalline and amorphous regions as shown below. 72 86 73 87 .. figure:: fig1.png … … 78 92 --------- 79 93 80 The extrapolated data set consists of the Guinier back-extrapolation from Q~0 81 up to the lowest Q value in the original data, then the original scattering data, and the Porod tail-fit beyond this. The joins between the original data and the Guinier/Porod fits are smoothed using the algorithm below to avoid the formation of ripples in the transformed data. 94 The extrapolated data set consists of the Guinier back-extrapolation from q ~ 0 95 up to the lowest q value in the original data, then the original scattering data, 96 and then the Porod tail-fit beyond this. The joins between the original data and 97 the Guinier/Porod extrapolations are smoothed using the algorithm below to try 98 and avoid the formation of truncation ripples in the transformed data: 82 99 83 100 Functions :math:`f(x_i)` and :math:`g(x_i)` where :math:`x_i \in \left\{ … … 94 111 95 112 96 Transform 97 --------- 113 Transformation 114 -------------- 98 115 99 116 Fourier 100 117 ....... 101 118 102 If "Fourier" is selected for the transform type, the analysiswill perform a119 If "Fourier" is selected for the transform type, *SasView* will perform a 103 120 discrete cosine transform on the extrapolated data in order to calculate the 104 1D correlation function :105 106 .. math:: 107 \Gamma _{1 D}(R) = \frac{1}{Q^{*}} \int_{0}^{\infty }I(q) q^{2} cos(qR) dq108 109 where Q\ :sup:`*` is the Scattering Invariant.121 1D correlation function as: 122 123 .. math:: 124 \Gamma _{1}(x) = \frac{1}{Q^{*}} \int_{0}^{\infty }I(q) q^{2} cos(qx) dq 125 126 where Q\ :sup:`*` is the Scattering (also called Porod) Invariant. 110 127 111 128 The following algorithm is applied: … … 116 133 N-1, N 117 134 118 The 3D correlation function is also calculated: 119 120 .. math:: 121 \Gamma _{3D}(R) = \frac{1}{Q^{*}} \int_{0}^{\infty}I(q) q^{2} 122 \frac{sin(qR)}{qR} dq 135 The 3D correlation function is calculated as: 136 137 .. math:: 138 \Gamma _{3}(x) = \frac{1}{Q^{*}} \int_{0}^{\infty}I(q) q^{2} 139 \frac{sin(qx)}{qx} dq 140 141 .. note:: It is always advisable to inspect Î\ :sub:`1`\ (x) and Î\ :sub:`3`\ (x) 142 for artefacts arising from the extrapolation and transformation processes: 143 144 - do they tend to zero as x tends to :math:`\infty`? 145 - do they smoothly curve onto the ordinate at x = 0? (if not check the value 146 of :math:`\sigma` is sensible) 147 - are there ripples at x values corresponding to (2 :math:`pi` over) the two 148 q values at which the extrapolated and experimental data are merged? 149 - are there any artefacts at x values corresponding to 2 :math:`pi` / q\ :sub:`max` in 150 the experimental data? 151 - and lastly, do the significant features/peaks in the correlation functions 152 actually correspond to anticpated spacings in the sample?!!! 153 154 Finally, the program calculates the interface distribution function (IDF) g\ :sub:`1`\ (x) as 155 the discrete cosine transform of: 156 157 .. math:: 158 -q^{4} I(q) 159 160 The IDF is proportional to the second derivative of Î\ :sub:`1`\ (x). 123 161 124 162 Hilbert 125 163 ....... 126 164 127 165 If "Hilbert" is selected for the transform type, the analysis will perform a 128 166 Hilbert transform on the extrapolated data in order to calculate the Volume 129 167 Fraction Profile. 130 168 131 .. note:: Th isfunctionality is not yet implemented in SasView.169 .. note:: The Hilbert transform functionality is not yet implemented in SasView. 132 170 133 171 … … 138 176 .................... 139 177 140 Once the correlation function has been calculated it may be interpreted by clicking the "Compute Parameters" button. 141 142 The correlation function is interpreted in terms of an ideal lamellar 143 morphology, and structural parameters are obtained from it as shown below. 144 It should be noted that a small beam size is assumed; ie, no de-smearing is 145 performed. 178 Once the correlation functions have been calculated *SasView* can be asked to 179 try and interpret Î\ :sub:`1`\ (x) in terms of an ideal lamellar morphology 180 as shown below. 146 181 147 182 .. figure:: fig2.png 148 183 :align: center 149 184 150 The structural parameters obtained are:185 The structural parameters extracted are: 151 186 152 187 * Long Period :math:`= L_p` … … 160 195 ....................... 161 196 162 SasView does not provide any automatic interpretation of volume fraction profiles in the same way that it does for correlation functions. However, a number of structural parameters are obtainable by other means: 197 SasView does not provide any automatic interpretation of volume fraction profiles 198 in the same way that it does for correlation functions. However, a number of 199 structural parameters are obtainable by other means: 163 200 164 201 * Surface Coverage :math:`=\theta` … … 175 212 :align: center 176 213 214 The reader is directed to the references for information on these parameters. 177 215 178 216 References 179 217 ---------- 180 218 219 Correlation Function 220 .................... 221 181 222 Strobl, G. R.; Schneider, M. *J. Polym. Sci.* (1980), 18, 1343-1359 182 223 … … 189 230 Baltá Calleja, F. J.; Vonk, C. G. *X-ray Scattering of Synthetic Poylmers*, Elsevier. Amsterdam (1989), 260-270 190 231 232 Göschel, U.; Urban, G. *Polymer* (1995), 36, 3633-3639 233 234 Stribeck, N. *X-Ray Scattering of Soft Matter*, Springer. Berlin (2007), 138-161 235 191 236 :ref:`FDR` (PDF format) 237 238 Volume Fraction Profile 239 ....................... 240 241 Washington, C.; King, S. M. *J. Phys. Chem.*, (1996), 100, 7603-7609 242 243 Cosgrove, T.; King, S. M.; Griffiths, P. C. *Colloid-Polymer Interactions: From Fundamentals to Practice*, Wiley. New York (1999), 193-204 244 245 King, S. M.; Griffiths, P. C.; Cosgrove, T. *Applications of Neutron Scattering to Soft Condensed Matter*, Gordon & Breach. Amsterdam (2000), 77-105 246 247 King, S.; Griffiths, P.; Hone, J.; Cosgrove, T. *Macromol. Symp.* (2002), 190, 33-42 192 248 193 249 .. ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ … … 198 254 Upon sending data for correlation function analysis, it will be plotted (minus 199 255 the background value), along with a *red* bar indicating the *upper end of the 200 low-Q range* (used for back-extrapolation), and 2 *purple* bars indicating the range to be used for forward-extrapolation. These bars may be moved my clicking and 201 dragging, or by entering appropriate values in the Q range input boxes. 256 low-Q range* (used for Guinier back-extrapolation), and 2 *purple* bars indicating 257 the range to be used for Porod forward-extrapolation. These bars may be moved by 258 grabbing and dragging, or by entering appropriate values in the Q range input boxes. 202 259 203 260 .. figure:: tutorial1.png 204 261 :align: center 205 262 206 Once the Q ranges have been set, click the "Calculate" button to determine the background level. Alternatively, enter your own value into the field. If the box turns yellow this indicates that background subtraction has resulted in some negative intensities. 207 208 Click the "Extrapolate" button to extrapolate the data and plot the extrapolation in the same figure. The values of the parameters used for the Guinier and Porod models will also be shown in the "Extrapolation Parameters" section of the window. 263 Once the Q ranges have been set, click the "Calculate Bg" button to determine the 264 background level. Alternatively, enter your own value into the box. If the box turns 265 yellow this indicates that background subtraction has created some negative intensities. 266 267 Now click the "Extrapolate" button to extrapolate the data. The graph window will update 268 to show the extrapolated data, and the values of the parameters used for the Guinier and 269 Porod extrapolations will appear in the "Extrapolation Parameters" section of the SasView 270 GUI. 209 271 210 272 .. figure:: tutorial2.png … … 214 276 buttons: 215 277 216 * **Fourier** Perform a Fourier Transform to calculate the correlation217 function 218 * **Hilbert** Perform a Hilbert Transform to calculate the volume fraction278 * **Fourier**: to perform a Fourier Transform to calculate the correlation 279 functions 280 * **Hilbert**: to perform a Hilbert Transform to calculate the volume fraction 219 281 profile 220 282 221 Click the "Transform" button to perform the selected transform and plot 222 the result in a new graph window. 223 224 If a Fourier Transform was performed, the "Compute Parameters" button can now be clicked to interpret the correlation function as described earlier. 283 and click the "Transform" button to perform the selected transform and plot 284 the results. 225 285 226 286 .. figure:: tutorial3.png 227 287 :align: center 228 288 289 If a Fourier Transform was performed, the "Compute Parameters" button can now be 290 clicked to interpret the correlation function as described earlier. The parameters 291 will appear in the "Output Parameters" section of the SasView GUI. 292 293 .. figure:: tutorial4.png 294 :align: center 295 229 296 230 297 .. note:: 231 This help document was last changed by Steve King, 08Oct2016298 This help document was last changed by Steve King, 26Sep2017 -
src/sas/sascalc/dataloader/data_info.py
r17e257b5 rdeaa0c6 1176 1176 final_dataset.yaxis(data._yaxis, data._yunit) 1177 1177 final_dataset.zaxis(data._zaxis, data._zunit) 1178 final_dataset.x_bins = data.x_bins 1179 final_dataset.y_bins = data.y_bins 1178 if len(data.data.shape) == 2: 1179 n_rows, n_cols = data.data.shape 1180 final_dataset.y_bins = data.qy_data[0::int(n_cols)] 1181 final_dataset.x_bins = data.qx_data[:int(n_cols)] 1180 1182 else: 1181 1183 return_string = "Should Never Happen: _combine_data_info_with_plottable input is not a plottable1d or " + \ -
src/sas/sascalc/dataloader/file_reader_base_class.py
rae69c690 rdeaa0c6 167 167 dataset.x_bins = dataset.qx_data[:int(n_cols)] 168 168 dataset.data = dataset.data.flatten() 169 if len(dataset.data) > 0: 170 dataset.xmin = np.min(dataset.qx_data) 171 dataset.xmax = np.max(dataset.qx_data) 172 dataset.ymin = np.min(dataset.qy_data) 173 dataset.ymax = np.max(dataset.qx_data) 169 174 170 175 def format_unit(self, unit=None): … … 191 196 self.output = [] 192 197 193 def remove_empty_q_values(self, has_error_dx=False, has_error_dy=False, 194 has_error_dxl=False, has_error_dxw=False): 198 def data_cleanup(self): 199 """ 200 Clean up the data sets and refresh everything 201 :return: None 202 """ 203 self.remove_empty_q_values() 204 self.send_to_output() # Combine datasets with DataInfo 205 self.current_datainfo = DataInfo() # Reset DataInfo 206 207 def remove_empty_q_values(self): 195 208 """ 196 209 Remove any point where Q == 0 197 210 """ 198 x = self.current_dataset.x 199 self.current_dataset.x = self.current_dataset.x[x != 0] 200 self.current_dataset.y = self.current_dataset.y[x != 0] 201 if has_error_dy: 202 self.current_dataset.dy = self.current_dataset.dy[x != 0] 203 if has_error_dx: 204 self.current_dataset.dx = self.current_dataset.dx[x != 0] 205 if has_error_dxl: 206 self.current_dataset.dxl = self.current_dataset.dxl[x != 0] 207 if has_error_dxw: 208 self.current_dataset.dxw = self.current_dataset.dxw[x != 0] 211 if isinstance(self.current_dataset, plottable_1D): 212 # Booleans for resolutions 213 has_error_dx = self.current_dataset.dx is not None 214 has_error_dxl = self.current_dataset.dxl is not None 215 has_error_dxw = self.current_dataset.dxw is not None 216 has_error_dy = self.current_dataset.dy is not None 217 # Create arrays of zeros for non-existent resolutions 218 if has_error_dxw and not has_error_dxl: 219 array_size = self.current_dataset.dxw.size - 1 220 self.current_dataset.dxl = np.append(self.current_dataset.dxl, 221 np.zeros([array_size])) 222 has_error_dxl = True 223 elif has_error_dxl and not has_error_dxw: 224 array_size = self.current_dataset.dxl.size - 1 225 self.current_dataset.dxw = np.append(self.current_dataset.dxw, 226 np.zeros([array_size])) 227 has_error_dxw = True 228 elif not has_error_dxl and not has_error_dxw and not has_error_dx: 229 array_size = self.current_dataset.x.size - 1 230 self.current_dataset.dx = np.append(self.current_dataset.dx, 231 np.zeros([array_size])) 232 has_error_dx = True 233 if not has_error_dy: 234 array_size = self.current_dataset.y.size - 1 235 self.current_dataset.dy = np.append(self.current_dataset.dy, 236 np.zeros([array_size])) 237 has_error_dy = True 238 239 # Remove points where q = 0 240 x = self.current_dataset.x 241 self.current_dataset.x = self.current_dataset.x[x != 0] 242 self.current_dataset.y = self.current_dataset.y[x != 0] 243 if has_error_dy: 244 self.current_dataset.dy = self.current_dataset.dy[x != 0] 245 if has_error_dx: 246 self.current_dataset.dx = self.current_dataset.dx[x != 0] 247 if has_error_dxl: 248 self.current_dataset.dxl = self.current_dataset.dxl[x != 0] 249 if has_error_dxw: 250 self.current_dataset.dxw = self.current_dataset.dxw[x != 0] 251 elif isinstance(self.current_dataset, plottable_2D): 252 has_error_dqx = self.current_dataset.dqx_data is not None 253 has_error_dqy = self.current_dataset.dqy_data is not None 254 has_error_dy = self.current_dataset.err_data is not None 255 has_mask = self.current_dataset.mask is not None 256 x = self.current_dataset.qx_data 257 self.current_dataset.data = self.current_dataset.data[x != 0] 258 self.current_dataset.qx_data = self.current_dataset.qx_data[x != 0] 259 self.current_dataset.qy_data = self.current_dataset.qy_data[x != 0] 260 self.current_dataset.q_data = np.sqrt( 261 np.square(self.current_dataset.qx_data) + np.square( 262 self.current_dataset.qy_data)) 263 if has_error_dy: 264 self.current_dataset.err_data = self.current_dataset.err_data[x != 0] 265 if has_error_dqx: 266 self.current_dataset.dqx_data = self.current_dataset.dqx_data[x != 0] 267 if has_error_dqy: 268 self.current_dataset.dqy_data = self.current_dataset.dqy_data[x != 0] 269 if has_mask: 270 self.current_dataset.mask = self.current_dataset.mask[x != 0] 209 271 210 272 def reset_data_list(self, no_lines=0): -
src/sas/sascalc/dataloader/readers/abs_reader.py
rad92c5a rffb6474 109 109 # Sample thickness in mm 110 110 try: 111 value = float(line_toks[5] )111 value = float(line_toks[5][:-1]) 112 112 if self.has_converter and \ 113 113 self.current_datainfo.sample.thickness_unit != 'cm': … … 202 202 is_data_started = True 203 203 204 self.remove_empty_q_values( True, True)204 self.remove_empty_q_values() 205 205 206 206 # Sanity check -
src/sas/sascalc/dataloader/readers/ascii_reader.py
rf994e8b1 r7b07fbe 156 156 raise FileContentsException(msg) 157 157 158 self.remove_empty_q_values( has_error_dx, has_error_dy)158 self.remove_empty_q_values() 159 159 self.current_dataset.xaxis("\\rm{Q}", 'A^{-1}') 160 160 self.current_dataset.yaxis("\\rm{Intensity}", "cm^{-1}") -
src/sas/sascalc/dataloader/readers/cansas_reader.py
rae69c690 r62160509 104 104 xml_file = self.f_open.name 105 105 # We don't sure f_open since lxml handles opnening/closing files 106 if not self.f_open.closed:107 self.f_open.close()108 109 basename, _ = os.path.splitext(os.path.basename(xml_file))110 111 106 try: 112 107 # Raises FileContentsException 113 108 self.load_file_and_schema(xml_file, schema_path) 114 self.current_datainfo = DataInfo() 115 # Raises FileContentsException if file doesn't meet CanSAS schema 109 # Parse each SASentry 110 entry_list = self.xmlroot.xpath('/ns:SASroot/ns:SASentry', 111 namespaces={ 112 'ns': self.cansas_defaults.get( 113 "ns") 114 }) 116 115 self.is_cansas(self.extension) 117 self.invalid = False # If we reach this point then file must be valid CanSAS118 119 # Parse each SASentry120 entry_list = self.xmlroot.xpath('/ns:SASroot/ns:SASentry', namespaces={121 'ns': self.cansas_defaults.get("ns")122 })123 # Look for a SASentry124 self.names.append("SASentry")125 116 self.set_processing_instructions() 126 127 117 for entry in entry_list: 128 self.current_datainfo.filename = basename + self.extension129 self.current_datainfo.meta_data["loader"] = "CanSAS XML 1D"130 self.current_datainfo.meta_data[PREPROCESS] = self.processing_instructions131 118 self._parse_entry(entry) 132 119 self.data_cleanup() … … 150 137 invalid_xml = self.find_invalid_xml() 151 138 if invalid_xml != "": 139 basename, _ = os.path.splitext( 140 os.path.basename(self.f_open.name)) 152 141 invalid_xml = INVALID_XML.format(basename + self.extension) + invalid_xml 153 142 raise DataReaderException(invalid_xml) # Handled by base class … … 164 153 except Exception as e: # Convert all other exceptions to FileContentsExceptions 165 154 raise FileContentsException(e.message) 166 155 finally: 156 if not self.f_open.closed: 157 self.f_open.close() 167 158 168 159 def load_file_and_schema(self, xml_file, schema_path=""): … … 209 200 if not self._is_call_local() and not recurse: 210 201 self.reset_state() 202 if not recurse: 203 self.current_datainfo = DataInfo() 204 # Raises FileContentsException if file doesn't meet CanSAS schema 205 self.invalid = False 206 # Look for a SASentry 211 207 self.data = [] 212 self. current_datainfo = DataInfo()208 self.parent_class = "SASentry" 213 209 self.names.append("SASentry") 214 self.parent_class = "SASentry" 210 self.current_datainfo.meta_data["loader"] = "CanSAS XML 1D" 211 self.current_datainfo.meta_data[ 212 PREPROCESS] = self.processing_instructions 213 if self._is_call_local() and not recurse: 214 basename, _ = os.path.splitext(os.path.basename(self.f_open.name)) 215 self.current_datainfo.filename = basename + self.extension 215 216 # Create an empty dataset if no data has been passed to the reader 216 217 if self.current_dataset is None: 217 self.current_dataset = plottable_1D(np.empty(0), np.empty(0), 218 np.empty(0), np.empty(0)) 218 self._initialize_new_data_set(dom) 219 219 self.base_ns = "{" + CANSAS_NS.get(self.cansas_version).get("ns") + "}" 220 220 … … 228 228 tagname_original = tagname 229 229 # Skip this iteration when loading in save state information 230 if tagname == "fitting_plug_in" or tagname == "pr_inversion" or tagname == "invariant":230 if tagname in ["fitting_plug_in", "pr_inversion", "invariant", "corfunc"]: 231 231 continue 232 232 # Get where to store content … … 258 258 self._add_intermediate() 259 259 else: 260 # TODO: Clean this up to make it faster (fewer if/elifs) 260 261 if isinstance(self.current_dataset, plottable_2D): 261 262 data_point = node.text … … 502 503 self.sort_two_d_data() 503 504 self.reset_data_list() 504 empty = None 505 return self.output[0], empty 506 507 def data_cleanup(self): 508 """ 509 Clean up the data sets and refresh everything 510 :return: None 511 """ 512 has_error_dx = self.current_dataset.dx is not None 513 has_error_dxl = self.current_dataset.dxl is not None 514 has_error_dxw = self.current_dataset.dxw is not None 515 has_error_dy = self.current_dataset.dy is not None 516 self.remove_empty_q_values(has_error_dx=has_error_dx, 517 has_error_dxl=has_error_dxl, 518 has_error_dxw=has_error_dxw, 519 has_error_dy=has_error_dy) 520 self.send_to_output() # Combine datasets with DataInfo 521 self.current_datainfo = DataInfo() # Reset DataInfo 505 return self.output[0], None 522 506 523 507 def _is_call_local(self): … … 553 537 self.aperture = Aperture() 554 538 elif self.parent_class == 'SASdata': 555 self._check_for_empty_resolution()556 539 self.data.append(self.current_dataset) 557 540 … … 609 592 if 'unit' in attr and attr.get('unit') is not None: 610 593 try: 611 local_unit = attr['unit'] 594 unit = attr['unit'] 595 unit_list = unit.split("|") 596 if len(unit_list) > 1: 597 self.current_dataset.xaxis(unit_list[0].strip(), 598 unit_list[1].strip()) 599 local_unit = unit_list[1] 600 else: 601 local_unit = unit 612 602 unitname = self.ns_list.current_level.get("unit", "") 613 603 if "SASdetector" in self.names: … … 665 655 return node_value, value_unit 666 656 667 def _check_for_empty_resolution(self):668 """669 a method to check all resolution data sets are the same size as I and q670 """671 dql_exists = False672 dqw_exists = False673 dq_exists = False674 di_exists = False675 if self.current_dataset.dxl is not None:676 dql_exists = True677 if self.current_dataset.dxw is not None:678 dqw_exists = True679 if self.current_dataset.dx is not None:680 dq_exists = True681 if self.current_dataset.dy is not None:682 di_exists = True683 if dqw_exists and not dql_exists:684 array_size = self.current_dataset.dxw.size685 self.current_dataset.dxl = np.zeros(array_size)686 elif dql_exists and not dqw_exists:687 array_size = self.current_dataset.dxl.size688 self.current_dataset.dxw = np.zeros(array_size)689 elif not dql_exists and not dqw_exists and not dq_exists:690 array_size = self.current_dataset.x.size691 self.current_dataset.dx = np.append(self.current_dataset.dx,692 np.zeros([array_size]))693 if not di_exists:694 array_size = self.current_dataset.y.size695 self.current_dataset.dy = np.append(self.current_dataset.dy,696 np.zeros([array_size]))697 698 657 def _initialize_new_data_set(self, node=None): 699 658 if node is not None: -
src/sas/sasgui/perspectives/corfunc/corfunc_state.py
r2a399ca r1fa4f736 289 289 namespaces={'ns': CANSAS_NS}) 290 290 for entry in entry_list: 291 sas_entry, _ = self._parse_entry(entry)292 291 corstate = self._parse_state(entry) 293 292 294 293 if corstate is not None: 294 sas_entry, _ = self._parse_entry(entry) 295 295 sas_entry.meta_data['corstate'] = corstate 296 296 sas_entry.filename = corstate.file -
src/sas/sasgui/perspectives/fitting/fitpage.py
r13374be r48154abb 2042 2042 # Save state_fit 2043 2043 self.save_current_state_fit() 2044 self.onSmear(None) 2045 self._onDraw(None) 2044 2046 except: 2045 2047 self._show_combox_helper() -
src/sas/sasgui/perspectives/fitting/pagestate.py
rda9b239 r1fa4f736 1297 1297 namespaces={'ns': CANSAS_NS}) 1298 1298 for entry in entry_list: 1299 try:1300 sas_entry, _ = self._parse_save_state_entry(entry)1301 except:1302 raise1303 1299 fitstate = self._parse_state(entry) 1304 1305 1300 # state could be None when .svs file is loaded 1306 1301 # in this case, skip appending to output 1307 1302 if fitstate is not None: 1303 try: 1304 sas_entry, _ = self._parse_save_state_entry( 1305 entry) 1306 except: 1307 raise 1308 1308 sas_entry.meta_data['fitstate'] = fitstate 1309 1309 sas_entry.filename = fitstate.file -
src/sas/sasgui/perspectives/fitting/simfitpage.py
ra9f9ca4 r9804394 1073 1073 """ 1074 1074 1075 model_map = {} 1075 init_map = {} 1076 final_map = {} 1076 1077 if fit.fit_panel.sim_page is None: 1077 1078 fit.fit_panel.add_sim_page() … … 1087 1088 save_id = self._format_id(save_id) 1088 1089 if save_id == model_id: 1089 model_map[saved_model.pop('fit_page_source')] = \ 1090 model[3].name 1090 inter_id = str(i) + str(i) + str(i) + str(i) + str(i) 1091 init_map[saved_model.pop('fit_page_source')] = inter_id 1092 final_map[inter_id] = model[3].name 1091 1093 check = bool(saved_model.pop('checked')) 1092 1094 sim_page.model_list[i][0].SetValue(check) … … 1106 1108 param = item.pop('param_cbox') 1107 1109 equality = item.pop('egal_txt') 1108 for key, value in model_map.iteritems(): 1109 model_cbox.replace(key, value) 1110 constraint_value.replace(key, value) 1110 for key, value in init_map.items(): 1111 model_cbox = model_cbox.replace(key, value) 1112 constraint_value = constraint_value.replace(key, value) 1113 for key, value in final_map.items(): 1114 model_cbox = model_cbox.replace(key, value) 1115 constraint_value = constraint_value.replace(key, value) 1111 1116 1112 1117 sim_page.constraints_list[index][0].SetValue(model_cbox) -
src/sas/sasgui/perspectives/invariant/invariant_state.py
r7432acb r1fa4f736 728 728 729 729 for entry in entry_list: 730 731 sas_entry, _ = self._parse_entry(entry)732 730 invstate = self._parse_state(entry) 733 734 731 # invstate could be None when .svs file is loaded 735 732 # in this case, skip appending to output 736 733 if invstate is not None: 734 sas_entry, _ = self._parse_entry(entry) 737 735 sas_entry.meta_data['invstate'] = invstate 738 736 sas_entry.filename = invstate.file -
src/sas/sasgui/perspectives/pr/inversion_state.py
ra0e6b1b r1fa4f736 472 472 473 473 for entry in entry_list: 474 sas_entry, _ = self._parse_entry(entry)475 474 prstate = self._parse_prstate(entry) 476 475 #prstate could be None when .svs file is loaded 477 476 #in this case, skip appending to output 478 477 if prstate is not None: 478 sas_entry, _ = self._parse_entry(entry) 479 479 sas_entry.meta_data['prstate'] = prstate 480 480 sas_entry.filename = prstate.file
Note: See TracChangeset
for help on using the changeset viewer.