Changeset 4b001f3 in sasview

Ignore:
Timestamp:
Sep 27, 2017 10:47:44 AM (9 months ago)
Branches:
master, ESS_GUI, ESS_GUI_CategroyManager, ESS_GUI_Docs, ESS_GUI_Pr, ESS_GUI_corf, ESS_GUI_model_editor, ESS_GUI_py2_OLD, ESS_GUI_reporting, SVCC-1, SasView-664, ticket-1069, ticket-1094-headless, ticket-818, ticket-976, ticket885, ticket885b, unittest-saveload, win64bit_conda_vm
Children:
497e06d
Parents:
48154abb (diff), ad476d1 (diff)
Note: this is a merge changeset, the changes displayed below correspond to the merge itself.
Use the (diff) links above to see all the changes relative to each parent.
Message:

Merge branch 'master' into ticket-915

Files:
16 edited
2 moved

Unmodified
Removed
• src/sas/sascalc/corfunc/corfunc_calculator.py

 ra859f99 params, s2 = self._fit_data(q, iq) # Extrapolate to 100*Qmax in experimental data qs = np.arange(0, q[-1]*100, (q[1]-q[0])) iqs = s2(qs)
• src/sas/sasgui/perspectives/corfunc/media/corfunc_help.rst

 rf80b416e ----------- This performs a correlation function analysis of one-dimensional SAXS/SANS data, or generates a model-independent volume fraction profile from the SANS from an adsorbed polymer/surfactant layer. This currently performs correlation function analysis on SAXS/SANS data, but in the the future is also planned to generate model-independent volume fraction profiles from the SANS from adsorbed polymer/surfactant layers. The two types of analyses differ in the mathematical transform that is applied to the data (Fourier vs Hilbert). However, both functions are returned in *real space*. A correlation function may be interpreted in terms of an imaginary rod moving through the structure of the material. Î\ :sub:1D\ (R) is the probability that a rod of length R moving through the material has equal electron/neutron scattering length density at either end. Hence a frequently occurring spacing within a structure manifests itself as a peak. A volume fraction profile :math:\Phi\ (z) describes how the density of polymer segments/surfactant molecules varies with distance from an (assumed locally flat) interface. Both functions are returned in *real space*. The analysis is performed in 3 stages: *  Extrapolation of the scattering curve to :math:Q = 0 and through the structure of the material. Î(x) is the probability that a rod of length x has equal electron/neutron scattering length density at either end. Hence a frequently occurring spacing within a structure will manifest itself as a peak in Î(x). *SasView* will return both the one-dimensional ( Î\ :sub:1\ (x) ) and three-dimensional ( Î\ :sub:3\ (x) ) correlation functions, the difference being that the former is only averaged in the plane of the scattering vector. A volume fraction profile :math:\Phi\ (z) describes how the density of polymer segments/surfactant molecules varies with distance, z, normal to an (assumed locally flat) interface. The form of :math:\Phi\ (z) can provide information about the arrangement of polymer/surfactant molecules at the interface. The width of the profile provides measures of the layer thickness, and the area under the profile is related to the amount of material that is adsorbed. Both analyses are performed in 3 stages: *  Extrapolation of the scattering curve to :math:Q = 0 and toward :math:Q = \infty *  Smoothed merging of the two extrapolations into the original data *  Fourier / Hilbert Transform of the smoothed data to give the correlation function / volume fraction profile, respectively *  (Optional) Interpretation of the 1D correlation function based on an ideal lamellar morphology function or volume fraction profile, respectively *  (Optional) Interpretation of Î\ :sub:1\ (x) assuming the sample conforms to an ideal lamellar morphology .. ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ Extrapolation ................ The data are extrapolated to Q = 0 by fitting a Guinier model to the data points in the low-Q range. The data are extrapolated to q = 0 by fitting a Guinier function to the data points in the low-q range. The equation used is: .. math:: I(Q) = Ae^{Bq^2} The Guinier model assumes that the small angle scattering arises from particles and that parameter :math:B is related to the radius of gyration of those particles. This has dubious applicability to polymer systems. However, the correlation function is affected by the Guinier back-extrapolation to the greatest extent at large values of R and so only has a small effect on the final analysis. I(q) = A e^{Bq^2} Where the parameter :math:B is related to the effective radius-of-gyration of a spherical object having the same small-angle scattering in this region. Note that as q tends to zero this function tends to a limiting value and is therefore less appropriate for use in systems where the form factor does not do likewise. However, because of the transform, the correlation functions are most affected by the Guinier back-extrapolation at *large* values of x where the impact on any extrapolated parameters will be least significant. To :math:Q = \infty ..................... The data are extrapolated to Q = :math:\infty by fitting a Porod model to the data points in the high-Q range. The data are extrapolated towards q = :math:\infty by fitting a Porod model to the data points in the high-q range and then computing the extrapolation to 100 times the maximum q value in the experimental dataset. This should be more than sufficient to ensure that on transformation any truncation artefacts introduced are at such small values of x that they can be safely ignored. The equation used is: .. math:: I(Q) = K Q^{-4}e^{-Q^2\sigma^2} + Bg Where :math:Bg is the background, :math:K is the Porod constant, and :math:\sigma (which must be > 0) describes the width of the electron or neutron scattering length density profile at the interface between the crystalline and amorphous regions as shown below. I(q) = K q^{-4}e^{-q^2\sigma^2} + Bg Where :math:Bg is the background, :math:K is the Porod constant, and :math:\sigma (which must be > 0) describes the width of the electron/neutron scattering length density profile at the interface between the crystalline and amorphous regions as shown below. .. figure:: fig1.png --------- The extrapolated data set consists of the Guinier back-extrapolation from Q~0 up to the lowest Q value in the original data, then the original scattering data, and the Porod tail-fit beyond this. The joins between the original data and the Guinier/Porod fits are smoothed using the algorithm below to avoid the formation of ripples in the transformed data. The extrapolated data set consists of the Guinier back-extrapolation from q ~ 0 up to the lowest q value in the original data, then the original scattering data, and then the Porod tail-fit beyond this. The joins between the original data and the Guinier/Porod extrapolations are smoothed using the algorithm below to try and avoid the formation of truncation ripples in the transformed data: Functions :math:f(x_i) and :math:g(x_i) where :math:x_i \in \left\{ Transform --------- Transformation -------------- Fourier ....... If "Fourier" is selected for the transform type, the analysis will perform a If "Fourier" is selected for the transform type, *SasView* will perform a discrete cosine transform on the extrapolated data in order to calculate the 1D correlation function: .. math:: \Gamma _{1D}(R) = \frac{1}{Q^{*}} \int_{0}^{\infty }I(q) q^{2} cos(qR) dq where Q\ :sup:* is the Scattering Invariant. 1D correlation function as: .. math:: \Gamma _{1}(x) = \frac{1}{Q^{*}} \int_{0}^{\infty }I(q) q^{2} cos(qx) dq where Q\ :sup:* is the Scattering (also called Porod) Invariant. The following algorithm is applied: N-1, N The 3D correlation function is also calculated: .. math:: \Gamma _{3D}(R) = \frac{1}{Q^{*}} \int_{0}^{\infty}I(q) q^{2} \frac{sin(qR)}{qR} dq The 3D correlation function is calculated as: .. math:: \Gamma _{3}(x) = \frac{1}{Q^{*}} \int_{0}^{\infty}I(q) q^{2} \frac{sin(qx)}{qx} dq .. note:: It is always advisable to inspect Î\ :sub:1\ (x) and Î\ :sub:3\ (x) for artefacts arising from the extrapolation and transformation processes: - do they tend to zero as x tends to :math:\infty? - do they smoothly curve onto the ordinate at x = 0? (if not check the value of :math:\sigma is sensible) - are there ripples at x values corresponding to (2 :math:pi over) the two q values at which the extrapolated and experimental data are merged? - are there any artefacts at x values corresponding to 2 :math:pi / q\ :sub:max in the experimental data? - and lastly, do the significant features/peaks in the correlation functions actually correspond to anticpated spacings in the sample?!!! Finally, the program calculates the interface distribution function (IDF) g\ :sub:1\ (x) as the discrete cosine transform of: .. math:: -q^{4} I(q) The IDF is proportional to the second derivative of Î\ :sub:1\ (x). Hilbert ....... If "Hilbert" is selected for the transform type, the analysis will perform a Hilbert transform on the extrapolated data in order to calculate the Volume Fraction Profile. .. note:: This functionality is not yet implemented in SasView. .. note:: The Hilbert transform functionality is not yet implemented in SasView. .................... Once the correlation function has been calculated it may be interpreted by clicking the "Compute Parameters" button. The correlation function is interpreted in terms of an ideal lamellar morphology, and structural parameters are obtained from it as shown below. It should be noted that a small beam size is assumed; ie, no de-smearing is performed. Once the correlation functions have been calculated *SasView* can be asked to try and interpret Î\ :sub:1\ (x) in terms of an ideal lamellar morphology as shown below. .. figure:: fig2.png :align: center The structural parameters obtained are: The structural parameters extracted are: *   Long Period :math:= L_p ....................... SasView does not provide any automatic interpretation of volume fraction profiles in the same way that it does for correlation functions. However, a number of structural parameters are obtainable by other means: SasView does not provide any automatic interpretation of volume fraction profiles in the same way that it does for correlation functions. However, a number of structural parameters are obtainable by other means: *   Surface Coverage :math:=\theta :align: center The reader is directed to the references for information on these parameters. References ---------- Correlation Function .................... Strobl, G. R.; Schneider, M. *J. Polym. Sci.* (1980), 18, 1343-1359 BaltÃ¡ Calleja, F. J.; Vonk, C. G. *X-ray Scattering of Synthetic Poylmers*, Elsevier. Amsterdam (1989), 260-270 GÃ¶schel, U.; Urban, G. *Polymer* (1995), 36, 3633-3639 Stribeck, N. *X-Ray Scattering of Soft Matter*, Springer. Berlin (2007), 138-161 :ref:FDR` (PDF format) Volume Fraction Profile ....................... Washington, C.; King, S. M. *J. Phys. Chem.*, (1996), 100, 7603-7609 Cosgrove, T.; King, S. M.; Griffiths, P. C. *Colloid-Polymer Interactions: From Fundamentals to Practice*, Wiley. New York (1999), 193-204 King, S. M.; Griffiths, P. C.; Cosgrove, T. *Applications of Neutron Scattering to Soft Condensed Matter*, Gordon & Breach. Amsterdam (2000), 77-105 King, S.; Griffiths, P.; Hone, J.; Cosgrove, T. *Macromol. Symp.* (2002), 190, 33-42 .. ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ Upon sending data for correlation function analysis, it will be plotted (minus the background value), along with a *red* bar indicating the *upper end of the low-Q range* (used for back-extrapolation), and 2 *purple* bars indicating the range to be used for forward-extrapolation. These bars may be moved my clicking and dragging, or by entering appropriate values in the Q range input boxes. low-Q range* (used for Guinier back-extrapolation), and 2 *purple* bars indicating the range to be used for Porod forward-extrapolation. These bars may be moved by grabbing and dragging, or by entering appropriate values in the Q range input boxes. .. figure:: tutorial1.png :align: center Once the Q ranges have been set, click the "Calculate" button to determine the background level. Alternatively, enter your own value into the field. If the box turns yellow this indicates that background subtraction has resulted in some negative intensities. Click the "Extrapolate" button to extrapolate the data and plot the extrapolation in the same figure. The values of the parameters used for the Guinier and Porod models will also be shown in the "Extrapolation Parameters" section of the window. Once the Q ranges have been set, click the "Calculate Bg" button to determine the background level. Alternatively, enter your own value into the box. If the box turns yellow this indicates that background subtraction has created some negative intensities. Now click the "Extrapolate" button to extrapolate the data. The graph window will update to show the extrapolated data, and the values of the parameters used for the Guinier and Porod extrapolations will appear in the "Extrapolation Parameters" section of the SasView GUI. .. figure:: tutorial2.png buttons: *   **Fourier** Perform a Fourier Transform to calculate the correlation function *   **Hilbert** Perform a Hilbert Transform to calculate the volume fraction *   **Fourier**: to perform a Fourier Transform to calculate the correlation functions *   **Hilbert**: to perform a Hilbert Transform to calculate the volume fraction profile Click the "Transform" button to perform the selected transform and plot the result in a new graph window. If a Fourier Transform was performed, the "Compute Parameters" button can now be clicked to interpret the correlation function as described earlier. and click the "Transform" button to perform the selected transform and plot the results. .. figure:: tutorial3.png :align: center If a Fourier Transform was performed, the "Compute Parameters" button can now be clicked to interpret the correlation function as described earlier. The parameters will appear in the "Output Parameters" section of the SasView GUI. .. figure:: tutorial4.png :align: center .. note:: This help document was last changed by Steve King, 08Oct2016 This help document was last changed by Steve King, 26Sep2017

 r17e257b final_dataset.yaxis(data._yaxis, data._yunit) final_dataset.zaxis(data._zaxis, data._zunit) final_dataset.x_bins = data.x_bins final_dataset.y_bins = data.y_bins if len(data.data.shape) == 2: n_rows, n_cols = data.data.shape final_dataset.y_bins = data.qy_data[0::int(n_cols)] final_dataset.x_bins = data.qx_data[:int(n_cols)] else: return_string = "Should Never Happen: _combine_data_info_with_plottable input is not a plottable1d or " + \

 rae69c690 dataset.x_bins = dataset.qx_data[:int(n_cols)] dataset.data = dataset.data.flatten() if len(dataset.data) > 0: dataset.xmin = np.min(dataset.qx_data) dataset.xmax = np.max(dataset.qx_data) dataset.ymin = np.min(dataset.qy_data) dataset.ymax = np.max(dataset.qx_data) def format_unit(self, unit=None): self.output = [] def remove_empty_q_values(self, has_error_dx=False, has_error_dy=False, has_error_dxl=False, has_error_dxw=False): def data_cleanup(self): """ Clean up the data sets and refresh everything :return: None """ self.remove_empty_q_values() self.send_to_output()  # Combine datasets with DataInfo self.current_datainfo = DataInfo()  # Reset DataInfo def remove_empty_q_values(self): """ Remove any point where Q == 0 """ x = self.current_dataset.x self.current_dataset.x = self.current_dataset.x[x != 0] self.current_dataset.y = self.current_dataset.y[x != 0] if has_error_dy: self.current_dataset.dy = self.current_dataset.dy[x != 0] if has_error_dx: self.current_dataset.dx = self.current_dataset.dx[x != 0] if has_error_dxl: self.current_dataset.dxl = self.current_dataset.dxl[x != 0] if has_error_dxw: self.current_dataset.dxw = self.current_dataset.dxw[x != 0] if isinstance(self.current_dataset, plottable_1D): # Booleans for resolutions has_error_dx = self.current_dataset.dx is not None has_error_dxl = self.current_dataset.dxl is not None has_error_dxw = self.current_dataset.dxw is not None has_error_dy = self.current_dataset.dy is not None # Create arrays of zeros for non-existent resolutions if has_error_dxw and not has_error_dxl: array_size = self.current_dataset.dxw.size - 1 self.current_dataset.dxl = np.append(self.current_dataset.dxl, np.zeros([array_size])) has_error_dxl = True elif has_error_dxl and not has_error_dxw: array_size = self.current_dataset.dxl.size - 1 self.current_dataset.dxw = np.append(self.current_dataset.dxw, np.zeros([array_size])) has_error_dxw = True elif not has_error_dxl and not has_error_dxw and not has_error_dx: array_size = self.current_dataset.x.size - 1 self.current_dataset.dx = np.append(self.current_dataset.dx, np.zeros([array_size])) has_error_dx = True if not has_error_dy: array_size = self.current_dataset.y.size - 1 self.current_dataset.dy = np.append(self.current_dataset.dy, np.zeros([array_size])) has_error_dy = True # Remove points where q = 0 x = self.current_dataset.x self.current_dataset.x = self.current_dataset.x[x != 0] self.current_dataset.y = self.current_dataset.y[x != 0] if has_error_dy: self.current_dataset.dy = self.current_dataset.dy[x != 0] if has_error_dx: self.current_dataset.dx = self.current_dataset.dx[x != 0] if has_error_dxl: self.current_dataset.dxl = self.current_dataset.dxl[x != 0] if has_error_dxw: self.current_dataset.dxw = self.current_dataset.dxw[x != 0] elif isinstance(self.current_dataset, plottable_2D): has_error_dqx = self.current_dataset.dqx_data is not None has_error_dqy = self.current_dataset.dqy_data is not None has_error_dy = self.current_dataset.err_data is not None has_mask = self.current_dataset.mask is not None x = self.current_dataset.qx_data self.current_dataset.data = self.current_dataset.data[x != 0] self.current_dataset.qx_data = self.current_dataset.qx_data[x != 0] self.current_dataset.qy_data = self.current_dataset.qy_data[x != 0] self.current_dataset.q_data = np.sqrt( np.square(self.current_dataset.qx_data) + np.square( self.current_dataset.qy_data)) if has_error_dy: self.current_dataset.err_data = self.current_dataset.err_data[x != 0] if has_error_dqx: self.current_dataset.dqx_data = self.current_dataset.dqx_data[x != 0] if has_error_dqy: self.current_dataset.dqy_data = self.current_dataset.dqy_data[x != 0] if has_mask: self.current_dataset.mask = self.current_dataset.mask[x != 0] def reset_data_list(self, no_lines=0):

 rad92c5a # Sample thickness in mm try: value = float(line_toks[5]) value = float(line_toks[5][:-1]) if self.has_converter and \ self.current_datainfo.sample.thickness_unit != 'cm': is_data_started = True self.remove_empty_q_values(True, True) self.remove_empty_q_values() # Sanity check

 rf994e8b1 raise FileContentsException(msg) self.remove_empty_q_values(has_error_dx, has_error_dy) self.remove_empty_q_values() self.current_dataset.xaxis("\\rm{Q}", 'A^{-1}') self.current_dataset.yaxis("\\rm{Intensity}", "cm^{-1}")

 rae69c690 xml_file = self.f_open.name # We don't sure f_open since lxml handles opnening/closing files if not self.f_open.closed: self.f_open.close() basename, _ = os.path.splitext(os.path.basename(xml_file)) try: # Raises FileContentsException self.load_file_and_schema(xml_file, schema_path) self.current_datainfo = DataInfo() # Raises FileContentsException if file doesn't meet CanSAS schema # Parse each SASentry entry_list = self.xmlroot.xpath('/ns:SASroot/ns:SASentry', namespaces={ 'ns': self.cansas_defaults.get( "ns") }) self.is_cansas(self.extension) self.invalid = False # If we reach this point then file must be valid CanSAS # Parse each SASentry entry_list = self.xmlroot.xpath('/ns:SASroot/ns:SASentry', namespaces={ 'ns': self.cansas_defaults.get("ns") }) # Look for a SASentry self.names.append("SASentry") self.set_processing_instructions() for entry in entry_list: self.current_datainfo.filename = basename + self.extension self.current_datainfo.meta_data["loader"] = "CanSAS XML 1D" self.current_datainfo.meta_data[PREPROCESS] = self.processing_instructions self._parse_entry(entry) self.data_cleanup() invalid_xml = self.find_invalid_xml() if invalid_xml != "": basename, _ = os.path.splitext( os.path.basename(self.f_open.name)) invalid_xml = INVALID_XML.format(basename + self.extension) + invalid_xml raise DataReaderException(invalid_xml) # Handled by base class except Exception as e: # Convert all other exceptions to FileContentsExceptions raise FileContentsException(e.message) finally: if not self.f_open.closed: self.f_open.close() def load_file_and_schema(self, xml_file, schema_path=""): if not self._is_call_local() and not recurse: self.reset_state() if not recurse: self.current_datainfo = DataInfo() # Raises FileContentsException if file doesn't meet CanSAS schema self.invalid = False # Look for a SASentry self.data = [] self.current_datainfo = DataInfo() self.parent_class = "SASentry" self.names.append("SASentry") self.parent_class = "SASentry" self.current_datainfo.meta_data["loader"] = "CanSAS XML 1D" self.current_datainfo.meta_data[ PREPROCESS] = self.processing_instructions if self._is_call_local() and not recurse: basename, _ = os.path.splitext(os.path.basename(self.f_open.name)) self.current_datainfo.filename = basename + self.extension # Create an empty dataset if no data has been passed to the reader if self.current_dataset is None: self.current_dataset = plottable_1D(np.empty(0), np.empty(0), np.empty(0), np.empty(0)) self._initialize_new_data_set(dom) self.base_ns = "{" + CANSAS_NS.get(self.cansas_version).get("ns") + "}" tagname_original = tagname # Skip this iteration when loading in save state information if tagname == "fitting_plug_in" or tagname == "pr_inversion" or tagname == "invariant": if tagname in ["fitting_plug_in", "pr_inversion", "invariant", "corfunc"]: continue # Get where to store content self._add_intermediate() else: # TODO: Clean this up to make it faster (fewer if/elifs) if isinstance(self.current_dataset, plottable_2D): data_point = node.text self.sort_two_d_data() self.reset_data_list() empty = None return self.output[0], empty def data_cleanup(self): """ Clean up the data sets and refresh everything :return: None """ has_error_dx = self.current_dataset.dx is not None has_error_dxl = self.current_dataset.dxl is not None has_error_dxw = self.current_dataset.dxw is not None has_error_dy = self.current_dataset.dy is not None self.remove_empty_q_values(has_error_dx=has_error_dx, has_error_dxl=has_error_dxl, has_error_dxw=has_error_dxw, has_error_dy=has_error_dy) self.send_to_output()  # Combine datasets with DataInfo self.current_datainfo = DataInfo()  # Reset DataInfo return self.output[0], None def _is_call_local(self): self.aperture = Aperture() elif self.parent_class == 'SASdata': self._check_for_empty_resolution() self.data.append(self.current_dataset) if 'unit' in attr and attr.get('unit') is not None: try: local_unit = attr['unit'] unit = attr['unit'] unit_list = unit.split("|") if len(unit_list) > 1: self.current_dataset.xaxis(unit_list[0].strip(), unit_list[1].strip()) local_unit = unit_list[1] else: local_unit = unit unitname = self.ns_list.current_level.get("unit", "") if "SASdetector" in self.names: return node_value, value_unit def _check_for_empty_resolution(self): """ a method to check all resolution data sets are the same size as I and q """ dql_exists = False dqw_exists = False dq_exists = False di_exists = False if self.current_dataset.dxl is not None: dql_exists = True if self.current_dataset.dxw is not None: dqw_exists = True if self.current_dataset.dx is not None: dq_exists = True if self.current_dataset.dy is not None: di_exists = True if dqw_exists and not dql_exists: array_size = self.current_dataset.dxw.size self.current_dataset.dxl = np.zeros(array_size) elif dql_exists and not dqw_exists: array_size = self.current_dataset.dxl.size self.current_dataset.dxw = np.zeros(array_size) elif not dql_exists and not dqw_exists and not dq_exists: array_size = self.current_dataset.x.size self.current_dataset.dx = np.append(self.current_dataset.dx, np.zeros([array_size])) if not di_exists: array_size = self.current_dataset.y.size self.current_dataset.dy = np.append(self.current_dataset.dy, np.zeros([array_size])) def _initialize_new_data_set(self, node=None): if node is not None:
• src/sas/sasgui/perspectives/corfunc/corfunc_state.py

 r2a399ca namespaces={'ns': CANSAS_NS}) for entry in entry_list: sas_entry, _ = self._parse_entry(entry) corstate = self._parse_state(entry) if corstate is not None: sas_entry, _ = self._parse_entry(entry) sas_entry.meta_data['corstate'] = corstate sas_entry.filename = corstate.file
• src/sas/sasgui/perspectives/fitting/fitpage.py

 r13374be # Save state_fit self.save_current_state_fit() self.onSmear(None) self._onDraw(None) except: self._show_combox_helper()
• src/sas/sasgui/perspectives/fitting/pagestate.py

 rda9b239 namespaces={'ns': CANSAS_NS}) for entry in entry_list: try: sas_entry, _ = self._parse_save_state_entry(entry) except: raise fitstate = self._parse_state(entry) # state could be None when .svs file is loaded # in this case, skip appending to output if fitstate is not None: try: sas_entry, _ = self._parse_save_state_entry( entry) except: raise sas_entry.meta_data['fitstate'] = fitstate sas_entry.filename = fitstate.file
• src/sas/sasgui/perspectives/fitting/simfitpage.py

 ra9f9ca4 """ model_map = {} init_map = {} final_map = {} if fit.fit_panel.sim_page is None: fit.fit_panel.add_sim_page() save_id = self._format_id(save_id) if save_id == model_id: model_map[saved_model.pop('fit_page_source')] = \ model[3].name inter_id = str(i) + str(i) + str(i) + str(i) + str(i) init_map[saved_model.pop('fit_page_source')] = inter_id final_map[inter_id] = model[3].name check = bool(saved_model.pop('checked')) sim_page.model_list[i][0].SetValue(check) param = item.pop('param_cbox') equality = item.pop('egal_txt') for key, value in model_map.iteritems(): model_cbox.replace(key, value) constraint_value.replace(key, value) for key, value in init_map.items(): model_cbox = model_cbox.replace(key, value) constraint_value = constraint_value.replace(key, value) for key, value in final_map.items(): model_cbox = model_cbox.replace(key, value) constraint_value = constraint_value.replace(key, value) sim_page.constraints_list[index][0].SetValue(model_cbox)
• src/sas/sasgui/perspectives/invariant/invariant_state.py

 r7432acb for entry in entry_list: sas_entry, _ = self._parse_entry(entry) invstate = self._parse_state(entry) # invstate could be None when .svs file is loaded # in this case, skip appending to output if invstate is not None: sas_entry, _ = self._parse_entry(entry) sas_entry.meta_data['invstate'] = invstate sas_entry.filename = invstate.file
• src/sas/sasgui/perspectives/pr/inversion_state.py

 ra0e6b1b for entry in entry_list: sas_entry, _ = self._parse_entry(entry) prstate = self._parse_prstate(entry) #prstate could be None when .svs file is loaded #in this case, skip appending to output if prstate is not None: sas_entry, _ = self._parse_entry(entry) sas_entry.meta_data['prstate'] = prstate sas_entry.filename = prstate.file
Note: See TracChangeset for help on using the changeset viewer.